Difference between revisions of "User:Timeroot"

From Complexity Zoo
Jump to navigation Jump to search
(starting table of graph substructures)
(continuing table)
Line 8: Line 8:
 
For instance, for many types of substructure (such as subgraph), finding an H inside G can be trivially done in O(G^H) time. (I use the name, G, synonymously with |V(G)|, its vertex size.) Worst case, this is exponential with H. But if you fix H, then there may (or may not) be algorithms that 'move' the difficulty, e.g. O(H! * G^2). This would then by fixed-parameter tractable.
 
For instance, for many types of substructure (such as subgraph), finding an H inside G can be trivially done in O(G^H) time. (I use the name, G, synonymously with |V(G)|, its vertex size.) Worst case, this is exponential with H. But if you fix H, then there may (or may not) be algorithms that 'move' the difficulty, e.g. O(H! * G^2). This would then by fixed-parameter tractable.
  
Being WQO (well-quasi-ordered) is important, because it implies that any property closed under these operations can be described by a forbidden substructure. The classic case is minors, which are a WQO and can have substructure found in O(G^2) time with a fixed H. There is a ''very'' nontrivial constant factor that depends on H, though, rendering most of these algorithms impractical. I'll use the notation '''RS'''(H) to indicate that an algorithm uses the Robertson-Seymour subroutine, and so although might have fixed H dependence, the constant matters.
+
Being well-quasi-ordered (WQO) is important, because it implies that any property closed under these operations can be described by a forbidden substructure. The classic case is minors, which are a WQO and can have substructure found in O(G^2) time with a fixed H. There is a ''very'' nontrivial constant factor that depends on H, though, rendering most of these algorithms impractical. I'll use the notation '''RS'''(H) to indicate that an algorithm uses the Robertson-Seymour subroutine, and so although might have fixed H dependence, the constant matters. A weaker property than being WQO is the descending chain condition (DCC), which means that there are no infinite chains of containment. Properties that respect some integral measure of size (e.g. |V| or |E|) always have the DCC, but others such as graph homomorphism might not. When a property is not WQO, I will usually provide an example of why, such as the cliques Kn, cycles Cn, or bipartite graphs Knn, that form an infinite antichain. Finally, there is also an Ascending Chain Condition (ACC), which usually applies when the vertex or edge count is fixed: all edges containing a given H are of a maximum size.
  
For relations that requires G and H are of the same size in some sense (such as Graph Isomorphism, Spanning Subgraph, Subdivision etc. -- technically, those that with fixed H admit a kernelization algorithm), the "fixed H" problem becomes trivial.
+
For relations that requires G and H are of the same size in some sense (such as Graph Isomorphism, Spanning Subgraph, Subdivision etc. -- technically, those that with fixed H admit a kernelization algorithm), the "fixed H" problem becomes trivial. This includes all with the ascending chain condition, as there are just finitely many G's to check for.
 +
 
 +
Complexities are usually given in the form of {complete class; upper bound on runtime; lower bound on runtime under reasonable assumptions}. I assume the ETH (that 3SAT can't be solved in 2^o(n)), which has the FPT corollary Clique can't be solved in f(k)n<sup>o(k)</sup>. If there a simple case that shows why something is NP-Hard, then I will mention it: in particular, finding Hamiltonian cycles and cliques are hard.
  
 
{| class="wikitable" style="text-align: center; width: 1000px; height: 200px;"
 
{| class="wikitable" style="text-align: center; width: 1000px; height: 200px;"
Line 17: Line 19:
 
! scope="col" | Name
 
! scope="col" | Name
 
! scope="col" | Reductions
 
! scope="col" | Reductions
! scope="col" | Map ''f'': G → H
+
! scope="col" | Map ''f'': H → G  
 
! scope="col" | Ordering Properties
 
! scope="col" | Ordering Properties
 
! scope="col" | Complexity (varying H)
 
! scope="col" | Complexity (varying H)
Line 23: Line 25:
 
|-
 
|-
 
! scope="row" | Graph Isomorphism
 
! scope="row" | Graph Isomorphism
| ''None'' || '''E'''(''f''(u), ''f''(v))  ↔ '''E'''(u,v)<br />''f'' bijective || Equivalence <br />Not WQO || GI-Complete || O(1)
+
| ''None'' || '''E'''(u,v) ↔ '''E'''(''f''(u), ''f''(v))<br />''f''<sub>V</sub> bijective || Equivalence <br />Not WQO (e.g. Kn)<br />DCC, ACC || GI-Complete <br /> e<sup>O(log<sup>3</sup> G)</sup> ''[1]'' <br /> - || ''Trivial'' <br /> (Fixed vertex count; ACC)
 
|-
 
|-
! scope="row" | 2
+
! scope="row" | Induced Subgraph
| 2 || 4 || 6
+
| Vertex deletion || '''E'''(u,v) ↔ '''E'''(''f''(u), ''f''(v))<br />''f''<sub>V</sub> injective || Not WQO (e.g. Cn)<br />DCC || NP-Complete (cliques) <br /> G<sup>H+2</sup> <br /> 2<sup>o(G)</sup> || W[1]-Complete (cliques) <br /> G<sup>H+2</sup> <br /> f(H)G<sup>o(H)</sup>
 
|-
 
|-
! scope="row" | 3
+
! scope="row" | Subgraph
| 3 || 6 || 9
+
| Vertex deletion, Edge deletion || '''E'''(u,v)  → '''E'''(''f''(u), ''f''(v))<br />''f''<sub>V</sub> injective || Not WQO (e.g. Cn)<br />DCC || NP-Complete (cliques) <br /> G<sup>H+2</sup> <br /> 2<sup>o(G)</sup> || W[1]-Complete (cliques) <br /> G<sup>H+2</sup> <br /> f(H)G<sup>o(H)</sup>
 
|-
 
|-
! scope="row" | 4
+
! scope="row" | Spanning Subgraph
| 4 || 8 || 12
+
| Edge deletion || '''E'''(u,v)  → '''E'''(''f''(u), ''f''(v))<br />''f''<sub>V</sub> bijective || Not WQO (e.g. Cn)<br />DCC, ACC || NP-Complete (Hamiltonian cycle) <br /> G<sup>H+2</sup> <br /> 2<sup>o(G)</sup> || ''Trivial'' <br />(Fixed vertex count; ACC)
 
|-
 
|-
! scope="row" | 5
+
! scope="row" | Dissolution <br />Equivalent to to ''Smoothing''<br />Dual to ''Subdivision'' or ''Expansion''<br />
| 5 || 10 || 15
+
| Edge deletion || ''f''<sub>V</sub> injective<br />''f''<sub>E</sub> carries edges to distinct induced paths || Not WQO (e.g. Kn)<br />DCC || NP-Complete ''[2]'' <br /> G<sup>H+2</sup> <br /> 2<sup>o(G)</sup> || -
 
|}
 
|}
 +
 +
References:
 +
 +
[1] https://cstheory.stackexchange.com/a/40373/12211
 +
[2] Own work

Revision as of 20:03, 14 May 2019

Real name: Alex Meiburg.

Physics PhD student at UCSB (expected grad ~2024).

I'm trying to compile information on types of graph containment. Complexity bounds should have upper- and lower-bounds on difficulty with fixed H and varying H.

For instance, for many types of substructure (such as subgraph), finding an H inside G can be trivially done in O(G^H) time. (I use the name, G, synonymously with |V(G)|, its vertex size.) Worst case, this is exponential with H. But if you fix H, then there may (or may not) be algorithms that 'move' the difficulty, e.g. O(H! * G^2). This would then by fixed-parameter tractable.

Being well-quasi-ordered (WQO) is important, because it implies that any property closed under these operations can be described by a forbidden substructure. The classic case is minors, which are a WQO and can have substructure found in O(G^2) time with a fixed H. There is a very nontrivial constant factor that depends on H, though, rendering most of these algorithms impractical. I'll use the notation RS(H) to indicate that an algorithm uses the Robertson-Seymour subroutine, and so although might have fixed H dependence, the constant matters. A weaker property than being WQO is the descending chain condition (DCC), which means that there are no infinite chains of containment. Properties that respect some integral measure of size (e.g. |V| or |E|) always have the DCC, but others such as graph homomorphism might not. When a property is not WQO, I will usually provide an example of why, such as the cliques Kn, cycles Cn, or bipartite graphs Knn, that form an infinite antichain. Finally, there is also an Ascending Chain Condition (ACC), which usually applies when the vertex or edge count is fixed: all edges containing a given H are of a maximum size.

For relations that requires G and H are of the same size in some sense (such as Graph Isomorphism, Spanning Subgraph, Subdivision etc. -- technically, those that with fixed H admit a kernelization algorithm), the "fixed H" problem becomes trivial. This includes all with the ascending chain condition, as there are just finitely many G's to check for.

Complexities are usually given in the form of {complete class; upper bound on runtime; lower bound on runtime under reasonable assumptions}. I assume the ETH (that 3SAT can't be solved in 2^o(n)), which has the FPT corollary Clique can't be solved in f(k)no(k). If there a simple case that shows why something is NP-Hard, then I will mention it: in particular, finding Hamiltonian cycles and cliques are hard.

Graph Substructure Table
Name Reductions Map f: H → G Ordering Properties Complexity (varying H) Complexity (fixed H)
Graph Isomorphism None E(u,v) ↔ E(f(u), f(v))
fV bijective
Equivalence
Not WQO (e.g. Kn)
DCC, ACC
GI-Complete
eO(log3 G) [1]
-
Trivial
(Fixed vertex count; ACC)
Induced Subgraph Vertex deletion E(u,v) ↔ E(f(u), f(v))
fV injective
Not WQO (e.g. Cn)
DCC
NP-Complete (cliques)
GH+2
2o(G)
W[1]-Complete (cliques)
GH+2
f(H)Go(H)
Subgraph Vertex deletion, Edge deletion E(u,v) → E(f(u), f(v))
fV injective
Not WQO (e.g. Cn)
DCC
NP-Complete (cliques)
GH+2
2o(G)
W[1]-Complete (cliques)
GH+2
f(H)Go(H)
Spanning Subgraph Edge deletion E(u,v) → E(f(u), f(v))
fV bijective
Not WQO (e.g. Cn)
DCC, ACC
NP-Complete (Hamiltonian cycle)
GH+2
2o(G)
Trivial
(Fixed vertex count; ACC)
Dissolution
Equivalent to to Smoothing
Dual to Subdivision or Expansion
Edge deletion fV injective
fE carries edges to distinct induced paths
Not WQO (e.g. Kn)
DCC
NP-Complete [2]
GH+2
2o(G)
-

References:

[1] https://cstheory.stackexchange.com/a/40373/12211 [2] Own work