|
Encyclopaedia of DesignTheory: Glossary |
The label "Topics" or "Encyc" against a glossary entry is a link to either a topic essay or an encyclopaedia entry for this subject; the label "Example" points you to an example, and "External" to a page on the Web with more information. (Of course we are not responsible for the availability or content of external web pages.)
This Warning sign is used for terms where there are conflicting meanings in the literature, or other problems of terminology. This is not infrequent in design theory, partly because the combinatorial and statistical aspects of the subject have developed with relatively little interaction. |
A | B | C | D | E | F | G | H | I | J | K | L | M | N | O | P | Q | R | S | T | U | V | W | X | Y | Z |
Many variations are possible. For example, in a multigraph we can take the (v,w) entry to be the number of edges incident with v and w. In a digraph we can take it to be the number of edges with initial vertex v and terminal vertex w.
If the ring multiplication satisfies the associative law, we have an associative algebra. Similarly, if the commutative law holds, we have a commutative algebra. The prototypical example of an associative algebra is the set of all n×n matrices over the field F. Other kinds of algebra such as Lie algebras and Jordan algebras can be defined by different laws.
Sometimes we allow the use of different symbols in different coordinate positions, in which case there is an alphabet associated with each coordinate.
Some authors use the term "association scheme" for any coherent configuration. Others use the term for a homogeneous configuration, that is, one for which the identity is one of the matrices. Yet others don't require the matrices to be symmetric but do require that they commute. In view of this confusion, we strongly advocate using the term association scheme in the sense defined here, and using the terms commutative coherent configuration, homogeneous coherent configuration, and coherent configuration for the more general concepts. |
Sometimes, different choices of what the "relevant structure" is lead to different concepts of automorphism (which may be distinguished as strong automorphism, weak automorphism, etc.) For example, given a resolved design, we could define a weak automorphism to preserve the design and the resolution, while a strong automorphism fixes every parallel class in the resolution.
Another of those words used differently in combinatorics and statistics. See D. A. Preece, Balance and designs: Another terminological tangle, Utilitas Math. 21C (1982), 85-186. |
In combinatorics, pairwise balance means that two points are contained in a constant number of blocks. This obviously generalises to t-wise balance.
In statistics, there are various different kinds of balance (variance balance, efficiency balance, etc.), none of which are equivalent to the combinatorialist's pairwise balance, but become equivalent to it if some extra conditions are satisfied (e.g. in binary equireplicate uniform block designs, aka 1-designs). To add to the complication, the combinatorialist's 3-design is called "doubly balanced"(!)
"Balance" also arises in other types of designs such as weighing matrices.
A basis of a matroid is a maximal independent set.
A bilinar form B can be represented by a matrix A:
B(x,y) = xAyT.
In incidence structures, "point" and "block" are the usual terms for the two types of element. Often a block is identified with a subset of the set of points, namely, those points incident with it. If the same set of points arises from more than one block, we say that the structure has repeated blocks.
Be warned that some combinatorialists use the term as a synonym for "2-design" (see t-design). |
If the sets A and B carry some structure, for example if they are groups or graphs, then we would expect that their cartesian product would carry the same kind of structure, defined coordinatewise. Indeed, the direct product of groups, or the direct sum of abelian groups or vector spaces, is defined in exactly this way. However, for various reasons, the coordinatewise product of graphs is called the categorical product; the cartesian product of graphs is defined differently.
Other graph products include the lexicographic product and the strong product.
Not every vertex-transitive graph is a Cayley graph; the Petersen graph is an exception.
For some structures (groups, or more generally loops or quasigroups), the Cayley table is a Latin square.
For an arbitrary group, a representation is a homomorphism to the group of n×n complex matrices, and the corresponding character is the function which takes any element to the trace of the representing matrix. A character is called irreducible if the representing matrices have no proper invariant subspace. A character is constant on the conjugacy classes of the group, and the numbers of irreducible characters and conjugacy classes are equal; the character table is the square table which gives the values of characters on conjugacy classes.
A circulant graph is a graph which is also a cyclic design. The cyclic graph is a circulant graph, but not every circulant graph is cyclic! |
An association scheme is a symmetric coherent configuration.
The basis matrices of a permutation group form a coherent configuration. Such a configuration is called Schurian.
Some authors use the term "association scheme" for any coherent configuration. Others use the term for a homogeneous configuration, that is, one for which the identity is one of the matrices. Yet others don't require the matrices to be symmetric but do require that they commute. In view of this confusion, we strongly advocate using the term association scheme for a coherent configuration all of whose matrices are symmetric, and using the terms commutative coherent configuration, homogeneous coherent configuration, and coherent configuration for the more general concepts. |
A block design is connected if its incidence graph is.
If the group happens to be Abelian, or if H is a normal subgroup of G, then the left and right cosets coincide and we simply speak of "cosets". In any case, the numbers of left and right cosets are equal; this number is the index of H in G.
Distinct left cosets are disjoint, and each has the same cardinality as the subgroup H; so they form a partition of the group. The same applies to right cosets. This is the basis of Lagrange's Theorem.
See also tight single-change covering design.
If the second condition is deleted, the set is a uniquely completable set.
The analogous concept for permutation groups is the direct product.
If D is a difference set in G with |G|=v and |D|=k, then the translates of D (the sets D+x, for x in G) are the blocks of a square 2-(v,k,lambda) design.
Some people write this group as D2n, others as Dn. |
If Gi is a permutation group acting on a set Xi for i=1,2, there are two natural actions of the direct product G1×G2:
A distance-regular graph of diameter 2 is strongly regular, and any connected strongly regular graph is distance-regular with diameter 2.
Any distance-transitive graph is distance-regular.
Any distributive lattice can be represented as the lattice of all ancestral subsets of some partially ordered set, where the order is set-theoretic inclusion and the operations of meet and join are union and intersection.
If the multiplication satisfies the commutative law, then the two distributive laws are equivalent.
The rows of a Hadamard matrix form an equidistant code, since any two agree in exactly half of their entries and disagree in the remainder.
If the block design is binary, then this condition is equivalent to regularity of the corresponding hypergraph; in other words, each point is in a constant number of blocks.
If this holds, then the relation on lines of being equal or disjoint is an equivalence relation called parallelism, and each parallel class of lines covers the point set exactly (that is, each point is incident with just one line of each parallel class).
So an incidence structure satisfying Euclid's parallel postulate is resolvable.
Because the total number of treatment combinations (the product of the numbers of levels of the factors) may be very large, we may only be able to test some of these combinations, giving a fractional factorial design. A suitable fraction may be an orthogonal array, or a subgroup of the direct product of abelian groups of orders equal to the number of levels of each factor.
A filter (in the strong sense) in a finite lattice must be "principal", that is, the set of all elements b satisfying b >= a for some fixed element a.
The Galois field of prime order p consists of the integers modulo p.
If n=9 and the regions are 3×3 subsquares, this is precisely the solution to a Sudoku puzzle.
If the regions are the symbol positions in another Latin square L', then the condition asserts that L and L' are orthogonal.
A more restricted concept, sometimes called a simple graph, has no loops and no multiple edges. In this case, an edge can be identified with an unordered pair of vertices. Two vertices are called adjacent if there is an edge incident with that pair of vertices.
In a directed graph or digraph, edges are ordered (rather than unordered) pairs.
This is a term which has different meanings to different people! |
Originally, a group divisible design was one in which we are given a partition of the set of points into "groups", all of the same size (the term here has no connection with the algebraic sense) so that two points in the same group lie in lambda1 blocks, while any two points in different groups lie in lambda2 blocks. Such a design (assuming constant block size) is partially balanced (that is, a PBIBD), with respect to the association scheme whose classes are "equal", "in same group but not equal", and "in different groups".
In combinatorial design theory, the conditions of constant group size and constant block size are very often relaxed, but it is often assumed that lambda1=0, that is, a block and a group share at most one common point.
A map between relational structures of the same type is said to be a homomorphism if it preserves the relations. For example, a graph homomorphism is a map which takes edges to edges (with no assumption about what happens to non-edges).
If there are just two types (traditionally called "points" and "blocks"), we usually speak of an incidence structure.
The incidence graph of an incidence geometry has all the varieties as vertices, two distinct varieties being joined if they are incident. It is a multipartite graph; the parts of the multipartition are the types of varieties.
Sometimes the transpose of this matrix is used
(especially in coding theory).
|
L=XT'(I-XBK-1XB')XTwhere XT and XB are the plot-treatment and plot-block incidence matrices respectively, and K is the diagonal matrix whose entries are the block sizes.
u·v = u1v1 + ... + unvn.In coding theory we speak of the standard inner product on a vector space over a finite field; it is given by the above formula and satisfies the first two conditions (but not of course the third).
Sometimes it is not clear exactly what structure is being preserved. For example, for Latin squares, are the roles of rows, columns and symbols interchangeable? If not, then the relation is called "isotopy", and the classes are "isotopy classes"; if they are, the classes are called "main classes". Other terminology is also used. A topic essay discusses the many different notions of equivalence for Latin squares. |
The term "isotopy" also describes a triple of bijections from the sets of rows, columns, and symbols of the first square to those of the second which demonstrates that the squares are isotopic.
x o (y o x2) = (x o y) o x2.The most familiar example is the space of real symmetric matrices of given order n, with the operation
A o B = (AB+BA)/2.
It follows from Hall's marriage theorem that any Latin rectangle can be extended (by the addition of n-m extra rows) to a Latin square.
Sometimes it is permitted that the size of the alphabet is greater than the number of columns, and required that each symbol occurs at most once in each row or column. Such an object is a special type of partial Latin square. |
The graph whose adjacent pairs are first associates is a strongly regular graph of Latin square type.
1. A lattice is a partially ordered set in which any two elements have a greatest lower bound and a least upper bound. Typical examples are the lattice of all subsets of a set, and the lattice of all subgroups of a group, ordered by inclusion in each case.
2. A lattice is a discrete subgroup of the additive group of a Euclidean space. This means that the lattice is closed under addition and negation, and that there is a non-zero minimum distance between two points of the lattice. An example is the integer lattice consisting of all points with integer coordinates.
Based on the second meaning, various combinatorial and statistical designs have the word lattice in their names, suggesting the regular arrangement of points in a low-dimensional integer lattice.
For a square lattice design, there are n2 treatments in rn blocks of size n. The treatments are allocated to the cells of an n × n square array. The blocks of the first two replicates are the rows and columns, respectively, of the array. The remaining replicates, if any, are defined by r-2 mutually orthogonal latin squares of order n; the letters of each square define the blocks in the corresponding replicate. The lattice design is simple if r=2. Square lattice designs are also called nets. They are partially balanced with respect to an association scheme of Latin square type; first associates concur once in blocks, second associates never.
For a rectangular lattice design, there are n(n - 1) treatments in rn blocks of size n-1. The construction is similar to that for a square lattice design, except that n cells are removed from the square array. The removed cells should form a transversal to all the Latin squares used in the construction. A rectangular lattice design is partially balanced if r=2, n-1 or n, but not otherwise.
For a cubic lattice design, there are n3 treatments in 3n2 blocks of size n. The treatments form a cube of order n; the blocks are the 1-dimensional slices.
The best-known and commonest example is PSL(2,p), the group of unimodular linear fractional transformations over the integers modulo p (where p is prime): the elements are mappings of the form
z -> (az+b)/(cz+d),where a,b,c,d are integers mod p and ad-bc = 1.
A loop in a graph is an edge whose two vertices are the same.
A loop is a quasigroup containing an element e which is an identity for the operation (that is, ae=ea=a holds for all elements a.) If the elements are 1,...,n, with e=1, then the Cayley table is a normalised Latin square.
In a matroid, a basis is a maximal independent set (all bases have the same size); a circuit is a minimal dependent set; the rank of a subset is the size of the largest independent set it contains; and a flat is a maximal subset of given rank.
f(x,y) = Sum g(x,z),then
g(x,y) = Sum f(x,z)µ(z,y),where both summations run over all elements z with x<=z<=y.
v=n2, k=s(n+1), lambda=s2+3s-n, mu=s(s+1).
For example, Rees defined a neighbour design to be a 1-design with a circular structure on each block satisfying this condition; these have subsequently been called "Rees neighbour designs" or "balanced Ouchterlony neighbour designs".
The analogous concept for permutation groups is the wreath product.
A necessary condition for a graph to have a 1-factorisation is that it is regular. It follows from Hall's marriage theorem that, for bipartite graphs, this condition is also sufficient.
One of those words with many different meanings. Be especially careful of
the first two. To make things worse, the word "order" is also used as a synonym for (partially) ordered set! |
The order of a finite group or field is the number of elements it contains.
The order of an element g of a finite group is the smallest positive number m such that gm=1. It follows from Lagrange's Theorem that the order of any element of a group divides the order of the group.
The order of a Latin square is the number of its rows (or columns or symbols).
The order of a projective (resp. affine) plane is the number n such that the plane has n2+n+1 (resp. n2) points.
Sometimes, the order of another type of design (such as a Steiner triple system) is used to mean the number of points of the design.
The word "orthogonal" has many different meanings. See, for example, D. A. Preece, "Orthogonality and designs: a terminological muddle", Utilitas Math 12 (1977), 201-223. |
Two vectors v and w are orthogonal if v·w=0, where v·w denotes the inner product. Two subspaces V and W are orthogonal if every vector in V is orthogonal to every vector in W.
Two partitions P1 and P2 of a set X are said to be orthogonal if, for any parts Y1 and Y2 of P1 and P2, the size of their intersection is equal to |Y1|·|Y2|/|X|. In other words, the proportion of elements of each part of P1 which belong to a given part Y2 of P2 is constant.
Two Latin squares L1 and L2 are orthogonal if, for each pair (k,l) of symbols, there is a unique cell (i,j) in which L1 has symbol k and L2 has symbol l.
The term orthogonal design has two entirely different meanings.
An orthogonal array is different again - see below.
There is a more general definition which permits using alphabets of different cardinalities in the different coordinate positions. The requirement is that the number of rows where given elements occur in t prescribed columns is constant over all choices of the elements, but is allowed to vary for different sets of columns. Such an orthogonal array does not have an index. The example is of this "mixed alphabet" type.
The word also refers to the equivalence relation thus generated on the set of Latin squares.
A theorem of Ryser asserts that if all the symbols in a partial Latin square are confined to i rows and j columns, where i+j <= n, then the array can be completed to a Latin square (by putting symbols in the blank cells).
Permutations in active form are functions and can be composed. This composition is the group operation in the symmetric group.
In other words, it is a subgroup of the symmetric group on X.
Among its many remarkable properties, it is strongly regular and vertex-transitive.
A common form of structure consists of one or more partitions. In the case of a single partition, we use a block design. For a pair of partitions, we use a nested block design or a row-column design if the partitions are nested or crossed respectively. More complicated structures are possible!
There may be further structure as well, such as a neighbour relation, for which we use a neighbour design.
If we order the points as p1,...,pn, and the blocks as p1t,...,pnt, then the incidence matrix of the incidence structure is symmetric.
The poset block structures corresponding to the 2-element antichain and chain correspond respectively to crossing and nesting.
The corresponding term for hypergraphs is uniform.
v=n2, k=s(n-1), lambda=s2-3s+n, mu=s(s-1).
Quadrics in finite projective space are classified as elliptic, parabolic or hyperbolic according to the dimension of the largest subspace of the projective space contained in the quadric. If this dimension is r and the quadric is in n-dimensional space, then n=2r+3 for an elliptic quadric, n=2r+2 for a parabolic quadric, and n=2r+1 for a hyperbolic (or ruled) quadric.
In odd characteristic, the set of absolute points of an orthogonal polarity is a quadric.
In other words, left and right division by any element are possible and give a unique result. This means that the equation ax=b has a unique solution x, given a and b; and similarly the equation ya=b has a unique solution y.
The rank of a subset of a matroid is equal to the size of the largest independent set which it contains.
The rank of a permutation group is equal to the number of orbits of the group on the set of ordered pairs of points; in other words, the number of basis matrices for the group.
The analogous term for block designs is equireplicate.
The corresponding term for a hypergraph is valency.
More generally, an incidence structure is r-resolvable if its blocks can be partitioned into sets such that each point lies in r blocks of each set.
This is equivalent to saying that the permutations form the rows of a Latin square.
The finite simple groups have been classified.
It is a special kind of orthogonal block structure.
A code meeting this bound is called maximum-distance
separable.
In coding theory, a code which attains this bound is
called a perfect code.
Each parallel class in a
resolution is a spread.
It is common to use the parameter v for the number of vertices, and
the numbers k,lambda,mu for neighbours of, respectively, one vertex,
two adjacent vertices, and two non-adjacent vertices.
A matrix is symmetric if it is equal to its
transpose. Thus, the adjacency
matrix of a graph is symmetric.
For further meanings of the term, see below.
Hall's marriage theorem gives a necessary and sufficient
condition for a family of sets to have a SDR.
Square 2-designs (BIBDs) are often called "symmetric" in the literature,
though there is no requirement that the incidence matrix should be
symmetric. (The latter condition is equivalent to
the existence of a polarity of the design.)
In other words, it has the edges of both the cartesian
and categorical products of the two graphs.+1 +1 +1 -1
2. A binary relation R is transitive if, whenever R(x,y) and R(y,z) hold, then R(x,z) also holds. Typical examples of transitive relations are equivalence relations and order relations in partially ordered sets.
The word "transposition" also refers to the action of replacing a matrix by its transpose.
Uniquely completable sets can be defined for other classes of designs as well; see the entry on Sudoku.
For a block design, this parameter is called replication number.
For a connected, proper, equireplicate block design, variance balance coincides with the combinatorial condition of pairwise balance. See also balance.
Another representation is as a k×v Latin rectangle with v different symbols, having the properties that the sets of symbols used in the columns of the rectangle are the blocks of a square 2-(v,k,lambda) design.
A Youden "square" is equivalent to a 1-factorisation of the incidence graph of a square 2-design. Hall's marriage theorem implies that any square 2-design can be represented as a Youden "square".
Table of contents | Glossary | Topics | Bibliography | History
Peter J. Cameron
4 August 2006