Multiple correspondence analysis explained

In statistics, multiple correspondence analysis (MCA) is a data analysis technique for nominal categorical data, used to detect and represent underlying structures in a data set. It does this by representing data as points in a low-dimensional Euclidean space. The procedure thus appears to be the counterpart of principal component analysis for categorical data.[1] [2] MCA can be viewed as an extension of simple correspondence analysis (CA) in that it is applicable to a large set of categorical variables.

As an extension of correspondence analysis

MCA is performed by applying the CA algorithm to either an indicator matrix (also called complete disjunctive table – CDT) or a Burt table formed from these variables.[3] An indicator matrix is an individuals × variables matrix, where the rows represent individuals and the columns are dummy variables representing categories of the variables.[4] Analyzing the indicator matrix allows the direct representation of individuals as points in geometric space. The Burt table is the symmetric matrix of all two-way cross-tabulations between the categorical variables, and has an analogy to the covariance matrix of continuous variables. Analyzing the Burt table is a more natural generalization of simple correspondence analysis, and individuals or the means of groups of individuals can be added as supplementary points to the graphical display.

In the indicator matrix approach, associations between variables are uncovered by calculating the chi-square distance between different categories of the variables and between the individuals (or respondents). These associations are then represented graphically as "maps", which eases the interpretation of the structures in the data. Oppositions between rows and columns are then maximized, in order to uncover the underlying dimensions best able to describe the central oppositions in the data. As in factor analysis or principal component analysis, the first axis is the most important dimension, the second axis the second most important, and so on, in terms of the amount of variance accounted for. The number of axes to be retained for analysis is determined by calculating modified eigenvalues.

Details

Since MCA is adapted to draw statistical conclusions from categorical variables (such as multiple choice questions), the first thing one needs to do is to transform quantitative data (such as age, size, weight, day time, etc) into categories (using for instance statistical quantiles).

When the dataset is completely represented as categorical variables, one is able to build the corresponding so-called complete disjunctive table. We denote this table

X

. If

I

persons answered a survey with

J

multiple choices questions with 4 answers each,

X

will have

I

rows and

4J

columns.

More theoretically,[5] assume

X

is the completely disjunctive table of

I

observations of

K

categorical variables. Assume also that the

k

-th variable have

Jk

different levels (categories) and set
K
J=\sum
k=1

Jk

. The table

X

is then a

I x J

matrix with all coefficient being

0

or

1

. Set the sum of all entries of

X

to be

N

and introduce

Z=X/N

. In an MCA, there are also two special vectors: first

r

, that contains the sums along the rows of

Z

, and

c

, that contains the sums along the columns of

Z

. Note

Dr=diag(r)

and

Dc=diag(c)

, the diagonal matrices containing

r

and

c

respectively as diagonal. With these notations, computing an MCA consists essentially in the singular value decomposition of the matrix:

M=

-1/2
D
r

(Z-rcT)

-1/2
D
c

The decomposition of

M

gives you

P

,

\Delta

and

Q

such that

M=P\DeltaQT

with P, Q two unitary matrices and

\Delta

is the generalized diagonal matrix of the singular values (with the same shape as

Z

). The positive coefficients of

\Delta2

are the eigenvalues of

Z

.

The interest of MCA comes from the way observations (rows) and variables (columns) in

Z

can be decomposed. This decomposition is called a factor decomposition. The coordinates of the observations in the factor space are given by

F=

-1/2
D
r

P\Delta

The

i

-th rows of

F

represent the

i

-th observation in the factor space. And similarly, the coordinates of the variables (in the same factor space as observations!) are given by

G=

-1/2
D
c

Q\Delta

Recent works and extensions

In recent years, several students of Jean-Paul Benzécri have refined MCA and incorporated it into a more general framework of data analysis known as geometric data analysis. This involves the development of direct connections between simple correspondence analysis, principal component analysis and MCA with a form of cluster analysis known as Euclidean classification.[6]

Two extensions have great practical use.

Application fields

In the social sciences, MCA is arguably best known for its application by Pierre Bourdieu,[7] notably in his books La Distinction, Homo Academicus and The State Nobility. Bourdieu argued that there was an internal link between his vision of the social as spatial and relational --– captured by the notion of field, and the geometric properties of MCA.[8] Sociologists following Bourdieu's work most often opt for the analysis of the indicator matrix, rather than the Burt table, largely because of the central importance accorded to the analysis of the 'cloud of individuals'.[9]

Multiple correspondence analysis and principal component analysis

MCA can also be viewed as a PCA applied to the complete disjunctive table. To do this, the CDT must be transformed as follows.Let

yik

denote the general term of the CDT.

yik

is equal to 1 if individual

i

possesses the category

k

and 0 if not.Let denote

pk

, the proportion of individuals possessing the category

k

.The transformed CDT (TCDT) has as general term:

xik=yik/pk-1

The unstandardized PCA applied to TCDT, the column

k

having the weight

pk

, leads to the results of MCA.

This equivalence is fully explained in a book by Jérôme Pagès.[10] It plays an important theoretical role because it opens the way to the simultaneous treatment of quantitative and qualitative variables. Two methods simultaneously analyze these two types of variables: factor analysis of mixed data and, when the active variables are partitioned in several groups: multiple factor analysis.

This equivalence does not mean that MCA is a particular case of PCA as it is not a particular case of CA. It only means that these methods are closely linked to one another, as they belong to the same family: the factorial methods.

Software

There are numerous software of data analysis that include MCA, such as STATA and SPSS. The R package FactoMineR also features MCA. This software is related to a book describing the basic methods for performing MCA .[11] There is also a Python package for https://pypi.org/project/mca which works with numpy array matrices; the package has not been implemented yet for Spark dataframes.

References

  1. Book: Le Roux . B. and H. Rouanet . Geometric Data Analysis, From Correspondence Analysis to Structured Data Analysis . Dordrecht. Kluwer: p.180 . 2004.
  2. Book: Greenacre, Michael and Blasius, Jörg (editors) . Multiple Correspondence Analysis and Related Methods . Chapman & Hall/CRC . London . 2006.
  3. Book: Greenacre, Michael . Chapman & Hall/CRC . London . 2007 . Correspondence Analysis in Practice, Second Edition.
  4. Le Roux, B. and H. Rouanet (2004), Geometric Data Analysis, From Correspondence Analysis to Structured Data Analysis, Dordrecht. Kluwer: p.179
  5. News: Hervé Abdi . Dominique Valentin . Multiple correspondence analysis . 2007 .
  6. Book: Le Roux . B. and H. Rouanet . Geometric Data Analysis, From Correspondence Analysis to Structured Data Analysis . Dordrecht. Kluwer . 2004.
  7. Scott, John & Gordon Marshall (2009): Oxford Dictionary of Sociology, p. 135. Oxford: Oxford University Press
  8. Rouanet, Henry (2000) "The Geometric Analysis of Questionnaires. The Lesson of Bourdieu's La Distinction", in Bulletin de Méthodologie Sociologique 65, pp. 4–18
  9. Lebaron, Frédéric (2009) "How Bourdieu “Quantified” Bourdieu: The Geometric Modelling of Data", in Robson and Sanders (eds.) Quantifying Theory: Pierre Bourdieu. Springer, pp. 11-30.
  10. Pagès Jérôme (2014). Multiple Factor Analysis by Example Using R. Chapman & Hall/CRC The R Series London 272 p
  11. Husson F., Lê S. & Pagès J. (2009). Exploratory Multivariate Analysis by Example Using R. Chapman & Hall/CRC The R Series, London.

External links