# Non-parametric Tests

A number of non-parametric tests are available. I mention only a sample of procedures which I think social scientists need most frequently. Note that in the examples as I provide them here the first slash, separating the keyword `npar tests`

from the following keyword that indicates the specific procedure requested, is *not* necessary if you request only one test. However, you may do several analyses with one single `npar tests`

command, and in this case the different procedures *have* to be separated by slashes.

## Tests for two or more independent samples

Many tests in this category deal with the situation that the dependent variable either consists of ranks or of metric data that are highly skewed or do otherwise not fulfill the requirements of parametric tests like the t-test or analysis of variance.

*Mann-Whitney U-test (and Wilcoxon test) for two groups*

npar tests | |

/ m-w testscore1 by tgroup (1,2). |

This will produce both Mann-Whitney's U and Wilcoxon's W. Both test statistics lead to the same decision.

*Kruskal-Wallis test for more than two groups*

npar tests | |

/ k-w testscore1 by tgroup (1,4). |

## Tests for two or more dependent samples

*Wilcoxon test for two samples*

npar tests | |

/ wilcoxon testscore1 testscore2. |

*Friedman's test for more than two samples*

npar tests | |

/ friedman testscore1 to testscore5. |

## Goodness-of-fit (comparison of an empirical distribution with a pre-defined distribution)

*Non-metric data: Chi-square test*

The following command will test the distribution of a variable (var31) against the null hypotheses that the distribution in the population is uniform.

npar tests | |

/ chisquare var31. |

You may also test against any arbitry distribution as follows:

npar tests | |

/ chisquare var31/expected 310 40 85 216. |

Note that the sum of the expected values need not correspond to the sum that was observed in var31. SPSS will take the values as indicating the *proportion* of cases in each category and adjust the figures accordingly. However, the number of frequencies given has to correspond to the number of values var31 has.

*Metric data: Kolmogorov-Smirnov test*

You may test a variable against a normal, poisson, uniform, or exponential distribution. I discuss only the case of a normal distribution, but extension to the other distributions is straightforward.

The first example tests whether variable income is normally distributed (if your output says it is, you may have special data; income nearly always is not normally distributed). The mean and the standard deviation are taken from the data.

npar tests | |

/ k-s (normal) = income. |

But actually you may test a variable against a normal distribution with arbitrary mean and standard deviation as follows (the first value being the mean):

npar tests | |

/ k-s (normal, 1763, 1164) = income. |

This works very similar with the others distributions mentioned. You may either provide the appropriate keyword only; or you may indicate additional arbitrary values, in the case of the poisson and the exponential distribution the mean and in the case of the uniform distribution the minimum and the maximum value, respectively.

© W. Ludwig-Mayerhofer, IGSW | Last update: 07 Dec 2009