Sum Of Squares

We remember pretty clearly from geometry class that it wasn’t possible to add any shapes together, even squares, so this must talking about the other kind of squares where we multiply a number by itself. The sum of squares, sometimes called the variation, is a way to measure the spread, or how far apart the data points are from each other and the mean, of a data set.

We find the sum of squares by subtracting the mean from every data point as a way to get a handle on how far each point is from the mean. We then square those differences and add them up. You might be wondering why we just don’t add the differences themselves without squaring. If we did that, the data points above the mean (with a positive difference) would likely cancel out with data points below the mean (with a negative difference) giving us a value close to zero.

By squaring the differences, we make them all positive, so both positive and negative differences contribute to the measure of spread. Let’s say we weighed our three office cats, Zok, Igoo, and Tundro, and got weights of 7.2 lbs, 9.1 lbs, and 9.3 lbs. The mean weight is 8.5 lbs. The sum of squares would be found using (7.2 – 8.5)2 + (9.1 – 8.5)2 + (9.3 – 8.5)2 = (-1.3)2 + (0.6)2 + (0.8)2 = 1.69 + 0.36 + 0.64. We get 2.69 for our sum of squares value. A relatively large sum of squares indicates data points quite spread out from one another while a relatively small sum of squares indicates the points are quite close together.

Find other enlightening terms in Shmoop Finance Genius Bar(f)