JiscMail Logo
Email discussion lists for the UK Education and Research communities

Help for ALLSTAT Archives


ALLSTAT Archives

ALLSTAT Archives


allstat@JISCMAIL.AC.UK


View:

Message:

[

First

|

Previous

|

Next

|

Last

]

By Topic:

[

First

|

Previous

|

Next

|

Last

]

By Author:

[

First

|

Previous

|

Next

|

Last

]

Font:

Monospaced Font

LISTSERV Archives

LISTSERV Archives

ALLSTAT Home

ALLSTAT Home

ALLSTAT  1999

ALLSTAT 1999

Options

Subscribe or Unsubscribe

Subscribe or Unsubscribe

Log In

Log In

Get Password

Get Password

Subject:

RE: Query: Degrees of Freedom

From:

spotter <[log in to unmask]>

Reply-To:

spotter <[log in to unmask]>

Date:

Sat, 17 Apr 1999 10:58:42 -0500

Content-Type:

text/plain

Parts/Attachments:

Parts/Attachments

text/plain (312 lines)

I want to sincerely thank everyone who replied to my query. Each of
your answers (below) provided me with a better understanding of the
concept and the application. Thank you so very much.

Teresa Beck

~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~

original message

Dear Allstaters,

I am researching what I thought would be a simple concept: degrees of
freedom. While reading the archives, I noticed that a number of people
found errors in their data that are directly related to this topic. I'm
beginning to feel as if degrees of freedom are blindly applied by people
like me, but never fully understood. Moreover, the somewhat mysterious
concept of degrees of freedom, (typically defined as n-1 and used to
determine mean-squares in ANOVA's) is not explained very well in any of
the stats books that I've looked at. I would sincerely appreciate
everyone's input and I will compile the answers.

1. Where does it come from?

2. Why is it always 1 less than n?

3. What, exactly, is degrees of freedom?


Thank you,

Teresa Beck

[log in to unmask]

~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~

Teresa,
I can't give you the math derivations, but rather a gut feel for what it
does. See below.

Take a pen, with a pocket clip on it, and toss it in the air. It can
spin about the center in three axes - rotate aboutthe long axis, about
right angle line through the clip, and about another line at right
angles to both of those. It can fly up in the air in 3 dimensions - up
down, left right, forward & back. In mechanics, we call those the 6
desgrees of freedom of a free body.

Now hold the pen so that it can only rotate about the long axis. It can
still move up down, right left, forward & back, but it cannot rotate
through the clip (you hold it that way!). You ahve reduced the degrees
of freedom. You can then hold it so that it slides along a line, in the
direction of the long axis. Fewer degrees of freedom. Eventually, you
will have pinned the pen down completely, and it cannot move at all. Do
it yourself, so you can feel the pen being tied down.

Now for the numbers. If I tell you that I have 4 meausrmeent numbers,
those four could be anything. If I tell you the average value of 4
numbers, three of those 4 numbers can individually be anything. The
fourth one, however, will depend on the first 3. Those first 3 numbers
are 'free,' or have degrees of freedom. If I give you the value of 3
numbers, and give you the average of the 4, then you can dope out the
fourth value. It is not 'loose,' or free. Thus, 4 numbers have 4
degrees of freedom. When I state the average, I have only 3 (4-1)
degrees of freedom.

If I then tell you the estimated standard deviation, I will use up one
more degree of freedom. I know this because when I know 2 values, an
average and an est. stdev, I can dope out the other two measurements.

In the Anova, which you mentioned, you will see that each time we
provide a summary calculation of some sort, the degrees of freedom
(number of 'loose' measurmeents) is reduced. In a linear regression,
each coefficient reduces the df again.

Now for your specific questions:

> 1. Where does it come from?

Either does not compute, or demonstrated above.

> 2. Why is it always 1 less than n?

'always'? When one summary value is calculated, then df = n-1 Could be
n. - k, too.

> 3. What, exactly, is degrees of freedom?

Did I get that?

If you get a good mathematical demonstration, please send it on to me,
too.

BTW, there was a big discussion in the 1930's on whether the stdev
should be n or n-1. So you are hardly alone. See my web site for a
discussion of which to use - the comparison of 2 groups (Student 't')
page.

Jay
--
Jay Warner
Principal Scientist
Warner Consulting, Inc.
4444 North Green Bay Road
Racine, WI 53404-1216
USA

Ph: (414) 634-9100
FAX: (414) 681-1133
email: [log in to unmask]
web: http://www.a2q.com

Power to the data!

~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~

   From:
        "Nigel Griggs" <[log in to unmask]>
   

Teresa,

Here are a couple of messages which went out a while ago via the
teaching-stats mailing list. They might be of use to you, as they were
to
me, in terms of thinking about the concept of DoF.

Number 1

I let the class provide a small population, say of size 5. Then I let
them sample randomly from the above population. Fix the sample size to
be three. Then I let them calculate the mean of the population, then
mean of each of the samples. The average of the sample mean is
demonstrated to be equal to the population mean. The mean is the number
that forces the average of sample means to be fixed. In return the mean
of each sample forces one of the members of the sample not to be free.
Therefore when one population parameter is not known one degree of
freedom is lost. Similarly, if two parameters are unknown two degrees
of freedom are lost, etc. I would like to demonstrate the link between
the number of unknown parameters and the degree of freedom without
having to go through the extensive demonstration in class.

Number 2
At 2:35 PM 6/24/97, Dr. Shahdad Naghshpour wrote:
>What is a good way of explaining the concept of degrees of freedom to
>beginning statistics students?

You have three numbers that must equal ten. Two are free to vary, but
once you know the first two, the remaining number is no longer free to
vary. So, you have two degrees of freedom in this situation. E.g., You
know you have 5 and 3 as your two numbers that are free to vary. What
must the remaining number be? 2. The last number is not free to vary
but the first two are.
Hope this helps,

TBN

~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
 
       From:
             "Nick Cox" <[log in to unmask]>
 Organization:
             University of Durham
        


I think you are right that the idea is sometimes not well explained.

The term `degrees of freedom' comes from an analogy with degrees of
freedom in classical mechanics, and refers to the number of ways in
which a body can move. This isn't likely to help much anyone who knows
less mechanics than statistics. I suspect that now few people learning
the idea of df will have previously encountered it in mechanics, even if
they are specialising in mathematics and statistics. The reverse was
possibly true in, say, the early decades of this century.

The number of df is not always n - 1 by any means, but depends on the
number of constraints that must be satisfied.

In general, at least in the simplest situations,

number of degrees of freedom = number of data - number of linear
constraints

In one situation, the requirement that a set of n numbers add up to
a total imposes one constraint. The numbers then have n - 1 degrees of
freedom because any n - 1 can vary freely but the last is then fixed by
that requirement.

There is a good expository article by Helen Walker in Journal of
Educational Psychology 31, 253-69 (1940). There is good material in
George W. Cobb, Introduction to design and analysis of experiments.

A fuller answer would have to explain why there are situations in which
the degrees of freedom is not an integer. Here the original analogy is
less relevant.


Nick

~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~

   From:
        [log in to unmask] (John Whittington)

Teresa,


The concept is all about how many of the items of data in your sample
are 'free to vary' for any particular value of the parameter(s) you are
estimating/testing. Hence, if you have a sample of N, and wish to
estimate the mean, N-1 of them are 'free' to take *any* values they
might like - and the mean could still take literally ANY value,
depending upon the value of the Nth one. In other words, you could give
me values for N-1 items, and I could still make the man ANYTHING, but
appropriately assigning ythe value of the last item. The number of DF
is therefore N-1 in that situation.

In the case of an unpaired t-test, say with groups of N1 and N2, we are
estimating/testing the difference between the two means. By the above
logic, N1-1 and N2-1 respectively will be 'free to vary' without
constraining the two means (hence the difference between them), so that
the total DF will be (N1-1) + (N2-1) = (N1 + N2 - 2).

If you were wishing to estimate *two* parameters (say mean and variance)
simultaneoulsy, then all but TWO of the variables would be 'free to
vary'(take any values they wished). If N-1 were allowed to vary, then
choice of the last value for allow either mean or variance (but not
both) to be made equal to ANY given value. If only N-2 were free to
vary, then both variance and mean could take any given values, according
to the values of the last two items. In this situation, DF would be N-2.

Does that help at all?

Kind Regards,

John

~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~

       From:
             "Philippe.NIVLET" <[log in to unmask]>
 Organization:
             IFP

 Dear Teresa,

The number of degrees of freedom represents exactly the number of
independent parameters of a n-variant system. This means that it equals
the number of parameters minus the number of constraints between them.
In chemistry, e. g., the number of degrees of freedom is explicitly
given by Gibbs' rule . More generally, you can fully understand this
concept geometrically :

If you study a system with n parameters x_i,i=,1...n you can represent
it in a n-dimension space. Any point of this space shall represent a
potential state of your system. If your n parameters could vary
independently, them your system would be fully described in a
n-dimension hyper-volume. Now, imagine you've got one constraint between
the parameters (an equation relying your n parameters), then your system
would be described by a (n-1)-dimension hyper-surface.

In statistics, your n parameters are your n data. To evaluate variance,
you first need to infere the mean E(X). So when you evaluate the
variance, you've got one constraint on your system (which is the
expression of the mean), and it only remains (n-1) degrees of freedom to
your system.

I hope this can be helpful to you,

Regards,

************************************************
Philippe NIVLET
Institut Francais du Petrole
Division Geophysique et Instrumentation
tel : 01-47-52-60-00 (poste 8824)
mail : [log in to unmask]
************************************************

~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~

  
       From:
             "Miland A I Joshi" <[log in to unmask]>
    Reply-To:
             [log in to unmask]
 Organization:
             University of Manchester, UK

Dear Teresa,

Greetings - I am a Medical , i.e. Applied Statistician, so my
strength is not theoretical. 'Degrees of freedom' is indeed a
concept that is easy to teach 'slickly' in standard situations, but
its mathematical definition is actually very difficult. However a
simple 'working definition' is 'sample size minus the number of
estimated parameters' - so it is not always n-1. I suspect
that the best way to deal with this problem for you would be to look
at a number of good worked examples, and you will find some in D.
Altmans's Practical Statistics for Medical Research (Chapman and
Hall, ISBN 0412276305).
I hope this helps.

Regards
Miland Joshi(Mr.)




%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%

Top of Message | Previous Page | Permalink

JiscMail Tools


RSS Feeds and Sharing


Advanced Options


Archives

April 2024
March 2024
February 2024
January 2024
December 2023
November 2023
October 2023
September 2023
August 2023
July 2023
June 2023
May 2023
April 2023
March 2023
February 2023
January 2023
December 2022
November 2022
October 2022
September 2022
August 2022
July 2022
June 2022
May 2022
April 2022
March 2022
February 2022
January 2022
December 2021
November 2021
October 2021
September 2021
August 2021
July 2021
June 2021
May 2021
April 2021
March 2021
February 2021
January 2021
December 2020
November 2020
October 2020
September 2020
August 2020
July 2020
June 2020
May 2020
April 2020
March 2020
February 2020
January 2020
December 2019
November 2019
October 2019
September 2019
August 2019
July 2019
June 2019
May 2019
April 2019
March 2019
February 2019
January 2019
December 2018
November 2018
October 2018
September 2018
August 2018
July 2018
June 2018
May 2018
April 2018
March 2018
February 2018
January 2018
December 2017
November 2017
October 2017
September 2017
August 2017
July 2017
June 2017
May 2017
April 2017
March 2017
February 2017
January 2017
December 2016
November 2016
October 2016
September 2016
August 2016
July 2016
June 2016
May 2016
April 2016
March 2016
February 2016
January 2016
December 2015
November 2015
October 2015
September 2015
August 2015
July 2015
June 2015
May 2015
April 2015
March 2015
February 2015
January 2015
December 2014
November 2014
October 2014
September 2014
August 2014
July 2014
June 2014
May 2014
April 2014
March 2014
February 2014
January 2014
December 2013
November 2013
October 2013
September 2013
August 2013
July 2013
June 2013
May 2013
April 2013
March 2013
February 2013
January 2013
December 2012
November 2012
October 2012
September 2012
August 2012
July 2012
June 2012
May 2012
April 2012
March 2012
February 2012
January 2012
December 2011
November 2011
October 2011
September 2011
August 2011
July 2011
June 2011
May 2011
April 2011
March 2011
February 2011
January 2011
December 2010
November 2010
October 2010
September 2010
August 2010
July 2010
June 2010
May 2010
April 2010
March 2010
February 2010
January 2010
December 2009
November 2009
October 2009
September 2009
August 2009
July 2009
June 2009
May 2009
April 2009
March 2009
February 2009
January 2009
December 2008
November 2008
October 2008
September 2008
August 2008
July 2008
June 2008
May 2008
April 2008
March 2008
February 2008
January 2008
December 2007
November 2007
October 2007
September 2007
August 2007
July 2007
June 2007
May 2007
April 2007
March 2007
February 2007
January 2007
2006
2005
2004
2003
2002
2001
2000
1999
1998


JiscMail is a Jisc service.

View our service policies at https://www.jiscmail.ac.uk/policyandsecurity/ and Jisc's privacy policy at https://www.jisc.ac.uk/website/privacy-notice

For help and support help@jisc.ac.uk

Secured by F-Secure Anti-Virus CataList Email List Search Powered by the LISTSERV Email List Manager