Question
Question: What happens to the T distribution if the sample size increases?...
What happens to the T distribution if the sample size increases?
Solution
To know the answer of the given question you must know about the probability distribution and there various types such as normal distribution and T distribution. After getting the knowledge of these concepts you can get your answer. So try to understand these concepts.
Complete step-by-step solution:
To get the answer to this question we should understand the concept of the T distribution. The T distribution is also known by the other name and that is Student’s t-distribution. It is a technique to find or to calculate the mean of the given data.
The idea behind the T distribution is to check the hypothesis that whether the hypothesis is accepted or it gets rejected. T distribution is generally used when the sample size of the given data is small in size that is approximately not more than 20 samples.
To calculate the T value the formula used is
t=s/N(xˉ−μ)
Where,
t=T value
xˉ=Mean of the sample
μ=Mean of the population
s=Standard deviation of the sample
N=Sample size
There are so many different types of T distributions. But there is a concept of degree of freedom which is used to determine the form of T distribution. The degree of freedom is defined as the number of independent observations in a set of data. For estimating a mean value or to estimate a proportion from a single sample, the number of independent observations is always equal to the sample size minus one. Hence, the distribution of the T statistic from sample size of 5 would be described by the T distribution having 5−1 or we can say 4 degrees of freedom. Similarly, if we say 21 is the sample size then the degree of freedom of T distribution will be 20.
The T distribution becomes very much similar to the normal distribution if the sample size increases.
Note: Normal distribution is a kind of probability distribution for a real valued random variable. Normal distribution is sometimes called a bell curve due to its shape. If for any sample mean is zero and variance of that sample is one then normal distribution is known as standard normal distribution.