What is the purpose of $frac{1}{sigma sqrt{2 pi}}$ in $frac{1}{sigma sqrt{2 pi}}e^{frac{(-(x - mu...
$begingroup$
I have been studying the probability density function...
$$frac{1}{sigma sqrt{2 pi}}e^{frac{(-(x - mu ))^2}{2sigma ^2}}$$
For now I remove the constant, and using the following proof, I prove that...
$$int_{-infty}^{infty}e^{frac{-x^2}{2}} = sqrt{2 pi }$$
The way I interpret this is that the area under the gaussian distribution is $sqrt{2 pi }$. But I am still having a hard time figuring out what the constant is doing. It seems to divide by the area itself and by $sigma$ as well. Why is this done?
probability statistics probability-distributions normal-distribution gaussian-integral
$endgroup$
add a comment |
$begingroup$
I have been studying the probability density function...
$$frac{1}{sigma sqrt{2 pi}}e^{frac{(-(x - mu ))^2}{2sigma ^2}}$$
For now I remove the constant, and using the following proof, I prove that...
$$int_{-infty}^{infty}e^{frac{-x^2}{2}} = sqrt{2 pi }$$
The way I interpret this is that the area under the gaussian distribution is $sqrt{2 pi }$. But I am still having a hard time figuring out what the constant is doing. It seems to divide by the area itself and by $sigma$ as well. Why is this done?
probability statistics probability-distributions normal-distribution gaussian-integral
$endgroup$
2
$begingroup$
so the integral of the probability density function over the entire space is equal to one
$endgroup$
– J. W. Tanner
yesterday
add a comment |
$begingroup$
I have been studying the probability density function...
$$frac{1}{sigma sqrt{2 pi}}e^{frac{(-(x - mu ))^2}{2sigma ^2}}$$
For now I remove the constant, and using the following proof, I prove that...
$$int_{-infty}^{infty}e^{frac{-x^2}{2}} = sqrt{2 pi }$$
The way I interpret this is that the area under the gaussian distribution is $sqrt{2 pi }$. But I am still having a hard time figuring out what the constant is doing. It seems to divide by the area itself and by $sigma$ as well. Why is this done?
probability statistics probability-distributions normal-distribution gaussian-integral
$endgroup$
I have been studying the probability density function...
$$frac{1}{sigma sqrt{2 pi}}e^{frac{(-(x - mu ))^2}{2sigma ^2}}$$
For now I remove the constant, and using the following proof, I prove that...
$$int_{-infty}^{infty}e^{frac{-x^2}{2}} = sqrt{2 pi }$$
The way I interpret this is that the area under the gaussian distribution is $sqrt{2 pi }$. But I am still having a hard time figuring out what the constant is doing. It seems to divide by the area itself and by $sigma$ as well. Why is this done?
probability statistics probability-distributions normal-distribution gaussian-integral
probability statistics probability-distributions normal-distribution gaussian-integral
edited 14 hours ago
user21820
40.1k544162
40.1k544162
asked yesterday
BolboaBolboa
408616
408616
2
$begingroup$
so the integral of the probability density function over the entire space is equal to one
$endgroup$
– J. W. Tanner
yesterday
add a comment |
2
$begingroup$
so the integral of the probability density function over the entire space is equal to one
$endgroup$
– J. W. Tanner
yesterday
2
2
$begingroup$
so the integral of the probability density function over the entire space is equal to one
$endgroup$
– J. W. Tanner
yesterday
$begingroup$
so the integral of the probability density function over the entire space is equal to one
$endgroup$
– J. W. Tanner
yesterday
add a comment |
3 Answers
3
active
oldest
votes
$begingroup$
If you consider every possible outcome of some event you should expect the probability of it happening to be $1$, not $sqrt{2pi}$ so the constant scales the distribution to conform with the normal convention of ascribing a probability between zero and one.
$endgroup$
2
$begingroup$
(In case its not clear, which it probably is) More precisely, you want the sum of the probabilities of all possible events to equal $1$, and since an integral represents a sum over such continuous variables, that's what this is.
$endgroup$
– John Doe
yesterday
add a comment |
$begingroup$
It is doing that, but observe that you are also stretching in the horizontal direction by the same factor (in the exponential). Say if $sigma>1$ you are decreasing your area by a factor $sigma$ but you are increasing it by the same factor because you replace $x$ by $x/sigma$ (the shift does not change the area of course)
$endgroup$
add a comment |
$begingroup$
As you have correctly stated, the p.d.f. of the normal distribution is given by $$f(xmidmu,sigma^2)=frac1{sigmasqrt{2pi}}expleft(-frac12left(frac{x-mu}sigmaright)^2right)$$ where the parameter space is $mathitTheta={(mu,sigma^2)inBbb R^2:sigma^2>0}$. This is essentially saying that the mean is a value on the real line, and the variance is one on the positive real line.
Now consider the simple case where $mu=0$ and $sigma^2=1$. Then the standard normal distribution has p.d.f. $$f(x)=frac1{sqrt{2pi}}expleft(-frac12x^2right).$$ If we integrate this in the interval $(-infty,infty)$, we will get $1$. This is by definition always the case as for all $xinmathit X$ (in this instance $mathit X=Bbb R$), $$int_{mathit X}f(x),dx=1.$$ That is, the sum of all the probabilities of $x$ being in each region in $mathit X$ is $1$. In fact, the constant that makes this happen is so important in statistics (especially Bayesian statistics) that it is given a name: the normalising constant.
A further example is the Beta distribution, with p.d.f. $$f(xmidalpha,beta)=frac{x^{alpha-1}(1-x)^{beta-1}}{text B(alpha,beta)}$$ where $1/text B(alpha,beta)$ is the normalising constant.
$endgroup$
add a comment |
Your Answer
StackExchange.ifUsing("editor", function () {
return StackExchange.using("mathjaxEditing", function () {
StackExchange.MarkdownEditor.creationCallbacks.add(function (editor, postfix) {
StackExchange.mathjaxEditing.prepareWmdForMathJax(editor, postfix, [["$", "$"], ["\\(","\\)"]]);
});
});
}, "mathjax-editing");
StackExchange.ready(function() {
var channelOptions = {
tags: "".split(" "),
id: "69"
};
initTagRenderer("".split(" "), "".split(" "), channelOptions);
StackExchange.using("externalEditor", function() {
// Have to fire editor after snippets, if snippets enabled
if (StackExchange.settings.snippets.snippetsEnabled) {
StackExchange.using("snippets", function() {
createEditor();
});
}
else {
createEditor();
}
});
function createEditor() {
StackExchange.prepareEditor({
heartbeatType: 'answer',
autoActivateHeartbeat: false,
convertImagesToLinks: true,
noModals: true,
showLowRepImageUploadWarning: true,
reputationToPostImages: 10,
bindNavPrevention: true,
postfix: "",
imageUploader: {
brandingHtml: "Powered by u003ca class="icon-imgur-white" href="https://imgur.com/"u003eu003c/au003e",
contentPolicyHtml: "User contributions licensed under u003ca href="https://creativecommons.org/licenses/by-sa/3.0/"u003ecc by-sa 3.0 with attribution requiredu003c/au003e u003ca href="https://stackoverflow.com/legal/content-policy"u003e(content policy)u003c/au003e",
allowUrls: true
},
noCode: true, onDemand: true,
discardSelector: ".discard-answer"
,immediatelyShowMarkdownHelp:true
});
}
});
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
StackExchange.ready(
function () {
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmath.stackexchange.com%2fquestions%2f3181774%2fwhat-is-the-purpose-of-frac1-sigma-sqrt2-pi-in-frac1-sigma-sqrt%23new-answer', 'question_page');
}
);
Post as a guest
Required, but never shown
3 Answers
3
active
oldest
votes
3 Answers
3
active
oldest
votes
active
oldest
votes
active
oldest
votes
$begingroup$
If you consider every possible outcome of some event you should expect the probability of it happening to be $1$, not $sqrt{2pi}$ so the constant scales the distribution to conform with the normal convention of ascribing a probability between zero and one.
$endgroup$
2
$begingroup$
(In case its not clear, which it probably is) More precisely, you want the sum of the probabilities of all possible events to equal $1$, and since an integral represents a sum over such continuous variables, that's what this is.
$endgroup$
– John Doe
yesterday
add a comment |
$begingroup$
If you consider every possible outcome of some event you should expect the probability of it happening to be $1$, not $sqrt{2pi}$ so the constant scales the distribution to conform with the normal convention of ascribing a probability between zero and one.
$endgroup$
2
$begingroup$
(In case its not clear, which it probably is) More precisely, you want the sum of the probabilities of all possible events to equal $1$, and since an integral represents a sum over such continuous variables, that's what this is.
$endgroup$
– John Doe
yesterday
add a comment |
$begingroup$
If you consider every possible outcome of some event you should expect the probability of it happening to be $1$, not $sqrt{2pi}$ so the constant scales the distribution to conform with the normal convention of ascribing a probability between zero and one.
$endgroup$
If you consider every possible outcome of some event you should expect the probability of it happening to be $1$, not $sqrt{2pi}$ so the constant scales the distribution to conform with the normal convention of ascribing a probability between zero and one.
answered yesterday
CyclotomicFieldCyclotomicField
2,6331316
2,6331316
2
$begingroup$
(In case its not clear, which it probably is) More precisely, you want the sum of the probabilities of all possible events to equal $1$, and since an integral represents a sum over such continuous variables, that's what this is.
$endgroup$
– John Doe
yesterday
add a comment |
2
$begingroup$
(In case its not clear, which it probably is) More precisely, you want the sum of the probabilities of all possible events to equal $1$, and since an integral represents a sum over such continuous variables, that's what this is.
$endgroup$
– John Doe
yesterday
2
2
$begingroup$
(In case its not clear, which it probably is) More precisely, you want the sum of the probabilities of all possible events to equal $1$, and since an integral represents a sum over such continuous variables, that's what this is.
$endgroup$
– John Doe
yesterday
$begingroup$
(In case its not clear, which it probably is) More precisely, you want the sum of the probabilities of all possible events to equal $1$, and since an integral represents a sum over such continuous variables, that's what this is.
$endgroup$
– John Doe
yesterday
add a comment |
$begingroup$
It is doing that, but observe that you are also stretching in the horizontal direction by the same factor (in the exponential). Say if $sigma>1$ you are decreasing your area by a factor $sigma$ but you are increasing it by the same factor because you replace $x$ by $x/sigma$ (the shift does not change the area of course)
$endgroup$
add a comment |
$begingroup$
It is doing that, but observe that you are also stretching in the horizontal direction by the same factor (in the exponential). Say if $sigma>1$ you are decreasing your area by a factor $sigma$ but you are increasing it by the same factor because you replace $x$ by $x/sigma$ (the shift does not change the area of course)
$endgroup$
add a comment |
$begingroup$
It is doing that, but observe that you are also stretching in the horizontal direction by the same factor (in the exponential). Say if $sigma>1$ you are decreasing your area by a factor $sigma$ but you are increasing it by the same factor because you replace $x$ by $x/sigma$ (the shift does not change the area of course)
$endgroup$
It is doing that, but observe that you are also stretching in the horizontal direction by the same factor (in the exponential). Say if $sigma>1$ you are decreasing your area by a factor $sigma$ but you are increasing it by the same factor because you replace $x$ by $x/sigma$ (the shift does not change the area of course)
answered yesterday
GReyesGReyes
2,43815
2,43815
add a comment |
add a comment |
$begingroup$
As you have correctly stated, the p.d.f. of the normal distribution is given by $$f(xmidmu,sigma^2)=frac1{sigmasqrt{2pi}}expleft(-frac12left(frac{x-mu}sigmaright)^2right)$$ where the parameter space is $mathitTheta={(mu,sigma^2)inBbb R^2:sigma^2>0}$. This is essentially saying that the mean is a value on the real line, and the variance is one on the positive real line.
Now consider the simple case where $mu=0$ and $sigma^2=1$. Then the standard normal distribution has p.d.f. $$f(x)=frac1{sqrt{2pi}}expleft(-frac12x^2right).$$ If we integrate this in the interval $(-infty,infty)$, we will get $1$. This is by definition always the case as for all $xinmathit X$ (in this instance $mathit X=Bbb R$), $$int_{mathit X}f(x),dx=1.$$ That is, the sum of all the probabilities of $x$ being in each region in $mathit X$ is $1$. In fact, the constant that makes this happen is so important in statistics (especially Bayesian statistics) that it is given a name: the normalising constant.
A further example is the Beta distribution, with p.d.f. $$f(xmidalpha,beta)=frac{x^{alpha-1}(1-x)^{beta-1}}{text B(alpha,beta)}$$ where $1/text B(alpha,beta)$ is the normalising constant.
$endgroup$
add a comment |
$begingroup$
As you have correctly stated, the p.d.f. of the normal distribution is given by $$f(xmidmu,sigma^2)=frac1{sigmasqrt{2pi}}expleft(-frac12left(frac{x-mu}sigmaright)^2right)$$ where the parameter space is $mathitTheta={(mu,sigma^2)inBbb R^2:sigma^2>0}$. This is essentially saying that the mean is a value on the real line, and the variance is one on the positive real line.
Now consider the simple case where $mu=0$ and $sigma^2=1$. Then the standard normal distribution has p.d.f. $$f(x)=frac1{sqrt{2pi}}expleft(-frac12x^2right).$$ If we integrate this in the interval $(-infty,infty)$, we will get $1$. This is by definition always the case as for all $xinmathit X$ (in this instance $mathit X=Bbb R$), $$int_{mathit X}f(x),dx=1.$$ That is, the sum of all the probabilities of $x$ being in each region in $mathit X$ is $1$. In fact, the constant that makes this happen is so important in statistics (especially Bayesian statistics) that it is given a name: the normalising constant.
A further example is the Beta distribution, with p.d.f. $$f(xmidalpha,beta)=frac{x^{alpha-1}(1-x)^{beta-1}}{text B(alpha,beta)}$$ where $1/text B(alpha,beta)$ is the normalising constant.
$endgroup$
add a comment |
$begingroup$
As you have correctly stated, the p.d.f. of the normal distribution is given by $$f(xmidmu,sigma^2)=frac1{sigmasqrt{2pi}}expleft(-frac12left(frac{x-mu}sigmaright)^2right)$$ where the parameter space is $mathitTheta={(mu,sigma^2)inBbb R^2:sigma^2>0}$. This is essentially saying that the mean is a value on the real line, and the variance is one on the positive real line.
Now consider the simple case where $mu=0$ and $sigma^2=1$. Then the standard normal distribution has p.d.f. $$f(x)=frac1{sqrt{2pi}}expleft(-frac12x^2right).$$ If we integrate this in the interval $(-infty,infty)$, we will get $1$. This is by definition always the case as for all $xinmathit X$ (in this instance $mathit X=Bbb R$), $$int_{mathit X}f(x),dx=1.$$ That is, the sum of all the probabilities of $x$ being in each region in $mathit X$ is $1$. In fact, the constant that makes this happen is so important in statistics (especially Bayesian statistics) that it is given a name: the normalising constant.
A further example is the Beta distribution, with p.d.f. $$f(xmidalpha,beta)=frac{x^{alpha-1}(1-x)^{beta-1}}{text B(alpha,beta)}$$ where $1/text B(alpha,beta)$ is the normalising constant.
$endgroup$
As you have correctly stated, the p.d.f. of the normal distribution is given by $$f(xmidmu,sigma^2)=frac1{sigmasqrt{2pi}}expleft(-frac12left(frac{x-mu}sigmaright)^2right)$$ where the parameter space is $mathitTheta={(mu,sigma^2)inBbb R^2:sigma^2>0}$. This is essentially saying that the mean is a value on the real line, and the variance is one on the positive real line.
Now consider the simple case where $mu=0$ and $sigma^2=1$. Then the standard normal distribution has p.d.f. $$f(x)=frac1{sqrt{2pi}}expleft(-frac12x^2right).$$ If we integrate this in the interval $(-infty,infty)$, we will get $1$. This is by definition always the case as for all $xinmathit X$ (in this instance $mathit X=Bbb R$), $$int_{mathit X}f(x),dx=1.$$ That is, the sum of all the probabilities of $x$ being in each region in $mathit X$ is $1$. In fact, the constant that makes this happen is so important in statistics (especially Bayesian statistics) that it is given a name: the normalising constant.
A further example is the Beta distribution, with p.d.f. $$f(xmidalpha,beta)=frac{x^{alpha-1}(1-x)^{beta-1}}{text B(alpha,beta)}$$ where $1/text B(alpha,beta)$ is the normalising constant.
answered 19 hours ago
TheSimpliFireTheSimpliFire
13.2k62464
13.2k62464
add a comment |
add a comment |
Thanks for contributing an answer to Mathematics Stack Exchange!
- Please be sure to answer the question. Provide details and share your research!
But avoid …
- Asking for help, clarification, or responding to other answers.
- Making statements based on opinion; back them up with references or personal experience.
Use MathJax to format equations. MathJax reference.
To learn more, see our tips on writing great answers.
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
StackExchange.ready(
function () {
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmath.stackexchange.com%2fquestions%2f3181774%2fwhat-is-the-purpose-of-frac1-sigma-sqrt2-pi-in-frac1-sigma-sqrt%23new-answer', 'question_page');
}
);
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
2
$begingroup$
so the integral of the probability density function over the entire space is equal to one
$endgroup$
– J. W. Tanner
yesterday