女人的秘密是什么| pd是什么| 上火牙齿痛吃什么药| 金星原名叫什么| 库欣综合征是什么病| 咨客是做什么的| 呼吸困难是什么原因引起的| 奶篓子是什么意思| 舌苔厚发黄是什么原因| 三七粉主要治什么病| 荔枝晒干了叫什么| 王八看绿豆是什么意思| 谬论是什么意思| 手汗症挂什么科| 护理是什么意思| 属牛的婚配什么属相最好| 内裤发黄是什么原因呢| 肺动脉增宽是什么意思| 尿素是什么肥| 什么茶解酒效果比较好| kpi什么意思| anca是什么检查| 郁闷什么意思| 什么人容易得骨肿瘤| 结节性甲状腺肿是什么意思| 甲鱼喜欢吃什么食物| 21什么意思| 女人喜欢什么样的阴茎| 保险公司最怕什么投诉| 健康证明需要检查什么| 女性的排卵期是什么时候| 排卵试纸阴性是什么意思| 乳腺增生吃什么好| 神经损伤是什么症状| 深圳市长什么级别| 眼睛一直跳是什么原因| 嗯是什么意思| 公关是干什么的| 复配是什么意思| 湛蓝是什么颜色| 周杰伦为什么叫周董| ab型血和o型血生的孩子是什么血型| e6是什么意思| 自求多福什么意思| 笑气是什么| 中国的国球是什么球| 西洋参什么人不能吃| 柔顺剂是什么| 输卵管堵塞有什么样症状| 5月26日什么星座| 七月一日是什么节日| 女人被操是什么感觉| 诛是什么意思| 乙型肝炎e抗体阳性是什么意思| 利润是什么| 香港电话前面加什么| 宫外孕术后可以吃什么| 降钙素原检测是查什么的| 开车撞死猫有什么预兆| 士官是什么级别| 荨麻疹是由什么引起的| 胃疼挂什么科室| 排便困难拉不出来是什么原因| 双皮奶为什么叫双皮奶| 什么样的教诲| 深井冰什么意思| 肥肠烧什么配菜好吃| 银杏果什么时候成熟| 排骨是什么肉| 吃什么营养神经| 脆鱼是什么鱼| 学英语先从什么学起| 一点半是什么时辰| 节育环嵌顿是什么意思| 类风湿忌吃什么| 扁桃体肥大是什么原因造成的| 血离子是检查什么的| 寿司的米饭是什么米| 吃什么可以补血| 男人喝什么茶壮阳| 什么是甲减有什么症状| 田野是什么意思| 甲状腺过氧化物酶抗体高说明什么| 3.30是什么星座| 痔疮是什么原因引起的| 晚上十二点是什么时辰| 梅杰综合症是什么病| 黑米和什么一起搭配煮粥最佳| 生生不息是什么意思| 丰富多腔的腔是什么意思| 离线缓存是什么意思| 吃什么容易滑胎流产| 为什么不建议吃大豆油| 复方北豆根氨酚那敏片是什么药| 什么叫hp感染| 奇异果是什么| 明是什么生肖| 做完胃镜可以吃什么| 小孩子肚子痛吃什么药| 暴毙是什么意思| 男人喝什么茶壮阳| 全身酸痛吃什么药好| 什么是玫瑰糠疹| 火乐念什么| 日本天皇叫什么名字| 前列腺增生吃什么药效果最好| 绿色加什么颜色是蓝色| 吃什么食物养胃| 天麻与什么煲汤最好| 梦见鞭炮是什么意思| 沉网和浮网有什么区别| 不可一世是什么意思| 丙肝是什么病严重吗| 插入是什么感觉| 3月19是什么星座| 闺蜜生日送什么礼物好| 保护肾吃什么食物好| 银杏属于什么植物| hpv16阳性有什么症状| 体检挂什么科| 早上不晨勃是什么原因| 毕罗是什么食物| fs是什么单位| 阅历是什么意思| 李子吃了有什么好处| 橙子什么季节成熟| 88属什么| pd是什么意思| 扁桃体发炎是什么原因引起的| 耳朵蝉鸣是什么原因引起的| 五十路是什么意思| 纳呆什么意思| 姐姐的女儿叫什么称呼| 座是什么结构| 什么花在什么时间开| 四时感冒什么意思| 弄璋之喜是什么意思| 什么品牌的卫浴好| tr是什么| 725是什么意思| bmi是什么| 唾液分泌过多是什么原因| 中级会计什么时候报名| 心脏扩大吃什么药好| 梦见小孩是什么| 血脂稠是什么原因造成的| 喝酒头疼是什么原因| 等边三角形又叫什么三角形| plano是什么意思| 梦见被蛇咬是什么意思| 经常犯困是什么原因| 老年人嗜睡是什么原因| 手黄是什么原因| 95年属什么的| 调剂生是什么意思| 拉红色的屎是什么原因| 不睡人的空床放点什么| 4是什么意思| 什么是信仰| 三竖一横念什么| 甲状腺肿大挂什么科| 借鸡生蛋是什么意思| 什么情况下要打破伤风针| 虫至念什么| 乳房胀痛吃什么药| 心律不齐什么症状| st是什么意思| 榜眼是什么意思| 睾丸炎吃什么药最有效| 为什么洗头发时会掉很多头发| 右边腰疼是什么原因| 今年27岁属什么生肖| 男属鼠的和什么属相最配| 贫血吃什么维生素| 农历8月20日是什么星座| comeon什么意思| 吃什么拉什么完全不能消化怎么办| 更年期燥热吃什么食物| 苹果醋有什么功效| 百年灵手表什么档次| 蓝莓不能和什么一起吃| 颔是什么意思| 特殊情况是什么意思| 苡字取名寓意是什么| 禀赋是什么意思| 眩晕是什么原因引起的| 尿囊素是什么| nu11是什么意思| pc是什么意思| 流金岁月什么意思| 维生素d是什么东西| 8月15日什么星座| 榴莲不能与什么食物一起吃| 客厅沙发后面墙上挂什么画好| 女人梦见老虎是什么预兆| 美的不可方物什么意思| 卵巢囊性占位是什么意思| 松露是什么| 菩提子是什么树的种子| 痢疾吃什么药| 木犀读什么| 缓刑是什么| 头发长的慢是什么原因| 孩子晚上磨牙是什么原因| 走路脚后跟疼是什么原因| 别见怪是什么意思| 性病是什么| 出家当尼姑需要什么条件| 梦见杀鸡见血什么征兆| 日光性皮炎用什么药膏| KTV服务员主要做什么| 占有欲强什么意思| 打喷嚏流清鼻涕吃什么药| 守旧是什么意思| 冻豆腐炖什么好吃| 小腿抽筋是什么原因引起的| 尿味道很重是什么原因| 就加鸟念什么| 三焦湿热吃什么中成药| 梦见自己扫地是什么意思| 小叶苦丁茶有什么作用和功效| 泡沫尿吃什么药| 微信附近的人都是些什么人| 摩根石是什么| 酸化是什么意思| 乙肝五项245阳性是什么意思| 治疗白斑最有效的方法是什么| 图号是什么| 长期失眠挂什么科| 藏语扎西德勒是什么意思| 牛油果不能和什么一起吃| 吉利丁片是什么| 包头古代叫什么| 骨质密度不均匀是什么意思| 百日咳是什么| 闷葫芦是什么意思| 阿扎西是什么意思| 两肺散在小结节是什么意思| 左撇子是什么意思| 水中毒是什么| 骶髂关节在什么位置| 为什么月经一次比一次提前| 2012年是什么命| 鱼生是什么鱼| 为什么手会掉皮| 龟头炎用什么药膏好| 体毛多是什么原因| 天蝎女和什么星座最配| 肾小球肾炎吃什么药| 气血虚什么症状| 渗透压偏高是什么原因| 跳票什么意思| 新斯的明是什么药| 依赖一个人是什么意思| 为什么屎是臭的| 报道是什么意思| 欲言又止是什么意思| 肌炎有什么症状| 什么时期最容易怀孕| 气血不足吃什么药| 结婚证需要什么资料| 人瘦了是什么原因| 白色念珠菌是什么意思| 过敏性紫癜挂什么科| 百度

【新政】想买自住房的看过来:最低配置等你发表意见

百度 目前,商业城已经停牌,正在谋求新一轮的并购重组。

Probability theory or probability calculus is the branch of mathematics concerned with probability. Although there are several different probability interpretations, probability theory treats the concept in a rigorous mathematical manner by expressing it through a set of axioms. Typically these axioms formalise probability in terms of a probability space, which assigns a measure taking values between 0 and 1, termed the probability measure, to a set of outcomes called the sample space. Any specified subset of the sample space is called an event.

Central subjects in probability theory include discrete and continuous random variables, probability distributions, and stochastic processes (which provide mathematical abstractions of non-deterministic or uncertain processes or measured quantities that may either be single occurrences or evolve over time in a random fashion). Although it is not possible to perfectly predict random events, much can be said about their behavior. Two major results in probability theory describing such behaviour are the law of large numbers and the central limit theorem.

As a mathematical foundation for statistics, probability theory is essential to many human activities that involve quantitative analysis of data.[1] Methods of probability theory also apply to descriptions of complex systems given only partial knowledge of their state, as in statistical mechanics or sequential estimation. A great discovery of twentieth-century physics was the probabilistic nature of physical phenomena at atomic scales, described in quantum mechanics.[2]

History of probability

edit

The modern mathematical theory of probability has its roots in attempts to analyze games of chance by Gerolamo Cardano in the sixteenth century, and by Pierre de Fermat and Blaise Pascal in the seventeenth century (for example the "problem of points").[3] Christiaan Huygens published a book on the subject in 1657.[4] In the 19th century, what is considered the classical definition of probability was completed by Pierre Laplace.[5]

Initially, probability theory mainly considered discrete events, and its methods were mainly combinatorial. Eventually, analytical considerations compelled the incorporation of continuous variables into the theory.

This culminated in modern probability theory, on foundations laid by Andrey Nikolaevich Kolmogorov. Kolmogorov combined the notion of sample space, introduced by Richard von Mises, and measure theory and presented his axiom system for probability theory in 1933. This became the mostly undisputed axiomatic basis for modern probability theory; but, alternatives exist, such as the adoption of finite rather than countable additivity by Bruno de Finetti.[6]

Treatment

edit

Most introductions to probability theory treat discrete probability distributions and continuous probability distributions separately. The measure theory-based treatment of probability covers the discrete, continuous, a mix of the two, and more.

Motivation

edit

Consider an experiment that can produce a number of outcomes. The set of all outcomes is called the sample space of the experiment. The power set of the sample space (or equivalently, the event space) is formed by considering all different collections of possible results. For example, rolling an honest die produces one of six possible results. One collection of possible results corresponds to getting an odd number. Thus, the subset {1,3,5} is an element of the power set of the sample space of dice rolls. These collections are called events. In this case, {1,3,5} is the event that the die falls on some odd number. If the results that actually occur fall in a given event, that event is said to have occurred.

Probability is a way of assigning every "event" a value between zero and one, with the requirement that the event made up of all possible results (in our example, the event {1,2,3,4,5,6}) be assigned a value of one. To qualify as a probability distribution, the assignment of values must satisfy the requirement that if you look at a collection of mutually exclusive events (events that contain no common results, e.g., the events {1,6}, {3}, and {2,4} are all mutually exclusive), the probability that any of these events occurs is given by the sum of the probabilities of the events.[7]

The probability that any one of the events {1,6}, {3}, or {2,4} will occur is 5/6. This is the same as saying that the probability of event {1,2,3,4,6} is 5/6. This event encompasses the possibility of any number except five being rolled. The mutually exclusive event {5} has a probability of 1/6, and the event {1,2,3,4,5,6} has a probability of 1, that is, absolute certainty.

When doing calculations using the outcomes of an experiment, it is necessary that all those elementary events have a number assigned to them. This is done using a random variable. A random variable is a function that assigns to each elementary event in the sample space a real number. This function is usually denoted by a capital letter.[8] In the case of a die, the assignment of a number to certain elementary events can be done using the identity function. This does not always work. For example, when flipping a coin the two possible outcomes are "heads" and "tails". In this example, the random variable X could assign to the outcome "heads" the number "0" (?) and to the outcome "tails" the number "1" (?).

Discrete probability distributions

edit
?
The Poisson distribution, a discrete probability distribution

Discrete probability theory deals with events that occur in countable sample spaces.

Examples: Throwing dice, experiments with decks of cards, random walk, and tossing coins.

Classical definition: Initially the probability of an event to occur was defined as the number of cases favorable for the event, over the number of total outcomes possible in an equiprobable sample space: see Classical definition of probability.

For example, if the event is "occurrence of an even number when a dice is rolled", the probability is given by ?, since 3 faces out of the 6 have even numbers and each face has the same probability of appearing.

Modern definition: The modern definition starts with a finite or countable set called the sample space, which relates to the set of all possible outcomes in classical sense, denoted by ?. It is then assumed that for each element ?, an intrinsic "probability" value ? is attached, which satisfies the following properties:

  1. ?
  2. ?

That is, the probability function f(x) lies between zero and one for every value of x in the sample space Ω, and the sum of f(x) over all values x in the sample space Ω is equal to 1. An event is defined as any subset ? of the sample space ?. The probability of the event ? is defined as

?

So, the probability of the entire sample space is 1, and the probability of the null event is 0.

The function ? mapping a point in the sample space to the "probability" value is called a probability mass function abbreviated as pmf.

Continuous probability distributions

edit
?
The normal distribution, a continuous probability distribution

Continuous probability theory deals with events that occur in a continuous sample space.

Classical definition: The classical definition breaks down when confronted with the continuous case. See Bertrand's paradox.

Modern definition: If the sample space of a random variable X is the set of real numbers (?) or a subset thereof, then a function called the cumulative distribution function (CDF) ? exists, defined by ?. That is, F(x) returns the probability that X will be less than or equal to x.

The CDF necessarily satisfies the following properties.

  1. ? is a monotonically non-decreasing, right-continuous function;
  2. ?
  3. ?

The random variable ? is said to have a continuous probability distribution if the corresponding CDF ? is continuous. If ? is absolutely continuous, then its derivative exists almost everywhere and integrating the derivative gives us the CDF back again. In this case, the random variable X is said to have a probability density function (PDF) or simply density ?

For a set ?, the probability of the random variable X being in ? is

?

In case the PDF exists, this can be written as

?

Whereas the PDF exists only for continuous random variables, the CDF exists for all random variables (including discrete random variables) that take values in ?

These concepts can be generalized for multidimensional cases on ? and other continuous sample spaces.

Measure-theoretic probability theory

edit

The utility of the measure-theoretic treatment of probability is that it unifies the discrete and the continuous cases, and makes the difference a question of which measure is used. Furthermore, it covers distributions that are neither discrete nor continuous nor mixtures of the two.

An example of such distributions could be a mix of discrete and continuous distributions—for example, a random variable that is 0 with probability 1/2, and takes a random value from a normal distribution with probability 1/2. It can still be studied to some extent by considering it to have a PDF of ?, where ? is the Dirac delta function.

Other distributions may not even be a mix, for example, the Cantor distribution has no positive probability for any single point, neither does it have a density. The modern approach to probability theory solves these problems using measure theory to define the probability space:

Given any set ? (also called sample space) and a σ-algebra ? on it, a measure ? defined on ? is called a probability measure if ?

If ? is the Borel σ-algebra on the set of real numbers, then there is a unique probability measure on ? for any CDF, and vice versa. The measure corresponding to a CDF is said to be induced by the CDF. This measure coincides with the pmf for discrete variables and PDF for continuous variables, making the measure-theoretic approach free of fallacies.

The probability of a set ? in the σ-algebra ? is defined as

?

where the integration is with respect to the measure ? induced by ?

Along with providing better understanding and unification of discrete and continuous probabilities, measure-theoretic treatment also allows us to work on probabilities outside ?, as in the theory of stochastic processes. For example, to study Brownian motion, probability is defined on a space of functions.

When it is convenient to work with a dominating measure, the Radon–Nikodym theorem is used to define a density as the Radon–Nikodym derivative of the probability distribution of interest with respect to this dominating measure. Discrete densities are usually defined as this derivative with respect to a counting measure over the set of all possible outcomes. Densities for absolutely continuous distributions are usually defined as this derivative with respect to the Lebesgue measure. If a theorem can be proved in this general setting, it holds for both discrete and continuous distributions as well as others; separate proofs are not required for discrete and continuous distributions.

Classical probability distributions

edit

Certain random variables occur very often in probability theory because they well describe many natural or physical processes. Their distributions, therefore, have gained special importance in probability theory. Some fundamental discrete distributions are the discrete uniform, Bernoulli, binomial, negative binomial, Poisson and geometric distributions. Important continuous distributions include the continuous uniform, normal, exponential, gamma and beta distributions.

Convergence of random variables

edit

In probability theory, there are several notions of convergence for random variables. They are listed below in the order of strength, i.e., any subsequent notion of convergence in the list implies convergence according to all of the preceding notions.

Weak convergence
A sequence of random variables ? converges weakly to the random variable ? if their respective CDF converges? converges to the CDF ? of ?, wherever ? is continuous. Weak convergence is also called convergence in distribution.
Most common shorthand notation: ?
Convergence in probability
The sequence of random variables ? is said to converge towards the random variable ? in probability if ? for every ε > 0.
Most common shorthand notation: ?
Strong convergence
The sequence of random variables ? is said to converge towards the random variable ? strongly if ?. Strong convergence is also known as almost sure convergence.
Most common shorthand notation: ?

As the names indicate, weak convergence is weaker than strong convergence. In fact, strong convergence implies convergence in probability, and convergence in probability implies weak convergence. The reverse statements are not always true.

Law of large numbers

edit

Common intuition suggests that if a fair coin is tossed many times, then roughly half of the time it will turn up heads, and the other half it will turn up tails. Furthermore, the more often the coin is tossed, the more likely it should be that the ratio of the number of heads to the number of tails will approach unity. Modern probability theory provides a formal version of this intuitive idea, known as the law of large numbers. This law is remarkable because it is not assumed in the foundations of probability theory, but instead emerges from these foundations as a theorem. Since it links theoretically derived probabilities to their actual frequency of occurrence in the real world, the law of large numbers is considered as a pillar in the history of statistical theory and has had widespread influence.[9]

The law of large numbers (LLN) states that the sample average

?

of a sequence of independent and identically distributed random variables ? converges towards their common expectation (expected value) ?, provided that the expectation of ? is finite.

It is in the different forms of convergence of random variables that separates the weak and the strong law of large numbers[10]

Weak law: ? for ?
Strong law: ? for ?

It follows from the LLN that if an event of probability p is observed repeatedly during independent experiments, the ratio of the observed frequency of that event to the total number of repetitions converges towards p.

For example, if ? are independent Bernoulli random variables taking values 1 with probability p and 0 with probability 1-p, then ? for all i, so that ? converges to p almost surely.

Central limit theorem

edit

The central limit theorem (CLT) explains the ubiquitous occurrence of the normal distribution in nature, and this theorem, according to David Williams, "is one of the great results of mathematics."[11]

The theorem states that the average of many independent and identically distributed random variables with finite variance tends towards a normal distribution irrespective of the distribution followed by the original random variables. Formally, let ? be independent random variables with mean ? and variance ? Then the sequence of random variables

?

converges in distribution to a standard normal random variable.

For some classes of random variables, the classic central limit theorem works rather fast, as illustrated in the Berry–Esseen theorem. For example, the distributions with finite first, second, and third moment from the exponential family; on the other hand, for some random variables of the heavy tail and fat tail variety, it works very slowly or may not work at all: in such cases one may use the Generalized Central Limit Theorem (GCLT).

See also

edit

Lists

edit

References

edit

Citations

edit
  1. ^ Inferring From Data
  2. ^ "Quantum Logic and Probability Theory". The Stanford Encyclopedia of Philosophy. 10 August 2021.
  3. ^ LIGHTNER, JAMES E. (1991). "A Brief Look at the History of Probability and Statistics". The Mathematics Teacher. 84 (8): 623–630. doi:10.5951/MT.84.8.0623. ISSN?0025-5769. JSTOR?27967334.
  4. ^ Grinstead, Charles Miller; James Laurie Snell. "Introduction". Introduction to Probability. pp.?vii.
  5. ^ Daston, Lorraine J. (1980). "Probabilistic Expectation and Rationality in Classical Probability Theory". Historia Mathematica. 7 (3): 234–260. doi:10.1016/0315-0860(80)90025-7.
  6. ^ ""The origins and legacy of Kolmogorov's Grundbegriffe", by Glenn Shafer and Vladimir Vovk" (PDF). Retrieved 2025-08-14.
  7. ^ Ross, Sheldon (2010). A First Course in Probability (8th?ed.). Pearson Prentice Hall. pp.?26–27. ISBN?978-0-13-603313-4. Retrieved 2025-08-14.
  8. ^ Bain, Lee J.; Engelhardt, Max (1992). Introduction to Probability and Mathematical Statistics (2nd?ed.). Belmont, California: Brooks/Cole. p.?53. ISBN?978-0-534-38020-5.
  9. ^ "Leithner & Co Pty Ltd - Value Investing, Risk and Risk Management - Part I". Leithner.com.au. 2025-08-14. Archived from the original on 2025-08-14. Retrieved 2025-08-14.
  10. ^ Dekking, Michel (2005). "Chapter 13: The law of large numbers". A modern introduction to probability and statistics?: understanding why and how. Library Genesis. London?: Springer. pp.?180–194. ISBN?978-1-85233-896-1.{{cite book}}: CS1 maint: publisher location (link)
  11. ^ David Williams, "Probability with martingales", Cambridge 1991/2008

Sources

edit
The first major treatise blending calculus with probability theory, originally in French: Théorie Analytique des Probabilités.
An English translation by Nathan Morrison appeared under the title Foundations of the Theory of Probability (Chelsea, New York) in 1950, with a second edition in 1956.
A lively introduction to probability theory for the beginner.
鲁智深的绰号是什么 罹患率是什么意思 pr间期缩短是什么意思 五行是指什么 耳朵烫是什么预兆
嗯哼是什么意思 五指姑娘是什么意思 印堂发黑是什么原因 打下巴用什么玻尿酸最好 小寒是什么意思
失恋是什么意思 入伏吃羊肉有什么好处 怀孕3天有什么症状 头上出汗多是什么原因 阴道镜活检是什么意思
人肉什么味道 吉数是什么生肖 2020是什么生肖 锅包肉用什么淀粉 一个牙一个合是什么字
颈部淋巴结肿大是什么原因hcv9jop3ns8r.cn 湦是什么意思wuhaiwuya.com 起湿疹是什么原因造成的hcv7jop6ns5r.cn 木是什么意思hcv8jop6ns3r.cn 炒菜用什么油比较好hcv8jop3ns4r.cn
农历六月十九是什么星座hcv8jop0ns5r.cn 葛根粉吃了有什么作用hcv7jop7ns3r.cn 什么时间英语hcv9jop2ns9r.cn 滴度是什么意思hcv8jop2ns5r.cn 回声不均匀是什么意思hcv8jop3ns4r.cn
为什么家里会有隐翅虫hcv9jop1ns2r.cn 腱鞘炎吃什么药hcv8jop9ns6r.cn 鸿字五行属什么zhongyiyatai.com 脂溢性脱发用什么洗发水好hcv9jop5ns7r.cn 免签是什么意思hcv8jop8ns9r.cn
莜面是什么面hcv9jop4ns2r.cn 血脂稠是什么原因造成的xinjiangjialails.com 失眠吃什么好睡觉hcv8jop7ns0r.cn 1964属什么生肖adwl56.com 西瓜可以做成什么美食hcv9jop7ns2r.cn
百度