益生菌什么时间吃最好| 喘不上气挂什么科| 咖啡伴侣是什么| 氧化锌是什么| 泳帽的作用是什么| 白带黄吃什么药| 乌托邦什么意思| 非萎缩性胃炎伴糜烂是什么意思| 菩提树长什么样| 体检查什么| 冻感冒吃什么药| 马的守护神是什么菩萨| 姑姑的孙子叫我什么| lisa英文名什么意思| 为什么医生说直肠炎不用吃药| 林黛玉和贾宝玉是什么关系| 解痉镇痛酊有什么功效| 支原体感染吃什么药好| 消炎药是什么药| 绿豆和什么食物相克| 骨转移用什么药能治愈| 贫血应该吃什么| 手指甲有月牙代表什么| 肚子胀痛什么原因| eb是什么意思| flair是什么意思| 经常便秘吃什么药好| 喉咙痛吃什么水果好得最快| 健硕是什么意思| 脾胃虚弱能吃什么水果| 笔画最多的字是什么| md是什么意思| 丹字五行属什么| stories是什么意思| 不停的出汗是什么原因| 微信拥抱表情什么意思| 梦见对象出轨什么征兆| 耷拉的近义词是什么| 血钾高是什么引起的| 今天什么时候出梅| 乙肝病毒表面抗体阳性是什么意思| 什么是基因突变| 昭是什么意思| 什么的威尼斯| 大象的鼻子像什么| 内能与什么因素有关| 今年25岁属什么生肖| 梦见找对象是什么意思| 阿迪耐克为什么那么贵| 双手发麻是什么病的前兆| 脑白质稀疏什么意思| 木耳菜是什么菜| 吃什么全面补充维生素| 慢性盆腔炎吃什么药| 造血干细胞是什么| 山竹里面黄黄的是什么| 康复治疗学什么| 舌苔厚白是什么原因| 西瓜为什么叫西瓜| 梦见自己头发长长了是什么意思| 纳征是什么意思| 耳膜炎是什么原因引起的| 人活着到底是为了什么| 江米是什么米| 屠苏酒是什么酒| 日加西念什么| 舌尖麻是什么原因| 圆坟是什么意思| 巧克力囊肿是什么| 白电油对人体有什么危害| 疏是什么意思| ap手表是什么牌子| 人缺钾有什么症状| 大学生入伍有什么好处| 失眠有什么特效药| 什么是强直性脊柱炎| 1949年是什么年| 红细胞压积是什么意思| 山茱萸的功效与作用是什么| 单身领养孩子需要什么条件| 汲汲营营是什么意思| 梅花是什么颜色的| 公园里有什么有什么还有什么| 钠尿肽高是什么原因| 三言两语是什么生肖| 日本料理都有什么菜| 剪不断理还乱是什么意思| 超敏c反应蛋白正常说明什么| 蚕丝衣服用什么洗最好| 过敏性鼻炎吃什么药能快速缓解| 血包是什么意思| 人生导师是什么意思| 海参有什么营养价值| 12306什么时候放票| 命中注定是什么意思| 一碗香是什么菜| 出煞是什么意思| 左下眼皮跳是什么原因| anxiety什么意思| 原住民是什么意思| 豆芽和什么一起炒好吃| 功能性消化不良是什么意思| 皓是什么意思| 验孕棒什么时候测比较准| 尿酸高有什么危害| 渡情劫是什么意思| 心脑血管挂什么科| 11月8日是什么星座| 免去职务是什么意思| 罗姓男孩取什么名字好| 脾胃是什么意思| 什么叫相向而行| 过期药品属于什么垃圾| 高血压吃什么食物最好| 组胺过敏是什么意思| 湿气重去医院挂什么科| 北宋六贼为什么没高俅| 女人多吃什么补黄体酮| 直肠炎吃什么药最好| 脾脏是人体的什么器官| 锅烧是什么| 女子是什么意思| 奶茶色是什么颜色| 妖魔鬼怪是什么生肖| 地皮菜是什么菜| 吃冬瓜有什么好处| 害怕什么| 12月份是什么星座的| 什么时候闰九月| 检测毛囊去什么医院| 女性手麻是什么原因| 头晕目眩是什么病的征兆| 尿后余沥是什么意思| 撒野是什么意思| 什么现象证明你在长高| 会阴是什么部位| 尿蛋白高是什么原因引起的| 得意门生是什么意思| 做阴超有黄体说明什么| 瞌睡是什么意思| 沣字五行属什么| 繁花似锦是什么意思| 噬是什么意思| 吃什么减肥效果最快| 派出所什么时候上班| 小腿肿胀是什么原因引起的| 糙米是什么米| 1.12是什么星座| 官鬼是什么意思| 鱼油有什么用| 可吸收线是什么颜色| left是什么意思| 三氧化硫常温下是什么状态| 总胆红素偏高是什么引起的| 什么茶降血糖| 肚子经常胀气什么原因| her是什么意思| pda医学上是什么意思| 数字7五行属什么| 生孩子前要注意什么| 肩胛骨缝疼吃什么药| dmd是什么病| 乌龟代表什么生肖| 卡密是什么| 夜字五行属什么| 晚的反义词是什么| 为什么游戏| 谝是什么意思| 甲功异常有什么症状| 复合面料是什么面料| avg什么意思| 经期肚子疼是什么原因| 尿粘液丝高是什么原因| 粘人是什么意思| 鱼翅是什么鱼身上的| 麦高芬是什么意思| 卵巢增大是什么原因引起的| 给老人买什么礼物| 淋巴细胞偏高是什么原因| 终而复始什么意思| 往来账是什么意思| 肾气亏虚吃什么中成药| 子痫前期是什么意思| 什么的寒冷| xl代表什么尺码| 尼泊尔属于什么国家| 衣服发黄是什么原因| 吃羊肉有什么好处| 波美度是什么意思| 胎位头位是什么意思| 小孩子注意力不集中看什么科| 什么烧鸡好吃| 身上痒是什么情况| 2月15日是什么星座| 为什么有的人怎么吃都不胖| 世界上最贵的烟是什么烟| 农历正月初一是什么节| 春宵一刻值千金什么意思| 阴囊潮湿吃什么药好| 陶渊明世称什么| 肚子冰凉是什么原因| 91年什么命| 男人脚肿是什么原因| 回光返照是什么意思| 曦字五行属什么| 红色和蓝色混合是什么颜色| 左边脖子疼是什么原因| 可字属于五行属什么| aed什么意思| 拖什么东西最轻松| 皮肤软组织感染用什么消炎药| 朝鲜为什么那么落后| 菜园里有什么菜| 泌乳素高有什么症状| 什么的夜晚| 疝气挂什么科| 物流专员是做什么的| 劫持是什么意思| g代表什么| 肾上腺瘤吃什么药可以消除| 2月18日什么星座| 月经提前半个月来是什么原因| 舌苔发青是什么原因| Polo什么意思| 达喜是什么药| 什么鸡没有翅膀| 射手座的幸运色是什么颜色| 排湿气最快的方法吃什么| 吃东西恶心想吐是什么原因| 哈西奈德溶液治什么病| 小孩老放屁是什么原因| 什么是灰指甲| 大姨妈来了吃什么对身体好| cpb是什么牌子| 锌中毒是什么症状| 烤麸是什么| 佳木斯二院全名叫什么| ecco什么牌子| 为什么可乐能溶解鱼刺| 太学是什么意思| 熬夜伤什么器官| 皂矾是什么| o发什么音| 随性是什么意思| 尿道感染挂什么科| 朱砂有什么作用与功效| 妆前乳是什么| 感冒没胃口吃什么好| 种马什么意思| 手肿是什么原因引起的| 骨髓瘤是什么病| 再生牙技术什么时候能实现| 什么是疖肿| 素鲍鱼是什么做的| 2019年属什么生肖| 五步蛇为什么叫五步蛇| 心电图窦性心动过缓是什么意思| 火象是什么意思| 武将是什么生肖| 母亲节送婆婆什么礼物| 欧巴是什么意思| 化痰止咳吃什么药最好| 什么是尿酸高| 脾脏切除后有什么影响| 百度

车讯:引入混动/纯电动 现代IONIQ将于3月上市

(Redirected from Bayesian Statistics)
百度 通过这两三个月的折腾,我就想肯定不只我们家有这样的问题。

Bayesian statistics (/?be?zi?n/ BAY-zee-?n or /?be???n/ BAY-zh?n)[1] is a theory in the field of statistics based on the Bayesian interpretation of probability, where probability expresses a degree of belief in an event. The degree of belief may be based on prior knowledge about the event, such as the results of previous experiments, or on personal beliefs about the event. This differs from a number of other interpretations of probability, such as the frequentist interpretation, which views probability as the limit of the relative frequency of an event after many trials.[2] More concretely, analysis in Bayesian methods codifies prior knowledge in the form of a prior distribution.

Bayesian statistical methods use Bayes' theorem to compute and update probabilities after obtaining new data. Bayes' theorem describes the conditional probability of an event based on data as well as prior information or beliefs about the event or conditions related to the event.[3][4] For example, in Bayesian inference, Bayes' theorem can be used to estimate the parameters of a probability distribution or statistical model. Since Bayesian statistics treats probability as a degree of belief, Bayes' theorem can directly assign a probability distribution that quantifies the belief to the parameter or set of parameters.[2][3]

Bayesian statistics is named after Thomas Bayes, who formulated a specific case of Bayes' theorem in a paper published in 1763. In several papers spanning from the late 18th to the early 19th centuries, Pierre-Simon Laplace developed the Bayesian interpretation of probability.[5] Laplace used methods now considered Bayesian to solve a number of statistical problems. While many Bayesian methods were developed by later authors, the term "Bayesian" was not commonly used to describe these methods until the 1950s. Throughout much of the 20th century, Bayesian methods were viewed unfavorably by many statisticians due to philosophical and practical considerations. Many of these methods required much computation, and most widely used approaches during that time were based on the frequentist interpretation. However, with the advent of powerful computers and new algorithms like Markov chain Monte Carlo, Bayesian methods have gained increasing prominence in statistics in the 21st century.[2][6]

Bayes's theorem

edit

Bayes's theorem is used in Bayesian methods to update probabilities, which are degrees of belief, after obtaining new data. Given two events ? and ?, the conditional probability of ? given that ? is true is expressed as follows:[7]

?

where ?. Although Bayes's theorem is a fundamental result of probability theory, it has a specific interpretation in Bayesian statistics. In the above equation, ? usually represents a proposition (such as the statement that a coin lands on heads fifty percent of the time) and ? represents the evidence, or new data that is to be taken into account (such as the result of a series of coin flips). ? is the prior probability of ? which expresses one's beliefs about ? before evidence is taken into account. The prior probability may also quantify prior knowledge or information about ?. ? is the likelihood function, which can be interpreted as the probability of the evidence ? given that ? is true. The likelihood quantifies the extent to which the evidence ? supports the proposition ?. ? is the posterior probability, the probability of the proposition ? after taking the evidence ? into account. Essentially, Bayes's theorem updates one's prior beliefs ? after considering the new evidence ?.[2]

The probability of the evidence ? can be calculated using the law of total probability. If ? is a partition of the sample space, which is the set of all outcomes of an experiment, then,[2][7]

?

When there are an infinite number of outcomes, it is necessary to integrate over all outcomes to calculate ? using the law of total probability. Often, ? is difficult to calculate as the calculation would involve sums or integrals that would be time-consuming to evaluate, so often only the product of the prior and likelihood is considered, since the evidence does not change in the same analysis. The posterior is proportional to this product:[2]

?

The maximum a posteriori, which is the mode of the posterior and is often computed in Bayesian statistics using mathematical optimization methods, remains the same. The posterior can be approximated even without computing the exact value of ? with methods such as Markov chain Monte Carlo or variational Bayesian methods.[2]

Construction

edit

The classical textbook equation for the posterior in Bayesian statistics is usually stated as ? where ? is the updated probability of ? being the true parameter after collecting the data ?, ? is the likelihood of collecting the data ? given the parameter ?, ? is the prior belief of ?'s likelihood and the integral in the denominator gives the probability of collecting the data ?.

Mathematically, this version of bayes theorem can be constructed in the following way: Suppose ? to be some parametric statistical model and ? to be a probability space over the parameter space. We can construct a new probability space ? where ? is a sort of product measure defined as: ?

Now, let ? and ?, then we get: ?

and hence

?

both as empirically might be expected. Thus, Bayes' theorem states:

?

If ? (absolutely continuous w.r.t. lebesgue measure), then there exists a density such that ? and we can write:

?

Else, if ? (absolutely continuous w.r.t. counting measure), analogous we can write:

?

Thus, by identifying ? with ? and ? with ? we arrive at the classical equation stated above.

Bayesian methods

edit

The general set of statistical techniques can be divided into a number of activities, many of which have special Bayesian versions.

Bayesian inference

edit

Bayesian inference refers to statistical inference where uncertainty in inferences is quantified using probability.[8] In classical frequentist inference, model parameters and hypotheses are considered to be fixed. Probabilities are not assigned to parameters or hypotheses in frequentist inference. For example, it would not make sense in frequentist inference to directly assign a probability to an event that can only happen once, such as the result of the next flip of a fair coin. However, it would make sense to state that the proportion of heads approaches one-half as the number of coin flips increases.[9]

Statistical models specify a set of statistical assumptions and processes that represent how the sample data are generated. Statistical models have a number of parameters that can be modified. For example, a coin can be represented as samples from a Bernoulli distribution, which models two possible outcomes. The Bernoulli distribution has a single parameter equal to the probability of one outcome, which in most cases is the probability of landing on heads. Devising a good model for the data is central in Bayesian inference. In most cases, models only approximate the true process, and may not take into account certain factors influencing the data.[2] In Bayesian inference, probabilities can be assigned to model parameters. Parameters can be represented as random variables. Bayesian inference uses Bayes' theorem to update probabilities after more evidence is obtained or known.[2][10] Furthermore, Bayesian methods allow for placing priors on entire models and calculating their posterior probabilities using Bayes' theorem. These posterior probabilities are proportional to the product of the prior and the marginal likelihood, where the marginal likelihood is the integral of the sampling density over the prior distribution of the parameters. In complex models, marginal likelihoods are generally computed numerically.[11]

Statistical modeling

edit

The formulation of statistical models using Bayesian statistics has the identifying feature of requiring the specification of prior distributions for any unknown parameters. Indeed, parameters of prior distributions may themselves have prior distributions, leading to Bayesian hierarchical modeling,[12][13][14] also known as multi-level modeling. A special case is Bayesian networks.

For conducting a Bayesian statistical analysis, best practices are discussed by van de Schoot et al.[15]

For reporting the results of a Bayesian statistical analysis, Bayesian analysis reporting guidelines (BARG) are provided in an open-access article by John K. Kruschke.[16]

Design of experiments

edit

The Bayesian design of experiments includes a concept called 'influence of prior beliefs'. This approach uses sequential analysis techniques to include the outcome of earlier experiments in the design of the next experiment. This is achieved by updating 'beliefs' through the use of prior and posterior distribution. This allows the design of experiments to make good use of resources of all types. An example of this is the multi-armed bandit problem.


Exploratory analysis of Bayesian models

edit

Exploratory analysis of Bayesian models is an adaptation or extension of the exploratory data analysis approach to the needs and peculiarities of Bayesian modeling. In the words of Persi Diaconis:[17]

Exploratory data analysis seeks to reveal structure, or simple descriptions in data. We look at numbers or graphs and try to find patterns. We pursue leads suggested by background information, imagination, patterns perceived, and experience with other data analyses

The inference process generates a posterior distribution, which has a central role in Bayesian statistics, together with other distributions like the posterior predictive distribution and the prior predictive distribution. The correct visualization, analysis, and interpretation of these distributions is key to properly answer the questions that motivate the inference process.[18]

When working with Bayesian models there are a series of related tasks that need to be addressed besides inference itself:

  • Diagnoses of the quality of the inference, this is needed when using numerical methods such as Markov chain Monte Carlo techniques
  • Model criticism, including evaluations of both model assumptions and model predictions
  • Comparison of models, including model selection or model averaging
  • Preparation of the results for a particular audience

All these tasks are part of the Exploratory analysis of Bayesian models approach and successfully performing them is central to the iterative and interactive modeling process. These tasks require both numerical and visual summaries.[19][20][21]

See also

edit

References

edit
  1. ^ "Bayesian". Merriam-Webster.com Dictionary. Merriam-Webster.
  2. ^ a b c d e f g h i Gelman, Andrew; Carlin, John B.; Stern, Hal S.; Dunson, David B.; Vehtari, Aki; Rubin, Donald B. (2013). Bayesian Data Analysis (Third?ed.). Chapman and Hall/CRC. ISBN?978-1-4398-4095-5.
  3. ^ a b McElreath, Richard (2020). Statistical Rethinking?: A Bayesian Course with Examples in R and Stan (2nd?ed.). Chapman and Hall/CRC. ISBN?978-0-367-13991-9.
  4. ^ Kruschke, John (2014). Doing Bayesian Data Analysis: A Tutorial with R, JAGS, and Stan (2nd?ed.). Academic Press. ISBN?978-0-12-405888-0.
  5. ^ McGrayne, Sharon (2012). The Theory That Would Not Die: How Bayes' Rule Cracked the Enigma Code, Hunted Down Russian Submarines, and Emerged Triumphant from Two Centuries of Controversy (First?ed.). Chapman and Hall/CRC. ISBN?978-0-3001-8822-6.
  6. ^ Fienberg, Stephen E. (2006). "When Did Bayesian Inference Become "Bayesian"?". Bayesian Analysis. 1 (1): 1–40. doi:10.1214/06-BA101.
  7. ^ a b Grinstead, Charles M.; Snell, J. Laurie (2006). Introduction to probability (2nd?ed.). Providence, RI: American Mathematical Society. ISBN?978-0-8218-9414-9.
  8. ^ Lee, Se Yoon (2021). "Gibbs sampler and coordinate ascent variational inference: A set-theoretical review". Communications in Statistics - Theory and Methods. 51 (6): 1549–1568. arXiv:2008.01006. doi:10.1080/03610926.2021.1921214. S2CID?220935477.
  9. ^ Wakefield, Jon (2013). Bayesian and frequentist regression methods. New York, NY: Springer. ISBN?978-1-4419-0924-4.
  10. ^ Congdon, Peter (2014). Applied Bayesian modelling (2nd?ed.). Wiley. ISBN?978-1119951513.
  11. ^ Chib, Siddhartha (1995). "Marginal Likelihood from the Gibbs Output". Journal of the American Statistical Association. 90 (432): 1313–1321. doi:10.1080/01621459.1995.10476635.
  12. ^ Kruschke, J K; Vanpaemel, W (2015). "Bayesian Estimation in Hierarchical Models". In Busemeyer, J R; Wang, Z; Townsend, J T; Eidels, A (eds.). The Oxford Handbook of Computational and Mathematical Psychology (PDF). Oxford University Press. pp.?279–299.
  13. ^ Hajiramezanali, E. & Dadaneh, S. Z. & Karbalayghareh, A. & Zhou, Z. & Qian, X. Bayesian multi-domain learning for cancer subtype discovery from next-generation sequencing count data. 32nd Conference on Neural Information Processing Systems (NIPS 2018), Montréal, Canada. arXiv:1810.09433
  14. ^ Lee, Se Yoon; Mallick, Bani (2021). "Bayesian Hierarchical Modeling: Application Towards Production Results in the Eagle Ford Shale of South Texas". Sankhya B. 84: 1–43. doi:10.1007/s13571-020-00245-8.
  15. ^ van de Schoot, Rens; Depaoli, Sarah; King, Ruth; Kramer, Bianca; M?rtens, Kaspar; Tadesse, Mahlet G.; Vannucci, Marina; Gelman, Andrew; Veen, Duco; Willemsen, Joukje; Yau, Christopher (January 14, 2021). "Bayesian statistics and modelling". Nature Reviews Methods Primers. 1 (1): 1–26. doi:10.1038/s43586-020-00001-2. hdl:1874/415909. S2CID?234108684.
  16. ^ Kruschke, J K (Aug 16, 2021). "Bayesian Analysis Reporting Guidelines". Nature Human Behaviour. 5 (10): 1282–1291. doi:10.1038/s41562-021-01177-7. PMC?8526359. PMID?34400814.
  17. ^ Diaconis, Persi (2011) Theories of Data Analysis: From Magical Thinking Through Classical Statistics. John Wiley & Sons, Ltd 2:e55 doi:10.1002/9781118150702.ch1
  18. ^ Kumar, Ravin; Carroll, Colin; Hartikainen, Ari; Martin, Osvaldo (2019). "ArviZ a unified library for exploratory analysis of Bayesian models in Python". Journal of Open Source Software. 4 (33): 1143. Bibcode:2019JOSS....4.1143K. doi:10.21105/joss.01143. hdl:11336/114615.
  19. ^ Gabry, Jonah; Simpson, Daniel; Vehtari, Aki; Betancourt, Michael; Gelman, Andrew (2019). "Visualization in Bayesian workflow". Journal of the Royal Statistical Society, Series A (Statistics in Society). 182 (2): 389–402. arXiv:1709.01449. doi:10.1111/rssa.12378. S2CID?26590874.
  20. ^ Vehtari, Aki; Gelman, Andrew; Simpson, Daniel; Carpenter, Bob; Bürkner, Paul-Christian (2021). "Rank-Normalization, Folding, and Localization: An Improved R? for Assessing Convergence of MCMC (With Discussion)". Bayesian Analysis. 16 (2): 667. arXiv:1903.08008. Bibcode:2021BayAn..16..667V. doi:10.1214/20-BA1221. S2CID?88522683.
  21. ^ Martin, Osvaldo (2018). Bayesian Analysis with Python: Introduction to statistical modeling and probabilistic programming using PyMC3 and ArviZ. Packt Publishing Ltd. ISBN?9781789341652.

Further reading

edit
edit
后遗症是什么意思 月经推迟挂什么科 芹菜和什么不能一起吃 肽是什么东西 济州岛有什么好玩的
口角炎缺乏什么维生素 唐筛都检查什么 矫正视力是指什么 耳浴是什么意思 夏天哈尔滨有什么好玩的地方
左眉毛上有痣代表什么 西门子洗衣机不脱水是什么原因 elisa是什么检测方法 不怀孕是什么原因引起的 脚臭用什么泡脚效果好
霸王龙的后代是什么 style什么意思 二五八万是什么意思 菊花茶有什么功效 做梦梦见火是什么征兆
什么是外阴白斑hcv9jop3ns9r.cn 刘姥姥和贾府什么关系hcv8jop4ns1r.cn 早上屁多是什么原因造成的dayuxmw.com 生肠是什么hcv7jop9ns0r.cn 火同念什么hcv8jop0ns6r.cn
高血压是什么症状hcv8jop1ns8r.cn 生气胸口疼是什么原因hcv9jop7ns2r.cn 月元念什么xinjiangjialails.com 晚上睡觉经常醒是什么原因hcv8jop0ns7r.cn 孩子b型血父母什么血型hcv8jop8ns3r.cn
ab和ab生的孩子是什么血型hcv8jop5ns3r.cn 八面玲珑什么意思hcv9jop5ns6r.cn 91年属什么的hcv7jop6ns3r.cn 肠痈是什么意思hcv9jop2ns8r.cn 父母都是b型血孩子是什么血型beikeqingting.com
什么原因导致卵巢早衰hcv8jop1ns7r.cn 弯脚杆是什么意思hcv9jop1ns1r.cn 生殖疱疹吃什么药不复发hcv8jop8ns5r.cn 尿检3个加号什么意思hcv9jop5ns4r.cn 林冲是什么生肖xinjiangjialails.com
百度