Jump to content
 







Main menu
   


Navigation  



Main page
Contents
Current events
Random article
About Wikipedia
Contact us
Donate
 




Contribute  



Help
Learn to edit
Community portal
Recent changes
Upload file
 








Search  

































Create account

Log in
 









Create account
 Log in
 




Pages for logged out editors learn more  



Contributions
Talk
 



















Contents

   



(Top)
 


1 Chain rule for events  



1.1  Two events  



1.1.1  Example  







1.2  Finitely many events  



1.2.1  Example 1  





1.2.2  Example 2  







1.3  Statement of the theorem and proof  







2 Chain rule for discrete random variables  



2.1  Two random variables  





2.2  Finitely many random variables  





2.3  Example  







3 Bibliography  





4 References  














Chain rule (probability)






Беларуская
فارسی
Français
עברית
Türkçe
Українська

 

Edit links
 









Article
Talk
 

















Read
Edit
View history
 








Tools
   


Actions  



Read
Edit
View history
 




General  



What links here
Related changes
Upload file
Special pages
Permanent link
Page information
Cite this page
Get shortened URL
Download QR code
Wikidata item
 




Print/export  



Download as PDF
Printable version
 
















Appearance
   

 






From Wikipedia, the free encyclopedia
 

(Redirected from Chain rule of probability)

Inprobability theory, the chain rule[1] (also called the general product rule[2][3]) describes how to calculate the probability of the intersection of, not necessarily independent, events or the joint distributionofrandom variables respectively, using conditional probabilities. This rule allows you to express a joint probability in terms of only conditional probabilities.[4] The rule is notably used in the context of discrete stochastic processes and in applications, e.g. the study of Bayesian networks, which describe a probability distribution in terms of conditional probabilities.

Chain rule for events[edit]

Two events[edit]

For two events and , the chain rule states that

,

where denotes the conditional probabilityof given .

Example[edit]

An Urn A has 1 black ball and 2 white balls and another Urn B has 1 black ball and 3 white balls. Suppose we pick an urn at random and then select a ball from that urn. Let event be choosing the first urn, i.e. , where is the complementary eventof. Let event be the chance we choose a white ball. The chance of choosing a white ball, given that we have chosen the first urn, is The intersection then describes choosing the first urn and a white ball from it. The probability can be calculated by the chain rule as follows:

Finitely many events[edit]

For events whose intersection has not probability zero, the chain rule states

Example 1[edit]

For , i.e. four events, the chain rule reads

.

Example 2[edit]

We randomly draw 4 cards without replacement from deck with 52 cards. What is the probability that we have picked 4 aces?

First, we set . Obviously, we get the following probabilities

.

Applying the chain rule,

.

Statement of the theorem and proof[edit]

Let be a probability space. Recall that the conditional probability of an given is defined as

Then we have the following theorem.

Chain rule —  Let be a probability space. Let . Then

Proof

The formula follows immediately by recursion

where we used the definition of the conditional probability in the first step.

Chain rule for discrete random variables[edit]

Two random variables[edit]

For two discrete random variables , we use the eventsand in the definition above, and find the joint distribution as

or

where is the probability distributionof and conditional probability distributionof given .

Finitely many random variables[edit]

Let be random variables and . By the definition of the conditional probability,

and using the chain rule, where we set , we can find the joint distribution as

Example[edit]

For , i.e. considering three random variables. Then, the chain rule reads

Bibliography[edit]

References[edit]

  1. ^ Schilling, René L. (2021). Measure, Integral, Probability & Processes - Probab(ilistical)ly the Theoretical Minimum. Technische Universität Dresden, Germany. p. 136ff. ISBN 979-8-5991-0488-9.{{cite book}}: CS1 maint: location missing publisher (link)
  • ^ Schum, David A. (1994). The Evidential Foundations of Probabilistic Reasoning. Northwestern University Press. p. 49. ISBN 978-0-8101-1821-8.
  • ^ Klugh, Henry E. (2013). Statistics: The Essentials for Research (3rd ed.). Psychology Press. p. 149. ISBN 978-1-134-92862-0.
  • ^ Virtue, Pat. "10-606: Mathematical Foundations for Machine Learning" (PDF).

  • Retrieved from "https://en.wikipedia.org/w/index.php?title=Chain_rule_(probability)&oldid=1228720091"

    Categories: 
    Bayesian inference
    Bayesian statistics
    Mathematical identities
    Probability theory
    Hidden categories: 
    CS1 maint: location missing publisher
    Articles with short description
    Short description is different from Wikidata
     



    This page was last edited on 12 June 2024, at 20:12 (UTC).

    Text is available under the Creative Commons Attribution-ShareAlike License 4.0; additional terms may apply. By using this site, you agree to the Terms of Use and Privacy Policy. Wikipedia® is a registered trademark of the Wikimedia Foundation, Inc., a non-profit organization.



    Privacy policy

    About Wikipedia

    Disclaimers

    Contact Wikipedia

    Code of Conduct

    Developers

    Statistics

    Cookie statement

    Mobile view



    Wikimedia Foundation
    Powered by MediaWiki