# ProbabilityJoint Distributions

The distribution of a random variable is sometimes its called its **marginal** distribution, with the term *marginal* emphasizing that distribution includes information only about a single random variable. If we are interested in two random variables and , it is often important to consider their *joint* distribution, which captures probabilistic information about where the pair falls in .

**Definition**

If and are two random variables defined on the same probability space, then the **joint distribution** of and is the measure on which assigns to each set the value .

If and are

**Example**

Consider the two-fair-coin-flip experiment, and let be the number of heads in the first flip and the number of heads in the second flip. Let be the number of tails in the first flip.

Show that , , and all have the same marginal distributions and but that and have different joint distributions.

*Solution.* The random variables all have the same distribution because each can be or with probability On the other hand, can take the values with equal probability while can only be either or with probability

This exercise shows that the joint distribution of two random variables provides information not present in the marginal distributions alone. Conversely, the marginal distributions of two random variables may be recovered from their joint distribution:

**Exercise**

Consider a computer program which rolls two virtual dice and returns roll results with probabilities shown in the table.

The probability that die 1 shows 4 is

*Solution.* The event that the first die shows 4 can be written as a disjoint union of the events where ranges over the integers 1 to 6. We get

**Exercise**

Determine which of the following joint distributions on has the property that each random variable and has the same marginal distribution. (Note: each disk indicates a probability mass at a point, with the size of the disk proportional to the mass at that point)

**Solution**. We find the distribution of by summing the joint distribution along vertical lines, and we obtain the distribution of by summing along horizontal lines. Only for the third distribution do these two procedures give the same results.

**Exercise**

For each of the three joint distributions in the previous exercise, the probability that is equal to

*Solution.* Since all of the probability mass is in the first quadrant, both and are positive with probability 1. The probability that is the total amount of probability mass in the region in the plane above the line . The figure with the most mass in that region is the first one.