Study Guide

So far, we've been playing with a bunch of statistics topics. Now it's time to move on to something that'll help you win at cards and games of chance: **probability.**

Probability is where we find the words "likelihood'' and "chance.'' Probability is also useful for figuring out the strength of your poker hand. Although if you're still using that card that shows what beats what, you may not be ready for probability just yet.

### Outcomes and Events

Whenever we do an experiment like flipping a coin or rolling a die, we get an**outcome**. For example, if we flip a coin we get an outcome of heads or tails, and if we roll a die we get an outcome of 1, 2, 3, 4, 5, or 6. Unless we're rolling a 20-sided die, in which case we're likely playing Dungeons & Dragons, and the outcome is that we won't go on a date for a few years yet. Ouch.We call the set of all possible outcomes of an experiment the

**sample space**. The sample space for the experiment of flipping a coin is{heads, tails}

and the sample space for the experiment of rolling a die is

{1, 2, 3, 4, 5, 6}.

An

**event**is a set of outcomes. The event of rolling an even number with a die is the set{2, 4, 6}.

If you were looking for odd numbers, it wouldn't be called an "oddent." Still called an "event." Sorry if that's confusing.

Each of the outcomes in this set is an even number, so if we get any of the outcomes in this set we have successfully rolled an even number. Come on...cat's eyes!

An experiment is called

**random**or**fair**if any outcome is equally likely. Unlike the grand experiment that is life, which is both random and*not*fair.If we flip a fair coin, it means either heads or tails is equally likely. No weighted coins allowed, Mr. Trickster Man. If we draw a card at random from a deck, it means any one of the 52 cards (assuming no jokers) is equally likely to be drawn.

When we talk about finding probabilities, we mean finding the likelihood of events. They're different than the skills certain aliens possess, which are generally referred to as "probe abilities." Important distinction.

If an experiment is random/fair, the probability of an event is the number of favorable outcomes divided by the total number of possible outcomes:

A

**favorable outcome**is any outcome in the event whose probability you're finding (remember, an event is a set).### Sample Problem

If you roll a standard 6-sided die, assuming each side is equally likely to land upwards, the probability of rolling a 1 is

.

### Sample Problem

What's the probability of rolling an even number on a 6-sided die?

If you're finding the probability of the event of rolling an even number, any even number is considered a favorable outcome. Especially if it means you can move ahead four spaces and buy Boardwalk.

The probability of rolling an even number on a standard 6-sided die is

.

Since there are 3 ways to roll an even number on a standard die, the probability of rolling an even number is

.

This makes intuitive sense, since half the numbers on a die are even, and half are odd. Although we can't be entirely sure that's true, as we've never been able to look at all six sides at once, and we're always suspicious they keep changing on us when we aren't looking. Okay, so maybe we're paranoid.

### Important Elements

The probability of an event

- is a fraction:

- is non-negative (that is, 0 or greater). There's no such thing as a negative number of outcomes.

- cannot be more than 1. There can't be more favorable outcomes than there are possible outcomes, so bring the optimism down a notch.

- is 0 if there are no favorable outcomes; that is, if the event is impossible.

- is 1 if every outcome is favorable; that is, if the event is definitely gonna happen.

- is a fraction or decimal between 0 and 1 (Yes, we know we're repeating ourselves.)

- Also, any fraction or decimal between 0 and 1 can be a probability.

### Sample Problem

What is the probability of rolling 7 with a standard 6-sided die?

There are no favorable outcomes. Not unless you're standing at the counter of a magic shop and testing out one of their trick dice: 7 isn't one of the possible outcomes when we roll an everyday, run-of-the-mill die. Therefore,

,

so the probability of rolling 7 is 0. The good news is that, as long as you're rolling only one die, you can't crap out.

If an event has a probability of 1, it means the event is absolutely, positively guaranteed to happen when you do the corresponding experiment (for example, rolling a number less than 7 on a die). If an event has a probability of 0, that event can absolutely, positively

*not*happen when you do the experiment (for example, rolling 7 on a die). For another example, the television show*The Event*has a 0 probability of winning an Emmy this year, since it's no longer on the air. Voters tend to show their support for programs that weren't so bad they got canceled.We recommend keeping this in mind when doing problems. If you're asked the probability of an event you

*know*can't happen, you know the probability is 0, so you don't need to worry about counting favorable outcomes. If you're asked the probability of an event that*has*to happen, you know the probability is 1, so you don't need to worry about counting favorable outcomes. For example, you know your dad will wake you up tomorrow morning using that obnoxious Donald Duck voice, so the probability is automatically 1. When possible, do things the easy way.### Odds

The probability of an event is written as a fraction:

This probability tells us how likely an event is to happen.

**Odds**are another way of conveying the same information, or another way of saying how likely an event is to happen. Instead of comparing the number of favorable outcomes to the total number of outcomes, we compare the number of favorable and unfavorable outcomes. An**unfavorable**outcome is any outcome not in the event we're looking at. Try to keep this straight from an unflavorable outcome, which is one that's bland and tasteless.Here's how we write the odds

**in favor of**an event:The odds

**against**an event are:### Sample Problem

What are the odds in favor of rolling a 4 with a fair die?

There's 1 favorable outcome (rolling a 4) and there are 5 unfavorable outcomes (rolling anything else). The odds in favor of rolling 4 are 1:5.

Not that we want you to encourage you to gamble, but we wouldn't lay even money on that proposition if we were you.

Since an event must either happen or not happen, if we add up the number of favorable and unfavorable outcomes, we get the total number of outcomes. Therefore, we can go from odds to probability, or from probability to odds. If we'll be doing both, it only makes sense to purchase a round-trip ticket ahead of time.

### Sample Problem

If an event has probability , the number of favorable outcomes is 3 and the number of total outcomes is 4. There's only 1 outcome left to be unfavorable. The odds in favor of the event are 3:1, and the odds against the event are 1:3. Oh, and by the way, there's no such thing as luck. Your odds of losing a coin flip are not higher than someone else's because the world hates you. Sorry to disappoint.

### Sample Problems

If the odds in favor of an event are 1:2, there's 1 favorable outcome and 2 unfavorable outcomes, meaning there are 3 total outcomes, so the probability of the event is

.

If the odds against an event are 4:5, there are 5 favorable outcomes and 4 unfavorable outcomes, for a total of 9 possible outcomes. The probability of the event is

.

By the way, a set of odds can be reduced just like a fraction. If the odds of acing your math midterm are 15:20, you can simplify that to 3:4. Good luck.

### Compound Events

A**compound event**is an event that can be described in terms of simpler events, not a riot that occurs on prison grounds.A compound event will often, but not always, involve multiple experiments. Outcomes that result from combining two experiments are written like ordered pairs:

(outcome of first experiment, outcome of second experiment)

For instance, if Dr. Frankenstein tried bringing his creation to life AND curing its acne in one fell swoop, he'd be engaging in a compound event.

### Sample Problem

If we roll a die and flip a coin, the sample space is

{(1, H), (1, T), (2, H), (2, T), (3, H), (3, T), (4, H), (4, T), (5, H), (5, T), (6, H), (6, T)}.

"Rolling an even number and landing a head on the coin flip" is a compound event, since this event can be described in terms of the two simpler events "rolling an even number" and "landing a head on the coin flip."

### Independent and Dependent Events

Suppose we have a jar with 10 pieces of chocolate candy and 5 pieces of vanilla candy. Clearly, the chocolate candy is far superior, which is why we went out and bought twice as many of them.

We take one piece of candy at random from the jar, put it back, then take a second piece of candy at random from the jar. The event of selecting first chocolate and then vanilla candy is a compound event, since this is made up of two events (taking a chocolate candy first, and taking a vanilla candy second). The math would be easiest if we simply took and ate all 15 pieces of candy, but we don't want to ruin our appetite.

### Sample Problem

Take that jar with 10 pieces of chocolate candy and 5 pieces of vanilla candy. We take one piece of candy at random from the jar, put it back, then take a second piece of candy at random from the jar.

- What is the probability of the first candy being chocolate?

- What is the probability of the first candy being vanilla?

- What is the probability of the second candy being chocolate?

- What is the probability of the second candy being vanilla?

- What is the probability of us actually being able to put that piece of candy back once we have it in our grasp?

Answers:

- Since we put the first candy back, this is the same as the probability of the first candy being chocolate:

- Since we put the first candy back, this is the same as the probability of the first candy being vanilla:

- Not good...have you
*tasted*these things?

The two events in the experiment above (selecting chocolate first and vanilla second) are

**independent**. When you finally move out of your parents' house and are "independent" yourself, you'll be able to eat all of the chocolate and vanilla candy you like.Intuitively, we know the two events have nothing to do with each other. The probability of selecting vanilla second is the same whether or not the first candy is chocolate. Generalizing this idea, two events are independent if the probability of one event happening stays the same whether the other event happens or not. Case in point, the chances of the Seahawks winning on Sunday are independent of which socks you decide to wear. Despite what you may argue to your less superstitious comrades.

Now let's change the rules of the experiment and see what happens.

### Sample Problem

Suppose we have a jar with 10 pieces of chocolate candy and 5 pieces of vanilla candy. We take one piece of candy at random from the jar,

*eat it*, and then take a second piece of candy at random from the jar. Ooh, we like where we're going with this.- IF the first candy is chocolate, what is the probability of the second candy being chocolate?

- IF the first candy is chocolate, what is the probability of the second candy being vanilla?

- IF the first candy is vanilla, what is the probability of the second candy being vanilla?

And your answers:

- After eating one chocolate candy there are now 9 pieces of chocolate and 5 pieces of vanilla candy in the jar, so the probability of getting chocolate is now .

Now the two events (selecting chocolate first, selecting vanilla second) are

**dependent**. The probability of selecting vanilla second depends on whether the first candy was chocolate. Similarly, the chances of the Seahawks winning on Sunday are dependent on whether or not you decide to kidnap their star quarterback. Just kidding...the Seahawks don't have a star quarterback.Look, we said we have a lot of feelings about football.

Formally, we say two events

*A*and*B*are**independent**if(probability

*A*occurs AND*B*occurs) = (probability*A*occurs)(probability*B*occurs).We don't need to say it while wearing a tuxedo. We don't need to be

*that*formal.### Sample Problem

Let

*A*be the event of rolling 1 on a die and*B*be the event of flipping tails on a coin. Then events*A*and*B*are independent.Look at the sample space for the experiment where we roll a die and flip a coin:

There's one favorable outcome for the compound event of

*A*(rolling 1) and*B*(flipping tails), so.

Now look at the probabilities of the individual events:

Since

,

which is the same as the probability we found for the compound event, we conclude that events

*A*and*B*are independent.### Sample Problem

If we roll two dice, the event of rolling 5 on the first die and the event of the numbers on the two dice summing to 8 are dependent.

It might help to look at the possible sums when we roll two dice. The numbers going down the side of the below chart correspond to the first die, and the numbers going across the top correspond to the second die. Just be glad we're not using that 20-sided die we mentioned earlier.

The compound event of rolling 5 on the first die and the numbers summing to 8 has only one favorable outcome, out of 36 total:

Therefore,

.

Not great odds. Hope you didn't bet the farm on that, or we know some cows and chickens who will be very unhappy.

Now we look at the individual events.

To check how many ways the numbers on the dice can sum to 8, we look at the table again:

Since there are 5 ways for the numbers on the dice to sum to 8 out of 36 possible outcomes,

.

Finally, we check the independence condition.

This is

*not*the same asso the events are NOT independent; they're dependent. In other words, you can claim them on your tax return. #oldpeoplejokes

- What is the probability of the first candy being chocolate?
### Mutually Exclusive Events

These aren't parties that only mutual fund managers are allowed to attend. Don't worry; you aren't missing much. Not too much roof-raising goes on at those shindigs.

Sometimes two events can't happen at the same time. For example, we can't roll one die and get both an odd number and an even number on the same roll. We also can't roll two dice and get 1 on the first die, but a sum of 8 from the two dice. When events disagree like that, we call them

**mutually exclusive**. We try not to call them that right to their faces, however, as we've already mentioned they're sort of disagreeable.When two events

*A*and*B*are mutually exclusive, to find the probability that either*A*or*B*happens, we add the probabilities of*A*and*B*. While this may seem like a new trick, we must confess we're being sneaky; you've actually done this already when finding probabilities like the one in the next example. We won't draw out the suspense any longer...here she is:### Sample Problem

What's the probability of rolling a 5 or a 6 on a die?

We can't roll 5 and 6 at the same time, so these events are mutually exclusive. The probability of rolling 5 OR 6 on a die is then

.

### Sample Problem

The probability that Jenny wears purple shoes is . The probability that Jenny wears green shoes is . What is she, prepping for Mardi Gras? If Jenny can only wear one color of shoes, what is the probability that she wears either green or purple shoes?

She can't wear green and purple at once, no matter how many feet she has. The event of wearing purple shoes and the event of wearing green shoes are mutually exclusive. Good thing, because we feel like we saw this exact same issue addressed once on

*What Not to Wear*.To find the probability that either one happens, we add the individual probabilities:

When we talk about finding the probability that "

*A*or*B*" happens, what we actually mean is the probability that*A*happens,*B*happens, or both*A*and*B*happen. When the events*A*and*B*are mutually exclusive it's impossible for both*A*and*B*to happen, so we happen upon this nice shortcut:(probability

*A*or*B*happens) = (probability of*A*) + (probability of*B*)When events

*A*and*B*are not mutually exclusive, both*A*and*B**could*happen. Fingers crossed. This means (for now), when*A*and*B*are not mutually exclusive, we're going to go back to the formula.

Here's a useful tidbit about probability: for an event

*A*,(probability

*A*happens) + (probability*A*doesn't happen) = 1.Event

*A*and Event "not*A*" are mutually exclusive: we can't have*A*happen and not happen at the same time. The only time such a paradox was created that we can think of was when Keanu Reeves became an actor.(probability

*A*happens or "not*A*" happens) = (probability*A*happens) + (probability "not*A*" happens).On the other hand, look at the left-hand side of that equation:

(probability

*A*happens or "not*A*" happens).What are the favorable outcomes for this event? No matter what, either

*A*happens, or it doesn't. Every outcome is favorable. We have what we like to call a "win-win." Eat your heart out, Charlie Sheen.Therefore,

Since (probability

*A*happens or "not*A*" happens) always equals 1, we can rewrite the equation(probability

*A*happens or "not*A*" happens) = (probability*A*happens) + (probability "not*A*" happens)as

1 = (probability

*A*happens) + (probability "not*A*" happens).Hold that formula in your brain for a sec. We'll make use of it, promise.

### Sample Problem

Let

*A*be the event of rolling an odd number on a die. Then "not*A*" is the event of rolling an even number.It's useful to know

1 = (probability

*A*happens) + (probability "not*A*" happens)(told ya!), because if we rearrange the equation by subtracting (probability

*A*happens) from each side, we get1 – (probability

*A*happens) = (probability "not*A*" happens).If we know the probability that an event happens, we also know the probability that the event doesn't happen: just subtract it from 1. Again, this discovery may seem intuitive to you, but it's always nice to back up intuition with some good old-fashioned formulas.