40 Probability Concepts to Remember

Posted on Sep 13, 2024 @ 08:14 AM under Probability

1. Probability

  • Definition: The likelihood or chance of an event occurring, expressed as a number between 0 and 1.
  • Example: The probability of rolling a 3 on a fair six-sided die is $\frac{1}{6} $ because there are six possible outcomes and only one favorable outcome.

2. Event

  • Definition: A specific outcome or combination of outcomes of a random experiment.
  • Example: Rolling an even number on a die (2, 4, or 6) is an event.

3. Sample Space

  • Definition: The set of all possible outcomes of a random experiment.
  • Example: For a single roll of a six-sided die, the sample space is {1, 2, 3, 4, 5, 6}.

4. Outcome

  • Definition: A possible result of a random experiment.
  • Example: Getting a 4 when rolling a die is an outcome.

5. Experiment

  • Definition: A procedure or action that results in one or more outcomes.
  • Example: Rolling a die is an experiment.

6. Random Variable

  • Definition: A variable that takes on different numerical values based on the outcome of a random experiment.
  • Example: The number shown on a die roll is a random variable.

7. Discrete Random Variable

  • Definition: A random variable that can take on a countable number of distinct values.
  • Example: The number of heads in 3 coin flips (0, 1, 2, or 3) is a discrete random variable.

8. Continuous Random Variable

  • Definition: A random variable that can take on any value within a given range.
  • Example: The height of a person is a continuous random variable because it can be any value within a range.

9. Probability Distribution

  • Definition: A function that describes the likelihood of different outcomes of a random variable.
  • Example: For a fair die, the probability distribution assigns a probability of ( \frac{1}{6} ) to each number from 1 to 6.

10. Probability Mass Function (PMF)

  • Definition: A function that gives the probability that a discrete random variable is exactly equal to a specific value.
  • Example: For a fair six-sided die, the PMF is $ P(X=x) = \frac{1}{6} $ for $ x = 1, 2, 3, 4, 5, 6 $.

11. Probability Density Function (PDF)

  • Definition: A function that describes the probability of a continuous random variable falling within a particular range of values.
  • Example: For a normally distributed height, the PDF describes how likely it is for a person’s height to fall within a certain range.

12. Cumulative Distribution Function (CDF)

  • Definition: A function that gives the probability that a random variable is less than or equal to a certain value.
  • Example: For a die roll, the CDF for getting a value of 4 or less is $\frac{4}{6} $ or approximately 0.67.

13. Expected Value (Mean)

  • Definition: The long-term average or mean value of a random variable over many trials.
  • Example: The expected value of a fair die roll is $\frac{1+2+3+4+5+6}{6} = 3.5$.

14. Variance

  • Definition: A measure of how much the values of a random variable differ from the expected value.
  • Example: For a fair six-sided die, the variance can be calculated as $\frac{(1-3.5)^2 + (2-3.5)^2 + (3-3.5)^2 + (4-3.5)^2 + (5-3.5)^2 + (6-3.5)^2}{6} \approx 2.92 $.

15. Standard Deviation

  • Definition: The square root of the variance, representing the average distance of each value from the mean.
  • Example: For the same fair die, the standard deviation is $\sqrt{2.92} \approx 1.71$.

16. Independent Events

  • Definition: Two events are independent if the occurrence of one does not affect the probability of the other occurring.
  • Example: Rolling a die and flipping a coin are independent events because the outcome of one does not affect the outcome of the other.

17. Dependent Events

  • Definition: Two events are dependent if the occurrence of one event affects the probability of the other occurring.
  • Example: Drawing two cards from a deck without replacement; the outcome of the first draw affects the probabilities for the second draw.

18. Conditional Probability

  • Definition: The probability of an event occurring given that another event has already occurred.
  • Example: The probability of drawing an ace from a deck, given that a card drawn is a spade, is $\frac{1}{13} $ because there are 13 spades and 1 ace among them.

19. Joint Probability

  • Definition: The probability of two events occurring together.
  • Example: The probability of rolling a 4 on a die and getting heads on a coin flip is $\frac{1}{6} \times \frac{1}{2} = \frac{1}{12}$.

20. Marginal Probability

  • Definition: The probability of an event occurring irrespective of the outcome of another event.
  • Example: The marginal probability of rolling a 4 on a die is $ \frac{1}{6}$.

21. Law of Total Probability

  • Definition: A formula that provides a way to calculate the probability of an event by considering all possible ways the event can occur.
  • Example: To find the probability of drawing a red card from a deck, considering that the deck can be divided into red and black cards, you sum the probabilities of drawing a red card from each subset.

22. Bayes’ Theorem

  • Definition: A formula used to update the probability of an event based on new evidence.
  • Example: If you know the probability of having a certain disease given a positive test result, Bayes’ theorem helps update this probability based on the test’s accuracy.

23. Law of Large Numbers

  • Definition: A principle stating that as the number of trials increases, the sample mean will get closer to the expected value.
  • Example: Flipping a fair coin many times will result in the proportion of heads approaching 0.5.

24. Central Limit Theorem

  • Definition: A theorem stating that the distribution of the sample mean will approach a normal distribution as the sample size becomes large, regardless of the original distribution.
  • Example: If you repeatedly take samples from any distribution (e.g., rolling dice) and compute their averages, those averages will form a normal distribution if the sample size is large enough.

25. Permutation

  • Definition: An arrangement of objects in a specific order.
  • Example: The number of ways to arrange 3 books on a shelf is $3! = 6$.

26. Combination

  • Definition: A selection of items where the order does not matter.
  • Example: The number of ways to choose 2 out of 4 books is $ \binom{4}{2} = 6 $.

27. Binomial Distribution

  • Definition: A probability distribution that describes the number of successes in a fixed number of independent Bernoulli trials.
  • Example: The number of heads in 10 coin flips follows a binomial distribution.

28. Poisson Distribution

  • Definition: A probability distribution that describes the number of events occurring within a fixed interval of time or space, given the events occur with a known constant mean rate and independently of the time since the last event.
  • Example: The number of emails received in an hour might follow a Poisson distribution.

29. Normal Distribution

  • Definition: A continuous probability distribution that is symmetric about the mean, with data near the mean being more frequent in occurrence.
  • Example: Human heights are approximately normally distributed with a mean and standard deviation.

30. Exponential Distribution

  • Definition: A continuous probability distribution that describes the time between events in a Poisson process.
  • Example: The time between arrivals of buses at a station can be modeled with an exponential distribution.

31. Uniform Distribution

  • Definition: A probability distribution where all outcomes are equally likely.
  • Example: Rolling a fair die where each number from 1 to 6 has the same probability.

32. Geometric Distribution

  • Definition: A probability distribution that models the number of trials needed for the first success in a series of Bernoulli trials.
  • Example: The number of coin flips needed to get the first head follows a geometric distribution.

33. Bernoulli Distribution

  • Definition: A discrete probability distribution of a random variable which has exactly two possible outcomes: success and failure.
  • Example: A single coin flip is Bernoulli distributed with heads as success and tails as failure.

34. Conditional Expectation

  • Definition: The expected value of a random variable given that another event has occurred or another random variable has a certain value.
  • Example: If you know the score of a student on a math test and want to find the expected score on a science test given that the student did well in math, you use conditional expectation.

35. Moment Generating Function (MGF)

  • Definition: A function that generates the moments (expected values of powers) of a random variable, useful for finding distributions and solving problems.
  • Example: The MGF of a normal distribution helps in deriving the distribution of the sum of normal random variables.

36. Characteristic Function

  • Definition: A function that provides a way to uniquely describe the probability distribution of a random variable using complex numbers.
  • Example: The characteristic function of a normal distribution can be used to derive properties like the distribution of the sum of independent normal variables.

37. Covariance

  • Definition: A measure of how much two random variables change together. Positive covariance indicates that the variables tend to increase together, while negative covariance indicates that one tends to increase when the other decreases.
  • Example: If the height and weight of individuals in a population are measured, positive covariance would indicate that taller individuals tend to be heavier.

38. Correlation

  • Definition: A standardized measure of the strength and direction of the linear relationship between two random variables, ranging from -1 to 1.
  • Example: The correlation coefficient between height and weight might be high, indicating a strong linear relationship.

39. Joint Distribution

  • Definition: The probability distribution that describes the likelihood of two or more random variables occurring together.
  • Example: The joint distribution of the scores on two different exams taken by students can show how likely different score pairs are.

40. Marginal Distribution

  • Definition: The probability distribution of a subset of random variables within a joint distribution, obtained by summing or integrating over the other variables.
  • Example: From a joint distribution of height and weight, the marginal distribution of height alone is found by summing over all possible weights.