投稿日:2024年12月23日

Derivation of Bayes theorem and concrete examples

Understanding Bayes’ Theorem

Bayes’ theorem is a fundamental concept in probability theory and statistics, with wide-ranging applications in fields such as machine learning, data science, and decision-making processes.
At its core, Bayes’ theorem provides a way to update probabilities based on new evidence.

This theorem is named after Thomas Bayes, an 18th-century statistician and minister, who first provided a framework for inductive reasoning.

Bayes’ theorem is particularly useful when dealing with uncertainty and gaining insights from incomplete information.

By understanding its derivation and practical applications, one can effectively utilize it to make informed decisions.

Derivation of Bayes’ Theorem

Bayes’ theorem can be mathematically expressed as:

\[ P(A|B) = \frac{P(B|A) \cdot P(A)}{P(B)} \]

This equation can be broken down to understand its components:

– \( P(A|B) \): The probability of event A occurring, given that event B has occurred.
– \( P(B|A) \): The probability of event B occurring, given that event A has occurred.
– \( P(A) \): The probability of event A occurring independently of the other.
– \( P(B) \): The probability of event B occurring independently of the other.

To derive Bayes’ theorem, let’s start by considering the definition of conditional probability:

\[ P(A|B) = \frac{P(A \cap B)}{P(B)} \]

This equation highlights that the probability of A given B is equal to the joint probability of A and B, divided by the probability of B.

Similarly, we can express:

\[ P(B|A) = \frac{P(A \cap B)}{P(A)} \]

By rearranging this equation, we get:

\[ P(A \cap B) = P(B|A) \cdot P(A) \]

Now, combining both expressions for \( P(A \cap B) \) and substituting, we arrive at Bayes’ theorem:

\[ P(A|B) = \frac{P(B|A) \cdot P(A)}{P(B)} \]

Concrete Examples of Bayes’ Theorem

Bayes’ theorem may seem abstract at first glance, but it can be applied in many practical situations.
Let’s explore a few examples that illustrate its use in real-world decisions.

Example 1: Medical Diagnosis

Imagine a scenario where a doctor is diagnosing a patient for a particular disease.
Let’s say the prior probability \( P(\text{Disease}) \) of a patient having the disease is 1%.
The probability of receiving a positive test result given the patient has the disease \( P(\text{Positive Test}|\text{Disease}) \) is 99%.

Further, the probability of receiving a positive test result even if the patient does not have the disease \( P(\text{Positive Test}|\text{No Disease}) \) is 5%.
The overall probability of a positive test \( P(\text{Positive Test}) \) is:

\[ P(\text{Positive Test}) = P(\text{Positive Test}|\text{Disease}) \cdot P(\text{Disease}) + P(\text{Positive Test}|\text{No Disease}) \cdot P(\text{No Disease}) \]

Thus:

\[ P(\text{Positive Test}) = 0.99 \cdot 0.01 + 0.05 \cdot 0.99 = 0.0594 \]

Now, applying Bayes’ theorem to find the probability that the patient has the disease given a positive test result:

\[ P(\text{Disease}|\text{Positive Test}) = \frac{P(\text{Positive Test}|\text{Disease}) \cdot P(\text{Disease})}{P(\text{Positive Test})} \]

\[ P(\text{Disease}|\text{Positive Test}) = \frac{0.99 \cdot 0.01}{0.0594} \approx 0.1667 \]

This result demonstrates that, despite a positive test, the relatively low prior probability affects the post-test probability significantly.

Example 2: Spam Email Filtering

Consider an email spam filter that uses Bayes’ theorem to determine whether an email is spam or not.
Let’s assume:

– The prior probability of an email being spam \( P(\text{Spam}) \) is 20%.
– The probability that a spam email contains the word “offer” \( P(\text{“Offer”}|\text{Spam}) \) is 70%.
– The probability that a non-spam email contains the word “offer” \( P(\text{“Offer”}|\text{Not Spam}) \) is 10%.

We need to find the probability that an email is spam given that it contains the word “offer”:

\[ P(\text{Spam}|\text{“Offer”}) = \frac{P(\text{“Offer”}|\text{Spam}) \cdot P(\text{Spam})}{P(\text{“Offer”})} \]

First, calculate \( P(\text{“Offer”}) \):

\[ P(\text{“Offer”}) = P(\text{“Offer”}|\text{Spam}) \cdot P(\text{Spam}) + P(\text{“Offer”}|\text{Not Spam}) \cdot P(\text{Not Spam}) \]

\[ P(\text{“Offer”}) = 0.70 \cdot 0.20 + 0.10 \cdot 0.80 = 0.22 \]

Now, apply Bayes’ theorem:

\[ P(\text{Spam}|\text{“Offer”}) = \frac{0.70 \cdot 0.20}{0.22} \approx 0.6364 \]

This probability indicates that if an email contains the word “offer,” there is approximately a 63.64% chance that it is spam.

The Power of Bayes’ Theorem

Bayes’ theorem empowers individuals and systems to make probabilistic inferences based on observed data and prior knowledge.
Its applications extend to various domains, including medical diagnostics, spam filtering, finance, and machine learning.

By continually updating beliefs with new evidence, Bayes’ theorem enables more accurate and data-driven decisions.
Understanding and applying Bayes’ theorem equips you with a valuable statistical tool for navigating uncertainty and making informed predictions.

資料ダウンロード

QCD調達購買管理クラウド「newji」は、調達購買部門で必要なQCD管理全てを備えた、現場特化型兼クラウド型の今世紀最高の購買管理システムとなります。

ユーザー登録

調達購買業務の効率化だけでなく、システムを導入することで、コスト削減や製品・資材のステータス可視化のほか、属人化していた購買情報の共有化による内部不正防止や統制にも役立ちます。

NEWJI DX

製造業に特化したデジタルトランスフォーメーション(DX)の実現を目指す請負開発型のコンサルティングサービスです。AI、iPaaS、および先端の技術を駆使して、製造プロセスの効率化、業務効率化、チームワーク強化、コスト削減、品質向上を実現します。このサービスは、製造業の課題を深く理解し、それに対する最適なデジタルソリューションを提供することで、企業が持続的な成長とイノベーションを達成できるようサポートします。

オンライン講座

製造業、主に購買・調達部門にお勤めの方々に向けた情報を配信しております。
新任の方やベテランの方、管理職を対象とした幅広いコンテンツをご用意しております。

お問い合わせ

コストダウンが利益に直結する術だと理解していても、なかなか前に進めることができない状況。そんな時は、newjiのコストダウン自動化機能で大きく利益貢献しよう!
(Β版非公開)

You cannot copy content of this page