Dr Mike
Puddephat
Online

An introduction to random variables (part 2)

An introduction to random variables (part 2)

Taken from Mike Puddephat's PhD, in this article fundamental results regarding the expectation and variance of random variables (discrete or continuous) are stated and proved.

This article follows on from An Introduction to Random Variables (Part 1). Here, fundamental results regarding the expectation and variance of random variables (discrete or continuous) are stated and proved.

Property 1

For some constant c ∈ ℜ and random variable X,

E(cX) = cE(X). (1)

Proof:
From Part 1-(1), for X discrete

Equation

while from Part 1-(8), for X continuous

Equation

Property 2

For discrete or continuous random variables X and Y,

E(X + Y) = E(X) + E(Y). (2)

Proof:
Suppose X and Y have joint mass function fX,Y : ℜ2 → [0, 1] given by fX,Y(x,y) = P(X = x and Y = y). Then, for X and Y discrete, an extension of Part 1-(1) gives

Equation

Noting Part 1-(8), for X and Y continuous the proof begins

Equation

where fX,Y(x, y) : ℜ2 → [0, ∞) is the joint density function of X and Y. The proof then proceeds in a similar way to the discrete case with summations replaced by integrations.

Property 3

For discrete or continuous independent random variables X and Y,

E(XY) = E(X)⋅E(Y) (3)

Proof:
The proof of (3) is first presented for discrete random variables X and Y. Let X and Y have joint mass function fX,Y(x, y) : ℜ2 → [0, 1] given by fX,Y = P(X = x and Y = y). If X and Y are independent, then (by definition) the probability of Y occurring is not affected by the occurrence or non-occurrence of X. For X and Y independent,

P(X = x and Y = y) = P((X = x) ∩ (Y = y)) = P(X = x)⋅P(Y = y),

so that fX,Y(x, y) = fX(x)⋅fY(y). Therefore,

Equation

For X and Y continuous, the proof begins

Equation

where fX,Y : ℜ2 → [0, ∞) is the joint density function of X and Y. The proof then proceeds in a similar way to the discrete case with summations replaced by integrations.

Property 4

For the discrete or continuous random variable X,

Equation 4 (4)

Proof:
The proof of (4) holds for X discrete or continuous. As a shorthand notation, let μ = E(X). Then,

Equation

Property 5

For the discrete or continuous random variable X and the constant c ∈ ℜ,

Var(cX) = c2⋅Var(X). (5)

Proof:
The proof of (5) holds for X discrete or continuous. Again, let μ = E(X). Then,

Var(cX) = E((cX - cμ)2) = E(c2⋅(X - μ)2) = c2E((X - μ)2) = c2⋅Var(X).

Property 6

For discrete or continuous independent random variables X and Y,

Var(X + Y) = Var(X) + Var(Y).(6)

Proof:
The proof of (6) holds for X and Y discrete or continuous. As a shorthand notation, let μX = E(X) and μY = E(Y). Then,

Equation

However, from (3), if X and Y are independent random variables, then

Equation

Latest articles in this section

Michael James Puddephat BSc MSc PhD

  • Posted: 14 February, 2012

IT2 workbenches and process maps

  • Posted: 1 December, 2011

Single object stereology (part 3)

  • Posted: 23 June, 2010

Single object stereology (part 2)

  • Posted: 21 June, 2010

Single object stereology (part 1)

  • Posted: 17 June, 2010

What is stereology?

  • Posted: 17 June, 2010

Random position and orientation

  • Posted: 15 June, 2010