Joint probability- mean and variance? Pl refer image attached

Answer 1

The answer is #"(d) "3/4x_2 and 3/80x_2^2#.

Part 1: Conditional Mean By definition, #"E"(X_1|X_2)=int_Ax_1*color(blue)(f(x_1|x_2))dx_1#,

where

#color(blue)(f(x_1|x_2))=(f(x_1, x_2))/color(red)(f_(X_2)(x_2))#,

where

#color(red)(f_(X_2)(x_2))=int_Af(x_1, x_2)dx_1#.
In this case, since #x_1# is bounded by #0 < x_1 < x_2#, our integration interval is #A = (0,x_2).# Thus:
#color(red)(f_(X_2)(x_2))=int_0^(x_2)21" "x_1^2" "x_2^3" "dx_1# #color(white)(f_(X_2)(x_2))=21" "x_2^3int_0^(x_2)x_1^2" "dx_1# #color(white)(f_(X_2)(x_2))=21" "x_2^3[1/3x_1^3]_(0)^(x_2)# #color(white)(f_(X_2)(x_2))=7" "x_2^6#

Thus,

#color(blue)(f(x_1|x_2))=(f(x_1, x_2))/color(red)(f_(X_2)(x_2))#
#color(white)(f(x_1|x_2))=(21" "x_1^2" "x_2^3)/(7" "x_2^6)#
#color(white)(f(x_1|x_2))=(3" "x_1^2)/(x_2^3)#.

Finally,

#"E"(X_1|X_2)=int_Ax_1*color(blue)(f(x_1|x_2))dx_1#
#color(white)("E"(X_1|X_2))=int_Ax_1*(3x_1^2)/x_2^3dx_1#
Since we are once again integrating with respect to #x_1#, the integration interval is the same as before:
#"E"(X_1|X_2)=int_0^(x_2)x_1*(3x_1^2)/x_2^3dx_1#
#color(white)("E"(X_1|X_2))=3/x_2^3 int_0^(x_2)x_1^3" "dx_1#
#color(white)("E"(X_1|X_2))=3/x_2^3 [x_1^4/4]_0^(x_2)#
#color(white)("E"(X_1|X_2))=3/(4x_2^3) [x_2^4]#
#color(white)("E"(X_1|X_2))=3/4x_2#

Part 2: Conditional Variance

The conditional variance is

#"Var"(X_1|X_2)="E"(X_1^2|X_2)-["E"(X_1|X_2)]^2#
I'll leave the calculation of #"E"(X_1^2|X_2)# as an exercise. (Hint: just replace #x_1# with #x_1^2# in the formula for #"E"(X_1|X_2)#.)

The result is:

#"Var"(X_1|X_2)=(3x_2^2)/5-[(3x_2)/4]^2#
#color(white)("Var"(X_1|X_2))=(3x_2^2)/5-(9x_2^2)/16#
#color(white)("Var"(X_1|X_2))=3x_2^2[1/5-3/16]#
#color(white)("Var"(X_1|X_2))=3x_2^2[(16-15)/80]#
#color(white)("Var"(X_1|X_2))=3/80x_2^2#.
Sign up to view the whole answer

By signing up, you agree to our Terms of Service and Privacy Policy

Sign up with email
Answer 2

To calculate the mean of joint probability distribution:

[ E(X) = \sum_{i=1}^{n} \sum_{j=1}^{m} x_{ij} P(X = x_{ij}) ]

To calculate the variance of joint probability distribution:

[ Var(X) = \sum_{i=1}^{n} \sum_{j=1}^{m} (x_{ij} - E(X))^2 P(X = x_{ij}) ]

Where:

  • ( E(X) ) is the mean of the joint probability distribution.
  • ( Var(X) ) is the variance of the joint probability distribution.
  • ( x_{ij} ) are the possible values of the random variable ( X ).
  • ( P(X = x_{ij}) ) is the probability of the random variable ( X ) taking on the value ( x_{ij} ).
  • ( n ) is the number of possible values for the random variable ( X ).
  • ( m ) is the number of possible values for the random variable ( Y ).
Sign up to view the whole answer

By signing up, you agree to our Terms of Service and Privacy Policy

Sign up with email
Answer from HIX Tutor

When evaluating a one-sided limit, you need to be careful when a quantity is approaching zero since its sign is different depending on which way it is approaching zero from. Let us look at some examples.

When evaluating a one-sided limit, you need to be careful when a quantity is approaching zero since its sign is different depending on which way it is approaching zero from. Let us look at some examples.

When evaluating a one-sided limit, you need to be careful when a quantity is approaching zero since its sign is different depending on which way it is approaching zero from. Let us look at some examples.

When evaluating a one-sided limit, you need to be careful when a quantity is approaching zero since its sign is different depending on which way it is approaching zero from. Let us look at some examples.

Not the question you need?

Drag image here or click to upload

Or press Ctrl + V to paste
Answer Background
HIX Tutor
Solve ANY homework problem with a smart AI
  • 98% accuracy study help
  • Covers math, physics, chemistry, biology, and more
  • Step-by-step, in-depth guides
  • Readily available 24/7