$E[g(X,Y)] = \sum_x \sum_y g(x,y)\, p_{X,Y}(x,y)$
Moments for Joint Distributions generalize single-variable expectation: the expected value of any function $g(X,Y)$ is the double-sum of $g(x,y)$ weighted by the joint PMF.
This framework covers joint moments $E[X^r Y^s]$ as well as moments computed from marginal or conditional distributions.
> [!example]- Expected Product of Two Random Variables {💡 Example}
> Let $X$ and $Y$ be independent with $P(X=1)=0.4$, $P(X=2)=0.6$, $P(Y=1)=0.5$, $P(Y=3)=0.5$. Find $E[XY]$.
>
> > [!answer]- Answer
> > Because $X$ and $Y$ are independent, $E[XY] = E[X]\cdot E[Y]$.
> > $E[X] = 1(0.4)+2(0.6) = 1.6, \qquad E[Y] = 1(0.5)+3(0.5) = 2.0$
> > Therefore $E[XY] = 1.6 \times 2.0 = 3.2$.