To calculate the conditional expectation \( E[Y_1 | Y_1 + Y_2 + \cdots + Y_n = y] \), we start by recognizing that \( Y_1, Y_2, \ldots, Y_n \) are independent and identically distributed (iid) nonnegative random variables. We denote \( S_n = Y_1 + Y_2 + \cdots + Y_n \). Given the symmetry and identical distribution of the \( Y_i \)'s, we can intuitively understand that the conditional expectation \( E[Y_1 | S_n = y] \) should be the same for each \( Y_i \). Therefore, we have: \[ E[Y_1 | S_n = y] = E[Y_2 | S_n = y] = \cdots = E[Y_n | S_n = y]. \] Since there are \( n \) such random variables, we can express the conditional expectation in terms of the total sum: \[ E[Y_1 | S_n = y] = \frac{1}{n} E[S_n | S_n = y]. \] Now, \( E[S_n | S_n = y] = y \) because if we condition on \( S_n \) being equal to \( y \), then \( S_n \) is exactly \( y \). Putting this together, we find: \[ E[Y_1 | S_n = y] = \frac{1}{n} y. \] Thus, the final result for the conditional expectation is: \[ \boxed{\frac{y}{n}}. \]
To compute the unconditioned expectation \( E[Y_1] \), we can use the following integral expression: \[ E[Y_1] = \int y_1 p(y_1, y_2, \dots, y_n) \, dy_1 \, dy_2 \, \cdots \, dy_n, \] where \( p(y_1, y_2, \ldots, y_n) \) represents the joint probability density function (PDF) of the random variables \( (Y_1, Y_2, \ldots, Y_n) \).
Given that \(Y_1, Y_2, \ldots, Y_n\) are independent and identically distributed (i.i.d.), we can express the expected value \(E[Y_1]\) through the following integration: \[ E[Y_1] = \int y_1 \prod_{i=1}^n f(y_i) \, dy_1 \, dy_2 \, \ldots \, dy_n = \int y_1 f(y_1) \, dy_1. \] The transition to the second equality is justified by the normalization property of the probability density function.
In analyzing $E[Y_1|Y_1+\cdots+Y_n=y]$, we can express it as follows: \begin{equation} E[Y_1|Y_1+\cdots+Y_n=y]=\int{y_1 \cdot p(y_1, y_2, \ldots, y_n | Y_1 + Y_2 + \cdots + Y_n = y) \, dy_1 \, dy_2 \ldots dy_n} \, , \end{equation} where $p(y_1, y_2, \ldots, y_n | Y_1 + Y_2 + \cdots + Y_n = y)$ denotes the conditional joint probability density function (PDF) of the random variables $Y_1$, $Y_2$, ..., $Y_n$.
However, the integration mentioned above may not be valid. This is due to the fact that the conditioned joint probability density function (PDF) $p(y_1,y_2,...,y_n|y_1+y_2+\cdots+y_n=y)$ possesses only $n-1$ degrees of freedom. Consequently, the integration should be conducted in an $(n-1)$-dimensional space, whereas the integration presented above is framed as an $n$-dimensional integration.
To find the conditioned expectation \( E[Y_1 | Y_1 + Y_2 + \cdots + Y_n = y] \), we can utilize the properties of conditional expectations and the joint distribution of the random variables involved. Assuming \( Y_1, Y_2, \ldots, Y_n \) are independent and identically distributed (i.i.d.) random variables with individual probability density functions \( f(y_i) \), we can denote \( S = Y_1 + Y_2 + \cdots + Y_n \). The sum \( S \) has a distribution that can be derived from the convolution of the individual PDFs. Using the definition of conditional expectation, we have: \[ E[Y_1 | S = y] = \int_{-\infty}^{\infty} y_1 \cdot f_{Y_1 | S}(y_1 | y) \, dy_1 \] where \( f_{Y_1 | S}(y_1 | y) \) is the conditional probability density function of \( Y_1 \) given \( S = y \). By the property of symmetry in i.i.d. random variables, we know that the conditional distribution of any one random variable given the sum is uniform across all \( Y_i \). By the law of total expectation, we can express \( E[Y_1 | S = y] \) as: \[ E[Y_1 | S = y] = \frac{1}{n} E[S | S = y] = \frac{1}{n} y \] Thus, the exact integration expression for the conditioned expectation \( E[Y_1 | Y_1 + \cdots + Y_n = y] \) can be simplified to: \[ E[Y_1 | Y_1 + Y_2 + \cdots + Y_n = y] = \frac{y}{n} \] This result leverages the symmetry of the distribution of \( Y_1, Y_2, \ldots, Y_n \) and the fact that the expected value of any individual \( Y_i \) given the total sum is evenly distributed among all \( n \) variables. In conclusion, the value of the conditioned expectation is: \[ E[Y_1 | Y_1 + Y_2 + \cdots + Y_n = y] = \frac{y}{n} \]