### 2024年丘成桐大学生数学竞赛(概率与统计)-无答案

There are $r$ players, with player $i$ initially having $n_i$ units, $n_i>0, i=1,2, \cdots, r$. At each stage, two of the players are chosen to play a game, with the winner of the game receiving 1 unit from the loser. Any player whose fortune drops to 0 is eliminated, and this continues until a single player has all $n=\sum_{i=1}^r n_i$ units, with that player designated as the winne r. Note that the mechanism to choose two players at each sta ge is unknown. It can be either deterministic or random. Assu me that the results of successive games are independent and that each game is equally likely to be won by either of its two players.

For any set of players $S \subseteq\{1, \cdots, r\}$, let $X(S)$ denote the $\mathrm{n}$ umber of games involving only members of $S$. Does $E(X(S))$ depend on the player selection mechanism? If you think it doesn't depend, calculate the expectation. If you think it depends, give two mechanisms leading to different expecta tions.

Let $X_1, X_2, \cdots$ be independent Bernoulli random variables $s$ atisfying $P\left(X_i=1\right)=p$ and $P\left(X_i=-1\right)=q=1-p$ for some $p \in(0,1)$. Let $S_n=X_1+\cdots+X_n$ and $M=\sup _{n \geq 1}\left(S_n / n\right)$.
(a) Calculate $P(M=0)$.
(b) Show that $P(p-q < M \leq 1)=1$. For any rational num ber $x \in(p-q, 1]$, is $P(M=x)>0$ ? If so, prove it. If not, fi nd a point with zero probability.

Let $X$ have a uniform distribution on the interval $[0,1]$ and le $\mathrm{t} N_{m, k}$ be the digit in the $m$ th place to the right of the decim al point in $X^k$.
(a) Find $\lim _{m \rightarrow \infty} P\left(N_{m, m}=i\right)$ for $i=0,1,2, \cdots, 9$.
(b) Let $k(m)$ be a function of $m$, taking values greater than 1 .

Find a necessary and sufficient condition on $k(m)$ such that $\lim _{m \rightarrow \infty} P\left(N_{m, k(m)}=i\right)=\frac{1}{10}$ for $i=0,1, \cdots, 9$.

Assume we have $n$ observations: $\left(Y_i, x_i\right), i=1, \cdots, n$, wher e $Y_i$ is the random response and $x_i=\left(x_{i 1}, \cdots, x_{i p}\right)^T$ is a ve ctor of $p$ fixed covariates for the $i$ th observation. Denote $\beta=\left(\beta_1, \cdots, \beta_p\right)$ be a unknown $p$-length vector of regressio n coefficients. Let $\theta_i=\sum_{j=1}^p x_{i j} \beta_j, \mu_i=E\left(Y_i\right)$ and $\sigma_i^2=\operatorname{Var}\left(Y_i\right)$. Assume the density of $Y_i$ belongs to the follo wing exponential family:
$$f\left(y_i ; \theta_i\right)=\exp \left\{\theta_i y_i-b\left(\theta_i\right)\right\},(1)$$
where $b^{\prime}\left(\theta_i\right)=\mu_i, b^{\prime \prime}\left(\theta_i\right)=\sigma_i^2$. Suppose that all $\theta_i$ 's are con tained in a compact subset of a space $\Theta$. Let $\ell_n(\beta)$ be the log -likelihood function of the data, and let $H_n(\beta)=-\frac{\partial^2 \ell_n(\beta)}{\partial \beta \partial \beta^T}$.

Let $\mathcal{X}$ be the set of all $p$ covariates under consideration. Let $\alpha_0 \subset \mathcal{X}$ be the subset that contains and only contains all the important covariates affecting $Y$ (the corresponding $\beta_j$ 's are nonzero). Let $\alpha$ be any subset of $\mathcal{X}$, and let $\beta(\alpha)$ be the vect or of the components in $\beta$ that correspond to the covariates $\mathrm{i}$ $\mathrm{n} \alpha$. Let $A=\left\{\alpha: \alpha_0 \subset \alpha\right\}$ be the collection of models that including all important covariates. We assume:
(I) There exist positive constants $C_1, C_2$ such that for all suffic iently large $n$,
$$C_1 < \lambda_{\min }\left\{\frac{1}{n} H_n(\beta)\right\} < \lambda_{\max }\left\{\frac{1}{n} H_n(\beta)\right\} < C_2$$
where $\lambda_{\min }\left\{\frac{1}{n} H_n(\beta)\right\}$ and $\lambda_{\max }\left\{\frac{1}{n} H_n(\beta)\right\}$ are the smalles $\mathrm{t}$ and largest eigenvalues of $\frac{1}{n} H_n(\beta)$.
(II) For any given $\varepsilon>0$, there exists a constant $\delta>0$ such th at, when $n$ is sufficiently large,
$$(1-\varepsilon) H_n(\beta(\alpha)) \leq H_n(\tilde{\beta}) \leq(1+\varepsilon) H_n(\beta(\alpha))$$
for all $\alpha \in A$ and $\tilde{\beta}$ satisfying $\|\tilde{\beta}-\beta(\alpha)\| \leq \delta$.

For any model $\alpha$. let $\hat{\beta}_\alpha$ be the MLE of $\beta(\alpha)$ based on this m odel. Show that
$$\max _{\alpha \in A}\left\|\hat{\beta}_\alpha-\beta(\alpha)\right\|=O_p\left(n^{-1 / 3}\right)$$

Consider a random sample of size $n$, and write the data as an $r=r_n$ by $c=c_n$ matrix, $\left\{X_{i j}: i=1, \cdots, r_n ; j=1, \cdots, c_n\right\}$ with $n=r_n c_n$. To spec ify notation, $\left\{X_{i j}\right\}$ are i.i.d. with c.d.f. $\mathrm{F}(\mathrm{x})$ and continuous de nsity $f(x)$. Let $\beta$ denote the median, i.e., $F(\beta)=0.5$. Define an estimator by
$$\hat{\beta}_n=\min _j\left\{\max _i\left\{X_{i j}\right\}\right\} .$$
(a) What is the condition on $r_n$ when $n \rightarrow \infty$ for median-un biasedness, i.e., $\beta$ is also the median for the distribution of $\hat{\beta}_n$ ?
(b) We further assume $F$ is differentiable in an open neighbo rhood of $\beta$ and has a positive derivative at $\beta$. For $r_n$ in (a), sh ow that $r_n\left(\hat{\beta}_n-\beta\right)$ converges in distribution, and find the li miting distribution function.

• 无限看试题

• 下载试题

• 组卷