The Gaussian chain is isotropic when averaged over conformation space and all orientations. Therefore, a Gaussian chain is dealt as sphere with radius of $R_g$, it’s radius of gyration. In the 2nd chapter of Rubinstein’s Polymer Physics, an exercise shows that the $R_g^2$ is asymmetric if one set the coordinate frame on its end-to-end vector, $\mathbf{R}_{ee}$. i.e., the $\mathbf{R}_{ee}$ vector is set as the $x$-axis therefore $\mathbf{R}_{ee}=(R,0,0)^T$. Then the 3 components of $\mathbf{R}_{ee}$ are $\frac{Nb^2}{36}$, $\frac{Nb^2}{36}$ on $y$, $z$ direction and $\frac{Nb^2}{36}+\frac{R^2}{12}$ on $x$ direction. Here I make a very simple proof.
Consider a Gaussian chain is fixed between $(0,0,0)^T$ and $\mathbf{R}_{ee}$. It’s actually a Brownian bridge, and the distribution is a multivariate Gaussian with mean at $\frac{i}{N}\mathbf{R}$ with variance $\frac{i(N-i)}{N}b^2$, the proof is simple:
$$P_{0\mathbf{R}}(\mathbf{r},n)=\frac{G(\mathbf{r},0,n)G(\mathbf{R},\mathbf{r},N-n)}{G(0,\mathbf{R},N)}$$
$G(a,b,n)$ represents distribution of a Gaussian chain ends at $a,b$ with segment length $n$. The meaning is straightforward: it’s the probability of a length $n$ Gaussian chain start from $0$ and end at $\mathbf{r}$ connected with another $N-n$ Gaussian chain which started at $\mathbf{R}$ and stopped at $\mathbf{r}$, and the whole chain is an Gaussian chain with length $N$ and $\mathbf{R}_{ee}=\mathbf{R}$. It is easily to show the distribution of such chain
$$P_{0\mathbf{R}}(\mathbf{r},n)=G\left(\mathbf{r}-\frac{n}{N}\mathbf{R}, 0,\frac{n(N-n)}{N}\right)$$
is equivalent to a Gaussian chain segment ends with $\mathbf{r}$ and $\frac{n}{N}\mathbf{R}$ with equivalent length $\frac{n(N-n)}{N}$. The $R_g^2$ is then
$$\begin{align}R_g^2=&\frac{1}{2N^2}\int_0^N\langle(\mathbf{r}_i-\mathbf{r_j})^2\rangle\mathrm{d}i\mathrm{d}j\\
=&\frac{1}{N^2}\int_0^N\int_j^N \frac{(i-j)^2 R^2}{N^2}+\frac{(i-j)(N-(i-j))}{N}b^2\mathrm{d}i\mathrm{d}j\\ =&\frac{R^2+nb^2}{12}\end{align}$$
In this equivalent method, $\langle (\mathbf{r}_i-\mathbf{r}_j)^2\rangle$ is considered as $(\frac{i}{N}-\frac{j}{N})^2R^2+\frac{|i-j|(N-|i-j|)}{N}$ from the equivalent Gaussian chain. This is because despite the chain is ‘fixed’, it is still a Gaussian chain, which means translation invariance, $\langle (\mathbf{r}_i-\mathbf{r}_j)^2\rangle$ depends only on $|i-j|$, therefore, $P_{0\mathbf{R}}(\mathbf{r},n)$ gives the probability of segment $\mathbf{r}_n-\mathbf{r}_0$, which is any $n$-segment on the Gaussian chain. I tried calculating $P(\mathbf{r}_i-\mathbf{r}_j)$ from the convolution of $P_{0\mathbf{R}}$. This is incorrect because $\mathbf{r}_i$, $\mathbf{r}_j$ are correlated, convolution of $P_{0\mathbf{R}}$ results in dependence of $i$ and $j$ in $\langle (\mathbf{r}_i-\mathbf{r}_j)^2\rangle$, violates the translation invariance of the chain.
Now if $R^2=Nb^2$, we see that $R_g^2=\frac{1}{6}Nb^2$; and we have $\frac{Nb^2}{36}+\frac{R^2_{x,y,z}}{12}=\frac{Nb^2+R^2}{36}$ for each dimension, especially, if $\mathbf{R}_{ee}=(R,0,0)^T$, we have $\frac{Nb^2}{36}$ in $y$ and $z$ direction and $\frac{Nb^2}{36}+\frac{R^2}{12}$ in $x$-direction.
Simple simulation code:
rg2_ree = rg2 = 0
for _ in range(5000):
ch = np.random.normal(size=(1000,3))
ch = np.append(np.zeros((1,3)),np.cumsum(ch, axis=0),axis=0)
ch = ch - ch.mean(axis=0)
ree = ch[-1]-ch[0]
ree = ree/np.linalg.norm(ree)
rg2_ree += np.sum(ch.dot(ree)**2)
rg2 += np.sum(ch**2)
rg2_ree/rg2
# Result is 0.666...
No comments:
Post a Comment