Lecture 16

Noise in Gene Activation

We will continue our discussion of noise in gene activation from the previous lecture.

Last lecture we made a simple model for gene activation and the intrinsic noise in the system. Next we will consider how to derive the extrinsic noise. Recall that X is the activator molecule and Y is the protein produced from transcription / translation.

\[n = \text{ number of Y molecules} \\ m = \text{ number of X molecules}\]

The noise in the number of Y molecules is due to both intrinsic and extrinsic noise.

\[\sigma^2_n = \sigma^2_\text{int} + \sigma^2_\text{ext}\]

Last lecture we calculated the intrinsic noise in the production Y without considering X.

\[\eta^2_\text{int}=\frac{\sigma^2_\text{int}}{\bar{n}^2}=\frac{1}{\bar{n}}\]

Before we tackle how to compute the extrinsic noise lets look at unregulated production/degradation of Y from a different perspective.

Generating Function

Recall from last lecture that the probability of being in a state of \(n\) Y proteins is the following:

\[\dot{p_n} =p_{n-1}+p_{n+1}r(n+1)-p_nk-p_nrn \ \ \ \text{for }n\ge 1.\]

Lets perform a Fourier transform on \(p_n\).

\[p(\theta)=\sum_n p_n e^{i\theta n} \\ p_n = \int_0^{2\pi}\frac{d\theta}{2\pi}p(\theta) e^{-i\theta n} \\ \text{let } z\equiv e^{i\theta} \\ p(\theta) \to \sum_n p_n z^n\]

We now have an expression for the Generating Function, \(G(z)\).

\[G(z) = \sum_n p_n z^n\]

We can use the generating function to solve for the probabilities.

\[p_n = \int_C\frac{dz}{2\pi i z}G(z) \frac{1}{z^n}\]

You can evaluate this integral using the Residue Theorem.

\[\Rightarrow p_n=\frac{1}{n!} \left. \partial_z^n G(z) \right|_{z=0}\]

The function \(G(z)\) generates moments:

\[\begin{align*} G(1) &= \sum_n p_n = 1 \\ G'(1) &= \sum_n p_n \left. \partial_z z^n \right|_{z=0} = \sum_n n p_n \\ G'(1) &= \left<n\right> \\ G''(1) &= \left<n(n-1)\right> = \left<n^2\right>-\left<n\right> \end{align*}\]

We can make another function \(F(z)\) from the generating function which also has useful properties.

\[\begin{align*} F(z) &\equiv \ln G(z) \\ F(1) &=0 \\ F'(1) &= \left.\frac{G'(z)}{G(z)}\right|_{z=1}=\left<n\right> \\ F''(1) &= \sigma^2 - \left<n\right> \end{align*}\]

Lets calculate the time derivative of the generating function.

\[\dot{G} = \sum_{n=0}^\infty \dot{p}_n z^n = \dot{p}_0z^0+\sum_{n=1}^\infty \dot{p}_n z^n \\ \dot{G} = \dot{p}_0 + \sum_{n=1}^\infty \dot{p}_n z^n\]

Now plug in the expressions for the corresponding probability time derivatives we derived in the previous lecture.

\[\dot{G} = -kp_0+rp_1+k \underbrace{\sum_{n=1}^\infty p_{n-1}z^n}_{1} - k \underbrace{\sum_{n=1}^\infty p_n z^n}_{2} + r \underbrace{\sum_{n=1}^\infty (n+1)p_{n+1}z^n}_{3} + -r \underbrace{\sum_{n=1}^\infty n p_n z^n}_{4}\]

We can simplify this expression greatly by coming some terms and simplifying some of the summation terms. Combining \(-kp_0\) with the second summation term yields:

\[-k\sum_{n=0}^\infty p_{n-1}z^n = -kG\]

Combining \(rp_1\) with the third summation term yields:

\[r\sum_{n=0}^\infty (n+1)p_{n+1}z^n = rG'\]

The first and fourth summation can be expressed as…

\[k\sum_{n=1}^\infty p_{n-1}z^n = kzG \\ -r\sum_{n=1}^\infty n p_n z^n = -rzG'\]

And so we go from having an expression which is a sum of infinite ordinary differential equations to one, single partial differential equation.

\[\Rightarrow \dot{G} = -kG+rG'+kzG-rzG' \\ \dot{G}=(1-z)(rG'-kG)\]

Generating Function Steady State

\[\dot{G} = 0 \\ 0 = (1-z)(rG'-kG)\]

\(z\) is a variable not necessarily equal to zero, therefore the other factor must equal zero.

\[\Rightarrow 0 = rG'-kG \\ \Rightarrow G' = \frac{k}{r}G = \lambda G \ \ \ \text{ with } \lambda \equiv \frac{k}{r} \\ \therefore G(z) = Ae^{\lambda z}\]

Using the first moment of the generating function we can solve for the unknown coefficient \(A\).

\[G(1) = 1 \Rightarrow A=e^{-\lambda} \\ G(z) = e^{\lambda(z-1)} \\ \therefore p_n = \frac{1}{n!} \lambda^n \left. e^{\lambda(z-1)} \right|_{z=0} \\ p_n = \frac{1}{n!} \lambda^n e^{-\lambda}\]

This is the same Poisson Distribution that we have derived before. Using the generating function we can verify some of our earlier results.

\[\begin{align*} F(z) &= \ln G(z) = \lambda(z-1) \\ F'(1) &= \lambda = \left<n\right> \\ F''(1)&=0=\sigma^2-\left<n\right> \\ \Rightarrow \sigma^2&=\left<n\right> \end{align*}\]

Extrinsic Noise

Now after all that math, lets go back and consider how find the extrinsic noise in our earlier problem. Our system of interest look like the following:

\[\text{D} \xrightarrow{g}\text{D}+\text{X} \\ \text{X} \xrightarrow{s}\emptyset \\ \text{D} \xrightarrow{k_\text{eff}(x)}\text{D}+\text{Y} \\ \text{Y} \xrightarrow{r}\emptyset\]

We will simplify our expression for \(k_\text{eff}(x)\) by assuming \(h=1\) and \(x<<K_d\) and combining all the constants together.

\[k_\text{eff}(x) = k\frac{x^h}{x^h+K_d^h} \approx \frac{k}{K_d}x \to kx\]

This leads to the following rate equations for both X and Y.

\[\dot{x} = \frac{g}{V}-sx \\ \dot{y}=\frac{kx}{V}-ry\]

Lets define the joint probability of having \(m\) X proteins and \(n\) Y proteins in a cell as

\[p_{mn}\equiv p(m,n) .\] \[\begin{align*} \dot{p}_{mn}= g &\ p_{m-1,n} - g \ p_{mn} + s(m+1)p_{m+1,n} \\ - &(sm)p_{mn} + (km)p_{m,n-1} \\ - &(km)p_{mn} + r(n+1)p_{m,n+1} \\ - &(rn)p_{mn} \end{align*}\]

Generating Function

\[G(u,z) = \sum_{m,n}p_{mn}u^mz^n \\ \{x,m\} \to u, \ \ \ \{y,n\} \to z \\ \dot{G} = (1-u)(s\partial_u G-gG) + (1-z)(r\partial_z G - ku\partial_u G) \\ \dot{F} = \frac{\dot{G}}{G} = (1-u)(s\partial_u F-g) + (1-z)(r\partial_z F - ku\partial_u F)\]

We can calculate the mean and the noise using the following relations

\[\left<n\right> = \left. \partial_zF \right|_{z=1} \\ \sigma^2_n - \left<n\right> = \left. \partial^2_nF \right|_{z=1, \ u=1}\]

After some math we get…

\[\left<n\right> = \frac{k}{r}\left<m\right> \\ \sigma_n^2 = \underbrace{\left<n\right>}_\text{intrinsic} + \underbrace{\frac{k}{r}c}_\text{extrinsic}\]

The term \(c\) is known as the covariance.

\[c =\left. \partial_u\partial_z F \right|_{z=1, \ u=1} \\ c = \frac{k}{s+r}\sigma^2_m \\ \therefore \sigma_n^2 = \left<n\right> + \frac{k}{r}\frac{k}{s+r}\sigma^2_m\]

So the noise has both intrinsic and extrinsic components.

\[\sigma^2_\text{int} = \left<n\right> \\ \sigma^2_\text{ext} = \left(\frac{k}{r}\right)^2 \left(\frac{r}{s+r}\right)\sigma^2_m\]

Slow response of Y corresponds to \(r<<s\), \(\ \sigma^2_\text{ext}\to 0\).

Written with StackEdit.

Written on March 13, 2015