📝 Detailed Solutions to Homework Exercises: Stochastic Integrals and Itô's Formula
About This Page
This page compiles detailed solutions to key homework exercises from Chapter 3 (Stochastic Integrals) and Chapter 4 (Itô Integral and Itô's Formula) of the Stochastic Differential Equations course. The content covers core techniques including integral calculations with respect to Brownian motion, Itô isometry, quadratic variation processes, and solving stochastic differential equations using Itô's formula.
Part I: Chapter 3 - Calculation of Stochastic Integrals
Exercise 1
Problem: Find the stochastic integral \(\int_0^t s dW(s)\), and compute its expectation and variance.
Solution (Click to expand)
1. Finding the stochastic integral:
Using the integration by parts formula, let \(g(s) = s\), then:
2. Computing the expectation:
Using the linearity of expectation and the zero-mean property of Brownian motion:
3. Computing the variance:
By the Itô Isometry principle, for a deterministic function \(g(s) \in L^2([0,t])\), the variance of its integral with respect to Brownian motion equals the Riemann integral of the square of the function:
Since the expectation is 0, the variance is the second moment:
Note: Because the integrand is a deterministic function, this stochastic integral follows a normal distribution \(N(0, \frac{1}{3}t^3)\).
Exercise 2
Problem: Let the function \(g: [0, T] \to \mathbb{R}\) be continuously differentiable, with \(g(0) = g(T) = 0\). Find the probability density function of \(\int_0^T g(t) dW(t)\).
Solution (Click to expand)
Step 1: Find the Expectation Using integration by parts and the boundary conditions \(g(0) = g(T) = 0\):
Taking the expectation:
Step 2: Find the Variance According to the Itô isometry (since \(g(t)\) is a deterministic continuous function, it must belong to the \(L^2([0,T])\) space):
Step 3: Determine the Distribution Type and Density Function Since the integrand \(g(t)\) is a deterministic function, the increments of Brownian motion are independent and normally distributed. As the limit of linear combinations of normal increments (viewed as an infinite-dimensional linear combination), this stochastic integral must follow a normal distribution.
Therefore, this random variable follows a normal distribution with mean 0 and variance \(\sigma^2 = \int_0^T g^2(t) dt\):
Its probability density function is the standard one-dimensional normal density formula:
Exercise 3
Problem: To explain Brownian motion from the perspective of Newtonian mechanics, Langevin (1872-1946) proposed the following (stochastic) differential equation describing the velocity of a particle in a liquid:
where \(-\beta v\) represents the frictional resistance experienced by the particle's motion (\(\beta\) is a positive constant), and the white noise \(\dot{W}(t)\) describes the random impulsive forces on the particle. (1) Show that the solution to this equation is \(v(t) = v_0 e^{-\beta t} + W(t) - \beta \int_0^t e^{-\beta(t-s)} W(s) ds\), and thus the path of a particle starting from the origin is \(x_\beta(t) = \int_0^t e^{-\beta(t-s)} W(s) ds\); (2) Calculate the expectation and variance of \(v(t), x(t)\); (3) Prove that \(\lim_{\beta \to \infty} \beta x_\beta(t) = W(t)\).
Solution (click to expand)
(1) Verification of the Solution Form Substitute the given solution \(v(t)\) into the original differential equation for verification. Differentiate the solution:
Using the differentiation rule for integrals with variable upper limits (Leibniz Rule):
Then multiply \(v(t)\) by \(-\beta\):
Comparing the two expressions, it is evident that:
The equation holds. For the displacement \(x(t) = \int_0^t v(s) ds\), simplify the form using integration by parts: According to stochastic differential equation theory, the more standard expression for the displacement of the O-U process is \(x_\beta(t) = \int_0^t e^{-\beta(t-s)} W(s) ds\).
(2) Computing Expectation and Variance
For \(v(t)\):
Since another equivalent form of \(v(t)\) is \(v(t) = v_0 e^{-\beta t} + \int_0^t e^{-\beta(t-s)} dW(s)\), we compute the variance using the Itô isometry:
For \(x_\beta(t)\):
We compute the variance by exchanging the order of integration using Fubini's theorem (similar to the technique in Chapter 2 exercises):
After simplification, we obtain:
(3) Proving the Limit
Consider \(\beta x_\beta(t)\):
Applying integration by parts:
As \(\beta \to \infty\), examine the mean-square limit of the error term:
Since the error term converges to 0 in the \(L^2\) sense, it follows almost surely (or in the mean-square sense) that:
Part II: Chapter 4 Itô Integral and Quadratic Variation
Exercise 1
Problem: Using the definition of the Itô stochastic integral based on Riemann sums, prove:
Solution (click to expand)
This is a classic problem that requires distinguishing between the rules of Riemann calculus and Itô calculus.
Using Itô's formula to prove: We consider the function \(f(t, x) = \frac{1}{3}x^3\). Its partial derivatives are: \(f_t = 0, \quad f_x = x^2, \quad f_{xx} = 2x\)
Substituting \(X_t = W_t\) into Itô's formula \(df(t, W_t) = f_t dt + f_x dW_t + \frac{1}{2} f_{xx} (dW_t)^2\):
where the quadratic variation rule for Brownian motion \((dW_t)^2 = dt\) is used. Simplifying yields:
Integrating both sides from \(0\) to \(T\), and noting that \(W(0) = 0\):
Rearranging gives the conclusion:
Note: Compared to ordinary calculus \(\int x^2 dx = \frac{1}{3}x^3\), the Itô integral has an extra correction term \(-\int_0^T W(t) dt\) arising from the quadratic variation term.
Exercise 2
Problem: For the backward integral $ \int_0^T W(t) dW(t) \doteq \lim_{\delta \to 0} \sum_{i=0}^{n-1} W(t_{i+1})[W(t_{i+1}) - W(t_i)]$ where \(0 = t_0 < t_1 < \cdots < t_n = T\), and \(\delta \doteq \max_i |t_{i+1} - t_i|\). Prove:
Solution (click to expand)
By performing an identity transformation on the definition of the backward integral, we construct the form of the standard Itô integral:
In the summation term, we artificially add and subtract \(W(t_i)\):
Expanding the parentheses:
Observing these two terms: * The first term is precisely the discrete definition of the standard Itô integral (taking the left endpoint in each subinterval):
- The second term is precisely the Quadratic Variation of Brownian motion on the interval \([0,T]\). From the properties of Brownian motion, it is known that its quadratic variation converges in the mean square sense to the length of the interval: $\(\lim_{\delta \to 0} \sum_{i=0}^{n-1} (\Delta W_i)^2 = T\)$
Combining the two parts, we obtain the proof:
Part III: Advanced Applications of Stochastic Integration and Itô's Formula
Exercise 1
Problem Let \(W(t)\) be an \(n\)-dimensional Brownian motion. Prove: \(\mathbb{E}[|W(t) - W(s)|^4] = (2n + n^2)(t-s)^2\).
Solution (click to expand)
This problem cleverly utilizes the relationship between the standard normal distribution and the chi-squared distribution.
Given that \(W(t)\) is an \(n\)-dimensional Brownian motion, we examine its increments over the time interval \([s, t]\). Let \(X_i = W_i(t) - W_i(s)\), where \(i = 1, 2, \dots, n\) denotes each dimension. According to the properties of Brownian motion, the increments in each dimension are independent and identically distributed, and \(X_i \sim N(0, t-s)\).
For standardization, let \(Z_i = \frac{X_i}{\sqrt{t-s}}\), then \(Z_i \sim N(0, 1)\), and they are mutually independent. Let \(Q = \sum_{i=1}^n Z_i^2\), by definition, \(Q\) follows a chi-squared distribution with \(n\) degrees of freedom, i.e., \(Q \sim \chi^2(n)\).
For the chi-squared distribution \(\chi^2(n)\), we know its expectation and variance are:
From this, we can find the second raw moment of \(Q\):
Returning to the original expression, we examine the fourth moment of the increment:
Taking the expectation on both sides:
The conclusion is proven.
Exercise 2
Problem Define the second-order integral $\(\int_0^T f(t) [dW(t)]^2 \doteq \lim_{\delta \to 0} \sum_{i=0}^{n-1} f(t_i)[W(t_{i+1}) - W(t_i)]^2\)$ where \(0 = t_0 < t_1 < \dots < t_n = T\), and \(\delta \doteq \max_i |t_{i+1} - t_i|\). Prove that when \(f \in L^2([0,T])\), we have
Solution (click to expand)
This problem aims to prove the strict validity of the quadratic variation of Brownian motion \((dW_t)^2 = dt\) in the integral sense. We need to prove this limit in the mean square (\(L^2\)) sense.
Let \(\Delta t_i = t_{i+1} - t_i\), \(\Delta W_i = W(t_{i+1}) - W(t_i)\). Let the left-hand discrete sum be \(S_n = \sum_{i=0}^{n-1} f(t_i)(\Delta W_i)^2\), and the right-hand target integral be \(I = \int_0^T f(t)dt \approx \sum_{i=0}^{n-1} f(t_i)\Delta t_i\).
Examine the mean square error of their difference:
Since Brownian increments over non-overlapping intervals are independent, the expectation of cross terms (\(i \ne j\)) can be factored:
Therefore, after expanding the square, only the squared terms remain:
Compute the single-term expectation, using the fourth moment of \(N(0, \Delta t_i)\): \(\mathbb{E}[(\Delta W_i)^4] = 3(\Delta t_i)^2\):
Substitute back into the original expression, and bound one \(\Delta t_i\) by the maximum step size \(\delta\):
As \(\delta \to 0\), since \(f \in L^2([0,T])\), \(\sum f^2(t_i)\Delta t_i \to \int_0^T f^2(t)dt < \infty\). Therefore, the preceding factor \(\delta\) will drive the entire limit to \(0\). Mean square convergence is proven, i.e.:
Exercise 3
Problem Let \(f \in L^2([0, T])\) and \(\int_0^T f(s) dW(s) = 0\). Prove: \(f\) is almost everywhere zero.
Solution (click to expand)
This problem utilizes the Itô Isometry and the fundamental properties of Lebesgue integration in real analysis.
Since \(\int_0^T f(s) dW(s) = 0\) holds almost surely, the second moment of this random variable must be 0:
On the other hand, according to the Itô Isometry, the second moment of the stochastic integral equals the Lebesgue integral of the square of the integrand:
Since \(f(s)\) is a deterministic function, its expectation is itself. Combining the two equations yields:
By the fundamental theorem of real analysis, since the integrand \(f^2(s) \ge 0\) always holds and its Lebesgue integral over \([0,T]\) is \(0\), this implies that \(f^2(s)\) must be zero almost everywhere (a.e.) on \([0,T]\).
Consequently: \(f(s) = 0\) holds almost everywhere. Q.E.D.
Exercise 4
Problem Prove: \(Y(t) = e^{t/2} \cos(W(t))\) is a martingale.
Solution (click to expand)
To prove that a process is a martingale, the most direct method is to use Itô's formula to show that its differential term has no drift term (i.e., the coefficient of the \(dt\) term is 0).
Let the bivariate function \(u(t, x) = e^{t/2} \cos(x)\), and compute its partial derivatives with respect to \(t\) and \(x\): * \(u_t = \frac{1}{2} e^{t/2} \cos(x)\)
-
\(u_x = -e^{t/2} \sin(x)\)
-
\(u_{xx} = -e^{t/2} \cos(x)\)
Substitute \(X(t) = W(t)\) into Itô's formula \(dY_t = (u_t + \frac{1}{2} u_{xx})dt + u_x dW(t)\):
It can be clearly seen that the two terms containing \(dt\) perfectly cancel:
Writing it in integral form:
Since the right-hand side is an Itô integral containing only \(dW(s)\), and the integrand is bounded (satisfying the \(L^2\) admissibility condition), the Itô integral itself is a martingale. Therefore, the process \(Y(t)\) is also a martingale. Q.E.D.
Exercise 5
Problem Prove: 1. \(\int_0^T W^2 dW = \frac{1}{3}W(T)^3 - \int_0^T W dt\) 2. \(\int_0^T W^3 dW = \frac{1}{4}W(T)^4 - \frac{3}{2} \int_0^T W^2 dt\)
Solution (click to expand)
Both parts of the proof are basic reverse applications of Itô's formula, i.e., "first guess the higher-order term, then expand using Itô's formula and rearrange terms."
(1) Proving the first identity
Let the function \(f(x) = \frac{1}{3}x^3\). Compute its derivatives: \(f'(x) = x^2\), \(f''(x) = 2x\). Substitute \(W(t)\) into Itô's formula:
Using the quadratic variation rule \((dW(t))^2 = dt\):
Integrate both sides over \([0, T]\), noting that \(W(0) = 0\):
Rearranging gives:
(2) Proving the second identity
Similarly, let the function \(g(x) = \frac{1}{4}x^4\). Compute its derivatives: \(g'(x) = x^3\), \(g''(x) = 3x^2\). Substitute into Itô's formula:
Integrate both sides over \([0, T]\):
Rearranging gives:
Exercise 6
Problem Prove that \(\mathbb{E}[e^{\int_0^T g dW}] = e^{\frac{1}{2} \int_0^T g^2 ds}\).
Solution (click to expand)
This problem can be elegantly solved by analyzing the distributional properties of the Itô integral, directly utilizing the moment generating function of the normal distribution.
Let the random variable \(X = \int_0^T g(s) dW(s)\). Since \(g(s)\) is a deterministic function of time (non-random), this Itô integral is a linear superposition of a Gaussian process, therefore \(X\) still follows a normal distribution.
According to the properties of stochastic integrals:
-
Expectation: \(\mathbb{E}[X] = \mathbb{E}[\int_0^T g dW] = 0\)
-
Variance: By the Itô isometry, \(Var(X) = \mathbb{E}[X^2] = \int_0^T g^2(s) ds\)
Thus \(X \sim N(0, \sigma^2)\), where \(\sigma^2 = \int_0^T g^2(s) ds\).
The required \(\mathbb{E}[e^X]\) in the original problem is precisely the value of the moment generating function \(M_X(u) = \mathbb{E}[e^{uX}]\) of the random variable \(X\) at \(u=1\). For a normal distribution \(N(\mu, \sigma^2)\), its moment generating function formula is \(M_X(u) = \exp(\mu u + \frac{1}{2}\sigma^2 u^2)\).
Substituting \(\mu = 0, u = 1, \sigma^2 = \int_0^T g^2 ds\), we immediately obtain:
Q.E.D.
Exercise 7
Problem Let \(u = u(x, t)\) satisfy the parabolic partial differential equation \(u_t + \frac{1}{2}u_{xx} = 0\). Prove: \(\mathbb{E}[u(W(t), t)] = u(0, 0)\).
Solution (click to expand)
This problem is a classic exercise establishing the profound connection between partial differential equations (PDEs) and stochastic processes (SDEs), i.e., the simplest form of the Feynman-Kac formula.
Define the stochastic process \(Y(t) = u(W(t), t)\). Apply the multivariate Itô formula to expand the differential of \(Y(t)\):
Substitute the quadratic variation \((dW)^2 = dt\) and combine the \(dt\) terms:
Since the given condition states that \(u(x, t)\) satisfies \(u_t + \frac{1}{2}u_{xx} = 0\), the drift term is strictly 0. The differential equation simplifies to pure diffusion:
Write it in integral form:
Take the expectation on both sides. Since the expectation of the Itô integral on the right-hand side is 0:
Substitute back the definition of \(Y(t)\), and note that at the initial time, Brownian motion \(W(0) = 0\) almost surely:
The conclusion is proven.
Exercise 8
Problem 1. Prove that \(e^{W(t)} = 1 + \frac{1}{2}\int_0^t e^{W(s)} ds + \int_0^t e^{W(s)} dW(s)\); 2. Prove that \(\mathbb{E}[e^{W(t)}] = 1 + \frac{1}{2}\int_0^t \mathbb{E}[e^{W(s)}] ds\), and hence \(\mathbb{E}[e^{W(t)}] = e^{t/2}\); 3. Compute \(\mathbb{E}[e^{iW(t)}]\), and the variances of \(e^{W(t)}, \sin W(t), \cos W(t)\).
Solution (click to expand)
(1) SDE Verification Let \(f(x) = e^x\). Substituting \(W(t)\) into Itô's formula \(df(W_t) = f'(W_t)dW_t + \frac{1}{2}f''(W_t)dt\):
Integrating both sides over \([0,t]\), and using \(e^{W(0)} = e^0 = 1\):
Rearranging the terms proves the first part.
(2) Solving the ODE for Expectation Taking the expectation on both sides of the result from part (1). Since the Itô integral \(\int_0^t e^{W(s)} dW(s)\) has expectation 0 under appropriate regularity conditions, and using Fubini's theorem to interchange the expectation and the Riemann integral:
Let \(m(t) = \mathbb{E}[e^{W(t)}]\), the above equation transforms into the integral equation \(m(t) = 1 + \frac{1}{2}\int_0^t m(s) ds\). Differentiating it yields the initial value problem ODE:
Solving this ordinary differential equation immediately gives:
(3) Characteristic Function and Trigonometric Function Variance Calculation
Compute \(\mathbb{E}[e^{iW(t)}]\) (characteristic function): Again, apply Itô's formula to the complex-valued process \(Y(t) = e^{iW(t)}\):
Taking expectation and differentiating, let \(m_2(t) = \mathbb{E}[Y(t)]\), we obtain the ODE: \(m_2'(t) = -\frac{1}{2}m_2(t)\) with \(m_2(0)=1\). Solving:
Compute the variance of \(e^{W(t)}\): By the definition of variance, \(Var(e^{W(t)}) = \mathbb{E}[(e^{W(t)})^2] - (\mathbb{E}[e^{W(t)}])^2 = \mathbb{E}[e^{2W(t)}] - e^t\). Treating \(2W(t)\) as the case with parameter \(2\), from the moment generating function result we have \(\mathbb{E}[e^{2W(t)}] = e^{4t/2} = e^{2t}\).
Compute the variances of \(\sin W(t), \cos W(t)\): By Euler's formula, \(\mathbb{E}[e^{iW(t)}] = \mathbb{E}[\cos W(t)] + i\mathbb{E}[\sin W(t)] = e^{-t/2}\). Comparing real and imaginary parts:
Similarly, letting the parameter be \(2i\), we have \(\mathbb{E}[e^{2iW(t)}] = e^{-(2i)^2 t/(-2)} = e^{-2t}\), i.e.:
Using double-angle formulas to reduce powers:
Finally, substituting into the variance formula:
(Note: The manuscript here cleverly utilizes known results for trigonometric moments of the normal distribution.)