# Preface

Iteration is the main topic of this section.

# Iterations

Our main method to solve nonlinear and variable coefficient linear differential equations is expansion of solutions into power series. Before the middle of 20th century, it was the standard method for attacking problems of this kind and many of the famous equations such as Legendre and Chebyshev equations have series solutions named after them. With the availability of computers, the power series came to prominence once again. In this section, we present an extension and generalization of Picard's iteration scheme. We start with the first order differential equations.

First order separable equations

Although the initial value problem for a separable differential equation can be solved by separation of variables, we demonstrate the interation scheme advantage on this particular class of equations. We rewrite the initial value problem
$y' = f(x)\,g(y), \qquad y(x_0 ) = y_0$
in an equivalent integral form
$y(x) = y_0 + \int_{x_0}^x f(s)\,g(y(s))\,{\text d} s .$
It is assumed that the functions f(x) and g(y) are smooth enough to have all nesessary derivatives is use. Let h(x) be an antiderivatives of f(x) so $$\displaystyle f(x) = h' (x) .$$ The integral is expanded by integration by parts and use of the given differential equation in the first step
$y(x) = y_0 + \int_{x_0}^x h' (s)\,g(y(s))\,{\text d} s = y_0 + h (x)\, g(y) - f\left( x_0 \right) g\left( y_0 \right) - \int_{x_0}^x h(s)\,g' (y(s))\,f(s)\,g(y)\,{\text d} s .$
It is convenient to rewrite the obtained equation as
$y(x) - y_0 = \left[ h(x)\,g\left( y(s) \right) \right]_{s=x_0}^{s=x} - \int_0^x g_2 (s)\, \frac{\text d}{{\text d}s} \left( \frac{h^2 (s)}{2} \right) {\text d} s ,$
where $$\displaystyle g_2 (s) = g' (y(s))\,g(y(s)) .$$ Integration by parts again yields
$y(x) - y_0 = \left[ h(x)\,g\left( y(s) \right) \right]_{s=x_0}^{s=x} - \left[ \frac{h^2 (s)}{2} \, g_2 (s)\right]_{s=x_0}^{s=x} + \int_{x_0}^x \left( \frac{h^2 (s)}{2} \right) \frac{\text d}{{\text d}s} \left( g_2 (s) \right) f(s)\,g(s) \,{\text d} s .$
Upon introducing a sequence of functions
$$g_0 (y) = y, \qquad g_1 (y) = g(y), \qquad g_{n+1} (y) = g(y)\, \frac{{\text d}g_{n} (y)}{{\text d}y} , \quad n= 1,2,\ldots , \label{EqIter.1}$$
we obtain
$y(x) - y_0 - \left\{ \sum_{n=1}^N \frac{(-1)^n}{n!} \,h^n (s)\, g_n (y) \right\}_{s=x_0}^{s=x} =(-1)^{N+1} \frac{1}{N!} \, \int_{x_0}^x h^N (s) f(s)\, g_{N+1} (y(s))\,{\text d}s .$
The integral term in the equation above is the remainder of th expression; if it vanishes with N → ∞, the series solution to the separable equation becomes
$$\left\{ \sum_{n\ge 0} \frac{(-1)^n}{n!} \, \left[ h(s) \right]^n g_n (y) \right\}_{s=x_0}^{s=x} = C , \label{EqIter.2}$$
where C is a constant that is specified by the initial condition. In the region of convergence the series summation will give a function of x and y. So it will provide an implicit solutions to the given separable equation.

Example: Consider the differential equation

$y' = f(x)\, y^m ,$
where m is a real number. Using function g(y) = ym, we obtain from Eq.\eqref{EqIter.1} that
$g_n (y) = m \left( 2m-1 \right) \cdots \left( mn -n-m +2 \right) y^{mn-n+1} , \qquad n= 2,3,\ldots .$
For particular values of m this sequence may terminante at a finite value of n. If m ≠ 1, the general solution is
$\sum_{n\ge 0} \frac{(-1)^n}{n!}\, \left( 1- m \right)^n \binom{1/(1-m)}{n} \,h^n (x) y^{mn-n+1} = C.$

Consider a particular separable equation

$y' = f(x) \, y^{1/2} \qqiad \frac{{\text d} y}{\sqrt{y}} = f(x)\,{\text d} x .$
Integration gives the general solution in implicit form:
$2\,y^{1/2} = \int_{x_0}^x f(s)\,{\text d}s + C .$
The g-functions from Eq.\eqref{EqIter.1} are
$g_1 (y) = y^{1/2} , \qquad g_2 (y) = \frac{1}{2} , \qquad g_n = 0, \quad n=3,4,\ldots .$
The series solution is reduced to
$y - h(x)\, y^{1/2} + \frac{1}{4}\, h(x)^2 = C \qquad \Longrightarrow \qquad y = \left[ \frac{1}{2}\,h(x) + C \right]^2 .$
■

Generalized Riccati equation

Consider the differential equation
$$y' = f(x) + g(y) \qquad\mbox{with} \qquad h' (x) = f(x) \quad\mbox{and} \quad y(x_0 ) = y_0 . \label{EqIter.3}$$
Integrating both sides, we get
$y (x) = y_0 + \left[ h(s) \right]_{s=x_0}^{s=x} + \int_{x_0}^x g(y(s))\,{\text d} s .$
The integral over g(y) can be transformed by integrating by parts:
$\int_{x_0}^x g(y(s))\,{\text d} s = \left[ s\,g(y(s)) \right]_{s=x_0}^{s=x} - \int_{x_0}^x s\, \frac{\partial}{\partial s} \, g(y(s)) \,{\text d} s = \left[ s\,g(y(s)) \right]_{s=x_0}^{s=x} - \int_{x_0}^x s\, \frac{{\text d}g}{{\text d}y}\left[ f(s) + g(y(s)) \right] {\text d} s$
Integrating by parts the right-hand side of the equation
$y(x) - y_0 - \left[ h(s) + s\,g(y(s)) \right]_{s=x_0}^{s=x} = - \int_{x_0}^x s\, \frac{{\text d}g}{{\text d}y}\left[ f(s) + g(y(s)) \right] {\text d} s = - \int_{x_0}^x \frac{{\text d}}{{\text d}s} \left( \frac{s^2}{2} \right) \left[ g_2 (s) + g'_1 (s)\,f(s) \right] {\text d} s ,$
we get
$- \int_{x_0}^x s\, \frac{{\text d}g}{{\text d}y}\left[ f(s) + g(s) \right] {\text d} s = - \left[ \frac{s^2}{2}\,g'_1 (s)\,f(s) + \frac{s^2}{2}\,g_2 (s) \right]_{s=x_0}^{s=x} + \int_{x_0}^x \frac{s^2}{2} \frac{\partial}{\partial s} \left[ g'_1 (s)\,f(s) + g_2 (s) \right] {\text d} s$
where $$\displaystyle g_2 (s) = g(y(s))\,g' (y(s))$$ and $$\displaystyle g_1 (s) = g(y(s)) .$$ Substituting back, we obtain
$y - y_0 = \left[ h(s) + s\,g(y(s)) - \frac{s^2}{2}\,g' (s)\,f(s) - \frac{s^2}{2}\,g_2 (s)\right]_{s=x_0}^{s=x} + \int_{x_0}^x \frac{{\text d}}{{\text d}s} \left( \frac{s^3}{3!} \right) \left[ g'' (s)\, f(s) + g' (s)\, h(s) + g'_2 (s) \right] \left( f(s) + g(s) \right) {\text d} s .$
Extending the process of intgeration by parts, we arrive at
$y - y_0 = \left[ h(s) + \sum_{n=0}^N (-1)^{n+1} \frac{1}{n!} \, s^n g_n (s) \right]_{s=x_0}^{s=x} + \sum_{n=1}^N (-1)^{n} \frac{1}{n!} \int_{x_0}^x s^n h' (s) g'_n (s) \,{\text d} s + (-1)^N\,\frac{1}{N!} \int_{x_0}^x s^N h' (s) g_{N+1} (s) \,{\text d} s .$
Assuming that summation and integration can be interchaned in the second term, and that the remainder vanishes in the limit N → ∞, the series expansion becomes
$y - h(x) + \sum_{n\ge 0} (-1)^{n} \frac{1}{n!} \, x^n g_n (x) - \int_{x_0}^x {\text d} s \,\sum_{n\ge 0} (-1)^{n} \frac{1}{n!} \, s^n g'_n (s)\,h' (s) = C ,$
where C is a constant.

Example: Consider the Riccati equation

$y' = y^2 + 2x , \qquad y(0) = a.$
Integrating both sides, we obtain
$y - a = x^2 + \int_0^x y^2 (s)\,{\text d} s = x^2 + \left[ s\,y^2 (s) \right]_{s=0}^{s=x} - \int_0^x s\,2y(s)\left( 2s + y^2 (s) \right) {\text d} s .$
We simplify the result as
$y - a = x^2 + x\,y^2 - 4 \int_0^x s^2 y(s)\, {\text d} s - 2 \int_0^x \frac{\text d}{{\text d} s} \left( \frac{s^2}{2} \right) y^3 (s) \, {\text d} s .$
Next integration by parts yields
\begin{align*} \int_0^x s^2 y(s)\, {\text d} s &= \left[ \frac{s^3}{3}\, y(s) \right]_{s=0}^{s=x} - \int_0^x \frac{s^3}{3} \left( y^2 (s) + 2s \right) {\text d} s \\ &= \frac{x^3}{3}\, y - \int_0^x \frac{\text d}{{\text d} s} \left( \frac{s^4}{4\cdot 3} \right) y^2 (s) \, {\text d} s - \frac{2}{15}\, x^5 , \\ \int_0^x \frac{\text d}{{\text d} s} \left( \frac{s^2}{2} \right) y^3 (s) \, {\text d} s &= \left[ \frac{s^2}{2}\, y^3 \right]_{s=0}^{s=x} - \int_0^x \frac{s^2}{2}\,3\,y^2 (s) \left( y^2 (s) + 2s \right) {\text d} s \\ &= \frac{x^2}{2}\,y^3 - \int_0^x y^4 (s) \,\frac{\text d}{{\text d} s} \left( \frac{s^3}{2} \right) {\text d} s - \int_0^x 3\,s^3 y^2 (s) \, {\text d} s . \end{align*}
Disregarding integral terms, we obtain a polynomial approximation
$y - a = x^2 + x\,y^2 - \frac{4}{3}\,x^3 y + \frac{8}{15}\, x^5 - x^2 y^3 .$
 We plot the implicit approximation along with the true solution: contour2 = ContourPlot[ y - 1 == x^2 + x*y^2 - (4/3)*x^3 *y + (8/15)*x^5 - x^2 y^3, {x, 0, 0.8}, {y, 0.8, 1.6}, ContourStyle -> ColorData[10] /@ Range[10], BaseStyle -> Thickness[0.007]]; s = NDSolve[{y'[x] == y[x]^2 + 2*x, y[0] == 1}, y, {x, 0, 2}]; sol = Plot[Evaluate[y[x] /. s], {x, 0, 0.8}, PlotStyle -> {Orange, Thickness[0.01]}]; t1 = Graphics[ Text[Style["True solution", Orange, FontSize -> 18], {0.46, 1.5}]]; t2 = Graphics[ Text[Style["Approximation", Purple, FontSize -> 18], {0.5, 1.15}]];; Show[contour2, sol, t1, t2] True solution (in orange) and its approximation. Mathematica code

■

Second order differential equations

Example: Consider the Abel equation, named after Niels Henrik Abel

$y' = y^3 + x$
■

1. Fairen, V., Lopez, V., Conde, L., Power series approximation to solutions of nonlinear systems of differential equations, American Journal of Physics, 1988, Vol. 56, Issue , pp. 57--61; https://doi.org/10.1119/1.15432
2. Reut, Z., Solution of ordinary differential equations by successive integration by parts, International Journal of Mathematical Education in Science and Technology, 1995, Vol. 26, No. 4, pp. 589--597.
3. Robin, W.A., Solving differential equations using modified Picard iteration, International Journal of Mathematical Education in Science and Technology, 2010, Vol. 41, No. 5, pp. 659--665. https://doi.org/10.1080/00207391003675182
4. Robin, W.A., Iterative solutions to classical second-order ordinary differential equations, Journal of Inovation technology and Education, 2019, Vol. 6, No. 1, pp. 1--12. HIKARI Ltd, https://doi.org/10.12988/jite.2019.911