Preface

This is a tutorial made solely for the purpose of education and it was designed for students taking Applied Math 0330. It is primarily for students who have very little experience or have never used Mathematica before and would like to learn more of the basics for this computer algebra system. As a friendly reminder, don't forget to clear variables in use and/or the kernel.

Finally, the commands in this tutorial are all written in bold black font, while Mathematica output is in normal font. This means that you can copy and paste all commands into Mathematica, change the parameters and run them. You, as the user, are free to use the scripts for your needs to learn the Mathematica program, and have the right to distribute this tutorial and refer to this tutorial as long as this tutorial is accredited appropriately.

Existence and Uniqueness

Theorem: Suppose that f(x,y) is a continuous function deﬁned in some rectangular region

$R = \left\{ (x,y)\,: \,| x_0 - x | \le \delta , \quad |y_0 - y | \le \epsilon \right\}$
containing the point $$\left( x_0 , y_0 \right) .$$ Then there exists a number h (possibly smaller than δ) so that a solution $$y = \phi (x)$$ to the initial value problem
$y' = f(x,y) , \qquad y(x_0 ) = y_0$
is deﬁned for $$x \in (x_0 - h , x_0 + h ) .$$

This theorem was proved in 1886 by the Italian mathematician Giuseppe Peano (1858--1932). Giuseppe Peano. Giuseppe Peano was a founder of symbolic logic whose interests centred on the foundations of mathematics and on the development of a formal logical language. In 1890 Peano founded the journal Rivista di Matematica, which published its first issue in January 1891. In 1891, Peano started the Formulario Project. It was to be an "Encyclopedia of Mathematics", containing all known formulae and theorems of mathematical science using a standard notation invented by Peano.

In addition to his teaching at the University of Turin, Peano lectured at the Military Academy in Turin in 1886. The following year he discovered, and published, a method for solving systems of linear differential equations using successive approximations. However Émile Picard had independently discovered this method and had credited the German mathematician Hermann Schwarz (1843--1921) with discovering the method first. In 1888 Peano published the book Geometrical Calculus which begins with a chapter on mathematical logic.

Theorem: Let f(y) be a continuous function on the closed interval [a,b] that has one null $$y^{\ast} \in (a,b) ,$$ namely, $$f(y^{\ast} ) =0$$ and $$f(y) \ne 0$$ for all other points $$y \in (a,b) .$$ If the integral

$\int_y^{y^{\ast}} \frac{{\text d}y}{f(y)}$
diverges, then the initial value problem for the autonomous differential equation
$y' = f(y) , \qquad y(x_0 ) = y^{\ast}$
has the unique solution $$y (x) \equiv y^{\ast} .$$ If the integral converges, then the initial value problem has multiple solutions.

Theorem: Suppose that f(x,y) is uniformly Lipschitz continuous in y (meaning the Lipschitz constant L in the inequality $$|f(x,y_1 ) - f(x, y_2 )| \le L\,|y_1 - y_2 |$$ can be taken independent of x) and continuous in x. Then, for some positive value δ there exists a unique solution $$y = \phi (x)$$ to the initial value problem

$y' = f(x,y) , \qquad y(x_0 ) = y_0$
on the interval $$\left[ x_0 -\delta , x_0 + \delta \right] .$$

Theorem: Let f(x,y) be continuous for all (x,y) in open rectangle $$R= \left\{ (x,y)\,:\, |x-x_0 | < a, \quad |y- y_0 | < b \,\right\}$$ and Lipschitz continuous in y, with constant L independent of x. Then there exists a unique solution to the initial value problem

$y' = f(x,y) , \qquad y(x_0 ) = y_0$
on the interval. Moreover, if z(x) is the solution to the same problem with the initial condition $$z(x_0 ) = z_0 ,$$ then
$\left\vert y(x) - z(x) \right\vert \le e^{L(x- x_0 )} \,|y_0 - z_0 | .$

Theorem: Suppose that f(x,y) and $$\frac{\partial f}{\partial y}$$ are continuous functions deﬁned in some rectangular region

$R = \left\{ (x,y)\,: \,| x_0 - x | \le \delta , \quad |y_0 - y | \le \epsilon \right\}$
containing the point $$\left( x_0 , y_0 \right) .$$ If these functions are bounded in R:
$\left\vert f(x,y) \right\vert \le M \qquad \mbox{and}\qquad \left\vert \frac{\partial f}{\partial y} \right\vert \le K$
for some positive constants M and K, then the initial value problem
$y' = f(x,y) , \qquad y(x_0 ) = y_0$
has a unique solution in the interval
$\left[ x_0 -h , x_0 +h \right] , \qquad \mbox{where} \qquad h \le \min \left\{ \delta , \frac{\epsilon}{M} \right\} . \qquad ■$
The above theorem is usually referred to as Picard's theorem (or sometimes Picard–Lindelöf theorem) named after Émile Picard (1858--1941) who proved this result based on iteration procedure. Ernst Leonard Lindelöf (1870--1946) was a Finnish mathematician and Charles Émile Picard was a French mathematician. Rudolf Otto Sigismund Lipschitz (1832--1903) was a German mathematician who gave his name to the Lipschitz continuity condition. Picard's mathematical papers, textbooks, and many popular writings exhibit an extraordinary range of interests, as well as an impressive mastery of the mathematics of his time. In addition to his theoretical work, Picard made contributions to applied mathematics, including the theories of telegraphy and elasticity. Picard's popular writings include biographies of many leading French mathematicians, including his father in law, Charles Hermite.

Since the initial value problem $$y' = f(x,y) , \qquad y(x_0 ) = y_0$$ is equivalent to the Volterra integral equation (subject that the slope function f(x,y) and the solution y(x) are continuous functions)
$y(x) = y_0 + \int_{x_0}^x f(s,y(s))\, {\text d}s ,$
this problem has a unique solution that is the limit of the sequence of function $$\left\{ \phi_n (x) \right\}_{n\ge 0}$$ that satisfies the recurrence:
$\phi_{n+1} (x) = y_0 + \int_{x_0}^x f(s,y_n (s))\, {\text d}s , \qquad n=0,1,2,\ldots ; \quad \phi_0 = y_0 .$

Corollary: The continuous dependence of the solutions on the initial conditions holds whenever slope function f satisfies a global Lipschitz condition.

Corollary: If the solution y(x) of the initial value problem $$y' = f(x,y), \ y(x_0 )= y_0$$ has an a priori bound M, i.e., $$|y(x)| \le M$$ whenever y(x) exists, then the solution exists for all $$x \in \mathbb{R} .$$

Example. Consider the initial value problem

$y' = 2\,\sqrt{y}, \quad y(0)=0 ,$
where the slope function $$f(y) = 2\,\sqrt{y}$$ is continuous on infinite interval $$[0, \infty )$$ but not Lipschitz. So according to Peano's theorem, this initial value problem has a solution. Indeed, we can apply Picard's iteration procedure to obtain a solution $$y(x) \equiv 0 .$$ On the other hand, since the given differential equation is autonomous, we can separate variables and integrate:
$\frac{{\text d}y}{2\,\sqrt{y}} = {\text d} x \qquad \Longrightarrow \qquad \sqrt{y} = x-C ,$
where C is an arbitrary constant. Since we consider only positive branch of the square root function, the above formula is valid only when $$x \ge C .$$ Therefore, we get a family of solutions (which is also called the general solution) depending on a parameter C:
$y = \begin{cases} \left( x-C \right)^2 , & \qquad x \ge C , \\ 0 , & \quad \mbox{for } x <C . \end{cases}$
Using Mathematica, we plot some solutions
q[x_, CC_] = Piecewise[{{(x - CC)^2, x >= CC}, {0, x < CC}}];
q2 = Plot[y = 0, {x, -3.5, 3.5}, PlotStyle -> {Thick, Black}] (* singular solution *)
graph4[CC_] :=
Module[{}, Plot[Evaluate[q[x, CC]], {x, -3.5, 3.5}, AxesLabel -> {x, y},
PlotRange -> {{-3.5, 3.5}, {-0.5, 6}}, AspectRatio -> 1, DisplayFunction -> Identity,
PlotStyle -> RGBColor[1, 0, 0]]]
initlist = {0, 0.5, 1, 1.5, 2, 2.5, 3, 3.5, 4, -1, -2, -3, -4};
Module[{i, newgraph}, graphlist = {}; Do[CC = initlist[[i]];
newgraph = graph4[CC];
graphlist = Append[graphlist, newgraph], {i, 1, Length[initlist]}]]
solgraph =
Show[q2, graphlist, {PlotStyle -> {Black, Thick}, {DisplayFunction -> \$DisplayFunction}}]

In this sequence of command, I am first entering the family of solutions to the differential equation. Since using C is prohibited in Mathematica, we use CC instead. Then we use two subroutines, one for plotting solutions, and another one for looping with respect to constant C, Finally, we display all graphs.

We can also check that the given initial value problem has multiple solutions by evaluating integral

$\int_y^0 \frac{{\text d}y}{2\,\sqrt{y}} = \sqrt{y} ,$
which converges.

Example. Consider the initial value problem for the Riccati equation

$y' = x^2 + y^2, \quad y(0)=0 ,$
which has a unique solution $$y = \phi (x)$$ expressed via Bessel functions
$\phi (x) = x \, \dfrac{Y_{-3/4} \left( \frac{x^2}{2} \right) - J_{-3/4} \left( \frac{x^2}{2} \right)}{J_{1/4} \left( \frac{x^2}{2} \right) - Y_{1/4} \left( \frac{x^2}{2} \right)} .$
This function blows up at $$x \approx 2.003147359$$ --- the first positive root of the transcendent equation $$J_{1/4} \left( \frac{x^2}{2} \right) = Y_{1/4} \left( \frac{x^2}{2} \right) .$$ Actually, Picard's theorem guarantees a unique solution within the interval $$\left[ 0, 2^{-1/2} \right] \approx [0,0.707107 ] .$$ To find its solution, we use Picard's iteration procedure:
\begin{align*} \phi_0 (x) &= 0 , \\ \phi_1 (x) &= \int_0^x \left( x^2 + 0^2 \right) {\text d}x = \frac{x^3}{3} , \\ \phi_2 (x) &= \int_0^x \left( x^2 + \left( \frac{x^3}{3} \right)^2 \right) {\text d}x = \frac{x^3}{3} + \frac{x^7}{63} , \\ \phi_3 (x) &= \int_0^x \left( x^2 + \left( \phi_2 (x) \right)^2 \right) {\text d}x = \frac{x^3}{3} + \frac{x^7}{63} + \frac{2\,x^{11}}{2079} + \frac{x^{15}}{59535} , \\ \phi_4 (x) &= \int_0^x \left( x^2 + \left( \phi_3 (x) \right)^2 \right) {\text d}x \\ &= \frac{x^3}{3} + \frac{x^7}{63} + \frac{2\,x^{11}}{2079} + \frac{13\,x^{15}}{218,295} + \frac{82\,x^{19}}{37,328,445} + \frac{662\,x^{23}}{10,438,212,015} + \cdots , \end{align*}
and so on.