In general it is hard to find the general solution of an ODE. Apart from making sure that there is a solution, we will be concerned with finding solutions. Here we discuss a method, separation of variables, which may apply in special cases, and the structure and existence of the solution set of a linear first-order ODE.
Let #y# be a differentiable function of #t#. Suppose that \(f(t)\) and \(g(y)\) are continuous functions, that #g(y)# is not the constant function #0#, that \(F(t)\) is an antiderivative of #f(t)#, and that \(H(y)\) is an antiderivative of #\frac{1}{g(y)}#.
The general solution #y# of the differential equation \[y'(t)={g(y)}\cdot {f(t)}\] satisfies the equality \[H(y)=F(t)+C\] where \(C\) is a constant.
An ODE as above is called a separable differential equation. An equation like #H(y)=F(t)+C# of the theorem is often referred to as an implicit solution of the ODE.
The differential equation \[\frac{\dd y}{\dd t} = {g(y)}\cdot{f(t)} \] can be written as \[\dfrac{1}{ g(y)}\,\frac{\dd y}{\dd x} = f(t)\] and so it is equivalent to \[\dd\bigl(H(y)\bigr) = \dd\bigl(F(t)\bigr)\] Consequently, \[H(y)=F(t)+C\] for an integration constant \(C\).
If #g(y)# is the constant function #0#, then the left hand side of the ODE is equal to #0#, so #y# is a constant function. This case is easy to handle.
If the variable #t# only occurs in #y#, the ODE is called autonomous. A first-order autonomous ODE of the form #y' = g(y)# for a function #g# is separable. After all, such an ODE can be written as \(y '(t) = g(y) \cdot f(t)\) with #f(t)# equal to the constant function #1#.
In general, the implicit solution of the statement is not a solution of the ODE in the explicit form \( y(t)=\text{a function of }t\), but a relationship between the variables \(y\) and \(t\). Sometimes, an explicit solution can be derived from this relationship.
If there are initial conditions, the implicit solution can be used for finding corresponding values of #C#.
Linear first-order differential equations have the following form, where \(a(t)\), \(b(t)\), and \(f(t)\) are functions with #a(t)\ne0#. \[a(t)\cdot \frac{\dd y}{\dd t}+b(t)\cdot y=f(t)\] The equation is called homogeneous if \(f(t)=0\). The functions \(a(t)\) and \(b(t)\) are called coefficients and #f(t)# the inhomogeneous term. If these coefficients are constant, the general solution can be expressed in terms of standard functions.
We may assume that the coefficient #a(t)# is distinct from zero (that is, the constant function #0#). For otherwise the differential equation would be of zeroth order. Therefore, we can divide by #a(t)#. In the resulting equation, the coefficient of #y'# is equal to #1#. In this case, we say that the equation is in standard form.
Consider the ODE \[ y' + p(t)\cdot y= q(t)\] where #p# and #q# are functions on an open interval #\ivoo{c}{d}#.
1. Let #t_0# be a point of #\ivoo{c}{d}# (that is, #c\lt t_0\lt d#) and let \(p\) and \(q\) be continuous functions on this interval. Then the initial value problem \[ y' + p(t)\cdot y= q(t), \phantom{xxx}\phantom{xx}y(t_0) = \alpha\] where #\alpha# is an arbitrary number, has a unique solution defined on the entire interval #\ivoo{c}{d}#.
2. Suppose that #y_{\text{part}}# is a solution of the ODE. Then every solution #y# can be written as the sum \[ y = y_{\text{hom}} +y_{\text{part}}\] where #z=y_{\text{hom}}# is a solution of the corresponding homogeneous differential equation
\[z' + p(t)\cdot z=0 \]
A solution of the homogeneous ODE is called a homogeneous solution of the original ODE. The solution #y_{\text{part}}# is often referred to as the particular solution of the original ODE.
The boundaries #c# and #d# of the interval may be equal to #-\infty# and #\infty#, respectively.
A consequence of the theorem is the fact that the general solution of a linear first-order ODE with the specified properties has a single free parameter (integration constant). In terms of linear algebra: the solution set is a #1#-dimensional affine space.
A very simple example is #y'=q(t)#, where #q# is a continuous function defined on an open interval containing the origin. A particular solution is found by calculating an antiderivative of the right-hand side (it exists because #q# is continuous). The corresponding homogeneous equation has the general solution #y=C#. This can be seen directly by taking an antiderivative. But we can also deduce this fact from the theorem: Let #y# be any solution of the homogeneous ODE #y'=0#. Choose #\alpha=y(0)#. The theorem shows that there is exactly one function #u# with #u'=0# and #u(0)=\alpha#. Both #y# and the constant function #\alpha# meet these conditions. This forces #y=u=\alpha#. It shows that every solution of the homogeneous ODE has the form #C#. The conclusion is that, if #Q(t)# is an antiderivative of #q(t)#, the general solution of the ODE #y'=q(t)# is
\[ y= Q(t)+C\]
This solution is indeed uniquely determined. If we specify #y(0)=\alpha#, then we have \(y(t) = Q(t)+\alpha-Q(0)\). This solution is defined on the entire interval #\ivoo{c}{d}#.
The fact that the general solution is a sum of a homogeneous and a particular solution is used to break the problem of finding all solutions down into two steps:
- Finding a solution
- Finding all homogeneous solutions
The homogeneous solutions (of the second step) form a vector space: any linear combination of homogeneous solutions (with constant coefficients) is again a homogeneous solution. This, and the fact that the homogeneous ODE is simpler, increases the ease of finding solutions in the homogeneous case. All that remains after having found all homogeneous solutions, is to find a single particular solution (of the first step).
Consider the linear first-order ODE \[ y'-p\cdot y = q\] where #p# and #q# are nonzero constants. We know that the homogeneous equation #y'-p\cdot y=0# has solution #y_{\text{hom}}(t) = C\cdot \e^{p\cdot t}#, where #t# is the independent variable and #C# is a constant of integration.
A particular solution can be found in the form of a constant, say #a#. Substituting #y=a# into the ODE gives
\(0-p\cdot a = q\), so #y_{\text{part}} (t)=a= -\frac{q}{p}# is a particular solution. We conclude that the general solution is
\[y(t) = y_{\text{part}} (t)+y_{\text{hom}} (t) =-\frac{q}{p}+C\cdot \e^{p\cdot t}\]
Consider the initial value problem
\[y'=\sqrt[3]{y^{2}}\,\phantom{xxx}\text{ with }\phantom{xxx}y(0)=0\]
The general solution of the differential equation is
\[ y=0\lor y =\left(\frac{1}{3}x+C \right)^{{3}}\] To see this, we set the solution #y=0# apart; that is, we assume for the time being that #y\ne0#. We find
\[\begin{array}{rcl}
\displaystyle\left(y\right)^{-\frac{2}{3}}\,\frac{\dd y}{\dd x} &=& 1\\
\displaystyle{3} \frac {\dd \left(y^{\frac{1}{3}}\right)}{\dd x} &=& 1\\
\displaystyle\frac {\dd \left(y^{\frac{1}{3}}\right)}{\dd x} &=&\displaystyle\frac{1}{3} \\
\displaystyle y^{\frac{1}{3}} &=&\frac{1}{3}x+C \\
y &=&\displaystyle\left(\frac{1}{3}x+C \right)^{{3}} \\
\end{array}\]
Adding the initial condition #y(0)=0#, we see that
\[\text{both}\phantom{zzz} y=0\phantom{zzz}\text{and}\phantom{zzz}y=\left(\frac{1}{3}x \right)^{{3}}\]
are specific solutions. The conclusion is that the initial value problem has at least two solutions.
Now we glue the solution #y=0# far from the origin to the solution #y =\left(\frac{1}{3}x+C \right)^{{3}}#. The gluing can be carried out at a point #\rv{a,0}# whose #x#-coordinate #a# satisfies
\[\eqs{\left(\frac{1}{3}a+C \right)^{{3}}&=&0\cr\left(\frac{1}{3}a+C \right)^{{2}}&=&0\cr}\]
so the function is continuous and differentiable at #x=a#. If we take #C=-\frac{a}{3}#, then #a# meets the above requirement and we find the following solution #y_a# of the initial value problem
\[y_a(x) =\begin{cases}0&\text{ if }x\le a\\
\left(\frac{x-a}{3} \right)^{{3}} &\text{ if }x\gt a\\
\end{cases}\]
Therefore, there are infinitely many solutions.
The largest open interval #\ivoo{c}{d}# around a given point #t_0# on for which the conditions of the first statement are satisfied is called the interval of validity of the initial value problem with #y(t_0) = \alpha#.
Below are some examples.
Solve the following initial value problem in which #y# is a function of #t#:
\[y'=y,\quad y(0)=3\]
\(y=3\cdot \e^{t}\)
The general solution of this differential equation of exponential growth equals \[y= C\cdot \e^t\] We use the condition \(y(0)=3\) by substituting #t=0# and #y=3# into it. This gives the equation \(3=C\cdot \e^{0} \), so #C = 3#. Substituting this value of #C# into the general solution, we find the special solution of the initial value problem:
\[ y = 3 \cdot \e ^ t \]