Math 309B/C Winter 2012, Homework 3


Due February 1

Recall that if $z=a+ib$, then the real part of $z$ is $a$ and the imaginary part of $z$ is $b$. The real and imaginary parts are both real.

    1. Show that $\overline{e^{a+ib}}=e^{a-ib}$ (Hint: recall that $e^{a+ib}=e^{a}(\cos b+i\sin b)$. (We will need this identity a lot.)

    2. Show that $z\overline{z}=|z|^2$. Recall that if $z=a+ib$, then $|z|=\sqrt{a^{2}+b^{2}}$.

    3. Show that $\overline{\overline{z}}=z$.
    4. Show that if $z=a+ib$ with $b\neq 0$, then $z$ and $\overline{z}$ are linearly independent with respect to the real numbers, i.e. there are no nonzero real constants $c_1$ and $c_2$ such that $c_1z+c_2\overline{z}=0$. Hint: Write $z$ as $z=a+ib$ where $a$ and $b$ are real numbers. \[c_1z+c_2\overline{z} =c_1(a+ib)+c_2(a-ib)=(c_1+c_2)a+i(c_1-c_2)b=0.\] If you look at the real and imaginary parts, those have to both equal zero, so we have the system of equations \[ac_1+ac_2=0\] \[bc_1-bc_2=0,\] i.e. we're solving the matrix equation \[\left(\begin{array}{cc} a & a \\ b & -b \end{array}\right) \left(\begin{array}{c} c_1 \\ c_2 \end{array}\right)=0.\] So now we just have to show that any solution $\left(\begin{array}{c} c_1 \\ c_2 \end{array}\right)=0$ to this equation must be zero. Hint: look at the determinant.


  1. Let $A$ be an $n\times n$ matrix with real entries.
    1. Show that if $v$ is a complex eigenvector with eigenvalue $\lambda$, then $\overline{v}$ (i.e. the conjugate of the vector) is an eigenvector with eigenvalue $\overline{\lambda}$. (What this exercise says is that whenever you have an eigenvalue, it's complex conjugate is also an eigenvalue. This can be helpful when trying to solve for eigenvalues, since if you find one complex one, you've automatically found a second one. In other words, complex eigenvalues and eigenvectors come in conjugate pairs.)
      Answer: Since $v$ is an eigenvector, $Av=\lambda v$. Conjugating both sides of the equation, we get $\overline{Av}=\overline{\lambda v}$. Now notice that $\overline{A}=A$ since $A$ is real, hence $\overline{Av}=A\bar{v}=\overline{\lambda v}=\bar{\lambda}\bar{v}$, but this is what it means for $\bar{v}$ to be an eigenvector with eigenvalue $\bar{\lambda}$.
    2. Show that if $v=\left(\begin{array}{c} a+ib \\ c+id\end{array}\right)$ is a complex eigenvector for $A$ (a 2x2 matrix) with complex eigenvalue $\lambda$, then $ad-bc=0$. Hint: If $v$ is an eigenvector with e-value $\lambda$, then so is $(a-ib)\cdot v$. Compute what this vector is and note that if $ad-bc=0$, then $(a-ib)v$ is a real eigenvector with a complex eigenvalue, but this can't happen since $A$ is real.

      Answer: Note that \[(a-bi)\cdot v= \left(\begin{array}{c} a^{2}+ b^{2} \\ ac+bd+i(ad-bc)\end{array}\right).\] If $ac-bd=0$, then $(a-bi)\cdot v$ is real, and so $A((a-ib)v)$ is real (since $A$ is real), but $A((a-bi)\cdot v)=\lambda(a-bi)\cdot v$, which is complex (since $\lambda$ is complex), hence $ad-bc\neq 0$.
  2. Suppose $x=a+ib$ is now a vector, with $a$ and $b$ real vectors. Then $x$ and $\overline{x}$ are linearly independent with respect to the reals. (This follows from problem 1d, although you don't need to prove it, but you will need to know this fact.)
  3. Show that if $x(t)$ solves $x'=Ax$, where $A$ is a real matrix, then

    1. $\overline{x}(t)$ also solves this equation.
      Answer: Note that if $x'=Ax$, then $\bar{x'}=\overline{Ax}=A\bar{x}$ since $A$ is real.

    2. Suppose $A$ is a $2\times 2$ matrix, and $x(t)=ve^{\lambda t}$ is a solution with $\lambda=a+ib$ complex (i.e. $b\neq 0$). Show that $x(t)$ and $\overline{x}(t)$ are linearly independent for all $t$. Hint: You just need to show that the Wronskian is nonzero somewhere. Let $v=\begin{array}{c} \alpha+i\beta \\ \gamma + i\rho\end{array}$ and compute $\det(v|\overline{v})$. By problem 2b, this will be nonzero at $t=0$, why?
      Answer: We just need to show that the Wonskian is nonzero at $t=0$. Writing $x(0)=v=\left(\begin{array}{c} a+ib \\ c+id\end{array}\right)$, \[W(0)=\det(x(0)|\bar{x}(0))=\det \left(\begin{array}{cc} a+ib & c+id \\ a-ib & c-id \end{array}\right) = 2i(bc-ad),\] and recall that $v$ is an eigenvector, so by problem 2b, $bc-ad\neq 0$, so $W(0)\neq 0$.
Recall that, when we were solving $x'=Ax$, we functions of the form $x(t)=ve^{\lambda t}$, where $v$ is an eigenvector and $\lambda$ is its eigenvalue, are solutions. The same still holds if we get complex eigenvalues, so solving these systems is almost the same as before, but now (by the previous problem) we know that they come in conjugate pairs, i.e., $\overline{v}e^{\overline{\lambda}t}$ is also a solution..
    1. Solve $x'=\left(\begin{array}{cc} 1 & 1 \\ -1 & 1 \end{array}\right)x$ for its general solutions. You should get complex eigenvalues and complex solutions \[x^{(1)}=\left(\begin{array}{c} 1 \\ i \end{array}\right) e^{(1+i) t} \;\;\; \mbox{ and } \;\;\; x^{(2)}=\left(\begin{array}{c} 1 \\ -i \end{array}\right) e^{(1-i) t},\] or perhaps multiples of these.
    2. You solve these equations exactly the same way as in the previous section, only these complex roots show up. Hint: By the previous problem, if you ever solve an equation of the form $x'=Ax$ and get a complex solution $x$, then $\overline{x}$ is another fundamendal solution. Hence, if $A$ is $2\times 2$, if you find one fundamental solution, then it's a synch to find the other one...just conjugate the first one! Notice, in particular, that the solutions in the previous problem are complex conjugates of each other.

      We could consider ourselves done in solving this problem, but in real life applications we typically are dealing with real functions and real initial conditions, and should expect that our solutions are also real. Hence, it is important to know how to express our general solutions interms of real fundamental solutions.

    3. With the solution $x^{(1)}$ from the previous problem, compute the real and imaginary parts of this vector function. You will then get two real-valued vector functions $y^{(1)}=\left(\begin{array}{c} \cos t \\ -\sin t\end{array}\right) e^{t}$ and $y^{(2)}=\left(\begin{array}{c} \sin t \\ \cos t\end{array}\right) e^{t}$.

    4. Show that $y^{(1)}$ and $y^{(2)}$ are a fundamendal set of solutions. Hint: Compute the Wronskian of the two vectors for some value $t$, say $t=0$.


  1. So when dealing with $x'=Ax$ where $A$ is a matrix with two complex eigenvalues $a\pm ib $, to solve for the real valued general solutions, do the following:

    1. Solve for the eigenvalues $a\pm ib$ and corresponding eigenvectors $v_{\pm}$. The fundamental solutions will look (just as before) \[x^{(1)}= v_1 e^{(a+ib)t} \;\;\; \mbox{ and } \;\;\; x^{(2)}= v_2 e^{(a-ib)t}.\]

    2. Solve for the real and imaginary parts of one these fundamental solutions, say $x^{(1)}$, by writing $v_1=u+iw$ (where $u$ and $w$ are real vectors). If you substitute this in the above expression, write $e^{(a+ib)t}=e^{at}(\cos bt+i\sin bt)$ and expand, you should get \[x^{(1)}=e^{at}(u\cos bt-w\sin t)+ie^{at}(u\sin bt + w \cos bt).\] Then your two real fundamental solutions are the real and imaginary parts of this expression \[y^{(1)}=e^{at}(u\cos bt-w\sin t), \;\;\; y^{(2)}=e^{at}(u\sin bt + w \cos bt).\]

    3. Note that if the real value is positive (i.e. $a>0$), then the solution will grow to infinity, and if the real part is negative, then it will decay to zero. In either case, we call the origin a spiral point.
    Try the following problems to get some practice. Solve for a real set of fundamental solutions and write the general form of a real solution (and for fun (i.e. not necessary for the quiz), try plotting some of your solutions):
  2. $x'=\left(\begin{array}{cc} 1 & -3 \\ 3 & 1 \end{array}\right)$. To see why we call these solutions spirals, here is a plot of a solution to this system:

    Answer: This has eigenvalues $\lambda=1+\pm 3$. To find the eigenvectors, note that \[(A-(1+3i)I)x = \left(\begin{array}{cc} -3i & -3 \\ 3 & -3i \end{array}\right)\begin{array}{c} x_{1} \\ x_{2} \end{array}\right).\] Note that if $x$ is an eigenvector, then so is any constant multiple. Hence, we can actually pick $x_1$ to be whatever we want, so let's pick it to be some thing so that, when we compute the resulting vector above, there is no $i$, so let $x_1=i$. Then $x_2=1$ by setting the above equation equal to zero and solving for $x_2$. That is the first eigenvector, and by one of our earlier problems, the eigenvector for $1-3i$ is just the conjugate of this one. Hence, our first eigenvector is $\left(\begin{array}{c} \pm i \\ 1 \end{array}\right)$. So one of your fundamental solutions is \[x^{(1)}(t)= \left(\begin{array}{c} i \\ 1 \end{array}\right) e^{(1+3i)t}.\] Let's compute the real and imaginary parts of this: \[x^{(1)}(t) = \left[ \left(\begin{array}{c} 0 \\ 1 \end{array}\right) + i \left(\begin{array}{c} 1 \\ 0 \end{array}\right) \right] e^{t}(\cos 3t+i\sin 3t) \] \[= \underbrace{e^{t}\left[ \left( \begin{array}{c} 0 \\ 1 \end{array}\right)\cos t - \left(\begin{array}{c} 1 \\ 0 \end{array}\right)\sin t\right]}_{y^{(1)}} +\underbrace{e^{t}\left[\left( \begin{array}{c} 0 \\ 1 \end{array}\right)\sin t + \left(\begin{array}{c} 1 \\ 0 \end{array}\right)\cos t\right]}_{y^{(2)}},\] and $y^{(1)}$ and $y^{(2)}$ are your real fundamdental solutions, and the general form is \[ c_{1} e^{t}\left[ \left(\begin{array}{c} 0 \\ 1 \end{array}\right)\cos t - \left(\begin{array}{c} 1 \\ 0 \end{array}\right)\sin t\right] + c_{2} e^{t}\left[ \left(\begin{array}{c} 0 \\ 1 \end{array}\right)\sin t + \left(\begin{array}{c} 1 \\ 0 \end{array}\right)\cos t\right],\]

  3. $x'=\left(\begin{array}{cc} 3 & -2 \\ 4 & -1 \end{array}\right)x$. \[x= c_{1} e^{t} \left(\begin{array}{c} \cos 2t \\ \cos 2t+\sin 2t \end{array}\right) +c_{2}e^{t}\left(\begin{array}{c} \sin 2t \\ -\cos 2t +\sin 2t\end{array}\right).\]

  4. $x'=\left(\begin{array}{cc} -1 & -4 \\ 1 & -1 \end{array}\right)x.

  5. $x'=\left(\begin{array}{cc} 2 & -5 \\ 1 & -2 \end{array}\right)x$
    Answer The eigenvalues are $\lambda =\pm 1$. To solve for the eigenvectors, let $\lambda =i$, and solve \[0=(A-iI)x=\left(\begin{array}{cc} 2-i & -5 \\ 1 & -2-i \end{array}\right)\left(\begin{array}{c} x_1 \\ x_2\end{array}\right).\] Again, let us pick $x_1$ so that when it gets multiplied by the top left entry in the matrix (i.e. 2-i), it becomes real, so let $x_1=2+i$ (i.e. the conjugate of $2-i$). Then the above equals \[\left(\begin{array}{cc} 5-5x_2 \\ 2+i +(-2-i)x_2\end{array}\right),\] and since this all equals zero, we should pick $x_2=1$, and so our eigenvector for $\lambda =i$ is $\left(\begin{array}{c} 2+i \\ 1 \end{array}\right)$, and the eigenvector for $-i$ is just the conjugate of this. Our first fundamental solution is \[x^{(1)}(t)= \left(\begin{array}{c} 2+i \\ 1 \end{array}\right) e^{it}.\] Let's compute the real and imaginary parts of this: \[x^{(1)}(t) = \left[ \left(\begin{array}{c} 2 \\ 1 \end{array}\right) + i \left(\begin{array}{c} 1 \\ 0 \end{array}\right) \right] (\cos t+i\sin t) \] \[= \underbrace{\left[ \left( \begin{array}{c} 2 \\ 1 \end{array}\right)\cos t - \left(\begin{array}{c} 1 \\ 0 \end{array}\right)\sin t\right]}_{y^{(1)}} +\underbrace{\left[\left( \begin{array}{c} 2 \\ 1 \end{array}\right)\sin t + \left(\begin{array}{c} 1 \\ 0 \end{array}\right)\cos t\right]}_{y^{(2)}},\] and $y^{(1)}$ and $y^{(2)}$ are your real fundamdental solutions, and the general form is \[ c_{1} \left[ \left(\begin{array}{c} 2 \\ 1 \end{array}\right)\cos t - \left(\begin{array}{c} 1 \\ 0 \end{array}\right)\sin t\right] + c_{2} \left[ \left(\begin{array}{c} 2 \\ 1 \end{array}\right)\sin t + \left(\begin{array}{c} 1 \\ 0 \end{array}\right)\cos t\right].\]

  6. $x'=\left(\begin{array}{cc} 3 & -2 \\ 4 & -1 \end{array}\right)$
  7. This problem deals with a system $x'=Ax$ where the eigenvalues of $A$ have no real part, i.e. they are $\pm ib$ for some real number $b$. The solution to such an equation will just spiral around in a circle: it won't diverge to infinity or converge to the origin, but will orbit $0$. In such a case, we call the origin a center. Solutions are stable in the sense that they don't diverge to infinity, but they are considered asymptotically unstable because they never converge to any vector.

    1. Solve $x'=\left(\begin{array}{cc} 0 & 1 \\ -1 & 0 \end{array}\right)x$ for its fundamental solutions.

    2. Convert the fundamental solutions into real ones by computing the real and imaginary parts of one of the fundamental solutions. In the end, you should get fundamental solutions of the form \[y^{(1)}(t)=\left(\begin{array}{c} \cos t \\ -\sin t\end{array}\right), \;\;\; y^{(2)}(t)=\left(\begin{array}{c} \sin t \\ \cos t\end{array}\right).\] If you didn't get these exactly, your solution is still correct if your fundamental solutions were multiples of these ones. Note that there is no exponential term, so any linear combination of these guys (i.e. a general solution) won't diverge to infinity or converge to the origin.

    3. In this problem, you'll prove that any general solution will just spiral around in a circle. Show that if \[x(t)= c_{1} \left(\begin{array}{c} \cos t \\ -\sin t\end{array}\right)+ c_2 \left(\begin{array}{c} \sin t \\ \cos t\end{array}\right),\] then \[x(t)=c\left(\begin{array}{c} \\cos(t+\theta) \\ -\sin(t+\theta)\end{array}\right),\] where \[c=\sqrt{c_1^2+c_2^2}, \;\;\; \theta = \arccos \frac{c_1}{c}.\] Hint: Start from the right side of the equation and espand using angle formulas for sin and cos, and remember that if $\theta=\arccos t$, then $\sin t=\sqrt{1-t^2}$.
      Answer: Note that $\arcsin \theta = \sqrt{ 1-\frac{c_{1}^{2}}{c^{2}}}=\sqrt{1-\frac{c_{1}^{2}}{c_{1}^{2}+c_{2}^{2}}}=\sqrt{\frac{c_{2}^{2}}{c^{2}}}=\frac{c_{2}}{c}$, soStart by expanding what we think the solution is: \begin{align*} x(t) & = c\left(\begin{array}{c} \cos(t+\theta) \\ -\sin(t+\theta)\end{array}\right) =c\left(\begin{array}{c} \cos t \cos \theta -\sin t\sin \theta \\ -\sin t\cos \theta -\sin\theta\cos t \end{array}\right) \\ & =c\left(\begin{array}{c} \cos t \frac{c_{1}}{c} -\sin t\frac{c_{2}}{c} \\ -\sin t\frac{c_{1}}{c}-\frac{c_{2}}{c}\cos t \end{array}\right)\\ & =\left(\begin{array}{c} c_1\cos t -c_2\sin t \\ -c_1\sin t-c_2\cos t \end{array}\right) \\ & =c_{1} \left(\begin{array}{c} \cos t \\ -\sin t\end{array}\right)+ c_2 \left(\begin{array}{c} \sin t \\ \cos t\end{array}\right).\end{align*}

    4. Show that $x(t)$ travels in a circle around the origin. What's its radius? Hint: If you're moving around in a circle around the origin, then your distance to zero is constant (imagine someone swinging $x(t)$ around the origin with a rope, then it orbits the point in a circle of radius equal to the length of the rope). Hence, you only need to show that the norm of $x(t)$ is constant (and it's norm will be the radius of the circle).


Suppose now that you have a 3x3 real matrix $A$ that has a complex eigenvalue $\lambda$ with a complex eigenvalue $\lambda$ with eigenvector $v$. Again, we'd like to express the general solution for this equation in terms of real fundamental solutions.
By problem 2, $\overline{\lambda}$ is also an eigenvalue with eigenvector $\overline{v}$. Then the third eigenvalue $\mu$ must be real (otherwise, $\overline{\mu}$ would also be an eigenvalue, but then we'd have 4 eigenvalues, which is impossible for a 3x3 matrix) and has a real eigenvector $u$. Then the fundamental solutions are $ue^{\mu t}, ve^{\lambda t},$ and $\overline{v}e^{\overline{\lambda}t}$. The general solution will (as always) look like \[x(t)=c_1 ue^{\mu t}+c_2 ve^{\lambda t}+c_3 \overline{v}e^{\overline{\lambda}t}.\] Again, we'd like to have a general solution in terms of real fundamental solutions. While the first term in our general solution (the one with the $\mu$) is ok, since everything is real, we just have to replace the other two vectors with real functions. You do the same trick as before: take the real and imaginary parts of one of the complex fundamental solutions (say $ve^{\lambda t}$), decompose it into its real and imaginary parts to get two new fundamental solutions $y^{(1)}$ and $y^{(2)}$. Hence, a general solution with real fundamental solutions will have the form \[c_1 ue^{\mu t}+c_2 y^{(1)}+c_3 y^{(2)}.\] Express the general solutions to the following systems in terms of real fundamental solutions:
  1. $x'=\left(\begin{array}{ccc} 1 & 0 & 0 \\ 2 & 1 & −2 \\ 3 & 2 & 1\end{array}\right)x$.
  2. $x'=\left(\begin{array}{ccc} -3 & 0 & 2 \\ 1 & -1 & 0 \\ -2 & -1 & 0\end{array}\right)x$.
  3. The following problems deal with repeated eigenvalues. Solve for the general solutions of each system, and solve the initial value problem if there is an initial condition.
    1. $x'=\left(\begin{array}{cc} 3 & -4 \\ 1 & -1 \end{array}\right)x$
    2. $x'=\left(\begin{array}{cc} 4 & -2 \\ 8 & -4 \end{array}\right)x$
    3. $x'=\left(\begin{array}{ccc} 1 & 1 & 1 \\ 2 & 1 & -1 \\ 0 & -1 & 1 \end{array}\right)x$
    4. $x'=\left(\begin{array}{ccc} 0 & 1 & 1 \\ 1 & 0 & 1 \\ 1 & 1 & 0 \end{array}\right)x$
    5. $x'=\left(\begin{array}{cc} 1 & -4 \\ 4 & -7 \end{array}\right)x$, $x(0)=\left(\begin{array}{c} 3 \\ 2 \end{array}\right)$
      Answer: This has eigenvalue $\lambda =-3$ with multiplicity 2 and eigenvector $v=\left(\begin{array}{c} 1 \\ 1 \end{array}\right)$. One fundamental solution is \[x^{(1)}=ve^{\lambda t}=\left(\begin{array}{c} 1 \\ 1 \end{array}\right)e^{-3t}.\] To solve for the other one, recall that \[x^{(2)}=ue^{-3t}+vte^{-3t}\] where $v$ is out eigenvector and $u$ satisfies \[(A-\lambda I)u=v,\] i.e. \[(A-(-3)I)u=\left(\begin{array}{c} 1 \\ 1 \end{array}\right).\] Solving this equation gives $u=\left(\begin{array}{c} 1 \\ 1 \end{array}\right)k+\left(\begin{array}{c} 1/4 \\ 0 \end{array}\right)$ for some arbitrary constant $k$. Since we just want one solution, just pick $k=0$, so $u=\left(\begin{array}{c} 1/4 \\ 0 \end{array}\right)$, and our general solution is now \[x(t)= c_{1} x^{(1)}+c_{2}x^{(2)}= c_{1} \left(\begin{array}{c} 1 \\ 1 \end{array}\right)e^{-3t}+ c_{2} \left( \left(\begin{array}{c} 1/4 \\ 0 \end{array}\right)+ \left(\begin{array}{c} 1 \\ 1 \end{array}\right)te^{-3t}\right)\] To solve the initial value problem, we set \[\left(\begin{array}{c} 3 \\ 2 \end{array}\right)= x(0)=c_{1} \left(\begin{array}{c} 1 \\ 1 \end{array}\right)+ c_{2} \left(\begin{array}{c} 1/4 \\ 0 \end{array}\right)= \left(\begin{array}{cc} 1 & 1/4 \\ 1 & 0 \end{array}\right)\left(\begin{array}{c} c_{1} \\ c_{2}\end{array}\right),\] and now we solve this equation (which is just linear algebra at this point).
    6. $x'=\left(\begin{array}{cc} 2 & 3/2 \\ -3/2 & -1 \end{array}\right)x$, $x(0)=\left(\begin{array}{c} 3 \\ -2 \end{array}\right)$
    7. $x'=\left(\begin{array}{cc} 3 & 9 \\ -1 & -3 \end{array}\right)x$, $x(0)=\left(\begin{array}{c} 2 \\ 4 \end{array}\right)$
  4. Consider the equation $x'=\left(\begin{array}{ccc} 1 & 1 & 1 \\ 2 & 1 & -1 \\ -3 & 2 & 4 \end{array}\right)x$. Compute all eigenvalues and eigenvectors (there should be only one of each, call them $\lambda$ and $v$ respectively). Here's how to find some fundamental solutions $x^{(1)},x^{(2)},x^{(3)}$:
    1. The first one is just $x^{(1)}=ve^{\lambda t}$ as usual.
    2. Let $x^{(2)}=v_1 e^{\lambda t}+v_2 t e^{\lambda t}$, solve for $v_1$ and $v_2$.
    3. Let $x^{(3)}=u_1 e^{\lambda t} + u_2 t e^{\lambda t}+ u_3 \frac{t^{2}}{2} e^{\lambda t}$, solve for $u_1, u_2,$ and $u_3$.
    For the following systems, find the fundamental matrix $\Phi$ (i.e. the matrix s.t. $\Phi'=A\Phi$) such that $\Phi(0)=I$, and then solve the initial value problems $x(0)=e_{1}$ and $x(0)=e_{2}$ (where $e_{1}$ and $e_{2}$ are standard basis vectors):
    1. $x'=\left(\begin{array}{cc} 3 & -2 \\ 2 & -2 \end{array}\right)x$
    2. $x'=\left(\begin{array}{cc} 2 & -1 \\ 3 & -2 \end{array}\right)x$
    3. $x'=\left(\begin{array}{cc} 2 & -5 \\ 1 & -2 \end{array}\right)x$ (Note that you'll get complex eigenvalues, first find some real fundamental solutions to make your fundamental matrix.)
  5. Show that if $\Phi(t)$ satisfies $\Phi'=A\Phi$ and $\Phi(0)=I$, then $\Phi(t)\Phi(s)=\Phi(t+s)$. (Hint: Fix $s$. Show that, for any vector $v$, $x(t)=\Phi(t)\Phi(s)v$ and $x(t)=\Phi(t+s)v$, satisfy the same initial value problem $x'=Ax$ and $x(0)=\Phi(s)v$. Now apply uniqueness, and recall that two matrices $M$ and $N$ are equal if and only if $Mv=Nv$ for all vectors $v$.)
    Answer: Fix $s$ (that is, suppose it's some number that's not varying, this means that when we differentiate, we don't differentiate in $s$, only $t$). Note that \[x(t)=\Phi(t)\Phi(s), \;\;\; y(t)= \Phi(t+s)\] both solve the initial value problem $X'=AX$ with initial data $X(0)=\Phi(s)$. Hence, by the uniqueness theorem, they are the same function.