# Suppose # s(x) # and # c(x) # are 2 functions where: 1) # s'(x) = c(x) # and # c'(x) = -s(x); # 2) # s(0) = 0 # and # c(0) = 1. # What can you say about the quantity: # \qquad [ s(x) ]^2 + [ c(x) ]^2 # ?

##### 3 Answers

#### Explanation:

We have

now putting the initial conditions

and finally

# [ s(x) ]^2 + [ c(x) ]^2 =1 #

#### Explanation:

We are given that:

# s'(x) = c(x) \ \ \ \ \ \ \ \ \ \ \ # ..... [A]

# c'(x) = -s(x) \ \ \ \ \ \ # ..... [B]

Differentiating the second equation [B] wrt

#c''(x) = -s'(x) #

And then incorporating the first equation [A]:

# -c''(x) = c(x) #

Or:

# c''(x) + c(x) = 0 #

Which is a Second Order ODE with constant coefficients, so we consider the associated Auxiliary equation:

# m^2 + 1 = 0 => m= +- i #

So as we have two pure imaginary roots, the solution is of the

# c(x) =Acosx + Bsinx #

And then using [B] we have:

# s(x) = -c'(x) #

# :. s(x) = -{-Asinx+Bcosx} #

# \ \ \ \ \ \ \ \ \ \ \ = Asinx-Bcosx #

Using the given condition,

# A + 0 = 1 => A = 1#

# 0-B = 0 => B=0 #

Thus we have:

# c(x) = cosx #

# s(x) = sinx #

And so we infer that:

# [ s(x) ]^2 + [ c(x) ]^2= sin^2x+cos^2 =1 #

If you're going to submit an inefficient answer, it may as well be interesting !

#### Explanation:

We have:

That solves trivially as:

#mathbf s = e^(x M) mathbf s_o#

Now, "what can we say":

Looking the matrices:

Symmetry....and Commutation

So: