How do you convert the Cartesian coordinates (10,10) to polar coordinates?

1 Answer
Aug 6, 2015

Cartesian: #(10;10)#

Polar: #(10sqrt2;pi/4)#

Explanation:

The problem is represented by the graph below:

enter image source here

In a 2D space, a point is found with two coordinates:

The cartesian coordinates are vertical and horizontal positions #(x;y)#.

The polar coordinates are distance from origin and inclination with horizontal #(R,alpha)#.

The three vectors #vecx, vecy and vecR# create a right triangle in which you can apply the pythagorean theorem and the trigonometric properties. Thus, you find:

#R=sqrt(x^2+y^2)#

#alpha=cos^(-1)(x/R)=sin^(-1)(y/R)#

In your case, that is:

#R=sqrt(10^2+10^2)=sqrt(100+100)=sqrt200=10sqrt2#

#alpha=sin^(-1)(10/(10sqrt2))=sin^(-1)(1/sqrt2)=45°=pi/4#