# How do you graph polar coordinates?

Nov 6, 2014

To establish polar coordinates on a plane, we choose a point $O$ - the origin of coordinates, the pole, and a ray from this point to some direction $O X$ - the polar axis (usually drawn horizontally).

Then the position of every point $A$ on a plane can be defined by two polar coordinates: a polar angle $\varphi$ from the polar axis counterclockwise to a ray connecting the origin of coordinates with our point $A$ - angle $\angle X O A$ (usually measured in radians) and by the length $\rho$ of a segment $O A$.

To graph a function in polar coordinates we have to have its definition in polar coordinates.
Consider, for example a function defined by the formula
$\rho = \varphi$ for all $\varphi \ge 0$.

The function defined by this equality has a graph that starts at the origin of coordinates $O$ because, if $\varphi = 0$, $\rho = 0$.
Then, as a polar angle $\varphi$ increases, the distance from an origin $\rho$ increases as well. This gradual increase in both polar angle and distance from the origin produces a graph of a spiral.

After the first full circle the point on a graph will hit the polar axis at a distance $2 \pi$. Then, after the second full circle, it will intersect the polar axis at a distance $4 \pi$, etc.