How do you graph polar coordinates?

1 Answer
Nov 6, 2014

To establish polar coordinates on a plane, we choose a point O - the origin of coordinates, the pole, and a ray from this point to some direction OX - the polar axis (usually drawn horizontally).

Then the position of every point A on a plane can be defined by two polar coordinates: a polar angle φ from the polar axis counterclockwise to a ray connecting the origin of coordinates with our point A - angle XOA (usually measured in radians) and by the length ρ of a segment OA.

To graph a function in polar coordinates we have to have its definition in polar coordinates.
Consider, for example a function defined by the formula
ρ=φ for all φ0.

The function defined by this equality has a graph that starts at the origin of coordinates O because, if φ=0, ρ=0.
Then, as a polar angle φ increases, the distance from an origin ρ increases as well. This gradual increase in both polar angle and distance from the origin produces a graph of a spiral.

After the first full circle the point on a graph will hit the polar axis at a distance 2π. Then, after the second full circle, it will intersect the polar axis at a distance 4π, etc.