How did Florida become a territory of the United States in 1821?

1 Answer

Spain sold it to the US after battles.

Explanation:

The Spanish government did not have many soldiers stationed there and no money to support the Floridian Government. After the Spanish Empire got control of Florida from Britan, America fought the Spanish to take Florida as a territory of their own. After Spain realized that they could not stop the United States, they sold Florida to the United States in the Florida Acquisition.