0 votes
Is the United States in South America or North America?

1 Answer

0 votes
The term America (or the Americas) refers to all the lands in the Western Hemisphere, comprising the continents of North America and South America. (Central America is actually part of the North American continent.) The United States of America, or U.S.A., is a country in North America.
...