This question doesn't bother me too much, because I've always hated the demonym "American," although it seems like at this point in our linguistic history we are stuck with that. But that doesn't mean the country should be called that, since America refers to a continent, and our country is the USA. Just one of my pet peeves.
Except there is no continent simply called "America". There are continents called "North America" and "South America". Note the capitalization. "North" and "South" are part of the names, not just some descriptors that can be dropped off. To say so otherwise would be like claiming West Virginia is Virginia.