The 20th century had often been called the century of the US. The USA has been dominated poltical, economic, and moral force in the world since 1900. In recent years, however, scholars contend that the influence of the US in world affairs is in decline. Some say that China or some other group of powers will fill the void. Please speak your opinion on this matter, and please tell me or link me to some facts that either support or go against this statement.
Thank you!
Tags: