Question:

I need to know the names of western countries?

by  |  earlier

0 LIKES UnLike

What is the difference between western and eastern education?

 Tags:

   Report

1 ANSWERS


  1. The term Western countries (sometimes the West or the Occident) is somewhat imprecisely defined - derived from the old dualism of East (Asia) and West (Europe) - now used to refer to wealthy and industrialised countries, as the inheritants of European societies, and their colonial legacies. The term is sometimes used as a synonym for the Western societies.

    Depending on context, the Western countries may be restricted to the founding members of NATO in addition to Germany, Spain, and the non-aligned Austria, Finland, Sweden and Switzerland. A broader definition might extend to Australia, New Zealand, Japan, South Korea, the Republic of China (Taiwan), Israel and some of the more prosperous Warsaw Pact states.

    Latin America is sometimes considered part of the West and sometimes not. Mainland China, the remainder of the Middle East, India, and Russia are generally not considered part of the West.

    Western countries have in common a high (relative) standard of living for most citizens - compared to the rest of the world. They may also have democratic, (mostly secular) governments, and developed bodies of laws that have some expression of rights (for its own citizens) in law. Also, high levels of education, and a similar, "modern" popular culture may reflect the Western or Westernized society. Militarily and diplomatically, these "Western" societies have generally been allied with each other to one degree or another since World War Two. In fact, some would argue that this is the definition of the West and explains why Japan is usually considered Western while Ecuador is not.

    More typically, the term "The West" contains a pejorative meaning - simply to describe and deliniate the wealthy and dominant societies from the poorer societies - those who are subjugated economically, miltarily, and otherwise, by deliberate restraints placed on them by the wealthier ones. "The West" then becomes simply a term to mean: "Wealthy, Colonial (slave-holding), Europe-descended (or allied) societies." The derived meaning of the above, in current use, tends to translate as: "Those who control the world" or "Those who seek to continue in domination of others and their lands."

    Oftentimes use of the term "The West" was motivated by racist attitudes towards Slavic Europeans, in that the term was not encompassing of them whereas "Europe" is.

    http://www.encyclopedia4u.com/w/western-...

Question Stats

Latest activity: earlier.
This question has 1 answers.

BECOME A GUIDE

Share your knowledge and help people by answering questions.