*** Welcome to piglix ***

Western Society


The Western world or the West is a term usually referring to different nations, depending on the context, most often including at least part of Europe. There are many accepted definitions about what they all have in common. The Western world is also known as the Occident (from Latin: occidens "sunset, West", as contrasted with Orient).

The concept of the Western part of the earth has its roots in Greco-Roman civilization in Europe, and the advent of Christianity. In the modern era, Western culture has been heavily influenced by the traditions of the Renaissance, Protestant Reformation, Age of Enlightenment—and shaped by the expansive imperialism and colonialism of the 15th to 20th centuries. Before the Cold War era, the traditional Western viewpoint identified Western Civilization with the Western Christian (Catholic-Protestant) countries and culture. Its political usage was temporarily changed by the antagonism during the Cold War in the mid-to-late 20th Century (1947–1991).

The term originally had a literal geographic meaning. It contrasted Europe with the cultures and civilizations of the Middle East and North Africa, Sub-Saharan Africa, South Asia, Southeast Asia, and the remote Far East which early-modern Europeans saw as the East. In the contemporary cultural meaning, the phrase Western world includes Europe, as well as many countries of European colonial origin with substantial European ancestral populations in the Americas and Oceania.


...
Wikipedia

...