*** Welcome to piglix ***

The Western world


The Western world, or simply the West, refers to various nations, depending on the context, most often including at least part of Europe. There are many accepted definitions based on commonalities. The Western world is also known as the Occident (from Latin: occidens "sunset, West", as contrasted with Orient).

The concept of the Western part of the Earth has its roots in Greco-Roman world in Europe, Judaism and the advent of Christianity in Ancient Israel. In the modern era, Western culture has been heavily influenced by the traditions of the Renaissance, Protestant Reformation, Age of Enlightenment – and shaped by the expansive imperialism and colonialism of the 15th to 20th centuries. Before the Cold War era, the traditional Western viewpoint identified Western Civilization with the Western Christian (Catholic-Protestant) countries and culture. Its political usage was temporarily changed by the antagonism during the Cold War in the mid-to-late 20th Century (1947–1991).

The term originally had a literal geographic meaning. It contrasted Europe with the cultures and civilizations of the Middle East and North Africa, Sub-Saharan Africa, South Asia, Southeast Asia, and the remote Far East which early-modern Europeans saw as the East. In the contemporary cultural meaning, the phrase Western world includes Europe, as well as many countries of European colonial origin with substantial European ancestral populations in the Americas and Oceania.


...
Wikipedia

...