TL;DR: This is a series about the impact that the dominance of Western Civilization has had upon the world in the last several centuries. This includes a long history of colonization, cultural development, and military action. This article gives an extremely broad introduction to the history of the West. After centuries of being relatively undeveloped compared to the rest of the world, in the last several centuries the West became the supreme power in the world, colonizing, influencing, and far surpassing the rest of it in terms of prosperity.
In modern discussion concerning “The West”, I see few views that are really balanced. Here, the West means Western European nations and the countries founded primarily by European descendants, such as the United States and Canada. In my experience, one either views the west as a ghastly force, responsible for unparalleled destruction and loss of life, or one holds a view of Western triumphalism and exceptionalism where the rise of the West was unequivocally the greatest thing ever to occur without much, if any, downside. I shall predicate all of these discussions by saying simply that I am primarily targeting the anti-west groups, because I believe their narrative to be more politically relevant and socially prevalent at the present time. With that said, I feel that the horrendous sins of the west must be understood and fully appreciated, so I will pull no punches with the pro-west group. This article will describe the general historical situation concerning the triumph of the west on a global scale.
I have to make clear that within this series we are dealing with highly collectivistic concepts. When we are discussing “the West”, we are referring to twenty countries that exist presently. Many of these countries were at the far periphery of the events that we will be talking about, and many that were central players employed extremely different policies at different times and different places. For examples, Scandinavian countries, which one could certainly include in the categories of “the West” were effectively blameless when it came to European colonization; and the structure, size, and behavior of the British Empire experienced gargantuan shifts between the seventeenth and nineteenth centuries. Nevertheless, these discussions are important in understanding certain many political discussions, and certain countries in particular, such as the United States and United Kingdom. The way that we understand the West influences much of the way that we view these countries today and the social groups that traditionally benefited from them.
Now I will give a very general history of the west and the role it played in the world. For the vast majority of human history Western Europe was something of a backwater. Until the past several centuries it was in the eastern areas of the Eurasian continent that most great discoveries and civilizations existed. The west had few if any great cities and was technologically inferior to most other areas in the east. The civilizations of Ancient Egypt along the Nile, Ancient Sumer in modern day Iraq, and the Indus River Valley civilization between modern India and Pakistan were where the first civilizations emerged, not in France, Germany, or England. The Ancient Greeks, Europe’s closest claim to early greatness, were always far more focused on trade and interaction with Egypt and the Middle East, rather than with Western Europe. There is a reason why Alexander the Great led his armies eastward, never looking back. The Middle East, India, and China was where the true wealth was.
The greatest exception to this general rule of Eastern superiority in the ancient world is, of course, the great Roman Empire. Italy lies right at the fringe of what one can consider Western. Yet in the Eyes of the Romans most of the greatest prizes were to be had in the conquests of Egypt and the Middle East. France and Spain were primarily threats to be overcome and further area to be settled into, not massively lucrative areas like some areas in the eastern portions of the empire. Indeed, in the end it was the Eastern Roman Empire centered in modern Turkey that carried on after Rome itself finally fell.
At any rate, with the fall of the Roman Empire, Europe fell into disarray, with Catholic Christian Europe becoming a peripheral part of world affairs for almost a millennium as endless infighting between small states ensued. This was the glorious era of knights and chivalry Far surpassing it in terms of wealth, technological progress, and production were the vast Muslim empires in the Middle East, the Indian Kingdoms, and China. Remember that it was an attempt to try to find a direct ocean route to India and to a lesser extent China that eventually lead Christopher Columbus westward from Spain. This in turn lead to the discovery of the “New World”, that is to say the continents of North and South America. Within a century of the discovery the great Central and South American civilizations of the Inca and Aztec empires had fallen to the Spanish, and the world was starting to look very different. Portugal with its relatively advanced naval and weapons technology at the time yielded an increasingly powerful control over Asia’s sea and ocean routes, at least outwardly claiming a monopoly over all such trade.
The increase in European wealth was dramatic. This was particularly true with the massive plunder of gold and other precious metals in South America by the Spanish, which might be the largest transfer of this kind to ever have taken place. With a much greater trade to and from the east, important ideas were gained, as well as competition between European powers for better and better technology. The first signs of the scientific revolution were showing themselves in Europe.
This lead fully to the age of European empire and colonization. From the sixteenth to the nineteenth century various degrees of control were exerted over almost every part of the world by an array of European powers at different times. Which European power was dominant in a particular area shifted dramatically over time. For instance control over China was primarily wielded by use of threats and small but decisive conflicts that opened up its ports to trade favorably to European powers, while control over India involved sometimes very direct control by the British Civil service. British colonies in North America were primarily voluntarily established, often either by those looking to make their fortune or by those looking to form religiously pure societies.
Over the course of this time the West became overwhelmingly dominant in world affairs, until eventually by the later third of the 20th century nearly all the Western Empires were either a shadow of their former selves, or disintegrated entirely. Nevertheless, Europe and the United States are still extremely influential internationally, and the United States in particular has been very ready to manipulate other nations to serve its interests. Methods of achieving this include direct military intervention and funding certain regimes.
The full implications of Western empire and the rise of the West are extremely mixed and hard to label in a simple way. The West has performed massive atrocities that are difficult to match historically, and many modern problems in countries outside of the West can be traced back to actions by some set of Western nations. On the other hand, the West formed the foundation of modern science, political rights, and developments that dramatically improved the quality of modern life. We will describe these developments in greater depth in the subsequent parts of this series.