5 Historical Biases and Other Problems
John McLean and Scott Rausch
Bias in Historical Writing
Bias is an inclination or outlook to present or hold a partial perspective, often accompanied by a refusal to consider the possible merits of alternative points of view. Biases are tendencies that tend to get in the way of finding the truth or finding the complete truth. Regardless of whether conscious or learned implicitly within cultural contexts, biases have been part of historical investigation since the ancient beginnings of the discipline. As such, history provides an excellent example of how biases change, evolve, and even disappear.
History as a modern academic discipline based on empirical methods (in this case, studying primary sources in order to reconstruct the past based on available evidence) rose to prominence during the Age of Enlightenment. Voltaire, a French author and thinker, is credited to have developed a fresh outlook on history that broke from the tradition of narrating diplomatic and military events and emphasized customs, social history (the history of ordinary people) and achievements in the arts and sciences. His Essay on Customs traced the progress of world civilization in a universal context, thereby rejecting both nationalism and the traditional Christian frame of reference. Voltaire was also the first scholar to make a serious attempt to write the history of the world, eliminating theological frameworks and emphasizing economics, culture, and political history. He was the first to emphasize the debt of medieval culture to Middle Eastern civilization. Although he repeatedly warned against political bias on the part of the historian, he did not miss many opportunities to expose the intolerance and frauds of the Catholic Church over the ages — a topic that was Voltaire’s life-long intellectual interest.
Voltaire’s early attempts to make history an empirical, objective discipline did not find many followers. Throughout the 18th and 19th centuries, European historians only strengthened their biases. As Europe gradually benefited from the ongoing scientific progress and dominated the world in the self-imposed mission to colonize nearly all other continents, Eurocentrism prevailed in history. The practice of viewing and presenting the world from a European or generally Western perspective, with an implied belief in the pre-eminence of Western culture, dominated among European historians who contrasted the progressively mechanized character of European culture with traditional hunting, farming and herding societies in many of the areas of the world being newly conquered and colonized. These included the Americas, Asia, Africa and, later, the Pacific and Australasia. Many European writers of this time construed the history of Europe as paradigmatic for the rest of the world. Other cultures were identified as having reached a stage that Europe itself had already passed: primitive hunter-gatherer, farming, early civilization, feudalism and modern liberal-capitalism. Only Europe was considered to have achieved the last stage. With this assumption, Europeans were also presented as racially superior, and European history as a discipline became essentially the history of the dominance of white peoples.
However, even within the Eurocentric perspective, not all Europeans were equal; Western historians largely ignored aspects of history, such as class, gender, or ethnicity. Until relatively recently (particularly the rapid development of social history in the 1960s and 1970s), mainstream Western historical narratives focused on political and military history, while cultural or social history was written mostly from the perspective of the elites. Consequently, what was in fact an experience of a selected few (usually white males of upper classes, with some occasional mentions of their female counterparts), was typically presented as the illustrative experience of the entire society. In the United States, some of the first to break this approach were African American scholars who at the turn of the 20th century wrote histories of black Americans and called for their inclusion in the mainstream historical narrative.
Bias in the Teaching of History
The biased approach to historical writing is present in the teaching of history as well. From the origins of national mass schooling systems in the 19th century, the teaching of history to promote national sentiment has been a high priority. Into the present day, in most countries history textbooks are tools to foster nationalism and patriotism and promote the most favorable version of national history. In the United States, one of the most striking examples of this approach is the continuous narrative of the United States as a state established on the principles of personal liberty and democracy. Although aspects of U.S. history, such as slavery, genocide of American Indians, or disfranchisement of the large segments of the society for decades after the onset of the American statehood, are now taught in most (yet not all) American schools, they are often presented as marginal in the larger narrative of liberty and democracy.
In many countries, history textbooks are sponsored by the national government and are written to put the national heritage in the most favorable light, although academic historians have often fought against the politicization of the textbooks, sometimes with success. Interestingly, people in 21st-century Germany have attempted to be an example of how to remove nationalistic narratives from history education. As the 20th-century history of Germany is filled with events and processes that are rarely a cause of national pride, the history curriculum in Germany (controlled by the 16 German states) is characterized by a transnational perspective that emphasizes the all-European heritage, minimizes the idea of national pride, and fosters the notion of civil society centered on democracy, human rights, and peace. Yet, even in the rather unusual German case, Eurocentrism continues to dominate.
The challenge to replace national, or even nationalist, perspectives with a more inclusive transnational or global view of human history is also still very present in college-level history curricula. In the United States after World War I, a strong movement emerged at the university level to teach courses in Western Civilization with the aim to give students a sense of a common heritage with Europe, particularly Western Europe, particularly Northwestern Europe. After 1980, attention increasingly moved toward teaching world history or requiring students to take courses in non-western cultures. Yet, world history courses still struggle to move beyond the Eurocentric perspective, focusing heavily on the history of Europe and its links to the United States. In many cases, World Civilizations courses are simply Western Civilization courses with a few non-Europeans thrown in to the mix.
Despite all the progress and much more focus on the groups that have been traditionally excluded from mainstream historical narratives (people of color, women, the working class, the poor, the disabled, LGBT+ people, etc.), bias remains a component of historical investigation, whether it is a product of nationalism, author’s political views, or an agenda-driven interpretation of sources. It is only appropriate to state that the present world history book, while written in accordance with the most recent scholarly and educational practices, has been written and edited by authors trained in American universities and published in the United States. As such, it is also not free from both national (U.S.) and individual (authors’) biases.
Myths and Other Problems
Good historians, like any thoughtful group of people trying to make sense of the world, try to be aware of their own preconceptions. Everyone brings biases of different kinds, and it is probably impossible to free oneself of all biases or be 100% objective, but it is helpful when getting at the truth to be cautious about how one’s own preconceptions may be shaping the experience. Every historian is a product of that historian’s own particular context, and as you will see in other related readings in this course, historical thinking changes over time. Things that were assumed to be true in one generation may be critically examined in the next, and vice versa.
Below are some common issues to be aware of when studying history today, because they are often based on dubious assumptions and faulty approaches. They obscure or distract from the truth more than they help to find the truth. This is not a complete list, but it represents some common biases. None of them are the end of the world, and reasonable people may fully disagree on how faulty or true any of these perspectives are. None of them are inherently always evil, though most if not all can be used for nefarious purposes.
Historical Myths
Historians and other academics often use the word “myth” in a very particular sense, a sense very different from the way that most people use the word in their daily lives. For historians, a myth is simply a powerful story that people tell, a story that has a lot of influence on the way that people see the world. It is a story that is told and retold, primarily because of the effects of the story (what people “get” from the story). It may be widely believed or assumed to be true, when in fact the popularity of the story is entirely independent of accuracy. Notice that last part does not say that a myth is always false, just that the story’s impact is completely separate from the question of its truth or falsehood.
Saying something is a myth does not necessarily mean it is entirely false or a lie or totally made up. A myth could be totally false, slightly true, mostly true, and even in extremely rare, very limited cases, entirely true. An example of a “true myth” would be a story that someone made up that turns out to be true, for example a slanderous lie someone made up that, unbeknownst to the creator of the lie, is true. Stories may be inadvertently true, just as they may be inadvertently false.
It is often said that every myth contains a kernel of truth. That is not necessarily true of all myths, and in any case the “true” part of a myth may not be very significant or useful. The true part of a myth may not give much if any validity to the rest of the story. For example, stories about the comic-book superhero Superman are a fixture in American popular culture across multiple media. One could find several aspects of the Superman myth that are “true,” but they are somewhat limited in value as proof of anything: there are such things as city newspapers and news reporters; some reporters are men and some women; the state of Kansas is a real place; there are farms in Kansas, and there are adopted children who live in Kansas, some of whom may even have the last name Kent. Similarly, Sherlock Holmes stories mention many real-life parts of 19th century London, including a street named Baker Street; that is not proof that therefore Sherlock Holmes existed and that all Sherlock Holmes stories are true. Finding verifiably true people and places in a book does not automatically mean the entire book is true.
Most commonly, myths are less than fully accurate for a lot of the same reasons that they are popular. They usually present a very simplified version of the past, often with very clearly marked heroes and villains. They take what is in reality a very complex, seemingly chaotic situation such as a moment in history and present it in a clear, entertaining fashion, usually with a clear and tidy beginning, middle, climax, and ending.
A historical myth is myth about the past, usually in the form of a narrative story or overarching theme, presented as the truth but ultimately independent of accuracy. Historical myths are commonly stories or explanations that people assume are true because deep down they want them to be true. Historical myths tend to confirm what people already think is true about human nature, their own communities, and the present day compared to the past. Sometimes people remain faithful to the “truth” of a myth even in the face of evidence to the contrary, even if they admit the story is not true. One popular 1990’s collection of historical myths captures the feeling best in its title, a quote attributed to Warren G. Harding: I Love Paul Revere Whether He Rode or Not.
Teleology
Teleology refers to several inter-related ideas, all of which tend to shape the study, writing, and teaching of history, sometimes consciously, sometimes unconsciously. In general, teleology refers to the idea that history moves in a particular direction or towards a particular endpoint. Teleologies suggest that human history follows a single, automatic course, as if human societies, civilizations, or even humanity itself follows a predestined track towards a predetermined endpoint. It tends to rely on deep, abstract philosophical assumptions about the nature of human beings, the nature of human societies, and even the nature of cause and effect. These assumptions are usually either unproven or unprovable or a matter of endless debate. The best approach for a good historian is to avoid the assumption that all of history is fated or pre-ordained or all on a single track.
Some examples of telelogies to avoid in History:
1. Progress with a capital P is the assumption that on the whole the historical record shows humanity improving or getting better in meaningful ways and/or assuming that progress however defined is the natural, inevitable tendency of human societies. The most common examples of this appear in popular versions of the history of science and technology, where humanity over time as a rule increases its knowledge, gets better and better scientific explanations, gets better and better technology, and creates more and better knowledge at its disposal. This includes the assumption that humans know more today than they ever did before, we have better information than we did in earlier generations, and newer technologies are by definition superior to older technologies.
2. Innovation Equals Improvement is the assumption that introducing new things or ideas is the same as improving a situation or promoting progress. This can be seen in institutions that treat “innovation” as a goal in itself rather than as a means to an end. It also tends to focus attention on historical “firsts” as moments of achievement for a larger society or civilization. This tends to assume that introducing new things is inherently an improvement. For a good historian, innovation or any other change can be positive, negative, both, neither, or something else entirely.
3. Primitive to Advanced. Earlier generations of historians, anthropologists, economists, and other social scientists generated a teleological theory that human societies followed a fairly predictable course very similar to the stages of life of a human being. Under this idea, cultures, nations, empires, and entire civilizations all begin as “primitive” groups of people — ignorant, simple, brutal, childlike, immature, etc. Then, over time, some of them “develop” into more complex societies and “mature” into more advanced societies. Those who subscribe to this point of view almost always have the bias of assuming that their own society is the best, most advanced one, and almost always see their society’s historical trajectory as the best path for others to follow. A further problem is that what counts as “advanced” is constantly updated to include the most recent communities that the author thinks are on the right track. This viewpoint is very common in countries where national economic policies focus heavily on “economic development,” which assumes that older economic systems are inferior to new ones of a particular type. In another example, World History courses have often implied that there was a natural, almost automatic one-way progression from hunting and gathering to horticulture to agriculture to industrialization, instead of seeing these systems as options created, adopted, and adapted by human decisions.
4. This often combines with a Simple to Complex teleological fallacy. Many popular views of human history, even very recent history, take at face value the assumption that people in earlier millenia, earlier centuries, even a few decades ago lived much simpler lives than we do today. Past peoples are often assumed to have been more narrow-minded, more driven by tradition, more literal-minded about their stories, more constrained by their social identities, and more devoted to rigid social distinctions than people in the present day. The presumption is that life in modern society is more complex, less predictable, more free for the individual, less bound by tradition, more creative, and more sophisticated in our media consumption. There are elements of truth to that assertion, but it’s an assertion that few people today question, and the true comparison/contrast is much more complicated.
5. Teleologies run in the opposite direction as well. A declensionist teleology is for example the idea that human history is on a course of decline, or a on collision course with disaster, or that ever since a particular “wrong” turn, humans have been doomed. (For example, humans were doing just fine, totally in harmony with nature, until that fateful day we developed agriculture, which led tragically and inevitably to civilization, which led to industrialization, which is even worse, which now means the end of human life entirely.) Declensionist teleologies also contribute the idea of “golden ages” mentioned below in “Nostalgia.” Breaking up historical fields into “classical” and “post-classical” eras can be a form of declensionism, because such terms tend to imply that everything after the classical period will always and forever be less impressive, that the end of a classical period was a permanent decline.
Nostalgia
Nostalgia refers to the tendency to see situations in the past as better than they really were. In historical approaches, nostalgia often presents previous eras or previous generations as superior to the present day or to people in the present day. In many cases, nostalgia presents the idea of a “golden age” of a particular society or civilization (or a period in the history of comic books), when everything was at its peak of goodness, greatness, virtue, wealth, success, etc. Often, people in the golden age are retroactively portrayed as simply better people, motivated by better things, following more closely to the ideal than people in later generations, who are by definition less-than, derivative, pale imitations of the better sort. In the U.S., this often takes the form of celebrating a) the unsurpassed wisdom and virtue of the “Founding Fathers” generation, b) the never-repeated heroism, sacrifice, and moral rectitude of the World War II generation, and in more recent years c) the pure idealism and selflessness of those who protested in the 1960’s.
Nostalgia by its nature is highly selective. It automatically downplays the negative aspects of earlier periods and tends to focus on the most favorable interpretation of past events. It depends on a highly simplified version of the past and is usually powered by a desire to change the current situation to a situation that the nostalgic person prefers. It is very commonly an expressed sentiment of people who feel that their own community today has less power, prestige, or influence than it did in the past, back when things were “the way they were supposed to be.”
Nostalgia does not depend on an accurate view of the past or actual lived experience; in fact, it can be entirely divorced from reality. The Portuguese word saudade is in some ways a good term for understanding nostalgia. The word has no single counterpart in English but refers to a feeling that combines sadness, longing, a sense of loss, and homesickness. Like nostalgia, it can refer to longing for a moment that never actually happened but that you wish had happened. It can be homesickness for a place that you have never been to but still feels like home. (For many Brazilians, saudade is an example of what is called an auto-stereotype: something that someone from a particular culture says that only someone from that culture can understand. “It’s a [insert identity] thing. You wouldn’t understand.”)
Modern Self-Congratulation
This pitfall combines aspects of many of those listed above. Self-congratulation in history can take many forms. It can be a tendency to see one’s own historical context as a better time than all previous eras, for example seeing one’s own generation as superior to all previous and subsequent generations. It can be a tendency to see current society as the culmination of a process of improvement, seeing present-day civilization as the best, most advanced civilization, the one that “has more things figured out” than earlier societies did. It can go hand-in-hand with a teleology that, for example, treats the history of the United States primarily as a series of problems that Americans solved one after the other, systematically making a freer, more equal, more democratic society.
It shows up in the habit of present-day people in assuming that earlier generations, especially people very far into the past, were simple-minded, close-minded, stuck in their traditions, straight-jacketed by their own rituals, and so trapped in their thinking that they thought all of their stories were literally true. In contrast, according to this modernist snobbery, we modern folks are highly sophisticated, very critical of our own biases, much more creative, more accepting of difference, freed from traditions, and free of empty traditional rituals. We have, supposedly, replaced superstition with science, prejudice with information, and despotism with freedom, and we are, in the words of Lewis Carroll, capable of believing “as many as six impossible things before breakfast.”
In extreme cases, this prejudice assumes that if modern-day people do not know how to do something without modern technology, then earlier people without modern technology certainly could not have accomplished it, either. For example, if we today cannot make highly precise architectural constructions without lasers and GPS, then ancient Egyptians could not have done so either, and thus the only explanation for the precision of the pyramids is therefore extraterrestrial help. In reality, a more likely conclusion is that we in the present may not have all the information we need to understand how the Egyptians did what they did.
To counter this tendency, you should avoid assuming that people in the past were inherently simpler or had inferior minds compared to people today. According to the best paleontology available, the human brain is essentially the same today as it was 100,000 years ago. If a time traveler kidnapped an infant from around a campfire in 80,000 BCE and brought it to the present day, that infant has the same chance of getting a Nobel Prize in Physics as a child born today.
There is abundant evidence that humans thousands of years ago were just as capable of complex, creative, revolutionary thinking as we are in the twenty-first century. They were capable of fantasy, technological innovation, social rebellion, and highly abstract thought; in fact, many of our modern forms of those things are based on ancient ones. Ancient peoples were not necessarily more loyal to their traditions than people are today. They were not necessarily less resistant to change, more conservative, more obedient to authority, more likely to mindlessly reproduce their culture, or more likely to accept their prescribed social roles, including traditional gender roles.
Even the idea that modern society has better access to more and more information over time is somewhat debatable. Even if the body of knowledge provided by the internet is in the aggregate larger than the knowledge available to forager communities in the distant past, that does not mean that 1) internet humans are more intelligent, think more complicated thoughts, use more information on a daily basis, or make better, more objective decisions; nor 2) that foragers do not have information, knowledge, and wisdom that have been lost to internet humans.