by Brian Whitney.  Inspired by a devotional address given by Elder D. Todd Christofferson on September 24, 2013 at BYU Idaho.

I like to make up words.  When I feel that the minuscule mental syllabus I have at my disposal fails to provide just the right option for conveying a specific feeling or idea, I simply make up a new word.  “Certaintude.”  I like it because it connotes both certainty and attitude, or, having an attitude of certainty.  That feels right.

Actually, I am not the only person to have thought of “certaintude” as a legitimate term, or confuse it with “certitude.”  If you perform a Google search, you will come across several listings from people inquiring as to whether it is an actual word.  The word feels so correct that it is easy, perhaps natural, to draw a conclusion that it must be, in fact, extant in our vocabulary.

Trust Your Instincts to Fail

This is how the human brain works.  We take a couple of seemingly related ideas, draw a connection or inference, and then form a conclusion.  Perhaps this genetic characteristic was passed down from our primitive ancestors who, for survival, had to quickly be able to “put two and two together.” The primitive man hears a rustling noise in nearby bushes, smells damp fur, and concludes that it could either be a rabbit or a wolf.  Thinking it better not to take his chances, he backs off, crouches, and waits to see what will emerge.  This is basic, instinctual reasoning, commonly known as the “fight or flight response.”  We do this as children, we do this as adults, and we do this as religious believers or skeptics.  However, as is often the case, we find that our instinctual reasoning is not always a reliable tool.  Our senses are all too ready to rush, accepting insufficient data in an effort to hasten a self-preserving conclusion. The human species did not survive on “fight or flight” instincts alone (let’s face it, in instinctive categories, humans are not all that remarkable), but also our capacity for inquiry and innovation.

When Rene Descartes set off on his philosophical quest to discover what he could possibly know with absolute certainty, the first casualties on his list were anything he knew through by his senses.  Why?  Because a stick appears bent in water.  That our eyes could play that mean of a trick on us, Descartes reasoned, means that we can’t trust what we see.  He systematically eliminated all senses from the list of possibly reliable data collection points, leaving him without knowing whether he was dreaming or awake, or if he was an actual being at all.  Descartes reasoned that, because he even possessed the capacity to ponder that question, he must exist.  I think therefore I am has since become one of the most often recited philosophical declarations, perhaps largely due to its brevity if not its profundity.

In religious terms, this manifests itself most clearly in knee-jerk reactions to new information.  Whether from the devout believer who, encountering critical information, quickly cries out “persecution!” or the skeptic who, coming across unflattering information, speedily decries the entire belief system, both are appealing to their most primitive “fight or flight” instincts.  Most often, both are also relying on the unreliable sensory data collection that Descartes warned against – something they thought they saw or heard, or something that simply leaves them with a bad taste or odor.  It just doesn’t feel right.  While we should not completely ignore those impulse reactions, they should always encourage us to explore matters further, never rush us to a conclusion.

Slow-Cooked Certainty

To use a clumsy metaphor: certainty should be a slow-cooked and carefully prepared meal, not cheap and quickly consumed fast food.  However, we live in a fast-paced, modern society where blame can’t be placed solely on the individual for not having the time, energy, or wherewithal to spend much energy on one singular project – whether it be preparing a complex, but nutritious meal, or researching complex religious or historical topics.  Most of us are simply trying to maintain enough sanity to keep up with the constant flow of information barraging us at lightening speeds.  The Internet has simultaneously allowed us to become quasi-experts of many things while masters of none.  Bullet-points, summaries, blog posts, status updates, texts, tweets, and memes are all most of us have the motivation or inclination to pay any attention to.  Again, it’s not entirely our fault.  Society has conditioned us to filter information this way.  It’s a survival mechanism.  However, there is a downside to this reality.  Just as our diets suffer from lack of nutrients, so to does our knowledge – being satiated by the fillers, artificial ingredients, and high-fructose syrup of social media, sound bytes, and sensationalist media.

I miss the library.  I miss the days that researching meant having to thumb through a card catalog, seeking for sources on a specific topic, and then perusing the open stacks, fingers lightly grazing the spines of worn and faded books, and sitting down at a table with a comfortable, wooden chair, and seven or eight thick books eagerly waiting to be opened.  It’s a romantic, if not antiquated, image.  Don’t misunderstand; I love Google and Wikipedia as much as the next guy.  They are both great ways to access information quickly, get a good overview of a topic, and hopefully move on to more comprehensive sources.  I say hopefully, because that is where most of us fall short.  To return to my clumsy metaphor: we tend to take the prepared, microwave version rather than taking the time to work with the raw ingredients.

The benefit to the pre-internet days of research was that it took significantly longer to dig into things, which resulted in a more comprehensive dissemination of information.  In looking for material to form a thesis from, you were exposed to varying scholarly perspectives, and the publishing, editing, and peer-review process gave assurance that there was at least some measure of quality control on the information you were being exposed to.  A scholar could study a subject for five, ten, or fifteen years before feeling qualified to offer an expert opinion.  In the post-Internet world, it seems scholars are born after five, ten, or fifteen minutes of online research.

Just the Facts, Ma’am

What am I getting at?  I am arguing that we often confuse facts and conclusions in our hasty search for knowledge.  As I was completing my general courses for my undergraduate degree, I took a very challenging course titled Deductive Logic of Philosophy.  In this class, we learned how to symbolically frame arguments to look for inconsistencies, contradictions, and invalidity.  The definition of an invalid argument was “an argument which contained all true premises but a false conclusion.”  Premises are nothing more than statements being made as facts.  Let me give a couple of arguments to illustrate:

Early Mormons practiced polygamy
Some early Mormon women were widows
Therefore, the purpose of polygamy was to care for widows

The first two lines (the premises) are, indeed, facts.  Early Mormons did practice polygamy, and some early Mormon women were widows.  However, the last line (conclusion) has problems.  We can imagine several scenarios where this isn’t a true statement.  Not all of the women who entered into plural marriage were widows.  Some were single and others were already married and entered into a polyandrous relationship.  This is an invalid argument: true, factual statements, but a conceivably false conclusion.  We have to separate the facts from conclusion.

Let me give another example:

The Book of Mormon reported to contain ancient records written in “Reformed Egyptian”
Archaelogists and linguists have yet to identify a language they would call “Reformed Egyptian”
Therefore, the Book of Mormon is not an ancient record

Again, true, factual premises, but the conclusion presents us with a problem because we can imagine scenarios where this statement isn’t true.  For instance, there are more dead languages than there are living languages, and scientists continue to discover ancient civilizations we did not know about before.  It is also possible that what Joseph Smith called “Reformed Egyptian” isn’t the exact term that the reported author, Mormon, used. Because there are other plausible conclusions, we have to label this argument as invalid.

Don’t confuse facts with conclusions.  What researches look for are historical facts that line up with other historical facts.  What a good researcher does is refrain from forming conclusions before gathering enough facts.  How many facts is enough to form a conclusion?  Perhaps that depends on the complexity of the question being asked.  For now, it is enough to be able to differentiate between a fact and a conclusion.

Seek First to Understand

Max Weber (1864-1920) is most commonly identified as a Sociologist from the “German School,” however it is important to understand that he was, primarily, a historical researcher.  Unlike many of his contemporaries, Weber developed his the main body of his social theories by studying the past.  In his methodology, Weber became well associated with the expression verstehen, which is German meaning, “understanding.”  For Weber, this was a methodical tool that was necessary in studying other cultures, societies, or the past.  Unlike how it is often interpreted, Weber was not simply advocating empathy.  For him, it was a much more scientific process. Weber argued that social norms, symbols, or rituals of a society could only be understood from their perspective, not ours.  Seems simple enough.  Furthermore, Weber argued for what he called “value free” research; meaning leave your personal values at the door when trying to understand things from another perspective.  We might call this objectivity.

Of course, it is debatable as to whether true objectivity can or even ought to exist.  We are social creatures who interpret and define everything from a very deterministic set of socially constructed circumstances.  What I mean to say is that bias always exists, so long as human beings are the ones telling the stories.  The history we read and the history we write are all taken from subjective human experience.  There is no way around it.  However, that doesn’t have to be a negative thing.  How boring would it be if everyone told the same story the same way?   Subjectivity is the spice that seasons the stew, so to speak.  The point isn’t to avoid it, but to understand that it exists; and, because it exists, we need to read as wide a variety of subjective perspectives as possible.  It is as important to give time to the believer as well as the dissenter.  Both are real experiences.  Both can inform you a great deal about historical fact.

“Facts are stubborn things,” said John Adams in typical lawyerly fashion.  He continued, “Whatever may be our wishes, our inclinations, or the dictates of our passion, they cannot alter the state of facts and evidence.”  But the so-called “facts,” we must remember, are the children of the rather enigmatic system of memory that humans possess.  In short: we make stuff up, even unintentionally.  This is why the practice of law is devoted to the cross-examination of a variety of witnesses and experts.  Even then, the case is argued, with the most compelling argument swaying the judge or jury.  Although using logic and evidence to build the case, the ruling is, ultimately, a subjective belief that one body of evidence outweighs the other.  Wrote Lewis Carrol in You Are Old, Father William:

In my youth
I took to the law
And argued each case with my wife
And the muscular strength
Which it gave to my jaw
Has lasted the rest of my life

 

Method to the Madness

This brings me to my final point.  The human challenge, as I see it, is trying to, like Descartes, divide truth from error.  Paradoxically, we are ill equipped, both by our natural senses, as well as our subjective natures, to accomplish so great a task.  History, therefore, is the flawed recollections of experiences told by the flawed interpretations of researcher.  Facing this reality, historians have relied upon methodology to disseminate trustworthy from untrustworthy sources, much in the same way a lawyer may call a witness reliable or unreliable.

Let me give one example from a paper I wrote about the Shaker religion.  One of the sources I chose was a dissenter.  Someone who had left the religion and later wrote an expose of the madness and corruption that he witnessed.  Is his perspective biased?  Without a doubt.  However, does that bias merit exclusion from the research?  I didn’t believe it did.  While his views and possibly his memory may be tainted, they are still valuable in understanding a better picture of what it was like to live among the Shakers.  Some of his more caustic statements needed to be prefaced, so the reader understood the perspective by which they were spoken, but they provided valuable insights for the argument I was making, possibly insights that a devotee may have not been so willing to admit to.  The opposite can certainly be true as well.  As we tend to dwell on controversy and sensationalism, we need to not ignore the faithful or devoted perspective.  Their experiences are every bit as informative, perhaps more in many cases, as to what makes a community thrive or fail.

To briefly explain the historical method, one needs to be familiar with primary sources and secondary sources.  As their names imply, a primary sources is one written by the person, or contemporary to a person or events that you are wishing to study.  Artifacts like journal writings, correspondence, newspaper clippings, affidavits, and meeting minutes, are common primary sources.  Although much of this has been made digitally available through websites like books.google.com or archive.org, the vast body of primary sources are still held in collections at public and private archives.  Secondary sources are books or articles written about people and events of the past.  These are typically written by historians or other scholars and, if they are doing their job, contain a long list of primary sources that they drew from in their bibliographies.  Biographies and general history books are examples of this.  To make it simple: the vast manuscript of letters, journals, and other writings in the Brigham Young Collection housed at the LDS Church History Library are primary sources.  John Turner’s biography, Brigham Young: Pioneer Prophet, is a secondary source.  Most readers of history read secondary sources.

1383892_10201999673224032_763558125_nWhen using the historical method, what researchers are generally looking for are correlations or multiple accounts from more than one credible witness in order to establish something as historical fact.  For example, we factually say that in 1852 Brigham Young was in Salt Lake City because we have a number of witnesses who recorded being present at a conference he spoke at.  Furthermore, because we have a number of minutes recorded by several people, we can pretty well piece together what Brigham Young said at this event.  It is the corroboration of the historical record that allows us a greater measure of certainty.  However, just because an event was not witnessed or written about by multiple people, does not imply that it did not happen, only that we have scant evidence to make a claim either way.  It is an unknowable, forever given to speculation.

When an author pens a book, they are typically posing some sort of thesis or argument about the topic.  I would go so far as to say that this holds true for biographical work as well.  The author’s bias and underlying assumptions are going to manifest themselves in how the historical record is not only interpreted but also edited – meaning how much information the author is sharing in order to support their argument.  Here is where we, as readers of history, get ourselves into trouble.  It is easy to assume that an author is treating their subject fairly and become consumed with and convinced by their work.  This is particularly easy when we have not cross-examined the evidence they present, analyzing other points of view, or interpretations, of the same material.  It is not the author’s obligation to inform the reader of their bias, so the reader must uncover bias – a process made possible by comparing and contrasting other perspectives of the same subject.

Unfortunately, this process of comparison/contrast doesn’t happen as often as it should in religious studies.  Partly due to the high-speed information-driven environment we are now in, and partly due to our “fight/flight” natures, and partly due to the highly personal nature of religious belief, we tend to respond emotionally rather than academically to perceived bias.  We confuse assumptions with facts and hastily draw conclusions that may or may not be supportable upon further investigation.  So here are my tips for developing a well-rounded perspective on historical topics when dealing with secondary source material:

  1. Consider the author and publisher (assuming it is not self-published).  What other work has the author done?  How has it been received?  Is the publisher a small-press publisher, an in-house publisher of a retailer (like Deseret Book), or a university publisher?  There is no reason to avoid one over the other, but just be aware that some small-press publishers don’t offer editing services, which can lead to unchecked claims.  University publishers generally have a peer-review process that, although not perfect, is usually more reliable than self-edited work or editors that employ a particular slant (yes, editors are subjective humans as well).  As a rule, self-published work is the most risky because it is most likely that the author has had no vetting done on their claims.
  2. Skip to the end.  No, not the final closing lines, but past that to the bibliography.  How robust is it?  What kind of sources did the author draw from?  Remember the difference between a primary and secondary source.  Drawing from secondary sources is not a negative thing; just know that the author did not focus their research on primary documentation, which can affect the reliability of the outcome.
  3. Read the preface.  Most often, the argument that the author is trying to make will be found in the preface or introduction of the book.  It may seem silly to say, but it is important to understand the argument that an author is about to make beforehand so that you can weigh the evidence that the author gives against their argument, bearing in mind that the author will naturally make subjective interpretations and possible editing of the evidence they provide.
  4. Read reviews by experts in the field.  I don’t mean the reviews on Amazon.com.  Academic Journals that contain reviews are usually the best place to go.  The reviewer is typically chosen by the journal editor because of their academic background on the subject of the book they are reviewing.  This doesn’t mean that the reviewer won’t be biased, but it helpful to see what others who are familiar with the topic have said.  A negative review shouldn’t deter you from reading the book, but it should make you aware of potential problems.
  5. I can’t stress this enough – Don’t just read one book on a particular subject or person, particularly if the subject is tied into the religion you adhere to.  This is far too emotional and personal of a life commitment to allow one book sway your views dramatically in one direction or another. If you find yourself reading a book that is particularly challenging, rather than immediately agreeing with it or dismissing it, make it a point to find other books from other perspectives on the same subject.   Like I said, there are many scholars who spend a lifetime researching and writing on one subject, and still don’t feel that they have mastered it.  Don’t allow yourself to feel like you have become an expert on a subject because you have read (or skimmed) one book on the matter.

To understand the nuances of human history requires patience, a certain degree of charity, and diligence.  Due to our natural predispositions, we are not as well equipped to survive on instinct alone, we also need to use our tools of inquiry and innovation.  We need to attempt to understand the perspective of others by methodically studying a variety of perspectives, while keeping in mind that each perspective is going to contain bias and a subjective understanding of people and events.  Cross-examining evidence, understanding the difference between a fact and a conclusion, and judging a published work by its merits by looking for signs of well-researched and vetted scholarship, as well as being anxious to entertain a wide variety of discourse on a subject, will assist us greatly in navigating the waters of scholarship.  Don’t rush the process, allow it to slow-cook, and don’t get caught up in the fallacy of “certaintude.”