Marketing Private-label Designer Brands of Relativism as Absolutism:
Info-wars Destabilize our Common Conceptual Climate

Return to Homepage

Return to next Topic in main file: "How to Hack a Concept: Mental Malicious Code"

Sections

Overview
Cultural Destabilization -- Nihilistic Conceptual Corrosion
Cognitive Relativism: The Reality of Interpretive Frameworks
Why Conceptual Climate Change is Difficult to Perceive
Doesn't Cognitive Relativism imply (or excuse) Moral Relativism?

Overview

Defining Cognitive Relativism:   Every individual's viewpoint(s) necessarily is embedded in some conceptual framework(s), and that necessarily shapes/distorts/colors our worldview. As a result, that individual views the world as refracted/filtered through their particular interpretive framework. Hence, that individual's Cognition is Relative to some conceptual/interpretive framework(s).

Unethical Info-war strategists argue, "Why not color their thoughts to our advantage? If we don't do it, our opponents surely/probably will. Therefore let's introduce, and propagate into the common cultural environment, conceptual frameworks that will serve our own interests."

There's a very troubling analogy between shooting "CO2 bullets" into our common physical environment -- causing climate change chaos -- and shooting "conceptual bullets" into our common Cultural Environment:   How much Disinformation can our semantic environment absorb, before it gets progressively more disordered and destabilized, and passes various "tipping points"? I argue that we've already done an enormous amount of damage, but -- like the damage to our physical climate system -- it's hard for us to perceive that damage directly, due to our own perceptual and conceptual limitations.

Conceptual disorder, incoherence, or inconsistency -- either within an interpretive framework, or between various frameworks -- presents problems. "Cognitive Dissonance" refers to the detection, realization, or recognition (on some level) of contingent empirical tension -- or perhaps inherent logical contradiction -- in an individual's mental environment. I may also use "Cognitive Dissonance" to refer to analogous things in the Cultural Environment. Whether individual-cognitive or collective-cultural, such dissonance is experienced as discomfort. Often, that discomfort induces a response. Disordered or conflicting concepts may be reconciled via rejection, revision, or synthesis/re-combination/re-organization. Depending on circumstances, various responses to conceptual dissonance may be more or less beneficial or harmful.

Elsewhere, I've described cases of unintentional, but pathological Info-war. (I've attributed those emergent pathological forms to the evolutionary process of Escalation acting on competing variants of Cultural Software.) In this section, I identify longstanding cultural processes ("nihilism") that often are unintentional forms of Info-war, and that also may constitute emergent cultural pathologies.

I argue that, in viable and flourishing cultures, these forces function analogously to a cultural immune system -- attacking dangerous "foreign" ideas that cannot readily be assimilated, and cleaning up the culture's own "waste products" (fragments of "native" concepts that were damaged by internal cultural contradictions). Thus, when it functions properly, a cultural immune system is essential for that culture to thrive, adapt, and evolve. Its forces are akin to "constructive criticism", or conceptual "creative destruction" -- they are both "conservative" and "progressive".

However, various circumstances may trigger a self-destructive cultural auto-immune reaction -- a self-perpetuating disease state of cultural pathology where these forces indiscriminately or disproportionately attack aspects of their own culture. I argue that the "patterns of nihilism", identified by Spengler as characteristic of collapsing civilizations, are due to individuals becoming embedded in various culturally-destructive conceptual frameworks. Moreover, I argue that some of today's Info-warriors -- although intending primarily to advance narrow tactical and strategic objectives -- unintentionally are amplifying broader pathologically-destructive cultural auto-immune reactions, by marketing private-label designer brands of conceptual Relativism disguised as Absolutism.

Top of Section

 

Cultural Destabilization -- Nihilistic Conceptual Corrosion

The following section is relevant, both to Moral Relativism (which I address below), and to my larger thesis -- that Info-war is damaging our shared Cultural Software, our common cognitive infra-structures · · · but that such damage is difficult to perceive, partly because we tend to "measure" those changes with "conceptual yardsticks" whose measuring standard already has changed!

 
	# Insert extended & annotated quote from: 
 "After Virtue:  A Study in Moral Theory" -- 
	by Alaisdair MacIntyre (Second Edition.  1984.  Univ. of Notre Dame Press.  pp. 2 - 5). 

MacIntyre's analysis of the problem is insightful and valid -- i.e, Western cultures indeed failed to recognize that the conceptual foundations for their moral theories had been destroyed (yet we continue to use language and concepts that presuppose an intact foundation). That's bad, but it gets worse:   I argue that what MacIntyre describes is but one particular instance of a more general problem.

In his "Will to Power", Nietzsche warned that, "our whole European culture has been moving as toward a catastrophe." Nietzsche predicted this cultural trajectory toward collapse was nearly inevitable, and he attributed it to certain destructive forces, called "nihilism".   Nihilism is a destructive form of skepticism that corrodes all knowledge, values, meaning, and purpose. Nihilism must be contrasted with contructive forms of skepticism -- which contructively criticize claims of absolute certainty (human infallibility) about knowledge, values, meaning, or purpose, in order to improve humanity.

According to the Internet Encyclopedia of Philosophy section on Nihilism:

"Convinced that Nietzsche's analysis was accurate, for example, Oswald Spengler in 'The Decline of the West' (1926) studied several cultures to confirm that patterns of nihilism were indeed a conspicuous feature of collapsing civilizations. In each of the failed cultures he examines, Spengler noticed that centuries-old religious, artistic, and political traditions were weakened and finally toppled by the insidious workings of several distinct nihilistic postures · · · Spengler concludes that Western civilization is already in the advanced stages of decay with all three forms of nihilism working to undermine epistemological authority and ontological grounding."

It's abundantly clear to every intelligent American that "epistemological authority" has been severely wounded by Info-war campaigns of economic special interests and self-serving political demagogues. Accusations of "junk science" are met with counter-claims alleging "scientific conflicts of interest". The intelligent layperson often must invest considerable effort to penetrate the disinformation, and decide which side is "really right".

For laypersons who lack sufficient strength, time, or energy to withstand the Info-war barrages, one alternative is apathy and resigned acceptance of a radically-irresponsible form of Cognitive Relativism -- not only is everyone entitled to their own opinion, but everyone also is entitled to their own facts! This stance typically converges to self-indulgent hedonism, Narcissism, and Moral Relativism. For those unable to withstand Info-war, the other alternative is to "buy-in" -- emotionally, cognitively, and ethically -- to some of the many ill-founded forms of Feel-Good Absolutism that are widely marketed. How does one decide which conceptual brand of Absolutism to choose when aligning one's moral compass, and establishing the truth of "facts"? In America, popular consumer choices of Absolutist brands include the semantic snake-oils sold by Rush Limbaugh, by proponents of "free-market" Economic Darwinism, by "Gospel of Prosperity" preachers, by theologically-illiterate Fundamentalists, by scientifically-dishonest militant Atheists, and by nationalistic "real-politic" materialists.

Top of Section

 

Cognitive Relativism: The Reality of Interpretive Frameworks

Is Cognitive Relativism a "radical" new idea? No -- it's basic, common-sense reality.

"Cognition" is interpretation. Any thought, sensation, perception, feeling, experience, emotion, concept, image, or word that has any significance, has that "meaning" only because it was interpreted relative to some framework. Our word, "recognize", emphasizes that "raw data" carries no meaning until -- via some interpretive framework -- we have "re-cognized it as" a certain type of data that has a particular significance. For example, consider the letter, "I". Does "I" mean "Me", or does "I" mean "You"? Or does "I" mean "the Roman Numeral signifying the number One"? Every individual has many different interpretive frameworks to choose from, and apply in various circumstances.

Meaning necessarily is relative to some frame of reference; some context. Our English word, "context", originally meant "the weaving together of words". The English words, "context", "text", "textile", and "texture", all derive from the same Latin word, "texere", which means "to weave". If we focus our attention on one particular word or phrase in a book -- thus bringing it into the foreground -- its meaning depends on the surrounding words -- which our concentration has forced into the background -- with which it was "woven together". OK, so what is the meaning of those background words? To determine that, we must shift our concentration, and bring into the foreground, a phrase that previously had been part of the surrounding background. In a "woven textile", no particular strand is absolutely part of the foreground, rather than the background. It is only the relative amount of attention we focus on a particular strand, that makes it appear foreground, rather than background. So it is with the words in a body of text.

When I write, I use various methods to emphasize certain words, because I intend for you to interpret those words as relatively more important foreground, compared to their surrounding background. That is, I may intend for you to use a certain interpretive framework, in order to understand the meaning of my words. But it's not always that simple -- I may intend my words to convey more than one meaning. (Often, sharing these multiple meanings successfully is precisely why we enjoy communicating via the genre of jokes.) Besides intentional ambiguity, my words might reveal multiple meanings that, although true, I did not intend to communicate.

There can be no absolute frame of reference (at least, not for human beings.) To understand one word, how much of the surrounding "background context" should we include in our frame of reference? And, even within that "background", which words are relatively more important than others? Elsewhere, I documented how the meanings of the words, subjective and objective, have been reversed -- switched to mean the exact opposites of what they formerly meant. But unless you learned this fact, and you "pay attention" to it now, so you recognize that it might be an essential part of the (historically-surrounding) context -- and unless you know when that switch occurred, and when some author wrote the text you're now reading -- you cannot be absolutely certain that the frame of reference you've chosen is "good enough" or "correct enough" to justify your interpretation of text that contains the words subjective or objective.   And how can you be certain there are no other words, whose meanings have switched?

What about religious interpretations -- to understand the meaning of a text such as the Bible? Is there some absolute frame of reference to use, some "divine dictionary" that will guarantee "the correct" translation (or, at least, a "good enough" interpretation) of the Bible's meaning? (Dare I ask what frame of reference to use when I interpret that "divine dictionary"?)

In America, a longstanding debate rages over the issue of whether the theory of evolution contradicts the Biblical story of creation. I (and many others) dispute the absolute infallibility of Popes, but few would accuse the Vatican of "Godless atheism". Regarding the alleged incompatibility between evolution and creation, the current (2008) Pope, Benedict XVI, stated, "They are presented as alternatives that exclude each other ... This clash is an absurdity." The previous Pope, John Paul II, reasserted the longstanding Catholic interpretation that there is "no [inherent] opposition between evolution and the doctrine of the faith." Cardinal Paul Poupard, the head of the Pontifical Council for Culture, correctly identified the source of the clash -- disagreement over which conceptual framework should be used when interpreting the Bible's account of creation: "Fundamentalists want to give a scientific meaning to words that had no scientific aim." Clearly, this clash offers another case study in the Confusion of Gospel and Culture. Are we justified in using a scientific frame of reference -- a cultural product only about 200 years old -- to interpret the intended meaning of a text written about 2000 years ago? What if, unknown to us or to historians, many words besides subjective and objective have undergone significant changes in meaning? I suggest that all Americans might benefit from reading what the Catholic Encyclopedia has to say about Hermeneutics, a discipline that helps us select appropriate interpretive frameworks. I dispute the "Catholic answers" it gives, but it "asks many of the right questions" about choosing an adequate frame of reference to interpret the meaning of texts -- questions which most Americans don't even think to ask. Written in 1910, that article on hermeneutics offers timeless insights on human cognitive fallibilities that modern Americans -- in our arrogant certainty that we always infallibly apply "the correct" conceptual tools -- seem never to have learned:   "the controversies between Jews and Christians, between Christians and Rationalists, between Catholics and Protestants, are in the end brought back to hermeneutic questions." .

When human beings make scientific interpretations to understand the meaning of "raw data" collected by experimental observations, Einstein's theory of relativity recognizes that there can be no absolute frame of reference, no "motionless background" against which to interpret the foreground data. Quantum theory, and Heisenberg's Uncertainty principle, assert that, in some cases, the very act of "measuring" experimental data must be interpreted as part of the interpretive context -- because the act of "measuring" changes the "measurement". The "observer" becomes a participant; the reader becomes a co-author. Therefore, by including the participant's impacts in a combined frame of reference, modern physics theories offer frameworks that, according to numerous physics experiments, seems to be "good enough" to explain and to predict the behavior of the physical universe.

Every individual has many different interpretive frameworks to choose from, and apply in various circumstances. There are many "scientific" and "engineering" frameworks, and many "economic" frameworks. Several such "single-viewpoint" frameworks can be combined, to yield a framework that incorporates a broader context. If so, we need to evaluate the suitability of that broader framework. I.e, exactly what situations is it good for, vs. in what situations should it not be applied? It may be difficult to determine the relative importance of "science", "engineering", and "economics" in a particular combined interpretive frame of reference, i.e, how much these different types of considerations are in the relative foreground vs. the relative background. It may become even more difficult, when the observer is an active participant   · · ·   because in that case, the motivations of every participant must be included in the frame. That is, when an allegedly "detached observer" becomes a participant, this inevitably will perturb the "objective measurement". Therefore, the motivations of every participant must be assessed, and appropriate "correction factors" included in the reference frame. The participant's own interpretive biases must be recognized explicitly as an important part of the interpretive context.

In 1986 the NASA space shuttle, Challenger, exploded during liftoff, killing all seven astronauts. Prior to liftoff, there was extensive discussion about whether it would be safe to proceed with the launch, given that cold temperatures the previous night had made certain flexible parts of the rocket -- its O-ring seals -- very rigid. What interpretive framework should participants in that discussion have chosen, to evaluate this critical safety question? NASA administrators preferred not to delay the launch to wait for warmer temperatures, fearing adverse publicity might cause Congress to reduce their funding. They "strongly" communicated these economic considerations. In the popular press, the Challenger disaster was blamed on "faulty O-rings". But to avoid framing an innocent man (or mechanical part), any frame suitable for assigning blame must include at least 3 factors:   mechanical parts, people (participants), and the interaction process. Any mechanical part will fail, when pushed beyond its limits. And (some of) the decisionmaking participants knew it was extremely likely those limits had been exceeded (by exposing the O-rings to prolonged cold temperatures). The cause of NASA's Challenger disaster was Morton Thiokol Senior V.P. Jerald Mason ordering his Engineering V.P., Robert Lund, to change his frame of reference:   "Take off your engineering hat and put on your management hat," Mason said to Lund, meaning that Lund should change his frame of reference to a context that put economic considerations more in the foreground, and engineering considerations more in the background. When Mr. Lund's new reference frame emphasized which side his company's bread was buttered on, he reversed his previous recommendation and told NASA that Challenger could safely launch, despite the cold-induced rigidity of its O-ring seals. Because Lund was persuaded to choose a dangerously-unsuitable interpretive framework for applying his engineering judgement, the result was a fatal catastrophe. That quote appears in Chap. 9 of "Report to the President by the Presidential Commission on the Space Shuttle Challenger Accident".

It's important to "Use the right tool for the right job," · · · or at least, to use some tool that's "suitable enough" for the situation. Because of the enormous consequences (for the self, and for others) of applying a particular interpretive framework to a particular situation, we are responsible for our "choice" to use any particular interpretive framework. And we are responsible for realizing that humans are fallible, hence the job of recognizing our own interpretive biases -- and revising our interpretive frameworks accordingly -- is never finished.

In critical situations -- where using an inappropriate interpretive framework can be disastrous -- society recognizes the benefits of developing an institutionalized, collective process for choosing an appropriate framework. Because of the Challenger disaster, NASA instituted a more rigorous system, that incorporated certain checks and balances, to constrain the choice of evaluative frameworks.

In law, judges frequently order juries to ignore, discredit, or dismiss, certain evidence they've heard during the trial. Over time, our legal system has evolved strict rules to guide decisionmakers in selecting appropriate frames of reference when deciding matters of guilt or innocence, life or death.

Are these legal rules perfect? No, even if we could define "perfect", these rules continue to evolve, to adapt to changing circumstances. Are there "rules about rules", to constrain the evolution of our legal interpretive frameworks. Yes. But our society hotly debates those rules. "Strict constructionists" argue, first, that we can understand the "original intent" of those who wrote the laws, and second, that we should interpret the meaning of the law today, as being identical to its intended meaning when written (perhaps 200 years ago). Others argue that meanings do, and should, change with the times. The subjective and objective semantic shift is one dramatic example, but more subtle cases abound. For example, the U.S. Supreme Court recently decided (by a 5-to-4 vote) on a specific interpretation of the Constitutional "right to bear arms". Two hundred years ago, there were no machine guns, no biological weapons, etc. But today, we include such weapons in the semantic category of "arms". What frame of reference should we choose, to interpret an instance of the word, "arms", that was written 200 years ago? Who should choose that frame, and how? I.e, via what procedure? Should everybody get to vote on it -- even people with no expertise in assessing the suitability of frameworks, or assessing their own biases? Or should only (selected) lawyers or judges vote? And, collectively what framework should we choose to evaluate which lawyers or judges to select? These are the difficult questions of how to govern ourselves.

Even to ask such questions, is to presuppose certain interpretive frameworks, certain concepts and terms that allow us to "phrase" our questions. And the frameworks we use for "writing" our questions, inherently bias the answers we "read" using those frameworks. It's clear that we recognize the reality of Cognitive Frameworks, and the vital importance (at least in some circumstances) of individual or collective procedures to help ensure an appropriate Frame of Reference is selected. Human judgement via any individual's framework is fallible; likewise, any collective decisionmaking framework will be more or less fallible, but in different ways. We are flawed beings, forced to view the world via flawed tools. We don't have the power to change that. But we do have the power to recognize the interactions between our own flaws and our tools' flaws, and therein lies the key to self-improvement and human progress.

 
Survey examples of various interpretive frameworks that demonstrate Cognitive Relativism: 

	Male-female brain differences. 
	good-bad belief structures/demonization/Orwell's 1984. 
	Evaluating applicants' resumes and students' writing -- Male vs. Female attribution. 
		(Similar bias, even when females perform the evaluation.) 
	Psychology experiments:  - "subliminal priming/activation" of economic eval. framework. 
		- MRI studies of enjoying taste of wine when allegedly expensive vs. cheap. 
		- Asian vs. Western perceptual differences (foreground/background biasing). 
	Any *interpretation* necessarily is relative to a particular *conceptual context* -- 
		- Morning Star vs. Evening Star 
	Any *interpretation* necessarily is relative to a particular *motivational context* -- 
		- police lineup (ID the perp, even if all are innocent) 
			Motivation = expectation & desire, besides possible 
				unconscious pre-supposition that only low-lifes get lined-up. 
		- Confirmation Bias:  
			- Conservative critique of Dan Rather 
			- Corresponding critique of Bush WMD "rush to judgement"

To reiterate, Cognitive Relativism is not a disturbing theory, it's a disturbing fact!   Of course we are fallible -- we're only human. We cannot know "objective" truth with certainty, because it may have been distorted by our interpretive framework. (And in assessing our interpretive framework to discover what biases it imparts, we necessarily must use another evaluative framework for that purpose -- a meta-language or meta-framework. But it is conceptual sloppiness to conflate special cases of "recursive" or "self-referential" assessment with the more general case.) By disciplined processes of "intentionally monitoring for unconscious bias", it seems we can better approximate "objective reality". Therefore our responsibility is to strive toward it, continuously. Hence this quote:

"Bring me into the company of those who seek the truth · · · and deliver me from those who have found it."

And, regarding the way our interpretive frameworks shape our worldview, this highly instructive quote:

"Ontology recapitulates philology."

Top of Section

 

Why Conceptual Climate Change is Difficult to Perceive

Every individual has many interpretive frameworks to apply in various circumstances. And we learn new frameworks, and revise (or sometimes discard) old frameworks. Many of these frameworks are communicated, via culture. Some basic versions become so widespread, that they warrant being called, "Cultural Software". Any framework can change -- intentionally or unintentionally, consciously or unconsciously. "Automatic" evolutionary adaptation of our interpretive frameworks is a "natural process". (The neuro-anatomy and computer-science aspects of neural network evolutionary adaptation are well understood.)

Precisely because our conceptual frameworks adapt automatically, we often are unaware when those conceptual structures change. And because our "choice" or application of a particular interpretive framework to a particular situation tends to be automatic and "unthinking", we may not even be aware of the existence of those frameworks. When you are dependent on a tool, and habituated to its use, it tends to become "invisible" -- exactly like eyeglasses become invisible to people who must wear them every day, like public water and sewage infra-structures become invisible to urban dwellers, and like environmental infrastructure services (such as clean air, reliable rainfall, and "forgotten pollinators") had become invisible to modern technological societies. Often, we become aware of the existence of such things -- and our profound dependence on them -- only when they are damaged or destroyed. (Hence the song phrase, "You don't know what you've got 'til it's gone.")

Many cognitive changes seem subtle or small; an individual may not even be aware that they have changed. For example, repeatedly applying a particular interpretive framework may gradually cause a nearly-unbreakable habit -- where we "automatically" apply that interpretive framework to inappropriate situations. As another example, when we ingest "mind-altering substances" that profoundly change our physical behavior, we may remain unaware of our altered mental state. The drunk person who insists their driving is not impaired is an obvious example. Because our "conceptual yardstick" has been distorted, we may become "too drunk to know/realize/recognize that we're drunk."

Frequently, pharmaceutical companies market new prescription drugs widely. But only belatedly, as reports of adverse effects on individuals accumulate -- collected by a central aggregator -- are we able collectively to perceive the magnitude of the problem. At that point, hopefully the FDA (Food and Drug Administration) will require that the drug's label warn patients about psychoactive effects (e.g, don't operate machinery, be on the lookout for mood changes, aggression, etc.).

In many cases, we only become aware of risks to individuals because we can collectively monitor and assess the impacts. Prominent examples of such risks in the physical environment are CFCs that damaged Earth's protective ozone layer, and greenhouse emissions that are damaging the stability of our climate.

 
TOPICS (under construction): 

# Analogy: It's impossible for any individual to directly perceive "climate change". Instead, only indirectly can we perceive it, via instrumentation, statistics, historical measurements, and causal inferences -- E.g, *if* we think about it, only then, when we measure our physical terrain, do we perceive/recognize *symptom* that our land area is shrinking. We infer that's *caused* by rising sea level, which is *caused* by the ice caps melting, which, we might infer, ultimately is *caused* by our "CO2 bullets". *Or* · · · is that change "natural"? # Likewise, it's impossible for any individual to directly perceive "conceptual frame change": *If* I notice any change, is that change "in me" or is it "in the world"? And whatever "location" I ascribe to that change, is it a "different kind of change"? I.e, is that change part of "natural variation", or did somebody cause an "un-natural variation" in the distribution of "natural variation"? Instead, we can only perceive it indirectly, via different actions by other people whose Cultural Software embodies a newer "revision" than our own, ... [more?] Causal inferences may be subject to even more serious errors -- i.e, to what extent was a revision a "natural evolutionary update", vs. an intentional "subversive Info-war revolutionary revision"? ===> Escalation: Launch-on-Warning, pre-emptive Info-war, etc.

Top of Section

 

Doesn't Cognitive Relativism imply (or excuse) Moral Relativism?

Objection: Doesn't Cognitive Relativism imply (or excuse) Moral Relativism? (I.e, an "anything goes" approach that undermines any and all moral/ethical standards.) Even if Cognitive Relativism is an inescapable fact, we may not want to *acknowledge* that, because then it would offer a social license to excuse unethical actions.

Response: Cognitive Relativism *is* a fact, but it does *not* imply Moral Relativism. It only implies uncertainty regarding exactly what constitutes the Moral Absolute. Our society already acknowledges (indeed, is riven by) disagreements regarding how tradeoffs/conflicts among various ethical values should be reconciled. Cognitive Relativism offers a framework that helps us address ethical disagreements, by disambiguating them from (some) cognitive disputes. (Case in point: how the clash between Evolution and Biblical Creation boils down to hermeneutics.)

Accusations of "Moral Relativism" often are merely case studies in the confusion/conflation of Ethics with arbitrary Cultural traditions. Example: In America, it's wrong to drive on Left side of 2-way road (unless emergency vehicle with lights & siren). Yet in England, it's wrong to drive on Right side of 2-way road. That's not moral relativism, but merely a different cultural "free variable instantiation" or "library routine" in the same Absolute Moral Code. (Which says, "It's wrong to drive on the *non-conventional* side.") Better if we disambiguate Ethical from Cultural interpretive frameworks!

Society already is replete with people who *behave* according to Moral Relativism. By openly acknowledging Cognitive Relativism -- and making explict (to self and others) our various Cognitive and Ethical interpretive framework(s) -- we help bring that Moral Relativism out into the open public. Contrast this "sunshine is the best disinfectant" approach, with the private "mental reservation", an approach widely (ab-)used in both the Catholic and Protestant traditions. The following [annotated] excerpt is from:

      "Lying: Moral Choice in Public and Private Life" --   by Sissela Bok (1978; Vintage Books) (pp. 14, 15, 37)

Certain religious and moral traditions were rigorously opposed to all lying. Yet some adherents wanted to recognize at least a few circumstances when intentionally misleading statements could be allowed. The only way out for them was ... to define lies in such a way that some falsehoods did not count as lies.   · · · casuist thinkers developed the notion of the "mental reservation", which, in some extreme formulations, can allow you to make a completely misleading statement, so long as you add something in your own mind to make it true.   [i.e, you smuggle in Moral Relativism via a hidden, private interpretive framework!]   Thus, if you are asked whether you broke somebody's vase, you could answer, "No,", adding in your own mind, the mental reservation, "not last year" to make the statement a true one.     · · ·   [This "mental reservation" approach] left out the speaker's intention to deceive, as part of the definition [of lying]. It thereby allowed the following argument: If you say something misleading to another and ... add a qualification to it in your own mind so as to make it true, you cannot be held responsible for the "misinterpretation" made by the listener.   [I.e, that argument makes the patently false and self-serving claim that you could not reasonably have been expected to know that your listener would use a different interpretive framework!]

Parenthetically, note interesting parallels between "too onerous" rules of morality, security (computer & otherwise), and law (e.g, 55 MPH speed limits, or alcohol prohibition). In all these cases, insisting on "too strict" a social standard made the Absolute "perfect" the enemy of the Relative "good" (i.e, the "better").

Top of Section