These words make it all worth it.

From someone to me, regarding recent events:

Thank you. The courage you have shown in this endeavor is remarkable. Instead of randomly throwing a pebble in the water, you unleashed a precision guided boulder to which I will enjoy witnessing any resulting tsunamis. No matter what outcomes occur, your intentions were honorable and unselfish and as with all of the other adjectives floating in my head, are demonstrating what everyone strives to be, a leader.

Words fail me.

Inspiring Men: My Grandfather

(Below is a mostly unedited letter I sent to someone a few months ago. The only changes I have made were grammatical and removing my grandfather’s name for operational security purposes. Even though, at time of writing, I hadn’t had some of the epiphanies I have had recently, I still think my story may be of some use to some of the readers of the Spearhead.)

Nothing – absolutely nothing – in life is permanent. If you spend your entire life dreading the loss of something, you might not ever get to fully enjoy that thing. This is a lesson I’ve had to learn the hard way, and I think it might help you to hear a story from me, maybe so you can learn the same lesson I did.

Part of the reason I enlisted in the Marine Corps was because my grandfather retired as a Lieutenant Colonel from the Marines. Growing up, he had always been something of a hero to me – even if I didn’t really understand much about the Marine Corps, or felt like I knew him much. In fact, he once made it very clear to me that I was the son of the black sheep of his family – he still loved me, I felt, but my exposure to him was limited. I take my middle name after him.

After I enlisted – and saw what the Marine Corps was really like – my love and admiration for him sky rocketed. It’s something people could never understand unless they go through it. I can’t even try to put words to it, I know it’s futile – others have tried and failed. I heard that my graduation photos from boot camp had circulated back to him, and that for the first time in a long time, his family saw him cry, he was so proud.

Training in the Marine Corps is lengthy and intense. I wanted very much to, when I was first able, take leave and meet with him, to talk about the Marine Corps and give him a chance to talk to someone who could understand. As I continued to work on my own memoirs, I realized that my grandfather had stories – amazing stories – that he had probably never told anyone. He had fought – and survived – on Iwo Jima! He never talked about it much with anyone, and after being in the Marine Corps, I can understand why… people on the outside just don’t get it.

As my training was nearing completion, I received bad news. My grandfather was coming down with Alzheimer’s. By the time I graduated, members of my extended family, who were with him, made it clear that his memory was pretty much gone – he couldn’t remember his own children anymore. I had missed my chance – forever – to really talk to him and understand his life.

I grieved, candidly, in my own fashion. I grieve now, as I recall. Grieving is natural. You just can’t get stuck on it. What I realized, as I became stuck on it, was that I couldn’t control the situation – I had no ability to influence his disease or his memory. Furthermore, I was sure that I would do his spirit no honor by remaining paralyzed in grief. Instead of grieving, I took the time to drudge up a deep introspective dialogue – sifting through my own memories for my memories of my grandfather.

And as I did this tough work, rather than remember with grief and regret and longing, I focused instead on cherishing each memory, remembering it to the fullest and enjoying it as though I were there again. Some memories I enjoyed for the “first” time, having, as a child, not enjoyed that particular experience, but as a man with new understanding, cherishing it in a new way.

Part of my healing process involved investigating my family history. I tried to learn as much about him as I could – being that I hadn’t known him very well, and what I did know came from youth. He was a beautiful man. (I’m crying a bit right now, but it’s not painful. I am proud to be his grandson.) The trials and tribulations he went through – being one of MANY children from a very poor family, suffering through abuse, disowning his own family much like I have had to do, paying his own way through college, entering the Marine Corps as an officer in lieu of what he could have done with his education, and serving in World War 2 and the Korean War (he proposed to his wife and got married shortly before reporting in for The Basic School, right after receiving his commission – he would not see his wife again for two years…they stayed married until she died of old age), retiring in order to become a public servant in another way – by being a teacher and then principal at his local high school.

Did he do everything right? No, perhaps not. My father claims the way he was raised by my grandfather was not fair or healthy – but my grandfather’s other children seemed to have turned out better than my father, so who is to say who is right? But the struggles he went through, the pain and adversity he must have felt, resonated with my own life, and I felt very close to him. There was nothing that I could do for him in his final days, but that was okay, because I could live the rest of my life in his honor. I used to want to change my name in order to disown my family, which I had come to hate. But learning about my grandfather this way, after he was already effectively taken from me, restored my faith and pride in my family name.

Now, when times are dark and when I wish I wasn’t in the Marine Corps or doing other things, I turn my thoughts back towards Bill, and it gives me strength and resolve.

This turned out longer than I intended. A lot of my coping with Bill’s situation was done on a more subconscious, nonverbal level, also. This is the first time I’ve told ANYONE – even my few close friends – about this. The lesson I learned was to not be consumed by grief over loss; to instead channel that grief into something more positive. Everything must eventually come to an end, so it does not make sense to dread that time and to waste your energy being full of regret and sadness. Let the passing of something you’ve cherished be a cause for remembrance and cherishing. Let it be a new beginning – something to live the rest of your life for, rather than spend the rest of your life mourning.

These philosophies dovetail also with the revelation of thought I had while in the more intense training phases of my Marine Corps career, where I literally trained for every waking moment to kill and be killed. Life is so fragile and transient. We are all so very fragile and vulnerable. It makes no sense, none whatsoever, to dedicate your life to seeking achievement or seeking material gain over emotional depth and well-being. There will always be more work to be done tomorrow, and there will always be another achievement to seek or another record to break. Eventually, we will all pass our peak, and in all likelihood, have things left we still want to achieve or accomplish that we cannot. However, we may not always have a second chance to tell someone that we care about that we love them, or another chance to get to know that someone interesting just a bit better. Take risks in the name of enriching your relational life – strip everything else away and the measure of your life, I think, is the impact you had on other people and on the bonds you forged with them – on the families and communities you forged or were a part of.

Communicating Thoughts and Thinking About Communication; The Duality of Reason

(Author’s note: this piece was originally prepared as an essay for a philosophical conference, and so was so tailored to be as general as possible. The application to men’s rights should be obvious and I encourage readers to comment on various points raised in the essay as they apply specifically to men’s rights. There should be plenty of points of departure.)

1. Introduction

The objective of philosophy, in the broadest and most fundamental sense, is concerned with thinking. But this begs the obvious question – what is thinking? One professor at the University of Utah asked his students this simple question with varying results:

I recently asked some college honor students to define thinking. After pondering the question, a student majoring in sociology said, “It consists of reflecting on some idea or insight and exploring its logical connections and implications for making sense out of something.” In response to the same question, an English major responded: “It’s the ability to write a convincing argument in support of a particular point of view.” According to a premedical student majoring in biology, “Thinking is the ability to use information for analyzing data in order to solve some problem.” A philosophy major said without hesitation, “It’s a critical openness to new ideas as one explores their logical foundations.” And a student whose major is undecided said “It’s what I’m trying to do in response to your darn question.” (Geersteen, 2003)

It seems that a precise definition eludes consensus; indeed, as American writer and television producer J. Michael Straczynski once famously remarked, “The quality of our thoughts is bordered on all sides by our facility with language” (Lewis, 2009). The purpose of this essay is to more closely examine the link between language and thought, and put forth the foundation for an argument that asserts that language is, more or less, thought, as well as consider the implications of that idea.

2. Evolution of Language

In his work, Evolutionary Biology of Language, Martin A. Nowak goes to great lengths to establish a logical model that tracks the way in which a simple system of symbols could have evolved into modern language. His model involves quite a bit of advanced logic and mathematical concepts (which are beyond the scope of this essay), but the basic idea is that language evolved from a rigid system of limited symbols whereby one object had one symbol to the more modern system which allows for (perhaps) unlimited expression of ideas and events. Another expert who has also tracked the evolution of language remarks on the character of modern language systems: “Present-day human languages can be readily deployed to talk about events, objects, people and places far removed in space and time from the act of speaking, and the signs used to talk about such displaced referents have no detectable physical similarity to the referents themselves” (Urban, 2002).

Nowak (2000) incorporates arguments about evolution in his analysis of language: “Evolution relies on the transfer of information from one generation to the next. For billions of years this process was limited to the transfer of genetic information. Language facilitates the transfer of non-genetic information and thus leads to a new mode of evolution.” Essentially, he asserts that language has evolutionary advantages, and thus more effective (or “fit”) systems of languages would be the ones that get passed down from our ancestors, while less “fit” systems would become extinct. How do systems become more or less fit? It has to do with how many symbols and how many definitions there are in a system:

In other words, adding the possibility of describing more and more objects (or concepts) to the repertoire of a language cannot increase the maximum amount of information transfer beyond a certain limit. If, in addition, we assume that objects have different values, then we find that the maximum fitness is usually achieved by limiting the repertoire of the language to a small number of objects. Increasing the repertoire of the language can reduce fitness. Hence natural selection will prefer communication systems with limited repertoires.

He goes on to assert that “successful communication increases the survival probability or performance during life history and hence enhances the expected number of offspring. Thus, language is of adaptive value and contributes to biological fitness.”

If languages which retain fewer concepts (“reduced repertoires”) have increased biological fitness and evolutionary advantage, how did modern language come to be so complex and accommodate so much ambiguity and confusion? Nowak (2000) offers his thoughts in his conclusion (emphasis my own):

Efficient and unambiguous communication together with easy learnability of the language is rewarded in terms of pay-off and fitness. While we think that these are absolutely fundamental and necessary assumptions for much of language evolution, we also note the seemingly unnecessary complexity of current languages. Certainly, systems designed by evolution are often not optimized from an engineering perspective. Moreover, it seems likely that at times evolutionary forces were at work to make things more ambiguous and harder to learn, such that only a few selected listeners could understand the message. If a good language performance enhances the reputation of the group, we can also imagine an arms race towards increased and unnecessary complexity. Such a process can drive the numbers of words and rules beyond what would be best for efficient information exchange. This should be the subject of papers to come. (Nowak, 2000)

Here Nowak admits to one of the most fundamental problems with language – the way in which it can be ideologized. Before I tackle that idea, which deals with large social systems, let us first examine more mundane dangers associated with misunderstandings of language.

3. Language Assumptions

“Everyone who reads this paper knows of the order of 50,000 words of his primary language. These words are stored in the ‘mental lexicon’ together with one or several meanings, and some information about how they relate to other words and how they fit into sentences” (Nowak, 2000). Language is a fundamental fact of human existence, but it is also one that is taken for granted. As the quote above illustrates, we all have a sort of mental dictionary we tote around, and during communication, it is all too natural to assume that the definition we have in mind for a word we use when we are speaking matches the definition that our listener has in mind for the same word. Often times, conflicts that arise as a result of this are minimal, but other times they can have important consequences. Imagine a scenario where a friendly girl tosses around the phrase “I love you” in the most trivial of ways, utilizing the phrase as a sort of goodbye, which seems so common these days. (Her interpretation of the word love is casual and can be supported by more than one of the many definitions found in any number of dictionaries – the dictionary.com website has no less than 14 definitions for love.) In a communication moment with someone who thinks of the word love as having more gravitas, she is bound to create a miscommunication – the person to whom she is speaking will receive an entirely different message than the one she intends. This could result in disappointment for her listener, such as in the case of her listener being a man who was infatuated with her. Twist the scenario a bit, and imagine she did mean the more serious interpretation of love, whereas her male listener assumed she meant a more casual one, and now she is prone to feel the negative impacts of communication loss.

Recall Nowak’s (2000) idea of language fitness, first visited above: “…The fitness contribution of a language can be formulated as the probability that two individuals know the correct word for a given event summed over all events and weighted with the rate of occurrence of these events.” Essentially, a language is more “fit” if the chances of the speaker and listener having the same definition for a word (love in the previous example) are high. It follows that the higher the number of disparate definitions for the same word there are in a given language, the less fit that language becomes. English seems especially rife with words that have numerous and disparate meanings, and it is no exaggeration to say that a 15-page paper could be written on this subject alone. I implore you to consider the use of words such as “socialist, “communist,””harassment,” “equality,””family values,” “oppression,” or nearly any other politically charged word in the public sphere, with the idea that the person speaking the word could be talking about something entirely different from the person listening to the word, even though they are both considering the same word. Since this essay is concerned only with impelling contemplation, the previous analyses should suffice as a primer on this particular point.

However, definitions of words are not the only assumptions we make regarding language. The literature establishes that the average person is both ignorant and arrogant when it comes to language, a rather dangerous combination:

In matters of language history, structure, function, and standardization, the average individual is, for the most part, simultaneously uninformed and highly opinionated. When asked directly about language use, most people will draw a very solid basic distinction of ‘standard’ (proper, correct) English vs. everything else. (Lippi-Green, 1994)

That we posit the existence of something called language can itself be considered an assumption, which, furthermore, has an impact on how human societies are organized:

Beliefs about what is or is not a real language, and underlying these beliefs, the notion that there are distinctly identifiable languages that can be isolated, named, and counted, enter into strategies of social domination. Such beliefs…have contributed to profound decisions about, for example, the civility or even the humanity of subjects of colonial domination. They also qualify or disqualify speech varieties from certain institutional uses and their speakers from access to domains of privilege. (Woolard, & Schieffelin, 1994)

More examples of how language has a tangible impact on our lives will be discussed later. We hold other assumptions about language that are of paramount philosophical importance. Indeed, how we think about language may impact perceptions as fundamental as how we define ourselves: “Language socialization studies have demonstrated connections among folk theories of language acquisition, linguistic practices, and key cultural ideas about personhood” (Woolard, & Schieffelin, 1994). Moreover, thoughts regarding language (especially in the Western nations, like the United States) underpin assumptions regarding the nature of reality: “In the vernacular belief system of Western culture, language standards are not recognized as human artifacts, but are naturalized by metaphors such as that of the free market” (Woolard, & Schieffelin, 1994). This stands in stark contrast to a more reasoned and self-examined perspective, informed by a more comprehensive understanding of language, its functions, and its evolution:

Deconstructive rhetorical analysis is based on the premise that all claims to transcendent truths are radically undercut by the fact that they are made within a given language and culture which impose limits on the thought and perception of individuals making the claim. We do not have unmediated access to a truth; rather, our view of the world is a function of a set of culturally constructed assumptions which shape our perception of the truth…Deconstructive critics also assert, though, that rhetoric is always open to multiple interpretations which are themselves a function of the interpreters’ own beliefs and values. Any deconstructive reading is offered as one among many possible interpretations.” (Blanton, McLaughlin, & Moorman, 1994)

A key point here is that not only is language ineffective for establishing perfectly objective observations of reality form the perspective of the speaker, it also depends upon the interpretations of the listener. Thus, miscommunication can result either from poorly phrased speaking or from various deficiencies in the listener. For example, while a speaker’s poor accent can increase the odds of communication loss, a listener’s desire to understand the speaker is perhaps even more important:

…Accent…is most likely to pose a barrier to effective communication when two elements are lacking. The first is a basic level is communicative competence on the part of the speaker…. The second element, even more important but far more difficult to assess, is the listener’s good will. Without the goodwill, the speaker’s…degree of communicative competence is irrelevant. Prejudiced listeners cannot hear what a person has to say, because accent, as a mirror of social identity and a litmus test for exclusion, is more important.” (Lippi-Green, 1994)

The assumptions of language we take for granted can be exploited through the imposition of language ideologies.

4. Language Ideologies

The phrase “language ideology” is defined by different authors in different ways, but for the purpose of this essay I provide the following: “The definition [of language ideology] used here is: a bias toward an abstracted, idealized, homogenous spoken language which is imposed from above, and which takes as its model the written language. The most salient feature is the goal of suppression of variation of all kinds” (Lippi-Green, 1994). In other words, a language ideology seeks to impose a standard of language upon as many potential speakers as possible (for example, within a nation as in a national language, though certain social groups may exclude others on the basis of ‘official’ languages).

Considering that language can be crucial to how one defines one’s place in the world (as examined in the article Language and Borders), such totalitarian impositions have observable consequences:

The phrase “language and borders” suggests that language differences signify categories of person defined by ethnic or national origin and that these categories are opposed to each other. People act in ways that are taken as “having” a language, which is equated to “belonging” to an origin group. Borders emerge in specific contexts as a metonymy of person, language, and origin category. This metonymy can be fleeting or quite rigid and in varying degrees politicized. (Urciuoli, 1995)

Stated more simply, the language a person speaks comes to fully define that person in the perception of others. Ideas like ethnicity and nationality could be deconstructed as nothing more than a difference in language. Urciuoli (1995) goes on further to say, “What does exist, in any society, is the fact of linguistic variation from which people deploy language forms in acts of identity. From such acts, people’s sense of community, group, and language emerge in specific places and times.” This need not be interpreted only with regards to nations – think also of subcultures, such as internet gaming communities, the military, or any other culture which has a language unto itself.

Due to length considerations, this essay cannot fully examine the ways in which various parts of language have important implications. The literature points out that accents play a key role in the way a listener perceives a speaker: “…Prosidics and accents…are key in the perception of ethnic and race boundaries that thread their way through ordinary situations and that have real-world consequences for people’s social options” (Urciuoli, 1995). Additionally, notions regarding systems of writing and literacy are fueled by cultural factors: “Anthropological studies of literacy…recognized belatedly that it is not an autonomous, neutral technology, but rather is culturally organized, ideologically grounded, and historically contingent, shaped by political, social, and economic forces” (Woolard, & Schieffelin, 1994). I would now like to take the time to examine the counterargument that language is not the only way to think, that surely there must be some other way of thinking.

5. Alternative Models of Thinking

The most common split in cognition theories is that humans are capable of thinking verbally and mathematically, and that these two modes of thinking are distinct from each other. Such a worldview is evident in the organization of the SAT exam, for example, which is split between math and verbal components. Nowak (2000) seems to hint that language is not the only mode of thinking when he writes, “Our language performance relies on precisely coordinated interactions of various parts of our neural and other anatomy, and we are amazingly good at it. We can all speak without thinking. In contrast, we cannot perform basic mathematic operations without concentration.”

Let us examine mathematical thinking more closely. At first glance, the literature is convincing in establishing mathematical thinking as distinct from thinking through language:

Most schools assume that teaching mathematics compulsorily and over a number of years they are providing the conditions through which pupils will develop their mathematical thinking. This assumption, usually unchallenged, rests on a view of mathematics as a logically developed discipline, together with the expectation that the logic will spill over and be absorbed by the pupils into all aspects of their lives as they pursue a study of the content of mathematics, for example, in learning number, geometry, trigonometry, or algebra. Experience, however, tells a very different story…Certainly an inordinate amount of time in schools is spent teaching mathematical content and techniques while the process, the means through which mathematics is derived, receives little attention….Exploring process is not very profitable when teachers do not understand the kinds of thinking from which process springs. (Burton, 1984)

In summary, mathematical thinking is a different process for thinking, and one assumes it would be distinct in nature from thinking through language. However, once the literature is read more deeply, apparent distinctions evaporate (emphasis the author’s own): “The process is initiated by encountering an element with enough surprise or curiosity to impel exploration of it by manipulating… Although the sense of what is happening is vague, further manipulating is required until the sense can be expressed is anarticulation” (Burton, 1984). Expressing an articulation? Isn’t that the precise point of language? Yet wait, there’s more (emphasis my own):

Pupils need tools to help them structure their responses so that they can build their reflective powers. Further, they need encouragement tocapture their feelings at the moment of expression. Consequently, students of all ages have been encouraged to develop the use of particular words that reflect their responses as they tackle questions…The key to recognizing and using mathematical thinking lies in creating an atmosphere that builds confidence to question, challenge, and reflect.(Burton, 1984)

I assert that mathematics is just another type of language which is also based on symbols (generally, numbers instead of words), and that, in fact, all human thinking is symbolic in nature. There is a school of thought which asserts truth through symbolic logic, and it is worth examining at this juncture.

6. Symbolic Logic

What is symbolic logic? Aside from a rule system that has useful applications in computer programing and math, for instance, I assert it is nothing more than another language. In any event, one proponent describes it as follows (emphasis my own):

…Symbolic logic, is, in its broadest sense, a new science which studies through use of efficient symbols the nature and properties of all nonnumerical relations, seeking precise meanings and necessary conclusions. As an applied science, it holds immense promise. For example, it may give us an unambiguous language for political, economic, and social fields, which will conveniently reflect the structure of these fields and make discussion and analysis easy. (Berkeley, 1942)

Symbolic logic seeks “precise meanings” and supplies us an “unambiguous language…” It should be clear at this point that it is nothing more than another language (and thus owes no higher claim to The Truth than any other language), but, the following quote about one of symbolic language’s chief powers may help shed some more light on the situation (emphasis my own): “We observe first that symbolic logic can define certain ideas which neither mathematics nor the dictionary can possibly define; for example, symbolic logic can define number.” (Berkeley, 1942) There again, symbolic logic is concerned with defining things (and it is even said later in the article that symbolic language competes with dictionaries); how can one not conclude it is just another language?

For those unfamiliar, symbolic language did indeed take off as a philosophical idea and has had many practical and important impacts. However, as mentioned before, it is no more a path to conceptualizing The Truth or reality as any other path. It is still a language, though a refined one, and still subject to the pitfalls of language (emphasis author’s own):

As we noted earlier, no language, as the product of a given culture and history, can claim to have unmediated access to the real. Making such a simple assertion implies a kind of cultural arrogance, forgetful as it is of the multiplicity of languages and of the linguistic reality that each provides a variety of ways to structure the real. Language is a mediational tool which enables the construction of cultural reality. Such terms as real,authentic, and genuine, especially when they are repeated without much critical self-awareness, give the impression that successful language use provides access to The Truth itself.” (Blanton, McLaughlin, & Moorman, 1994)

Symbolic logic was not without critics, however.

One author wrote a multifaceted rebuttal based on logical arguments, though a full treatment of those arguments is beyond the scope of this essay. Of more relevance is the following analysis, which rejects the idea that the definitions of things should be fixed, or, in other words, that language ideologies should be enforced (emphasis author’s own):

But Formal Logic has perversely chosen to build on the fiction that the meaning of terms is (or ought to be) fixed, and to talk about propositions rather than judgments. So the proposition becomes a helpless formula, totally incapable of reproducing the features of living thought. It has acquired its meanings from past uses; but these do not protect it against ‘willful modifications’ at the hands of masters of language like Humpty Dumpty, who make words mean what they please…Is not the whole history of philosophy one long illustration of philosophic audacity in manipulating language, and does not experience show that philosophers frequently get away with their arbitrary modifications of ‘the‘ meaning of words and ‘propositions?’ I can not admit, therefore, that…symbolic logic [is] in any way relevant to the procedures of our actual thinking. (Schiller, 1932)

An important implication of this argument is that words/ideas/what-have-you are defined by past uses, but this provides no protection from those who would manipulate definitions for their own advancement. Thus the Jew in Nazi Germany, for instance, can suddenly become the scapegoat for an entire nation’s woes. There are many examples of this but unfortunately little room for a detailed analysis; I am sure the reader can think of his or her own examples of how words have been manipulated to mean entirely different things, for good and for ill. The take away point from this section is that a more accurate conceptualization of human thinking is thus: all human thinking is symbolic – and language can be thought of as nothing more than a system of symbols created for the purposes of communication. Therefore, symbolic logic is ‘just’ another language, like English or mathematics.

7. Practical, or Less Abstract Implications

So far, many of the examples mentioned in this paper have been of an abstract nature. Here, I hope to provide a brief look at some real-world, practical implications to the understanding of language I have outlined above. Geersteen (2003) outlines why it is important to think more clearly:

What makes higher-level thinking so important? To begin with, we live in a world of unprecedented change and expansion in information. New information continues to multiply as old information becomes obsolete…Constant and accelerating shifts in information mean that all members of society need greater skill in assessing and evaluating knowledge.

An understanding of language ideologies may yield answers to significant and challenging contemporary problems, including (but certainly not limited to) those outlined in the quote below:

Many populations around the world, in multifarious ways, posit fundamental linkages among such apparently diverse cultural categories as language, spelling, grammar, nation, gender, simplicity, intentionality, authenticity, knowledge, development, power, and tradition. But our professional attention has only begun to turn to understanding when and how those links are forged – whether by participants or their expert analysts – and what their consequences might be for linguistic and social life. A wealth of public problems hinge on language ideology. Examples from the headlines of United States newspapers include bilingual policy and the official English movement; questions about free speech and harassment; the meaning of multiculturalism in schools and texts; the exclusion of jurors who might rely on their own native-speaker understanding of non-English testimony; and the question of journalists’ responsibilities and the truthful representation of direct speech. Coming to grips with such public issues means coming to grips with the nature and working of language ideology. (Woolard, & Schieffelin, 1994)

Indeed, an understanding of language ideologies may be critical to achieving true intellectual freedom. As was quoted in Nowak (2000)’s work above, it is not unfeasible to imagine that certain people/groups/interests have a vested interest in an “arms race” towards ever increasing complexity and ambiguity. As his work demonstrates, this is more than just a social justice issue  if the fitness of our language continues to deteriorate and we can no longer efficiently and effectively communicate with one another, we will be at an evolutionary disadvantage. How much danger we are in is up for debate, but it certainly warrants consideration.

We should be wary of ideological interpretations of language: “Important sociolinguistic changes can be set off by ideological interpretation of language use, although because they derive only from a larger social dialectic, such changes are likely to take an unintended direction, as in the historical case of the second person pronoun shift in English.” (Woolard, & Schieffelin, 1994)

One example I would point out would be the “politically correct” movement, which seeks to define what is and is not acceptable for conversation and even intellectual debate. Such ideas are dangerous because they limit the amount of discourse in society and in the academy, and, further, can allow for certain ideologies to propagate unopposed and without critical evaluation. The whole idea of freedom of speech, after all, is to protect the ideas that we do not like to hear; the ideas that we enjoy hearing need no protection, and yet, if we do not listen to ideas we do not like, we may not be able to see the ways that the ideas we do like potentially poison our thinking (and in turn, our society and world).

Figurative speech is often employed for ideological purposes. This essay cannot hope to examine every use of figurative speech possible, but will provide one example related to the “whole language” movement in education to illustrate the point (emphasis author’s own):

We mean the term rhetoric to refer to the effort to persuade or argue forcefully for a position. More specifically, following a tradition that goes back to antiquity, we use it to refer to…figures of speech….An example…is the use of the word ownership to describe the relationship that the movement wants to foster between student writers and the texts they produce. In this context the word ownership is being used figuratively. It does not refer literally to an act of economic possession; rather, it uses that act as an analogy for the idea that writers can have control over and feel pride in what they write. The use of the word ownership in this metaphorical or figurative way serves as a rallying cry for teachers. The intense control that the word implies is a goal that teachers can strive for, a value that they can share. Thus the use of this figure attempts to persuade the uninitiated and to produce group solidarity. It serves the rhetoric of the movement. (Blanton, McLaughlin, & Moorman, 1994)

Figurative language can be evaluated critically to reveal the deepest assumptions of its users, however:

We argue, though, that the figurative language of a text does more than persuade. Read critically, it also reveals the deepest assumptions that underlie the text’s arguments. All arguments proceed from a set of assumptions held by the persons making the arguments. These assumptions are what can be taken for granted, the unquestioned ‘truths’ that underlie the explicit claims evident in the text. (Blanton, McLaughlin, & Moorman, 1994)

A wise philosophy professor I had the good fortune of studying under once told our class that philosophy was the business of questioning assumptions. If that is the case, we should always be on the lookout for figurative language, and seek to evaluate the assumptions that lie lurking below powerful rhetorical language.

8. Conclusion

We are left then with the simple-seeming question posed at the beginning of this paper: what is thinking, and what is the proper way to go about thinking? The answer is likewise simple: thinking and the proper way to go about are entirely up to the thinker to define. Some may deride this idea as infantile and naive; but I believe a full appreciation of the implications of this idea reveal it to be as liberating a philosophy as can be conceived – it permits true freedom of thought. As Kierkegaard once remarked: “The thing is to find a truth which is true for me, to find an idea for which I can live and die.” The extent to which we can conceptualize of thinking as correct applies only so far as a thinker can effectively communicate with another through whichever language facilitates the most clarity between the two – be it English, algebra, symbolic logic, or some soon-to-be invented language. Cognizance of the communication medium and respect for differing abilities among speakers/listeners to comprehend messages encoded in that medium are paramount to understanding and commonality. Indeed, “A…crucial concept is that the burden of communication is shared, on every level, by both participants…” (Lippi-Green, 1994). Perhaps, one day, humans will evolve an entirely new system to replace symbolic thinking/language, but until that day we are compelled to live with what we have. A rigorous review and critical evaluation of the mediums we choose to communicate in, and all the associated implications, seems likely to reduce unnecessary conflict and to, dare I say, promote peace.

Works Cited

Geersteen, HR. (2003). Rethinking thinking about higher-level thinking.Teaching Sociology, 31(1), 1-19.

Lewis, JJ. (2009). Wisdom quotes. Retrieved from http://www.wisdomquotes.com/cat_wisdom.html

Urban, G. (2002). Metasignaling and language origins. American Anthropologist, 104(1), 233-246.

Nowak, MA. (2000). Evolutionary biology of language. Philosophical Transactions: Biological Sciences, 355(1403), 1615-1622.

Lippi-Green, R. (1994). Accent, standard language ideology, and discriminatory pretext in the courts. Language in Society, 23(2), 163-198.

Woolard, KA., & Schieffelin, BB. (1994). Language ideology. Annual Review of Anthropology, 23, 55-82.

Blanton, W., McLaughlin, T., & Moorman, G. (1994). The Rhetoric of whole language. Reading Research Quarterly, 29(4), 308-329.

Urciuoli, B. (1995). Language and borders. Annual Review of Anthropolgy, 24, 525-546.

Burton, L. (1984). Mathematical thinking: the struggle for meaning. Journal for Research in Mathematics Education, 15(1), 35-49.

Berkeley, EC. (1942). Conditions affecting the application of symbolic logic.The Journal of Symbolic Logic, 7(4), 160-168.

Schiller, FCS. (1932). The Principles of symbolic logic. The Journal of Philosophy, 29(20), 550-552.

New piece coming soon

I just wrapped up an essay I’d been working on for a philosophical conference, and submitted it over to The Spearhead. I am told it should be up within the next day or so. I wouldn’t mind becoming a regular contributor over there, but we shall see. If that becomes the case, I could use this blog to merely talk about video games and the military. Haha!

::EDIT:: Here it is.

Another good comment

I’ll get around to generating my own content again shortly (long weekend, competing priorities) but for now, I rather enjoy pointing out good comments on posts that may otherwise go unnoticed.

The comment is a sort of companion diatribe to this analysis (author’s blog) of common “feminist” (I use the term loosely and apply it generally – I accept/acknowledge the hypothetical possibility of an honest to goodness, logical “feminist” even though I haven’t met one) debate tactics.

Enjoy!
[EDIT] Kudos award to this comment for presenting a great dilemma to the advocation of a broadening of the definition of rape. Plus, this keeps language narrow and specific, which we all know I’m a fan of.
And here’s some lulz.

Moral of the Story: Get a Vasectomy

This comment on this post was just too hilarious (and awesome) to let get buried.

TL;DR summary: guy gets a vasectomy. Three months later, he meets a woman with whom he enjoys a casual relationship. She had other designs, however, and about four months into the relationship she plays the “I am pregnant” game, which our hero indulges with a bit of pleasure (knowing obviously the child is not his). After revealing her to be completely crazy and immoral, he performs an awesome coup-de-grace that leaves her sobbing.
Comeuppance! Catharsis! Woo.

The Lies We Live With – Independence

I contend that it is impossible to live an absolutely honest life. We all live with lies (and by this I mean the simplest definition of a lie: “an inaccurate or false statement,” whether it is intentional or not). This could be the result of any number of things – indoctrination, propaganda, self-defense, who knows. One of the lies I lived with for a long time was related to my indoctrination into the modern liberal mindset, a process I had very little choice in.

But I’ve lived with a variety of other lies. Human memory is faulty, and so we may come to believe things about our past that are untrue. During my exile in Utah, only after deep reflection and a painful confrontation with my past, did I learn that I had convinced myself of an outright falsehood which had fundamentally altered the way I approached relating to other people. How or why this happened, I cannot say for sure.
This is why my ideal is to strive for ever more honesty; we can never be perfectly honest, I think, but we can try.
In what is the first of an intended series of posts, I want to take a closer look at the lie of “independence” in the West, particularly among modern liberal (and by this I mean the type discussed in my link above) thinkers. We are all raised to value our independence and we fancy ourselves independent thinkers, workers, citizens, etc. Most people’s first taste of “independence” is when they get their first car at the age of 16 (anecdote: I’ve always relied on mass transit). Independence is valued in our society and enshrined as a sort of virtue.
Only an extremely small minority of us are independent, however. In fact, most of us have crippling dependencies on “the system” and on society at large. If, overnight, the entire *American infrastructure were to be swept under a rug (the roads, the system of distribution for foodstuffs and water, just to name a few), we would find that most people would quickly die in short order. As a culture, we fancy ourselves as industrious and deserving of our place in the world. When one talks about their hard-won job or hard-earned achievements, they always talk in terms of their struggles inside the system, spending hours and hours studying at the universities or working long nights or whatever the case may be. Their analysis completely ignores the fact that they had a system to work within at all.
For example, I am absolutely positive that there are people who have been born in less advantaged places in the world who were either equal to me in capacity, or superior. However, I am just as certain that they died well before they reached the age of 21. Why? They lost the “genetic lottery,” and that’s it. They had the poor misfortune of being born into a terrible part of the world that does not have a grand infrastructure like the United States*. If they did survive, it is because they learned how to be truly independent – how to grow their own food, build their own home, and eek out their own living.
So, while we fancy ourselves independent, we can think all sorts of absurdities. I was recently talking to someone who expressed that she could never imagine herself “shackled to a man” in a marriage, how this would be absolutely catastrophic to her “independence.” She did not think it was fair game or even relevant for me to point out that she was absolutely taking for granted her day to day dependence upon the social system that has been set up for her sustenance. She does not want to be “dependent” upon one man (a husband), when in reality, she is dependent upon countless men (and some women) who pay taxes and work for the state to build and maintain the institutions she uses to survive.
This thinking is poisonous. We devalue community and cooperation in favor of “independence,” but yet again, we’re not even talking about “independence.” True independence would be off-the-grid living, growing your own crops and perhaps tending to your own herds for sustenance. Such living is often ridiculed as backwards and “crazy.” So when people talk about how they’re very “independent,” what they’re probably talking about is how they’re very ignorant and very irresponsible. They feel no responsibility to society and see no reason why they should contribute back, and often pursue purely luxurious endeavors – things like art. (Art is great, and I am a fan of it. But it contributes nothing to our survival.)
Yet another example, methinks, of a thought-terminating cliché or doublespeak. I would love to see all those enlightened, “open-minded” college students (in whom these attitudes seem to be very prevalent) have to do things like build their own schools, pave their own roads, provide for their own defense, make their own budgets, grow their own food, and on and on… They might finally learn what being “independent” is really all about.
*I mention the US specifically, but these arguments could apply to any civilization. If you do not grow your own food and provide for your own basic sustenance and survival needs (to include defense from hostile aggressors – be they bandits or be they invaders), you are not independent, and in fact you are dependent upon the system to provide a method for you to acquire the things you need to survive. In case you’re wondering, I’m entirely dependent. There’s nothing inherently “wrong” with being dependent, but there is something wrong with being ignorant about it.

Organized!

I just finished organizing my blog, somewhat. I’ve revamped labels and applied applicable labels to every post in the blog, as well as removing redundant ones. If you’ve a label suggestion for a post, recommend it in the comments – if you’ve a question as to why something was labeled the way it was, ask in the comments! Check out the sidebar for all the labels I’m using.

I’ve also gotten rid of any references to my actual name and cleaned up some posts that had too much jargon which made them hard to read.
Next step to organizing: have a “most important” posts sort of column, perhaps for the major labels, ala Female Misogynist’s blog.

Video Games and the Suspension of Disbelief

Preamble
First off, I’ve been rather stressed out lately. I haven’t been writing here as much as I’d like to, probably because I am not in optimal condition to digest and interpret matters which I take rather seriously. Therefore, I’m going to take a break (even if for but a post) from major examinations of philosophy and society to talk about something that I usually derive great pleasure from: video games.
This is not a post that will attempt to establish that video games are art. I am no expert when it comes to art, and I am not the sort that could attempt to establish and support such a thesis. I have a pretty liberal idea of what constitutes “art” in any case, and my line of thinking is similar to this quote from Man on Fire: “A man can be an artist… in anything, food, whatever. It depends on how good he is at it. Creasey’s art is death. He’s about to paint his masterpiece.” To me, video games can be art, there is an art to warfare (“The Art of War”), writing is an art form, and so on.
Generally, there is not nearly so much controversy when one asserts that writing/literature is art, and I am going to apply the idea of “suspension of disbelief” to a discussion about video games. I once fancied myself a writer and wanted to pursue being a novelist, so I understand more about the art of writing than I do about other art forms. I’ve also been playing video games for years and years, so combining ideas from both seems rather natural.
The main thesis is that once that suspension of disbelief is broken, a gamer stops playing a game – much like a reader would stop reading a book.
Initial Concerns – Interface
I believe it is fair to boil down the idea of “suspension of disbelief” in literature to the idea that the reader must buy into the writer’s world, that even though the reader knows what’s going on is fiction, they choose to suspend their disbelief and behave as though what they were reading was not fiction, to get into the mood. This can be achieved in a variety of ways, and it does not necessarily mean a writer need be overly concerned with realism or describing the mechanics of their fictional (and sometimes fantastic) universes; but if a reader will not suspend their disbelief, it is unlikely they will continue reading. Therefore, it’s a critical concern. When it comes to a video game and for the purpose of my analysis, “suspension of disbelief” refers to the gamer’s willingness to continue playing a video game despite objections the gamer may have to the various stages of game play – from interface, to mechanics, to immersion.
Video games are a unique medium with a unique interface. Generally, one needn’t worry about interface concerns when it comes to writing – we are all very used to interfacing with books and the written word. Not many surprises there – black ink on white paper, read from left to right and top to bottom, usually in book form…you get the idea. With a video game, however, we don’t interface this way, even though the ability to read may be crucial to enjoying the game. There are many other factors, and the interface may be a big enough hurdle that some people give up before they’ve even began playing (stereotypical example: old people).
I agree with a lot of what David Sirlin has to say about interface. Here’s a quote from one of his interviews (responding to why he thinks designers make a lot of mistakes with interface):

I think there are many reasons that all contribute to that. One of them is that game designers like to think about system or story―big ideas. And that [interface] is not big ideas. It’s mundane and boring and not sexy to care about. And yet you can end up with this great story that’s written in children’s handwriting. It’s ridiculous. It’s that extra level of polish that we as an industry need to care about more.

Sometimes, however, bad interface choices are defended by fans of certain games, claiming that they add elements of “tension” or “excitement” to the game. One example is with Resident Evil 5, where you can’t pause the game to manage your inventory and you only get a limited number of spaces. Fans claim this creates tension in a firefight. This is analogous to claiming that using an illegible or cryptic font style in a novel adds tension to a fight scene. Why would you ever think it is a good idea to make it harder to interface with your product? Stellar ideas are the ones that are easily accessed and still brilliant, not ones that are hidden away under layers of bad interface choices.
However, interface is certainly a matter of “suspension of disbelief.” Different people have different tolerances when it comes to clunky interface design, and may choose to play a game with frustrating controls so long as the game has something else to offer – is lots of fun, deeply engaging, tells a great story, whatever the case may be. Having a good interface is never a bad thing, but having a poor interface isn’t necessarily deal breaking either. It contributes overall to the suspension of disbelief, and interface ranks at different levels of importance for different gamers.
Intermediate Concerns – Mechanics
One of the earliest reasons, I would argue, that games ever caught on in the first place is that people found them to be a lot of fun. This is primarily due to game mechanics – a great game design that is executed well. This is a meaty subject that fuels a lot of thinking and debating, and is usually the major topic of concern for those who talk about “game design.” You’ll see Sirlin talk about mechanics all the time. Mechanics factor into suspension of disbelief insofar as one may give up playing a game if one does not like the mechanics of that game. Like interface, objections over mechanics may not yet be enough to break a gamer’s suspension of disbelief – particularly in games that are more about immersion. This is more true of seasoned gamers than it is newbies, who may have bowed out already at the interface stage. (The analogy to literature holds true, still – an early reader, such as a middle schooler, is not going to want to read War and Peace, despite any literary merit it has. The early reader hasn’t mastered the interface in the same way an adult reader may have – such as having a large enough vocabulary or long enough attention span – and may be more prone to appreciating style rather than substance.)
If I ever got into reviewing video games, I would forego the conventional wisdom that arbitrarily assigns scores to arbitrary facets of a game (look at any game review site and you’ll likely find this breakdown: Graphics – 9, Sound – 8, Story – 7, Gameplay – 10, Tilt – 7 Overall 8…just for example) and instead focus solely on interface, mechanics and immersion. Assigning arbitrary scores here would not make much sense either, and I would talk merely about the things done correctly or incorrectly in each of these categories, perhaps suggesting how much time one could expect to spend with a game (while acknowledging that ten hours spent with one game may be more fulfilling than one hundred with another, for various reasons)…but I’m getting off topic.
Mechanics basically boils down to concern over whether or not the game is pleasurable to play. Is there enough challenge, and is the game challenging in a way that is fair or in a way that is cheap? If it is strategy focused, does it have depth and allow for creative use of game assets, or is it shallow and affords the player only canned strategies? If it’s about action, is it fast and furious or light and, well, boring? Again, there are a ton of things that factor into game mechanics, and no game will ever have the perfect formula (I define the perfect formula as being one that succeeds so brilliantly you would never need to play any other game ever again – and furthermore, all people would agree that it is the perfect game). There is the possibility that you may find the perfect game for you, but I highly doubt it. I thought I had found such games, but I also found that after a significant investment of time, I eventually grew bored and turned to other games.
Certain genres of video games are designed to rely on mechanics more than are other video games. Examples would include action games, fighting games, or platforming games. People don’t generally play these games because they tell a great story or otherwise immerse a player in a fantastic game world (escapism). People generally play these games because they are fun to play, because the game mechanics are smartly designed and satisfying to learn. Interface is usually important in primarily mechanical games, though not necessarily so – some interfaces are hard to learn initially but can be wielded with impunity after a certain amount of investment, at which point the mechanics can shine through. Likewise, immersive factors can be ignored – a game that initially looks or sounds ‘ugly’ will still attract a large audience if the mechanics are highly refined.
Advanced Concerns – Immersion
As games have evolved, so too have their reasons for being played. It is hard to call any 8-bit game a pleasure to visually behold, but nowadays, games can be very visually enticing. In about two decades, games went from the visuals offered in the first picture (left) to the visuals offered in the second (below, right). This is from the same series of video games (Final Fantasy I and Final Fantasy XIII, for the non gaming audience – an in depth analysis tracking the growth of this series can be found here) depicting the same mechanics (a battle sequence). Even the first screenshot is worlds ahead of the earliest video games, especially in the same genre – some were purely text-based adventures akin to a “choose your own adventure” novel! Visuals are just one area where games have improved, however. Increased technology has allowed for better visuals, more realistic sounds and more memory (allowing for things like, initially, more text, and later, more video and audio data storage – all contributing factors to ‘better stories’). The “old guard” of video game reviewers have understood that people like shiny things, and thus given consideration to the artistic and technical merits of graphics. They’ve considered the artistic and technical merits of a video game’s sound-scape, and even discussed the artistic and technical merits of a game’s story. No large game review outlet that I have seen has successfully weaved these seemingly disparate elements together into a single cohesive theory, however. I doubt very much that a person will play a game for very long that is merely very pretty but has no other merits, or merely sounds very good without any other merits, or has a great story without any other merits. The reason all of the things discussed in this paragraph matter is because they all contribute to a game’s immersion.
For this discussion, however, a game’s immersion is a high-level factor of consideration for a gamer’s suspension of disbelief. It is possible that a gamer may play a game that is hard to control (poor interface), and not very fun (poor mechanics) if the game is superbly immersive. Some games get by on their immersion alone, offering convoluted or clunky interfaces and stale mechanics but satiating a gamer’s desire to escape to another realm (see also: World of Warcraft).
To a certain extent, a game must pass a gamer’s bare minimum for interface and mechanical checks – if an interface is simply too cumbersome or mechanics are simply too boring or disengaging, a gamer isn’t going to stick around to get immersed – no matter how beautiful the graphics, how fitting the music or how wonderfully penned and executed the story. Furthermore, some gamers plain won’t give a shit about the immersion at all! Then there are the types of gamers who may be able to forgive poor interface and poor mechanics, but who won’t be able to be immersed in a game which is of a genre they dislike. For example, I think Braid is an amazingly well designed game, but if a gamer does not enjoy platforming or puzzle games, it is unlikely they will be able to play and appreciate Braid. (More on Braid later – Braid was originally going to be the subject of this post, but I thought a more general discussion of video games would serve me well here).
Conclusions
I am a fan of trying to communicate and explain things in ways that people can understand. The goal here was to communicate my thoughts on why people play games and why they may bow out of the process at various stages. It all starts with interface and whether or not a person will agree to play the game, basically. After that, the next hurdle is mechanical – is the game fun or otherwise enjoyable to play? If a game succeeds brilliantly on its mechanics alone, that may be enough to keep gamers coming back for more. If not, the game needs to be immersive – it needs to draw the gamer in and keep them coming back in order to be a part of a fully realized alternative game world.
I hope this was not a complete waste of time for either the non-gaming or gaming members of my reading audience. Expect a post on Braid next.