Reading Heidegger continually reimpresses upon me the joyful recognition that I am, after all — and despite what anyone however well-informed may say about me — a philosopher.
Read moreProgression, Regression, Aggression
Good writing is rewriting.
— Truman Capote
I remember hearing Truman Capote say that on The Tonight Show decades ago. Over the years since, I’ve often experienced that the same thing also applies to reading: all reading is rereading. That’s true at least for anything really worth reading in the first place.
All my life, beginning all the way back to my childhood before I even knew how to read myself, I have realized that anything worth reading deserves — and richly repays — rereading. As I may have mentioned before in one or more of my previous blog posts (I may well reread a bunch of them to find out for sure), back when I was only about four years old my mother thought that I was ready to learn how to read, given how incessantly I asked her to read and reread things to me. The Earth for Sam, which was originally published way back in 1930, and Space Cat, which first came out when I was six and began to be taught how to read and which was one of the first books I read myself, were my two favorites.
Wanting to do the right thing, my mother made the mistake of asking teachers of the Jefferson County Colorado school system—that being the county in which we lived—if she should teach me how to read, which she would have been able and happy to do. She was informed that it would be best for me to wait, and to learn how to read along with other children my age once I entered public schools and made it past kindergarten and into the first grade, where that would occur. So till then my mother, who took the teacher’s supposedly expert but utterly wrong-headed advice to heart, had to keep doing all my reading and rereading aloud for me.
At any rate, once I got to my mid-teens one of the authors I started reading and rereading on my own was Søren Kierkegaard. Once I became a college classroom teacher myself, I often used texts by him in various classes at various academic levels, from introductory lower-division undergraduate philosophy classes to seminars for graduate students and advanced undergraduates.
Despite my having read and reread most of Kierkegaard’s works, one thing of his I never read until just recently, and at least parts of which I have already reread more than once, was his academic dissertation, The Concept of Irony, with Continual Reference to Socrates. Among the already more than once reread passages of that text is the one that follows, a passage in which Kierkegaard is addressing the “Socratic outlook” on irony. My initial reading of the passage is what helped give me the idea for the blog post you are currently reading. (The passage occurs on p. 60 in the Howard V. and Edna H. Hong English translation of the text, first published by Princeton University Press in 1989.)
If we now inquire further and ask to what more universal view this Socratic outlook may be traced, in what totality it rests, then it obviously is in the meaning ascribed to [what Plato calls] recollection; but recollection is in fact the retrograde development […]. [. . .] It is Socratic to disparage all actuality and to direct man to a recollection that continually retreats further and further back toward a past that itself retreats as far back in time as that noble family’s origin that no one could remember. [. . .]
In the next section of this post, I will share some of what that passage suggested to me when I first read it.
Kierkegaard’s grave, Copenhagen (picture from Wikimedia Commons)
* * *
Profound progression is ever-recurrent regression into the groundless ground. Such regressive progression/progressive regression is never aggressive, at least not in the ordinary sense of “attacking”: it is, rather, always serene, and spreads the seeds of peace wherever it goes. However, in the original and originating sense to which the word aggression itself always goes back, progressing step by step in its retreat, through French to Latin and eventually to a presumed Proto-Indo-European root, in joins up with both progression and regression, as I’ll now try to explain.
The presumed Proto-Indo-European root of all three words — progression, regression, and aggression — is *ghredh-, assigned the meaning “to walk, go.” To progress is to walk or go forth or forward. To regress is to walk or go back. To aggress is to walk or go against, as one sometimes needs to go “against the current,” as we say, to do what one is called to do as one progressively regresses/regressively progresses.
Progression is walking the path forward into the terrain of what will be. Regression is walking the path back into the terrain of what has been. The walking itself traverses, step by step, the terrain of what truly is, regardless of what the current opinions about what is may be.
* * *
To come to a full understanding of the preceding section of this post, you may need to resist temptations to just go with the flow. Instead, you may need to go back to reread that section, and perhaps even the first section of this post, carefully and attentively again and again.
I will leave it to you to decide if I am speaking ironically in the preceding paragraph.
Note to readers: This will be my last post before I take my usual summer break. My next post will go up on Monday, September 12, 2022.
Health, Care, and Healthcare
The less meaningful our lives become, the more tightly we cling to them.
That is why, in our global economic society, new institutions and provisions of institutionalized medical “healthcare” proliferate. They do so especially in the most commercially developed and advantaged countries. Those in power in the same economically developed countries then often dangle the carrot of such institutionalized healthcare in front of the faces of those who live in less developed countries. At the same time, the commercially powerful countries use the promise of healthcare as a stick to drive such disadvantaged countries ever forward, toward taking full advantage of all the touted advantages of planned, provided, medicalized healthcare.
If life in such healthcare-enticed and -driven countries becomes ever less meaningful the further forward those countries are driven along the healthcare road, so what? That’s a small price to pay for all the profits to be made by healthcare providers.
* * *
I’ve told the story before. The first time I told it in print was in my first published book, The Stream of Thought (available through the “Shop” at the top of this blog website), which came out in 1984. It’s the story of how, a few years before that, in spring 1976, I hosted Ivan Illich for a visited to the University of Denver.
That was just the year after after his book Medical Nemesis: The Expropriation of Health had come out in its American edition, receiving a great deal of attention. Illich’s thesis in that book was that the modern institutionalized medicalization of healthcare had long ago passed the point of “specific counter-productivity.” Not to be confused with what economists call the point of “diminishing returns,” the point of “specific counter-productivity” is that point beyond which the process of institutionalization begins to work contrary to the very supposed purpose for which that institutionalization was established in the first place. Thus, Illich argues that the current institutionalization of medicalized healthcare, which was supposedly designed to foster health, is actually making society as a whole more un-healthy than it ever was before such institutionalized medicalization was initially instituted.
As I first described it in The Stream of Thought (in note 98, on page 644), on April 22, 1976, I hosted Illich as he gave an address at the University of Denver: “During the course of his ‘address’ [. . .] an individual identifying himself as a member of the ‘health-care profession’ working at a local hospital expressed enthusiasm for Illich’s ‘position’ and asked what he, the ‘health-care professional,’ might, as such a professional, do to rectify the ills to which Illich was pointing. Illich’s reply was, ‘Burn down the hospital!’ The frustration this reply engendered was obvious on the questioner’s face and in his demeanor.”
At the time I wrote The Stream of Thought, I shared that questioner’s frustration with Illich’s response. What’s more, for years I continued to feel the same way.
That was because my thought during all that time continued to be oriented toward the goal of changing the very institutions that, as I fully agreed with Illich, had progressed beyond the point at which they had become specifically counterproductive, making worse and worse what they were claimed to be designed to make better and better.
However, as my own thinking continued to mature, I came to a different perspective, one from which I was able to see that Illich’s response was, in fact, completely appropriate. Over the years I came more and more clearly to see what was genuinely at issue for him, and found myself to be in full, emphatic agreement with his position.
That issue, I came to see clearly, was not at all to change the present system, or any of its ever-proliferating institutions. The issue, rather, was to live in the irrelevance of that entire system — its irrelevance to anything that truly matters in human life individually or collectively. That is exactly what I explicitly argued for myself, in my recent book The Irrelevance of Power, just published two years ago, in 2020 (and also available from the “Shop” at the top of this blog website).
If we were to live in full awareness of just such irrelevance, we might well take advantage of whatever opportunities might present themselves at times to “burn down” the whole, irrelevant system — at least if we had nothing better to do, such as taking a stroll, snoozing, or eating a cookie. That, I came to see, was the real gist of Illich answer to the hospital healthcare worker’s question at D.U. in April 1976.
He thereby gave the most genuinely responsible answer possible.
Ivan Illich
* * *
Yes, we suffer pain, we become ill, we die. But we also hope, laugh, celebrate; we know the joy of caring for one another; often we are healed and we recover by many means. We do not have to pursue the flattening out of human experience.
I invite all to shift their gaze, their thoughts, from worrying about health care to cultivating the art of living. And, today, with equal importance, to the art of suffering, the art of dying.
A little over fourteen years after I hosted Illich’s talk at D.U., he spoke again on the topics of health and healthcare. That talk occurred in Hanover, Germany, on September 14, 1990. What he said then became the basis for a short article he wrote called “Health as One’s Own Responsibility — No, Thank You!” The lines above are the closing two paragraphs of that article.
The whole piece is less than seven pages long, and it is easily available online at https://www.pudel.samerski.de/pdf/Illich_1429id.pdf. I recommend it highly. It is well worth reading carefully and thoughtfully. If you do want to do that, be sure when you read the article to keep clearly in mind what I said in the preceding section of this post.
It was only in February of this year that I became aware of that article myself, and I read it right away. Reading it is what occasioned my writing of this current post.
Some Wisdom from Wisdom
John Wisdom (1904-1993)
“My dear, it’s the Taj Mahal.” That is what the 20th century British philosopher John Wisdom imagines one woman saying to another, a friend who is trying on a hat and who is “studying the reflection in a mirror like a judge considering a case.” After a pause, her friend says (“in tones too clear,” writes Wisdom) the remark about the Taj Mahal. “Instantly the look of indecision leaves the face in the mirror. All along she has felt there was about the hat something that wouldn’t quite do. Now she sees what it is.”
According to Wisdom’s analysis, talk of God has important similarities with such everyday uses of metaphor. What he is saying is not that the things people say about God (such as that God is all powerful or all knowing, for example, or that God is love) are said of God metaphorically (as the woman in Wisdom’s story says metaphorically of the hat that it’s the Taj Mahal). That idea—that the things believers “predicate” of God are predicated metaphorically—is an ancient one. But that is not the point that Wisdom is making.
Instead, Wisdom is making the point that people use sentences such as “God exists” or “God does not exist” in much the same way that the woman in his little story uses the remark about the Taj Mahal. By his thinking, the way the woman in the story uses that line is not at all to convey any new information. It’s not to tell anybody anything they didn’t already know, or already see clearly enough. As Wisdom is careful to note, “the hat could be seen clearly and completely before the words ‘The Taj Mahal’ were uttered.” He goes on:
And the words were not effective because they referred to something hidden like a mouse in a cupboard, like germs in the blood, like a wolf in sheeps’ clothing. To one about to buy false diamonds the expert friend murmurs “Glass,” to one terrified by what [s]he takes to be a snake the good host whispers “Stuffed.” But that’s different. That is to tell somebody something [s]he doesn’t know—that that snake won’t bite, that cock won’t fight. But to call a hat the Taj Mahal is not to inform someone that it has mice in it or will cost a fortune. It is more like saying to someone “Snakes” of snakes in the grass but not concealed by the grass but still so well camouflaged that one can’t see what’s before one’s eyes. Even this case is different from that of the hat and the woman. For in the case of the snakes the element of warning, the element of predictive warning, is still there mixed, intimately mixed, with the element of revealing what is already visible. This last element is there unmixed when someone says of a hat which is plainly and completely visible “It’s the Taj Mahal.”
Wisdom suggests that a lot of religious discourse, including “There is a God” or “God exists,” is just like that. It doesn’t tell anybody anything new. Nor does it (at least it doesn’t have to) mix the issuing of warnings or similar things with just letting something be seen. Instead, all it does is just that: to let something be seen. But what’s most interesting about such religious discourse that just lets something be seen is the same thing that’s most interesting about his Taj Mahal hat example. What is most interesting about all that is that what’s said—what’s revealed or opened up to be seen by such discourse—is nothing that isn’t already clearly visible and completely evident and seen even before anything’s said. It’s a matter of revealing what was never covered over in the first place, but was hiding in plain sight all along, like telling someone who’s looking for her glasses that she’s wearing them already. Such remarks, such discourse, let’s those who have the eyes for it, see, or those who have the ears for it, hear.
Often times, what most needs to be pointed out is not what’s hidden away or disguised, but precisely what’s there in plain sight for everybody to see. It’s something so obvious that we need special attention or help not to overlook it. What we overlook is not what’s glaring and showy, what’s flashy and fresh, what calls attention to itself one way or another. It’s what’s right there in front of us but does not call attention to itself, what’s inconspicuous and unobtrusive. That’s what trips us up most easily and often.
What’s obvious, so inconspicuous and unobtrusive that we almost always overlook it, is sometimes what’s most important. Sometimes it’s what’s really most striking and wonder-provoking--what’s most riveting of our attention, once it is pointed out. Like pointing out the snakes plainly visible in the grass all along, but for that very reason unnoticed until someone says “Snakes!” and lets what’s so plainly visible be seen.
Wisdom is suggesting that religious discourse is at least often like that (except, as he is careful to note, that the element of warning that goes with the example of the snakes need not be present—though, I would add, it can be). It doesn’t so much tell us anything new, convey to us any information that isn’t already readily available and even, in one clear sense, already known by everybody, as it does let us see it again for the first time. It is like the line in the poem by T. S. Eliot about how we finally, after all our journeying, come back again to the same place we started, only now we know it for the first time.
“God is dead in Salem!” John Proctor, the central character in The Crucible—Arthur Miller’s drama about the Salem witch-trials, written during the same period that Wisdom made his remarks, and occasioned for Miller by the McCarthy witch-hunt for “communists” everywhere in the United States—cries out at a critical juncture in the action of the play. In some ways that statement is less about God than about Salem—or, rather, so far as that goes, than about the United States during the McCarthy period. Or, rather, it is about God, but no less about Salem, about the U.S. in the nineteen-fifties—and about much more, which is finally beyond cataloguing.
God is dead in Proctor’s Salem, which is and is not the United States in the Eisenhower years, or the United States (and elsewhere) even today. But the Devil is alive and well there.
Note to readers: The above post is most of a section from “Hushed Talk of God,” a chapter in my book God, Prayer, Suicide, and Philosophy: Reflections on Some of the Issues of Life (available in the “Shop” at the top of this blog site). All the quotations from John Wisdom are from his piece “The Modes of Thought and the Logic of God,” contained in John Hick, editor, The Existence of God (New York: Macmillan, 1964).
Neglecting Ourselves
To neglect ourselves is to fail to gather ourselves together. Such failure most often derives from ignorance. In turn, that ignorance itself is sometimes imposed upon us, in which case we are not to blame for our self-neglect. Even then, however, once we become aware of our self-neglect, we are responsible for our own recovery from it. No one else can do it for us; we must do it ourselves.
At other times, the ignorance at the root of our self-neglect is not imposed upon us but is instead a manifestation of our own stupidity. Then our self-neglect derives from our own willful refusal to let ourselves see what is in front of us. It derives from our own motivated blindness, as I call it in my book The Irrelevance of Power (available in the “Shop” at the top of this blog-site). When that is our situation, then we are indeed to blame for our own self-neglect; and we add greatly to our own blameworthiness by refusing to assume responsibility for recovering from our neglecting of ourselves.
In either case, whether our self-neglect is imposed upon us or is our own fault, to recover from neglecting ourselves is always a matter of self-recollection. That is, to recover from neglecting ourselves is always a matter of collecting ourselves back together from out of the dispersal of ourselves. Recovery from self-neglect is always and only such self-recollection. If we are ever to recover from neglecting ourselves, whether by our own fault or by the fault of others, we must glean the fields wherein the seeds of ourselves have taken root and then grown.
In short, to recover from neglecting ourselves we must — either once again or for the very first time — harvest our very selves.
* * *
neglect (v.): from ne- as negative prefix, plus PIE root *leg- “to collect, to gather.”
recollect (v.): from Latin recollectus, past participle of recolligere "to take up again, regain," etymologically "to collect again," from re- "again" + colligere “gather” (see collect (v.)).
re- (prefix): directly from Latin re- an inseparable prefix meaning "again; back; anew, against." Often merely intensive.
collect (v.): from Latin collectus, past participle of colligere "gather together," from assimilated form of com "together" + legere "to gather," from PIE root *leg- "to collect, gather."
glean (v.): "to gather by acquisition, scrape together," especially grains left in the field after harvesting, but the earliest use in English is figurative, from Old French glener "to glean" (14c., Modern French glaner) "to glean," from Late Latin glennare "make a collection," of unknown origin.
harvest (n.): Old English hærfest "autumn," as one of the four seasons, "period between August and November," from Proto-Germanic harbitas (source also of Old Saxon hervist, Old Frisian and Dutch herfst, German Herbst "autumn," Old Norse haust "harvest"), from PIE root *kerp- "to gather, pluck, harvest."
All the above entries have been collectively harvested from the Online Etymology Dictionary. If one does not neglect them but keeps them firmly in one’s recollection and then rereads the preceding section of this blog, one may well glean more from crossing back over that section than one did from one’s first crossing.
Concepts, just like individuals, have their history and are no more able than they to resist the dominion of time, but in and through it all they nevertheless harbor a kind of homesickness for the place of their birth.
—Søren Kierkegaard, The Concept of Irony (trans. Howard V. & Edna H. Hong)
* * *
Now my head was like permanently located halfway between being bummed from no dope and unconscious from too much, only it was like my true self that I’d locked onto there, the self that hadn’t been fucked up by my childhood and all and the self that wasn’t completely whacked in reaction.
— Russell Banks
As Faulkner once famously said, well-crafted fiction is truer than the most accurate journalistic account. If one wants to experience the truth of neglecting oneself, and the truth of then hearing and obeying the command to assume responsibility for one’s self-neglect by pulling one’s self together from out of one’s scatteredness, one can glean a rich harvest from reading Russell Banks’s fine novel Rule of the Bone, first published in 1995, from which I have myself harvested the quotation at the beginning of this section of today’s post.
That novel is the story told through the voice of a teenager who. . .
Well, read it yourself and find out!
Russell Banks at the 2011 Texas Book Festival
The End of Education
The title of this blog post is polysemic, as I often make my post titles. It can mean more than one thing. Taken one way the phrase “the end of education” would mean what education aims to achieve, its goal or purpose. In another way of taking the same phrase it would mean the point at which all education is brought to a stop, frustrated of ever reaching its goal.
I will address each of those two senses of the end of education in turn, one at a time in each of the following two sections of this post.
* * *
Education’s end, its goal or purpose, lies beyond all mere acquisition of accurate information concerning what one is being educated about. The goal of education is not the acquisition of such information, which is at best only one tool of use in some cases, though far from all, for attaining that goal. Since such information is what is often called knowledge, we could put the same point by saying that the end of education, in the sense of its goal or purpose, lies altogether beyond all mere acquisition of knowledge.
Instead of aiming at any increase in knowledge, education aims at imparting vision, at giving one eyes to see. The goal of education is to induce insight.
In-sight is still sight. However, it is a seeing into rather than a seeing that remains only on the surface, gaping at the spectacle of whatever comes before the eyes like an image on a screen.
Insight brings understanding. The seeing of those granted insight is anything but mere sight-seeing, the sort of seeing in which we often indulge during travel-vacations. The seeing that is insight and brings understanding is not like such sight-seeing. Instead, it “puts things in proper perspective,” as we say. That is, insight lets us see things in their own proper places. Thus, insight is a placing vision: it lets us see the place to which belongs whatever we are encountering.
Such insight goes far beyond the mere beholding of any spectacle. There is in that sense nothing at all “spectacular” about insight. Insight displays nothing to view that has not been open to view before. Rather, it lets us see clearly at last just what has been displaying itself before us, but that we theretofore lacked the eyes truly to see.
Insight, then, gives nothing new to see. Rather, it gives us new seeing, letting the scales fall from our eyes. It is not seeing anything new; it is seeing things newly.
If you want to store a bunch of information, it would be wise to use a computer. That way you can keep yourself open for gifts of insight into what all that information is about in the first place.
The end of education is to open the mind to receiving such gifts.
* * *
When schooling becomes no more than a process of training and imparting new information, schooling works against education. All too often in our contemporary global consumer society, schooling becomes nothing more than such training and such conveying of information. Where that occurs, openness to insight is not fostered. It becomes at best accidental, a matter of pure chance, rather than what the schooling aims to accomplish. Education ceases to be what schools impart when the imparting of information and the provision of training become paramount.
When that occurs, it is the end of education.
Argh! An Argument!
“Don’t let arguing get in the way of the argument!”
I used to say that to the students in my philosophy classes from time to time. Sometimes, I would explain what I meant along the following lines:
Our word argument eventually traces back to the presumed Proto-Indo-European root *arg-, which meant “to shine,” like the white metal we call silver, the archaic name of which was argent. Keep that in mind as you go on to consider two very different sorts of conversations, in both types of which we have all been engaged many times, I’m sure. Reflect upon how the two sorts differ from one another. Let that difference itself shine forth!
First, there is that sort of conversation in which the parties involved exchange information concerning their respective views or opinions about the topic under discussion. This is the sort of conversation in which all the parties involved more or less patiently and politely — typically less and less of either, the longer the conversation goes on — tell one another what propositions they hold for true about the given topic.
In this first type of conversation, listening is often at a minimum, whereas talking is at a maximum. Before the conversation even begins, everyone participating already knows what they think; their individual opinions are already formed before the conversation begins, and what’s at issue is simply to vocalize those opinions. In such a conversation, the real topic of the conversation is those opinions themselves, rather than that about which the communicants have those opinions.
For example, imagine a small group of people at a cocktail party who fall into conversation about current United States policy toward Russia and the Ukraine. In such cocktail party chatter that policy itself is not really what the chatter is all about. Rather, it focuses on the exchange of the party-goers’ opinions of that policy. When the party is over, everyone comes away knowing quite a bit that they did not know before. Everybody now has new information about what everybody else holds for true.
However, it is only accidentally, if at all, that any of the cocktailers comes away knowing anything new about US policy. What one essentially acquires from such conversations is just new knowledge of the opinions of the other cocktailers—“new” if one didn’t already know all that even before the start of the conversation, thanks to prior conversations of the same type with the same co-partyers. (We will leave it up to information brokers to determine the exchange-value of such knowledge.)
In contrast, consider a different type of conversation, one in which the parties involved do not already cling to any cherished opinions about the matter ostensibly under discussion and in which, therefore, the real matter at issue is not the exchange of opinions but is, rather, the genuine exploration of whatever is being addressed. The focus of this second kind of conversation is not, in fact, acquiring any new information at all, most especially any information about the opinions of the other communicants. The focus is instead precisely on exploring the topic at issue in all such opinions, exploring that topic in an effort to acquire new insight into it, rather than information about anyone’s opinions on it (or anything else).
In sharp contrast to the first type of conversation, the cocktail-party sort, for this second type of conversation what has premium status is precisely attentive listening rather than talking. However, what one is focally listening to is not so much the other persons present as it is what, following the German philosopher Edmund Husserl, founder of 20th century phenomenology, could best be called “the thing itself,” whatever that thing under discussion might be.
In conversations of this second type, we let whatever we are talking about, that thing itself, hold our attention. We grant it room to unfold itself like a flower at bloom in and through our conversation, building its own case, making its own argument, shining in its own brilliance.
Arguing with one another just dulls the shine of the thing’s own silvery argument.
Philosophy? What's That?
Das eigens übernommene und sich entfaltende Entsprechen, das dem Zuspruch des Seins des Seienden entspricht, ist die Philosophie.
That’s a line from near the end of a talk Martin Heidegger gave in August 1955 in Cerisy-la-Salle, France. Here’s my own translation: “The expressly adopted and self-unfolding corresponding, which corresponds to the appeal of the Being of beings, is philosophy.”
Heidegger used both a French and a German lecture title. The German title he used Was ist das — die Philosophie? Given his audience, he tried with that title to approximate as closely as possible the French title of Qu’est-ce que la philosophie? A corresponding English effort matching the German title would be What is that — philosophy?
The English translation of the essay by Jean T. Wilde and William Kluback, which came out early in 1958, less than three years after Heidegger first delivered his address, bore the more standard English title What Is Philosophy? However, I prefer the less standard English version, and have even tweaked it a bit more to use as the title of this post.
Don’t take that to mean this post is about Heidegger. It isn’t. It’s about philosophy. Or rather, at least at the deepest level this post isn’t even so much about philosophy itself as it is simply about thinking. That, too, is in accord with Heidegger’s own way, and serves to acknowledge how indebted my own thinking is to his work.
Heidegger aside, perhaps the most descriptively accurate title I could have given this post would have been:
Philosophy? What’s that got to do with thinking?
The answer, I fear, is that all too often what has come to be called “philosophy” in academic circles has all too little to do with thinking—and in fact that remark comes closest to telling you what this post is about.
* * *
ladder (n.): Old English hlæder "ladder, steps," from Proto-Germanic *hlaidri (source also of Old Frisian hledere, Middle Dutch ledere, Old High German leitara, German Leiter), from suffixed form of PIE root *klei- "to lean."
— Online Etymological Dictionary
You might say that this post is about the degeneration that already long ago now began to beset philosophy in such a way as to separate philosophy from thinking. That was done above all by the academicization of philosophy, that is, by making philosophy into just one more academic specialty among all the others from algebra to zoology.
However, once separated from thinking, philosophy in fact ceases to be philosophy at all any longer — at least in my own philosopher’s judgment. It ceases to be anything resembling “the love of wisdom,” which is etymologically and originally the meaning of that word, derived as it is from the Greek philo- “loving” plus sophia “wisdom.”
A few decades ago, I had the unmitigated audacity to foreground a discussion of that etymology of the term philosophy to begin An Introduction to the History of Philosophy. That is what I entitled a two-volume work I wrote back in the mid-1980s. I wrote it solely for the purpose of using it as the text for the general education introductory philosophy courses I was regularly assigned as a faculty member in the philosophy department of the University of Denver. Two of my department colleagues also found my text to be worth using in their own introduction to philosophy courses.
The text worked so well for all three of us that I decided to send it to a book publisher. The editor of that press followed standard procedures and sent my manuscript on to some other philosophy professor for review for possible publication. That reviewer recommended against publication, and the publisher accordingly declined to publish it, then attached the reviewer’s negative comments to my rejection letter.
The academically certified professor of philosophy who wrote the negative review of my two-volume work haughtily dismissed my guiding use in my text of the etymology of the word philosophy by writing that the etymology of that word was no more enlightening than was that of the word ladder, which ultimately derives from a root that means “to lean.”
I remember thinking, when I read that comment, how revealing, in fact, that etymology of ladder actually is, contrary to what the reviewer was saying. After all, anyone who has ever used a straight ladder knows perfectly well that using it requires one to lean it against the structure up which one is trying to climb. Thus, only thoughtlessness would lead anyone to dismiss the etymology of ladder as irrelevant to that term’s current meaning.
I cannot speak for that reviewer or others of the same ilk, but I am happy to attest that in the way I have always used the term, philosophy is the provoking — literally, if I may be permitted reference to the etymology of that word, “the calling forth” — of thought, including any provocation thereof by exploration of the etymology of a word. For me, the answer to the question “Philosophy? What’s that?” is that philosophy is indeed the friendly love for, or loving friendship toward, wisdom.
Such love always seeks the good of whom or which it loves for that beloved’s own sake — as Aristotle, for one, knew and said — and gladly communicates such loving friendship to others, also for the sake of the beloved as such. As I use the word, anything that truly provokes thought is indeed “philosophical”: it is “of,” or “pertaining to,” philosophy itself, precisely in the etymologically original sense of that term. It is only in that original and originary sense of the word that philosophy has ever attracted any love from me personally, at any rate.
So much for academic certification, I guess.
Hoping for Hopeless Hope
Hope, now, is an exercise to see if I still have it in me to hope, despite all the reasons not to that are staring me in the face. The effort of hoping yields its own rewards, no matter the outcome and as intangible as they may sometimes seem. [. . .] Right now, it’s all we’ve got as we stand like Pippin waiting for the next battle, hoping to have hope.
— William Rivers Pitt
Pitt is senior editor and lead columnist for the online alternate news source Truthout. The lines above come from his opinion piece “Two Years of COVID Have Forced Us to Recalibrate Our Concept of Hope,” which was published in Truthout on Christmas Day, 2021.
Nor is Christmas, that time of the fulfillment of expectant awaiting, a bad time to think hope all the way through. Perhaps we can thereby even learn better how to think—by learning something of how to let hope call thought forth.
However, any hope that could initiate thinking by calling it forth would have to be a thoroughly hope-less hope in one important sense. It would have to be authentic hope—hope that has clarified itself entirely, freeing itself from every bit of in-authentic hope that clings to it.
Such hopeless hope is what is truly most worth hoping for.
The forlorn hope is not only a real hope, it is the only real hope of mankind.
— G. K. Chesterton
* * *
In reality hope is the worst of all evils because it prolongs our torments.
— Friedrich Nietzsche
Waiting is more initiating and far-reaching than all hoping, which always counts with something on something.
— Martin Heidegger
Any hope the fulfillment of which counts on some specific outcome to a given situation tends to denature hope. Such hoping for a pre-defined outcome robs hope of its full power, its infinite capacity to keep open the way to whatever the future may bring. It turns hope from an open and opening willed expectancy that can never be extinguished into a closed and closing wishful expectation that is always doomed to eventual frustration — something that, as Nietzsche says in the line from him above, always just prolongs an underlying torment.
Authentic hope accepts with gratitude whatever is given, and then communicates such grateful acceptance to others. True hope refuses ever to let itself be closed off and shut down, and it always extends itself broadly. It is infectious.
In contrast, false hope is no more than a disguised wish, the mere simulacrum of genuine hope. Such false hope is selfish, self-centered, and easily shut down. It remains incommunicable.
Sartre famously said that life begins the other side of despair. Given the etymology of the word despair (from Latin de- “without” plus sperare “to hope”) that means that life begins the other side of hopelessness.
It is only there, too, that true hope is born.
* * *
Hoping against hope, he believed that he would become “the father of many nations,” according to what was said, “So numerous shall your descendants be.”
— Romans 4:18 (NRSV)
True hope does not prolong one’s torment, as does the false hope Nietzsche addresses. Instead, true hope overcomes torment. It continues without reservation to trust that the promises it has received will be fulfilled, regardless of how desolate things may appear at any given moment. Only such persistent, unfathomably faithful trust is authentic hope, freed from all the tormenting illusions of false hope. True hope simply trusts. It is filled full of nothing but faith.
True hope does not count on things turning out in some given way at some given time, if only one manages well — manages, for example, to have the luck to place one’s bets on the winning number in some game of roulette. True hope opens out beyond all issues of efficacy and expectation, as Heidegger knew and said in the second epigraph to the preceding section of this post.
True hope also has nothing of selfishness or the calculation of private interest about it. In contrast with all such self-centered concern, true hope opens beyond itself, embracing everyone, oneself included, with pure love.
In Christian scripture, Paul knew all that; and let it be known to others, as when he wrote in the following famous verses:
Love is patient; love is kind; love is not envious or boastful or arrogant or rude. It does not insist on its own way; it is not irritable or resentful; it does not rejoice in wrongdoing, but rejoices in the truth. It bears all things, believes all things, hopes all things, endures all things.
— 1 Corinthians 13:4-7 (NRSV)
* * *
To live without hope is to cease to live.
— Fyodor Dostoevsky
It is only for the sake of those without hope that hope is given to us.
— Walter Benjamin
Dostoevsky and Benjamin are speaking of true hope in the lines above.
They knew what they were talking about.
A hope-inspiring picture.
Our Disconnection
Typical U. S. commuters in the 1950s
The purpose of newspapers is to paper over what’s new. News-casts—whether over the radio, on TV, or through the internet—are a great way of casting the new away. In general, processing information, however it’s done, can always be relied upon to keep us blind and bewildered. Staying connected to our cell phones and computers is especially effective for fostering our disconnection.
Diversion is everywhere to be found.
Diversion from what?
From ourselves—and from one another.
Wherever you go, be sure to take numerous selfies. Then post them on Facebook, to make sure all your FB “friends” feel jealous. How entertaining!
What’s more, taking pictures of yourself on your cell phone when you are in Venice or Prague or at the Eifel Tower in Paris keeps you from having to be wherever you are, connecting with that place itself. What a pleasure!
That way you can even visit Auschwitz and pretend to remember the millions who were murdered in the Holocaust, without having to concern yourself with doing anything to bear witness to their suffering or to change the underlying societal conditions that produced genocide then and continue to produce it today—perpetrated especially by the United States of America, as has always been our country’s wont. Who needs connection with such horrors, when posturing ourselves before the cameras of our cell phones is such a painless and self-inflating way to divert our attention from anything that really calls for attention? Is it any wonder that everyone does it?
* * *
A simple sentence will suffice for modern man: he fornicated and read the paper. After that vigorous definition, the subject will be, if I may say so, exhausted.
That line is one Albert Camus puts in the mouth of Jean-Baptiste Clamence, the fictional and always ironical narrator of The Fall, published in 1956, the last novel Camus completed before his death in an automobile accident in 1960 when he was only 46.
Reading newspapers is so exhausting of attention that it puts readers to sleep even when their eyes stay wide open, just as fornicating also tires one out while saving one the bother of loving.
In They Thought They Were Free: The German 1933-1945, first published in 1955, Milton Mayer quotes a German scholar he interviewed as speaking to him about “those who understand what is happening—the motion, that is, of history, not the reports of single events or developments”. In fact, though neither Mayer nor his interviewee make note of it, devoting oneself to the latter is an effective way of keeping oneself ignorant of the former: the more informational “news” one processes, the less one has to bother with such a messy thing as history.
After all, if one allows oneself to come to an understanding of history, one will not be able to avoid experiencing the call to take part in history. Any Germans who made the mistake of letting themselves experience such a thing in the 1930s and 1940s either had to protest against the Nazi regime, which was a dangerous thing to do, or experience the shame of failing so to protest, which was humiliating. It was much easier for them just to read the papers, losing themselves in the plethora of “reports of single events or developments.” A model to be followed!
Günther Anders, who escaped Nazi Germany with his then wife Hannah Arendt, captured well the advantage of such life lived in the oblivion of history, when he wrote in the first volume of his work Die Antiquitertheit des Menschen, which I would translate as “the antiquation of humanity”: “When the world comes to us, rather than we to it, then we are no longer ‘in the world,’ but rather no more than its consumers, denizens in a dreamland of milk and honey.”
Sweet deal!