Redefining Smart
January, 1985
This year, we subscribed to cable television, mostly because when cable television comes around, subscribe to it is one of the things with-it households do, even as, 50 years ago, they would have subscribed eventually to larger encyclopedias, larger dictionaries; bought more magazines.
But suddenly I realized the subscribing--to encyclopedias, dictionaries, magazines, newspapers, newsletters, book clubs, catalogs, still other cable networks, etc.--had to stop. Go to a large newsstand. Do you know there are more than 400 magazines devoted to computing alone? More than 40,000 books published per year? More television played commercially in one year than movies produced since the industry began? And, through all this flood of information, occasionally you will want to take time to remind yourself that the sky is blue, the grass green, the waters pure (except for those Gary Hart talked about in a speech in which Ronald Reagan featured).
Which brings us to the question at hand: How is it possible to keep up in today's world?
The answer is that it isn't possible to "keep up," not even at a rudimentary level. To which dismaying observation one reasonably asks, "What do you mean by a rudimentary level?" To which I answer--why not?--People magazine. It is rudimentary, isn't it, to have a working knowledge of the stars and the starlets of the society we live in?
Well, hear this. Last Christmas, my wife and I sailed in the Caribbean with a couple with whom we have for many years shared the season. Richard Clurman is my best-informed friend in the entire world. When serving as chief of correspondents for Time and Life, he cultivated and developed those habits that required that he know everything about everything going on. So he arrived, as usual, with his heavy rucksack of books and magazines. Among the latter, I remember offhand Scientific American, The Economist, The Atlantic, Harper's, The New Republic, The Nation, National Review, Esquire, Time, Newsweek, Playboy, Business Week, Foreign Affairs, and I am certain to have forgotten a supplementary dozen. He reads at a rate that would leave the ordinary computer puffing to keep up. After a day or two, he had gone through the magazines and started in on the books.
One week later, in the Virgin Islands, I sauntered about an old colonial town in search of periodical matter, finding, at the drugstore, only People, for a copy of which I exchanged a dollar and a quarter.
It was the year-end issue, and thumbing through it in the cockpit that night, sipping a planter's punch, I came upon what is evidently a yearly feature, enumerating 16 persons who had committed renowned gaffes of one type or other, 25 persons who had committed extraordinary feats of one kind or other. My eyes traveled down the list with progressive dismay in search of a name I recognized. I did discover one, finally, in each category, and paused for a moment, taking a deep draft of rum to console myself over my confirmed deracination from my own culture.
It struck me to recite the names I had just read to Richard Clurman. So I gave them out, one after another. He scored better than I did, recognizing three out of 41. (Neither one of us--this was December 1983--had ever heard the name Michael Jackson.) I am 59, Clurman a year older. Was this merely a generational gap? Is it that each of us develops habits of mind, perhaps needing to do so for self-protection, winnowing the flood of information that comes at us so that certain phenomena become, for all that they are ubiquitous, for all intents and purposes imperceptible?
Or was it sheer chance? Individual lacunae? But I told the story of going over the names of the featured galaxy of People to Henry Grunwald at a party a few months later, and he shrugged his shoulders. He is, after all, among other things the editor in chief of People, even as he is editor in chief of all the publications put out by Time, Inc. "I know what you mean," Grunwald said. "When they tell me who they have scheduled for the cover of the next issue of People, half the time I never heard of him or her."
•
Someone once said that Erasmus (1466-1536) was the last man on earth about whom it could more or less safely be said that he knew everything there was to know. But even in the 16th Century, "everything" was defined as everything common to Western culture. Erasmus could hardly have known very much about cultures whose existence neither he nor anyone else in the Western world had written about. What they meant to say was that Erasmus had probably read every book then existing in those Western languages in which books were then written. The library at the University of Salamanca, founded in the 13th Century, still has, framed and hanging over the little arched doorway that leads into the room in which all of the books of one of the oldest universities in Europe were once housed, a papal bull of excommunication directed automatically at any scholar who left the room with one of those scarce, sacred volumes hidden in his vestments. Books copied out by hand can be very valuable. The tradition is not dead, thanks to the Russian samizdat, by which Soviet dissenters communicate with one another, even as early Christians communicated by passing about tablets in the catacombs. Knowledge in those days, in the early years of movable type, was difficult to come by. But then there was not so much of it as to overwhelm. In that relatively small room in Salamanca were housed all the books an Erasmus might be expected to read--granted that his mind was singular and his memory copious. So had been Thomas Aquinas', a man modest except when laying down certitudes, who admitted, sheepishly one must suppose, that he had never come across a single page he had not completely and instantly understood. If, per impossibile, Thomas was required to linger a few days in purgatory for committing the sin of pride, I am certain that the torturers stood over him demanding that he render the meaning of the typical "documentation" (that is what they call instructions) of a modern computer.
Never mind the exceptional intelligence. It is sufficient to meditate that in the 16th Century it was acknowledged as humanly possible to be familiar with all the facts and theories then discovered or developed; to read all the literature and poetry then set down. To know the library of Western thought.
Move forward now 250 years and ask whether or not Benjamin Franklin could have been surprised by an eldritch scientific datum, an arcane mythological allusion, a recondite historical anecdote, an idiosyncratic philosophical proposition. Of course he could have been, even bearing in mind that Benjamin Franklin was a singular intelligence, eclectically educated, and that he was surrounded, at the convention in Philadelphia, by men most of whom moved sure-footedly in the disciplines then thought appropriate to the background of statesmen. The standards at Philadelphia were high; indeed, it has been opined that at no other deliberative assembly in history was there such a concentration of learning and talent.
But these are anomalies. We ask, and continue to do so, How much was there lying about to be learned? Two hundred and fifty years having passed since the last man died who "knew" everything, then by definition it follows that there were "things" Ben Franklin didn't know. Perhaps we are circling the target. "Things." What things?
•
It is said that twice as much "knowledge" was charted in 1980 as in 1970. How can one make an assertion of that kind? At a purely technical level, it isn't all that hard to conceive. Suppose, as an example, that every decade, the penetrating reach of a telescope doubles. In that case, you begin the decade knowing X about astronomic phenomena. At the end of the first decade, you know 2X; at the end of the second decade, 4X; and so on.
It is so (the epistemologists tell us) primarily because computer science advances us (we fall back on ancient metaphors) at an astronomic rate. It was somewhere reported that when George Bernard Shaw was advised that the speed of light was equal to 186,000 miles per second, he greeted that finding as a madcap effrontery--either that or a plain, bald lie.
Such sullen resistance to the advancement of physical knowledge is behind us; indeed, it has left us blasé rather than awed. When we pick up the telephone and lackadaisically dial Hong Kong, we simply submit--to a kind of magic we never presume to understand. The inquisitive minority among those who use such instruments for such purposes is mindful that something quite extraordinary is going on, triggered by rudimentary digital exertions by one finger of one hand, the result of which is to rouse a friend (he had better be a friend, considering that it's midnight in Hong Kong) by ringing his telephone 8000 miles away: a process that combines a knowledge of "things"--things such as transistors, transmitters, radio beams, oscilloscopes, etc., etc., etc.--they will simply never understand and are unlikely to burden themselves with the challenge of attempting to understand.
So it is that the knowledge explosion, as we have come to refer to it, is acquiescently and routinely accepted by both the thoughtful and the thoughtless, the grateful and the insouciant. Every now and then one identifies a little cry of frustrated resentment. Ten years ago, I took to Bermuda a self-effacing boatwright in his mid-60s to give expert testimony in a lawsuit. He was asked by the defendant's lawyer how he could presume to qualify as an expert in all that had to do with the construction of a seagoing boat--woodwork, electricity, engine, rigging, plumbing, sail. William Muzzio answered diffidently that, in fact, he knew as much as any of the specialists who worked for him who had mastered only the expertise in their separate fields.
He then paused for a brief moment in the little, attentive courtroom....
He did not, he corrected himself, know--himself--how to fabricate transistors for ships' radio gear. Thus the sometime complete boatwright formally acknowledged the progressive relative finiteness even of his own very wide expert knowledge of all that used to be required to launch a seagoing yacht. Others acknowledge their progressive relative ignorance by the simpler expedient of paying no attention to it whatever.
•
Consider, in the light of our general concern about our increasing ignorance, the obsessive interest in the working habits of the President of the United States. It is widely acknowledged that Ronald Reagan devotes fewer hours to studying the data that flow into the Executive cockpit than his predecessor did. But two questions are begged by those who stress invidiously the comparison. The first is: Is this difference reflected in the quality of Reagan's performance as Chief Executive? And the second, How could his predecessor, Jimmy Carter, reasonably assume that he had mastered all the data conceivably relevant to the formulation of the most enlightened decision? How do we correlate--or do we?--knowledge and performance in nonscientific situations? Unflattering things have been said about Carter's handling of the Presidency, but nobody ever accused him of dereliction at the homework level. And then again, five Presidents back, John F. Kennedy was once overheard to say that the Presidential work load was entirely tolerable. Notwithstanding this nonchalant evaluation of arguably the most taxing job in the world, Kennedy, as Chief Executive, had probably more full-time bards working to apotheosize him than any President since, oh, Abraham Lincoln.
What are we to make of all this confusion on the matter of time devoted to the acquisition of knowledge?
•
So we move in on an intimation of the painless acclimation of our culture to an unspoken proposition: that every day, in every way, man knows more and more, while every day, in every way, individual men know less and less. The question arises whether we give in, by our behavior, to complacency, or acknowledge philosophically, even stoically, force majeure, much as we acknowledge biological aging and, eventually, death. There is, after all, nothing an epistemological reactionary can do to erase human knowledge. Buckminster Fuller remarked that it is impossible to learn less. Valiant efforts at Luddite nescience have been made, most notably by Pol Pot, who recently set out to kill everyone in Cambodia who (continued on page 222)Redefining Smart(continued from page 96) was literate--save, presumably, those in his circle who needed to read his instructions to kill everyone else who could read his instructions. He was stopped, finally, after he had killed somewhere between one quarter and one third (the estimates vary) of all Cambodians. But poor Pol Pot, all he ultimately accomplished was the premature death of millions of people and a testimonial dinner in his honor by Communist China.
Given, then, that we cannot hope to read, however much time we give over to the effort, one one-hundredth of the books published in America alone every year, nor read one periodical out of every 100 published, and all of this to say nothing of catching up with those masterpieces written yesterday that silt up into public recognition, some of them 10, 20, even 50 years after first published, how can we hope to get about with any sense of--self-satisfaction isn't quite the right word, because self-satisfaction is not something we ought ever to strive after--rather, well: Composure is probably as good a word for it as comes readily to mind?
•
Nothing I have ventured until now is, I think, controversial. Is it controversial to bridge over to the final point; namely, that inhabitants of a common culture need to have a common vocabulary, the word vocabulary here used in the most formal sense as the instrument of intercommunication?
It is probably not a culturally disqualifying civic delinquency, or even civic abnegation, to come late, say six months or even a year late, to the recognition of Who is Michael Jackson? and What exactly is it that makes him, after two hours at a studio, create something the price of which Picasso would not have dared to ask after 20 hours' work at his easel? But I do think it hovers on civic disqualification not to know what is meant, even if the formulation is unfamiliar, when someone says, "Even Homer nodded."
Now, any time anybody comes up with something everybody ought to know on the refined side of, say, The world is round, not flat, or, A day comprises 24 hours, you will encounter an argument over whether knowledge of that particular datum is really necessary to integration as a member of a culture. So that what I just said about Homer's nodding will be objected to by some as not intrinsic to a "common vocabulary" in the same way that, let us say, it is intrinsic to know the answer to the question What was Hitler's holocaust? Subgroups within a culture will always feel that a knowledge of certain "things"--even of certain forms, certain recitations--is indispensable to a common knowledge and that without them, intercourse (social intercourse, I suppose I should specify, writing for Playboy) is not possible. These "things" go by various names and are of varying degrees of contemporary interest. For instance, there is "consciousness enhancement" as regards, oh, black studies, or malnutrition, or Reagan's favoritism toward the rich. But these are, I think, faddist in any large historical perspective. Not so much more remote "things," such as Homer's nodding.
With the rise of democracy and the ascendancy of myth-breaking science, the need arose to acknowledge man's fallibility, preferably in a way that also acknowledged man's vanity. This was the period during which a belief in the divine right of kings began to wither on the overburdened wings of certitude. So that it became common in the 17th Century, the lexicographers tell us, to reflect that if it--i.e., human fallibility--could strike out at Homer, the more so could it overtake us. Homer was the symbol for the poet universally regarded as unerring (the divine Homer); yet objectivity raises its obdurate voice to point to errors (mostly factual inconsistencies) committed by the presumptively unerring. Only just before the beginning of the Christian era, Horace had written that "even Homer sometimes nods." And as recently as 1900, Samuel Butler spotted a picture of a ship in the Odyssey with the rudder at the front.
And so an entire complexion of social understanding unfolds before us: so that by recalling that even Homer nodded, we are reminded of the vulnerable performance of lesser human beings--indeed, of all human beings. And if we acknowledge our weaknesses, then we inherit insight into such terms as "government by laws, not by men"; of such propositions as that "nobody is above the law"; and of such derivative things as checks and balances; insights, even, into the dark side, and black potential, of human nature.
In the age of the knowledge explosion, the struggle, by this reckoning, should be not so much to increase our knowledge (though that is commendable even if we recognize, fatalistically, that we fall further behind every day) as to isolate those things that no data that have been discovered have ever persuasively challenged and--here we approach an act of faith--no data will ever plausibly challenge. These are known, sometimes, as the "eternal verities." A secular version of one of these verities is that no one has the right to deprive another man of his rights. Let the discussion proceed over exactly what that man's rights are but not over the question of whether or not he has rights. But in order to carry on that discussion intelligibly, we need to share that common vocabulary that reaches out and folds protectively into a common social bosom those common verities. If, next Monday, all Americans were to suffer an amnestic stroke, forgetting everything we had ever known, what is it that would be required before we reassembled--if ever--around such propositions as are asseverated in the Declaration of Independence and in Lincoln's Gettysburg Address?
Western culture is merely a beachhead in space, Whittaker Chambers reminds us. That insight is what distinguishes today the Renaissance man. He is not the man who, with aplomb, can fault the Béarnaise sauce at Maxim's before attending a concert at which he detects a musical solecism, returning to write an imperishable sonnet before preparing a lecture on civics that the next day will enthrall an auditorium. No: The Renaissance man is, I think, someone who bows his head before the great unthreatened truths and, while admitting and even encouraging all advances in science, nevertheless knows enough to know that the computer does not now exist, nor ever shall, that has the power to repeal the basic formulas of civilization. "We know," Edmund Burke wrote, "that we have made no discoveries; and we think that no discoveries are to be made, in morality--nor many in the great principles of government, nor in the ideas of liberty, which were understood long before we were born, altogether as well as they will be after the grave has heaped its mold upon our presumption, and the silent tomb shall have imposed its law on our pert loquacity."
"We cannot hope to read ... one one-hundredth of the books published in America alone every year."
Like what you see? Upgrade your access to finish reading.
- Access all member-only articles from the Playboy archive
- Join member-only Playmate meetups and events
- Priority status across Playboy’s digital ecosystem
- $25 credit to spend in the Playboy Club
- Unlock BTS content from Playboy photoshoots
- 15% discount on Playboy merch and apparel