Electronic Mysticism
This is Chapter 24 of The Future Does Not Compute: Transcending the Machines in Our Midst, by Stephen L. Talbott. Copyright 1995 O'Reilly & Associates. All rights reserved. You may freely redistribute this chapter in its entirety for noncommercial purposes. For information about the author's online newsletter, NETFUTURE: Technology and Human Responsibility, see http://www.netfuture.org/.
Cyberspace. Room for the human spirit to soar free. Earth surrounded by a digitized halo of information -- a throbbing, ethereal matrix coagulating into ever shifting patterns of revelation, and giving birth to a rich stream of social, political, and environmental initiatives. The individual freedom once sought only within the cloister or in the trenches is now to flow from keyboard, mouse and glove, electrifying the initiated with raw and unbounded potential for new being. An electronic New Jerusalem, its streets paved with silicon and bridging all cultural rifts, promises the healing of nations. Even the hope of personal immortality flickers fitfully for the first time through materialist brains contemplating prospects for DNA downloading and brain decoding.
And you, self-professed infonaut -- from whose jargon I have fashioned this vision -- you say you're not religious?
Athena's project
If we really must have our mysticism, then it seems a worthwhile exercise to take a brief look backward. For our myth-making ancestors truly did live mystically tinged lives. A comparison with our own day might therefore prove interesting. I will not, however, “make a case” about early man. I will attempt only to suggest what at least some scholarly traditions have to say about the human past. You may take it for whatever it's worth.
The problem, of course, is that it's nearly impossible for us to twist our minds around the psyche of those who left us the great myths. If you doubt this, try making sense of a few lines from the Egyptian Pyramid Texts. Or simply reflect for awhile upon Owen Barfield's remark that “it is not man who made the myths, but the myths that made man.” Barfield, a philologist and profound student of historical semantics, means this quite literally. And yet, no one can blame us for finding such a statement disorienting.
For my part, I will approach the matter more gently by observing (with Barfield) that ancient man, much more than we, experienced himself rather like an embryo within a surrounding, nourishing cosmos. And his cosmos was not at all a world of detached, inert, dis-ensouled “things” such as we face, but instead more like a plenum of wisdom and potency. Out of this plenum -- all the great mythic traditions assure us of this -- the primal, mythic “words” of the gods congealed into the deeply ensouled forms of creation. Man, a microcosm within the macrocosm, encountered both his Source and reflection in the world, much as an embryo draws its sustenance from, and discovers its image of wholeness in, the mother.
The minds of our remote ancestors, as Barfield forcefully demonstrates, were not launching pads for “primitive theories” about the world. It would be truer to say, as we will see, that the mythic surround was engaged in weaving the ancient mind, as in a dream. But, eventually, man the dreamer of myths woke up to be the maker of myths -- or, rather, the recollector of myths, just as we today recall the pictures of our dreams and then struggle with awkward, make-believe words (“it was as if...”) to reconstitute what had been a perfect unity of experience. This historical waking up occurred most suddenly and dramatically in Greece.
Recall for a moment that scene at the beginning of the Iliad where Achilles is prepared to draw his sword against Agamemnon, bringing disastrous strife into the Greek camp. The goddess Athena takes her stand beside him, catches him by his golden hair,
making herself to be seen of him alone, and of the rest no man beheld her. And Achilles was seized with wonder, and turned him about, and forthwith knew Pallas Athena; and terribly did her eyes flash.
Athena then counsels Achilles to stay his hand, and he obeys, for “a man must observe the words” of a goddess.
Now it happens -- and this was pointed out by the outstanding classicist, Bruno Snell -- that this same pattern holds at many points throughout the Iliad: when a decisive moment is reached and a man must act fatefully, a god intervenes. Often the intervention does not further the story in the slightest, but only gets in the way. Or, at least, it gets in our way. We would rather Achilles just decided to restrain himself -- something he should do from out of his own character. Snell's point, however, is that for the Homeric heroes there was not yet enough character -- no one sufficiently there -- to do the deciding. Athena “enters the stage at a moment when the issue is, not merely a mystery, but a real secret, a miracle.” For Achilles “has not yet awakened to the the fact that he possesses in his own soul the source of his powers”:
Homer lacks a knowledge of the spontaneity of the human mind; he does not realize that decisions of the will, or any impulses or emotions, have their origin in man himself .... What was later known as the “life of the soul” was at first understood as the intervention of a god./1/
Similarly, Francis Cornford, noting the common definition of animism as the conviction that everything in the world possesses a soul like one's own, commented that such animism could only be of relatively recent date:
At first, the individual has no soul of his own which he can proceed to attribute to other objects in nature. Before he can find his own soul, he must first become aware of a power which both is and is not himself -- a moral force which at once is superior to his own and yet is participated in by him./2/
Following his discussion of Homer, Snell goes on to trace the actual Greek “awakening” in the lyric poets and tragedians. The discovery of the mind, he argues, occurred by means of the progressive internalization of the order, the laws, the grace, first experienced in the all-enveloping world of the Olympian gods, with whom the Greeks were so fascinated. And he notes that this “discovery” is not quite the same thing as chancing upon a lost object. To discover the mind is, in a very real sense, to gain a mind that did not exist before. Part of Achilles' mental functioning was not his own, but the gift of Athena. He was not capable of forming theories about her or “using” her to explain events; the “presence of mind” by which the theory- former would eventually be born was only now being brought down to earth.
The result of the descent was the flowering of art and intellect in classical Greece -- and beyond that, all of Western civilization. For you can read, if you like, not only in Snell but in works like Francis Cornford's From Religion to Philosophy and R. B. Onians' The Origins of European Thought, how the world of the gods upon which the earliest Greeks gazed and by which they were so profoundly affected was the historical prerequisite for all those later achievements of the human mind with which we are familiar. As Cornford notes, even if we look at the extreme materialism reflected in Democritean atomism, it remains true that “the properties of immutability and impenetrability ascribed to atoms are the last degenerate forms of divine attributes.” Nor have we even today wholly succeeded in elucidating a certain chthonic darkness hidden within the most fundamental terms of our physics -- time, space, energy, mass, matter.
The law in which we're made
All of recorded history can be viewed as a progressive waking, a coming to ourselves./3/ Barfield, who traces many aspects of this process, points out, for example, that the human experience of “inspiration” -- literally, being breathed into by a god -- has passed from something like possession or divine madness, to something much more like “possessing a tutelary spirit, or genius,” until today it means scarcely more than “realizing one's fullest capabilities.”/4/ What first seemed an inbreathing from without now seems increasingly our own activity.
In all of this we see a certain coherent movement. The embryo gains an ever more independent life and finally -- cut off from a once- sustaining world -- contemplates and regenerates within the isolation of its own skull the creative speech from which it was itself bred. The word has moved from the mythic surround into a bright, subjective focus in the newly self-conscious individual, who now prepares to speak it forth again as his own creation. But not merely his own creation. For this word -- if it remains true -- still resonates with the ancient intonations of the gods. As J. R. R. Tolkien has written of the human storyteller:
Though all the crannies of the world we filled with Elves and Goblins, though we dared to build Gods and their houses out of dark and light, and sowed the seed of dragons -- 'twas our right (used or misused). That right has not decayed: we make still by the law in which we're made./5/
We make still by the law in which we're made. But wait. What is this we see today? Scholars and engineers hover like winged angels over a high-tech cradle, singing the algorithms and structures of their minds into silicon receptacles, and eagerly nurturing the first glimmers of intelligence in the machine-child. The surrounding plenum of wisdom -- an incubating laboratory crammed full of monitors, circuit boards, disks, cables, and programs -- supports the fragile embryonic development at every turn. When, the attendant demigods ask, will this intelligent offspring of ours come to know itself as a child of the human cosmos -- and when, beyond that, will it finally waken to a fully independent consciousness? Will the silicon embryo, its umbilical cord to the research laboratory severed, eventually attain the maturity to create freely “by the law in which it's made”? And will it learn, in the end, to rebel against even this law?
Some will consider these questions blasphemous. I prefer to see them simply as natural -- the questions we must ask, because our destiny forces them upon us. And yet, our unmistakable echoing of the myth-enshrouded past suggests that we are indeed dwelling within sacred precincts wherein blasphemy is possible. Perhaps only today, when the creation is finally given power to devise once-unthinkable rebellion against the law of its own making, can the truest and deepest blasphemy be spoken.
From meaning to syntax
I began by citing a kind of latter-day, electronic mysticism, and suggested we look for comparison toward the mythic past. This led to a consideration of the mind's “discovery” by the Greeks at the hands of their gods. More broadly, it appeared that from the most distant -- yet still discernible -- past to the present a kind of reversal, or inversion, has occurred: where man's consciousness was once the stage on which the gods played, he has now “grown up” so as to stand firmly within himself and to project his own thoughts out into the universe. We saw one reflection of this reversal in the changing meaning of “inspiration.”
Of the many other reflections of the same historical process, one is particularly worth noting now. Languages, Barfield points out, give every appearance of having emerged from something like a sea of pure meaning, and only slowly have they hardened into the relatively fixed and explicit syntactic structures we know today. For the ancients, every surface disclosed a spirit-filled interior, while every gesture of spirit was embodied. Spirit and flesh, thought and thing, were not the opposites we have since made them. A dense web of connections -- inner and outer at the same time -- bound the human being to the world. The spoken word -- a token of power -- constellated for both speaker and hearer a concrete universe of meaning.
Tolkien suggests something of this ancient power of the word in The Silmarillion, where he describes the creation of the earth. Iluvatar and the Ainur broke forth in song,
and a sound arose of endless interchanging melodies woven in harmony that passed beyond hearing into the depths and into the heights, and the places of the dwelling of Iluvatar were filled to overflowing, and the music and the echo of the music went out into the Void, and it was not void./6/
If it helps to give substance to the metaphor, you might try to imagine the infinitely complex patterns of interweaving sound waves -- say, in a concert hall -- and the higher-order patterns that continually emerge from this complexity and then dissolve again, only to assume new shapes. Imagine further that we could steadily thicken the medium of this aural dance, until finally the harmonies condensed into graceful, visible forms. Or, if you prefer, think of the patterns of interacting light that coalesce into full-bodied holograms.
It is a long way from Tolkien's mythological constructions -- wherein he recaptures something of the word's former significance -- to the computer languages of our day. We call these computer languages “formal,” since they are possessed of syntax (form) alone. Their “dictionaries” do not elucidate meanings, but rather specify purely computational -- arithmetical or logical -- sequences. Somehow, it is up to us to invest the sequences with meaning. Starting with threadbare 1's and 0's, we must build layer after labyrinthine layer of logic, hoping that somewhere along the line we will have succeeded in imposing our own meaning upon the syntax (but where will it come from?). This hope has been enshrined as a rather forlorn principle:
THE FORMALIST'S MOTTO: If the programmer takes care of the syntax, the program's meaning will [somehow] take care of itself./7/
And what is this meaning? No longer the enveloping, sustaining, formative power of the word, but (if we are lucky), a mutely pointing finger. Philosophers wrestling with the problem of “intentionality” struggle to find a legitimate way by which the mechanically manipulated word may at least refer to -- point at -- a dead object. There is little agreement on how even this final abstraction of the ancient experience of meaning might be possible./8/ So in the historical movement from meaning-rich to syntax-bound languages, computers represent an extreme endpoint.
Much else is related to this overall historical shift -- not least the change from a humanly experienced unity with the world to a state of alienation wherein we stand as isolated subjects before objects that seem wholly disconnected from us. One milestone in this development occurred only a few centuries ago, when the words “subject” and “object” completed a rather dramatic reversal of meaning. The subject went from “what exists absolutely in its own right” (God was the one true Subject) to today's notion captured in the phrase, “merely subjective.” Similarly, the object -- once purely derivative, a spin-off from the subject -- now enjoys an unaccustomed prestige and self-sufficiency as the solid, reassuring substance of “objective reality.”/9/
What manner of gods will we be?
The engineering hosts hovering over their silicon child have not escaped these historical reversals. Having lost sight of their own substantiality as subjects -- wherein they might have chosen, with Tolkien, to “make by the law in which they were made” -- they contrive instead to fashion a child-mind wholly emergent from its physical substrate. Their own role as begetters, while not entirely forgotten, is less a matter of spirit-fecundity than of purely external manipulation of the mind's “objective, physical basis” -- which basis in turn is supposed to account for all the results. From Tolkien's storyteller -- whose tale originates and remains one with his own mind -- they have descended to mechanical tinkerer. To reduce creation's language to syntax is to be left -- as they proudly boast -- with mechanism alone.
Given the historical passage, Tolkien's passionate affirmation of the creative subject was already quaint when he voiced it. But now we have received from Myron Krueger, one of the fathers of “virtual reality,” a suitably modern declaration: “what we have made, makes us.”/10/
What we have made, makes us. Do you see the inversion? Furthermore, it is substantially true. Just so far as we ignore the law of our own making -- just so far as we forget our ancient descent from a cosmos of wisdom above us -- we lose the basis of creative mastery, and offer ourselves to be remade by the mechanisms below us.
Whether consciously or not, we choose between the above and the below. We take possession of a creativity rooted in the wisdom by which we were made, or else we subject ourselves to lesser powers ruling more perfectly in our machines than they ever could in ourselves, our only “hope” then being to live as fit servants of our own creations.
It is not hard to see that we are pursuing an experiment every bit as momentous as the discovery of the mind at the dawning of Western civilization. If we would “sing” our own intelligence into machines, hoping they may eventually discover their own minds, then -- whatever you may think of the historical perspectives I have outlined -- you will grant that we aspire to a role like the one I described for the gods. And it is therefore reasonable to ask: what manner of gods will we be?
We do not seem very interested in this question. We spend much time debating what manner of intelligence our machines are manifesting, and toward what future goal they are evolving. But we are less inclined to ask toward what goal we are evolving. Perhaps we and our machines are converging upon the same goal -- converging, that is, upon each other. Certainly if our computers are becoming ever more humanlike, then it goes without saying that we are becoming ever more computerlike. Who, we are well advised to ask, is doing the most changing?
What if the decisive issue we face in artificial intelligence today is the ever accelerating adaptation of our minds and ourselves to electromechanical intelligence? What if our current efforts, wrongly or uncritically pursued, threaten to abort the human spirit on its path from the cradle of the gods to its own intended maturity? The ancient narratives speak of the gods' sacrifice as they labored over the human race. But a sacrifice can be noble, or it can be ignoble -- it can be the discovery of our highest calling, or a wasteful throwing away of what is valuable. We should ask what sort of sacrifice we are making as we labor to breathe our own intelligence into a race of silicon.
There is no gainsaying that computers already embody much of what human intelligence has become, much of what we experience our intelligence to be. We may -- legitimately, I think -- feel a pride in our creations. But, looking over the past five hundred years or so, perhaps we should also feel a few whispered apprehensions about the systematic reduction of our own thinking to a kind of computation. For it now appears that we face a choice: either to reduce ourselves, finally, to machines, or to rediscover -- and live up to -- our own birthright, our heritage from ages long past.
Ignoring the past
But perhaps now your patience is exhausted. “Why do you insist on exercising this strange predilection for placing the computer within an ancient historical context? To be sure, there are some interesting curiosities here, but don't expect anyone to take it all with grave seriousness! Eventually one has to turn away from speculations about the past and get back to the work at hand, with eyes wide open.”
I do understand such a reaction. But the suggestion that history -- including the origin and development of the human mind -- is largely irrelevant to our current undertakings is a curious one. How is it that we have attended to origins and evolution so profitably in fields ranging from biology to astronomy -- and, indeed, have found this attention critical to our understanding -- but are now hell-bent on creating artificial intelligence without so much as a backward look? Where would geology be without Lyell, or biology without Darwin? Where would astronomy be without cosmology? And yet the cognitive scientist blindly and happily strives to “implement” the human mind in computers without a thought about the past./11/
This narrowness of outlook reads quite well as a symptom of adaptation to our machines. We are trained to analyze every matter -- including our own powers of analysis -- into the sort of nonproblematic terms that will successfully compute. Otherwise, we -- who are, after all, supposed to do something as we sit in front of our computers -- have no job. It does not surprise me that we can instill much of our intelligence into machines, for it is an intelligence already machine-trained, an intelligence shaped to the mechanisms with which we have compulsively been surrounding ourselves for these past several centuries.
It seems to me that we're looking at this same narrowness of outlook when we consider how badly the early pioneers of artificial intelligence erred with their excited predictions, the most famous probably being Herbert Simon's 1965 statement that “machines will be capable, within 20 years, of doing any work that a man can do.”/12/ Still earlier, in 1958, Simon had written:
It is not my aim to surprise or shock you .... But the simplest way I can summarize the situation is to say that there are now in the world machines that think, that learn and that create. Moreover, their ability to do these things is going to increase rapidly until -- in the visible future -- the range of problems they can handle will be coextensive with the range to which the human mind has been applied./13/
Certainly all but a few incorrigible enthusiasts have pulled back from such statements in more recent years. And the unpleasant jolts administered by failed projects have delivered a younger generation of researchers from having quite the same naivete. But not many have come to understand -- or even to pose the question -- how it was that competent researchers were subject to such a misconstrual of the task in the first place.
The early and misguided enthusiasm for “logic machines” arose, I think, from an extraordinarily simplistic reading of their own minds by a group of researchers who possessed, without a doubt, brilliant intellects. They looked within, and they were aware of little more than the operations of a sophisticated calculator. This should give us pause. What crippling of human consciousness -- evident in even the best minds of the modern era -- could have yielded such a grotesquely flawed and inadequate self-awareness?
It is not clear that we today are any less possessed by the same underlying assumptions, the same qualities of mind, that led to the earlier errors. Nor can we ever be confident in this regard, without first enlarging our sympathies to embrace other minds, other eras, other possibilities.
Escaping cultural limitation
Human consciousness has evolved -- and in some rather fundamental ways. Certainly we're left with some room for reading different meanings into the word “evolved,” but it remains true that the world of Achilles and the Olympian gods is a long way from our own. Our consciousness today is, for all practical purposes, incapable of assuming shapes that once were easy or natural. But if this is so, surely we should be concerned. What if all that we glorify today as “information” is but an ashen residue of the luminous meaning that once held both man and world in its embrace?
In other words, it is not outrageous to contend that what we have today is in some respects a seriously disabled consciousness, and that our infatuation with machines is both a symptom of our disability and a further contributor to it. Just as we cannot imagine the mythic word to which the ancients were attuned, so increasingly we cannot imagine our own thoughts as anything very different from the electromechanical operations of a computer. But entrapment within a particular culture is a dangerous state for any truth-seeker. How can we be sure, without some historical investigation, that our cultural limitations are not destructive of the truth?
It is precisely such investigation that I would like to encourage. Whatever it is that most essentially distinguishes us from computers, I am convinced that it is intimately related to the powers with which we may successfully pursue exercises in historical imagination. Having no doubt made rough weather of it here, I should perhaps consider myself ripe for cloning as a computer! But the alternative to such cloning will, I trust, still appeal to at least some of you -- the alternative, that is, of reaching beyond yourselves and understanding sympathetically exactly those things that have no support in your current habits of thought or system of meanings. The result may be fully as startling as to find yourself yanked by the hair and brought face to face with flashing-eyed Athena.
Of one thing I am sure: if we do not have this sort of experience, we will, in the end, merely compute.
References
1. Snell, 1960: 21, 31-32.
2. Cornford, 1957.
3. See Chapter 20, “Awaking from the Primordial Dream.”
5. Tolkien, 1947.
6. Tolkien, 1977: 15.
7. Adapted from Haugeland, 1985.
8. “This is the fundamental problem in the philosophy of mind of mental content or intentionality, and its proposed solutions are notoriously controversial” (Dennett, 1991: 192 n. 7).
10. Krueger, forward to Heim, 1993.
11. Daniel C. Dennett, referred to in a previous footnote, is more historically oriented than most who work with computer models of mind. Yet he offers only the standard evolutionary story of the brain and its “software,” and chooses not to explore the evolution of what he calls the “heterophenomenological text” (Dennett, 1991). For that evolution and some of its implications -- wholly missed in contemporary cognitive science -- see Barfield, 1973 and Barfield, 1965a.
12. Simon, 1965: 96.
13. Simon, 1958: 8.
Get The Future Does Not Compute now with the O’Reilly learning platform.
O’Reilly members experience books, live events, courses curated by job role, and more from O’Reilly and nearly 200 top publishers.