the unlikely story of how America slipped the surly bonds of earth & came to
believe in signs & portents that would make the middle ages blush
via Amazon...
this site is a labor of love. i.e., if you love me enough,
I'll be able to complete it. send proof of love via buttons above. please. if you can. thanks.
Within a generation the problem of creating
"artificial intelligence" will be substantially solved. ~ Marvin Minsky, 1967
Had Dr. Minsky (I presume) been dropping half as much acid as I was that year, he might be forgiven such a hallucinatory forecast. For my own part, I was less concerned with the future of machine intelligence than with the question of how to disinherit myself from the acquired characteristics so typical of postwar childhood. Orphans against the night, strange voices from the sky, the world unraveling. You never quite get over something like that. Missing in action, miscast, cast adrift. Picture yourself in a boat on a river...
Read my lips, HAL: open the pod bay doors, motherfucker.
As this Amazon search shows, "cultural imaginary" is a term of art much in vogue at the rarefied heights of contemporary literary criticism. Here's a clip from a review I found at the University of Toronto's EDGE Magazine that explains...
...[the book] also explores pop culture -- comic strips, soap operas, sermons, scientific essays. By casting their net a little wider, researchers hope to form a better understanding of how this wider body of literature is both created and received, and how it has an impact on the cultures in these areas over time and across geographical regions.
This eclectic mix of literature reflects what [the author] calls "the cultural imaginary" -- how average people see themselves and make sense of the world around them in a given time and place.
Science, technology, politics, the military, religion, culture. Worlds within worlds, dreams within dreams: each seemingly unconscious of the others' existence, yet all interleaving, interweaving, fabricating your so-called reality.
Imagine that.
posted by Christopher Locke at #
Monday, April 30, 2007
In response to my previous post, a longtime reader wrote: "I think you're on to something big. How the hell to get there, or what to do when you arrive, I can barely begin to imagine."
When I started the research for this (gods willing) book-to-be, I could barely begin to imagine, myself, where it was headed. My hunch was that there is something in the water, metaphorically speaking, that is causing people these latter days to be like, you know, real weird. The resultant weirdness I labeled NewAge++. The "++" bit was intended to signal two things:
a sort of kinship (if only suggestive) with the C++ programming language, and by extension, the entire range of so-called high technologies, perhaps especially as they may have influenced various contemporary notions about psychotherapy; and
the existence of a distinct (if still fuzzy) social class quite a bit larger than what we usually think of as "New Agers."
On the heels of my foregoing ruminations here about cognitive science, and on the theory that a picture is worth 1000 words, perhaps this book cover graphic will begin to bring us full circle.
Now for starters, there are three things you need to know about Ray Kurzweil:
Nearly two years ago (in Technology and Superstition) I wrote about Are We Spiritual Machines?: "Proving that you don't need to be into angels or astrology to be a modern-day soothsayer, Ray Kurzweil weighs in again on his favorite subject" -- his favorite subject being The Singularity. Quoting from that page...
Within a few decades, machine intelligence will surpass human intelligence, leading to The Singularity -- technological change so rapid and profound it represents a rupture in the fabric of human history. The implications include the merger of biological and nonbiological intelligence, immortal software-based humans, and ultra-high levels of intelligence that expand outward in the universe at the speed of light.
Ta-da!
So here's my next-installment hypothesis about another slice of the NewAge++ demographic, to wit: that hare-brained hallucinations like Kurzweil's -- and the AI/CogSci fever-dreams it springs from -- are essentially New Religious Movements, no different in kind from A Course in Miracles, UFO cults, and The Urantia Book.
every picture tells a story don't it
~ rod stewart
posted by Christopher Locke at #
Friday, April 27, 2007
I should have followed my second mind...
and you know I wouldn't be here now people
down on the killing floor. ~ electric flag
Why deny it? Mystic Bourgeoisie has run the gamut from enigmatic to abstruse to plain obscure. And perhaps my previous post should take some sort of prize for impenetrability: an ambiguous portrait of a woman, her decadent visage worthy of an Aubrey Beardsley print, accompanied only by the callout "MEAT MACHINES." What could I have been thinking?
OK, I'll tell you. I was thinking that some of you -- though I knew it would be few -- would click on that cover graphic and discover that the book it comes from -- Mindware: An Introduction to the Philosophy of Cognitive Science -- can be Searched Inside™ on Amazon. Moreover, I was hoping that maybe a few (certainly a much smaller subset than those who bothered to click on the cover shot) would call up the table of contents and notice that Chapter 1 is titled "Meat Machines: Mindware as Software."
But why do I make it so hard for you? Simple: when readers are forced to participate in the learning experience, such engagement leaves deeper neuronal traces in their cognitive hamburger.
But there is, as usual (I hope), a larger point here. If I'm annoyed by the "meatspace" neologism -- as I mentioned two posts ago -- I am positively repulsed by the idea of human beings being referred to as "meat machines." And these sorts of metaphors are not merely random Wired-Burning-Man hipspeak. They derive directly from the Computational Theory of Mind. For all you non-clickers (you know who you are), that Wikipedia link would have informed you that...
The computational theory of mind is the view that the human mind is best conceived as an information processing system very similar to or identical with a digital computer. In other words, thought is a kind of computation performed by self- reconfigurable hardware (the brain). This view is common in modern cognitive psychology and is one of the foundations of evolutionary psychology.
Now, it's funny that whoever wrote that should mention evolutionary psychology, because in that same post where I objected to "meatspace," I quoted from an Amazon review about a book called Mindblindness: An Essay on Autism and Theory of Mind. And of course it's not funny at all, because that book includes a foreword by Leda Cosmides and John Tooby -- the guiding lights, if not the outright founders, of the field of evolutionary psychology.
I'm not confusing you am I? I hope not, because we're fast coming up on the aforementioned larger point. In fact, here it is now, in the form of a string of partial quotes from that very Cosmides & Toobyesque foreword...
The machinery that causes these experiences...
specialized circuitry computes in our minds...
the specialized neural circuitry responsible...
We inhabit mental worlds populated by the computational outputs of battalions of evolved, specialized neural automata.
the neural automata responsible for these constructions...
these devices are present in all human minds...
the representations produced by these universal mechanisms...
these evolved inference engines...
the machinery that constitutes most of the evolved architecture of the human mind...
These mechanisms solve the many computational problems involved in constructing the world...
evolved, specialized, computational problem solvers...
our cognitive architecture resembles a confederation of hundreds or thousands of functionally dedicated computers...
Each of these devices has its own agenda...
There is a "theory of mind" module, and a multitude of other elegant machines.
social exchange algorithms that define a social world of agents...
the psychological architecture can now be mapped...as a system of computational relationships
a triumph of automated modules and evolutionary cognitive engineering.
I could go on, but you get the idea. This is a major clue about what happens when the psychological study of human beings is hijacked by computer "scientists" -- and, to reiterate this telling passage...
We inhabit mental worlds populated by the computational outputs of battalions of evolved, specialized neural automata.
...about what happens when that "science" had already been hijacked by the US Department of Defense, whose funding for all of the above was driven by its usual goals: battlefield command and control.
The "evolution" we are concerned with here doesn't carbon-date back to anywhere even remotely close to the Pleistocene epoch. Rather it begins with the military-industrial-academic complex that evolved during the Cold War of the 1940s and -50s, entailing highly unnatural selection by very undisinterested institutions.
Put that in your meat machine -- and give it a good hard crank.
"This book is about computers, as machines and as metaphors, in the politics and culture of Cold War America."
Note: I despise the term "meatspace." However, as it's a sort of left-hand derivative of an uncritical acceptance of the computational model of mind, perhaps the irony will gel at some point in what follows. No guarantee is implied.
Financial concerns were partly responsible for [Frederick] Terman's attention to output and external support. A desire to see Stanford produce large numbers of Ph.D.s and to build steeples in particular fields of national importance also reflected his commitment to increasing Stanford's influence and prestige. To these concerns Terman brought a particular interest in efficiency and a faith in statistical measures of merit, both of which would have pleased his father, who had dedicated his professional life to the dubious goal of quantifying intelligence and had regarded the university as a business enterprise, albeit an inexcusably inefficient one.
If Terman was carrying into the postwar period the interests of some prewar corporate progressives, he was also representing (although perhaps in extreme fashion) the enthusiasms of the cold war decades. The early postwar years were marked by a general faith in objective measures of merit and the widespread use of intelligence, achievement, and personality tests at all levels of the educational system. The application of standards of efficiency and productivity to the university was also in keeping with the interests of the period.
Free associating off the above passage, I immediately think of three other books. Occupational hazard.
The third issue that generated controversy within the discipline (and outside of it) was over the characterization of intelligence as a biological potential genetically determined from birth. Especially potent at the level of groups, where this use of intelligence ... had proven integral to the development of scientific racism, the biological basis for an individual's degree of intelligence had been promoted by most of the first generation of American testers as well as by their allies in the eugenics movement.
Both "first generation of American testers" and "their allies in the eugenics movement" apply specifically to Fred's pop, Lewis Terman.
The book describes (pp. 271-73) the January, 1934 translation by one Dr. George Dock -- "a member of the advisory board for California's Human Betterment Foundation and one of the most highly regarded physicians in the United States" -- of a recently published Nazi compulsory sterilization statute: the "Law for the Prevention of Hereditarily Diseased Offspring."
...as he translated the official proclamation of this far-reaching German law, Dr. Dock did not really expect to discover what he did: immediately after the full text of the law, the Nazi government had singled out the example of California and the statistics provided by the Human Betterment Foundation to justify what it knew would be a controversial national program....
[Dock wrote:] "I think the reference to the California work, and the work of the Foundation is a very significant thing. The matter has given me a better opinion of Mr. Hitler than I had before."
All the leaves are brown. And the sky is grey.
Finally, with respect to Fred Terman's "particular interest in efficiency," how could I not think of that turn-of-the-[last]- century elitist scumbag Frederick Taylor, "the father of Scientific Management"? Library Journal said this about Taylorism Transformed: Scientific Management Theory Since 1945 (University of North Carolina Press, 1991).
...this intellectual history of modern management theory traces the two sides of Taylorism (power and value) and its effect on modern management writers such as Herbert Simon, Peter Drucker, Elton Mayo, and Abraham Maslow. He discusses the bureaucracy of centralized power and the corporatism of worker participation, which he describes as "the opposite side of the same Taylorist coin." ...this title is a detailed argument that "Taylor's bureaucratic and corporate successors transcended his techniques but not his premises."
We've run into Maslow before here on Mystic B -- and will again. For now, just keep in mind that his mentor was Edward L. Thorndike, a colleague of Lewis Terman who was equally involved in intelligence testing and its use to support "scientific racism."
Herb Simon is another story -- one that will link us back to where this started: with Fred Terman at Stanford. One of the signal events of the cold war, and a major driver of government funding for Big Science, was the successful launch of Sputnik by the Soviet Union in 1957. As a direct result, the Advanced Research Projects Agency (ARPA) was created a year later under the US Department of Defense. Over the intervening decades, DARPA, as the agency came to be known, expanded its funding to include what was, by the early '60s, coming to be called "cognitive science" -- a new field which owed much to Herbert Simon. I would occasionally run into the guy in the halls of the Robotics Institute at Carnegie Mellon University. But cutting to the chase, this graphic from DARPA's 50th anniversary Strategic Plan (PDF, March 2007) will, at a glance, give you the basic idea of what developed over the ensuing five decades. God Bless America: your PAL.
Howard Gardner documented this paradigm shift -- and it actually was one, in this case, in the true Kuhnian sense -- in The Mind's New Science: A History of the Cognitive Revolution. In a section titled "De-Emphasis on Affect, Context, Culture, and History," he writes about some of the pushback (now largely forgotten) that the new field was met with (pp. 41-42)...
Though mainstream cognitive scientists do not necessarily bear any animus against the affective realm [i.e., emotion], against the context that surrounds any action or thought, or against historical or cultural analyses, in practice they attempt to factor out these elements to the maximum extent possible.... And so, at least provisionally, most cognitive scientists attempt to so define and investigate problems that an adequate account can be given without resorting to these murky concepts.
Critics of cognitivism have responded in two principal ways. Some critics hold that factors like affect, history, or context will never be explicable by science: they are inherently humanistic or aesthetic dimensions, destined to fall within the province of other disciplines or practices. Since these factors are central to human experience, any science that attempts to exclude them is doomed from the start. Other critics agree that some or all of these features are of the essence in human experience, but do not feel that they are insusceptible to scientific explanation. Their quarrel with an antiseptic cognitive science is that it is wrong to bracket these dimensions artificially. Instead, cognitive scientists should from the first put their noses to the grindstone and incorporate such dimensions fully into their models of thought and behavior.
The main feature of our architecture is the notion of self-aware cognition that we believe is necessary for human-like cognitive growth. Our approach is inspired by studies of the human brain-mind: in particular, by theoretical models of representations of agency in the higher associative human brain areas. This feature (a theory of mind including representations of one’s self) allows the system to maintain human-like attention, focus on the most relevant features and aspects of a situation, and come up with ideas and initiatives that may not follow from formal logic. The result is a robust cognitive system capable of significant cognitive growth.
Note the present-tense use of "is" in that last sentence. This rhetorical trope is commonly known as "wishful thinking" and it's been relentlessly deployed for most of DARPA's 50 year history. As it turns out, your PAL is a wholly imaginary friend. So much for getting rid of "murky concepts."
In the references at the end of that paper, I was interested -- fascinated is a better word -- to find this: Mindblindness: An Essay on Autism and Theory of Mind by Simon Baron-Cohen (MIT Press, 1995). Amazon reader Hubert Cross, self-identifying as autistic, includes the following in his review:
As a kid, I didn't see people like objects, but I didn't quite see them as people either. They were there, but they were not very important. That is as far as I can go explaining how it was for me. The only thing I can add is that I am not giving you anything more than a faint idea of how it really was.
What does Simon Baron-Cohen do? He introduces the concept of "skinbags." Bags of skin that move and talk like people but that are not quite people.
"Skinbags" is precisely what people were for me. They moved and talked, but they had no feelings. It was not that I believed that they had no feelings; it was that it never crossed my mind to consider the possibility.
A Newsweek review of Mindblindness (" Blind to other minds," Geoffrey Cowley, 14 August 1995) contains statements such as: "Scientists sometimes think of the mind as an all-purpose learning machine, equally receptive to any kind of information." And: "...by the time they turn 4, [children] display a 'theory-of-mind mechanism,' an awareness that other people are capable of different mental actions, such as pretending or believing or misunderstanding."
Machines and mechanisms: it has become quite common to encounter such "digital analogs" of mental abilities and dynamics. The ease with which such assumptions are made and accepted is itself a product of the so-called cognitive revolution -- and the cold war Big Science that fostered it. In The Closed World: Computers and the Politics of Discourse in Cold War America (MIT Press, 1996), Paul Edwards writes (p.2)...
...computers inspired new psychological theories built around concepts of "information processing." Cybernetics, AI, and cognitive psychology relied crucially upon computers as metaphors and models for minds conceived as problem-solving, self-controlling, symbol-processing systems.
After awhile, Bruner himself became disenchanted with the
Cognitive Revolution, or at least with what it had become. "That
revolution," he wrote at the beginning of his 1990 Acts of
Meaning, a "goodbye to all that" proclamation of a new
direction,
was intended to bring "mind" back into the human sciences
after a long and cold winter of objectivism... [But it] has
now been diverted into issues that are marginal to the
impulse that brought it into being. Instead, it has been
technicalized in a manner that undermines that original
impulse. This is not to say that it has failed: far from it,
for cognitive science must surely be among the leading
growth shares on the academic bourse. It may rather be that
it has become diverted by success, a success whose
technological virtuosity has cost dear. Some critics... even
argue that the new cognitive science, the child of the
revolution, has gained its technical success at the price of
dehumanizing the very concept of mind it had sought to
reestablish in psychology, and that it has thereby estranged
itself from the other human sciences and the humanities."
Here's a question to ponder at the end (for now) of these loosely connected and intertwining threads. Is it possible that "autism" is, if not an effect of, at least a metaphorical mirror held up to the "antiseptic cognitive science" that has increasingly come to define human beings as mindblinded skinbags? In the case of the computational model of mind, minus even the skin.
If you want to really spook yourself on this general theme, go read The Geek Syndrome by Steve Silberman (Wired, December 2001). The sub-title slug reads: "Autism - and its milder cousin Asperger's syndrome - is surging among the children of Silicon Valley."
As ever, to be continued...
posted by Christopher Locke at #
Saturday, April 21, 2007
Frederick Emmons Terman (born June 7, 1900 in English, Indiana; died December 19, 1982) was an American academic. He is widely credited (together with William Shockley) with being the father of Silicon Valley....
Terman's father Lewis Terman, the man who popularized the IQ test in America, was also a professor at Stanford.
Lewis Madison Terman (born 15 January 1877 in Johnson County, Indiana, died 21 December 1956 in Palo Alto, California) was a U.S psychologist, noted as a pioneer in cognitive psychology in the early 20th century at Stanford University. He is best known as the inventor of the Stanford-Binet IQ test. He was a prominent eugenicist and was a member of the Human Betterment Foundation.
...the Human Betterment Foundation, a Pasadena-based eugenics group founded by E.S. Gosney in 1928 which had as part of its agenda the promotion and enforcement of compulsory sterilization laws in California.
Lewis Terman was the father of Frederick Terman, who, as provost of the Stanford University, greatly expanded the science, statistics and engineering departments that helped catapult Stanford into the ranks of the world's first class educational institutions, as well as spurring the growth of Silicon Valley.
The "cold war university" is the academic component of the military-industrial- academic complex, and its archetype, according to Rebecca Lowen, is Stanford University. Her book challenges the conventional wisdom that the post-World War II "multiversity" was created by military patrons on the one hand and academic scientists on the other and points instead to the crucial role played by university administrators in making their universities dependent upon military, foundation, and industrial patronage.
~ Publishers Weekly
“That every feeble-minded woman is a potential prostitute would hardly be disputed by any one. Moral judgment, like business judgment, social judgment, or any other kind of higher thought process, is a function of intelligence.”
me, I'm waiting so patiently
lying on the floor
I'm just trying to do this jigsaw puzzle
before it rains anymore stones
The following MIT Press book description for Strategic Computing: DARPA and the Quest for Machine Intelligence, 1983-1993 is in support of something larger that's in the works and on the way. If you've been following Mystic B and are like WTF? right now, don't worry. It took me over 20 years to get it myself.
This is the story of an extraordinary effort by the U.S. Department of Defense to hasten the advent of "machines that think." From 1983 to 1993, the Defense Advanced Research Projects Agency (DARPA) spent an extra $1 billion on computer research aimed at achieving artificial intelligence. The Strategic Computing Initiative (SCI) was conceived as an integrated plan to promote computer chip design and manufacture, computer architecture, and artificial intelligence software. What distinguished SCI from other large-scale technology programs was that it self-consciously set out to advance an entire research front. The SCI succeeded in fostering significant technological successes, even though it never achieved machine intelligence. The goal provided a powerful organizing principle for a suite of related research programs, but it did not solve the problem of coordinating these programs. In retrospect, it is hard to see how it could have. In Strategic Computing, Alex Roland and Philip Shiman uncover the roles played in the SCI by technology, individuals, and social and political forces. They explore DARPA culture, especially the information processing culture within the agency, and they evaluate the SCI’s accomplishments and set them in the context of overall computer development during this period. Their book is an important contribution to our understanding of the complex sources of contemporary computing.
posted by Christopher Locke at #
Tuesday, April 10, 2007
the harmonicas play the skeleton keys and the rain
dylan
So I was just sitting here foolin around with my way-cool IrisPen OCR scanner and thinking oh man, I really gotta capture that thing I just read in the bathroom. You know how it is some days. So here's a snatch (can I say that?) of the wit and wisdom of Barbara Marx Hubbard as delivered in Conscious Evolution: Awakening Our Social Potential (p. 43):
In 1987 the Harmonic Convergence sent millions to resonate with
sacred places on Earth. At dawn people went to mountains, to
groves, to parks, to pyramids, and to temples to empathize with
the living Earth. Empathy for one another and for nature charged
the global mind with love. During the event I was in Boulder,
Colorado, with Jose Arguelles, the originator of the Harmonic
Convergence. He said to me, "Barbara, the world will never be the
same again."
Wow. And it wasn't. Not that the difference had anything to do with either of these people's delusional fantasies.
Unfortunately, I knew Jose. I did mushrooms with him at the end of 1981. Fortunately, he did a lot more than I did, so I was able to escape without more serious psychic injury. I am still healing. Since then, he has morphed himself into Valum Votan, Messenger of the Law of Time -- best I can figure it, a sort of Mayan Dr. Who. Click image below for (a lot) more, if you can handle it.
"Native American teacher and mystic, Jose
Arguelles, delivered a talk at the recent June
10th Harmony Festival in Santa Rosa California.
Jose warns of severe earth changes slated to occur
by 2012, with the collapse of US society."
"Slated to occur" -- you like that? Yeah, me too. Damn! Rhetoric: don't leave home w/o it.
posted by Christopher Locke at #
Thursday, April 05, 2007