mindblind in meatspace
Pursuant to my last post ("Termanal Velocity"), here's a quote from Creating the Cold War University: The Transformation of Stanford, from the chapter titled "Building Steeples of Excellence" (p. 158): Financial concerns were partly responsible for [Frederick] Terman's attention to output and external support. A desire to see Stanford produce large numbers of Ph.D.s and to build steeples in particular fields of national importance also reflected his commitment to increasing Stanford's influence and prestige. To these concerns Terman brought a particular interest in efficiency and a faith in statistical measures of merit, both of which would have pleased his father, who had dedicated his professional life to the dubious goal of quantifying intelligence and had regarded the university as a business enterprise, albeit an inexcusably inefficient one. Free associating off the above passage, I immediately think of three other books. Occupational hazard.
Herb Simon is another story -- one that will link us back to where this started: with Fred Terman at Stanford. One of the signal events of the cold war, and a major driver of government funding for Big Science, was the successful launch of Sputnik by the Soviet Union in 1957. As a direct result, the Advanced Research Projects Agency (ARPA) was created a year later under the US Department of Defense. Over the intervening decades, DARPA, as the agency came to be known, expanded its funding to include what was, by the early '60s, coming to be called "cognitive science" -- a new field which owed much to Herbert Simon. I would occasionally run into the guy in the halls of the Robotics Institute at Carnegie Mellon University. But cutting to the chase, this graphic from DARPA's 50th anniversary Strategic Plan (PDF, March 2007) will, at a glance, give you the basic idea of what developed over the ensuing five decades. God Bless America: your PAL. |
Howard Gardner documented this paradigm shift -- and it actually was one, in this case, in the true Kuhnian sense -- in The Mind's New Science: A History of the Cognitive Revolution. In a section titled "De-Emphasis on Affect, Context, Culture, and History," he writes about some of the pushback (now largely forgotten) that the new field was met with (pp. 41-42)... Though mainstream cognitive scientists do not necessarily bear any animus against the affective realm [i.e., emotion], against the context that surrounds any action or thought, or against historical or cultural analyses, in practice they attempt to factor out these elements to the maximum extent possible.... And so, at least provisionally, most cognitive scientists attempt to so define and investigate problems that an adequate account can be given without resorting to these murky concepts. To get a sense of this transition, Google "cognitive science" on the DARPA site. I brought up this paper: An Integrated Self-Aware Cognitive Architecture (PDF), dated September 27, 2006. The main feature of our architecture is the notion of self-aware cognition that we believe is necessary for human-like cognitive growth. Our approach is inspired by studies of the human brain-mind: in particular, by theoretical models of representations of agency in the higher associative human brain areas. This feature (a theory of mind including representations of one’s self) allows the system to maintain human-like attention, focus on the most relevant features and aspects of a situation, and come up with ideas and initiatives that may not follow from formal logic. The result is a robust cognitive system capable of significant cognitive growth.Note the present-tense use of "is" in that last sentence. This rhetorical trope is commonly known as "wishful thinking" and it's been relentlessly deployed for most of DARPA's 50 year history. As it turns out, your PAL is a wholly imaginary friend. So much for getting rid of "murky concepts." In the references at the end of that paper, I was interested -- fascinated is a better word -- to find this: Mindblindness: An Essay on Autism and Theory of Mind by Simon Baron-Cohen (MIT Press, 1995). Amazon reader Hubert Cross, self-identifying as autistic, includes the following in his review: As a kid, I didn't see people like objects, but I didn't quite see them as people either. They were there, but they were not very important. That is as far as I can go explaining how it was for me. The only thing I can add is that I am not giving you anything more than a faint idea of how it really was.A Newsweek review of Mindblindness (" Blind to other minds," Geoffrey Cowley, 14 August 1995) contains statements such as: "Scientists sometimes think of the mind as an all-purpose learning machine, equally receptive to any kind of information." And: "...by the time they turn 4, [children] display a 'theory-of-mind mechanism,' an awareness that other people are capable of different mental actions, such as pretending or believing or misunderstanding." Machines and mechanisms: it has become quite common to encounter such "digital analogs" of mental abilities and dynamics. The ease with which such assumptions are made and accepted is itself a product of the so-called cognitive revolution -- and the cold war Big Science that fostered it. In The Closed World: Computers and the Politics of Discourse in Cold War America (MIT Press, 1996), Paul Edwards writes (p.2)... ...computers inspired new psychological theories built around concepts of "information processing." Cybernetics, AI, and cognitive psychology relied crucially upon computers as metaphors and models for minds conceived as problem-solving, self-controlling, symbol-processing systems.The following is cognitive scientist Jerome Bruner as quoted in Clifford Geertz's Available Light: Anthropological Reflections on Philosophical Topics (p. 189). The first part is Geertz, the second Bruner... After awhile, Bruner himself became disenchanted with the Cognitive Revolution, or at least with what it had become. "That revolution," he wrote at the beginning of his 1990 Acts of Meaning, a "goodbye to all that" proclamation of a new direction,was intended to bring "mind" back into the human sciences after a long and cold winter of objectivism... [But it] has now been diverted into issues that are marginal to the impulse that brought it into being. Instead, it has been technicalized in a manner that undermines that original impulse. This is not to say that it has failed: far from it, for cognitive science must surely be among the leading growth shares on the academic bourse. It may rather be that it has become diverted by success, a success whose technological virtuosity has cost dear. Some critics... even argue that the new cognitive science, the child of the revolution, has gained its technical success at the price of dehumanizing the very concept of mind it had sought to reestablish in psychology, and that it has thereby estranged itself from the other human sciences and the humanities." Here's a question to ponder at the end (for now) of these loosely connected and intertwining threads. Is it possible that "autism" is, if not an effect of, at least a metaphorical mirror held up to the "antiseptic cognitive science" that has increasingly come to define human beings as mindblinded skinbags? In the case of the computational model of mind, minus even the skin. If you want to really spook yourself on this general theme, go read The Geek Syndrome by Steve Silberman (Wired, December 2001). The sub-title slug reads: "Autism - and its milder cousin Asperger's syndrome - is surging among the children of Silicon Valley." As ever, to be continued... |
<< Home