emergence_1.17.02.doc.doc

84
Towards an Unknown State: Interaction, Evolution, and Emergence in Recent Art Breeding an “Evolutionary”: by Dan Collins …we can no longer accept causal explanations. We must examine phenomena as products of a game of chance, of a play of coincidences… --Vilém Flusser, from Next love in the electronic age, 1991 Learning is not a process of accumulation of representations of the environment; it is a continuous process of transformation of behavior… --Humberto Maturana 1980 http://pangaro.com/published/cyber-macmillan.html Art is not the most precious manifestation of life. Art has not the celestial and universal value that people like to attribute to it. Life is far more interesting. --Tristan Tzara, “Lecture on Dada” (1922) INTRODUCTION In August 2000, researchers at Brandeis University made headlines when they announced the development of a computerized system that would automatically generate a set of tiny robots—very nearly without human intervention. “Robots Beget More Robots?,” asked the New York Times on its front page. Dubbed the Golem project (Genetically Organized

description

 

Transcript of emergence_1.17.02.doc.doc

Page 1: emergence_1.17.02.doc.doc

Towards an Unknown State: Interaction, Evolution, and Emergence in Recent Art

Breeding an “Evolutionary”:

by Dan Collins

…we can no longer accept causal explanations. We must examine phenomena as products of a game of chance, of a play of coincidences…

--Vilém Flusser, from Next love in the electronic age, 1991

Learning is not a process of accumulation of representations of the environment; it is a continuous process of transformation of behavior…

--Humberto Maturana 1980 http://pangaro.com/published/cyber-macmillan.html

Art is not the most precious manifestation of life. Art has not the celestial and universal value that people like to attribute to it. Life is far more interesting.

--Tristan Tzara, “Lecture on Dada” (1922)

INTRODUCTION

In August 2000, researchers at Brandeis University made headlines when they announced the development of a computerized system that would automatically generate a set of tiny robots—very nearly without human intervention. “Robots Beget More Robots?,” asked the New York Times on its front page. Dubbed the Golem project (Genetically Organized Lifelike Electro Mechanics) by its creators, this was the first time that robots had been robotically designed and robotically fabricated. While machines making machines is interesting in and of itself, the project went one step further: the robot offspring were “bred” for particular tasks. Computer scientists Jordan Pollack and colleague Hod Lipson had developed a set of artificial life algorithms—instruction sets—

Page 2: emergence_1.17.02.doc.doc

that allowed them to “evolve” a collection of “physical locomoting machines” capable of goal oriented behavior. (footnote the rest of the story) The Golem project is just one example of a whole new category of computer-based, creative research that seeks to mimic--somewhat ironically given its dependence on machines--the evolutionary processes normally associated with the natural world. The new research is characterized by high levels of interactivity, dynamic changes over time including evolution and mutation and emergent behaviors, and a more fluid and active involvement on the part of the user.

The research suggests a new and evolving role for artists and designers working in computer aided design process and interactive media, as well as an expanded definition of the user/audience. Instead of exerting total control over the process and product where the artist has a “direct” relationship to every aspect of the creative process, the task of the artist/scientist becomes one of interacting effectively with machine-based systems in ways that supercharge the investigative process. Other individuals—research collaborators and audience members alike--are called upon to “interact” with the work and further the dialogue.

How can we begin to understand the wealth of processes and shift in philosophical perspectives such an approach to artmaking represents? The unpredictable nature of the outcomes provides an ideational basis to art making that is less deterministic, less bound in inherited style and method, less totalizing in its aesthetic vision. (footnote to historical precedents), and, perhaps, less about the ego of the individual artists. In addition to the mastery of materials and harnessing the powers of the imagination that we expect of the professional artist, our new breed of artist—call her an “evolutionary”--is equally adept at developing new algorithms, envisioning useful and beautiful interfaces, and managing/collaborating with machines exhibiting non-deterministic and emergent behaviors. Like a horticulturalist whooptimizes growing conditions for particular species but is alert to the potential beauty of mutations in evolutionary strains, the evolutionary works to prepare and optimize the conditions for conceptually engaging and aesthetic outcomes. In order to do this, this new breed of artist must have a fuller understanding of interactivity, an healthy appreciation of evolutionary theory, and a gift for setting into motion emergent behavior.

We begin this admittedly utopian vision of the artist by trying to understand and extend what we mean by the concept of “interaction.”

-------------Most computer-based experiences claiming “interactivity” are a sham. Ask any twelve year old who has exhausted the “choices” in their state of the art “interactive” computer game. The choices offered are not significant choices. Most games, even games of great complexity, are finite and depend upon a user accessing predefined routines stored in computer memory.

Page 3: emergence_1.17.02.doc.doc

The “game” is not limited to the arcade of course: the links and rollovers that clog the margins of our computer screens—and increasingly our television sets (footnote TV Guide Interactive)—add to the illusion of infinite possibility. Ironically, as we become acculturated to a “full menu of choices”, the options because less distinct, less meaningful. Try it. You can “Build Your Lexus” on the Toyota website—just click on any of the Lexus 2002 product line and “interactively” select exterior color, interior fabrics, and accessories.1 Nevermind that nearly identical options can be found at Saturn, Ford, GM, and Daimler Benz.2

The promise of “interactive TV” has also been receiving a lot of attention of late—particularly in Europe where a full range of interactive programming has been available for several years.

http://www.itvt.com/sky.html (see text in “notes”)

While Interactive TV represents an impressive wedding of the broadcast model (one to many, plus passive viewing) with the experience of the Internet (1 to many, many to many, many to one, plus active participation), we are still a long way from high level interactive experiences (such as generating new content on the fly in collaboration with another person or a machine). (see notes for diagrams of network topology)

Stephen Wilson writes in “Information Arts” (p. 344):

The inclusion of choice structures does not automatically indicate a new respect for the user’s autonomy, intelligence, or call out significant psychic participation.

Page 4: emergence_1.17.02.doc.doc

In fact, some analysts suggest that much interactive media is really a cynical manipulation of the user, who is seduced by a semblance of choice.3

Wilson argues further that the “nature of the interactive structure is critical” and requires a “deeper involvement by viewers.”

But what is an “interactive structure”? Is there a consensus on what constitutes interactivity? Is there a definition, a set of rules, or a handbook for designing effective interactive experiences? What distinguishes systems that provide a sense of user autonomy and control? Can the new sciences of “computational evolution” and “emergence” help us to transform computer-based systems from simply attractive data management and navigation tools into collaborative partners for creation, research, and learning?

Educational technologist Ellen Wagner defines interaction as "… reciprocal events that require at least two objects and two actions. Interactions occur when these objects and events mutually influence one another." (Wagner, 1994).

High levels of “interactivity” are achieved in human/machine couplings that enable reciprocal and mutually transforming activity. Interactivity—particularly the type that harnesses emergent forms of behavior—requires that both the user and the machine be engaged in open-ended cycles of productive feedback and exchange. Beyond simply providing an on/off switch or a menu of options leading to “canned” content, in an ideal state, users should be able to interact with the system in ways that produce new information.

While the demand for "interactivity" is a relatively recent phenomenon in the arts, the culture at large has long been obsessed with the idea of machines that learn—and can in turn interact with a user. From media spectacles such as Big Blue's defeat of World Chess Champion Garry Kasporov in May of 1997 to quieter revolutions in teaching autistic children (see NY Times article), computers that master the behaviors of their users are beginning to find a place in contemporary society.

There is more than a hint of narcissism in our desire to be personally reflected in the machines we make. Our desire for “interaction” can be understood as a kind of “Pygmalion complex”(*) in which the world is animated according to our own designs and desires. This has both negative and positive aspects—negative in the sense of a self-absorption in which we see ourselves reflected in the world around us; positive in the sense that we have within us the energy to transform and give life to inanimate material through our powers of invention. In any event, there is a trend away from "dumb" tools and toward "intelligent" machines that respond and learn by interacting with their owner/operators. While our cooking appliances and VCRs, are already "programmable" to reflect individual tastes, the idea of agents and wizards that know our “personal tastes and preferences” represent a rapidly growing trend. (See “Interface Culture” pp. 189-190). (In contrast to the “push” of direct marketing and junk mail, we need the “pull” of

Page 5: emergence_1.17.02.doc.doc

just in time delivery of information and goods on an “as needed” basis.).

Few art schools provide courses for producing let alone interpreting or critiquing "interactive” or “emergent” artworks. Though the borderline between the fine arts and other cultural practices (such as science, technology, and entertainment) is becoming increasingly blurred, it is clear that the development of "interactive art" is largely dependent on "non-art" traditions. Interactive and emergent art practices, at least from a technical perspective have more in common with computer gaming, combat simulation, and medical diagnostics than main stream art history or criticism. Theorizing this territory is less a matter of mining, say, the Art Index, and more a matter of conducting systematic research into areas such as communications theory, human computer interaction, educational technology, and cognitive science. With this in mind, it may be helpful to briefly review how other disciplines are looking at the issues surrounding interaction.

Historical and Theoretical Context

A brief look at the history of communication theory shows an evolution from "one-way" systems of communication to "multi-directional" systems. C.E. Shannon, the inventor of “information theory,” developed a mathematical theory of communication in the 1940’s (Shannon, 1948) that revolutionized the way we think about information transfer. In fact he coined the term “bits”—short for “binary digits”—as a fundamental particle, an irreducible unit of measure, that could be used to represent virtually any kind of information--be it smoke signals, music videos, or satellite images. Initially, Shannon posited a highly linear engineering model of information transfer involving the one-way transmission of information from a source to a destination using a transmitter, a signal, and a receiver. Later theorists built upon Shannon's model to include the concepts of interactivity and feedback.

The feedback loop is perhaps the simplest representation of the relationships between input and output elements in a system. One element or agent (the 'regulator' or control) sends information into the system, other agents act based upon their reception/perception of this information, and the results of these actions go back to the first agent. It then modifies its subsequent information output based on this response, to promote more of this action (positive feedback), or less or different action (negative feedback).System components (agents or subsystems) are usually both regulators and regulated, and feedback loops are often multiple and intersecting (Clayton, 1996, Batra, 1990)." (Morgan, 1999)

Feedback is essential for a system to maintain itself over the course of time. Negative feedback leads to adaptive, or goal-seeking behavior such as sustaining the same level, temperature, concentration, speed, direction in a given system. In some cases the goal is self-determined and is preserved in the face of evolution: the system has produced its own purpose (to maintain, for example, the composition of the air or the oceans in the ecosystem or the concentration of glucose in the blood). In other cases humankind has determined the goals of the machines (automats and servomechanisms). In a negative

Page 6: emergence_1.17.02.doc.doc

loop every variation toward the positive triggers a correction toward the negative, and vice versa. There is tight control; the system oscillates around an ideal equilibrium that it never attains. A thermostat or a water tank equipped with a float are simple examples of regulation by negative feedback.

http://pespmc1.vub.ac.be/FEEDBACK.html

While the concept of feedback provides a concise description of how inputs and outputs interact in a closed system, the science of cybernetics uses the circularity of feedback mechanisms as a key to understanding organization, communication, and control in systems of all kinds.

The term “cybernetics” was coined in 1947 by the mathematician Norbert Wiener (see http://www.well.com/user/mmcadams/wiener.html), who used it to name a discipline apart from, but touching upon, such established disciplines as electrical engineering, mathematics, biology, neurophysiology, anthropology, and psychology. Wiener and his colleagues, Arturo Rosenblueth and Julian Bigelow, needed a new word to refer to their new concept; they adapted a Greek word meaning "steersman" to invoke the rich interaction of goals, predictions, actions, feedback and response in systems of all kinds (the term "governor" derives from the same root) [Wiener 1948]. Early applications in the control of physical systems (aiming artillery, designing electrical circuits and maneuvering simple robots) clarified the fundamental roles of these concepts in engineering; but the relevance to social systems and the softer sciences was also clear from the start. <http://pangaro.com/published/cyber-macmillan.html>

Cybernetics grew out of Shannon's information theory which, as mentioned above, was designed to optimize the transmission of information through communication channels, and the feedback concept used in engineering control systems. As cybernetics has evolved, it has placed increasing emphasis on how observers construct models of the systems with which they interact to maintain, adapt, and self-organize (*). Such circularity or self-reference makes it possible to make precise, scientific models of purposeful activity, that is, behavior that is oriented towards a goal or preferred condition. In that sense, cybernetics proposes a revolution with respect to the linear, mechanistic models of traditional Newtonian science. In classical science, every process is determined solely by its cause, that is, a factor residing in the past. While classical science is based in understanding cause/effect relationships, cybernetic science seeks to understand the

Page 7: emergence_1.17.02.doc.doc

behavior of living organisms in some future, unknown state--a state of being that does not as yet exist and, therefore, cannot be said to have a relationship to a definable “cause.”

Cybernetics has discovered that teleonomy (or finality) and causality can be reconciled by using non-linear, circular mechanisms, where the cause equals the effect. The simplest example of such a circular mechanism is feedback. The simplest application of negative feedback for self-maintenance is homeostasis. The non-linear interaction between the homeostatic or goal-directed system and its environment results in a relation of control of the system over the perturbations coming from the environment.

http://pespmc1.vub.ac.be/CYBERN.html

But in this carefully balanced, homeostatic condition, what is it that accounts for change—and the ability to adapt to change? As it turns out, the degree to which a system can both introduce new information and respond to changes in its environment has a relationship to the degree it can accommodate new information and process it in an “unbiased” way. Unbiased systems allow us to reverse the old adage, “if the only tool you have is a hammer, you tend to treat everything as if it were a nail” rewriting it to become, “if the tools you have are adaptable, you tend to treat everything on its own terms.” (Reflect upon and refine this…).

INSERT MORE INFO ON COMPUTATIONAL EVOLUTION AND EMERGENCE

In their book Swarm Intelligence (2001), computer scientists Kennedy and Eberhart explain some of the special characteristics of adaptive systems, with emphasis on the power of “random” (read “unbiased”) numbers:

If we consider an adaptive system as one that is adjusting itself in response to feedback, then the question is in finding the appropriate adjustment to make; this can be very difficult in a complex system…because of both external demands and the need to maintain internal consistency. In adaptive computer programs, randomness usually serves one of two functions. First, it is often simply an expression of uncertainty. Maybe we don’t know where to start searching for a number, or where to go next, but we have to go somewhere—a random direction is as good as any…The second important function of random numbers is, interestingly, to introduce creativity or innovation. Just as artists and innovators are often the eccentrics of a society, sometimes we need to introduce some randomness just to try something new, in hopes of improving our position. And lots of times it works.4

------------

random: in genetic algorithms and simulated evolution in a computer, one needs a plethora of "random" choices: for choosing mates probabilistically based on their fitness scores, for selecting sites along the genetic code for sexual crossover, etc. How does one introduce such randomness into that bastion of

Page 8: emergence_1.17.02.doc.doc

determinism, the digital computer? Most commonly through the use of a pseudorandom number generator (above). But Rupert Sheldrake has suggested the use of genuine realtime random noise generators in such simulations to test an intriguing though highly controversial theory of causation called morphic resonance. Generating evolutionary art with such a realtime random number generator may be more satisfying to an artist because no work could ever be repeated, unlike restarting a run with a preserved seed from a pseudorandom number generator.

http://www.azstarnet.com/~srooke/glossary.html#meme

--------

Most natural and living systems are both productive and adaptive. They produce new material (e.g, blood cells, tissue, bone mass) even while adapting to a constantly changing environment. While most natural and living systems are "productive" in the sense of creating new "information," human-made machines that can respond with anything more than simple binary "yes/no" responses are a relatively recent phenomenon. To paraphrase media artist Jim Campbell, most machines are simply "reactive," not interactive.

If “adaptability,” "mutual influence" and reciprocity are criteria for true interactivity, then the system should be capable of delivering more than pre-existing data on demand. Sophisticated Interactive systems, ideally, should be able to generate "custom" responses to input and queries. In short, the system needs to be smart enough to produce output that is not already part of the system. Interactivity must be more than following predetermined prompts to preprogrammed conclusions like in a video game

"Intelligent" machines, being developed with the aid of "neural networks" (footnote) and "artificial intelligence," can interact by learning new behaviors and changing their responses based upon user input and environmental cues. Over time, certain properties begin to "emerge" such as self-replication or patterns of self-organization and control. These so-called "emergent properties" represent the antithesis of the idea that the world is simply a collection of facts waiting for adequate representation. The ideal system is a generative engine that is simultaneously a producer and a product.

"INTERACTIVE" ARTWORKS

Concrete examples from art and technology research can illustrate how different individuals, groups, and communities engage in interactive experiences—from the locally controlled parameters characteristic of the video game and the LAN bash, to large scale interactions involving distributed collaborative networks over the Internet. Artists and Scientists such as Eric Zimmerman (game designer, theorist, and artist); John Klima (artist and webgame designer); Hod Lipson and Jordan B. Pollack (The Golem Project), Pablo Funes (computer scientist and EvoCAD inventor); Eduardo Kac (transgenic art and

Page 9: emergence_1.17.02.doc.doc

distributed systems); Sarah Roberts (interactive art), Christa Sommerer, and Laurent Mignonneau (Interactive systems); Ken Ringold (Artificial Intelligence); Yves Amu Klein (Living Sculpture); and Jim Campbell (“Responsive systems”) are doing pioneering work in an area that could be called “evolutionary art.”. What differentiates the work of these artists from more traditional practices? What educational background, perceptual skills, and conceptual orientations are required of the artist—and of the viewer/participant? What systems, groups, or individuals are acknowledged and empowered by these new works?

The capability of exhibiting or tracking "Emergent properties" is seen by the author as a future hallmark and essential feature of all interactive systems. With projects that enable these heightened levels of interactivity, we may begin to see the transformation of the discrete and instrumental character of “information” into a broad—and unpredictable-- “knowledge base” that honors the contexts and connections essential to global understanding and exchange.

Creating an experience for a participant in an interactive artwork must take into account that interactions are, by definition, not "one-way" propositions. Interaction as we’ve seen depends on feedback loops that include not just the messages that preceded them, but also the manner in which previous messages were reactive. When a fully interactive level is reached, communication roles are interchangeable, and information flows across and through intersecting fields of experience that are mutually supportive and reciprocal. The level of success at which a given interactive system attains optimal levels of reciprocity could offer a standard by which to critique interactive artwork.

Many artists have developed unique attempts at true interaction, addressing problems of visual display, user control processes, navigation actions, and system responses. Different works have varying levels of audience participation, different ratios of local to remote interaction, and either the presence or absence of emergent behaviors. Moreover, different artistic attempts at interactivity suggest different approaches to interaction could be used for diverse kinds of learners in a variety of educational settings. Understanding experiments with interaction in an art context may help us to better understand interaction in pedagogical settings.

A Gallery of Interactive Experiments and Artworks

Christa Sommerer and Laurent Mignonneau: Interactive Plant Growing (1993)

Austrian-born Christa Sommerer and French-born Laurent Mignonneau teamed up in 1992, and now work at the ATR Media Integration and Communications Research Laboratories in Kyoto, Japan. In nearly a decade of collaborative work, Sommerer and Mignonneau have built a number of unique virtual ecosystems, many with custom viewer/machine interfaces. Their projects allow audiences to create new plants or creatures and influence their behavior by drawing on touch screens, sending e-mail, moving through an installation space, or by touching real plants wired to a computer.

Page 10: emergence_1.17.02.doc.doc

Artist's rendering of the installation showing the five pedestals with plants and the video screen.

Interactive Plant Growing is an example of one such project. The installation connects actual living plants, which can be touched or approached by human viewers, to virtual plants that are grown in real-time in the computer. In a darkened installation space, five different living plants are placed on 5 wooden columns in front of a high-resolution video projection screen. The plants themselves are the interface. They are in turn connected to a computer that sends video signals from its processor to a high resolution video data projection system. Because the plants are essentially antennae hard wired into the system, they are capable of responding to differences in the electrical potential of a viewer's body. Touching the plants or moving your hands around them alters the signals sent through the system. Viewers can influence and control the virtual growth of more than two dozen computer-based plants.

Page 11: emergence_1.17.02.doc.doc

Screen shot of the video projection during one interactive session.

Viewer participation is crucial to the life of the piece. Through their individual and collective involvement with the plants, visitors decide how the interactions unfold and how their interactions are translated to the screen. Viewers can control the size of the virtual plants, rotate them, modify their appearance, change their colors, and control new positions for the same type of plant. Interactions between a viewer's body and the living plants determine how the virtual three-dimensional plants will develop. Five or more people can interact at the same time with the five real plants in the installation space. All events depend exclusively on the interactions between viewers and plants.

The artificial growing of computer-based plants is, according to the artists, an expression of their desire to better understand the transformations and morphogenesis of certain organisms (Sommerer et al, 1998).

What are the implications of such works for education? How can we learn from this artistic experimentation to use technological systems to be better teachers? Educators have long recognized the importance of two-way or multi-directional communication. Nevertheless, many educators perpetuate the mindset of the one-way "broadcast"--a concept that harks back to broadcast media such as radio and echoes the structure of the standard lecture where teacher as "source" transmits information to passive "receivers." The notion of a "one-to-many" model that reinforces a traditional hierarchical top-down approach to teaching is at odds with truly democratic exchange. In Interactive Plant Growing, Sommerer and Mignonneau invert this one to many model by providing a system for multiple users to collaborate on the creation of a digital wall projection in real time. The system in effect enables a real time collaboration that takes many diverse inputs and directs them to a common goal. And this is exactly what good teaching is. This conceptualization of critical pedagogy has been developed in many different situations, but here is combined with technology that mirrors its structure.

Sommerer and Mignonneau: Verbarium (1999)

In a more recent project the artists have created an interactive "text-to-form" editor available on the Internet. At their Verbarium web site, on-line users are invited to type text messages into a small pop up window. Each of these messages functions as a genetic code for creating a visual three-dimensional form. An algorithm translates the genetic encoding of text characters (i.e., letters) into design functions. The system provides a steady flow of new images that are not pre-defined but develop in real-time through the interaction of the user with the system. Each different message creates a different organic form. Depending on the composition of the text, the forms can either be simple or complex. Taken together, all text images are used to build a collective and complex three-dimensional image. This image is a virtual herbarium, comprised of plant forms based on the text messages of the participants. On-line users help to not only create and develop

Page 12: emergence_1.17.02.doc.doc

this virtual herbarium, but also have the option of clicking on any part of the collective image to de-code earlier messages sent by other users.

Screen shot of the Verbarium web page showing the collaborative image created by visitors to the site.The text to form algorithm translated "purple people eater" into the image at the upper left.This image was subsequently collaged into the collective "virtual herbarium."

In both the localized computer installations and web-based projects realized by Sommerer and Mignonneau, the interaction between multiple participants operating through a common interface represents a reversal of the topology of information dissemination. The pieces are enabled and realized through the collaboration of many participants remotely connected by a computer network. In an educational setting, this heightened sense of interaction needs to be understood as crucial. Students and instructors alike become capable of both sending and receiving messages. Everyone is a transmitter and a receiver, a publisher and a consumer. In the new information ecology, traditional roles may become reversed--or abandoned. Audience members become active agents in the creation of new artwork. Teachers spend more time facilitating and "receiving" information than lecturing. Students exchange information with their peers and become adept at disseminating knowledge.

Ken Rinaldo: Autopoiesis (2000)

Page 13: emergence_1.17.02.doc.doc

Overview of all fifteen robotic arms of the Autopoiesis installation. Photo credit: Yehia Eweis.

A work by American artist Ken Rinaldo was recently exhibited in Finland as part of "Outoaly, the Alien Intelligence Exhibition 2000," curated by media theorist Erkki Huhtamo. Rinaldo, who has a background in both computer science and art, is pursuing projects influenced by current theories on living systems and artificial life. He is seeking what he calls an "integration of organic and electro-mechanical elements" that point to a "co-evolution between living and evolving technological material."

Rinaldo's contribution to the Finnish exhibition was an installation entitled Autopoiesis, which translates literally as "self making." The work is a computer-based installation consisting of fifteen robotic sound sculptures that interact with the public and modify their behaviors over time. These behaviors change based on feedback from infrared sensors which determine the presence of the participant/viewers in the exhibition, and the communication between each separate sculpture.

The series of robotic sculptures--mechanical arms that are suspended from an overhead grid--"talk" with each other (exchange audio messages) through a computer network and audible telephone tones. The interactivity engages the participants who in turn effect the system's evolution and emergence. This interaction, according to the artist, creates a

Page 14: emergence_1.17.02.doc.doc

system evolution as well as an overall group sculptural aesthetic. The project presents an interactive environment which is immersive, detailed, and able to evolve in real time by utilizing feedback and interaction from audience members.

What are the pedagogical implications for systems such as Autopoiesis that exhibit "emergent properties?" Participant/learners interacting with such systems are challenged to understand that cognition is less a matter of absorbing ready made "truths" and more a matter of finding meaning through iterative cycles of inquiry and interaction. Ironically, this may be what good teaching has always done. So would we be justified in building a "machine for learning" that does essentially the same thing that good teachers do? One argument is that by designing such systems we are forced to look critically at the current manner in which information is generated, shared, and evaluated. Further, important questions are surfaced such as "who can participate"; "who has access to the information;" and "what kinds of interactions are enabled?" The traditional "machine for learning" (the classroom) with its single privileged source of authority (the teacher) is hardly a perfect model. Most of the time, it is not a system that rewards boundary breaking, the broad sharing of information, or the generation of new ideas. It IS a system that, in general, reinforces the status quo. Intelligent machines such as Rinaldo's Autopoiesis can help us to draw connections between multiple forms of inquiry, enable new kinds of interactions between disparate users, and increase a sense of personal agency and self-worth. While intelligent machines will surely be no smarter than their programmers, pedagogical models can be more easily shared and replicated. Curricula (programs for interactions) can be modified or expanded to meet the special demands of particular disciplines or contexts. Most importantly, users are free to interact through the system in ways that are suited to particular learning styles, personal choices, or physical needs.

IMPLICATIONS FOR ART AND EDUCATION

Interactive artworks of the future will enable interactions that are at once personal and universal. These interactions will be characterized by a subtle reciprocity between the body and the natural environment, and an expanded potential for self-knowledge and learning. Truly interactive experiences are already empowering individuals (consider the "disabled" community or autistic learners, for example).

Returning to various theories of interaction (particularly those of Ellen Wagner), several recommendations for artists emerge that begin to trace a trajectory for the education of the interactive artist. They include training on and empowerment with various technologies; understanding media-structured feedback loops (1) and methods for enhancing "mutual recriprocity"; rethinking where meaning is constituted (cognitive theory is now suggesting that "meaning" is seen as something that happens between rather than inside individuals); and redefinition of the roles of educators and learners. Rapid evolution in the art profession as a whole is creating changes in the definitions and roles played by art teachers and prospective artists.

There is no question that the uses of technology outlined here need to be held against the

Page 15: emergence_1.17.02.doc.doc

darker realities of life in a hi-tech society. The insidious nature of surveillance and control, the assault on personal space and privacy, the commodification of aesthetic experience, and the ever-widening "digital divide" between the technological haves and have nots are constant reminders that technology is a double edged sword.

But there is at least an equal chance that a clearer understanding of the concepts of interaction, evolution, and emergence—enabled by technology--will yield a broader palette of choices from which human beings can come together to create meaning. In watching these processes unfold, educators will surely find new models for learning.

Conclusion

Because what we are doing is modeling processes and behaviors that more closely approximate the complexity of “real life”—seen as such, we put ourselves in a position of appreciation rather than continue our misguided hubris of simple domination and control. Interacting in collaboration with our environment and seeking out unexpected outcomes through systems of emergence provide new models for life on a tightly packed but incredibly diverse planet.

Robert Axelrod, author of The Evolution of Cooperation writes,

There is a lesson in the fact that simple reciprocity succeeds without doing better than anyone with whom it interacts. It succeeds by eliciting cooperation from others, not by defeating them. We are used to thinking about competitions in which there is only one winner, competitions such as football or chess. But the world is rarely like that. In a vast range of situations, mutual cooperation can be better for both sides than mutual defection. The key to doing well lies not in overcoming others, but in eliciting their cooperation.

<http://www.globalcommunity.org/breakthrough/book/chapters/axelrod.html>

_____________

Notes

(1) "The feedback loop is perhaps the simplest representation of the relationships between elements in a system, and these relationships are the way in which the system changes. One element or agent (the 'regulator' or control) sends information into the system, other agents act based upon their reception/perception of this information, and the results of these actions go back to the first agent. It then modifies its subsequent information output based on this response, to promote more of this action (positive feedback), or less or different action (negative feedback). System components (agents or subsystems) are usually both regulators and regulated, and feedback loops are often multiple and intersecting (Clayton, 1996, Batra, 1990)." (Morgan, 1999)

Page 16: emergence_1.17.02.doc.doc

References

Flax, Carol. (2000). URL: http://www.arts.arizona.edu/cflax/

Hillman, D., Willis, D.J. & Gunawardena, C.N. (1994). Learner-interface interaction in distance education: An extension of contemporary models and strategies for practioners. The American Journal of Distance Education, 8(2).

Huhtamo, Erkki (1993). Seeking deeper contact: interactive art as metacommentary. URL: http://www.ccms.mq.edu.au/mus302/seeking_deeper_contact.html

Moore, M. (1989). Editorial: Three types of interaction. The American Journal of Distance Education, 3(2), 1-6.

Morgan, Katherine Elizabeth (1999). A systems analysis of education for sustainability and technology to support participation and collaboration. Unpublished Master's Thesis at the University of British Columbia. http://www.interchange.ubc.ca/katherim/thesis/index.html

Penny, Simon (1996) Embodied agents, reflexive engineering and culture as a domain. p. 15. (talk given at the Museum of Modern Art in New York City, May 20, 1996) URL: http://adaweb.walkerart.org/context/events/moma/bbs5/transcript/penney01.html

Penny, Simon. (2000). URL: http://www-art.cfa.cmu.edu/Penny/works/traces/Tracescode.html

Rinaldo, Ken (2000).URL: http://ylem.org/artists/krinaldo/emergent1.html

Shannon, C.E. (1948). A mathematical theory of communication, Bell System Technical Journal, vol. 27, pp. 379-423 and 623-656, July and October. URL: http://cm.bell-labs.com/cm/ms/what/shannonday/paper.html

Sommerer, Christa and Laurent Mignonneau (1998). Art as a living system. Leonardo, Vol. 32, No. 3., pp. 165-173.

Sommerer, Christa and Laurent Mignonneau (2000). URL: http://www.mic.atr.co.jp/~christa

Wagner, M. (1994). In support of a functional definition of interaction. The American

Page 17: emergence_1.17.02.doc.doc

Journal of Distance Education, 8(2).

Wilson, Stephen. (1993). The aesthetics and practice of designing interactive computer events. URL: http://userwww.sfsu.edu/~swilson/papers/interactive2.html

Notes

1 This is the automotive equivalent of what in agriculture would be termed a “monoculture”—that is farming practices dependent upon one crop. While monocultures are efficient in the short term, we know from experience that they do not represent the diversity characteristic of any healthy ecosystem, and they do not adapt well tochanges or unexpected stress in the environment.1 Edward O. Wilson has described the loss of biodiversity as being "...far more complex, and not subject to reversal.". This complexity begs us to "...rescue whole ecosystems, not only individual species.". Much of the diversity of Rainforests is lost to agriculture, or more properly, monoculture. Monoculture is the destruction of a diverse ecosystem and replacement with a single species system. This is most often a crop of little local value, but with enormous direct and/or indirect profit potentials in other regions or countries. By design, agriculture is designed to typify monoculture. By analogy, our consumer-based economy is also a kind of “monoculture” that, in its promotion of narrow stylistic and functional categories, suffers from any lack of real choice. Toyota, GM, Volvo…what’s the difference? They all represent the same fossil fuel based transportiona predicated upon the privately owned automobile powered by an internal combustion engine.

2http://config.lexus.com/Lexus/Dispatch.jsp

or this pitch from TV Guide Interactive http://www.tvguideinteractive.tv/default.asp “Best of all, you are in control, all at the touch of a button…all interactive.”

WHAT IS INTERACTIVE TELEVISION?

How It Looks and Feels

InteractiveTV is, essentially, video programming which incorporates some style of interactivity - be it with data on video, graphics on video, video within video, or retrieving video programming and possibly recording it on a digital hard disk drive for further use. To the viewer, "enhancements" appear as graphical and sometimes purely informational elements on the screen overlaying (some technologies actually incorporate the data enhancements in the video MPEG stream such as HyperVideo). Often these are sopaquely colored and cover the broadcast in part or are transparent or semi-transparent. Specific reoccurring elements are icons, banners, labels, menus, interface structures, open text fields in which you can insert your email address, forms to fill out in order to buy a product, or commands to retrieve and manage video streams and graphics on a relevant

Page 18: emergence_1.17.02.doc.doc

Web page. Interactive or accessible information data, of course, is the most important new addition to the television landscape.

http://www.itvt.com/etvwhitepaper-2.htmlCopyright 2000 By the American Film Institute | Intel Corporation | Tracy Swedlow

Figure 1: Broadcast Network typifies “one to many” topology of broadcast television.

Figure 2: Switched Network typifies Internet which allows for “one to many”, “many to many” and “many to one” communication.

http://www.cs.ukc.ac.uk/people/staff/asi1/ictp/rtp/s2node2.html

-------

*Self-organization is a process where the organization (constraint, redundancy) of a system spontaneously increases, i.e. without this increase being controlled by the environment or an encompassing or otherwise external system http://pespmc1.vub.ac.be/SELFORG.html

*Neural networks depend upon kdParallel processors are computers that excel at pattern recognition, or inductive thinking. Parallel processors that can handle many instructions at once are called neural networks (or nets). Neural nets excel at inductive tasks, such as pattern recognition, for which many commercial applications are now being developed.

Page 19: emergence_1.17.02.doc.doc

http://www.funderstanding.com/neural_networks.cfm

More on the Golem project:

Combining automatic manufacturing techniques with evolutionary computing, the two scientists had found ingenious ways to harness the mathematical principles at the core of “a-life” (artificial life) to a series of computerized mechanical processes. With the exception of humans snapping motors into locations determined by the computer, the collection of machines utilized for the project performed “as a single manufacturing robot” capable of creating other robots. Over the two plus year course of the project, multiple computers were harnessed to perform the thousands of iterative calculations necessary to find the best configurations for the required tasks. At one point, over 9000 users had logged into the researcher’s website and were acting as beta testers—each running still more variations on the basic kit of parts and behaviors at the core of the project. The software generated different designs and methods of movement, creating traits that worked and failed. Mimicking natural selection’s mantra, “survival of the fittest,” the most promising designs survived and passed their success to future generations. Finally, hundreds of generations later, three robots were manufactured by a rapid prototyping machine. These machines, eight inch long robots—had evolved surprising and largely unpredictable kinds of locomotive behaviors that enabled them to achieve their genetically programmed destiny: pulling themselves across a table top or ratcheting their way across a bed of sand.

In all computational evolution one or more “parents” (derived from virtually any media) are mutated and/or crossbred to produce a number of "children", which are then selected again. The more advanced systems allow the researcher to assign a "goodness" or “fitness” factor to each child. The results of this "selection" are then used to produce the next "generation". While there are countless evolutionary dead-ends in any selection process (be it natural…or unnatural), surprisingly robust and creative outcomes are often achieved.

Footnote: The Dada Spirit and Chance Operations: Conceptual precedents for the spirit if not the method of non-deterministic processes can be found in the work of many earlier artists—Marcel Duchamp, Jean Arp, Jackson Pollack, John Cage, George Brecht, for example—who embraced systems of artmaking rooted in chance operations and aleatory practices. Favoring “chance over choice,” they attempted to bypass conscious decision making and open up new creative territory. Of course, chance operations or other bids for non-predictability have never been about advocating chaos over composition, nor has the use of such strategms succeeded in completely erasing the ego of the artist. But chance operations have explored the potential optimizing the conditions for unexpected relationships and being open to that which is patently unpredictable. “The importance of chance to the unconscious has manifold facets, not

Page 20: emergence_1.17.02.doc.doc

only in modern psychology, but also (and particularly) in Oriental thought (such as that manifested in the I-Ching or in Zen). Ref. Brecht, George, “Chance-Imagery,” Aesthetics Contemporary. First published 1966.

Why is the Golem project significant? Humans distinguish themselves largely by the ability to make a use tools. Insert Core technology idea… The idea of creating useful artifacts depends on a culture that embraces tools. But to make a tool that can make a tool…this has always been a special skill. Imagine the technology required to make a simple hand tool like a rasp. The tech to make the rasp far more complex… To use the rasp requires a basic understanding of how to hold the tool and how to properly move it across a surface to substract unwanted material. But to make the rasp requires not only an understanding of the desired performance of the tool, but countless other technologies—industrial design, metallurgy, manufacturing, etc.—to build the tool that can manufacture the rasp.

---------------------------------

more stuff at end of intro:

What is Evolutionary Art?

Evolutionary art is a comparativly recent artform, and is virtually exclusively generated on computers.

The basic idea behind evolutionary art is that the artist is able to control the development of a piece of work through some form of "selection", in a manner analogous to natural selection. In all evolutionary art one or more parent pictures or virtual sculptures are mutated and/or crossbred to produce a number of "children", which are then selected again. The more advanced systems allow the artist to assign a "goodness" factor to each child.

The results of this "selection" are then used to produce the next "generation". Evolutionary systems allow the artist to generate complex computer artwork without them needing to delve into the actual programming used.

Most, if not all, genetic art systems, and many organic art systems, are "Evolutionary", .

. “A computer programmed to follow the rules of evolution has for the first time designed and manufactured simple robots with minimal help from people,” wrote the journalist.

Page 21: emergence_1.17.02.doc.doc

The basic idea behind this and other demonstrations of “computational evolution” is that the researcher/artist is able to control the development of a piece of work through some form of "selection" in a computer program in a manner analogous to natural selection.

Like “biological lifeforms whose structure and function exploit the behaviors afforded by their own chemical and mechanical medium,” Pollack and Lipson’s “evolved creatures” took advantage of the nature of their own medium - thermoplastic, motors, and artificial neurons.

*Pygmalion complex: http://keirsey.com/pygmalion/mirroroffiction.html*Claude Shannon: The fundamental problem of communication is that of reproducing at one point either exactly or approximately a message selected at another point. Frequently the messages have meaning; that is they refer to or are correlated according to some system with certain physical or conceptual entities. These semantic aspects of communication are irrelevant to the engineering problem. The significant aspect is that the actual message is one selected from a set of possible messages. The system must be designed to operate for each possible selection, not just the one which will actually be chosen since this is unknown at the time of design.

kkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkkmmmmmmmmmmmmmmmmmmmmmmmmmmmmbbbbbbbbbbcccccccccccccccccccccccccccccccchhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxVVVVVVVVVVVVVVVVVVVVVVVVVVVVVVVVVVVVVVVVVVVVVVVVVVVVVVVVVVVVVVVVVVVVVVVVVVVVVVVVVVVVVVVVVVVVVV

Page 22: emergence_1.17.02.doc.doc

VVVVVVVVVVVVVVVVVVVVVVVVVVVVVVVVVVVVVVVVVVVVVVVVVVVVVVVVVVVVVVVVVVVVVVVVVVVVVVVVVVVVVVVVVVVVVVVVVVVVVVVVVVVVVVVVVVVVVVVVVVVVVVVVVVVVVVVVVVVVVVVVVVVVVVVVVVVVVVVVVVVVVVVVVCCCCCCCCCCCCCCCCCCCCCCCCCCCCCCCCCCCCCCCCCCCCCCCCCCCCCCCCCCCCCCCCCCCCCCCCCCCCCCCCCCCCCCCCCCCCCCCCCCCCCCCCCCCCCCCCCCCCCCCCCCCCCCCCCCCCCCCCCCCCCCCCCCCCCCCCCCCCCCCCCCCCCCCCCCCCCCCCCCCCCCCCCCCCCCCCCCCCCCCCCCCCCCCCCCCCCCCCCCCCCCCCCCCCCCCCCCCCCCCCCCCCCCCCCCCCCCCCCCCCCCCCCCCCCCCCCCCCCCCCCCCCCCCCCCCCCCCCCCCCCCCCCCCCCCCCCCCCCCCCCCCCCCCCCCCCCCCCCCCCCCCCCCCCCCCCCCCCCCCCCCCCCCCCCCCCCCCCCCCCCCCCDDDDDDDDDDDDDDDDDDDDDDDDDDDDDDDDDDDDDDDDDDDDDDDDDDDDDDDDDDDDDDDDDDDDDDDDDDDDDDDDDDDDDDDDDDDDDDDDDDDDDDDDDDDDDDDDDDDDDDDDDDDDDDDDDDDDDDDDDDDDDDDDDDDDDDDDDDDDDDDDDDDDD]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]]////////////////////////////////////////////////////////////////////////////////////////////////////////////////////////////////////////////////////////////////////////////////////////////////////////////////////////////////////////////////////////////////////////////////////////////////////////////////////////////////////////////BBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBBMMMMMMMMMMMMMMMMMMMMMMMMMMMMMMMMMMMMMMMMMMMMMMMMMMMMMMMMMJJJJJJJJJJJJJJJJJJJJJJJJJJJJJJJJJJJJJJJJJJJJJJJJJJJJJJJJJJJJJJJJJJJJJJHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHTTTTTTTTTTTTTTTTTTTTTTTTTTTGGGGGGGGGGGGGGGGGGGGGGGGGGGGBKKKKKKKKKKKKKKKKKKKKKK GGGGGGGGGGGGGGGCCCCCCCCCCCCCCCCCCCC’

‘’References

Shannon, Claude, A Mathematical Theory of CommunicationEdward O. Wilson, The Diversity of Life, W.W. Norton & Company, 1992.Soderquist, David, Monoculture Vs. Multiculture

<http://kids.osd.wednet.edu/Marshall/homepage/monoculture.html>

------

A common example used to explain negative feedback is the standard thermometer.

System components (agents or subsystems) are usually both regulators and regulated, and feedback loops are often multiple and intersecting (Clayton, 1996, Batra, 1990)." (Morgan, 1999)

Page 23: emergence_1.17.02.doc.doc

, and these relationships are the way in which the system responds to . One element or agent (the 'regulator' or control) sends information into the system, other agents act based upon their reception/perception of this information, and the results of these actions go back to the first agent. It then modifies its subsequent information output based on this response, to promote more of this action (positive feedback), or less or different action (negative feedba[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[[ck). System components (agents or subsystems) are usually both regulators and regulated, and feedback loops are often multiple and intersecting (Clayton, 1996, Batra, 1990)." (Morgan, 1999)

Feedback can be understood as both a regulating force in a given system as well as a metaphor for productive exchange between a “sender and receiver.”

What does this mean in the context of art?

Creating an experience for a participant in an interactive artwork must take into account that interactions are, by definition, not "one-way" propositions. Interaction depends on feedback loops that include not just the messages that preceded them, but also the manner in which previous messages were reactive. When a fully interactive level is reached, communication roles are interchangeable, and information flows across and through intersecting fields of experience that are mutually supportive and reciprocal. The level of success at which a given interactive system attains optimal levels of reciprocity could offer a standard by which to critique interactive artwork.

The capability of systems evolved from "one-way" systems of communication to "multi-directional" systems. It is only recently that interactive systems that support both synchronous and a-synchronous exchanges among multiple users have been available. Common examples of synchronous exchange include live satellite uplinks, telephones, and chat rooms on the Internet. E-mail is the prototypical example of an "asynchronous" exchange system.

Other trends supporting the development of interactive systems come from research in artificial intelligence and cognitive science.

Artificial intelligence

Artificial Intelligence (AI) is the study of how computer systems acquire, represent, and use knowledge. Intelligent computer programs, (such as systems for medical diagnostics, systems for oil field exploration, or chess programs) can be developed by means of explicit knowledge representation and the use of inference rules. The central hypothesis of AI is that all intelligent behavior can be simulated in computer systems as the explicit manipulation of symbolic structures by programs. AI encompasses a broad range of sub-disciplines that attempt to understand three central theoretical problems: knowledge representation, inference, and learning.

Page 24: emergence_1.17.02.doc.doc

http://www.tilburguniversity.nl/faculties/flw/education/courses/artificial.html

While most natural and living systems are "productive" in the sense of creating new "information," human-made machines that can respond with anything more than simple binary "yes/no" responses are a relatively recent phenomenon. To paraphrase media artist Jim Campbell, most machines are simply "reactive," not interactive.

If "mutual influence" and reciprocity are criteria for true interactivity, then the system should be capable of delivering more than pre-existing data on demand. Interactive systems need to be able to generate "custom" responses to input and queries. In short, the system needs to be smart enough to produce output that is not already part of the system. Interactivity must be more than following predetermined prompts to preprogrammed conclusions like in a video game

"Intelligent" machines, being developed with the aid of "neural networks" and "artificial intelligence," can interact by learning new behaviors and changing their responses based upon user input and environmental cues. Over time, certain properties begin to "emerge" such as self-replication or patterns of self-organization and control. These so-called "emergent properties" represent the antithesis of the idea that the world is simply a collection of facts waiting for adequate representation. The ideal system is a generative engine that is simultaneously a producer and a product.

Evolutionary Art and Design: Harnessing Emergent Behavior

Rules of Emergence:

-Hierarchichal organization-modularization with interdependencies-cyclomatic complexity

http://www.cs.brandeis.edu/~pablo/pres/intro/stance2.html

-----------

Dan CollinsInstitute for Studies in the ArtsArizona State UniversityTempe, AZ 85287-3302USA

Page 25: emergence_1.17.02.doc.doc

telephone: 480-965-0972

[email protected]

return to Dan Collins's bibliography

The Golem Project

-------------

http://www.cs.ucl.ac.uk/staff/P.Bentley/AIEDAM.html

Evolutionary Design

However, despite the success of these methods, many questions remain unanswered. For example: should we continue to use evolutionary computation as generative tools, instead of simply optimizers? How can we convince designers of the fact that an unpredictable, unexplainable, stochastic method is of use to them? Can we use ideas from other fields, including biology, to increase the capabilities of our computational models? What are the best ways to interface evolutionary search with existing analysis tools? Is there a future in using Evolutionary Computation in design, or will its limitations (e.g. being unable to backtrack or 'undo' stages of evolution) ultimately prevent us from tackling unsimplified real-world problems?

Papers should contain original and unpublished material, describing the use of evolutionary techniques such as genetic algorithms, genetic programming, memetic algorithms, evolutionary strategies and evolutionary programming, for design problems. Relevant topics include:

Evolutionary optimization of designs. Evolutionary generative design. Creative Evolutionary Design. Conceptual Evolutionary Design. Representations suitable for Evolutionary Design. The integration of aesthetics or techniques from Artificial Life in Evolutionary

Design. Investigations of key aspects within Evolutionary Design systems, e.g. creation or

interfacing of fitness functions, multiobjective optimization, constraint handling, variable-length chromosomes, epistasis.

Page 26: emergence_1.17.02.doc.doc

EDUCATING FOR INTERACTION

By Dan Collins

A first version of this paper was given at the Performative Sites conference at Penn State in October 2000.A published version appeared in New Art Examiner in February 2001.

What is interaction? How can we begin to make sense of the avalanche of educational toys, computer programs, and artworks that claim to be "interactive?" What would a new pedagogy structured around the rules of interactivity look like? ------ Some theorists have argued that all "art is interactive," effectively shutting the door on the discrete contributions of new digital/computer-based interactivity. Conversely, other theorists such as {clipped off front end and added to new introduction…]}

Interactivity and Emergence in the ArtsTraining Jellyfish: Finding patterns in the unpredictableInfinite set of correctionsmanaging the chaosmanaging choice

Necessary Coincidences: Optimizing interactive systems through

Happy AccidentsAccidental TouristAccidental Play

by Dan Collins

A philosophy of the new stretch of time comes into being automatically. Not only because the themes change, but prior to everything, because the method of the thinking changes. One of the descriptions of this transition is that we can no longer accept causal explanations. We must examine the phenomena as products of a game of chances, whereby the chances bend statistically for that purpose, to become necessary.

--Vilém Flusser, from Next love in the electronic age, 1991

Page 27: emergence_1.17.02.doc.doc

Moving beyond what is generally understood as constituting “interactive” elements…

[more conventional definitions of “interactivity” developed by theorists in the cognitive sciences and educational technology,]

…this essay discusses interactive art and its potential as a vehicle for exploring so-called “emergent behaviors”

Emergent behavior may be defined as “levels of self organization that cannot be predicted from the input into a given system.”

Starting with such fundamental concepts as reciprocity, pattern recognition, feedback, and locus of control, the author provides a template for understanding and evaluating interaction and emergent behavior in works of art.

Several threads from recent art and science history are brought together in this essay. Buzz words such as “cybernetics” (Weiner), “systems aesthetics” (Burnham) “artificial intelligence”, and “complexity and chaos theory” (Sante Fe Institute) are set into their respective historical and research contexts. Each orientation provides a piece of the puzzle that is slowing being revealed under the broader rubric of “emergence.”

Concrete examples from art and technology research are used to illustrate how different individuals, groups, and communities engage in interactive experiences—from the locally controlled parameters characteristic of the video game and the LAN bash, to large scale interactions involving distributed collaborative networks over the Internet. Artists and Scientists discussed include Eric Zimmerman (game designer, theorist, and artist); John Klima (artist and webgame designer); Hod Lipson and Jordan B. Pollack (The Golem Project), Pablo Funes (computer scientist and EvoCAD inventor); Eduardo Kac (transgenic art and distributed systems); Sarah Roberts (interactive art), Christa Sommerer, and Laurent Mignonneau (Interactive systems); Ken Ringold (Artificial Intelligence); Yves Amu Klein (Living Sculpture); and Jim Campbell (“Responsive systems”). What differentiates the work of these artists from more traditional practices? What educational background, perceptual skills, and conceptual orientations are required of the artist—and of the viewer/participant? What systems, groups, or individuals are acknowledged and empowered by these new works?

The capability of exhibiting or tracking "Emergent properties" is seen by the author as a future hallmark and essential feature of all interactive systems. With projects that enable these heightened levels of interactivity, we may begin to see the transformation of the discrete and instrumental character of “information” into a broad—and unpredictable-- “knowledge base” that honors the contexts and connections essential to global understanding and exchange.

Page 28: emergence_1.17.02.doc.doc

---------------------------

(Emergence: The Secret Lives of Ants, Brains, Cities, and Software) will be referenced in terms of the internet and his argument that "knowledge" is produced by the interactions of multiple people working collaboratively online...whether by conscious design or otherwise. Artists who may or may not be moving into this territory include Eduardo Kac, Sarah Roberts, Ken Ringold, Christa Sommerer and Laurent Mignonneau, and Jim Campbell.

at with examples of locally controlled interactive computer controlled art operating in local, the author sketches an outline of how large scale interactions, facilitated by distributed collaborative networks, hold promise for various forms of productive emergent behavior

Understanding Interactivity and emergent behavior requires a familiarity with fundamental concepts such as reciprocity, pattern recognition, feedback, and locus of control. The essay offers definitions of terms and concrete examples of levels of interaction and emergence drawn from both the arts and sciences.

Indeces of interactivity are explored including such fundamental concepts as reciprocity, pattern recognition, and feedback.

Reciprocity implies mutual influence. Interactions occur when two or more objects and actions actively modify one another. (see Wagner, 1994)

Feedback

the author explores concepts are first explored such aspattern recognition

Interactivity and emergence are interextricably linked. Fully interactive systems often lead to emergent behaviors. And emergent behavior is predicated on notions of interaction—either among the elements of a given system, or through interactions with some external agent.

Emergence is often discussed in terms of levels of intelligence. A distinction is made between information simply stored in a system and actively application of information to do purposeful work

Page 29: emergence_1.17.02.doc.doc

(See my definition of intelligence in KDI)

Of course “Intelligence”—whether the “real” or “artificial” variety—presumes an ability to synthesize unique responses on the fly—not simply to hit a series of presets on demand. An intelligent system needs to be able to generate "custom" responses to input and queries. In short, the system needs to be smart enough to produce output that is not already part of the system. Intelligence and interactivity must be more than following predetermined prompts to preprogrammed conclusions like in a video game

interactivity that I developed in the New Art Examiner essay from last Spring (interactivity defined as "mutual reciprocity"...which a smart person told me was redundant...and I should simply characterize as "reciprocity.") The concept of Feedback will be used--both from a technical point of view and as a metaphor--to get into the territory of systems that foster dynamic change by feeding output back into input. "Emergent properties" (i.e., self-organizing behaviors that cannot be predicted from input into a given system) will be a characteristic of truly interactive systems. A recent book by Steven Johnson that you may know

(Emergence: The Secret Lives of Ants, Brains, Cities, and Software) will be referenced in terms of the internet and his argument that "knowledge" is produced by the interactions of multiple people working collaboratively online...whether by conscious design or otherwise. Artists who may or may not be moving into this territory include Eduardo Kac, Ken Ringold, Christa Sommerer and Laurent Mignonneau, and Jim Campbell. I will be finding appropriate cool examples to make my point.

How do we define “emergence?” In Steven Johnson’s recent book by the same name he offers the following:

In the simplest terms (emergent systems) solve problems by drawing on masses of relatively stupid elements, rather than a single, intelligent “executive branch.” They are bottom-up systems, not top-down. They get their smarts from below. In a more technical language, they are complex adaptive systems that display emergent behavior. In these systems, agents residing on one scale start producing behavior that lies one scale scale above them: ants create colonies; urbanites create neighborhoods; simple pattern-recognition software learns how to recommend new books. The movement from low-level rules to higher-level sophistication is what we call emergence. (p. 18)

Page 30: emergence_1.17.02.doc.doc

Stephen Wilson

http://www.stanford.edu/group/SHR/4-2/text/wilson.html

The section that follows focuses on my own personal art research. The reader should note that there is a growing number of artists addressing AI issues and that this article is not a comprehensive review. For example, Harold Cohen,3 Peter Beyls,4 and artists represented in the journal Languages of Design 5 are exploring the possibilities of developing algorithms that enable computers to generate behaviors that would be called creative. Extensive work has been undertaken on the automatic composition systems in music. Joseph Bates 6 and his associates are working on a graphic world called OZ in which autonomous objects interact with each other trying to achieve private desires, reacting emotionally to events that occur, and forming simple relationships with other creatures. Naoko Tosa 7 is working on a project called "Neuro Baby" in which a graphic creature responds to feelings it detects in a human's voice and synthesizes appropriate facial expressions. Artistic activity in this area will undoubtedly continue to increase.8

What is artificial intelligence?

http://www.tilburguniversity.nl/faculties/flw/education/courses/artificial.html

Artificial intelligence

The aim of this course is to give a broad overview of the state of the art as regards Artificial Intelligence, AI. AI is the study of how computer systems acquire, represent, and use knowledge. Intelligent computer programs, (such as systems for medical diagnostics, systems for oil field exploration, or chess programs) can be developed by means of explicit knowledge representation and the use of inference rules.

The central hypothesis of AI is that all intelligent behaviour can be simulated in computer systems as the explicit manipulation of symbolic structures by programs. This course is an introduction to the three central theoretical disciplines within AI: knowledge representation, inference, and learning.

See also the "Computing Language and Meaning" pages at http://cwis.kub.nl/~fdl/general/people/rmuskens/clm/ for more information about this course.

http://www.playboy.com/arts-entertainment/movies-tv/intheaters/ai/

The most engaging segment of the 140-plus-minute epic is its frenetic second act when a petrified David becomes quarry for the Flesh Fair, a horrifying Battlebots-like holocaust. To the delight of rabid redneck fans, sentient androids of varying forms are chopped,

Page 31: emergence_1.17.02.doc.doc

shredded, minced, diced or melted in this sadistic arena show while Ministry throbs on the mainstage. Mental connections can be made to any number of ghastly historical precedents, but our humane reaction is complicated by the question of, no matter how sophisticated the Mechas are, is it any more inhumane to junk one of them than it is to scrap a Lexus? There are relevant digital-age philosophical questions looming over Spielberg's summer blockbuster.

http://music.calarts.edu/~david/dcr_courses/dcr-AIArtEd.html

ARTIFICIAL INTELLIGENCE AND ART EDUCATION

Created for a course on AI ideas for artists given at theSan Francisco Art Institute. The text and class outline are presented in their original

forms, without significant updates, in order to preserve the creative spirit of the time.

David Rosenboom

1970 & 1981(slightly revised 1983)

Artificial intelligence belongs to that class of disciplines in which the primary focus is placed on the creation of formal procedures, (i.e.. complex algorithms), which are inspired by and based on hypothetical models of the functioning of human intelligence. In this it shares something fundamental with artistic disciplines which look to nature, in the broadest sense, for stimulation and guidance.

-------

http://www.usask.ca/art/digital_culture/wiebe/index.html

Harold Cohen’s website. Has numerous links.-----

http://www.heise.de/tp/english/special/vag/6063/2.html

Like the Christian Apocalypse, the arrival of this Artificial Intelligence is perpetually postponed. Yet, it remains a powerful contemporary myth. In science fiction, conscious computers and sassy robots are essential elements of the genre. From Rachel in 'Bladerunner' to Data in 'Star Trek TNG', sci-fi stories use the fantasy of artificial life to express the modernist dilemma: what makes us truly human? Yet, when embraced by mystical positivism, this sci-fi anthropomorphism ironically becomes the repository of some very pre-modern desires. As in traditional religion, the cult of Artificial Intelligence feeds off atavistic fantasies: making babies without sex; being the master of slaves;

Page 32: emergence_1.17.02.doc.doc

achieving immortality; and even turning into pure Spirit. With secular utopias discredited, old myths are reborn as sci-fi monsters.

http://www.topsiteslinks.com/ussites/computer/ai.htm

1. About.com 2. Botspot 3. Journal of Artificial Intelligence 4. Artificial Brains 5. CMU Artificial Intelligence 6. Generation5

http://www.artnode.dk/text/text_uk/rinaldo.html

Technology Recapitulates Phylogeny:Artificial Life Art

by Kenneth E. Rinaldo

 

AbstractThis paper discusses the notion of emergence, the result of the collapse of both scientific and artistic barriers which have contributed to the rise of Artificial Life art. A discussion of artists who use biology as model and computers as material will lead into a description of some current Artificial Life art works. The author finishes with conclusions about the work of art in relation to Artificial Life techniques and interactive art in particular.

--------------

http://www.feedmag.com/templates/default.php3?a_id=1524

FEED article on Mark Tilden and solorpowered robot bugs that exhibit “emergent” behaviors.

Page 33: emergence_1.17.02.doc.doc

Consider also the Turing Test…

http://www.homeworkspot.com/know/artintelligence.htm

Artificial intelligence (AI) uses computers to model or imitate human reasoning and learning. This field of science is associated with an area of cybernetics which combines the study of biology and computer science. Scientists may one day use this knowledge to create thinking robots.

What was a distant desire has become a demand: We want to be able to interact with our machines.

While broad interest in the constructs of “interactivity” and “emergence” is relatively recent in the arts, the culture at large has long been obsessed with the idea of machines that learn. The evidence is mounting. From media spectacles such as Big Blue's defeat of World Chess Champion Garry Kasporov in May of 1997 to quieter revolutions in teaching autistic children, intelligent machines and computer-based systems that master the behaviors of their users are beginning to find a place in the culture.

There is more than a hint of narcissism in our desire to be personally reflected in the machines we make. We don't want simply "dumb" tools, we want smart machines that respond and learn through intuitive interaction. Even our cooking appliances and car radios are "programmable"—albeit crudely--to reflect individual tastes.

In the arts, the revolution in digital media and interactive techniques, and distributed computing has helped to spawn a series of conferences worldwide dedicated to interactive systems, artificial intelligence, emergent behavior, and cybernetics. Siggraph, Ars Electronica, ISEA and many others are part of the staple of the international art and technology crowd. More recently, major exhibitions at the Whitney (BitStreams) and SFMOMA (0101010) have provided a kind of “coming out” for digital media in the mainstream art world.

Interactivity and emergence are interextricably linked. Fully interactive systems often lead to emergent behaviors. And emergent behavior is predicated on notions of

Page 34: emergence_1.17.02.doc.doc

interaction—either among the elements of a given system, or through interactions with some external agent.

Emergence is often discussed in terms of levels of intelligence. A distinction is made between information simply stored in a system and actively application of information to do purposeful work

(See my definition of intelligence in KDI)

Of course “Intelligence”—whether the “real” or “artificial” variety—presumes an ability to synthesize unique responses on the fly—not simply to hit a series of presets on demand. An intelligent system needs to be able to generate "custom" responses to input and queries. In short, the system needs to be smart enough to produce output that is not already part of the system. Intelligence and interactivity must be more than following predetermined prompts to preprogrammed conclusions like in a video game

“Intelligence” is of course much more than the ability to hit a series of pre-sets at a specified hour (e.g., the coffee maker kicking. in at 6:30 am).

---------

landscape our consumer culture paints itself as a purveying an infinite set of choices. While the move from the department store to the 500 stations of cable television may seem like a choice, as Gene Youngblood has characterized “the broadcast”—its only ONE choice.

But what are the advantages of creating systems that enable “mutual influence” between players? Other trends supporting the development of interactive systems come from research in artificial intelligence and cognitive science.

Flusser

Now however to the question whether a philosophy of the computer era is demanded. A philosophy of the recent time develops automatically. Not only, because the topics change, but above all, because the method of thinking changes. One of the characteristics of the transition is that we cannot be content no more with causal explanations. We must regard the phenomena as products of a play of coincidences , whereby the coincidences are inclined statistically to it, to become necessary.

Simon Penny has insisted that traditional artworks such as painting and sculpture are simply "instances of representation" and as such should not be defined as truly "interactive systems." For Penny, interactive artworks are "virtual machines which...produce instances of representation based on real time inputs." (Penny 1996) Still

Page 35: emergence_1.17.02.doc.doc

other theorists distinguish between the relationship of system interactivity (the enabling hardware and software) and the nature of the interaction (the actual exchange--be it aesthetic, educational, political, etc.) (Hillman et al, 1994). For our purposes here, a high level of interaction equals "mutual reciprocity"--a state of dialectical exchange between two or more entities. Ideally, interactive systems--whether a high tech computer game or a Socratic dialogue--can be tools for learning providing intelligent feedback that refines and amplifies user input.

Interactivity and Emergence in the ArtsTraining Jellyfish: Finding patterns in the unpredictableInfinite set of correctionsmanaging the chaosmanaging choice

INTERACTIVITY and EMERGENCE

Out of Control, Letting Go,

Building Knowledge by Relinquishing Control

Autonomous Design and Intelligent Artworks

ARTifcial Systems, Evolutionary Design

Emergence

Neighbor interaction, pattern recognition, feedback, and indirect control. (p. 22)

Pattern Recognition

Pattern recognition is the research area that studies the operation and design of systems that recognize patterns in data. It encloses subdisciplines like discriminant analysis, feature extraction, error estimation, cluster analysis (together sometimes called statistical pattern recognition), grammatical inference and parsing (sometimes called syntactical

Page 36: emergence_1.17.02.doc.doc

pattern recognition). Important application areas are image analysis, character recognition, speech analysis, man and machine diagnostics, person identification and industrial inspection

Related areas of research:

Artificial Intelligence (expert systems and machine learning) Neural Networks Vision Cognitive Sciences and Biological Perception Mathematical Statistics (hypothesis testing and parameter estimation) Nonlinear Optimization Exploratory Data Analysis

http://www.ph.tn.tudelft.nl/PRInfo/prarea.html

------------------------------------------

Actors and neighbors.Easel's architecture is designed so that it can simulate very large numbers of independent "actors"(analogous to the participants discussed above). Actors are simulated entities of the physical world (e.g., asystem administrator, user, intruder, automobile, bird, or the moon), of the electronic world (e.g., acomputer, router, or peripheral device), or of the software world (e.g., a software agent or task). Each actorhas it own thread of control and this allows for a high degree of parallelism in Easel's execution. Actors caninteract directly only with their near neighbors and only in ways prescribed by their neighbor relationships.Neighbor relationships are protocols of interaction and are defined as types that can be associated with anyactor. Thus, in a simulation of birds in flight, a bird's near neighbors might be any bird or other object thatthe bird can see from its current position and heading. In an organizational simulation, an actor's nearneighbors might be only those actors that are connected by formal organizational ties. In the latter example,neighbor operations might include sending and receiving messages.

http://www.prosim.org/prosim2000/paper/ProSimEA11.pdf

-----------------------------------

Page 37: emergence_1.17.02.doc.doc

Neural Networks

Computers are the ideal metaphor for the human mind. Cognitive scientists have long used the serial processor as their model for the brain because this type of computer excels in deductive reasoning.

Researchers are also exploring whether parallel processors can serve as models for how the brain functions. Parallel processors are computers that excel at pattern recognition, or inductive thinking. Parallel processors that can handle many instructions at once are called neural networks (or nets). Neural nets excel at inductive tasks, such as pattern recognition, for which many commercial applications are now being developed.

It's possible these researchers will conclude that the brain is not a linear tool, as originally suggested by the serial model, but that the parallel model of processing information more closely represents how the mind works. Maybe the ultimate model of the human brain would be one that combines both the serial and parallel analogies.

http://www.funderstanding.com/neural_networks.cfm

------------

-----

1.1 Artificial Intelligence and Intelligence

There are two goals to AI, the biggest one is to produce an artificial system that is about as good as or better than a human being at dealing with the real world. The second goal is more modest: simply produce small programs that are more or less as good as human beings at doing small specialized tasks that require intelligence. To

Page 38: emergence_1.17.02.doc.doc

many AI researchers simply doing these tasks that in human beings require intelligence counts as Artificial Intelligence even if the program gets its results by some means that does not show any intelligence, thus much of AI can be regarded as "advanced programming techniques".

http://www.dontveter.com/basisofai/ch1.html

Golem Project

Without reference to specific organic chemistry, life is an autonomous design process that is in control of a complex set of chemical factories allowing the generation and testing of physical entities that exploit the properties of the medium of their own construction. Using a different medium, namely off-the-shelf rapid manufacturing, and evolutionary design in simulation, we have made progress towards replicating this autonomy of design and manufacture. This is the first time any artificial evolution system has been connected to an automatic physical construction system. All together, our evolutionary design system, solidification process, and rapid prototyping machine form a primitive "replicating" robot. While there are many, many further steps before this technology is dangerous, we believe that if indeed artificial systems are to ultimately interact and integrate with reality, they cannot remain virtual; it is crucial that they cross the simulation-reality gap to learn, evolve and affect the physical world directly. Eventually, the evolutionary process must accept feedback from the live performance of its products.

http://www.demo.cs.brandeis.edu/golem/conclusions.html

emergent rules * hierarchichal organization * modularization with interdependencies* cyclomatic complexity

http://www.cs.brandeis.edu/~pablo/pres/intro/stance2.html

Page 39: emergence_1.17.02.doc.doc

Feedforward for faster control response

Control systems often rely on feedforward to improve their ability to respond to the command signal. Feedforward is based on the structure in control systems where one loop is nested inside another. For example, in motion control systems, a velocity loop is often enclosed inside a position loop. The position loop generates a velocity command, which is shuttled to the velocity loop. In this control structure, the inner loop is always faster than the outer loop. The feedforward path is essentially a shortcut around the slower outer loop, allowing the command signal direct access to the faster inner loop.http://www.controleng.com/archives/2000/ctl1001.00/0010bb.htm

Self-modeling is a process where people watch themselves performing a desired behavior or skill, generally on videotape. It has proven to be effective in changing behavior, improving skills, and increasing self-confidence. Video feedforward is a special type of self-modeling in which images of future behavior, or a skill not yet acquired, are created using hidden supports and editing techniques. ACE Reading uses video feedforward to create images of children reading successfully, which can actually help them improve their reading skills.

http://www.cds.hawaii.edu/reading/feedforward.html

Feedforward: anticipating perturbationsFeedback and feedforward both require action on the part of the system, to suppress or compensate the effect of the fluctuation. For example, a thermostat will counteract a drop in temperature by switching on the heating. Feedforward control will suppress the disturbance before it has had the chance to affect the system's essential variables. This requires the capacity to anticipate the effect of perturbations on the system's goal. Otherwise the system would not know which external fluctuations to consider as perturbations, or how to effectively compensate their influence before it affects the system. This requires that the control system be able to gather early information about these fluctuations. For example, feedforward control might be applied to the thermostatically controlled room by installing a temperature sensor outside of the room, which would warn the thermostat about a drop in the outside temperature, so that it could start heating before this would affect the inside temperature. In many cases, such advance warning is difficult to implement, or simply unreliable. For example, the thermostat might start heating the room, anticipating the effect of outside cooling, without being aware that at the same time someone in the room switched on the oven, producing more than enough heat to offset the drop in outside temperature. No sensor or anticipation can

Page 40: emergence_1.17.02.doc.doc

ever provide complete information about the future effects of an infinite variety of possible perturbations, and therefore feedforward control is bound to make mistakes. With a good control system, the resulting errors may be few, but the problem is that they will accumulate in the long run, eventually destroying the system.

http://pespmc1.vub.ac.be/MECHCONT.html

Complexity

Characteristics of a complex system

A complex system is emergent. In an emergent system, smaller parts comprise a larger system. This larger system has properties the smaller units lack. For example, the brain is made up of individual neurons that, when functioning together, are capable of tasks no single neuron can perform alone. The new properties only emerge when the neurons work together.

A complex system is unpredictable.

A complex system contains many iterations and feedback/feedforward loops.

In a complex system, decision-making is decentralized.

Learning is a typically a "complex" activity. Most learning systems contain a number of separate parts that must work together for learning to occur. For example, a typical learning system consists of students, a teacher, a content focus, and resources. This system operates according to a fixed plan--the students follow the teacher's "rules." Learning environments

A learning environment can be emergent. Working together, a group of learners can collectively build their knowledge of a topic, for instance, the phases of the moon. To do so, each learner might research a particular lunar phase, then share what he or she has learned with the rest of the group. This way, the group amasses a body of knowledge that no one person could have acquired alone.

A learning environment can be unpredictable. An

Page 41: emergence_1.17.02.doc.doc

exploration of the phases of the moon could result in the group considering whether planets also have phases.

A learning environment can contain many iterations and feedback/feedforward loops. People learn by trial and error--in other words, they learn from their mistakes.

Decision-making in a learning environment can be decentralized. Groups can really thrive when students control the learning process, rather than the instructor.

http://www.funderstanding.com/complexity.cfm

What is interaction? How can we begin to make sense of the avalanche of educational toys, computer programs, and artworks that claim to be "interactive?" What would a new pedagogy structured around the rules of interactivity look like? ------ Some theorists have argued that all "art is interactive," effectively shutting the door on the discrete contributions of new digital/computer-based interactivity. Conversely, other theorists such as Simon Penny have insisted that traditional artworks such as painting and sculpture are simply "instances of representation" and as such should not be defined as truly "interactive systems." For Penny, interactive artworks are "virtual machines which...produce instances of representation based on real time inputs." (Penny 1996) Still other theorists distinguish between the relationship of system interactivity (the enabling hardware and software) and the nature of the interaction (the actual exchange--be it aesthetic, educational, political, etc.) (Hillman et al, 1994). For our purposes here, a high level of interaction equals "mutual reciprocity"--a state of dialectical exchange between two or more entities. Ideally, interactive systems--whether a high tech computer game or a Socratic dialogue--can be tools for learning providing intelligent feedback that refines and amplifies user input.

While the demand for "interactivity" is a relatively recent phenomenon in the arts, the culture at large has long been obsessed with the idea of machines that learn. The evidence is mounting. From media spectacles such as Big Blue's defeat of World Chess Champion Garry Kasporov in May of 1997 to quieter revolutions in teaching autistic children, computers that master the behaviors of

Page 42: emergence_1.17.02.doc.doc

their users are beginning to find a place in the culture. There is more than a hint of narcissism in our desire to be personally reflected in the machines we make. We don't want simply "dumb" tools, we want "intelligent" machines that respond and learn by interacting with their owners. Even our cooking appliances and car radios are "programmable" to reflect individual tastes.

Few art schools provide courses for producing let alone interpreting or critiquing "interactive artworks." Though the borderline between the fine arts and other cultural practices (such as science, technology, and entertainment) is becoming increasingly blurred, it is clear that the development of "interactive art" is largely dependent on "non-art" traditions. From a technical and theoretical perspective, such strange bedfellows as computer gaming, combat simulation, and medical diagnostics have more in common with much recent digital and interactive art practice than main stream art history or criticism. Theorizing this territory is less a matter of mining, say, the Art Index, and more a matter of conducting systematic research into areas such as communications theory, human computer interaction, educational technology, and cognitive science. With this in mind, it may be helpful to briefly review how other disciplines are looking at the issues surrounding interaction.

THEORIES OF INTERACTION

"Interaction" is a useful construct in helping to understand the complex relationships occurring in a computerized learning environment. Educational technologist Ellen Wagner defines interaction as "… reciprocal events that require at least two objects and two actions. Interactions occur when these objects and events mutually influence one another." (Wagner, 1994)

Wagner points to historical examples of communication theory to illustrate the move from "one-way" systems of communication to "multi-directional" systems. C.E. Shannon's mathematical theory of communication (Shannon, 1948), for example, was a highly linear engineering model of information transfer involving the one-way transmission of information from a source to a destination using a transmitter, a signal, and a receiver. Later theorists built upon Shannon's model to include the concepts of interactivity and feedback. It is only recently that truly interactive systems that support both synchronous and a-synchronous exchanges among multiple users have been available. Common examples of synchronous exchange include live satellite uplinks, telephones, and chat rooms on the Net. E-mail is the prototypical example of an "asynchronous" exchange system.

Page 43: emergence_1.17.02.doc.doc

Other trends supporting the development of interactive systems come from research in artificial intelligence and cognitive science. If "mutual influence" and reciprocity are criteria for true interactivity, then the system needs to be capable of delivering more than pre-existing data on demand. Interactive systems need to be able to generate "custom" responses to input and queries. In short, the system needs to be smart enough to produce output that is not already part of the system. Interactivity must be more than following predetermined prompts to preprogrammed conclusions like in a video game

While most natural and living systems are "productive" in the sense of creating new "information," human-made machines that can respond with anything more than simple binary "yes/no" responses are a relatively recent phenomenon. To paraphrase media artist Jim Campbell, most machines are simply "reactive," not interactive. "Intelligent" machines, being developed with the aid of "neural networks" and "artificial intelligence," can interact by learning new behaviors and changing their responses based upon user input and environmental cues. Over time, certain properties begin to "emerge" such as self-replication or patterns of self-organization and control. These so-called "emergent properties" represent the antithesis of the idea that the world is simply a collection of facts waiting for adequate representation. The ideal system is a generative engine that is simultaneously a producer and a product.

"INTERACTIVE" ARTWORKS

Creating an experience for a participant in an interactive artwork must take into account that interactions are, by definition, not "one-way" propositions. Interaction depends on feedback loops that include not just the messages that preceded them, but also the manner in which previous messages were reactive. When a fully interactive level is reached, communication roles are interchangeable, and information flows across and through intersecting fields of experience that are mutually reciprocal. The level of success at attaining mutual reciprocity could offer a standard by which to critique interactive artwork.

Many artists have developed unique attempts at true interaction, addressing problems of visual display, user control processes, navigation actions, and system responses. Different works have varying levels of audience participation, different ratios of local to remote interaction, and either the presence or absence of emergent behaviors. Moreover, different artistic attempts at interactivity suggest different approaches to interaction could be used for diverse kinds of learners in a variety of educational settings.

Page 44: emergence_1.17.02.doc.doc

Understanding experiments with interaction in an art context may help us to better understand interaction in pedagogical settings.

Carol Flax: Journeys: 1900/2000

Detail showing a viewer turning a page of the book and triggering a video projection. Photo credit: Patricia Clark.

Arizona artist Carol Flax has created an "interactive book" entitled, Journeys: 1900/2000 at the Institute for Studies in the Arts at Arizona State University, where I am the Interim Director. At the heart of the project is a reproduction of a 19th century travel album that trades in fragments of memory, pieces of voyages, and bits of history. It uses single images from various existing albums, reproducing and recontextualizing them to create a completely original "voyage." Movement sensing technology (computerized tracking devices) sense the presence of a viewer. "Bend sensors" embedded in the pages cause an electrical signal to be sent to a computer when a page is turned. Video and audio clips are in turn triggered that support, amplify, or contest the veracity of the photographic prints of idealized ancient settings through the simple juxtaposition of contemporary imagery and sound with historical photographs. For example, a black and white image of a Middle Eastern market--a classic example of the "exotic" and the "picturesque"--is overlaid with a video closeup in color of an orange being passed from the hand of one person to another in an Arizona back yard. Text is used

Page 45: emergence_1.17.02.doc.doc

throughout both descriptively and ironically to throw into question the truth value of what we are seeing. The message is deliberately multi-leveled and ambiguous, but one thing is clear: we are creating a journey in which we are complicit, not simply voyeurs.

The work enables a unique method of navigating the content and scores high in providing a seamless encounter between the user and the subject matter. While the artwork is described as being an interaction between a single user and a variable content, the entire set of options remains fixed in the computer's database. The work does not claim any degree of "reciprocity" between the object and the user. It does not "learn" the reader's habits. Therefore, it is not, strictly speaking, interactive. This does not diminish the project's ability inspire repeated visits and reward the user with unexpected discoveries. Given its non-linear organization and randomized sequences of multiple video clips, each user’s experience of the book is actively engaging and unique.

As a model for a different "textbook" perhaps, the project points toward a new class of books that are constructed with the individual user in mind and that respond with some intelligence to reader's choices. The fact that Flax insists on preserving the essential kinesthetic aspects of reading--the feel of the paper, the turning of the pages--implies that certain direct forms of knowing just cannot be improved upon. However, the use of moving images, scrolling text, and audio clips that spill beyond the boundaries of the book have more in common with immersive experiences such as VR than reading. This is neither a conventional book nor an over-scaled e-book, but rather a hybrid of traditional forms and "reactive" hi tech processes.

Christa Sommerer and Laurent Mignonneau: Interactive Plant Growing (1993)

Austrian-born Christa Sommerer and French-born Laurent Mignonneau teamed up in 1992, and now work at the ATR Media Integration and Communications Research Laboratories

Page 46: emergence_1.17.02.doc.doc

in Kyoto, Japan. In nearly a decade of collaborative work, Sommerer and Mignonneau have built a number of unique virtual ecosystems, many with custom viewer/machine interfaces. Their projects allow audiences to create new plants or creatures and influence their behavior by drawing on touch screens, sending e-mail, moving through an installation space, or by touching real plants wired to a computer.

Artist's rendering of the installation showing the five pedestals with plants and the video screen.

Interactive Plant Growing is an example of one such project. The installation connects actual living plants, which can be touched or approached by human viewers, to virtual plants that are grown in real-time in the computer. In a darkened installation space, five different living plants are placed on 5 wooden columns in front of a high-resolution video projection screen. The plants themselves are the interface. They are in turn connected to a computer that sends video signals from its processor to a high resolution video data projection system. Because the plants are essentially antennae hard wired into the system, they are capable of responding to differences in the electrical potential of a viewer's body. Touching the plants or moving your hands around them alters the signals sent through the system. Viewers can influence and control the virtual growth of more than two dozen computer-based plants.

Page 47: emergence_1.17.02.doc.doc

Screen shot of the video projection during one interactive session.

Viewer participation is crucial to the life of the piece. Through their individual and collective involvement with the plants, visitors decide how the interactions unfold and how their interactions are translated to the screen. Viewers can control the size of the virtual plants, rotate them, modify their appearance, change their colors, and control new positions for the same type of plant. Interactions between a viewer's body and the living plants determine how the virtual three-dimensional plants will develop. Five or more people can interact at the same time with the five real plants in the installation space. All events depend exclusively on the interactions between viewers and plants.

The artificial growing of computer-based plants is, according to the artists, an expression of their desire to better understand the transformations and morphogenesis of certain organisms (Sommerer et al, 1998).

What are the implications of such works for education? How can we learn from this artistic experimentation to use technological systems to be better teachers? Educators have long recognized the importance of two-way or multi-directional communication. Nevertheless, many educators perpetuate the mindset of the one-way "broadcast"--a concept that harks back

Page 48: emergence_1.17.02.doc.doc

to broadcast media such as radio and echoes the structure of the standard lecture where teacher as "source" transmits information to passive "receivers." The notion of a "one-to-many" model that reinforces a traditional hierarchical top-down approach to teaching is at odds with truly democratic exchange. In Interactive Plant Growing, Sommerer and Mignonneau invert this one to many model by providing a system for multiple users to collaborate on the creation of a digital wall projection in real time. The system in effect enables a real time collaboration that takes many diverse inputs and directs them to a common goal. And this is exactly what good teaching is. This conceptualization of critical pedagogy has been developed in many different situations, but here is combined with technology that mirrors its structure.

Sommerer and Mignonneau: Verbarium (1999)

In a more recent project the artists have created an interactive "text-to-form" editor available on the Internet. At their Verbarium web site, on-line users are invited to type text messages into a small pop up window. Each of these messages functions as a genetic code for creating a visual three-dimensional form. An algorithm translates the genetic encoding of text characters (i.e., letters) into design functions. The system provides a steady flow of new images that are not pre-defined but develop in real-time through the interaction of the user with the system. Each different message creates a different organic form. Depending on the composition of the text, the forms can either be simple or complex. Taken together, all text images are used to build a collective and complex three-dimensional image. This image is a virtual herbarium, comprised of plant forms based on the text messages of the participants. On-line users help to not only create and develop this virtual herbarium, but also have the option of clicking on any part of the collective image to de-code earlier messages sent by other users.

Page 49: emergence_1.17.02.doc.doc

Screen shot of the Verbarium web page showing the collaborative image created by visitors to the site.The text to form algorithm translated "purple people eater" into the image at the upper left.This image was subsequently collaged into the collective "virtual herbarium."

In both the localized computer installations and web-based projects realized by Sommerer and Mignonneau, the interaction between multiple participants operating through a common interface represents a reversal of the topology of information dissemination. The pieces are enabled and realized through the collaboration of many participants remotely connected by a computer network. In an educational setting, this heightened sense of interaction needs to be understood as crucial. Students and instructors alike become capable of both sending and receiving messages. Everyone is a transmitter and a receiver, a publisher and a consumer. In the new information ecology, traditional roles may become reversed--or abandoned. Audience members become active agents in the creation of new artwork. Teachers spend more time facilitating and "receiving" information than lecturing. Students exchange information with their peers and become adept at disseminating knowledge.

Page 50: emergence_1.17.02.doc.doc

Ken Rinaldo: Autopoiesis (2000)

Overview of all fifteen robotic arms of the Autopoiesis installation. Photo credit: Yehia Eweis.

A work by American artist Ken Rinaldo was recently exhibited in Finland as part of "Outoaly, the Alien Intelligence Exhibition 2000," curated by media theorist Erkki Huhtamo. Rinaldo, who has a background in both computer science and art, is pursuing projects influenced by current theories on living systems and artificial life. He is seeking what he calls an "integration of organic and electro-mechanical elements" that point to a "co-evolution between living and evolving technological material."

Rinaldo's contribution to the Finnish exhibition was an installation entitled Autopoiesis, which translates literally as "self making." The work is a computer-based installation consisting of fifteen robotic sound sculptures that interact with

Page 51: emergence_1.17.02.doc.doc

the public and modify their behaviors over time. These behaviors change based on feedback from infrared sensors which determine the presence of the participant/viewers in the exhibition, and the communication between each separate sculpture.

The series of robotic sculptures--mechanical arms that are suspended from an overhead grid--"talk" with each other (exchange audio messages) through a computer network and audible telephone tones. The interactivity engages the participants who in turn effect the system's evolution and emergence. This interaction, according to the artist, creates a system evolution as well as an overall group sculptural aesthetic. The project presents an interactive environment which is immersive, detailed, and able to evolve in real time by utilizing feedback and interaction from audience members.

What are the pedagogical implications for systems such as Autopoiesis that exhibit "emergent properties?" Participant/learners interacting with such systems are challenged to understand that cognition is less a matter of absorbing ready made "truths" and more a matter of finding meaning through iterative cycles of inquiry and interaction. Ironically, this may be what good teaching has always done. So would we be justified in building a "machine for learning" that does essentially the same thing that good teachers do? One argument is that by designing such systems we are forced to look critically at the current manner in which information is generated, shared, and evaluated. Further, important questions are surfaced such as "who can participate"; "who has access to the information;" and "what kinds of interactions are enabled?" The traditional "machine for learning" (the classroom) with its single privileged source of authority (the teacher) is hardly a perfect model. Most of the time, it is not a system that rewards boundary breaking, the broad sharing of information, or the generation of new ideas. It IS a system that, in general, reinforces the status quo. Intelligent machines such as Rinaldo's Autopoiesis can help us to draw connections between multiple forms of inquiry, enable new kinds of interactions between disparate users, and increase a sense of personal agency and self-worth. While intelligent machines will

Page 52: emergence_1.17.02.doc.doc

surely be no smarter than their programmers, pedagogical models can be more easily shared and replicated. Curricula (programs for interactions) can be modified or expanded to meet the special demands of particular disciplines or contexts. Most importantly, users are free to interact through the system in ways that are suited to particular learning styles, personal choices, or physical needs.

IMPLICATIONS FOR ART AND EDUCATION

Interactive artworks of the future will enable interactions that are at once personal and universal. These interactions will be characterized by a subtle reciprocity between the body and the natural environment, and an expanded potential for self-knowledg and learning. Truly interactive experiences are already empowering individuals (consider the "disabled" community or autistic learners, for example).

Returning to various theories of interaction (particularly those of Ellen Wagner), several recommendations for artists emerge that begin to trace a trajectory for the education of the interactive artist. They include training on and empowerment with various technologies; understanding media-structured feedback loops (1) and methods for enhancing "mutual recriprocity"; rethinking where meaning is constituted (cognitive theory is now suggesting that "meaning" is seen as something that happens between rather than inside individuals); and redefinition of the roles of educators and learners. Rapid evolution in the art profession as a whole is creating changes in the definitions and roles played by art teachers and prospective artists.

There is no question that the uses of technology outlined here need to be held against the darker realities of life in a hi-tech society. The insidious nature of surveillance and control, the assault on personal space and privacy, the commodification of aesthetic experience, and the ever-widening "digital divide" between the technological haves and have nots are constant reminders that technology is a double edged sword.

But there is at least an equal chance that a clearer

Page 53: emergence_1.17.02.doc.doc

understanding of the concept of interaction--specifically interaction enabled by technology--will yield a broader palette of choices from which human beings can come together to create meaning. In watching these processes unfold, educators will surely find new models for learning.

_____________

Notes

(1) "The feedback loop is perhaps the simplest representation of the relationships between elements in a system, and these relationships are the way in which the system changes. One element or agent (the 'regulator' or control) sends information into the system, other agents act based upon their reception/perception of this information, and the results of these actions go back to the first agent. It then modifies its subsequent information output based on this response, to promote more of this action (positive feedback), or less or different action (negative feedback). System components (agents or subsystems) are usually both regulators and regulated, and feedback loops are often multiple and intersecting (Clayton, 1996, Batra, 1990)." (Morgan, 1999)

References

Flax, Carol. (2000). URL: http://www.arts.arizona.edu/cflax/

Hillman, D., Willis, D.J. & Gunawardena, C.N. (1994). Learner-interface interaction in distance education: An extension of contemporary models and strategies for practioners. The American Journal of Distance Education, 8(2).

Huhtamo, Erkki (1993). Seeking deeper contact: interactive art as metacommentary. URL: http://www.ccms.mq.edu.au/mus302/seeking_deeper_contact.html

Page 54: emergence_1.17.02.doc.doc

Moore, M. (1989). Editorial: Three types of interaction. The American Journal of Distance Education, 3(2), 1-6.

Morgan, Katherine Elizabeth (1999). A systems analysis of education for sustainability and technology to support participation and collaboration. Unpublished Master's Thesis at the University of British Columbia. http://www.interchange.ubc.ca/katherim/thesis/index.html

Penny, Simon (1996) Embodied agents, reflexive engineering and culture as a domain. p. 15. (talk given at the Museum of Modern Art in New York City, May 20, 1996) URL: http://adaweb.walkerart.org/context/events/moma/bbs5/transcript/penney01.html

Penny, Simon. (2000). URL: http://www-art.cfa.cmu.edu/Penny/works/traces/Tracescode.html

Rinaldo, Ken (2000).URL: http://ylem.org/artists/krinaldo/emergent1.html

Shannon, C.E. (1948). A mathematical theory of communication, Bell System Technical Journal, vol. 27, pp. 379-423 and 623-656, July and October. URL: http://cm.bell-labs.com/cm/ms/what/shannonday/paper.html

Sommerer, Christa and Laurent Mignonneau (1998). Art as a living system. Leonardo, Vol. 32, No. 3., pp. 165-173.

Sommerer, Christa and Laurent Mignonneau (2000). URL: http://www.mic.atr.co.jp/~christa

Wagner, M. (1994). In support of a functional definition of interaction. The American Journal of Distance Education, 8(2).

Wilson, Stephen. (1993). The aesthetics and practice of designing interactive computer events. URL:

Page 55: emergence_1.17.02.doc.doc

http://userwww.sfsu.edu/~swilson/papers/interactive2.html

-----------

Dan CollinsInstitute for Studies in the ArtsArizona State UniversityTempe, AZ 85287-3302USA

telephone: 480-965-0972

[email protected]

In a system where a transformation occurs, there are inputs and outputs. The inputs are the result of the environment's influence on the system, and the outputs are the influence of the system on the environment. Input and output are separated by a duration of time, as in before and after, or past and present.

In every feedback loop, as the name suggests, information about the result of a transformation or an action is sent back to the input of the system in the form of input data. If these new data facilitate and accelerate the transformation in the same direction as the preceding results, they are positive feedback - their effects are cumulative. If the new data produce a result in the opposite direction to previous results, they are negative feedback - their effects stabilize the system. In the first case there is exponential growth or decline; in the second there is maintenance of the equilibrium.

Page 56: emergence_1.17.02.doc.doc

Positive feedback leads to divergent behavior: indefinite expansion or explosion (a running away toward infinity) or total blocking of activities (a running away toward zero). Each plus involves another plus; there is a snowball effect. The examples are numerous: chain reaction, population explosion, industrial expansion, capital invested at compound interest, inflation, proliferation of cancer cells. However, when minus leads to another minus, events come to a standstill. Typical examples are bankruptcy and economic depression.

In either case a positive feedback loop left to itself can lead only to the destruction of the system, through explosion or through the blocking of all its functions. The wild behavior of positive loops - a veritable death wish - must be controlled by negative loops. This control is essential for a system to maintain itself in the course of time. Negative feedback leads to adaptive, or goal-seeking behavior: sustaining the same level, temperature, concentration, speed, direction. In some cases the goal is self-determined and is preserved in the face of evolution: the system has produced its own purpose (to maintain, for example, the composition of the air or the oceans in the ecosystem or the concentration of glucose in the blood). In other cases man has determined the goals of the machines (automats and servomechanisms). In a negative loop every variation toward a plus triggers a correction toward the minus, and vice versa. There is tight control; the system oscillates around an ideal equilibrium that it never attains. A thermostat or a water tank equipped with a float are simple examples of regulation by negative feedback.

Page 57: emergence_1.17.02.doc.doc

return to Dan Collins's bibliography

A good, unbiased quasirandom number can be especially useful in areas where people have known predispositions. Like the drunk who looks for his keys under the streetlight, instead of in the bushes where he dropped them, “because there’s more light here,” we often make decisions that reflect our own cognitive tendencies more than the necessities of the task at hand. A random choice can safeguard against such tendencies.

--Extra from the Swarm Intelligence quote…

------------Humberto Maturana, a biologist who recasts the concepts of "language" and "living system" with a cybernetic eye [Maturana & Varela 1988], in shifting their opinions away from the AI perspective. They quote Maturana:

"Learning is not a process of accumulation of representations of the environment; it is a continuous process of transformation of behavior through continuous change in the capacity of the nervous system to synthesize it. Recall does not depend on the indefinite retention of a structural invariant that represents an entity (an idea, image or symbol), but on the functional ability of the system to create, when certain recurrent demands are given, a behavior that satisfies the recurrent demands or that the observer would class as a reenacting of a previous one." [Humberto Maturana 1980] <http://pangaro.com/published/cyber-macmillan.html>

In this new time, philosophy comes into being automatically. Not only because the themes change, but, prior to everything, because our method of thinking changes. One of the descriptions of this transition is that we can no longer accept causal explanations.

Page 58: emergence_1.17.02.doc.doc

We must examine phenomena as products of a game of chance, of a play of coincidences, whereby the chances bend statistically for that purpose, and the coincidences become necessary.

--Vilém Flusser, from Next love in the electronic age, 1991

Robots Beget More Robots? Computer Automatically Designs, Manufactures Robots

A virtual robot called " Complex 1" is seen in a computer program on Aug. 29, 2000, at the Dynamical and Evolutionary Machine Organization lab at Brandeis University in Waltham, Mass. (Julia Malakie/AP Photo)

By Matthew FordahlThe Associated Press

Aug. 31 — A computer programmed to follow the rules of evolution has for the first time designed and manufactured simple robots with minimal help from people.    

    

The 8-inch automatons did not take over the world or even vacuum the lab. Instead, they crawled across a tabletop, exactly as they were digitally bred to do, said Jordan Pollack, a computer scientist at Brandeis University in Waltham, Mass.     “It’s not what our robots do that is so surprising,” he said. “They’re not humanoid robots — they don’t raise their eyebrows and make you giggle. But what they did do was autonomously designed and manufactured.”

Closer to Solving Two Big ObstaclesBy having a computer create designs using natural selection, researchers edged closer to solving two of robotics’ biggest obstacles: robots’ lack of versatility and their high cost of development.     Robots engineered by people typically function only under specific conditions with limited ability to adapt to changing situations.     A simple robot that vacuums a home, for instance, could cost millions to develop and sell for $5,000 after engineers figured out a way to make sure it doesn’t crash into furniture or fall down stairs, Pollack said.     “Then again, you could just hire a minimum-wage worker with a $100 manual vacuum,” he said. “The cost of building an intelligent humanoid robot is so high, we just can’t get the economics going.”     Ultimately, the Darwinian approach could revolutionize everything from

Robotic technology. RealVideo (download RealPlayer)

Page 59: emergence_1.17.02.doc.doc

manufacturing to space exploration.     “Down the road, if we could have a thing like this in space, you could send the building blocks and let them evolve themselves,” said Yoseph Bar-Cohen, director of a robotics lab at NASA’s Jet Propulsion Laboratory. “That would be fascinating.”

Structure Forever EvolvingPollack and colleague Hod Lipson merged automatic manufacturing techniques with evolutionary computing. Their results appear Thursday in the journal Nature.     The computer that evolved the designs was told only what parts it would be working with, the physics of the environment in which its offspring would be moving, and the goal of locomotion.     Over several days, the computer thought up different designs and methods of movement, creating traits that worked and failed. Like dinosaurs, woolly mammoths and dodo birds, the failures were cast into the dustbin of history.     The most promising designs survived and passed their success to future generations. Hundreds of generations later, three robots were manufactured by a prototyping machine.     “It evolved various kinds of locomotive mechanisms — all surprising, given there was no human coming up with how to do it,” Pollack said. “We got ratcheting motions. We got rolling motions. We got swimming motions.”

Robot ReproductionThe little white robots were made of bars, actuators, ball joints, motors and circuits. People intervened only to insert the motors into the plastic parts spit out by the prototyping machine.     The next step will be to incorporate sensors into the robots so that success or failure in the physical world can be built into future generations.     It could be a difficult project, said Maja J. Mataric, a professor at the University of Southern California and director of the USC Robotics Research Labs.     “The authors very cleverly figured out a way to design a body and then actually manufacture it, which is an amazing feat,” she said. “What is not at all clear is how to come up with a sensor design and manufacture it.”

http://www.abcnews.go.com/sections/tech/DailyNews/robots000830.html--------ner that is simultaneously predictable to the friendlypredictors and unpredictable to the hostile predictors.This game design allows us to employ information the-ory to provide rigorous characterizations of agent be-havior and coevolutionary progress. Further, we cancraft agents of known ability and environments of knowndi_culty, and thus precisely frame questions regardinglearnability. Our results show that subtle changes tothe game determine whether it is open-ended, and pro-foundly a_ect the existence and nature of an arms race.IntroductionMost machine learning (ML) systems operate by opti-mizing to a _xed _tness function, or learning environ-ment, and typically require considerable inductive biasin order to succeed; this inductive bias takes the formThe key to successful coevolutionary learning is a com-petitive arms race between opposed participants. Com-petitors must be su_ciently well-matched in skill to forceeach other to improve. The di_erence between whatparticipants already know and what they must learnis critical: if one competitor becomes relatively expertsuch that the opponent is \overpowered," then the op-ponent will fail to _nd a gradient towards improvementand be subsequently unable to o_er continued challenge,thereby breaking the arms race. If a balance in the armsrace is maintained, on the other hand, coevolution is hy-pothesized to provide a way to gradually evolve opposingforces such that each is always suitably pre-adapted to

learn against the other while, at the same time, o_ering asuitably engineered gradient against which the other canlearn. In open-ended domains, coevolutionary progresscan, theoretically, continue inde_nitely

Learning is not a process of accumulation of representations of the environment; it is a continuous process of transformation of behavior through continuous change in the capacity of the nervous system to synthesize it.

--Humberto Maturana 1980 http://pangaro.com/published/cyber-macmillan.html