Unshackling Evolution: Evolving Soft Robots with Multiple...

8
Unshackling Evolution: Evolving Soft Robots with Multiple Materials and a Powerful Generative Encoding Nick Cheney, Robert MacCurdy, Jeff Clune*, Hod Lipson Creative Machines Lab, Cornell University Ithaca, New York, USA *Evolving AI Lab, University of Wyoming *Laramie, Wyoming, USA [email protected], [email protected], [email protected], [email protected] ABSTRACT In 1994 Karl Sims showed that computational evolution can produce interesting morphologies that resemble natural or- ganisms. Despite nearly two decades of work since, evolved morphologies are not obviously more complex or natural, and the field seems to have hit a complexity ceiling. One hypothesis for the lack of increased complexity is that most work, including Sims’, evolves morphologies composed of rigid elements, such as solid cubes and cylinders, limiting the design space. A second hypothesis is that the encod- ings of previous work have been overly regular, not allow- ing complex regularities with variation. Here we test both hypotheses by evolving soft robots with multiple materials and a powerful generative encoding called a compositional pattern-producing network (CPPN). Robots are selected for locomotion speed. We find that CPPNs evolve faster robots than a direct encoding and that the CPPN morphologies appear more natural. We also find that locomotion per- formance increases as more materials are added, that di- versity of form and behavior can be increased with dier- ent cost functions without stifling performance, and that organisms can be evolved at dierent levels of resolution. These findings suggest the ability of generative soft-voxel systems to scale towards evolving a large diversity of com- plex, natural, multi-material creatures. Our results suggest that future work that combines the evolution of CPPN- encoded soft, multi-material robots with modern diversity- encouraging techniques could finally enable the creation of creatures far more complex and interesting than those pro- duced by Sims nearly twenty years ago. Categories and Subject Descriptors: I.2.11 [Distributed Artificial Intelligence]:Intelligent Agents General Terms: Algorithms, Design, Experimentation Keywords: Genetic Algorithms, Generative Encodings, CPPN- NEAT, Soft-Robotics, HyperNEAT, Evolving Morphologies Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. To copy otherwise, to republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. GECCO’13, July 6–10, 2013, Amsterdam, The Netherlands. Copyright 2013 ACM 978-1-4503-1963-8/13/07 ...$15.00. Figure 1: An example of a natural looking morphol- ogy and behavior evolved by combining a generative encoding with voxel-resolution soft, actuatable ma- terials. The soft robot gallops from left to right across the image with a dog-like gait. 1. INTRODUCTION In 1994, Karl Sims’ evolved virtual creatures showed the potential of evolutionary algorithms to produce natural, com- plex morphologies and behaviors [30]. One might assume that nearly 20 years of improvements in computational speed and evolutionary algorithms would produce far more impres- sive organisms, yet the creatures evolved in the field of ar- tificial life today are not obviously more complex, natural, or intelligent. Fig. 2 demonstrates an example of similar complexity in robots evolved 17 years apart. One hypothesis for why there has not been a clear in- crease in evolved complexity is that most studies follow Sims in evolving morphologies with a limited set of rigid ele- ments [21, 4, 3, 16, 22]. Nature, in contrast, composes or- ganisms with a vast array of dierent materials, from soft tissue to hard bone, and uses these materials to create sub- components of arbitrary shapes. The ability to construct morphologies with heterogeneous materials enables nature to produce more complex, agile, high-performing bodies [35]. An open question is whether computational evolution will Figure 2: (left) The scale and resolution of robots evolved by Sims in 1994 [30]. (middle) The scale and resolution at which evolutionary robotics commonly occurs today (from Lehman and Stanley, 2011 [21]). (right) The scale and resolution of robot fabrication techniques (from Lipson and Pollack, 2000 [22]). 167

Transcript of Unshackling Evolution: Evolving Soft Robots with Multiple...

Page 1: Unshackling Evolution: Evolving Soft Robots with Multiple ...jeffclune.com/publications/2013_Softbots_GECCO.pdf · body robots [25]. However, few attempts have been made to evolve

Unshackling Evolution: Evolving Soft Robots with Multiple

Materials and a Powerful Generative Encoding

Nick Cheney, Robert MacCurdy, Jeff Clune*, Hod LipsonCreative Machines Lab, Cornell University

Ithaca, New York, USA*Evolving AI Lab, University of Wyoming

*Laramie, Wyoming, [email protected], [email protected], [email protected], [email protected]

ABSTRACTIn 1994 Karl Sims showed that computational evolution canproduce interesting morphologies that resemble natural or-ganisms. Despite nearly two decades of work since, evolvedmorphologies are not obviously more complex or natural,and the field seems to have hit a complexity ceiling. Onehypothesis for the lack of increased complexity is that mostwork, including Sims’, evolves morphologies composed ofrigid elements, such as solid cubes and cylinders, limitingthe design space. A second hypothesis is that the encod-ings of previous work have been overly regular, not allow-ing complex regularities with variation. Here we test bothhypotheses by evolving soft robots with multiple materials

and a powerful generative encoding called a compositionalpattern-producing network (CPPN). Robots are selected forlocomotion speed. We find that CPPNs evolve faster robotsthan a direct encoding and that the CPPN morphologiesappear more natural. We also find that locomotion per-formance increases as more materials are added, that di-versity of form and behavior can be increased with di↵er-ent cost functions without stifling performance, and thatorganisms can be evolved at di↵erent levels of resolution.These findings suggest the ability of generative soft-voxelsystems to scale towards evolving a large diversity of com-plex, natural, multi-material creatures. Our results suggestthat future work that combines the evolution of CPPN-encoded soft, multi-material robots with modern diversity-encouraging techniques could finally enable the creation ofcreatures far more complex and interesting than those pro-duced by Sims nearly twenty years ago.

Categories and Subject Descriptors: I.2.11[Distributed Artificial Intelligence]:Intelligent Agents

General Terms: Algorithms, Design, ExperimentationKeywords: Genetic Algorithms, Generative Encodings, CPPN-NEAT, Soft-Robotics, HyperNEAT, Evolving Morphologies

Permission to make digital or hard copies of all or part of this work for

personal or classroom use is granted without fee provided that copies are

not made or distributed for profit or commercial advantage and that copies

bear this notice and the full citation on the first page. To copy otherwise, to

republish, to post on servers or to redistribute to lists, requires prior specific

permission and/or a fee.

GECCO’13, July 6–10, 2013, Amsterdam, The Netherlands.

Copyright 2013 ACM 978-1-4503-1963-8/13/07 ...$15.00.

Figure 1: An example of a natural looking morphol-ogy and behavior evolved by combining a generativeencoding with voxel-resolution soft, actuatable ma-terials. The soft robot gallops from left to rightacross the image with a dog-like gait.

1. INTRODUCTIONIn 1994, Karl Sims’ evolved virtual creatures showed the

potential of evolutionary algorithms to produce natural, com-plex morphologies and behaviors [30]. One might assumethat nearly 20 years of improvements in computational speedand evolutionary algorithms would produce far more impres-sive organisms, yet the creatures evolved in the field of ar-tificial life today are not obviously more complex, natural,or intelligent. Fig. 2 demonstrates an example of similarcomplexity in robots evolved 17 years apart.

One hypothesis for why there has not been a clear in-crease in evolved complexity is that most studies follow Simsin evolving morphologies with a limited set of rigid ele-ments [21, 4, 3, 16, 22]. Nature, in contrast, composes or-ganisms with a vast array of di↵erent materials, from softtissue to hard bone, and uses these materials to create sub-components of arbitrary shapes. The ability to constructmorphologies with heterogeneous materials enables natureto produce more complex, agile, high-performing bodies [35].An open question is whether computational evolution will

Figure 2: (left) The scale and resolution of robotsevolved by Sims in 1994 [30]. (middle) The scale andresolution at which evolutionary robotics commonlyoccurs today (from Lehman and Stanley, 2011 [21]).(right) The scale and resolution of robot fabricationtechniques (from Lipson and Pollack, 2000 [22]).

167

Page 2: Unshackling Evolution: Evolving Soft Robots with Multiple ...jeffclune.com/publications/2013_Softbots_GECCO.pdf · body robots [25]. However, few attempts have been made to evolve

produce more natural, complex forms if it is able to createorganisms out of many material types. Here we test thathypothesis by evolving morphologies composed of voxels ofdi↵erent materials. They can be hard or soft, analogous tobone or soft tissue, and inert or expandable, analogous tosupportive tissue or muscle. Contiguous patches of homoge-neous voxels can be thought of as di↵erent tissue structures.

Another hypothesis is that the encodings used in previ-ous work limited the design space. Direct encodings lackthe regularity and evolvability necessary to consistently pro-duce regular morphologies and coordinated behaviors [9, 6,34, 16], and overly regular indirect encodings constrict thedesign space by disallowing complex regularities with varia-tion [16, 31, 34]. We test this hypothesis by evolving mor-phologies with the CPPN-NEAT encoding [31], which hasbeen shown to create complex regularities such as symme-try and repetition, both with and without variation (Fig. 3).CPPN-NEAT has shown these abilities in 2D images [29] and3D objects [7] and morphologies [4]. To test the impact ofthe CPPN encoding, we compare it to a direct encoding.

Overall, we find that evolution does utilize additional ma-terials made available to it; their availability led to a signif-icant amount of diverse, interesting, complex morphologiesand locomotion behaviors without hindering performance.Furthermore, the generative encoding produced regular pat-terns of voxel ‘tissue’, leading to fast, e↵ective locomotion.In contrast, the direct encoding produced no phenotypic reg-ularity and led to poor performance.

Because it is notoriously di�cult to quantify attributessuch as “impressiveness” and “complexity”, we make no ef-fort to do so here. Instead, we attempt to visually representthe interesting diversity of morphologies and behaviors thatevolved once evolution was provided with more materialsand a sophisticated encoding. We also demonstrate the abil-ity for this system to scale to higher resolutions and greatermaterial diversity without hindering performance. Finally,we investigate the e↵ects of di↵erent fitness functions, reveal-ing that evolution with this encoding and material palettecan create di↵erent bodies and behaviors in response to dif-ferent environmental and selective pressures.

2. BACKGROUNDThere are many Evolutionary Robotics papers with rigid-

body robots [25]. However, few attempts have been made toevolve robots composed of soft materials [27], and most ofthose attempts are limited to only a few components. Thispaucity is due largely to the computational costs of simulat-ing flexible materials and because many genetic encodingsdo not scale to large parameter spaces [5, 18].

The CPPN encoding abstracts how developmental biologybuilds natural complexity, and has been shown to producecomplex, natural-appearing images and objects (Fig. 3) [29,7, 31]. Auerbach and Bongard used this generative encodingto evolve robotic structures at finer resolutions than previouswork. The systems evolved demonstrated the ability to takeadvantage of geometric coordinates to inform the evolutionof complex bodies. However, this work was limited to rigidbuilding blocks which were actuated by a large number ofhinge joints [1, 4, 3], or had no actuation at all [2].

Rigid structures limit the ability of robots to interact withtheir environments, especially when compared to the com-plex movements of structures in biology composed of mus-cle and connective tissue. These structures, called muscular

Figure 3: (left) Examples of high resolution, com-plex, natural-looking images evolved with CPPN-NEAT that contain symmetry, repetition, and in-teresting variation [29]. (right) Examples of CPPN-encoded 3D shapes with these same properties [7]).

hydrostats, often display incredible flexibility and strength;examples from biology include octopus arms or elephanttrunks [35]. While soft robots can be designed that provideoutstanding mobility, strength and reliability, the designprocess is complicated by multiple competing and di�cult-to-define objectives [35]. Evolutionary algorithms excel atsuch problems, but have historically not been able to scaleto larger robotic designs. To demonstrate that evolutioncan design complex, soft-bodied robots, Hiller and Lipsoncreated a soft-voxel simulator (called VoxCAD) [11]. Theyshowed a preliminary result that CPPNs can produce inter-esting locomotion morphologies, and that such designs cantransfer to the real world (Fig. 4) [13]. However, this workdid not take advantage of the NEAT algorithm, with itshistorical markings, speciation, crossover, and complexifica-tion over time - which have been shown to greatly improvethe search process [33]. Additionally, these preliminary re-sults consisted of only three trials per treatment. Here weconduct a more in-depth exploration of the capabilities ofCPPNs when evolving soft robots in VoxCad.

Figure 4: A time-series example of a fabricated softrobot, which actuates with cyclic 20% volumetricactuation in a pressure chamber [13]. This proof-of-concept shows that evolved, soft-bodied robots canbe physically realized. Current work is investigatingsoft robot actuation outside of a pressure chamber.

168

Page 3: Unshackling Evolution: Evolving Soft Robots with Multiple ...jeffclune.com/publications/2013_Softbots_GECCO.pdf · body robots [25]. However, few attempts have been made to evolve

Figure 5: A CPPN is iteratively queried for eachvoxel within a bounding area and produces outputvalues as a function of the coordinates of that voxel.These outputs determine the presence of voxels andtheir material properties to specify a soft robot.

3. METHODS

3.1 CPPN-NEATCPPN-NEAT has been repeatedly described in detail [31,

9, 7, 10], so we only briefly summarize it here. A compo-sitional pattern-producing network (CPPN) is similar to aneural network, but its nodes contain multiple math func-tions (in this paper: sine, sigmoid, Gaussian, and linear).CPPNs evolve according to the NEAT algorithm [31]. TheCPPN produces geometric output patterns that are built upfrom the functions of these nodes. Because the nodes haveregular mathematical functions, the output patterns tend tobe regular (e.g. a Gaussian function can create symmetryand a sine function can create repetition). In this paper,each voxel has an x, y, and z coordinate that is input intothe network, along with the voxel’s distance from center (d).One output of the network specifies whether any material ispresent, while the maximum value of the 4 remaining out-put nodes (each representing an individual material) spec-ifies the type of material present at that location (Fig. 5).This method of separating the presence of a phenotypic com-ponent and its parameters into separate CPPN outputs hasbeen shown to improve performance [36]. Robots can beproduced at any desired resolution. If there are multiple dis-connected patches, only the most central patch is consideredwhen producing the robot morphology.

3.2 VoxCADFitness evaluations are performed in the VoxCAD soft-

body simulator, which is described in detail in Hiller andLipson 2012 [14]. The simulator e�ciently models the stat-ics, dynamics, and non-linear deformation of heterogeneoussoft bodies. It also provides support for volumetric actuationof individual voxels (analogous to expanding and contract-ing muscles) or passive materials of varying sti↵ness (muchlike soft support tissue or rigid bone). For visualization, wedisplay each voxel, although a smooth surface mesh couldbe added via the Marching Cubes algorithm [23, 7].

3.2.1 MATERIALSFollowing [12], there are two types of voxels: those that

expand and contract at a pre-specified frequency, and pas-sive voxels with no intrinsic actuation, which are either softor hard. We expand upon [12] to include multiple phases ofactuation. Unless otherwise noted, four materials are used:Green voxels undergo periodic volumetric actuations of 20%.Light blue voxels are soft and passive, having no intrinsicactuation, with their deformation caused solely by nearbyvoxels. Red voxels behave similarly to green ones, but withcounter-phase actuations. Dark blue voxels are also passive,but are more sti↵ and resistant to deformation than lightblue voxels. In treatments with less than 4 materials, voxelsare added in the order above (e.g. two material treatmentsconsist of green and light blue voxels).

3.3 GAlibThe direct encoding is from GAlib–fully described in [37]–

a popular o↵-the-shelf genetic algorithm library from MIT.In the direct encoding genome, each voxel has its own inde-pendent values representing its presence and material out-puts. The first value is binary, indicating whether a voxelat that position exists. If the voxel exists, the highest ofthe material property values determines the type of voxel.Thus, a 10⇥ 10⇥ 10 (“103”) voxel soft robot with 4 possiblematerials would have a genome size of 103⇥5 = 5000 values.

3.4 Experimental DetailsTreatments consist of 35 runs, each with a population size

of 30, evolved for 1000 generations. Unless otherwise noted,fitness is the di↵erence in the center of mass of the soft robotbetween initialization and the end of 10 actuation cycles. Ifany fitness penalties are assessed, they consist of multiplyingthe above fitness metric by: 1 � penalty metric

maximum penalty metric

. Forexample, if the penalty metric is the number of voxels, anorganism with 400 non-empty voxels out of a possible 1000would have its displacement multiplied by 1� 400

1000

= 0.6 toproduce its final fitness value. Other CPPN-NEAT param-eters are the same as in Clune and Lipson 2011 [7].

4. RESULTSQuantitative and qualitative analyses reveal that evolu-

tion in this system is able to produce e↵ective and inter-esting locomoting soft robots at di↵erent voxel resolutionsand using di↵erent materials. We also discover that impos-ing di↵erent environmental challenges in the form of penaltyfunctions provides an increased diversity of forms, suggest-ing the capability to adapt to various selective pressures.

Videos of soft robot locomotion are available at http:

//tinyurl.com/EvolvingSoftRobots. So the reader mayverify our subjective, qualitative assessments, we have per-manently archived all evolved organisms, data, source code,and parameter settings at the Dryad Digital Repository.

4.1 Direct vs. Generative EncodingThe CPPN-NEAT generative encoding far outperforms

the direct encoding (Figure 8), which is consistent with pre-vious findings [9, 6]. The most stark di↵erence is in the reg-ularity of the voxel distributions (compare Figs. 1, 6, 12, 13to Fig. 7). CPPN-NEAT soft robots consist of homogeneouspatches of materials akin to tissues (e.g. one large patch ofmuscle, another patch of bone, etc.). The direct encoding,

169

Page 4: Unshackling Evolution: Evolving Soft Robots with Multiple ...jeffclune.com/publications/2013_Softbots_GECCO.pdf · body robots [25]. However, few attempts have been made to evolve

Figure 6: CPPN-NEAT-encoded soft robots can scale to any resolution. Pictured here are soft robots sampledat voxel resolutions of 5⇥ 5⇥ 5 (left), 10⇥ 10⇥ 10 (center), and 20⇥ 20⇥ 20 (right).

on the other hand, seems to randomly assign a material toeach voxel. These homogeneous tissue structures are benefi-cial because similar types of voxels can work in a coordinatedfashion to achieve the locomotion objective. For example, allthe voxels in one large section of green voxels will expand atthe same time, functioning as muscle tissue. This global co-ordination leads to jumping, bounding, stepping, and manyother behaviors. In the direct encoding, each voxel worksindependently from–and often at odds with–its neighboringvoxels, preventing coordinated behaviors. Instead, final or-ganisms appear visually similar to those at initialization, andperformance barely improves across generations (Figure 8).

Another reason for the success of the CPPN-NEAT encod-ing is one of the key properties of the NEAT algorithm: itstarts with CPPN networks that produce simple geometricvoxel patterns and complexifies those patterns over time [31].

4.2 Penalty FunctionsTo explore performance under di↵erent selective or envi-

ronmental pressures, we tested four di↵erent penalty regimes.All four require the soft robot to move as far as possible,but have di↵erent restrictions. In one environment, the softrobots are penalized for their number of voxels, similar toan animal having to work harder to carry more weight. Inanother, the soft robots are penalized for their amount ofactuatable material, analogous to the cost of expending en-ergy to contract muscles. In a third treatment, a penaltyis assessed for the number of connections (adjoining facesbetween voxels), akin to animals that live in warm environ-ments and overheat if their surface area is small in com-parison to their volume. Finally, there is also the baselinetreatment in which no penalties are assessed.

While a cost for actuated voxels does perform significantlyworse than a setup with no cost (p = 1.9⇥ 10�5 comparingfinal fitness values), all treatments tend to perform similarlyover evolutionary time (Fig. 9). This rough equivalence sug-gests that the system has the ability to adapt to di↵erentcost requirements without major reductions in performance.However, drastically di↵erent types of body-plans and be-haviors evolved for the di↵erent fitness functions. Thereare di↵erences in the proportions of each material found inevolved organisms, indicating that evolution utilizes di↵er-ent material distributions to fine tune morphologies to var-ious environments (Fig. 10). For example, when no penaltycost is assessed, more voxels are present (p < 2 ⇥ 10�13).When there is a cost for the number of actuated voxels, but

not for support tissue, evolution uses more of these inertsupport materials (p < 0.02).

More revealing are the di↵erences in behaviors. Fig. 11categorizes locomotion strategies into several broad classes,and shows that di↵erent task requirements favor di↵erentclasses of these behaviors. To limit subjectivity in the cat-egorization process, we made clear category definitions, asis common in observational biology, and provide an onlinearchive of all organisms for reader evaluation (see Sec. 4).

Fig. 12 displays the common locomotion strategies andFig. 11 shows how frequently they evolved. They are de-scribed in order of appearance in Fig. 12. The L-Walker isnamed after the“L” shape its rectangular body forms, and isdistinguished by its blocky form and hinge-like pivot pointin the bend of the L. The Incher is named after its inchwormlike behavior, in which it pulls its back leg up to its frontlegs by arching its back, then stretches out to flatten itselfand reach its front legs forward. Its morphology is distin-guished by its sharp spine and diagonal separation betweenactuatable materials. The Push-Pull is a fairly wide class ofbehaviors and is tied together by the soft robot’s powerfulpush with its (often large) hind leg to propel itself forward,which is usually coupled with a twisting or tipping of itsfront limb/head to pull itself forward between pushes. Thehead shape and thinner neck region are surprisingly commonfeatures. Next, the Jitter (or Bouncer) moves by bouncing

Figure 7: A representative example of a soft robotevolved with a direct encoding. Note the lack of reg-ularity and organization: there are few contiguous,homogeneous patches of one type of voxel. Instead,the organism appears to be composed of randomlydistributed voxels . The resolution is the default 103.

170

Page 5: Unshackling Evolution: Evolving Soft Robots with Multiple ...jeffclune.com/publications/2013_Softbots_GECCO.pdf · body robots [25]. However, few attempts have been made to evolve

0 200 400 600 800 1000

Generation0.0

0.5

1.0

1.5

2.0

2.5

3.0

3.5

4.0

Run

cham

pion

dist

ance

(bod

yle

ngth

s)

Generative EncodingDirect Encoding

Figure 8: The best individuals from 35 independentruns with a direct or generative encoding. Note howthe generative encoding sees large improvementsearly in evolution, while it is exploring new loco-motion types. It then settles on specific types andgradually improves coordination, timing, etc., to ex-ploit a given strategy. The direct encoding is unableto produce globally coordinated behavior to developnew locomotion strategies, resulting in very minorimprovements as it exploits its initial random forms.Here, and in all figures, thick lines are medians ±95%bootstrapped confidence intervals.

its (often large) back section up and down, which pushesthe creature forward. It is distinguished by its long bodyand is often composed mainly of a single actuatable mate-rial. The Jumper is similar in that it is often comprised ofa single actuatable material, but locomotes in an uprightposition, springing up into the air and using its weight toangle its jumping and falling in a controlled fashion to moveforward. The Wings is distinguished by its unique verticalaxis of rotation. It brings its arms (or wings) in front ofit, then pushes them down and out to the sides, propellingits body forward with each flapping-like motion. Fig. 13demonstrates other, less-common behaviors that evolved.

These example locomotion strategies display the system’sability to produce a diverse set of morphologies and behav-iors, which likely stems from its access to multiple types ofmaterials. Our results suggest that with even more mate-rials, computational evolution could produce even more so-phisticated morphologies and behaviors. Note that di↵erentbehaviors show up more frequently for di↵erent task settings(Fig. 11), suggesting the ability of the system to fine tuneto adapt to di↵erent selective pressures.

4.3 Material TypesTo meet its full potential, this system must scale to arbi-

trarily large numbers of materials and resolutions. We firstexplore its ability to compose soft robots out of a range ofmaterials by separately evolving soft robots with increasingnumbers of materials (in the order outlined in Sec. 3.2.1).

0 200 400 600 800 1000

Generation0.0

0.5

1.0

1.5

2.0

2.5

3.0

3.5

4.0

Run

cham

pion

dist

ance

(bod

yle

ngth

s)

No CostsCost for VoxelsCost for Connections Between VoxelsCost for Actuated Voxels

Figure 9: Performance is mostly una↵ected by dif-ferent selection pressures (i.e. fitness functions).

Empty Phase1 Muscle Soft Support Phase2 Muscle Hard SupportMaterial Type

0

200

400

600

800

Voxe

lsU

sed

(out

of10

00pe

rrob

ot)

— —No Costs

— —Cost for Voxels

— —Cost for Connections Betwen Voxels

— —Cost for Actuated Voxels

Figure 10: The amount of each material that evolvedfor di↵erent cost functions, revealing the system’sability to adapt material distributions to di↵erentenvironments. For example, without a cost, evo-lution used more voxels to produce actuation (p <

2⇥10�13). With a cost for actuated voxels, evolutiontends to use more inert support tissue (p < 0.02).

Adding a second, and then a third, material significantlyimproved performance (Fig. 14, p < 2 ⇥ 10�6), and addinga further hard, inert material did not significantly hurt per-formance (Fig. 14, p = 0.68). This improved performancesuggests that CPPN-NEAT is capable of taking advantage ofthe increase in morphological and behavioral options. Thisresult is interesting, as one might have expected a drop inperformance associated with the need to search in a higherdimensional space and coordinate more materials.

171

Page 6: Unshackling Evolution: Evolving Soft Robots with Multiple ...jeffclune.com/publications/2013_Softbots_GECCO.pdf · body robots [25]. However, few attempts have been made to evolve

L-Walker Incher Push-Pull Jitter Jumper Wings Other0

5

10

15

20

25

Indi

vidu

als

ofLo

com

otio

nTy

pe

No CostCost for Actuated VoxelsCost for Voxel ConnectionsCost for Total Voxels

Figure 11: Common behaviors evolved under di↵er-ent cost functions, summed across all runs. Thesebehaviors are described in Sec. 4.2 and visualized inFig. 12. Some behaviors occur more frequently un-der certain selective regimes. For example, the L-Walker is more common without a voxel cost, whileJitter, Jumper, and Wings do not evolve in any ofthe no cost runs.

4.4 ResolutionThis system also is capable of scaling to higher resolu-

tion renderings of soft robots, involving increasing numbersof voxels. Fig. 6 shows example morphologies evolved ateach resolution. The generative encoding tended to per-form roughly the same regardless of resolution, although thecomputational expense of simulating large numbers of voxelsprevented a rigorous investigation of the e↵ect of resolutionon performance. Faster computers will enable such researchand the evolution of higher-resolution soft robots.

5. DISCUSSIONThe results show that life-like, complex, interesting mor-

phologies and behaviors are possible when we expand thedesign space of evolutionary robotics to include soft mate-rials that behave similarly to organic tissue or muscle, andsearch that design space with a powerful generative encod-ing like CPPN-NEAT. Our preliminary experiments suggestthat soft robotics at the voxel resolution will someday pro-vide complex and breathtaking demonstrations of lifelike ar-tificial forms. Soft robotics will also showcase the ability ofevolutionary design because human intuitions and engineer-ing fare poorly in such entangled, non-linear design spaces.

We challenged multiple scientists to design fast, locomot-ing soft robots by hand, using the same resolution and ma-terials. While the sample size is not su�cient to reporthard data, all participants (both those with and without en-gineering backgrounds) were unable to produce organismsthat scored higher than the evolved creatures. Participantsnoted the surprising di�culty of producing e�cient walk-ers with these four materials. This preliminary experimentsupports the claim that systems like the CPPN-NEAT gen-

Figure 12: Time series of common soft robot be-haviors as they move from left to right across theimage. From top to bottom, we refer to them asL-Walker, Incher, Push-Pull, Jitter, Jumper, andWings. Fig. 11 reports how frequently they evolved.

Figure 13: Time series of other evolved strategies.(top) Opposite leg stepping creates a traditional an-imal walk or trot. (middle) A trunk-like appendageon the front of the robot helps to pull it forward.(bottom) A trot, quite reminiscent of a gallopinghorse, demonstrates the inclusion of sti↵ materialto create bone-like support in longer appendages.

erative encoding will increasingly highlight the e↵ectivenessof automated design relative to a human designer.

This work shows that the presence of soft materials aloneis not su�cient to provide interesting and e�cient locomo-tion, as soft robots created from the direct encoding per-formed poorly. Our results are consistent with work evolvingrigid-body robots that shows that generative encodings out-

172

Page 7: Unshackling Evolution: Evolving Soft Robots with Multiple ...jeffclune.com/publications/2013_Softbots_GECCO.pdf · body robots [25]. However, few attempts have been made to evolve

0 200 400 600 800 1000

Generation0.0

0.5

1.0

1.5

2.0

2.5

3.0

3.5

4.0

Run

cham

pion

dist

ance

(bod

yle

ngth

s)

Two Actuated, Soft and Hard SupportTwo Actuated and Soft SupportOne Actuated and Soft SupportOne Actuated Material Only

Figure 14: The number of materials also a↵ects per-formance. With only one, only simple behaviorslike Jumping or Bouncing are possible, so perfor-mance peaks early and fails to discover new gaitsover time. Upon adding a second material, morecomplex jumping and L-Walker behavior develops.When a second actuatable material is added, mostbehavior strategies from Fig. 12 become possible.Adding a sti↵ support material broadens the rangeof possible gaits, but is only rarely taken advantageof (such as in the bottom gallop of Fig. 13) andthus has a minimal impact on overall performance.These observational assessments may be verified, asall evolved organisms are available online (Sec. 4)

perform direct encodings for evolutionary robotics [17, 19,9, 6]. Unfortunately, there have been few attempts to evolverobot morphologies with CPPN-NEAT [2], and there is noconsensus in the field of a proper measurement of “complex-ity”, “interestingness”, or“natural”appearance, so we cannotdirectly compare our soft robots to their rigid-body counter-parts. However, we hope that the reader will agree about thepotential of evolved soft robots upon viewing the creaturesin action [http://tinyurl.com/EvolvingSoftRobots].

6. FUTURE WORKThe ability to evolve complex and intricate forms lends

itself naturally to other questions in the field. Auerbachand Bongard have explored the relationship between envi-ronment and morphology with rigid robots in highly regularenvironments [4]. Because our system allows more flexibilityin robot morphology and behavior, it may shed additional,or di↵erent, light on the relationship between morphology,behavior, and the environment. Preliminary results demon-strate the ability of this system to produce morphologieswell suited for obstacles in their environments (Fig. 15).

While our research produced an impressive array of di-verse forms, it did use a target-based fitness objective, whichcan hinder search [38]. Switching to modern techniques forexplicitly generating diversity, such as the MOLE algorithmby Mouret and Clune [24, 8] or algorithms by Lehman and

Figure 15: An example of a soft robot that hasevolved “teeth” to hook onto the obstacle rings inits environment and propel itself across them.

Stanley [21], has the potential to create an incredibly com-plex and diverse set of morphologies and behaviors.

Additionally, we are currently pursuing methods to mini-mize the need for expensive simulations and to evolve spe-cific material properties instead of having a predefined paletteof materials. These avenues are expected to allow increasedcomplexity and diversity in future studies.

The HyperNEAT algorithm [32], which utilizes CPPNs,has been shown to be e↵ective for evolving artificial neuralnetwork controllers for robots [9, 20, 6]. The same encodingfrom this work could thus co-evolve robot controllers andsoft robot morphologies. Bongard and Pfeifer have arguedthat such body-brain co-evolution is critical toward progressin evolutionary robotics and artificial intelligence [26].

Soft robots have shown promise in multiple areas of robotics,such as gripping [15] or human-robot interaction [28]. Thescale-invariant encoding and soft actuation from this workhas potential in these other areas of soft robotics as well.

In order to compare di↵erent approaches, the field wouldbenefit from general, accepted definitions and quantitativemeasures of complexity, impressiveness, and naturalness. Suchmetrics will enable more quantitative analyses in future work.

7. CONCLUSIONIn this work we investigate the di�cult-to-address ques-

tion of why we as a field have failed to substantially improveupon the work of Karl Sims nearly two decades ago. Weshow that combining a powerful generative encoding basedon principles of developmental biology with soft, biologically-inspired materials produces a diverse array of interestingmorphologies and behaviors. The evolved organisms arequalitatively di↵erent from those evolved in previous re-search with more traditional rigid materials and either di-rect, or overly regular, encodings. The CPPN-NEAT en-coding produces complex, life-like organisms with propertiesseen in natural organisms, such as symmetry and repetition,with and without variation. Further, it adapts to increasedresolutions, numbers of available materials, and di↵erent en-vironmental pressures by tailoring designs to di↵erent selec-tive pressures without substantial performance degradation.Our results suggest that investigating soft robotics and mod-ern generative encodings may o↵er a path towards eventuallyproducing the next generation of impressive, computation-

173

Page 8: Unshackling Evolution: Evolving Soft Robots with Multiple ...jeffclune.com/publications/2013_Softbots_GECCO.pdf · body robots [25]. However, few attempts have been made to evolve

ally evolved creatures to fill artificial worlds and showcasethe power of evolutionary algorithms.

8. ACKNOWLEDGMENTSNSF Postdoctoral Research Fellowship in Biology to Je↵

Clune (DBI-1003220), NSF Graduate Research Fellowship(DGE-0707428) to Robert MacCurdy, and DARPA GrantW911NF-12-1-0449. Thanks also to Jon Hiller for help withthe VoxCad simulator.

9. REFERENCES[1] J. E. Auerbach and J. C. Bongard. Dynamic resolution in

the co-evolution of morphology and control. In ArtificialLife XII: Proc. of the 12th International Conf. on theSimulation and Synthesis of Living Systems, 2010.

[2] J. E. Auerbach and J. C. Bongard. Evolving CPPNs togrow three-dimensional physical structures. In Proc. of theGenetic & Evolutionary Computation Conf., pages627–634. ACM, 2010.

[3] J. E. Auerbach and J. C. Bongard. On the relationshipbetween environmental and mechanical complexity inevolved robots. In Proc. of the Artificial Life Conf., pages309–316, 2012.

[4] J. E. Auerbach and J. C. Bongard. On the relationshipbetween environmental and morphological complexity inevolved robots. In Proc. of the Genetic & EvolutionaryComputation Conf., pages 521–528. ACM, 2012.

[5] J. Bongard. Evolving modular genetic regulatory networks.In Evolutionary Computation, 2002. Proc. of the 2002Congress on, volume 2, pages 1872–1877. IEEE, 2002.

[6] J. Clune, B. Beckmann, C. Ofria, and R. Pennock. Evolvingcoordinated quadruped gaits with the HyperNEATgenerative encoding. In Proceedings of the IEEE Congresson Evolutionary Computation, pages 2764–2771, 2009.

[7] J. Clune and H. Lipson. Evolving three-dimensional objectswith a generative encoding inspired by developmentalbiology. In Proc. of the European Conf. on Artificial Life,pages 144–148, 2011.

[8] J. Clune, J.-B. Mouret, and H. Lipson. The evolutionaryorigins of modularity. Proc. of the Royal Society B,280(20122863), 2013.

[9] J. Clune, K. O. Stanley, R. T. Pennock, and C. Ofria. Onthe performance of indirect encoding across the continuumof regularity. IEEE Trans. on Evolutionary Computation,15(4):346–367, 2011.

[10] J. Gauci and K. O. Stanley. Generating large-scale neuralnetworks through discovering geometric regularities. InProc. of the Genetic & Evolutionary Computation Conf.,pages 997–1004. ACM, 2007.

[11] J. D. Hiller and H. Lipson. Multi-material topologicaloptimization of structures and mechanisms. In Proc. of theGenetic & Evolutionary Computation Conf., pages1521–1528. ACM, 2009.

[12] J. D. Hiller and H. Lipson. Evolving amorphous robots.Artificial Life XII, pages 717–724, 2010.

[13] J. D. Hiller and H. Lipson. Automatic design andmanufacture of soft robots. IEEE Trans. on Robotics,28(2):457–466, 2012.

[14] J. D. Hiller and H. Lipson. Dynamic simulation of softheterogeneous objects. ArXiv:1212.2845, 2012.

[15] S. Hirose and Y. Umetani. The development of soft gripperfor the versatile robot hand. Mechanism and MachineTheory, 13(3):351–359, 1978.

[16] G. S. Hornby, H. Lipson, and J. B. Pollack. Generativerepresentations for the automated design of modularphysical robots. IEEE Trans. on Robotics and Automation,19(4):703–719, 2003.

[17] G. S. Hornby and J. B. Pollack. Evolving L-systems togenerate virtual creatures. Computers & Graphics,25(6):1041–1048, 2001.

[18] M. Joachimczak and B. Wrobel. Co-evolution ofmorphology and control of soft-bodied multicellularanimats. In Proc. of the Genetic & EvolutionaryComputation Conf., pages 561–568. ACM, 2012.

[19] M. Komosinski and A. Rotaru-Varga. Comparison ofdi↵erent genotype encodings for simulated threedimensional agents. Artificial Life, 7(4):395–418, 2001.

[20] S. Lee, J. Yosinski, K. Glette, H. Lipson, and J. Clune.Evolving gaits for physical robots with the HyperNEATgenerative encoding: the benefits of simulation., 2013.

[21] J. Lehman and K. O. Stanley. Evolving a diversity ofvirtual creatures through novelty search and localcompetition. In Proc. of the Genetic & EvolutionaryComputation Conf., pages 211–218. ACM, 2011.

[22] H. Lipson and J. B. Pollack. Automatic design andmanufacture of robotic lifeforms. Nature,406(6799):974–978, 2000.

[23] W. E. Lorensen and H. E. Cline. Marching cubes: A highresolution 3D surface construction algorithm. In ACMSiggraph, volume 21, pages 163–169, 1987.

[24] J.-B. Mouret and J. Clune. An algorithm to createphenotype-fitness maps. Proc. of the Artificial Life Conf.,pages 593–594, 2012.

[25] S. Nolfi, D. Floreano, O. Miglino, and F. Mondada. How toevolve autonomous robots: Di↵erent approaches inevolutionary robotics. In Artificial Life IV, pages 190–197.Cambridge, MA: MIT Press, 1994.

[26] R. Pfeifer and J. C. Bongard. How the body shapes the waywe think: a new view of intelligence. MIT press, 2006.

[27] J. Rie↵el, F. Saunders, S. Nadimpalli, H. Zhou, S. Hassoun,J. Rife, and B. Trimmer. Evolving soft robotic locomotionin PhysX. In Proc. of the Genetic & EvolutionaryComputation Conf., pages 2499–2504. ACM, 2009.

[28] S. Sanan, J. Moidel, and C. G. Atkeson. A continuumapproach to safe robots for physical human interaction. InInt’l Symposium on Quality of Life Technology, 2011.

[29] J. Secretan, N. Beato, D. B. D’Ambrosio, A. Rodriguez,A. Campbell, and K. O. Stanley. Picbreeder: evolvingpictures collaboratively online. In Proc. of the 26thSIGCHI Conf. on Human Factors in Computing Systems,pages 1759–1768. ACM, 2008.

[30] K. Sims. Evolving virtual creatures. In Proc. of the 21stAnnual Conf. on Computer Graphics and InteractiveTechniques, pages 15–22. ACM, 1994.

[31] K. O. Stanley. Compositional pattern producing networks:A novel abstraction of development. Genetic Programmingand Evolvable Machines, 8(2):131–162, 2007.

[32] K. O. Stanley, D. B. D’Ambrosio, and J. Gauci. Ahypercube-based encoding for evolving large-scale neuralnetworks. Artificial Life, 15(2):185–212, 2009.

[33] K. O. Stanley and R. Miikkulainen. Evolving neuralnetworks through augmenting topologies. EvolutionaryComputation, 10(2):99–127, 2002.

[34] K. O. Stanley and R. Miikkulainen. A taxonomy forartificial embryogeny. Artificial Life, 9(2):93–130, 2003.

[35] D. Trivedi, C. Rahn, W. Kier, and I. Walker. Soft robotics:Biological inspiration, state of the art, and future research.Applied Bionics and Biomechanics, 5(3):99–117, 2008.

[36] P. Verbancsics and K. O. Stanley. Constrainingconnectivity to encourage modularity in HyperNEAT. InProceedings of the 13th annual conference on Genetic andevolutionary computation, pages 1483–1490. ACM, 2011.

[37] M. Wall. Galib: A c++ library of genetic algorithmcomponents. Mechanical Engineering Department,Massachusetts Institute of Technology, 87, 1996.

[38] B. G. Woolley and K. O. Stanley. On the deleterious e↵ectsof a priori objectives on evolution and representation. InProc. of the Genetic and Evolutionary Computation Conf.,pages 957–964. ACM, 2011.

174