VPH-FET VPH-FET Research Roadmap Advanced Technologies · 2017. 9. 27. · studies involving...

88
VPH-FET Future and Emerging Technologies for the Virtual Physiological Human Support Action in FET Proactive – FP7-ICT-258087 VPH-FET Research Roadmap Advanced Technologies for the Future of the Virtual Physiological Human

Transcript of VPH-FET VPH-FET Research Roadmap Advanced Technologies · 2017. 9. 27. · studies involving...

  • VPH-FET

    Future and Emerging Technologies for the Virtual Physiological Human Support Action in FET Proactive – FP7-ICT-258087

    VPH-FET ResearchRoadmapAdvanced Technologiesfor the Future of theVirtual PhysiologicalHuman

    VPH-FE

    T VPH

    -FET R

    esearch Roadm

    ap Advanced Technologies for the Future of the Virtual Physiological H

    uman

    B010911-Vphfet-Cover_Layout 1 13/10/2011 16:18 Page 1

  • Advanced Technologies for the Futureof the Virtual Physiological Human

    Editors: Marco Viceconti, Gordon Clapworthy – September 2011

    VPH-FET ResearchRoadmapA research roadmap to identify futureand emerging technologies necessaryfor the furtherance of the VirtualPhysiological Human initiative.

    Written by a panel of internationalexperts and coordinated by the VPH-FET consortium.

    © 2011 VPH-FET consortium – all rights reserved

    INSIDE FRONT COVER

    B010911-Vphfet Roadmap-aw_Layout 1 13/10/2011 15:37 Page 1

  • 2

    Editors: Marco Viceconti, Gordon Clapworthy – September 2011

    Executive Summary

    We, the community of experts that contributed to thisdocument (see Annex #1 for a complete list) recommendthat the European Commission issues Calls for Proposals inthe area of FET Proactive that will target topics related to theVirtual Physiological Human.

    We make this recommendation because biomedical researchand clinical practice are finding it increasingly difficult to sustainthe “rate of discovery” that has characterised biomedicineover the last two centuries. This is largely due to the fact thatwe are reaching the limits of causal reductionism, which hasdriven this research field so far. Causal reductionism impliesthat “the causes acting on the whole are simply the sum ofthe effects of the individual causalities of the parts”, that is, ifwe can understand how the parts work, an understanding ofhow the whole works will naturally follow.

    This excessive attention to the so-called upward causation(genes cause the cell behaviour, which causes the behaviourof the tissue, which in turn causes the behaviour of theorgans, which subsequently defines the behaviour of ourwhole body) has for a long time overshadowed the strongevidence that, in most living organisms, upward causationcoexists with mechanisms of downward causation. This term,introduced by Donald T. Campbell in the context of systemstheory, can be summarised by the following statement: “thewhole is to some degree constrained by the parts (upwardcausation), but at the same time the parts are to some degreeconstrained by the whole (downward causation)”. Some ofthe most challenging research questions in biomedicinerequire that we investigate the biological processes involvedin considering the human body as a whole, accounting for allinteractions between the microscopic and the macroscopiclevels, but this is not feasible with the methods andtechnologies currently available. We need to develop a newgeneration of technologies to make this dream of a holisticapproach to biomedicine possible. This future technology iscalled the Virtual Physiological Human.

    The Virtual Physiological Human (VPH) is “a methodologicaland technological framework that, once established, willenable collaborative investigation of the human body as asingle complex system”. First formulated in 2005, and thesubject of several Calls for Proposals in FP7, with the earliestprojects commencing in 2008, the VPH vision has beenexplored so far with particular attention to the so called “lowhanging fruits” – specific clinical targets for which the existingcore technologies could be developed and adapted to providean integrative approach.

    But the full VPH vision is rather more radical in nature, andwhile for some clinical targets it has now become possibleto develop ICT solutions that provide a certain level ofintegration in the approaches adopted, there is consensusamong researchers that the full-scale deployment of thisradical vision will require equally radical technologicalinnovation. The current evolutionary approach to VPHtechnology development, embedded within individual VPHapplications, cannot produce this. Instead, we need to take a revolutionary approach to VPH technology development –to step back, consider the broader ongoing technologicalneeds and address these head on.

    To this purpose, the VPH-FET support action sustained a consensus process that involved over 150 expertsworldwide, the primary output from which is this researchroadmap on Future and Emerging Technologies for theVirtual Physiological Human.

    In this roadmap, the panel of experts that participated in the VPH-FET consultation and consensus process delineatedwhat are perceived by this research community as the grandchallenges for blue-sky technological research associatedwith the full realisation of the VPH vision: that of atechnology enabling a fully personalised and integrativeinvestigation of human physiology in health and disease by means of computer modelling.

    While, the research targets described in this documentseem quite different from each other, looking at them from a distance we can see a pattern emerging, a big picture thatlinks these topics into a coherent whole.

    Everything that is man made craves for simplicity. Occam’srazor epitomises how the human mind reacts when facedwith an infinitely complex reality – reduce it to simpler terms.The more we understand about life, the more we realise howfar biological systems deviate from the ideal of functionalperfection. In fact, processes in a biological systemfrequently involve a number of unnecessary or inefficientsteps or imply a number of redundancies and they alwaysembody a certain level of randomness. In other words, allbiological processes are inherently highly complex.

    For years, biomedical research has tried hard to reducethese processes to simpler terms, achieving importantresults in the process. However, it is becoming increasinglyevident that such an approach can take us only so far in ourquest for knowledge: there are biological processes thatcannot be fully explained if we reduce them to their parts;

    B010911-Vphfet Roadmap-aw_Layout 1 13/10/2011 15:37 Page 2

  • :

    in these, the whole affects the parts while, in turn, the partsaffect the whole, and only by embracing the complexity canwe achieve a full understanding of the systemic nature ofthese processes.

    Ultimately, this is the essence of the VPH vision: develop anew framework of methods and technologies that make itpossible for the human mind to embrace the complexity atthe root of many important pathophysiological processes.Given our human limitations, to embrace complexity weneed to frame it, build it as collective endeavour, and thentame it for our purposes. Frame, build and tame complexity;these are the three characteristic elements that somehowconnect all eight topics discussed in this research roadmapin a larger picture:

    – Framing the complexity: integrative modelling ofhuman pathophysiology. We first need to create a globalreference frame, which will allow us to organise data,information and knowledge that we accumulate aboutany biological system in a coherent and integrated way.Then, we need to represent, properly and generically, thetransition from a healthy state to the diseased condition.Last, but not least, we need to find ways of translatingwhat we observe and learn from observing other speciesinto this global body of knowledge of humanpathophysiology.

    – Building the complexity: distributed and collaborativemodelling of human pathophysiology. Humanpathophysiology is infinitely complex, and can be tackledonly as a collective endeavour. By building a web ofpredictive pathophysiological models that can be reusedand composed freely, we can bring structure to thiscollective endeavour. Similarly, we can use informationtechnology to radically modify the process of knowledgevalidation, also a collaborative process, by ensuring thatstudies involving computational models ofpathophysiology can also be reproduced and tested aswell as physical experiments.

    – Taming the complexity: interactive fruition of multi-input multi-output integrative models. However, all of theabove would be futile if we cannot employ theintegrative knowledge that is being progressively andcollectively assembled when trying to solve the problemsof humanity. That is, we must also develop truly enablingtechnologies that will be capable of collecting andintegrating disparate data and information somehowrelated to the health status of an individual, andtransforming them into knowledge on how this statuswill evolve in the future. We need technology to explorethis maze of complex knowledge to enable us to identifythe proper leverage points or the specific actions that canheal or at least reduce the symptoms and the discomfortof our patients.

    These are the goals that we believe the VPH researchcommunity should set itself over the coming years. Becauseof the impact that the development of such technologieswould have not only on biomedical research and clinicalpractice, but also on other knowledge domains, werecommend that the European Commission should fund one or more calls for proposals in the area of Future andEmerging Technologies for the Virtual Physiological Human.

    AcknowledgementThe VPH-FET project acknowledges the financial support ofthe Future and Emerging Technologies (FET) programmewithin the ICT theme of the Seventh Framework Programmefor Research of the European Commission.

    30 September 2011

    B010911-Vphfet Roadmap-aw_Layout 1 13/10/2011 15:37 Page 3

  • Executive Summary 2

    Introduction 6Motivations 7The VPH Community 7Consensus process 7Roadmap structure 8

    1. Global space-time reference system for human biomedical data integration 10The vision 11The VPH motivation 11The challenge: Global Reference Body 11Challenge description 13Impact on Biomedicine 20General Impact 21Editorial Team 21

    2. Transforming biological and physiological data across species 22The Vision 23The VPH motivation 23The challenge 26Impact on Biomedicine 28General Impact 28Editorial Team 28

    3. Physio-environmental sensing and live modelling 30The Vision 31The challenge 33The VPH motivation 36Impact on Biomedicine 36General Impact 37Editorial Team 39

    4. A software engineering approach to computational modelling research 40The Vision 41The challenge 41The VPH motivation 48Impact on Biomedicine 48General Impact 48Editorial Team 49

    4

    :

    Contents

    B010911-Vphfet Roadmap-aw_Layout 1 13/10/2011 15:37 Page 4

  • :

    Contents5. Disease modelling and virtual physiology 50

    The Vision 51The challenge 51The VPH motivation 54Impact on Biomedicine 54General Impact 54Editorial Team 55

    6. Visual analytics for exploratory analysis of complex results sets 56The Vision 57The challenge 58The VPH motivation 60Impact on Biomedicine 61General Impact 61References 61Editorial Team 61

    7. Data exploration and analysis in large heterogeneous imaging databases 62The Vision 63The challenge 63The VPH motivation 68Impact on Biomedicine 68General Impact 69Editorial Team 69

    8. Web of models 70The Vision 71The challenge 75The VPH motivation 75Impact on Biomedicine 76Clinical Setting 76General Impact 77Editorial Team 77

    Conclusions 78The big picture 79Tentative calls text 80

    Annex #1: list of contribu tors 81

    B010911-Vphfet Roadmap-aw_Layout 1 13/10/2011 15:37 Page 5

  • 6

    Introduction

    B010911-Vphfet Roadmap-aw_Layout 1 13/10/2011 15:37 Page 6

  • :

    MotivationsThe Virtual Physiological Human (VPH) initiative had its originsin the summer of 2005 and, since its inception, its primaryfocus has been the rapid deployment of innovative methods,products and services that incorporate a highly integrated,multiscale approach within the clinical context. While thisfocus on the final application is probably a significant factor inthe success of the VPH initiative, which has attracted over€200m of public research funding, it has come at a cost.

    The VPH vision, defined as “a framework of methods andtechnologies that, once established, will make it possible toinvestigate the human body as a whole” is somewhat radicalin nature, and while for specific clinical targets it has nowbecome possible to develop ICT solutions that provide acertain level of integration in the approaches adopted, thereis consensus among researchers that the full-scaledeployment of this radical vision will require equally radicaltechnological innovation.

    The current evolutionary approach to VPH technologydevelopment, embedded within individual VPH applications,cannot produce this. Instead, we need to take a revolutionaryapproach to VPH technology development – to step back,consider the broader ongoing technological needs and addressthese head on. This approach, involving “blue-sky”technological research the impact of which may actually gobeyond the specific context of VPH, will require a considerablechange in the emphases that we, as a research community,place when defining the future VPH research agenda.

    We need to move from a collective thinking that revolvesaround the question: “what can we do with the availablecore technologies to bring the VPH vision into hospitalswithin five years?” to a more expansive perspective in whichwe ask ourselves “what are the fundamental technologicalchallenges that we need to overcome in order to make theVPH vision come true, without any compromise orlimitation?” In this process, we should anticipate both theneed to bring into the VPH spectrum technologies andexpertise not previously adopted in this context, and theexpectation that progress made using these technologies toovercome problems inherent in VPH can provide results thatultimately prove fruitful in a variety of fields beyond VPH.

    In summary, we need to identify a set of novel or newlyemerging technologies that will assist the long-termdevelopment of the VPH, along with possibleobstacles/bottlenecks that will constrain their adoption. This isthe primary motivation of the project VPH-FET: Future andEmerging Technologies for the Virtual Physiological Human,which is a Support Action funded within the FET Proactivestrand of the Future & Emerging Technologies priority of theFP7 ICT Programme. VPH-FET began on 1 September 2010,and its major deliverable is the VPH-FET Roadmap – the

    document before you. After 13 months of intense work, thecommunity of experts who participated in the consensusprocess have proposed a number of Grand Challenges, whichare presented in detail in the following pages.

    The VPH CommunityThe VPH community is still quite young. It was initiallyformed around the activities associated with the FP6Coordination Action STEP: A Strategy for the EuroPhysiome(FP6-ICT-027642), which concluded in March 2007, but thefirst EC-funded projects formally linked to VPH started onlyin the summer of 2008. In general, these projects involvebiomedical researchers, bioengineers, computer scientistsand clinicians. In these projects, the participants have had toaddress a significant number of challenges associated withbroadening both the scope of the work involved inencompassing an integrative context, and the range ofmultidisciplinary involvement that this requires.

    Thus, it is perhaps not surprising that, given the range ofimmediate problems faced by the initial VPH projects, theVPH community has had little previous contact with the FETProgramme and, collectively, had a relatively low priorknowledge of the type of project that the FET Programme isdesigned to support. As a result, VPH-FET has had not only toidentify the future technological directions but also to alertthe VPH community to the means by which the technologicaladvances could be achieved via the FET Programme.

    Consensus processAs indicated in the discussion above, VPH-FET had severalfacets:– stimulation: to awaken, in the VPH community,

    a common view about the long-term technological needsof the VPH, to motivate the members to embrace thetechnological challenges involved, and to develop anappreciation of how a successful outcome could beachieved

    – insight: identify the specific expertise, probably inmathematics, computer science, etc., that is necessary to bring the plans into fruition

    – outreach: attract people with that expertise to VPH and motivate them to become involved in the VPH:– as a source of problems requiring fundamental

    research– as an arena within which fundamental research

    outcomes can be applied and tested.

    The first step in developing the consensus process was totry to establish a core set of issues from which to start thediscussion and a small, but active, set of participants whowould help to establish firm foundations from which thediscussion could be driven forward and expanded intobroader participation and a wider range of topics.

    B010911-Vphfet Roadmap-aw_Layout 1 13/10/2011 15:37 Page 7

  • 8:

    As mentioned above, it was important to encourage input fromdisciplines that had previously not had strong interaction withVPH to give the opportunity for cross-fertilisation and theintroduction of novel applications of technology to resolve VPHissues that may arise. Consequently, in addition to expertsfrom within the VPH community, a number of experts fromother disciplines were contacted with a view to participating inthe roadmap development. We received only two refusals,from scientists who already had heavy commitments thatprevented their participation, so we ended up with an initialgroup of 60 researchers, whose interests covered a wide area.

    The experts were asked to suggest the types of noveltechnology that would help to accelerate VPH progress in the coming years. The returns from the experts wereclassified according to the type of researcher (technologist,modeller, VPH user) and area of activity (data integration,information integration, knowledge integration, automatedtesting) to which they most closely related. This produced astructure within which topics which were somehow relatedwere clustered, enabling the subsequent discussions to beundertaken in a manageable way.

    The experts were invited to a closed meeting in London inDecember 2010. The main part of the day consisted of 3 sessions in which an “animator” for the each of the threecategories of interaction took charge of the debate toidentify a list of topics to be taken forward to the Internetdiscussions that would follow. In fact, the 26 research topicsthat emerged from the meeting were subsequently reducedto 21, due to some duplication and overlap.

    A discussion forum was created on the VPH-FET pages onthe Biomed Town web site and its presence was made publicby widespread dissemination. Participation in the debateswas open to all. The topics were introduced for discussion afew at a time to retain focus according to a schedule coveringthe period February to June 2011 that was announced inadvance. Some topics failed to elicit much interest, whileothers produced a lively debate. Some of the original expertsencouraged colleagues and contacts who had relevantexpertise to join the discussions, which proved very helpful.

    Ultimately, 9 topics were selected for detailed discussion atthe VPH-FET Conference, Technologies for the Future of the

    Virtual Physiological Human, which took place on 27 June2011 in London. The conference, which had no registrationfee, was open to all. Ahead of the conference, animators for8 of the topics produced documents, which were circulatedin advance to all registered delegates; these were discussionpapers to aid the debates that would take place at theconference. The final topic, Uncertainty: Modelling andVisualisation, was one of the last to be introduced and itengendered an Internet discussion which suddenly took off just ahead of the conference, so it proved impractical todistribute a discussion paper for it.

    At the conference, each topic had an individual session inwhich the animator laid the ground in a brief summary andthen orchestrated a debate to elicit aspects of the problemthat would prove fruitful for future investigation. The sessionalso provided an opportunity to create a small editorial teamwho would assist the animator in producing an extendedarticle on the topic. These articles form the various sectionsin the remainder of this document. Unfortunately, it did notprove possible to produce a section on Uncertainty.However, Uncertainty has considerable relevance to visualanalytics and to data exploration in imaging datasets, sosome discussion of this important topic has been includedwithin Sections 6 and 7.

    As the sections were completed, they were added to thedraft roadmap and posted on Biomed Town to enable anyinterested party to make a further contribution if theywished.

    Roadmap structureSections 1-8 below are the final versions of the articlesproduced on the various topics that emerged from the VPH-FET discussions as the most significant in terms of VPH-related Future & Emerging Technology.

    The final section provides a summary of the features of theprevious sections and draws some conclusions about howadvanced technology can influence the future of the VPH. A draft Call for Proposals is also included to illustrate howthe findings encapsulated in this Roadmap can be brought to fruition within the Future & Emerging Technologiesprogramme of the ICT theme of the European Commission.

    B010911-Vphfet Roadmap-aw_Layout 1 13/10/2011 15:37 Page 8

  • :

    B010911-Vphfet Roadmap-aw_Layout 1 13/10/2011 15:37 Page 9

  • 10

    Chapter Editors: William Lalinde, Peter Hunter

    1Global space-time reference system for human biomedical data integration

    Global

    B010911-Vphfet Roadmap-aw_Layout 1 13/10/2011 15:37 Page 10

  • :

    The visionOne piece of pioneering work in this area was the VisibleHuman Project1. Using advanced cryogenic serial sectioningmethods, a whole cadaver was sliced at 1mm thickness, andhigh-resolution digital images were taken for each slice. Thisvery large digital data collection was then reconstructed intoa 3D stack, which provided the first high-resolution mappingof human anatomy.

    The VH project started the field of digital anatomy, whichtargeted mostly medical education services, such as theVisible Body on-line service2. While the project wasconsidered a turning point when the data were released, werapidly realised that what was missing was very importantfor most real-world applications. These limitations weresummarised in the Living Human Project manifesto3: “To putit simply – he was only a single human, and he is dead. Wedo not know how he breathed, walked, swallowed, digested– the VH data totally lacks multiplicity and functionality”.Today, there is a widespread consensus4 that the best wayto provide virtual humans with these features involves thedevelopment of the so-called Virtual Physiological Human5,intended as a framework of methods and technologies thatwill make it possible to describe human physiology andpathology in a complete and integrated way.

    In parallel, the increased sophistication of computerhardware and software has made it possible for synthetichuman performers to be used in animations, games, etc.Today, this is a vital ICT research field on its own right, withvarious top class labs entirely dedicated to it all over theworld. Their research agenda is broad: examples, taken fromthe Virtual Human Interaction Lab at Stanford University6

    include:– Avatars and Behavioural Modelling– The Proteus Effect: how the human tends to represent

    its avatar– Haptic Communication in Social Interaction– Facial Tracking and Emotion Abstraction in

    Communication– Homuncular Flexibility: remapping neuromotor control– Diversity Simulation: try to be someone else for some

    time.

    In these areas, attention is focused increasingly on buildingsynthetic humans that can talk, make facial expressions, act,or interact with us. While the motivation here is differentfrom that of the VPH, as these synthetic humans becomemore sophisticated, so the points of common interestmultiply. In order to improve digital characters, we need tomake them move, talk, and behave more like real humans,so this leads to the computer modelling of human physiologyand behaviour. On the other hand, in order to make a goodcomputer animation of a car racing in the street, you do not

    need to model the internal combustion of the engine, so wemust recognise that. research on synthetic humans isgenerally more interested in modelling what you can seefrom the outside, rather than what happens inside the body.

    A third dimension, more recent in conception, is that of usinga virtual human visual interface to help users to navigate inthe maze of data, information, and knowledge that we areaccumulating on the human body and its health. We canenlist the BodyBrowser7 prototype that Google Labs hasreleased, primarily as a demonstration of the capabilitiesoffered by HTML5 and WebGL, of which Google is strongsupporter, or vpHUMAN8, a visualisation from anatomical togenetic models in an open portal guided by the VPHframework and evolution of taxonomy standards. Now wecan also extend this to BodyMedia and BioDigitalHuman.Other related aspect is the work being done by AntonioCriminisi group at Microsoft Research Cambridge, UK, as partof the Inner Eye9 project, which aims to develop automaticsegmenters for 3D medical images, which could be the basisfor a search engine dedicated to the human body.

    The VPH motivationThe VPH perspective on this matter is focused on theproblem of integrating heterogeneous data, information andknowledge. In this light, the aforementioned research linesconverge toward a common Grand Challenge that is thedefinition of a global reference system for human biomedicaldata, information, and knowledge integration – namely, theGlobal Reference Body (GRB).

    If we have such a global reference system, in principle everysingle bit of data, information and knowledge on the humanbody that is shared in digital form can be searched, browsed,interactively explored, for a broad range of reasons, includingleisure, education, healthcare, etc. In particular, with respectto biomedical research, the possibility to find, retrieve, andreuse all of the data, information and knowledge that isavailable on an organ, its functions and its physiological anddiseased states, could truly revolutionise the field, increasingexponentially the potential users for VPH technology.

    The challenge: Global Reference Body During the discussion three aspects emerged to composethe challenge:– dimensionality: how many dimensions should this

    “global space” have?– user interface: how can we provide an interactive

    visualisation of such a complex space?– knowledge management: how should the data,

    information, and knowledge be categorised, organised,annotated, etc, to allow it to be usefully and automaticallyexposed in such a global space?

    B010911-Vphfet Roadmap-aw_Layout 1 13/10/2011 15:37 Page 11

  • 12:

    DimensionalityThis is probably the most challenging aspect. At the currentstate of our understanding, such a system should considerthe following dimensions:

    D1) Space.We must account for how the body is organisedin space and what references are being utilised (modelor actual).

    D2) Time. All physiological and pathological processes needto be described in the context of time.

    D3) Idealisation. Initially this was proposed as a Scaledimension, but later revised as Idealisation of what isviewed at each scale. Key points are the transformationof what is visualised in each space dimension and theadoption of different idealisations to describe the sameprocess at each scale, accounting for other processesthat could not be observed at the next (higher/lower)scale.

    D4) Population. At the origin of this axis there is an averagegeneric human, then there are all average humans thatwe obtain by clustering individuals with similar attributes(male, high blood pressure, over 70, etc.), and then eachsingle individual in the world.

    D5) Alterations. Possible alterations of space based onchanges in the body anatomy, accounting for changesdue to interventions or body modifications.

    D6) Treatment. These are the possible actions that we (orour carers) can undertake to transform our health status.

    User interfaceD1) Space.While the amount of information and its

    complexity is considerable, the anatomical (spatial)representation of the human body and the userinterfaces to properly interact with it are already at thecentre of an intense research activity worldwide. Currentsolutions appear effective in a number of cases, in so faras we limit the interaction to a single dimensional scale.

    D2) Time. Interactive visualisation of time varying quantitiesis also quite established as a research topic, again in sofar as we limit ourselves to a single temporal scale.

    D3) Idealisation. The user interface issue is considerable,but to a certain extent it maps back to the problem ofproviding multiscale interactive visualisation, which isbeing explored by the MSV project10, at least withrespect to space-time dimensions. A further aspect toconsider is the role of reference ontologies thatrepresent body structure in informing anatomy-drivenuser interfaces. Also, O2 Health Link, a Spanishcompany, has announced the Activa Central portal,which is intended to be a unique entry point todatabases and tools on biomedical informatics, in thisexceeding the current scope of Google Body Browser.

    D4) Population: How can the difference between themultidimensional model of a set of individuals bedefined in a mathematical manner usable in real time?No solution exists at this time that considers acontinuous approach, e.g. defining the human face,body or any part of it as a continuous function throughtime and space. In such a case, morphing from onemodel to another would not be anything more thecalculating their functional differences and creating acontinuous transition function between them. However,it should be possible to address this problem also in thediscrete domain (time and space not being continuous,but well defined at certain points and moments)supported with lower-dimensional data (such as imagesof the respected model/object).

    In this case, our problem becomes more specific – how to morph from one model to another if we are able to compute the differences between their 2Drepresentations? This solution is evolving and requiresagreed 2D perspective references, volumetric data andmodels, but as always, the challenge is to take intoconsideration the end user’s available hardwareresources so that execution to take place in anacceptable level of “real time”.

    This approach also does not reflect the process ofageing, and with regard to the impact of morphing onepart of the model on the model as a whole, furtherresearch still has to be conducted, with different “rules”having to be established and defined (partial morphingand interpolation of model parts, e.g. neck whenmodifying the head). With respect to this, the VPH-FETexperts established a contact with Roni Zeiger, developerat the Google Corporation of the Google Body Browser.Roni invited the panel to submit new specifications forfuture versions of the application. Thus, while it might besensible to provide specifications for these endeavours,as we develop the rest of the VPH-FET roadmap, it doesnot seem necessary to include this aspect in theroadmap itself. Still it might remain relevant with regardto the ancillary problem of what users will want VPH datato provide and what type of interface will then be neededto allow them a suitable form of access.

    Knowledge managementResearch on the use of explicit knowledge representation(such as ontologies) to manage VPH resources is wellunderway within the VPH community. Collaborationsbetween the VPH NoE11 and the VPH RICORDO12 project, for instance, aim to provide the VPH community with tools toeffect, share and reason over resource annotations that mapto an ontological representation of the body. This worksustains interoperability operations, based on the criterion ofanatomical relatedness, ranging from automated dataset

    B010911-Vphfet Roadmap-aw_Layout 1 13/10/2011 15:37 Page 12

  • :

    matching to model merging and managing complexsimulation workflows.

    Challenge descriptionIf we accept the pyramid of knowledge approach we used topartition the discussion topics also here we can separate theproblem of building a global reference system in three parts:

    DataBiomedical data involve a huge degree of heterogeneity, intype, accuracy, resolution, information content, etc. Most ofthe biomedical data generated by imaging orinstrumentation, or derivable by processing these data canbe represented by a single super-type defined as a posed(positioned and oriented) and bounded multi-scalar field,where location, boundary and field are time varying.

    a. Space. The general problem is to transform the super-type or any sub-type to and from the space in which itwas generated (by measure or computation) to the globalreference space. These transformations are calledregistrations. The operational scenario imagined herewould require that every time we add a dataset to theglobal collection, this is transformed to the global space,and the average representations are updated to includealso this new dataset. Since manual registration (basedon manually annotated anatomical landmarks) is atedious procedure, there is a vast amount of literature onautomated methods. In most work, registration isformulated as an optimisation problem. Numerically, aspatial coordinate transformation is estimated thatminimises some “cost function” measuring the residualmisalignment of the data. While there is a huge body ofliterature on the registration of specific data types,currently existing automatic registration methods usuallyrely on many user-defined parameters, such as thechoice of cost function, transformation model andoptimisation method, which may have a large impact onthe results. Configuration of these parameters is oftenmore art than science and requires a deep understandingof the underlying methodology. A great challenge for theVPH-FET community is the development of a registrationmethod that automatically configures itself based on datacharacteristics (noise, texture), shape and size ofstructures of interest, and performance constraintsregarding computation time and accuracy. Having suchan adaptive registration framework will enable us tocircumvent the subjectivity associated with manualparameter optimisation.

    b. Time. The same operation in time is usually calledsynchronisation. The problem is conceptually similar tothat of registration, but it has specific operational issues,for example on the best way to interpolate over timeposes or deformations (changes of the boundary).

    c. Idealisation. At the first glance, one could argue thatdata do not contain idealisations. The discussion couldbecome very philosophical, as we notice that the way weobserve the data always implies a degree of idealisation;but these reflections would probably take us far from thepoint. In practice, data generation methods are beingdeveloped that have such high resolution that they spantwo or more scales. In this sense, we can imagine thespace being partitioned into consecutive scales, with onlya portion of the dataset being exposed at each scale.

    d. Population. Primary data should always refer to aspecific individual. However, a very large body of dataavailable in the literature are provided not as singlevalues, but as average values over a population. In thesecases, specific procedures are required to reconcilethese “average” data with the single values. As thisbegins to be applied to personalised cases, the use maybecome more sophisticated and pose more challengingproblems.

    e. Care. The most relevant aspect is to be able to associatedatasets that come from the same subject but overdifferent times, during which a specific care pathway hasbeen followed.

    InformationThe creation of a global reference system for information isprimarily related to the existence of a standardised syntacticand semantic framework within which information isexpressed. However, some special cases exist.

    a. Space. A special case is related to concepts that whileexpressing a space-time association, do so in a semi-quantitative, qualitative or inherently ambiguous way:concepts such as medial, proximal, apical to be used todefine anatomical locations by themselves cannot beimmediately translated into a precise space-time regionin the global reference system; a manual mapping mustbe performed between sub-regions of the template andthese concepts, and such partitioning, must betransformed back and forth against subject-specific data,to preserve the association.

    b. Time. Frequently, time information is not present in thedata, but is embedded in the associated information (i.e.metadata, or patient folder). Such associations must bepreserved, whenever possible.

    c. Idealisation. The type of idealisation used is information.As such, it should map directly on to this dimension.

    d. Population. Similarly, the subject ID, or the clusteringattributes used to define populations are information.There is an interesting element, however. In a scenario ofpublic databases the biomedical information must beprovided in anonymous format, which means that the

    B010911-Vphfet Roadmap-aw_Layout 1 13/10/2011 15:37 Page 13

  • 14:

    only association to an individual is based on an IDnumber that can be reversed back to the true identityonly by the institution authorised to treat these personaldata, and then only in the cases considered by the locallegislation, and when authorised by the subjecthim/herself. This typically implies a coding scheme (i.e.DICOM) in which there is a unique code for eachinstitution, and then a unique code for each subjectexamined in that institution. This means that my datacollected in two different hospitals could be presentedanonymously as if they belong to two different persons.While in many cases this is not a problem, in otherslosing the fact that these two observations refer to thesame person is an essential problem. This points to ageneral ICT problem, Universal ID. There is a set ofcandidate technologies (OAuth, OpenID, FacebookConnect) that might be used to build a user-centricuniversal ID mechanism, in which the user can authorisesuch reconnection of information.

    e. Care. If the care information is standardised intoguidelines, or even better modelled as workflows, theproblem is trivial. However, clinical practice is far frombeing so uniform; in reality, the description of carereceived (from the subject information, i.e. electronichealth record) or which can be provided (from literature,industry information, etc.) is far from being formalisedand encoded, and it usually spreads over manydatabases, written in a myriad of formats and encodings,etc. Thus, the problem of transforming this informationinto a global care dimension poses major challenges.

    KnowledgeStatistical associations (or causal relationships) betweendifferent sources of data and information represent one kindof knowledge: “Atrophy of the hippocampus is associatedwith Alzheimer’s disease”, “The use of drug A causes side-effect B”, etc. More generally, knowledge can be seen as amapping between different domains (of differentdimensionality): from blood pressure to the distribution ofwhite matter lesions in the brain, from stenosis grade tostroke risk, from genotype to drug response. Ontologiesrepresent a second kind of knowledge, and provide anexplicit and independent identification of biomedical conceptsand the relationships between them. Considerable progresshas been made in developing reference ontologies for keydomains in biology, including gene functions and processes,chemical entities, proteins, anatomy and phenotypes.

    a. Space. As knowledge provides the link between differentsources of data and information, it cannot always bepinned down to a single anatomical region. The globalreference model should link a certain piece of knowledgeto all the anatomical regions involved.

    b. Time. Longitudinal studies result in knowledge thatrelates data/information at different time points.

    c. Idealisation. A causal relationship (pathway) at one levelof idealisation may translate in a statistical association ata higher level. For example, tumour cell death induced bya certain drug can be explained at a molecular level, butmay also manifest itself as a change in local magneticrelaxation time, as measured by an MRI scan.

    d. Population. Knowledge obtained on a population levelshould be translated back to a patient-specificreport/diagnosis/prognosis.

    e. Care. Practitioners should be able to upload patient-specific data/information and, with a few clicks, generatea report that summarises relevant knowledge that wasextracted from the patient data, based on the modelspresent in the system.

    Addressing the volumetric GRB challenge: Puttingphysiological data in a Global Reference BodyIn this section we discuss the question of how to linkphysiological data and models into the Global ReferenceBody based on ideas being developed in the VPH RICORDOproject. A key concept is that of ‘embedded’ or ‘material’coordinates. A region of infarcted tissue in the heart, forexample, occupies a unique position within the myocardialwall but moves in space during the beating heart cycle. Tolocate the infarct position relative to the anatomy of theheart, we introduce coordinates that are attached to thetissue (and so move and deform with the tissue). FieldML,an XML markup language standard for describing spatial (andtemporal) fields that is being developed under the VPHproject, contains all the concepts discussed here.

    RICORDO is exploring the combined use of anatomicalontologies and volumetric image data analysis to providepatient-specific information to VPH modelling tools. Thisapproach makes use of statistical radiological models(SRMs) – 3D representations of anatomical variation amongindividuals. These models have the crucial property of beingable to map, for any individual under investigation, theequivalent anatomical location in the subject’s radiologicalorgan structure (for 2D images, 3D reconstructions, as wellas time series). The volumetric annotation of such modelswith CORDO anatomical IDs allows for the automaticlabelling of an individual’s anatomical features with standardontological terms. This step creates the opportunity to: – determine automatically the patient-specific shape and

    size of a functionally important anatomical feature, andpass that on to otherwise generic volumetric models ororgan physiology

    – link up with other VPHDMs annotated using the sameontological terms to integrate regional anatomical

    B010911-Vphfet Roadmap-aw_Layout 1 13/10/2011 15:37 Page 14

  • :

    modelling with further information associated with thatregion (e.g. gene expression)

    – establish the basis for the community-based annotation ofSRMs, and explore the provision of public SRMrepositories based on the concept pioneered by theProtein Data Bank (PDB) for X-ray and MRI molecularstructures.

    The work is part of the wider issue of interoperability acrossVPH resources, volumetric as well as non-volumetric (Figure1.1). Whilst we focus on the issues across different types ofvolumetric resources, questions of how to link these to the

    rest of a VPH infrastructure are also being considered. Inparticular, recommendations for volumetric metadata modelsand standards need to be articulated with the larger VPHToolkit development activities. The work is building on threeexisting representative efforts in the areas of 3D atlasframeworks, statistical volumetric models and computationalvolumetric models. Respectively, these are the EdinburghMouse Atlas Project (EMAP), the GIMIAS system and acomputational heart model.

    Figure 1.1: Volumetric Data and Models as part of the wider VPH infrastructure (from the Ricordo project application).

    B010911-Vphfet Roadmap-aw_Layout 1 13/10/2011 15:37 Page 15

  • 16:

    These different representations are optimised for specificresearch and clinical applications; interoperability will enablefull transfer of knowledge from the basic research resourcesthrough to computational modelling and clinical practice thatcan produce better understanding of the underlying systemsbiology, hypothesise targets for treatment and build theinfrastructure for translational application of basic biologicalresearch. The project is establishing the standards andprotocols needed for this level of interoperability by studyinguse cases selected to establish the mappings requiredbetween the different representations. These are modellingthe specific behaviour of a patient heart – systems analysisand modelling utilise FieldML models to explore underlyingbiological processes, and statistical models are mapped to a standard atlas to query anatomy and volumetric-basedresources.

    The primary use case for exploration of the interoperabilityissues is based around clinical assessment, physiologicalmodelling and basic biological research into heart function.This is, of course, one of the best developed cases in termsof physiological modelling, and is of major clinical relevanceas recent molecular biological studies have linked adult heartfunction and repair to developmental processes andregulatory networks.

    The work is divided into 4 tasks: T6.1-T6.4, the first three of which focus on particular sub-problems, as illustrated inFigure 1.2.

    A key question will be whether some form of canonicalhuman atlas or a more abstract conceptual human anatomicalspace (illustrated as ‘Atlas’ in Figure 1.2) would be morebeneficial for integration of volumetric VPH resources.

    We establish embedded (material) coordinate systems in FieldML models that allow the regions of tissue to beassigned FMA13 labels. This will be done for heart models as an example but it is applicable to all tissues/organs in thebody. These coordinates are illustrated in Figure 1.3 for theleft and right ventricular myocardium of the heart. The linesshown in Figure 1.3 form three coordinates: h1 runningcircumferentially around the heart, h2 running from the baseto the apex, and h3 running transmurally. Any material pointcan be labelled by a triplet of embedded coordinates (h1, h2,h3) and any material region, such as the posterior basalsegment of the myocardial septum or the right ventricularfree wall endocardium, both shown in Figure 1.3, can belabelled by a specified range of these coordinates.

    Figure 1.2: Volumetric Data and Models as part of the widerVPH infrastructure (from the Ricordo project application).

    Physiological modelling of the heart has resulted in detailedFieldML models associating the computational modelparameters within a defined coordinate frame associatedwith topology and geometry of the heart. This curvilinearmodel coordinate frame can be mapped on to a standardCartesian frame (see Task 6.1, in Figure 1.2). From clinicalradiological imaging it is possible to build statistical modelsthat represent the normal and abnormal variation of heartgeometry within a given population. This model is a series oflinked surface models that share node-matched triangulated-surface representations, which enables statistical analysis,for example principal components, to establish normal modesof variation. These then allow, via mapping of a new subject,tests of normality in a statistically significant fashion. In basicbiology research, spatial data, such as in situ expressionpatterns or protein distributions are captured and mapped inthe context of models based on a simple volumetric imagecoordinate framework. These are overlaid with anatomyontology markup (delineated anatomy) and can be navigatedin direct 3D coordinates or via anatomical locations.

    13 The Foundation Model of Anatomy – an ontology for human anatomy

    B010911-Vphfet Roadmap-aw_Layout 1 13/10/2011 15:37 Page 16

  • :

    Figure 1.3: Embedded coordinates for the heart. (h1, h2, h3) label material points in the myocardium and move with the heartas it deforms. The red surfaces on the left represent the endocardial surfaces of the right and left ventricles; the diagram onthe right is a cross-section of the heart model.

    Note that, in order to reference the embedded coordinates for the heart with the embedded coordinates for any otherorgan in the body, a global Cartesian reference system (x1, x2, x3) is required. This is illustrated in Figure 1.4, where similarembedded coordinate systems are shown for the lungs, skin, skeleton and muscles.

    Figure 1.4: . Embedded coordinate systems for the heart, lungs, skin, bones and muscles, shown in relation to a Cartesianreference system in a standardised neutral body position.

    Note that once the mapping from (h1, h2, h3)-space to (x1, x2, x3)-space is established for a given organ, other embeddedcoordinate systems can be established and used for labelling the same points. These mappings must be established in aconvenient neutral position for the body, such as that shown on the RHS of Figure 1.4. To assist with annotation, weoverlay these coordinates and the associated models with images to which the models are registered, so that theannotations can be applied in relation to image data. Organs are easier to model, while items like nerves and vein varygreatly in position. This is illustrated in Figure 1.5, where the heart model is shown registered to a human MRI image set.

    B010911-Vphfet Roadmap-aw_Layout 1 13/10/2011 15:37 Page 17

  • 18:

    Finally, as part of the RICORDO effort, the CellML andFieldML metadata standards are being developed to includemodel annotation with GO, FMA and other bio-ontologies.

    Addressing the GRB knowledge challenge: the RICORDO ontology-based topological frameworksfor the anatomical integration of health record data andphysiology models14

    Mechanistic models in the Virtual Physiological Human (VPH)domain describe physiology knowledge in a formal and

    quantifiable manner. In particular, such models depictphysiological processes in terms of the influence that linkedphysiological variables have on each other (e.g. see Figure1.6). A significant proportion of variables in physiologymodels represent either a direct biological measurement, or an inference calculated from these measurements. At the ontological level, most variables correspond to aquality (e.g. pressure, concentration, rate of some biologicalprocess, etc.) associated with an anatomical location(e.g. an organ, or one of its parts).

    Figure 1.5: Registration of model to image in order to use additional information in the image to help define appropriate labels.A tool for doing this is shown on the right, using the cmgui component of the VPH visualisation toolkit.

    Figure 1.6: An example of a functional dependency series of six variables (rounded boxes) extracted from the Guyton 1992model, linked via five equations (circles). Independent variables are linked to equation nodes via input arrows, dependentvariables via output arrows (therefore, PRA is both an independent variable to Eq2 and a dependent variable to Eq1). Eachequation may involve more than one independent variable (hence the dotted vertical arrows). The free-text definitionassociated with each variable symbol (shown above each symbol) has been copied verbatim from the available documentationfrom the original model. AAR represents the arteriolar resistance in the kidneys. (ANP: Atrial Natriuretic Peptide).

    B010911-Vphfet Roadmap-aw_Layout 1 13/10/2011 15:37 Page 18

  • :

    Equations in mechanistic models formalise the quantitativeeffect that the value of one variable has on another. In suchmodels, however, provision is rarely made to explicitly depictthe anatomical connectivity that conveys the influencevariables have on one another. In Figure 1.6, for example,the anatomical route conveying the effect of VRE (a qualityof the right atrial lumen) on AAR (a quality of the kidneys) is not represented per se in the model from which the VRE-to-AAR mathematical relationships were derived. In such models, reference to anatomical location is madeindirectly (conventionally via free-text comments associatedwith physiology variable metadata). The application of anexplicit representation of anatomical connections is crucial toa number of key modelling tasks and, in particular, to support:

    – the linking of different physiology models to create asingle functional network of variables

    – the adaptation of a physiology model for a diseasescenario – a significant number of diseases may bedefined in terms of a set of altered anatomicalconnections.

    Anatomical connectivity is a key organizing principle for a broad spectrum of mechanistic biological knowledge, rangingfrom long-distance molecular interactions (e.g. endocrinesignalling) to developmental biology (e.g. topological operationson anatomical relationships). Establishing an independentontological framework depicting connectivity relationshipsbetween anatomical compartments would therefore supporta number of integrative goals in physiology, pharmacologyand medicine. In a pharmacological context, for instance, ashared standard for the representation of topologicalrelationships between body compartments would act as a keydesign template for the construction of pharmacokineticmodels, as well as the statistical analysis of anatomicallyaggregated datasets (e.g. safety and efficacy clinical trial data).

    In physiology, a connectivity framework would provide acommunal structural basis for the rational organisation offunctional (i.e. mathematical) relationships between modelvariables and an independent compartmental framework onwhich to integrate individual models. The ongoing effort bythe VPH community15 to annotate datasets and modelvariables using a communal core set of structural ontologiesis therefore crucial in preparing VPH resources to make useof this type of connectivity framework. However, a keysecond step is required to achieve the above interoperabilitygoals, namely: the consistent application of a multiscalecompartmental connectivity framework across the entirecore set of ontologies used by the VPH for annotation. TheRICORDO project addresses this requirement.

    We define compartmental connectivity as follows: twoanatomical compartments are connected when a structural(i.e. anatomical) path between them is necessary for theoccurrence of a process involving both sites. This coredefinition of connectivity is therefore established through the ontological integration of biological structure andprocess, explicitly bridging anatomy and physiologyknowledge. Furthermore, the main types of anatomicalconnections that are essential to convey physiologicalinteractions are classified in this work, based on the abovedefinition applied to the core set of VPH ontologiesdiscussed above. As connectivity categories also take intoaccount biological size scale and topological features, such a framework will provide models with contextual informationthat is relevant to equations governing the interactionbetween physiology variables (e.g. boundary conditions,distributions of spatial measurements, etc.).

    Furthermore, we describe the use of the above ontology-based framework to model and visualise key anatomicalconnections and physiological operations. Specifically, weillustrate:

    – the application of this approach to integrate locationknowledge pertaining to variables in cardiovascularphysiology models on to a connected compartmentalframe of reference

    – the role of this methodology in the development of novelvisualisation formalisms in support of the graphicalnavigation of, and interaction with, VPH models andclinical datasets. To this end, we show the application of this formalism to a clinical scenario drawn from theinvestigation datasets of endocrine disease involving anumber of distinct anatomical locations (see Figure 1.7).

    Human anatomy is central to the study and practice ofmedicine. A significant proportion of clinical symptoms, signs,phenotypes, diagnostic tests and medical procedures arerecorded and described with reference to anatomical location.The VPH domain focuses on the mechanistic modelling of asignificant number of functional relationships, as well as thestatistical correlation of biomedical datasets, that pertain todistinct anatomical locations. Both types of study require anindependent and explicit representation of compartmentalconnectivity that takes into account how physiological effectsare conveyed between body sites. The connectivity frameworkdiscussed in this work draws upon key reference ontologies ofbiological structure to provide a widely accessible and scalablemethod for model integration and data analysis, and lendsitself to the visual interaction with biomedical information in a topologically-correct anatomical context.

    14 This section is derived from an abstract presented by Bernard de Bono at the VPH2010 meeting in Brussels.15 Through the joint effort of the VPH Network of Excellence (http://www.vph-noe.eu/) and the RICORDO project (http://www.vph-ricordo.eu/

    B010911-Vphfet Roadmap-aw_Layout 1 13/10/2011 15:37 Page 19

  • 20:

    Impact on BiomedicineThe idea of a global reference system on which we can mapall data, information and knowledge about the human body(Global Reference Body) could have a huge impact onbiomedicine, at least at these three levels:

    a. An interface to data. Driving consoles in cars, in orderto represent all complex information they receive fromextensive electronic monitoring available on modern carsrepresent the information space as… a car. Similarly, wecan imagine a generic human body as the primary visualelement around which to build the interaction betweenthe users and all the information available. This mostlyrevolves around advanced interactive multiscalevisualisation and human-machine interfaces.

    b) A blender of information. Information is generallyacquired in a fragmented way, and the Global ReferenceBody provides an integrative representation of suchinformation. Here the accent is on integration, the abilityto combine, integrate, fuse information in a synergisticway, and to return such fusion visually to the user, if andwhen useful. It is about knowledge management, datafusion, image processing, multimodal visualisation andvisualisation of uncertainty.

    Figure 1.7: A topological graph for the visual organisation ofclinical data elements relevant to the investigation ofCushing’s Syndrome. In a clinical setting, the wealth of datagarnered by diagnostic sampling (green circle symbols) andinterventions (grey star symbols) are subsequently collatedand integrated by physicians to infer the biological state ofkey anatomical sites (blue triangle symbols). By mappingVPH models onto the same anatomical connectivityframework that integrates medical data, this work furtherimproves the clinical applicability of VPH modelling.

    c) An avatar of our knowledge. Literally Avatar meansembodiment or manifestation, and the VirtualPhysiological Human is embodied by the GlobalReference Body. Here the accent is on the modelling ofphysiological and pathological processes and theirrepresentation in a way that fosters understanding,exploration, and possibly the production of newknowledge.

    Then there are three dimensions related to the use case:

    a) Simulation: for training, retraining, rehearsal, and tosupport doctor-patient communication we use the GlobalReference Body to provide perceptually accurateinformation about the physiological and pathologicalprocesses, as well as of the procedures used todiagnose, treat or monitor the patient. “Perceptuallyaccurate” means that what matters is the accuracy ofperceptual (visual, tactile, haptic, auditory, etc.)information exchanged with the user, not thequantification of the underlying pathophysiologicalprocesses. The knowledge represented is entirely clinicaland completely generic (population-based).

    b) Planning and decision support: here the GlobalReference Body is used to take diagnostic, prognostic ortreatment decisions that might involve the precisequantification of some specific indicators (predictors) thatare recognised by the clinical user as the most importantin driving that decisional process. The predictive accuracyof the models must be good, but it is limited to a fewselected indicators. The knowledge represented ismostly clinical, although the precise quantification of thepredictors might also require some physical, chemical,and biological knowledge. The knowledge represented ispartially personalised, limited to the quantities thatdirectly influence the predictors.

    c) Explanatory medicine: the Global Reference Body isused as a source of knowledge, as well as a means forgenerating new knowledge. The Global Reference Bodyis used to explain why certain symptoms are observed,why a specific pathological scenario will produce certainoutcomes, or why a certain treatment will resolve thepathological scenario. The knowledge represented isintegral, in the sense that all available knowledge iscaptured into predictive models, properly integrated, andthen exposed via the Global Reference Body. Theknowledge is always subject specific, although previouspredictions can be used to estimate certain parameterswhen part of the information is not available.

    This 3x3 matrix creates many combinations, when wediscriminate per organ system, pathology, or stage of theclinical process (prevention, diagnosis, prognosis, treatmentplanning, treatment execution, monitoring, rehabilitation).

    B010911-Vphfet Roadmap-aw_Layout 1 13/10/2011 15:37 Page 20

  • :

    General ImpactThe problems that the creation of a Global Reference Bodypose, if solved, could open very interesting applications inmany other industrial or scientific sectors where the body of knowledge is large, heterogeneous, and complex.

    For example if we imagine the world financial market, wherethe data are all the financial transactions, the information is in all metadata on companies, funds, banks, financialinstruments, etc., and the knowledge is that of economists,but also that fixed by regulatory policies, trade rules,agreements, etc., we end up with a scenario that is not much different, in term of size, heterogeneity and complexity.

    Editorial Team

    Name Affiliation

    Peter Hunter University of Auckland

    William Lalinde O2 Health Link

    Marco Viceconti Istituto Ortopedico Rizzoli

    Gordon Clapworthy University of Bedfordshire

    Rainer von Ammon Centrum für Informations-Technologie Transfer

    Stefan Klein Erasmus University

    Bernard de Bono European Bioinformatics Institute

    Richard Christie University of Auckland

    B010911-Vphfet Roadmap-aw_Layout 1 13/10/2011 15:37 Page 21

  • 22

    Chapter Editors: Ralph Müller, Nic Smith

    2Transforming biological and physiological data across species

    Transforming

    B010911-Vphfet Roadmap-aw_Layout 1 13/10/2011 15:37 Page 22

  • :

    The VisionIn the introduction to his seminal book “What is Life?” firstpublished in 1944, Ewin Schrödinger wrote: “We are onlynow beginning to acquire reliable material for weldingtogether the sum total of all that is known into a whole; buton the other hand it has become next to impossible for asingle mind to command more than a small specialisedportion of it.”

    The acquisition of quantitative and high quality biological dataover the last six decades has continued and Schrödinger’sinsight still remains salient today. Parallel improvements in theperformance per unit cost of high performance computing(HPC) and numerical techniques that underpin much of theVPH are now powerful tools for application to Schrödinger’sprocess of ‘‘welding’’ or mapping of this biological knowledge.

    A key component of this mapping is the translation ofinsights gained from animal measurements into humanrelevant contexts. Such translation is not only a fundamentalgoal of a wide array of life science disciplines but is also, inmany cases, an underlying assumption. In traditional wet-labresearch this assumption is often ingrained to the pointwhere or how specific animal data can inform human focusedclinical practice is not even articulated. While this is clearly anissue, the on-going developments in high throughputmeasurement techniques and genetically modified animals(in particular mice) are providing new opportunities to addressthis issue. Specifically, the collection of large and unique datasets both provide the means to, and also increasingly require,this issue to be directly addressed.

    More than five decades after Schrödinger’s insight, Feri et al. (Mol Cancer Ther June 2006 5:1550) wrote: “Translatingmodeling results from mouse to human is a challengingproblem that is partially addressed by allometric scaling.Some physiologic, species-specific model parameters (e.g.,volume flow rates) can be translated across species usingthe allometric equation, an empirical power law that relatesthe biological parameter to body weight.”

    This points to mathematical models being parameterisedusing animal data, “scaled-up” to the human and then usedto predict the biodistribution of the same drug for clinicalapplications, a theme echoed by many researchers includingGeoffrey West and Jim Bassingthwaighte among others e.g.Baxter et al. (Cancer Res. 1995 Oct 15;55(20):4611-22.):“Other interspecies differences that can potentially affectdrug pharmacokinetics, such as production and clearancerates of free antigen in blood and binding to target antigen in normal tissues, can be accounted for by making structuralchanges to the model when scaling from mouse to human.”

    This citation exemplifies two of three strategies that arecurrently used to cope with the problem of translating tohumans what we observe in animal models:

    a) ignore the problem and pretend that a man is a mouse;

    b) reduce the problem to an isometric or allometric scalingproblem;

    c) refactor the model so as to account for differentmechanisms observed in the two species, whileretaining the same input-output set, which makes itpossible still to use the two models as a transformationfunction.

    The VPH motivationThe development of mathematical models using VPHframeworks and tools presents the opportunity toquantitatively assimilate data from multiple sources.Furthermore, because eukaryotic physiology is highlyconserved, the structure of existing examples of VPH-styleanimal-based models is very similar. This means that, usingspecies, temperature and genetically consistent animalmodels, we may ultimately be able to track parameter changesthrough the phylogenetic trees to predict human parametersand quantify the relevance of an animal result or diseasemodel to an individual or human population. If this was actuallypossible, we would be able to tap into the vast pool of datagenerated for mouse and other animal models to build modelsof physiology including molecular markers from genomic andproteomic origin. Such information is readily available for allmouse work, including genetically modified animals funded by the National Institutes of Health (NIH) in the United States(http://www.nih.gov/science/models/mouse/sharing/), as NIH-funded investigators are greatly encouraged to make thisdata public in standardised repositories after initial publicationin peer-reviewed journals.

    As its name suggests, the VPH primarily targets humans.However, it is clear that there are observations that cannotbe made in human subjects, primarily because of ethicallimitations. Such ethical limitations are fewer for animals,making these observations possible using a number ofanimal models. Many features of human biology at the celland molecular levels are shared across the spectrum of lifeon earth; our more advanced organism-based characteristicsare shared in a more limited fashion with other species. Atone extreme are a small number of human characteristics(brain functions and behaviour) that are shared by no otherspecies or, at most, by primates. However, at a lower level,there are characteristics that are shared only with mammals.In this context, the importance of mice in genetic studieswas first recognised in the biomedical fields of immunologyand cancer research, for which a mammalian model wasessential. Although it has been obvious that many otheraspects of human biology and development should beamenable to mouse models, until recently, the tools just didnot exist to allow for a genetic dissection of these systems.

    B010911-Vphfet Roadmap-aw_Layout 1 13/10/2011 15:37 Page 23

  • The movement of mouse genetics to the forefront ofmodern biomedical research was catalyzed by therecombinant DNA revolution, which began 30 years ago.With the ability to isolate cloned copies of genes and tocompare DNA sequences from different organisms came therealisation that mice and humans, as well as all otherplacental mammals, are even more similar genetically thanwas previously thought. An astounding finding has been thatall human genes have counterparts in the mouse genomewhich can almost always be recognised by cross-specieshybridisation. Thus, the cloning of a human gene leadsdirectly to the cloning of a mouse homolog which can beused for genetic, molecular, and biochemical studies that canthen be extrapolated back to an understanding of the functionof the human gene. Although the haploid chromosomenumber associated with different mammalian species variestremendously, the haploid content of mammalian DNAremains constant at approximately three billion base pairs.

    It is not only the size of the genome that has remainedconstant among mammals; the underlying genomicorganisation has also remained the same. Large genomicsegments (on average, 10-20 million base pairs) have beenconserved virtually intact between mice, humans and othermammals. In fact, the available data suggest that a roughreplica of the human genome could be built by simplybreaking the mouse genome into 130-170 pieces and pastingthem back together again in a new order. Although allmammals are remarkably similar in their overall body plan,there are some differences in the details of both developmentand metabolism, and occasionally these differences canprevent the extrapolation of mouse data to humans and viceversa. Nevertheless, the mouse has proven itself over andover again as being the model experimental animal parexcellence for studies of nearly all aspects of human genetics.

    Besides the strong homology in the genome, the mouse isamong mammals ideally suited for genetic analysis for severalother reasons. First, it is one of the smallest mammalsknown; second, it has a short generation time – in the order of10 weeks from being born to giving birth; third, females breedprolifically in the laboratory, with an average of 5-10 pups perlitter; fourth, an often forgotten advantage is the fact thatfathers do not harm their young and that laboratory-bredstrains are relatively docile and easy to handle. Finally,investigators are even able to control the time of pregnancies.

    The close correspondence discovered between thegenomes of mice and humans would not have beensufficient to drive researchers into mouse genetics withoutthe simultaneous development, during the last decade, ofincreasingly sophisticated tools to study and manipulate theembryonic genome. Today, genetic material from any source(natural, synthetic or a combination of the two) can beinjected directly into the nuclei of fertilised eggs; two or

    more cleavage-stage embryos can be teased apart intocomponent cells and put back together again in new"chimeric" combinations; nuclei can be switched back andforth among different embryonic cytoplasma; embryoniccells can be placed into tissue culture, where targetedmanipulation of individual genes can be accomplished beforethese cells are returned to the embryo proper. Geneticallyaltered live animals can be obtained subsequent to all ofthese procedures, and these animals can transmit theiraltered genetic material to their offspring.

    Progress has also been made at the level of molecularanalysis within the developing embryo. With the polymerasechain reaction (PCR) protocol, DNA and RNA sequencesfrom single cells can be characterised, and enhancedversions of the somewhat older techniques of in situhybridisation and immuno-staining allow investigators tofollow the patterns of individual gene expression through the four dimensions of space and time.

    Finally, with the automation and simplification of molecularassays that has occurred over the last several years, it hasbecome possible to determine chromosomal map positionsto a very high degree of resolution. Genetic studies of thistype are relying increasingly on extremely polymorphicmicrosatellite loci to produce anchored linkage maps, andlarge insert cloning vectors, to move from the observation ofa phenotype to a map of the loci that cause the phenotype,to clones of the loci themselves.

    All of these techniques provide the scientific communitywith the ability to search for answers to the many questionsposed. This will invariably lead to more questions, but thepotential is there to elucidate the mechanisms of manydiseases and realise effective treatments. The use of onespecies, such as the mouse, also allows the generation offull physiome maps in a relevant animal model as hasrecently been performed within VPHOP (www.vphop.eu), an integrated project funded in the 7th FrameworkProgramme of the European Commission. In this project, the effects of anabolic (parathyroid hormone; PTH),antiresorptive (bisphosphonate; BIS) treatment, andmechanical loading on vertebrae in a mouse model forpostmenopausal osteoporosis were investigated inlongitudinal fashion using in vivo high-resolution imaging.The ultimate goal of this project was to combine all of theresults in a physiome map that allows direct comparison ofall physiological effects on a standardised time axis. For thepurpose of the study, bone loss was induced in 15 week oldfemale C57Bl/6 mice by ovariectomy (OVX). Four weeks oftreatment was started either 5 or 11 weeks after OVX, andincluded a combination of mechanical loading (0N or 8N onthe 6th tail vertebra at 3000 cycles and 10 Hz, 3x/week), andeither injection of zolendronate (100 µg/kg once), PTH(80µg/kg daily), or corresponding vehicle. An additional group

    24

    :

    B010911-Vphfet Roadmap-aw_Layout 1 13/10/2011 15:37 Page 24

  • :

    of sham-operated animals was subjected to the mechanicalloading regime. Bone microarchitectural parameters of the6th tail vertebra were assessed by in vivo micro-CT beforeOVX, at treatment start, and after 2 and 4 weeks oftreatment. To create the physiome map, the values attreatment start (2nd point) were normalised to the basalmeasurement for each mouse. The effect of the treatmentwas calculated with regard to the treatment start and thepercentage change was scaled according to the main scaleof the first measurement. Figure 2.1 shows the map of thetrabecular bone volume fraction, but corresponding mapswere created also for both the cortical and the trabecularmorphometric parameters.

    All OVX mice showed a significant bone loss of 24% and30% at 5 and 11 weeks after surgery, respectively.Mechanical loading resulted in 16-28% of bone gain. Whilezolendronate stopped bone loss and slightly increasedBV/TV, PTH treatment restored the values to the basal level.The combination of pharmaceutical treatment andmechanical loading was additive in the BIS groups while aslightly synergistic effect was found in the PTH early loadinggroup. The physiome map allows a fast overview on therelative efficacy of each intervention and presents a powerfultool to compare the different groups on a single time axis.By including results from future longitudinal animal studiesthe physiome map can be extended easily.

    Figure 2.1: Physiome map in a mouse model of human osteoporosis.

    The problem is that what we observe in these mousemodels needs to be translated into something that will helpin understanding human physiology. Sometimes thistranslation is straightforward, but in most cases it is not;while it may be difficult, it is essential that it is carried outproperly. If a model needs inputs that cannot be observeddirectly in patients, and thus requires the use of estimatesfrom animal models, if we do not perform a correcttransformation, there is a significant risk that these data willturn out to be highly inaccurate when used to makepredictions for humans.

    One option would be to rely more on phylogeneticcloseness: so for humans, this would involve the use ofprimates with a particularly close genetic mark-up as theanimal model. Unfortunately, this works only to a certainextent (primarily for the same reasons that motivate the VPH approach to biomedicine). In addition, the use of animalmodels based on species that are closer to humans tends toraise greater ethical concerns and always involves muchhigher costs.

    B010911-Vphfet Roadmap-aw_Layout 1 13/10/2011 15:37 Page 25

  • An extension of the phylogenetic closeness strategy is todevelop multiple models across different genetic strains,populations and species to track parameter variation to bothidentify missing human parameters and quantify therelevance of animal insights in human contexts. The capacityfor mark-up languages such as CellML and FieldML tounambiguously represent and share biophysically basedmodels has the potential to support this process (Terkildsenet al., Experimental Physiology 2008). However, to achievethis goal will require the connection between modelparameters and the experimental data on which they arebased to be transparent. The maintenance of this link is alsoimportant when models are coupled or components reused.This is because this information is key to identifying thesources of model uncertainties introduced, often unavoidably,by combining data from different species or temperatures.

    Making model parameter and experimental data links bothclear and updatable is thus fundamental to defining thecontext within which model results can be interpreted in away that is relevant to the physiological system. This will, inturn, reduce cases in which the failure of a model to representexperimental data reflects an inappropriate structure orparameterisation, thereby increasing the likelihood that such a failure is identifying a gap in understanding, this can, in turn,lead to new insights being gained. For these reasons, a futurechallenge for the VPH is to develop integrated databases ofmathematical and experimental models focused on consistentsets of data in terms of species, temperature andexperimental conditions.

    Thus, it is important to develop robust and reliable methodsto accurately transform observations performed on animalsinto their equivalent for humans. In much contemporarybiomedical research, observations made on mice are directlytranslated to humans, without any further consideration. Thisis to forget that animal models are models, no less and nomore than computer models. Like any other model theyhave limits of validity in themselves. Such limits however,could be significantly extended if we could consider not onlythose observables that essentially do not change frommouse to man, but also those that do transform butaccording to some transformation law that we can model. In such a case, the model used to translate the observationsto the human would become a combination of the animalmodel and the computer model.

    The challengeThree challenges emerged from the discussion:

    a) space, time and space-time transformations

    b) parametric adjustment for inter-species integrativemodels

    c) compositional transformations into integrative models

    Space, time and space-time transformationsThe most evident difference between a mouse and a man is the body shape and posture. In many cases, theseanatomical differences are relevant to developing a correcttransformation of the observation made on the mouse intothe human.

    The simplest case is the spatial transformation of a singleorgan, which in many cases is homothetic. In this case, well-defined elastic registration algorithms are available.

    A more complex case is when the transformation involves a change of both shape and posture. This means that we are morphing a complex portion of the anatomy, which iscomposed of parts that can move relative to each, whileremaining connected. For each individual part, thetransformation would still be homothetic, but the connectivitynegates this condition. Some work is currently being done onthe heart and on the skeleton, where global and local elasticand rigid registrations are performed and combined cleverly,but we are far from a general solution to the problem.

    A third case is when the transformation, even as a singlecomponent, is not homothetic. A typical, though rathertrivial, case is when in the skeleton of one species there are two distinct bones, whereas in the other there is asynostosis – a fusion that forms a single bone. Here, nogeneral approaches are available, but some attempts arebeing made to incorporate evolutionary adaptation logic intothe transformation to derive rules for the spatial remapping.

    While temporal transformations in themselves are usually not a challenge due to the development of signal processingmethods which allow accurate synchronisation, in mostcases the combination of space-time transformation (motion)remains quite challenging. In computer animation, there are some interesting methods for what is called “motion re-targeting” which in principle could be explored, andeventually adapted, to solve this problem.

    Parametric adjustment for inter-species integrativemodelsOnce a complete space-time transformation is available, weneed to transform the observations we make in the animalto the human. If such observations are not merely anatomo-functional or chemo-physical, such a transformation involvesa much greater problem – transform the causation.

    As outlined above, given the conservation of physiologybetween species, the causation that links two observablescan often be assumed to be the same in both species. Inthese cases, we can use the same integrative model forboth the animal and the human, and the problem is reducedto a multidimensional re-parameterisation.

    26

    :

    B010911-Vphfet Roadmap-aw_Layout 1 13/10/2011 15:37 Page 26

  • :

    Central to achieving an effective re-parameterisation is themaintenance of a tight link between model parameterswithin modelling frameworks and experimental data. This linkis fundamental for underpinning and defining the contextwithin which a model component can be used. In contexts in which model components can be appropriately reusedwithout substantial re-parameterisation, coupling methodscan be applied to quickly develop new models to study anenlarged system of interest. This powerful feature ofcomponent reuse and knowledge capture within biophysicallybased modelling frameworks is increasingly being appliedwithin the VPH community. However, a difficulty of thisapproach is that inheritance of model components, due torepeated reuse, now often extends through multiplegenerations of a given class of models. This has paradoxicallyresulted in the obfuscation of the link between parametervalues and the original experimental data sources. Withoutclear provenance of sources, identification of inheritedparameters requiring updating with more recentmeasurements is problematic, and the relevance of themodel can be eroded.

    Thus, as model complexity increases in parallel with availabledata, processes to identify, assess and critique the basis ofmodel parameterisation must be developed. Specifically, newtools are required to analyse the models themselves to ensure

    a transparent link between models and experimental data. The application of these tools will be critical in supporting theiterative model development process and guiding a productivecyclic collaboration between experimentalists and modellers.Again, our expectation is that this will reduce cases in whichthe failure of a model to represent experimental data is due to an inappropriate structure or parameterisation, and this willincrease the probability that the failure is a result of a gap inunderstanding with respect to the specific physiology of thespecies being investigated. This process will be important forproviding a robust modelling foundation for mapping bothparameters and results between species.

    The incorporation of data from multiple species within VPHmodels focused on interpreting specific animal experimentsor human clinical data remains a challenge in many areas ofthe VPH. To provide a tangible demonstration of this, Figure2.2 demonstrates the wide range data sources from a wellknown model of the cardiac myocyte electrophysiology.

    The result is, in many cases, indicative of other models.Once models are fully defined and validated on the animalmodel, in which both the inputs and the outputs areobservable, this is used to predict the outputs (or the inputs,using inverse identification) for the human model, for whichthe inputs (the outputs) are observable.

    Figure 2.2: The citation (or “genetic”) tree of modelling and experimental studies, which were used to define parameters forthe Bondarenko mouse ventricular action potential model. Species dependence is defined by colour codes: blue-mouse,purple-guinea pig, pink-dog, yellow-bullfrog, light blue-rabbit, red-human, green-ferret, orange-sheep, grey, light pink-squid,brown-starfish (images from the work of Liren Li).

    B010911-Vphfet Roadmap-aw_Layout 1 13/10/2011 15:37 Page 27

  • Compositional transformations intointegrative modelsA much more complex problem is when the causation thatlinks sets of observables involves different mechanisms in thetwo species, i.e. the underlying model is different. This iswhere we expect the VPH approach to make a greater impact.

    Integrative models are predictive models made of componentidealisations and of relation idealisations defining the relationsbetween components. In the classic VPH approach, eachidealisation is captured in a separate sub-model, so we havecomponent sub-models and relation sub-models, whichtogether form an integrative model.

    The idea is to develop an integrative model for each species,using an abductive cycle that starts from the scale at whichthe phenomenon to be predicted is observed, and addscomponent sub-models describing processes at larger andsmaller scales, until the two integrative models expose thesame set of external inputs and outputs, in spite of the factthat internally they are formed by a different sets ofcomponent sub-models.

    Impact on BiomedicineThe impact on biomedicine would be dramatic. Today, it isquite common to develop, based on animal models, newpharmacological compounds that are expected to producecertain effects, only to find out after long, potentiallydangerous, and hugely expensive clinical trials, that this isnot the case in humans. Any technology that could reducethis problem, even if only marginally, would have a dramaticpositive impact on the pharmacology and medical deviceindustry.

    General ImpactThe problem of predicting something about a relevantsystem, based on the observations made on a simpler but similar one, appears of general relevance; it is possible to imagine relevance to applications in fields such astelecommunications, finance, socio-economic modelling, etc

    28

    :

    Editorial Team

    Name Affiliation

    Ralph Müller ETH Zurich

    Nicholas Smith King’s College London

    Marco Viceconti Istituto Ortopedico Rizzoli

    B010911-Vphfet Roadmap-aw_Layout 1 13/10/2011 15:37 Page 28

  • :

    B010911-Vphfet Roadmap-aw_Layout 1 13/10/2011 15:37 Page 29

  • 30

    Chapter Editors: Filippo Castiglione, Vanessa Diaz

    3Physio-environmental sensing and live modelling

    Sensing