Special theme: Mathematics for Finance and Economy

52
European Research Consortium for Informatics and Mathematics www.ercim.org ERCIM NEWS Number 78, July 2009 Also in this issue: Keynote by Michael Dempster Joint ERCIM Actions: ERCIM celebrates 20th Anniversary in Paris R&D and Technology Transfer: OpenViBE: Open-Source Software for Brain-Computer Interfaces Special theme: Mathematics for Finance and Economy Special theme: Mathematics for Finance and Economy

Transcript of Special theme: Mathematics for Finance and Economy

Page 1: Special theme: Mathematics for Finance and Economy

European Research Consortiumfor Informatics and Mathematicswww.ercim.org

ERCIM NEWSNumber 78, July 2009

Also in this issue:

Keynote by Michael Dempster

Joint ERCIM Actions:ERCIM celebrates 20th Anniversary in Paris

R&D and Technology Transfer:OpenViBE: Open-SourceSoftware for Brain-ComputerInterfaces

Special theme:

Mathematics for Finance and Economy

Special theme:

Mathematics for Finance and Economy

Page 2: Special theme: Mathematics for Finance and Economy

Quantitative Financeand the Credit Crisis

“Guns don’t kill people, people kill people.” (US National Rifle Association).

The thoughts and research of a long line of social philoso-phers and economists attest to the fact that the current finan-cial crisis is just the latest in centuries of frequently occurringevents. Indeed, beginning with Adam Smith in the eighteenthcentury, Marx, Mill, Marshall, Wicksell, Fisher, Keynes,Schumpeter, Minsky and Kindleberger have all studied thecauses and consequences of financial crises which have usu-ally been appropriately ‘global’ in their time. While perhapsnot yet generally agreed, the current truly global crisis, nowhopefully in its final ‘resolution” stage, fits virtually per-fectly the stages of crisis discussed in Kindleberger’s famousbook in an historical context. Usually promoted by differentcommunities and interests in isolation: ‘globalization’ in theform of financial imbalances between developing exportingand developed importing nations, improper regulation ofmacroeconomic policy and markets by governments andcentral banks, and greedy ‘Anglo Saxon’ bankers developingever more complex derivative products in search of personaland corporate profits, have all been blamed for the currentsituation. The truth is that these three potential causes haveall contributed to the current situation and all must beaddressed appropriately in its resolution. As a mathematicianwho has worked with the global financial services industryover the past two decades, I wish to address the third issuehere.

First, I would ask the reader to reflect on the fact that bankingis the latest of the major industries to go ‘high tech’ in thesense of extensive day to day employment of mathematicsand informatics. This follows military logistics in the secondworld war (operations research), the oil industry, aerospace,manufacturing, airlines, logistics, even film making (LucasLabs). Although beginning with the breakdown of theBretton Woods fixed currency exchange rate system and theprecise valuation of equity options in the early 1970’s,“modern” banking is at most a matter of the past twentyyears.

In no case that I know of, including the nuclear industry, hasgovernment regulation – independent of whether or not it isjustified – succeeded in reversing technological progress.Nevertheless people in high places are currently consideringregulation based on the widely quoted opinions expressed bythe world’s most successful investor, Warren Buffet,Chairman of Berkshire Hathaway, ‘Derivatives are weaponsof mass destruction’ and Lord Turner, Chairman of the UKFinancial Services Authority, who blames ‘excessivereliance on sophisticated mathematical models’ for the crisis.The true explanation of the role of derivative products in itsdevelopment is more complicated.

ERCIM NEWS 78 July 2009

ERCIM News is the magazine of ERCIM. Published quarterly, itreports on joint actions of the ERCIM partners, and aims to reflectthe contribution made by ERCIM to the European Community inInformation Technology and Applied Mathematics. Through shortarticles and news items, it provides a forum for the exchange of infor-mation between the institutes and also with the wider scientific com-munity. This issue has a circulation of 10,000 copies. The printed ver-sion of ERCIM News has a production cost of €8 per copy.Subscription is currently available free of charge.

ERCIM News is published by ERCIM EEIG BP 93, F-06902 Sophia Antipolis Cedex, FranceTel: +33 4 9238 5010, E-mail: [email protected]: Jérôme ChaillouxISSN 0926-4981

Editorial Board:Central editor: Peter Kunz, ERCIM office ([email protected])Local Editors:Austria: Erwin Schoitsch, ([email protected])Belgium:Benoît Michel ([email protected])Denmark: Jiri Srba ([email protected]) Czech Republic:Michal Haindl ([email protected])France: Bernard Hidoine ([email protected])Germany: Michael Krapp ([email protected])Greece: Eleni Orphanoudakis ([email protected])Hungary: Erzsébet Csuhaj-Varjú ([email protected])Ireland: Ray Walsh ([email protected])Italy: Carol Peters ([email protected])Luxembourg: Patrik Hitzelberger ([email protected])Norway: Truls Gjestland ([email protected])Poland: Hung Son Nguyen ([email protected])Portugal: Paulo Ferreira ([email protected])Spain: Christophe Joubert ([email protected])Sweden: Kersti Hedman ([email protected])Switzerla nd: Harry Rudin ([email protected])The Netherlands: Annette Kik ([email protected])United Kingdom: Martin Prime ([email protected])W3C: Marie-Claire Forgue ([email protected])

ContributionsContributions must be submitted to the local editor of your country

Copyright NoticeAll authors, as identified in each article, retain copyright of their work

AdvertisingFor current advertising rates and conditions, see http://ercim-news.ercim.org/ or contact [email protected]

ERCIM News online edition The online edition is published at http://ercim-news.ercim.org/

SubscriptionSubscribe to ERCIM News by: sending email to [email protected] by filling out the form at the ERCIM News website: http://ercim-news.ercim.org/

Next issue:October 2009, Special theme: Green ICT

Editorial Information

Page 3: Special theme: Mathematics for Finance and Economy

ERCIM NEWS 78 July 2009 3

It should also be noted that these simple but popular opinionsfly in the face of Mr Buffet’s recent investment in GoldmanSachs and the widely acknowledged role of the Basel inter-national capital requirements in the explosion of off balancesheet entities set up by the banks to securitize and structurecredit derivatives, most of which ended in disaster. The lawof unintended consequences in regulation remains a signifi-cant force to be reckoned with!

Good scientists know the truth of Box’s maxim that “allmodels are wrong, but some models are useful”. Financialmodels are no exception. The seminal contributions ofSamuelson and his associates, Black, Scholes and Merton, inpricing equity options – albeit under somewhat unrealisticassumptions – is based on the concept of dynamic marketequilibrium involving no arbitrage opportunities using a con-tinuously rebalanced portfolio of cash and the underlyingstock to replicate the option’s value. This approach is the per-fect example of Keynes’ maxim that “it is better to beapproximately right than precisely wrong” and is the prin-ciple upon which the pricing and hedging of current complexderivative products are based in the equity, foreign currencyexchange and fixed income (government bond) markets.Moreover, the models involved have by and large continuedto function properly throughout the crisis.

Alas this is not the case in the credit (mortgage, loan and cor-porate bond) derivative markets, which have grown over thepast decade since their inception by the JP Morgan bank froma few tens of billions of US dollars to $160,000 billion today,nearly three times global GDP. Throughout, both academicand practitioners have been criticizing, as mere “curve fit-ting” to market data, the basic Gaussian copula model and itsvariants used to value over-the-counter (OTC) tranches ofsynthetic collateralized debt obligations (CDOs) written on

Michael A H DempsterCentre for Financial Research, Statistical Laboratory, University of Cambridge & Cambridge Systems Associates.

indices of either European or US corporate bonds. As aresult, hedging these widely traded derivatives in the absenceof theoretical approximate hedges is notoriously difficult andshows that for these products more rather than less complica-tion is need. The hedging problem for CDOs has lead to theunfortunate linking of the banking and insurance industries,involving firms such as AIG and the so-called “monoline”insurers, who wrote tens of billions of dollars of insurance inthe form of credit default swaps (CDSs) on corporate bondswhich in the end has had to be honoured by the US govern-ment.

The recent history of Citigroup, two years ago the world’slargest bank and now effectively owned by the US taxpayer,under its previous CEO, Charles “while the music’s stillplaying you’ve got to get up and dance” Prince, exemplifiesthe failure of top banking management to understand the truecomplexities of the structure of – and the risks being run by –their organizations. To quote the late Peter Bernstein, longtime Wall Streeter, before the demise of the Bear Stearnsinvestment bank in March 2007, “We have passed through aperiod where the appetite for returns was so voracious thatconsiderations of riskiness were significantly downgraded”– the undeniable signature of a bubble. But there were evenworse anomalies in the derivatives markets as a consequenceof the top heavy incentive structures in banks ignored in thebubble period.

My own recent experience concerns valuing structured fixedincome and FX products for litigation involving their trustedbanks by continental European individuals, corporations andlocal authorities. It has been said that “30% of derivatives are‘bought’, and 70% are ‘sold’”. The implication is that 30% ofinstruments are traded between ‘consenting adults’ who aretechnically equipped to evaluate the risks involved, but theother 70% are sold to investors “who have no idea what theyare buying”. The role in the larger proportion of OTC deriva-tive trades of sales and structuring staff who set the parame-ters of new complex products – many now on the street –must be distinguished from the role of ‘quants’ who createtheir valuation and hedging tools in the first place. This egre-gious aspect of ‘financial engineering’ of derivatives needsto be examined going forward.

It is not as they say ‘rocket science’ to require by regulationthe visual presentation of the risks involved to the client,which would have precluded purchase in the first place ofmany ‘toxic’ assets so revealed as grossly unfair. This ofcourse might call into question the profitability of investmentbanking in the future – even if the Glass-Steagall act of 1933separating commercial and investment banking were to bereinstated. With President Obama’s current advisors – whowere instrumental in the demise of the act in Bill Clinton’ssecond term – this is currently an unlikely event.Nevertheless the proposed risk transparency might eventu-ally lead to the support of true financial engineering, overproduct structuring at the client’s expense, and encourage thewidespread valid use of derivatives in reducing global risk.

Whatever happens one thing is certain, ‘high tech’ banking ishere to stay!

Michael Dempster

Keynote

Page 4: Special theme: Mathematics for Finance and Economy

Joint ERCIM Actions

ERCIM NEWS 78 July 20094

Contents

2 Editorial Information

KEYNOTE

2 Quantitative Finance and the Credit Crisis

by Michael A H Dempster

JOINT ERCIM ACTION

6 ERCIM Celebrates 20th Anniversary in Paris

by Peter Kunz

9 ERCIM Working Group Joint Meeting:

Dependable Embedded Systems and Formal

Methods for Industrial Critical Systems

by Erwin Schoitsch

SPECIAL THEME

Mathematics for Finance and Economycoordinated by Ralf Korn, University of Kaiserslautern and Fraunhofer ITWM, Germany

Introduction to the Special Theme10 Modern Mathematics for Finance and Economics:

From Stochastic Differential Equations to the Credit

Crisis

by Ralf Korn

Invited articles12 The Quality of Quantitative Finance

by Paul Wilmott

14 Balancing Theory and Practice: Modern Financial

Mathematics at Fraunhofer ITWM

by Marlene Müller

Theoretical Finance16 Fundamental Modelling of Financial Markets

by László Gerencsér, Balázs Torma and Zsanett Orlovits

18 Macroeconomic and Financial Scenarios for

Cost/Risk Analysis in Public Debt Management

by Massimo Bernaschi, Maria Francesca Carfora, AlbinaOrlando, Silvia Razzano and Davide Vergni

19 Rehabilitating Fractal Models in Finance

by Miklós Rásonyi

20 Improving Banks’ Credit Risk Management

by Xinzheng Huang and Cornelis W. Oosterlee

22 Fast Pricing of Hybrid Derivative Products

by Cornelis W. Oosterlee and Lech A. Grzelak

23 Modelling and Optimization of Fixed Income

Investments

by Jacek Jakubowski and Andrzej Palczewski

Applications in Economics24 Valuing CO2 Emission Allowances with Stochastic

Control

by Olivier Davidau, Mireille Bossy, Nadia Maïzi andOdile Pourtallier

25 Revenue Management in the Parking Industry: New

Business Opportunities

Page 5: Special theme: Mathematics for Finance and Economy

ERCIM NEWS 78 July 2009 5

by Rob van der Mei, Kevin Pak, Maarten Soomer andGer Koole

27 SNIPER: A Data Mining Methodology for Fiscal

Fraud Detection

by Stefano Basta, Fosca Giannotti, Giuseppe Manco,Dino Pedreschi and Laura Spinsanti

29 Enterprise Network Dynamics Explored through

Computational Modelling

by Marco Remondino and Marco Pironti

30 A New Approach to Explicable Sales Forecast Models

for the German Automobile Market

by Marco Hülsmann, Christoph M. Friedrich and DirkReith

31 Multiple Dimensions in Resource Allocation and

Investment Decisions

by João Oliveira Soares

Software/Hardware for Applications33 Premia: A Numerical Platform for Pricing Financial

Derivatives

by Agnès Sulem and Antonino Zanette

34 Accelerating Applications in Computational Finance

by John Barr

35 Simulating Large Portfolios of Credit: The

CreditCruncher Project

by Gerard Torrent

Closing Paper36 Mathematics for Economics: A Statistical Mechanics

Perspective

by Pierluigi Contucci and Francesca Romiti

R&D AND TECHNOLOGY TRANSFER

38 The ASPIRE Project – Sensor Networks for

Immersive Multimedia Environments

by Athanasios Mouchtaris and Panagiotis Tsakalides

39 OpenViBE: Open-Source Software for Brain-

Computer Interfaces

by Anatole Lécuyer and Yann Renard

41 Adaptive Committees of Feature-Specific Classifiers

for Image Classification

by Fabrizio Falchi, Tiziano Fagni, and FabrizioSebastiani

42 3D MEDIA: A Unique Audio and Video R&D Project

Spanning Wallonia

by Jacques G. Verly, Jerome Meessen and Benoit Michel

43 Model-Based Early Warning and Decision Support to

Improve Infrastructure Surveillance

by Francesco Flammini, Andrea Gaglione and ConcettaPragliola

45 Evolutionary Testing for Complex Systems

by Tanja E. J. Vos

46 The WPU Project: Web Portal Usability

by Janne Jul Jensen, Mikael B. Skov and Jan Stage

47 ADOSE: New In-Vehicle Sensor Technology for

Vehicle Safety in Road Traffic

by Jürgen Kogler, Christoph Sulzbachner, ErwinSchoitsch and Wilfried Kubinger

EVENTS

47 Announcements

51 In Brief

Page 6: Special theme: Mathematics for Finance and Economy

ERCIM Celebrates 20th Anniversary in Parisby Peter Kunz

On 28 May, in conjunction with the ERCIM Spring Daysmeetings in Paris, ERCIM and INRIA organised ERCIM’s20th anniversary celebration. The anniversary wasmarked by a seminar including presentations byrenowned personalities from research and industry, andrepresentatives from the ERCIM community.

Set in the heart of the historical Le Marais district, ‘LesJardins du Marais’ was the venue not only for the celebration,but also for the ERCIM Board of Directors and EditorialBoard meetings, several ERCIM Working Group meetingsand two European project meetings, namely InterLink andEuroIndia. In all, 180 attendees participated in ten meetingsover three days.

Anniversary Seminar The highlight of the anniversary celebration was the seminarheld on the afternoon of 28 May. It was organised into threesessions: ‘Science and Society’, ‘ERCIM Activities’ and a‘historical round table’. Following the welcome speeches bythe ERCIM President, Keith Jeffery and President d’hon-neur, Cor Baayen (a founding father of ERCIM), Wendy Hall(University of Southampton) gave the first talk. As Presidentof the Association for Computing Machinery (ACM), a posi-tion to which she was elected in July 2008 as the first personfrom outside North America, Hall presented plans to create aEuropean chapter of the ACM. She then talked about the newdiscipline known as ‘Web Science’, the main concern of theWeb Science Research Initiative that she recently foundedtogether with Tim Berners-Lee, Nigel Shadbolt and Daniel J.Weitzner. The goals of Web Science are to promote andencourage multidisciplinary collaborative research, to studythe development of the Web, to provide a global forumenabling academia, government and industry to understandthe scientific, technical and social factors that drive thegrowth of the Web, and to encourage innovation.

Mazin Yousif (Avirtec Inc., CEO and former-Intel executiveand chair of the ERCIM Advisory Committee) talked about‘green computing’. When looking at the worldwide energyconsumption of data centres, for example, the importance ofthis topic is obvious. The energy consumption of all data cen-tres worldwide is similar to the entire energy consumption ofa country like Argentina or The Netherlands. Reducingenergy consumption relies on four pillars: new technology(new materials, virtualisation of computing facilities etc),efficiency (energy-efficient equipment, minimising resourcesexecuting workload etc), conservation (energy caps, resourceconsolidation, certificates) and operations (advanced coolingtechnology, best practices, holding IT accountable for costetc). This is an area in which ERCIM could take the lead, pro-moting green computing in Europe, encouraging the estab-lishment of a Green Computing Working Group, drafting agreen computing research vision, encouraging green com-puting initiatives, and working with European standardisationbodies and the EU to expand the code of conduct.

ERCIM NEWS 78 July 20096

Joint ERCIM Actions

Left column (from top): Cor Baayen, Philipp Hoschka,Dimitris Plexousakis. Right column: Wendy Hall,Gerard Berry, Michal Haindl.All photos by N. Fagot, © INRIA.

Page 7: Special theme: Mathematics for Finance and Economy

ERCIM NEWS 78 July 2009

Gerard Berry (INRIA) delivered a remarkable speech abouthis personal experience of teaching computer science toschool children. Unlike the current generation of teachersand scientists, children today grow up with technologyforming a natural part of their life. These days, there is littleemphasis on teaching children the concepts of how com-puters work. Yet getting today’s children enthused about sci-entific subjects such as computer science is crucial toensuring a good cohort of engineers and scientists tomorrow.As Berry pointed out, what we must do is educate children insuch a way that they choose to develop from consumers intocreators.

Philipp Hoschka (W3C) spoke about the work of the W3C(of which ERCIM is the European host) and progresstowards the Ubiquitous Web. Ubiquitous computing is nowat an inflection point, where both hardware and softwareoffer solutions for applications. Recent examples include the‘Sekai Camera’ that can display and create ‘floating air tags’when a user is navigating through a museum, and‘SixthSense’, developed by the MIT Medialab, a wearablegestural interface that augments the physical world with dig-ital information and lets us use natural hand gestures tointeract with that information. W3C is currently working infields such as geolocation, camera API and widgets. Morework is needed on model-based user interface design anddevice-to-device communication. Hoschka concluded that

7

Top (from left): Walter Weigel, Keith Jeffery,Alessandro Fantechi, MazinYousif.

Bottom left:Seminar audience.

Bottom right: Andreas Rauber.

mobile Web applications will play an important role for ubiq-uitous computing and that they are a good opportunity forEuropean research and industry to get involved.

Walter Weigel, Director General of ETSI, the EuropeanTelecommunications Standards Institute, gave a talk entitled‘Standardisation of research results’. ERCIM and ETSIhave developed a well-established cooperation, through fiveyears of shared Grid technology experience, partnerships inEuropean projects, and through the joint ‘Infinity Initiative’,a series of annual advanced seminars. Weigel explained thekey challenges for successful innovation. With innovationcycles accelerating, market segmentation demands fasterproduct and service takeoff and thus an increased invest-ment in R&D. However, it is not sufficient simply to beinnovative, or to master the innovation environment: thecrucial question is how to ensure that an innovative advan-tage is not wasted. This is why work in standards is soimportant for transforming innovation leadership intomarket leadership. ‘First movers’ can set the standards anduse them to impose their market leadership. Standardisationis also an important ‘market access tool’. As Weigelexplained, if we look at standardisation as a businessprocess, successful innovation then needs to manage theincreasing ‘overlaps’ in the ‘research’, ‘test and develop-ment’ and ‘product’ phases. A closer link is therefore neededbetween research and standardisation.

Page 8: Special theme: Mathematics for Finance and Economy

Opening the second session of the seminar, AlessandroFantechi (University of Florence and ISTI-CNR) gave a pres-entation on the Formal Methods for Industrial CriticalSystems (FMICS) Working Group as an example of anERCIM success story. The Working Group was created in1992 by Stefania Gnesi and Diego Latella (CNR), followingan initial successful workshop in Pisa, Italy. Although modelchecking was then in its early days, the FMICS communitywas already aware of the great potential of formal verificationtechniques. Since then, the Working Group has advanced withthe development of formal verification techniques and modelchecking in particular. The series of annual workshops,started in 1996, is now a well-established conference in theFormal Methods community. It has promoted an ongoing sci-entific discussion focused on identifying the most efficientverification techniques, with a keen eye on their industrialapplicability. Most members of the FMICS community havestrong links with industry; this has aided in the gradual intro-duction over the last decade of formal methods into the devel-opment cycle of industrial critical systems.

‘Beyond the Horizon’ (BTH) and INTERLINK are twoexamples of how ERCIM has affected EC strategy, and werepresented by Dimitris Plexousakis (ICS-FORTH). BTH -Anticipating Future and Emerging Information SocietyTechnologies, a project carried out in 2005-06, defined themajor challenges and promising research directions in ICT-related strategic areas. Support for these was providedthrough a well-organised, extensive and systematic consulta-tion of the relevant research community in Europe. Theproject was composed of six thematic groups investigatingthe topics ‘Pervasive Computing and Communications’,

‘Nanoelectronics and Nanotechnology’, ‘Security,Dependability and Trust’, ‘Bio-ICT Synergies’, ‘Intelligentand Cognitive Systems’ and ‘Software Intensive Systems’.The intended impact was to enhance Europe’s reactivity toemerging scientific and technological challenges, to buildresearch communities and research networks in these fields,to encourage interdisciplinary research, and to increaseindustry’s awareness of new trends, challenges and visions ininformation society technology-related research. As a result,a large majority of the proposals were adopted as proactiveinitiatives in the Future and Emerging Technologies WorkProgramme of the EU 7th Framework Programme.

INTERLINK - International Cooperation Activities - takes asimilar approach. This project had several missions: to iden-tify and address globally relevant basic research problemswhere significant added value is expected to be gained fromworldwide cooperation; to establish communication andcooperation mechanisms within and beyond Europe to sup-port the formation and functioning of a related scientific com-munity; to identify complementarities in selected thematicareas among EU and non-EU countries that can give rise toknowledge and technology exchange; and to define jointbasic research agendas, road-mapping activities and jointRTD initiatives. The thematic areas of INTERLINK are‘Software intensive systems and new computing paradigms’,‘Ambient computing and communication environments’, and‘Intelligent and cognitive systems’. The outcomes of projectare comprehensive state-of-the-art reports and researchroadmaps in the three thematic areas, as well as a number ofnew proposed research themes, including ‘ensemble engi-neering’ and ‘socially-aware ambient intelligence’.

ERCIM NEWS 78 July 20098

Joint ERCIM Actions

“Historical round table” composed of two former representatives of ERCIM’s Board of Directors, Executive Committee and Editorial Boardmembers. From left: Alain Bensoussan, Paul Williams, Bob Hopgood, Georges Nissen, Henk Nieland and Siegfried Münch.

Left: Seminar participants.Below (from left): Keith Jeffery, Alain Bensoussan, Cor Baayen and Gerard van Oortmerssen.

Page 9: Special theme: Mathematics for Finance and Economy

ERCIM NEWS 78 July 2009

by Erwin Schoitsch

The ERCIM Working Groups Dependable EmbeddedSystems (DES) and Formal Methods for Industrial CriticalSystems (FMICS) seized the opportunity at the 20th anniver-sary event of ERCIM to have a joint technical meeting.Experiences were exchanged, projects and scientific workwere presented, and further cooperation was discussed, espe-cially in light of the successful jointly edited special theme‘Safety-Critical Software’ in ERCIM News 75. The meetingtook place on 27 May 2009 in Paris.

The agenda covered the following issues:• Radu Mateescu (INRIA): ‘Specification and Analysis of

Safety-Critical and Biological Systems’• Jaco van de Pol (UTwente): 'High-Performance Model

Checking for Embedded Systems‘• Wolfgang Herzner (ARC): ‘Dependable Networked

Embedded Systems Research at ARC: Overview onEuropean Research Projects which are related to ERCIMDES WG Topics’

• Amund Skavhaug (NTNU): ‘The Creation of a SpecialInterest Group for Embedded Systems at NTNU’

• Pedro Merino Gómez (UMA): ‘Software Model Checking’• Alessandro Fantechi (ISTI): ‘Modelling Guidelines for

Code Generation in Safety-Critical Systems’• Erwin Schoitsch (ARC): ‘ProSE – or How to Get

Research Involved in Standardization’.

Possible cooperation in the editing of a book on safety-crit-ical software was also discussed. The request for such a bookcame from Cambridge University Press following the above-mentioned special theme of ERCIM News 75.

The next workshops planned for the ERCIM DES WorkingGroup are:• ‘Special Session: Dependable Embedded Systems’ at

Euromicro-SEAA, 27-29 August 2009, Patras, Greece.www.euromicro.org , http://seaa2009.vtt.fi/

• Session on ‘Safety and security as a systemic challenge’at IDIMT 2009 (Interdisciplinary Information Manage-ment Talks), Jindrichuv Hradec, Czech Republic, 9-11September 2009

• ‘ERCIM/EWICS/DECOS Dependable Embedded Sys-tems Workshop’, SAFECOMP 2009 (15-18 September2009, Hamburg). www.safecomp.org

The next FMICS Working Group workshop will be held inEindhoven, The Netherlands, on 2-3 November 2009.

All members of ERCIM or ERCIM Working Groups areinvited to participate.

Links:

http://www.inrialpes.fr/vasy/fmics/index.htmlhttp://www.ercim.at/

Please contact:

Erwin SchoitschERCIM DES Working Group coordinatorARCS/AARIT, AustriaE-mail: [email protected]

Alessandro FantechiERCIM FMICS Working Group coordinatorUniversity of Florence and ISTI-CNR, ItalyE-mail: [email protected]

9

Both projects were carried out under the guidance of, and incollaboration with, the European Commission. As with theBTH project, it is expected that INTERLINK will have animpact on the European Commission’s work programmes. Itis also expected that the INTERLINK results will be notedby the research programs of the US Congress on Roboticsand the Japanese Science Foundation on Cognitive Systems.

The session on ERCIM activities ended with presentationsfrom Andreas Rauber (TU Vienna/AARIT) and MichalHaindl (Institute of Information Theory and Automation,Academy of Sciences of the Czech Republic/CRCIM), whoreported on their personal experiences and views of the CorBaayen Award and the ERCIM Fellowship Programmerespectively. Andreas was the 2002 Cor Baayen awardwinner; Michal was one of the first three ERCIM fellows in1990, and since then has hosted an ERCIM fellow himself. Ina very diverting and personal manner, they demonstratedhow these two ERCIM activities contribute to an increase inscientific reputation, scientific community building and theestablishment of international contacts.

The seminar ended with a ‘historical round table’ composedof two former representatives of ERCIM’s Board ofDirectors, and members of the Executive Committee andEditorial Board, namely Alain Bensoussan, Paul Williams,Bob Hopgood, Georges Nissen, Henk Nieland and SiegfriedMünch. They explained how ERCIM began in 1989, dis-cussed the early achievements and visions and answeredquestions from the audience.

The celebration continued into the evening with a gala dinnerto which the participants were welcomed by Keith Jefferyand Michel Cosnard, President of INRIA. The dinner set thescene for speeches from former ERCIM presidents Gerardvan Oortmerssen and Cor Baayen, presenting what they havecontributed to ERCIM as presidents, what has been achievedduring their term of office, and their recommendations forthe future.

More information:

http://paris2009.ercim.org

ERCIM Working Group Joint Meeting:

Dependable Embedded Systems and Formal Methods for Industrial Critical Systems

Page 10: Special theme: Mathematics for Finance and Economy

With so much science and so many applications focused on quantitativeoutput, mathematics plays a key role in the design and evaluation of quanti-tative models. While in some cases simple or standard mathematics isenough to formulate models and support theories, sometimes methods arerequired that are very recent or even yet to be developed.

An application that falls into the second case is modern financial mathe-matics. From its first tentative steps in the early seventies it has undergoneextensive development, via a great deal of hype in the eighties and ninetiesto the critical attention it is currently receiving in the face of the creditcrisis.

Financial mathematics can be divided roughly into four areas:• modelling of price movements (for stocks, commodities, interest rate

products etc) and underlying factors (such as interest rates, exchangerates, inflation)

• portfolio optimization, ie the determination of an optimal investmentstrategy in a financial market

• option pricing, ie the determination of a price for a derivative, a securi-ty that consists of an uncertain future payment stream (such as simplecall or put options or more complicated, so-called exotic variants)

• risk management, ie the measurement and management of risk connect-ed with an investment in securities (such as credit risk, market risk oroperational risk).

While the other three areas seem to be the more natural ones when one isthinking about mathematical applications in finance, it is option pricingwhich is the star area of modern financial mathematics. Here, the Black-Scholes formula for the prices of European call and put options earnedMyron Scholes and Robert E. Merton the Nobel prize. On the theoreticalside, the area is one of the rare cases in which very recent mathematicalmethods such as stochastic processes, stochastic calculus and stochasticdifferential equations are indispensable. What is more, option pricing actu-ally motivated theoretical research in these areas of probability. On theapplication side, tools were developed that allowed the pricing – at least inprinciple – of all kinds of derivative securities, which could be shapedaccording to the demands of the market. This then called for efficientnumerical methods for calculating option prices, and as there is a close rela-tion between stochastic differential equations and parabolic partial differ-ential equations, numerical mathematicians and physicists entered thescene.

For some years, therefore, the inventive activity of the market in creatingnew derivative products resulted in new option pricing problems. At thesame time, the huge influx of researchers from other areas into financialmathematics led to new theoretical results and to more complicated modelsand valuation methods. The climax of this was the introduction in thisdecade of credit derivatives. These are nothing more than insurance againstthe defaults of creditors. With the reasons for a default already hard tomodel, the financial market made the problem even harder by putting many

Introduction to the Special Theme

Modern Mathematics for Finance and Economics: From Stochastic DifferentialEquations to the Credit Crisis

by Ralf Korn

Special Theme: Mathematics for Finance and Economy

10 ERCIM NEWS 78 July 2009

Special Theme: Mathematics for Finance and Economy Special Theme: Mathematics for Finance and Economy

Page 11: Special theme: Mathematics for Finance and Economy

11

credits together and then selling tranches of different riskiness in thispackage. This technique resulted in the now infamous collateralized debtobligation (CDO). In principle this is a package of credits, but it is dividedinto pieces: the equity tranche (the highly risky piece from which all defaultlosses are paid as long as this tranche still exists), the mezzanine tranche(the mildly risky piece from which default losses are paid when the equitytranche has completely disappeared), and the senior tranche (relatively safepieces that only suffer losses due to credit defaults if the other tranches arealready used up).

As CDOs are very complicated securities, many investors bought themwithout fully understanding the mechanism how they work. In addition, thecredits used in the packages were sometimes of questionable quality, themain issue that led to the current credit crisis. On the modelling side, deter-mining the risk of a CDO investment requires an understanding of the cor-relation structure among the credits within it (at least the correlation when itcomes to defaults). Here, very simplistic models became popular in theindustry and, on top of that, were regarded as perfect images of the realworld. This proved to be wrong as soon as the first defaults occurred, and asa consequence the values of CDOs dropped dramatically. Since creditderivatives and in particular CDOs had been enormously popular in recentyears, the losses were huge and the consequence is the current financialcrisis.

Examples of more off-the-shelf applications of mathematics in economicsinclude operations research methods applied to revenue management prob-lems, queuing theory applied to storage problems, and dynamical systemsused in macro-economic theory.

Although financial mathematics has been the most popular application ofmathematics in economics in recent decades, this special theme containsreports on a variety of applications. The contributions can be grouped inthree blocks, plus three survey-type papers:• an article by Paul Wilmott that sheds a slightly controversial light on

‘Financial Modelling’ and points out some important open problems• a paper by Marlene Müller that surveys what modelling can do for the

finance and insurance industry• a block of papers on theoretical finance, with topics such as option pric-

ing, scenario generation and price modelling• a block containing applications of operations research in economics,

dealing with topics such as revenue management, sales forecasting, CO2emissions and fiscal fraud

• a block on software/hardware for applications in finance and economics,describing software projects and recent hardware design for problems infinance

• a closing paper by Pierluigi Contucci and Francesca Romiti that gives astatistical mechanics perspective of the application of mathematics ineconomics.

While reading these contributions, one should keep in mind a key differ-ence between this area of applied mathematics and more classical physicsand engineering. The quality of models in finance and economics is notfixed: their correctness can change with time, since it is not nature beingmodelled but the much more changeable and unpredictable interactions ofhumans .

Please contact:

Ralf KornUniversity of Kaiserslautern and Fraunhofer ITWM, GermanyE-mail: [email protected]

Page 12: Special theme: Mathematics for Finance and Economy

Recently the Financial Times had a shortarticle on quantitative finance and thecurrent crisis, taking the position thatwhat was needed was more mathe-matics. In any formal debate one sidetakes one position and the other sidetakes the opposite. Hence my statementabove. In truth it’s not so much aboutquantity of mathematics as quality. TheFinancial Times article simply seemedto think that more is better, without anythought for the type of mathematicsinvolved. My position is therefore betterstated as:

“The quality of mathematical model-ling in finance is poor.” And the qualityhas been decreasing for at least the lastdecade.”

Twenty years ago people from all back-grounds, whether mathematics, science,economics, physics, etc. were welcomedinto finance and were encouraged tocontribute. There was a great deal ofattempted technology transfer. Yes, mostof it fell flat, that’s inevitable, but thepoint is that their opinions and ideaswere listened to with respect.Somewhere in the mid nineties the sub-ject of quantitative finance becameaxiomatized, and very rapidly therebecame just the one way to do quantfinance, and every other approach wasdismissed. Even the physicists joining inhad to adopt the ‘accepted’ way ofthinking. Groupthink began to domi-nate.

“I think I know how this happened, andI certainly know why it is bad.”

How? I blame the ubiquitous and over-rated Masters programmes. Sadly suchcourses prey on the young and impres-sionable, usually those straight out of afirst degree, leaving them heavily indebt and having to be re-educatedonce/if they get a job. These courses arealmost invariably taught by academicswith no practical experience. Thecourses are made abstract and over-complicated because that’s what theprofessors know. Final word on this:

even Madoff only preyed on the richand didn’t exploit poor young students.

Bad? It is extremely bad because nowthere is an entire generation of quantsand wannabe quants who really dobelieve that knowing mathematics is allthat is needed. But they are incapable ofquestioning assumptions, or buildingtheir own models unless they are but aminor, trivial tweaking of somebodyelse’s model. They cannot think outsideany box. They have no awareness of thelink between human behaviour and themarkets, but that is what drives every-thing! I know all this because I am oneof those people having to do the re-edu-cation.

I launched the Certificate inQuantitative Finance (CQF), seewww.cqf.com, in 2003, it has becomean antidote to the Masters programs.The CQF is fundamentally about thepractice of quantitative finance. Thisstill means that delegates must knowabout the mathematical methods andmodels, but they also learn how to criti-cize models, and build their own. Thisprogram has now grown to be the largestquant program in the world at that highlevel. Most people taking the CQF comefrom banks, and are people who appre-ciate the uneasy relationship betweentheory and practice in this subject.

Recently I was giving a talk about theCQF to an audience of people from allsorts of backgrounds. Afterwards one ofthem told me that he’d like to take theCQF but wanted to skip the first threemodules on the grounds that he knew itall, having taken a Masters. And do youknow what he meant by ‘knowing itall’? Apparently he could ‘derive theBlack-Scholes equation’…and that wasit! Well, sunshine, that’s the trivial bit!What about when it breaks down, aboutits robustness, what about the popular‘improvements’ that make it worse? No,he didn’t know any of this. And I said tohim “You can be a Nobel Prize winnerand you still start the CQF at moduleone, because even in the first lecture

there are things Nobel laureates don’tknow!”

Having set the tone I am going to spendthe rest of this paper describing, in brief,some of the main problems with currentquantitative finance modelling and sug-gest areas for research. In my viewthese are where people should concen-trate their attention, not on yet anotherclosed-form solution of a stochasticvolatility model:

Real challenges of realquantitative financeLack of diversificationDespite the benefits of diversificationbeing well known among the generalpublic, and despite there being elegantmathematical and financial formula-tions of this principle, it is still com-monly ignored in banking and fundmanagement. This is partly due to thecompensation structure within theindustry whereby bonuses are linkedsimply to profit. The end result is thatpeople will try to maximize the amountthey invest or bet without any concernfor risk.

Supply and demandPrices are ultimately dictated by supplyand demand. But ‘price’ is differentfrom ‘value.’ Valuation is what quantsdo, to come up with a theoretical valueof a product based on various assump-tions. The role of the salesperson is thento sell the contract for as much abovethat theoretical value as possible, thedifference being the profit margin. Butthen the quant, who rarely distinguishesprice and value often gets confused. Hesees a contract selling for $10, its price,and by confusing this price with the the-oretical value of the contract deducessomething about the mathematicalmodel for that contract.

Jensen's Inequality arbitrageThe expectation of a non-linear functionof a random variable and the function ofthe expectation are not the same. If thefunction is convex then the former is

ERCIM NEWS 78 July 200912

Special Theme: Mathematics for Finance and Economy

The Quality of Quantitative Financeby Paul Wilmott

I’m not sure readers of this magazine are going to be pleased with the following statement, but here goes: “There is too much mathematics in finance”. Ooh, er!

Page 13: Special theme: Mathematics for Finance and Economy

greater. But options are just non-linearfunctions of something random, and thedifference between the expectation ofthe function and the function of theexpectation is where the value is in anoption. The moral is do not assumethings are deterministic when they arerandom, otherwise you will be missingvalue and mispricing.

Sensitivity to parametersNo one really knows what volatility willbe in the future. So risk managers andquants do sensitivity analyses in whichthey assume a number for volatility, seewhat is then the value of a contract,change the volatility and see how thecontract’s value changes. This givesthem a range of possible values for thecontract. Wrong. They assume a quan-tity is a constant and then vary that con-stant! That does not give a true sensi-tivity, and is completely different fromassuming at the start that volatility is avariable. Such analyses can easily givea false sense of security. Because thesensitivity of a value to another quantityis called a greek, I have christened sen-sitivities to parameters ‘bastard greeks’because they are illegitimate.

CorrelationThere are two problems with correla-tion. First it is not quite what peoplethink it is and second it doesn’t do agood job of modeling the relationshipbetween two financial quantities. It’snot what people think it is because whenasked they talk about two highly corre-lated stocks trending in the same direc-tion. But more often than not correlationto the quant is about the relationship

between two random numbers, returns,for example, over the shortesttimescale, and nothing to do withtrending. And correlation is a veryunsubtle tool for modeling the verysubtle relationship between quantities.

Reliance on continuous hedging(arguments)Derivatives theory is based almostentirely on the idea of continuoushedging eliminating risk. But in prac-tice people don’t and can’t hedge con-tinuously. They hedge every day forexample. Such discrete hedging meansthat risk has not been eliminated and soall the theories based on risk elimina-tion cannot necessarily be used. That’sfine, as long as they openly admit thatthere is far more risk than they’ve beenleading people to believe.

FeedbackFor every buyer there’s a seller. Correct.Therefore derivatives are a zero-sumgame. Incorrect. One side, usually theoption seller if it’s an exotic, is alsohedging the options by trading in theunderlying asset. And this trading canmove the markets. Such feedback iscurrently not adequately addressed inderivatives modelling or in trading,even though it has been blamed for the1987 crash.

Reliance on closed-form solutionsThere are good models and there are badmodels. You’d think that people wouldchoose models based on comparisonwith statistics, or maybe on some insightgained from fundamental economic con-siderations. No. Most popular models

are chosen because they have simpleclosed-form solutions for basic con-tracts. This is seen especially in fixed-income models. Despite the ease ofnumber crunching of most financemodels quants still like to work with ele-gant models when they should beworking with the most accurate models.Typical closed-form models used are theVasicek- or Hull & White-model.

Valuation is not linearDespite people’s everyday experienceof going to the shops, popular valuationmodels are almost all linear. The valueof a portfolio of derivatives is the sameas the sum of the values of the indi-vidual components. The relevance ofthe visit to the shops is that we have allexperienced nonlinearity in terms ofeither discount for bulk, economies ofscale, buy two get one free. Or theopposite, only one item per customer.There are several great non-linear quantmodels, with some fantastic properties.

CalibrationCalibration means making sure that theoutput of a model matches data in themarket. It has been likened to the cali-bration of a spring. That is a completelymisleading analogy. Springs behave thesame way time and time again. Hooke’slaw is a very good model relating thetension in a spring and its extension, aslong as that extension is not too great.Finance is not like this. Stock prices donot behave the same way, even proba-bilistically, time and time again becausethey are driven by human beings. Forthis reason calibration is fundamentallyflawed.

ERCIM NEWS 78 July 2009 13

Page 14: Special theme: Mathematics for Finance and Economy

Too much precisionWhy bother with too much accuracy inyour numerical analysis? First, youdon’t know whether volatility is going tobe 10% or 20%. Second, you don’t evermonitor individual contracts to expira-tion, together with cash flows andhedging instruments, to see whether youmade a profit. Third, you lump all con-tracts together into one portfolio forhedging, and it’s the portfolio thatmakes a profit (or doesn’t). You makemoney from the Central Limit Theorem,so there’s no point in pretending that youknow the contract’s value more pre-cisely than you do.

Too much complexityOne can no longer see the wood for thetrees in quantitative finance. Mostresearchers are so caught up with theirseven-factor stochastic correlationmodel, with umpteen different parame-ters, calibrated to countless snapshots oftraded prices, that they cannot see that

their model is wrong. If the inventorsand users of the model cannot see whatis going on, then what hope has thesenior management, and those whohave to sign off on valuations and risk-management practices? Quant financeneeds to get back to basics, to work withmore straightforward models, to acceptright from the get-go that financialmodels will never be perfect.Robustness is all.

ConclusionsQuantitative finance is one of the mostfascinating branches of mathematics forresearch. But it is not fascinatingbecause of the sophistication of themathematics. Classical quantitativefinance, unfortunately, as taught in uni-versities only encompasses a relativelysmall amount of mathematical tools andtechniques. No, quantitative finance isfascinating because of its potential toembrace mathematics from many dif-ferent fields, a potential that has mainly

not been realized. And fascinatingbecause we are trying to model thatwhich is extremely difficult to model,human interactions.

Links:

For further information on the abovetopics please visithttp://www.paulwilmott.com.

Or to discuss the subjects with yourpeers just join the free discussionforum at http://www.wilmott.com.

Articles on these research areas andothers may be found in Wilmottmagazine and Wilmott journal, seehttp://www.wilmott.com/subscription.cfm

Please contact:

Paul Wilmott E-mail: [email protected]

ERCIM NEWS 78 July 200914

Special Theme: Mathematics for Finance and Economy

Balancing Theory and Practice: Modern Financial Mathematics at Fraunhofer ITWMby Marlene Müller

The financial mathematics department at the Fraunhofer Institute for Industrial Mathematics (ITWM) coversall important topics of modern financial mathematics which has been an extremely innovative research areaduring the last years.

The underlying mathematics is oftennew and at the same time very sophisti-cated, due to the complex structure ofthe modelling process. Financial mathe-maticians need to be familiar with sto-chastic processes, stochastic calculusand partial differential equations, as wellas parametric and non-parametric statis-tics. The Financial MathematicsDepartment at the Fraunhofer Institutefor Industrial Mathematics (ITWM)covers all the important topics of finan-cial mathematics, in particular:• option pricing using stochastic and

local volatility models• continuous-time portfolio optimization• credit risk, Basel II, valuation of cred-

it derivatives• asset liability management, insurance

mathematics, Solvency II.

In addition, risk management as anoverall topic complements these morespecific areas:

Option Pricing with StochasticVolatility ModelsOptions are financial products with aprice that depends on the price dynamicsof one or several underlyings. A simpleexample is stock prices following a geo-metric Brownian motion and leading tothe famous Black-Scholes formula forsimple call or put options. Various otherunderlyings are possible, which includenot only bonds, interest rates, commodi-ties, exchange rates and investmentfunds, but also weather observations orCO2 emissions.

Option pricing is one of the most pop-ular fields in financial mathematics.Option trading is an important issue,particularly when market conditions areunfavourable. In order to hedge risksand to offer investors attractive productswith a limited risk of loss, banks offer allkinds of derivatives with complexpayoff structures; such as products with

minimum capital guarantees or productsthat limit the gain via an upper bound.To calculate precise and market-con-formal prices for such derivatives, it isnecessary to choose a flexible andappropriate stochastic model for theunderlying prices.

A popular generalization of the Black-Scholes model is the Heston (1993) sto-chastic volatility model. To calculate theprice of a complex derivative as a dis-counted expected value of the payoff(under the risk-neutral measure), appro-priate numerical methods (treeapproaches, Monte Carlo, PDEmethods) must be used. The price calcu-lation is often complemented by the cal-culation of the ‘Greeks’, the sensitivitiesof the price to various model parameters.

The Fraunhofer ITWM has imple-mented the Heston model within an inte-grated option pricing software frame-

Page 15: Special theme: Mathematics for Finance and Economy

work. Due to closed-form solutions forstandard calls, this model offers a time-efficient calibration of the modelparameters to actual market situations.In addition, the Heston model has theability to reproduce the volatility smileobserved in, for example, stock mar-kets.

Credit Risk: Basel II and CreditDerivativesModelling of credit risks has been animportant issue in recent years, for tworeasons. The first is the implementationof the new capital regulations (Basel II)and their evaluation by the bankingsupervision, which is essentially com-pleted now for many banks. The secondis the development of several types ofcredit derivatives that have allowedbanks to give away part of their creditrisk. (This led to the subprime creditcrisis and eventually also the currentfinancial crisis.)

The Fraunhofer ITWM supports banksin their implementation of the Basel IIregulations by providing expert knowl-edge in statistics, gained over the courseof several successful research and con-sulting projects. A typical task is toassess the rating system from a statis-tical point of view, which meansassessing the discriminatory and predic-tive power of the scoring method, pos-sibly recalibrating the rating system,and validating specific terms, in partic-ular score weights, default probabilityestimates or the estimated loss distribu-tions. From a mathematical point of

view, the main methods that we applyare classification and regressionapproaches with a special focus on thespecific restrictions required by a ratingsystem. In particular, the rating scoresshould be parametrically specified,additive and monotonic functions of theunderlying risk factors, in order thattheir specification might be extrapo-lated to rating segments with no or fewcredit default observations.

In the field of credit derivatives, manystandardized products have been devel-oped during the last few years. Anexample of such a standardized productbased on only one underlying subject isthe so-called credit default swap (CDS).The attractiveness of this product comesfrom the fact that it allows the protec-tion buyer to ‘insure’ against losses due

to credit default of the underlying entity.CDSs are traded not only for large cor-porations but also for banks and sover-eign states, which typically have verylow default risk. When CDSs are tradedon a sufficiently liquid market, theirmarket prices can be used to calibrate adebtor's default probability (even if anassessment on the basis of historicaldata is almost impossible).

In addition to single-name-based CDSs,there are more complex types of creditderivatives that depend on severalunderlyings. Examples of these prod-ucts are first-to-default swaps (FtD),Nth-to-default swaps (NtD), or the(now rather infamous) collateralizeddebt obligations (CDO). The valuationof these complex products places a sig-nificant demand on mathematical mod-elling and economical and financialknowledge, since existing models havenot been able to characterize the marketefficiently. In several research and con-sulting projects, the Fraunhofer ITWMhas dealt with the implementation (par-ticularly with the help of copulaapproaches) of different models forcredit derivatives.

Continuous-Time PortfolioOptimizationContinuous-time portfolio optimization(ie the determination of optimal invest-ment strategies in continuous-timemodels) is one of the main researchareas of the Fraunhofer ITWM. It is oneof our goals to demonstrate that suchmodern portfolio optimization methodscan be applied to real-world problems(including transaction costs, nonlineargoods and worst-case bounds), and out-perform the simplistic but still wide-

ERCIM NEWS 78 July 2009 15

Figure 1: Implied volatility surface calculated from option prices on ht S&P 500 index.

Figure 2: Asset Liability Management for pension funds: the long-term liabilities are in thefocus of the optimization. The effects of different trading strategies, in particular their long-termrisks can be studied by a simulation tool like ALMSim.

Page 16: Special theme: Mathematics for Finance and Economy

spread one-period Markowitz mean-variance approach.

Asset Liability Management,Insurance Mathematics, Solvency IIAsset liability management (ALM)forms an important part of running acompany (especially a bank or insurancecompany), and it is usually companypolicy to carry it out regularly. The inter-dependence between assets and liabili-ties is about to become even moreimportant, as Basel II (for banks) andSolvency II (for insurance companies)are now being implemented on nationaland international levels.

The Fraunhofer ITWM has developedthe ALMSim simulation software tool toaddress this problem. Assets and liabili-ties can now be modelled individuallywith client-specific interdependences

incorporated. ALMSim has alreadybeen used successfully by two pensionfunds. The software obtains the simula-tion model from a user-defined inputfile (ALMSim comprises an intuitivesimulation language, very close tomathematical formulae), and then auto-matically recognizes and resolves log-ical model dependencies. ALMSim willthen simulate the model using theappropriately chosen Monte Carlomethods and discretization step sizes.This enables a user (possibly withoutprogramming skills) to adapt the ALMsimulation model as required. Modelparameters, starting values and resultsare saved in ASCII format (.csv) suchthat input data can be imported fromExcel, R/S-Plus, Matlab etc, and theresults can be exported to these pro-grams where they can be furtherprocessed and evaluated.

Link:

http://www.itwm.fraunhofer.de/en/fm/indexfm/

Please contact:

Marlene MüllerFraunhofer ITWM Kaiserslautern,GermanyE-mail:[email protected]

ERCIM NEWS 78 July 200916

Special Theme: Mathematics for Finance and Economy

Fundamental Modelling of Financial Marketsby László Gerencsér, Balázs Torma and Zsanett Orlovits

In modelling stock prices, two competing approaches exist: technical modelling and fundamentalmodelling. Technical models capture statistical phenomena observed directly in stock prices, whilefundamental models consist of mathematical descriptions of the behaviour of economic agents thataffect the stock price. Fundamental models incorporate minute details of the price generatingprocess, and are thus more authentic than technical models. The development and analysis of multi-agent fundamental models has been the subject of research in the Stochastic Systems ResearchGroup of SZTAKI since 2006.

Our model of market microstructuresconsists of various trading agents, and astock exchange with order processingrules analogous to real exchanges suchas the New York Stock Exchange. Theagents observe the market price process,and determine the limit price and orderamount in each trading periodaccording to their trading strategy.Agents also consume or invest money.In combination, this gives us a non-linear stochastic feedback system as themathematical model, with the stockexchange playing the role of the plant,and the agents playing the role of con-trollers.

Stock trading literature led us to identifythree types of behaviour for tradingagents. ‘Trend followers’ act aggres-sively in the direction of the currentprice momentum, meaning they buymore if prices rise sharply and sell whenprices fall. ‘Mean reverters’ take theopposite action to trend followers in the

hope that the stock price reverts to somemean value. ‘Fundamentalists’ developa belief about the ‘correct’ priceaccording to the information theyreceive; they then buy if the observedprice falls below this level, and sell oth-erwise. The believed price process isitself generated by a stochastic pricemodel, which they expect to adequatelydescribe the value of a company.

According to well-known findings inbehavioural finance, real traders tend todivert from their trusted trading strate-gies according to some random pattern.Randomness is also introduced whenannouncements or news about a com-pany trigger analysts to pay more atten-tion to that company and thus change itsvaluation. This in turn results in furtheranalysts’ announcements about that

Figure 1: A simulated GARCH(1,1) process.

Page 17: Special theme: Mathematics for Finance and Economy

company. This phenomenon can be rep-resented mathematically using self-exciting point processes, which areincorporated in our model.

To validate our fundamental model weanalysed the volatility characteristics ofthe generated stock price process usinga widely accepted technical model, theso-called Generalized AutoregressiveConditional Heteroscedasticity(GARCH) model. This model generatesvolatility via a stochastic, nonlinear,Wiener-Hammerstein feedback system,driven by log-returns. This model nicelycaptures a known volatility clusteringphenomenon, as shown in Figure 1using simulated data.

In fitting a GARCH model we relied ona preceding PhD reseach project on thestatistical analysis of GARCH models,in particular the real-time estimation ofits technical parameters. The theoreticalbasis of this work was the highly non-trivial application of an advanced theoryof stochastic approximation allowingMarkovian dynamics in the underlyingstate process. We have also developed ahighly efficient numerical method foroff-line estimation that improves theefficiency of the corresponding moduleof the MATLAB GARCH toolbox by afactor of 4.5. Simulations have shownthat our fundamental model successfullyreproduces the volatility properties ofprice processes generated by GARCHmodels.

An interesting research topic that weinvestigated is to analyse the relation

between the structure of the artificialstock market and the price volatility;ultimately this was captured by fitting aGARCH model. The market structure isdefined and parameterized by the rela-tive weights of agents. These are calcu-lated by dividing the number of agentsof any of the three types given above bythe number of all agents on the market.All agents are assumed to have the sameinitial endowment, so relative weightscan be seen as the initial distribution ofwealth among different groups ofagents of a given type. In particular, weinvestigated the problem of detectingabrupt changes in the market structure,which may be an indicator of insidertrading.

We have carried out extensive experi-ments in which we simulated stockprice time series for different relativeagent weights, and then fitted aGARCH model. The parameters offitted GARCH models are highlydependent on the agent weights, asshown in Figure 2 for the case ofGARCH(1,1) models with just twoparameters. We indicate the relativeweights of mean reverters and trend fol-lowers on the x and y axes respectively.Thus each box in the figure correspondsto a relative weight setting, defining amarket structure. Shades of grey indi-cate the range of values of one of thetwo coefficients of the fitted GARCHmodel: the lighter the shade the higherthe coefficient. This is known as a heatmap. The figures above indicate a highcorrelation between agent weights andGARCH parameters. Using simple

technical arguments, Figure 2a indi-cates that an increase in the relativeweight of trend followers increases thevolatility. Similarly, Figure 2b can beinterpreted as saying that an increase inthe relative weight of fundamentalistsalso increases the volatility. Both ofthese findings are in agreement withintuition.

We conclude that fundamental model-ling of stock price processes is a real-istic alternative to the widely usedGARCH models. Given the advancedmathematical technology that is cur-rently available to analyse stochastic,non-linear, Wiener-Hammerstein feed-back systems we think our approach hasgreat potential for future research inproblems such as market stability androbustness.

Please contact:

László GerencsérSZTAKI, Hungary Tel: +36 1 279 6138E-mail: [email protected]

Balázs TormaSZTAKI, Hungary Tel: +36-1-279 6158E- mail: [email protected]

Zsanett OrlovitsMathematical Institute, Budapest University of Technologyand Economics, Hungary Tel: +36 1 463 1384E- mail: [email protected]

ERCIM NEWS 78 July 2009 17

Figure 2a, 2b: Heat map of GARCH(1,1) parameters. Figure 2a (left) indicates that an increase in the relative weight of trendfollowers increases the volatility. Similarly, Figure 2b can be interpreted as saying that an increase in the relative weight offundamentalists also increases the volatility.

Page 18: Special theme: Mathematics for Finance and Economy

The Istituto per le Applicazioni delCalcolo ‘Mauro Picone’, part of theNational Research Council of Italy, wasfounded in 1927 in Napoli by Prof.Mauro Picone as the first institute in theworld devoted to the applications of cal-culus. Since 2002, the Institute has beeninvolved in joint projects with the ItalianTreasury that include high-level trainingon interest rate modelling for theMinistry staff. For the Department incharge of domestic public debt manage-ment, the Institute developed a softwareprototype that uses the simplex algo-rithm to determine, for a given scenarioof interest rate evolution, the optimalsequence of public debt securitiesissuances. At the same time, thissequence must fulfil the constraints(imposed by the law or by market bestpractices) that Treasury must take intoconsideration. Figure 1 shows the blockdiagram for the optimization process.

Recently a contract has been signed forthe extension of this prototype withConsip S.p.A, a company of the Ministryof Economy operating on services for the

Ministry itself. The prototype was pre-sented in several international contexts(eg World Bank) and has received greatattention and appreciation for its utility.

The final goal of the project is to deter-mine the optimal sequence of publicdebt securities issuance. To this aim, wewill develop methodologies to simulaterealistic scenarios for the dynamics ofinterest rates, taking into account theeffects of the main macroeconomicvariables. This is crucial to support arobust cost/risk analysis that includesthe impact of macroeconomic changes(variations in monetary policy, modifi-cations in the economic cycle etc) ondebt management.

The strong correlation between everyrate (regardless of its maturity) and theofficial rate as fixed by the EuropeanCentral Bank (ECB) leads us to assumethe latter as a reference rate and to con-sider all other rates as generated by acombination of this reference rate witha characteristic, maturity-dependentfluctuation. On the other hand, the

impact of expectations of the futureeconomic trend on the choices made bythe ECB cannot be neglected. We musttherefore include in our control method-ology a model describing the ECB ratedynamics as a function of the mainmacroeconomic variables.

The project is being developed throughsix modules:

1. Implementation of Markov Switchingmodels to describe the dynamics ofeconomic and inflation cycles, andsubsequent validation on Italian andsuitably aggregated European data.As far as inflation is concerned, weare comparing two options: modellingits dynamics by means of an exoge-nous variable relating the inflationcycle to the economic cycle (eg theoutput gap) and developing a bivari-ate model describing both cycles.

2. Development of a jump-diffusionmodel for the ECB rate. Our workinghypothesis is that the ECB interven-tions can be modelled as a combina-tion of a Poisson process describing

ERCIM NEWS 78 July 200918

Special Theme: Mathematics for Finance and Economy

Macroeconomic and Financial Scenarios forCost/Risk Analysis in Public Debt Management by Massimo Bernaschi, Maria Francesca Carfora, Albina Orlando, Silvia Razzano and Davide Vergni

Sovereign debt management is the process of establishing and executing a strategy for managinggovernment debt. Among the main goals of a sound public debt management are i) to fulfil thefunding requirements; ii) to minimize the cost of the debt and the related risk; and iii) to set up andcontrol an efficient market for government securities. Our research is aimed at providing themathematical tools for a cost/risk analysis of the portfolio of securities issued every month by theItalian Treasury. The final goal is to provide public debt managers with a flexible, efficient and user-friendly decision support tool.

Optimization process for public debt management.

ESA95Total DebtOther Quantities

Interest rates evolution

Past issuancesSecurity information (type, maturity, ...)

Forecast on Surplus

Optimizer

Refinancing constraintConstraint on the Portfolio CompositionNet Yearly IssuancesOther constraints

Optimal Issuances

Nominal DebtDurationOther Quantities

o Input Datao Constraintso Cost Functionso Output Data

Page 19: Special theme: Mathematics for Finance and Economy

the intervention times of the ECB,and a Markov process accounting forthe direction and width of suchjumps. Such a model must be linkedto the evolution of the economiccycle, by means of terms that varydepending on the recession or expan-sion phase.

3. As a complementary step withrespect to the model in 2, implemen-tation of a ‘forward-looking’ Taylorrule to model the ECB rate dynamics.Both the models in 2 and 3 will beintroduced in the software tool tooffer multiple choices for the simula-tions.

4. Evaluation, for both the models in 2and 3, of the interest rate fluctuationswith respect to the ECB official rate.The basic idea is that the ECB officialrate is a reference rate and all the

other rates are generated by a compo-sition of this reference rate plus acharacteristic maturity dependentfluctuation. Interest rates evolutionwill be described by means of differ-ent models (CIR, HJM, Nelson-Siegel). Powerful calibration tech-niques for the parameters will beimplemented.

5. Extension of the current software toolthrough i) development of modulesfor the simulation of the dynamics ofmacroeconomic variables; and ii) sto-chastic programming techniques thatallow the uncertainty in the evolutionof the control variables of the systemto be included in the optimal strategy.

6. Assessment of the risk associatedwith fluctuations in interest rates.Besides traditional risk measuressuch as Cost at Risk (CaR) and

Expected Shortfall (ES), we are alsoexploring spectral measures anddynamic risk measures.

The final result will be an analysis withclear indications on the robustness ofthe securities emission strategy of theItalian Treasury.

Please contact:

Albina Orlando Istituto per le Applicazioni del calcolo'Mauro Picone' – CNR, ItalyTel: +39 081 6132395 E-mail: [email protected]

Davide Vergni Istituto per le Applicazioni del calcolo'Mauro Picone' – CNR, ItalyTel: +39 06 49270951E-mail: [email protected]

ERCIM NEWS 78 July 2009 19

In the Stochastic Systems ResearchGroup of SZTAKI, our current researchin mathematical finance (in cooperationwith Paolo Guasoni, Boston University,and Walter Schachermayer, Universityof Vienna) has a three-fold objective. Wewish to incorporate certain marketimperfections (ie features of the stockexchange that are neglected in idealizedsettings) in our modelling of the tradingmechanism; to reintegrate an importantclass of ‘fractal-like’ stochasticprocesses into mainstream quantitativefinance; and to develop further tools fortreating these processes in a mathemat-ical finance context.

Market imperfections include brokeragefees, taxes and liquidity constraints(where certain products are not availablein the required quantities at a giventime). These imperfections have beenmostly neglected in the mathematicalframework. We are focusing on modelswhere proportional transaction costs aretaken into account: each time theinvestor buys or sells an asset a certainpercentage (eg 1% of the transactionvalue) must be paid to the broker. One of

our aims was to achieve a deeper under-standing of continuous trading in thesemore realistic models.

Identifying the right class of stochasticprocesses with which to model stockprices is of paramount practical impor-tance. Ideally, a tractable class with agood fit to real data should be chosen.One family of stochastic processes, theso-called fractional Brownian motions,appears to be a promising candidate.These processes admit a fractal struc-ture.

A fractal is generally thought of as ageometric object with rugged bound-aries whose parts (however small theyare) resemble the shape of the original,ie they show self-similarity. Most oftenthey are recursively generated and henceideal for computer implementations.There also exist stochastic processeswith a self-similar structure, the primeexample being Brownian motion. This isused to describe diffusive phenomena inphysics and it serves as a basis for theoverwhelming majority of currentmodels in mathematical finance.

However, it is not the only fractal-likeprocess of interest. Fractional Brownianmotions are characterized by a numberH ∈ (0,1) (the so-called Hurst index),which is related to their self-similarityproperties but also affects the roughnessof the sample paths: as H decreases, thetrajectories of the process appear wilderand more chaotic. Standard Brownianmotion corresponds to the case H=1/2and is thus only one particular point inan interval of possibilities!

Determining the ‘right’ value of H forstock prices is nontrivial, and there isevidence that it may well be differentfrom 1/2. Despite this, models with H ∈1/2 have so far been rejected as unsuit-able. The reason is that if the stock pricefollowed such a process then agents inthe market would have an arbitrageopportunity, meaning they could producea positive return from zero investment byclever trading. As arbitrage contradictseconomic equilibrium, these processeswere not much used in option pricing.

In light of our recent progress, there ishope that the situation will soon change.

Rehabilitating Fractal Models in Finance by Miklós Rásonyi

Despite of providing a good fit to financial data, asset price models with a fractal structure haveoften been rejected as they contradict a basic principle of economic theory: that arbitrage should beabsent. In light of new results, their role must be reconsidered.

Page 20: Special theme: Mathematics for Finance and Economy

In cooperation with Paolo Guasoni(Boston University) and WalterSchachermayer (University of Vienna),we have shown that if proportionaltransaction fees are incorporated into thetrading mechanism, stock prices can bemodelled as fractional Brownian motionwith any H. Not only does arbitrage thendisappear, but one can also constructfunctionals to evaluate options.

We plan to go on with investigatingthese exciting models based on frac-tional Brownian motion and to developoption pricing techniques for them. Onepopular method is based on the prefer-

ences of investors: we wish to describewhat strategy should be used to attainmaximal satisfaction with respect tothese preferences. This problem hasstrong links with optimal control,though it seems that the traditionalapproach (Bellmann principle) is use-less for fractional Brownian motionmodels and we will need to apply alter-native techniques.

We hope that these advances will lead toeffective new methods for evaluatingderivative securities in models thatbetter approximate reality. In the nearfuture, fractional Brownian motion may

play a significant role in derivativepricing, which – as econometric investi-gations support – it deserves. We lookforward to collaborating in these newdevelopments.

Link:

See article no. 130 at:http://www.mat.univie.ac.at/~schachermayer/pubs

Please contact:

Miklós RásonyiSZTAKI, HungaryTel: +36 1 2796190E-mail: [email protected]

ERCIM NEWS 78 July 200920

Special Theme: Mathematics for Finance and Economy

Two trajectories of fractional Brownian motion corresponding to the parameter values H=0.3 and H=0.7, respectively. The graphs were generatedwith the help of Tyrone Duncan (University of Kansas) and Balazs Torma (SZTAKI).

Credit risk is the risk of loss resultingfrom an obligor's inability to meet itsobligations. Generally speaking, creditrisk is the largest source of risk facingbanking institutions. For these institu-tions, sound management involvesmeasuring the credit risk at portfoliolevel to determine the amount of capitalthey need to hold as a cushion against

potentially extreme losses. In practice,the portfolio risk is often measured byValue at Risk (VaR), which is simplythe quantile of the distribution of port-folio loss for a given confidence level.With the Basel II accords (the recom-mendations on banking laws and regu-lations issued in 2004 by the BaselCommittee on Banking Supervision),

financial regulators aim to safeguard thebanking institutions' solvency againstsuch extreme losses.

From a bank’s perspective, a high levelof credit risk management means morethan simply meeting regulatory require-ments: the aim is rather to enhance therisk/return performance of credit assets.

Improving Banks’ Credit Risk Management by Xinzheng Huang and Cornelis W. Oosterlee

The last year has seen financial institutions worldwide announce writedowns of hundreds of billionsof dollars as a result of participating in the US mortgage market, in particular in subprime and otherlower-rated mortgage-backed securities. Improving the practice of credit risk management by bankshas thus become a top priority. Researchers Huang (Delft University of Technology) and Oosterlee(CWI) in the Netherlands focus on quantifying portfolio credit risk by advanced numerical techniqueswith an eye to active credit portfolio management. This work is sponsored by the Dutch Rabobank.

Page 21: Special theme: Mathematics for Finance and Economy

ERCIM NEWS 78 July 2009 21

To achieve this goal, it is essential tomeasure how much a single obligor in aportfolio contributes to the total risk, iethe risk contributions of single expo-sures. Risk contribution plays an inte-gral role in risk-sensitive loan pricingand portfolio optimization.

Extrapolation of credit risk from indi-vidual obligors to portfolio levelinvolves specifying the dependenceamong obligors. Widely adopted in theindustry is the Vasicek model, on whichis built the Basel II internal rating-basedapproach. It is a Gaussian one-factormodel, with default events being drivenby a single common factor that isassumed to follow the Gaussian distri-bution, and obligors being independentconditional on the common factor.Under certain homogeneity conditions,the Vasicek one-factor model leads tovery simple analytic asymptoticapproximations of the loss distribution,VaR and VaR Contribution. However,these analytic approximations can sig-nificantly underestimate risks in thepresence of exposure concentrations, iewhen the portfolio is dominated by afew obligors.

In their research, Huang and Oosterleeshowed that the saddle-point approxima-tion with a conditional approach is anefficient tool for estimating the portfoliocredit loss distribution in the Vasicekmodel and is well able to handle exposure

concentration. The saddle-point approxi-mation can be thought of as an improvedversion of the central limit theorem andusually leads to a small relative error,even for very small probabilities.Moreover, the saddle-point approxima-tion is a flexible method which can beapplied beyond the Vasicek model tomore heavily tailed loss distributionswhich provide a better fit to current finan-cial market data.

The single factor in the Vasicek modelrepresents generally the state ofeconomy. More factors are necessary ifone wishes to take into account theeffects of different industries and geo-graphical regions in credit portfolio lossmodelling. For example, in the currentcrisis the financial industry is taking thehardest hit, while back in 1997 EastAsian countries suffered most. Multiplefactors can be used to incorporate thesedetails, but they generally complicatethe computational process, as high-dimensional integrals need to be com-puted. For this, the researchers pro-posed efficient algorithms of adaptiveintegration for the calculation of the tailprobability, with either a deterministicmultiple integration rule or a MonteCarlo type random rule.

In the Vasicek model the loss givendefault (LGD) – the proportion of theexposure that will be lost if a defaultoccurs – is assumed to be constant.

However, extensive empirical evidenceshows that it tends to go up in economicdownturn. A heuristic justification isthat the LGD is determined by the valueof collateral (eg house prices in the caseof mortgage loans), which is sensitive tothe state of the economy. To account forthis, Huang and Oosterlee proposed anew flexible framework for modellingsystematic risk in LGD, in which thequantities have simple economic inter-pretation. The random LGD framework,combined with the fat-tailed models,further provides possibilities to repli-cate the spreads of the senior tranches ofcredit market indices (eg CDX), whichhave widened dramatically since theemergence of the credit crisis to a levelthat the industrial standard Gaussianone-factor model can not produce evenwith 100% correlation.

This research provides useful tools tofulfill the needs of active credit port-folio management within banks. Bankscan improve their insight into credit riskand take appropriate measures to maxi-mize their risk/return profile.

Link:

http://homepages.cwi.nl/~oosterle

Please contact:

Cornelis W. OosterleeCWI, The NetherlandsTel: +31 20 592 4108E-mail: [email protected]

The global economy is the driverin a credit risk one-factor model.

Sou

rce:

Shu

tter

stoc

k

Page 22: Special theme: Mathematics for Finance and Economy

Looking in a dictionary, we see thathybrids are, for example, creatures com-bining body parts of more than one realspecies (such as mermaids or centaurs).The hybrid contracts from the financialindustry are based on products from dif-ferent asset classes, like stock, interestrate and commodities. Often these prod-ucts have different expected returns andrisk levels. Proper construction of a newhybrid product may give reduced riskand an expected return greater than thatof the least risky asset. A simpleexample is a portfolio containing a stockwith a high risk and return and a bondwith a low risk and return. If one intro-duces an equity component into a purebond portfolio the expected return willincrease. If the percentage of the equityin the portfolio is increased, it eventuallystarts to dominate the structure and therisk may increase with a higher impactfor a low or negative correlation.

Advanced hybrid models can beexpressed by a system of stochastic dif-ferential equations (SDEs), for examplefor stock, volatility and interest rate,with a full correlation matrix. Such anSDE system typically contains manyparameters that should be determined bycalibration with financial market data.This task is challenging: Europeanoptions need to be priced repeatedlywithin the calibration procedure, whichshould therefore be done extremely fastand efficiently.

At a major financial institution like abank, one can distinguish a number oftasks that must be performed in order toprice a new financial derivative product.First, the new product is defined as themarket asks for it. If this is a derivativeproduct, then there is an underlying,modelled by stochastic differential equa-tions (SDEs). Each asset class has dif-ferent characteristics, leading to dif-ferent types of SDEs. To achieve a rea-sonable model that is related to the

present market, one calibrates theSDEs. These products also form thebasis for the hedge strategies used bythe banks to reduce the risk associatedwith selling the new product.

Once the asset price model is deter-mined, the new derivative product ismodelled accordingly. The product ofinterest is then priced by means of aMonte Carlo simulation for the integralversion of the problem, or by thenumerical approximation of a partialdifferential equation. The choice ofnumerical pricing method is thus basedon whether one is aiming for the modelcalibration, in which speed of a pricingmethod is essential, or for the pricing ofthe new contract, for which robustnessof the numerical method is of highestimportance.

Fourier-based option pricing methodsare computationally fast, and can beused relatively easily for the hybridmodels mentioned above. They are par-ticularly well suited for the calibrationprocess, as their efficiency is obviousfor pricing basic option products. Theywork whenever the so-called character-istic function of the asset price process,ie the Fourier transform of the proba-bility density function, is available.

The contribution of the Dutch group topricing algorithms is the development of

the COS option pricing method, basedon Fourier cosine expansions. The basisof this method is that the Fourier-cosineexpansion of a probability density func-tion is directly related to the character-istic function. This not only allows prob-ability density functions to be recoveredbut in addition, means highly efficientoption pricing can be performed simul-taneously for a large number of basicoptions. A unique property of the pricingmethod is that the pricing errordecreases exponentially with thenumber of terms in the cosine expan-sion. Accurate pricing with the COSmethod therefore takes place in just afew milliseconds. Experience has beengained for a variety of stochasticprocesses for the modelling of theassets, like sophisticated hybrid models.

At present, the researchers areimproving the applicability of theirpricing method. They have already gen-eralized it successfully to the pricing ofoptions with early exercise opportuni-ties, as well as to so-called swingoptions, which allow contract holders tobuy or sell electricity at certain times inthe contract. Another application,related to the recent credit crisis, is thecalibration of credit default swaps, oneof the problematic product classes ofrecent years, modelled by means ofjump processes to the last year's market.A remarkable correspondence betweenmodel and market prices was achieved,and calibration could be performedwithin reasonable time, due to the use ofthe COS method.

Link:

http://homepages.cwi.nl/~oosterle

Please contact:

Cornelis W. OosterleeCWI, The NetherlandsTel: +31 20 592 4108E-mail: [email protected]

ERCIM NEWS 78 July 200922

Special Theme: Mathematics for Finance and Economy

Fast Pricing of Hybrid Derivative Productsby Cornelis W. Oosterlee and Lech A. Grzelak

When the financial sector is in crisis, stocks go down and investors escape from the market toreduce their losses. Global banks then decrease interest rates in order to increase cash flow: thismay lead to an increase in stock values, since it becomes less attractive for investors to keep theirmoney in bank accounts. It is clear, therefore, that movements in the interest rate market caninfluence the behaviour of stock prices. This is taken into account in the so-called hybrid modelscurrently being developed by researchers Grzelak (Delft University of Technology) and Oosterlee(CWI) from the Netherlands.

A peculiar Dutch hybrid product.

Page 23: Special theme: Mathematics for Finance and Economy

ERCIM NEWS 78 July 2009 23

The Financial Mathematics Group atthe University of Warsaw was created inthe early 1990s, when a group of mathe-maticians at the university becameinterested in financial mathematics.One of the main research interests of thegroup members is the modelling offixed income derivatives and the opti-mization of bond portfolios. Fixedincome markets are in certain aspectsmore complicated than stock marketsdue to a strong correlation betweeninstruments. There is also no widelyaccepted 'standard' theory for allinterest rate instruments.

Credit RiskInvestment opportunities in fixedincome markets include both default-free and defaultable instruments. Tomodel the credit risk of defaultablebonds, one relates the probability ofdefault to the rating class of the issuer(rating classes reflect a firm’s economicstanding). To manage the financial riskof defaultable instruments, it is impor-tant to model not only asset prices inevery rating class, but also a credit ratingmigration process. Investigations by ourgroup are concentrating on a bondmarket model: this has a finite/infinitenumber of sources of randomness in theHeath-Jarrow-Morton (HJM) frame-work, driven by Lévy processes admit-ting discontinuous trajectories. Theimportance of models with an infinitenumber of risk factors has been stressedin recent papers, eg Carmona &Tehranchi, Ekeland & Taflin, Cont). Inthe model containing default-free anddefaultable bonds with rating migrationsdriven by a Markov chain, we havefound no-arbitrage conditions (general-ized HJM conditions) for all basic typesof recovery. This model has been fur-ther extended by defining a new class ofstochastic processes: F-doubly sto-chastic Markov chains, used to modelcredit rating migrations. The extendedmodel is used to valuate defaultableclaims. The advantage of the newapproach is twofold. First, the known

two-state analysis of default/no-defaultis extended to a multi-state case.Second, we give a pricing formula fordefaultable rating-sensitive claims interms of characteristics of the ratingmigration process.

Optimal Investments in Bond PortfoliosThe construction of a good combinationof financial instruments involves anoptimization problem: select the bestinvestment portfolio from all admis-

sible portfolios. The seminal paper byMarkowitz formulates this as a risk-return trade-off. In its original formula-tion, it is in fact a mean-variance (MV)optimization, with the mean as ameasure of return and the variance as ameasure of risk. To solve this problem,

it is necessary to know the distributionof random returns of risky assets. Thesolution procedure is then executedunder the implicit assumption that weknow both the mean and the covariancematrix. In fact, this assumption does nothold, and the estimation of the meanand covariance is an important part ofthe solution of the optimizationproblem.

Members of the Financial MathematicsGroup have collaborated with an indus-

trial partner to design an optimizationmethod that uses a reasonable set ofmarket data and produces 'good'optimal portfolios, ie portfolios that arewell diversified and have asset sharesthat are stable with respect to estimationerrors. The group has had an interesting

Modelling and Optimization of Fixed Income Investmentsby Jacek Jakubowski and Andrzej Palczewski

Modelling fixed income securities and managing risk in their portfolios is a challenging mathematicaltask requiring models with an infinite number of risk factors. We have designed new methods ofdealing with the risk of defaultable bonds and risk diversification in large bond portfolios.

Figure 1: Optimal 10 currency bond portfolios. The figure shows assets weights as the functionof portfolio risk measured by standard deviation (in percent). Observe that the portfolios arewell diversified and there are no corner solutions (except very risky portfolios).

Page 24: Special theme: Mathematics for Finance and Economy

experience in the optimal tactical assetallocation for a multi-currency bondportfolio. As mentioned above, themain difficulty in implementing MVoptimization is the estimation of param-eters. While the existing literaturepoints to the difficulty in estimatingmean returns, this problem has beenovercome using the Black-Littermanmethod. Our experience shows that theestimation of the covariance matrix isalso a difficult task. Standard maximumlikelihood estimators lead to unstableportfolios that are sensitive to smallchanges in data. In order to improve theportfolio stability we apply robust esti-

mators (MCD-type estimators) andimplement methods that account for amemory effect in the data. We have alsoperformed a theoretical analysis of thesources of instabilities, and concludedthat they are mostly due to small eigen-values of the covariance matrix.Although this finding is not in itself asurprise, the extent of the instabilityincurred by small eigenvalues hasexceeded our expectations. No remedycould be found in improved estimators;rather, the solution to the portfolio sta-bility issue is to modify the objectivefunction by adding a penalty term of atransaction cost type.

Link:

http://mathfinance.mimuw.edu.pl

Please contact:

Jacek JakubowskiUniversity of Warsaw, PolandTel: +48 22 5544 538E-mail: [email protected]

Andrzej PalczewskiUniversity of Warsaw, PolandTel: +48 22 5544 467E-mail: [email protected]

ERCIM NEWS 78 July 200924

Special Theme: Mathematics for Finance and Economy

The global warming phenomenon andthe related increase in GHG emissionsare subjects of great concern in the inter-national community. Developing inter-national cooperation on GHG emissionsreduction has led to the ratification ofthe Kyoto Protocol by most countries.For the European Union, it means an 8%reduction in its GHG emissions in com-parison with 1990 levels, to be attainedduring the period 2008-2012. Such anobjective is closely linked to carbonvalue determination over different timescales.

Determining the value of carbon istherefore an important economic issueand has led to a variety of research activ-ities. It is at the heart of the prospective-based research effort underway at theCenter for Applied Mathematics(CMA), at Mines ParisTech. In collabo-ration with the French Environment andEnergy Management Agency (ADEME)and CMA, specialists in stochastic mod-elling and numerics from the TOSCAand COPRIN teams at INRIA areworking on a short-term carbon value

derived from the so-called financial‘carbon market’, the European UnionEmission Trading Scheme (EU ETS).

The EU ETS is a framework for GHGemissions reduction in Europeanindustry. It covers specific industrialsectors including energy production,cement, iron and steel, and oilrefineries. In February, the beginningof the annual trading period, each pro-duction unit receives a certain quantityof emissions allowances. By April, theunit must possess at least the same levelof allowances as its emissions duringthe previous year. Meanwhile, the pro-ducer is allowed to sell or buyallowances on the carbon market. Ifallowances do not cover emissions, theproducer must pay a penalty. Such amarket is called a compliance or ‘capand trade’ market.

We aim to evaluate the consequences ofthe design of the allowance allocationsand penalties on industries’ behaviourand GHG emissions reduction. Ourapproach is based on ‘indifference

price’ evaluation methodology, used toprice complex Over The Counter (OTC)financial contracts. We propose tostudy the EU ETS indifference priceaccording to a given sector’s playersaggregation.

We model an agent (the producer)taking part in the EU ETS and seekingto optimize its production planning andCO2 allowances trading strategy for oneperiod of the carbon market. The agentfaces uncertainties in product sales andin production costs (energy, commodi-ties etc); these are modelled using sto-chastic processes. The agent thenchooses an adequate policy for produc-tion, investment and carbon trading inorder to maximize the expected utility, afunction of its final wealth which repre-sents its preferences. For example, anelectricity producer can dynamicallyswitch between coal, gas or hydropower plants, and/or buy/sell emissionallowances. To determine whether ornot the agent enters the carbon market,we compare the optimal expected utilitywhen buying or selling extra emission

Valuing CO2 Emission Allowances with Stochastic Controlby Olivier Davidau, Mireille Bossy, Nadia Maïzi and Odile Pourtallier

The carbon market was launched in the European Union in 2005 as part of the EU's initiative toreduce its greenhouse gas (GHG) emissions. This means that industrial players must now includetheir GHG emissions in their production costs. We aim to model the behaviour of an industrial agentwho faces market uncertainties, and we compare the expected value of an optimal productionstrategy when buying (or not buying) supplementary emission allowances. This allows us to computethe agent's carbon indifference price which defines its market behaviour (buyer or not). We aim touse the resulting indifference price to study the sensitivity of the carbon market to the shape of thepenalty and the emission allowances allocation.

Page 25: Special theme: Mathematics for Finance and Economy

ERCIM NEWS 78 July 2009 25

allowances at a given price, with theoptimal expected utility without acarbon market. The indifference priceis the price at which the two utilitiesbecome equal. A comparison of the cur-rent carbon price with the indifferenceprice shows whether it is more or lessfinancially advantageous for the pro-ducer to cover its GHG emissions. Thatis, when the market (buying) price ishigher than the (buying) indifferenceprice, the producer will cover its emis-sions; otherwise it prefers to pay thepenalty. This indifference price gives usinformation on the investment level theproducer will deploy to reduce its GHGemissions.

Mathematically, this optimizationproblem is a stochastic control problem,where the state variables are accumu-lated wealth and CO2 emissions, rev-enue from product sales and productioncosts. The optimal expected utility is thesolution of a Hamilton-Jacobi-Bellmanequation. Computing this price andstudying its sensitivity to the shape ofthe allowances allocation and penaltyleads us to solve this equation and even-tually to invert the resulting value func-tion. We then compute the derivativesof this solution with respect to the levelof penalty/allocation. From a mathe-matical point of view, this leads us todeveloping efficient numerical

schemes. An approach based on back-ward stochastic differential equationsmay provide alternative representations,qualitative information on regularity,and derivatives of the solutions andnumerical tools.

Link:

http://www.modelisation-prospective.org/

Please contact:

Olivier DavidauINRIA and Mines ParisTech, FranceE-mail: [email protected]

With the European carbonmarket, electricity producer mustinclude their greenhouse gasemissions in their productioncosts. Photo: Vattenfall.

Suppose you are sitting in a plane andyou ask the person next to you whatprice he paid for his ticket. It is likelythat it will be different from the priceyou paid for the same ticket. This pricedifferentiation is a result of RevenueManagement (RM). While the use ofRM techniques is now common practice

in the airline industry, these techniqueshave recently started to create new busi-ness opportunities in the parkingindustry. In a joint effort between CWI,the Dutch company ORTEC and VUUniversity in the Netherlands, theauthors have successfully applied RMfor one of the world’s largest parking

companies. This has boosted the com-pany’s annual revenue and given it anew competitive edge.

RM is a method for increasing revenuewhereby available capacity (forexample an airplane seat or a parkingspot) is sold in a smart manner by

Revenue Management in the Parking Industry:New Business Opportunities by Rob van der Mei, Kevin Pak, Maarten Soomer and Ger Koole

Revenue Management (RM) techniques provide a powerful means of increasing revenue, creatingnew business opportunities for companies in a wide variety of business areas. Recently, the authorshave successfully applied RM techniques for one the world’s largest parking companies, creatingmillions in additional yearly revenue.

Page 26: Special theme: Mathematics for Finance and Economy

offering different prices to differentcustomers. In order to determine howmuch capacity can be offered at whatmoment and at what price, statisticalmethods and optimization models areto be used.

RM emerged in the 1980s in the airlineindustry. Today, for many airlines RMmakes the difference between profit andloss. For each flight, airlines determinein detail how many seats will be offeredat a given moment and price. The com-mercial success of RM has triggered theinterest of other business areas, such ashotels, tour operators and car rentalcompanies.

RM can be used to stimulate demand byoffering services at lower prices in acontrolled manner during periods whenthe demand is relatively low. It can beapplied to products or services wherethe amount of capacity is fixed and hasonly a limited lifetime: the number ofseats in an aircraft is fixed and cannotbe extended if demand happens toexceed capacity. On the other hand, anempty seat cannot be sold once the air-craft has departed.

In order to be profitable, the availablecapacity should be offered at differentprices. For each flight, the airline deter-mines what fraction of the (remaining)seats are to be offered at what price. Ofcourse, price differentiation is onlyuseful when different groups of cus-tomers exist who are willing to pay dif-ferent prices. These customer groupsmust also be distinguishable by theselling channel they choose, the time ofreservation, or their choice of additionalfeatures (such as possibilities for can-cellation).

To illustrate the use of RM for airportparking, let us assume for simplicitythat a parking company offers a regularprice of $10 per hour, but also wants tooffer a discount rate at $6 per hour. Thecompany must then identify how manyparking spots are likely to be sold at thediscount rate. To this end, it is crucial tohave good forecasting models that canaccurately predict the demand for anygiven day for the different customerclasses based on both historical and cur-rent data.

The next step is to use these demandforecasts to optimize the expected rev-enue. To this end, one needs to deter-

mine the maximum number of seats tobe sold at the discount rate of $6. This isdone by the so-called ExpectedMarginal Seat Revenue Model(EMSR), which is based on comparingthe revenue of one additional parkingspot at the regular rate multiplied by theprobability of filling that spot againstthe discount rate. As long as the firstexceeds the second, the parking com-pany should reserve this parking spotfor customers willing to pay the regularrate; otherwise, the spot should be soldat the discount rate. In this way, it can beeasily determined how many parkingspots should be reserved for customerswilling to pay the regular price.

For one of the world’s leading parkingcompanies, we have developed andimplemented prediction models andEMSR-based revenue optimization, incombination with smart price differenti-ation and market segmentation. This hasled to a multi-million dollar increase inannual revenue, strongly enhancing thecompetitive edge of our customer.

To summarize, the use of RM tech-niques in the airline industry has been

successfully applied for one of world’sleading parking companies. This casedemonstrates the existence of tremen-dous business opportunities in theparking industry.

The authors are the founders of PreMa,the Dutch national network for Pricingand Revenue Management, in a jointeffort of CWI, VU UniversityAmsterdam and the Dutch companyORTEC. The goal of PreMa is to bringtogether practitioners and researchers toexchange knowledge and experience inthe area of RM.

Link:

http://www.prema-netwerk.nl

Please contact:

Rob van der MeiCWI, Amsterdam, The NetherlandsTel: +31 20 5924129E-mail: [email protected]

Kevin PakORTEC, Gouda, The NetherlandsTel: +31 18 2540500E-mail: [email protected]

ERCIM NEWS 78 July 200926

Special Theme: Mathematics for Finance and Economy

Figure 1: Revenue management in the parking industry: who is paying what?

Sou

rce:

Shu

tter

stoc

k

Page 27: Special theme: Mathematics for Finance and Economy

ERCIM NEWS 78 July 2009 27

Fraud detection represents a challengingissue in several application scenarios,and the automatic discovery of fraudu-lent behaviour is crucial in many real-life situations. In this context, the ValueAdded Tax (VAT) fraud detection sce-nario is receiving increasing interest forboth its practical and theoretical issues.Like any tax, the VAT is open to fraudand evasion. There are several ways inwhich it can be abused, e.g. by underde-claring sales or overdeclaring purchases.In addition, opportunities and incentivesto fraud are provided by the credit mech-anism that characterizes VAT: taxcharged by a seller is available to thebuyer as a credit against his liability onhis own sales and, if in excess of theoutput tax due, refunded to him. Thus,fraudulent claims for credit and refundsare an extensive and problematic issuein fiscal fraud detection. From this per-spective, a mathematical modellingmethodology capable of producing apredictive analysis tool is of great signif-icance. The tool should be able to iden-tify the taxpayers with the highest prob-ability of being VAT defrauders, in orderto support the activity of planning andperforming effective fiscal audits.

There are several issues that make theproblem difficult to address. First, eachgovernment agency has a limitedauditing capability, which severelyrestricts the amount of audited dataavailable. In Italy for example, audits

are performed on only 0.4% of theoverall population of taxpayers who filea VAT refund request. This restrictioninevitably raises a sample selectionbias: while auditing is the only way toproduce a training set upon which todevise models, auditors focus only onsubjects who according to certain cri-teria seem particularly suspicious. As aconsequence, the proportion of positivesubjects (individuals who are actuallydefrauders) in the training set is vastcompared with that in the overall popu-lation.

The limited auditing capability of ageneric revenue agency poses severeconstraints also in the design of thescoring system. Auditing is a time-con-suming task involving several investi-gation and legal steps, and which ulti-mately requires a significant commit-ment of human resources. Hence, thescoring system should concentrate on auser-defined fixed number of individ-uals (representing the auditing capa-bility of the agency), with high fraudu-lent likelihood and with a minimumfalse positive rate.

The situation is further exacerbated bythe quest for a multi-purpose modellingmethodology: in general, several objec-tive functions characterize the frauddetection scenario, and a traditionalclassification scheme may fail inaccomplishing such a multi-purpose

task. Typically, experts are interested inscoring individuals according to threecriteria:

• Proficiency: scoring and detectionshould not rely only on a binary deci-sion boundary separating defraudersfrom non-defrauders. Rather, higherfraud amounts make defrauders moresignificant. For example, it is betterto detect a defrauder whose fraudamounts to $1000 than one whosefraud amounts to $100.

• Equity: a weighting mechanismshould highlight those cases wherethe fraud represents a significant pro-portion of the business volume. Forexample, an individual whose fraudamounts to $1000 and whose busi-ness volume is $100,000 is less inter-esting than an individual whose fraudamounts to $1000 but whose businessvolume is only $10,000.

• Efficiency: since the focus is onrefunds, scoring and detection shouldbe sensitive to total/partial frauds.For example, a subject claiming aVAT refund equal to $2000 and enti-tled to $1800 is less significant than adifferent subject claiming $200 whois entitled to nothing.

There are several mathematical toolsbased on machine learning and statis-tics, which can be adopted to address

SNIPER: A Data Mining Methodology for Fiscal Fraud Detectionby Stefano Basta, Fosca Giannotti, Giuseppe Manco, Dino Pedreschi and Laura Spinsanti

An effective audit strategy is a key success factor for ‘a posteriori’ fraud detection applications infiscal and insurance domains. ‘Sniper’ is an auditing methodology with a rule-based system, whichis capable of dealing with conflicting issues such as maximizing audit benefits, minimizing false-positive audit predictions and deterring probable future fraud.

Figure 1: Flowchart of the SNIPER technique.

Page 28: Special theme: Mathematics for Finance and Economy

the above issues. Approaches based onthe estimation of the underlying distri-bution and a direct modelling of thefraudulent behaviour exhibit shortcom-ings due to both the complexity of thedomain under consideration and thepresence of noise which prevents suit-able model fitting. In general, super-vised techniques (using a training set ofknown fraudulent cases) based onhybrid or cost-sensitive classificationsuffer from low interpretability, whichmakes them inadequate for the problemat hand. In addition, the aforemen-tioned problem of sample selection biasmakes it difficult to devise a propertraining set. Recently, semi-supervisedand unsupervised methods have beenproposed to partially overcome thesedrawbacks. Unfortunately, these tech-niques fail to provide interpretableexplanations of the ‘outlierness’ of afraudster.

Rule-based approaches are preferablefrom a socio-economic point of view.Intelligible explanations of why indi-viduals are scored as fraudulent are farmore important than the scores them-selves, as the former allow auditors tothoroughly investigate the behaviouralmechanisms behind a fraud.Unfortunately, rule-based classifiersexhibit poor predictive accuracy whenthe underlying data distribution is inher-ently characterized by rarity and pri-mary aspects of the concept being learntare therefore infrequently observed.

In this context, we developed Sniper, aflexible methodology devised toaccommodate all the above-mentionedissues in a unified framework. Sniper isan ensemble method that combines thebest of several rule-based baseline clas-

sification tools, each of which addressesa specific problem from among thosedescribed above. The idea of theapproach is to progressively learn a setof rules until all the above requirementsare met. The approach is summarized inFigure 1.

Sniper devises a scoring function thatassociates an individual with a valuerepresenting degree of interestaccording to the proficiency, equity andefficiency parameters. Clearly, thetraining set of audited subjects allowsthe computation of such a function andits analytical evaluation over thoseknown cases. Tuning the functionallows different aspects of VAT fraud tobe emphasized, from which baselineclassification tools can then be trainedin order to associate class labels to indi-viduals according to their relevance tothe aspect of interest. Figure 2a reportsan example segmentation of the auditedsubjects on the basis of their relevanceto the scoring function. Specifically,from the lighter to the darker-colouredslice, the figure reports the percentageof subjects in four segments.Conversely, Figure 2b reports the per-centage of total amount of fraud associ-ated to these segments.

Baseline classifiers can thus be used tofilter out rules that degrade the overallaccuracy of the system. This is an itera-tive step, which selects the best subsetof rules from that obtained by the base-line classifiers. The result is a finalbinary classifier capable of selectingsubjects to be audited. Experimentally,Sniper has proven to be simple andeffective, producing a prediction modelthat outperforms those models gener-ated by traditional techniques.

Please contact:

Stefano Basta ICAR CNR, Italy E-mail: [email protected]

Fosca Giannotti ISTI CNR, Italy E-mail: [email protected]

Giuseppe MancoICAR CNR, Italy E-mail: [email protected]

Dino Pedreschi Computer Science Dept. Univ. Pisa,Italy E-mail: [email protected]

Laura SpinsantiISTI CNR, Italy E-mail: [email protected]

ERCIM NEWS 78 July 200928

Special Theme: Mathematics for Finance and Economy

Figure 2: (a) Subject portioning - Anexample segmentation of theaudited subjects on the basis oftheir relevance to the scoringfunction. Specifically, from thelighter to the darker-colouredslice, the figure reports thepercentage of subjects in foursegments. (b) Retrieved fraud - thepercentage of total amount offraud associated to thesesegments.

a) b)

Page 29: Special theme: Mathematics for Finance and Economy

ERCIM NEWS 78 July 2009 29

At the e-Business L@B at theUniversity of Turin, Italy, we are devel-oping a computational method based oniterative agent-based simulation toanalyse patterns and trends in enterpriseclusters. Clustering is definable as thetendency of vertically and/or horizon-tally integrated firms in related lines ofbusiness to concentrate geographically,or, to a more general extent, virtually.Our goal is to create a comprehensiveframework that can be used as a tool tostudy the dynamics of network andcluster formation and modification,when specific events occur. The aim isto be able to explore the qualitative andquantitative results deriving from theapplication of different strategies anddifferent exogenous parameters (such asinfrastructure quality, market com-plexity, the introduction of technologicalinnovation, market shocks etc). Ourmodel is at an advanced stage of devel-opment, but at present employs reactiveagents: future plans include cognitiveagents and real-world case analysis.

The development of a firm depends onits ability to expand its capabilities andresources and introduce its products intothe market. Two main kinds ofexploratory strategy are involved whendeveloping new skills. The first is inter-nally focused: firms can try to developnew competence through research anddevelopment activities. Alternatively,

externally focused firms may look forpartners with whom they can createlinks and exchange skills. The resultswill depend on the strategy employed,and this will have an impact on thetopology of the enterprise’s network.

Our model proposes three categories ofagent: the enterprises, the emissariesand the environment (E³). We introducethe ‘heat’ metaphor to represent theadvantage of an enterprise having com-petence. Enterprises are rewarded witha quantum of heat for every compe-tence. If the competence is internal (ie,developed by the enterprise and not out-sourced) this value is higher. The goalof each enterprise is to remain above acertain heat threshold: below this it willcease activity and disappear from thegrid. It achieves this by increasing itscompetence, both through internal andexternal exploration.

In the model, the environment ismetaphorically represented by a latticewith dimensions n∗m and each cell ofthis grid is assigned a heat value. A setof enterprises is uniformly distributed inthe environment. Each enterprise canuse its own emissary, also scattered inthe environment; these represent probesthat are able to move around and reportinformation to the parent enterpriseduring an external exploration phase.Each enterprise is represented by a

vector whose length defines the com-plexity of the industry in which theenterprise operates. This vectormetaphorically defines the competencespossessed (both internal and out-sourced). Each element C1 of the vectoris a Boolean variable (where 1 meansthat the lth competence is possessed,while 0 means that it is not).

At each step, the cells are updated toreflect the heat produced by the enter-prises according to the competencespossessed and the dispersion effectmanaged by the environment itself. Theheat of each enterprise is a function ofthe competences it possesses and henceof its previous behaviour (ie, its successor otherwise in creating new compe-tences). When idle, an enterprise candecide, at each step, which strategy tochoose (externally or internally explo-rative). Each strategy will take severalsteps to complete, during which theenterprise cannot carry out any otheraction. In addition to the physiologi-cally scattered heat, the enterprise con-sumes additional energy (used by theemissary, during external exploration,or by the research and developmentprocess, during internal exploration).

If a given action is successful, the end ofa phase of internal or external explo-ration will see the enterprise possess anew competence. An action is unsuc-

Enterprise Network Dynamics Explored through Computational Modellingby Marco Remondino and Marco Pironti

We apply computational modelling and simulation to enterprise economics and the underlyingnetwork dynamics. Here, we briefly describe a comprehensive framework.

Graphical Output from the Model, Showing the Network Dynamics (left), and the Geographical Dynamics (right).

Page 30: Special theme: Mathematics for Finance and Economy

cessful if, after several steps of researchand development (internal exploration)or external searching to find a partner(through the emissary), nothing new isfound and the lth competence remainsset to zero. While internal competencesare rewarded more than external ones,they are more expensive and difficult toproduce. However, internal explorationis important, since it also permits theexchange of innovative competencesduring the external exploration phase,with a link being created every time atleast one competence is exchangedbetween two enterprises. At each time-step, the set of links connecting enter-prises is updated according to the com-

petences exchanged among the enter-prises.

As a result, the model defines thedynamics of link creation among enter-prises. Many parameters are definableby the user at the beginning of the simu-lation; the framework is comprehensiveand still under development. At present,the agents are reactive or stochastic,meaning their behaviour is determinedby predefined rules based on the stim-ulus-reaction paradigm, or by a proba-bility distribution, but we are devel-oping cognitive strategies for them. Inparticular, agents will act realistically,by considering past data through a trial

and error technique based on reinforce-ment learning. In order to simulate asfar as possible the behaviour of humansin the real world, the agents will not betotally rational; they will have aninternal bias that will affect the waythey cognitively form their own actionsfor the next step.

Link:

http://e-lab.di.unito.it/

Please contact:

Marco RemondinoUniversity of TurinE-mail: [email protected]

ERCIM NEWS 78 July 200930

Special Theme: Mathematics for Finance and Economy

Successful corporate managementdepends on efficient strategic and opera-tive planning. In this context, reliableforecasts make an important contribution.As the automobile industry is one of themost important sectors of the Germaneconomy, its development is of theutmost interest. Developments in bothmathematical algorithms and computingpower have increased the reliability offorecasts enormously. Enhanced methodssuch as data mining, and advanced tech-nology allowing the storage and evalua-tion of large empirical data sets, meanthat forecasts are more reliable than everbefore. At the same time, the explicabilityof a forecast model is as important as itsreliability. In cooperation with the servicecompany BDW Automotive, which con-sists of market experts in the automobileindustry, the Fraunhofer-Institute forAlgorithms and Scientific Computing(SCAI) has developed a software tool forsales forecasts, wherein the methodsapplied are highly accurate and at thesame time easily explicable.

DataThe data for this model consists largelyof registrations of new automobiles aswell as economic exogenous parameters

A New Approach to Explicable Sales ForecastModels for the German Automobile Marketby Marco Hülsmann, Christoph M. Friedrich and Dirk Reith

The Fraunhofer-Institute for Algorithms and Scientific Computing (SCAI) has developed a softwaretool for sales forecasts using a methodology that consists of a combination of time series analysisand data mining techniques.

Figure 1: Model evaluation and forecast for quarterly data. In the training period (1992-2006),the blue line indicates the time series model; in the test period (2007/2008), it shows thepredictions based on this model. Using a Support Vector Machine as multivariate trendestimator, this leads to an average error of 5.91% in the test period, and a forecast up to the year2011 was performed. The decrease of automobile registrations due to the financial crisis in thelast two quarters of 2008 is depicted correctly. However, this is only because perturbationsindicated by the orange rugs were considered. Lower rugs represent positive and upper rugsrepresent negative perturbations.

Average Error 2007-2008: 5.91%Time

Regi

stra

tions

of n

ew a

utom

obile

s

7000

0080

0000

9000

0011

0000

0

1/921/93

1/941/95

1/961/97

1/981/99

1/001/01

1/021/03

1/041/05

1/061/07

1/081/09

1/101/11

Page 31: Special theme: Mathematics for Finance and Economy

ERCIM NEWS 78 July 2009 31

like the Gross Domestic Product, theConsumer Price Index, theUnemployment Rate, the Interest Rateand the Industrial Investment Demand:these are available from the FederalStatistical Office and the GermanFederal Bank, mostly as monthly orquarterly data. In addition, marketexperts provide domain-specific eco-nomic factors like Latent ReplacementDemand and Model Policy. Our soft-ware tool is able to generate forecastsfor monthly, quarterly and yearly timeintervals; if the data is not published atthe required time interval, conversionsare performed. As the German automo-bile market increased extraordinarilydue to the reunification of the twoGerman states in 1990, this could onlybe treated as a massive shock event,which caused all data prior to 1992 to bediscarded. We therefore used yearly,monthly and quarterly data from 1992 to2008 for our software tool. This has ledto a model based on the past that is con-sidered capable of making reliable pre-dictions for the future.

MethodologyThe methodology consists of a combina-tion of time series analysis and datamining techniques. New car registrationsare considered as a time series composedadditively of trend, seasonal and cal-endar components, which are assumed tobe independent. The seasonal and cal-

endar components are estimated by stan-dard time-series analysis methods,whereas the trend component is multi-variate, ie it depends on economicexogenous parameters and is thereforeestimated by data mining techniques.The simplest method applied in this con-text was multivariate linear regression.However, it delivered poor results, as thetrend could not be assumed to be roughlylinear. Hence, nonlinear trend estimatorswere used, which yielded much morereliable results, eg a Support VectorMachine with a Gaussian kernel,Decision Trees, K-Nearest-Neighbourand Random Forest.

Model Evaluation and Limitation of ForecastsIn order to evaluate the model, a trainingperiod and an adjacent test period mustbe defined. In the former, the model isbuilt by estimating trend, seasonal andcalendar components, and for the latter,a forecast is made, with the predictedregistrations of automobiles being com-pared to real data. Taking the periodfrom 1992 to 2006 as the training periodand 2007/2008 as the testing period, theaverage error on the predicted registra-tions is very low (0.1-1% for yearly, 4-6% for quarterly and 6-7% for monthlydata) using high-performance nonlineardata mining techniques. These low errorrates represent a very good performanceof our forecast tool.

Due to the expertise of BDWAutomotive, external perturbations canbe taken into consideration. Forexample, the rise in the German salestax in 2007 from 16% to 19% led toincreased sales in 2006, and the recentfinancial crisis led to a decrease in thelast two quarters of 2008. It should bepointed out that forecasts are alwayslimited by uncertainties caused by per-turbations in the future, which are notpredictable or whose effects cannot beassessed. The current grant of the scrap-ping bonus in Germany, which led tohigher sales in the year 2009, is anothersuch occurrence. As a result, reliabletrend forecasts can always be achievedbut more detailed forecasts must beinterpreted with care, as they cannothandle random events.

Future WorkThe focus now lies on applying our soft-ware tool to other traditional major mar-kets, eg France, Japan and the USA.This will require different exogenousparameters to be considered along withdifferent economic perturbations.

Please contact:

Dirk ReithFraunhofer-Institute for Algorithmsand Scientific Computing (SCAI),GermanyTel: +49 2241 142746E-mail: [email protected]

Daily we face numerous decisions.Many of these decisions are not subjectto a formal process of analysis, but mayimplicitly incorporate multiple criteriaand frequently involve uncertainty. Aclassical example is the choice of whatto wear each day, which depends on theweather forecast, the agenda for the day,comfort and the image we wish toproject. Despite this being a familiaroccurrence to all of us, most of the deci-

sion models for investment analysis andresource allocation still try to capturethe complex nature of decisionprocesses with very simplified rules andone sole criterion. Our research atCEGIST goes in the opposite direction.

Let us start with an example at a macro-economic level: the allocation ofregional funds within the EuropeanUnion. These funds represent a signifi-

cant fraction (1/3) of the Europeanbudget, but are distributed based on asingle indicator (GDP per head), whichis used to segment the European regions(NUT 2). Such segmentation leads tovery heterogeneous regions, and isinsufficient for characterizing the typesof dissimilarity among regions, which isnecessary to design solutions tailored tothe different needs within the EU terri-tory. Our work has been to develop

Multiple Dimensions in Resource Allocationand Investment Decisionsby João Oliveira Soares

Diverse case studies carried out at CEGIST (Centre for Management Studies) at the Technical Universityof Lisbon have a common root: the support and analysis of investment decisions made by publicauthorities, banks and private firms. The approach comprises multiple dimensions and reflects on theuse of statistical and quantitative methods along with qualitative criteria and judgment-based proce-dures. Recent work also adds flexibility and risk issues to this multidimensional context.

Page 32: Special theme: Mathematics for Finance and Economy

approaches based on multivariate statis-tical techniques that allow the identifi-cation of the different axes of socio-eco-nomic development and, subsequently,the segmentation of regions into morecoherent and homogeneous clusters.

Credit Risk AnalysisSegmentation, classification and multi-variate statistical methods are also usedin credit risk analysis, namely to helpcredit-granting decisions. For sometypes of loans, however, the modellingprocedure must permit the inclusion ofexpert judgements, since qualitativecriteria, particularly management’sexperience and reliability, show signifi-cant negative correlation with banks’default records. Within this context, we

extended our work to the use of multi-criteria decision analysis models as away of assessing credit risk by inte-grating both qualitative and quantita-tive data.

Similarly, we developed a multicriteriavalue model for selecting and managingportfolios of financial assets. It consistsin defining a benchmark portfolio andscoring its different stocks according toexpected return criteria. A procedure isthen proposed to suggest adjustments tothe proportions of stocks in the port-folio. Finally, the risk is taken into con-sideration in an optimization modulethat includes constraints concerning thelimits of variation for the proportion ofeach stock.

This approach is different from othersfound in the literature, eg mono-objec-tive optimization with constraints asso-ciated to the different factors; and multi-objective and goal programming. Thereason is the ambiguous nature of someindicators. Think of the interpretation ofa usual market indicator: P/E, the price-to-earnings ratio. A high value of P/E,for instance, can indicate that themarket expects future earnings to growfast or that the company has a low risk.However, it can also reveal that thestock is overpriced. The question then isnot whether the ratio is high or low, butwhether or not one believes the marketis correctly (efficiently) pricing theasset. A multicriteria value approach iswell suited to the subjective (and com-plex) nature of investors’ judgments onquantitative and qualitative criteria.

The final link in our conceptual chainadds multiple dimensions with flexi-bility and risk. The focus is again on thesupport of investment decisions, now interms of real assets. Imagine that thedecision to invest on a new industrialunit might be postponed, waiting formore information about the expectedmarket behaviour. This postponementoption as a value that can be estimatedby adapting the usual financial optionmodels to real assets. In this case, thevolatility is assumed to be representedby the variation in prices of the subja-cent asset (shares of the firm). However,projects are subject to multiple risks:variation in output price, variation ininput prices, and among other things,uncertainty in the future prices of CO2licences. These multiple sources of riskare not contemplated in option models.The solution we are working on is to useMonte Carlo methods to generate dif-ferent streams of cash flow and to usetheir variance as a proxy for thevolatility of the real options. Finally andin parallel, we are working on the devel-opment of multicriteria models to incor-porate sustainable development con-cerns into the investment decisionsframework.

Link:

https://fenix.ist.utl.pt/investigacao/ceg-ist/inicio

Please contact:

João Oliveira SoaresUniversity of Lisbon, PortugalE-mail: [email protected]

ERCIM NEWS 78 July 200932

Special Theme: Mathematics for Finance and Economy

Figure 1: The European Regional Policy and the Socioeconomic Diversity of EuropeanRegions: A Multivariate Analysis. Left: EU-25: convergence and competitiveness objectives, 2007-2013 - EC classification. Right: EU-25: clusters of socio-economic similarity.

Figure 2: Credit Scoring - the software PROBE is used to perform sensitivity analysis on theweight of a component.

Page 33: Special theme: Mathematics for Finance and Economy

ERCIM NEWS 78 July 2009 33

Efficient computation of prices andhedges for derivative products is a majorissue for financial institutions. Thedevelopment of increasingly complexfinancial products requires advancedstochastic and numerical analysis tech-niques. The software ‘PREMIA’ offersnumerical solutions to the problems ofpricing and hedging financial deriva-tives, with a collection of algorithmsstemming from recent research in finan-cial mathematics.

Premia has been developed in the frame-work of the MATHFI research team,which gathers researchers in probabilityand mathematical finance from INRIA,ENPC (Ecole Nationale des Ponts etChaussées) and the University of Paris-Est Marne laVallée. The development ofPremia started in 1999.

This project keeps track of the mostrecent advances in the field of computa-tional finance. Premia contains variousnumerical algorithms: deterministicmethods (finite difference and finite ele-ment algorithms for partial differentialequations, wavelets, Galerkin, sparsegrids etc), stochastic algorithms (Monte-Carlo simulations, quantizationmethods, Malliavin calculus-basedmethods), tree methods and approxima-tion methods (Laplace transforms, FastFourier transforms etc). These algo-rithms are implemented for the evalua-tion of plain vanilla and exotic options

on equities, interest rates, inflation,credit and energy derivatives. Forequity derivatives for example, themulti-dimensional Black-Scholes isavailable as well as stochastic volatility(Heston, Hull-White, Fouque-Papanicolau-Sircar) and various Lévymodels with jumps (Merton, Kou,Variance Gamma, normal-inverseGaussian (NIG), Tempered stable). Themost recent Monte-Carlo algorithms areimplemented for high-dimensionalAmerican options. Various models ofinterest rate derivatives are included, asfor example affine models, quadraticterm structure models, Heath-Jarrow-Morton model and the Libor MarketModel. Moreover, Premia provides acalibration toolbox for the Libor Marketmodel, using a database of swaptionsand cap implied volatilities and atoolbox for pricing credit derivatives(credit default swaps (CDS) and collat-eralized debt obligations (CDOs)) usingthe most recent algorithms.

The Premia software provides a collec-tion of C/C++ routines and scientificdocumentation in PDF and HTML, andis available for both Windows andLinux operating systems. Interfaces forExcel and NSP/Scilab are available.More precisely, Premia is composed of(i) a library designed to describe deriva-tive products, models and pricingmethods, and which provides basicinput/output functionalities; (ii) a uni-

fied numerical library (PremiaNumerical Library (PNL)) available forcontributors; (iii) a collection of pricingroutines which can easily be plugged, ifnecessary, into other financial soft-wares; and (iv) a scientific documenta-tion system.

The development of Premia is beingundertaken along with a consortium offinancial establishments. ConsortiumPremia is presently composed ofCALYON, Natixis, Société Générale,Bank of Austria and RZB ((RaiffeisenZentralbank Österreich). The partici-pants in the consortium contribute tofinance the development of Premia andhelp to determine the directions inwhich the project evolves. Every year, anew release is delivered to the consor-tium members. All algorithms imple-mented in Premia are provided withtheir source codes, solid research docu-mentation and extended references. Arestricted, open-source version ofPremia is also available on the PremiaWeb site and can be downloaded with aspecial licence for academic and evalu-ation purposes.

Premia aids research and promotestechnology transfer, creating a bridgebetween academic researchers and pro-fessional R&D teams. It supplies abenchmark for new numerical pricingmethods and provides a useful teachingsupport for graduate students in mathe-matical finance.

Links:

http://www.premia.frhttp://www-rocq.inria.fr/mathfi/

Please contact:

Agnès SulemINRIA Paris-Rocquencourt, FranceE-mail: [email protected]

Antonino ZanetteUniversita di Udine, ItalyE-mail: [email protected]

Premia: A Numerical Platform for Pricing Financial Derivatives by Agnès Sulem and Antonino Zanette

PREMIA is a computational platform designed to set up a technology watch for numerical problemsrelated to the evaluation of financial derivative products and the management of pertinent risks. It isdeveloped by the MATHFI research team at INRIA.

Figure 1: the impliedvolatility surface computedby the software Premia inthe Heston model.

Page 34: Special theme: Mathematics for Finance and Economy

Two financial and two technology issuesare combining to force a radical rethinkof how financial algorithms are com-puted.

First, the trading landscape is changing.The ‘Markets in Financial InstrumentsDirective’ (MiFID) regulation inEurope has encouraged the creation of araft of new execution venues.Combined with the growth of auto-mated and algorithmic trading, theamount of market data that must beprocessed - and the speed with whichthat must be accomplished in order todeliver profitable trades - means that atevery stage of the trade lifecycle, high-performance computation is required todeliver low-latency results.

Second, in the aftermath of the creditcrunch there is a requirement for moresophisticated risk calculations, bothfrom trading firms themselves and fromregulators. Overnight risk calculationsare no longer sufficient, with intra-dayrisk runs tending towards real-time cal-culations, covering multiple depart-ments and asset classes in addition tocounterparty and liquidity risk.

On the technology side, processor clock-speed, which for many years advancedin line with Moore’s Law, has stalled ataround 3 GHz as chips running fasterthan that consume over 150 Watts. Inaddition, the pressure to reduce ourcarbon footprint forces us to explorealternatives in many areas of life, partic-ularly in transport and in IT.

A first approach to dealing with theseissues is to exploit multicore by paral-lelizing applications. While this is agood option, it is non-trivial and doesn’tfix all of the issues raised above.Compilers will only automagically par-allelize a very small class of applica-tions; some will be unable to exploitmulticore at all, while others will requirea significant (possibly a total) rewrite. Inaddition, if we look forward five years

or more we need to consider not justdual or quad core, but dozens of coreswith heterogeneous architectures.

An alternative to this is to deploy spe-cialist architectures. Many examplesexist of appliances, driven by exoticprocessing technology, being deployedto handle one or more specific functionssuch as latency monitoring andanalysis, market data feed handling,ticker plant management, messageacceleration or risk analysis. Theprocessors used include FieldProgrammable Gate Arrays (FPGA),Graphic Processing Units (GPU) andthe IBM Cell Broadband Engine (a het-erogeneous high-performance com-puting architecture used in the SonyPlaystation 3 games console).

FPGAs, in which the personality of thechip must be programmed by the devel-oper, are not as good for computation-ally intensive workloads but are wellsuited to the processing of streamingdata. Examples of FPGA use in supportof financial markets include:• Endace for latency monitoring• Celoxica for data feed handling• Exegy for ticker plant acceleration• ActivFinancial for market data pro-

cessing• Solace, Tervela and Tibco for mes-

sage acceleration• TS Associates and Vhayu for data

compression.

FPGAs run at low clock speeds anddeliver high performance throughpipelining and parallelization. Due tothis complexity, their use by main-stream developers is very limited, buttheir value in appliances where they canbe programmed once - and used broadly- is significant. A single FPGA-basedappliance can replace a rack of com-modity servers, significantly reducingup-front cost, power and coolingcharges, and saving potentially veryexpensive rack space in colocationfacilities shared with stock exchanges.

Both GPUs and IBM Cell have beenused successfully in performanceexperiments with financial applications.IBM Research has demonstrated excel-lent performance of computationallyintensive risk analytics applications,and particularly good scaling resultswhen increasing the number of Cellprocessors targeted. The InRush tickerplant appliance from Redline Tradinguses the IBM Cell as an acceleration co-processor. Professor Mike Giles’ groupat Oxford University has developed aseries of demonstrators for financialapplications based on Nvidia CUDAGPUs with up to 240 cores, generallydelivering between 50 and 100 timesspeed-up compared to a single Xeoncore.

One obvious conclusion is that numeri-cally intensive compute performancefor financial applications running onfuture platforms will be deliveredthrough the complexity of parallelismand heterogeneity. In order that the per-formance is accessible to a broad rangeof developers it is crucial that appro-priate programming paradigms andtools, together with portable librariesand simple APIs, are developed. Oneexample of this is the C library of math-ematical functions from the NumericalAlgorithms Group that supports theIBM Cell processor (in addition tomany others). Another is the Fortranand C compilers from The PortlandGroup that support multicore x86processors as well as GPUs fromAMD/ATI and Nvidia. For multicoreand other, more exotic processor tech-nologies to be successful in meeting theperformance needs of financial mar-kets, new programming paradigms,languages, tools and libraries mustfollow.

Please contact:

John BarrThe 451 Group, UKTel: +44 7988 764900E-mail: [email protected]

ERCIM NEWS 78 July 200934

Special Theme: Mathematics for Finance and Economy

Accelerating Applications in Computational Financeby John Barr

The computational power required to participate successfully in today’s financial markets is growingexponentially, but the performance of commodity processors is not keeping pace. How can firmsharness innovative technology to achieve success?

Page 35: Special theme: Mathematics for Finance and Economy

While the portfolios of loans given toSMEs and portfolios of corporate bondsdo not have the glamour of the equity orforeign exchange markets, they are nev-ertheless the core business of manyfinancial institutions. Such portfoliostend to have values of thousands of mil-lions of dollars and may be composed oftens of thousands of small loans.

At first glance, credit portfolios mayseem harmless because payments arebased on pre-agreed dates. Nothingcould be farther from the truth. Defaultsfrequently occur in clusters, affectingentire sectors of industry. The correla-tion between defaults is the main diffi-culty in risk assessment, and this is therationale behind many of theapproaches to credit risk valuation. Theimpact of the correlation on the loss dis-tribution function is notorious, andcauses an asymmetrical distributionfunction with long tails.

CreditCruncher is an open-sourceproject for the simulation of massiveportfolios of loans where the unique riskis the default risk. It is addressed tofinancial institutions searching for awell-documented and efficient tool. Itconsists of an engine that runs in batchmode and is designed to be integratedinto the risk management systems offinancial institutions for risk assessmentand stress testing. The method used todetermine the distribution of losses in theportfolio is the Monte Carlo algorithm,because it allows us to consider themajority of variables involved, such asthe date and amount of each payment.This approach is conceptually equivalentto the valuation of portfolios of equitiesand derivatives using the Monte Carlomethod. In the case of market risk, theprice of the underlying assets is simu-lated using a geometric Brownianmotion with a given volatility and trend.In the credit risk case, borrowers' defaultsare simulated using a copula with givensurvival rates and correlations.

ERCIM NEWS 78 July 2009 35

Simulating Large Portfolios of Credit: The CreditCruncher Projectby Gerard Torrent

Financial institutions need tools to simulate their credit portfolios in the same way that theysimulate their equity portfolios. CreditCruncher allows each payment in a massive loan portfolioto be simulated, taking into account the probability of default for each creditor and thecorrelations between them.

Everybody defaults

Nobodydefaults Portfolio loss at time T

0 %

15 %

20 40 60

Pro

babi

lity

10 %

5 %

X. Uncorrelated defaults

Y. Correlation=0.2 with Normal copula

Z. Correlation=0.2 with T-Student(3) copula

PORTFOLIO LOSS AT 1 YEAR OF 100 LOANS OF 1$ AND P(default)=10%

Expected Loss(X) = 10, VaR99%(X) ≈ 17

Expected Loss(Y) = 10, VaR99%(Y) ≈ 42

Expected Loss(Z) = 10, VaR99%(Z) ≈ 58

Figure 2: Simulation schemaimplemented by Credit Cruncher.

Figure 1: Impact of correlation and copula selection on portfolio loss distribution.

Page 36: Special theme: Mathematics for Finance and Economy

In recent times, we have witnessed alarge-scale economic turmoil whosefuture is hard to predict. The crisis has sodeeply involved the world’s populationthat it is mentioned constantly in themedia and is having a daily influence ongovernment agendas. People’s reactionsand opinions are as diversified as theirexperience of these difficult matters.

How can mathematics be of help in sucha situation? The spectacular success ofmathematics in the physical sciences isbased on a long interaction betweentheory and experiment, with trial anderror procedures and multiple cycles offeedback. Reality is quantitatively‘understood’ when a theory (whose lan-guage is mathematics) is able to deduceobserved phenomena from a smallnumber of simple principles and predictthe output of new experiments. When

even a single experiment contradicts thetheoretical predictions, the wholemachinery must be modified at the costof replacing some of the principles withnew ones.

Economics, however, has followed anapparently different path. On the onehand, the large amount of available datahas only begun to come under seriousconsideration in the last century. Thediscovery that the tails of the proba-bility distributions of price changes aregenerally non-Gaussian is quite arecent achievement. On the other hand,the axiomatic method of deductive sci-ence has been applied without per-forming proper feedback checks withobservations: the principles of ration-ality of economic agents, market effi-ciency etc, have prospered within someschools of economics more like reli-

gious precepts than scientifichypotheses.

Yet testable and predictive theories haveappeared in economics. The study by D.McFadden (2000 Nobel Laureate inEconomics) on the use of the SanFrancisco BART (Bay Area RapidTransit) transportation system is a cele-brated example. It is interesting tonotice that from a mathematical point ofview, this work is equivalent to theLangevin theory of a small number oftypes of independent particles.However, when applied to cases inwhich the particles or agents interactand the resulting peer-to-peer effectsplay a more substantial role, that theoryturns out to be inefficient.

There are several factors responsible forthe delay in the advent of the scientific

ERCIM NEWS 78 July 200936

Special Theme: Mathematics for Finance and Economy

Mathematics for Economics: A Statistical Mechanics Perspective by Pierluigi Contucci and Francesca Romiti

Will the noble (and sometimes snobbish) queen of sciences mathematics have a role in the futurestudy of economics? Will the role it plays (if any) be as crucial as in the physical sciences? We arguethat mathematics will very likely have a pivotal beneficial mutual exchange with economics,particularly through the study of statistical mechanical models of complex systems.

Sklar's theorem states that any multi-variate distribution can be decomposedinto two parts, the marginal distributionsand a copula that reflects the structure ofdependencies between components.This result is important because it allowsus to separate the default probabilitiesfrom the default correlations.

CreditCruncher implements two of themost commonly used copulas in finance,the Normal and T-Student copulas.Because the default time of each bor-rower is modelled as a random variable,the size of the copula can be over50,000, where each component repre-sents a borrower. Simulating low-dimensional copulas poses no problem,but many of the algorithms operating inlow dimensions are unsuitable for largedimensions. To solve this problem, theCholesky decomposition was adapted tooperate with a large correlation matrixcomposed of blocks, in order to reducethe memory space and the number ofoperations involved.

Each Monte Carlo run simulates thedefault time for each debtor by com-bining a copula sampling with the sur-vival functions. This default time isused to determine the loss caused bythe default, mitigated by an estimatedrecovery at a given time. The defaultamount is discounted to the presentvalue using the interest rate providedby the yield curve. The sum of alllosses provides the loss value of theportfolio for that simulation. The com-pletion of thousands of simulationsallows us to obtain the loss distributionfunction and compute the usual riskindicators, such as the Value at Risk orExpected Shortfall. Depending on theportfolio size and the required accu-racy, the computational cost may behigh; CreditCruncher is therefore com-patible with execution in parallel com-puters.

CreditCruncher is currently a one-manproject with no financial support, and islooking for collaborators with expertise

in applied and/or theoretical financialmathematics to participate in a GPL(GNU General Public License) struc-ture. The CreditCruncher system hasbeen tested against both simple hand-made portfolios and large automaticallygenerated portfolios. Future work isbeing considered to add stochasticrecovery rates, pure risk operations likebank guarantees, and the creation of agraphical interface to make the toolavailable to the public, such as portfoliobond managers.

Links:

http://www.generacio.com/ccruncher/http://sourceforge.net/projects/ccruncher/

Please contact:

Gerard TorrentMediapro I+D, SpainTel: +34 606.179.518E-mail: [email protected]

Page 37: Special theme: Mathematics for Finance and Economy

method within the economical sciences.The intrinsic difficulty of its topics andthe gap between these and the availablemathematical techniques is one. Until afew decades ago, mathematics onlytreated simple models with translationor permutation invariance. From thepoint of view of statistical mechanics,

only uniform interactions were under-stood. However, as the physicistGiorgio Parisi likes to phrase it, sciencehas become more robust and the theoryof complex systems has made enormousprogress. Among the things that havebeen learned is how to treat systems inwhich imitative and counter-imitativeinteractions occur, and where interac-tions themselves are generally randomvariables and are related to novel topo-logical properties.

The challenge we now face is to fill thegap between phenomenological andtheoretical approaches. Data analysismust increase in depth and in particularmust follow a theoretical guide.Performing an extensive search of datawithout having an idea of what to huntfor is an illusion no less dangerous thanthe search for principles without regardfor experiments. In the same way, therefinement of the theoretical back-ground of economics must work in par-allel towards data searching andanalysis. The group of the StrategicResearch Project in Social andEconomical Sciences at the Universityof Bologna is working on these themes(see Link 1). Among the approachesbeing explored is the use of statistical

mechanics to extend the McFaddentheory to include interacting systems.There are good indications that thisapproach could lead to interestingresults. First, it has the potential toinclude sudden changes in aggregatequantities even for small changes of theexternal parameters, as happens in an

economic crisis (Link 2). Second, itmay eventually make use of the com-plex systems theory of spin glasses,whose applicability to economics iswell understood (Link 3). Third, it hasbuilt into it the capability to includeacquaintance topologies, especiallythose that have been observed in net-work theory like the small-world andscale-free (Link 4).

Mathematics can provide a substantialcontribution in the crucial phase ofchecking that questions are well posedand then solving them. It is obvious thatnew mathematical instruments will benecessary and that the process of devel-oping them will be lengthy. An earlyphase in which mathematics will beinvolved is the so-called ‘inverseproblem’. Unlike physics, where inter-actions between agents have generallybeen established by pre-existing theo-ries, in the realm of economics effectiveinteractions should be deduced fromdata, possibly at a non-aggregate level.From a mathematical point of view, thecomputation of interaction coefficientsfrom real data is a statistical mechanicsinverse problem, a research setting towhich many fields of science areturning their attention. The inverse

problem solution is structurally linkedto the monotonic behaviour of observedquantities with respect to parameters(Link 5).

For the time being, the simple eco-nomics models considered in mathe-matics and derived from theoretical

physics look like rough metaphors ofreality. Nevertheless, they are able todescribe some features of the observedphenomena and are in any case a neces-sary step towards a more refinedapproximation of reality.

Finally, the attempt to provide solubleand tractable mathematical models foreconomics will be an important oppor-tunity to fertilize mathematics itselfwith new paradigms and to develop newparts of Galileo’s ‘language of nature’.

Links:

(1) http://www.dm.unibo.it/~contucci/srp.html(2) http://arxiv.org/abs/0810.3029(3) http://arxiv.org/abs/0904.0805(4) http://arxiv.org/abs/0812.1435(5) http://arxiv.org/abs/cond-mat/0612371

Please contact:

Pierluigi ContucciBologna University, ItalyTel: +39 051 2094404E-mail: [email protected]

Francesca RomitiBologna University, ItalyTel: +39 051 2094015E-mail: [email protected]

ERCIM NEWS 78 July 2009 37

Will mathematics have a role in the future study of economics?

Page 38: Special theme: Mathematics for Finance and Economy

Realizing the potential of large, distrib-uted wireless sensor networks (WSN)requires major advances in the theory,fundamental understanding and practiceof distributed signal processing, self-organized communications, and infor-mation fusion in highly uncertain envi-ronments using sensing nodes that areseverely constrained in power, computa-tion and communication capabilities.The European project ASPIRE(Collaborative Signal Processing forEfficient Wireless Sensor Networks)aims to further basic WSN theory andunderstanding by addressing problemsincluding adaptive collaborative pro-

cessing in non-stationary scenarios; dis-tributed parameter estimation and objectclassification; and representation andtransmission of multichannel informa-tion. This highly diverse field combinesdisciplines such as signal processing,wireless communications, networking,information theory and data acquisition.

In addition to basic theoretical research,ASPIRE tests the developed theoriesand heuristics in the application domainof immersive multimedia environments.The project aims to demonstrate thatsensor networks can be an effective cat-alyst for creative expression. We focuson multichannel sound capture via wire-

less sensors (microphones) for immer-sive audio rendering. Immersive audio,as opposed to multichannel audio, isbased on providing the listener with theoption of interacting with the soundenvironment. This translates to listenershaving access to a large number ofrecordings that they can process andmix themselves, possibly with the helpof the reproduction system using somepre-defined mixing parameters.

These objectives cannot be fulfilled bycurrent multichannel audio codingapproaches. Furthermore, in the case oflarge venues which are possibly out-

doors or even underwater, and forrecording times of days or even months,the traditional practice of deploying anexpensive high-quality recordingsystem which cannot operateautonomously becomes impossible. Wewould like to enable ‘immersive pres-ence’ for the user at any event wheresound is of interest. This includes con-cert-hall performances; outdoor con-certs performed in large stadiums;wildlife preserves and refuges, studyingthe everyday activities of wild animals;and underwater regions, recording thesounds of marine mammals. The cap-ture, processing, coding and transmis-sion of the audio content through mul-

tiple sensors, as well as the reconstruc-tion of the captured audio signals so thatimmersive presence can be facilitated inreal time to any listener, are the ultimateapplication goals of this project.

So far, we have introduced mathemat-ical models specifically directedtowards facilitating the distributedsignal acquisition and representationproblem, and we have developed effi-cient multichannel data compressionand distributed classification tech-niques. In the multisensor, immersiveaudio application, we tested and vali-dated novel algorithms that allow audio

content to be compressed and allow thisprocessing to be performed on resource-constrained platforms such as sensornetworks. The methodology is ground-breaking, since it combines in a prac-tical manner the theory of sensor net-works with audio coding. In thisresearch direction, a multichannel ver-sion of the sinusoids plus noise modelwas proposed and applied to multi-channel signals, obtained by a networkof multiple microphones placed in avenue, before the mixing process pro-duces the final multichannel mix.Coding these signals makes them avail-able to the decoder, allowing for inter-active audio reproduction which is a

ERCIM NEWS 78 July 200938

R&D and Technolgy Transfer

The ASPIRE Project – Sensor Networks for Immersive Multimedia Environments by Athanasios Mouchtaris and Panagiotis Tsakalides

Art, entertainment and education have always served as unique and demanding laboratories forinformation science and ubiquitous computing research. The ASPIRE project explores thefundamental challenges of deploying sensor networks for immersive multimedia, concentrating onmultichannel audio capture, representation and transmission. The techniques developed in thisproject will help augment human auditory experience, interaction and perception, and willultimately enhance the creative flexibility of audio artists and engineers by providing additionalinformation for post-production and processing.

As immersive spatialised sound and user controlled interactivity depend on enhanced audio content, the ability to efficiently capture, process,transmit, and render multiple recordings containing numerous sound sources is of crucial importance.

Page 39: Special theme: Mathematics for Finance and Economy

necessary component in immersiveapplications.

The proposed model uses a single refer-ence audio signal in order to derive anerror signal per spot microphone. Thereference can be one of the spot signalsor a downmix, depending on the appli-cation. Thus, for a collection of multiplespot signals, only the reference is fullyencoded; the sinusoidal parameters andcorresponding sinusoidal noise spectralenvelopes of the remaining spot signalsare retained and coded, resulting inbitrates for the side information in theorder of 10 kbps for high-quality audioreconstruction. In addition, the pro-posed approach moves the complexityfrom the transmitter to the receiver, andtakes advantage of the plurality of sen-sors in a sensor network to encode high-quality audio with a low bitrate. Thisinnovative method is based on sparsesignal representations and compressive

sensing theory, which allows samplingof signals significantly below theNyquist rate.

In the future, we hope to implementexciting new ideas; for example,immersive presence of a user in a con-cert hall performance in real time,implying interaction with the environ-ment, eg being able to move around inthe hall and appreciate the hallacoustics; virtual music performances,where the musicians are located allaround the world; collaborative envi-ronments for the production of music;and so forth. A central direction in ourfuture plans is to integrate the ASPIREcurrent and future technology with theAmbient Intelligence (AmI)Programme recently initiated atFORTH-ICS. It is certain that the homeand work environments of the futurewill be significantly enhanced byimmersive presence, including enter-

tainment, education and collaborationactivities.

ASPIRE is a €1.2M Marie CurieTransfer of Knowledge (ToK) grantfunded by the EU for the periodSeptember 2006 to August 2010. TheUniversity of Valencia, Spain and theUniversity of Southern California, USAare valuable research partners in thiseffort.

Link:

http://www.ics.forth.gr/~tsakalid

Please contact:Panagiotis TsakalidesAthanasios MouchtarisFORTH-ICS, GreeceTel: +30 2810 391 730E-mail: {tsakalid,mouchtar}@ics.forth.gr

ERCIM NEWS 78 July 2009 39

Brain-Computer Interfaces (or BCI) arecommunication systems that enable auser to interact with a computer or amachine using only brain activity. Thebrain activity is generally measured byelectroencephalography (EEG). BCI is arapidly growing area of research andseveral impressive prototypes arealready available. The aim of theOpenViBE consortium was to developopen-source software for brain-com-puter interfaces, which is expected topromote and accelerate research, devel-opment and deployment of BCI tech-nology. Key features of the resultingsoftware are its modularity, its high per-formance, its multiple-users facilitiesand its connection with high-end virtualreality displays. The development ofOpenViBE involved six academic andindustrial French partners (INRIA,INSERM, FRANCE TELECOM R&D,CEA, GIPSA-LAB and AFM).Applications of OpenViBE are initiallyconcerned with the provision of access

for disabled people to multimedia andtelecommunication services.

The OpenViBE partners have con-ducted innovative research and pub-lished around forty papers covering theentire spectrum of the BCI field: neuro-science, electrophysiology, EEG signalprocessing, human-computer interac-tion and virtual reality. For example, theconsortium has proposed novel tech-niques for processing and identifyingcerebral data (eg classification of EEGsignals based on fuzzy sets theory), aswell as new paradigms for BCI based onneurophysiological experiments (eg theuse of auditory signals for BCI). Theconnection between BCI and virtualreality technology has also been investi-gated.

In parallel, the OpenViBE consortiumhas developed free and open-sourcesoftware devoted to the design, testingand use of brain-computer interfaces.

OpenViBE processes brain signals: itcan be used to acquire, filter, process,classify and visualize brain signals inreal time. It is also notable for its highmodularity. The platform comprises aset of software modules written in C++that can be integrated easily and effi-ciently to design BCI applications.OpenViBE proposes a user-friendlygraphical language to allow non-pro-grammers to design a BCI withoutwriting a single line of code (see Figure1). As such, OpenViBE addresses theneeds of all users, regardless of whetherthey are programmers or not.OpenViBE is portable, independent ofhardware or other software, can rununder Windows and Linux and isentirely based on free and open-sourcesoftware. Various 2D and 3D visualiza-tion tools allow brain activity to be dis-played in real time. Thanks to itsgeneric acquisition server, the platformis compatible with many EEGmachines. OpenViBE also includes pre-

OpenViBE: Open-Source Software for Brain-Computer Interfacesby Anatole Lécuyer and Yann Renard

Brain-computer interfaces enable commands or messages to be sent to computers by means ofbrain activity. Created by a consortium of academic and industrial partners, OpenViBE is free andopen-source software, which makes it simple to design, test and use brain-computer interfaces.

Page 40: Special theme: Mathematics for Finance and Economy

ERCIM NEWS 78 July 200940

R&D and Technolgy Transfer

configured scenarios that can be usedfor different applications including neu-rofeedback, and BCI.

The OpenViBE software is freely avail-able from the INRIA forge under theterms of the LGPL licence. It hasalready been downloaded by numeroususers, and an initial two-day trainingsession was organized in Rennes,France, in January 2009.

Several applications and demonstratorshave been built based on OpenViBE,and illustrate the numerous possibilitiesoffered. For instance, a mental spellerenables words to be written ‘bythought’, using a famous brain signal

called ‘P300’. A video game inspired bya famous sequence from the movie ‘StarWars’ was also developed, in whichusers lift a virtual spaceship in the air byway of motor imagery. After an initialbaseline-measuring phase, users canperform real or imagined foot move-ments to lift the spaceship as if theywere controlling the ‘force’ with theirminds (see Figure 2).

The OpenViBE project continues withnational funding for both advancedresearch on BCI and future develop-ments of the open-source software. Onefollow-up project is OpenViBE2, whichis funded by the French NationalAgency of Research and will focus on

video game applications. The projectwill assess future impacts of BCI onvideo games by making prototypes andtesting them in a fully ‘working envi-ronment’. The OpenViBE2 project willinvolve ten partners (including gameeditor UBISOFT) and will run for threeyears, commencing in the fall of 2009.

Link:

http://openvibe.inria.fr

Please contact:

Anatole LécuyerINRIA, FranceE-mail: [email protected]

Figure 2: OpenViBE applications: virtual-reality application called ‘Use-the-force’, in which a participant equipped with an EEG cap is trained tocontrol his mental activity. Imaginary foot movements control the take-off of a virtual ‘tie-fighter’, the famous spaceship from ‘Star Wars’.

Figure 1: OpenViBE software: (left) the OpenViBE designer helps to program a BCI scenario with a graphical user interface and a graphicallanguage; (right) 3D real-time topography of brain activity with OpenViBE visualization facilities.

Page 41: Special theme: Mathematics for Finance and Economy

ERCIM NEWS 78 July 2009 41

An automated classification system isnormally specified by defining twoessential components. The first is ascheme for internally representing thedata items to be classified; this represen-tation scheme, which is usually vectorialin nature, must be such that a suitablenotion of similarity (or closeness)between the representations of two dataitems can be defined. Here, ‘suitable’means that similar representations mustbe attributed to data items that are per-ceived to be similar. If this is the case, aclassifier may identify, within the spaceof all representations of data items, alimited region of space where lie theobjects belonging to a given class; here,the assumption of course is that dataitems that belong to the same class are‘similar’. The second component is alearning device that takes as input therepresentations of training data itemsand generates a classifier from them.

In this work, we address single-labelimage classification, ie the problem ofsetting up an automated system thatclassifies an image into exactly one froma predefined set of classes. Image classi-fication has a long history, with mostexisting systems conforming to the pat-tern described above.

We take a detour from this tradition bydesigning an image classificationsystem that uses not one but five dif-ferent representations for the same dataitem. These representations are based onfive different descriptors or ‘features’from the MPEG-7 standard, eachanalysing an image from a differentpoint of view. As a learning device weuse a ‘committee’: an appropriatelycombined set of five feature-specificclassifiers, each based on the representa-tion of the image specific to a singleMPEG-7 feature. The committees thatwe use are adaptive; to classify eachimage, they dynamically decide whichof the five classifiers should be entrustedwith the classification decision, or deter-mine whose decisions should be trustedmore. We study experimentally four dif-

ferent techniques for combining thedecisions of the five individual classi-fiers, namely dynamic classifier selec-tion and weighted majority voting, eachof which is realized in standard and inconfidence-rated forms.

As a technique for generating the indi-vidual members of the classifier com-mittee, we use distance-weighted knearest neighbours, a well-knownexample-based learning technique.

Technically, this method does notrequire a vectorial representation ofdata items to be defined, since it simplyrequires that the distance between anytwo data items is defined. This allowsus to abstract away from the details ofthe representation specified by theMPEG-7 standard, and simply specifyour methods in terms of distance func-tions between data items. This is notproblematic, since distance functionsboth for the individual MPEG-7 fea-tures and for the image as a whole havealready been studied and defined in theliterature.

Since distance computation is so funda-mental to our methods, we have alsostudied how to compute distancesbetween data items efficiently, and have

implemented a system that makes use ofmetric data structures explicitly devisedfor nearest neighbour searching.

We have run experiments on these tech-niques using a dataset consisting ofphotographs of stone slabs classifiedinto different types of stone. All fourcommittee-based methods largely out-perform, in terms of accuracy, a base-line consisting of the same distance-weighted k nearest neighbours fed with

a ‘global’ distance measure obtained bylinearly combining the five feature-spe-cific distance measures. Among the fourcommittee-based methods, the confi-dence-rated methods are not uniformlysuperior to those that do not use confi-dence values, while dynamic classifierselection methods are found to be defi-nitely superior to weighted majorityvoting methods.

Link:

http://www.isti.cnr.it/People/F.Sebas-tiani/Publications/IMTA09.pdf

Please contact:

Fabrizio SebastianiISTI-CNR, ItalyTel: +39 050 3152 892 E-mail: [email protected]

Adaptive Committees of Feature-SpecificClassifiers for Image Classificationby Fabrizio Falchi, Tiziano Fagni, and Fabrizio Sebastiani

Researchers from ISTI-CNR, Pisa, are working on the effective and efficient classification of imagesthrough a combination of adaptive image classifier committees and metric data structures explicitlydevised for nearest neighbour searches.

Figure 1: Photographs of stone slabs classified into different types of stone, used to experimentautomated classification.

Page 42: Special theme: Mathematics for Finance and Economy

ERCIM NEWS 78 July 200942

R&D and Technolgy Transfer

3D MEDIA has already given somespectacular results, such as NeuroTV'scapability of projecting in 3D stereo asynthetic avatar (‘Toon’) animated back-stage by an artist (see figure).

While not directly financed by theproject, industry is a key player of theproject. First, the consortium reaches outto the audio and video industry to iden-tify scientific and technical problemsthat industry does not have the time, themanpower, or the skills to solve. Theseproblems are turned into well-definedchallenges; the universities work onthem for, say, 18 months, and the resultsare then transferred back to industrywith appropriate licensing agreements.The overall view is a series of chal-lenges staggered in an innovation

pipeline. Second, the three Universities,which are fully financed, are also free toexplore more challenging and riskyavenues and, where warranted, totransfer this technology to industry.Given that Multitel is only partiallyfinanced, it generally needs to find con-tracts with industry, which also helps infinding technical challenges for the con-sortium to work on. Once an idustrialpartner has identified a potential part-nership within the project, it can go tothe regional government and ask for itsown funding.

The consortium partners – Multitel,FPMs, UCL, and ULg – have comple-mentary expertise. Early on in the3DMEDIA project, they all listed theirexisting capabilities and looked for

ways to combine them to come up withinnovative capabilities. One approachwas to identify each partner’s strengths,and to look for ways in which they couldbe combined in visual demonstrations,which also proved to be very useful inestablishing a dialog with industry.

The different research themes of the 3DMEDIA project can be found at differentpoints along the traditional chain ofacquisition, processing, and exploitation/immersion. The project considersadvanced acquisition means, such aspairs of stereoscopic cameras, multiple-camera and omni-directional-camera sys-tems, and 3D cameras that can acquirerange information directly. The use ofseveral cameras may call for the registra-tion of views, the reconstruction of a 3Dscene, and the automatic determination ofthe position of the cameras. Applicationsoften call for tracking objects and per-sons, for recognizing them, and for inter-preting their motion. Transmission andsecurity calls for compression and fortechniques such as watermarking. Mixingreal and synthetic data is becomingincreasingly important. ‘Match-moving’is one technique that allows this to bedone automatically. Indexation, annota-tion, and summarizing of video areimportant for archiving large amounts ofvideo data. Data mining allows one toquickly retrieve information from largedatabases of images. Image under-standing also plays a role in man-machineinterfaces, such as in recognizing ges-tures and facial expressions. The result ofacquisition and processing must ulti-mately be experienced by the user. Here,we mainly talk about 3D stereo visualiza-tion and the creation of immersiveacoustical and visual environments.

The application domains of 3D MEDIAare varied, such as 3D cinema, entertain-ment, sports, and videosurveillance.One concrete example in sports is the

3D MEDIA: A Unique Audio and Video R&D Project Spanning Walloniaby Jacques G. Verly, Jerome Meessen and Benoit Michel

3D MEDIA is part of a new portfolio of R&D projects that will run for five years, financed by theEuropean Regional Development Fund (ERDF). 3D MEDIA covers most scientific and technologicalaspects concerning the management and processing of audio and video contents. It involves thethree largest Universities in Wallonia, the Faculté Polytechnique de Mons (FPMs), the UniversitéCatholique de Louvain (UCL), and the Université de Liège (ULg); it is managed by the MultitelResearch Center.

Figure 1: During the preview of the 3D Stereo MEDIA event on 17 March 2009 in Liège, Belgium, NeuroTV showed, as a world first, the live animated

character Toon in full 3D stereoscopy. Above are examples of the slightly different left-eye (left)and right-eye (right) views of a 3D projection. An overlay of both (bottom) shows the difference.

(courtesy of NeuroTV, member of the 3D MEDIA project).

Page 43: Special theme: Mathematics for Finance and Economy

automatic tracking of players from mul-tiple cameras for resolving occlusions(TRICTRAC project), which then leadsto the automatic analysis of games and,eg, to the possibility of raising an alarmin case of off-side.

As the name indicates, it was judgedimportant by the partners of the consor-tium that the project have a significant3D flavour. This is in line with thesudden acceleration of interest world-wide for everything 3D. One of themajor factors behind this impetus is thearrival of digital cinema in theatres, andthe fact that the jump from 2D to 3D indigital cinema is relatively straightfor-ward, at least in terms of visualization.

ERCIM NEWS 78 July 2009

To meet this exploding interest in 3D,the first author has led the organizationof a new event called 3D StereoMEDIA, which will take place in Liège,Belgium, on 1-3 December 2009, andfor which a very successful preview washeld on 17 March 2009. A unique featureof this event is the bringing together ofartists and engineers who share a pas-sion for 3D acquisition, processing, andstereo visualization. Importantly, theevent will feature 3D projection capabil-ities, not only for movies but, moreunusually, for scientific and technicalpresentations. It is during the March pre-view of 3D Stereo MEDIA that the Toonavatar was shown for the first time everin 3D stereo (see figure).

Links:

http://mediatic.multitel.be/platforms/3dmedia.htmlhttp://www.montefiore.ulg.ac.be/http://intelsig.montefiore.ulg.ac.be/~verly/http://www.neurotv.comhttp://www.3dmedia2009.com

See also the announcement for theInternational 3D Stereo Film & Tech-nology Festival on page 49 in this issue.

Please contact:

Jacques G. VerlyUniversity of Liège, BelgiumE-mail: [email protected]

43

The best way to face threats is to stopthem before they cause catastrophic con-sequences. Unfortunately, if thesequence of events is heterogeneous,geographically distributed or rapidlyevolving, visual surveillance of videostreams and sensor alarms provided bycurrent security systems does not pro-vide operators with satisfactory situa-tional awareness. Operators are there-fore less likely to recognize sequencesof events that are indicative of a possiblethreat, due to their limited alert thresholdand knowledge base. Furthermore, oper-ators cannot guide and coordinate alarmresponses or emergency interventions ifthey are not precisely aware of what ishappening or has happened. The adop-tion of early warning and decision sup-port systems is a way of coping withthese issues.

To this end, we have designed theDETECT framework. The basicassumption behind the framework is thatthreats can be detected by predicting the

set of basic events (ie the patterns) thatconstitute their ‘signature’. Forinstance, Figure 1b shows the multi-lay-ered asset protection provided bymodern security systems. In each layera set of sensors (eg video, motion, tem-perature, vibration, sound, smoke) areinstalled. Threat scenarios must be pre-cisely identified during vulnerabilityassessment and risk analysis.

DETECT operates by performing amodel-based logical, spatial and tem-poral correlation of basic eventsdetected by different sensor subsys-tems, in order to recognize sequences ofevents which indicate likely threats.DETECT is based on a real-time detec-tion engine which implements the con-cepts of data fusion and cognitive rea-soning by means of soft computingapproaches. The framework can beinterfaced or integrated with existingSMS (Security Management Software).It can serve as an early warning tool oreven to automatically trigger adequate

countermeasures for emergency/crisismanagement. As such, it may allow fora quick, focused and automaticresponse to emergencies, thoughmanual confirmation of detected alarmsremains an option.

In fact, human management of criticalsituations, possibly involving manysimultaneous events, is a very delicatetask that is both prone to error and sub-ject to forced inhibition. Used as awarning system, DETECT can alertoperators to the likelihood and nature ofa threat; used as an autonomous rea-soning engine, it can activate respon-sive actions, including audio and visualalarms, unblocking of turnstiles, air-conditioning flow inversion, activationof sprinklers, and emergency calls tofirst responders. Furthermore, the corre-lation among basic events detected bydiverse redundant sensors can beemployed to lower the false alarm rateof the security system, thus improvingits overall reliability.

Model-Based Early Warning and Decision Supportto Improve Infrastructure Surveillanceby Francesco Flammini, Andrea Gaglione and Concetta Pragliola

Critical Infrastructure Protection (CIP) against both natural and intentional threats has become a majorissue in modern society. CIP involves a set of multidisciplinary activities and requires the adoption ofappropriate protection mechanisms operated by centralized monitoring systems. Such systems are stillhighly dependent on human operators for supervision and intervention. One of the challenging goals of theresearch community in this field is the automatic and early detection of threats, including strategic terrorattack scenarios. DETECT (Decision Triggering Event Composer & Tracker) is a new framework able torecognize complex events. This is achieved by a model-based correlation of basic events detected bypossibly heterogeneous sensorial subsystems.

Page 44: Special theme: Mathematics for Finance and Economy

Threats are described in DETECT usinga specific Event Description Language(EDL) and stored in a ScenarioRepository. Starting from this reposi-tory, one or more detection models areautomatically generated using a suitableformalism (eg event graphs, Bayesiannetworks, neural networks etc). In theoperational phase, a model managermacro-module performs queries on theEvent History database for the real-timefeeding of the detection modelaccording to predetermined policies.When a composite event is recognized,the output of DETECT consists of theidentifier(s) of the detected/suspectedscenario(s); an alarm level, associatedto the scenario evolution (used as aprogress indicator); and a probability of

attack (used as a threshold in heuristicdetection). A high-level architecture ofthe framework is depicted in Figure 2.

Together with a sensor network integra-tion framework, DETECT can performfusion and reasoning of data generatedby smart wireless sensors. To this aim,DETECT is being integrated with

SeNsIM (Sensor Networks Integrationand Management), as illustrated inFigure 4. SeNsIM is a framework usedto integrate heterogeneous and distrib-uted sensor systems (including ad-hocnetworks), by offering the user acommon view of all data sources.

We are currently working on the imple-mentation of a detection model basedon Bayesian networks. The next opera-tional step will be to interface theoverall system with a real security man-agement system for field trials. Theintegration will be performed usingWeb services and/or the OPC (OLE forProcess Communication) standard pro-tocol.

DETECT is a collaborative project car-ried out by the Innovation Unit ofAnsaldo STS Italy and the Departmentof Computer and Systems Engineering(Dipartimento di Informatica eSistemistica, DIS) at the University ofNaples ‘Federico II’.

Link:

DETECT and SeNsIM project pages inthe Seclab research group Web site:http://www.seclab.unina.it/

Please contact:

Francesco FlamminiANSALDO STS ItalyE-mail: [email protected]

Andrea GaglioneUniversity of Naples Federico II, ItalyE-mail: [email protected]

DETECT Engine

CriticalityLevel(1, 2, 3, ...)

DetectedAttackScenario

Event History

ScenarioRepository

Figure 2a: The DETECT framework (top) and its integration with external systems (Figure 2 b below).

ERCIM NEWS 78 July 200944

R&D and Technolgy Transfer

Analysis and assessment Remediation Indications

and warning Mitigation Response Reconstitution

Pre - Event Post - Event

Event

THREATROUTE

SENSING POINTS

Figure 1a: CIPprotection life-cycle.

Figure 1b: multi-layer sensing inmodern security systems.

Event History

SeNsIM(integration)

DETECT(reasoning)

SMS

System 1 System N

...

Page 45: Special theme: Mathematics for Finance and Economy

ERCIM NEWS 78 July 2009 45

The EvoTest project addresses a funda-mental problem faced by the Europeansoftware industry: quality assurance ofcomplex software systems. There isstrong empirical evidence that deficienttesting of both functional and non-func-tional properties is one of the majorsources of software and system errors.Even though many test automation toolsare currently available to aid test plan-ning and control as well as test case exe-cution and monitoring, all these toolsshare a similar passive philosophytowards test case design, selection of testdata and test evaluation. They leavethese crucial, time-consuming anddemanding activities to the humantester. This is not without reason; testcase design and test evaluation are diffi-cult to automate with the techniquesavailable in current industrial practice.The domain of possible inputs (potentialtest cases), even for a trivial program, istypically too large to be exhaustivelyexplored. One of the major challengesassociated with test case design is theselection of test cases that are effectiveat finding flaws without requiring anexcessive number of tests to be carriedout. This is the problem that EvoTesthas taken on.

The strategy is to combine variousvisions of the IST programme: a multi-disciplinary approach to evolving, adap-tive and automated testing as a solutionto some of the challenges of masteringcomplexity in software development.During its running period, 2006-2009,the project has successfully developed,applied and evaluated EvolutionaryTesting techniques. EvolutionaryTesting is an exciting, novel, nature-inspired solution, which transformstesting into an optimization problem.This allows the problem to be attackedusing search techniques inspired byDarwinian evolution. To increase thetest efficiency for complex systemsEvoTest has developed an AutomatedEvolutionary Testing Framework. This

framework is able to automaticallysearch for high-quality test cases with ahigh probability of error detection.Automation is widely regarded as thekey to efficient testing.

The impossibility of anticipating ortesting all possible uses and behavioursof a (complex) system suggests aprominent role for EvolutionaryTesting, because it relies on very fewassumptions about the underlyingproblem it is attempting to solve. Inaddition, optimization and search tech-niques are adaptive and, therefore, ableto modify their behaviour when facedwith unforeseen situations. These twoproperties make evolutionary testingapproaches ideal for handling complexsystems.

EvoTest has reached the followingobjectives:• An extensible and open Automated

Evolutionary Testing Framework hasbeen developed. The framework pro-vides general components and inter-faces to facilitate the automatic gen-eration, execution, monitoring andevaluation of black-box and white-box testing of C programs.

• The performance and the quality ofthe Evolutionary White-box Testinghas been increased by combiningevolutionary testing with other soft-ware engineering techniques, such asslicing and program transformation.These advanced techniques havebeen tailored both to reduce the sizeof the search space and to transformthe search space.

• The EvoTest tools and techniqueshave been evaluated using real-world case studies from the automo-tive and telecommunications indus-tries. Case studies have been per-formed for black-box and white-boxtesting activities.

EvoTest results have made it abun-dantly clear that evolutionary testingholds great promise as a solution tomany challenges faced by the softwareindustry. Future work on EvolutionaryTesting should concentrate on theoret-ical foundations, search techniqueimprovements, new testing objectives,tool environment/testing infrastructure,and new application areas.

The EvoTest project has broughttogether European entities that areworld leaders in this emerging field.ITI (Spain) is project coordinator andresearch partner providing expertise insoftware testing and evolutionary com-putation. Moreover, ITI acts as soft-ware developer, implementing variouscomponents of the Framework. INRIA(France) serves as research partnerproviding expertise in evolutionarycomputation. Fraunhofer FIRST

Evolutionary Testing for Complex Systemsby Tanja E. J. Vos

Complex systems exhibit emergent behaviour, which makes them hard to predict. This presentsparticularly challenging problems during testing. However, this is a challenge that simply cannot beavoided: testing is a vital part of the quality assurance process. With important partners from Spain(ITI), UK (Kings College), France (INRIA), Germany (Fraunhofer, Daimler, BMS), Bulgaria (RILA) andGreece (European Dynamics), the EvoTest project, funded under the 6th Framework programme (IST-33742), is attacking the problem of testing complex systems using evolutionary algorithms.

Black-box evolutionary testing at Daimler.

Page 46: Special theme: Mathematics for Finance and Economy

(Germany) is a research partner pro-viding expertise in software develop-ment, technology, validation and veri-fication. Kings College (UnitedKingdom) is a research partner pro-viding expertise in software engi-neering techniques like slicing andprogram transformations. Daimler(Germany) is a case study provider of acomplex automotive application.

Moreover, they serve as researchpartner contributing expertise on evo-lutionary testing. Berner & Mattner(Germany) is a case study provider oftesting infrastructure. Rila (Bulgaria)is a case study provider of a complexaccessibility solution, and acts as asoftware developer. EuropeanDynamics (Greece) is a technologyprovider and software developer.

Link:

http://www.evotest.eu

Please contact:

Tanja E.J. VosUniversidad Politécnica deValencia/SpaRCIM, SpainTel: +34 690 917 971E-mail: [email protected]

ERCIM NEWS 78 July 200946

R&D and Technolgy Transfer

Usability evaluation and user interactiondesign are two key activities in thedevelopment of an interactive system.They are mutually dependent, but inpractice there is often little or no fruitfulinterplay between them. Considerableefforts have been devoted to improvingthe interplay between usability evalua-tion and software development, withtwo approaches being typical. The firstfocuses on better methods, and thesecond on better feedback.

Compared to these approaches, Web sitedevelopment is particularly challenging.Web sites involve vast amounts of infor-mation, services and purchasing possi-bilities, and the users are a tremendouslyheterogeneous group who employ Websites for a multitude of purposes.Because of this, Web site developersmust accommodate the massive varietyin user preferences and capabilities.

Yet many contemporary Web sites sufferfrom problems with low usability. Thechallenges of developing user-friendlyWeb portals originate from two majorsources. First, the projects that developWeb portals usually have a very shortduration. Second, the users of most Webportals are exceedingly diverse.

Usability is “the extent to which aproduct can be used by specified users toachieve specified goals with effective-ness, efficiency and satisfaction in aspecified context of use” (ISO, 1998).The purpose of conducting usabilityevaluations is to facilitate a feedbackloop: the results of a usability evaluation

are fed back into the software develop-ment activities that create and shape theproduct in order to enhance usability.

State-of-the-art usability engineeringmethods have had so far a very limitedinfluence on Web portal development,because they provide very few solu-tions to the main challenges. Extensiveresearch has documented the fact thatusers encounter serious problems whenthey use Web services, wasting largeamounts of time and often giving upbefore they achieve what they set outto do.

The WPU project is based on the fol-lowing hypotheses:1. The usability of Web portals can be

improved considerably by applyingselected usability engineeringmethods that are tailored to thisdomain.

2. It is possible to develop usabilityengineering methods that are directlyrelevant to Web portal development,and to reduce the demands forresources and expertise to a levelwhere these methods can be integrat-ed with Web portal projects.

The aim of the WPU project is to con-firm these hypotheses by developingand testing usability engineeringmethods. The project will produce thefollowing results:• a set of new methods for usability engi-

neering in Web portal development• a set of guidelines for selection and

application of the methods

• a training programme for Web portaldevelopers

• research training for two PhD stu-dents and a post-doctoral researcher.

The WPU Project will involve a combi-nation of state-of-the-art survey,method creation, method training andexperimental assessment. The state-of-the-art survey will collect experienceswith usability issues in Web portaldevelopment that are documented in theliterature. The method creation willbuild the catalogue of methods that willbe a key result of the project. This willinclude the experiences and the list offragments of usability engineeringmethods compiled in the state-of-the-artsurvey. Method training will involve thedesign of training programmes for themethods created in the project, and theuse of these programmes in a partici-pating company. The experimentalassessment will collect experience withthe new methods. The quality of themethods will be assessed through aseries of experiments, conducted by aparticipating software company and in alaboratory setting.The project is supported in part by theDanish Research Councils under grantnumber 2106-08-0011. It also receivessupport from Aalborg University andthe two participating companies.

Please contact:

Janne Jul Jensen, Mikael B. Skov and Jan StageAalborg University/DANAIM,DenmarkE-mail: {jjj,dubois,jans}@cs.aau.dk

The WPU Project: Web Portal Usability by Janne Jul Jensen, Mikael B. Skov and Jan Stage

The Web Portal Usability (WPU) project is working on usability engineering methods, which areimportant in the development of Web portals. These methods are tested with companies thatdesign modern Web portals.

Page 47: Special theme: Mathematics for Finance and Economy

ERCIM NEWS 78 July 2009 47

ADOSE: New In-Vehicle Sensor Technology for Vehicle Safety in Road Trafficby Jürgen Kogler, Christoph Sulzbachner, Erwin Schoitsch and Wilfried Kubinger

Reliable Advanced Driver Assistance Systems (ADAS) aid drivers in a variety of traffic andenvironment/weather conditions. Growing traffic volumes require sensors and systems that handledifficult urban and non-urban scenarios. For such systems, the EU FP7 project ADOSE is developingand evaluating new cost-efficient sensor technology that will provide vehicles with a ‘virtual safetybelt’ by addressing complementary safety functions.

The EU-funded project ADOSE (reli-able Application-specific Detection ofroad users with vehicle On-boardSEnsors), coordinated by CentroRicerche Fiat (CRF), is evaluating newsensors and sensor systems. Such sen-sors are necessary for ADAS like lanedeparture warning, collision warning orhigh-beam assist. Figure 1 illustrates the

various sensors from different projectmembers. Additionally, the operatingdistance of each sensor and examples ofits use are depicted.

Cost-effective solutions for ADAS arestill missing, which is preventing bothextensive market penetration and anincrease in the number of sensors and

supported safety functions. Studies wereperformed in 2005 to evaluate cus-tomers’ desire and willingness to pay foractive and passive safety systems in pas-senger cars (Frost & Sullivan, EuropeanMarkets for Advanced Driver AssistanceSystems, B844-18, (2006)). In general,the price that consumers are willing topay for their ideal package of safety fea-

Figure 1: Sensor overview.

Page 48: Special theme: Mathematics for Finance and Economy

tures is significantly lower than whatthey perceive its market price to be.Researchers and manufacturers there-fore need to find ways to reduce theprices of safety options to increase cus-tomer acceptance. For this reason, it iscritical to develop and implement high-performance sensors that considerablyreduce the costs of ADAS for safety inpassenger cars. As an example, the pene-tration of ADAS dependent on classicalvision sensors in the highly cost-drivenautomotive market is still limited by thecost of the electronics required toprocess images in real time.

The five sensor technologies to bedeveloped further are (partners inbrackets):• far infrared (FIR) add-on sensor with

good thermal and spatial resolution atlower cost, to be combined with ahigh-resolution imager for enhancednight-vision applications (more reli-able obstacle detection and classifica-tion) [Bosch]

• low-cost multi-functional CMOSvision sensor, detecting critical envi-ronmental parameters (fog, rain etc)and providing information on thedriving scenario (oncoming vehicles,vulnerable road users (VRUs) in nightconditions) [CRF, STMicroelectron-ics, Fraunhofer Institut für Zuverläs-sigkeit und Mikrointegration (IZM)]

• high spatial resolution and low-cost3D range camera based on 3D pack-aging, optical CMOS and laser radartechnology for short-range safetyrequirements (eg for pre-crash)[Interuniversity MicroElectronicsCentre (IMEC)]

• harmonic radar combined with pas-sive nonlinear reflective and activetags enabling easy detection and iden-tification of traffic obstacles and vul-nerable road users, even in dark oradverse weather conditions [ValtionTeknillinen Tutkimuskeskus (VTT),Triad, Uppsala University]

• high temporal resolution and low-costbio-inspired silicon retina stereo sen-sor (SRS), addressing time-criticaldecision applications [Austrian Insti-tute of Technology (AIT)].

The approach of the Austrian Institute ofTechnology (AIT) to reducing the costsof ADAS is to use an SRS. An SRS is avision-based sensor that delivers infor-mation about the illumination changes(‘events’) in the visual field. Figure 1illustrates an example of the SRS output,

which rather than reflecting staticimages, records events when either theobject or the vehicle (or both) aremoving. Derived from the human visionsystem, the bio-inspired silicon retinasensor is a new type of imager. It detectsintensity changes in an observed scene,with each pixel delivering its addressand event data separately and independ-ently. This type of sensor is intended toovercome certain obstacles in classicalvision systems: high temporal resolu-tion allows quick reactions to fastmotion in the visual field, on-sensorpre-processing significantly reducesboth memory requirements and pro-cessing power, and wide dynamic rangehelps in the difficult lighting situationsencountered in real-world traffic.

The SRS is specifically tailored to serveas a pre-crash warning and preparationsensor for side impacts. Pre-crash appli-cations must reliably react in real timeto prepare the vehicle (eg activate thepretensioner, preparation of a sideairbag) for the imminent impact (which,in case of side impact, cannot beavoided by a reasonable reaction of theimpacted vehicle). For the pre-crashsensor, it is necessary to take distancemeasurements of objects approachingthe sensor. Two silicon retinas havetherefore been coupled to a stereovision unit, allowing depth informationto be extracted from moving objects inthe viewed scenery. Before the depthinformation is available, it is necessaryto match the corresponding SRS data inthe acquired left and right images. Theso-called ‘stereo-matching’ step is anessential part of each stereo visionsystem. A new kind of stereo visionalgorithm has been developed withinthe ADOSE project especially for thesilicon retina sensors, along withadvanced sensors with higher resolutionthan ever before.

Links:

http://www.adose-eu.org http://www.arcs.ac.athttp://www.smart-systems.at

Please contact:Willi Kubinger, Jürgen Kogler,Christoph Sulzbacher, Erwin SchoitschAustrian Research Centers /AARIT,AustriaE- mail: {wilfried.kubinger,juergen.kogler, christoph.sulzbachner,erwin.schoitsch}@arcs.ac.at

SAFECOMP 2009

28th International

Conference on

Computer Safety,

Reliability and Security

Hamburg, Germany, 15-18 September 2009

SAFECOMP focuses on state-of-the-artand innovative approaches to riskassessment and management from thesafety, security and reliability view-points. The scope includes IT systemsand infrastructures considered criticalwithin their present or emerging con-texts. All aspects of dependability andsurvivability of critical computer-basedsystems and infrastructures areincluded. In particular, SAFECOMPemphasises multidisciplinaryapproaches to deal with the nature ofcomplex critical IT systems and appli-cations.

SAFECOMP offers a platform forknowledge and technology transferbetween academia, industry, researchinstitutions and licensing bodies.

Topics to be presented include applica-tion and industrial sectors,such as aero-space and avionics, automotive, indus-trial process control, networking &telecommunication, railways, robotics,medical systems, off-shore technology,ship building, internatioinal safety andsecurity guidelines and standards,power systems; as well as researchareas such as safety & security riskassessment, design for dependability,diversity, fault tolerance, verification &validation, testing, education &training, human-factors, dependabilityanalysis and modelling.

The ERCIM Working Group onDependable Embedded Systems willhold a joint workshop at SAFECOMP2009 in cooperation with EWICS - theEuropean Workshop on IndustrialComputer Systems Reliability, Safetyand Security and the DECOS(Dependable Embedded Componentsand Systems) interest group.

More information:

http://www.safecomp.org/

ERCIM NEWS 78 July 200948

R&D and Technolgy Transfer

Page 49: Special theme: Mathematics for Finance and Economy

ERCIM NEWS 78 July 2009 49

• N. Izerrouken, M. Pantel, X. Thirioux and O. Ssi yan kai:‘Integrated Formal Approach for Qualified CriticalEmbedded Code Generator’

• L. Ladenberger, J. Bendisposto and M. Leuschel:‘Visualizing Event-B models with BMotionStudio’

• A. Mathijssen, Y. S. Usenko and D. Bosnacki:‘Behavioural Analysis of an I2C Linux Driver

• W. Mostowski, E. Poll, J. Schmaltz, J. Tretmans and R.Wichers Schreur:‘Model-Based Testing of ElectronicPassports’

• L. Panizo, M. M. Gallardo, P. Merino and A. Linares:‘Developing a decision support tool for Dam manage-ment with SPIN’

FMICS 2009 is part of the first Formal Methods Week(FMweek), which will bring together a choice of events inthe area. For the latest information on FMweek, seehttp://www.win.tue.nl/fmweek.

Organisation• PC co-Chairs:María Alpuente (T. U. Valencia, Spain) and

Byron Cook (Microsoft, UK)• Workshop Chair: Christophe Joubert (T. U. Valencia,

Spain).

More information:

http://users.dsic.upv.es/workshops/fmics2009/

Competition on Plagiarism

Detection - 3rd Intl. PAN

Workshop on Uncovering

Plagiarism, Authorship and

Social Software MisuseDonostia-San Sebastián, Spain, 10 September 2009

The detection of plagiarism by hand is a laborious retrievaltask, a task which can be aided or made automatic. The PANcompetition on plagiarism detection shall foster the develop-ment of new solutions in this respect. The competition hastwo main tracks:• External Plagiarism Analysis - the task is to find all pas-

sages within the suspicious documents which have beenplagiarised from one or more of the source documents.

• Intrinsic Plagiarism Analysis - the task is to detect para-graphs, in a set of suspicious documents, which have notbeen written by its main author.

The competition, which is sponsored by Yahoo! Research,raised interest among scientists, companies, private groups(more than 20 teams from all over the world have registered)as well as in the media.

More information:

http://www.webis.de/pan-09/competition.php

CALL FOR PARTICIPATION

FMICS 2009 - 14th Intl. ERCIM

Workshop on Formal Methods

for Industrial Critical Systems

Eindhoven, The Netherlands, 2-3 November 2009

The aim of the FMICS workshop series is to provide aforum for researchers who are interested in the develop-ment and application of formal methods in industry. In par-ticular, these workshops are intended to bring together sci-entists and practitioners who are active in the area of formalmethods and interested in exchanging their experiences inthe industrial usage of these methods. These workshopsalso strive to promote research and development for theimprovement of formal methods and tools for industrialapplications.

Invited SpeakersThis year, FMICS will feature four outstanding invitedspeakers - two outstanding scientists and two prominentindustrialists working in top companies with an emphasis onformal methods for critical systems :• Dino Distefano, Queen Mary, Univ. London, UK• Diego Latella, ISTI-CNR, Italy• Thierry Lecomte, ClearSy, France• Ken McMillan, Cadence, USA

Accepted Papers• J. B. Almeida, M. Barbosa, J. Sousa Pinto and B. Vieira:

‘Correctness with respect to reference implementations’• D. Delmas, E. Goubault, S. Putot, J. Souyris, K. Tekkal

and F. Védrine:‘Towards an industrial use ofFLUCTUAT on safety-critical avionics software’

• J de Dios and R. Pena: ‘A Certified Implementation ontop of the Java Virtual Machine’

• S. Evangelista and L. M. Kristensen: ‘Dynamic State SpacePartitioning for External and Distributed Model Checking’

• A. Goodloe and C. Munoz: ‘Compositional Verificationof a Communication Protocol for a Remotely OperatedVehicle’

• J. Mariño, Á. Herranz, M. Carro and J. J. Moreno-Navarro: ‘Formal modeling of concurrent systems withshared resources’

• P. Parizek and T. Kalibera: ‘Platform-Specific Restrictionson Concurrency in Model Checking of Java Programs’

• M. Raffelsieper, M. Reza Mousavi, J.-W. Roorda, C.Strolenberg and H. Zantema: ‘Formal Analysis of Non-Determinism in Verilog Cell Library Simulation Models’

• E. Schierboom, A. Tamalet, H. Tews, M. van Eekelenand S. Smetsers: ‘Preemption Abstraction: a LightweightApproach to Modelling Concurrency’

• K. J. Turner and K. Leai Larry Tan: ‘A RigorousMethodology for Composing Services’.

Accepted Posters• A. Ferrari, A. Fantechi, S. Bacherini and N. Zingoni:

‘Formal Development for Railway Signaling UsingCommercial Tools’

Page 50: Special theme: Mathematics for Finance and Economy

ERCIM NEWS 78 July 200950

Events

Two Networking Workshops

in Zurich in November

On November 9-10, 2009 in Zurich, Switzerland, twoworkshops will be held:the 4th Workshop on “The Future Internet” and the 2ndWorkshop on “Economic Traffic Management"

The two main goals of the 4th GI/ITG Kommunikation undVerteilte Systeme [Communication and DistributedSystems] (KuVS) Workshop on Future Internet and the 2ndWorkshop on Economic Traffic Management (supported bythe FP6 EMANICS Network of Excellence and the FP7STREP SmoothIT) are to give scientists, researchers, andoperators the opportunity to present and discuss their ideasin these areas as well as strengthening the cooperation in thefield of an economic-technology interplay.

The set of topics of these two combined workshops arefocused on but not limited to:• Cloud and Grid computing infrastructures• SLA and service management• Energy efficiency and green ICT• Service-oriented networks and infrastructures• Content-based routing• Routing mediation (pub/sub)• Economic traffic management mechanisms• Separating of identity and address (locator/ID split)• Infrastructure/ Platform as a Service• Cross-layer design, cross-layer optimization• Predictable QoS and Quality-of-Experience• Next generation transport, e.g., carrier grade Ethernet• Network management and control plane• Sensors networks and applications• Distributed control and management approaches• Network virtualization and segmentation• Future network and services business models• Regulatory effects on networks and infrastructure• Future mobile network and• Clean-slate architectures.

These two workshops will take place at the University ofZurich (UZH), CSG@IFI, in Zurich, Switzerland. They areorganized jointly by the Communication Systems GroupCSG of UZH and SAP Research, Zurich, Switzerland.

More information: http://www.csg.uzh.ch/events/fi-etm

Future Internet Conference

supported by ERCIM

ERCIM was a sponsor of the Future of Internet Conference,a Czech EU Presidency conference on the perspectivesemerging from R&D in Europe, held in Prague on 11-13May 2009. The latest issue of ERCIM News featuring aspecial theme on ‘Future Internet Technology’ was distrib-uted to the more than 500 participants. Together with thesponsorship, it contributed to the success of the conference.

More information: http://www.fi-prague.eu/

CALL FOR PARTICIPATION

3D Stereo MEDIA –

International 3D Stereo

Film & Technology Festival

Liège, Belgium, 1-3 December 2009

Imaging is undergoing a true revolution: 3D stereo imaging.While the tip of the iceberg of this revolution is 3D cinema -with the number of 3D-equipped movie theaters and 3Dmovies quickly increasing – 3D stereo visualization willincreasingly be found elsewhere, eg in the home (TV, games,cell phones, etc.), in science and engineering, and in commu-nication and advertising.

The event will feature both a technology festival and a filmfestival. The technology festival will consist of a main trackof presentations and panels (sometimes with two parallel ses-sions) and of a technical exhibition. Ben Stassen, a pioneer in3D movie making, and author of “Fly me to the moon,” willbe the President of Honor of the festival. He will give a pre-view of his new movie “Around the world in 50 years.” Thefilm festival will be open to the general public. Since 3Dstereo imaging is of interest to people with a wide variety ofbackgrounds (artists, scientists/engineers, communication/advertising experts, business people, etc), the conferenceprogramme will be designed in such a way that the maintrack will be appealing to all. The goal is to foster the get-together and interaction of, say, artists and engineers.

The high-level themes of the technology festival are the fun-damentals of 3D (stereoscopy, holography, 3D displays), 3Din cinemas and homes (‘3D at home’), 3D visualization inscience and engineering, 3D visualization in advertising andcommunication, advanced topics in 3D acquisition, pro-cessing, communication, and visualization (including inte-gral imaging, free-viewpoint TV, and multiview imaging),and the applications of 3D visualization in science and engi-neering (design and conception; architecture and urban plan-ning; cartography, geomatics, and remote sensing; securityand defense; space exploration and astronomy; radar, IR,UV, and passive MMW imaging; simulators for vehicles,defense, and training; guidance, control, and navigation;microscopy, etc.). Medical imaging and life sciences will becovered in a separate event to be held in the spring of 2010.

Throughout the main track of presentations, several com-mented clips of 3D movies will be shown. Needless to saythat our conference rooms will be equipped with 3D stereovisualization equipment, both for movie projection and fortechnical presentations. The event will also feature trainingsessions in 3D stereo movie making, some with hands-onworkshops (limited to small groups). These sessions will beorganized by the stereographers Kommer Kleijn and PeterWilson, and will take place immediately before and after themain event. Companies interested in sponsoring the eventand in renting a booth in the exhibition area should contactthe organizers.

More information: http://www.3dmedia2009.com

Page 51: Special theme: Mathematics for Finance and Economy

ERCIM NEWS 78 July 2009 51

Stefano Baccianella Winner of

the 2009 “Franco Denoth Award“

Stefano Baccinella, of ISTI-CNR, Pisa, Italy, is the winner ofthe 2009 “Franco Denoth Award“. The award, now at its firstedition, is granted by the "Fondazione Franco ed Enrico

Denoth" in memory of Franco Denoth, computer sciencepioneer and, among others, former member of the ERCIMBoard of Directors (see ERCIM News 72 for a tribute toFranco on the occasion of his death). The 2009 “FrancoDenoth Award“ is granted to the best MSc thesis obtained in2008 from the University of Pisa, dealing with issues relatedto the development of the Internet. Stefano obtained his MScdegree in Computer Science with a thesis entitled “Multi-Facet Rating of Online Hotel Reviews: Issues, Methods andExperiments”, carried out under the supervision of AndreaEsuli and Fabrizio Sebastiani, both of ISTI-CNR (seeERCIM News 77, pp. 60-61 for a brief description ofBaccianella's thesis work). Stefano’s thesis is now a book,published by VDM Verlag, Saarbrücken, Germany. Stefanois currently pursuing a PhD in Information Engineering atthe University of Pisa.

EEES: Elite Education

in Embedded Software

In April 2009, the Danish Ministry of Science gave the offi-cial approval of the elite master education in embedded soft-

In Brief

Gabriella PetriDenothpresenting theaward to StefanoBaccianella.

SpaRCIM Appoints

New ERCIM Representatives

During 2008, new representatives for SpaRCIM (SpanishResearch Consortium for Informatics and Mathematics) inERCIM have been appointed. Founding president ofSpaRCIM, Juan José Moreno-Navarro, was nominatedGeneral Director for Technology Transfer at the Ministry of

Science and Innovation (Ministerio de Ciencia eInnovación). Manuel Hermenegildo, IMDEA Software andTechnical University of Madrid, was elected as the newSpaRCIM director and representative on the ERCIM Boardof Directors. In January 2008, the ERCIM ExecutiveCommittee representative, Ernesto Pimentel, was replacedby Pedro Merino, University of Malaga. In December 2008,Christophe Joubert, Technical University of Valenciareplaced Salvador Lucas as Spanish local editor on theERCIM News editorial board. And SpaRCIM’s national con-tact for the ERCIM Fellowship Programme, FernandoOrejas, was substituted by Javier Larrosa, TechnicalUniversity of Catalonia. SpaRCIM represents Spanish com-puter science in ERCIM. It is currently spread over five uni-versities and one research institute from the SpanishResearch Council (CSIC).

From right to left: Juan José Moreno-Navarro, ManuelHermenegildo, Pedro Merino and Christophe Joubert during theERCIM Days and 20th anniversary of ERCIM in Paris, 28 May 2009.

ware at Aalborg University. In total only 20 educations wereapproved in Denmark, and at Aalborg University in additionto embedded systems three other educations were approved,including wireless communication, biotechnology, andtourism. The elite education in embedded software aims atrecruiting top bachelor candidates with the ambition of car-rying out their master education in close collaboration withthe research at CISS, Center for Embedded SoftwareSystems. The educational program in embedded softwareincludes also the aspects of hardware architecture and con-trol theory. With the approval of the the education comes agrant to cover the extra curriculum activities, including par-ticipation in PhD schools, conferences and workshops aswell as investment in special purpose equipment and soft-ware. The intension is to recruit approximately ten qualifiedDanish and international students.http://eliteeducation.aau.dk/http://www.eliteuddannelser.aau.dk/uddannelsesliste/3803886

Graduated Elite students of 2008.

Page 52: Special theme: Mathematics for Finance and Economy

ERCIM is the European Host of the World Wide Web Consortium.

Institut National de Recherche en Informatique et en AutomatiqueB.P. 105, F-78153 Le Chesnay, Francehttp://www.inria.fr/

Technical Research Centre of FinlandPO Box 1000FIN-02044 VTT, Finlandhttp://www.vtt.fi/

Irish Universities Associationc/o School of Computing, Dublin City UniversityGlasnevin, Dublin 9, Irelandhttp://ercim.computing.dcu.ie/

Austrian Association for Research in ITc/o Österreichische Computer GesellschaftWollzeile 1-3, A-1010 Wien, Austriahttp://www.aarit.at/

Norwegian University of Science and Technology Faculty of Information Technology, Mathematics andElectrical Engineering, N 7491 Trondheim, Norwayhttp://www.ntnu.no/

Polish Research Consortium for Informatics and MathematicsWydział Matematyki, Informatyki i Mechaniki, Uniwersytetu Warszawskiego, ul. Banacha 2, 02-097 Warszawa, Poland http://www.plercim.pl/

��������� �����������

���������������� �����������

Consiglio Nazionale delle Ricerche, ISTI-CNRArea della Ricerca CNR di Pisa, Via G. Moruzzi 1, 56124 Pisa, Italyhttp://www.isti.cnr.it/

Centrum Wiskunde & InformaticaScience Park 123, NL-1098 XG Amsterdam, The Netherlandshttp://www.cwi.nl/

Foundation for Research and Technology – HellasInstitute of Computer ScienceP.O. Box 1385, GR-71110 Heraklion, Crete, Greecehttp://www.ics.forth.gr/

FORTH

Fonds National de la Recherche6, rue Antoine de Saint-Exupéry, B.P. 1777L-1017 Luxembourg-Kirchberghttp://www.fnr.lu/

FWOEgmontstraat 5B-1000 Brussels, Belgiumhttp://www.fwo.be/

FNRSrue d’Egmont 5B-1000 Brussels, Belgiumhttp://www.fnrs.be/

Fraunhofer ICT GroupFriedrichstr. 6010117 Berlin, Germanyhttp://www.iuk.fraunhofer.de/

Swedish Institute of Computer ScienceBox 1263, SE-164 29 Kista, Swedenhttp://www.sics.se/

Swiss Association for Research in Information Technologyc/o Professor Daniel Thalmann, EPFL-VRlab, CH-1015 Lausanne, Switzerlandhttp://www.sarit.ch/

Magyar Tudományos AkadémiaSzámítástechnikai és Automatizálási Kutató IntézetP.O. Box 63, H-1518 Budapest, Hungaryhttp://www.sztaki.hu/

Spanish Research Consortium for Informatics and Mathematics,D3301, Facultad de Informática, Universidad Politécnica de Madrid,Campus de Montegancedo s/n,28660 Boadilla del Monte, Madrid, Spain,http://www.sparcim.es/

Science and Technology Facilities Council, Rutherford Appleton LaboratoryHarwell Science and Innovation CampusChilton, Didcot, Oxfordshire OX11 0QX, United Kingdomhttp://www.scitech.ac.uk/

Czech Research Consortium for Informatics and MathematicsFI MU, Botanicka 68a, CZ-602 00 Brno, Czech Republichttp://www.utia.cas.cz/CRCIM/home.html

Order FormIf you wish to subscribe to ERCIM News

free of chargeor if you know of a colleague who would like to

receive regular copies of ERCIM News, please fill in this form and we

will add you/them to the mailing list.

Send, fax or email this form to:ERCIM NEWS

2004 route des LuciolesBP 93

F-06902 Sophia Antipolis CedexFax: +33 4 9238 5011

E-mail: [email protected]

Data from this form will be held on a computer database. By giving your email address, you allow ERCIM to send you email

I wish to subscribe to the

� printed edition � online edition (email required)

Name:

Organisation/Company:

Address:

Postal Code:

City:

Country

E-mail:

You can also subscribe to ERCIM News and order back copies by fi l l ing out the form at the ERCIM Web site at http://ercim-news.ercim.org/

Danish Research Association for Informatics and Mathematicsc/o Aalborg University,Selma Lagerlöfs Vej 300, 9220 Aalborg East, Denmarkhttp://www.danaim.dk/

ERCIM – the European Research Consortium for Informatics and Mathematics is an organisation

dedicated to the advancement of European research and development, in information technology

and applied mathematics. Its national member institutions aim to foster collaborative work within

the European research community and to increase co-operation with European industry.

Portuguese ERCIM Groupingc/o INESC Porto, Campus da FEUP, Rua Dr. Roberto Frias, nº 378,4200-465 Porto, Portugal