Data Services Market Guide 2010

download Data Services Market Guide 2010

of 52

Transcript of Data Services Market Guide 2010

  • 8/14/2019 Data Services Market Guide 2010

    1/52

    10 - UK, ROW

    $20 - Americas

    15 - EMEAData ServicesMarket Guide 2010

    Good data, good savings

    ISJ Data Survey inside

  • 8/14/2019 Data Services Market Guide 2010

    2/52

  • 8/14/2019 Data Services Market Guide 2010

    3/52

    1

    Investor Services Journal | DataSe rvicesMar ketGu ide2 010

    Editors Letter

    The nancial data and technology industryhas witnessed an unprecedented overhaulin the capital markets, changing the outlook omany o its banking clients, with urther regu-lation to come.

    In doing so there has been a shit rom anemphasis on assets to that o liabilities what

    could be lost? rather than what money couldbe made? Corporate governance issues, thetransparency o operations, and the account-ability, perhaps, o how banks and other insti-tutions justiy its trading choices, are all ght-ing to emerge top o priority lists around theworld.

    The data services industry will be able toprovide some o the solutions. Financial insti-tutions have been looking to stem the outfow

    o capital either through losses or investorredemptions and conversations with sot-ware developers will more requently than notboil down to cost saving. Amid a number oarticles that touch on this area, Andreas Glat-ter o Econ explains the numbers game ogood systems and good data (p 32). He makesa powerul case or the centralising o data,particularly in the context o declining sta

    numbers. The insistence that managementmust get the message o needing the right, ac-curate data to drive a business Im sure will beechoed around the industry.

    An important consideration is that innova-tion in the nance industry will not stop. Justas a lotus can emerge spotless rom murkywater, new ideas will eventually drive the re-turn to prots. The outsourcing o middle andback oce unctions is one such innovationthat has emerged, potentially suiting rms o

    all sizes in the transerence o data operationsto trusted specialists and urther depressingcosts. Pramod Gupta o HCL Technologies,Geo Harries o Fiserv and the panellists inour market debate thoroughly expound uponthis service oering.

    Amid the reconstruction, open lines ocommunication between regulators, commit-tees and institutions must remain open. In this

    light, Bob Cumberbatch o Interactive Dataexplains the good work done by associationsto work together towards industry standards,as well as the collaboration o data vendorsthemselves (p 39).

    Few have any clear idea what the new struc-ture or operations and regulation o nan-cial institutions will look like. But this guidecontains many pertinent articles penned byindustry insiders with a wealth o experience

    and knowledge that describe how the oun-dations are to be laid. n

    Ben Roberts, Editor

  • 8/14/2019 Data Services Market Guide 2010

    4/52

    2

    Investor Services Journal | Data Services Market Guide 2010

    Foreword

    The concept o

    enterprise datamanagement

    (EDM) has absolutelytaken o; howeverthe reality is not yetbeing achieved. Firmsare spending ar toomuch on maintenance

    o existing technology projects rather than

    initiating projects that would help the organi-sation to be much more ecient in the waythey manage content and data and thereoreenable them to execute their business strategyeciently and as designed.

    EDM is a solution that will quietly sim-mer away water or many years to come eventhough its the nirvana solution rms need asthey battle against increased data volumes andniche content requirements rom specialist

    systems. The need or EDM will only increaseover time as more people require access tomore granular and higher requency data andcurrent niche data requirements become stan-dardised across the organisation. The require-ment or EDM absolutely exists, but there area airly insignicant number o projects tryingto achieve this at the moment.

    Historically, rms had little understanding

    o the granularity and in-depth detail o assetsand complex securities required and as a resultcreated specialised siloed platorms that didntlet data fow across the organisation properly.By having a centralised data managementsystem, rms can ensure the quality and real-time availability o data is managed eectivelyand data is understood in a much more intel-ligent way. As a result, particularly during thecurrent economic situation, better decisions

    will be made.Rather than investing in EDM, rms are

    spending their budgets on reducing latencyand improving eciency to move data around

    The pendulum swing of innovationStuart Calder o Linedata Services

    the organisation. Sybase is working with rms

    to add new technology to traditional systemsto both manage the ever-increasing volume odata whilst also achieving ecient data fowin real-time. By providing scalable technol-ogy Sybase is both helping rms to improvetheir existing systems whilst at the same timeproviding a ramework that shows there isan opportunity to achieve more o an EDMcapability.

    Over the past twenty years, the pendulumbetween business driving IT strategies and ITstrategies driving business has swung back andorth. In a time o reduced IT spending theemphasis is now moving towards a ocus ontechnology driving business. Sybase is seeingrms recognise that their previous businessmodels were hampered by technology notbeing able to cope with the changing market

    demands. I these same businesses are going toprogress, it means a change to their businessmodel with technology acilitating success.

    Firms however are not going to hedge theirbets on new, unproven technology; but ratherlook to vendors with proven-track records inFinancial Services, such as Sybase, to providetechnology that they can be condent willhelp achieve current and uture businessobjectives. With larger vendors, rms will have

    the advantage o using technology that can beimplemented in one department or scaled toan enterprise level. This ability will providerms with strategic, long-term partnerships.n

    The pendulum betweenbusiness driving ITstrategies and IT drivingbusiness strategies hasswung back and forth

  • 8/14/2019 Data Services Market Guide 2010

    5/52

  • 8/14/2019 Data Services Market Guide 2010

    6/52

    4

    Investor Services Journal | Data Services Market Guide 2010

    Foreword

    Regulatory Oversight:The Devil is in the DataMichael Atkin, ManagingDirector, EDM Council, Inc.

    The current angst extending rom turmoilin the nancial markets is perhapsthe best thing that has ever happened to

    data management. O course, we wouldntwish these problems on anyone, but as aresult o systemic ailure, weve learned twovaluable lessons that portend well or datamanagement.

    The rst is that nancial institutions arepredominately ocused (now and always) onshort term protability. Thats the nature othe nancial institutions. The second is thatthe regulatory environment doesnt currentlyrefect the true nature o how global businessoperates. It clearly needs to be restructured inorder to provide oversight and keep marketsin balance. And this is not really a secret.

    The problems o dealing with complexityand global interconnectedness have beendocumented in every regulatory report overthe past ve years.

    All o these reports, rom Giovaninnito G-30 to BIS Report on Payment andSettlement to the recent Treasury Blueprintor Regulatory Reorm all basically say that

    nancial processes are interconnected inways that we havent experience beore andthat they can be undermined by lapses indata, operational processes and systemsdysunction. In essence, these reports allconcluded (and we ignored) that its not amatter o i- it is a matter o when - weare going to experience a systemic crisis.

    The real issue is not why we experienced acrisis; it is how to prevent it rom happeningagain. But I dont want to get too ar ahead omysel. Let me set the stage o where we camerom at the beginning o 2008 and where Ithink were headed in 2009.

    Mike Atkin, EDM Council

  • 8/14/2019 Data Services Market Guide 2010

    7/52

    5

    Investor Services Journal | DataSe rvicesMar ketGu ide2 010

    Finger on the PulseThe EDM Council did some researchearly last year to characterize the currentstate o data management within nancial

    institutions. The bottom line was that it wasrising rather nicely. Projects were underway.Data groups existed. And progress was beingmade. The truth however was that most rmswere managing data tactically. Data had beenreduced to just another task to accomplishand subject to the whims o cutthroatcompetition or resources and attention.

    Executive management generally vieweddata management as a low level concern and

    a necessary evil. It was required to keep upwith the pace o business and the mantra wasdo it ast and do it cheap. Unortunatelyi top management doesnt take ownershipo data, the philosophy cascades throughoutthe organization and reinorces silo operatingmodels - making it dicult to managepolitics and rewarding tactical workarounds.

    Business users clearly understood the valueo data that was t or purpose but theydidnt invest time to peel back the layers othe onion as it related to the details. Theypaid attention to data when there was aproblem. Business management generallytook a narrow and unctional view o datamanagement which hid the problemsassociated with the interrelationship oprocesses.

    Data management was stuck dealing with

    the huge backlog o problems that existed,putting out brushres and dealing withimmediate challenges. Most data groupswere reactive. And when they were notdealing with the crisis du jour they spenttheir time justiying their existence, beggingbusiness units or unding and dealing withthe challenges o organizational politics.

    The bottom line is that the data genie wasout o the bottle, but were still operating theway we are used to operating. We just hadanother project added into the mix.

    Whats Dierent Now

    The ways things have unolded suggestthat the goals o data management are nowgoals shared by the regulators making thebusiness case a whole lot easier. Were all

    coming to the realization that rms cantreport and justiy and regulators can compareand analyze without the data being right.

    Not that its a surprise, but prepare ormore regulation. Its inevitable and the resulto proving that you still cant be trusted.Unortunately, regulation isnt the objectiverom my perspective. Regulation is a lawenorcement response and the price we payor our greedy sins.

    But in addition to regulation, we shouldbe preparing or principles-based, globaloversight. Thats the gap in the regulatoryenvironment. The current regulatory systemwas created in the 1930s and is not able todeal with the diversity o market participants,the complexity o nancial instruments,the convergence o trading platorms or thenature o global interconnectedness.

    This new regulatory oversight regimewont get caught sleeping at the wheel again.That means an increasing ocus on systemicand operational risk (including globalcounterparty and exposure management).More requirements related to hierarchiesand links. More transparency on pricing andvaluations. And automation o paymentsand settlement processes.

    There will be more ad hoc reports about

    activities about groups o trades aboutpositions or specic counterparties andabout relationships between investmentactivities. And you will need to slice, dice andreconnect according to multiple scenarios.That means cross-vendor comparisons,alignment o internal repositories andenhanced corporate actions processing tobe able to unravel the multiple issuer issue obligor relationships to get a clear andconsolidated view o risks and exposures.

    All o these processes are data dependent. n

    Foreword

  • 8/14/2019 Data Services Market Guide 2010

    8/52

    6

    Investor Services Journal | Data Services Market Guide 2010

    Contents

    Features

    8-9 Investor Services Journals Data Vendor Survey

    Counterparty Risk

    10-12 Do You Know Who You Are Dealing With?

    14 Are You Exposed?

    Outsourcing

    16 The Outsourcing Phenomenon

    18 Merger Update: A Restructuring Case Study

    20 New Architecture In Light O Regulation

    Panel Debate

    22-30 Panel

    32 Data - Bettering The Bottom Line - Cost Implications

    35 Time For SaaS - Speed OF Change

    36 Not Worth The Gamble - Risk Management

    39 Working Together Through the Annus Horribilis

    - Vendor Collaboration

    43 Market Comment - Unisys

    44 Keep The Customer Satisfed - Operations

    46 Company Profles

    Supplement editor:

    Ben Roberts

    ([email protected])

    Web design: Peter Ainsworth

    ([email protected])

    Senior account manager:

    Trish De La Grange

    ([email protected])

    Account managers:

    Mark Needham

    (Mark.Needham

    @2ipartners.com)

    Tarik Rekiouak

    ([email protected])

    Business developmentmanager:

    James Olweny

    ([email protected])

    Commercial director:

    Jon Hewson

    ([email protected])

    Operations manager:

    Sue Whittle

    ([email protected])

    CEO: Mark Latham

    ([email protected])

    Investor Intelligence partnership

    16-17 Little Portland Street

    London W1W 8BP

    T: +44 (0) 20 7299 7700

    F: +44 (0) 20 7636 6044

    W: www.ISJtv.com

    2009 2i Media plc.

    All rights reserved.

    No part o this publication may

    be reproduced, in whole or in

    part, without prior wr itten

    permission rom the publishers.

    ISSN 1744-151X.

    Printed in the UK

    by Pensord Press.

  • 8/14/2019 Data Services Market Guide 2010

    9/52

  • 8/14/2019 Data Services Market Guide 2010

    10/52

    8

    Investor Services Journal | Data Services Market Guide 2010

    Survey

    Investor Services Journal: data vendor survey

    Question 1: How have the proposed cut-backs in the spending by f-

    nancial institutions in technology acected your business [projections

    this year?

    Question 2: A unifed system o data management and inormation

    eeds has been cited as a major challenge to adopt in the past. Howmuch do you perceive the idea o a bank operating in silos (ie, separate

    departments that rarely, i ever, share data) as a remaining problem

    among your prospective customers?

    Question 3: How much will your business plan or this year involve

    entering new markets to seek opportunities?

    Investor Services Journal conducted a survey o the leading data providersand developers as a precursor to the eatures in this guide. Here are theresults o their responses.

    66.6%

    33.3%

    0.0%

    b) to some degree (2)a) very much (1)c) not at all (0)

    a) very much (3)b) still occasions o silos (0)

    c) not at all (0)

    b) to a some degree, but wont

    dominate our plans (3)a) to a great degree (0)c) not at all (0)

    100.0%

    0.0%

    0.0%

    100.0%

    0.0%

    0.0%

  • 8/14/2019 Data Services Market Guide 2010

    11/52

    9

    Investor Services Journal | DataSe rvicesMar ketGu ide2 010

    Survey

    Question 4: How much will the collaboration o vendors that have a

    mutual client be a growing part o the industry in the next year?

    Question 5: Are you part o such a collaboration?

    Question 6: Please enter the importance that you perceive your cus-

    tomers and prospective customers place on their data managementsystem?

    Question 7: Amid global reviews o fnancial regulation, there are

    strong signs that some areas will move rom a principled based ap-

    proach to frmer legislation. How much will this- directly or indirectly

    - aect the provision o data management?

    66.6%

    33.3%

    0.0%

    66.6%

    33.3%

    100.0%

    0.0%

    0.0%

    b) to some degree (2)a) to a great degree (1)c) not at all (0)

    Yes (2)No (1)

    b) to some degree (3)

    a) to a great degree (0)c) not at all (0)

    1 2 3 4 5

    a)Low latency 33.3% 0.0% 0.0% 0.0% 33.3%

    b)riskmanagement 0.0% 66.6% 0.0% 33.3% 0.0%

    c)low cost 33.3% 66.6% 0.0% 0.0% 0.0%

  • 8/14/2019 Data Services Market Guide 2010

    12/52

    10

    Investor Services Journal | Data Services Market Guide 2010

    For many risk managers in todays market,the pressing issue is not the need orcounterparty risk management this is wellunderstood. Nor is it the need or technical

    methodology there are more algorithmsand risk engines on the marketplace thanthere are banks to apply them. The mostimportant need is clean, timely data onwhich to base their analysis. And becauseOTC derivatives introduce counterparty riskinto the equation, rms need better tools tomonitor and manage counterparty risk.

    This is easier said than done. Howeverexisting data management solutions canplug the common gap in enterprise riskarchitecture by enabling a data-centricapproach to risk.

    Cleanliness is next to GodlinessThe use o risk management methodologiesin the standard operations o nancialinstitutions has led to accelerating demandor more and more market and reerencedata, covering a broader range o markets ata higher level o detail and accuracy andthe associated costs in time and resources aregrowing.

    There is a simple reason or the primacy

    o data collection and management issueswhen considering risk management systems:garbage in, garbage out. I the data you putinto your model, enterprise-wide or not, iswrong or out o date, the analysis producedwill be wrong or out o date. Essentiallyyour organisation is steering either whileblindolded or while looking into the rear-view mirror.

    Data consolidation and reerencingWhat makes this issue all the morerustrating is that many banks have thedata that they need to work rom but

    Counterparty Risk

    Do you know who youre dealing with?

    Tackling counterparty risk andinsuring correct oversight o

    trading partners is not possible

    without the right data system

  • 8/14/2019 Data Services Market Guide 2010

    13/52

    11

    Investor Services Journal | DataSe rvicesMar ketGu ide2 010

    it is simply strewn throughout theorganisation, held in dierent systems, oron dierent pieces o paper in dierentbuildings, or in dierent ormats even

    within the same inormation system.Data collection and systems initiativesemerge rom one part o the organisationwithout thought o their use to otherparts, or to the strategic data managementcapabilities o the bank as a whole.

    Moreover, when data is stored, it istoo oten kept without reerence to therelationships which turn that data intoinormation. For instance, credit data

    on particular customers is held by theaccounting department without reerenceto the inormation that is kept on thesecustomers by trading desks or marketingpurposes. Or data rom dierent settlementsystems is let unconsolidated, leadingto unwarranted credit and operationalexposures, simply because the data iscollected and analysed on a unit by unitbasis rather than working rom a commondata source.

    Market data collection processes holdtheir own perils. Dierent market dataproviders have dierent standards, each owhich needs to be integrated with internalsystems in a way that enables the marketdata to retain its accuracy, while still beingaggregated up to enterprise level usingvalid relational rules and comparisons.

    Oten, market data rom one part othe organisation will show signicantdiscrepancies when compared with the samedata used in a dierent part o the samecompany, simply because the source o thatdata was a dierent external system, or adierent time zone or language.

    These external interaces are made allthe more complex when combined withinternal data production, collection andstorage. General ledger, prot and lossaccounting, mark-to-market reportingrequirements; each o these has unique data

    collection and analysis needs. Positionsand balances continually impacted bytransaction events and their relationship

    to general ledger reports and summaries,and atomic level inormation such as tradedates, prices, trader ees and counterpartydata, are crucial to an organisations abilityto roll-up and drill-down through tradingdata to nd inormation that is essentialor an understanding o the companys riskposition.

    In most banks, the general ledger requiresinormation rom dierent inormation

    systems, but transaction systems areproduct specic and the data produced isinconsistent. Moreover, the general ledgeris not built to store transaction details.The result is that the general ledger isoverloaded, and requires enormous manualreconciliation. Whats more, the nal resultis separated rom the inormation usedto build it thus any urther analysis is

    rendered impossible and the general ledgeris separated rom any urther strategic useit may have. Ideally, transaction data needsto be pulled rom individual product-basedsystems and transormed into a consistentvalidated inormation standard that createsconsistent data between product andaccounting systems.

    Once again, these interaces are madesignicantly more complex when combined

    with the need to integrate external marketdata sources and transorm external marketinormation into internal inormation.When developing mark-to-market reporting

    Counterparty Risk

    It will not be possiblefor one institution to be

    running different datamanagement systemsfor each of the func-tions that need data

  • 8/14/2019 Data Services Market Guide 2010

    14/52

    12

    Investor Services Journal | Data Services Market Guide 2010

    requirements, external market data mustbe combined with internal transactioninormation to produce consistent fows odata useable in accounts and general ledgerwork.

    Too oten, banks data managementcapabilities trail signicantly behindoperations creating additional technology-induced operational risks. These risks canbe avoided, but organisations must take aproactive approach to the scale economiesavailable to them by implementing anintelligent and strategically ocused datamanagement system.

    Data collection or risk analysisI data is the key issue or the developmento risk management, risk managers mustask themselves how well developed theirdata management strategy is. One seriousissue when developing a data managementstrategy is the extent to which the systemwill be open how fexible is the systemat absorbing and disgorging dierenttypes o data to and rom dierent typeso systems. With many risk managementanalytics systems, the database is integralto the system. This oten means that itis impossible to support products andactions which are not integral to the riskengines database itsel: risk managerslose the opportunity to utilise dierentrisk engines or dierent analysis because

    the data is held in the proprietarydatabase o a particular risk engine.

    The diversity o middleware, data import/export and database management systeminteraces demanded by risk analyticsengines is bafing. What this diversitymeans is that many organisations rstselect the risk engine that they are going touse and then interace back into the dataenvironment that the risk engine needs.This is undoubtedly the wrong way todo things. The more robust method is todevelop a timely, accurate and consolidateddata structure which can be drawn on by

    whichever risk engine the risk manager,or indeed other bank unctions, wishes toemploy.

    In the end, it will not be possible orone institution to be running dierentdata management systems or each o theunctions that need data. It simply makessense or data management to become anenterprise-wide strategic unction romwhich risk analysis, accounting, audit,economics, operations research, and othercritical unctions can be run.

    Time issues

    In an ideal world, unding and risk wouldbe managed through a real-timeconsolidation o all current and anticipatedcash and positions and integrated withaccurate market and counterparty risk data.Almost by denition, maintaining a real-time stream o constantly updated, cleanand accurate data is the only way to closethe gap between actual and known liquidityand exposures. This inormation needs tobe continuously derived rom transactions,market data eeds, and customer/counterparty updates that are drawn rommultiple disparate systems as discussedabove without manual intervention orbatch processing.

    For most rms, however, real-timecounterparty exposure managementremains an ideal. Most rms rely on manual

    eorts or out-o-date sources to constructcounterparty credit ratings and corporatehierarchies. The extra step o linkingthis improved counterparty risk data totransactions and positions in real-time cantake hours or days. For many risk scenarios,this is too little too late. Establishing arm ooting or all critical data sets andintegrating then in real-time is howevernot a antasy it is what enterprise data

    management solutions can deliver andmitigation o counterparty credit risk isjust the latest high-prole application o itstransorming potential. n

    Counterparty Risk

  • 8/14/2019 Data Services Market Guide 2010

    15/52

    Data Management

    Investment Accounting

    For more information, please visit www.eagleinvsys.com or contact one of our specialists:

    Americas: +1 617 219 0100 Asia Pacific: +61 2 9087 7647 Europe: +44 (0) 20 7163 5700

    INNOVATIVE TECHNOLOGY & SERVICE SOLUTIONS FOR:

    Performance Measurement and Attribution

    Information Delivery

  • 8/14/2019 Data Services Market Guide 2010

    16/52

    14

    Investor Services Journal | Data Services Market Guide 2010

    Monitoring Exposure

    For all businesses operating in investmentmarkets today, it is imperative to have aproper understanding o counterparty ex-

    posures. Managers at investment rms needregular and requent supply o such inor-mation as well as the ability to obtain snap-shots on demand, accurate to the second. Yetobtaining this is not always straightorward.For example, in recent months we at Sim-Corp have spoken with several prospectivecustomers unable to obtain comprehensiveanalysis o exposures or days or even weeksater a counterparty has signalled trouble. It

    is hard to manage a problem when its scopecannot be determined.

    So how is it that such apparently basicinormation can be so hard to nd? In the

    cases we have seen, one o the main prob-lems has invariably been the existing ap-plication systems architecture, which mapsdiscrete applications more or less to theorganisational structure. While that makessome sense rom a unctional point o view,it creates serious diculties in respect odata. This is because unctionality gener-ally supports distinct operational processes,whereas data needs to be shared throughout

    the enterprise. In organisations where eachbusiness unction has its own system, thesame logical data will be stored in manydierent physical representations; the samesecurity or trade, say, will be represented indierent ways in dierent systems.

    Such a departmentalised approach to ap-plication systems architecture can result in

    huge complexity. This in turn obstructs thecreation o a consolidated view, which wouldnecessitate understanding the dierentdata sets involved, then extracting, ltering,manipulating, reconciling and ormattingdata rom each to arrive at the requiredreport. Attempting to obtain a comprehen-sive understanding o exposure to a coun-terparty across asset classes as diverse as,say, equities, xed income, cash, exchange-traded and OTC derivatives, exemplies wellhow a ragmented architecture can hamperan organisations ability to operate eec-tively. Furthermore, because o all the steps

    Are you exposed?

    It is imperative to

    have a properunderstanding ofcounterpartyexposures.

    Data warehousing has strengths in assessing counterparty risk

  • 8/14/2019 Data Services Market Guide 2010

    17/52

    15

    Investor Services Journal | DataSe rvicesMar ketGu ide2 010

    Monitoring Exposure

    involved, it would be impossible to obtaina consolidated view accurate to the second,even i all the processes required were identi-ed and built in advance o the need arising.

    One way to resolve this problem, at leastpartially, could be to utilise a data ware-house. Typically this would be ed rom allrelevant systems on a pre-dened requency,depending upon the specic data involved,and then interrogated as necessary. However,the success o such a solution or consoli-dated reporting o counterparty exposures

    depends upon several things. One o theseis that the designers o the data warehousemust have accurately predicted what datawill be required in a given, possibly urgentsituation in the uture - ie, that o all the datain the enterprise, the data required at anymoment is included among the data actuallystored. Another is that i it is updated on aperiodic basis, even i that is daily or intra-day, a data warehouse cannot oer up-to-the-second accuracy. Time delays are in-builtinto the approach, as a series o sequentialprocessing steps must be involved, by whichtime positions may have moved.

    A better solution, indeed the only kindthat can provide an accurate up-to-the-second view, is to use a seamless process-ing system across the organisation. Such asystem oers comprehensive unctionality

    to support each aspect o the operation,so to each department it appears that theyhave their own system. However, the wholeoperates around a single core database. This

    means that there is only one instance o anydata in the system, so one denition oreach security, one record o any trade,position, account and so on. All users useexactly the same physical data, rather thantheir own departments private version.And since users use the same system, allasset classes are handled in the samedatabase and all reporting is driven romit too.

    The result is that obtaining even complexanalyses does not require pulling data rommultiple, disparate sources and massagingit into a consistent orm. Rather, it involves

    using standard reporting tools to querythe database, which, i it is a modern,widely used platorm, will have its structuresclearly and openly dened or easyextraction.

    To the uninitiated, particularly anyoneworking in an investment managementbusiness where a highly ragmented systemsarchitecture exists, a seamless processingapproach may sound almost too good tobe true. However, at the same time as thoseinvestment organisations mentioned abovewere taking days and weeks to identiyconsolidated counterparty exposure, userso seamless processing systems werederiving the same inormation in seconds,where such reports were predened, andin no more than a ew hours, where newreports had to be created rom scratch.

    Furthermore, as all users access the exactsame physical set o data, there are no timedelays in data being rereshed and all datain the system, and thus all reports, are asup to date as anyone, anywhere across theenterprise knows. n

    John Mayr is responsible or marketing andpartner development at SimCorp. The compa-nys investment management system, SimCorp

    Dimension, is a powerul and comprehen-sive system with a combination o seamlessprocessing, broad instrument coverage and awide range o tools to support all aspects o the

    Obtaining evencomplex analyses doesnot require pullingdata from multiple,disparate sources andmassaging it into aconsistent form

  • 8/14/2019 Data Services Market Guide 2010

    18/52

    16

    Investor Services Journal | Data Services Market Guide 2010

    Outsourcing

    I you cant beat em, employ

    em. More fnancial institutions

    are outsourcing their back ofce

    processes to a third part specialist.

    Pramod Gupta, head o fnancialservices product engineering at

    HCL Technologies, explains the

    issues surrounding Master Data

    Management.

    The outsourcing phenomenon

    In the investment management industry,Master Data Management (MDM) isexpected to be sophisticated enough tohandle the markets dicult to understandproducts and myriad stakeholderrelationships; while at the same time besimple enough or decision makers to see

    through subsets o entities with absoluteclarity. These attributes can be dicultto reconcile within a working solutionimplemented or managed in-house andanalyst house Gartner has warned about

    the signicant incidences o ailed MDMdeployments. As a result, outsourcing someor all o the deployment and management oMDM to specialists is becoming an optionthat many institutions are turning to.

    MDM is typically applied to entities,such as customer data integration orproduct inormation management,which are o critical importance to theoperation and success o an organisation.A MDM solution will have some means orcleansing and aggregating incoming data(ETL tools), and data models or storingthe cleansed, golden copy, along with

  • 8/14/2019 Data Services Market Guide 2010

    19/52

    17

    Investor Services Journal | DataSe rvicesMar ketGu ide2 010

    support mechanisms such as distribution,auditing and certication. In even thesimplest nancial institution, these taskswill be complex and likely to be beyond the

    available resources or operationally desiredcompetency o an in-house IT unction.In capital markets, MDM applies to

    entities such as counterparties, instruments,accounts, trades and positions, andvenues. Some o these entities are morecomplex because o lack o standardisationacross the industry and the wide varietyo compliance considerations thatneed to be taken into account - or

    example, the MiFID regulation, whichrequires client details to be immediatelyaccessible. The massive amount o datain a typical nancial institution, togetherwith these complex data-governancerequirements, reinorces the position thatsuccessul MDM is a specialist eld.

    MDM is increasing in importancein capital markets because so manybusiness applications (or example, riskmanagement, FOMS and CorporateActions etc) are dependent on consistentand reliable data. A typical nancialinstitution has a number o incomingsources (or example, Bloomberg, Reutersor Hoovers) and a number o outwarddestinations (downstream applications,compliance or analytics). Good MDMsolutions ensure that these applications and

    data sources work to make an institutionmore knowledgeable, rather than moreconused, but arent easy to implement.

    Because o the non-standard nature oMDM, a certain amount o consultancyis always required. However, given theongoing complexity weve reerred to,options have been developed by vendorsthat go beyond helping with the initialdeployment and maintenance. Various levelso outsourcing, with inrastructure, IP andmaster data being held variously either bythe vendor, client or combination o both,are now regularly oered and, mores to

    Good MDM solutionswork to make sure thatapplications and datasources work to make aninstitution moreknowledgeable

    Outsourcing

    the point, taken up by institutions.At one end o the outsourced MDM

    model, a custom built (or customised othe shel solution) is installed in client

    premises, with ongoing managementprovided by consultants locally or remotely.

    With this option, not only is the masterdata still residing within the clientsperimeter but they could also retain theassociated solution IP. At the other end othe spectrum, a packaged MDM productis installed at the vendors site to provideASP access to clients. There are manycombination and urther options betweenthese two extremes. What they have incommon is oering nancial institutionsthe fexibility to secure the benets oMDM in the way that suits them best.

    MDM solutions allow institutionsto easily manage and understand whatwould in ormer days be a dizzying arrayo inormation. Gartner has suggestedthat by 2012, MDM will lead to a 60%

    reduction in all costs associated with theelimination o redundant master data.However the analyst rm has warnedthat, due to the lack o a sucientlybusiness-oriented approach, appropriategovernance and accompanying metricsstructure, 30% o MDM programs willail. This message is clear; institutionsneed successul MDM deployments.Outsourcing to specialist vendors is a verygood way to make sure that happens. n

    Pramod Gupta, head o fnancial servicesproduct engineering at HCL Technologies

  • 8/14/2019 Data Services Market Guide 2010

    20/52

    18

    Investor Services Journal | Data Services Market Guide 2010

    Merger Update

    Merger Update: A Restructuring Case Study

    Ater concluding the acquisition oCheckFree in December 2007, Fiservdeveloped a new market approach andbrand identity. Where previously thecompany had 77 units over the organisation,and acted more like a holding company,there is now a ocus on Fiserv and,underneath, two major divisions. We

    Fiservs acquisition o CheckFree is

    a key example o how post-trade

    services may need to be realigned

    to better suit the clients needs

  • 8/14/2019 Data Services Market Guide 2010

    21/52

    19

    Investor Services Journal | DataSe rvicesMar ketGu ide2 010

    Merger Update

    identied the core competencies acrossFiserv and came up with ve: processingservices, payments, risk and compliance,customer and general management and

    business intelligence and optimisation, saysHarries. Identiying those key competencesallowed us to establish and consolidatearound a vision o Fiserv rather than eachbusiness unit having its own vision.

    This realignment has included some neatootwork in rearranging certain areas obusiness. An example o this was a businessunit called Interactive Technologies, whichwas part o the Fiserv company. This

    was moved into the CheckFree investorservices division, as, according to Harries,it had closer synergies with the needs oexisting clients. We ound there were lotso mutual clients in terms o the assetmanagers and custodians which we hadrelationships with and it made sense tobetter service them by bringing themunder the investor services umbrella.

    Multiple challenges now presentthemselves to solution providers such asFiserv, spurred by the wider tremors inthe nancial market. Greater transparencyis demanded o banks along with a morethorough understanding o its middleand back oce. At the same time, stalevels are declining. Solution providerscan help on both scores. Firms need tohave the operational capability in place

    because theyre being challenged moreoten to prove they have the processesand risk mitigation strategies in place.

    The other thing high on the agendais being able to do the same or morework or less people. To do that youneed something which improves youroperational eciency and many o thesolutions in investor services are targetedat reducing operational risk and increasingoperational eciency. So the current marketclimate allows us to position our solutionsbetter and get those particular aspectspeople need or those requirements.

    The scrutiny o operations will extendto an analysis o the widening booko asset classes being traded, Harriesbelieves, particularly those exchanged

    over the counter. What weve seen is todiversiy the port out o the mainstreamportolio out o the mainstream equitiesand bonds, oreign exchange and toincorporate a wider asset class coverage.

    In this space, Fiserv oers a post-tradeliecycle system, rom conrmation tosettlement. Derivatives, as contracts withoten multiple variables, require a greaterdegree o event management during

    this time period than equities, includingthe teminination o novations and theamendments o existing positions. Itis, he says, a hot topic in the industryto ensure people have the capabilitiesto manage the increasing volumes.

    Oten, trading rms will work rom acetralised platorm such as Omgeos equitymatching, or DTCCs Derivserv and Fiservworks with them. We give investmentmanagers the capability to be able toconnect to these platorms, to interoperatewith di platorms and manage their booko business across di asset classes whichnaturally goes over many platorms.

    This post trade system has beenincreasingly used by asset servicing rms tohelp them oer outsourcing services. Harriessays Fiserv supplies to seven out o the

    top ten asset servicing providers. Harriesexplains that the end client, on the otherside o the asset servicer, either people dontwant to have those unction in-house, seeingthemselves as money managers, or theres askills gap. Some rms may also outsource byasset class, he adds. So while they might behappy to run their equity operations, theymay not want to run their derivative. Sothey may not outsource everything; it maybe a particular asset class or capability. n

    Geo Harries is vice presidento product strategy at Fiserv.

  • 8/14/2019 Data Services Market Guide 2010

    22/52

    20

    Investor Services Journal | Data Services Market Guide 2010

    New architecture in light of regulation

    The Financial Services Authority(FSA) recently demanded thatbanks spend nearly GBP1 billion on ITupgrades as part o an extensive overhaulo the Financial Services CompensationScheme (FSCS) which insuresindividual savings accounts worth up to

    GBP50,000 in the event o a collapse.The proposed investment is designedto ensure that banks will be equippedto provide a ull list o each customersdeposits within two days o a potentialcollapse, thus ensuring savers can bereimbursed by the FSCS within oneweek. The central idea is the creation o asingle view o each customer where allinormation mortgages, savings, creditcards ully accessible and in one place.While this is certainly a admirable goal orthe uture, in todays world, the inormationis likely to be spread across multiple

    divisions which may even use systemsthat are unable to speak to one another.

    As a result, many institutions are notin a position to currently acilitate thissingle view. However, there are steps thatcan be taken today that will positioncompanies to prepare or the single view

    that will be required in the uture.In the context o existing compliance

    laws such as Sarbanes-Oxley (SOX) and theData Protection Act, the FSAs proposalsunderscore the need to invest in datasystems that accurately record and structuredata so it can be accessed in real-time orcompliance reports. A research reportrecently commissioned by DataFlux, aleader in data quality and data integrationtechnology, and conducted by BDRC, anindependent research company specializingin the UK nancial services sector, pointsto a developing trend o new regulation.

    New Architecture

    Regulatory compliance has had, and will continue to have an eect on the

    IT spending by fnancial institutions

  • 8/14/2019 Data Services Market Guide 2010

    23/52

    21

    Investor Services Journal | DataSe rvicesMar ketGu ide2 010

    The report revealed that 91% o nancialinstitutions surveyed expect urtherregulation in the near uture with particularocus on security, accuracy and quality o

    data: It is going to be ID raud mainly andcontinued maintenance in order to keep dataas accurate as possible. Also making surecustomers inormation is as up-to-date andaccurate as possible in order or customers tohave condence in what we hold is correct. Verbatim response, Data Quality Managercommenting on data challenges in 2009.

    In such a climate it is vital thatnancial institutions equip themselves

    appropriately to handle their data. Thesame survey pointed out that although73% o respondents cite complianceas the primary motive or investmentin data management, meaning thatits importance is central to the legalsuccess o the bank, the responsibility ormaintaining data quality is oten scatteredacross departments 16% o institutionssurveyed didnt assign responsibility atall. Where there is no direct responsibilityheld or managing data quality, it isdicult to be hand on heart compliant.

    Financial institutions are acing a toughclimate and are eager to rein in spending andmaintain existing capital but data reallyshould be an area that attracts investment.

    Clean, accurate data is an invaluablestrategic asset and should be treated as

    such rather than being allowed to degradeand develop into a liability. In order toprevent data becoming a liability banksmust ensure it is trustworthy. This requiresa comprehensive data governance scheme.The aorementioned report ound thatwhilst 45% o companies had either alreadyimplemented or were in the process oimplementing a data governance project,most encouragingly, a urther 19% wereconsidering it. For data governance tobe eective, it requires a combinationo people, processes and technology.

    The benets o accurate and accessible

    data are ar-reaching. It is a key enterpriseasset not only in terms o compliancebut also in providing crucial inormationto acilitate targeted sales. Without the

    single view that the FSA is pushing or,banks oten miss opportunities to providepersonally tailored help and advice, orworse, approach the wrong customer withthe wrong advice and urther damagetheir customer relationships. Data alsoenables revenue generation. The UKFinancial Services Sector, or example,sells on average just 2.5 nancial productsto each customer. This is below European

    and US levels and demonstrates the needor improved cross-selling o products.

    I nancial institutions are preparedto spend time and money on bringingIT systems in line with FSA demandsthey will gain a signicant competitiveedge over those whose approach remainsstilted. They will breed condence withintheir current customer base through theknowledge that the FSCS will be able toreimburse them within a week. This aimresonates with the challenges our surveyrespondents are preparing or over thenext 12 months: I would say security odata is key. I you can hold data securelyyoull be trusted by clients otherwiseyoull lose clients on the basis o trust. Verbatim response, Data Quality Managercommenting on data challenges in 2009.

    Good quality data can be used to identiybusiness risk or or business intelligenceanalysis. All these benets occur as valuableside eects o compliance with the FSAproposals. In addition, and perhaps mostimportantly, with animosity towardsbanks at a high and condence at a low,data that is accurate and accessible mayenable banks to begin to rebuild trust inincreasingly ragile customer relationships.It is likely to prove a sound investment. n

    Colin Rickard, Managing Director,North & West Europe, DataFlux

    New Architecture

  • 8/14/2019 Data Services Market Guide 2010

    24/52

    22

    Investor Services Journal | Data Services Market Guide 2010

    Panel Debate

    The 2009 Panel Debate

    MARTIJN GROOT: Director o

    market strategyr, Asset Control

    COLIN RICKARD: managing director, west and North Europe,

    DataFlux. Colin has15 years experience in the construction

    and implementation o Data Warehouses, CRM systems andManaged Services in both the UK and US markets.

    RICHARD STUMM: vice president, Business

    Development, Broadridge Financial Solutons

    BOB CUMBERBATCH: Europeanbusiness lines director, Interactive Data

    GERT RAEVES: head o marketing

    and business solutions, GoldenSource

  • 8/14/2019 Data Services Market Guide 2010

    25/52

    23

    Investor Services Journal | DataSe rvicesMar ketGu ide2 010

    Panel Debate

    1.Research shows a prospective declinethis year in the amount hedge unds, asone investment vehicle, will be spendingon inormation technology. How can you

    maintain and grow business volumeson your side when cutting cost evenin middle and back ofce departmentsthat needs the best data is to occur?

    GROOT: Hedge unds assets undermanagement (AUM) are taking a hitdue to redemptions and declining assetvalues. Since their revenue model is basedon AUM, unding or any project will be

    harder to nd. On the other hand, hedgeund strategies that do not depend onleverage will survive and some hedge undswill prosper rom the current turmoil.Those hedge unds that are in a positionto exploit the opportunities that thecurrent low valuations bring will do well.

    RICKARD: Although overall IT spend ispredicted to all, investment in data quality isvery strong at the moment. This is primarilydue to two trends that have recentlyemerged: rstly, companies have shown anincreased interest in data risk mitigation,as they realise that data risk is a businessrisk. For example, the ability to spot riskyloans depends on accurate, real-time data.Secondly, weve seen banks realise that datainvestment has a very short payback period

    and a sound nancial case its cheaper tostandardise databases through consolidationthan to keep paying or the maintenanceo disparate, outdated legacy systems.

    RAEVES: There are strong competingand mutually exclusive orces at work inthe market. On one hand, the ongoing ghtor survival means that many individualrms and - in the case o the hedge undindustry - entire market segments have littleroom let or any initiative that does nothave an immediate impact on the bottomline. On the other hand, the ongoing market

    turmoil shows that a clear and consistentunderstanding o critical investment datais vital and no longer just a luxury.

    The good news is that data managementcan be applied to the targeted areas thatrms want and, in many cases, need toimprove immediately without stepping intothe big vision EDM projects that maybe too ambitious this year. This includescounterparty credit risk monitoring, pricingand valuation processes or commonequities as well as or OTC derivatives, andront-end corporate actions automation. All

    o these are dicult to accomplish without asound data management structure in place.

    It is also the case that data managementpractices are in need o improvementworldwide. While the New York/Londonaxis has taken on much o the post-turmoil impact, we have been doing verybrisk business in Central Europe, Asia,and Arica in 2008. EDM has gone global and as a supplier to that market weollow that interest wherever it occurs.

    CUMBERBATCH: While there will bea desire within nancial services rmsto cut costs in todays dicult businessenvironment, the reality is that the all outrom the credit crunch demands rms investselectively in more data services rather thanless. While the overall budget may be under

    siege, rms are more likely to have to re-allocate existing budget to those data servicesthat meet new business requirementsrather than deploy a static spend.

    There are a number o areas that willrequire nancial services rms to reassesshow they will deploy their inormationspend over the coming year. Internationalregulators have identied shortcomings inliquidity risk management practices, notleast in the application o stress testingunder extreme liquidity events. Liquidityrisk management is now high on theregulators agenda and is ast becoming

  • 8/14/2019 Data Services Market Guide 2010

    26/52

    24

    Investor Services Journal | Data Services Market Guide 2010

    Panel Debate

    a key issue or investment rms. Firmswill have to be able to collate, on anon-demand basis, a wide range o datarelevant to all assets, liabilities, contingent

    assets/liabilities, derivatives positionsand other o-balance sheet activities.Counterparty risk monitoring has also

    become a signicant part o overall businessoperations. One o the major driversor heightened attention to managingcounterparty risk is that nancial servicesrms be they hedge unds, asset managersor sell side institutions need to avoidthe negative impact on a rms operations

    should a key counterparty deault ontheir obligations. At the same time thevolatility and the lack o liquidity in themarkets has contributed to the demandor independent evaluations o a widerange o securities as nancial institutionsare seeking inormation to help ensurethat they can eectively value their wide-ranging, oten complex, portolios.

    So while budgets will be under pressure,there is a compelling need or data setsaddressing key business issues such asbusiness entity data or counterparty riskmanagement, independent evaluations particularly or hard-to-value assets combined with analytics to helpclients manage the required contentrequired or investment applications,as well as meet regulatory requirements

    to acilitate risk management.

    STUMM: We believe that a managedservice is the most cost-eective solutionor any rm. As rms look to spend wisely,we believe that rms can save moneyby moving commoditised unctions toa qualied provider that has the provenability to support their growth objectives.Firms can, thereore, spend their preciousIT dollars on those things that willdierentiate them in the marketplace.

    2. One subject o discussion at last years

    FIMA conerence was the growth invendors with a mutual client collaborating,with Omgeo, SWIFT and others cited asexamples. Is this an area that you believe

    will grow as security and risk managementbecome the top two issues or these banks?

    GROOT: This will certainly grow asthere are many vendors looking to expandbusiness opportunity in a shallowerpool. In some cases, more structuralpartnerships that oer complementaryservices will emerge and some o thesemay lead to industry consolidation.

    Ultimately, it is the customers whowill decide which partnerships bringlasting value and which do not.

    RICKARD: Certainly DataFlux ispartnering with data content providerswhere relevant, and we work regularly withsystems integrators on large projects.

    RAEVES: People want to share risk morethan ever, so all orms o shared governanceand mutualisation have obvious attractionsto an industry looking or mitigation. Yes,there will be a growing desire to explore thepotential or utility inrastructures or datamanagement, but this will likely remainan intellectual/political debate or thetime being. Collaborative eorts are otenpreceded by years o development; SWIFT

    and OMGEO, or example, have beenbuilding their communities or decades.

    CUMBERBATCH: At Interactive Data,we have a track record o working withother vendors and service providers tohelp them to address problems and todevelop opportunities. Interactive Data haspartnered with numerous rms to bring tomarket innovative new services includingour award-winning ISO 15022 corporateactions and class actions data. Workingwith sotware vendors, we aim to ensurethat our data not only fows seamlessly

  • 8/14/2019 Data Services Market Guide 2010

    27/52

    25

    Investor Services Journal | DataSe rvicesMar ketGu ide2 010

    Panel Debate

    into these third party applications, but alsothat the data is correctly represented inthese applications, thus helping to reducethe risks and timescales or rms wishing

    to implement an automated solutionor corporate actions processing andaiming to reduce their operational risk.

    In the business entity arena, InteractiveData ormed alliances with Avox andCounterpartyLink to provide businessentity data services designed to help rmsmanage counterparty risk and to complywith new nancial market regulations suchas UCITS III, Basel II/Capital Requirements

    Directive (CRD) and MiFID. Our creditdeault swap (CDS) valuation service wasdeveloped through our strategic alliancewith Markit Group Limited and our recentexclusive agreement with Prism Valuationenables us to oer clients valuations ohighly complex OTC derivatives andstructured products as part o our wide-ranging pricing and evaluation services.

    STUMM: We believe that strategicpartnerships can help to solve industry needsin a timely ashion. The Broadridge GlobalReerence Data Solution (GRDS) is, in act,a joining o orces to provide a solutionthat includes Broadridge, GoldenSourceand IBM. It is important to choose theright partners and, in the case o our GRDSoering, rms that are best-o-breed in

    nancial services outsourcing, enterprisedata management and systems integration.

    3. Is the choice between outsourcingmuch o the data management to externalproviders, and developing a system in-house, a matter o the size o institution?What other actor might be involved?

    GROOT: Firms outsource or dierentreasons and it is not purely a matter osize. Small rms oten outsource becausethey do not want to carry the ull costo an inrastructure. In the case o large

    rms, there are strategic considerationsas to whether they want to do all o thedata management in-house. In this caseit will depend on whether they view

    data management as a value-add to theirservice oerings, which also depends ontheir product, volume and geographymix. For some nancial institutions inthe asset servicing and custody world, theability to construct added data servicesis a core part o their businesses.

    RICKARD: The obvious actor in theoutsourced vs. in-house data management

    debate is one o cost, but the actor ocontrol must also be considered. Obviously,it costs more to outsource data, andcompanies will also have less control overoutsourced data than i they managed itinternally. We also typically see that thepeople who have the clearest ideas aroundhow to organise and control data are thepeople within the companies who workmost closely with the databases. In thesecases all the expertise is in-house so itmakes sense to use that expertise to yourbusinesss advantage. We use the termsundisciplined, reactive, proactive andgoverned to describe how mature acompanys data strategy is. For a less mature,or reactive organisation, outsourcing maybe the best short-term solution, at leastuntil their business ormulates a more

    sophisticated data governance programme.

    RAEVES: In theory, everyone likes theidea o outsourcing data management. Mostrms rely upon similar or same sets o datarom the usual providers and much o thisdata is not proprietary nor oers immediatecompetitive benets. With such a sizeableamount o common data, it is only naturalthat people ask why are we all doing thesame thing in our own shops, when wecould just connect to a shared service?

    The big problem is the lack o precedent.There are hardly any examples o successul

  • 8/14/2019 Data Services Market Guide 2010

    28/52

    26

    Investor Services Journal | Data Services Market Guide 2010

    outsourcing o reerence data, oten

    because providers did not always have theright pedigree or architecture in place.GoldenSource chose to partner withBroadridge to oer a managed data serviceprecisely because they had the proven trackrecord and inrastructure necessary todeliver our platorm to a global client base.

    2009 will be an interesting year oroutsourcing providers, as more rmsthan ever beore are being orderedby boards and investors to shed non-core activities. The question to meremains i managing data can ever beconsidered non-core in our industry?

    CUMBERBATCH: A number oactors can determine whether anancial services rm will look toeither buy a data management system,

    build their own internally or lookto outsource to a third party.

    In considering outsourcing, nancialinstitutions need to look at their ownspecic circumstances when weighing up theadvantages o working with an external datamanagement provider. Outsourcing entailsreplacing internal sta and processes withmore eective external sta and processes.However, beore undertaking what canbe a complex business relationship witha third party, rms need to understandthat while the day-to-day management odata is with a third party, responsibility

    Panel Debate

    or quality still lies with the rm.

    While it is alluring or the day-to-dayproblems o data management to be

    outsourced, organisations need to determinewho denes the client and productdescriptions; who knows o the correctrelationship between client, counterpartyand organisation; who resolves the confictsamong these data sources, and moreimportantly, who owns the data? Settlingdata governance policies and practices is anenormous, corporate-wide undertaking.

    Without business involvement in

    data management, the project canoten be doomed rom the start.

    Unlike other IT issues related to datamanagement, data and its managementare not merely inrastructure. The businessimpact o data is direct and its monetarybenets are measurable. Given that thelanguage o business is not the languageo data management, the challenge liesin translating how data managementshould create clear business benets.

    STUMM: The outsourcing versus in-house question is not related to the sizeo the institution. It is more related tosolving a business need that is driving aline o business to adopt a high-quality andcost-eective solution. Another actor isrms being cautious about outsourcing the

    data related to proprietary inormation.We eel that with the right providersuch as Broadridge, which is ISO27001certied, a rm can actually achieve thegoal o enterprise data governance byutilising a single outsourced provider.

    4. Inormation is only one componento knowledge, hence the argument orsufcient data quality. But where canline be drawn between receiving multipledata eeds to-or example - assess acounterparty and its products, andsimply having too much inormation?

    It is only naturalto ask: why are we doingthe same things in ourshops when we could justconnect to a sharedservice?

    Gert Raeves, GoldenSource

  • 8/14/2019 Data Services Market Guide 2010

    29/52

    27

    Investor Services Journal | DataSe rvicesMar ketGu ide2 010

    Panel Debate

    GROOT: Having multiple eeds bringsbenets in terms o independence, qualityimprovement by being able to cross-compare data products, and complementary

    strengths in geographies, products andbusiness continuity. When adding additionaleeds, rms have to ascertain whether theinormation is going to be used on a regularbasis such as a basis or golden copy, asan additional check to validate data, or asa unique, additional inormation source.Institutions that centralise incoming dataeeds tend to do more with them in termso cross-comparison, quality checking

    and eed integration. In addition, theyare in a better position to insulate theirapplications rom oten requent vendorchanges to provide greater continuity. Inturn, centralisation also highlights unuseddata assets, which can help organisationsto assess usage and gain eciencies.

    RICKARD: All data is useless, nomatter how great or small the quantity,i it is not organised properly or cannotbe trusted. Businesses are nearly alwaysdata-rich in terms o quantity, but whenit comes to quality, the problems aroundaccuracy can leave a company withouta true picture o what is going on.Organisations need the correct data qualityprocesses to ensure reports provided rombusiness intelligence (BI) systems can be

    relied upon to provide the appropriateinormation to senior decision-makers.

    RAEVES: Only the data consumer can bethe judge o that. In this context, the levelo control and conguration that a datamanagement solution oers becomes acrucial dierentiator. I necessary and rightor the business, the solution has to be ableto support an avalanche o data. In addition,the ltering and selection o data needs to besimple to allow a business user to ne tunethe fow o data to suit his/her inormationrequirement. No business will allow a one

    size ts all data management model.

    Achieving this is simpler than itsounds, as smart routing, ltering,request-based processing have allbecome standard in a post-middlewareservice-oriented architecture.

    CUMBERBATCH: Data quality is nota single metric and not all data is createdequal. When looking at a low latency eedthe requirement is all about speed, whileor ront oce trading purposes detailsabout a bond may only require a minimumsubset o the available attributes, such asname and coupon rate, but will need muchgreater details or settlement in the backoce. There are a number o dimensionssuch as accuracy, timeliness, completenessand reliability, which need to be consideredwhen talking about data quality. The rst

    thing to understand is what data thebuilding block or inormation is neededor what purpose and to qualiy a trusteddata supplier or that data. Where nosingle provider can supply all the requireddata then a combination can be createdrom multiple suppliers. Indeed, nancialservices rms oten have to have at least twoindependent sources o data to help ensureaccurate and reasonable portolio values.

    In todays risk and regulatoryenvironment, rms need to be able toseamlessly compare any dierences in data

    While day-to-daymanagement o data iswith a third-party,responsibility orquality still lies withthe frm

    Bob Cumberbatch, Interactive Data

  • 8/14/2019 Data Services Market Guide 2010

    30/52

    28

    Investor Services Journal | Data Services Market Guide 2010

    between vendors. Thereore rather than toomuch inormation, a clear understandingo what data is required or what purposecan enable users to purchase the datathey need rather than over subscribe.

    STUMM: We believe the key is having atleast two points o reerence or comparisonpurposes. By simply having a point ocomparison, you can begin to evaluatequality in a proactive manner. Otherwise,the lack o quality is ound ater thedamage has been done. The other pointto be considered is automation. Havingthe ability to take in multiple sources oinormation and to compare and contrastthem in an automated ashion, will allowyou to make intelligent decisions as to whothe appropriate sources are, and will enableyou to get the highest quality data. Withoutautomation, this critical comparative processcan seem like too much inormation.

    5.How ar is the data management

    world rom metrics that will thoroughlymeasure the eectiveness and qualityo enterprise data management?

    GROOT: There have been some eortsaround constructing a data quality indexto provide a snapshot o overall quality. Thetrouble is, one would need to know thatsomething is wrong. In addition, dierentstakeholders have diering points o viewon quality aspects such as speed, accuracy,completeness, transparency and auditability,which complicate matters. Requirementson data quality dier depending on where

    the data is consumed in the trade liecycle(pre-trade and research, execution, clearing& settlement, risk management, assetservicing, nancial reporting, etc.). The rst

    step towards quality and a precondition tomeasuring any quality level is establishing acommon nomenclature to ensure everybodyis on the same page and can compare applesto apples. Metrics drawn rom, or example,the world o supply chain managementthat highlights quality issues and potentialoperational risks, can ollow rom that.

    RICKARD: The industry is already

    there. In act, the technology exists todayto monitor and report on the accuracyo data over time. Financial servicesorganisations should be asking themselvesi they are treating data like other importantcorporate assets and reporting on it at theboard level. There are two types o metricsthat can be used to measure this directreturn on investment (ROI) and thelesser-used is my data quality improvingover time? question. The importanceo ROI has been demonstrated when anorganisation knows how much its savedon procurement through having a singleview o its supplier data or how muchhas been saved in labour costs due to theautomation o the data quality unction.

    However, organisations also need to beasking is my data quality improving over

    time?. Its through ongoing data monitoringthat organisations can see whether a dataquality implementation is aiding theirbusiness; or example, can they see thattheir customer address database contains30% less inaccuracies than a month ago?Its important to remember that thereis no once and done ormula or a dataquality programme the system needs tobe constantly measured to ensure bad datadoesnt sneak back in and to ensure yourbusiness rules are operating eciently.Business rules can always be modiedto t changing priorities they can be

    The ability to spotrisky loans relies onaccurate, real-timedata

    Colin Rickard, DataFlux

    Panel Debate

  • 8/14/2019 Data Services Market Guide 2010

    31/52

    29

    Investor Services Journal | DataSe rvicesMar ketGu ide2 010

    aligned against regulatory requirements,or example, but youll never know howto t your data to your requirementsunless you look at the metrics.

    RAEVES: The industry as a wholeis pretty ar away rom a common oragreed set o metrics, as there is no suchthing as a generic business case or datamanagement. Firms start on these projectsor very dierent reasons and it is otena case o comparing apples and pearswhen people share their experiences onunding and ROI or data projects.

    Individual rms, however, havebecome much more adept at creatingdata management business casesby understanding the many impactsites and stakeholders that need tobe included rom an ROI aspect.

    CUMBERBATCH: While some progresshas been made in developing metrics ordata management, there tends to be a ocuson technology rather than the businessvalue o the data. Organisations can otensuer rom bad data or reasons that havenothing to do with technology. Among thecauses o poor-quality data are inaccuratereporting, internal discrepancies overwhich data is appropriate and incorrectdenitions rendering the data unusable.Improving the quality o their data requires

    them to address any internal discrepanciesand broken processes. Sta must agreeon exactly what constitutes a customeror a counterparty and how to resolveany discrepancies across business units.Departments and divisions need to agreeon hierarchies o customers and productsand how to resolve duplicate records acrosssources. Rather than a technology-ocusedeort, metrics need to be developed thataddress the business value o data.

    6. How will these metrics help the caseor a more detailed consideration and

    higher budgets towards data managementin tomorrows fnancial institutions?

    GROOT: Budgets can either be set at a

    departmental level or local projects or canbe raised through a data tax across protcenters, such as an overhead charged as axed percentage or based on actual usage.Metrics on data quality will help justiyand allocate budgets to specic areas, whilemetrics on usage and value-add o specicdata sources will help allocate cost. Thecurrent turmoil in the nancial servicesindustry will speed up consolidation within

    asset management and banking, which insome areas is still quite ragmented. Withmergers and acquisitions comes integrationand the need or a scalable inormationinrastructure, serving as the only basis onwhich synergies and cost savings can bemade. In this case, metrics can also help.

    RICKARD: Central to setting metrics andbuilding a case or increased investment indata management is starting the processwith a thorough audit o an organisationsdata at the outset. It is impossible to showimprovement and benchmark progresswithout this step being done properly.Using this inormation, organisations canprove how investments in data quality areimproving key perormance indicators(KPIs), helping to demonstrate the business

    case or data governance. ROI metricsare particularly useul or those in a bankthat are responsible or data governanceprojects. Demonstrating a quick ROI or asmaller tactical project can oten help dataproessionals to gain unding to roll outwider schemes across the whole group.

    Banks should plan a two-prongedapproach to their IT nances, lookingrstly at their compliance needs and takinginto account an emerging trend or moreregulation in the nancial sector. TheFSA, or example, has recently announcedproposals to ensure banks can provide up-

    Panel Debate

  • 8/14/2019 Data Services Market Guide 2010

    32/52

    30

    Investor Services Journal | Data Services Market Guide 2010

    management projects because they canmore easily reach across business lines andidentiy shared data as a corporate asset.

    CUMBERBATCH: The key considerationis or metrics that show data to be accurate,consistent, timely and comprehensive, whichensures trust and condence that the data ist or purpose. Demonstrating that key coredata is t or purpose rom trusted suppliersis the essential component in buildingthe business case or a data managementbudget. For the business, improvementsin data quality are all about improving

    business perormance, enhancing customerexperience, reducing operational costs andhelping to ensure regulatory compliance.

    STUMM: It may not be metricsthat drives higher budgets in the datamanagement space. It is more likely to

    be a necessity in meeting a rms riskmanagement and reporting requirements.n

    to-date inormation both on their customers

    and to their customers in respect o theFSCS (Financial Services CompensationScheme). The thinking behind this move isthat in the event o a bank ailing, the highaccuracy o its data would enable rapidpayment o compensation to savers, whichin turn will drive condence in the banks.Existing issues such as international watchlists must also be considered. Compliantorganisations are those with trustworthy,accurate data. The second prong is toexamine how you can cut back-oce costs a good example is call resolution in thecustomer call centre. A customer queryanswered in one call not only makes thecustomer happy, it also saves the operatorstime and saves the organisation money.This kind o lateral thinking is one waythat banks can instigate successul and

    shrewd data governance programmes.

    RAEVES: It is impossible to generaliseabout business case and ROI. Building asuccessul business case means asking a loto questions; unortunately, the answers willalways be dierent or each institution.

    I would suggest the challenge is gettingthe right people behind the business caserather than proving it through metrics. Ia potential project is driven rom a groupo data management specialists or IT sta,it can be a struggle. We have seen moresuccess when CROs and CFOs support data

    With mergers andacquisitions comesintegration and theneed or a scaleableinormation

    inrastructure

    Martijn Groot, Asset Control

    Firms can savemoney by movingcommoditised

    unctions to aqualifed provider

    Richard Stumm, Broadridge

    Panel Debate

  • 8/14/2019 Data Services Market Guide 2010

    33/52

    GSL Summit ad

    28

    GSL SummGlobalSecuritiesLending

    In parched markets, liquidity must be

    widely sought, and securities lending

    can be a vital source. Benecial

    owners (pension and insurance fundsin particular) can only understand the

    wider benets of this liquidity if they are

    fully engaged with the lending process

    and understand the potential, the risks,

    and have opportunity

    to voice their concerns and views.

    GSL invites readers to an informative

    afternoon to discuss this topic alongwith wider issues of transparency, risk

    and

    returns at this crucial time.

    The event is free of charge to

    attend but by invitation only, see

    below.

    Visit www.GSL.tv for speakerand session updates.

    Ldg frLqudy

    Date

    thursdy, 14h My 2009

    Location

    Fur Sss Hl, cry Whrf, Ld

    1:30 pM

    Rgsr/cff

    5:00 pM

    Drks R

    Thursday,

    14th May `09

    Sponsors

  • 8/14/2019 Data Services Market Guide 2010

    34/52

    32

    Investor Services Journal | Data Services Market Guide 2010

    Cost

    Data - bettering the bottom line

    1Price Impact o Correctand Centralised Data1.1Automation Projects and Data Projects

    Prices o services in the nancial industrydepend on production cost and volume. Lowerprices can only be reached by higher volumesand less cost. Increasing volume requiresa higher degree o automation. Thereoreinvestments in automation are re-quired inmost cases in order to lower prices o services.It is obvious that automation criticallydepends on the availability o accurate data in

    electronic ormat. This is particularly true orreerence data (data describing the nature onancial instruments and the environmentthey are traded in). Without correct dataautomation projects will simply ail.

    Investments on automation projectsusually dont pay, i they have to nance anasso-ciated data management project. I notall automation projects are to ail due toun-availability o data or due to the inabilityto support an associated data managementproject, the decision on data managementprojects needs to be separated rom the de-cision on individual projects using the data.

    Andreas Glatter (pictured right) at Econ explains the cost argument or

    centralising data and improving accuracy

  • 8/14/2019 Data Services Market Guide 2010

    35/52

    33

    Investor Services Journal | DataSe rvicesMar ketGu ide2 010

    1.2 Data CentralisationAs a consequence data management

    projects are only likely to be justied

    i data is centralised and serves a largenumber o other projects and applications.Centralisation in data managementis desirable or other reasons too:Centralisationlowersthecostsofdata acquisition and datamaintenance since inrastructureand man power is required only once,Centralisationincreasesthebargainingpower in negotiations with suppliers

    and tends to lower the price o data,Centralisationlowersthe number o interaces required,Centralisationensuresconsistencyo data across all appliances

    Decisions on data management projectsrequire a strategic view on where theinstitution should go in terms o servicesprovided, risks associated with the services,fexibility o the oering etc. and what datawill be required to support all aspects o theoperation. Management needs to be awarethat the availability o the right data is cru-cial to the uture development o the rm.

    Nobody can exactly tell the uture valueo data, but it should be obvious that uturedevelopments will require more data andnot less. Thereore, not the exact savings incertain areas or other metrics are decisive,

    but the strategic perspective o the rm.Management has to be aware that

    thorough data managementiscrucial,requiresacomplexinfrastructuretobe built up and maintained over timeandneedstimetoestablish.The more fexibility is desired, the more

    data is to be readily available. You onlycan decide to move into another businessarea i you can make the necessary dataavailable in due time. This in turn requires aninrastructure capable to support such moves.Thus there is a correlation between strategic

    and operational fexibility and investmentsinto the data management inrastructure.

    This is widely accepted in the area omarket data and or many projects in thepast market data and little reerence datawas needed. Management now becomesincreasingly aware o that the same is true or

    reerence data, since all more sophisticatedautomation processes in the areas o securitiesprocessing, risk management or evalu-atedprices increasingly need reerence data.

    1.3 Outsourcing as Extension tothe Centralisation Concept

    I centralisation o data managementactivities is the right answer or singleinstitu-tions the natural extension o thecentralization concept is outsourcing. Whyshould all nancial institutions make thesame eorts to get the data right insteado pooling resources and get the same

    Centralisation lowersthe cost o dataacquisition andmaintenance sinceinrastructure and manpower is only required

    once

    Cost

  • 8/14/2019 Data Services Market Guide 2010

    36/52

    34

    Investor Services Journal | Data Services Market Guide 2010

    output or a raction o the price?In a sense, data vendors have always been

    acting as outsourcing partners to the nan-cial community. In act, several o todays

    data vendors (eg, Telekurs) have explicitlybeen created as an outsourcing project.Today their clients ace the same prob-lemsas clients o data vendors with dierentbackgrounds. So, i data vendors do not bringthe solution, how could an intermediatelayer between data vendors and user in-stitutions can bring about a better solution?

    There are a number o problems specicto data management outsourcing that

    have to be overcome in order to makeoutsourcing projects successul:First,itisveryhardtoensuredataquality

    without using the data. You have to see whatdoesnt work using the data in order to nd outwhat is wrong with the data itsel. Thereoreoutsourcing projects in data managementshould be accompanied by other outsourcedprocesses, e.g. corporate action processing,evaluated prices projects or similar projects.Second,externaldatamanagement

    or multiple entities aces a dilemma:The external supplier tends to

    provide the data a large number ousers are willing to pay or and neglectdata required only by a ew users.

    The supplier has no incentive to go beyondwhat is stipulated in the SLA just becausea user urgently needs to solve a particular

    problem. On the other hand, or the userit is critical to have the ew data items rightnow regard-less o what the SLA says.

    Both eects tend to degrade the serviceso each user institution, because the userservices wont get the right nish this way.Third,requirementsforadditionaldata

    are usually easier communicated inter-nallythan externally, because the decision chainis much shorter and does neither involve anexternal hierarchy nor a complex contractualramework (even i inter-nal SLAs are in place).Since data management is a very dynamicissue, this is an important consideration.

    For all these reasons it is dicult to makeoutsourcing successul in the area o datamanagement. It is most likely that what datavendors provide is what can be achieved by

    outsourcing or a larger number o customers.To provide an intermediate level oservice that will result in mutual benets toall parties involved is a challenging task.

    The same reasons making it dicult tooutsource also make it dicult to centralizedata management internally. This partlyexplains why rms organized in silos withlargely independent hierarchies have muchgreater diculties establishing a central data

    management than others. It also explainswhy never all data is centralized in anorganisation and why local spreadsheetsand applications with proprietary data willalways coexist with centralized data. It is aconstant challenge to decide what data is tobe centralized and what is to be kept locally.

    1.4 ConclusionCuttingpricesofserviceswillrequire

    higher automation and more data. More datacan be acquired best i data managementis centralized rom a technologicalstandpoint and or cost reasons.Centralisationhasinherentproblemsthat

    can more easily be overcome internally thanbetween independent organizations. Evenin single hierarchy organizations centralizeddata will always coexist with local data. Thechallenge is to nd the right balance.Outsourcingprojectsshould

    careully identiy the data suitable oroutsourcing and continue to managedata internally which is not.n

    Future

    developments willrequire more dataand not less

    Cost

  • 8/14/2019 Data Services Market Guide 2010

    37/52

    35

    Investor Services Journal | DataSe rvicesMar ketGu ide2 010

    SaaS

    Time changes everything. What was oncestate o the art becomes a cumbersomenuisance at some point. The explosion oactivity that occurred in the 1990s, whenhundreds o small sotware companiesbuilt applications to sell to asset managers,allowed a big increase in eciency. Processesthat had taken weeks were reduced tohours or minutes. New processes that wereimpossible dreams became a reality. Thiswas all possible because o Windows andVisual Basic and the process was not unique

    to asset managers but aected every businesssector. By comparison, the dot.com boomdid not reach the business community butwas aimed at the retail market and when itbusted prematurely, development slowed.

    Today the IT landscape o the assetmanagement industry is one where anycompany o any size has 10 to 20 importantsystems upon which they depend or avariety o essential services. The majority othese systems are all provided as sotwarethat is installed on the companys ownservers in their own IT centre. They willprobably have many other applications

    o less importance that they run as well.The result is that they have large IT teams,complex processes and relatively high costs.

    Some companies have tried to outsourcethe whole thing to a third party such as acustodian or IT services provider. However,there are many horror stories that warn thisis not a simple solution. In act the rule othumb is that i your IT department workswell and eciently, you can outsource it, buti it is inecient you cant. The outsourcercannot be expected to unravel your mess.

    What has worked, however, is selectiveoutsourcing. This typically means gettingthe supplier o your system to hosttheir own application. This makes senseas the supplier ought to know how tosupport their own product and i theydo this or all their clients, they can geteconomies o scale which a generalistoutsourcer cannot hope to achieve.

    The other actor is that Web 2.0 hasunleashed a new revolution in IT making itpossible or suppliers to provide Sotware asa Service (SaaS) solutions over the Internet

    Time orSaaS?

    Justin Wheatley, CEO o StatPro,

    tracks the speed o change in the

    data and technology provision toasset management, and identies

    the importance o Sotware-as-a-

    Service

  • 8/14/2019 Data Services Market Guide 2010

    38/52

    36

    Investor Services Journal | Data Services Market Guide 2010

    SaaS

    Not worth thegamble

    In the wake o raud at Socit Gnrale,

    its clear that risk management willremain a ront-and-centre issue or aconsiderable period o time. What ewexpected was that risk would assume socritical a position so soon especially onethat would stretch beyond the tradingworld and into the broader economy. ISocit Gnrale wasnt enough o a callor a broad review o risk practices, thensurely the current nancial crisis mustbe whether that review is sel-triggeredor, as is now increasingly likely, imposed.

    The crisis, building on the implementationo regulations such as Sarbanes-Oxley, Basel

    and with relatively low cost. Incumbentsuppliers are reluctant to adopt this newtechnology as they ear cannibalisation otheir existing business and because they

    calculate that such is the complexity o theprocesses that surround their embeddedsystems, clients will be loathed to moveanyway. That leaves the door open orstart-ups, but this generation have a muchbigger battle than the class o 95 as thestandards and unctionality they have tomatch are signicantly higher and the gainsin eciency are less about time and volumeand more about money. Clients want the

    same or less rather than more or the same.

    This is the thinking behind StatProsown strategy. We believe that eventuallyall our clients will want to access theirservices over the web just in the way thatwe all use email rather than a ax (or telex,or telegram). Indeed, email is ast beingsuperseded by instant messaging. However,we also recognise that clients cannot simplythrow everything out and start again, butrather need to evolve in a sensible directionaccording to a plan that has been tried andtested. The last thing anyone wants to dois jeopardise their business or the sake o

    saving a relatively small amount o money.This means that we have ocused on

    deploying new services and productsthat oer quick savings or clients whilstmoving them towards the strategic objectiveo low-cost web-based applications.

    Many asset managers must be thinkingabout SaaS as a way to reduce their costs,but the obstacle they will encounter is thecomplexity o getting there and doing soor a wide number o applications.n

    Justin WheatleyChief Executive StatPro Group plc

    David Stewart o Misys puts riskmanagement in context

    Clients need to evolvein a sensible direction

  • 8/14/2019 Data Services Market Guide 2010

    39/52

    37

    Investor Services Journal | DataSe rvicesMar ketGu ide2 010

    The Risk Management Challenge

    II and others, is certainly changing theplaying eld with regard to risk practices.Th