16
www.a-teamgroup.com Enterprise Data Management Reference Data Review A-TEAM INSIGHT presents

Enterprise Data Management

Embed Size (px)

DESCRIPTION

 

Citation preview

Page 1: Enterprise Data Management

www.a-teamgroup.com

Enterprise Data Management

Reference Data ReviewA-TEAM INSIGHT presents

Page 2: Enterprise Data Management

2 | An A-TEAMGROUP Publication March 2010 | Issue 10

Enterprise Data Management A-Team Group

The very fact that a US Federal Reserve governor is cham-pioning the cause of data management is proof positive

that the industry can no longer ignore the need for a more structured and standardised approach to its data foundations. In February, Fed gover-nor Daniel Tarullo brought the issue of data standardisation to the attention of the US Senate during his testimony before the Subcommittee on Security and International Trade and Finance, thus kicking off a series of discussions about how the industry should tackle putting its data in order.

The regulatory community has become increasingly aware of the data management challenge within financial institutions, as it struggles with its own challenge of better tracking systemic risk across financial markets. As noted by Tarullo in February: “The recent financial crisis revealed important gaps in data collection and systematic analysis of institutions and markets.”

To rectify these inadequacies, the US regulator is seemingly keen to kick off a standardisation process and also wants the regulatory community to be-gin collecting additional data in order to better supervise systemically impor-tant financial institutions. During his speech, Tarullo discussed the invest-ments the Fed has made thus far to be able to better monitor the markets by evaluating current data sources and adding new sources. This investment should be extended, he suggested, to the entire data arena by establishing a new standalone independent data col-lection and analysis agency to serve the regulatory community.

So, data is definitely on the regula-tory radar and the Fed itself has been investing in its data infrastructure, including adding ex-Citi chief data

officer (CDO) John Bottega to its ranks. If data collection and aggrega-tion is required for regulatory reporting purposes, then surely enterprise data management (EDM) is an obvious part of that endeavour?

New regulatory requirements and risk management challenges are cropping up across all of the financial markets and firms need to develop a structured approach to meeting these, or face sinking under their weight. For example, the Committee of European Banking Supervisors (CEBS) has produced a veritable cartload of risk management papers thus far this year, all of which could be used to support a business case for investment in EDM.

In a recent paper on risk manage-ment principles, CEBS stresses the need for reliability of risk data in order to allow for all sources of relevant risks to be “identified, measured, and monitored on an aggregated basis and also, to the extent necessary, by entity, business line, and portfolio”. To be able to conduct enterprise risk man-agement (ERM) in this manner, surely the underlying data infrastructure of these firms must be robust enough to support these requirements?

Other more specific regulatory re-quirements, such as those around fair value accounting or the UK Financial Services Authority’s Single Customer View (SCV) reports, to name just a couple, all individually contribute to the business case for EDM.

To elaborate further on one example, the push for transparency around valu-ations, which is part and parcel of the ongoing regulatory agenda to restore confidence in the market, is lending credence to the EDM agenda by ask-ing firms to adopt a more transparent and structured approach to pricing data. This data is becoming ever more

complex and data management teams focused on this function are being asked to do even more with fewer resources at hand due to the reces-sionary climate, so this is where EDM comes into the picture.

Market participants such as Mat-thew Cox, head of securities data management in Europe for BNY Mel-lon Asset Servicing, have spoken to A-Team Group about the pressures they have been facing in the pricing space over the last year or so and how EDM fits into the picture. Banks have been faced with a boom in excep-tions in the pricing process as a result of increased market volatility and this has put undue pressure on limited resources. Furthermore, this pressure is likely to get worse: a reader poll at the end of last year highlighted that most firms are being faced with either a moratorium on hiring new staff or a decrease in staff.

In light of these pressures on staff-ing, Cox suggested during an A-Team Group event last year that firms could instead take a more “creative” ap-proach to data workflow by adding in more automation and by dual sourcing data. He said that risk management has become much more of a driver for investment in market data feeds and automation projects and this could be leveraged to get buy in from senior management.

According to an A-Team Group survey conducted in October, 78% of respondents had an EDM strategy in place and 86% of these respondents claimed they had included pricing and valuations in the overall plan. Given that valuations is just one area in which an EDM approach could pos-sibly be of benefit, this is promising news for the strategy as a whole over the next couple of years.

Enterprise Data Management as an EnablerBy Virginie O’Shea, editor, A-Team Group

Page 3: Enterprise Data Management

March 2010 | Issue 10 An A-TEAMGROUP Publication | 3

A-Team Group Enterprise Data Management

A review of the last decade reveals an ever-changing list of data management chal-lenges for the investment

management industry. Ten years ago, as we said goodbye to Y2K fears, the industry began preparing for the looming compliance mandate known as T+1, trade date plus one day trade settlement. Along with T+1, the Securities and Exchange Commission (SEC) was pushing straight-through processing (STP) to be rolled out by mid-year 2004, as trade volumes con-tinued to rise along with the level of risk for many organisations.

Before T+1 could gain real traction, the economy began a slowdown. With the tech crash and devastating events of 11 September, T+1 efforts slowly faltered and eventually were indefinitely postponed. Then came the challenges of M&A activity and how to deal with complex and derivative security types. Overarching them all was the ever-present ‘buy or build’ question and, in some cases, evalua-tions for outsourcing and ASPs began entering the mix.

Today, after the market events in 2009 rocked the economy and industry, the emphasis is on risk mitigation and anticipated federal regulatory changes. Lessons about exposure were learned the hard way, so the need for firms to look through the layers of their busi-ness and to drill into their positions to truly understand their direct and indi-rect exposure has taken centre stage.

What is most interesting about the past 10 years is that while data man-agement challenges have changed, the core tenants of the solution have remained the same. Firms that put a solid foundation in place with a proven EDM solution are the most agile in their response to changes in

the industry. What does this mean? First and foremost it means under-standing that data management is a practice and not a one-time project. A quality EDM practice requires the ongoing attention of key business and technology personnel closely work-ing with a carefully chosen and vetted vendor. It can be a differentiator for you in the marketplace. The market changes, investment management practices change in response, and your data management solution needs to change in support of your modified business model.

A core tenet of solid EDM is a thorough knowledge of your firm’s current and desired business models. EDM solutions are designed to be configured to support the model not to define it. To best deploy and maintain an EDM solution you must conduct a detailed assessment of your needs and keep it current. Alongside this as-sessment, evaluate the resources you have supporting both the business and the technology models. Align these resources with the future state and hire where you do not have the best fit.

At this point it should go without saying that big bang is not a viable ap-proach in any aspect of EDM. Working with business and technology stake-holders regularly prioritises needs and changes so they can be organised into tactical projects that deliver business solutions. Keeping projects and solu-tions visible and touting their business benefits, especially cost savings, will foster a culture that is open and sup-portive of change. Don’t be afraid to change things in the existing model. Often workflows are really ‘work-arounds’ that evolved in place of a technology solution.

Key to a solid EDM strategy is the technology solution supporting and

sustaining it, so choosing the right solution and vendor is important. Top shelf vendors focus on accommodat-ing data and processing for the spec-trum of fund and instrument types and providing workflows and exception only processing to reduce opera-tional costs and support T+0. Data and systems integration are second nature to a superior solution, including integration with multiple accounting or trading systems. The solution must be able to provide firm-wide views – whether assets under management, issuer or security exposure, or how performance is tracking to the busi-ness strategy – as well as provide the ability to look through complex fund structures, focus on a troubled region, or isolate the outperformer in a group of fund managers.

A solid technology solution must be backed by a tried and true vendor measured by their experience and the likelihood that they will remain in strong standing for at least the next 10 years. Working with an EDM vendor should be assessed as a long term commit-ment. Being financially sound with the ability to invest significant R&D dollars into their product year-over-year is as crucial as the product quality.

The lesson is, while investment man-agement challenges may change, the best practices of quality data manage-ment remain the same. Know your business and its strengths and weak-nesses. Keep a disciplined focus on in-cremental improvements that will con-tribute toward an EDM strategy. Build a sustainable, flexible, best-of-breed infrastructure with a carefully chosen technology solution at the core. A solid EDM foundation is the best insurance for supporting your business strategy, reducing risk and ensuring internal and external compliance.

The More Things Change, The More They Stay the Same By Dani Newland, data management product manager, Eagle Investment Systems

Page 4: Enterprise Data Management

4 | An A-TEAMGROUP Publication March 2010 | Issue 10

Enterprise Data Management A-Team Group

Enterprise data manage-ment has been high on our industry’s agenda for a very long time, and there is no

question that EDM is relevant. But more significant progress hasn’t been made due to the complexity and expense of developing an EDM strategy that effectively controls data acquisition and cleansing while still making it available to the business lines that consume it in a way that is effective for their individual needs.

The EDM conversations have varied over the years, starting with “EDM – this is great stuff and we need it NOW”, but changing to “EDM, what does that REALLY mean?” or even to “EDM is way too big – we need to segment and create a golden copy for one data segment first”. But in the wake of the extraor-dinary events of the last 18 months, it seems that we’ve come full circle, and the thinking is trending back around to “EDM – we need it now!”

During the height of the recent global financial crisis, there were more requests for data from inter-

ested parties ranging from business unit leaders, clients and counterpar-ties, risk managers, compliance of-ficers and regulators. The frequency and ad hoc nature of these requests undoubtedly stressed IT infrastruc-tures and processes, but also served to highlight the areas most in need of improved data management.

We are close enough to the cri-sis to be understandably fearful of launching large-scale, big ticket initiatives. But are we so far past the crisis that we’ve forgotten how hard real information was to come by in those dark days after Lehman fell? In some of the more esoteric or illiquid markets, it was all but impossible to value positions, and firms couldn’t confidently aggregate exposures, highlighting gaps in customer and counterparty data. So the time seems right to examine the value of data assets to the institution, and to establish prudent management policies and practices that leverage those assets for maximum benefit.

In our cautiously optimistic post-crisis environment, it is important

to act on the lessons learned by implementing data management procedures that will:

-ance checks;

costs;-

ance; and

account management strategies.But crisis or not, data manage-

ment requirements haven’t changed, and the myriad reasons why EDM is relevant are unchanged. Essentially, EDM facilitates a reduction in opera-tional expenses and the enhance-ment of risk management.

Giving data managers the best possible product – complete, scrubbed and readily accessible data – will give tremendous benefit to the institution not just by reducing costs, but by freeing human assets to do what only experienced data managers can do: leverage their experience for continued process improvements.

Attitudes to EDM Have Gone Full CircleBy Deirdre Sullivan, vice president of marketing, SIX Telekurs USA

Page 5: Enterprise Data Management

Up-to-data.

At SIX Telekurs, we remain committed to delivering the industry’s highest quality, farthest-reaching global market and reference data. Our database includes information on more than 5.6 million instruments, giving you a solid basis for your success and keeping you up-to-date with your compliance obligations and transparency needs.

www.six-telekurs.com

Page 6: Enterprise Data Management

6 | An A-TEAMGROUP Publication March 2010 | Issue 10

Enterprise Data Management A-Team Group

The financial industry has been shaken by the recent credit and mortgage fallout and has spotlighted the critical role of

risk information management within a financial organisation. Risk informa-tion management is a key component of a financial firm’s risk infrastructure. However, the recent crisis in the fi-nancial system was in many respects a crisis of information. It was the lack of quality information, the ability to model this information, and finally, the ability to provide complete trans-parency to this information, which was at the heart of unmanaged sys-temic, liquidity and counterparty risk. Financial firms who seek to overhaul their risk management methodologies and governance must address these issues.

It begins with quality information. Risk measurement requires high quality historical information for modelling. Many structured products and other derivatives have special features including intricate priorities of payment, multiple hedges, com-plex definitions and multiple cash flow triggers that impact risk. Unfor-tunately, this data is often locked up in PDFs and is usually incomplete. This means that someone is gather-ing this information and entering it in manually.

Most firms today seek to address this issue by sourcing this manual process through a third party. Work-ing with some of the largest financial institutions, Netik has been provid-ing outsourcing services for market and reference data since the 1980s. There are processes that can be highly automated and processes that are still manual. It is critical that a firm contemplating outsourcing these processes ensure that the service provider has the proven experience, platform, processes, people and tight service level agreements (SLAs) in

place to back up any information they provide, whilst enabling the firm to re-tain local control over their own data.

After ensuring the quality of data, a firm must be able to accurately model this information. Again, this becomes tricky in dealing with structured prod-ucts such as CDOs that themselves may contain other pooled instru-ments within them. Secondly, the necessary information to understand risk exposure must often be assem-bled from multiple sources. There-fore, the underlying data model must be able to aggregate information from the firm level with look-through analy-

sis down to its lowest elements.Netik is a bit unique here. Because

of our focus in financial data man-agement and reporting services, we also have one of the leading data warehousing products, Netik Inter-View. Working with one of the largest global asset management firms, we leveraged a feature of our product that was originally designed for fund managers who needed look-through analysis into the portfolio composi-tions of their pooled investments down to very granular levels. This capability was extended to incorpo-rate structured products.

Interestingly, we found that this in-formation was not viewed in isolation. Instead, our clients were aggregating information across their organisations

in order to understand the relation-ship between systemic, liquidity and counterparty risk. This information was then viewed with performance attribution to determine how well their managers adjusted to market changes.

Therefore, to be successful, the data model used for this aggrega-tion must be able to incorporate all relevant enterprise-wide information. It is critical for a firm to make certain that whoever is going to design this data model understands how the en-terprise-wide information interrelates; especially if they wish to leverage sophisticated business intelligence technology for analysis and reporting.

A chief risk officer must have a solution that analyses information across the firm all the way down to the underlying elements of a strategy or position. This capabil-ity must be flexible to allow the user to follow a chain of thought in their analysis, and it must allow analysis across multiple dimensions such as time, risk, asset allocation and performance attribution. You see this today, as financial firms are required to conduct more thorough stress testing and scenario analysis. What happens if interest rates go up x% or if half the instruments in a portfolio terminate their contracts in the fol-lowing year?

In addition to their own use, many of our clients are being asked to pro-vide this data for reporting to regula-tory bodies for macro-prudential risk analysis, or to their clients who are demanding greater transparency to their underlying funds. We see these trends significantly gaining momen-tum. Therefore, a firm’s enterprise-wide data management solution must also incorporate the ability to extract the required data and produce the appropriate reports to meet these growing requirements.

Management of Risk Information is KeyBy Michael Rodgers, director of business development, Netik

“A chief risk officer must have a solution that analyses information across the firm all the way down to the underlying elements of a strategy or position.”

Page 7: Enterprise Data Management

March 2010 | Issue 10 An A-TEAMGROUP Publication | 7

A-Team Group Enterprise Data Management

Back office efficiency is top of the business agenda in most investment banks and asset managers, driven

by risk management and regulation demanding faster delivery of data into downstream systems. Our clients are reporting downstream consumers are getting much closer to the data, more specific about what they want and the timeliness of the delivery.

This fact raises the bar and ex-pectation of what is really meant by enterprise data management (EDM). The ability of vendors and internal IT to build and run a centralised re-pository to collect and cleanse data is not in question. However, what is in question is how much of the end to end ‘supply chain’ of reference data management is really going to be achieved by implementing EDM alone.

Our clients have been asking us how they can achieve end to end in-tegration and management of supply and demand for reference data. They have recognised that it is needed to deliver on the promise of centralised reference data management.

Today’s requirements coming from the business community when it comes to EDM systems are not as straightforward as they used to be. Some consumers want the raw ven-dor data, others want raw data plus cleansed, others just want cleansed, and others don’t want to deal with a central authority and go straight to the data vendor. This will vary ac-cording to asset classes and market sectors in the same firm typically.

The trend of consumers getting closer to the data and wanting cust-omised services means a new focus on data delivery service level agree-ments (SLAs), support for multiple formats and differing cardinalities, de-livery status control and data access

control mechanisms. These are the realities of the reference data supply chain in 2010. It is more like a FedEx supply chain than a traditional view of data management.

This is where the golden copy ap-proach, somewhat unfairly I believe, got a bad name. Clients are telling us that they now see a golden copy system as a subset to an end to end solution or reference data supply chain. What happens then, when the business starts to focus on one data vendor for a particular type of data and where the golden copy system now cleanses on a single source? In

this case, the process being auto-mated is now primarily one of a data supply chain and is validation, not cleansing, in the purest sense. And this is a trend that is gaining momen-tum.

It is a move away from seeing EDM as broad cleansing alone (putting together a ‘golden’ record with fields from multiple vendors), more to a flexible approach where I can decide I will trust a particular vendor’s data for a particular asset class (for example, Reuters for US equities security mas-ter data) and handle exceptions in the quality of this feed alone. So, in this situation, distribution, access control and exception management to the data is as important as sophisticated cleansing heuristics and a bcomplex

data model that builds a ‘golden’ record.

An extension may be trusting one vendor’s data for all data in one asset class except for pricing fields, where they want to validate pricing against another vendor’s data with associ-ated human workflow and automated business rules. This is especially true as firms are consolidating on the numbers of reference data vendors they are subscribing to. When the markets were booming the number of times a piece of data was sourced from multiple vendors was irrelevant. Not anymore. Efficiency in the supply chain now has to ensure there is no redundant vendor reference data be-ing ordered or erroneously reordered.

According to the APICS dictionary (Association of Operations Manage-ment), it defines supply chain man-agement as the “design, planning, execution, control, and monitoring of supply chain activities with the objec-tive of creating net value, building a competitive infrastructure, leveraging worldwide logistics, synchronising supply with demand, and measuring performance globally”.

This sounds like a more realistic description of what people are trying to achieve in reference data manage-ment in 2010, than a pure focus on the collection and cleansing of refer-ence data that EDM has long been known for.

The financial crisis has created a drive for improved operational effi-ciency, risk management and regula-tory compliance. This has raised the bar dramatically on what is meant by reference data management, particu-larly the end to end delivery on that vision, not just cleansing alone.

EDM may in fact be about to turn into reference data supply chain man-agement.

“Efficiency in the supply chain now has to ensure there is no redundant vendor reference data being ordered or erroneously reordered.”

Reference Data Supply Chain Management is Here TodayBy John Randles, CEO of PolarLake

Page 8: Enterprise Data Management

8 | An A-TEAMGROUP Publication March 2010 | Issue 10

Enterprise Data Management A-Team Group

OUR PANEL OF enterprise data management (EDM) experts debate the challenges firms are facing with regards to EDM projects in the current market and how best to overcome them.

What are the main challenges firms will face in 2010 when launching an enterprise data management initiative and how should these be tackled?

Brownlee: We see two primary challenges for 2010. Firms are working through funding issues for EDM initiatives, but that is no surprise to anyone. The good news is that many initiatives are receiving funding and organisations are looking at how to make progress on EDM. This is one of the bright spots of the financial meltdown – it shined the light on the importance of data in these organisations.

The other big challenge is showing progress in 2010. EDM initiatives are often difficult and time consuming to get off the ground, so while funding is available, well run programmes are looking for visible wins to build momentum for 2011.

Sullivan: For those firms in the initial stages of planning an EDM strategy, the biggest problem is creating a culture of awareness and garnering support for the EDM initiative throughout the firm. The specific issues are most likely to be found in two main areas: governance and prioritisation.

Implementing an EDM solution is an enormous undertaking that requires clear

goals, unambiguously defined responsibilities and support throughout the organisation. Designating and empowering the responsible parties is not a trivial exercise, nor is establishing an organisational structure that fosters cooperation while maintaining accountability.

An old saying instructs that the only way to eat an elephant is one bite at a time. But defining which ‘bites’ should be devoured and in which order can be an enormously complex process that requires careful attention and analysis from the very start of planning.

Rodgers: One of the main challenges firms face in launching enterprise-wide data management initiatives is in understanding how this information is going to be used and how it interrelates. Often within organisations, you have areas of expertise: risk, performance and general asset management, market and reference data, which results in data silos. This information, however, is not viewed in isolation by end users and, instead, firms are attempting to aggregate this information across their organisations in order to understand the relationship between systemic, liquidity and counterparty risk. This will allow them to see how well asset managers adjust their asset allocation strategies due to changes in market beta and to analyse the performance and attribution results of such changes.

Understanding how this information relates together across

Roundtable:Enterprise Data Management

Deirdre Sullivan, vice president of marketing for SIX Telekurs USA

Tony Brownlee, managing director of Data Solutions for Kingland Systems

Michael Rodgers, director of business development for Netik

John Legrand, managing director of Europe, Middle East and Asia Pacific for Eagle Investment Systems

John Randles, CEO of PolarLake

Page 9: Enterprise Data Management

6 of the top 10 Investment Banks,2 of the top 5 Prime Brokers and2 of the top 10 Asset Managers usePolarLake to control Pricing & ReferenceData Distribution & Integration.

Pricing & Reference Data Distribution

& Integration delivered with:

- Control & Confidence.

- Fast Time to market.

- Managing ongoing complexity.

www.polarlake.com

PolarLake Headquarters

80 Harcourt Street,

Dublin 2, Ireland.

E: [email protected]

P: +353 1 449 1010

PolarLake USA

345 Park Avenue, 17th Floor,

New York, NY 10154, USA.

E: [email protected]

P: +1 212 588 1650

PolarLake UK

40 Basinghall Street,

London EC2V 5DE, England.

E: [email protected]

P: +44 20 7618 6426

Page 10: Enterprise Data Management

10 | An A-TEAMGROUP Publication March 2010 | Issue 10

Enterprise Data Management A-Team Group

silos is critical, especially if a firm wishes to leverage sophisticated business intelligence technology for analysis and reporting.

Legrand: New regulations, globalisation, market consolidation, and cost containment are all challenges facing firms in 2010. Doing more with less continues to be a key theme facing firms around the world. Those firms that invest time during the planning phase of a data management initiative will be the ones that can capitalise on their technology and maximise the return on investment. When launching their EDM initiative, these firms need to focus on solving the core business issues and leveraging data management solutions that have been proven over time. Market consolidation is not only a reality in the investment management space; technology vendors are consolidating too. So, when firms look for a vendor, they need to know that the technology will continue to be developed to meet future regulation, instrument types and new markets. They also need to research the stability of the vendor to ensure they can weather the current market conditions.

Randles: The main challenges, I think, in launching an EDM initiative in 2010 are twofold. Part one is the business case, building a compelling cash flow-based ROI model is now essential and is no longer optional or a nice to have. Part two of the puzzle is to take on something that is achievable. While perhaps obvious, in the past quite a few EDM initiatives tried to solve unsolvable problems. Getting absolutely everyone in a global firm to agree on the same golden copy for every single security has been somewhat discounted. There is now recognition that the practicalities of different lines of business having preferences that are fundamentally incompatible with other business lines need to be accommodated and planned for. A good analogy is we have moved from a model T world of any color as long as it’s black to a choice of colours.

Where should firms be focusing their efforts to begin with – what are the key metrics to measure at the start of the process?

Sullivan: There are a variety of really good reasons to implement an EDM solution, from improved risk analysis (avoiding unacceptable losses) to improved compliance (avoiding possible penalties) and many more.

And after all, missing, mismatched or inaccurate data in one or more systems is often the cause of missed opportunity or transaction workflow fails, and fixing the data problem through an EDM implementation could significantly reduce expenses and increase margins. But at the end of the day, the bottom line is…well, the bottom line, and EDM projects can be hugely expensive, which means the metrics are key to getting and keeping funding.

The measurements that seem most appropriate at the start of the process are those that are both intuitive and sustainable, like a straightforward measurement of data quality improvements over time, or by monitoring improvements in transaction fail rates over time as EDM projects are implemented.

Randles: The main factors driving EDM today that we see are cost efficiencies (data costs, operational cost, manpower cost savings), operational efficiencies (data quality, STP failure) and risk and compliance issues (VaR data integration, corporate actions handling, trade reporting). Lots of EDM projects have a particular theme but they seem to be a subset of these overall measures. You cannot say they should be focused on one above another. It really is down to where most benefit is to be gained.

Legrand: In order to achieve success when initiating a new EDM strategy, firms need to be thorough in outlining their framework, identify the areas in their business where the most money

is being spent or duplicated, and determine a way to align those areas. Risk management practices and compliance reporting are methods being used to measure the success of a firm’s data management framework. Also, there are operational tools available to help organisations better understand their overall business and operations. Some provide more detail regarding areas of the business where duplicate efforts are costing money and identify where excessive time and resources are being spent on less important initiatives.

Rodgers: For many firms who have experienced the recent financial crisis, their number one focus will be on implementing enterprise data management initiatives that support their risk information management infrastructure. Nothing focuses the mind better than surviving a brush with financial ruin or witnessing the demise of an institution similar to your own.

The challenge will be for firms to accurately model risk data, which becomes tricky in dealing with structured products such as CDOs that themselves may contain other embedded pooled instruments within them. The underlying data model must be able to aggregate information from the firm level with look-through analysis down to its lowest elements.

Brownlee: I hate to say it, but this depends largely on the goals of the respective programme being put in place; the metrics should support those goals. Generically speaking, though, we are seeing institutions look at quality metrics to confirm the data is reliable for enterprise use, coverage metrics to confirm support for business needs from global business units, and usage metrics to determine how EDM capabilities are and are not being used.

How has the economic environment and increased mergers and acquisitions, for example, impacted firms’ EDM systems?

Page 11: Enterprise Data Management

March 2010 | Issue 10 An A-TEAMGROUP Publication | 11

A-Team Group Enterprise Data Management

Rodgers: The increase in M&A activity has a huge impact on a firm’s EDM system, which is why smarter firms will want to investigate the IT infrastructure of a target client – what key systems do they have in place, how quickly and accurately is data assembled and transmitted across the organisation, where is there redundancy, and how will this organisation roll into the acquiring firm’s EDM system? A firm will need to address these issues if they are to successfully integrate a target client and deliver a consolidated view of data across the resulting enterprise post-acquisition or merger.

Brownlee: M&A activity has exposed the need for EDM systems and, also, has caused firms to think about M&A in their planning and design stages. Firms often start by looking at a “single view of the customer” problem, working to understand both the total exposure and total opportunity of the customers for the post-acquisition institution. Now firms are progressing to thinking about a “single view of the enterprise” problem, which requires additional types of data beyond client and counterparty, to include hierarchies, business to business relationships, accounts, securities, transactions, events and activities, and other types of data that is common across the multiple institutions. Data such as this requires a very flexible and scalable master data management (MDM) environment.

Legrand: The current economic environment has definitely impacted the way many firms look at their data and reporting methods. Clients are demanding more transparency and better risk management practices. Those demands coupled with the need to comply with compliance reporting has forced firms to look at their data management systems to address these needs while preparing for the next wave of regulation. However, these firms are looking for lower total cost

of ownership, proven return on investment and scalability when augmenting or selecting a new data management system. They are choosing solutions that can meet their growing requirements, and ensuring the systems can handle new financial instruments and complex strategies, global expansion and future M&A activity.

Randles: I think the flurry of M&A has forced people to go back to basics in EDM. Things that people may have assumed were straightforward have become quite complex, such as finding out who are all the data providers, what do we use them for, how do people across the combined firm access data (centrally, through a central repository, hybrid). This basic groundwork is being done before major system consolidation or new systems implementation is begun. Also what I see out there is people rethinking what them mean by EDM. Rather than the traditional one size fits all centralised repository, firms are now turning to the consumers more to see what tailored service the downstream system requires. The economic environment has driven firms to look at this consolidation primarily from a cost management and a risk perspective. This is why data vendor consolidation is a higher priority than pure cleansing of data.

Sullivan: It’s a pretty safe bet that if data management was a mess in the past, there is a good chance that acquisitions and mergers have created an even bigger mess. One of the key reasons firms struggle today with EDM is that many organisations have been built from multiple institutions that have been merged over time. Those organisations had working infrastructure and databases that had not been merged, so adding new companies into the mix is simply pouring gasoline on a fire. But with government involvement, promises of regulatory change and calls for greater transparency, EDM is more important than ever, because institutions need to clearly

demonstrate a comprehensive understanding of their positions and risks. Without good data – effectively managed – this isn’t possible. So it would seem logical that EDM initiatives will be given high priority going forward.

What internal governance issues are likely to crop up and how should firms tackle these? How can service providers help in this respect?

Legrand: When various groups within an organisation assemble to discuss data governance, many times the dialogue can easily move in the direction of what technologies fit the firm’s need. The most important first step, before choosing a technology solution, should be to clearly identify the business requirements, determine how the data is going to be managed, assign who is ultimately responsible for the data, and design a plan to maintain the data going forward. Other factors to consider are the need to address new regulations, how the firm will support future global expansion plans, and whether or not it makes sense to outsource certain aspects of the plan.

Providers with extensive knowledge and experience can guide a firm through the planning and implementation phases of data governance. Providers that have implemented data management solutions to a number of different types and sizes of firms are those than can offer a broader perspective and show that there isn’t a one-size-fits-all when it comes to data management.

Randles: Governance is another word for conflict resolution in my view. EDM is no different from other cross department enterprise applications in industry (ERP, CRM), there are going to be conflicting priorities and requirements. Where EDM differs is that it is not always possible to mandate one size fits all for data when different divisions have very valid reasons for having

Page 12: Enterprise Data Management

12 | An A-TEAMGROUP Publication March 2010 | Issue 10

Enterprise Data Management A-Team Group

overriding requirements. This is why flexibility on governance is just as importance as strict enforcement on EDM. Software vendors in particular can help by having systems designed to be flexible and where exceptions are the rule rather than an unpleasant accommodation.

Sullivan: When multiple databases exist for any kind of reference data - client, counterparty, pricing – there can be multiple versions of the “truth”, and it is important that the governance structure is flexible and that it provides a mechanism to support this. As the financial world has become far more global, and as cross border transactions become more the rule than the exception, these situations occur much more frequently.

To illustrate, let’s assume that a trading desk uses a symbol to identify a security, but down the processing chain the back office uses the local official ID (Cusip, Sedol), and further that the custodian or depository might uses the ISIN. These are all valid ways to identify the security, and because each ID fits into the environment where it is used, enforcing the use of one ID or another wouldn’t necessarily improve the processing in any one area, so it shouldn’t necessarily be changed. Rather, the enterprise database needs to store and link all the valid permutations along with the security, so that each processing system along the chain can be fed by the appropriate identifier.

At SIX Telekurs, we have long understood that there is not one single truth but many truths, and we’ve built our own enterprise data model to accommodate this reality. This means that our data can populate a single, enterprise database but still effectively service the individual business applications by providing the specific data element needed in a way that is efficient for consumption.Brownlee: Internal governance issues will inevitably creep in, but

a well crafted data governance strategy and forward looking solution architecture can help. As Kingland is a service provider, we’ve seen clients benefit from leveraging the expertise of our staff. Additionally, we have seen clients inject our services and solutions into key processes, which creates a separation of responsibilities that is often healthy for governance.

Rodgers: Many organisations look to outsource at least some parts of their enterprise-wide data management infrastructure. It is critical that a firm contemplating outsourcing processes ensures that the service provider has the proven experience, platform, processes, people and tight SLAs that complement their own internal governance.

However, whilst a managed service effectively outsources the problem and delivers cost savings from economies of scale, it still leaves concerns with the data management and technology teams over perceived loss of control over their data and managed service arrangements. SLAs, with penalties in Netik’s case, go some way to alleviating these concerns, but often this is not enough.

Traditionally building or buying and integrating a software solution has been the answer. However, engineering the financial data architecture, model and structures needed to support not just reference data but also operational data, such as portfolio accounting, custody, performance and risk, is not a trivial exercise, let alone the integration with data sources and downstream systems. Although a buy and integrate solution can be cost effective compared to an in-house build, they can be lengthy projects to implement and as a result the data operations savings can be a long time coming.

So what’s the alternative? A hybrid approach, where the commodity functions are outsourced to a managed service, such as Netik GSM, that delivers the high quality data at a reduced

cost in short timeframes combined with an on-site software solution for local control. The managed service feeds the on-site software ‘container’ enabling the data to be viewed, checked, released and distributed, as well as adding and maintaining internal data sources, custom business and distribution rules. The net result is the best of both worlds, where functions along the financial data management value chain can be chosen to be part of the outsourced service or implemented using the on-site ‘container’ software.

What are the risk hotspots that can be addressed via the implementation of an EDM solution?

Randles: Risk is such a broad area and data can be critical to many facets of it from operational risk, systemic risk to liquidity risk. While EDM is not going to change a risk taking culture of a firm, a smart EDM implementation will help to remove the risk of unknown risk factors, un-integrated at a corporate risk level within the organisation. The move to more real-time risk analysis in the front office is where we see EDM being pulled into as well.

Brownlee: We advise our clients that they must stay ahead of the curve from an EDM perspective and make sure there are four legs to their EDM strategy. The first is expertise – seek out and retain people that understand the complexities and best practices in EDM. The second is data – it’s fundamental that the data within an EDM environment can be trusted to drive improved business decisions. The third is technology – the MDM and business analytics technologies available are improving and can bring tremendous benefits to these initiatives. The fourth is organisational support – the organisation as a whole must support the initiative today…and tomorrow.

Rodgers: In addition to their own

Page 13: Enterprise Data Management

March 2010 | Issue 10 An A-TEAMGROUP Publication | 13

A-Team Group Enterprise Data Management

risk reporting use, many of our clients are being asked to provide this data for reporting to regulatory bodies for macro-prudential risk analysis, or to their clients who are demanding greater transparency to their underlying funds. We see these trends significantly gaining momentum. Therefore, a firm’s enterprise-wide data management solution must also incorporate the ability to extract the required data and produce the appropriate reports to meet these growing requirements.

Sullivan: Counterparty risk. The industry has been batting about the idea of unique business identifiers for what seems like forever – and with good reason. Effective risk management depends on the ability to aggregate exposures to issuers, clients and/or counterparties across the enterprise, and the ability to look across segments (fixed income, equities) as well as business lines is critical, as firms often do business with a single entity in a variety of ways (customer trading, proprietary trading, lending). Taking a broad view of those entities – an enterprise view - clearly conveys risk control benefits, but also provides a powerful foundation of information on which to base an overall relationship management strategy vis a vis that counterparty and how it fits into the overall goals of the institution.

Legrand: By implementing an enterprise data management strategy, firms can attain better access and transparency of their data and this directly improves business, operational and systemic risk management. A centralised data management strategy puts data at the core of an investment management firm, whereby all areas of the organisation such as accounting, performance and trading, access the same set of data. Centralised data management also allows firms to drill into their data and extract information to truly understand direct and indirect exposure to various investments.

The alternative approach requires reliance on multiple systems creating multiple sets of data, which is not only less efficient, but it opens a firm up to various types of risk.

How will the issues faced during the past year and those of 2010 shape the EDM vendor community?

Brownlee: We have seen consolidation already and I expect that we will continue to see some additional consolidation across the vendor community.

Rodgers: EDM vendors need to understand how information is used throughout a financial organisation. This requires the necessary knowledge to support middle and front office functions, such as risk and portfolio management.

Financial firms and EDM vendors also need to look at alternative approaches. Many of the functions undertaken within data management teams across financial firms are the same or, in other words, commoditised, so it stands to reason that there are economies of scale to be realised by undertaking them as a centralised managed service or utility. Commodity functions could include: data source and vendor interface maintenance, data vendor SLA monitoring, identifier cross referencing, sources comparison to check consistency and data compositing from multiple sources. The managed service model including a technology platform, industry best practice operating model and data operations teams across firms is compelling to many in terms of economies of scale and data expertise, resulting in cost savings and better quality of data when compared with internal processes and technology.

Sullivan: The issues we’ve faced recently have resulted in increased demands for transparency and risk controls from stakeholders and regulators alike. This puts the risk and compliance functions

under pressure, and these folks will need to be agile as regulations and reporting requirements shift and evolve. To support these efforts, the entire community – not just the firms or just the vendors – need to work together to facilitate the analysis of data that is available and help the risk and compliance folks turn that data into information.

That said, it isn’t so much a lack of data that is the problem – rather it is the fact that so much data still resides in so many siloed repositories that creates opacity. The redundant expense that results from sourcing and cleansing data in these silos, combined with the complexity of aggregating data across the silos continues to challenge – if not confound – attempts to bring the situation under control through an EDM initiative.

Complicating efforts to implement EDM solutions is the simple fact that many firms haven’t achieved stellar financial results over the last six or eight quarters, which means that funding for huge EDM projects isn’t necessarily available.

In this environment, vendors need to support the drive toward EDM through relevant solutions that can be implemented on an incremental basis, and by providing creative tools that facilitate data analysis even before full data integration is achieved.

Randles: In a word: practicality. That is the underlying legacy for EDM from the crisis. A vision that is not backed up by solid ROI, a contribution to risk management and regulation or a strong focus on integration will not pass the practicality test. The winning vendors will be those that can deliver incremental benefits in very short iterations.

What are customers looking for from their providers in 2010?

Legrand: EDM vendors need to address risk management, transparency and data

Page 14: Enterprise Data Management

14 | An A-TEAMGROUP Publication March 2010 | Issue 10

Enterprise Data Management A-Team Group

management requirements for their clients and they need to do it in the most cost effective manner. Clients are looking for technologies that can solve immediate business needs while accommodating the longer term enterprise data management vision of their organisation, and they must be able to do more with less. As a result, a number of vendors have rolled out software as a service (SaaS) or application service provider (ASP) models to help service clients by allowing them to focus on their business and letting the vendors manage the platforms, architecture and applications. This is not only a cost efficient way of conducting business, but it is also a strategic way to remain current with the latest technology and functionality and address industry changes without having to upgrade or put significant resources toward these needs.

Rodgers: Customers are looking to their providers for accurate data, the ability to model complex data with transparency for reporting this

data, especially with structured financial products, and knowledge of how this information is being used throughout the organisation.

Brownlee: Customers are looking for innovation and value. They expect that providers are investing and will work with them on their specific initiative and recognise that it is unique. From a Kingland perspective, innovation, value and customisation are themes of our approach to business, all of which are driven from our customers.

Sullivan: Quality and value.At SIX Telekurs, we pursue a

strategy of direct data collection, because we believe that by keeping total control of the data from the original source through various stages of processing to final delivery is the best way to ensure quality. And because we do maintain control of the data, we are also able to respond quickly to customer inquiries.

Quality is critical, but delivering value is what keeps customers happy. Our customers value SIX

Telekurs’ data services not just for the breadth and depth of our data coverage, but also for our intelligently structured and linked reference data and for our flexible selection and delivery methods for pricing, corporate actions and reference data.

Randles: We see firms looking for experience and expertise above and beyond what they have in their firm today. Whatever about the software or service you provide, firms are buying expertise and knowledge that you are bringing best practice for a particular domain. With a lot of cut backs projects that may have been a long term internal build project no longer have that luxury and are now going into the buy a solution camp. Buyers are quite cautious and will only spend when you can really differentiate between your offering and what in-house skills and other vendors can bring to a solution. In our area of focus on integration we are differentiating against generic ETL technology and hand coded solutions.

Page 15: Enterprise Data Management

March 2010 | Issue 10 An A-TEAMGROUP Publication | 15

A-Team Group Enterprise Data Management

Ultra High Performance Technologies for the Financial MarketsNew York City - April 26th

www.A-TeamGroup.com/InsightExchange

Conference and Free Exhibition

A-Team Insight Exchange events focus on hot fi nancial markets trends and the specifi c technologies and applications that are driving them, through an educational program that addresses how IT is driving accelerated innovation in fi nancial trading and risk management. The April 26th event in New York City will bring together A-Team’s editors and analysts with IT innovators and fi nancial markets participants to engage in the exchange of ideas, knowledge, experience and – most importantly – business.

Some of the market trends on the conference agenda for April 26th: How algorithmic and high frequency trading fuels the low latency arms race Building an execution architecture for fragmented liquidity How data centres are becoming the new exchange fl oors Approaches to coping with the market data volume explosion Pre-Trade decision support analytics and the need for speed Sponsored Access as a driver for real-time risk management Building a scalable IT architecture for the fi nancial enterprise Reducing TCO through open systems and standards

Attend the exibitions for free. Attend the conference to listen and learn from industry luminaries. Just $295 for the entire conference ($195 for fi nancial institutions). Conveniently located at The Roosevelt Hotel, situated in the heart of midtown Manhattan, next to Grand Central Station.

For more information or to register, go to: www.A-TeamGroup.com/InsightExchange

Sponsored by:

Page 16: Enterprise Data Management