Financial Data Analytics: A Core Element Of Business Strategy
by Vipul Parekh
In today’s complex business environment, financial services firms are collecting data from numerous sources both internally through transactional systems, operations workflows, and application logs. as well as externally through customer interactions and social networking sites. Additionally, in the regulatory arena, new financial oversight rules have expanded the categories of data that financial services firms need to capture, manage and disseminate. Examples include the capital requirements and risk analytics needs for BASEL III, Dodd Frank, and EU Solvency II regulations. Large financial organizations are also facing constant pressure to achieve operational efficiency and reduce operating costs due to thinner margins.
Overall, industry wide regulatory developments, increased sensitivity to regulatory breaches, pressure to reduce costs, recent technology advancements, and customer behavior are all converging to stage big challenges and significant opportunities to enable data driven decisions for financial services firms. The best way for financial services firms to meet this challenge is by providing efficient data analytics capabilities to key decision makers, creating an excellent opportunity to start treating financial data analytics as a core element of business strategy.
Reality of Current State
In reality, however, key decision makers and operations staff in most financial services firms still rely heavily on traditional reports-based analytics solutions. These solutions mostly generate reports with historical data, with users performing manual analysis using tactical dashboards and Excel spreadsheets. Additionally, with the rapid rate of increase in data volume, these solutions have several key bottlenecks:
3 Vs Challenge
The volume, velocity and variety of data have far exceeded the capacity of manual analysis, degrading the strength of traditional databases.
Lack of Metadata and Data Quality
The data attributes are not well tagged and analyzed, in addition to having significant quality issues that make them difficult to use for analytics.
Storage Space Issues
Most of the reports have fixed sets of attributes and overlapping information, causing unnecessary waste of storage space. This is becoming a very big issue considering the amount of data collected by financial services firms.
Limited Insight
Historical data only allow descriptive analyses (what has happened) by looking at ad-hoc analysis or dashboards. There is no easy way to perform diagnostics analytics (why it has happened), predictive analytics (what will happen) and prescriptive analysis (what should happen) on data, which is key to enabling more effective data driven decisions.
Lack of standardization
The data analysis and metrics used for decision-making are ad-hoc and lack standardization at the department or firm level, resulting into different interpretations of same data attributes.
Flexibility Issues
Expansion of reports to add more information or generate new metrics requires enhancements and further developments by technology teams.
Bringing Change
So, how do firms drive their data analytics initiatives for better decision-making? At the highest level, the key is to approach data analytics as an enterprise-wide initiative in a coordinated manner. That means business decision makers from various departments need to clearly articulate key performance metrics and requirements that will enable better decisions. The organizations should establish an enterprise-wide data analytics program to build a framework that brings key people (data experts, decision makers, IT teams) and processes (data governance, quality, analytics) together. The framework should broadly cover standardization of key performance metrics, data quality rules, data governance processes, data dictionary, analytical data model, metadata and visualization to bring consistency in implementation across all departments.
The data framework should support multiple decisions across various departments. For example, observation of transactions volume spikes would be important for operations in studying underlying market events as well as potential capital and counterparty risks. Whereas the IT team may be looking to analyze system performance and capacity needs on high volume days. So, it is important to recognize nuances by building flexibility into the framework rather than trying a one size fits all style solution for all departments. Additional points to work towards include:
Get to know your data
Catalog data sources and tag data housed across all business lines to develop enterprise-wide taxonomy and data dictionaries.
Data Quality
Bad data leads to bad business decisions. So, having better data governance processes to produce better quality data plays a crucial role in usability and success of data analytics initiatives.
Running data quality initiatives with data experts from each department can help immensely in defining key quality metrics, identify root cause of quality issues, provide feedback in data governance processes, and improve overall data quality.
Analytics Model
Having a separate data model for analytics can help immensely in adding flexibility required for data analytics initiatives.
Selection of data attributes, understanding data quality needs, identifying right data sets, and correct formatting of data are all crucial in designing a flexible analytical data model.
Support for various views of data and different analytics methodologies are important as decision makers interpret data differently.
Roles and Responsibilities
Clearly define the roles and responsibilities of sponsors and all stakeholders involved in data analytics initiatives. The success of an initiative depends heavily on close collaboration between business stakeholders, data experts, data modelers, and technology teams.
New Skills
Data driven decisions and analytics solutions require resources with better artistic, graphical, and analytical skills. The firms can hire data analysts with specific skills from outside or train existing data experts to develop required skills.
Analytics User Interface
Technology teams should focus on building infrastructure that allow data experts and decision makers to perform ad-hoc analysis on underlying data, create new metrics visualizations to best represent data, and the ability to drill down in detail as required.
The UI should have provisions to perform what-if analysis and forecast varying outcome of performance metrics.
Iterative Development
Data analytics solution development is very iterative in nature. Most of the requirements change significantly after the first phase when the business starts analyzing data and builds deeper insights. Careful planning should be done to easily add or update functionality while limiting throwaway work.
Data Security
Processes need to be in place to limit liberal use of data, as well as to enforce roles- and responsibilities-based entitlements to control who has access to data, as well as what and how much can be seen in order to avoid any violation of sensitive data.
Bottom Line
With the increasing importance of data quality and regulatory developments, the challenges financial services firms face are creating the need for maintaining quality data in-house. Ultimately, having sophisticated data analytics to produce high quality data can give deeper insights into new opportunities through a more comprehensive understanding of market trends, customer behavior, risks assessment, capital efficiency, and operations efficiency, as well as providing the ability to perform futuristic analysis based on what-if scenarios. Data is truly a sizable and important asset and having strong data analytics capabilities as a core element of business strategy will be the key to the future success for financial services firms.
Vipul Parekh is a leader of Financial Services practices at Optimity Advisors.
Rod Collins's Blog
- Rod Collins's profile
- 2 followers
