Top 5 US player

The Big Picture:

The US healthcare system loses more than $200 billion every year in fraud, waste, and abuse-nearly 10 percent of annual healthcare spending. The Government Accountability Office (GAO) has deemed Medicaid to be highly vulnerable to fraud, waste, and abuse. A leading multi-billion-dollar healthcare payer, with a growing government business supporting Medicare and Medicaid, wanted to identify claims overpayments and opportunities to better contain costs.

Transformative Solution:

Claims may be overpaid due to fraud, waste, or abuse by providers, pharmacies, members, or payers’ internal claims adjustors. With variations in member demographics, health conditions, condition severities, and treatment patterns, it is imperative to compare claims with other similar claims when looking for irregularities or aberrancies.

An unsupervised learning framework was developed to identify overpayments. For example, one cause of overpayment was the company paying for a service that a patient wasn’t supposed to be covered for. Here, ‘bogus billing’ was a driver of cost and was identified as a key focus area.

The framework has a four-step approach:

  • Define causes of fraud, waste and abuse, and identify the focus areas with high impact.
  • Develop hypotheses to define key internal and external data elements that should be analyzed – we identified 50+ hypotheses.
  • Use multi-level unsupervised decision tree and clustering techniques and business rules to create homogenous segments across member, provider and procedure details.
  • Evaluate millions of claims to identify patterns or thresholds of aberrant dosage and pricing, while controlling for clinically homogenous segments.

The aberrancy identification framework flagged claims likely to be overpaid and helped in achieving unbiased leads for foragers and cost containment units (CCUs). An independent clinical validation showed, identification of fraud, waste or abuse with 60% accuracy compared to 10%-20% accuracy with existing process. The entire framework was automated for scalability and efficiency.

The framework drove efficiency improvements beyond flagging overpaid claims, including sharing reasons with foragers and prioritizing recoveries based on the expected cost of overpayment & likelihood of recovery. The highest priority was given to low complexity and less sensitive claims.

The process revealed that more than 75% of the identified claims overpayments were due to higher dosage or units billed than actually serviced or units billed that were not covered. Nearly 25% of the identified overpaid claims were systemic inconsistencies.

The Change:

As a result of the engagement, the payer was able to identify more than $45 million in recoverable overpaid claims, in the first year, by

  • Developing business rules, leading to systemic changes to hold possible overpayments.
  • Identifying 20 times more claims with dosage and pricing aberrancies.
  • Optimizing the recovery process through the recommended prioritization framework.
biotech case study

The Big Picture:

A leading pharmaceutical company was facing a decline in the sales volume of its flagship drug, which usually accounts for over $5 Billion of its annual sales. Less than 30 percent of physicians accounted for a loss of prescription sales worth approximately $400 Million. As a result, the company wanted to predict which physicians were likely to show a continued decline in drug prescription behavior to effectively engage them well in advance for improving sales.

Transformative Solution:

To address its revenue challenges, the company defined the drivers of prescription sales decline and identified which physicians were likely to decline in their drug prescription activities. Physicians, highly probable of showing a decline in sales, were prioritized looking at their potential patient volume and put in segments based on their market share to drive actionable decisions. From there, physicians were targeted through marketing and sales interactions to improve prescription behaviors.

To predict prescription activity decline, machine learning models were developed that used data on physicians who were targeted by the client’s sales force. Numerous hypotheses were created on characteristics that could impact prescription activities, which helped identify the right data for analysis. This included data from sales, calls, marketing, claims, and physician attributes. It helped the pharma company identify over 1,800 features from the data sources to analyze changes in physician activity over the months of the decline-such as number of calls to a physician & extended team, DTC TV impressions, DMARD sales, and number of approved claims.

Varied traditional and advanced data science techniques, such as logistic regression, machine learning, were used to build the best model. The model identified different drivers of physician decline behavior validating various hypotheses, and revealing the impact of each factor on prescribing behavior.

The Change:

The solution identified over 1,300 physicians with a high probability of decline in the prescribing behavior. The analysis revealed a $105 Million annual revenue opportunity that could be achieved with the appropriate sales and marketing intervention to these physicians.

To make the most of the opportunity, personas were created based on the drivers for each physician segment. Also, physicians’ overall prescription volumes were overlaid to further prioritize the selection of the physicians for targeting.

The company received specific recommendations to achieve the desired impact, including increasing penetration during a call by connecting with multiple stakeholders, customizing detailing message to focus on solving for reimbursement issues, and ensuring that call detailing also involves effective co-pay card distribution.

Drive better strategic decisions with analytics for customer insights

The Big Picture:

A Fortune 50 technology company saw a compelling opportunity to use existing enterprise data to answer key strategic questions about its customers, products, geography, and sales channels. It recognized that areas of the business were underperforming but lacked the proper insights to make changes to address the performance issues.

The company wanted to better understand how business performance compared to plans, budgets, and the previous year; where it should smartly allocate marketing resources; and how it could identify areas of improvement to drive performance consistently across businesses, products, channels, customer groups, and marketing.

Transformative Solution:

To address the opportunity, the company built a centralized, consolidated analytical data mart and provided users with analytical dashboards for analysis and decision-making.

The solution integrated many sources of structured enterprise information, including data on customers, products, billing, campaigns, usage, and promotions, as well as unstructured sources such as net promoter scores (NPS), panel surveys, and competition research.

Through an ETL engine, data hygiene, and quality process, the company loaded the data into a centralized, consolidated analytic data mart, using a sophisticated star-schema model. A suite of powerful, intuitive dashboards was placed on top of this consolidated data, providing views and analysis to various decision-makers, including self-service solutions using Tableau, analytical dashboards built on a “why-what-how” framework, and an executive dashboard that would provide key summaries and automated alerts. Using the solution, the company forecasted performance at granular levels, established alerts, and assessed risk to strategic goals and drivers to mitigate, using advanced analytics.

The project was completed in roughly a year’s time, starting with conceptualization and planning phases, data mart development, and the roll out of the various dashboards and tools.

The Change:

The company realized several key benefits:

  • Gained ‘one source of truth’ resulting in an 80% reduction in the cost and complexities of decision-making
  • Reduced time for strategic “ad hoc” analyses from 13 days to 3 days
  • Identified true drivers of customer satisfaction to improve customer experiences
  • Established consistent KPIs and definitions across business groups
  • Provided standardized and proactive reporting back to senior executives across multiple teams, transforming time-to-market on decision-making

The Big Picture

The need to drive impact in the self-learning space is imperative, and while there has been a flurry of activities, initiatives, and products with high intentions and passion for change in this space, envisioned impact has not been delivered. Self-learning mobile applications have revolutionized access, and while access to high-quality educational content is important, it is not enough for learning and behavior change, especially when behavior is self-motivated.

Transformative Solution

FinalMile’s extensive research has uncovered that while users have access and have an (stated) intention to use the application, it is not translating to actions – usage, completion and retention. This is known as an “Intent-action gap” in behavioral science literature, highlighting the need for a shift in focus to use behavioral science led design interventions to engineer engagement and reduce the gap of intent and action.

FinalMile developed a ‘Behavioral Science based Design Blueprint for Self-Learning’ to drive engagement across education and employability – geared towards influencing as many EdTech entrepreneurs to build more engaging products, improve completion rate and consequently better outcomes and impact. The Blueprint is off content and leverages behavioral science to help design app environments that improve engagement through building commitment and motivation, driving thoughtful interactions, making learning personalized to create a learning mindset.

We have implemented this Blueprint with 2 cohorts of Calibrator an accelerator program focused on driving the product towards user engagement and retention run by Gray Matters Capital. 17 companies ranging from K-12, teacher capacity building, vocation training, adult learning and skilling, SME skilling, content platforms, self-help platforms across India, Kenya, Vietnam delivered measured changes in engagement.

The Change

The Blueprint has been applied to 17 companies across the EdTech domain and has demonstrated measurable impact. Implementing templates from the Blueprint has resulted in:

  • Doubling session length in a reading app for young children
  • 30% increase in active users and 10% improvement in course completion in a language learning app for blue-collar workers
  • 20% increase in revenue for test preparation app.

The Big Picture

As a manufacturer of a patch-based dopamine agonist used to treat Parkinson’s Disease, a major global pharmaceutical was facing a steep attrition rate of 50%, post two months of prescription uptake. The company had tried multiple interventions to mitigate attrition, but results were anemic. Recognizing the complex behavioral nature of the problem, they wanted a more effective understanding of the drivers of treatment behavior, and design concepts by which they could model a new, global on-boarding program.

Transformative Solution

At the outset, there were many questions and assumptions regarding the influencers of attrition. What role did the prescribing physician play? Did people understand how to use the patch properly? Were they having adhesion issues? Were they not properly rotating patch placement on a daily basis?

Almost all of the patent-centered behavioral assumptions could be summarized under the heading: “The patient is not using the patch properly. Therefore, they are not getting the clinical benefit. Therefore, they are abandoning the drug.”

In the medical community, this class of assumptions are commonly categorized as adherence issues. ie. The patient is not adherent with the prescription. When the problem is defined as adherence the subsequent orientation to behavior change is generally “outside-in.” You exhort the patient, you institute social pressure, you set up reminders, you threaten punishment and loss for failure to adhere.

After extensive qualitative, primary research with patients, caregivers, and physicians it became apparent, from a patient-centric perspective, the issue was not proper use of the drug, but simple preference. Was it perceived to be effective? Was it tolerable? Understanding preference in the context of Parkinson’s disease-treatment is extraordinarily complex, as the disease-treatment context is complex.

The Change

Suffice it to say, the resultant reframe of the disease-treatment context, as well as the behavioral problem itself, was extraordinarily powerful in changing the thinking of the business.

A global on-boarding design brief was created that outlined:

  1. A modular approach to interventions, by channel
  2. A portfolio of design briefs, by objective, for each channel
  3. A prototype that reimagined their existing, in-market interventions based on the new thinking

Not only has the new perspective informed tangible interventions, but it has changed the KPIs by which the company tracks success. While attrition is still an important metric, it is a lagging indicator, only. Armed with a new behavioral understanding, they are better equipped to track and measure leading indicators of preference and truly drive behavior change, and treatment outcomes.

The Big Picture

CPG has been underperforming in e-commerce. While online sales are picking up, the growth is skewed to a few categories with most CPG categories still lagging. What can be the possible reasons for the low growth rate of CPG sales online? How behavioral science informed strategies can devise and influence shoppers, improving sales in the segment.

The objective was to identify the reasons for the low growth rate of CPG sales online and devise behavioral science informed strategies to influence shoppers.

Transformative Solution

The vast majority of prevalent research tools depend on respondent’s ability to introspect, deliberate and consciously provide responses and analysis. Unlike these, Fractal’s research methodology is game-based to go beyond “Say-Do” gaps – the difference between what consumers say and what they do.

Conundrum Ethnolab research tool is a group exercise designed to eliminate participants’ personal filters and capture responses representative of the emotions, mental models and biases underlying the behavior. We deployed the Conundrum Ethnolab in New York and in Chicago suburbs among a group of 85 consumers to understand true drivers of consumer decisions and the barriers to shopping online for CPG products.

The Change

As a result of the engagement, we were able to identify the following insights for clients to take action on:

  • Most CPG categories are bundled together as ‘grocery’ because of the long-standing association of buying them together with other perishable groceries. Consumers don’t differentiate between non-perishable CPG goods and fresh grocery because of this association. Consumers re-evaluate their shopping modes and choices only when their context changes. Until then this dominant ‘grocery mental model’ leads to consumers not even evaluating online channels for CPG purchases.
  • This process revealed that consumers are increasingly shifting to shop on mobile rather than laptops or larger devices. And this holds for CPG/grocery as well. Mobile-based shopping enables interstitial shopping – i.e. shopping between tasks. In this interstitial mode, consumers are far more satisficing than when they are shopping in a brick and mortar environment. This highly satisficing mindset makes shopping narrow and shallow – searching and adding one thing at a time (which gives very little chance for other categories – than the one being bought – to be considered) and making shallow comparisons on prices.
Effectively prioritize and deploy analytics initiatives

The Big Picture:

A leading financial services and insurance company had limited knowledge in cutting-edge insurance analytics. The company wanted to address two key challenges:

First, the company did not know how to prioritize its analytics initiatives. The company had difficulty accurately estimating their own analytics readiness, in terms of talent, infrastructure, and mind-set change. It also needed to accurately estimate the long-term business impact of analytics initiatives.

Second, the company was not sure if its organization was ready to deliver analytics models that would incorporate industry best practices and use the most advanced machine-learning techniques.

Transformative Solution:

To solve the client’s first challenge of estimating its analytics readiness and the long-term impact of analytics initiatives, Fractal helped the client develop an analytical roadmap based on quantified estimates of effort and business impact.

Fractal interviewed more than 100 business and data SMEs to exhaustively understand the client’s key challenges, as well as the client’s data and infrastructure readiness. Fractal built three roadmaps for the client’s teams: Claims analytics, product management, and billing experience. A fourth roadmap started in January 2018 for Experience management.

Fractal helped address the client’s second challenge of ensuring the solutions (developed from the roadmap) were built following industry best practices and used the most advanced machine-learning techniques.

  • This challenge would be resolved by directly collaborating with Fractal in developing the most prioritized solutions. Two specific challenges were addressed from the roadmap:
    • How to identify key drivers of customer satisfaction and quantify the impact of the inter-relationship of these drivers?
    • How to accurately predict customer attrition and understand key actionable drivers?

To get there, the client needed to address a lack of internal expertise and solution benchmarks. It didn’t have a standard definition of satisfaction or attrition. So, before getting into the modelling aspect, Fractal helped the client define member satisfaction and attrition. This exercise was then followed by alignments of these definitions by various business teams.

The data used in these projects varied from claims, policy, billing, competitor premium, banking, marketing spends, employee performance, call center, customer demographics, social media, click stream, and survey data, to name some. This was the first time that all these types of data were analyzed together to generate models and insights for the client. The analytical data-mart created by harmonizing varied data sources was a reusable asset for the client.

Advanced machine-learning techniques were used to address the two prioritized solutions of A) identifying customer satisfaction drivers and B) accurately predicting attrition. For the analytical use cases, Fractal used advanced machine-learning techniques, such as extreme gradient boosting (xgboost) and Bayesian belief network (BBN), to identify drivers of member satisfaction and attrition.

These techniques enabled the client to understand hidden data patterns and identify intrinsic and extrinsic drivers. This helped in comprehending satisfaction and attrition at a much deeper level. Understanding the drivers of satisfaction and attrition involved analyzing the relationships of many customer and company factors, such as digital preference, number of calls, settlement duration, and adjuster workload.

The Change:

As a result of the engagement, the data algorithms that were developed by Fractal are being used by the client in many other solutions. This has really helped the client in recognizing the value of data collaboration.

The engagement also helped the client identify customer satisfaction drivers, which inspired the creation of a business intelligence solution to track these factors. Attrition model outputs are also being consumed by retention teams to act pro-actively for the first time.

Fractal has been a part of a great transformational story for the client towards data-driven decision making. We could significantly enable speed to value for the client by getting to insights and recommendations much faster compared to the client’s internal teams, and at the same time, by using the most innovative approaches and industry best practices.

 

The Big Picture

Delinquency is not only an unfamiliar situation for credit consumers but also one which is most often accompanied by temporary/permanent financial difficulty. Therefore, consumers display very little ability to cope with it. In this context, consumers display an ‘avoidance’ tendency – making themselves difficult to reach and making promises that they can’t keep. This leads to significant operations costs, write-offs, and frustration amongst service agents.

Transformative Solution

  1. Behavioral Sciences principle of ‘Commitment’ was deployed in the agent-customer conversation script to strengthen consumers intent to pay the dues one he has made a ‘promise to pay’.
  2. Behavioral Sciences principle of ‘Loss Aversion’ and ‘Reciprocity’ was brought into play to ensure that consumers prioritize delinquency resolution over other financial commitments.
  3. The scripts were customized to different delinquent customer segments – fresh tickets, 90 days and 180 days delinquent customers.

The solution was deployed across the bank’s multiple call centers throughout the country with minimal training requirement.

The Change

The scripting changes were intuitively appealing to the agents as they received a favorable response from customers. The subtleness of the solutions ensured that the company was able to scale the solution across multiple centers in the country. The key results of this deployment were:

  1. A 40% reduction in the number of connects with delinquent consumers before a resolution is achieved
  2. A 20% reduction in unresolved delinquencies
Understand the impact of media on customer web traffic and sales

The Big Picture

An online marketplace uses cross-media network models to understand how media drives web visits.

A regional division of a global online peer-to-peer marketplace wanted to understand the impact of media and trade spending on customer visits (traffic) on the website and what drove subsequent sales. The company wanted to quantify the non-linear impact of various marketing touchpoints in driving traffic and sales. The challenge was to capture the real market phenomenon in terms of customer journeys and subsequently identify the impact of each touchpoint in the journey.

The company was also looking for guidance on measuring the impact of ‘above the line (ATL)’ marketing after accounting for all the other factors effecting the business, guidance on collaboration with retailers and promotion type to be offered, and quantification of competitors’ impact on their own sales.

Transformative Solution

A Bayesian Belief Network based approach was developed to model cross-media effects and understand complex network relationships between marketing channels. These models depicted the customer journey across all marketing touchpoints.

The team discovered the linkage between paid, owned, and earned media and the overall interconnecting that impacts website traffic and sales. It was learned that the paid media like TV and Display had a significant indirect effect through search engine, partner network, and other online marketing activities. Decay and saturation levels for TV, Radio, and Magazine advertising were also evaluated, helping optimize the flighting strategy (i.e., a time-interval advertising strategy).

The entire modeling process was automated to enable standardization across business lines and regions as well as to operationalize at scale.

The Change

With the understanding and quantification of interactions among various paid, earned, and owned media, and their overall impact on driving traffic and sales, the budget allocation was optimized (with same overall budget) potentially leading to additional $23MM in annual customer traffic and incremental $8MM in annual revenue. The automation of the measurement solution also reduced overall time to measure ad effectiveness by more than 80%.

Build effective communication and product strategies

The Big Picture

A leading brand in the men’s grooming category already had over 80% volume share in all key markets. The penetration of the category was close to 100% already, and hence the only feasible way to grow business was to move consumers to higher-tier products. The business wanted to identify the right communication and positioning strategies for high-tier products to attract the consumers of low-tier products. The company also wanted to identify opportunities for a new product introduction to trade-up low-tier consumers.

Transformative Solution

The solution was designed to identify key brand drivers for different segments of consumers, and identify key product attributes to offer new products or in communication of existing products in order to trade-up low-tier consumers.

Equity drivers analysis was conducted with the help of network models on top of respondent-level equity tracking data to understand how perceptions are built in the minds of consumers. The solution identified what among various product attributes and equity measures drove the overall brand rating. The approach quantified the impact of each of those attributes on the overall rating and uncovered key equity paths that could be adopted in design of global and local campaign messaging and product positioning. This helped discover what consumers look for from the category that drives their perception and choice of brand or product across tiers.

The team established the key equity paths for the category using Structural Equations Models and identified the key drivers, as well as identified the key drivers for each of the consumer segments (based on current product usage) and highlighted the different needs across segments.

The Change

Based on the key drivers that emerged for each of the segments, Fractal identified the winning product proposition for the market and developed strategies to trade-up the low-tier consumers towards high-tier products and hence increase revenue for the business. Meta analysis of studies across different markets was conducted in order to compare key drivers and investigate commonalities and differences.

Comparison of key brand-drivers across markets revealed opportunities for global campaigns and product innovation, along with market-specific customization needed in communication strategies based on distinct drivers.

Run systematic remodeling experiments to optimize retail store sales

The Big Picture

A store remodeling exercise is a significant investment, sometimes ranging into millions of dollars. Additionally, at times, store operations need to be put on hold for a few days, which further impacts store sales and revenue. A top 10 specialty retailer wanted to remodel 27 stores, by taking a measured approach to remodeling based on incremental benefits. However, measuring the impact of remodeling was subject to misinterpretation due to factors such as seasonality and difficulty in identifying a control group.

Transformative Solution

The retailer used Fractal’s Trial Run product to run systematic experimentation on individual store enhancements. Each of the 27 stores had different test levers. Trial Run proprietary algorithms were applied to simulate a control group for each of the tests, after first assembling granular data for all the stores in the US. Statistical techniques were applied to quantify a significant lift in sales, with an interactive visual capability to diagnose for drivers of incremental sales.

The Change

As a result of the engagement, 22 of the company’s stores generated a positive lift, with a few experiencing a lift of more than 15% in sales. In addition, break-even for large stores (i.e., those with sales greater than $10 million) was expected within 2.5 years, whereas break-even for small stores (i.e., those with sales less than $5 million) was expected in 5-7 years. Based on the results, the retailer decided to prioritize remodeling for large stores.

Use AI to enhance customer experience and drive digital sales

The Big Picture

For most legacy brick-and-clicks that sell mid to high complexity products, conversions on digital continue to remain sluggish, and businesses face a high degree of customer drop-off, cart abandonment, as well as confused and irrelevant customer conversations. The impediments to digital sales can be very unique for each visitor, such as insufficient product information, in-store pick-up issues, improper search results, price shock, poor product recommendations, and more.

Since most large businesses cannot recreate the digital experience from scratch, they need to adopt a rapid mode of continuous improvement. That means identifying what went wrong in the sales experience, what were the causes, and how the various improvement options can be tested.

Transformative Solution

For a leading retailer, Fractal deployed an advanced AI-based framework to create features from every digital interaction down to minute click-events and identified high-impact causative issues that negatively impacted the customer sales journey.

The Change

As a result of the engagement, the retailer gained more than a 25% increase in digital sales, by removing the impediments in the customer purchase journey.

Understand and predict key drivers of customer satisfaction

The Big Picture

One of the world’s largest tech companies was looking to better understand the key drivers of its customer and partner experience program. The company was already tracking customer and partner satisfaction through equity trackers across segments and markets. It wanted to understand what drove satisfaction for each segment, and to be able to predict a satisfaction-score if it invests into a specific driver.

Transformative Solution

Several key considerations went into designing the solution. The first step was to identify the connection between various product and service features as well as equity measures to increase customer satisfaction. Several factors, such as quality of experience, service quality, and value, rated highly in terms of driving customer satisfaction.

Next was determining the optimal path that could be leveraged to drive satisfaction in a segment and quantify the importance of each factor. Then, the approach highlighted the brand’s strengths and areas of opportunity across segments and markets. This meant plotting the importance of various factors against the performance of a technology industry leader, and then identifying the “sweet spot” areas and improvement opportunities. The findings showed that ‘Customer Experience’ and ‘Understanding Customer Needs’ were well received, while ‘Innovation’ was rated much lower, offering the client an area of improvement to address.

The solution also helped identify commonalities vis-à-vis differences across segments. This involved plotting the contribution of drivers, such as quality of products and ease of doing business, across segments and markets. It also involved identifying the drivers where common interest lied or a disparity stood up across segments and markets.

A dashboard was created to simulate the impact of change in the satisfaction rating of a given driver through different elements in the equity map. The simulator used a nodal tree to then mimic the likely impact on overall satisfaction based on changes in the rating of different factors.

The Change

As a result, the client gained a globally live analysis that was capable of generating new viewpoints into trackers and satisfaction metrics.

Determine the contributions of media and the drivers of sales

The Big Picture

A leading technology company was looking to break down the sales of their gaming product into contributions from marketing and promotional activity, and those coming from other base activity. It also wanted to identify the drivers of sales in general and to understand their marketing mechanics. In order to optimize future planning, it was also looking for marketing recommendations.

Transformative Solution

Two primary areas of focus were involved in the solution. The first was to look at the company’s marketing mix modeling (MMM) to evaluate the performance of various marketing levers and identify their contribution to sales over the years. The Bayesian belief network (BBN) was used to create a network of media variables and redistribute the media contribution. The data used was sourced from POS, the promotional calendar, and paid media, along with earned and owned media. The second focus area was to assess the ROI of the commercial programs currently in place.

The solution approach revealed that 35.7% of media volume contribution was driven directly by TV, 25% of total media contribution came from social media, and 54.1% of social media contribution was driven by PR.

From all of this, an optimization engine was built over the MMM results in order to run different spend-based “what if” analyses. The approach input baseline sales (in absence of TV GRPs) across a series of weeks and the total budget for TV ads (or total pool of GRPs available). The optimization elements assessed were based on the resources available, the constraints and the main objective. The output was then able to determine the optimized allocation of GRPs across the selected weeks.

The Change

As a result, the optimization engine generated an additional volume of 7% using the same spends with superior spends allocation. Using BBN, TV flighting was optimized, increasing sales of the product by 5.6%, and it was determined that prioritizing TV support within key seasonal periods increased effectiveness. Digital laydown was another significant variable per the BBN. Optimizing it helped boost the sales of the product by 2.5%, and prioritizing continuity over short, high impression bursts drove higher sales.

Use automation to scale up services and manage growing demand

The Big Picture

A consulting company, providing services to card issuers, acquirers, and retailers across the globe, wanted to systematically scale up its services to manage fast growing demand. Its approach of delivering bespoke consulting engagements on an individual customer basis involved heavy manual data processing, limiting its reach to service and engage all players in the market.

Transformative Solution

To solve this problem, it was necessary to build a scalable solution out of its standard services by bringing data, consulting, domain expertise, and visualization all together. This would allow for the creation of a visually-engaging delivery platform for consultants across the globe with predetermined recommendations ready to offer their clients.

A solution suite of scaled data products was developed to service end-clients based on the consulting company’s in-house data and technologies. It was designed with scale and a wide range of global clients in mind, and leveraged process automation to reduce the dependencies on manual effort for refreshes.

Each solution in the suite was developed in a phased manner, refining the product features and outputs at each stage. Phases included design and development, go-to-market, enhancement with additional features, and operationalization. The solution enabled the client to scale up and engage with multiple markets and end-clients in a short time-frame, through data products based on an in-house data repository.

The Change

As a result, the company saw multiple benefits:

  • Greater data-driven, standardized insights and recommendations to end-clients across different engagements.
  • Powerful visualizations, helping clients understand and dig deeper into their own data to uncover new trends and insights.
  • Faster go-to-market times (60%-80% time savings across different solutions).
  • 100% adoption by the consultants to deliver all solutions to more clients, therefore increasing their coverage within the market.
  • Scalable approach to quickly expand the company’s product portfolio.
  • Design thinking and visual storytelling were leveraged to create engagement for consultants during delivery.
  • Automation of manual tasks wherever possible.
Manage data size and complexity with a single data delivery platform

The Big Picture

A leading insurer was faced with a data delivery challenge. It had an internal system for collecting policy quotes data across various lines of business. Although the data was very rich for analytical purposes, it was not usable due to its sheer size (>67 TB compressed) and complexity (>12k nested fields in XML).

The company needed to set-up a single data delivery platform that would enable different analytical and product user groups to query the data in an automated way. Another challenge the company faced was how to handle the various, disparate extractors that different teams across the organization had built, which could not handle the scale of the data and required additional man hours.

There were many other challenges. There were inconsistent business rules due to conflicts with the new data strategy. There was less granular data due to normalization. There was difficulty leveraging all of the historical data for insight generation. Manual intervention was required for data generation. A robust governance mechanism was needed. There was also no production grade system existing, which could be extendible.

Transformative Solution

Addressing the company’s challenges meant solving three key problems:

  • Data ingestion and management: Ingesting and collecting big data (~68TB and storing it on Hadoop).
  • Data harmonization: De-personalizing sensitive (PII) fields and partitioning the data in a distributed environment. Optimizing the stored data using Avro and parquet containers.
  • Data extraction: Creating a robust flask based UI to query the data in an automated manner and generate analytics-ready data easily, using map reduce and spark.

A single data delivery platform was created, aiming to streamline the process of data extraction, while consuming it in the purest form (XMLs). The platform had three main components:

  • Partitioning the data in a Hadoop environment on the basis of certain parameters. The parameters were identified upon thorough assessment of business requirements.
  • Building a Custom UI (User Interface) that would facilitate requests for data extraction based on the parameters identified, and rectangularize the nested raw data to CSV files, easing the analytical consumption of data.
  • End-to-end integration of the platform with the existing framework, further enhancing and improving easy scale up and operationalization.

The developed platform acted as a single source of truth for any data related needs, which was scalable to the enterprise (usable across multiple business units, functions, and teams). The platform was developed, keeping the business rules in mind in alignment with the data strategy. Highly granular data was made available due to direct interface with the raw layer, and data points were made available to the lowest degree of granularity. Optimized storage and pre-processing resulted in the historical data being available for extraction. A fully-automated platform eliminated manual intervention and enabled on-demand data extraction. A robust governance and security mechanism was built in using security groups and Kerberos for limiting data access. The platform and application developed was per industry and enterprise standards and flexible for new enhancements.

The Change

As a result of this process, the company received a first-of-its-kind big data platform for data hosting, MDM and custom data access, and business intelligence handling ~1TB of data. This provided a single source of information to different analytical teams, offered support for different filtering requirements, and delivered a streamlined process by abandoning the existing multi-layered architecture.

Track customer journeys across insurance functions

The Big Picture

A leading insurer wanted to track customer journeys across insurance functions and also optimize contact center operations. The company’s existing data was relational in nature, and even though it captured the customer history, a standard data structure (table) would not be able to capture the history in its entirety. The data the company housed was too big to be transformed or extracted using traditional tools.

More specifically, the company needed a flexible and robust ETL mechanism in place to convert relational parent attributes to a key value-based data structure that tracks consumer lifecycle journeys. Manual intervention was required for data generation, and different pipelines needed to be created to merge data, which was becoming a tedious process. Effort was needed to replicate similar functionalities across different data pipelines, resulting in additional effort and no standardization. A better security and governance mechanism was needed. Also, data was manually validated and loaded to Hive tables.

Transformative Solution

Solving the company’s data challenges meant addressing three key focus areas:

  • Developing a raw layer: The approach scheduled data ingestion from different structured and semi-structured sources using Airflow, an open source scheduler.
  • Developing an integration layer: the approach merged raw data in Spark to create a key value-based data structure that contained the entire historical information with scalability to add more data points. Parquet and Avro-based compression were used for optimized storage.
  • Developing the blended layer: The approach used Airflow-driven, use-case data generation, which was scheduled and free from manual intervention.

A data lake was proposed, developed, and implemented that would act as a single source of truth for any analytical data-related needs in the client vertical. A multi-layered architecture was developed and implemented, leveraging Airflow for automation and scheduling. The raw layer served as the holding area for the historical data and any new data that was ingested. The ingestion and validation were then automated and scheduled using Airflow.

The integration layer served as the merging area for performing transformations, as well as converting data to a merged key-value format, containing end-to-end consumer history from different parent attribute tables. The blended layer contained data for various use cases and was scheduled using Airflow. The merged data was stored in a compressed format, allowing the layered architecture to make incremental additions to the data lake. The system was governed and secure, using Unix and Kerberos-based access to the data.

The Change

  • As a result of the engagement, the company attained:
  • A layered architecture: This was scalable, robust, and secure, enabling standardized development.
  • Modular reusable components: These provided plug and play components for data ingestion, validation and loading. These were developed in a standardized manner across different data teams, saving time and effort.
  • Workflow manager driven automation and scheduling: This enabled a process with no manual overhead. Data generation could be scheduled as well as triggered manually from a user interface.
  •  Automated ingestion, validation, and data loads: Using Airflow resulted in faster parallel and error-free execution.

The data lake, by means of leveraging a key value-based data structure, allowed for transforming data to track customer histories and journeys across functions.

Gain early alerts and insights if metrics deviate from business goals

The Big Picture

A major P&C insurer wanted to develop an early alert system for tracking deviations from its goals and benchmarks in its business portfolio. This involved developing a guided analytics dashboard to track the KPI ‘Items in Force’, its metrics, and underlying drivers. The core parameters included: early identification of ‘where’ results were deviating from key business goals; answering ‘why’ results were deviating from goals; and providing bandwidth for leadership to focus more on improving the future of the business.

The client had many challenges with its existing process. Multiple and isolated reporting was being performed, and disparate data sources were resulting in a delayed agreement on direction. A slice and dice method was not available to understand why results were deviating from the goals, and there was no mechanism to alert users about deteriorating performance. There was also a long gap between availability and consumption-ready data, resulting in delayed decision making.

Transformative Solution

An agile and iterative approach was taken to accelerate requirements gathering, design, and development. The first step was to understand existing needs, challenges, goals, and objectives, then identify the key measures and performance indicators that align with the needs. Next was the design and development of the data layer, along with development of an interactive visual dashboard, and then implementation of automation and governance processes.

The solution offered centralized reporting with a highly-performant web-based interactive solution to slice and dice data for decisions. It consolidated data sources, resulting in the well-thought-out strategic metrics the client was looking for to enable effective and efficient decision making. Pre-defined storyboards were created to easily identify the exact problem area and its drivers, prompting a self-alerting mechanism to highlight deteriorating areas of the business. A fully-automated solution was put in place to process data and deliver it on time for quick decision making. Seasonality adjusted benchmarks produced accurate business trends.

The Change

As a result, the client now had a way of identifying drivers impacting the business portfolio early on. It had a centralized product for users across multiple levels, with a significant reduction in cycle time for solution availability to business stakeholders by way of automation. The platform also delivered accurate forecasting for better business planning.

Improve customer claims experiences using advanced analytics

The Big Picture

A leading P&C insurer wanted to build a market-leading analytical foundation. This would help it rapidly and profitably capture whitespace growth opportunity and identify ‘micro-drivers’ of customer satisfaction, profitability, cost, and ROI. The foundation was also needed to enable enhanced ongoing prediction, monitoring, and improvement in performance of claims settlement and the customer experience.

Transformative Solution

To solve the insurer’s challenges, more than 30 interviews were quickly conducted to understand the existing processes, challenges, and aspirations for the organization as well as to assess the current state and formulate the desired end state (including data/analytics maturity level). The solution identified focus areas around financials, customers, and employees. In addition, more than 50 use cases were identified and prioritized, which identified an opportunity of $100M+ in revenue across use cases.

The roadmap development approach consisted of

  • knowing the organization’s pain points, vision, and goals,
  • identifying the use cases by breaking down the problems using issue trees,
  • understanding the status of current processes, activities, and data accessibility, and
  • prioritizing use cases based on impact, need, and readiness.

The Change

The solution revealed a three-year impact of more than $100M that could be delivered through prioritized use cases from more than 50 analytical opportunities that were identified. The company identified $25M-$35M incremental EBITDA opportunity. Nearly 30 new analytics opportunities were identified to reduce costs, improve efficiencies, and enhance the customer experience.

More than 25 data sources were identified for building models around claims severity prediction and fraud detection. Robust models, leveraging 2k+ variables coupled with human evaluation, drove 10X more correct predictions and 10% fewer inaccurate predictions.

Match individuals across data sources without a unique key

The Big Picture

A leading credit bureau provided analytics and intelligence support to local credit rating agencies. The company had access to multiple data sources, such as bank data, voter IDs, and tax returns, from where it pulled information for individuals and created their credit score.

However, it was not possible to use information from all these data sources as there was no single unique key. For example, in a bank data set, an individual would have a driver’s license number, but in a voter ID data set, the same person would be identified with a voter ID number. The company couldn’t know that the individual was the same in both data sets. Therefore, there was a need to create a logic to match these different data sets without having a unique key.

This meant that the company needed an improved framework of working with unstructured addresses data so that it could provide a single view of customers across different data sources. It needed to drive measurable performance improvements by improving match accuracy and reducing false positives.

Transformative Solution

To solve the company’s challenges, a solution was deployed to match datasets using names and addresses, since this information was present in all data sources. Since the format of names and addresses were different everywhere, the solution needed to create intelligent and fuzzy logics to standardize names and addresses for mapping purposes.

The approach took raw data and deployed a name and address matching algorithm that was configurable at different levels. The solution incorporated a search capability along with optimization and improvement of matching. Three key steps were:

  • Data standardization: Data was cleaned and normalized to remove components not adding value to addresses. Addresses were segregated into logical components: house number, locality information, and pin code.
  • Address search: The approach searched the request address into candidate data using pin code (and derivatives) as a key.
  • Name and address matching: This step used Fractal’s dCrypt to match all the addresses in a key value pair with request address and selected top 100. For top 100 addresses, the corresponding names were also matched and the best output was generated on the basis of name and address matching scores.

The final output provided a list of names and addresses from the candidate data which match the name and addresses from the reference data. Using the algorithm, a matching score was generated between two strings which could be compared with a base matching score already present in the client’s sample file. Randomly selected samples were manually checked and a confusion matrix was created for both algorithms.

The Change

As a result of the engagement, the company achieved several benefits:

  • Improvement in accuracy by 10% on 11 million household addresses.
  • Incorporation of a search capability in the matching algorithm.
  • Three different algorithms were used for matching as opposed to a single algorithm for name and address matching, which led to better coverage and efficiency.
Use data to identify ideal premiums and non-renewable policies

The Big Picture

A major home insurer sought to grow its policy book by selectively targeting potentially profitable business. The company wanted to predict non-catastrophic losses per household per year using home attributes. The company sought to determine profitability by estimating unrestricted and ideal premiums. In addition, it wanted to identify policies to be non-renewed based on profitability and catastrophic exposure.

Transformative Solution

To determine the profitability of policies the approach combined data across claims, policy, and premium to create the modeling data for in-force policies. Then, it identified key drivers of non-catastrophic losses for homeowner’s insurance. Advanced machine-learning techniques were then applied to predict expected losses using publicly available home attribute data. The approach created ensemble models to reduce variations from individual techniques, and compared results from machine learning to the existing techniques.

Building models by peril captured different relationships between rating variables and losses for different perils. Different machine-learning techniques generated different insights, enhancing the predictive power.

The recommended model helped the client answer ‘who’, ‘how’, and ‘when’ to target. It helped identify renewable and non-renewable policies and predicted losses to understand significant drivers for lowest losses vs highest losses.

The Change

The resulting unrestricted premium model was significantly better than the client’s previous model and one of the top insurer’s models. The model enabled a reduction in loss ratio by 19% on non-renewal of the worst 10% of the policies. The unrestricted pure premium model picked roughly 50% of the premium, and the loss ratio for the policies picked by the unrestricted pure premium model was only 34% vs 57% with the company’s previous model.

Improve claims anomaly identification and tracking

The Big Picture

A top 5 US payer wanted to improve its ability to identify and track claims anomalies. Its existing process was business rules-driven, significantly manual, applied only in a post pay scenario and lacked a visual solution to track improvements and business impact. As such, a few of the most common anomalies were being only partially addressed and unknown anomalies were not getting identified.

The ‘partially addressed’ common anomalies included: billing services not provided, billing non-covered services or items, improper billing practices, and billing unnecessary services and items. Many uncommon anomalous patterns such as high cost out-of-network claims without prior authorization, ambulatory claims without ER visits, were getting missed. As a result, on an average, only $1-2M annual opportunity was identified for recovery or better utilization management in previous years.

Transformative Solution

To solve the organization’s challenges, the solution addressed the situation along three key components:

  • Better problem solving: The approach considered both aspects of anomaly detection – known anomalies and unknown anomalies. Additionally, several hypotheses were generated to prioritize claims selection for review by anomaly detection huddle group.
  •  Sophistication: The solution applied predictive analytics, AI and unsupervised methods to drive detection effectiveness.
  • Accelerated consumption: An interactive tool was developed to identify new patterns, track anomalies and recoveries, and monitor and evaluate the impact from interventions. Also, data refreshes and metrics generation were automated.

The solution primarily focused on identifying unknown anomalies, starting with business rules-based prioritization of claims for review. Many hypotheses were created to identify a set of business rules for claims prioritization, such as high cost thresholds, ambulatory claims with certain procedures, behavioral health related claims, outpatient surgeries, and more.

The solution was further complimented by applying AI and predictive analytics for even better prioritization. Several algorithms were used in the analysis of claims characteristics, RX characteristics, member lifestyle and demographics, provider characteristics and external data.

The Change

The solution led to superior outcomes compared to the previous year, majorly through identification of new and unknown anomalies in 2017. These included:

  • Overpaid dollar recovery, e.g., high cost out-of-network claims without prior authorization
  • Better utilization management, e.g., members with consecutive day ER visits
  • Network optimization, e.g., in-network facilities transferring patients to out-of-network facilities

Identification of new / unknown anomalies led to a higher business impact in savings opportunities. The new business rules and POC predictive model identified 4X more opportunity in the subsequent year. This additional savings helped business self-fund a suite of advanced analytics and AI-based solutions proposed for realizing future benefits.

Leverage external data to improve pricing and underwriting decisions

The Big Picture

A leading health insurer believed it could better predict claims experiences using external data, as supplement to internal claims data. The organization wanted to leverage the predicted claims experience to improve pricing for new business, by considering external data in addition to internal factors such as age, gender and region.

Transformative Solution

As an initial objective, a rating modifier was proposed to be built leveraging expected claims experience that is informed based on external data.

Two key considerations for designing the solution were to

  • validate the business case under consideration through external studies before investing in the use case, and
  • to achieve a decent lift over and above the claims experience informed by age, gender and location factors.

With these considerations in mind, the approach began by performing a discovery to ascertain the validity of the use case. Upon completion of discovery, several external variables that were predictive of the claims experience were identified, and were mapped to the client’s external data ecosystem for further validation. This included factors like buying behavior, socio economic and financial indicators, and health interests.

Post validation of the use case in discovery, several hypotheses were identified to create a set of 200+ features from external data to validate factors that could potentially predict claims experience. The variables were created to represent various behavioral aspects and characteristics of members, using external lifestyle census data, zip data, and account level variables.

The approach explored multiple traditional and advanced machine-learning techniques to predict claims experience (loss) at member level. The model that was built identified meaningful drivers for the claims experience of a member. Several key insights were uncovered, such as individuals with prime time television usage associated with lower loss amount.

To adjust rating factors, the expected loss (claims experience) at member level was rolled up to the group level to create a group level rating modifier. The rating modifier created by leveraging external data reduced the gap between predicted and actual loss.

The Change:

As a result, this rating modifier, which leveraged the external data, ended up reducing the gap between actual and predicted risk. Adjustments were then proposed to the quotation process for incorporating the new modifier into pricing of new group policies.

Improve medication adherence to lower health costs and improve patient outcomes

The Big Picture

Non-adherence of medication is one of the most critical problems when treating patients with chronic conditions. Patients who do not follow the prescribed drug regimen are more likely to suffer poor health outcomes, significantly contributing to the total cost of care. A top 5 health insurance payer wanted to improve medication adherence of patients with chronic conditions to lower health risks, improve health outcomes and lower costs. The organization also wanted to improve customer engagement. To achieve all of this, the organization needed insights to understand and anticipate medication non-adherence to drive more effective intervention strategies.

Transformative Solution

The solution involved developing a medication non-adherence framework to identify individual patients less likely to adhere to their prescribed drug regimen during one year. A structured problem-solving framework was designed to identify the types of non-adherence, understand the drivers of non-adherence, build a model to predict members likely to exhibit such behavior and then implement the right intervention plan to decrease medication non-adherence. The solution involved three stages:

Stage 1 – Problem solving: This stage identified over 200 potential hypotheses by integrating data from claims, medication utilization history, medication information, member demographics and consumer data and classified medication adherence based on key drivers.

Stage 2 – Predictive modeling: This stage evaluated various types medication non-adherence that could be measured and acted upon. These are: 1) no prescription or refill filled, 2) incorrect dosage, 3) medication at the wrong time, 4) forgetting to take doses and 5) stopping therapy too soon. The scope was limited to defining non-adherence based on prescriptions filled due to non-availability of data to ascertain other behaviors. We leveraged advanced machine-learning models to predict past non-adherence looking at various factors, including patient-related, healthcare system and provider-related, therapy-related, conditions and disease-related, cost-related and socio-economic.

Stage 3 – Assigning action plans by segment: This stage mapped out recommended strategies for three key patient segments that behaved differently in their medication adherence for chronic diseases.

The Change

Armed with more accurate prediction and classification, the payer was able to create targeted intervention and outreach programs to improve medication adherence and health outcomes for patients with chronic diseases, resulting in $5 million dollars in annual care related savings.

Develop prediction framework to address high attrition

The Big Picture

Wholesale drug distributors have experienced strong competition and consolidation leaving only a few surviving entities to service most of the US market. A leading distributor of drugs to long-term care facilities was losing business to competitors over the last several quarters. The hypothesis was that facilities were leaving for addressable reasons such as poor response times, inaccuracies in executing orders, or delays in addressing customer complaints. The organization wanted to understand the drivers of attrition and proactively identify at-risk accounts to develop effective retention strategies.

Transformative Solution

The solution proposed that the organization maximize the total lifetime customer relationship value, with the first phase focused on determining the drivers of attrition. Involuntary attrition related to the organization terminating the relationship or closing a facility was not included in the initial scope.

To understand drivers of attrition and predict the observed attrition behavior, over 40 hypotheses were developed to identify likely churners, including facility and pharmacy characteristics, customer service, account management details contract information and billing data. The definition of attrition was defined separately for each segment – for Skilled Nursing Facilities, attrition was defined as cancellation of the contract, whereas for Assisted Living Facilities, attrition was defined as a significant drop in prescription volume.

A variety of machine-learning techniques, including Gradient Boosting Machine and Random Forest, were used to build advanced predictive models. The current profitability was considered in prioritizing facilities for retention with a segment-specific intervention strategy based on the drivers of attrition, which varied significantly across high-risk segments.

Based on the analysis, the following insights were provided to address customer pain points:

  • Facilities served by pharmacies with high clear time percentages and high satisfaction scores attrite less
  • Facilities that experience calls with high average speed of answer (ASA) and a high rate of abandoned calls attrite more
  • Facilities with a high number of active months in the past 12 months attrite less (active months are the ones with non-zero script volume)

The Change

As a result of the predictive analytics solution, $45MM in revenue opportunity was identified by retaining profitable facilities:

  • An estimated $40MM in potential of revenue could be saved through retention targeting of 40% of skilled nursing homes
  • $5MM in revenues could be saved by targeting Assisted Living Facilities

In addition, the ability to ensure high service levels through close monitoring of drug supplies, customized kiosks for easy order placement and locating pharmacies in close proximity to facilities can significantly improve retention.

Improve collectability of the self-pay portion of medical expenses

The Big Picture

Escalating healthcare costs have forced employees with employer-provided insurance to bear a higher portion of self-payment costs such as co-pay, coinsurance, deductible and out-of-pocket expenses. Hospitals are challenged to collect the patients’ portion of medical expenses, which according to the American Hospital Association, comprises 6.1% of all services.

Multi-specialty practices collect only 56.6% of their accounts receivables in the first 30 days. Many hospitals, especially a growing number of non-profit companies, are particularly vulnerable and are seeing their access to capital weakened and their capital ratings downgraded due to bad debts.

A leading multi-specialty, multi-location healthcare provider, serving eight million patients annually, wanted to deploy analytical methods to manage and increase the collectability of the self-pay portion of services delivered.

Transformative Solution

While maximizing point-of-service collections was a key goal, improving account receivables collections post patient discharge was extremely important for hospitals. Successful risk scoring approaches for self-pay accounts required segmenting them by understanding patients’ ability as well as their willingness to pay.

To design a well-rounded collection strategy, it was critical to: know which patients required assistance with payment plans, know where to apply discounts, and define what, if any, additional collection resources were needed to be deployed.

The approach started with applying over 50 hypotheses to identify different data elements for the analysis, including patient characteristics, policy benefits, credit worthiness, and medical and procedures data to define the key drivers of patient payment behavior.

A suite of advanced predictive models was developed using machine-learning techniques to gain insights into a patient’s propensity to pay, the likely amount a patient would pay and the timeframe within which a patient was likely to pay.

Five key segments were identified with distinctive payment behaviors and the following treatments were recommended:

  • Continue to bill 40% of patients with little or no additional follow-ups
  • Offer payment plans and low discounts to 20% of patients to collect sooner
  • Leverage collectors to target the remaining segments that were highly unlikely to pay with higher discounts.

The Change

The solution helped the provider target the right accounts, speed up collection receivables and collect more unpaid fees. The solution revealed that providers can recover significantly higher incremental dollars by selectively offering payment plans, discounts and collection resources. The client received a best-in-class solution of ensemble machine-learning models that predicted amount, time to pay and propensity to pay.

The recommendations helped the provider identify segments to collect $20MM in payments from unpaid health service invoices by targeting 30% of outstanding accounts.

Reduce high costs of care associated with avoidable ER visits

The Big Picture

The high cost of maintenance and limited availability of Emergency Rooms (ER) facilities are under intense scrutiny by payers, the government, providers and employers. According to the Centers for Disease Control and Prevention (CDC), Americans made 136 million ER visits in 2014, which is likely to increase further. Yet a study in the American Journal of Managed Care cites more than 30% of ER visits could have been avoided.

Avoidable ER visits stem from a lack of coordinated medical attention that drives higher costs of care, longer wait times and sub-standard health outcomes. Redirecting only 20% of ER visits to lower-cost alternatives, such as urgent care or Primary Care Physicians (PCP), could save $4.4 billion, according to HealthAffairs.org.

A multi-billion dollar healthcare payer wanted to identify members likely to make avoidable ER visits, and steer them to more cost effective alternatives.

Transformative Solution

Members may be visiting an ER unnecessarily for convenience, desire for a more effective PCP, insufficient co-pay funds, or an unmanaged condition. To address these challenges, clinical rules were used to identify low intensity conditions where an ER visit could have been avoided. The approach offered more than 50 hypotheses for factors which could be predictive of avoidable ER visits.

To test these hypotheses, we identified different structured and unstructured data sources such as call center notes, geographic details for members and providers, and the availability of providers.

For unstructured data, we applied multiple feature selection algorithms such as InfoGain1 and BNS2. For structured data, we tested hypotheses such as distance of the Primary Care Physician or urgent care facilities, ease of access to an ER, and difficulty finding quality providers. An ensemble of classifier models was developed to predict the likelihood of visiting an ER for low intensity conditions, using advanced analytics such as machine-learning, text mining, and traditional modeling techniques.

The solution identified 65% of all avoidable visits among 30% of the population. This yielded an opportunity to save more than $10M annually by targeting a small group of members for alternative care management and provider interventions.

The Change

The payer was able to gather from this project that

  • Members with past ER visits were 8 times more likely to visit the ER unnecessarily.
  • Members visiting multiple PCPs were twice as likely to make an avoidable ER visit.
  • Each avoided ER visit could reduce costs by $1,500, leading to $10M in potential cost savings.
  • Optimized ER utilization could substantially improve member health outcomes.
  • Creating a framework of text-mining and machine-learning methods could improve accuracy in rare event scenarios.
Identify preferred store locations to enable personalized marketing

The Big Picture

A leading payment provider wanted to understand the relationship between a shopper’s home location and preferred store locations in order to improve marketing efforts and make store-related business decisions.

Transformative Solution

The solution deployed iterative a machine-learning algorithm on geo location data and merchant data to identify patterns in shopping behavior. The team identified source data from transactional, merchants, and geo sources, creating extensive rules to propel the initial cleansing and mining.

The data was matched with external data sources to identify merchant locations using a scalable matching solution that leveraged probabilistic search techniques.

A two-way learning algorithm was applied to estimate merchant and cardholder location. This merchant location data was used to identify transaction level data for customers. Insights from the analysis were used to improve personalized marketing tactics.

The Change

The program achieved an accuracy level of 97%, 72%, and 58% for Mine, Match and Learn stages respectively while a fill rate was achieved of 13%, 55%, and 99% respectively. The derived insights will help the client in providing card linked offers and obtain shopping insights for retailers. Additionally, the initiative identified digital targeting opportunities based on customer home and work locations.

Deliver next best product recommendations during customer interactions

The Big Picture:

A leading retail bank was facing low customer engagement and satisfaction with its customers. The existing analytical models on product propensities generated lower accuracy, and missed critical data elements, such as offline and online interactions and transactions, and prevented an objective arbitration of offers among multiple competing product offers. This resulted in sub-optimal customer experience and lower response rates.

Transformative Solution:

To address the company’s challenges, a new next best product and service recommender was built using deep learning. It was designed to predict the top three recommendations from among a wide suite of products, and for services.

A single customer view was prepared with 4,000+ attributes such as customer product holdings, transactions, in-bank transactions, and online interactions.

The models were tested on a select population within the lead scoring platform and deployed centrally.

The Change:

As a result of the engagement, customer product-offtake rates jumped by 60%, resulting in significantly higher marketing ROI.

Identify major customer events using analytics

The Big Picture

A leading bank had the highest share of young adults in the 18 to 26 years age group, but the proportion tails off in the subsequent older age groups. Given that customers’ profitability peaks in the older age groups, with 35-45 age group customers having the highest customer lifetime value (CLV), it was a cause of concern and needed an investigation, as well as a significant opportunity to acquire and hold on to more profitable customer segments.

Transformative Solution

The company decided to create a unified view of the customer to improve customer service, and identify and determine the impact and time duration between events of a customer journey to retain customers. Through an exhaustive review of the bank’s requirements, a comprehensive list of business hypotheses was drawn from different product verticals, and a customer base was identified to be analyzed. Data of varying sources was integrated into a single view of the customer.

Modeling occurred using Java, and the sequences were mined for events that would result in an attrition outcome. Upon identifying likely attrition events in the customer journey, the customers were profiled month-to-month on key KPIs to identify specific behaviors prior to attrition.

The Change

Although the sequence pattern mining analysis pointed towards ‘fee’ as a driver of attrition, the bank was not convinced enough to take the analysis to production. The client, though, was highly satisfied with the upskilling of its internal team through monthly meetings and sessions and was acknowledged as such in the final meeting.

Recommend replacement products to advisors to drive sales

The Big Picture

Financial advisors, looking to add value to consumer investors, frequently rebalance their portfolios to keep with investor’s needs for risks and gains. This presents an opportunity for asset management companies to intervene with financial advisors and recommend replacement products, often replacing an underperforming competitor product with a better performing proprietary product. A leading global asset management company needed to have a holistic, but quantitative, basis for funds replacement at scale.

Transformative Solution

The solution was to develop holistic framework analyses to evaluate fund attractiveness including returns, ratings, costs, advisor relevance, and other attributes. An entity extraction algorithm was applied to modularize and score the attractiveness of every owned and competitor fund held by financial advisors. K-means clustering and collaborative filtering was applied to map the relevance to advisors. Finally, a simple rule-based recommendation engine was put in place to score and suggest replacements based on incremental attractiveness.

The Change

POC identified an immediate opportunity of $1.5B in incremental sales by replacing underperforming funds.

Automate insight generation from unstructured sources

The Big Picture

Asset management companies are perpetually in the race to capture mind-share from financial advisors by being relevant, contextual, and timely to drive product sales and adoption. A large portion of advisor interest remains locked in unstructured documents such as analyst reports, home office recommendations, sales notes, and other documents.

A leading global asset management company needed to enable a scaled, automated solution that could extract meaningful insights from vast volumes of unstructured text, and provide a timely, digestible format for consumption by relationship managers.

Transformative Solution

The solution was to apply advanced text-mining algorithms to extract topics and sentiments from volumes of PDF documents, married with industry-specific text libraries, to parse out business-ready content. The solution was then engineered as an HTML application that could be queried in real time and consumed by relationship managers, in this case, by an automatically generated PDF report that was refreshed and published quarterly.

The Change

The solution enabled a 98% improvement in productivity for a relationship manager to identify critical insights from published documents.

Identify growth opportunities across functions

The Big Picture

A global CPG company needed to be able to predict and identify potential areas of growth that would apply to future scenarios in multiple markets and product lines (e.g., household detergent and food products). Multiple, cross-functional business teams needed to understand which areas of their business would grow and which would decline ensure future, sustainable profitability.

Transformative Solution

The company chose to implement a growth driver analytics suite of the right tools to have a ‘forward-looking view’ of likely performance, and identify granular opportunities for growth. The initiative aimed to empower cross-functional business teams to preempt declines in business in specific markets, and accordingly make the corrective measures to bring back momentum in demand and sales.

A Bayesian Network approach was used to bring together all KPIs that come from across business-functions, and tie them together in a network structure, to capture the interrelationships among various dimensions and attributes. Through these models, Fractal captured the direct and indirect impact of various KPIs on business performance (sales), and hence the total impact in terms of elasticity. With the help of these network structures, Fractal could identify the set of drivers for any element in the structure and determine how to influence future incomes.

Additionally, the teams were able to capture lead-lag relationships among various KPIs so they could preempt changes in any KPI with the use of other driving KPIs.

The Change

A front-end platform for business users was enabled providing the required intelligence, empowering the cross-functional business teams to be agile and accurate in decision-making. The platform employed three core modules:

  • An Early Warning System that provides a forward-looking view of likely changes to the momentum of sales in the next few weeks or months. This helped spot areas of issue or opportunities, which could require interventions.
  • A Drivers Discovery Module that is used to uncover key drivers in a market and identify potential levers to pull to gain momentum. This augmented CCBT with differentiated drivers and brought out the relative importance and impact of different levers in the current environment.
  • A Simulator Module that was used to run different scenarios of interventions, and predict the likely outcome, and hence enable the company to course correct the investment plan.
Optimize product portfolios using Bayesian Regression analytics

The Big Picture

A leading CPG manufacturer recently changed its messaging strategy from attribute-oriented to customer experience-oriented and thus wanted to compare the impact of its old messaging to its new messaging on brand volume sales at the retailer level. Additionally, the company wanted to understand how its campaign had impacted private label product lines and how it could enhance its campaigns in the future to maximize net ROI.

Transformative Solution

A Hierarchal Bayesian Regression technique and a Bayesian Belief Network were used to build marketing mix models to measure campaign performance at the total US level by evaluating the impact of each campaign in generating incremental sales. The team further refined the models at a more granular retailer level using business information (priors) from the national level model as an input to the retailer models. The team could effectively establish performance of each campaign for all of the retailers and their impact on both brand sales and private label sales.

The Bayesian Belief Network approach was further adopted to identify the interrelationships between all of the media channels and tie them together in a network structure in order to capture the direct and indirect impact of media channels on sales. With the help of these network structures, the true contribution from a media channel on incremental sales was quantified, enabling reporting of the true ROI of each product line.

The Change

With the measurement of incremental sales of each campaign, along with true channel performance by Bayesian Network, it was established that customer-centric campaigns performed better than attribute-centric, increasing sales in both brand and private product lines, thus growing the entire category. Additionally, the true impact of campaign messaging was isolated from the channel mix strategy and other factors, providing further, more actionable insights.

The organization could easily transform the learnings from the study to enhance its marketing strategy and its trade strategy with all the retailers by highlighting the impact of advertising on private label goods.

Build effective communication and product strategies

The Big Picture

A leading brand in the men’s grooming category already had over 80% volume share in all key markets. The penetration of the category was close to 100% already, and hence the only feasible way to grow business was to move consumers to higher-tier products. The business wanted to identify the right communication and positioning strategies for high-tier products to attract the consumers of low-tier products. The company also wanted to identify opportunities for a new product introduction to trade-up low-tier consumers.

Transformative Solution

The solution was designed to identify key brand drivers for different segments of consumers, and identify key product attributes to offer new products or in communication of existing products in order to trade-up low-tier consumers.

Equity drivers analysis was conducted with the help of network models on top of respondent-level equity tracking data to understand how perceptions are built in the minds of consumers. The solution identified what among various product attributes and equity measures drove the overall brand rating. The approach quantified the impact of each of those attributes on the overall rating and uncovered key equity paths that could be adopted in design of global and local campaign messaging and product positioning. This helped discover what consumers look for from the category that drives their perception and choice of brand or product across tiers.

The team established the key equity paths for the category using Structural Equations Models and identified the key drivers for each of the consumer segments (based on current product usage) and highlighted the different needs across segments.

The Change

Based on the key drivers that emerged for each of the segments, Fractal identified the winning product proposition for the market and developed strategies to trade-up the low-tier consumers towards high-tier products and hence increase revenue for the business. Meta analysis of studies across different markets was conducted in order to compare key drivers and investigate commonalities and differences.

Comparison of key brand-drivers across markets revealed opportunities for global campaigns and product innovation, along with market-specific customization needed in communication strategies based on distinct drivers.

Reduced shipment complexity and cost through network optimization

The Big Picture

A Fortune 100 Technology company had high complexity in distribution and SKU portfolio, which led to suboptimal costs of shipment. It sought to reduce its cost of shipment through network optimization. The challenge was setting the correct replenishment cycle time and quantity, based on inventory values, freight costs, and an assessment of service levels. The other challenge was to identify which class of SKUs to target and optimize the network for the shipments of these SKUs. It was determined that network optimization through a best-in-class inventory rule, along with various network management techniques at the distribution center level would solve this problem.

Transformative Solution

The following inventory and network optimization methods were used to address the company’s challenges:

  •  Analyzed network shipments by creating order, inventory, and shipment, along with systematic modelling of the supply chain network.
  • Engaged across functions i.e., network design, channel, planning, logistics & distribution, order fulfilment and customer services.
  • Identified major drivers for out-of-region shipments by assessing the network KPIs, i.e., number of transit days, day-on-hand, distance travelled and cost per shipment. These KPIs simulated and assessed the impact of proposed inventory levels.
  • Provided actionable recommendations to reduce freight costs, and identified areas to improve service levels based on prescriptive analytics on allocation principles, transportation route, and inventory planning.

The Change

Optimized network with two DCs instead of three resulting in multi million dollars in cost savings. The company identified opportunities to reduce ~5% of total freight costs by eliminating sub optimal shipments. Also, a deeper look into demand vs replenishment profiling for distribution centers resulted in a recommended action plan to increase the frequency of replenishment from four to three weeks.

Identify relevant report information from unstructured data

The Big Picture

A leading bank was conducting projects on socioeconomic issues, which resulted in the production of unstructured data in the form of a plethora of files. Given a database of documents and concepts as use cases, the challenge was to analyze and identify insights in the documents related to ‘road safety’.

Transformative Solution

To solve the company’s challenge, an analysis and assessment framework was used that leveraged text mining to enhance productivity. Deep-learning based techniques were used to identify semantically similar keywords to expand the scope of the search. The approach also cross-tabbed documents and search terms by frequency or correlation to generate cross-document heat maps. The solution computed distribution of search term frequencies, projects counts, and other statistics across countries and report start years.

The Change

As a result of the solution, the company was able to use co-occurrence statistics to identify other terms related to search keywords. 60% of the documents were found to be relevant. The solution enabled the company to categorize relevant documents based on ranking.

Identify social media followers’ interests to deliver targeted offers

The Big Picture

A leading loyalty analytics company wanted to identify the personality traits of ~64K of its loyalty shoppers based on their Twitter feeds. This uncovered a few challenges that needed to be examined before creating a solution. Out of the total number of Twitter followers, only 5K were identified in the company’s user base. In addition, auras that accurately and comprehensively represented personality segments needed to be built. Lastly, Twitter is noisy, full of abbreviations, emoticons, and non-descriptive text which made sifting through the data a challenge.

Transformative Solution

To start, auras of personality segments were constructed using an initial taxonomy/dictionary. This was enhanced through additional sources (lexical resources and pre-trained neural network word embeddings). Then, text from Twitter streams was extracted using Kafka in real time, and dCrypt was run for text cleaning and preprocessing. Key phrases were then extracted using unsupervised methods, and the similarity score to the personality segments was calculated by distance measures (such as cosine similarity). Lastly, DBPedia was leveraged to find the relatedness of terms to personality segments.

This technique successfully identified the personality traits of the loyalty shoppers. Out of 60K social media followers, traits could be identified for 3K (5%) of them. Movies and music were recognized as top interests of followers of its company page.

The Change

As a result, this information is being used by the company for targeted marketing. The analysis offers the opportunity for the data to be extended to other social media pages. Also, the specific followers identified can be targeted with relevant offers using their social media handle.

Match customer information across multiple data sources

The Big Picture

One of the largest providers of credit information and information management services in the world wanted to develop an algorithm to appropriately link names and addresses across multiple datasets. The company was looking to achieve measurable performance improvements by refining match accuracy and reducing misclassification rates.

The company already had a basic framework for matching customers by their names and addresses from different data sources, but were striving to improve the current process. The company wanted an improved framework to work with unstructured address data so that it could provide a single view of customers across different data sources.

Transformative Solution

It was determined that dCrypt’s Fuzzy Matching module would provide the solution. It combines several different fuzzy matching techniques such as Levenshtein Distance and token-based distance measures, and applies heuristics to give an overall match score, resulting in higher accuracy and recall. The system offered the following:

  • Cleaning and normalizing the data to work within a standardized form.
  • Segregating the addresses into logical components such as house number, post code, etc.
  • Using the post code as a reference so the request address can be searched within the data of the same post code.
  • Matching the addresses by the top 100. The corresponding customer names are also matched and the best outputs are generated based on the overall score.

The Change

Using dCrypt’s Fuzzy Matching module for higher accuracy enabled the company to track almost 80 million more candidates for credit rating agencies, which were not captured by the existing system. The overall accuracy of matching improved by 7 percentage points, from 74.2% to 81.2%.

Enable guided selling using natural language search queries

The Big Picture

A large health and wellness products company was looking to build a cognitive learning platform that would enable guided selling based on specific consumer needs. It was also looking to create a customized wellness experience across its brands.

The company’s key challenges included extracting text from sources such as Lotus Notes Rich Text Format and PDFs (where extraction is imperfect) and harmonizing the data from different sources at the correct levels. It needed a solution that could understand the intent of long and indirect queries and terms expressed in multiple forms (for e.g., head ache and headache, etc.). It also needed to develop a framework that could be used by several systems that could easily be enhanced with additional data sources.

Transformative Solution

It was determined that dCrypt would solve the product recommendation problem using an Information Retrieval (IR) based approach. The framework was created using several components of Natural Language Processing, IR, and software engineering for a fully-operational search-engine system. This consisted of: ingesting and extracting data from various sources harmonized at the right levels for a single view; cleaning, preprocessing, and feature extractions for the models; and building ranking models for providing product recommendations to user queries.

Several types of data, including product descriptions, health/wellness related, and other sources, were used for building a richer recommendation engine. The solution was exposed through an API, enabling a business-agnostic solution to be created that could be plugged in and consumed by various other systems.

The Change

The new search engine provided relevant product recommendations for natural language search queries with higher accuracy and recall than the company’s existing e-commerce search. The system was capable of understanding long sentence and indirect queries. It was integrated with the company’s mobile app, and it could be improved upon by plugging in new data sources easily. Logging was incorporated to allow for future improvements such as recommendations and personalization.

Leverage unstructured data to improve preventive care

The Big Picture

A major US health insurance firm wanted to assess the riskiness of its customers. Traditionally, the company used structured data sources, such as customer demographics, past claims data, and past health details, to predict the likelihood of a customer raising a claim in a specified number of days. The company wanted to augment a structured data approach by leveraging insights from unstructured data to enabled more accurate predictions.

Transformative Solution

To solve the company’s challenges, a large repository of call center data was used. This was unstructured data in the form of call transcripts. The approach was to use a big data platform and Spark to process the call center data and use Python to develop a model on ‘propensity to claim’ using only the unstructured data.

The output of this model was then used as an input in the model with structured data. The final model was an ensemble of the models of structured data and unstructured data. The enhanced data set was used to build member risk scores. Members were prioritized based on their risk, so the company could provide better and more focused care. The model performed significantly better than the model that used structured data alone.

The Change

As a result of the approach, it was found that mining unstructured text added significant value over the structured data analysis techniques alone. The ensemble of methods enabled:

  • An 11%-point improvement in KS statistic (which assessed ‘goodness of fit’ i.e., how well the statistical model reflected the data) – from 35% up to 46%.

Several key indicators of model performance were also increased:

  • A 12%-point increase in the success rate in the top three deciles (lift) – from 63% up to 75%.
  • A 10%-point increase in concordance – from 66% to 76%.
  • A 4%-point increase in model classification accuracy – from 68% to 72%.
Identify the most efficient shipping carriers through advanced analytics

The Big picture

A Fortune 100 CPG company had a diverse portfolio of products and high SKU-level complexity. One of the issues that the company faced was assessing each carrier and choosing the right carrier for shipment across the globe. Since there were various metrics available, the challenge was ranking carrier performance and choosing the right one for shipping. The same set of metrics were not applicable for each shipment or region.

Transformative Solution

Addressing the company’s challenges meant choosing the right metric to assess and simulate carrier performance. Carrier performance was assessed on various dimensions, such as service levels (including reliability, transit time, geo coverage, etc.), carrier (including cost per mile, responsiveness, fleet size and types, etc.), and shipment (including length of haul, shipment volume, product categories, etc.).

Understanding the carrier selection framework, the solution followed a step wise approach by:

Identifying correlated attributes: Cost competitiveness was assessed between carrier pricing and industry averages. Contract compliance testing was performed regarding on-time delivery and accuracy. Cost-reduction targets were identified with the help of payment term benchmarks.

Ranking of carriers: The solution assessed the best carrier for daily and weekly shipment characteristics, performed driver analysis for carrier ranking, and assessed opportunity savings resulting from carrier ranking and benchmarking.

Assessing alternate carriers via what-if analysis: This included the ability to override key attributes and simulate the impact of potential changes to lead times and carriers.

The Change

As a result of the engagement, the company achieved several benefits:

  • A $1M savings opportunity per category and group of transportation lanes was identified by effectively selecting the best ranked carriers.
  • Renegotiation of SLAs with certain carriers that were sensitive to loading and unloading times.
  • Improved warehouse planning and shipping processing for better service levels and reduced costs.
Reduce inventory levels to release working capital

The Big Picture

Packaging material adds great value to end consumers’ perception of a brand. Packaging material cost has a significant share in the total cost of the product, and its reduction helps to achieve better profit margins. One of the ways to reduce purchase cost is to optimize frequency and volume of purchases.

The conflicting business objectives for a leading company and its supplier posed a major challenge to optimization. As cost was a key business KPI, the supplier favored longer production runs as opposed to frequent shorter runs. On the contrary, the company supported shorter runs for better control over volatile demand as well as reduction in inventory and scrap cost.

Transformative Solution

To solve the company’s challenges, the conflicting objectives were analyzed and an approach was developed to optimize order frequency and order volume.

First, demand consolidation for packing material was performed from multiple sources using sophisticated data harmonization techniques. A genetic algorithm was used to optimize the ABC classification by material spend. Using advanced mathematical techniques, such as mixed integer linear programming, exhaustive enumeration, and theory of constrains, the approach developed optimized order frequency and order volume for each purchase cycle. Then, a Monte Carlo simulation was run to arrive at potential savings by running and estimating the scrap risk.

The Change

As a result of the solution, the company achieved several benefits including:

  • Optimized purchase volume and purchase frequency of the packaging materials led to savings of $1M per material, per purchase cycle.
  • Reduced total inventory holding cost as a result of the streamlined ordering pattern.
  • Reduced scrap risk as there was less unproductive inventory.
  • Released working capital as a result of less inventory holding costs.
Use predictive modeling to control critical process parameters

The Big picture

A Fortune 100 fertilizer manufacturing company produces fertilizers that must meet quality criteria for key natural elements like potassium, nitrogen, and phosphorus to be within a defined range of specification. The quality of output is largely dependent on critical process parameters (CPPs) during the manufacturing process such as temperature, electric current, and flow volume. Using CPPs for predicting the output would help to keep the finished product grade within acceptable tolerance levels. Having a large number of CPPs, however, necessitates the need for a robust prediction model, which would balance out the trade-off between prediction accuracy and model complexity.

Transformative Solution

After thorough data understanding and brainstorming, the following four-step approach was taken:

  • Harmonized the data to consolidate CPPs, and lab test the output into a single source of truth.
  •  Developed multiple random forest models for each output parameter and product grade.
  • Validated the accuracy of the model on test data; then iterated and refine the model.
  • Developed a final prediction summary and variable importance summary.

The Change

The results determined that for water soluble fertilizers, in 98.4% of instances the prediction accuracy was greater than 95%. For other nutrient products, in 99.9% of instances the prediction accuracy was greater than 95%. The client saw a multi-million-dollar savings through these tools in fine-tuning the manufacturing process, which statistically derived critical and controllable process parameters that drove the output grade.

Reduce inventory levels to release working capital

The Big Picture

A Fortune 100 CPG company had a diverse portfolio of products along with high SKU level complexity. One of the issues that the company faced was a high working capital blockage due to elevated cycle and safety stock inventory levels. The goal was to analytically determine the right levels of this stock without impacting the service, also while generating supply chain cost savings.

To manage a complex portfolio with irregular demand and high seasonality, it was necessary to get the demand forecast right in order to direct other downstream processes. The importance of lead time variability and stock on-hand would further streamline end-to-end inventory optimization. A more accurate demand forecast and visibility over lead-time variability would allow for effectively predicting the required inventory levels to attain target service levels.

Transformative Solution

The following demand and supply planning processes, were used to solve the company’s challenges:

  • Generated an accurate demand forecast using demand sensing algorithms.
  • Used this demand forecast to predict safety stocks to arrive at minimum and maximum inventory norms.
  • Leveraged the predicted safety stocks to find impact on desired service levels by simulating its effectiveness, therefore determining the right inventory norms and releasing savings.

The Change

As a result, the working-capital block was reduced by 26%, releasing $5Mn for one of the client’s top brands, based in Brazil. The average target days of supply were reduced by 33%, the projected service levels were 99.9%, and the total target days of supply for eight out of nine brands was decreased.

Estimate locations of merchants and customers with transaction data

The Big Picture

A leading financial institution had visibility on the card transaction patterns of its customers but did not know where its customers’ home locations were, or the locations of the merchants where customers were transacting. The company wanted to build a scalable algorithm to estimate customer and merchant locations to run location-based targeted campaigns for its customers.

The company had the data, but customer and merchant locations were not identified, as this data typically resides with other parties in the transaction chain. It wanted to identify customer and merchant locations, based on transaction behavior. This information could have been purchased, but that option becomes cost-prohibitive given the scale of deployment needed, not to mention creating a dependence for the company. The process needed to be scalable across time, volume of data, and velocity of data. The initial sample data was 3-4 GBs, and the algorithm was to be deployed on 1+ TB of data, for the UK market.

Transformative Solution

Given the low levels of visibility on merchant locations as well as customer locations in the data, the solution was to augment the existing data with third-party data on a sample of all businesses in the UK to enhance the merchant intelligence within the data. The exact third-party data source was finalized after a rigorous data assessment process covering 15+ data sources, across parameters like coverage, quality, granularity, accuracy, and more.

Next, the approach was to leverage text-mining algorithms to extract meaningful location-oriented information from the transaction descriptions captured by the company its transaction data and from the third-party data source. Machine-learning algorithms were then deployed to iteratively estimate customer home locations and merchant locations in the data.

The Change

As a result of the engagement, a terabyte scale algorithm was deployed in the customer environment. The solution estimated home locations for 98% customers and merchant locations for 99% of the merchants in their data. 58% merchant locations were predicted within 0.3 miles of their actual location, whereas 45% of customer locations were estimated within one mile of their home location.

Use data visualization to monitor e-commerce performance

The Big Picture

A leading CPG company for baby foods recently started selling through its own and third-party e-commerce stores. While e-commerce contributed approximately 20% to 30% of sales across markets, there was limited monitoring of consumer behavior for the recently launched websites.

The company wanted to build a long-term and comprehensive roadmap for e-commerce. The company had the data, but the data was dispersed among various online and offline sources including Google Analytics, CRM database Magento, and multiple spreadsheets. To start with, the company wanted to harmonize and integrate the data dispersed in multiple locations.

Then, the company wanted to build a visualization solution to provide an e-commerce business health overview that would deliver automated insights on e-commerce health. The solution also needed to enable guided analysis to identify drivers and drainers of sales, and provide catch-connect-close analysis. This process needed to be scalable across time, volume of data, and velocity of data.

Transformative Solution

To solve the company’s challenges, a dashboard was designed based on the following key points: overall e-commerce health (growing or declining); consumer shopping behavior; drivers and drainers of sales; campaign performance analysis; and content, channel, and device performance analysis.

The approach identified the right KPIs to use for effective decision-making. Various mock-ups were created before finalizing the design. APIs were developed for getting the data from various sources. In addition, a what-why-how framework was deployed to design the critical dashboards.

The Change

As a result of the engagement, the company attained a visualization solution. The solution would enable quicker decision-making by identifying areas of focus in order to improve business results. With minimum clicks, the solution would also provide visibility to e-commerce performance. In addition, a standard data updating process would allow for frequent data refreshes and correlate data from different sources.

Optimize product portfolios using Bayesian Regression analytics

The Big Picture

A leading CPG manufacturer recently changed its messaging strategy from attribute-oriented to customer experience-oriented and thus wanted to compare the impact of its old messaging to its new messaging on brand volume sales at the retailer level. Additionally, the company wanted to understand how its campaign had impacted private label product lines and how it could enhance its campaigns in the future to maximize net ROI.

Transformative Solution

A Hierarchal Bayesian Regression technique and a Bayesian Belief Network were used to build marketing mix models to measure campaign performance at the total US level by evaluating the impact of each campaign in generating incremental sales. The team further refined the models at a more granular retailer level using business information (priors) from the national level model as an input to the retailer models. The team could effectively establish performance of each campaign for all of the retailers and their impact on both brand sales and private label sales.

The Bayesian Belief Network approach was further adopted to identify the interrelationships between all of the media channels and tie them together in a network structure in order to capture the direct and indirect impact of media channels on sales. With the help of these network structures, the true contribution from a media channel on incremental sales was quantified, enabling reporting of the true ROI of each product line.

The Change

With the measurement of incremental sales of each campaign, along with true channel performance by Bayesian Network, it was established that customer-centric campaigns performed better than attribute-centric, increasing sales in both brand and private product lines, thus growing the entire category. Additionally, the true impact of campaign messaging was isolated from the channel mix strategy and other factors, providing further, more actionable insights.

The organization could easily transform the learnings from the study to enhance its marketing strategy and its trade strategy with all the retailers by highlighting the impact of advertising on private label goods.

Optimize marketing planning and execution through advanced analytics

The Big Picture

An India-based telecom service provider wanted to enhance the efficacy of their marketing campaigns by understanding how past campaigns have impacted key brand KPIs such as intention-to-purchase and awareness. It also wanted to understand how it could enhance its campaigns in the future to maximize ROI and understand the key parameters that drive the choice of service providers in the market.

Transformative Solution

To address the company’s challenges, an integrated approach of marketing attribution and equity drivers analysis was introduced. Marketing attribution models were used to identify the historic performance of various campaigns on the key brand KPIs. The initiative quantified the impact of each campaign in generating incremental business performance and established their relative effectiveness. The team identified the kind of campaigns that have larger effectiveness on business performance versus lower-performing campaigns.

Equity drivers analysis, with the help of network models on top of respondent-level equity tracking data, enabled the team to understand how perceptions were built in the minds of consumers. The approach identified which various product and service attributes and equity measures drove the overall brand rating. The impact of each of those attributes were quantified based on overall rating and uncovered key equity paths that could be adopted in the design of campaign messaging and positioning. This helped discover what consumers look for from the category that drove their perception and choice of brand or service provider.

The Change

The identification of key equity drivers (i.e., consumers’ needs) along with the performance of past campaigns enabled the company to validate why certain campaigns performed better than others. The campaign that talked about the key needs, as identified from the drivers analysis, showed better effectiveness than the other lower-performing campaigns.

The organization applied the learning from marketing attribution and key equity drivers into the design of future marketing campaigns, both in terms of marketing budget allocation across marketing levers and in terms of the right messaging that resonates with consumers. This ‘integrated planning’ helped design far more effective campaigns with higher ROI.

Identify growth opportunities across functions

The Big Picture

A global CPG company needed to be able to predict and identify potential areas of growth that would apply to future scenarios in multiple markets and product lines (e.g., household detergent and food products). Multiple, cross-functional business teams needed to understand which areas of their business would grow and which would decline ensure future, sustainable profitability.

Transformative Solution

The company chose to implement a growth driver analytics suite of the right tools to have a ‘forward-looking view’ of likely performance, and identify granular opportunities for growth. The initiative aimed to empower cross-functional business teams to preempt declines in business in specific markets, and accordingly make the corrective measures to bring back momentum in demand and sales.

A Bayesian Network approach was used to bring together all KPIs that come from across business-functions, and tie them together in a network structure, to capture the interrelationships among various dimensions and attributes. Through these models, Fractal captured the direct and indirect impact of various KPIs on business performance (sales), and hence the total impact in terms of elasticity. With the help of these network structures, Fractal could identify the set of drivers for any element in the structure and determine how to influence future incomes.

Additionally, the teams were able to capture lead-lag relationships among various KPIs so they could preempt changes in any KPI with the use of other driving KPIs.

The Change

A front-end platform for business users was enabled providing the required intelligence, empowering the cross-functional business teams to be agile and accurate in decision-making. The platform employed three core modules:

  • An Early Warning System that provides a forward-looking view of likely changes to the momentum of sales in the next few weeks or months. This helped spot areas of issue or opportunities, which could require interventions.
  • A Drivers Discovery Module that is used to uncover key drivers in a market and identify potential levers to pull to gain momentum. This augmented CCBT with differentiated drivers and brought out the relative importance and impact of different levers in the current environment.
  • A Simulator Module that was used to run different scenarios of interventions, and predict the likely outcome, and hence enable the company to course correct the investment plan.
Gain behavioral insights from anonymous website customers

The Big Picture

A technology giant, with a sales-centric business model, wanted to trace the journey of its potential customers even before they register on its website. The company wanted to increase conversion rates by identifying anonymous customers with high engagement scores and buying intent. It also wanted to visualize millions of pieces of cookie information in one unified view.

Transformative Solution:

The company’s website had 15 million visitors and 100 million website clicks per month. 12 million visitors were unregistered users and 3 million were registered. Of the unregistered users 10.5 million were anonymous users. And 1.5 million were identified using cookies and reverse IP. Cookie information was processed on unregistered users by using big data ETL processes (Hadoop and Hive). Tableau was used to visualize customer journeys that were found to have high engagement scores and buying interests.

Cookie-level information was connected with organization information to identify potential customers. The solution revealed that every unregistered visitor leaves traces about their activity on the web that can be traced back to their physical location and can be helpful to identify their interests and requirements. Organization information was identified for such anonymous visitors. Potential customers were identified to predict customer interests and buying behaviors.

The Change:

As a result of the solution, the company was able to:

  • Uncover organization information for 18,000+ non-registered users leaving web interaction data.
  • Map 18,000+ non-registered users to 4,000+ organizations.
  • Increase effectiveness of marketing campaigns by targeting 50% of the registered users who were earlier anonymous to the client.
AI based processing of transactions data on Big Data infrastructure

The Big Picture

A leading loyalty firm wanted to launch an online-only coalition program in a region. The plan was to establish ‘a marketplace of marketplaces’ set-up to offer loyalty solutions to large ecommerce platforms in the region. To this effect, the company’s burning need was the efficient management of huge volumes of transaction data (~one million transactions within six months of launch). The company engaged with multiple vendor partners to design, launch, track, and execute the program.

Transformative Solution

The company worked with Fractal to implement an on-premises (Fractal) hosted SaaS delivery model for setting up the data warehouse. A big data technology stack was used to support the data hosting ecosystem. The approach included periodic data inputs from multiple sources, such as members, transactions, campaign, PCA, product, sellers, clickstream data.

The data warehouse was set up to handle multiple relational and non-relational data sources. It was built on a state-of-the art data hosting ecosystem, utilizing cutting-edge big data technology stacks like Apache Hadoop, Spark, Hue, Ambari, and more. In line with the client’s burning needs, the data warehouse servers were hosted within Fractal premises to ensure data security and a cost-effective solution. Daily visual reports were provided using Tableau for program tracking.

The solution enabled the company to:

  • Understand launch, financial status, and customer profiles.
  • Track merchandising trends and assess performance of scheme(s).
  • Monitor overall program health and review campaign performance.
  • Enable 1-on-1 personalization by combining web click data with the transaction.

The Change

As a result of the engagement, the company gained a highly scalable solution (~one million transactions within six months of launch) handling data input from multiple sources. The company unearthed opportunities to cross-sell and up-sell. The solution approach also provided foundational the building block for advanced coalition and loyalty analytics.

Detect anomalous behavior using security surveillance video streams

The Big Picture

A leading French aerospace and defense company sought to use access control data and surveillance videos to identify abnormal behaviors, while minimizing the rate of false alarms and optimizing true positive detections. This would help support its operator in assessing the origin, root cause, and impact of the alarm. The company also wanted to develop a ‘point of view’ and recommendations on improving the efficiency and intelligence of infrastructure.

Transformative Solution

The company performed root-cause analysis to understand people flows, which could help it pre-empt anomalies and improve operational efficiencies. Using statistical and machine-learning techniques, it developed a solution framework to analyze employees’ behavior and flows at various access points and identify anomalous activities. It also developed an interactive dashboard that enabled tracking of the facility by harmonizing data from various sources. Advanced machine-learning techniques, such as self-organizing maps, were used to analyze employee behavior.

The approach revealed several key insights. Using statistical and machine-learning models to identify anomalies revealed that 90% of anomalies were due to higher usage, invalid cards, and unusual building visits. The approach also derived insights about employee behavior such as duration of stay, buildings accessed, campus entry times, access patterns, and more.

The Change

As a result of the solution, the company gained several benefits. The company’s facility analysis dashboard can be used for real-time tracking and reporting of anomalous behavior at any access point. Also, employee analysis can help in understanding patterns in employee behavior at any granularity level—such as overall, department, or individual level—and further compare an individual’s behavior to identify anomalies.
The company also gained the ability to harmonize data from various sources like access control data, video files, and external sources like live weather and traffic status. It gained the ability to process streams of video and access data together to create a single view of the employee.

Deploy a big data technology strategy

The Big Picture

A leading US health insurer wanted to improve its existing big data infrastructure to support its future big data analytics use cases. The company wanted to evaluate its current big data use cases and the underlying infrastructure, and benchmark it with industry standards.

It wanted to understand its business needs and identify a list of existing and new big data use cases to execute. The client also sought to develop a blueprint for building big data infrastructure to support the execution of use cases, the scope of which would be clinical analytics.

Transformative Solution

To solve the company’s challenges, a big data technology strategy roadmap was created. The data strategy components included data shape, data volume, data latency, analytics development, analytics operationalization, insights consumption, and integration. The analysis stages included determining key business objectives, identifying use cases, evaluating existing capabilities, identifying gaps, and proposing a new architecture.

Four steps were taken to deliver big data recommendations and a roadmap, over a ten-week period:

  • Finalized use cases: The approach identified current and new use cases, and then finalized the use cases that fit the big data use case definition. A scope and project plan document was provided.
  • Performed gap analysis: The solution created detailed requirements and evaluated existing capabilities. This provided a current data component architecture, identified data sources, and identified components to replace or upgrade.
  • Evaluated architecture patterns to bridge the gap: The activities in this step included assessing the feasibility of platforms, evaluating alternatives, and outlining short- and long-term investments. This resulted in a gap assessment document and benchmarking. Big data components were proposed such as Hadoop, Spark, Storm, Kafka, and Spring XD Platform.
  • Provided final recommendations: A detailed recommendation and implementation roadmap was created. Best practices for big data setup and governance were provided, along with a guide for advanced visualization techniques and tools.

The Change

As a result of the engagement, the client gained several key benefits. These included: Final recommendations and a roadmap; design and best practices; a risk mitigation and governance plan; and a big data implementation plan. This blueprint provided the company with an approach to support its future big data analytics use cases.

Detect anomalies using sensor data

The Big Picture

A leading European engineering company had sensor data that was continuously collected through its multiple turbines. The company wanted to build an algorithmic workflow to enable automatic detection of gearbox anomalies by utilizing gearbox sensor data. This was challenging because the company needed to collate the data from multiple sensors and also deploy the solution in production (on high-velocity data) and predict machine failures at an early stage.

Transformative Solution

Spectral data from different sensor-variant-frequency combinations were analyzed to understand the gearbox behavior. Based on the trends, all time series were classified into different categories and model performance was evaluated for each category. Additional filters were deployed to prevent false alarms.

The approach used two models for anomaly detection. One captured long-term system behavior, and the other captured recent trends. Anomaly scores from the models were passed into a detection engine to identify anomalies. The engine raised an alarm if the anomaly scores were beyond an acceptable threshold for a specified time period.

The Change

As a result of the solution, the company realized several benefits. Due to the model’s anomaly detection capability, and flexible and scalable methodology, it can be deployed as a product on the client’s internal platform to enable cost savings from early detection of machine failures. The anomaly detection model was capable of making predictions using live streaming sensor data (high-velocity data). The models were developed using adaptive machine-learning methods, which saved the time that would be required in building models at regular intervals of time.

Build a unified data delivery platform

The Big Picture

A leading insurance provider want to build scalable data preparation tool with the highest level of granularity for group of data scientists. To get there, it needed to ingest, process, and load complex, and deeply nested data. It needed to generate data in a readable format for the end users and develop a unified data delivery platform. The solution also needed to be both extendible and maintainable.

This approach involved several key challenges, such as churning deeply nested XML data in compressed format (~61 TB). The solution also needed to ensure flattening of XML fields with the same number of fields in the processed data irrespective of the schema changes in the source files. The company also sought to automate the entire ETL process as a service.

Transformative Solution

To solve the company’s challenges, an approach was taken to collect, process, and consume data.

The collection step involved collecting data from sources. XML data was consumed from HDFS, which was highly nested and complex. Compressed data was read in a scalable fashion (decompression during job runs). From there, the approach wrote custom input formats to handle a variety of input data formats.

The process step involved extracting fields and partitioning. The approach was to iterate through the XML and retrieve field values, generate key-value pairs in a semi-flattened structure, partition the data based on the key (varies at XML level), and generate headers required for output rectangularization.

In the consume step, the data preparation tool was implemented. The approach built a simplified user-interface to prepare the data with the highest granularity. The tool was based on Python Flask, and it provided the ability to choose from a list of available partitions.

The Change

As a result of the engagement, the company gained the ability to:
– Process TBs of data (61 TB) and automate the complete process of ETL.
– Handle compressed and deeply nested data at a large scale.
– Present the most granular details to the data science group in order to streamline their process and improve efficiency.

Predict ‘propensity to buy’ using big data analytics

The Big Picture

A leading FMCG company wanted to develop purchase propensity models for its customers—purchasing products both online and offline—to better plan for its manufacturing and inventory processes. It wanted to improve accuracy and recommend improvements in the current approach and develop a scalable solution to churn terabytes of data (100 million data points).

To get there, the company needed to:

  • Churn large data sets using a big data stack and perform EDA
  • Perform feature engineering on large data sets
  • Build an automated and scalable machine-learning pipeline

Transformative Solution

The solution leveraged structured data from customer transactions, and the data preparation included monthly roll-up of customer ID’s, an EDA to choose the prediction window of the target variable, and feature engineering on the observation window. The solution also leveraged semi-structured behavioral data on customers, which was prepared by removing redundancy at the segment level.

Classification models were developed and an ensemble of models was implemented, including Lasso, Random Forest, and GBM. These were compared to identify the best model. The model resulted in identifying the right cut-off for predicting that a customer will likely make a purchase. The solution was scaled on a big data platform. The solution provided an automated script that provides purchase propensity scores for each customer.

The Change

As a result of the engagement, the client:

  • Identified the average turn-around-time for a customer to buy a product in a particular segment.
  • Extracted insights from TBs of data with minimal execution time of two hours.
  • Experimented with Logistic and Random Forest algorithms to arrive at higher accuracy.
  • Enabled an automated scalable solution to procure products for a particular segment.

 

Decoding and influencing mis-selling

The Big Picture

The Insurance industry relies heavily on a large number of sales-staff who sell insurance policies after face-to-face discussions with prospects. This large private sector insurance company was no different, with 10000+ sales-staff. It had been struggling with rampant complaints, internal and external, about sales-staff indulging in mis-selling.

The company was invested in utilizing strict punishments as a compliance mechanism, exemplified by surprise audits, multiple levels of disciplinary actions (cautions, warnings, and terminations), etc. Even with these policies in place – new employees joining the company were seen doing the same kinds of malpractices which older and more experienced hands were doing. The numbers of people cautioned, warned, and terminated rose every quarter with no downward trend. The organization commissioned us to unearth behavioral drivers of integrity issues and to design interventions to reduce mis-selling.

Transformative Solution

The vast majority of prevalent research tools depend on a respondent’s ability to introspect, deliberate and consciously provide responses and analysis. Unlike these, our research methodology is game-based to go beyond “Say-Do” gaps – the difference between what consumers say and what they do.

Conundrum Ethnolab research tool is a group exercise designed to eliminate participants’ personal filters and capture responses representative of the emotions, mental models and biases underlying behavior. We deployed the Conundrum Ethnolab among 110 employees across 4 regions to understand true drivers of sales-staff decisions.

We discovered that because mis-selling was the norm in branches, one emotion that could have prevented such practices, was conspicuous by its absence: Guilt. Because of which, while employees knew their actions were illegal, they did not feel that their actions were immoral.

We designed and deployed interventions (structural and tactical) that aimed at reducing mis-selling by increasing guilt associated with mis-selling, increasing organizational memory of fairness, engineering a ritual to embed learning of “right” actions.

The Change

Interventions were deployed across 400+ branches of the organization since some structural interventions involved creation of organizational roles and responsibilities which could not be deployed only to a part of the organization. Leading and lag metrics related to mis-selling was defined. Data were captured across all branches for 8 months. Past data was churned to provide a baseline to compare against.

We saw that about 60-80% of managers had reduced Terminations, Warnings and Cautions in 8 months following the intervention compared to the baseline.

DECODING AND INFLUENCING MIS-SELLING

Tackling road accidents with behavioral science

The Big Picture

Moradabad Bareilly Expressway (MBEL) is a 121 km stretch of the National Highway 24 (India)  that has been operational since January 2015.

Within the span of the first 9 months, there was a record of over 450 accidents, with an average of 55 accidents per month.

The 4 lane highway is a mix of densely populated semi-urban and many rural transitions. The driving behavior along MBEL was seen to be highly influenced by local norms. Unlike most highways, here the drivers were mostly from villages and towns in the vicinity. They were well versed with various routes and shortcuts. Consequently, wrong side driving, over-speeding, and high-risk driving behaviors had become the norm. Moreover, lack of sensory cues for the Main Carriageway (MCW) drivers about the approach road points added to the number of accidents recorded.

Transformative Solution

FinalMile approached ‘driving’ through the lens of behavioral science and understood it as a non-conscious activity governed by inherent biases and heuristics in human decision making. The limitations in information processing of the human brain make the road users adopt decisions and behaviors that are undesired.

Combining learnings from behavioral science and design, we developed an intervention strategy to evoke a non-deliberative positive response from the target audience; triggering behavior change at a non-conscious level. These methods work universally, thus overcoming demographic, region and language barriers in the road safety context.

The following principles were adopted for designing interventions:

  1. Managing private-optimism and risk-unavailability of the local drivers that lead to high-risk driving behavior.
  2. Sensory cues for managing entry and exit of vehicles.
  3. Managing goal incongruency that led to rampant wrong side driving.
  4. Managing inter-vehicle distance and unanticipated behavior at merge points to reduce rear-end collisions.

The set of interventions included road markings, road signs, and disaster mitigation report.

Tackling road accidents with behavioral science

Tackling road accidents with behavioral science

The Change

The interventions helped reduce accidents by 46%.

Tackling road accidents with behavioral science

 

FTI case study

Company Background

Franklin Templeton Investments (FTI) is one of the world’s largest asset managers, selling mutual funds to investors through financial advisors (FA). One of FTI’s chief goals is to increase overall market share in an industry that is rapidly evolving with the demand for new products, a shift to fee-based business models, and more recently, the emergence of robo-advisors, and numerous regulatory changes. FA’s are challenged by this dynamic, highly competitive marketplace and have begun to adopt new digital channels to gain access to fund and market information. Engaging them meaningfully and serving their needs in this omni-channel world is critical to their retention.

Fractal is a great partner that has helped us advance our analytics practice. They contributed a level of experience and expertise that allowed us to move much more quickly than we would have been able to do on our own.

  • Franklin Templeton generates $600mn in new assets with Customer Genomics®
  • Jennifer Ball SVP Global, Product Marketing & Insights, Franklin Templeton Investments

The Big Picture

In this changing competitive landscape, it’s imperative to influence the investment decision of the FA at the right time, with the right sales and marketing intervention, and with the right information for their business needs to grow FTI’s share of wallet. How do we better understand the behaviors, preferences, and needs of FA’s on a more personal level? How do we best communicate information about relevant products in the right message and channel? How can Franklin’s interactions with them provide maximum value, while simultaneously improving their level of digital engagement?

The Transformative Solution

To better serve the needs of its financial advisors, FTI partnered with Fractal Analytics to deeply understand FA’s by creating ‘Advisor Genomics®’ proprietary machine learning algorithms. This enabled Franklin to transform its digital strategy and increase the company’s ability to leverage analytics and machine-learning algorithms as a competitive advantage by increasing message relevance across all digital channels and providing actionable insights that add day-to-day value for sales and marketing engagement with FA’s.

Advisor Genomics leverages machine-learning and pattern-matching algorithms to develop a comprehensive understanding of FA’s by using granular historical data to generate an output set of personalized ‘FA genomic labels.’ These labels are probabilistic scores indicative of future propensities to transact and interact, and are also self-learning and adaptive, updating every new data point in an automated manner.

Rather than rely on a traditional RFM segmentation-based framework, Advisor Genomics was developed with the belief that customers ‘are what they engage with, and how they react to those engagements.’ This deep and adaptive understanding enables hyper-contextual and personalized interactions with FA’s by delivering the right message through the right channel at the right time.

The Results

FTI’s ‘Advisor Genomics’ personalized behavioral and contact strategy system developed by Fractal Analytics resulted in:

 

Franklin Templeton generates $600mn in new assets with Customer Genomics®

Insights

  • By combining customer genome predictions with product attribute matching, Fractal Analytics helped FTI profile potential advisors with laser precision so they could recommend the company’s products to their clients.
  • Advisor Genomics identified additional leads to target for higher sales, as well as seasonal behavior in advisor purchases.

Innovation

  • Advisor Genomics identified influencers within a specific firm who had the greatest potential to significantly increase sales.
  • Advisors most likely to be active on social media channels were predicted and pro-actively engaged.

Impact

  • Fractal identified 13.2K sales leads with a 65% success rate
  • Insights generated 1,300 incremental leads that garnered $600 million in assets under management
  • Results featured in the ANA Genius Award video produced by FTI

This new marketing channel, combined with the new analysis and the analytics that we get from it, is really helping us expand our practice and reach clients that we wouldn’t have been able to reach otherwise.

  • Franklin Templeton generates $600mn in new assets with Customer Genomics®
  • David McSpadden Chief Marketing Officer

The Implementation

In order to establish a truly adaptive customer intelligence platform to meet FTI’s objectives, Fractal Analytics led the team to develop and implement the following protocol:

Harmonizing Data

  • Integration of asynchronous data from various sources to enable the efficient measurement of sales impact of multi-channel interactions, including Email campaigns data, Web visits data, Online simulation request data, Social media data and Event data

Understanding FA Behaviors

  • Attributes were created for 350K advisors across the U.S.
  • Measuring FA interactions across various digital channels
  • Estimates were generated on the impact of multi-channel interactions on customer sales, transactions, and conversions

Optimizing FTI’s Interaction Strategy

  • An optimum sequence of digital and non-digital interactions (calls, visits) were identified for each FA based on Channel Preference, Channel Sequence Digital Engagement and Message Response
  • Optimal contact channels for sales interactions were established for 90K of the advisors

Ultimately, we are striving to really build and define and improve an ongoing conversation with our customers and really optimize and deepen our relationship with them.

  • Franklin Templeton generates $600mn in new assets with Customer Genomics®
  • Kate Biagini Manager, Analytics

Customer Genomics ‘Genome Markers’ enable real time customer intelligence and engagement

Customer Genomics uses all available signals from each individual consumer across digital channels to build predictive artificial intelligence algorithms to create markers that predict the likelihood that specific consumer will respond to digital actions. This approach varies from traditional targeted segmentation in it’s ability to predict future behaviors, attitudes and preferences in an adaptive way where the models are automatically updated with each consumer action.

Franklin Templeton generates $600mn in new assets with Customer Genomics®

 

An interview with Jennifer Ball. SVP Global Product Marketing & Insights, Franklin Templeton Investments

Which strategic business objectives does analytics support for Franklin Templeton?

  • At Franklin Templeton, our objective is to help investors achieve their goals with confidence so they can create a better future. Analytics plays a key role in helping us help our clients. For example, analytics helps us connect in a relevant way with the financial advisors who recommend our solutions.

What impact has analytics achieved in driving success for these objectives?

  • We partnered with Fractal to implement an adaptive customer intelligence program called “Advisor Genomics” to transform our contact and digital marketing strategies. Genomics helps us determine which advisors to contact, how to contact them, and what to discuss.

Give us a peek under the hood of ‘Advisor Genomics.’ How does it actually work?

  • The analogy to genomics works well as we are really trying to understand the business DNA of a financial advisor. We leverage machine learning and pattern matching algorithms on historical data to identify individual genome markers – from channel preferences and message response to purchase propensities, and the experience they have had with our solutions.

What actions are these ‘markers’ specifically used to drive or advance?

  • We use these markers for lead generation, to identify relevant conversation topics for use by our sales force, and for our digital marketing efforts. Each activity and response feeds back into our models. The goal is to have an environment where sales and marketing are equipped to engage with financial advisors in a way that speaks directly to their preferences and needs.

What role has Fractal played in furthering your analytics goals?

  • The work we did with Fractal was recognized by the ANA for the Genius Award in Digital Marketing Analytics. Fractal is a great partner that has helped us advance our analytics practice. They invested in learning our industry, our business, our company and our people. They contributed a level of analytics experience and expertise that allowed us to move much more quickly than we would have been able to do on our own.

What do you appreciate most about working with Fractal?

  • The Fractal team became an extension of our team. They are smart, collaborative and really fun people to work with.
Enable forecasting for global markets across regions

The Big Picture

The global analytics team of a large manufacturer of medical devices, pharmaceuticals, and consumer packaged goods was developing global-scale country-category forecasts using syndicated data sources. They wanted to build automated forecasting solutions for generating forecast numbers across 450+ country-category combinations of value and volume. To get a holistic view of the numbers, country, regional, and business heads were to consume the output of such forecasts through dashboards and presentations.

Transformative Solution

The company leveraged Fractal’s Centralized Analytics Environment (CAE) to successfully deliver a market-size forecasting solution across 450+ country-category combinations in 90 countries. To achieve this, the following CAE modules were implemented:

  • Data processing: Fractal used its business and market understanding to harmonize the data for all combinations and mapped the external factors that may impact forecasts. This harmonized data was put together from multiple data sources.
  • Data modeling: Using its proprietary forecasting solution, Autocast, Fractal deployed multiple time series based statistical models to generate multiple forecasts and selected the best fit model output.
  • Validation checks: Fractal carried out validation checks on the statistical model outputs using business rules and market understanding to confirm the validity of the outputs.
  • Business consumption: The finalized category forecasts were shared with business stakeholders through a Tableau dashboard.

The Change

The CAE provided the client with the following insights and forecasts for the next two years for all global markets across regions:

  • Actual historical values compared with model forecast outputs across categories, country, and region.
  • Monthly and quarterly forecasts with year-on-year growth rates.
  • dentification of non-performing markets.
  • Monitoring price of the market vis-à-vis the sales value.

The company also benefited from using these technologies:

  • Autocast was leveraged to build the market-size forecasting solution.
  • Tableau visualization provided a holistic view to stakeholders on market performance and enabled better consumption of the output.