Senior business leaders convened at The Langham in London on February 26 for ai.lcy, the fifth in Fractal’s series of artificial intelligence (AI) events. This time, leading thinkers from academia and business posed key questions about the future relationship between humans and machines on the core theme of AI meets design.
The event arrived at a crucial point for AI within businesses.
As Fractal co-founder Srikanth Velamakanni noted, the question is no longer whether we have the required data or technology to implement AI systems. Rather, we must decide how we will use this phenomenal power.
BBC’s Leo Kelion opened a day of highly stimulating discussions by introducing the central themes: Asking the right questions to solve the right problems; designing with AI for customer delight, and combining emotional intelligence with AI.
Reframing the question
Leo cited the classic Albert Einstein quote, “If I had an hour to solve a problem I’d spend 55 minutes thinking about the problem and five minutes thinking about solutions.”
He added that there are times today when AI feels more artificial than intelligent. It works very well in a constrained and controlled environment, but if, for example, we do not phrase our query correctly to an Alexa-enabled device, it struggles to respond.
This served as a fitting curtain-raiser, as the day’s speakers continued to assess how we reframe business questions, to deliver answers through AI and design.
Johan Aurik, Partner and Chairman Emeritus at Kearney observed that while nine out of ten business leaders say AI is of strategic importance, just four out of ten have implemented an AI solution, and much fewer still have reported success.
Fireside chat – AI: Making it work in the real world.
Hannah Fry, Johan Aurik and Srikanth Velamakanni
He built on this by remarking that too many businesses see AI as a tool for delivering efficiency, rather than growth. However, if we begin with genuine customer problems and view technology as an enabler of innovative solutions, we will build the growth mindset that unlocks AI’s potential.
“Don’t start with the technology and what it can do”, Johan advised, as this will inevitably shape how we design our systems.
But how do we even know which questions to ask, to ensure that we are solving the right problems?
There were numerous takes on this pressing query throughout the day.
Natwar Mall, CEO of business analytics platform Cuddle.ai, emphasized that the key indicators are often hidden in plain sight.
We tend to overestimate our ability to digest lots of complex information, Natwar continued. This is shown in the complex performance dashboards, so many businesses employ.
With so much data at their disposal, employees are often overwhelmed rather than empowered. Natwar shared the statistic that just 29% of businesses link analytics to action today, so the current approach is clearly lacking in effectiveness.
Instead, AI (most notably, through Natural Language Processing) can let employees speak to the technology and receive useful responses about the company’s data. The data that these exchanges produce can be invaluable when trying to understand the typical pain points people encounter, too.
Future of decision-making @ work.
Forrester’s James McCormick later predicted that insights-driven businesses, such as Uber or Tesla, will generate $1.8 trillion in 2021, so the prize on offer is sizable for those that can break through this impasse.
Cuddle.ai uses “nudges” to help people work more effectively, and Stephen Donajgrodzki from Kellogg Company built on this behavioral science concept in his talk on how data can help us understand what makes people tick.
Stephen warned against some common errors when it comes to framing questions.
Companies often employ surveys to gather insights on how the audience perceives their brand. However, companies shape the audience’s answers through the design of the survey questions.
Stephen posited the following scenario: Ask someone, “How much do you love Coca-Cola?” and the question leads them to state that they love the brand. Ask them, “How much do you love Coca-Cola, compared to how much you love your mother?” and the frame of reference is markedly different from the outset.
Stephen concluded on a subtle and perceptive note that many brands could learn from: “Awareness in itself isn’t that useful. Understanding why people act is very useful.”
Human behaviors never exist in a vacuum, so we need to pose deeper questions that focus on what people do and why they do it. To “nudge” those behaviors, businesses must know how they want the audience to act.
As Pier Culazzo, Executive Director Data Science Lab & Operations at Visa said, this must also relate to the incentives the business is willing to offer to the customer.
If a company wants to know about their audience’s behavior, they need the audience’s cooperation to access the data. There must be an obvious benefit for the consumer, or they will opt not to share their private information.
Pier pointed out that Visa processes around 130 billion transactions each year, so there is no shortage of data. The challenge is deciding what to do with the data, in the interests both of the company and its customers.
Visa agreed that this could help solve customer problems relating to credit card fraud by using AI to spot anomalous transactions. It has done so since 1993 and estimates that it saves $25 billion per year by preventing these fraudulent transactions.
This is just one example and it represents an impressive win-win for both the company and the consumer, but Pier still sounded a note of caution about using AI without due consideration for how the machine arrives at its answers to our questions.
He suggested, “You constantly need to think about, how do I balance the power of an upgrade versus my need to explain to the consumer how the upgrade came about.” The explicability of some AI systems is still a work in progress, and in many areas we need transparency. If a mortgage or loan application is rejected, the customer should be entitled to an explanation, after all.
How AI and Design together can develop workable solutions and solve problems at scale.
Pier Culazzo, Pranay Agrawal and George Mathew
Designing for Delight: EI + AI
To usher in a new age of partnership between human and machine, we need to know where each party excels – and where they typically fall short.
Dr Hannah Fry tackled this scenario in her fascinating keynote speech, “How to Be Human in the Age of the Machine.”
Hannah, who stars in BBC documentaries about mathematics as well as hosting the superb DeepMind podcast, focuses her academic work on how AI can help solve difficult social problems.
As Hannah noted, the sheer potential of AI brings with it a host of new challenges.
Scientists in the past did not need to worry about the ethics of fluid particles, but AI needs to exist in the dynamic field of the human world. Hannah added, “You can’t just build an algorithm that involves people and decide whether you think it’s good or bad in isolation.”
Hannah offered numerous examples of how bad design has hindered the progress of mathematically sound systems.
She stated that driverless cars are, in strict terms, excellent at driving. They have improved so much that now, a driverless car can execute the component aspects of driving very well, in the testing environment.
However, the machines are not always designed with people in mind. Some models only require human intervention in an emergency, which demonstrates a lack of understanding when it comes to how people function. We do not perform well under pressure; and yet, the cars demand precisely this.
It does not need to be this way.
A good human-centered design would have the machine monitor for potential emergencies and step in to reduce the likelihood of an accident. Tesla and Volvo are just two car companies that have made this shift, taking into account how humans and machines can complement each other’s strengths – and counteract each other’s weaknesses.
Hannah shared a categorization of tasks that can help companies use design and AI to build a partnership between human and machine.
She stated that AI works fantastically well when it comes to specialized situations. It is much more likely to identify minor inconsistencies in patterns, for example, making it highly valuable for disease detection.
On the other hand, people are the experts when it comes to specificity. Hannah saw this as our “human superpower” in the age of AI. We can look at a small number of samples and make a judgment call much more effectively than a machine can.
Hannah summarized her core argument with, “Let the machine filter down all of the boring work, pick out areas of interest, and then let the human with their specificity superpowers come in and check, and decide.”
James McCormick from Forrester agreed with this in his talk, “Digital Data Tech and the Practice of Competing with Experiences.”
He suggested the example of a clothing retailer, Stitch Fix. Many shoppers are faced with decision paralysis when they shop online, as there are simply too many choices on offer.
Stitch Fix uses AI to filter through its inventory and identify the products that best fit each individual customer, based on their taste profile. It then provides a human stylist with some options to choose from, making the decision process simpler and allowing the stylist to focus on what they do best.
The end customer gets a personalized service, and the business can scale their capabilities without needing to hire hundreds of more stylists.
James observed that just 7% of businesses today would qualify alongside Stitch Fix as “insights-driven”. He added that these businesses differentiate themselves by “voraciously competing on experiences”, using a combination of data, insights, and algorithms to create faster, better outcomes for customers.
Dan Makoski, Chief Design Officer at Lloyds Banking Group, provided a unique insight into how his team puts customers in control of their financial lives, using AI and design.
Dan underlined the human element in creating technology products, opening his talk with the suggestion that AI requires EI or Emotional Intelligence.
Humanizing FinTech, by Design.
In the banking world as in so many others, it is tempting to think in terms of the products the company can offer.
However, the customer does not think of current accounts or savings accounts at a deeper level. Beyond those surface needs, the customer thinks in terms of milestones and moments. They may think of saving towards putting their kids through college or going on a family vacation, for example.
So far, banks have struggled to incorporate that truth into their customer-facing product.
Dan and his team tackled a complex question, to try and arrive at an altogether new answer: “What would banking feel like if it were more human?”
They visited their bank branches, held focus groups, and had plentiful brainstorms in their office, with the objective of pinning this down to one universal metaphor.
In the end, this was the metaphor of a journey.
Dan explained that people know what they are striving for, yet they feel imprisoned by their finances. They can see the goal, but then they ask, “How do I get there? What are my options? What happens if circumstances change?”
This is where AI should enter the equation. Dan discussed the example of Waze, which uses Global Positioning System (GPS) data and auto-rerouting algorithms to help people get from A to B, or to C, should they change their mind along the way.
Personal finances could avail of the same AI principles, to help people join the dots between their life milestones.
This was a recurring theme throughout the ai.lcy event: For all of AI’s hugely impressive advances, it works best when it is in the background, empowering people to make better decisions.
The challenge today is not to acquire the power of AI, but rather to harness it. If we apply enough care in framing our questions and designing solutions with people at their heart, AI will help us join the dots to deliver transformational results.
Fractal is excited to welcome Dr. Hannah Fry, Associate Professor in the Mathematics of Cities at the Centre for Advanced Spatial Analysis at UCL as this year’s keynote speaker at ai.lcy at The Langham, London on 26 February, 2020.
We are getting closer to ai.lcy 2020, where it is all about the unique approach to combine AI, Engineering, and Design, driving human behavior and solving problems at scale for organizations.
Here is the quick recap of ai.lcy 2019. Over 70 senior executives from FTSE 250 companies attended the event that focused on why AI is not enough to solve problems at scale. We had Kenneth Cukier, co-author of the New York Times bestseller, “Big Data: A revolution that transforms how we live, work and think,” for a keynote, and industry leaders from Google, Visa, Mars, Forrester, Lloyds Banking Group, M&G Prudential, and more.
As business leaders reserve their spots to attend ai.lcy 2020, looking forward to meeting peers to learn and share an AI-led transformation journey and the value AI has created, we have one more must-see to add to the event.
Introducing our keynote speaker: Dr. Hannah Fry
Dr. Hannah Fry is an Associate Professor in the Mathematics of Cities at the Centre for Advanced Spatial Analysis at UCL, where she studies patterns in human behavior. Her research applies to a wide range of social problems and questions, from shopping and transport to urban crime, riots and terrorism.
Her critically acclaimed BBC documentaries include Horizon: Diagnosis on Demand? The Computer Will See You Now, Britain’s Greatest Invention, City in the Sky (BBC Two), Magic Numbers: Hannah Fry’s Mysterious World of Maths, The Joy of Winning, The Joy of Data, Contagion! The BBC Four Pandemic and Calculating Ada (BBC Four). She also co-presents The Curious Cases of Rutherford and Fry (BBC Radio 4) and The Maths of Life with Lauren Laverne (BBC Radio 6).
Hannah is the author of Hello World, The Indisputable Existence of Santa Claus: The Mathematics of Christmas and The Mathematics of Love: Patterns, Proofs, and the Search for the Ultimate Equation.
About the session – How to be human in the age of the machine
Hannah has spent the last decade working with data, hunting for mathematical patterns in human behaviour. In that time, she has come across some incredible stories – written solely in the numbers – that get right to the heart of who we are as people.
Hannah will share some extraordinary tales about what’s happening at the very cutting edge of data science – but will also explore the idea of human’s vs machines. The modern rhetoric is that machines are waiting in the wings to dominate the workforce, and that’s something that – and yet it’s an argument that, she hopes to persuade you, is missing something crucial.
In her talk, Hannah wants to highlight the salient differences between us and our machines. We’ll explore how the prevalent narrative of human replacement is missing the point and examine what happens when technology isn’t designed with humans in mind.
We look forward to having you at ai.lcy 2020.
Consumer goods companies are spending millions of dollars to access data from multiple sources. These data sets must be harmonized for their true power to be realized.
Misaligned data sets lead to incorrect analysis, which could then lead to misdirected business strategies. It also means a poor return on your investment, making it hard to justify additional spending on data. Common problems resulting from an un-harmonized data environment include:
Having different data sets pointing in different directions can be worse than not having data at all. At best, incompatible data sets represent a waste of money; at worst, using them for business-critical analytics could lead to illguided strategies and material loss to the company.
Good data can transform a business, however. So your focus should be on turning these disparate sources into a coherent whole that lets you access the relevant data at the right levels. Data harmonization does exactly that, and typically at a fraction of the cost of purchasing the data sets in the first place.
What is data harmonization?
Harmonization is a continuous process that aligns all available data sources – market data, shipment data or customer data – across all key metrics.
By synchronizing data points across products, channels, time periods and geographies, otherwise disparate data sets can easily converse with each other. And that brings the full range of business data to bear, making the entire organization “analytics-ready.” This will:
Harmonization helps develop focused marketing strategies.
A major Fortune 500 was able to estimate the market size and develop focused marketing strategies for each subcategory, which were originally masked under one big category of oral care in the data provider’s database.
Harmonization vs. transformation
Combining data sources in a single warehouse is not the same as harmonization. There are many tools in the market that claim to transform and integrate different sources, but few are capable of truly harmonizing disparate data sets to the level required to build powerful analytics.
Sometimes, integration is really all that you need. If the objective is to check the data for completeness, validate for accuracy or simply add to your existing data warehouse, then you may not need a complete harmonization. Setting up a simple “extract, transform, load” (ETL) process or commissioning a data service provider to clean your data may suffice.
Harmonization cuts down ‘data-to-decisions’ time.
A CPG manufacturer was able to cut down the data-to-decisions time by 45 days through automated harmonization.
However, if your objective is to make the organization “analytics-ready,” then you should embark on the harmonization journey. This should include:
Harmonization also simplifies data governance – a process overlooked at most companies. Updates to one data set will automatically be reflected in all others. This can prevent companies from making multi-million dollar errors. Imagine if a retailer consolidation was reported in your internal systems, but not in the syndicated market data; the insights derived and, therefore, the recommendations for your customers could be disastrous.
Harmonization will help go beyond linear hierarchies to help with business strategies.
A global beverages company was able to create attributes to classify retail outlets based on monthly sales volume and hence support its distributors with promotion strategies for specific retail outlets.
The trouble with data sets
While you should expect data providers to share data sets that are ready to use, that unfortunately is not always the case. Even data from top global providers can have problems such as:
This means that, even after spending millions of dollars to access data from multiple sources and even more on having it all streamlined, your organization could still be left with data sets that can’t talk to each other.
Taking the plunge
Choosing the right partner to guide you through harmonization is vital. But before you can choose that partner, setting a clear objective is integral to ensuring that the process does not end up becoming a simple data integration.
A strong vision, and identified use cases, will help you choose the relevant fields and optimize your effort and resources toward enabling the analytics solutions you need. This requires a commitment from any organization embarking on the harmonization journey to deliver the right infrastructure, set up robust governance processes, and ensure that key personnel are engaged in the project.
Fractal has experience harmonizing more than 1100 country categories.
For consumer goods companies, the rewards are considerable. You will finally be able to unlock the enormous value in datasets currently sitting idle or – even worse – generating misleading insights. By turning the flood of data that’s available into a coherent, cohesive whole, data harmonization gives businesses a full-spectrum view of their organization and the markets in which they operate.
Fractal has experience harmonizing more than 20 different data sources (from syndicated to media to shipment to consumer survey).
Due to the disparate nature of data, analytics, and technology in different organizations, the optimal big data deployment strategy may differ for every organization.
Therefore, a customized strategy that is specific to the organization should be drafted for deploying big data technologies without any disruption. A detailed assessment of integration and interoperability, security, governance, and processes should be made before the deployment. All the deliverables should be defined and artifacts made available for seamless implementation.
The objective of this paper is to enable the decision makers to adopt a comprehensive approach while deploying big data technologies and processes in their organizations.
The age of big and smart data
We are in the age of big data where organizations are exploring ways to store, manage, and harness large amounts of data effectively. With an intention to improve their operations and make informed business decisions, they are applying analytics to convert data into actionable insights. With the help of intelligent algorithms, the attempt is to make data smart so that it can send patterns and signals for informed decision making. This will eventually result in significant reduction of operational cost and increase profit.
The think tank at the USA’s leading health insurance company and Fractal Analytics planned rigorously for eight weeks before successfully deploying big data technologies within the former’s infrastructure.
Immense expertise, however, is required to select and deploy the right combination of big data technologies that enhance operations and address specific business needs. There is a plethora of technology offerings in the market that solve specific problems within big data environments. Moreover, these technologies are evolving at a rapid pace to offer greater efficiency and solve more complex problems. To make the best of these technologies, it is important to assess the existing data and infrastructure, and compare with the industry benchmarks to identify the gaps. It is also essential to articulate the key performance indicators to be achieved and draft a detailed big data deployment and adoption plan tailored for the organization.
Taking the plunge without adequate knowledge and a foolproof big data strategy can result in failure. Ill-informed decisions and flawed deployment roadmaps can drain budgets and have an undesired impact on business performance.
Big data deployment framework
There are several factors that influence the decisions to deploy big data technologies in an organization.
- Factor: Presence of unstructured and/or non-traditional data in the system
Examples: Web chats, call center transcripts, digital notes and records, web metrics
- Factor: Dealing with a huge volume of data that runs into petabytes or zettabytes
Examples: Social media data, calls, web chats, web logs, mobile devices data
- Factor: Flow of ultra-low latency data
Examples: Social media and blog comments, recent calls and chats, web adoption, user authorization information
- Factor: Need for real-time scoring and insights
Examples: Authorization triggers, incoming calls
- Factor: Exploring new analytics algorithms
Examples: Text mining, predictive models, probabilistic learning algorithms, unsupervised learning methods, Bayesian algorithms, neural networks
Data, analytics, and technology are the three main pillars of the big data landscape.
However, not all pillars are equally strong in every organization. Some organizations have structured data with known latency and volume, but have no means to perform analytics with it. Others may have all the analytical tools in place, but no control over managing data. There may be yet others that have control over data and analytics, but are unable to harness the insights for decision making due to outdated technology infrastructure.
These three pillars are integrated and further reinforced with security, governance, and processes. The following illustration depicts:
In the subsequent sections, let us delve deeper to decipher this framework
Data is at the heart of analytics, technology, and informed decision making. Its shape, volume, and latency determine the breed of big data technologies to be deployed in the organization. Meticulous mapping of data properties, sources, and frequencies is important while drafting the deployment strategy.
Data from existing and new sources may be dealt with differently. Besides, there may be different methods to manage transactional and non-transactional data. How structured the data is also influences the deployment decisions. Data’s format, its ability to interact with other data and databases, and its consistency should be thoroughly assessed. Other factors, such as write throughput, data-source prioritization, event-driven message input, data recoverability, fault tolerance, high performance deployment, platform availability, and automation, also need to be considered while strategizing.
Data is at the heart of analytics, technology, and informed decision making. Its shape, volume, and latency determine the breed of big data technologies to be deployed in the organization.
A combination of traditional technologies that are in use (such as relational databases) and big data technologies (such as Apache Hadoop and NoSQL) might be the apt solution for some organizations to achieve their business objectives. For others, adopting new technologies altogether would be the best solution.
Analytics may involve developing and operationalizing descriptive, predictive, text mining, and unsupervised learning models leveraging data sources. Developing an analytical capability in a big data environment involves understanding how it would support the following:
- Data processing, querying, aggregation, and transformation
- Structured query language (SQL) and native programming languages
- Human-acceptable query latency
- Text mining and processing
- Supervised and unsupervised algorithms
- Interoperability with other platforms and analytical tools
Apart from these factors, the analytical models should ideally offer features such as ease of use (coding, debugging, and packaging), open source libraries and application programming interfaces (APIs), and graphical interfaces for visualization. Operationalizing analytics may typically involve assessing fault tolerance and automated recovery of data processing jobs, setting standardized parameters such as dates and strings across platforms, seamless scheduling of jobs, logging, and monitoring.
Analytics may involve developing and operationalizing descriptive, predictive, text mining, and unsupervised learning models leveraging data sources.
To operationalize analytics, a combination of traditional technologies that are in use (such as SAS, R, Python server) and a distributed environment for big data could possibly be the best solution for some organizations to achieve their business objectives.
The existing infrastructure in the organization may have limitations to solve complex data problems. It is important to study the existing software and hardware to identify the gaps that can be filled with new technologies and systems.
The organizations should assess their networking, server, storage, and operations infrastructure thoroughly before deploying big data technologies. Besides, the nature of data processing—real-time or batch— also drives the infrastructure needs to a considerable degree. This in turn determines if the infrastructure needs to be commissioned on cloud, dedicated servers, or a combination of both.
It is important to study the existing software and hardware to identify the gaps that can be filled with new technologies and systems.
Speed, performance, scalability, and costs are other important factors that can influence the decisions around investments in big data infrastructure.
Lastly, it is of utmost importance to map the technology infrastructure with human skills for deploying and using it.
Integration and interoperability
Big data technologies should be integrated into the existing infrastructure in a seamless fashion to avoid any disruption, business downtime, and cost overruns.
Data storage and operationalization can happen in both traditional and big data environments, whereas analytics can remain exclusive to the traditional environment. This will provide a certain level of cost optimization.
Some analytics can happen in both the environments, thereby providing additional cost savings.
Data storage and analytics can happen in both the environments with operationalization becoming exclusive to the big data environment. This will optimize cost significantly.
Big data technologies should be integrated into the existing infrastructure in a seamless fashion to avoid any disruption, business downtime, and cost overruns.
Various databases, clusters, and nodes should be studied for integration. Other important considerations are metadata and master data management (MDM), extract-transform-load (ETL) preprocessing, data retention, framework for faster deployment with automation, and scalability.
At the organizational level, inter-departmental, cross-project, and multi-platform integration of big data technologies should be planned early on, as it may get difficult to achieve this later.
Security and privacy have become major concerns with the advent of cloud, diversified networks and data sources, and the variety of software platforms. As the organization’s data and infrastructure become more accessible from different platforms and locations, they also become vulnerable to hacking and theft risks.
Security and privacy have become major concerns with the advent of cloud, diversified networks and data sources, and the variety of software platforms.
Traditional security methods and procedures that are suitable for smallscale static data may be inadequate to fortify big data environments.
Amongst several security considerations, big data deployment strategy should most importantly encompass the following:
- Securing all the applications and frameworks
- Isolating devices and servers containing critical data
- Introducing real-time security information and event management
- Providing reactive and proactive protection
Finer details of configuring, logging, and monitoring data and applications should be known beforehand to implement the security measures.
Data governance involves having access to audit reports and reporting metrics. The scope of governance should be clearly defined while commissioning the big data environment.
The following are certain important considerations for governance:
- Defining the frequency of refreshing and synchronizing metadata
- Identifying possible risk scenarios along with failovers
- Instilling quality checks for each data source loaded and available within the big data environment
- As per the regulations and business needs, disposing of the assets that are no longer required
- Defining guidelines for acceptable use of social media data of existing and potential customers
- Scheduling audit logging of events along with defined metrics
- Productionizing a logging framework for data access, and creation and update of intermediate datasets to generate logs of event runs and identify failures/errors during production run and analytic operations
- Identifying access patterns across data folders and defining access control rules based on hierarchy (groups, teams, and projects)
Data governance involves having access to audit reports and reporting metrics. The scope of governance should be clearly defined while commissioning the big data environment.
Adopting big data technologies requires a shift in the organization’s way of functioning because it changes the operations of the business. The deployment should be manageable and timely, integrate into the broader enterprise architecture, and involve specialized personnel.
Processes to implement, customize, populate, and use the big data solutions are necessary. Methodologies to control the flow of data into and out of the big data environment should be defined. Furthermore, processes to provide feedback for improvements during big data infrastructure deployment and thereafter should be in place.
Insights in the big data environment
Data, analytics, and technology have an impact on the consumption of insights for making informed business decisions. Using tools and technology, analytics performed on data provides insights that determine how users and applications will access, interpret, and consume the output of analytics performed on data. Having maximum visibility of the analytical output will enable effective decision making and optimize business outcomes.
Visualizing the output provides the ability to effectively consume aggregated and granular data. Factors such as connecting to multiple data processes and look-up engines, and interfacing with databases and warehouses to push output for downstream consumption should be considered. Furthermore, the ability to modify a reporting module and add new dimensions to it will enhance the output and decision making in a big data environment.
A Fortune 500 Global company got an analytics solution developed for surveillance monitoring to identify anomalies in real time by processing streaming data. The solution also provided an interface for retroactive analysis on large historic datasets.
While planning for the consumption of insights in a big data environment, the constraints and opportunities of the underlying systems should be considered. On one hand, the infrastructure should be configured to provide quick (realtime) insights. On the other, provisions should be made for greater processing times for certain insights that are derived from large datasets.
Other important factors to be considered while distributing the output should be the frequency of consumption, the extent of personalization, access through APIs, message pushing ability, and reusability.
Intensive, comprehensive, and persistent planning is imperative for deploying big data technologies in any organization. With the changing landscape of big data technologies, the strategy itself should evolve to accommodate such changes. The deployment strategy is more than a piece of paper. It is a mechanism to deploy big data technologies within the existing infrastructure for maximum impact without affecting the core business.
Intensive, comprehensive, and persistent planning is imperative for deploying big data technologies in any organization.
The effort invested at the initial planning stage might determine the success or failure of the deployment, and in many cases, of the organization itself. The nuts and bolts of every aspect of deployment should be fine-grained. Stakeholders from different departments should be involved and a collaborative environment should be created. A roadmap with the deployment phases should be drafted and made handy.
The effort invested at the initial planning stage might determine the success or failure of the deployment, and in many cases, of the organization itself.
The following should be the typical outcome of a detailed assessment that will help in formulating the big data deployment strategy:
- Big data architecture diagrams
- Data flow diagrams
- Logical architecture diagrams with detailed explanation of the inherent layers
- Technology mapping diagrams
- Process flows of operational patterns
- Reference architecture diagrams for the inherent scenarios
Reference architecture diagrams for the inherent scenarios
- Deployment recommendations
- Deployment phases
- All the activities within the deployment phases
- Human skill mapping
These artifacts of strategy and roadmap collectively form a construct for deploying big data technologies. Each one plays an important role in articulating, tracking, and controlling the deployment. These are must-have for any organization seeking a successful deployment and therefore, should be tailored carefully.
Finally, as deployment of big data technologies can be a complex task, organizations need to realistically assess what to execute in-house and where to take the help of external partners.
Finally, as deployment of big data technologies can be a complex task, organizations need to realistically assess what to execute in-house and where to take the help of external partners. Collaborating with partners can provide access to a proficient talent pool, improve utilization rates, reduce cost, and offer the much needed strategic direction for successful deployment.
Suraj Amonkar – Director (Big Data and Visualization) at Fractal Analytics
Vishal Rajpal – Director (Global Consulting) at Fractal Analytics
- API – Application programming interface
- ETL – Extract, transform, load
- SAS – Statistical analysis software
- SQL – Structured query language
- USA – United States of America
All roads lead to an omnichannel retail world. In an increasingly competitive market, retailers are starting to lay the foundations for omnichannel. Yet, there is a lot to be done before the retailers can harmonize all the touchpoints to their customers.
Companies have historically focused on improving operational capability, but they have so far struggled to convert this into an enhanced retail experience.
Online retail made up just 8.4% of total U.S. retail sales in 2016, and Forrester research predicts it will account for only 11% of total U.S. retail sales by 2018. As per an AT Kearney study, around two-thirds of customers shopping online use physical stores before or after the transaction. In such cases, stores are essential in converting the sale. Physical stores provide consumers a sensory experience that allows them to touch and feel the product, immerse in brand experience, and engage with sales associates who provide suggestions and reaffirm shopper enthusiasm for their new purchases. Nothing can replace these aspects of in-store shopping. This suggests that physical stores still dominate the retail landscape, and will continue to do so in the near future.
However, there are certain aspects of online stores that physical retailers can take inspiration from to enhance the customer experience. Prominent among those aspects is the aspect of experimentation known as A/B testing in the e-commerce space.
Retailers must test many ideas, quickly and accurately before they decide what works and what doesn’t.
Retailers should innovate, start new initiatives, and bring new technology into existence to transform the store experience. Retailers must test many ideas, quickly and accurately before they decide what works and what doesn’t. Those who have understood this have already started to build a culture of business experimentation within their organizations and are seeing its benefits. For those who haven’t, this is the right time.
New role of the physical store
The new retailer needs to be a combination of store retail and non-store retail. Retailers need to integrate the online and offline advantages to provide a seamless experience across channels.
An indication of how the role of stores is expected to be transformed is evident in the fact that 40 percent of Best Buy’s and more than 50 percent of Walmart’s online sales already are picked up in stores.
According to a McKinsey Insight article, to make informed network choices, retailers must take a long-term view of their real-estate. Beyond building stores, what expansion models are available when they look for growth? How can they enable new multichannel experiences?
The new retailer needs to be a combination of store retail and non-store retail. Retailers need to integrate the online and offline advantages to provide a seamless experience across channels.
As prices and inventory availability become more transparent, retailers will not survive just by being “pass through” sellers of national brands. They will have to give consumers a reason to choose their stores over competitors.
No longer will consumers shop at a retailer simply because it happens to be where a product is distributed. Retailers will need to offer deep product expertise and a unique product education.
There are enormous possibilities where modern stores can bring new experiences to customers. The following scenarios will help paint a picture:
- Customers browsing online, locating the nearest store and purchasing from that store.
- Customers picking up the product in the store and paying online to avoid long checkout queues.
- Customer making an online purchase but returning the product at the physical store.
- Access to an in-store interactive screen where customers can browse through various products, read reviews and pick up from the shelf.
- Sensors which can understand the interest of a customer and send the data to a screen which displays relevant product information.
- Experience zones within a store that simulates the environment in which a product is designed to be used.
- In store assistants carrying mobile/tablets with information on each customer’s profile and personalizing the experience for a customer.
- Personalized promotions sent directly to the customers mobile, based on the location within the store.
Some stores have already incorporated a few of the above scenarios.
is pioneering the ‘Just Walk Out’ technology, enabling the customer to completely bypass queues.
Kate Spade Saturday and SONY
are experimenting with shoppable windows and revolutionizing the concept of window shopping.
Macy’s and Waitrose
have started sending personalized recommendations and offers based on the location of the customer in the store.
is a U.K. fashion retailer that’s fusing their ecommerce site, mobile app, and brickand-mortar stores into a simple shopping experience. If you walk into one of their stores, you’ll find sales associates armed with iPads that are available to give you on-the-spot, accurate, and up-to-date product information.
The iPad also acts as a cash register, making it easy for associates to ring you up from anywhere in the store. And the cherry on top? If it appears that something is out of stock, the staff can instantly place an online order for you to have the item shipped directly to your home. This is true seamless customer experience.
For many retailers, future store layouts will need to foster greater customer learning and experimentation. Technology will need to be fully integrated into how stores and employees engage customers. And the lines between physical and digital will have to blur.
Store transformation journey: Ask a lot of questions
It is important to note that none of the stores have adopted a big bang strategy to invest in the latest technology. Amazon Go is currently open only to Amazon employees in their Beta program. Similarly, SONY has installed shoppable windows at only one store. And rightly so. Technology adoption is a high risk, high investment venture. It is important to understand whether making a change translates to an increase in the metrics that matter, be it sales, customer engagement, footfalls, or any metric that the store decides is an objective metric.
It is important to understand whether making a change translates to an increase in the metrics that matter, be it sales, customer engagement, footfalls, or any metric that the store decides is an objective metric.
Modifications to these existing processes can increase the chances of a sale considerably. It is important to ask the right questions to identify which modification will be most effective.
Once a customer identifies a need, the next step will be to browse online for the relevant product. A considerable digital presence can go a long way in attracting the customer. For a retailer, the key is to understand which digital channels are the most effective.
Will a significant web presence lead to more views or is mobile the more effective channel? Will ads placed on other websites get those views? Are these channels an effective space for running promotions? Are these promotions resulting in additional footfalls to the store?
Once a customer identifies a need, the next step will be to browse online for the relevant product. A considerable digital presence can go a long way in attracting the customer.
Will sending personalized promotions and coupons entice the customer to the store? Or is this a privacy concern? Will adding a Google Maps plugin make it easier for the customer to reach the store, resulting in additional footfalls? The permutations and combinations are many, but incremental changes and accurate measurement will simplify many of those. Walmart has recently introduced an app that lets customer buy products online and pick them up in stores. This helps customers conveniently manage their shopping cart while maintaining their in-store experience and loyalty.
Store designs and layouts strongly influence in-store traffic patterns, shopping behavior, and the shopping experience. Understanding the cause-effect relationships here will help retail stores arrive at the optimum store design.
Store designs and layouts strongly influence in-store traffic patterns, shopping behavior, and the shopping experience. Understanding the cause-effect relationships here will help retail stores arrive at the optimum store design.
For example: Will changes in the exterior of the store result in more footfalls? Will a redesign of the zones within a store help customers navigate better and reach their desired products faster? Will a change in the shelf layout increase the visibility of a product, which in turn lead to an increase in sales? Why do more than half of customers, as soon as they enter the store, walk towards the beverage section despite visible promotions on packed foods? Who went without purchasing anything? What did the shoppers not buy?
The design of a store is also greatly influenced by the persona of its customers. The Amazon Go store, at roughly 1,800 square feet, is conveniently compact to suit the needs of its target customers: busy shoppers who want to get in and out as quickly as possible. A store which serves leisurely shoppers may want to incorporate more open spaces.
Now that customers have arrived at the product location, they may want to compare different brands, specifications, and prices.
Now that customers have arrived at the product location, they may want to compare different brands, specifications, and prices.
Will the presence of a screen that automatically displays the relevant information add to a positive experience? Will a store assistant, armed with behavior patterns and demographic data of that customer, make it easier for the customer to make a choice? Or is this another privacy concern? Is it feasible to add a premium on the price for the enhanced instore experience?
Augmented reality (AR) is touted as the next big technology in retail due to encouraging feedback from several customer surveys. However, it is not yet clear whether AR will actually increase sales. For example, will a virtual mirror that can quickly learn preferences and show customers new looks without requiring them to try out any of the products result in tangible benefits for the store? Will sales increase if the retailer rolls out an app that lets customers imagine what a pair of shoes would look like on their foot without actually trying them on?
The key here is to take one step at a time, make gradual changes, and measure each change through controlled experiments.
Store transformation journey: Find answers to questions through experimentation
Organizations have ideas and they spend an immense amount of money to execute/implement them, but very few companies succeed in the end. For any retailer, effective implementation of ideas is the key challenge. Dearth of talent, limited budget, high operational cost, and lack of technical infrastructure reduce a retailer’s appetite for change. Hence, they lag in innovation.
For any retailer, effective implementation of ideas is the key challenge.
On the other hand, online players like Amazon, Best Buy, etc., frequently bring new features to lure customers.
As per Jeff Bezos, “If you double the number of experiments you do per year you’re going to double your inventiveness.”
Clearly, companies like Amazon do not hesitate to try new ideas and thrive on innovation. They also have the advantage of technical infrastructure to experiment with new ideas.
Experimentation as a concept is not new. Conventional retailers have been using some form of manual methods to try and test their ideas and it has very limited scope. But today, to transform an entire store, retailers need to bring gradual changes in the store and test them in more sophisticated and agile manner. Whether the desired change works or not, it is vitally important to gain insight on scales large enough to assess results but small enough to reduce the large investments and risks that come with full scale execution. With numerous options, factors and possibilities in play, a robust approach is necessary for testing ideas.
…today, to transform an entire store, retailers need to bring gradual changes in the store and test them in more sophisticated and agile manner.
The first step to transform the role of retail stores is to build a mechanism of testing new ideas in an agile manner. It will empower business users to increase the risk appetite, efficiently manage their budget, and evaluate their ideas to maximize ROI.
A transformation story
Store remodeling is an investment-heavy process of producing an incremental change in a store’s physical design to enhance customer experience. It is very difficult to accurately predict whether a store remodeling exercise will generate returns. The best way to know this for certain is to test the change in a subset of stores and based on the assessment, make a decision on whether it should be rolled out.
A leading US retailer decided to conduct a remodeling experiment to test the effect of introducing customer experience lounges, changes in exterior signage, and upgrade in the existing lighting system to introduce smart lights.
The retailer designed an experiment to measure the impact of remodeling in select representative stores, analyzed the results by comparing with a list of similar control stores, and devised the future action plan.
The retailer designed an experiment to measure the impact of remodeling in select representative stores, analyzed the results by comparing with a list of similar control stores, and devised the future action plan.
A remodeling exercise is a significant investment sometimes ranging into millions of dollars. Additionally, at times store operations need to be put on hold for few days, which impacts the sales and revenue further. The retailer had to quantify the impact accurately to decide how they wanted to make this change for any subsequent set of stores.
The retailer decided to remodel 27 stores (to include both large and small stores) and wanted to assess its impact. These 27 stores were spread across the US. Control stores were simulated algorithmically for each test store Overall experiment could generate 6% lift in sales.
22 stores generated a positive lift. Few experienced lifts more than 15% in sales. In addition, break-even for large stores (sales greater than $10 million) was expected within 2.5 years whereas that for small stores (sales less than $5 million) was expected in 5-7 years. Based on the results, the retailer decided to prioritize remodeling for large stores.
The role of the retail store remains essential for today’s consumers. Retailers that use technology to transform the in-store experience can capture new opportunities to create true omnichannel customer experiences. It will take innovative thinking, experimentation, and data savvy to create the seamless digital and in-store experiences of tomorrow.
As in the case of the retailer that successfully remodeled its stores, those that take an intelligent approach to experimentation, powered by measurement and data, will drive real results, while minimizing risk. The opportunity is ripe for retailers that take smart action and strive to innovate. The leaders have already started. For those that haven’t, the time is now.
- http://www.middle-east.atkearney.com/consumer-products-retail/ featured-article/-/asset_publisher/S5UkO0zy0vnu/content/getting-in-onthe-gcc-ecommerce-game
Ankit Bhardwaj – Senior Manager, Client Development, Fractal Analytics
Gourav Chugh – Product Manager, Fractal Analytics
Nachiket Sane – Product Manager, Fractal Analytics
Shivendu Mishra – Director, Product Management, Fractal Analytics
You may think this article has a provocative title. Well, in this article, I’d like to highlight what I believe to be key success factors that are integral in the journey towards achieving Trade Investment Management Effectiveness (TIME) and I’d also like to describe what a ‘good practice’ looks like in each of those factors.
Trade Investment for most CPG companies is the biggest line item in the P&L after Cost of Goods Sold. In my experience, most CPG companies have ‘nth-degree-detailed’ focus, capabilities, analytics and knowledge when it comes to supply-chain efficiency, asset utilization and cost-to-serve dynamics. However, when it comes to Trade Investment Management Effectiveness (TIME), most CPG companies have much less of an understanding of what’s working, what’s not, why and how to drive higher ROI and improved business results from that investment. But first, I’d like to stress the importance of the concept of Trade Investment as opposed to Trade Spend as it is often referred to.
Spend vs. Investment
As people who know me will (painfully) attest to, this is a real point of principle and paramount importance for me. When you look up ‘spend’ in the dictionary, it states: ‘to pay out, disburse, or expend; dispose of (money, wealth, resources, etc.)’. To me, spend does not describe the purpose of trade monies. When you look up ‘investment’ in the dictionary, it states: ‘the investing of money or capital in order to gain profitable returns, as interest, income, or appreciation in value.’ Now to me, investment sounds much more like how CPG CFO’s would describe the purpose and the requirement from budgeted trade monies. This may seem like a nuance in wording, but it’s very important in terms of mindset and culture for an organization as they seek to achieve TIME.
Now let me introduce – in my experience – the 6 key success factors to achieving TIME:
The first of the success factors that I’ll cover is Focus. In this instance, by Focus, I mean Focus on success criteria and balanced objective setting. There are inherent trade-offs between using Trade Investment in driving volume vs. driving ROI vs. driving mix and % margin. All too often, there will be stated goals for one of these metrics for a brand / portfolio in an organization, but no communicated guardrails relating to the other metrics. As an example, if I had $10 for every time I’ve been asked ‘Did that promotion work?’ in my career, I’d be taking private villa family holidays in Barbados every year! The answer is straight forward, as are the lessons. Answer ‘Yes! If you’re objective was to drive volume. The promotion was a BOGOF. So of course it did, but look at the state of your P&L…’
In order to set Focused balanced objectives and an effective deployment of Trade Investment, companies need to set balanced objectives that are clearly communicated e.g., ‘Grow volume by 10%, $ Profit by 5% with no more than 100bps decrease in % margin.’ This is clear across all key metrics and very easy for everyone in the organization to be measured against. Not easy to achieve, I agree, both in terms of objective setting and objective achieving. But not only does this Focus drive clarity, it also forces more rigor into the objective setting process, meaning the combination of metrics has to be modelled out and deemed to be achievable in unison rather than setting an unachievable combination of objectives for the organization and setting the organization (and the individuals within it) up for failure.
The second success factor I’ll address is having the right Visibility into how Trade Investment is actually invested and the purpose of the investment (i.e., what are we aiming to achieve from the investment and how it is actually recorded internally?) These factors, therefore, provide Visibility to drive the analytic capability. After all, if you can’t see it, you can’t measure it. If you can’t measure it, you can’t understand it. If you can’t understand it, how on earth do you make solid business decisions on the often $1B+ global Trade Investment budget?
Usually, Trade Investment is recorded by financial controls criteria: Offinvoice vs. billback / retrospective payments; lump sum vs. variable cost. I’d like to put forward that whilst there are legitimate ways of accounting / clear criteria and rules to record by, they are not representative of the impact that those investments were intended to create relative to strategy and desired consumer behavioral response.
I also propose that ideally classifying Trade Investment according to intended impact and desired consumer behavioral response is far more useful to feed analytics to understand what is working, and how and why to improve business performance. For example, see the figure below for suggested classifications and goals of the investments.
Using the above classifications, aligned to the purpose and desired outcome of these investments, we can far more clearly analyze the key drivers to invest more in and those activities that we should discontinue or at least minimize.
I should add here that the notion that increased Trade Investment is bad and decreased Trade Investment is good, which exists in a majority of CPG’s, in my opinion is far too black and white versus the realities of the world of CPG and Retailing.
As an example, if I have a really breakthrough set of innovation products to launch this year, should I really be aiming to keep Trade Investment flat or decrease it? If I try to do that, because I don’t have the Visibility framework I detailed above, I will invest in the innovation at the expense of the core portfolio (from where the savings need to be made to fund it). As this continues, my innovation (the Children) are being fed and looked after, meanwhile the core portfolio (the Adults) who bring home the revenue to feed the Children, get weaker and longer term, may not be able to feed the adolescents.
Using the Trade Investment Visibility approach, I would argue that in the above situation, a year of increased Trade Investment is not only acceptable, but possibly desirable, providing that all the increase is being attributed to the breakthrough innovations in the classifications of InStore Execution and Consumer Price – the objectives of which are to drive Visibility, awareness and trial of the innovation. This is very much in support of the overall longer term business objectives.
If we now have clear Visibility to strategically aligned investments / fluctuations / explanatory factors around investment decisions and ROI, next comes ‘Did it work?’
The third area I’ll cover is Knowledge. By that I mean a full and regular diet of Pricing and Promotional Effectiveness Analytics that, wherever possible, cover the differing in-market executions impacts on all major metrics: volume, retail revenue, net sales and gross profit.
- Impacts of everyday price and combinations of promotional executions on each of these metrics including all cannibalization and competitive switching effects
- Impacts of all of these on manufacturer profitability, retailer profitability and how the profit pool is evolving between manufacturer and retailer
Within each promotional execution, the precise investment Visibility created will break down the investment components which in turn drive the delivered metrics. Tying these back to rounded Focus objectives and guardrail parameters allows much more accurate planning, monitoring and data-driven decision making.
Given the size of the investment that most CPGs have in Trade Investment, it amazes me how few companies have an institutionalized approach to the Knowledge of how these investments work and are performing. A former boss of mine used to have a mantra that I’d like to share here. ‘You cannot save yourself rich.’ This clearly is linked to my earlier point about thinking of Trade as an ‘Investment’ rather than referring to it as ‘Spend’. If a CPG company invest $1B globally in Trade, surely it’s sensible to set aside 0.25%–0.5% as a ratio to get detailed Knowledge regarding the performance and drivers of the investment results to refine those investments in the future to improve business results?
If I was running a private business, I would be approaching this with a ‘My Money Mindset’ and this level of investment in Knowledge vs. the overall investment that Knowledge would influence seems a no brainer to enable my organization to make far more decisions on ‘What I know?’ vs. ‘What I think?’
Having established the Focus and Visibility and the Knowledge of best uses of investment to drive the business objectives, the next challenge is to embed all of this into standard business processes, such as Annual Operating Planning, Sales & Operations Planning, but also to feed into brand / retailer strategy plans and playbooks.
For example, For Brand A in Retailer X, we need to do more of B and stop doing C. This sounds very prescriptive. There will always be exceptions (and often these can turn out to be the most difficult decisions and most relationship-led, successful investments there are!). But there needs to be clearly defined playbook guidelines, aligned to the overall Focus-balanced objectives. And when exceptions to those arise, there are clear escalation and approval / ‘no go’ processes that can be invoked quickly (by either cyclical meeting cycles or by exception meetings of a select group to make a decision) based on Knowledge and contextual business situation. Without Focus, Visibility and Knowledge, these exceptions would never be flagged, they would just happen and unknown / unapproved consequences would ensue.
Execution probably warrants an entire article in its own right; such are the complexities of every single executional negotiation of Trade Investment. But this is an article, not a book, so I’ll keep it short.
With Focus, Visibility, Knowledge and Control, the time is now for the rubber to hit the road. These four areas combine to provide a playbook for the sales teams to execute with excellence and bring home the Focused business objectives. This means: ensuring the right information gets to the right people, at the right time, to enable them to sell. The Execution of the playbook plans is where the money is made and where the ROI increases become realized.
Investment in the right technology and systems is key to enable this two-way communication, as quickly as possible. What to execute and compliance reporting are paramount in capitalizing on all the good work that has been done internally regarding Focus, Visibility, Knowledge and Control. Without Execution, the rest is rendered almost academic.
Do It Again and Again!
There is a reason that CPG in Europe is called FMCG. F = Fast. The marketplace, consumer attitudes and retailer agendas can change quickly. So the need to continually update and improve on these six areas is inevitable. This is not a one-time exercise. This is a continuous improvement journey. Focused objectives and guardrails need revisiting. Trade Investment Visibility needs maintaining, improving and adapting to innovative in-market executions of investment. A regular diet of up-todate Knowledge is required to keep guidelines, playbooks and executional excellence on track. Controls will always remain important, but also will the escalation process to manage and veto exceptions (You can’t save yourself rich!). Execution is Execution. Without it, nothing is going to improve.
I believe that most CPG’s are moving ahead on some if not all of these areas. But I’d love to hear from anyone who believes they’ve really fully cracked the Trade Investment Management Effectiveness (TIME) challenge, along with the six areas I’ve detailed as the key success factors.
There is also a cultural change management component required to truly embed and embrace this ethos that I’ve touched on in parts and will summarize here. In that cultural change is a language element. No longer should we refer to ‘Trade Spend’, only ‘Trade Investment’. Mindset shifts should follow. People should be encouraged to think no longer ’My budget. I must spend it or else I’ll get a lower budget next year’ rather instead be encouraged to think ‘If it were my money, would I invest it that way?’.
Focus, Visibility, Knowledge, Control, Execution and Do it again and again.
In order to be successful on the journey to enhance TIME, believe it or not, Business Practices and Culture need to become a lot more FVKCED!
Client Partner, CPG Retail Consulting, Fractal Analytics
Chris leads Fractal’s work in CPG and Retail with his domain thought leadership and client consulting expertise. He has 20 years of experience in the CPG industry, starting with IRI in the area of Retail. He spent 13 years with Kellogg Company, where he held various roles across CMI, CatMan, Brand, Sales, S&OP, Revenue & Trade Management. He was also a member of Sales and Marketing Leadership Teams. Chris moved to Kimberly Clark as Senior Director – Global Analytics, where he lead global best practice deployment in Sales and Marketing ROI. Chris holds a BA in European Marketing from the University of Hull in the UK. He also holds a diploma from the Chartered Institute of Marketing in the EU.
Chris has a real passion for using advanced analytics to tackle very real business challenges. He believes that companies invest huge amounts in Marketing and Sales, but most invest little in understanding the ROI of these investments. He believes those that do will be winners, and he wants to help drive that success.
Crafting superior customer experiences through personalized customer journeys is no longer just an option for insurers. Today’s increasingly digitally-active consumers are demanding a significantly higher level of services and a bouquet of choices to earn their continued loyalty. Insurers are at a crossroads, where the challenge of balancing the demands of new age, tech-savvy customers with the requirements of traditional insurance buyers has magnified significantly.
Address Both Digital-savvy and Traditional Customers
As digitally-savvy consumers increasingly fuel growth for insurers by opening new markets and sources of revenue, these new age consumers are now one of the most influential slices of pie in the insurance market. They are driving expectations for personalization and service to a whole new level. However, traditional customers, which still have a fair share in the insurance book of business, continue to demand value for money, with basic quality services.
In the current environment, consumers envision having a central role in their relationships with businesses. For insurers, their arm’s-length stance of earlier years is resulting in inadequate customer experiences across both the segments. The fact that consumers are constantly pampered by the retail and hospitality industry has elevated the expectation of this customer centrality and created a new baseline where insurers must operate.
In a commoditized business like insurance, where the customer experience is one of the last bastions left with product managers, insurers need to balance this dichotomy to be effective in marketplace. Achieving both balance and customized journeys is a challenging prospect for insurers. On the one hand, insurers have customers that are costprudent, basic service seekers, and on the other hand, are customers that are the histrionic, prima donnas of today’s digital world.
In the current environment, consumers envision having a central role in their relationships with businesses. For insurers, their arm’s-length stance of earlier years is resulting in inadequate customer experiences across both the segments.
Bridge the Gap Between Insurers and Customers
Traditionally, insurance policies have been sold and managed through intermediaries, separating the insurance firm from its customers. Further, a quantitative view of the risk pool as ascertained only through actuarial principals, has expanded this chasm. Often, this traditional pool constituted bargain seekers and tranches, which viewed the insurance agent as a surrogate for the firm. This placed insurance is a position with no or little effort to reach the consumer directly, which has led to significant inconsistency in customer experiences. This means that for customers all insurance companies look the same: just a product package.
Delivering a superior customer experience takes a concerted effort across business functions, which relies on creating a single thread of customer centrality and seamless experiences. Here, every customer touchpoint becomes a step in the journey and not a discrete event. This journey typically starts with identifying drivers of customer satisfaction and embedding these into operations, and ultimately crafting customer journeys, with cross-functional ownership and omnichannel experience.
Delivering a superior customer experience takes a concerted effort across business functions, which relies on creating a single thread of customer centrality and seamless experiences.
Take a 4 Step Process to Win in a Cluttered Marketplace
In our experience, the leaders in the customer experience space are winning by focusing on a few, but very well defined, customer needs and interventions around them. Leveraging digital channels and maintaining continuous customer dialogue through every touchpoint defines the strategy and its ongoing refinement. Below is a four step process which we recommend every insurer to take to win in this cluttered marketplace using the customer experience as a lever:
- Culture: A successful customer experience journey needs a culture of customer centricity in the organization. All processes, interactions and values should be aligned to the centrality of customer needs, and should be put in practice from the CEO’s office to front-line executives.
- Know Your Customer: The first step in designing a customer experience framework is to align on the type of experience a segment of customer wants. A data-driven micro segmentation—with need, value and behavior as key dimension—helps in designing the customer experience framework and strategy. Most insurance firms would build their strategies aligned to segments, based on the following levers:
- Product and its customization
- Trust and transparency
- Speed and professionalism
- Unified communication and experience
- Design Thinking and Operationalization of Journeys: Radically redesigning the customer journeys based on needs of segments defines the success of any customer experience initiative. A segment with complex insurance needs may require more touch points, longer sales cycles and better articulation of product features. On the other hand, a simple product for a digitally-savvy customer may require a non-intrusive digital sales push. Similarly, the amount of time spent at first notice of loss, the courtesy extended by employees, the speed and ease of tracking claims status, etc., can be designed differently for each customer segment. Using customer empathy and cross functional integration typically forms the backbone of successful operationalization.
- Digitization and Customer Insight: Digitization of processes and interaction helps capture nuanced views of the customer experience and granular data. Digital insight as captured from processes and interaction, coupled with customer experience KPIs defines how successful firm has been in the customer experience journey. NPS, loyalty, churn, external benchmarks, new business, etc., are some of the KPIs which insurers have successfully used in defining success of customer experience strategies.
Walking the Walk to Deliver on Customer-Centricity
The opportunities for differentiating through great customer experiences has been established many times by customer experience leaders across industries. USAA, Apple, AMEX have gained significant market share and customer loyalty through their well-crafted customer experience programs. With the ever increasing power of digital consumers, the market leaders will be the ones that create deeper relationships with customers. Each interaction and touchpoint judged by the customer will define whether a carrier is meeting its brand’s promise or not. With such a large value at stake, the insurers that deliver on the customer experience journey will win greater market share and space in the minds of consumers.
Our second annual ai.nyc was held at One World Trade Center on Wednesday, June 5, 2019. This year’s event focused on the perfect recipe for AI problem solving, driving better business outcomes. Our panelists left attendees feeling optimistic and confident about the future of AI, as they shared inspiring messages about how the prospective relationship between humans and AI will elevate businesses, lives and the world.
New York Times best-selling author and mathematician, Cathy O’Neil opened ai.nyc with an ethical examination of artificial intelligence, citing multiple documented cases of biased algorithms, from criminal justice to child abuse to college admissions. Cathy also detailed findings from her book, “Weapons of Math Destruction: How Big Data Increases Inequality and Threatens Democracy.” For example, using information from her book, Cathy shared with the audience her views on the potential for bad algorithms to institutionalize injustices, such as racism and xenophobia. She also talked about how the coming together of human oversight and the right algorithms, could prevent such unwanted outcomes, helping us to focus on the good that AI can do, if planned and executed in a smart way.
Our co-founder, Srikanth Velamakanni delivered the second keynote of the day. In a fascinating presentation, Srikanth shared his recipe for scaled problem solving, including well-built AI, unique design and smart engineering. He referenced Microsoft’s Tay chatbot experiment, Google’s flu activity algorithm and Google Glass to demonstrate how a strategy missing any one of these ingredients can hamper great problem-solving ideas. He concluded by saying, “augmenting human intelligence is where the action is.”
E-commerce optimization expert, Joe Keating from Hill’s Pet Nutrition, along with Fractal client partner, Dipita Chakraborty led a positive and constructive discussion about mining social data analytics to help e-commerce and CPG brands perform better. Keating, who works in consumer pet products ecommerce, shared examples of real-world insights, cautioning brands against excessive automation, “don’t fall into a trap of thinking you don’t need people.” He also noted that by doubling down on analytics they’ve seen significant growth. It seems the right combination of both, is the key to success.
In her chat, Dipita detailed how to analyze social chatter, advising, “whenever we use an algorithm, we have to figure whether the algorithm is spitting out meaningful data or not.” She also shared that brands are all trying to answer the million-dollar question, “what’s the next big thing consumers want?” through analytics. Dipita helped the audience to understand how to answer these questions for themselves, by sharing insights and her own real-world use cases in which analyzing social chatter correctly helped brands spot innovation, improve brand health and drive conversions.
In the “Problem Solving @ Internet Scale” session, Chris Jasensky, Area IT Director and VP at North America, RB and Rambabu Vallabhajosula, SVP, Hotels Performance and Revenue Management at Priceline, along with Fractal’s own, David Yeo discussed the critical ingredients for implementing AI at scale, including the importance of human capital. “It takes a village to create a successful AI scaling program,” shared Yeo. The panelists detailed how implementing AI at scale is not just a technical challenge and agreed that AI projects need organizational buy-in and alignment to succeed. Company colleagues need to feel they have a vested interest in AI projects, which can be stimulated by sharing KPIs and measurement to inspire people to work for the common goal. Jasensky added that “AI projects must deliver a big, dynamic benefit to keep people engaged.”
Biju Dominic, CEO and Co-Founder of Final Mile Consulting energized the delegates with a passionate plea for designing AI products with the human element in mind. Dominic urged the industry to consider human behavior in the creation of its solutions, striving to build stronger emotional connections between humans and AI. Dominic also suggested the industry address the non-conscious barriers to adoption — primarily trust and emotional connection, closing with a thought-provoking message on the future of artificial intelligence, “we can’t win by making others within the organization lose.”
Panelists Natali Mohanty, who is the Senior VP of Data and Analytics at Pure Insurance, Mike Gualtieri, VP and Principal Analyst at Forrester Research, and Fractal’s own, Lana Klein discussed the “Magic of Three Ingredients of Transformation.” This provoking conversation also shared the nexus between engineering, design and human capital, as it relates to scaling AI products, referred to as the “trifecta.” Natali presented a compelling case study, which demonstrated the difficulties in scaling AI products when either quality engineering or design are inadequate. “No matter how hard the problem is, you cannot divorce from the design mindset,” observed Natali, who also defined a design skill as, “requiring a combination of imagination and empathy.” They all agreed that a practical, long-term mindset is mandatory when launching AI products, “it’s a marathon, not a sprint.”
Our co-founder and CEO Pranay Agrawal concluded the day-long conference with fascinating assertions about the role AI plays in everyday lives and how it has helped make our world a better place in the long-run. He also talked about how the world has never been a better place to live in, on any objective measure, and about how technology has played a major role in that.
Pranay also covered the framework for a successful AI-driven future, which included algorithms that allow us to match and exceed human capacity, engineering that will successfully feed tremendous amounts of data into the algorithms, and good design, which will help to solve the right problem. He concluded by saying that in order to solve a problem, thoroughly, we also need to put users at the center of our decisions.
At the very end, Pranay left the audience with this, “Let’s all be happy, and appreciate the fact that we are living in the best times ever, and that if history has taught us anything, it’s that as technology gets better, which it always does, so will our lives, and the world.”
It’s no exaggeration to say that every problem today is an AI problem. But while AI is a critical component of our problem-solving approach, it’s no longer enough to overcome these challenges at scale. We now have a new recipe to solve complex challenges at scale & drive action – AI combined with Engineering and Design.
Our ai.lcy event in London last week focussed on helping businesses drive better decisions when operating at scale. The event was attended by FTSE 250 attendees from a wide-range of industries including CPG, financial services, healthcare, insurance, telecom, retail and more. Speakers from companies including Mars, Visa, Google, Lloyds Banking Group and M&G Prudential provided their insights to 60+ attendees, as well as members of the media. They also experienced AI products and services in the exhibit showcase area. Here is a brief top line summary of each session:
- Keynote – How AI revolutionises business strategy – Kenneth Cukier (Senior Editor, The Economist)
Businesses need to think of data as a new factor of production. The more data we collect, the more we can do with it and the more we can produce. Because we can now apply artificial intelligence (AI) to different problems, businesses are able to learn things they couldn’t before. They can maximise new opportunities and create new values (jobs, services, production, sales). But how an organisation ‘frames’ the problem it’s trying to solve through AI is becoming increasingly important. AI is at it’s very best when forecasting and making predictions. As a result, businesses need to stop considering the problem as “humans vs. machines” and instead make everything a predictions problem. That needs to be at the core of their AI strategy.
- Keynote – AI is not enough – Srikanth Velamakanni (Co-founder and Group Chief Executive, Fractal Analytics)
AI is becoming ubiquotus – all problems will be reframed as AI problems & at the core of everything that we are doing around the world of AI is a behavioural problem. When we think of it as humans and machines as opposed to humans vs. machines, and balance between driving forces and restraining forces, we power decisions that make real progressAnd AI alone is not enough to solve problems at scale. Machines are scalable, but so are their errors. As simple automation isn’t always the solution, a deep understanding of human behaviour is needed. Therefore, a combination of AI, engineering and intelligent design is needed to solve problems at scale. To maximise the benefits and efficiencies of AI, businesses must combine their solutions with their staff. Simply having algorithms that outperform humans isn’t enough, you need to marry super intelligence to experience. Machines solve problems, humans make sense of things. They have the ability to adapt to a new environment, to a new set of circumstances very quickly. Machines alone don’t possess that adaptability
- How to use AI to hack tricky problems – Abhijit Akerkar (Head of Applied Sciences, Business Integration, Lloyds Banking Group), Priyank Patwa (Head of AI & ML, M&G Prudential), Rahul Desai (Client Partner, Fractal Analytics)
To solve tricky problems, we need to understand human behaviour. To improve the overall experience – which should be the goal of any business – businesses need to consider the motivating factors that drive customer decision-making. And then we need to ask ourselves: can we predict the next likely event in the customer’s journey and power the “next best action” for the customer? To do this, we need to appreciate that short-, medium- and long-term actions have consequences on the customer journey. So, we need deep learning to extract customer patterns (and memory) from the journey so that we can craft a model that propels that person to the next best action (and deep learning-based models outperform traditional models across various scenarios). We need to change the decision-making journey by making sure that the right information and the right insights are available to the right person at the right time.
- How to solve problems at internet scale – Linden Glen (Digital Transformation Director, Analytics & Data, Mars), Arpan Dasgupta (Client Partner, Fractal Analytics), Sameer Dhanrajani (Chief Strategy Officer, Fractal Analytics)
Business should start with a user-centric approach: find out the problem of specific users, use design thinking to uncover the problem and why it needs to be solved. Once this is done, we must then consider how we should use analytics and AI to solve these problems. The final step is looking to find out how to scale a solution. But all starts with understanding what the initial problem is before thinking about what data and technology is needed.It is the business’ goal/aim – whatever it is that it’s trying to achieve – that will inform the type of AI solutions and computational architecture that it crafts and deploys. At the end of the day we’re talking about a sea-change in personal preferences and so now we have to make a change in business processes.
- How to make it work with design – Pranay Agrawal (Co-founder and Chief Executive Officer, Fractal Analytics)
To fully unravel human behaviour, we need to go beyond data. We need to ask why people take the actions they do? And businesses need to go beyond what the data shows them. Behaviour is driven by a wide variety of emotions, desires, factors and influences, some conscious and some unconscious. The two key factors in any behavioural change situation/scenario are the driving force and the restraining force. These need to be examined and understood so that we can design solutions for non-conscious behavioural change. So, we need to not only get a better understanding of behaviour but also context. Because context alters human behaviour. To identify the problem, we need to analyse the behaviour. To solve it we need to understand it.
- The new recipe and it’s magic – Ben Neffendorf (Joint Data Science Lab Delivery Lead, Visa), Eleonora Kourtzi (Product Marketing Manager – Digital Growth Lead, Google), Martha Bennett (Principal Analyst, Forrester), Natwar Mall (CEO, Cuddle.ai)
To build and implement AI solutions correctly, organisations should pay special attention to the people involved. It’s the people who gather the data, people who select which data goes into the model and people who design the algorithms. All the problems that businesses are trying to solve are problems of AI, design and engineering. Through the right combination of all three, we can help business leaders reimagine their business through new technologies. Improving the overall experience for the customer and generating high-quality results.
Through ai.lcy, we have challenged some of the assumptions around AI and big data practices, while delivering insightful and thought-provoking sessions on what businesses need to do to apply AI efficiently and scale it effectively. To be successful, organisations need to understand the driving forces behind customer behaviour, the context of that behaviour within the customer journey and the problems that need solving. For businesses to get the best results and deliver the best possible experience, they need a combination of AI, engineering and design.
The vast proliferation of data, faster computing, digital detonation, and the need for continuous innovation to stay ahead of cybercriminals has put the cybersecurity industry at an interesting inflection point, needing a radical paradigm shift in the way we foresee cyber businesses operating in the near future. Cybercrime is increasingly becoming prominent in every boardroom’s agenda,as it is costing businesses across industries nearly $118bn annually, and the role of cybersecurity experts is undoubtedly gaining more relevance in today’s ever-evolving IT landscape. This paper discusses in detail how cyber security officers can respond more proactively and more effectively to cyber threats using AI & machine learning techniques.
Despite the sophistication in tools and technology and machine learning driven solutions available out there, cybersecurity officers have been barely successful in bringing down the time it takes to acknowledge an attack ‘after it has happened’, let alone stay ahead of the curve and predict the next breach or attack much before the attackers strike the first blow.
As per a recently published Verizon Data Breach Investigation Report, more than 50 percent of data breaches go undetected for several months. And most of the traditional approaches tend to focus on just aggregating data around malware, hacking attempts, identity thefts, data breaches, phishing campaigns, etc., translating these into threat signatures (digital fingerprint of the attack) and then analyzing streams of historical/real-time data for finding similar patterns/behaviors. Not so surprisingly, owing to our adversary’s inventiveness, cybersecurity criminals have always been one step ahead in terms of constantly advancing and fine-tuning their attack strategy to circumvent existing systems and finding newer innovative ways to threaten organizations.
Cybersecurity officers today are looking to employ a more advanced, intelligent and less human-intensive system to proactively monitor cybersecurity threats and mitigate them in order to reduce cost, prevent fraudulent activities from happening in the first place or even improve the efficacy of their current cybersecurity implementations. And when the rules of the game are changing at such an unprecedented pace, agility and the right attitude to let go of the old rules and learn new ones is no longer a matter of choice but rather a necessity to avoid the extinction event. Having said that, the critical questions that loom unanswered are:
- How can cybersecurity officers break this endless loop of playing a catchup game with cyber criminals and have an advantage in the game? How can they tackle evolving fraud?
- How can they properly channelize investments to handle the volume and complexity of today’s cyber-attacks?
- How can they move beyond the current sub-optimal approaches of maintaining black-lists and adopt a more signature-free security approach?
- How can they truly differentiate between an actual/genuine human activity vs intentional misconduct and minimize false-positive alerts?
- How can they proactively detect out-of-normal behavior by analyzing real-time data streams from multiple network and infrastructure assets to uncover threats in real-time?
- How can they automate interventions based on severity/criticality/ complexity of the threat event as well as the risk appetite of the organization?
Growing Beyond a “Reactive” Signature-Based Methodology
A cyber-attack, security breach, hacking attempt or security threat is not identified until after the event has occurred. Organizations today are looking for options to rapidly mitigate threats in order to avert ramifications associated with retrospective identification, rationalize spends and opportunity cost tied to the investment, and also implement a robust, scalable cybersecurity strategy which caters to their future needs.
Fraudulent behavior or misconduct in this context must be looked at through a different lens, a new perspective which most of the traditional approaches don’t cater to today. Most current implementations out there look at historical evidences of attacks and potential breaches from ‘known’ sets of events. Instead of just gleaning over individual areas of anomalous behavior, we should mathematically define ‘what is normal’; the reason being fraud is ever-evolving.
By following this approach, we should be able to understand and digest the nuances of what is “Not Normal”. And by doing so, we have a higher likelihood of uncovering out-of-normal activity, improving overall detection, automating incident investigations, improving threat containment and implementing better threat aversion strategies.
An Innovative Approach to Combating Evolving Cyber Threats
Using the following three-phased approach, businesses can establish a System of Intelligence for end-to-end cyber threat prediction, detection, prevention and intervention in real-time, thereby improving the overall cyber threat remediation process.
Real-time threat tagging
Real-time threat assessment, evaluation and adjudication strategy
In this phase, historical cyberthreat activity and potential threat actor information will be used to determine what is ‘normal’ and what is ‘not normal’. When normal is understood, a model will be developed to identify all non-normal activity which will flag incoming real-time activity from IR tools (incidence reporting) or logs/events from systems, applications, network, security devices and other sources. These events will then be bucketed into various ‘known’ threat categories (threat types defined based on historical incidences plus SME knowledge) and unknown/undefined events which could be newer forms of evolving threats (discovery of new trends and behavioral patterns).
In real-time, the potential threat events that are flagged ‘At Risk’ will be adjudicated through RPA systems (robotic processing automation) based on client’s risk appetite or severity/criticality/complexity of the threat involved and associated downstream effects. For example, if you have a vendor or contractor, Edward Snowden, downloading copious amounts of data and has done the same thing “X” number of times over a certain period, he will be referred to for human investigation vs an email/text/pop-up message if it’s just the first attempt. Similar anomalies in employee and/or vendor/contractor behavior can be brought to light ahead of time and adjudicated for early mitigation.
Predicting future threats
Analyzing attacker behavior to predict future threat actors and events
Build profiles of the top 5-10 percent of cyber criminals responsible for extreme threat events in the past by analyzing their longitudinal digital footprints over elongated periods of time, plus profiles of ‘known’ threat types and finally ideal profile(s) for a user who had a genuine digital/web activity.
Based on these profiles, build a mathematical model to predict WHO is a potential threat actor and WHEN will they indulge in a potential cyber threat activity
When these events are predicted, a Robotic Processing Automation (RPA) system shall be implemented to contact the concerned party using the ideal outreach channel (email/text/pop-up window/phone call) based on the event’s risk score, organization’s risk appetite and/or employ a human investigator to intervene for a corrective action
Real-time identification of future attack events and enabling automated interventions ‘ahead of time’ to mitigate risk or minimize losses
This phase combines the results from phase 1 and 2 with unstructured tertiary inputs from cybersecurity SME’s and/or external world events to make the system more intelligent and reduce false positives or negatives.
This phase will drive compliance and education (via training, behavioral coaching, etc.), decreasing the amount of non-compliance, avoiding mistakes from genuine users and/or averting actual intentional cyber threats. This phase will help detect and stop threats ahead of time, and shorten time to remediation when attacks occur (using next best action).
Cybercrimes are growing exponentially, faster than what most business could decipher and embrace the winds of change. Naysayers will perish, existing market incumbents will be toppled and early winners shall rewrite the underlying business fundamentals, disrupting the marketplace in ways unimagined. Cyber business across the globe should get attuned to this new operating model paradigm shift. The situation may seem insurmountable unless businesses are equipped with the right set of tools/technologies and knowledge partners to help. “AI-driven” cybersecurity implementations behold the future for businesses bracing up to flip the markets again.
It’s the tip of the iceberg, or just scratching the surface—call it what you may! There’s a new cybersecurity story waiting to be etched in history, and it’s being driven by AI and powered by analytics.
Principal Consultant, Fractal Analytics
Artificial intelligence (AI) has limitless potential, and for most enterprises, we are only scratching the surface of the potential opportunity. There are many examples of enterprises embracing artificial intelligence to predict buying patterns, understand customer behavior, create personalization, help in genome research, optimize supply chains, conduct financial trading, or recommend movies. For these companies, artificial intelligence has already become a competitive differentiator
However, there are many more companies who are trying to figure out the reality vs. the hype and how to either begin or accelerate their own journey. For these companies, it’s still a significant challenge if not a daunting undertaking to begin the AI journey. Many firms also have concerns over technology selection, cost, integration, privacy, security, and regulatory challenges.
These challenges are not unique to AI. They are many of the same challenges that have existed in the widespread effort to adopt, use, and benefit from analytics and machine learning (ML) initiatives. Therefore, given these challenges, how can you start the journey if you haven’t already done so?
The strategies discussed in this white paper are intended to provide guidance and suggestions for organizations and enterprises who are navigating through this journey. However, these strategies can also be applied by leaders of business units, divisions, or other entities of larger organizations who wish to adopt or accelerate their analytics and AI initiatives
1. Gain C-Suite Sponsorship
To ensure success in adopting or accelerating analytics or AI across your enterprise, it’s important to provide active sponsorship from all C-suite leaders. These leaders must make cultural adoption a priority to drive progress, and align your assets, investments, and plans to your corporate strategy.
Executive buy-in is key to success.
Executive sponsorship and buy-in is vital for success. The more engaged and bought-in the C-suite is for AI, the better the chance of success in implementing and adopting Analytics and Artificial Intelligence across the enterprise. Ensure that all senior executives engage, actively participate, and “buy-in” to the initiative. It is even more critical as technology, data, complexity, risk, and demand increase. According to McKinsey Global Institute, “strong executive leadership goes hand-in-hand with stronger AI adoption. Respondents from firms that have successfully deployed an AI technology at scale tended to rate C-suite support nearly twice as high as those from companies that had not adopted AI technology.”
It’s not just the job of a CDO, CIO, or CAO—all need to buy-in
It’s not just the job of a single function, the CDO, CIO, and/or CAO if your firm has adopted these roles. If there is no business leader or function that is chartered (and empowered) to spearhead your AI efforts, then you are at risk of falling behind your competitors. Therefore, as a critical first step it is important to establish clear executive ownership for data and analytics (e.g., a CDO and/or CAO). Ensure that the functional leader and their teams are staffed at an appropriate level to drive the transformation strategy and change management needed throughout the enterprise in order to be successful. One-off, independent, or silos of effort will not succeed and may impede progress due to competing strategies, investments or performance objectives
Ensure there is active and engaged sponsorship from all C-suite leaders
Align assets, investments, and plans to enable your corporate strategy
It is important that you are able to drive strategy, resources, investment, and set the tone for the organization and cultural adoption. This includes active engagement and support for BI/AI strategy, assets (both IT and human), investments, and cultural adoption. By contrast, if budget for AI projects, initiatives and/or talent are disbursed throughout the enterprise, or not aligned to enterprise priorities, it will impede progress. Therefore, once you establish commitment at the C-suite level and formalize your strategy, you should also determine how you wish to manage and control your budget and capital throughout the enterprise, particularly if your current landscape consists of competing internal (or external) analytics or AI efforts.
This is not to suggest analytics or AI budgets should not exist throughout the organization but if left unmanaged or not aligned to your enterprise strategy, then the end result may be sub-optimized use of budget, capital and resources, or worse, competing solutions and efforts. The key is to identify the budget and resources, and align them towards your desired outcomes.
Make cultural adoption a priority to remove barriers, obstacles, or blockers
Cultural adoption is perhaps the biggest single challenge and requires the most ops-down leadership to define, communicate, and reinforce the vision/strategy. Be sure to hold the organizational leaders accountable to execute the changes required to drive the transformation. Executive leadership needs to remove barriers, obstacles, or even blockers if required to increase the chance for success.
Champion and communicate wins and progress.
Champion and communicate wins and progress to the broader organization. This will help reinforce the commitment from the top as well as garner support for the transformation. Reward and recognize champions to reinforce and support the behaviors and leadership you need for success. Conversely, a lack of communication and engagement from the top will slow or delay progress by reinforcing that “business as usual” is acceptable.
2. Coordinate Your Enterprise Strategy and Investments
Make sure you drive strategy and investments that align the C-suite, business, IT, data, and analytics functions or organizations. Success also requires a tops-down, coordinated strategy for IT investments to create or optimize big data solutions, data storage, apps, BI tools, and platform integration.
Have a strategy, plan, and roadmap that aligns business, IT, and analytics.
You need to have an integrated strategy, plan, and roadmap that aligns business, IT, and analytics. This could take the form of a strategic plan and three-year roadmap to align and drive investments in IT, processes, and talent. This roadmap should include your strategy (and timing) for investments in the entire “BI stack” including data storage and governance, big data technology, BI, analytics, and visualization tools and solutions. Without an integrated plan and roadmap, it will be difficult to yield an appropriate ROI on your IT investments.
Create a strong Business-IT partnership to promote progress.
There must be collaboration by all key stakeholders and functions to ensure enterprise benefits. Business-IT partnership is vital for success. It’s not purely IT’s role to implement big data, analytics AI projects, or fix “data problems”. Business leaders must take ownership and responsibility to partner with IT and other functions to drive analytics and AI initiatives. Silos are the enemy of progress.
Big data is part of the “BI stack” and overall solution—not the end- state.
Investing in big data architecture is critical to success, particularly given the explosion in data and data sources like video, chat, Internet of Things, and sensor technology—largely “unstructured” data which complements most legacy structured” data sources like CRM and ERP systems. However, big data solutions should be considered part of the strategy for the “BI stack” of technology from data ingestion, storage, discovery, modeling, analytics/ML, visualization, and mobility.
Big data can provide scalable, fast, and responsive BI and AI solutions to manage both structured and unstructured data. In order to maximize success of your investment in big data, it is equally important to determine what data you will store and make available through an enterprise data warehouse, data lake(s), and high performance memory-resident appliances in order to provide fast, responsive solutions for the business. As discussed later in this white paper, this also requires an effective enterprise data governance capability.
On top of this architecture, you also need to consider what tools are required to enable data exploration and visualization by the business and end-users. There are a number of compelling solutions currently available, and many more being introduced so it is important to strategically assess these solutions against your enterprise requirements and rationalize the tools you require for the given business need or use case. If you have too many BI tools, it can result in inefficiencies, confusion, competition, and a drag on productivity. Ultimately, you may choose several depending on your needs, but the key is not to ignore BI analytics and visualization tools as part of your overall BI architecture strategy and roadmap.
Create or scale a BI system that provides a platform for innovation, analytics, and AI.
Building an enterprise-wide business intelligence or business management system can create a robust big data platform for not only descriptive analytics and reporting, but an effective and fast way to implement predictive analytics solutions, ML, and AI at scale. Build once and share for the benefit of the enterprise is much more effective than building isolated solutions throughout the enterprise. A business management system works well when combined with effective data governance policies and practices governing what data is available in data warehouse(s)/ data lakes, for whom, and how it can be accessed and utilized. Such a platform can actually accelerate innovation and AI by allowing faster identification of end-user best practices or ideas, rapid prototyping, replication, and deployment of best practices, algorithms, and solutions.
An enterprise-wide BI or business management system can accelerate analytics or AI in the enterprise.
On top of this platform, you can readily apply analytics and visualization solutions to drive a scalable, fast, single-source-of-truth solution for the enterprise. It also accelerates analytical discovery by the business since it represents a large volume of trusted, normalized, and relevant data which can be easily accessed and analyzed by the business to determine drivers for success or reasons for under-performance. Ultimately, this will lead to improved ways of evaluating business performance, or introduction of predictive measures and KPIs for success. therefore, in the context of AI, such a platform enables and accelerates the ability to leverage ML and AI faster and more broadly across the enterprise. It is essentially a platform for analytics and AI innovation.
Establish the ability to do rapid prototyping with business and IT teams.
Utilize a prototyping capability or data lab to allow the business to perform rapid testing of new theories, algorithms, or models prior to production. Speed is essential, but so is scalability, security, and maintainability. Rapid prototyping can not only speed solutions to the business, but it can also raise confidence in the final product before it is released which accelerates adoption. You may also wish to create a cross-functional task force encompassing key business leaders with responsibility for chosen use case(s), data, analytics capability, IT, and IP partner(s) to collaborate on critical initiatives utilizing rapid prototyping and agile methodology to quickly move from idea to testing and action. This would also allow you to iterate quickly in enhancements to improve the quality of the product or solution before going into production.
Agile methodology can help accelerate the development of analytics and AI solutions
Agile methodology also needs to be embraced to develop analytics and AI solutions at both speed and scale. Traditional IT development projects take too long, are too rigid, and often do not meet the needs of the business, which do not remain static. You need to have a way to iterate the development process and deliver more frequent “wins” to the business. Otherwise, they will look for solutions elsewhere, which can result in proliferation of “shadow IT”.
Through agile methodology, the business is allowed direct and immediate access to all data within the data warehouse or data lake which precludes the need to acquire, replicate, or export the data into offline tools or systems (in effect creating “shadow IT”). This also allows the business users to shift efforts towards analytics insights and action while ensuring compliance to enterprise data security, privacy, regulatory, risk, and scale considerations.
Data integration, harmonization, and governance are critical to success.
Data integration and consolidation into data warehouses and data lakes are critical enablers for success. Collecting, storing, and providing data is the lifeblood of analytics and AI. Fragmented, “shadow” IT (data disbursed throughout the enterprise in various sources, which are often unsupported and unmanaged) is a significant drag on speed, productivity, and ability to implement enterprise-wide AI solutions. Beyond data collection and integration, data governance is essential to make sense of the data, improve (and maintain) data quality, and manage the access, use, and distribution of the data to enable the enterprise AI strategy.
Be open to strategic partnerships for creative, breakthrough thinking and IP
There is significant investment flowing into AI and related fields. Don’t miss out on the opportunity to partner with leaders in innovation and IP. Strategic partnerships can provide access to leading edge IP and capabilities. However, scalability, security, and integration challenges can slow adoption. As more investment flows into AI, this will become increasingly important and will require the ability to speed integration through open source, API, or cloud integration tools or platforms.
3. Establish Data Governance and Management
An enterprise-wide data strategy and governance process is a critical enabler for successful analytics and AI implementation. It’s important to recognize how critical, yet challenging, it is to govern and manage data across the enterprise. An effective data governance strategy will help you understand where your data is, what is important, how you need to manage it, and how (and whom) you want to allow access to the data, and then how it will be used.
Data strategy, governance, and management is mandatory for success.
Given the proliferation in data, data sources, and increased end-user demand, along with more intuitive and pervasive “self-service” tools and solutions, the need to have an effective data governance program is becoming even more critical. Without data governance, all of this data will end up in a data warehouse or data lakes and become “data swamps”.
Said another way, proliferation of data and uncontrolled user access can provide full freedom for business users. However, for an enterprise, it can result in confusion, duplication, inefficiencies, and distrust. The appearance of moving quickly on analytics or AI projects will mask the fact that enterprises are absorbed in an internal battle over data access and use, and not allocating the critical assets (people, process, technology) to serve a broader purpose—the organization itself, it’s customers and shareholders.
Establish or scale out data governance and data stewardship across the enterprise.
If there is a weak or nonexistent data governance process or function, then it is critical to create or endorse an organization chartered to address this challenge with appropriate support from the C-suite. The organization or function(s) should include resources committed to manage and improve data collection, accuracy, and use across all critical business functions. The data governance organization must also define and manage data policies, standards, definitions, and manage data quality in order to support an effective analytics and AI strategy while complying with regulatory or legal requirements, privacy, security and other considerations.
Break down data silos to gain access to the data that is most critical, and decide how you want to balance control vs. speed and flexibility in use of the data.
Be maniacal about managing data quality and invest in the tools and rocesses to maintain data quality.
Data quality will directly impact the accuracy of the analytics and AI models and output, and the resulting business decisions, so it is critical to have strategy, tools, and resources dedicated to ensure data accuracy and availability in source systems and data warehouse(s)/lakes. If there are data quality issues, don’t underestimate the criticality to have a strategy and capability to ensure ongoing data quality once you’ve corrected or cleaned up the data challenge(s).
Not all data is equal. Determine what data to tightly control, and what data you wish to make available for self-service, discovery and exploration.
Establish the rules, policies, and controls to govern critical data or KPIs, which must be tightly controlled and distributed or published vs. data you will allow for discovery, exploration, and ad-hoc analysis. This may also change over the life of the data, however, these rules must be strategically determined, managed, and controlled.
Not all data is the same, so it is important to determine what you wish to tightly control as a “single source of truth”, and to comply with privacy, security, risk, or regulatory demands. What data or KPIs are most critical to make decisions, who needs them, how are they delivered, and who “owns” and produces them are a few key questions that need to be answered. Ironically, although tight compliance and governance sounds restrictive, it can actually accelerate innovation and application of predictive analytics at scale. To carry the illustration a bit further, if you have defined the key measurements you need to successfully manage your enterprise (financial or operational), then how powerful is it if you have standardized, normalized, readily available data that you can use to identify trends, variances, business unit or individual performance (and underlying attributes)? It can be argued that your ability to apply analytics solutions increases significantly, including machine learning and ultimately AI solutions.
In parallel, you also need to understand what you will allow for discovery, experimentation, and analysis by the end-users and business, or where you will allow “multiple versions of the truth”. Given the increase in availability of more user-friendly, intuitive analytics and visualization tools, how far will you go to enable “self-service” to create new predictive analytical models or new ways of evaluating the business or creating new business process(es)? Who will you enable, with which datasets, KPIs, and use cases? These are important questions to consider. There is a balance here between being too rigid, yet being too flexible, which underscores the need to have a strategy, organizational capability, process, and adherence to a well-defined data governance model to enable an effective enterprise-wide analytics or AI strategy
- Too much flexibility can result in different or competing versions of the truth, which can create debates, confusion, conflict, and a significant drag on productivity.
- Too much control can result in rigid processes, bureaucracy, slowed or lack of response to the business, and the creation or proliferation of business-led IT solutions (“shadow IT”).
As you make these decisions, it’s important to have a governance process in place that allows you to implement and manage these decisions, including who gets access to what, how much, and what they can do with the data. There are a growing number of vendors and tools in the marketplace that can help enforce and support these decisions (also referred to as “Metadata Management”), including Informatica, Collibra, DATUM, and Global Data Excellence, to name a few
Consider creating a Chief Data Officer (CDO) role and function if you haven’t done so already.
There is a lot of literature on how to organize data management functions, including the role of the CDO. However, the primary message is you need to have an enterprise-wide strategy and effective governance model managed by a function that has the charter and ability to effectively lead and manage the acquisition, quality, dissemination, and use of the data across the enterprise.
4. Solve Enterprise-Level Business Problems
Identify specific business problems that you can address. Ensure they are strategic and impactful for the broader benefit of the enterprise vs. department or mid-level business problems. Too often, critical or scarce analytics and data science talent is applied to solve minor challenges, automate reporting, or other mundane tasks. It’s vital to align the talented assets and investments to solve critical business problems or opportunities.
The business must lead in the identification and selection of use cases or projects for AI.
Preferably, this should be done by senior executives. Projects or initiatives should be aligned with or enable the corporate strategy. Ideally, the use case or initiative will have a direct and material impact in driving the P&L (cost reduction, top line growth, profitability), better serve customers through reduced cycle time, better quality products and services, and improved customer experience to highlight a few of the most obvious choices.
It must be business-led to ensure commitment and enhance adoption. Depending on where you are in your AI journey, it may be advisable to pursue a “crawl-walk-run” approach where you identify near-term opportunities that may not be as impactful as you may hope (or plan for), but will allow you to test and learn what you are capable of doing, and how to effectively deploy AI. This “roadmap” approach can be a powerful way to build a foundation and rapidly expand your initiatives, and results, as you learn and iteratively improve. I am not suggesting ignoring breakthrough ideas or capabilities, but would suggest focusing on near-term opportunities, building your AI “muscle” in addition to planning for breakthrough capabilities if you are still early in your journey towards adoption.
AI is well suited to optimize internal processes to enhance your operations, improve customer service, or deliver greater financial outcomes for your enterprise.
Ideally, it is a business process that is repeatable, where the business process may be done manually today but can be automated to the point that you can apply machine learning techniques and AI to refine, improve, learn, and strengthen the process. It’s also important to approach these problems initially like experiments where you can identify the critical success factors you want to influence or change and measure the impact of the change in process or hypothesis by implementing a new analytics/ML/AI solution, algorithm, or model. As you do so, you can readily demonstrate the impact of the change in process with agreed-upon measurements, facts, and data which demonstrate the ROI of the solution.
Tie use cases end-to-end, and do not get bogged down in functional silos.
Although you may start with a specific use case like sales forecasting or demand management for supply chain, it’s important to tie these use cases end-to-end, and do not get bogged down in functional silos. You may need to commence with functional solutions, but the key will be how to tie them “end-to-end” for greatest impact. For example, building an outstanding sales forecast model delivers a lot of value to the corporation, but what if you can tie it to your supply chain demand model?
To provide another illustration, marketing and sales must collaborate on the best use of data and insights to improve sales effectiveness, productivity, and results. Marketing data gleaned from customer browsing history, social media sentiment, or contact information can be shared with sales to provide recommendations and leads for follow-through. If the datasets or analytics solutions are not harmonized or integrated, then these will become disparate data points requiring the end user (sales person) to access multiple sources to seek the information, or they may not get the information at all. As a result, integrating these different datasets into a consolidated solution, recommendation, or set of actions for sales will have a much bigger impact than if they stood alone, or worse, competed for attention. For the sales end user, this means a loss in productivity as well as forcing them to decide what is the most appropriate or effective action to take.
Ensure the business problem or use case is strategic and understand its impact to the bottom line.
Don’t underestimate the importance of process engineering skills.
Most business problems that can be automated need to be defined in an “as-is”, “to-be” state where you can apply analytics, ML, and AI solutions to automate, learn, and improve. Experiment and test. Measure before and after. Tweak, modify, and learn. Process engineering skills are critical. Your organization must have the ability to assess current and proposed future state processes that require change.
For example, how do you perform sales forecasting or demand management today? Is it being done on spreadsheets using different formulas, gut instinct, or other means and then manually rolled up? If so, how can you automate the process, embed that into a tool that you can apply ML and AI to in order to learn, improve, and dramatically increase the outcome? At the root of this, you need to have the ability to do process engineering and ultimately, process redesign in addition to the analytics/AI work.
Embed these new AI-driven solutions into decision-making tools and processes.
It’s critical to think about how you can embed these new AI-driven solutions and processes into decision-making or transactional systems. These will yield the greatest impact on the bottom line, customer satisfaction, and productivity. Think about your CRM or sales tools and how you would embed these “recommendations” into your point of quote, transaction, or customer interaction. This can be a powerful differentiator in how you serve customers and how you can improve your P&L.
For illustration, how impactful would it be if you can provide automated recommendations to optimize product configurations or add-on and upsell recommendations at the point of quote or order? Even better, what if these recommendations didn’t require human intervention? Ultimately, integrating AI into workplace processes, decision-making, or transactional tools is the key to long-term success. A redesigned end-to-end process with applied machine learning techniques will not only facilitate faster and better decision-making, enhance business performance and customer experience, but the cost to identify, and deliver on these decisions is dramatically reduced over historically manual (or nonexistent) methods.
Personalized recommendations will accelerate end-user acceptance and adoption.
Generic algorithms and models are generally more advantageous over pure “gut instinct” (although some might argue otherwise). However, the more personalized the recommendation, the better. This also suggests that you should think about how your customers or end-users consume information and how you can deliver your analytics/AI solutions in a relevant, intuitive, and adaptive manner. Man-machine interface is becoming increasingly important as the amount of data, and data-driven solutions multiply. It’s not sufficient to deliver large static reports with hundreds of rows of data. The key is how you convert the data into insights and actions for the individual end-user to act upon.
5. Build, Scale, and Partner for Talent and Access to Intellectual Property
To be successful, it’s important to develop, partner, or acquire the critical skills you need. It’s well documented that the demand for analytics and data science professionals exceeds the supply of talent, and the gap is increasing.
Ensure you have a strong recruitment process with leading universities and institutions developing analytics talent.
If you have an in-house analytics team, ensure you also have a strong recruitment process and relationships with universities to gain access to critically needed analytics talent. In a highly competitive field like analytics and data science, it goes without saying that you also need to have competitive salaries, benefits, and a well-defined, structured, (and enviable) career path that not only defines upward mobility, but encourages cross-training.
It’s impossible to accelerate creation or adoption of AI without the right talent or partnerships with industry leaders and innovators.
Create or leverage talent development programs to build expertise, skills, and scale.
Once you acquire the talent, it’s also important to retain the talent and expertise. This means investing in talent retention or development programs that allow you to quickly ramp new hires, provide advanced skills, and mitigate the impact of turnover. Given the high demand, and high attrition of talent in this space, a rigorous talent development program can help reduce turnover while accelerating the use of critical knowledge or skills to be successful – whether “hard” or “soft”.
Development programs should also bridge the gap between the “science” (e.g., predictive analytics, machine learning, deep learning) of analytics and data science and the “art”. Deep business or domain knowledge is needed to be most successful, or ensure the analyst is embedded in the business. Don’t ignore the importance of soft skills like verbal and written communications, including “storytelling”. Invest and develop skills in these areas.
Utilize a knowledge management platform and tools to share models and best practices.
This will help provide faster, proven solutions to end-users and customers. Invest or create knowledge management systems or platforms, and tools to share analytics models, automate data ingestion, etc. This will help build scale in your analytics organization, speed solutions to the business, and reduce “re-work” or duplication. More time can be spent on new techniques, or insights derived from the data.
Build the talent and skills needed in complementary functions to be successful.
Don’t overlook the importance of process engineers, IT infrastructure, developers, and project managers. All functions and skills are required in most enterprises to build robust, sustainable AI solutions. If these skills are lacking, it will be difficult to quickly deploy analytics solutions into production.
Partner with firms like Fractal Analytics to gain access to critical expertise, skills, and IP.
This will help you extend or complement your in-house capabilities to move faster to meet internal demand. It will help you obtain access to the talent and proven knowledge, skills, and IP solutions necessary to understand the business problem, build the analytics/ML solutions needed to address the business problem, and access leading-edge IP and knowledge from their deep bench of experts and investments in AI. They also have the ability to provide best practices or solutions from other industries or use cases that may accelerate your own efforts. Given the rapid increase in investments in artificial intelligence, partnering with industry thought leaders can also help keep you current on new and emerging technology and IP that can further accelerate your AI initiatives.
6. Organize for Success
Choosing the right organizational model is also an important factor in accelerating adoption of analytics, machine learning, or AI solutions. There are benefits and drawbacks to choosing a fragmented analytics team vs. one that is centralized in the organization. Consider using a hybrid model to combine the best of both worlds.
Highly dispersed or fragmented analytics talent can create challenges in driving enterprise AI
If your talent is highly fragmented or staffed in department and/or mid-level functions, it will be extremely difficult to build enterprise-wide solutions. Analytics or AI solutions being developed in fragmented manner may be interesting, “sexy”, or possibly breakthrough in some respects, but do they solve the most critical business problems or challenges? Are they aligned or are there competing solutions being developed? Is the work effort strategic, or tactical, and are you leveraging the critical and scarce talent in the most effective manner to achieve your goals? De-centralized analytics organizations may result in some of the following challenges:
- Excessive focus on depart or tactical objectives, not strategic.
- Internal competition for funding, tools, talent, and control.
- Proliferation of BI/Analytics tools or solutions.
- Limited or inability to apply predictive analytics at scale.
- Difficult or impossible to move or develop talent across the organization.
On the other hand, highly centralized analytics organizations run the risk of being irrelevant.
By contrast, highly centralized analytics organizations run the risk of being irrelevant or lack the business knowledge to be successful. Centralized analytics teams can provide scale and the ability to quickly understand, share, and leverage best practices, tools, and methodology which are constantly and rapidly evolving. Characteristics of an overly centralized analytics organization may include:
- Too rigid or slow in delivering analytics/AI solutions to the business.
- Disproportionate focus on the science of data and analytics vs. the business impact or outcome.
- Lack of business knowledge resulting in ineffective or irrelevant solutions.
- May result in increased growth or proliferation in de-centralized analytics talent
A well-defined and effective organizational model can accelerate the adoption of analytics and AI solutions in the enterprise.
Consider a hybrid model to combine the best of both worlds.
Therefore, whether you build or partner to acquire your analytics expertise, you must carefully consider how to organize your talent in a way that allows for both expertise and scale in the tools, processes, and techniques of analytics and data science (the “science”), yet also provide for close alignment and understanding of the business and business problems that need to be solved (the “art”)
- Analysts or data scientists must be close to, if not embedded in, the businesses they support in order to better understand the business challenges or processes that need to be improved. Don’t “throw the challenge over the fence and hope for the best”.
- A model that provides the best of both worlds is the “hybrid” or “hub-and-spoke” model. There are different ways to implement these hybrid models and determine where the resources are placed, and who manages them. Overall, this hybrid approach has many advantages over the decentralized or centralized functional models.
No matter which model you choose, it’s ideal to have analytics talent remain close to the business. The more they learn and know about the business, the more effective the solutions will be. Conversely, the more remote (physically, intellectually, organizationally) your analytics talent is from the business, the less successful you will be.
7. Create and Strengthen a Culture of Collaboration and Experimentation
It’s also important to build a culture to collaborate, experiment, and innovate. Build a roadmap that delivers early and frequent “wins”, and communicate the progress and wins throughout the enterprise to inform, “de-mystify”, and rally support for your AI initiatives.
Remember that AI is a journey.
It’s a journey. Be willing to fail, learn quickly, adapt, and test again. The organization will learn. As you do so, collectively the organization will also learn how to better utilize analytics and AI to solve more complex questions or problems, and learn how to be more proactive in applying the solutions and outputs.
Enable collaboration and teamwork
Provide incentives, KPIs, or metrics to encourage coordination across functions like business, operations, IT, and analytics to work together. Engage the right subject matter experts. AI is not a problem nor solution that is solely in the realm of data scientists or IT. You need the operational, business, and technical expertise to ensure that the analytic outputs solve the business problem or challenge. It requires effective collaboration by business, operations, analytics, and IT experts to solve complex challenges and design, deploy, optimize, and maintain solutions leveraging ML and AI
Don’t underestimate the challenge or importance of cultural acceptance and adoption.
Identify internal champions, thought leaders, and change agents to help drive cultural awareness and adoption.
Identify and build champions or change agents who can help drive the cultural awareness and adoption needed to gain traction. Ensure they are recognized as key leaders and operate at a level or in a function that is strategic enough to make an impact. They are instrumental in achieving and communicating early wins and gaining buy-in from their peers, co-workers, and team members. Champions can also provide honest and frank feedback which can help improve the solutions you deliver which can ultimately accelerate broader adoption
Build your roadmap to yield early wins and successes to increase confidence and momentum.
Build your projects or roadmap in a manner to yield early wins or successes. This will enhance confidence and demonstrate the value or impact of the investments in big data, analytics, and AI. If you are successful in doing so, you will gain further buy-in from the sponsoring business or executives to invest, move faster, and more broadly if they understand the value and impact to the corporation.
Communicate early wins and successes to provide encouragement and support for the journey.
Communicate the progress throughout the enterprise to garner broader support and momentum for further investments, collaboration, and buy-in.Communications should start with the CEO in order to instill the importance throughout the enterprise. Change management is paramount to success in implementing AI in the enterprise in order to offset resistance, fear, uncertainty, or questions over the ROI of AI, and potential impact to the organization.
Be aware of obstacles, roadblocks or even “blockers”, and take appropriate action.
Internal competition for talent, data, control, or even “storytelling” work against the greater good. The more internal silos and competition you have, the slower you will be in adoption and gaining the benefits of analytics or AI to benefit the enterprise.
Determine an objective way to recognize the impact of the investments being made and impact to the bottom line.
Speaking of ROI, determine a way to equitably measure the impact of analytics or AI vs. the business decision or action. The impact of analytics (or AI) may be “watered down” if the value of the work is hidden, understated, or misunderstood. Alternatively, proclaiming excessive impact due to the analytics output vs. the business action can result in distrust.
The Next AI Conversation
In closing, there are many critical elements required to adopt or accelerate analytics and AI solutions in an enterprise. In an ideal state, these strategic elements work in an orchestrated fashion to enhance the chance for success. This doesn’t suggest you cannot be successful if all of these elements are not in place, or mature throughout the organization. However, senior executives and leaders who utilize these strategies will be more successful in implementing or accelerating their analytics and AI initiatives. Ultimately, enterprises which make AI a strategic priority or imperative will create a competitive advantage in the marketplace.
The Next Step
Contact us at Fractal Analytics to have a conversation to see how we can help you in your journey.
Board Advisor at Fractal Analytics and Former Senior Vice President, Performance Analytics Group at Dell Technologies
Doug provides advisory services to help advance Fractal Analytics’ capabilities, services, and offerings to empower enterprise clients.
Doug held various leadership roles at Dell for more than 19 years. In his most recent role, he was responsible for providing global data, reporting, and analytics services to support Dell’s sales, marketing, finance, services, e-commerce, and operations business units. He also partnered with IT to launch Dell’s first enterprise-wide big data business intelligence solution to create a platform that significantly improved enterprise level descriptive analytics while enabling and accelerating predictive analytics at global scale
In his prior role at Dell, Doug was the Vice President of Dell Services, where he was responsible for delivering data center and desktop managed services and outsourcing engagements for Dell customers worldwide. His responsibilities included ownership for a $1.5B P & L, solution design and delivery, and global leadership of over 6,000 professionals. Prior to that, he held several senior executive roles in Dell Services where he led growth and scaling of Dell’s enterprise and global service capabilities to serve customers in all segments and regions. Throughout his tenure at Dell, Doug was also a champion and leader for diversity, STEM education for girls, and an advocate for women in technology leadership
Doug leverages his passion, knowledge, and experience to help Fractal Analytics and clients accelerate the use, adoption, and value creation with data, analytics, and AI in the enterprise.
The lesson: inspirational leadership is essential in our knowledge economy, but inspirational leadership is not “feel good” leadership. It is not about charisma. It’s about creating the conditions that motivate peak performers to seize opportunities and attack problems. It can and must be carefully cultivated through training and development, through personal coaching and example. Inspirational leadership is more likely to enable transformational change to deliver sustainable growth.
A.G. Lafley (Paris, April 5, 2006)
Analytics and BI
Starting in January, 2010, I was presented with a unique opportunity from Bob McDonald, P&G CEO – one of the most principled & purpose driven men I’ve had the privilege to work with in my career, and Filippo Passerini, P&G CIO – one of the most strategic leaders I’ve experienced and learned from in any industry. They believed that “Analytics” was going to transform P&G, our industry, and business in general. However, P&G efforts to date were fragmented, IT vs. Business Needs driven, a cross-functional nightmare, plagued by under-leveraged talent, and not positioned to take advantage of what would be the “perfect storm” of capability over the following three years.
They wanted me to lead a complete transformation of Analytics, Business Intelligence, and how we approach and drive value for individual business units, functional areas, and the Company overall. This would turn into one of my largest Vertical Startups (VS) to date across strategy, talent development, information technology, strategic partnerships, innovation, and the move into advanced analytics to drive significant business value for P&G. My team and I created a unique blue print for P&G’s business (one that is confidential and not shared here), but also a model that is clearly applicable across any business to leverage analytics as a key capability to win.
“The Magnificent 7,” was a 1960 classic movie (actually an adaptation of a Japanese classic… samurais vs. gun fighters) with screen play by William Roberts, direction by John Sturges, and starring Yul Brynner, Steve McQueen, Charles Bronson, and James Colburn. “They were seven… they fought like seven hundred.” This was our inspiration! Get more done faster, cheaper, and with bigger impact than anyone thought was possible.
Ignore the status quo, barriers, people playing not to lose, cultural antibodies, and technology challenges… just make it happen. It was also about making choices… where to play.
Driving a holistic / business impacting Analytics Program is not easy. It helps if you can network with someone that has done it; and it definitely requires a little help from your friends!
1. Start with the Business Need & Strategy
Sounds simple, but is done incredibly poorly by most. All analytics start with the business problem you are trying to solve! Not a fancy technology, interesting data set, or impassioned leader preaching to the crowds. It’s about the business need.
If you do not know how to pull a strategy together, start with “Playing to Win” by A.G. Lafley and Roger Martin. Get clear on where to play, how you will win in those areas, the capabilities needed to deliver, and how you will measure success. Move from lots of activities to an articulated strategy and execution!
You need a set of business leaders that will iterate with you on the analytics. The CEO may be necessary, but is not sufficient. Find key leaders in the line businesses, key functional areas, e.g., Supply Chain, and respected forward thinkers who will lead with you. You won’t get the analytics or data right the first time. If they bail after one or two misses, you have the wrong players. I was extremely fortunate to have the President of our European business, a truly strategic, no-nonsense respected leader, and an early adopter, willing to iterate with us to solve his biggest challenges: how to grow profitable share in stagnant markets, how to optimize the supply chain across one Europe, and so on. We then found 3-4 other key leaders (including one of the best Supply Chain Global Officers in the industry) that jumped in as well.
These business leaders will also help you shape your strategy and business areas to focus. Without a doubt, Supply Chain, Retail Execution, Consumer / EBusiness, and Brand Analytics will quickly become some of your internal “Magnificent Seven” analytic domains! Analytics will become a “currency” that adds value to your business and with your external business partners.
2. Invest in Talent
Special Operation Forces, Rule #1: Humans are more important than hardware.
Talent is critical! But like any asset, how you leverage makes all the difference. We had incredible talent that was all off trying to solve problems (often the same problems) in a hundred different ways. You need to get your Top Analysts working as one unit. No, you don’t need to centralize them (it’s actually better if they stay embedded with the business units). You do need to organize, develop, and recognize them as the critical asset they are to your company.
The best analysts have three skills: 1) Analytics expertise, 2) Deep Business knowledge in the domain or business unit they are working, and 3) Effective Communications skills. Focus on developing all three aspects of your analysts. Great analytic expertise without context is useless. Great Business knowledge and analytics without the ability to communicate / influence makes it slow and tedious. Great communication without substance is smoke & mirrors – you know, the Powerpoint warriors.
Build internally and leverage analytic know-how from strategic partners! You need to be building your talent pool and leveraging immediate talent & external insights brought from a strategic partner. This investment is one of your most critical choices. Focus on talent, principles/value/purpose, and culture that fits with yours.
3. Partnership between Analytics & IT
If you don’t reorganize and give one leader accountability for IT, Analysts, and Business Algorithm development, get them working as one team. The biggest barrier to any strategy is groups having different measures, definition of success, or priorities. I was extremely fortunate to get one IT Leader, Corrado Azzarita, who understood the technology inside out, garnered the respect of the broader IT organization, and operated with a sense of urgency / playing to win!
Don’t chase the $500 golf driver of the year! Most bad golfers think the answer to their problems is an expensive new club. Bad golfers actually get worse with better equipment… what they really need is a swing coach that can teach them how to play. Don’t get distracted by the new shiny object, software, or tool of the day. Focus on getting better at the fundamentals / make a real difference for the business. Interestingly, once a golfer gets better, giving them great equipment then makes them even stronger. The same is true here.
Be ready to change the tires while driving down the road! We needed to fix the IT Architecture (ADW, Harmonization, Big Data, and more). We did not have time to stop, do these IT projects, and then resume analytics to help the business. Design your project stream to self-fund, invest for business impact, and focus on winning now.
4. Don’t Wait for the Data to be Right
In talking with numerous Fortune 500 companies, one of the insights I always share is don’t wait to get the data right. I remember the CIO during this discussion who turned as white as a ghost! They had been spending money for two years trying to get the data right before trying to do any analytics with the business. This is a waste! First, you don’t know what data will be most critical without driving true business analytics, and second, nothing gets data cleaner faster than presenting it to the senior leadership of the company!
Select tools that allow your analysts and data scientists to adapt/ harmonize on the fly. You will get the data right for the most critical business needs… it’s a simple value proposition. However, it is key to select the tools that allow your analysts and business teams to adjust quickly, add new sources of data, and so on. Don’t create a model that is dependent on a central team to “code” for every new business problem or adaptation.
5. Strategic Partnerships to Accelerate
Selecting the right strategic partners is essential to your journey. This includes IT, Analytics, and business transformation partners. I shared that I only had two types of partners: 1) Strategic Partners who put their best people on my business, invested with me to grow, and brought innovation to my business needs, and 2) Tactical Partners who I negotiated only on cost until I could replace them with a strategic partner. The right strategic partnership model (and I’ve learned from some of the best in the Purchases / Procurement field – always a core part of my leadership team / inner circle) is an entirely separate topic. I invested heavily with and in my strategic partners. Not surprising to see who I am still working with now (https://www.linkedin.com/in/andy-walter/ )!
Create a true Joint Business Plan with your strategic partners. Build a winwin plan that includes all aspects of your work with the partner: Operations / SLA’s / Savings, Key Projects, Co-Innovation, and Moon-shots you’d like to solve together. Leverage your senior leaders as core to this effort. Design quarterly rigor and annual top-to-top meetings with your strategic partners.
6. Focus on Innovation immediately
Analytics, Delivery, Collaboration, Scale
Don’t wait, organize for innovation within your analytics eco-system. Innovation is the life blood of your product strategy (and you are building an analytics product strategy for your company). Focus it around key elements of your plan: Data, Analytic Algorithms, Delivery of Insights, Analytics Team Collaboration, Scale, etc. New break-through is in progress on all fronts…
VR/AR with advanced analytics, mobile delivery analogous to LinkedIn/ Facebook, Machine Learning, and more. Did I mention checking out who I’m working with now?
Get your innovation briefs articulated to be able to share with Strategic Partners and the broader ecosystem of startups, accelerators, and industry players. Get everyone working on your business problems with you!
7. Network Beyond
I immediately realized there were extremely smart people working in other companies (non-compete), Industry bodies, Academia, and for-profit institutes that could be incredibly valuable to the journey. Seek them out and learn with the best together. I’ve created and am still leveraging a powerful network across companies in Analytics.
Conferences are interesting, the networking that occurs from interacting with the top analytics thought-leaders is priceless! I happen to be co-chairing the CGT/RIS Analytics Summit on April 27-28th. The leaders we are bringing together across industries and the ability to interact and learn from them, is incredible. You as a leader need to be investing in your talent and in yourself!
Fractal Analytics Strategic Adviser
Andy Walter is a business results-driven professional with extensive experience in strategy, development, execution, and operations across Shared Services and IT. He led the Commercial Services & Delivery Organization (over 1500 IT and multifunctional professionals) for Procter & Gamble’s Global Business Services (GBS). He was responsible for IT & Shared Services for all Global Business Units and Markets around the world. His team was accountable for developing cutting-edge digital capabilities for Procter & Gamble to win “where it matters most,” with Consumers, Shoppers, and Retailers. This included all eBusiness, Consumer Services, BI/Analytics, Sales Force Solutions, Project Delivery, Business Process Services, and A&D / Company restructuring efforts.
By Eugene Roytburg, Managing Partner and Lana Klein, Managing Partner
Over the past four years, the growth of CPG sales in the U.S. stalled at under 1%, in contrast to 7% growth between 2006 and 2011. Volumes are mostly stagnant or have declined in many categories – in some cases, sharply. In addition, growth has slowed dramatically in key emerging markets. These statistics are a result of several fundamental drivers.
Fewer People & Slower Growth
First, the number of consumer product consumers is stagnant due to flat population growth. Birth rates in Western Europe are below the replacement rate, for example. And without migration, most countries in Europe (with the exception of France, Ireland and Norway) are projected to have negative population growth. In the US, birth rates in 2016 were 14% lower than in 2008.
Economic hurdles have also had an impact. Despite the global economic recovery, wages have stagnated for the two lowest-income quintiles. And help from emerging markets, which fuelled CPG a few years ago, has also dried up as GDP growth in those markets has slowed. Additionally, currency weakness in many emerging markets has impacted consumer purchasing power to compound softness in CPG sales.
Shifting Consumer Preferences
Against this backdrop, consumer attitudes, tastes, needs, and behaviors have also shifted. E-commerce upended shopping behavior and levelled the playing field for small companies, helping them reach consumers more easily than in the past. Buying power has shifted from the baby boom generation, which is well understood by marketers, to the more fickle Millennials, who often shun large brands in favor of newcomers. As a result, small, upstart CPG companies have captured 3% market share from larger players over the past few years.
Consumer behaviors are also becoming less homogeneous. Increasingly, they’re polarizing into “low involvement” and value-seeking buying on the one hand, and highly-selective purchase behavior on the other. To complicate matters further, some consumers show both behaviors – depending on what category they’re shopping for. As a result, large brands are under threat of death by a thousand cuts, losing share to “low involvement” and to a host of smaller, innovative brands who cater to increasingly fragmenting consumer tastes.
Traditional CPGs vs. Digital Natives
So, where can manufacturers find growth in this environment? In many niches across different verticals. All categories show the same pattern: The emergence of niche and micro-segments – largely dominated by start-up brands – taking share from established mainstream products.
So, why can’t established brands adapt? Here are a few reasons:
- The innovation process in large CPG firms isn’t set up for the new environment. It’s slow and risk-averse, with a relatively large “hurdle rate” favoring initiatives that don’t venture too far from the core business. Sure, many companies recognize this and try to become more nimble. The problem, however, is that incremental improvements are usually too weak to change the massive “cultural DNA” of large companies and produce a meaningful shift.
- Many companies are still laggards in e-commerce and digital – at least compared to relative newcomer brands like Dollar Shave Club (later bought by Unilever), who are often digital natives and have entire operating models rooted online.
- Brand Equity – traditionally a huge asset – can play against large brands with long-established mainstream perceptions when they try to venture into a new niche. Small start-up brands have an authentic story that’s typically better aligned with the needs and attitudes of their target consumers.
How Can CPGs Compete?
To survive in these conditions, large CPG companies need to re-think the ways they look for growth opportunities. They must shift their mindset, culture and operations to succeed in a changing environment.
- Develop capabilities to quickly identify growth opportunities. With landscapes shifting so rapidly, traditional category segmentation and a “future will be like the past” approach should be replaced by robust analytics that can quickly scan for emerging growth niches and discern fads from longer-lasting opportunities. These opportunities span the intersections of categories, accounts, product attributes and consumer segments, to name a few.
- Develop a strong acquisition strategy and execution. This is an obvious route that many firms already pursue. But the devil is in the details — creating the ability to rapidly and effectively identify suitable acquisition targets. Another important point is determining optimal deal size. While smaller acquisitions are likely to deliver higher growth rates, incremental revenues may be too small to move the growth needle. What’s more, they require deep resources for execution. Finally, it’s vital to design a post-integration strategy to preserve the entrepreneurial spirit of the new brand, while using corporate muscle to scale it. Hormel (Muscle Milk®, Applegate®, Justin’s®, Wholly Guacamole®) is one company that does this especially well.
- Invest in start-ups like venture capital funds do. Many corporations are engaging with start-ups through internal corporate venture capital arms that invest directly in these types of companies . Nestle, Chobani, General Mills and PepsiCo are some who’ve launched “accelerator” units to participate in growth and support start-ups. Kraft has launched a business unit called Springboard, for example, to develop what the company describes as “disruptive” food and beverage brands. The unit is actively searching for emerging, authentic brands and looking to build a “network of founders.”
- Finally, take a more agile and differentiated approach to your brand portfolio. Ask yourself: Where are unmet needs in your categories? Do consumers care or need anything innovative, or have all meaningful problems and needs been addressed? Too often, the growth targets in mature spaces are unrealistic and driven by inertia. They’re unlikely to produce much growth. Yet, firms keep pouring money into marginally incremental innovation, rather than looking for new spaces.
Ultimately, the current environment for CPG brands is a zero-sum game. Everything from slow population growth to unfavorable economic factors — which have given rise to a new breed of digital competitors — is putting pressure on them. To compete, CPGs must adapt and force growth opportunities through innovation and a more differentiated approach.
The sole purpose of the Emergency Room (ER) is to save lives by providing immediate attention to people with life threatening situations. With 24×7 access, a broad array of services, and the latest technology at hand, ER teams are well equipped and trained in treating medical urgencies, stabilizing patient conditions, and preventing further damages.
Today, many ERs are overcrowdedi as:
- Unlike other treatment facilities, ERs have a federal mandate to provide care to any patient requesting treatment
- Primary care physicians (PCPs) are in short supply
- Poor patient knowledge and socio-economic conditions drive more patients to seek medical care in ERs
An increasing abuse of ERs, either due to patient ignorance or convenience, demands urgent attention from both payers and providers. There is a lot of documented evidence where patients have used ERs for situations that could have been treated in a more cost-effective care setting such as Urgent Care Clinics (UCCs) or Patient-Centered Medical Homes (PCMHs). A study from Project HOPEii estimated that 13% to 27% of all emergency department visits in the US could be managed in alternative sites, with a potential cost savings of approximately $4.4 billion annually.
This paper presents a comprehensive end-to-end solution to reduce ER utilization for non-emergent conditions. The proposed data-driven solution leverages predictive analytics to develop a framework to identify members likely to use the ER for avoidable reasons in the near future, and the solution recommends designing specific interventions to prevent future visits. During the process, we will have also built a case to leverage analytics in an agile way to rapidly derive maximum value.
iBarish RA, Mcgauly PL, Arnold TC. Emergency Room Crowding: A Marker of Hospital Health. Transactions of the American Clinical and Climatological Association. 2012;123:304-311.
iiCopyrighted and published by Project HOPE/Health Affairs as Weinick RM, Burns RM, Mehrotra A. Many emergency department visits could be managed at urgent care centers and retail clinics. Health Aff (Millwood). 2010;29(9):1630-1636. The published article is archived and available online at www.healthaffairs.org.
The ER health care landscape
In 2016, America spent more than $3.3 trillion on health care, or approximately $10,348 per person. This was a 4.3% increase from the previous year, contributing 17.9%1 to overall US GDP. More importantly, the health care expenses grew 1.5% faster than the rise in GDP. This faster growth in total spending was partly driven by strong growth in spending for private health insurance, hospital care, physician and clinical services, aging population, and the expansion of coverage through the Affordable Care Act (ACA).
While health care experts and economists are still debating the long-term and exact impact of the probable causes, in a recent study published in JAMA,2 the Obama administration claimed that since the ACA became law, the uninsured rate has declined by 43%-from 16% in 2010 to 9.1% in 2015- with most of that decline occurring after the law’s main coverage provisions took effect in 2014 (see Figure 1 for details). Further, the Department of Health and Human Services (DHSS) estimated that 20 million more people had health insurance in early 2015 because of the law.
However, having more insured people under the health care safety net without improving supporting infrastructure is expected to put significant pressure on the entire delivery system. A Center for Disease Control and Prevention (CDC) report3,4,5 estimated that in 2011 there were over 136 million emergency room (ER) visits, for an average of 44.5 visits per 100 persons. Now, with an additional 20 million members getting coverage in 2015 and many more expected to have it in the following years, there will be an increased spotlight on ER utilization.
A finding from the National Hospital Ambulatory Medical Care Survey (NHAMCS) cites that nationally, 39.5% of ER visits among the general population are primary care sensitive in nature and therefore preventable.6 A Truven study7,8 estimates that only 29% of ER visits required emergency attention, with a rough cost estimate of $1,233 per ER visit,9,10 wasting billions of dollars in health care costs. Another study projected $4.419,11 billion in annual savings if non-urgent visits are better managed in alternate care settings.
Ideally, ER usage should drop when there is an efficient health care system with better access to care and affordable costs, as expected by ACA provisions. However, ERs are not a substitute for primary care relationships, nor can they address the broader socio-economic factors driving health care costs.11,12
The scope of the ER problem
The scope of this paper is to dig deep and answer three broad questions:
- Which members are likely to make avoidable ER visits in the near future?
- Why are these members more likely to make an avoidable ER visit than others?
- How can we prevent such visits in the future?
Academia and organizations have been studying parts of this equation for years. We started with a systematic review of the existing literature to leverage present-day knowledge to understand:
- ER utilization trends: What has been the historic trend and what do experts say about future use?
- Emergent vs. non-emergent ER visits: What factors drive non-emergent visits?iii
- Interventions: What can be done to reduce non-emergent ER visits?
- Efficacy: What works vs. what does not with respect to environment (type of payer) and care delivery settings?
FIGURE 3. Components of ER Over-Utilization Problem
iiiCases where immediate care is not required within 12 hours (e.g., sore throat).
Issues and challenges in existing solutions
- Complexity of the problem
- Limited scope
- Limited data
- High cost of using analytical solutions
- No early cost benefit analysis
- Analytics in isolation without synergies in interventions
- Solutions specific to population cohorts under study and cannot be generalized
- No attempt to answer “So what?”
We found that due to the complexity of the problem, the inherent restrictions on sharing patient data, and the vast variety of health care delivery settings (Medicare, Medicaid, Employer Sponsored, Commercial, Individual, etc.), most of the studies answered only some of the questions that we set out to answer. The limited evidence from the academic studies did suggest that age, ease of access to ER compared with other care alternatives, perceived severity, and socio-economic settings all play a role in decisions to seek care in the ER for non-urgent problems.
The usual research studies focused on identifying:
- How the visits should be classified: emergent and non-emergent? Or…
- What are the factors driving non-emergent usage? Or…
- What intervention may work for a select population through statistical analysis, review of patient charts, or survey techniques?
We observed that there was no systematic end-to-end approach that addressed all parts of the problem holistically. Further, most of the studies were not focused on demonstrating ROI from such initiatives which may, in part, be attributed to the nature of their funding itself: academic or through grants from non-profit organizations.
The analytics driven ER utilization solution
In a study funded by the California Healthcare Foundation to understand factors influencing an individual’s decision to visit an Emergency Department (ED) for a non-urgent condition, authors Lori Uscher-Pines et al13 proposed a conceptual frameworkiv to show how a patient arrives at a decision to seek care in an ER by consciously or unconsciously weighing several considerations. The decision to go to an ER is influenced by an array of causal pathway factors and associated factors. See Section I of Figure 4 for details.
The associated demographic, socio-economic, and lifestyle factors (Section I of Figure 4) can be determined through qualitative and quantitative techniques. We will limit our study to patients who chose the “Go to ER” path and algorithmically determine the associated factors. Then, we will extend the framework (Section II of Figure 4) to identify which visits, retrospectively, were avoidable and what was the dollar impact of it. Finally, we will show how advanced AI/ML techniques will help to prospectively identify members likely to make an avoidable ER visit in the near future.
FIGURE 4. Conceptual Framework
ivConceptual Model of Non-Urgent ED Use. Deciding to Visit the Emergency Department for Non-Urgent Conditions: A Systematic Review of the Literature. Am J Manag Care. 19(1):47-59.
Uscher-Pines L, Pines J, Kellermann A, Gillen E, Mehrotra A. Deciding to Visit the Emergency Department for Non-Urgent Conditions: A Systematic Review of the Literature. The American journal of managed care. 2013;19(1):47-59.
Algorithm and techniques to classify ER visits
The foremost problem is to classify ER visits as emergent (unavoidable) or non-emergent (avoidable). The gold standard is to have a panel of clinical experts review patient charts for the selected population and then classify each visit accordingly. However, this process is very resource intensive and not feasible when quick results are needed. There can be multiple alternative approaches to classify ER visits as avoidable or unavoidable:
Option 1: Developing an independent algorithm
Here, a statistically large sample of claims is selected, and the frequency of primary diagnosis codes is analyzed in a “regular setting” vs. an “ER setting.” Diagnosis codes that were more often treated in a “regular setting” and were also present in an “ER setting” are flagged. A threshold value is chosen to further trim down the selection, and then claims with flagged diagnosis codes are considered as “avoidable.”
Option 2: Leveraging a publicly available algorithm
The NYU ED algorithmv is widely used to identify diagnosis codes which are avoidable (with greater than 90% probability). There are other variants of the NYU algorithm: The Billings/Ballard algorithm14 and, more recently, the Minnesota algorithm.
Option 3: Leveraging in-house learning to create a hybrid algorithm
Leverage an in-house clinical research team’s prior experience to identify avoidable ER visits for a sample population.
We used “Option 2—The NYU ED Algorithm” to identify avoidable ER visits.
FIGURE 5. NYU Algorithm
vNYU ED Algorithm, NYU Wagner, New York University, wagner.nyu.edu/
Identifying the analysis population and expected ROI
For our pilot study, we selected members with a specific chronic disease and then segmented the entire population into homogenous groups across several key dimensions such as plan (HMO, Non-HMO), health conditions, geography, age groups, etc. We also checked two key factors for the selected population:
a) Is there enough opportunity (avoidable ER visits) to begin with
b) Is support available to drive intervention programs and traverse the last mile
Highly sophisticated algorithms with super rich data quality are certainly expected to deliver the best of outcomes. However, in the real world there is a need to balance research and implementation costs with expected benefits and ROI.
Figure 6 below shows an illustrative approach to do a quick ROI estimation before moving ahead.
Leveraging predictive analytics in an agile way
- What proportion of ER visits are avoidable?
- Is intervention program support readily available?
- What are the potential benefits?
- What is the expected cost?
Traditionally, predictive modelling exercises follow a waterfall approach. The requirements must be frozen and all stakeholders aligned before moving to design, development, and validation phases. This methodology lacks the necessary flexibility to rapidly react to evolving business needs. Further, the cost of failure is relatively high, as benefits cannot be established until the model results are field validated.
We recommend developing predictive solutions in an agile way: start with small data (e.g., claims) and simpler techniques to establish initial value. After each sprint, reassess the benefits to validate the necessity of the next sprint. Once the incremental gains are established, explore options to add either new data sources and/or complex predictive algorithms to further maximize the returns. See Figure 7 for more details.
Using the agile approach shown above, we used traditional analytical steps to develop our models, starting from logistic regression and claims data to artificial intelligence (AI)/machine learning (ML) techniques with non-traditional data sources, e.g., augmenting traditional claims data with lifestyle and behavior information, bringing in zip-level socio-economic information, adding macroeconomic indicators, temperature, pollution information, etc. Refer to Figure 8 for more details.
Our final solution consisted of an ensemble of machine learning and logistic models. The solution was able to capture 50% of all avoidable ER members within top two deciles.
Below are some selected insights/validated hypotheses for members likely to make an avoidable ER visit:
- Comorbid conditions, such as Congestive Heart Failure (CHF) or Chronic Obstructive Pulmonary Disease (COPD)
- History of frequent ER use
- Behavioral conditions such as drug or sub stance abuse, alcohol dependency, etc.
- Depression and other bipolar disorders
- Certain ethnic groups
- Poor educational levels
- Obesity and others
The solution implementation
What can you do about the in-house information that your care management/disease management teams already have?
A bigger question is:
What can you do about the information that you don’t have?
This phase involved identifying the right set of interventions that would benefit the member population identified from our predictive model.
Identifying the right set of interventions
Once the top “x” decile members had been selected, we wanted to further segment this population based on its propensity to respond to a specific intervention to maximize the return on intervention. However, due to challenges outlined below, such information was not readily available:
- Delivered in a specific care setting
- Targeted at a specific geographic region for a certain ethnic group
- Broad disease-specific interventions are force fitted for selected population segment
- No longitudinal study that tracks outcome of historic care management (CM) or disease management (DM) programs linked with predictive models-e.g., impact of diabetes management program on members
with high risk scores from a predictive model
- Limited employer workforce performance related data—e.g., how a certain wellness program resulted in member performance: absenteeism, productivity, etc.
In the real world, there are additional practical limitations, such as cost to execute, limited resources in the care management team, limited time to execute, and the need for rapid realization of benefits from the experiments.
Analytics can help organizations in optimizing the entire value chain of experiments: planning, designing, and execution. We need to leverage analytic techniques to fully mine information that the traditional care management/disease management programs have collected so far, and if not, design experiments to gather such intelligence.
In our case, we divided interventions into two broad categories (Figure 9):
- Known interventions linked with member profiles identified through profiling
- Experiential interventions to test and learn new programs through secondary research
FIGURE 9. Designing Prioritized Interventions
We used below steps to identify and design the right set of interventions:
Step 1: Identify key themes from predictive model results
We identified key themes by reviewing patient profiles from our predictive models. We then reviewed them against the existing medical literature and selected interventions that were most relevant to our study population.
See Table 1 for details.
Step 2: Design collaborative interventions for experiential learning
Here, our objective was to identify which pilot interventions would help most in getting the right data for future interventions. Sample experiential interventions linked with emerging themes identified in step 1 are listed below:
- Smoking cessation program for members in a select Accountable Care Organization (ACO), Patient Centered Medical Home (PCMH), Skilled Nursing facility (SNF), or Long-Term Acute Care Facility (LTAC)
- Pharmacist-led education on early parents or members residing in a rural area
- Providing free nebulizers to asthma population—this will help in seeing (if) an increased medication adherence results in increased response to DM program
Step 3: Prioritizing interventions
Once we had a broad set of interventions linked to members’ medical, socio-economic, behavioral, and lifestyle data, we prioritized the specific interventions for maximum ROI within the constraints of time, effort, and budget. See Figure 10 for an illustrative framework.
Step 4: Efficacy of the interventions
As a final step, we designed a detailed IT system to capture the data from the experiments. Our ultimate aim is to use the collected intelligence to evaluate the efficacy of the program and also as an input to future predictive models.
Reducing avoidable ER visits is a complex problem requiring a collaborative effort from multiple functions. Early identification of the problem through a sophisticated predictive analytics solution can provide a competitive edge in mitigating revenue leakage and containing health risks.
To balance between program costs and potential benefits, we recommend using analytics in an agile approach and accounting for below critical success factors:
Ensure strong executive sponsorship for end-to-end program support: Predictive models have little value if the CM/DM teams cannot timely use the results. A multi-divisional collaboration for program execution and implementation is a must.
Perform early ROI estimation through baseline vs. benchmark comparison to ensure that there is value in pursuing the initiative.
Start small but be specific: Identify the right population segments where the problem is severe, and establish clear metrics and thresholds to measure success.
Develop the solution iteratively, starting with easily available small data: Leverage external data to fill in gaps due to lack of internal data, and use non-traditional data sources such as lifestyle, behavioral, and socio-economic data to enrich data quality.
Start with simpler analytic techniques to show value for executive buy-in before making a case to move to complex machine learning algorithms.
Plan, design, and develop systematic experiments to test and learn from interventions. Use this data as an asset for future studies.
Leverage artificial intelligence techniques to scale and automate solutions.
It’s time to reduce unnecessary ER visits and deliver impactful interventions to prevent them from occuring in the future. An analytics-powered approach, delivered in an agile manner, can help organizations deliver on this opportunity.
- Obama B. United States Health Care Reform Progress to Date and Next Steps. JAMA. 2016;316(5):525-532. doi:10.1001/jama.2016.9797
- Weiss AJ (Truven Health Analytics), Wier LM (Truven Health Analytics), Stocks C (AHRQ), Blanchard J (RAND). Overview of Emergency Department Visits in the United States, 2011. HCUP Statistical Brief #174. June 2014. Agency for Healthcare Research and Quality, Rockville, MD. http://www.hcup-us.ahrq.gov/reports/statbriefs/sb174- Emergency-Department-Visits-Overview.pdf
- The Role of Health Centers in Lowering Preventable Emergency Department Use: http:// nachc.org/wp content/uploads/2015/06/ED_FS_20151.pdf
- “The median charge for outpatient conditions in the emergency department was $1,233, which is 40% more than the average American pays in rent each month ($871).” [$1,233/$875=1.42] “Avoidable Emergency Department Usage Analysis.” Truven Health Analytics. (April 25, 2013)
- Caldwell N, Srebotnjak T, Wang T, Hsia R (2013) “How Much Will I Get Charged for This?” Patient Charges for Top Ten Diagnoses in the Emergency Department. PLoS ONE 8(2):e55491. doi:10.1371/journal.pone.0055491
- Cunningham PJ, et al. The use of hospital EDs for nonurgent health problems. Med Care Res Rev. 1995; 52(4):453-474
- Weinick RM, Burns RM, Mehrotra A. Many emergency department visits could be managed at urgent care centers and retail clinics. Health Aff (Millwood).2010;29(9):1630-1636.
- Emergency Department Visits for Nonurgent Conditions: Systematic Literature Review. The American Journal of Managed Care. 2013;19(1):47-59 https://ajmc.s3.amazonaws. com/_media/_pdf/AJMC_13jan_UsherPines_eApx_47to59.pdf
- Ballard DW, Price M, Fung V, et al. Validation of an Algorithm for Categorizing the Severity of Hospital Emergency Department Visits. Medical Care. 2010;48(1):10.1097/ MLR.0b013e3181bd49ad. doi:10.1097/MLR.0b013e3181bd49ad.
- Basch CE, Walker EA, Howard CJ, Shamoon H, Zybert P. The effect of health education on the rate of ophthalmic examinations among African Americans with diabetes mellitus. American Journal of Public Health. 1999;89(12):1878–82. [PubMed: 10589324]
- Piette JD, Weinberger M, McPhee SJ, Mah CA, Kraemer FB, Crapo LM. Do automated calls with nurse follow-up improve self-care and glycemic control among vulnerable patients with diabetes? American Journal of Medicine. 2000;108(1):20–27 [PubMed:11059437]
- Clancy DE, Brown SB, Magruder KM, Huang P. Group visits in medically and economically disadvantaged patients with type 2 diabetes and their relationships to clinical outcomes. Topics in Health Information Management. 2003;24(1):8–14. [PubMed: 12674390]
- Banister NA, Jastrow ST, Hodges V, Loop R, Gillham BM. Diabetes self-management training program in a community clinic improves patient outcomes at modest cost. Journal of the American Dietetic Association. 2004;104(5):807–10. [PubMed: 15127069]
- Jaber LA, Halapy H, Fernet M, Tummalapalli S, Diwakaran H. Evaluation of a pharmaceutical care model on diabetes management. Annals of Pharmacotherapy. 1996;30(3):238–43. [PubMed: 8833557]
- Rothman RL, Malone R, Bryant B, Shintani AK, Crigler B, Dewalt DA, Dittus RS, Weinberger M, Pignone MP. A randomized trial of a primary care-based disease management program to improve cardiovascular risk factors and glycated hemoglobin levels in patients with diabetes. American Journal of Medicine. 2005;118(3):276–84. [PubMed: 15745726]
- Gerber BS, Brodsky IG, Lawless KA, Smolin LI, Arozullah AM, Smith EV, Berbaum ML, Heckerling PS, Eiser AR. Implementation and evaluation of a low-literacy diabetes education computer multimedia application. Diabetes Care. 2005;28(7):1574–80. [PubMed: 15983303]
- Davidson MB. Effect of nurse-directed diabetes care in a minority population. Diabetes Care. 2003;26 (8):2281–87. [PubMed: 12882849]
- Erdman DM, Cook CB, Greenlund KJ, Giles WH, El-Kebbi I, Ryan GJ, Gallina DL, Ziemer DC, Dunbar VG, Phillips LS. The impact of outpatient diabetes management on serum lipids in urban African Americans with type 2 diabetes. Diabetes Care. 2002;25(1):9–15. [PubMed: 11772894]
- Anderson-Loftin W, Barnett S, Bunn P, Sullivan P, Hussey J, Tavakoli A. Soul food light: Culturally competent education. The Diabetes Educator. 2005;31(4):555–63. [PubMed: 16100331]
- California Medi-Cal Type 2 Diabetes Study Group. Closing the gap: Effect of diabetes case management on glycemic control among low-income ethnic minority populations. Diabetes Care. 2004;27(1):95–103. [PubMed: 14693973]
- Brown A, Gregg EW, Stevens MR, Karter AJ, Weinberger M, Safford MM, Gary TL, Caputo DA, Waitzfelder B, Kim C, Beckles GL. Race, ethnicity, socioeconomic position, and quality of care for adults with diabetes enrolled in managed care. Diabetes Care. 2005;28(12):2864–70. [PubMed: 16306546]
- Brown SA, Garcia AA, Kouzekanani K, Hanis CL. Culturally competent diabetes self-management education for Mexican Americans: The Starr County Border Health Initiative. Diabetes Care. 2002;25 (2):259–68. [PubMed: 11815493]
Engagement Manager, Fractal Analytics
Anupam Bhatnagar is working as an Engagement Manager with Fractal Analytics and has over 12 years of experience in data analytics, problem solving, and consulting in the US health care and insurance domain.
Principal Consultant, Fractal Analytics
Kishore Bharatula is working as a Principal Consultant with Fractal Analytics and has over 13 years of experience in the analytics industry. He is passionate about solving challenges through implementing analytics to bring measurable topline and bottom-line impact.
“By 2020, only one-third of sales organizations will have embraced predictive and robotic technologies that guide and automate actions to achieve sales goals.”
– Mark Smith, CEO & Chief Research Officer, Ventana Research
Fueled by new levels of sophistication, processing power, and AI solutions in the digital landscape, it is time for enterprise executives and sales leaders to fully embrace AI to empower their sales organizations. Innovations in AI, robotics, and chatbots are expanding at an accelerated rate. As a result, there are more capabilities and solutions available today than the ability of sales organizations to adapt. Moreover, many sales organizations appear to be lagging in their efforts to utilize AI as an enabler for digital sales transformation.
Sales drives the engine of growth and represents your front line in creating and sustaining lasting customer relationships. So why are only one-third of sales organizations adopting predictive analytics, AI, and robotic technologies? Sales organizations also represent the largest and most expensive labor pool in most organizations so any gains in productivity or effectiveness will have significant impact to business performance, shareholder value, and enhance customer experience.
This white paper provides a roadmap you can utilize to embrace AI in your digital sales transformation and realize the full potential of your most powerful employees – front line sales! There are six critical steps in the journey:
- Understand your customer journey and engagement model
- Activate your data
- Move from many channels to omni-channel
- Embrace predictive analytics
- Stop “pulling data” and start “pushing” information and insights to your sellers
- Reduce complexity – Utilize AI and digital solutions to simplify sales processes
The following discussion and examples are focused on B2B enterprises and sales organizations. There are many more use cases which can be discussed for planning and optimizing your sales organization and the design of your sales force (both “direct” and “indirect”). However, the focus of this white paper is empowering your salespeople to be more productive and effective using AI and emerging digital technology solutions. It may also be useful to review the Fractal Analytics’ white paper titled “Accelerating AI enterprise-wide to achieve a competitive edge” to learn more about key ingredients and strategies for successfully implementing AI and analytics in enterprises.
Many of these suggestions are not new. However, what is new is the rapid growth in innovation and available solutions, and an increasingly competitive landscape that is adopting AI solutions for sales at varying rates of speed and success. Enterprise executives and sales leaders who employ these best practices will be at a competitive advantage over those who do not or cannot. Perhaps easier said than done, but there are significant rewards for enterprises that lead the way in driving digital sales transformation through AI.
CUSTOMER JOURNEY AND ENGAGEMENT MODEL
According to Forrester1, “Modern B2B buyers want to buy from modern sellers; they want to interact fluidly across channels, and when doing so, they expect to have a consistent brand and engagement experience. B2B companies that fully embrace this journey will thrive, and those that delay risk disintermediation from competitors and/or buyers themselves.”
There are obviously massive implications with this trend, but one clear message is that there is more data than ever before being generated by these digital buyers through an ever-increasing number of channels. The “art” of selling is being augmented, if not replaced by the “science” of selling.
Therefore, whether you are early in your digital sales transformation, integrating a new acquisition, or establishing standardized sales processes, the critical first step is to understand and document the customer journey across these various channels and touchpoints. Identify the key touchpoints, events, or actions that shape or influence decisions, perceptions, actions, and customer experience. From there, you can map your sales engagement process to understand where you can make the greatest impact from a sales perspective. Likewise, this will help you identify where you also have the most significant gaps.
Once you’ve identified these critical events, you can then map your data sources and begin to determine where, and how, you can employ AI and new technology to deliver the information your sales teams need, and when they need it. You may also determine where you may have glaring gaps in your data models and information flow, or where you rely on excessively manual processes and corporate knowledge. Once understood and documented, you can then prioritize where you need to invest in process redesign, tools, or capabilities to remedy the gaps.
1 Mary Shea, “Sales Digital Transformation: It’s Now or Never!”, Forrester, January 8, 2018.
Using a simplified and generic customer journey map for B2B customers helps to illustrate the point. By utilizing a customer journey map, you can identify what events or actions your sales team should either be aware of, influence, or drive to win business and satisfy customers. It will also help to expose where you have excessive complexity, where your salespeople require better support and collaboration by specialists, support, sales operations, or with external partners, including channel partners. The exercise will also help you determine your future- state model to improve the end-to-end process to remove complexity and make it easier for your salespeople to sell and to satisfy your customers.
Finally, the exercise will also provide you with critical information to help you identify where you may be able to invest in new applications, tools, or redesign your legacy sales processes to utilize data-driven or AI-enabled sales processes. By leveraging AI-driven sales process software and tools, you can redesign, automate, and standardize how you perform activity management, opportunity and deal management, prioritization, pipeline prediction, and sales forecasting. By mapping your customer journey, you can also enable the implementation of more comprehensive quote-to-cash (QTC) and configure-price-quote (CPQ) processes. In all these cases, you will be in a better position to leverage advance analytics, including machine learning, to compare actual vs. desired activities or actions and make appropriate recommendations and drive the right behavior throughout the sales engagement process.
FIGURE 1. B2B Customers Illustrative Customer Journey Map
Mapping the complete customer journey through all channels-digital and non- digital-is obviously an extensive and lengthy exercise. However, even if you do some basic mapping and understand critical events or activities along the journey, you can use these to begin charting your digital sales transformation. In other words, you can take a “crawl, walk, run” approach vs. “boiling the ocean” to get started. This will also allow you to begin to design or redesign your workflow around digitally enabled processes and tools which can improve customer experience, sales productivity, and improve win rates or conversions. As an example, in analyzing your online customer engagement or “clicks”, you can apply advanced data engineering and anomaly detection to identify unique customer journeys, determine “drop-offs” in the journey, and establish improved or automated processes to reduce drop-offs and improve conversion rates.
Once you have identified your key processes and touchpoints, it is likely that the data rests in many various sources and in many different forms and formats (e.g., structured, unstructured). As a result, this requires a salesperson to spend valuable time aggregating this information from these various systems and tools to prepare for or meet new or existing clients, address challenges or questions, and update sales management on progress vs. plans and objectives. This administrative work is obviously unproductive and prevents the salesperson from spending more time in the field or with customers. It can also force the salesperson to determine the appropriate or next best course of action without the benefit of predictive analytics to help guide the process or the decision. Predictive analytics can provide recommendations or guidance based on analysis of historical events or activities that can yield better results than traditional methods or gut instinct. Augmenting the “art” of selling with “science” can make it easier for the salesperson and can generate more effective and productive outcomes. Taking this a step further, if you can complement the use of predictive analytics with technology solutions which can aggregate and simplify the delivery of this information to sales, then you can eliminate these administrative tasks and free up your sales teams to spend more time where it matters-with your customers.
ACTIVATE YOUR DATA
Once you’ve defined your customer journey and critical touchpoints, the next obvious question is “where is the data, and what do we need”? Data is the “fuel” for AI and empowering your sales teams to successfully manage the customer journey. Delivering effective AI-driven recommendations and results requires readily-available clean and trusted data from your source systems.
Starting with your CRM, begin to identify where source data resides in your organization. You will quickly find out that only a portion of the data you need resides in your CRM. You may also need to access data in your ERP system, support systems, customer success, content management, CPQ (configure- price-quote), and other transactional systems. A significant amount of data will reside in your data warehouse, data lake, or the data may reside in “shadow IT” data sources, which are the most challenging of all. Regardless of where the data resides, you need to map each of these data sources to the critical events you defined in the customer journey and determine how easy, or difficult, it will be to integrate, access, cleanse, transform, and deliver the data you need to be successful.
FIGURE 2. Enterprise Date Strategy & Governance Program
As you identify these various sources and map them to the critical touchpoints you’ve defined from your customer journey work, you then need to determine the quality of the data. You must understand if you have consistent, well-defined, and agreed-upon data definitions and standards. Based on these definitions, how clean is your customer data? How are inputs created, by whom, and how frequently? What are the chances you may clean your data only to find that errors creep back into your data due to lack of controls, data stewardship, or data governance? Frequently overlooked, an ongoing data governance framework is vital to ensure that once you clean your data, it remains accurate and reliable.
AI can also provide a very useful benefit here as well. It can improve how you capture customer data and information (e.g., scanning business cards vs. manual data entry, or ingesting LinkedIn data). AI can also help rationalize and cleanse customer data records like names, job titles, addresses, phone numbers, company names, etc.
Since the foundation for much of your customer data is your CRM system, it is imperative to drive adoption and use of your CRM systems to ensure timely, accurate, and reliable data. Historically, this has been a significant challenge for many sales organizations due to many factors. Updating CRM systems often requires significant administrative time to modify or add records, update opportunities, etc. There is also often an unwillingness or reluctance to provide accurate opportunity or pipeline data due to quota or compensation concerns, complexity, or other factors. As a result, legacy CRM solutions can present a drain on productivity due to the manual data entry required to provide accurate and timely information for sales management. There is also some reluctance to provide too much information, which can then be “micro-managed” by sales leadership. Nevertheless, the more accurate your underlying sales and customer data, the better and easier it will be to make decisions for your sales organization and your company. Given these challenges, how can you improve the accuracy of your customer data?
There are many tactics you can utilize to drive adoption, use, and compliance. Traditionally, these methods work with varying degrees of success, but industry-wide CRM systems remain plagued by low adoption, infrequent usage, complexity, distrust, and erroneous or missing data.
Through innovations in AI, a new class of products and solutions has recently emerged to address this challenge. These solutions are generally described as “Virtual Digital Sales Assistants (VDSAs)” or “Intelligent Assistants”. VDSAs combine AI algorithms with touch, talk, or text interfaces with leading CRM and Salesforce Automation (SFA) solutions, marketing content/automation systems, customer support, legacy third-party databases, and other data sources. VDSAs simplify how sales can interact with these source systems to provide updates on contacts, leads, opportunities, and gather critical information at pre-determined times or on demand. Ultimately, VDSAs remove “friction” from current sales processes to allow sales to be more productive and effective. When coupled with predictive analytics or AI-delivered insights as described elsewhere in this white paper, it can become an even more powerful element of your digital sales transformation.
Not only can VDSAs simplify and streamline data integration for sales and help reduce administrative time and friction, there is another very strong value proposition. By making sales’ jobs easier and delivering information and insights to sales to help them perform their jobs better, there is a strong incentive by sales to use these solutions. As they do so, the data in the underlying CRM will be better, more accurate, and timelier. This means that all downstream data- driven processes, including forecasting, supply chain demand planning, and predictive analytics, will also become more accurate and trusted. An example from GE illustrates this point. At GE, “sales, technology, and finance executives have been collaborating on an app to reduce time that sellers spend inputting and addressing forecast questions. This app allows sellers to enter information on the fly, through text and voice solutions, and has eliminated multiple rounds and levels of management inspection of the numbers. Early pilots point to significant ROI as sellers spend more time on customer-facing selling activities.”2
To conclude, as you aggregate and deliver clean, reliable customer data through your CRM and source systems, you can then apply AI predictive models and algorithms to help your sales teams make sense of the mountain of information available to them. Ultimately, this will enable you to make it easier for your sellers to succeed in their core mission, which is to serve your customers and drive growth for your business.
MANY CHANNELS TO OMNI-CHANNEL
According to Forrester,3 “the explosion of content-rich B2B marketing and commerce sites has made buyer research easier than ever. This digital maturation is in stark contrast with the experience buyers receive working with human sellers. More than 90% of B2B buyers prefer to make their purchases online rather than interact with a salesperson, yet highly considered purchases often still require seller involvement. AI in human-assisted sales needs to match the buyer experience of self-service if sellers hope to stand a chance in long sales cycles.”
Buyers today are more educated and connected than ever before, and accustomed to a digital experience in their personal lives. As a result, their expectations for achieving the same experience in their working lives has increased dramatically. When considering new products or services, they will engage multiple channels to explore and learn about products and services, frequently doing so before they ever engage a salesperson. They will go online and read marketing literature, white papers, customer testimonials, watch videos or demos, or perhaps even download sample or trial software. As a result, it is essential to connect all these various channels to ensure not only a consistent experience for the customer, but also arm your sales team with the information they need to be successful.
2Mary Shea and Jacob Milender, “B2B Sales Force Digital Transformation: Three Global Leaders Share Best Practices”, Forrester, July 26, 2017.
3John Bruno, “How AI Will Transform Sales”, Forrester, December 18, 2017.
This means that coordinating and collaborating across organizations, partnerships, and through various channels is more critical than ever. It’s no longer adequate to have your marketing organization collect and distribute recommendations and actions through email, powerpoint, or other standalone data sources to your inside or outside sales teams. Data-driven information must be available and delivered as “real-time” as possible to exploit opportunities before they are lost. A recent article by Boston Consulting Group also stated that “the big problem for most companies is that marketing and sales operate in their own silos, each function having its own organization processes, incentives, cultures, and in many cases, objectives”. Furthermore, they state that “companies that do not cooperate closely across their organizations may suffer because of poorly executed customer buying journeys, misaligned objectives, misallocated resources, and poor team morale”, resulting in the potential for “customer alienation, loss of market share, and slowed or no growth”.4 As a result, it is imperative to close the historical gap between marketing and sales through integration of data and insights into common language, actions, and insights delivered through a single, integrated platform. Many CRM systems attempt to provide this connection, but all too often, marketing teams and sales ops or support teams deliver additional insights or recommendations through alternate means like email, chat, powerpoint, or excel. Frequently, these recommendations may not align due to underlying data differences, inconsistencies in definitions, KPIs, or analytical models. Ultimately, this requires the salesperson to consolidate and decide what course of action he or she needs to act on. This can exacerbate the administrative burden and reduce confidence in the predictive analytical models.
One excellent example of addressing this challenge is Cisco’s 2020 initiative5. Cisco’s sales and marketing teams partner on Customer360: a data-driven collaborative effort to better understand buyers and provide guidance and recommendations for sales to take appropriate action. By employing data engineering, predictive analytics, and “connecting” marketing and sales data and insights, it means the seller doesn’t have to look for the information and can focus on selling and their customers. Not only does this mean your reps are more effective, but you can also realize significant improvement in your sales cycle time, conversion, win rates, and ultimately revenue.
It’s also imperative to be aware of and align your efforts between your “direct” and “indirect” sales teams or your channel partners. It’s no longer acceptable for the sophisticated B2B buyer to have discrete engagements by both sales organizations, particularly if unaware of the potential “conflict”, or worse, uninformed and openly competitive. It’s equally important to collaborate and align with your support organization (i.e., are there any open service tickets or escalations?) and your specialist teams with deeper knowledge of your product and services capabilities and features. Your salespeople need to know who is calling on their customer, what have they bought, and what is their experience with your company, your products, and services? All of this requires knowledge and information from various sources and organizations-both inside and outside of the company.
4Phillip Andersen, Robert Archacki, Basir Mustaghni, Roger Premo, “Building an Integrated Marketing and Sales Engine for B2B”, The Boston Consulting Group, June 2018.
5Mary Shea and Jacob Milender, “B2B Sales Force Digital Transformation: Three Global Leaders Share Best Practices”, Forrester, July 26, 2017.
“We take marketing sentiment data, pair it with sales data, and create insights that tie to opportunities and actions that reps can take. This collaborative effort fosters higher- quality interactions with customers and better rep prioritization of selling activities.”
– Forrester describing Cisco’s 2020 initiative
To meet the need to share and exchange information and ideas across these various channels, there are many new collaboration software and solutions in the marketplace. These solutions are increasingly powered by AI and address various use cases such as team collaboration and communication, content or document coordination, edit and approvals. Collaboration tools and applications can help sales work seamlessly with other teams and customers to accelerate deals and provide a superior customer experience. If designed and implemented with the salesperson at the center, you can facilitate rapid collaboration by diverse teams to enhance communications, enable the use of file sharing and document annotation capabilities (e.g., proposal content review and coordination), support live meeting capabilities through video and audio, and incorporate electronic signature capabilities for faster approvals of proposals and contract documents. Ultimately, by improving collaboration through these tools, you also remove friction and administrative burden from your selling process.
“Sales force digital transformation requires new and more creative ways of collaborating.”
EMBRACE PREDICTIVE ANALYTICS
It’s an unprecedented era for B2B sales organizations. Buyers and buying patterns are changing. Traditional sales models are being augmented if not supplanted by digital channels and expanded routes-to-market. There is a staggering amount of data, computing power, and technology solutions available to sales. However, if left unmanaged, it merely adds to the burden a salesperson has, which is to navigate the growing amount of data and information being sent their way. Sophistication in advanced analytics and machine learning provides the ability to augment traditional sales “instincts” with data-driven information at a scale never possible before. Due to these trends, B2B sales is also rapidly evolving from an “art” to a “science”. To many, this is not an entirely comfortable conclusion given that “science” does not completely replace human knowledge, intuition, experience, and “gut instinct”. However, there is so much data and information available for the salesperson today, it is not humanly possible to make sense of it all without the benefit of big data, analytics, and AI, particularly if utilized effectively. Like the “needle in the haystack” analogy, it is critical to discern what is important and insightful from the mountains of available data. This is where predictive analytics can make a significant impact. Furthermore, according to Forrester in a recent Forbes article6 “companies that opted to blend AI with human insight report improved satisfaction on the part of sales reps (69%), as well as heightened operational efficiency (68%), agent productivity (66%), and customer satisfaction (61%).
Therefore, embracing advanced and predictive analytics can benefit sales organizations in many ways. From the massive amounts of data available to enterprises today, advanced and predictive analytics can rapidly evaluate the available data faster and better than a human is able to. By employing predictive analytics, you can provide recommendations on campaigns to maximize impact, accounts, customers, or opportunities to prioritize. You can also provide guidance on accounts with a higher propensity to win or buy, help optimize product, and pricing strategies, next product to buy, and more.
6Falon Fatemi, “4 Ways AI is Transforming Sales Organizations”, Forbes, February 28, 2018.
The table below illustrates some predictive analytics use cases that can be applied against each step of the customer journey to provide guidance and recommendations to the salesperson throughout the process. Moreover, as you implement these predictive analytics solutions and apply machine learning, the output will become more comprehensive, refined, and accurate. A fundamental principle of AI and machine learning is the ability to learn, improve, and increase in accuracy over time through repetition and feedback. As you compare actual results vs. predicted outcomes through machine learning processes, you can refine your algorithms. Therefore, as you implement, explore, test, and learn, the software algorithms will improve as well. This means the entire ecosystem gets smarter—sales reps, sales managers, executives, and those who rely on sales to drive (and predict) the engine of growth and customer experience for the enterprise. It also means the outcomes and recommendations gain trust as they improve in accuracy; and based on Forrester’s findings, your sales reps will also be more satisfied, productive, and efficient.
FIGURE 3. Sampling of Predictive Analytic Use Cases
As you implement these various use cases, you will find that not only can you provide recommendations or guidance based on the algorithms, but you may even be able to introduce proactive measures. For example, you could alert your support teams to act when you observe an event, trend, or pattern that requires attention or action while notifying your salesperson of a potential issue. As an illustration, if you have a shipment that may be delayed, you could notify your sales operations and supply chain organizations to proactively address or mitigate the issue to prevent or minimize the impact. Your salesperson would be informed and can help manage the event or issue with your customer. This is far better than not knowing, missing the shipment, and finding out about the issue through customer escalation. It could be as simple as a quote is expiring, or an opportunity at 90% in your pipeline is “stalled”. If you measure and track customer sentiment, satisfaction, or NPS, you could also proactively alert your sales rep that one of their accounts is at risk of leaving and recommend actions. If your account is “healthy” and/or you are expanding your presence in the account, you could also identify opportunities for cross- selling or up-selling complementary products and services or proactively capture renewal opportunities.
Depending on your business priorities or challenges, there are an infinite number of use cases, which can be designed and implemented to improve your sales execution and customer satisfaction. As you employ these predictive analytics, learn and improve the underlying algorithms, you can improve your ability to proactively provide recommendations or guidance to your sales teams.
Proactively suggesting actions avoids not only missing critical events or opportunities to win business, expand relationships, or retain accounts, but it also moves you away from the reactive and time-draining cycles of managers, support staff, or executives asking what happened along with the ensuing emails, phone calls, and escalations. Even worse is it means a salesperson will be forced to spend more time dealing with internal issues and questions as opposed to spending time with customers. You may even go beyond data science and apply “behavioral science” in order to understand the drivers of behavior (customer and/or salespeople) and design levers which “nudge” or encourage the desired action or outcome.
To improve or accelerate the use of predictive analytics, you may also consider how you organize your analytics and data scientists (internal, or external partner) to support your sales organization. While there are benefits to a shared service function like analytics and data science, you will get significantly more impact, relevance, and buy-in from sales if you align your talent to the sales organization they support. This does not suggest you utilize a fragmented organizational model for your analytics talent. They may (and likely should) still report into a central function. However, they should be embedded into the business they support. In this manner, you can benefit from both the advancement in analytics and AI knowledge and skills developed within the central function while learning more about the business processes and challenges. The more domain knowledge the analytics professional or data scientist has, the more relevant and accurate the predictive analytical models. From there, it is imperative to experiment, test, and modify the original hypothesis with actual results and continually refine the algorithms.
As you advance in your knowledge, use, and trust with data-driven recommendations, it will only be natural to see the increase in use of AI, chatbots, and robotics to automate repeatable activities or transactions, freeing up your high-cost, and highly-skilled, sales talent to focus on more complex engagements and opportunities.
From a sales leadership perspective, these predictive analytical models should be used in conjunction with “gut instinct”, not instead of. The more accurate and useful the prediction, the lower the variability in actual results vs. predicted results. This is a significant improvement over purely “gut instinct” or even worse, “gaming”. As discussed above, algorithms and models will mature and improve. As they do, sales and sales leadership can rely even more on these data- driven recommendations. This also means sales managers and sales leaders can provide more effective coaching and training based as much on facts and data as behaviors and acquiring knowledge. The emergence of “behavioral sciences” to complement “data sciences” may provide additional benefit as the field matures and more use cases become evident. By applying behavioral science to determine the way decisions are made (by sellers and buyers), identify optimal sales candidates, assist in sales coaching, sales communication and goal setting, and improve sales processes to encourage or “nudge” the appropriate behavior, there is great promise in what behavioral sciences can yield.
As discussed earlier in this white paper, it is essential to have good, clean, and reliable data from trusted data sources for these predictive models to work.
PUSH INFORMATION AND INSIGHTS TO YOUR SELLERS
Once you’ve defined the critical events in your customer lifecycle, the single “source of truth” for the data, and have begun utilizing predictive analytics to provide guidance to your salespeople, what is the best way to deliver that information to your front-line sales teams when they need it?
Traditionally, it was up to the sales individual to access multiple data sources to plan, manage, and report on their business and performance vs. plans and quotas. In many enterprises, sales planning teams, sales operations, marketing, or other organizations are sending either emails, texts, tasks, or actions in your CRM or using any available means to communicate a promotion, new offers, events, or recommendations for your sales teams, ultimately overwhelming them with information.
In addition, sales management and executive leadership want to be kept informed. They request status updates through email, chat, text, or most critically – the CRM. As a result, the salesperson would access their CRM system to enter contact data, account information, update opportunities, enter trip reports, and keep their management updated on all assigned accounts. The salesperson may need to create reports, charts, or slides for management review. While this helps keep management informed and allows them to keep track of appropriate actions and activity, it’s very time consuming for the salesperson.
If they are geographically or territory-assigned salespeople, they may need to access multiple data sources to plan their customer visits. They may use D&B, Hoovers, Aberdeen, and LinkedIn to gather information on the customer and account information or gather some competitive intelligence. They may access their CRM and other internal data sources to understand buying patterns, product history, outstanding quotes, and any existing relationships. They might use a mapping program like Google Maps to efficiently plan their route.
These examples highlight the fact that the salesperson must access disparate tools, applications, or data sources and aggregate the information in a manner that is relevant and useful for them to do their jobs. Perhaps work like this is being performed by your inside sales, sales support, or sales operations organization. However, all this work requires manual effort and can divert the salesperson from what they should be doing—spending more time with customers and less time doing administrative or data entry work. Moreover, most modern business intelligence (BI) tools and applications used to gather this information often requires a lot of training or require the end user or seller to drill into raw data, create pivots, or query databases, which contributes to the challenge in how you effectively deliver information to sales.
Fortunately, there are emerging platforms to make it easier for sellers (or anyone for that matter) to acquire this information. By using GUI search tools similar to Google, BI and analytics providers are beginning to make it easier to work with BI and analytics applications. Therefore, instead of asking your end users or sellers to navigate dense databases to search for information, they can query these new applications through search or voice commands and obtain the information they need. In return, they will receive real-time, contextual answers to their questions without having to spend valuable time mining databases. Behind the scenes, these search solutions employ AI technologies like NLP, machine learning, and chatbots to query, ingest, and deliver information to the end user or seller in an intuitive, relevant, and contextual manner. While GUI and cognitive search engines simplify how your end users or sellers can acquire valuable information or insights, it is even better if you can proactively communicate or push this information to your sellers, so they don’t have to look for it.
To further advance the thought, what if you not only provided data and insights to sales when they need it, but relied on tools and processes to remind or “nudge” them to act? For example, you just left a customer meeting, what if your sales application asked you to document your meeting, provide a trip report, and enter all of this into your CRM seamlessly?
Even better, what if you alert your salesperson before they enter the meeting that there is an update on pricing for an outstanding quote, expiring warranties, or software licenses that are up for renewal, or products which may be nearing “end of life”. Furthermore, by leveraging your predictive analytics engine, you can also provide recommendations on next best buying opportunity (e.g., if you are a B2B enterprise selling products and services, are there additional products, peripherals, accessories, or services that you can offer to complement a recent quote, or better, order?). What if the customer has experienced a recent product outage or is having troubles with your services organization? Obviously, it would be better to be armed with this information in advance to avoid surprises, better manage the customer experience, and take advantage of selling opportunities.
All these scenarios are possible (and many more) if you have defined your customer engagement and sales process, the underlying data sources, and leveraged your Analytics Center of Excellence or your analytics partner to develop and push these insights and recommendations to sales. As discussed in the next session, ideally you can do so through simple, intuitive mobile solutions.
UTILIZE AI AND DIGITAL SOLUTIONS TO SIMPLIFY SALES PROCESSES
The final piece of the transformation is delivering the data, information, and insights you have generated into an easy and intuitive interface for your sales organization-how they need it, when, and where. Mobile business intelligence, salesforce automation (SFA), or analytics solutions represent an improvement over legacy PC or browser-based applications. Mobile SFA solutions provide an easier way for sales people to interact with their CRM systems through mobile devices. While these mobile solutions represent an advancement in making it easier for sales to utilize technology to aid them in their work processes, they fall short of truly empowering the salesperson and solving the dilemma of poor adoption, use, and accuracy of data in CRM systems. As discussed earlier in this white paper, contributing to the challenge is the fact that critical data is typically stored in different source systems, data warehouses, or data lakes requiring the salesperson to aggregate the information.
As described by The Boston Consulting Group7, “while companies have made massive investments in technology, they haven’t focused on true integration – that is, integrating tools with the way people actually work”. The resulting paradox they claim is the “complexity trap” that most companies face. They further assert that “digital technologies and methods are supremely flexible. They enable businesses, end users, and IT departments to design applications and user journeys that are “just right” and adapt processes accordingly-in the end, reducing or eliminating this complexity trap. Naturally, this is also the dilemma of the average salesperson who is asked to navigate internal complexities to do their jobs.
Yet another challenge is that sales reps notoriously delay putting a deal opportunity into a CRM because they don’t want sales managers learning about it and constantly asking what they are doing to move the opportunity further in the sales cycle. In addition, it took time to translate notes or recall into CRM databases, which means sales reps will often wait until they work from home on Friday or prepare on Sunday evening for the week ahead. This means the data may be inaccurate, stale, or forgotten. While this is useful to provide information to sales while they may be traveling or visiting customers, it falls far short in empowering your sales teams with the data and insights they need to be successful in performing their work, generating sales, and spending time with clients.
As suggested by Gartner’s Tad Travis8, this challenge can be addressed by employing a Virtual Digital Sales Assistant “VDSA” solution for your sales organization, particularly when utilized in conjunction with predictive analytics use cases described in this white paper. VDSAs can integrate data across the CRM, support systems, content management, legacy data sources, and external data sources like LinkedIn, email, calendar, mapping, and other sources to deliver frictionless and contextual insights. Many of these VDSA solutions utilize Natural Language Processing (NLP) and chatbots to enable your sellers to interact with these source systems and your CRM through voice command similar to how you may use Alexa, Siri, Cortana, or similar applications in your personal life.
7Vanessa Lyon and Anne-Francois Ruand, “Take Control of Your Digital Future”, The Boston Consulting Group, 2018.
8Tad Travis, “2016 Recap: The Third Wave of Sales Automation is Here”, Gartner, January 3, 2017.
“While companies have made massive investments in technology, they haven’t focused on true integration – that is, integrating tools with the way people actually work.”7
– Vanessa Lyon andAnne-Francois Ruand, The Boston Consulting Group
As a result, you could have an AI-powered sales digital assistant proactively deliver information to you in advance of a meeting that you’ve scheduled, or your sellers could ask for information through voice command. The VDSA can then deliver information on the account, contact, prior sales, open opportunities, open service tickets, partners or competitors who may be engaged, etc. All of this means your salesperson is significantly more knowledgeable going into a customer meeting, and with significantly less effort than trying to gather this information on their own. Moreover, you don’t need to employ large teams of support staff to gather and generate information like this either—further freeing up your talent and operating budget for more value-add work and more time with customers. When your salesperson leaves the meeting, their personal digital sales assistant can “nudge” them to capture and document key findings, agreements, next steps, or actions. It can then automatically update your contacts, account information, or opportunity status in your CRM system, meaning the information is timely, accurate, and readily available to your supporting organizations in sales operations, support, supply chain planning, or even your financial forecasting team. Using The Boston Consulting Group’s analysis, this would also allow you to remove complexity from your sales processes, and free up your salespeople.
Ultimately, instead of salespeople being expensive data entry clerks, they can enjoy a seamless, intuitive AI-powered interface (“touch, talk, text”) with their CRM and various data sources, allowing them to focus on the work of selling. The seller becomes the center of the selling universe, not the CRM. Coupled with the use of predictive analytics along the customer journey to deliver insights and information to your salespeople when they need it, you will greatly empower the ability of your entire sales organization.
Your sales organizations can enjoy higher productivity, win rates, and faster sales cycles, while improving employee engagement and customer experience.
“VDSA will become the primary interface by which sales representatives manage their work. When combined with artificial intelligence systems, VDSA will become the cognitive system that removes much of the inefficiencies common in B2B sales processes.”
– Tad Travis Research Director Gartner
CLOSING AND CALL TO ACTION
Unfortunately, there are no complete, end-to-end solutions which adequately address all of the challenges and inefficiencies that exist within legacy sales operations. Naturally, there will be convergence and consolidation as vendors and service providers mature in their application of AI solutions to solve these sales process challenges.
The good news is many of these emerging applications, tools, and solutions are focused on simplifying and improving the seller’s (and customer’s) experience through the application of advanced analytics and AI. This is a vast improvement over legacy applications that largely focused on the collection of data and information for management, oversight, and inspection of sales activities.
None of these steps are easy and may be too large of a leap for many firms. However, these strategies are pivotal to success in driving digital sales transformation in today’s rapidly evolving, complex B2B selling environment. Therefore, the sooner executives and sales leaders adopt these strategies, or embark on a roadmap to do so, the more competitive they will be in the digital era.
- Define your customer journey and identify your critical touchpoints. Use this to determine how and when your salespeople should engage customers (your methodology or desired process) and where you can apply AI solutions to aid them.
- Map your touchpoints to your data and your data sources. Leverage open source or API solutions to accelerate access to necessary data sources.
- Embrace all channels-online, offline, direct, indirect, support, and social media. Utilize collaboration tools to break down internal and external silos.
- Aggressively employ advanced and predictive analytics-adapt and modify algorithms to improve accuracy and confidence. As your organization learns what is possible, new methods will become apparent, including proactive, predictive, and prescriptive solutions.
- Take the burden of data entry and data consolidation away from your sellers. Streamline, integrate and “push” information and insights to your sellers when they need it.
- Utilize AI-powered digital solutions to deliver the insights and information in a simple, intuitive manner.
Analytics and AI are beginning to make a significant impact in enterprise sales organizations. Leaders in adopting AI to enhance their sales processes are already reaping the rewards of their investments.
As stated by Gartner in a Forbes article9, “30% of all B2B companies will employ AI to augment at least one of their primary sales processes by 2020. The most effective companies, though, will use AI to augment multiple parts of their sales processes”. There will be significant rewards for companies that do so. According to McKinsey10, “companies that have embraced what we call the ‘science of B2B sales’ have already started to pull ahead of their peers in terms of revenue growth (registering 2.3 times industry average revenue growth), profitability (3 to 5 percent additional return on sales) and shareholder value (8 percent higher total return to shareholders than the industry average).”
9Falon Fatemi, “4 Ways AI is Transforming Sales Organizations”, Forbes, February 28, 2018.
10Tim Colter, Mingyu Guan, Mitra Mahdavian, Sohail Razzaq, Jeremy Schneider, “What the future science of B2B sales growth looks like”, McKinsey&Company, January, 2018.
Advisory Board Member, Fractal Analytics
Doug provides advisory services to help advance Fractal Analytics’ capabilities, services, and offerings to empower enterprise clients. Doug leverages his knowledge and experience to help Fractal Analytics and clients accelerate the use, adoption, and value creation with data, analytics, and AI in the enterprise. Previously, Doug held various leadership roles at Dell for more than 19 years. In his most recent role at Dell, he was responsible for providing global data, reporting, and analytics services to support Dell’s sales, marketing, finance, services, e-commerce, and operations business units. He was also responsible for leading a transformation strategy for improving the use of data, BI, and analytics across the company to enhance decision-making. Doug also led a digital sales transformation for Dell’s global sales operations by partnering with IT to create a big data platform for sales, enabling the use of enterprise- wide KPIs and BI solutions. He also led the creation and implementation of predictive analytics forecasting solutions for the global sales organization.
What’s the most frustrating aspect of using current enterprise solutions for a senior executive? It is the inability to find timely and reliable answers to their questions. Often, answers do lie somewhere – it is just too challenging and time consuming to get to them.
Imagine how cool it will be if you can get the most important information and insights before every tactical decision you can make during your workday?
Too much data, insufficient, unreliable metrics
Consider the case of a global consumer goods company we work with. In a large growing market, their sales team has little or no idea how their trade promotions are performing (even though they spend significantly more than $100 million/year).
At a macroscopic level, they do have data on their overall shipments and revenues, but if they want to know whether a specific promotion generated incremental sales or how it performed across different regions, the answer is so excruciating to find that they have stopped asking the question!
There are just too many data sources (from their distributors) that are highly inconsistent as well as complex – and there’s no agreement in defining true incrementality.
In such a situation, they just go with their gut and experience when they decide which promotions to repeat, which to drop and which new ones to introduce. This is the equivalent of driving in an unknown city by asking passersby for directions. It’s time to get a map with real time traffic information and turn by turn directions.
Information overload, too little time
There are other cases where companies have great data but they are still deluged with information overload. A bank we work with expected its senior executives to read an 800-page document, ironically called “at-aglance” report to understand how the bank was doing!
The idea here is this – since we don’t have a clue what’s really important, let’s just get everything together so that senior executives can find whatever they may be looking for. This is like printing out the map of the whole country because you don’t know exactly where you are going.
A bank we work with expected its senior executives to read an 800page document, ironically called “at-a-glance” report to understand how the bank was doing!
BI/Data discovery platforms overpromise, underdeliver
BI & “data discovery” platforms that promise answers aren’t working either. These platforms suffer from GIGO (garbage in garbage out) syndrome and have performance constraints.
In a healthcare company we recently worked with, their report takes forever to load as it attempts to load hundreds of GB of data. Most importantly, these tools don’t even make an attempt to understand their users, like the banking example we discussed above. Whether you are the CEO, Chief Sales Officer, marketing director or finance manager; whether you are in Mexico or look after Western Europe as a region; the reports look more or less the same.
The BI system expects you to learn it and find your own answers and not the other way round. Why is that acceptable?
BI & “data discovery” platforms that promise answers aren’t working either. These platforms suffer from GIGO (garbage in garbage out) syndrome and have performance constraints.
Your Facebook seems to know you quite well, why can’t your BI report understand you likewise and anticipate your questions?
Can AI transform BI?
The answer, I believe, is to bring AI to BI. We need to rethink BI dashboards in light of the advances we are making in AI. Thanks to big data & AI techniques in text analytics, it is easier than ever before to bring together disparate, messy, inconsistent data and fill in the missing gaps. AI algorithms in knowledge representation have made it possible to connect fluid data points into probabilistic but consistent, highly accurate understanding of what’s happening in the business (KPIs, competitive intelligence, etc.).
Most importantly, AI transforms our understanding of the user, helping us serve information, recommendations and insights that the user really needs to know, even before she “wants” to know. That’s when BI truly becomes personalized and “anticipatory”. Additionally, by instrumenting how the user is interacting with these insights/recommendations and acting on them, the AI within BI can learn to be even more relevant, actionable and dare I say, addictive. Eventually, managers and executives will spend more engaged time with their BI than with their Facebook feed. (OK, the last line went too far :-), but I am optimistic).
AI transforms our understanding of the user, helping us serve information, recommendations and insights that the user really needs to know, even before she “wants” to know.
Returning to the example of the consumer goods company, thanks to an AI plus BI solution, the company executives are now beginning to get a clear, in market read of their trade promotion performance. Machine learning algorithms make recommendations to the design team on what promotions to retain, what to drop and predict how a new promotion will perform. The sales team, including the distributor sales representatives will soon have, on their smartphone, information they “need to know”. The National sales director will have real-time understanding of performance of trade promotions and the same app (Cuddle.ai) recommends the right promotion for the right channel to each sales representative.
BI/Data discovery platforms will benefit by embracing this “AI meets BI” thinking to move from data discovery (by the user) to user/insights discovery by the platform
What do you think? Will this be a game changer for your business? Will BI platforms incorporate AI soon enough?
Co-Founder, Group Chief Executive & Executive Vice-Chairman, Fractal Analytics
Srikanth is a co-founder of Fractal Analytics. In his role as Group Chief Executive & Executive Vice-Chairman, he is responsible for all four entities, inorganic growth and the long-term future of the business.
At Fractal, he has played a role in the evolution of the analytics industry. Long before big data became a buzzword, Fractal evangelized the idea of using advanced analytics and data assets of the company to make better decisions.
Over the last 16 years, he has been a thought partner to global corporations as they have embraced analytics to improve the quality and execution of their decisions. He also believes in building a great place to work that attracts the best minds in the world and creating a trusting environment where people are respected and are free to do creative problem solving.
Srikanth considers himself a lifelong student of mathematics, probability & AI and is interested in consumer behavior, behavioral economics and deep learning.
By Amitabh Bose (Ambo), Fractal Analytics
The demographic segment retailers desire most – yet which is often the most elusive – is the millennial consumer. The fervor to unlock the buying power of millennials makes sense; those 26-42 are the largest percentage of working-age adults and currently account for $600 billion in annual purchase power – a number expected to mushroom to $1.4 trillion by 2020. So, it’s no surprise that forward-thinking retailers are adapting to reach this generation.
Consumer packaged goods (CPG) companies, in particular, are eager to crack how they can market to the millennials to increase their mind share. In fact, it was recently projected that millennials alone will spend $65 billion on CPGs over the next decade. So why not double, or even triple this projection? That’s exactly what CPGs have in mind.
For CPGs to get the attention of millennials, however, they’ll need to identify what differentiates them from other generations – including interpreting their spending habits and understanding their personal needs and wants. This is harder for CPGs than it is for online retailers, as most CPGs are sold through third-party sellers.
This is where AI-driven strategies can come into play.
For example, millennials are known for demanding personalized experiences much more than earlier generations. In fact, 75 percent of millennials say they’re willing to give up personal data in order to work with businesses that have instant on-demand engagement, as opposed to only 53 percent of baby boomers or traditionalists. CPGs that learn how to leverage AI technologies to create these tailored experiences will become the leaders of their industry.
One way CPGs can harness the power of AI for personalization is by analyzing various customer datasets-both first-party and beyond-to conduct an “equity drivers evaluation.” AI can help identify the most resonant brand drivers for different types of consumers. It can also segment consumers in a very granular way, by the drivers identified for each, and develop brand “ratings” for each segment. This type of analysis can reveal valuable new insights that may be adopted in the design of campaign messaging, product positioning and even product innovation.
Leading CPGs understand that scale is no longer a competitive advantage. In fact, millennials, which were the generation to drive the Direct-to-Consumer (DTC) market to the size that it is today, prefer a much more simple, personalized approach. To adopt a strategy that answers this demographic’s demands, CPGs need to adapt their current offering and develop new ones, through analytics, machine learning and AI. In doing so, they will be better equipped to develop and market products that will address the millennials’ very specific preferences, helping them to expand deeper into this coveted segment of the market.
The CPG that can understand and cater to the millennial consumer in an authentic way will be king in this new age of retail. And, the key to unlocking the lion’s share of that revenue opportunity will be AI.
This interest in using AI to conquer new revenue streams, such as the millennial, is a perfect example of how the consumer goods and retail industry are currently in the middle of an AI and machine learning transformation. My company, Fractal Analytics, a global leader in artificial intelligence and analytics that powers decision-making in Fortune 500 companies, has much to bring to the table in terms of strategy and implementation for such a transformation, due to our ability to identify future consumer and shopper needs, as well as market trends, through our trifecta of enterprise capabilities: AI, engineering & Design.
This year, we are the Title Sponsor for the 2019 Retail and Consumer Goods Analytics Summit. At the summit, we will showcase our recent, exciting work in artificial intelligence, machine learning and behavioral sciences. By presenting specific cases from the industry’s largest brands, we’ll be sharing how our technology and expertise have been used to drive positive results in the form of more sales, reduced costs and beyond.
Automated digital advice platforms, commonly known as robot-advisory platforms, have seen accelerated adoption over the last three years due to the promise of low-cost wealth advice, enabling wealth managers to serve new profit pools of mass and mass affluent clients. Legacy wealth and brokerage firms are vying to build rapid scale and leadership in this new market. The initial accelerated adoption is slowly reaching a plateau as leaders compete to scale to the next heights by incorporating new data, AI, and digital experiences.
Fidelity’s bold foray in this space with the AMP platform is generating keen interest and adoption from leading banks and RIAs. Though, the best is yet to come.
Read the perspective of Vinod Raman, Head of Digital Advice Solutions at Fidelity, as he talks to Arpan Dasgupta, Head of Financial Services at Fractal Analytics.
Arpan Dasgupta: As the builder and product owner of the Fidelity AMP robot-advisory solution, can you tell us a little bit more about the platform?
Vinod Raman: Fidelity AMP is an end-to-end digital wealth management solution. The first leg of the stool is the interaction with end investors and advisors, which offers recommendations and ongoing digital services in a goal-oriented fashion.
The second leg of the stool is Fidelity-offered brokerage services, such as account opening, money movement, and transfer of assets. The core brokerage capabilities, trading, and custody services are offered by Fidelity.
The third leg of the stool is Geode Capital Management offering core investment management services.
Combining this all together, AMP looks at investors’ holistic risk profiles and offers recommendations towards achieving their goals, such as buying a home in three years.
Once that happens, the brokerage services kick in. In a matter of minutes, the investor can open and fund that account. Then, the investment management services kick in, where Geode starts managing that portfolio. From there, the investor continues to interact with the portal and gets access to all the brokerage capabilities and sophisticated investment management capabilities.
This end-to-end solution has received tremendous traction in the last year or so since we launched. We’ve had big banks, broker dealers, and RIAs go live, and that’s where we are right now.
Arpan Dasgupta: Why is robot-advisory important for Fidelity? Is the industry heading towards automated digital advice?
Vinod Raman: Absolutely. The industry is clearly heading there. There’s been a proliferation of robot-advisory solutions that are gaining market share. Providers come in from startups as well as incumbents.
Aside from industry dynamics, there are three important reasons from my perspective that we decided to offer robot-advisory services.
One is access to a new customer base. A lot of our clients – such as RIAs, banks, and broker dealers – don’t have the time or the ability now to manage clients below a certain asset base like $500K-$1M. Robot-advisory is opening a whole new segment of customers that these firms can begin to target with different strategies. An RIA might be interested in serving the family of an HNI that, traditionally, they couldn’t have served profitably. A bank might be interested in cross-selling digital capabilities along with lending and savings accounts they currently offer. A broker dealer might want to bring someone who’s under half a million onto this digital platform, serve them in an efficient manner, and eventually transition them into a full-service model. So, it opens your world to a whole new customer base or asset base. That’s one reason.
Secondly, investors are asking for more digital technologies that are delivered in a simple, intuitive, and digital fashion. We also want to out-serve other financial services competitors that are providing a full suite of digital capabilities and services online to financial institutions and their end investors.
The third important reason is democratizing financial education. Especially with lower asset base investors, financial education is still in its nascent stages. It’s not easy for a layman to understand sophisticated financial products, where they should invest, and how they should invest. Robot-advisory solutions allow you to leverage data, provide simple and intuitive digital experiences, offer financial products that people have not had access to, and help with financial education.
It’s the combination of those three forces that motivated Fidelity to offer our own digital advice solution.
Arpan Dasgupta: Since the AMP platform is targeted to end investors, some of whom may not be financially savvy, how well is the platform adapted to be relevant and targeted with recommendations for individual investors?
Vinod Raman: We’ve done a lot already with Fidelity AMP, which has set it apart in the market. It’s the nation’s first planning-oriented, goals-based digital advice solution. As part of the workflow, we’ve been able to not only collect data but also intelligently analyze it and provide the right recommendations. Additionally, eMoney allows the advisor to collect some information from the investor and other aggregators. The advisor can then run analyses such as: Does this investor have debt somewhere? Has this investor aggressively invested in another portfolio? How is this investor doing with respect to retirement?
The advisor can then come up with additional tailored recommendations for the investor.
It starts with recommending a portfolio, but it doesn’t stop there. It allows the advisor to collaborate with the end investor to intelligently source information from a host of data sources, such as banks, other financial institutions, and brokerage firms. Then, it determines the holistic financial profile of that investor across debt, savings, investments, and their overall risk profile. It goes as far as saying, “Is the investor aggressively invested in retirement versus not?” It’s a heavy, data-driven exercise with all the necessary regulatory guidance taken into consideration.
Arpan Dasgupta: All of this must have substantial investments around data, analytics, technology, and people. What did it take to bring it all together? What are some fundamental choices that you made that caused this product to be a success?
Vinod Raman: It has been a multi-year journey since we first developed the strategy.
Along the way, we have made many transformations in our business and operating model as we worked closely with internal and external partners.
A major ongoing transformation has been to move towards agile development (on the tech side). Fidelity acquired eMoney – a financial planning software – and the underlying AMP technology combined Fidelity and eMoney’s software. However, eMoney is more of an agile shop, while Fidelity is more of a waterfall shop. We made key organizational and structural shifts to re-orient everybody to an agile mindset and collaborated closely with eMoney to accommodate such a large initiative as AMP.
The second big structural change was creating a partnership ecosystem after the strategic evaluation of build vs. buy. Apart from co-developing with eMoney, which was an excellent choice, we also partnered with a host of other providers. We partnered where it made sense, especially with respect to sourcing data, analyzing data, and digital experiences.
We also made a big shift culturally. We emphasized the spirit of pace over perfection, empowered our teams to quickly make decisions, and accelerated development.
We partnered closely in parallel with multiple internal groups such as finance and legal compliance.
Arpan Dasgupta: How do you see AMP evolving over the next couple of years?
Vinod Raman: We launched AMP in 2018 with different firms such as Fifth Third Bank, First Tennessee Bank, HD Vest Financial Services, and a couple of large RIAs. We have over 200 firms in the pipeline.
Going forward, our focus is three-pronged. One area of focus is to offer additional portfolio flexibility/configuration options to our clients by enhancing our platform capabilities.
The second area of focus is retention. How can we continuously learn about the investor, personalize our services, and retain the investor? We’ll continue to help our institutional clients do this more effectively and use data and technology to do that.
The third focus is around scale. Our pipeline continues to grow. Next year, our focus is to quickly get more clients on the platform.
Arpan Dasgupta: With recommendations coming from AI and machine-learning, there is always the question of adoption. Have advisors or end investors questioned whether the recommendation is right for them or whether they should believe it?
Vinod Raman: Yes. Especially in the advisor world, we’ve faced challenges around the advisors saying, “If the tool is intelligently taking all this information and providing advice in an unbiased fashion, is it replacing me?” So, that’s a big adoption barrier that we must break. Then, if you look at the end investor side, the tool is taking all this information and providing them the recommendation. The question they ask is, “How do I know this is right for me, and am I educated enough to know that this looks right based on what I think my financial profile is?”
We see these kinds of technology as augmenting the work that advisors do. The way we are approaching that within my team – and broadly at Fidelity as well – is that we are very use-case-driven. The big business use case has been acquisition. How do you acquire more investors? How do you get the right investor on the platform? Then, once you get them on the platform, how do you not only cross-sell, but how do you retain them? That’s because we’ve seen investors come onto the platform and open accounts but hesitate about funding them. Or after they fund the accounts, they start thinking about potentially closing the account and leaving the platform, especially since there was some volatility this year.
We prioritize these business use cases internally and marry it with how data and AI can drive incremental impact. We ask the questions: Is AI the right way to solve it? Does AI have everything it needs to be able to solve this problem? And if it does, then we say that’s the number one priority for us next year.
When it comes to adoption challenges, we are continuing to work on education both on the advisor side as well as on the investor side. One of the other big areas of focus this year has been practice management for advisors driven by three or four digital strategy consultants. The sole focus of these practice management consultants is to work with our clients, advisors, broker dealers, and banks to inform them about how the digital world is evolving, how wealth management itself is evolving, and help clients make that transition.
From the investor standpoint, our focus continues to be education. What else can we provide in their ongoing experience that really helps them understand what services we are offering and what products we are selling to improve the adoption metric on the investor side?
Arpan Dasgupta: Who do you see as competition in this space other than the startups? Do you see other institutional wealth managers also providing this service?
Vinod Raman: The one thing I keep telling my teams is, “Don’t think of traditional financial services firms as your competitor. Think of internet firms such as Amazon or Google as your competitor.” Because that’s what’s happening here in this space. Investors are not comparing the Fidelity experience with the experience they might get at other financial institutions. They want to compare that experience with what they get on Amazon. In my mind, it’s the advanced technology companies that are our competitors in the financial services world, more so from an experience, product delivery, and usage standpoint.
From a product innovation standpoint, we’ve done a lot with offering sophisticated financial products through the digital advice solution. But with everything becoming digital, the whole internet ecommerce space is competition. Investors are asking for more digital capabilities, and they want to do everything seamlessly at the click of a button.
This space is certainly up-and-coming and exciting. There’s a lot of innovation going on. What’s really exciting about the space is the true intersection of financial services and technology, powered by data and artificial intelligence. A lot of evolution is going to happen within the next 3-5 years, which is exciting for all of us.
Autonomous and intelligent systems (A/IS) have witnessed unprecedented progress over the last few decades, in an age dominated by Artificial Intelligence, Big Data, and Machine Learning. It touches almost every sphere of our daily lives via automation, process optimization, and efficient resource usage, enabling informed planning and efficient decision making. Notwithstanding the positives, the prevalence of such technology, also raises serious concerns about its effect on us as individuals and society as a whole. There have been increasing apprehensions being voiced from various forums and panels, by academics and business leaders alike, on the potential harm to privacy, loss of skills, adverse socio-economic impacts, discrimination, and lack of trustworthiness.
Ethically Aligned Design (EAD) – First Edition, published under the IEEE Global Initiative on Ethics of Autonomous and Intelligent Systems (The IEEE Global Initiative), is an attempt to go beyond the generic “pros & cons” evaluation of such systems. It is a confluence of thoughts, from a large community of global experts across different fields and industries, into a set of high-level ethical principles, identification of critical challenges, and proposition of actionable recommendations. The underlying belief is that the true potential of these technologies can only be realised if they are in sync with society’s values and ethics, in terms of fairness, sustainability, freedom, democracy, and trustworthiness. It is a combination of what our community can hope to achieve and what every individual or group involved with or affected by these technologies, can do to progress in the right direction.
EAD was developed through an open, collaborative, and consensus-building initiative, among industry leaders, World Economic Forum executives, academics, lawyers, data scientists, and policy-makers, with a common aim to develop focused guidance for standards, certification, regulation or legislation in development and application of A/IS in alignment with societal well-being. It offered a platform for idea exchange via extensive research, scientific analysis, high-level principles, and actionable recommendations, enabling refinement in thoughts on rapidly changing technologies in an efficient, constructive environment.
Chandramauli has been a part of two specific committees:
- A/IS for Sustainable Development: The value of A/IS is closely associated with the generation of insights, which could drive a positive socio-economic impact for both high and low-income societies, in line with the Sustainable Development Goals of United Nations Agenda for Sustainable Development (2015). The ethical objective of the committee is that A/IS must be utilized in the benefit of humanity, accountability and sustainability.
- Law: The innovation and impact of A/IS in our everyday lives is relatively new and complex. Moreover, the interplay between the law and such technology can vary based on the legal and societal structures. While challenges do exist, in terms of uncertainty and associated risks, it is crucial to recognize that law can play an extremely critical role in ensuring A/IS are aligned with principles of ethics and well-being. The key objective is, therefore, to identify and define legal guidelines that can help direct this process, leading to societal development, fairness, and equality.
Currently, the work has inspired the creation of fourteen IEEE Standardization Projects, A/IS Ethics Courses, a Certification Program, and multiple action-based programs now in development. EAD, had also motivated collaborations on A/IS governance and policy-design with the United Nations, the European Commission, and other governmental/civil-society organizations. The “Draft Ethics Guidelines for Trustworthy AI” of the European Commission’s High-Level Expert Group on AI mentioned EAD as a reference. Beyond the policy-making arena, the vast body of work and knowledge-base of The IEEE Global Initiative has also influenced the development of industry-related resources like the IBM Watson.
Download the full report
The IEEE Global Initiative on Ethics of Autonomous and Intelligent Systems. Ethically Aligned Design: A Vision for Prioritizing Human Well-being with Autonomous and Intelligent Systems, First Edition. IEEE, 2019.
The “FashionAI Global Challenge 2018 – Attributes Recognition of Apparel” is conducted to push the ability of AI to help the fashion industry in recognizing the attributes of clothing from a given image. This capability could be widely applied in applications such as apparel image searching, navigating tagging, mix- and-match recommendations, etc. The competition was hosted at the Alibaba cloud competitions site: Tianchi. The dataset released for the competition is the largest dataset available in the domain of attributes recognition of the apparel. We finished the competition at 30th position out of 2,950 contestants across the world. In the sections that follow, we will define the dataset, our approach, other experiments, results, conclusions and further ideas, and references.
Apparel attributes are the basic knowledge of the fashion field, and they are large and complex. The competition provided us with a hierarchical attributes tree as a structured classification target to describe the cognitive process of apparel, which is shown below. The “subject” refers to an apparel. Our focus for the competition was in the characteristics of the apparel.
Data provided has eight categories, each representing a clothing type. Each category is further broken down into labels defining it in terms of design or length. If the design or length is not clearly visible in the image, an invisible label is assigned. Tables below show the categories, the labels in them and the total number of images inside each label:
CATEGORY SKIRT LENGTH LABELS
CHALLENGES IN THE DATA
- In some images, the way the person is posing might obscure the design or length of the clothes. For example, if the person is sitting, a floor length skirt might seem like an ankle length skirt
- The background in the image also added to the noise. For some images, it merged with the dress color, making it difficult for the model to distinguish between the dress boundary and the background
- Also, in some cases, the model couldn’t differentiate between clothes of similar length (example, knee vs. midi length skirt)
Data is given to us in separate folders for each category. This eliminated the requirement to first predict the category and then the labels inside it. We need to predict the labels of each category. As obvious as it may sound, the test dataset also has a similar structure.
Convolutional neural networks (CNNs) are used to solve the problems related to image classification. We have used the same technique and approach, which can be divided into four parts:
A. Preprocessing the images and data augmentation
B. Choosing network architecture of CNN
C. Optimizing the parameters of the network
D. Test time augmentation
A. PREPROCESSING THE IMAGES
We normalized (took difference from the mean) the pixel values of the images (0-255) to suit the network architecture used. We applied certain transformations like zooming (1-1.1X), adjusting the image contrast (randomly between 0-0.05), rotation (randomly between 0-10 degrees) and flipping the images. This helped in making the model more invariant to orientation and illumination in the image. In every epoch, a random transformation was chosen, so that in every epoch we are showing a different version of the same image, thus avoiding the network to overfit to a set of images.
B. CHOOSING NETWORK ARCHITECTURE OF CNN
We used transfer learning for solving this problem. Transfer learning means using a model that is trained for another task to assist us in solving the problem at hand. This helps in creating the initial base features and avoids training the model from scratch when you have limited data and computational resources. We took network trained on ImageNet data as a starting point. ImageNet is a large database of images, and every year many researchers try to improve upon the accuracy of the classification of objects in ImageNet and submit it to Large Scale Visual Recognition Challenge (ILSVRC). This challenge has 1,000 categories to predict.To suit the problem at hand, the final output layer after the Fully Connected layers (FC layers) in the architecture were replaced with the number of labels of the given category.
We experimented with different types of Residual Networks: ResNet , ResNext  and the current state of the art architectures NasNets& SeNets. In our experiments, ResNext gave better results than other algorithms when looked in both accuracy and computational time.
C. OPTIMIZING THE PARAMETERS OF THE NETWORK
FINDING THE LEARNING RATE:
Choosing a starting value of the learning rate is highly important to ensure convergence of the network parameters to the optimal value.
Leslie Smith’s (researcher in field of deep learning) recent work on “Cyclical Learning Rates for Training Neural Networks”  contains a point on choosing an initial learning rate for the given problem. In summary, the idea is to start with a very small learning rate and gradually increase the learning rate in powers of 2 or 10 for every iteration in the epoch. Initially, when the learning rate is too small, error will decrease at a very slow rate. If you keep on increasing, at some point the learning rate becomes so high that the error skips the minimal value and starts shooting upwards. This indicates that beyond this learning rate shouldn’t be chosen, as it has become too high for parameters to converge.
The image below shows the learning rate finder for the ResNext-101 architecture when trained on a category of clothing. Loss has been decreasing drastically between 10-4 to 10-3, and then from 10-2 loss has started increasing, and at 10-1 it has increased drastically. Ideally, we should choose a learning rate between 10-4 and 10-3. We have chosen 10-4 to accommodate another technique of adjusting the learning rate.
CYCLIC LEARNING RATE:
Leslie Smith’s (a researcher in the field of deep learning) recent work on “Cyclical Learning Rates for Training Neural Networks” points out that instead of just having a constant learning rate across the epochs, the learning rate can be made cyclical across the epochs. In summary, the number of epochs could be equal to the number of cycles, and in each cycle the learning rate resets back to the original learning rate (learning rate chosen from learning rate finder above). Inside a cycle, the learning rate decreases gradually for each batch in cosine fashion. This process helps the network to escape the narrow regions (local minima) in error surface and favors a wider region.
The plots below show the cyclic learning rate plots. After running for the few cycles, we can change the length of the cycles so that the learning rate gradually decreases to help weights converge. When the cycle length is two, in that case, the learning rate of the next epoch is equal to the latest learning rate in the last epoch.A total of seven epochs were run in the case shown in the image. Here, the total cycles are three, and the cycle length is multiplied by two times the length of the previous cycle. We can see that the last cycle was run for four epochs, the second cycle was run for two epochs, and the first one was run for one epoch.
LOSS VERSUS THE ITERATIONS:
We can see that error surface is not smooth, and the concept of cyclic learning rate can help us jump past the narrow regions of error surface. We have increased the number of cycles and have seen that loss has been constant, indicating that it is not a narrow region of error surface.
UNFREEZING THE LAYERS AND DIFFERENTIAL LEARNING RATES:
As we are using a pretrained architecture and performing transfer learning, not all layers require additional training. As the architectures are state of the art on ImageNet, they are already good in identifying the low-level abstract features like boundaries and edges. Those are captured in the few initial layers of the architecture. Hence, they don’t require much re-training.
We chose different learning rates for different parts of the network, and the layers are grouped into three parts. The first part corresponds to the initial set of layers, the second part corresponds to the layers in the middle, and the third part corresponds to the last set of layers plus FC layers (Fully Connected layers).
Two steps that are used to train the network are listed below:
- Initially, the network is frozen for all layers except for the last Fully Connected We mean those layers are not trained; we are just predicting the values till the layers before the FC layers and trying to tune the weights between Fully Connected layers (two layers of size 512) and the output layers (size is dependent on the category we are predicting).
- Next, the network is unfrozen, e. all the layers are made trainable. Now, the learning rate for the three groups of layers is set by the rule of thumb of [lr/100, lr/10, lr] in the order for the images like ImageNet, but in our case [lr/100, lr, lr] has proven to work well. Information getting captured in the middle layers is having equal importance to the information that is getting captured in the layers near to the FC layers. (Here, “lr” refers to learning rate).
This concept of using different learning rates across different layer groups is termed as the differential learning rates .
INCREASING INPUT SIZE OF IMAGE GRADUALLY:
The images we received in the data are mostly at 512 pixels, and we resized the images to 224 (since most of the ImageNet images are of this size) for the initial tuning of weights. And then we resized the images to 299, and we ran the same number of epochs using the final weights generated at 224 as initial weights. Then we resized all the images to 512 pixels and used the weights generated, using 299 pixels as the initial weights.
The advantages were twofold:
- We would get computational time advantage, since larger images increase the time it takes to tune the weights of the network. Hence, if we provide the weights obtained from the smaller size of images for the same problem, we are providing optimal weights for the problem and network converges in less time than it takes.
- We get accuracy gains from We are providing the data at different sizes; hence, those categories that are very far way in the classification (i.e., sleeveless vs. wrist length of the sleeve) will already be taken care of in the smaller sizes. But the nearby classes will be classified more accurately in the case of higher resolution input image.
D. TEST TIME AUGMENTATION
During the prediction, we applied the same transformation parameters that we used during the training and generated eight images. We choose four of them randomly and predicted on these sets of images and also on the original image. We averaged out the prediction probabilities, and this has increased the accuracy of the predictions obtained. We believe that one possible reason for this is some center cropping could happen while resizing the image, which could result in loss of information from the sides. When we do transformations, that information is captured in one or more of the images, and averaging the probabilities is increasing the accuracy of the model.
As discussed earlier, a gradual increase in size is bringing the accuracy improvements, but most of the original images are of size 512 pixels; hence, we applied a concept of super resolution (a deep learning based method to resize the images to a higher resolution). We resized the images to 1024 pixels and performed the similar experiments, but we didn’t get the accuracy improvements, and computational time grew exponentially from 720 pixels.
During the semi-finals of the competition, we were provided with the dataset that contains images of apparel that were just hanging on a hanger or on the wall (i.e., not worn by humans). We used a similar approach but used yolo to separate out images of hanger and humans and built separate models. But the combined model of human and hanger always gave better results in comparison with separate models.
We have results for all the categories following similar trends of results across the experiments. Hence, we will present the results of one category: skirt length (as we have provided example images for the same category).
COMPARISON OF RESULTS ACROSS ARCHITECTURES:
We have started with ResNet architecture and moved on to ReNext-50 and ReNext-101. ResNext-101 outperformed all the other architectures, as shown in the results below. Notations: Epoch: Number indicates the cycle-indexing starting with zero; trn_loss- log loss on training data; val_loss: log loss on validation data; accuracy: classification accuracy of validation data.
It can also be observed that the TTA has always provided an improvement in the prediction accuracy when compared to the last epoch’s accuracy.
SEQUENTIALLY INCREASING THE SIZE OF INPUT IMAGE:
Choosing the best performing architecture, ResNext-101, we have sequentially increased the size of the input image. Accuracy has increased from 85.39% at size 224 pixel to 88.25% at size 512 pixel.
CONFUSION MATRIX (FOR THE SAME CATEGORY-SKIRT LENGTH):
There is no confusion between the short length and floor length (i.e., those categories that we as humans can also do very accurately). But the model suffers to correctly classify the nearby classes. We have tried other approaches like modeling separately for the nearby categories and making changes in loss function, but none of them are able to solve the problem of confusion of nearby classes.
CONCLUSIONS & FURTHER IDEAS
The way the learning rate is chosen is very important for neural nets convergence, and the way the network is optimized is also another important step in the process of model building. But we feel the current state of the art architectures are throwing away a lot of information when it reaches the FC layers. But taking all the activations will make the parameter space exponentially bigger, thereby causing overfitting and increasing time complexity to tune the network. Current methods are taking the average value of all the channels before the FC layers. By doing so, we are losing the detailed information captured till that layer. We propose the following ideas to improve on this:
- We tried XGBoost at the end of the competition by taking all activation values from all the filters in the layer before the FC layers or final conv We observed that we can get better results when compared to just taking average values of those filters. But due to the time constraint in competition, we haven’t been able to complete the experiment, and we will publish those results soon. In summary, using XGBoost on the activations obtained on filters just before FC layers could help boost the accuracy of nearby classes by capturing some detailed information.
- The approach of bagging could help, where we selectively expose all the activations of the few important filters based on their weights importance and repeat it multiple times and take an average This might help us capture the detailed information from some important filters.
Mathematics and Scientific Computing graduate from IIT Kanpur. Fascinated by the mathematics behind machine learning. Interested in applications of deep learning, especially in the field of computer vision.
Senior Data Scientist
Material Science graduate from IIT Roorkee. Enjoys problem solving and has worked on creating data driven solutions for various domains like Insurance, Supply Chain, CPG.
Material Science graduate from NIT Warangal. Passionate about solving problems that create a meaningful impact to the society/business using machine learning and deep learning.
- ResNet. Deep Residual Learning for Image Recognition. https://arxiv.org/pdf/1512.03385.pdf
- ReNext. Aggregated Residual Transformations for Deep Neural Networks.https://arxiv.org/pdf/1611.05431.pdf
- Cyclical Learning Rates for Training Neural Networks. https://arxiv.org/ pdf/1506.01186.pdf
- Lectures by fast.ai. http://www.fast.ai/
We would like to thank Jeremy Howard and Rachel Thomas for generously helping everyone learn the state of the art deep learning techniques through their lectures at fast.ai . Many of our ideas are inspired from the lectures there.
As per an industry report, for every $1B in revenue, working capital optimization can result in $20-60M annual benefit1
Working capital optimization is a sweet spot for CFO’s, as it determines the financial health and operational success of the business. Organizations today are looking for end to end solutions around all the levers impacting working capital and are talking about overall processes rather than just payables, receivables, and inventory. OTC-order to cash process has replaced accounts receivable. Companies are looking for solutions to move beyond just cost savings to a more strategic impact, and an integrated system is the answer.
Working capital is one of the most important components of the financial operation of a company. Organizations need to explore more efficient ways to manage the levers of working capital, i.e., accounts receivable, accounts payables and inventory. To manage these levers, we need to look at the overall picture rather than just focusing on
receivables and payables.
In this whitepaper, we are going to focus on order to cash (OTC), as it is one of the important strategic components that can be used-not just to improve the organizations profitability, but also to better manage the working capital and customer services for future growth.
With growing business and customer expectations, organizations must go into expert mode and transform their order to cash process using the advance technology and analytical solutions available.
This paper discusses a step-wise solution that can help organizations transform their accounts receivable process into a smart order to cash process. A detailed analysis of various subprocesses within the OTC process-and possible technological and AIML (Artificial Intelligence and Machine Learning) interventions – forms the base of the solution. Artificial intelligence (AI) engines can be built to scan through the invoice details, customer e-mails, contracts, and other unstructured data and significantly reduce the manual effort while improving the execution time for collections, deductions, dispute resolutions, shorts payments, cash posting, etc. Many organizations have explored similar solutions, though only in parts. An end to end approach, starting from order processing to cash application and performance reporting, is the answer.
Reducing the cash conversion cycle can improve EBITDA margins and significantly increase profitability-by as much as 20%, in some cases2.
Challenges in OTC
Today across industries, we see the cash-to-cash cycle-from procurement to receipt of payments from customers – exceeding six months, which means companies have already made payments for the raw materials and services but haven’t sold the product yet.
To address this issue around working capital, we need to understand each component of working capital, how they interact with each other, and how advanced analytics can be used to enhance the output of each of these levers for an optimal working capital.
Most organizations have accounts receivable, accounts payables, and inventory working in silos. If we go within these processes, there is a lot of system dependencies, and not much importance is given to the interactions within the subprocesses. Figure 1 below explains the dynamics of AP, AR and inventory in the overall working capital
Figure 1 – Working capital optimization and roles of the key levers
An analysis by PWC in the year 2016 suggests that ~1.1 trillion Euro is locked up in working capital and this is increasing3.
We’ll discuss in detail how accounts receivable is moving towards OTC and what challenges we see across the value chain.
The biggest challenge that we see across the order to cash process is customer experience and a lack of visibility across the entire OTC process. OTC is one of the most important processes in the entire business cycle, as without this companies won’t make any money. However, system dependencies and the business silos of various departments involved make it highly manual and challenging. For most companies, a huge amount of transaction data available is not being used for any intelligence. Almost one-fourth of the effort goes into collecting payments from on-time payers, and an equal amount of time is spent in resolving disputes that could have been easily avoided.
Let’s have a look at the system dependencies:
Figure 2 – OTC interactions among the various departments
According to ordertocash.net, ~30% of account department costs goes towards managing the OTC process. If not managed properly, OTC can cost 6-25% of revenue4.
The other pitfalls of the process include inconsistent data and documents, duplicates, incorrections, reactive measures, unclear deduction rules, and inefficient customer contacts.
One of the key problem areas is collections, where most often the account/customer prioritization for customer contacts, calls, dispute tickets, and resolutions are performed manually and, as a result, are suboptimal and expensive. All of these can be automated through RPA and cognitive automation.
A reduction of one day in DSO for a $10B company reduces WC by around $30M5.
The OTC solution: going beyond efficiency
The solution to the above problems is to go beyond efficiency to effectiveness of various subprocesses under OTC—to deliver better cost, cash, and improve upon DSO (days sales outstanding) and customer satisfaction levels to build the long-term relationship. DSO (days sales outstanding), APT (average payment term), and WAPT (weighted average payment term) are the most important metrics to use when measuring the health of OTC for any organization. An improved DSO reduces the overall cycle time of collection and significantly improves working capital health.
The key to the overall solution is to connect business silos through upstream and downstream interlinkages and use advanced analytical techniques to drive focused customer contacts, proactive collection strategies, and a smart auto cash posting system. Also, rather than working on discrete portions of the process or exploring a complete technology overhaul, a more effective solution is to integrate all the systems to a single platform to form input links at various process stages.
We are suggesting a step-wise approach to gradually move from the current state to a more cutting-edge state as an industry leader in OTC practices.
Below is a snapshot of multiple subprocesses and dependencies on various other departments:
- Order processing: The OTC process starts with sales order processing with high volume and complex orders. The order processing demands instant access to delivery docs and related customer and order details.
- Billing: Billing starts post product/service delivery to customers. It has its own issues that arise due to incorrections, information breakage from deliver point, and lack of customer interaction information.
- Terms: Payment terms are not clearly defined and followed. Non-standard payment terms and variation across divisions, business segments, etc., make it difficult to follow and optimize.
- Collection: Collection is a process of contacting customers and getting the dues cleared. This requires solving a lot of customer queries, and the people involved should have smooth information flow from various departments involved. Typically, collections agents wait for invoices to go delinquent to start acting.
- Dispute resolutions: Disputes arise for multiple reasons, such as incorrect billing details, discounts, goods return, etc. Once an invoice falls into dispute, the cycle time to resolve it is long and delays the collection process. A systematic information flow from the order processing unit, and measures to reduce incorrections at each step, can help significantly reduce the number of disputed invoices.
- Cash application: This is the most manual process in the collection cycle, where once the payment is received from the customers, and remittance information around the same is shared by banks, agents map this remittance data with invoice details.
- Analytics: Here, relevant information is shared with the leadership team, and insights are provided for proactive actions. Measuring the process performance and identifying the gaps for improvement is necessary to move towards a smarter and effective order to cash system.
It is important to note that the root cause of overdues and collection issues is outside of the OTC process. So, it’s essential to understand the interdependencies.
Let’s have a look at the various subprocesses and the issues organizations face within these subprocesses, as well as what KPIs to measure to analyze the health of these processes. This helps us identify various possible technology and AIML interventions across the OTC value chain.
Figure 3 – Approach model: opportunities analysis across accounts receivable process
As it is mentioned above, OTC starts with order processing and, if not handled properly, it impacts all the downstream processes, impacting DSO or on-time collection and reconciliation. Based on experience with various organizations across industries, it’s seen that sales data, customer master, price promotion information, and point of
delivery information are not integrated properly and leads to billing/invoice incorrections and a long lag time to correct these incorrections. These incorrections also result in a high number of customer disputes and a negative impact on long-run relationships.
To transform accounts receivable to a smart order to cash process, we start by studying all the manual interactions and automatic data flow. This will help us identify the information leakages and process loopholes.
Developing a Risk Management Tool to understand the impact of the root causes and taking corrective actions is an effective way to start. Also, providing a web application for the delivery people and sales team that works in both online and offline mode to collect all customer interaction information is a must for this solution. Regular performance measurement and benchmarking of the implemented solution helps in identifying and filling in the gaps.
OTC – Solutions to start with for the OTC process transformation
OTC – Solutions to move to the advanced stage
The develop phase, once taken care of, will help streamline all the downstream processes, from collection to dispute resolution, and finally, to cash reconciliation.
This stage specifically focuses on smart collection strategies and a high hit rate of auto cash posting using AIML techniques.
Smart Collection: Key drivers of customer behavior and invoice level risk profiling.
Automated On-Invoice Cash Application Hit Rate: A combination of OCR, AI, and RPA for auto cash posting, where we have all invoice rules and deductions rules in one place.
OTC – Solutions to become an industry leader
This helps automate and transform the process into a smart OTC process with smooth data flow at each touchpoint and advanced visualization solutions for alerts and insights for proactive decision making. A dynamic management dashboard helps to measure the health of the process and take data-backed actions in any pain area. This includes:
• Monitoring collection performance
• Identifying the most frequently occurring issue Transforming
To be a world-class leader in the OTC process, organizations must leverage the technological solutions available. Having a completely integrated system from customer master to final cash application is the key to this solution. There shouldn’t be any lag in the information flow, and RPA can help by eliminating all the manual efforts needed in fixing the billing errors, resolving customer disputes, and looking for supporting data for cash application.
Benefits of the overall solution
The overall solution can help organizations better manage their working capital and significantly improve DSO at a much lower operations cost. An overall OTC solution can enable:
- Working capital optimization: OTC being the most significant lever of working capital management systems, if managed smoothly can help companies manage their working capital in the most efficient way.
- Reduced DSO: The overall solution significantly reduces the number of days-to-pay/days-sales-outstanding. With reduced manual interventions, targeted customer contact approaches, and quick access to information, a significant amount of payments can be collected on time, and the energy can be diverted towards risky invoices and customers.
- Improved service levels: Fewer disputes and better dispute resolution in the shortest possible time strengthens customer relationships. With customer behavior understanding and alerts on likely payment defaults, businesses can initiate strategic partner discussions in advance to reduce the risk of payment defaults.
- Improved cash flow: Reduced revenue leakage, lower write-offs, intelligent ways to connect with customers, and smart collection all lead to improving on-time payments and visibly reducing write-offs. AI can be used to identify and validate the short pays, deductions, and charge backs. With all information available at hand, companies can rework their payment terms and considerably lower the >90/120-day transactions.
- Visibility to cash position/working capital in real-time: A visibility into customer payment behavior and invoice risk profiling gives a clear look at future payment trends, and companies can manage their expenses and working capital requirement accordingly.
- Improved productivity and reduced operations cost: As order to cash is mostly a manual process and consumes the highest share of cost in the finance department, the cognitive automations reduce the man hours needed to complete any task. Also, with quick access to information, the lag/wait time is minimal, which improves the productivity of people managing the OTC process. They can now focus more on analyzing and deep diving the process performance and issues.
- Improved auto cash posting: With all the invoice details and deductions rules in one place, and pre-built algorithms to match short pays, auto cash applications can be improved up to >90%, which would significantly reduce the effort needed to reconcile the invoice details and remittance data.
- Alerts and role-based actionable insights using role-based visualization techniques: This helps in being proactive and taking necessary actions before the actual damage hits. The complexities around the subprocesses, customer disputes, short payments, lock boxes, payment terms, etc., can be very well understood and managed through a single reporting system with automated alerts and narration, customized reports, and real-time insights within seconds.
The call to action
For multiple client engagements, these solutions have delivered historic and positive business impacts, reducing the payment cycle time and improving the overall process effectiveness. With everyone leveraging the current technology trends and advancements in machine learning and artificial intelligence, this is the time when organizations should move away from a typically manual and inefficient order to cash cycle. Now is the time to create a system that breaks silos and allows for the smooth movement of information across subprocesses. Technology is an enabler for the same, and with millions of rows of data, the available information can be easily transformed into intelligence to make a smart OTC process.
Practice Lead – Financial Planning & Analytics
Shipra is leading the FP&A practice at Fractal Analytics. She has over 12 years of diverse experience in building advance analytical solutions around finance, pricing, forecasting and working capital management. She has worked across industries like CPG, retail, energy, OEMs, pharma, telecom.
An interview with Koley Corte, SVP and Global Head of Business Transformation at AllianceBernstein
How AllianceBernstein is using digital to create better customer experiences in asset management
Digital is helping asset managers deliver more value to the customer experience. Still, going digital can be a complex enterprise-wide effort that requires answers to some challenging questions. For example, where does digital fit in with other business priorities? How can leaders get backing for digital? How can companies leverage, and not duplicate, digital capabilities?
AllianceBernstein is answering these questions. They are strategically deploying digital as a key piece in their business ecosystem. It’s helping them better understand their customers and deliver the right experiences at the right time and place. Through their vision, initiatives, and partnerships, they already have a lot to offer, and they are just getting started.
Read the perspective of Koley Corte, SVP and Global Head of Business Transformation at AllianceBernstein, as she talks to Arpan Dasgupta, Head of Financial Services at Fractal Analytics.
Fractal: What does digital mean for AllianceBernstein when it comes to customer experiences?
Koley Corte: Our main goal is to improve the customer experience and deliver more value to more customers. That means making it easier for customers to engage with us. Digital and data work hand in hand to help us know the customer better, anticipate and deliver against their needs, and reach more of them in greater depth.
Fractal: In taking care of customer needs, are you focused on reducing the need for a human interface?
Koley Corte: No. Our customers should be able to get the information they want, where and when they want it. Whether that’s self-service from a digital platform, engaging with someone on the phone, or meeting with someone in person, we should provide the experience to deliver what they want and anticipate what they need—at the right time and place.
Fractal: What digital channels are you focused on?
Koley Corte: We’re focused on delivering customer value by understanding and anticipating their needs and engaging with them through omnichannel experiences including social, email, and potentially voice, delivered through different kinds of devices. We’re digitally engaging with customers in a way that is platform agnostic. We have a lot of valuable content out there across multiple channels: web, social, and email outreach. There’s an opportunity to increasingly curate that experience and have all the channels work in harmony around the customer and needs.
Fractal: Where does digital fit into AllianceBernstein’s priorities?
Koley Corte: Digital and data are critical enablers for us to drive continued success with our clients and reach new customers. It works in tandem as part of an ecosystem around the customer, and not in exclusion. We have a clear vision and strategy, we have funded initiatives in place, and we are deploying digital and data-driven solutions across channels.
Fractal: What are some key decisions you’ve made to drive digital initiatives?
Koley Corte: We work alongside and with the organization. We partner outside the organization to bring in fintech skills and capabilities. We are working with different firms like Fractal to bring in data knowledge to accelerate our ability to deliver as well as figure out how to work together and truly partner with an operating model that supports the business.
Fractal: Internally, have you brought together a team that’s focused on digital?
Koley Corte: Yes. My role is broader than digital; it’s business transformation. We have built out a business transformation team in the Americas and recently hired our first team member in Asia. We are in the midst of building out a global model. We’ve supplemented internal resources, so we aren’t trying to duplicate capabilities – we are bringing in strategic thinking, the ability to drive results forward, and the partnerships to execute. We work with the business with a goal of weaponizing the current model and operating at scale.
Fractal: In accelerating your vision of improved customer experiences, what challenges have you experienced?
Koley Corte: The first challenge is being able to manage an environment of experimenting and learning, as a mechanism, as opposed to building the perfect mouse trap. We need to deliver on our goals today along with the future. Getting people comfortable with playing with data, and giving partners access to data, is a different model and a test and learn approach.
Onboarding new partners, tools, and services has taken longer than I’d like. It’s new for us. There are growing pains in our approach.
The other learning for us has been helping the organization in balancing short and long-term priorities. That’s the learning that we need to do sometimes in how we work.
Fractal: In prioritizing a digital initiative, how do you get people to agree on what the organization should focus on?
Koley Corte: We formed business transformation committees regionally to create more transparency and alignment within the organization. We have built shared goals with the regional business heads and their teams, so it’s embedded in their goals to both deliver the business today and embrace delivering as a business tomorrow. I do a lot of championing in talking about it and building relationships. As we experiment and deliver highlights and proofs of concept, the more people see it, the more real it becomes, and we build momentum and a fear of missing out.
Fractal: In executing on those strategic decisions, what kinds of challenges have you faced?
Koley Corte: We make sure this remains a priority and gets resources and funding. We manage the data, so we see when things are working and when they are not, so we can be agile in our thinking and approach and can pivot along the way.
Fractal: Are there multiple partners involved here?
Koley Corte: We rely on a lot of internal business partners. We lead the governance around that. We work with partners to develop plans, and we hold people accountable to timelines. We first develop the concepts together with our customers in the business, and then we lay out realistic plans and put resources to deliver against those.
Fractal: How do you figure out what your customers need from digital experiences?
Koley Corte: We observe how customers engage with us, talk to them, and understand what they want. We have online tools to put things in front of customers or potential customers to understand their perspective. We look at the art of the possible: how have new capabilities evolved that we can leverage into understanding what our customers are trying to achieve?
Fractal: How are you leveraging learnings and capabilities across AllianceBernstein?
Koley Corte: We’ve formed an innovation group across the company. We meet regularly and share what we are doing to leverage any capabilities we can. We stay aware of what the other firms and strategic business units are focused on, and we share what we are learning. I spend a lot of time understanding what others are doing and making sure we aren’t reinventing, and instead, leveraging capabilities where we can.
Fractal: Are there any closing thoughts you’d like to share?
Koley Corte: This is a work in progress. It is following the customers’ needs and lead. It is testing, learning, and experimenting to drive more value. It is early days. As a firm, we have a lot to offer, and we need to experiment with ways to reach customers even more effectively, penetrating the market more deeply and broadly. There is a lot of opportunity to continue to improve and grow.
Senior Vice President and Global Head of Business Transformation, AllianceBernstein
A senior executive focused on leveraging data and digital to drive strategic change, Koley Corte is currently Senior Vice President and Global Head of Business Transformation at AllianceBernstein focused on developing transformative growth strategies for next generation institutional and retail distribution. Koley’s areas of current focus include artificial intelligence, sales enablement, demand generation, robotics and automation, predictive analytics and new channel development.
Prior to joining AB, Koley was at Reed Elsevier (RELX) where she served as Senior Vice President and Head of Digital, Innovation and Customer Acquisition Strategies, Americas Region for Reed Exhibitions (RX). While at RX, she focused on driving attendee acquisition and engagement, developing digital programs, assets, capabilities and revenue and improving analytics and insights. Previously, Koley was Vice President, Head of Market & Competitive Strategy and Integrated Campaign Management at TIAA, where she led teams responsible for developing the environmental backdrop with insights and implications for revenue growth and diversification strategies, the rebrand and large-scale external communication initiatives, including complex product redesigns. Previously, Koley held progressive leadership roles at American International Group (AIG).
Koley received her MBA with distinction from the Leonard N. Stern School of Business at New York University, with a concentration in Finance, Management, and International Business, and her B.A. cum laude in Economics and Psychology from Brandeis University.
Head of Financial Services, Fractal Analytics
The people who are crazy enough to think they can change the world are the ones who do.
Apple “Think Different” Campaign 1997
Enterprise leaders often consider how and when they should be thinking about innovation in their analytics and digital transformation efforts. But, times have changed. To keep pace in a rapidly changing business and technology environment, innovation is no longer a choice. It’s an imperative.
The market is being reshaped by technology changes in the areas of artificial intelligence, block-chain, and computing. Business models are also changing, as new platforms, direct-to-consumer models, and mobile innovations continue to shake up the market. All this change creates great opportunities for those that make innovation their operating imperative. To win today, leaders must build innovation into the core of their efforts from day one. Creating a strong integrated innovation engine is critical to analytics and digitization transformation success.
As you build out your own winning strategy and operating model, focus in on three key elements to build your innovation engine:
1. “What Matters” Is Changing
At any point in time, “what matters” and how it is changing is critical to your successful transformation. Some call it a landscape assessment, canvassing the industry, bringing the outside in, articulating your business model, or any other expression of understanding “what matters” in your company, industry, and category and how it is evolving now and in the coming years. The only thing for sure is that the forces impacting “what matters” are changing right now whether you know it and like it, or not!
In his seminal book, “What Really Matters”, John Pepper (Former P&G Chairman of the Board, President and Chief Executive Officer) shares the importance of this approach with respect to winning brands. He outlines the “Brand-Building Success Factors” that are required to create leadership brands and keep them young year after year: “It’s only when all of these factors are present – not three or four, but all of them – that we sustain and improve our record of creating and building leadership brands.” He defines the five “what matters” factors and how to shape the innovation engine and forces around them:
Interestingly, the last factor, talent, was and is a passion that John never stops innovating and investing in. Even now, many years after retiring from P&G, his coaching is as relevant as ever.
Evolving success factors in enterprise analytics: From 2010 to today
Let’s look at analytics and how AI, robotic process automation, and other success factor forces will shape your innovation engine. In 2010, MIT published an interesting chart: “Analytics, the new path to value.”
In 2010, this laid out key insights on where to drive innovation in leading analytics programs like P&G. At P&G we built the Business Sphere environment to harness real-time global data to power business-decision making. We began testing more advanced analytics at store/SKU level, created strategic partnerships with key players in these critical analytic domains, and drove significant impact across the businesses.
It’s very clear now that the “What Matters” forces are transforming in the analytics domain. The next evolution is taking shape, and it’s looking something like this:
As you drive your own transformation, build your innovation efforts around “what matters” and the forces (e.g., AI) that are shaping them in the coming years. Leading analytic and digitally transforming companies will leverage both internal talent and strategic partners to create competitive advantage now and for the future.
The greatest danger for most of us is not that our aim is too high and we miss it, but that it is too low and we reach it.
The decision isn’t about whether to embrace a new business model, but about when, how, and how fast to do it!
Filippo Passerini, Former P&G CIO & Global Shared Services President
2. Build an Integrated Innovation Strategy into Your Business Strategy
Like any good strategy, innovation also requires the rigor of making bespoke choices: where to play, how to win, capabilities needed, measuring success, management systems required, and your aspirational vision.
An integrated innovation strategy focuses across short and long-term objectives by focusing your innovation in three areas (tightly linked with your business model):
1. Innovation to lead the industry
This is sometimes called “sustaining innovation.” It is focused on your current products or solutions with intent to stay focused on new consumer trends (“Who” focused), vertical or horizontal expansion to adjacencies, or geographic expansion. Continued innovation in the core is critical.
Think about how you will bring analytic innovation to core areas like communications, knowledge management, your existing CRM programs, and more. Sometimes innovation on the day-to-day core can become break-through for your company and employees.
2. Innovation to change the industry
This part of the innovation focus looks to shape the industry landscape in order to tilt it in your favor. It is focused on transformational innovation. It is focused on the “How” with an intent to create a competitive advantage.
Think about how the digital and analytic transformations will tilt external relationships in your favor. Analytics has become a “currency” between business partners. Are you using it to shape external business reviews, product launches, consumer interactions, and other key areas that matter to your consumers?
3. Innovation to create new industries
This area of innovation holds the highest risk and reward. It is focused on building new consumption where it did not exist before. It is focused on the “What” with the intent to disrupt the market. This is also the most difficult for large companies, as so many processes focus on driving the current business model.
These are your “moon shots,” “exponential”, or “black ops” projects. This is digital and analytic transformation set out to completely disrupt your current business, business process, reliance on expensive status quo, or transform and industry. If you’re not trying to do it, I guarantee you someone is trying to do it to you.
The reason why it is so difficult for existing firms to capitalize on disruptive innovations is that their processes and their business model that make them good at existing business actually make them bad at competing for the disruption.
Your innovation plans should have a balance across the three areas and be fully integrated with your current and future business models. Too often I see siloed groups working on the “exponential projects” with no business understanding, credibility, or chance of succeeding. That’s simply a model for burning cash.
3. Unleash Your Hybrid Talent
In the end, innovation comes from talented individuals who make connections, discover insights, solve problems, take risks, or devise approaches that no one previously made. Your innovation strategy and results will only be as good as the talent you have working against it.
Stack the odds in your favor and leverage a hybrid talent model to put the very best people in your organization and your strategic partners working on it together. Innovation is critical to growing your business and theirs. “You can’t cost cut yourself to growth” is valid now more than ever before.
Are you putting your top players, your “water walkers”, against your biggest innovation opportunities? James Lafferty, then P&G VP of Europe Family Care, laid out incredible observations on the difference between “swimmers” and “water walkers”. Here are some key points:
- When confronted with a severe business crisis, the water walker focuses all energy on how they will overcome the crisis and still deliver. The swimmer will often focus instead on doing a superb job of “selling” a lower base.
- Water walkers consistently focus on self-improvement and ask themselves, “How do I get better every day?” Swimmers focus more on self-promotion and ask themselves, “How do I sell myself and get myself positioned to get the promotion I deserve?”
- Water walkers make any assignment a great assignment. Swimmers think success or failure is based upon having a “good assignment”.
- Water walkers always approach a topic from the standpoint of “how crisp and clear can I make it?” Swimmers tend to measure success by how long, how many charts, and how many numbers they can put on one page. They erroneously equate quality with quantity.
- Water walkers recognize the power of developing people as the way to achieve their business goals. Swimmers tend to prefer to “go it alone” and consistently believe they alone must be the major driver.
- Finally, water walkers tend to be thinking, “How do I change the game?”, whilst swimmers work on, “How do I grow the business?”
When faced with building and growing innovation, I’ve often pulled out Jim’s memo ahead of staffing and partner decisions. Are you putting your ‘A’ Players on the innovation opportunities (in some cases double-hatting with their current business scope)? My experience is that your truly top talent wants more accountability, scope, and operating freedom. Mine always rose to the occasion!
With the complexity and speed of digital and analytic innovations, it is also critical to have the right strategic partners working closely with you. I call it the hybrid organization design. You should be looking for three characteristics:
1. Their CEO’s and leadership are thought-leaders and “get it.”
Companies with CEOs that are good managers is nice. However, you’re looking for individuals and teams that are shaping the industry and “What Matters” for you.
2. They truly believe in and want strategic partnerships with their clients.
They hire and put top people working with you, bring innovation to you, and co-invest with you as they believe innovation is a win-win. If you win, they win. If you lose, they lose.
3. They are focused on “speed to value.”
It’s all about winning with your business, not theirs. Too often, suppliers (not partners) will sell you something, and whether you create value is your issue, not theirs. Strategic partners make value creation and speed to value their imperative.
There are great opportunities out there to power innovation through analytics and digital transformation. Those that do will see their organizations thrive in a fast changing landscape and be ready for tomorrow.
So, remember, be crazy enough to think you can change the world:
With the right approach to innovation, you can change the game. Get going.
Fractal Analytics, Strategic Advisor
Andy Walter is a business results-driven professional with extensive experience in strategy, development, execution, and operations across Shared Services and IT. He led the Commercial Services & Delivery Organization (over 1500 IT and multifunctional professionals) for Procter & Gamble’s Global Business Services (GBS). He was responsible for IT & Shared Services for all Global Business Units and Markets around the world. His team was accountable for developing cutting-edge digital capabilities for Procter & Gamble to win “where it matters most,” with Consumers, Shoppers, and Retailers. This included all eBusiness, Consumer Services, BI/Analytics, Sales Force Solutions, Project Delivery, Business Process Services, and A&D / Company restructuring efforts.
Artificial Intelligence (AI), along with its related technologies are set to impact all aspects of the insurance industry, right from underwriting, claims to pricing. Advanced technologies and data are enabling the insurers to form quick decisions and make progress. We can already see it affecting distribution, with policies priced, purchased, and bound in real-time.
The winners in the AI-based insurance will be enterprises that use new technologies to create innovative products, streamline processes, and go beyond customer expectations for individualization and dynamic adaptation.
Even though all of this is known, it is not often clear how insurance companies can make AI deliver outcomes. Here’s Sankar Narayanan (SN), sharing his insights on the below six emerging ideas for realizing value out of AI for insurance companies.
Transform Underwriting: Text mining offers one of the most interesting solutions to improving underwriting processes and outcomes, and it has the potential to reduce the time to draft policies by 20 to 25 percent.
Improve the customer experience and deliver next-best actions: One of the highest opportunity areas to apply AI within Insurance, across most lines of businesses, is in driving customer engagement and experience.
Reduce customer friction and the cost to serve: There is considerable opportunity to reduce friction that customers face especially on the digital assets (e.g., websites) of insurance companies through a three-step approach of sensorizing the various parts of the digital assets that cause breaks in customer journey, applying sophisticated algorithms for anomaly detection and root cause identification and A/B testing at scale.
Application of AI across the insurance value chain: Insurers need to deploy a fail-fast approach and run multiple rapid Minimum Viable Proposition (‘MVP’) initiatives, in parallel. The most impactful emerging sources of internal data from our recent experience include Voice of Customer data (calls, social, etc.) and telematics, and our experience suggests that about 25-30% of MVPs will result in eventual operationalization at the enterprise level, which makes it critical to have a fail-fast mentality.
Beyond AI: A 2019 Gartner CIO survey indicates that most P&C and life insurers are still building out the foundation for analytics and facing legacy system challenges. With more sophisticated approaches emerging, the challenge of improving transparency and accountability of AI methods is becoming important to manage. This requires a behavioral sciences based approach to empathize, up-skill, and engage the people that need to act on the insights driven by AI.
Make measurement a priority: The philosophy towards measurement is critical to get right. It is important to internalize that the value of AI is not realized from the methods or number of models or statistical uplifts but from the execution of actions coming out of these models.
Read the in-depth story here
As business leaders look to capitalize on AI opportunities, they are asking, how to predict the future returns for AI projects. Where should the investment be targeted, and what kind of capabilities can enable better performance?
Here’s Sankar Narayanan (‘SN’) sharing his thoughts on how to measure the ROI of AI in ways that aren’t limited to just financial returns. He also shares from his hands-on experience at implementing AI to provide business leaders with ways of thinking about success when it comes to AI projects.
Assessing the future value of AI
When we think what AI can do to the business, it mostly comes from multiple well-marketed examples, coming from well-defined areas. Like, the best chess player getting beaten by Artificial Intelligence. But problems like this, have a definitive endpoint, like in chess. However, most problems in Fortune 500 companies do not have a definite outcome. Problem statements that these companies deal with are, for example, will the next product launched, be successful? But what’s the definition of success here? Similarly, another example can be, how to improve customer experience in banking or similar sectors?
Most of these problems are vague and complex, and the outcomes defined in multiple ways. So, the challenge comes in quantifying the outcome on investment in something where the outcome itself is vague. To answer this question, the most important thing to ask is whether we truly understand the business problem that we want to solve. In other words, is the business question well framed?
These are open-ended business problems and often use-cases where machine learning wasn’t in use before. For example, when we try to define payment fraud, there may be false positives and negatives that, attaining a specific ratio could define success. However, when we want to improve customer value for a wealth management client over 20 years, how do we get an answer for that. In this situation, it is not only vague but also an area where AI was not in use. To work in these areas, the very first thing to do is lock-in and frame precisely what the problem is.
Scaled problem solving
When Fortune 500 companies look at solving problems, they focus on solving them at a scale that can have an impact not only at a functional area but on the overall business and be sustainable. To get a deep understanding of what we are trying to solve, framing and reframing the business problem is a critical pre-requisite. Is it a problem of growth, is it a problem of inefficiency, or is it a problem of a better experience for our consumers? What are we looking to solve here?
When we talk about problem-solving at scale, there are three parts to the solution, post problem framing. The first is, identifying the levers of experience, growth, or inefficiency through advanced analytics/AI. The second one is to apply an engineering mindset and identify elements required for the AI initiatives to be successful at enterprise scale including setting up the appropriate architectural elements, managing the data lineage, latency, ingestion, obfuscation, and dev-ops.
Ultimately, the algorithms are only as accurate and actionable as the underlying data. And, if we have to do things sustainably, will the data be available, accessible and accurate over a period of time. The third and final component is execution, including design of experiments and human-centered learning. We also call this the final mile of decision making.
Every decision that enterprises are talking about is going to impact a human and is going to be made by a human. Do we understand those human needs that need a solution? To solve problems at scale, we need three elements to come together, i.e., better algorithmic sophistication, better engineering, and a better understanding of human behavior.
The code to success
The big-picture approach rests on various factors. There is no single way to get everything right. The ability to test and learn, the ability to experiment, the ability to fail fast but pick up the learnings and quantify the learnings are going to be very important.
Here is an example of a consumer goods company that made a remarkable transformation through AI. The approach the company took was to give every initiative a six-week time to show progress. What it allowed the company to do is become more rapid, not just in defining problems but also in defining the quantifiable measurement of success and progress. So, in 12 months, 30-40 different initiatives were executed to achieve Minimum Viable Outputs (‘MVO’) out of which 5 to 6 specific initiatives taken into organizational scale.
An ecosystem of innovation
What we learned from the example quoted above was, for an initiative to be successful, cross expertise teams need to work together. Projects are successful when there is an inter-disciplinary approach, as opposed to a multi-disciplinary framework. We further realized that the agile mode of working was going to yield better results and finally documenting learnings and quantifying the learnings increases the probability of success.
AI is a capability that helps us become more non-linear. A human plus machine approach will almost always be substantially better than the most intelligent human or the most sophisticated algorithm. The real value of success is if we realize that AI is a supply chain of several cobs to come together for customer experience to become better. When we look at the return on investment, we need to be cognitivist to the point that many of these moving parts need to come together for value addition. The process of defining and measuring the ROI of AI is a journey with the opportunities to affect change on business truly limitless.
Omnichannel customer experience has been the buzz word for a few years now. It is now time retailers start going beyond that to something we can term as Unified Customer Experience, which combines elements of AI, data engineering and design to deliver the right customer experience, irrespective of the point of interaction. This is crucial for driving meaningful customer engagement, profitably. The value of doing so is reiterated through different industry research, for consumers as well as business buyers. Some retailers are already mature in operationalizing this, but most others are at different stages in the journey.
Source: Salesforce (The State of Connected Customer report, 2018)
The foundation of Unified Customer Experience lies in getting the data elements right. Customer 360 is the customer master record that enables organizations to go beyond just the transactional nature of customer engagement to truly start understanding the various relationships that exist between the retailer and customer. It could be to do with the breadth of goods/ services that are purchased by the customer to engage across servicing channels and campaigns, among others, making it the most significant part of improving our understanding of the customer. The key watch-out here is that the Customer 360 needs to be dynamic – living and evolving with business changes.
With the Customer 360 in place, the next step is to build customer intelligence on top of this. It typically is in the form of strategic segments to drive overall customer engagement strategy and micro-segments to drive more targeted customer conversions. The strategic customer segments should be Relevant, Actionable, Stable and Quantifiable, to ensure we can craft appropriate organizational strategies for marketing in alignment with organizational priorities. The micro-segments enable us to incorporate predictive elements in the mix and help us uncover trends at a more granular level. E.g., while working with a retail client, we observed that a segment had an average spend of $500, while a more in-depth look at the micro-segments reveals groups with a spend range of $200 to $5000, besides other differences!
In essence, micro-segments provide us a forward-looking view of customers across the various dimensions like the product, offer, channel, price, cadence, serviceability, etc. and offer a great way for direct customer targeting. The strategic segments and micro-segments together form what we can call the Customer Decisioning Engine, which can then be leveraged to drive various customer engagement activities.
Customer Decisioning Engine
With the key elements of Customer 360 and Customer Decisioning Engine (CDE) in place, the next step is to do in-market testing for different campaigns. These campaigns can be at various granularities, with the target segments being generated dynamically, enabling us to learn from these tests and act as feedback into the various quantitative elements to fine-tune performance over time.
“The future of customer interactions and purchases is digital”– practitioners have heard that a million times over several years. Digital transformation as a concept started in the early 2000s – and since then, the commerce world has changed with at least 60% of the customer shopping journey inﬂuenced by a digital interaction. In parallel, digital investments has grown enormously – IDC estimates that 40% of all technology spending will go towards digital transformations, with enterprises spending over $2 Tn in 2019.
Digital leaders at 6 of the top 10 global telcos pointed to a few key initiatives.
Major initiatives capturing digital investment
The business prerogative for going digital is clear – heavy commoditization that results in:
- Pressured demand-side economics: Cut-throat promotions, continued loyalty to stores (74-85% of sales and upgrades) results in the high cost of acquisition, ranging between $350-$500 per customer. Cost of service at $12-$35 per transaction in chat/calls, and $60 a store visit – adds up to a massive, addressable cost.
- In parallel, to create supply-side differentiation using 5G, Gigabit, media content: Telcos are shoring up massive capital investments. The trade-off between a healthy operating P&L vs. a healthy capital structure has never been more prominent, forcing telco executives to explore new proﬁt pools (e.g., IoT, media distribution). Exploring the validity of these big bets is out-of-scope for this document, but it has resulted in CEOs of 8 of the 10 largest telcos identifying CX and efficiency ratio improvement as a critical necessity, with “going digital” and “self-service” being the solution mantras.
Humanizing the digital experience – is it needed?
We are not going to give forward more ideas on digital initiatives – instead, we asked a contrarian question – we believe there’s sufficient investment already, yet – why is digital still not a significantly higher percentage of customer’s end to end purchase/ service journey? Why do customers drop and complete with assistance?
Consider the below drop-off rates in a purchase funnel in telecom.
In addition, 70-80% of customers who start a service need online end up calling to get it resolved.
We took a customer-first approach by investigating over 5M+ chats, call logs, and on 200M+ digital journeys and learned a startling fact about customer expectations – speciﬁcally the curious “tolerance” phenomenon.
A customer is willing to put in only 10-12 mins of active attention (does not include passively-open browsers or waiting on a chat) on a telco website but able to spend up to 3X (~30 mins) in a store to make a telecom purchase (averaged for device, accessory, plan).
Let us put that in perspective of a smartphone purchase. In 10 minutes, a digital buyer expects to be comfortable with product quality, lifestyle ﬁt, promo applicability, feature customization, peer reviews, trade-in value, delivery options, surcharges, and fees. If either the experience takes too long or leaves open questions, customers would drop off. It’s no wonder then that only 1-2% of prospective digital shoppers end up buying online within a session.
We would encourage you to visit your favorite telco’s website and try buying a Bluetooth device, or trade-in an existing device to get a ﬁrst-hand feel!
If this sounds familiar, one might be tempted to believe that digital is only a showroom and actual sales happen in- store – resulting in a couple of large telcos massively increase store footprint. However, that is inaccurate – we have seen massive increases of 50%+ in digital sales when the experience was made better i.e., consciously more “humanized.” In other words, digital needs to mirror or exceed the expectations from an “active salesperson”.
Humanizing the digital customer experience – What and How do we enable that
If telcos treat their website as a showroom, then customers will treat it as a showroom. Instead, if telcos reimagine their website (or app) as an intelligent, empathetic sales assistance platform, then “humanization” prerogative becomes more real. Humanizing boils down to 3 speciﬁc concepts:
- Remove true friction that customers face, give them the assurance, information, navigations intuitively and naturally,
- Make it convenient and relevant by giving the shortest path journey, peppered with the right products, offers, features & conﬁgurations, accessories, shipping, and payment options
- Intelligent assistance by being cognitive to the actual need/ concern that doesn’t stop with “information,” but completes the necessary “action” for the customer.
All of this sounds like another $50M in investments, but it is not. Most large telcos have enough off-the-shelf tools and more-of-the-same tools will only result in more-of-the-same results. Instead, a structured, iterative evolution of customer experience would help, as observed at two large telcos.
Three fundamental questions to be asked:
- Can we identify the root causes of customer friction as they navigate the digital purchase journey?
- We certainly can – applying a combination of advanced AI and Big Data Engineering by mining through millions of journeys at a granular click-level. Diagnosing the root cause and removing obvious friction helped a large retailer improve the conversation rate by 25% within 3-4 months.
- When to best intervene through the right product, messaging, and assistance through chat or call?
- An excellent example of real-time decisions, though applying simple business rules will take you back to the robot age. When you incorporate true Customer DNA along with real-time recommendation engine is when you see true behavioral change. This degree of 1-1 personalization has driven at least 15% increase compared to using business rules.
- What are the messages/ experiences to be driven at critical moments of truth?
- AI can help with the when and where, but the “what” would appeal is a work of Human-centric designed messages, powered by behavioral sciences, still outperforms any commercially available AI today. Final Mile, a global expert in behavioral science, identified ways to redesign to appeal to customers’ “fear of a better option” tendencies and redesigned the experience to be comfortable and reassured.
How can you drive a meaningful change
In large telcos, digital is a melting-pot of 5-6 cross-functional teams (Digital Ops, CX, Marketing, IT, Analytics, Vendors) operating under highly-visible pressures. It has become non-feasible to pursue traditional means of consensus building/ large one-stop investments/ clarity of ownership – just as it is to take 6 months to build a few AI- driven models.
We pioneered and tested out a new approach – the DART approach (a slightly more nuanced form of agile) that combines AI – Engineering – Design
- Detect pain point/ human-centric friction: Using AI at scale to identify the root cause for abandon and size the business case. This is different from just looking at anomalies or exit rate reports.
- Architect a better experience: Using behavioral sciences and design-thinking to overcome the customer’s hesitation.
- Rapid deployment: Using AI and big data engineering to quickly deploy production-grade models, APIs, and applications.
- Testing: Insightful behavioral insights to suggest what is working and how well.
A pod-based approach – to bring together cross-functional teams can help to smoothen out basic inertia and set the stage for rapid iteration. A large telco organized the work-process around a war-room with large screens spitting out performance metrics, daily stand-ups to deﬁne and track the sprint with a deﬁned blueprint to smoothen the pod’s operating model.
In the words of the head of digital transformation at a Fortune 100 Telco – “Digital transformation is about small steps that will make customer experience Simple, Easy and Awesome.”
COVID-19 and hand-wash category
In light of COVID-19 epidemic, businesses across the world are slowing down. Sectors such as travel and hospitality are already experiencing severe slowdowns. Others like retail, consumer goods, financial services will face pressures on demand and supply side1.
Not all CPG categories are facing equal pressures – in fact some categories are booming! Nielsen reports that sales of hand sanitizer (four weeks ending on March 7) increased 228% when compared to the same period last year2. With 64% of Americans (correctly) believing that washing hands with soap is better than using hand-sanitizers, we can expect hand-washing behaviors to also be higher in current circumstances.3
Will these new behaviors sustain when eventually the disease is no longer a threat?
Context Driven Behaviors
Contrary to traditional views that people weigh desirability and likelihood of outcomes and alternatives, people process risk as feelings4. Such an assessment is colored by salience of events and our emotions. Compare the risk of dying from a shark attack (a minuscule 1 in 3,748,067) to that of a car accident (1 in 84) with the perceived risk!5
The COVID-19 pandemic has created a necessary environment of information overload and an atmosphere of fear. This fear has been useful in driving short term behavior change. Hand-washing and body-washing behaviors are expectedly increasing. The key question is whether these newly acquired context-driven behaviors and habits will sustain when anxiety and fear levels decrease?
Some of the new behaviors are part of a collective – social distancing, quarantine, and hand wash. When components of such a context-dependent package cease to matter, cues that trigger other associated behaviors disappear, leaving people to depend solely on their motivation to sustain other behaviors. Such a situation decreases durability of new habits, leading to a decline in levels of hand-washing and body-washing over time. Anticipating this decline, it is imperative that businesses take measures to institute programs that drive sustainability of new practices.
Change Behavior and Sustain Habits
Current hand/body wash routines are intertwined with salient disease risk – it is a coping strategy to adapt to and control aspects of the situation which people can control. This automatic association needs to be broken (at the right time) for habits to sustain when the context changes.
While we know that communication has a huge role in disassociating these habits from disease context and behavior, we do not know which framing works best. Some of the possible framings are:
- A self-image framing (“Washing hands is the right thing to do as a responsible citizen”)
- A descriptive norm framing (“Most people in your locality continue to wash hands every hour”)
- A de-contextual framing (“Corona or SARS or MERS is likely to be a recurring phenomenon and therefore more hand-washing is required”)
- A power-oriented framing (“YOU have the responsibility and the power to stop the spread”)
CPG manufacturers need to work with some of these moderations in their campaigns to drive sustainability of newly acquired behaviors.
Delinquency is one of the financial services sectors that COVID-19 is bound to have a direct and immediate impact on. Large scale income disruptions, compounded by a looming financial recession and growth slowdown, dictate that financial institutions around the world adopt proactive strategic measures to handle the impending increase in delinquency rates. For some perspective, coming off the recession in 2008, credit card delinquency rates in the US reached 6.77% in 2009, before gradually dropping back down to 2.12% through the second quarter of 2015. Mortgage delinquencies were at 9.3% in 2010, before gradually stabilizing around 4.5 between 2016 and 2018.
It is important to view both delinquency and consumers’ interactions with financial institutions during this time from a behavioral standpoint, in order to craft internal strategies and communicate measures to consumers.
A segment-based approach: Learnings
This note focuses on two behavioral segments that in the current context are most likely to be created – consumers with a) temporary and b) permanent financial hardships. The first step is to look at some fundamental truths we’ve learned from studying delinquency behavior for these segments:
Delinquent members are oriented towards resolution: Whether arising from temporary or permanent hardships, delinquency brings with it negative appraisals (snap evaluations of situations, that lead to the emotions we feel at that moment), especially of the self and the ability to manage said hardships. These lead to feelings of guilt, and along with it, a strong intent to resolve – in this case, pay. In light of the COVID-19 context, the window of resolution orientation is likely to be shorter than normal, owing to the attribution of the hardship to external factors, rather than the self. It is important to leverage this window as it does exist.
Delinquency is a coping mechanism: For these segments, delinquency is a coping mechanism: a response to deal with their context of low control and scarcity. Tunnelling around immediate goals like food and lifestyle needs, cause these goals to be prioritized over pending bills. The consumer’s goal at this point, therefore, is financial management, and not delinquency resolution.
Reciprocity Seeking: This low-control situation also leads to reciprocity seeking – whether it is expectation of flexibility from the bank, or a human touch to help guide them through the delinquency landscape (this is going to be especially pertinent in the wake of the current pandemic, where there will exist a strong expectation of ‘allowances’ from the institution). This is the point where, currently, for most consumers, the relationship with the financial institution sours due to a perceived lack of empathy from their side.
Over time, this orientation turns to avoidance: The negative feelings towards the self subside, the longer consumers stay in the delinquency funnel, and are replaced with poor intent perception towards financial institutions, stemming from the way the institution has stepped up in their time of need. This poor perception, coupled with punitive strategies adopted by the financial institution leads to avoidance, and in a few extreme cases, feelings of anger and revenge seeking.
Turning Avoidance to Approach
Designing for delinquency management needs a nuanced approach and keeping in mind the above factors, should be based on the segment and the part of the delinquency funnel they are currently in.
Temporary Financial Hardships
For customers with temporary financial hardships, the strategy should be centred around curating their return to ‘back to normal’ before they slip deeper down the funnel. It is important to recognize that any punitive measures adopted here would only increase avoidance. The goal, then, should be to showcase Control, Negotiation and Flexibility. This might involve:
- Communicating empathy and goal alignment (financial management) – ‘eg: we do not wish to add to your hardships, but want to find ways to work with you to ease them’.
- Showcasing alternate coping strategies rather than delinquency – consumers are looking for a source of authority to help them figure out possible pathways out of their current context.
- Showcasing reciprocity through flexibility, introducing new partial-payment plans skewing towards ‘more time’.
- Strategically leveraging credit health and possible future losses involving credit opportunities in order to gently nudge them towards the ‘right’ path.
Permanent Financial Hardships
For those with permanent financial hardships, the strategy has to be centred around getting them ‘adapted to their new normal’. The first step here, would be to turn their avoidance responses into approach – to get them to the table. For this segment, it is imperative to showcase Assistance, Repair and rebuild Trust and Authority to ensure that you remain the ‘last card/loan standing’. This might involve:
- Turning Avoidance into Approach: communicating flexibility and leveraging consumers’
deal-seeking goal to get this segment to the table.
- Communicating empathy by aligning to their current financial management goals.
- Using authority to provide pathways to this segment to adapt to the landscape of delinquency and help minimize damage already done.
- Engineering long-term resolution intent (‘last card standing’) by leveraging reciprocity – frame communication that talks about safeguarding the institution’s long standing relationship with this segment and showcase intent to keep them in the in-group.
Delinquency Management: Channels and Behavior
Different delinquency management channels also afford their own emotional pathways. Phone channels, for example, are associated with expertise, empathy, negotiation and closure, depending on where consumers are in the delinquency funnel. Non-voice channels (messages, e-mails, websites) are associated with flexibility, consistency, exploration and efficiency/ease of use (Approach). It is important to understand these affordances, and factor the channel into the behavioral strategy.
A strong, behavioral science perspective for delinquency management is capable of enabling both consumers and clients to manage and cure delinquencies.
In a matter of just a few weeks, the COVID-19 threat has escalated to an unprecedented global health crisis. As of March 17th, there are over 180,000 people infected across the world and over 4,000 in the US. Containing this global threat, governments are taking extraordinary measures, unseen in modern history – closing borders, putting communities, cities, and countries under the lockdown, and effectively shutting down entire industries.
Along with the escalating health crisis, there are also skyrocketing economic implications and uncertainties about the impact of COVID-19. While travel, entertainment, hospitality, and restaurants came to screeching halt, business leaders across other industries are trying to assess what COVID-19 will mean to their business.
Impact on Consumer Goods companies is likely to be nuanced, involving both the demand and supply sides of the equation. And unlike other industries, a shift in consumer behaviors is leading to the growing demand for many food, beverage, and home care categories, as well as to changing brand preferences. At the same time, manufacturers need to assess how to best address likely supply chain disruptions and capacity limitations that may prevent them from meeting changing consumer needs.
Below are some emerging insights on how pandemic worries are changing consumer behavior.
Changing consumer behaviors:
- Hunkering down at home: with accelerating mass shutdowns and the looming possibility of a lockdown, consumers are huddling at home and minimizing all social contacts; discouraging travel, parties/social gatherings; eating outcomes to a screeching halt as multiple states mandating restaurants/bars closures
- Stockpiling: sales a broad array of items from cleaning/disinfecting and preventative health products to food staples. While some of this is pantry-loading, much is likely to have significant incrementality
- Turning to comfort foods and comfort brands: during times of stress, consumers tend to turn towards their comfort foods and old favorites,
- Back to trusted brands: preference switches switching back to trusted national brands and away from natural/organic and Private Label, particularly in cleaning, disinfecting, health products,
- Ecommerce acceleration: ordering online to avoid extra trips,
- Eating all meals at home – all week: with all kids and adults huddled at home 24/7, all meals will be consumed at home,
- More time: more cooking from scratch, more baking – “projects” to occupy kids,
- More snacking: stress, Netflix, boredom, proximity to the pantry – all these are likely to drive snacking, at least temporarily,
- Providence awareness: avoidance of products from impacted regions,
- Food safety worries: concerns about produce and fresh products that may have had exposure to the virus.
Anticipated impact on demand by category type:
Supply chain issues and concerns:
- Product availability and capacity constraints – particularly in key stockpiled categories such as hand sanitizer, branded high efficacy cleaning products – demand exponentially exceeds available supply,
- Lack of clarity about future demand– traditional forecasting models, will not be accurate for the next 6 -12 until things at least marginally go back to normal,
- Potential long-terms import gaps for products made in impacted countries and regions – as well as transportation and shipping disruptions,
- Risks of domestic supply chain disruptions – reduced manufacturing capacity and shipping constraints, shortage of workers if lockdown is implemented, or areas of the country are cordoned off.
Implications to CPG firms:
Most firms are now in the Crisis Management mode: ensuring employees and consumer safety and scrambling to keep critical goods on the shelf. Beyond this, issues that need to be addressed are:
- Assessing and modeling impact on demand: what will the next 1, 3, 6, 12 months look like? Forecast models need to be retooled to reflect consumer behavior as it progresses through the situation and to pick up a field signal rapidly.
- Optimizing manufacturing: balancing diminished capacity due to employee safety and lockdowns with increased demand.
- Optimizing delivery logistics: increased volume will require higher transportation capacity; however, flexibility is needed to meet the volatility of demand as the situation progresses.
- Evaluating pricing and trade investments – normal price sensitivities no longer apply. Moreover, subtleties will be highly dynamic, changing as we progress from pantry-loading to home confinement stage, and as the economy likely deteriorates.
- Accelerating e-commerce and DTC sales – product shortages, shifting to the online and more vigilant consumers with more time on their hands can create an opportunity to drive traffic to your website and start selling directly to the consumer.
- Re-assessing marketing investments – advertising investment, messaging, channels.
- Addressing longer-term supply chain implications: target safety stock and inventory levels, supplier diversification.
To add a historic perspective, we compared how COVID-19 is impacting consumer behavior vs. another major shock: 2008 recession
The insights/analytics team is all set to witness a new ‘Normal’. How can we predict the new behavior from the consumer. Find out here – Consumer insights/analytics leading in defining new normal
Insight and Analytics (I&A) leaders are preparing for tough decisions on the budget or resource adjustments or even reductions deciding what projects to keep, shelf, or let go of and even tougher negotiations with management and CFO. These discussions will be framed as where do I focus our budgets now, should I&A be priority vs. other competing priorities, and if yes, why, efficiency (doing same for less) vs. effectiveness (doing more for same). Of course, I&A leaders would want to communicate, “it is even more important than ever to understand consumer now and be prepared for the future to define new normal,” see my post from last week. However, the reality is that cost reduction, and efficiency will be on top of management/finance minds for the foreseeable future, so how should I&A fight this battle of efficiency vs. effectiveness and have a good chance of winning?
The answer is …with short and long term ROII and insight Value (IV)! How much of incremental dollar revenue increase or dollar cost reduction or percentage productivity can be attributed to the recommendations and process improvements directly coming from I&A? Similar concept as with Marketing Mix, short and long term effects.
We first developed the concept of Insight Mix Modeling (IMM) ten years ago to help the I&A function address the significant budget reductions in 2010/2011. At that time, we also began the dialogue with I&A leaders about the need to bring more value-oriented metrics (beyond just total dollar budget and # of projects) to start demonstrating and communicating I&A value. Unfortunately, not much (by way of introducing these metrics and making them current norms) was done in the past ten years that would make it much more challenging to win in the looming fight.
However, hopefully, there is still time to do a few things:
Make sure every project has a clear connection to business issues/goals and financial/functional performance metrics. Have EXPLICIT dollar value associated with these projects (we would be happy to share our expertise and experience).
- Make sure every project is and will be activated, and there is a measurable value that can be documented. No zero activated projects!
- Make sure these values will be communicated frequently across the organization.
- Make sure there is a real-time I&A Value Dashboard to capture and share the value on a monthly or even weekly basis.
- Make sure you start immediately educating stakeholders and management on I&A value metrics and use as a part of frequent communication with them.
I&A folks have talked about moving from cost center to value creator for a long time, and, while they made good progress in driving strategic thinking of their stakeholders, they need to claim the $ financial credit as well!
As the world has turned upside down over the past few weeks, changing what/how/why (WHW) we DID things, there is the opportunity for Insights/Analytics organization to be the one answering… WHW we WILL do things in the next 3, 6, 12 months, and after starting with analyzing the NOW. There will be NEW NORMAL in WHW consumers will be buying products and services, and what will drive those changes there will be a re-prioritization of existing need states (e.g., immediately after 9/11, man shaving needs changed from closeness of shave to safety for eight months and then returned to old normal) as well as adding new need states, we will behave differently across all our interactions and touchpoints, and this is precisely what Insights/Analytics functions should be defining, NEW NORMAL!
For years, this function tried to position itself as an advisor to businesses, and it has not happened for many reasons. However, the current situation may provide this incredible window of opportunity to do this. Business leaders will need this NEW NORMAL very soon. In the next 2-4 weeks, once they are done with immediate P&L, operations, organizational issues, and the insights teams need to act NOW to be of most help, to demonstrate their relevancy and importance. Stakeholders are asking for and are (honestly) needing this familiar, stable, logical, data-driven way of working that makes sense to them as very little did in the past few weeks. They are also thinking about how not to get bogged down in a defensive posture for too long, learning the lessons from the 2008/09 crisis – staying defensively for too long did hurt them in the long run.
A few ideas for how Insights/Analytics team should act proactively (without asking anyone’s permission)
- Organize a Rapid Business Enabling Team bringing together best strategic insight thinkers internally as well as externally (from suppliers)
- Begin immediately brainstorming and then communicating (to stakeholders) key observations on the NOW, based on experience, intuition while preparing for more data-driven analysis of WHW.
- Do this work and communication frequently (weekly) across the entire organization while addressing specific, most urgent needs of stakeholders
- At the same time, begin planning for defining NEW NORMAL and be ready with this in next 4-6 weeks.
- All suppliers should suspend their competitiveness and work closely together within this team to enable quick insights/recommendations creation and communication.
If Insights/Analytics do this, they will come out from this crisis in a different position, having a lot more respect and ensure their future place in the NEW WORLD. However, this window of opportunities will be closing soon, though, and, if Insights function does not step up, does not change, it may close the window ever to reach the desired state and be in danger of becoming obsolete. Hopefully, this will not happen, but they need to act NOW!
I welcome all to share your thoughts and experiences, other ideas that can be helpful; we are all in this together!
Businesses around the world are counting costs. Here’s how it is going to impact the CPG industry. Read more – COVID-19: Impact on CPG
We’ve taken care of all the things that we need to work from home – an adequate internet connection, a secluded space, calendars, meeting organizers, digital whiteboards, etc. – but a feeling of discomfort lingers on. Why?
We can walk out from a four hour face-to-face meeting with fantastic outcomes and a feeling of accomplishment, but when we attempt a four-hour video conference, it goes south real quick. Why?
The answers might lie in understanding the factors that guide cognition and emotions, especially when dramatic shifts occur in our primary, otherwise well-organized behaviors like work, for example.
Literature around remote-working usually revolves around three kinds of distances that have to be reckoned with –
- Physical (place and time),
- Operational (team sizes, bandwidth, skill levels) and
- Affinity (softer, more behavioral factors like trust, values, co-operation, and reciprocity).
Organizations mostly tackle reducing physical and operational distances to ‘prepare’ for remote working; this is not nearly enough to guarantee performance, motivation and collaboration.
Reducing affinity distance is crucial to getting a handle on the above factors. Let’s look towards Cognitive Science to give us a handle on the working of some of our higher-order mental processes in the remote-working context.
Remote communication is a cognitive depletion abyss
|Uncertainty – the price to pay for flexibility||What companies could do|
|There are clear boundaries and expectations defined when we head into work every day – work begins when we step into our workplaces and ends when we walk out. Processes have been laid out, and norms already exist; we know what to expect – business hours, attire, meeting cadence, breaks, lunch, informal chats. With remote working, however, these norms are either vague or non-existent and are mostly left to the individual’s discretion. A quick walk across to a colleagues desk is replaced by decisions in the digital space – ‘should I call her, what if she’s busy?’, ‘should I write first, or schedule a calendar invite?’. This uncertainty, then, becomes the price we pay for flexibility, not to mention the significant strain it places on our cognitive resources.||Set Defaults to manage uncertainty. Set expectations for work from home by creating defaults to manage all areas of uncertainty – a work from home dress code, working hours, lunch hours, when participants in a meeting are expected to turn on the video, when it’s okay to directly reach-out and call colleagues and when meetings ought to be set-up, whether it’s okay to take video meetings from different parts of the house, and so on. Open communication about factors like these will significantly ease the cognitive strain on employees.|
|Diffused attention||What companies could do|
|Let’s talk about attention. Focusing on a call is hard enough, with the brain working overtime to complete any gaps and fill in missing emotional cues. To make things worse, the limited attention that is left is distributed between people and tasks at home – the brain prioritizing immediate and more proximal demands over remote ones (which, unfortunately, include the people on the other side of our digital meetings).||
|Intent mis-perception and breakdown in trust||What companies could do|
|Lack of shared physical space and emotional cues mean that we are continuously tracking the intent of people we are digitally interacting with (‘she just put herself on mute, wonder what she’s doing?’, ‘why is his video not on?’). These moments of negative intent-perception and judgement, can immediately distract, and over time, lead to a breakdown of trust, which is extremely harmful for collaboration.||Encourage Consistency & Honesty
|Future interactions and belief updation||What companies could do|
|A shared physical space (workspace, meeting room) is an equalizer, as opposed to a video call, for instance, which affords access to participants’ unique physical environments. As a result, a significant portion of our cognitive resources are deployed into belief updating (‘that looks like a nice clock behind him – must have been expensive, wonder where he got it’). The cognitive costs associated with belief updating are unavoidable – these are automatic processes, with resources being continually marshaled towards them. These beliefs also shape future interactions with colleagues and are not easily modified once they have been cemented.||Ensure Stability
While it’s okay to be flexible about where in the house one can work and take meetings from, guidelines about these spaces could be established – a relatively neutral, non-stimuli-rich environment is ideal.
|Negative outcome expectations and being ‘always on’||What companies could do|
|Another aspect of working remotely entails mentally calculating outcome probabilities (probabilities of certain events occurring, and the possible outcomes of those events) – ‘what if I’m at lunch and my boss calls – he’ll think I’m shirking work’, ‘what if I’m on a break and I miss something important.’ Expecting negative outcome probabilities could lead to coping mechanisms like being ‘always on’, leading to distortion of the already blurred lines between home and work.
Additionally, the commute home, which gives us time to wind down and switch modes, is also missing – exacerbating the ‘always on’ feeling.
|Bracketing – work and not work
In a nutshell, working remotely really puts our limited resources to the test. Coping cognitively and emotionally is key to ensuring positive outcomes with the extended working from the home scenario that most of us across the globe are likely to face in the coming months.
At the time of this article, there are 2M Coronavirus cases, with 640k in the US alone. We are at week 6 of stay-at-home with various government bodies forecasting a further 4 weeks. Early impact on the economy is trickling in – leading news of employee furloughs deferred bill payments and pay cuts.
Retail banks, traditionally at the cross-roads of mass-market economic upheavals, are expectedly preparing short-to-medium-term strategies to manage capital, liquidity, and risk.
We see four major factors impacting banks – each factor may evolve differently based on various COVID-19 recovery scenarios.
Source: Fractal’s analyses of operating trends with global banking clients
- Slowing demand for credit: We already see a 25% reduction in global card spending, with US and select markets in Europe showing a higher reduction. We expect similar declines in credit applications and approvals (due to tightening credit policies), partially offset by low-interest regimes.
- Extended Low NIM scenario: The Fed has reduced interest rates to 0%, and the 10 year treasury yields have dropped to a record low. Historically, lower interest scenario is highly correlated with banking NIM.
- Credit losses impact: Too early to predict, and highly dependent on recovery scenarios. Still, early metrics on deferred bill payments and loss of income for both consumer and small business segments are a cause for concern. Loss management will expectedly be the biggest focus for the near-medium term.
- Spike in Operating Expenses: Predominantly driven by customer handling costs, we expect near term costs to increase by 25-30%. Spike in call volume by 60-70% is a major contributor, partially offset by a reduction in variable sales and marketing costs.
We see reduced operating costs as a critical and addressable imperative for banks – the focus will be on extreme digitization in all customer-facing interactions. “Digitize like a digital native”
The current crisis has exposed the low level of digital preparedness of large banks to deliver client value (in the present scenario- handling issues, communicating implications, providing assurances, planning for exigencies). Most of the spillover from loss of branch banking has moved to the call center instead of digital self-service channels (Web, app, IVR). In the case of one large bank, we have seen digital activity only increase by 20% while call center volumes increased by 90% in one month alone.
These call center numbers are understated (and digital numbers are overstated) since a few large banks have signiﬁcantly cut down call center hours and use IVR to direct customers to the web. Unfortunately, for several needs– digital interfaces are just not ready today, and customers would rather call back and keep calling.
Today’s digital capabilities are not ready to easily migrate customers to self-serve channels – we observed that despite the uptick in digital activity (as measured in logins), the percentage of closed transactions online decreased signiﬁcantly from steady-state numbers. Speciﬁcally, the percentage of digitally closed “purchase transactions (application for product/ credit)” decreased by 33%, while at the same time, the percentage of assisted “purchase transactions” remained ﬂat. We did not observe a discernible difference in the reduction of digital close rates across existing vs. new customers. Mystery shopping with a Big 4 US bank revealed that even basic wire transfer could not be easily completed online till an agent came on the line two hours later, and helped navigate through the digital process.
Source: Fractal’s analyses of operating trends with global banking clients
We expect strong persistence of customer digital expectations in the post-covid world – and digital strength will become an even more critical source of competitive advantage than it was pre-COVID-19
To be precise – all large banks have well-functioning web, mobile, and fairly sophisticated IVR ecosystems. They just do not have the experience synced up to rapidly-rising customer expectations – unlike what ﬁntechs and digital-native banks can provide.
It is most critical to recognize that the expected standards of digital are not to provide a set of tools and capabilities online but to provide an end-to-end frictionless, personalized and cognitive experience that ultimately replaces the need for human assistance.
Our study identiﬁed three major categories of disconnect in the digital experience:
Banks have an immediate call-to-action to drive Extreme Digitization – rapidly to immediately reduce ballooning operating costs and build for post-COVID-19 competitive advantage. Digital natives have traditionally operated with the notion of “completely digital experience, no human intervention,” and that is the north-star that banks should strive for. The human factor should be deployed only further to enhance the appeal in complex/ behavioral selling interactions.
But before we go star-wars on digital transformation, we must realize that Banks can already make signiﬁcant strides in the short term with minimal new investment. However, the ones that have improved, did so by getting past organizational and operating roadblocks, and answering fundamental questions:
- How to define the right target metrics
Bank’s silo-ed structures prevent an ability to define customer-cost metrics. Operating costs at a customer level, are rarely tracked, typically bundled into aggregate measures held by each channel.
- Who should own and manage these metrics
Metrics look individually good (at a channel level), but collectively not so much (at a customer level). In these times, digital can proudly claim to handle 20% higher volumes, Call centers can claim to improve efficiency at spiked volumes – while the customer P&L itself looks negative. Few banks have organized teams and leaders responsible for customer P&L.
- How to decide what to build, and how much to invest
While banks tend to be reasonably data-driven on investment decisions, the standards required to truly identify customer need is extremely high – requiring talent with highly sophisticated AI, technology and business expertise – a rare resource in most legacy banks.
- How to drive eventual adoption by customers
Probably, the often-overlooked challenge, because it is nobody’s challenge in particular, is that of continually striving to help customers achieve self-service through a concerted application of education, incentives, and motivations.
Therefore, the biggest roadblocks are rarely the what or why, but the how. Large organizations tend to be individually smart, but collective slow and misaligned. The undercurrent theme we have observed is that improving customer cost and experience metrics is everybody’s problem, but nobody’s problem – it is in face widely distributed across channel teams. The imperative of pace necessitates that large banks create an agile cross-functional team that comprises a core team from the channels, marketing, analytics, IT, and extended support from Legal and CX (if any). Such a cross-functional SWAT team would typically report to the business head, maintain a virtual war room, and adopt an agile process and technologies to function efficiently.
Extreme Digitization team: Cross-functional team focused on rapid, agile deployment
KPIs: Reduction in assisted channels by customer need, increase in digital ﬂow through
- Apply rigorously high AI to identify need from calls, chats, inferring friction and intent from digital and non-digital behaviors and prioritize based on the number of cases.
- E.g., the bank should know that wire-transfers to a speciﬁc recipient segment is broken, or that fees bundled inside mortgage escrows are a typical call reason for people who were looking for the same information online.
POC and test
- Agile methodologies used to design experiences – use insights to know where in the journey, and what speciﬁc experience to improve.
- Target a 3-day schedule to design new and test.
- Establish automated testing rhythm with few pre-designed metrics of digital ﬂow through.
Engineer for scale
- Engineering (various applications, Pega, etc.) is core to keep a continuous integration – Continuous Deployment ﬂow.
- Big Data deployment of AIML algorithms is critical.
- Ensure alignment with speciﬁc legal and CX teams for rapid approval.
Marketing is critical, however – since customers need to be “educated and/or incented to make the behavioral changes and the leap to digital.” Our analyses revealed that 30% of the customer base tends to be never-digital, further 60% are digital dabblers (e.g., such people would typically work digital and yet would continue with paper statements), and only 10% are truly digital natives.
90% of customers would need behavioral change to go completely digital, even if most are them are reasonable digitally savvy and already conduct some part of their banking transactions online
Therefore while banks work feverishly to advance their digital experience – a steady combination of awareness and incentive messaging to customers is important. A Big 4 bank is using its call center touchpoints to make people aware of all the cool online features, but that does not translate into customers executing on it. Behavioral sciences could further help, identifying any points of resistance that customers have, incentives they would care about – that could drive customers to go completely digital.
At the core, banking leaders must recognize that the expected standards of digital are not to provide a set of tools and capabilities online but to provide an end-to-end frictionless, personalized and cognitive experience that ultimately replaces the need for human assistance.
It’s been a tough few weeks for multiple retail sectors, as nearly $ 100 billion in US retail spend disappeared, as per a recent study by Forrester. Going forward as well, store sales are predicted to hit historic lows in the US, including apparel, restaurants, and car dealers.
The ‘essentials businesses’ are going strong, but at the same time, national chains are struggling. The consumer dynamics are changing, and retail needs in-depth data and analytics to manage and predict future performances effectively.
Retail is at the inflection point, making it clear that the consumer shopping experience will be part of a “New Normal” and not return to the past normal. We explore the role data & analytics will play to sustain and flourish in the “New Normal” retail experience.
It’s clear that the global recession is a reality of the beginning of the “New Normal.” We reflected on the key insights from great companies operating in past recessionary environments. It is more critical than ever that CEOs and their teams focus on the top priorities. A.G. Lafley, the former CEO of Procter & Gamble, captures four tasks that CEOs must uniquely do in such crises. (HBR, May 2009).
- Defining the meaningful outside – Of all the external stakeholders, which are the ones that matter most? What results are the most meaningful? During a crisis, resolve in favor of the ones that matter most.
- Deciding what business you are in – Where should you play to win? Where should you not play at all? Are you focused on your CORE business?
- Balancing present and future – Learning to strike the right balance between the short and long term comes more from experience and judgment than from facts. This applies to the business and also your talent.
- Shaping values and standards – Values establish a company’s identity; they are about behavior. Standards are about expectations; they define what winning on the outside looks like. Staying true to these in crisis will define the company.
These four tasks that only a CEO can do are paramount, whether it is a Fortune 500 company or a start-up. The CEO can not delegate these!
On balancing the present and potential of a recession, the following “ten commandments” were captured (over four past recessions) and hold true for companies of all sizes:
- Keep your prices/value equation as stable as possible for your consumers/clients.
- Minimize the price/value spread between your product and lower-cost alternatives.
- Don’t degrade your product to save money, increase the role of innovation.
- Keep pressure on all internal parts of the business and external suppliers.
- Maintain your long-term product initiative plans.
- Insist on top quality in recruiting and in your organization
- Anticipate the consequences of your actions on clients and suppliers.
- Study money management (cash flow) like never before.
- Engage with your clients and strategic partners, among others, personally, like never before.
- Motivate and connect with colleagues and leaders.
Data and analytics are positioned to help and accelerate all these tasks. The focus should be to turn the massive challenge in front of us into unique opportunities. There are three areas in particular, where the new retail experience will excel.
1. Operational re-alignment
- Demand supply imbalance – As China reopens production, product supply is coming back to speed sooner than expected. Though many of the Chinese manufacturers have moved to meet the demand for medical supplies, many are still manufacturing their original ware. With most of the countries currently under lockdown, manufacturers will struggle as unprecedented stock returns continue, leading to surplus with Chinese manufacturers.In the meantime, consumer demand will continue to remain uncertain in the near-term horizon, beyond essential categories. Lack of predictability for such demand will expectedly continue to widen the demand-supply gap, till equilibrium is eventually established. By this time, the overall supply-demand mismatch may result in significant price fluctuations.Retailers operating in non-essential categories, who have invested in setting up robust digital operations, are experiencing a definite spike in digital demand. However, it is not enough to offset the loss from store sales.Analytical interventions to identify patterns & surges in demand; re-assess inventory availability scenarios and simulating fulfillment options; optimizing cash flow and integrated decision cockpits for operations would play a vital role.
- Liquidity will re-emerge as a challenge – Many retailers are running promotions to remain cash-flow positive. Retailers of essential categories, such as pharmacy and grocery, will continue to maintain the current peak in demand while dealing with inventory shortages.Business re-organization has already begun, with major retailers announcing job cuts at a large scale. Simultaneously, in-demand online businesses are seen hunting aggressively for operational talent by offering better compensation. This trend, specifically, will be short-lived as the demand-supply imbalance is corrected. In the long term, retailers might even need to re-evaluate each store location for resource allocations and ascertain if it adds to the overall profitability.
- Diversify sourcing – Retailers and manufacturers will need to re-evaluate their existing sourcing partners to seek cost-effective alternatives if required. Similarly, they may need to reduce dependencies on any one source and instead cast a wider net for sourcing to several different locations or geographies to hedge the risk that such a situation or trade agreements might bring up.Analytical initiatives to identify high-demand zones & re-evaluate the sourcing partners delivering with the total cost of delivery.
Increased collaboration with existing suppliers: To expedite the product availability & ease down the capacity pressure on retail DCs, retailer organizations could enable their suppliers to ship to stores directly. For non-essential goods, manufacturers and retailers can collaborate with their suppliers to increase cash liquidity by removing incentives for on-time deliveries and advise them to delay raw materials procurement.
2. Consumer Expectations
- Omnichannel presence will be the new normal– As the pandemic struck, businesses were made to think on their feet to respond to the change. Retailers who were able to shift focus and learn from the emerging challenges quickly have been able to salvage the situation. They had to shift focus on e-commerce channels to keep going, including setting up partnerships for the last mile coverage. The resurgence in the role of brick and mortar stores for essential businesses has been apparent during these times, including the heroic efforts from store personnel. Many have had to remain open while ensuring employee safety, limiting store hours, and controlling the number of people in the store at a given time. True omnichannel experience has been tested and pushed to its limits. The trend is here to stay and will be one of the key business differentiators in the post-COVID-19 world. Innovations such as curbside pick up, In-Store Pick Up (ISPU), Buy Online Pick up In-store (BOPIS) are now the new normal. This situation may have been the final push for retailers to take omnichannel seriously and divert investments.
- Experiential retail will emerge – One of the growing trends, before the COVID-19 pandemic struck, was experiential retail. As consumers get more accustomed to a zero-touch way of working, this trend might start gaining popularity, and storefronts might be used for brand interactions and product trials. Making the actual order placement and delivery happen completely online might help keep store operational costs down and limit the number of stores required.
Fractal’s Behavioral Sciences arm is running a project to study the impact of COVID-19 on human behavior, the results from which will be made publicly available. Visit here to participate and share your own story.
One thing that the COVID-19 crisis has brought with it is the realization that continuous innovation is no longer an option, instead, a necessary pillar for success as retailers start thinking about returning to business as usual (hopefully sooner than later). Such innovation efforts will touch every part of the business. While supply chain innovations are well underway, retailers will need to re-think other aspects of their business. It will include responsible merchandising strategy, pricing and promotions, store-level allocations, technology-enabled planogramming, and deploying always-on information on-the-go for store associates, among others. Data will become far more essential than before, and retailers that continue investing in foundational capabilities to harness such data (including in Artificial Intelligence/Machine Learning/Advanced Analytics, Cognitive BI, Blockchain, and Cloud capabilities) will lead to the future. Such skills will also become essential for retailers to be able to connect with their core consumers in a more meaningful way, beyond daily transactions.
A post-COVID-19 world must be viewed as a time when business leaders and management will be more open to new ideas of embracing disruption and necessary transformation for the planet, the people, and profit. For some retailers, this disruption may well be the opportunity to pivot to a newer, more modern, and effective way of conducting business.
Every enterprise that wants to power itself with artificial intelligence (AI) has two fundamental questions. First, how will the future of its business be different as a result of AI? Second, what must enterprises do to stake their claim on that future?
In the mid-90s, the web came along, and it transformed the customer behavior and remade businesses. Today, AI is about to create an entirely new transformation in how companies design and deliver value to their customers. Despite the billions spent on tools and bots, AI continues to stumble. Why can’t AI use all the information and data that enterprises are generating and make them deliver results? The answer, because something is missing.
AI works and delivers when you understand the requirement of the enterprise. Every piece of information matters in business, from processes to products to people. What makes the difference between the promise of AI and delivering on that promise is selecting the right partner.
Here’s what AI-driven growth means for enterprises.
The need for an AI service provider
Engaging with an AI vendor looks like the right solution to business problems. But what’s the objective? Companies have to think through how they can embed AI in their strategies and business. Ad hoc approaches might not scale and cannot prove out new technologies and can fail to build systematic capabilities. Finally, such efforts make a minimal impact. Here’s what you should look for:
- Proof of capability: To understand and prove the capability of your vendor, there can be two approaches. It could be a point problem or using AI very broadly in the business. In point problem, the vendor should have the capabilities for the defined solution. If it’s the latter, then the vendor requires the capability to work on complex organizational structures across a range of things. Tests as with pilots and PoCs can be challenging for AI solutions, as it requires a lot of data and training to get them to work effectively. If the identified vendor can do something meaningful here, you are on the right path.
- Proof of value: When looking for a vendor, one thing you should be absolutely clear about, i.e., AI, is alone not enough to deliver value. So, how does the vendor measure the value of their solution? To deliver value from AI, you need a combination of AI, engineering, and design. AI can create algorithms that can match or exceed human capacity; engineering can create data pipes and technology infrastructure that will python the data into the algorithms and decisions out of the algorithms in the operating systems where decisions are made. Design is the piece that conceptualizes the problem. It is what puts the user in the center. The key thing is for the value they can demonstrate to align with the business requirement and therefore bring in these three things together.
Selecting the right AI service provider
The kind of service provider that may be required include strategy change management and implementation consultancies. To identify if you have the right vendor beside you, do a pilot or PoC. It would first test whether the idea has the needed potential and can generate value for the organization. The second could be to see if the vendor can work and deliver for you.
Robust, demonstrable methods, independence, and cultural fit will be influential factors in your selection. Along with it, the following key AI-specific considerations also play an important role.
- Experience: The service provider should demonstrate a real depth of experience in implementing AI with the right approach to tools and software. It is essential to go after challenges that can create a noticeable impact on the organization. The pilot will merely be a part of the journey that creates high value and to understand in the initial 10-12 weeks if you want to continue going in the direction or make course corrections. If you take the second route, then see if the service provider is aligned to the company’s culture and values. Will, your teams, be able to collaborate, are the thoughts of both parties aligned, etc. This is a critical aspect, as this equation will be a founding stone for successful projects of the future.
- Partnership and independence: For the different tools and approaches that are identified, you will want your service provider to have a strong relationship with these. Is the solution figured out, or you are looking for software to help you. Do you need a partner who can help you think through the entire process, right from the beginning? The service provider that can provide you with such solutions will be different from pointed solution providers. Each of these solution capabilities is quite different. Finally, it is how much independence there is to work on these projects and deliver solutions at scale.
The framework to work with the service provider
Doing a pilot doesn’t necessarily translate it to being a success. Once a pilot model is identified, the next thing question is, what’s the matrix you are trying to impact? A business could have multiple product lines. However, customers could just be limited to using one or two. The value is to look at a broader range of products. With AI, such needs can be identified, the next best action or product decided, and then achieve it through set mechanisms. The outcome can then be measured through different benchmarks like revenue per customer and products per customer, among others.
Understanding if you are moving in the right direction is important, even though the impact may not be immediate. The pilot is the base that identifies the potential of the concept to move forward with.
Moving towards data and analytics
For companies to make a move towards data and analytics, three supportive capabilities are required. First, companies should be able to identify, combine, and handle multiple data sources. Second, they require the capability to build advanced-analytics models for predicting and optimizing the outcomes. Finally, the company must be ready to bring in transformation at multiple levels, so that the data and models yield decisions.
There are two main features to make these competencies work:
- Clear strategy
- Have proxies in place
- Implementing the right technology architecture
- Measure places in the organization & decisions that are influenced by AI. How it has changed over time.
Identify case studies, where the core business processes have improved when it comes to customer experience or product development, operating efficiencies, and how much of that you have been able to infuse in the organization.
Successful product adoption
To make any product a success, design becomes critical. The best uses of AI is where the user is impervious to the existence of it, whether it is Google search engine or Netflix recommendations. It is the user interface that makes the entire product look simple, easy, and intuitive, and that’s what is needed.
This approach is already there in the consumer segment. You need the same lens for businesses as well. User empathy will help you understand their core issues like how does their daily life work and what do they care for, among others. If you can start providing that, without making them worry too much about the AI and engineering underneath that service, then you have a winner.
Listen to the Podcast
Here’s Pranay, sharing his thoughts with Daniel Faggella on how to select the right vendor partner when it comes to buying and procuring AI for the Enterprise here:
As anxiety and stress due to the unforeseen outbreak of COVID-19 continue to aggravate, governments and organizations across the globe are falling behind at the moment, as they are left imagining about the right measures/ strategies. This gap is expected to further widen at least over the next few months before the world could reach a stage of catch-up. As is the case with any pandemic, some industries are already hit hard (e.g. retail, transportation and travel) while others could arguably benefit from it (e.g. telecom, broadband, entertainment). The economic implications, either positive or negative, on the entertainment industry may sound insignificant in comparison to threat the virus poses to human life around the world. Nevertheless, it could have a long-lasting impact on the livelihood of millions of people who are associated with the creation or distribution of movies, music, sports, arts, and much more. Initial estimates suggest that 120,000 people are already out of work in Hollywood, and another 50,000 in the UK entertainment industry could potentially lose their jobs.
At first thought, it is natural to assume that mandatory quarantine, work from home and self-isolation measures would benefit companies in cable TV (e.g. AT&T, Comcast, Sky, Virgin Media) and streaming business (e.g. Netflix, Prime, Disney+, Now Tv). This is, of course, based on the assumption that subscribers will have more time at their disposal and that some of it would convert into increased viewing hours leading to lower churn (early results show an upswing of a massive 60% in the amount of content watched in the US). While it would certainly hold in the short-term, implications in the long-term will be complicated – driven strongly by how providers adapt to changes in customer behavior. The suspicion is because as economic hardship of pandemic starts trickling down, customers will be driven to let go of their discretionary spending with entertainment budget most likely be a part of that cut. Now, let’s unpack this to understand areas of business that will be impacted and their respective guiding principles.
Implications on entertainment value chain
- Content has always been the key mainstay to anchor customer engagement and thereby prevent churn, so the sudden halt to ongoing content creation across the world has thrown the industry into flux. Remember, this is not the job that could be completed remotely; you need people on the ground. Movie productions are getting cancelled/postponed everywhere including China, India, Europe and the US, and so are the sporting events with Wimbledon 2020, Euro 2020 and Tokyo 2020 being the most high-profile events getting cancelled/ declaring postponement by a year.
- As viewing hours hit an all-time high for the subscribers, negative sentiments and thin choice of new relevant content across platforms will be at its peak. To keep refreshing the content library, providers will need to reimagine programming and scheduling decisions quickly. E.g., the frequency of release of episodes of a TV series will have to be phased out to suitably cover a lean period of restricted new content supply. Some content, especially movies, getting streamed ahead of the planned schedule, slipping in repeat content more than usual, etc.
- Bold decisions from the providers (an example) – In an unprecedented move, NBCUniversal decided to break the theatrical window. It announced that it would make some upcoming movies available digitally the same day they are released in movie theatres that would continue to be open. Some of the films like The Hunt, The Invisible Man and Emma, are now available for on-demand viewing along with their recent theatre release and Trolls World Tour is set to be the next one which is slated for April 10 release.
- While an increase in demand is usually a great scenario for the entire industry and increases subscribers primarily for all providers, but not during the current crisis. As the period of isolation prolongs, the content library from individual preferences standpoint will start getting exhausted, leading subscribers to look for newer options. That, coupled with already high penetration percentage, will leave little room for established players to increase revenue and hands smaller/regional players an advantage to spike their acquisition rate much faster. E.g., in the US, 87% of paid streaming users already hold a Netflix subscription, whereas numbers for Prime and Hulu are 52% and 41%, respectively. Also, services like HBO Max, Peacock, and Quibi (upcoming) could benefit more from the unexpectedly high demand for newer content alternatives.
- Assuming an economic downturn in a longer-term, the surge in immediate acquisition rate is likely to be offset by an incremental churn rate at a later stage. This is because stretch in the immediate entertainment budget and financial distress will converge at some point in the future. Once the focus shifts from acquisition to retention, it will be interesting to understand the choices subscribers make and which connections they decide to cancel. This could potentially expedite industry consolidation with partnership offerings.
- A surge in viewing is breaking the internet globally, and to manage this network congestion, governments have started asking providers for a reduction in bit rates of streaming. E.g. Netflix and YouTube in Europe, acting on the ask of the European Union, had to reduce streaming quality from high-definition to standard-definition. The likely reduction of traffic on the respective ISP’s network due to this move will be around 25%. The subscribers can still opt-in for the high-definition video quality, but by default, they will be viewing the reduced quality. YouTube and Disney+ have already confirmed to implement this in all markets, with many other providers expected to follow suit shortly.
- Many providers are in a tricky situation as their ever-dependable programming asset – live sports, the driver of high-cost subscriptions has suddenly disappeared. Sports forms the crux of the package around which several other channels/packages are bundled. To prevent customers from running into the “value for money,” dilemma, providers like Sky, Virgin Media and BT have introduced flexibility to pause sports package until the pandemic eases. The move will hurt revenue significantly in the short term but should reflect in increased customer loyalty and trust over the long term. A glimpse of what goes into preparing a healthy sports offering: Sky Sports shelled out $1.63 billion per season – or $12.7 million per game – for the current English Premier League deal, with BT Sport paying $404 million per season – $12.6 million per match.
- In addition to subscription revenues, advertising-income is also expected to tumble. Usually, high viewing, as is the current case, makes for an excellent platform for advertising. Still, marketing spends in many industries is being cut drastically, especially in travel, hospitality, and retail. The impact is already being observed in the stock prices of many advertising agencies including WPP, Omnicom and Interpublic. The drop in ad revenues will leave broadcasters with lower budgets, limiting their ability to invest in content creation.
- Being stuck indoors has signaled an expansion in customers’ viewing choices. Catering to the shift BBC has started offering its UK customers more content on education, fitness, religion and cooking recipes. The same could be expected from other government-run/sponsored channels of various countries where the content would be less fancy and focuses on the wellbeing of subscribers, the nation’s culture and history, mythology, etc. Another example of increased appetite is the content around pandemic itself – Netflix’s documentary “Pandemic: How to Prevent an Outbreak.” It has become one of the most popular searches on the platform during the last month, after it had initially failed to garner much interest in the month of its release (January 20).
- Other expected changes in viewing habits include an increase in demand being primarily driven by the younger audience due to school/college closures, multifold increase in time spent in watching news telecasts (up to 3X in countries like China and Spain), and significant spike in daytime viewing.
Though every major economy around the world has announced a range of financial packages to help people and industries withstand the impact of the pandemic, there’s little doubt that many industries will suffer deeply till the time it lasts and possibly for a few years after.
Refining AI frameworks in times of unprecedented change and uncertainty
With these disruptions unfolding, it is imperative for organizations to take a prudent and well-informed approach towards customer decisions based on data and analytics. Artificial intelligence, machine learning, and most of the other analytical models provide insights based on what it sees in recent historical data. However, unprecedented pandemic events get rarely captured in that time frame. Lack of data on such events is likely to prove to be a major challenge for analysts and data scientists in designing the frameworks.
However, with some innovation, flexibility, and planning, organizations can refine AI-driven strategies to improve customer experience, address business challenges across the value chain, and gain competitive advantage. This process can be initiated with a few fundamental design questions:
- Which AI-driven use cases are most critical to the organization?Prioritization: It is critical to assess the business impact of each use case and modeling frameworks driving those. It will help identify a list of top-priority use cases which may need to be monitored and refined in these changing scenarios. E.g., accurate forecasting models driving strategic decisions like customer retention or revenue planning will have to be improved to enable the budgetary plan of cable TV and streaming business.
- How sensitive are AI frameworks to the extreme changes in customer behavior and industry trends?Scenario analysis: For important use cases, one of the starting points can be the analysis of events and understanding different scenarios that can arise. It starts with the identification of model features likely to get impacted and respective scenario planning for a spectrum of changes (optimistic, pessimistic, and business-as-usual) they may undergo – evaluating movement in individual forecast outcomes.
- How can the impact of such events be incorporated into existing and future AI frameworks?Effective model management: In AI driven organizations, where hundreds of models get developed and deployed, management and monitoring of the models become very important. It is more applicable to real-time models where the acceptable downtime is lesser compared to models running on weekly/monthly refreshes. With ongoing monitoring, analytics teams can deploy solutions more efficiently and have fall-back options should one approach not work. This will allow minimal downtime and reduce the impact on business. E.g. deployment of robust validation frameworks and alert-mechanisms to notify the analytics team of poor model performance can help quicker and more efficient turnarounds in areas where the performances are sub-optimal.Incorporating rare high-impact events into modeling frameworks: Events like COVID-19 are extreme and the rarest of circumstances, severely disrupting the way customers interact and perceive their environment. Whether it is product upgrades, new subscriptions, or viewership patterns, changes are likely to be observed across the majority of the KPI’s. Thus, it is necessary to have robust yet flexible modeling frameworks in place which can account for these events. One such approach is described below:Creating proxy features: Features designed on historical events, which mimic a similar impact on customer behavior, may be used to shadow the current events. E.g. in Europe, we observed:
- During the Beast from the East phenomena in 2018, viewership had increased by 12% in Week-9.
- A similar trend is observed during current COVID-19 situation, a 14% increase in viewership in Week-12 as compared to the average of the previous two years.
The similarity between the two events stems from the fact that in both cases, people were forced to remain confined at home.
Integrating a behavioral science perspective to understand and cater to customers’ needs
An understanding of customers’ emotions from a behavioral science lens, within the context of a pandemic, where people are homebound, where people are experiencing emotions such as fear, anxiety and loneliness, is key to understand the related action tendencies of behavior. Here are a few behavioral concepts that could help organizations respond to such need-state:
- Customers are experiencing emotions of anxiety, stemming from the high amount of uncertainty.
Anxiety makes one crave what is familiar and safe, as it helps counter the feelings of uncertainty, even if from a source that is unrelated—resulting in behaviors on the content platform, where a person is likely to go back to shows that they have previously enjoyed.
- Relaunching old content, highlighting old favorites. E.g. Cue re-watching old comedies, shows and movies that people have grown up with/seen in the past, where they are already familiar of the outcomes, giving people a sense of comfort and sense of control.
- Lockdown is creating a feeling of isolation and loneliness.
Frame organization’s position to recognize and empathize with the emotion of loneliness towards bridging the void.
- Creating social viewing in an isolated context addresses the feeling of loneliness and the ability to physically connect with people. Dialling-up sharing/commenting/recommending would create some socialness to the rather isolated lives. E.g. Framing offers that help a group of people get a better deal so they can be connected.
- In an unprecedented situation that no one has imagined, customers are experiencing extreme uncertainty.
It is a global shock, and hence, people might seek content that creates “preparedness” or visibility into the worst-case scenarios.
- Allow audiences to live through the end of days vicariously. It’s a form of emergency preparedness for the mind, rendering thinkable the unthinkable and theorizing where the average person’s place in all of it might be. E.g., curating a set of “end of the world” movies.
- Time has lost its relevance and place in people’s days currently.
Position content towards “productivity”/”learning” would help make time spent on watching content not “wasted” but meaningful, e.g., curating a set of documentaries and biopics, informative content.
To conclude, providers need to strategize for both demand-side changes (customer behavior, price elasticity, etc.) and supply-side disruptions (content delivery and costs, bundling, distribution, etc.). However, this is still an early assessment and would require ongoing monitoring of pandemic’s short and long-term ripple effect as the new normal takes form.
COVID-19: A need to re-look at digital
In light of the COVID-19 epidemic, businesses across the world are slowing down. Sectors such as travel and hospitality are already experiencing severe slowdowns. Others like retail, consumer goods, financial services will face pressures on demand and supply-side with social distancing and work from home policies in effect1.
While the US Bureau of Labor Statistics estimated that only about 29% of workers could work from home, this is skewed towards the more educated and some industries, like tech and finance2. But the COVID-19 pandemic has surely forced an increase in that number – even if for the short term. An interesting side effect of work from home and social distancing policies is the increase in online services such as:
- streaming:Netflix sees usage ≥ by 60%3
- e-commerce:average weekly revenue growth rates ≥ by 52% (but with many complications)4
- remote collaboration and teleconference:Microsoft Teams usage ≥ upwards of 200% to zoom stock price shooting up 42% since January5
Of these, the impact of e-commerce is of most concern to CPG manufacturers because of its twin threat: on the one hand,many CPG categories have been laggards in e-commerce, and on the other hand, e-commerce decreases brick and mortar shopping trips (a main-stay for impulse categories such as confectionery, for instance). In this situation, it is imperative that CPG manufacturers re-look at their digital strategy towards areas of growth.
Digital Marketing: Beyond Consumer Journey
Traditional consumer journey models detail out ‘ideal consumer journey’ across various milestones, and allows the creation of customized experience-based models. This macro, milestone-based model misses micro-moments that underlie real-life consumer experiences.
Across these moments, consumer preferences are constructed and reconstructed, making the real-life journey non-linear, contrary to what is traditionally assumed.
Emotionally Congruent Micro-Moments
Most moments are relevant to categories and brands because of their emotional congruence with the category, product, or brand. For example, when feeling cognitively depleted because of a long meeting or reading multiple articles on COVID-19, the resulting emotional low of depletion is relevant for a whole category of products which can be used to get a dopamine refill – from coffee to snacks and so on.
But, unlike a consumption milestone or a purchase milestone in a journey, many of these moments are not apparent or activated by CPG manufacturers whose focus remain on milestones.
Identifying and Activating Digital Moments
People have specific action tendencies in various emotionally charged moments. Various digital ‘action tendencies’ of consuming, connecting, sharing, engaging etc. can be interpreted, along with other signals, as a digital fingerprint of moments. For example, we are likely to be mindlessly browsing content when depleted, than deeply engaging in a Twitter-war.
Activating these moments is not a category level job – CPG manufacturers have to think sub-category. For instance, while liquid soap is functionally similar to a bar of soap, it is quite psychologically distinct – liquid being more apt for social distancing. Activation of social distancing moments for soap has to incorporate a behavioral understanding of such moments of doubts, the appropriate digital signals across different channels, and a context-appropriate messaging – a new form of digital marketing.
The worldwide outbreak of COVID-19 is creating major swings in demand across CPG as businesses selling necessity products experience a surge in sales, while other discretionary items see lower demand. Fractal’s Consumer Hub is enabling CPG clients to get customer insights and tap into the current needs of consumers.
COVID-19 has shown its strength as it continues to impact global organizations negatively. The reality is that the virus is expected to spread for a while. As we move into wartime economies worldwide and belief in Keynesianism is getting restored, it is now paramount to continuously evaluate how businesses should not only aim at surviving the crisis but also focus on servicing their respective communities and consumers.
With this ambivalence faced by the industry, we have pressed the pedal on fast-tracking digital transformation journey our clients are undertaking.
Rethinking omnichannel and adopting
Over the past year, Fractal has worked to bring a vision of AI-powered consumer, category, and brand insights to life that powers go-to-market pivots, fuel innovation, and focused content for personalized targeting. It has enabled our clients to continuously identify new market opportunities, spot emerging competition, detect emerging consumer prospects and consumption patterns all across consumer product categories – convenience, shopping, specialty, and unsought goods.
The sheer volume of data around is unfathomable, and each passing day necessitates a new response. Traditionally, this work previously used to take several months and was outsourced to consultant(s)/agencies, with organizations having little clarity on the source of information. Also, reliance on only social listening or only market research distracts and confuses organizations regarding the veracity of the insights from trends.
Using AI, Consumer Hub filters out irrelevant signals and continuously monitors consumer pulse, competitor tactics, regulatory maneuvers, government announcements, and policy decisions. It helps spot hotspots, trend shifts, and real data-based insights automatically. A continuously evolving product such as Consumer Hub pegs our clients to be at the forefront of transforming operations digitally and helps them adapt to a discovery-driven planning (DDP) process bringing incremental change in the ways of insights generation, consumption, and response planning. Our steadfast belief always stems from the following quote:
If you don’t monitor data, you will regret it. If you don’t measure data, you will regret it
Understanding the shopper to drive sales
The corporate workforce around the world is struggling to comprehend the sudden changes in their routines and changed surroundings due to the COVID-19 pandemic. We can see a run on products like cleaning items, toilet paper, grocery staples, and medicines. To tackle the current situation efficiently, we need to understand the bigger picture first.
With our readiness with Consumer Hub, we quickly pivoted to configure a COVID-19 tracking command center to continually monitor community needs, government announcements, and shifts in industry trends. Our core objective at Fractal is to understand better how we can be of service to the community and clients alike. With the biggest ever focus on personal hygiene and home care products, we are observing many a trend shift that may just be the beginning of a permanent lifestyle change. Demand and need for categories such as detergents, dishwashing liquids, and other household cleaners have skyrocketed. Continuous category contouring and tracking possible synergies with adjacent categories have helped us understand the vagaries of cross-category improvisation and innovation in consumer usage occasions.
In health and personal care, there is an increased focus on big brands driving product consideration with top brands driving 5x higher conversations than traditionally defined category need states. Further, consumers have become more finicky about topics such as hygiene, disinfecting, germs, antibacterial. This change in behavior has thrown up a few worthy substitutes in the form of OTC medical sanitizing tablets, unraveling a few key ingredients that have caught an eye of our otherwise happy-go-lucky self when it comes to daily essentials.
In more sensuous categories such as home décor or ambient creation, staying at home has made us think long and hard about safety aspects and continuous product usage. Such emerging vectors are likely to play a permanent shift in shaping consumer needs and usage preferences in a post COVID world. On the bright side, it gives us a tremendous opportunity to bring innovation to the market.
To win in the era of relevance
As much as it has helped our clients stay abreast and activate decisions faster, it has also helped us spot early trends concerning conversations around their products. As a strategic analytics partner, there is tremendous responsibility on Fractal to monitor any unintended usage of our client’s product portfolio that may trigger serious health concerns. We have actively set up a weekly drumbeat tracker with crisp, actionable insights to drive business decisions and activation.
With successful activation and business embedment, we are now beginning to envision if Consumer Hub can mine data to provide insights into operations planning in a post COVID-19 world. We currently work to understand and translate this unparalleled transition across industries and the impact it is posited to have. It is about the way we work remotely, cope with access to education and healthcare for our family, navigate vagaries of social distancing, and balance fluctuating emotions while keeping good mental health. Understanding all such signals and engaging in empathetic decision making will be key to a sustainable, digitally transformed organization.
Road to the future
Road to disruption is not an easy one, especially when the goal is to transform traditional spaces such as consumer and market research. Five core principles are fundamental to the vision of Consumer Hub:
- Think consumer,
- Capture intent-driven micro-moments,
- Let data do decision making,
- Tell and integrate consumer, category and brand story,
- Learn & iterate using AI
AI signals many possibilities for organizations in every industry who are trying their best to serve both community and consumers – the decision wheel swings full circle from real-time sensing of consumer pulse to continuous evaluation and course-correction. At Fractal, we are continuously adapting to confront digital challenges and learn our way towards a new business model. The future is unknown, uncertain, and not yet obvious; now is the time to learn as much as possible, connect and serve consumers in every possible way. We are bullish about transforming the world together through consumer, competition, content, and commerce intelligence.
Sometimes a change is so hard and dislocating that it’s difficult to see a way through. In our personal as also our corporate lives, global COVID-19 presents a great challenge in functioning in the ‘new normal.’
The current COVID-19 health scare, along with stay-at-home measures, limited mobility, minimalist business scenarios, and early signs of a looming recession globally, has created a volatile, uncertain, complex, and ambiguous situation.
Businesses find themselves looking into the abyss of a mostly moribund economy, while executives are trying to salvage their multi-level organizational challenges. This changed way of the corporate world is leading to long working hours, an overload of meetings and actions, merging of family & work time, and stressful minds.
It is quintessential to simplify work-tasks, enabling agile decision-making, having easy and fast access to relevant data, and aids such as exception alerts and reduction of non-value-added tasks.
We are looking at a time where social distancing dilutes the in-person connections and strengthens comfortable long-distance relationships. The ways of executing tasks, achieving goals more productively, and enhancing the quality of personal and family life, is done more remotely than from office spaces, adding meaning to life other than work.
There may be infrequent in-person meetings, even limited yearly offsites, while all other strategic and operational meetings are virtual. It would reduce much travel, congregation, physically eating, playing, and laughing together, leading to a new virtual environment of working that currently, most employees of the world are not used to. It would warrant training, education, and simplification of tools to an precedented level. Let’s explore the new ways of digital workspaces of the coming times.
Accelerating implementation in the digital ecosystem
In our mission of powering every decision in the organization, we have found that traditionalists were the modal business persona in any organization- those used to a systemic process of information collection and dissemination within the organization. Circa 2020, COVID-19 has forced the need for work from home; an environment whereby the systemic processes and infrastructure so suitable for the workplace needs a complete rethink.
Zoom is serving up to 200 million users, as people are zooming (yes, it is becoming a new term) for everything from business calls, education, weddings to zoom parties. Microsoft announced, more than 75 million users are pushing towards rapid digital transformation. This has brought to the forefront the persona of the self-serve users on the rise. In today’s VUCA world, due to COVID-19, stay-at-home measures, virtual decision-making meetings are imperative. It has forced a digital transformation never thought of. Traditionalists who relied on business analysts for analyses to assist decision-making are forced to become self-serve. Round the clock access to precise information to deliberate and take decisive actions has become the need of the hour. Request for AI-based digital ecosystems, including all devices – laptops, tablets, mobiles for business analysis has seen a rise.
2020 VUCA world: Need for integrated business operations with fast and, agile decisions
Businesses globally are seeing an unprecedented increase in agility in decision-making in COVID-19. With the economic uncertainty, loss of employment, limited store-openings, fluctuating consumer needs, and increased online shopping behavior, all projections of category forecasts, demand plans, operational throughputs, working capital, and inventory estimates have to be re-done far more frequently.
With Fractal’s unique integrated business planning, cross-functional executive collaboration is now effortless. All the decisions flow from a marketing forecast that’s driven dynamically by price, advertising, merchandising, seasonal events, and competition. The data is added to AI-based sales planning, including shipment and availability, to avoid out-of-stocks on all channels at the right price.
It is further verified and course-corrected by the on-the-ground situation of factories impacted by COVID-19 due to the health of the blue-collared and white-collared workers as well as upstream raw material supply. It’s then reconciled with the working capital, inventory, and projection estimates to build the YEE P&L opportunities and risks. The entire process is then reviewed in a management review with the CEO for a go-forward action plan. All of this is enabled by reimagined process engineering, smartly designed tools and big data.
Remaining agile in crisis
Given the daily changing consumer & customer needs, CPG companies are struggling to meet the demand due to active stockpiling and disruption of supply chains. It has become quintessential to get an integrated view of overall business performance in one go, to understand the market performance of portfolio brands and categories across different channels within different geographies. To identify emerging trends and act with speed and agility will define the success of the businesses in uncharted times. Hence, businesses are looking for one-stop solutions to understand the impact of the crisis on their top and bottom line through an integrated business performance system. Understanding the shift in shopping behavior, consumption patterns, the impact of macro-economic factors such as unemployment rate, forex rates to measure the business performance.
Data to decision journey with AI, engineering, and design
We believe in a comprehensive and straightforward data-to-decisions journey for Fortune 500 organizations making fact-based speedy decision-making leveraging the power of AI, engineering, and design. This single source of truth in COVID-19 times can lead to efficient consumption of the right data to enable topline and bottom-line impact use-cases such as revenue management, marketing optimization, forecasting hotspots, operational intelligence, demand and supply planning.
Fractal uses data accelerators such as Concordia to ingest and harmonize internal and external, structured and unstructured data sources in a few weeks to build the data foundation. Our solutions like Eugenie & Foresient help companies find anomalies, patterns, and forecasts in the short and long-term to make strategic and operational decisions. Accelerators like Decision Whiz, Cuddle & ERYL enable simple, easy, and human-design friendly consumption of data for higher adoption using behavioral nudges based on contextual organization research.
The maxim ‘if you are the only one who gets it, you have failed,’ is true today.
We can simplify work-lives by empowering executives with data that are easily available on their devices. The information is available, right on their laptops, mobiles, tablets for quick access to office spaces on desktops and collaborative rooms.
We can use algorithms to take away the tactical and operational tasks daily from thousands of people in an organization and allow them to focus on their health, family, hobbies to work in this new normal.
Intuition and judgment can finally be backed by historical evidence and learning. Through this disruption, legacy organizations can also leapfrog to digital transformation – converting traditionalists to self-serve using next-generation technology. This transformational program can show quick-wins, thus leading to the highest adoption of tools ever, that can be celebrated to propel the organization forward.
Having implemented these transformative set-ups for Fortune 500 companies for a variety of contexts, we feel confident in helping entities – businesses, governments, and every individual navigate through this tough time with less pain, even if work-life becomes a less significant part of many of our lives.
The recent COVID-19 global pandemic is sending shockwaves throughout the financial markets, the elasticity of consumer goods, and significantly impacting the way we engage in normal business activities, for the foreseeable future. As a result, financial institutions could see a run on cash, limiting their liquidity and restraining access to consumer credit – requiring retraining of credit risk models. This could also lead to a subsequent increase in delinquency rates and the likelihood of default across consumer lending – necessitating refresh of account servicing strategies. However, the most significant impact on financial institutions may be operational. Many retail banks are currently evaluating ways to minimize COVID-19’s impact on their day-to-day operations, testing BCPs (business continuity plans), implementing work from home (WFH) protocols, and investing heavily in digital transformation efforts to combat the disruption. As a result, competition amongst banks will be fierce, as consumers evaluate their banking options. Early adapters will gain a competitive advantage over that of slow adopters. Intelligent Automation (IA) solutions can assist organizations in successfully driving value throughout these challenging times, and amid the opportunities they present.
One of our core operating mantras is, “Don’t send a human to do what a machine can!” As per MarketWatch, the market for technological automation, such as robotic process automation (RPA), is growing at 20% per year and is likely to reach $5B by 2024. Intelligent automation leverages advanced technologies like data science and AI to make automation smarter and provide considerably more value to an organization. Over the next three years, polled executives estimate that intelligent automation efforts will drive an average cost reduction of > 20% with a corresponding increase in revenue of approximately 10%. The benefits of IA include speed and precision in operational productivity with reduced costs, greater accuracy, and improved customer experience. IA enables organizations to build bots that automate a critical business process that is highly repetitive for humans and provides an opportunity to refocus attention on strategy and innovation. With the growth in IA solutioning, FS executives expect automation to increase workforce capacity by > 25%, equivalent to over 2 million incremental FTE by 2023. Most financial services leaders agree they will reallocate the resource time saved in automation to higher-value work efforts that drive improvement to customer experience and organizational growth.
Let’s evaluate a few of the key operational disruptions to retail banks where IA can be applied:
The global nature of this recent pandemic highlights the need for retail banks to develop a playbook for readiness to deal with operational disruptors. To prepare, these institutions should be evaluating both data and operational governance protocols, as well as WFH (work from home) guidelines, to minimize disruption to employees. This playbook should include parameters for streamlined decision making for organizational response and developing an effective communications strategy for both employees and customers. These organizations should setup procedures and strategies to effectively prepare for a completedigital environment. A digital ecosystem where all customer sales and service journeys are handled via the web, app, and email methods. Also, organizations should review all BPO activities to evaluate opportunities for greater automation. AI and ML are essential mechanisms that can support banks in refining business operations during this massive digital migration. Once these opportunities are identified, intelligent automation can be effectively deployed to drive efficiency.
As the pandemic advances, the traffic and, therefore, reliance upon brick and mortar bank branches diminishes. Consequently, bank customers will remain at home, and we will see a substantial migration to digital channels for conducting banking activities
As the pandemic advances, the traffic and, therefore, reliance upon brick and mortar bank branches diminishes. Consequently, bank customers will remain at home, and we will see a substantial migration to digital channels for conducting banking activities. In response, retail banks will prioritize customer experience and hyper-personalization initiatives to ensure consistency in execution for customers who will migrate and likely to remain loyal on these digital platforms. Advanced personalization can be utilized to proactively direct customers to responsibly maintain and service their accounts. In addition, they will look to technical consulting and advanced analytics to remove all possible friction to these channels across both sales and servicing journeys, while identifying the root causes. Automated, AI-based solutions have been developed to overcome behavioral and data issues and drive digital insights by channel. However, there will always be those customers that are less technologically sophisticated and so data-driven enhancements to inbound CSR smart call routing, and self-service IVR strategies will be required. In the case of one large bank, we have seen digital activity only increase by 20%, while call center volumes have increase by 90% over one month. Lastly, FS institutions should be encouraged to dig deep into their data lakes during this surge in digital transactions across their web, mobile, IVR, and chat channels. That said, AI and ML can be useful tools to “weed through” data and in developing solutions to manage these unprecedented volumes and uncover actionable, strategic insights on changing customer behaviors.
As Operations teams are forced to work from home and required to seamlessly conduct business activities on behalf of customers, firms will need to perform digital analytics to effectively redesign their technical infrastructure in support of massive digital migration. This technical assessment will cover information/data security, network capacity, risk monitoring, and systemic tools. Subsequently, there is a significant opportunity to leverage intelligent automation in alert management and advanced monitoring of information security threats. IA marries standard RPA bots with AI / ML techniques that make bots more intelligent. A perfect example of where this could be leveraged is within retail banks KYC (Know Your Customer) automation.
Implemented correctly, automation of critical KYC processes, like customer identity verification and required documentation capture, frees up humans to apply reasoning and analysis. In many cases, this leads to better decisions and decreased risk. Another relevant use case might be integrating intelligent optical character recognition (OCR) with RPA to create a robust workflow for automating document-heavy processes such as invoices, contracts, and sales/purchase orders within accounting.
In a recent surveys of U.S. financial institutions, “less than 45% of retail banks and credit unions felt they had a high or very high degree of readiness in mobile technologies and digital platforming, with approximately 20% stating they had low or very low level of readiness.”
In a recent surveys of U.S. financial institutions, “less than 45% of retail banks and credit unions felt they had a high or very high degree of readiness in mobile technologies and digital platforming, with approximately 20% stating they had low or very low level of readiness.” Automation is key. There are numerous ways to accomplish this objective. The development of a digital roadmap focusing on all customer sales and service journeys will be a critical first step. These institutions should look to evaluate their utilization of automated assistants, IVRs, and bots and to the likelihood of increased traffic across those channels. Besides, there should be an investment to accelerate the adoption of digital servicing mechanisms, including electronic statements, self-service troubleshooting, and account maintenance. Intelligent automation within the customer response, alert notifications, and inbound query handling across email, web, and chat channel is critical.
A classic example is where banks are developing programs to provide automated responses to customer inquiries via emails. Typical business rules offer 20% coverage across standard queries, where the integration of more sophisticated AI can drive automated email response to over 60%. That’s a significant lift with the introduction of AI principles.
In the end, these are historically challenging times, and the COVID-19 global pandemic will fundamentally change how we work and engage with customers, now and into the future. The retail banks that will survive and thrive will be the ones that are able to react and adapt quickest to these environmental changes. Digital is at the epicenter of it all. Find some easy ways to get started with intelligent automation that can assist banks in improving business operations and customer experience. Now, as the new normal transpires, it is critical to address these opportunities.
- Global Market Insights, “To 2024, robotic process automation market to see 20% CAGR,” MarketWatch, October 5, 2018. https://www.marketwatch.com/press-release to-2024-robotic-process-automation-market-to-see-20-cagr-2018-10-05
- Deloitte Insights, 2019 Deloitte Global Human Capital Trends, 2019. https://www2.deloitte.com/content/dam/insights/us/articles/5136_HC-Trends-2019/DI_HC-Trends-2019.pdf
- Gartner, “How to Scale RPA and Achieve Business Value in Utilities”, April 2018
- Forrester, “RPA Operating Models Should Be Light and Federated”, Aug 2017
- Gartner, “How to Scale RPA and Achieve Business Value in Utilities”, April 2018
COVID-19 has impacted a large number of countries and is turning out to be even worse than the critical economic, strategic, and political clashes happening around the world. The outbreak has and still is impacting all industries, including the Information & Communication Technology (ICT) sector.
Tech giants have hit the pause button on marketing operations, canceled essential events and conferences, most notably, the Mobile World Congress, and announced financial results below market expectations. The US tech giant, Microsoft, lowered its revenue estimates in the quarter ended March due to the impact of the epidemic with lower sales of Windows software and surface devices1. Apple also had last month said its revenue for the quarter to be below forecast. Although demand seems to be in line with expectations, the supply chain is returning to normal operations at a slower pace than anticipated.
According to international law firm Baker McKenzie2, the crisis has fostered corporate introspection and the need for businesses to re-evaluate near-term and long-term supply chains, resource deployment, and liquidity in the face of what it looks like “a looming global recession.”
In other words, businesses are tightening their belts and being careful with their spending, just like the rest of us.
According to International Data Corporation (IDC), growth in global IT spending is expected to reduce by 3-4% by the end of 2020, considering the ‘pessimistic scenario,’ due to the COVID-19 pandemic. While the major impact is expected to be on hardware business, including devices, the software, and services businesses are also expected to slow down as the spread of Coronavirus goes beyond the boundaries of Asia. However, the adoption of collaborative applications and cloud service sees a positive impact followed by technologies such as security, big data, AI, IoT, where the impact seems relatively small.
Even though businesses are grappling with current losses, in the long run, the ICT industry might be one of the few still standing and, in many aspects, stronger than before. But it will not be plain sailing for all businesses in the market. The strain on infrastructure networks, contractions in consumer spending, disruptions to supply chain, reduced availability of components, and the all-around financial impact of the Coronavirus are taking its toll in the short-term.
In a nutshell, before we see the potential rebound after the pandemic has slowed, we might see some severe short-term implications. There will be a retrenchment in outlook, reduced investments in modernization, as survival instincts trump the drive to prosperity.
Impact on different ICT areas: Growth engines and challenges
The world has never been more interconnected, and this health crisis is affecting the whole world. The ICT industry is already losing a lot of income opportunities, and it’s still uncertain when the situation will contain. Amidst the uncertain scenario, many technological areas will be the key focus and keep emerging as clear winners.
With organizations promoting working remotely, there is already an exponential rise in video calls/phone calls, as an increasing number of people are organizing meetings via apps or collaboration platforms. Digital media and Over the Top (OTT) content players are benefiting while Virtual Private Networks (VPNs), cybersecurity, and data security are other technologies that will see a surge as most workforces are operating remotely. Cloud services will grow, boosted by higher usage of content, gaming downloads, video conferencing, and the impact of remote access to corporate networks. There will also be an increased focus on technologies like artificial intelligence, big data, augmented reality, and virtual reality, among others going forward. Equipment maker, Huawei, for example, recently posted a surge in its technological services comprising AI, video conferencing, and wireless network coverage across the Asia Pacific, given the on-ground communication challenges to ensure smooth connectivity.
Although the COVID-19 pandemic has certainly not triggered Industry 5.0, it has brought home the reality of Industry 4.0. digital workflows, robots, automation are no longer goals; they are requirements. IoT devices have offered organizations a path toward preserving revenue streams during this pandemic.
Other areas that would see an uptick include e-learning, online education, and e-governance. As shoppers begin to self-isolate and avoid crowded areas, the clear winner is the e-commerce sector, with digital payment taking over a lot faster than the physical payment options.
The media industry (both traditional broadcasters and newer streaming platforms) are playing a vital role during the pandemic in providing correct and responsible health information to the public with search engines and social media platforms. In terms of the morale of those isolated, access to the wealth of quality content available is important. (Refer to our whitepaper: Implications of COVID-19 on cable tv and streaming business3 for a more in-depth analysis)
The current circumstances may also accelerate the adoption of 5G to meet the demands of bandwidth, performance, and network slicing. There will be more focus on the sufficiency of networks to carry the significantly increased traffic as working from home continues to ramp up. Social distancing and self-isolation mean that telecommunication has become an elevated essential service. It will be worthwhile to see how the Communications Service Providers (CSPs) both Mobile Network Operators (MNOs) and cable operators meet the challenge of their new critical role in the changed world.
While people are mostly using home WiFi, they would still use their mobile phones for voice communications, and this is the area that MNOs need to keep a close eye on. With video conference services (Skype, Zoom, and others) traffic is going through the roof right now, the dial-in option is an alternative if video conferencing platforms are overloaded.
Also, with the need for social distancing, people are communicating more, which will drive additional voice traffic, whether it’s on legacy circuit-switched systems or the 4G network with Voice over Long-Term Evolution (VoLTE) calls. The strain on voice services, though, will be helped to some extent by the convergence of WhatsApp, Messenger, Skype, Hangouts, and FaceTime calling within the iOS and Android ecosystems. Nevertheless, some CSPs are predicting significantly higher voice traffic growth against the regular year-on-year increase of only 5%4.
With the potential congestion of fixed broadband service, there is a chance that devices will fall back on mobile networks, and this will cause a significant surge of mobile data traffic. Likewise, in areas where MNOs also provide home broadband connectivity, one would expect a significant increase in data traffic, and in some parts of the world where unlimited data isn’t pervasive, an increase in subscriber costs.
The cable operators, on the other hand, will be more impacted than MNOs due to the increased activity at home – working from home and a greater demand for streaming services. Netflix and Google have already announced that they would be reducing video streaming quality in Europe for a month from high definition (HD) to standard definition (SD) to prevent network overload and collapse.
With significant changes in user behavior, escalating use of telemedicine solutions for remote diagnosis, and the corresponding data traffic shifts, it will be increasingly important for CSPs, whether it’s a mobile or cable operator, to keep vigilant and closely monitor their network.
Overall, the future for the ICT industry is here, but only the savviest businesses will bring it to its full potential.
The pivotal role of AI and analytics in supporting ICT businesses
‘During the crisis and its aftermath, winning companies will reinvent themselves by putting data and AI at the core of their organizations.’ They can leverage AI and Analytics on two major fronts:
- Driven by its internal needs as new challenges, opportunities, and use cases emerge.
- Driven by the systematic shifts in consumer behavior in what will be a ‘New Reality.’
- Key areas (driven by internal needs), where AI and analytics will play a crucial role:
- Revenue and Business Continuity Planning: Some of the massive shift to remote work due to the pandemic might be temporary. However, much of it will persist as more businesses provision for long-term, flexible working arrangements and on-demand staffing models will become more common. AI is no silver bullet for implementing new ways of working, but it can play an important role. It can help companies better simulate live-work environments, use predictive analytics for precisely forecasting sales and operational challenges, such as staffing needs and supply disruptions, and create the on-demand staff. Data-driven strategies can help to effectively address uncertainties by creating a scenario-based analysis using key variables while updating the model dynamically as the new data comes in.
- Network and operational analytics: AI/ML-based analytics can provide automated anomaly detection at scale. As traffic increases and network chokes-up, understanding spikes and breakdowns in a scaled, automated manner will be critical. AI and analytics help track these anomalies much more efficiently and accurately than manually looking at the system issues.
- Collaboration services leveraging AI: Communication and collaboration platforms have already seen an exponential rise in voice and video calls. Using voice-based AI, video conference users can start, join, or end a meeting. Voice-to-text transcription, another AI feature, can take meeting notes during video meetings, leaving individuals and their teams free to concentrate on what’s being said or shown, boosting efficiency, and collaboration. This AI/ML-based technology will get smarter and more accurate as more people will use it. With many tech events/summits becoming virtual, this may see an increase in demand.
- Digitalization and Automation: The current crisis is an excellent accelerator of digitalization for both consumers and businesses. Even the most skeptical ones, will be prepared to integrate digital agenda in their organizations and will be in much favor of automating their operations. For many technology firms, data-driven automation will be a strategic focus beyond Robotic Process Automation (RPA). Businesses will need to leverage automation across multiple areas, including customers, employees, and network. Marrying AI and analytics to digital agenda will be vital to building resilience. AI can be an enabler of digital transformation covering multiple use-cases such as digital relationship management, adoption of digital channels, digital identity verification, digital onboarding, digital fraud prevention. With more traffic being directed to digital channels, AI-based automated solutions can help quickly detect friction points and its root causes in customer journeys and hence, timely intervention/ resolution to drive more sales and better customer experience.
- Augmented data management and integration: As cloud services continue to grow, and data continues to collect from multiple touchpoints, data management & integration sees bolstered growth. This involves leveraging ML capabilities and AI engines to make enterprise information management categories, including data quality, metadata management, master data management, data integration as well as Database Management Systems (DBMS) self-configuring and self-tuning. It is also automating many of the manual tasks and allows less technically skilled users to be more autonomous using data and allows highly skilled technical resources to focus on higher-value tasks.
- Key areas (driven by the shifts in consumer behavior), where AI and analytics can play a crucial role:
- New Consumption patterns: The pandemic is drastically altering consumption habits worldwide as consumers are making more purchases online and digitally. This will fuel more online purchases of technological products & services, an upsurge in cloud-based services, and the need for more devices per household. As their focus shifts to recovery, more companies are likely to deploy AI-enabled solutions to reignite top-line growth. AI can help companies discover emerging trends, detect new consumption patterns, and identify the change in preferences. It also enables the ‘hyper-personalization’ of products to improve customer engagement and sales.
- Customer Experience Optimization: With front-liners being flooded with customer calls and queries, contact centers will need a more efficient workforce and capacity planning. The manual process based on experiential learnings could result in inaccurate capacity planning for ramp-ups and unprecedented demand. AI and analytics can provide more accurate and efficient capacity planning models for predicting the volume of agents needed, effective staff utilization, better management of agents, and enhanced customer experience. Effective planning will also reduce Average Handle Time (AHT), the number of complaints, and overall operational costs of the contact center. Alongside, businesses can use Natural Language Processing and AI capabilities to speed up wait times in call centers
- Last mile fulfilment: Increasing trade barriers is forcing companies to re-think their supply chain strategies and re-assess the merits of redundancy. Disruption in the global supply chain has moved redundancy higher up on companies’ agendas as a means of reducing risk. Rather than heavily concentrating sourcing and production in a few low-cost locations, businesses are looking to build more redundancy into their value chains. But redundancy and duplication entail a significant cost. AI offers the potential for companies to build resilience into their operations while minimizing the cost and damage to margins. It enables cost optimization in each factory through predictive maintenance and better planning. It also allows them to operate a larger number of small, efficient facilities nearer to customers, rather than a few big factories in low- cost locations. With the deployment of advanced technologies such as 3D printing and advanced robots, it can now handle tasks that previously required humans.
AI and analytics will be a must in enabling ICT businesses to thrive and seize competitive advantage in this new environment. AI capabilities will be enormously valuable as businesses confront and adapt to the new reality of the current crisis and its aftermath.
- 2Baker McKenzie
- 3Implications of COVID-19 on cable tv and streaming business
- 4Spending time in the new normal
COVID-19 has spelled disaster for most industries, but for CPG, the road ahead has changed altogether. So far, CPG has made significant advances in understanding the role of pricing and trade promotions in driving growth. However, there is still a lot of room for improvement to combat the new challenges of today.
Companies are routinely spending almost half of their total expenditures on trade promotions, a number that has remained mostly similar over the years. Less than half of this trade promotion spend results in profitable growth, revealing a significant opportunity.
The change in channel structures, with consumers’ shift online, has added complexity to revenue growth management. The recent and continuing impact of COVID-19 globally has also created demand shifts and disruptions in the supply chain.
Revenue growth management (RGM) now has a unique opportunity to transform CPG companies by utilizing data and analytics continuously, to make better and faster decisions.
Continuous, consistent and disciplined use of RGM
With advances in data availability, computing power, and analytics/AI algorithms, it is now possible to run a continuous RGM platform, serving up ongoing insights, opportunities, and even prescriptive recommendations.
- Companies can continuously look for RGM opportunities, not only at the strategic level, but at the micro-level, e.g., surgical or selected price actions, uncovering pack-channel opportunities based on the marketplace, or continuous optimization of promotion plans to be on track.
- Continuous tracking also enables companies to react faster to marketplace changes, e.g., the impact of COVID-19 by sensing, shaping, and fulfilling demand more efficiently.
- The evolution of analytics, AI, and data engineering enables the creation of such a continuous platform.
Beyond brick ‘n mortar: Integrating online channels in RGM
Multiple CPG categories have seen a steady increase in eCommerce: both pureplay retailers like Amazon, and click ‘n collect for traditional retailers, like Target.
- Integrating eCommerce with brick n mortar in RGM decisions, e.g., pricing and range/mix can lead to better overall decisions and profitability.
- Interactions between channels and prices can be tracked and simulated in real-time to make more profitable pricing decisions.
- Differentiating pack sizes and value between channels, including marketplaces, can be a better strategy for CPG companies.
Trade promotion spend optimization continues to be an opportunity
Trade spends accounts for up to 25 percent of gross sales for a CPG company, second only to the cost of goods sold. Yet, trade promotion productivity underperforms, while users have to navigate multiple legacy systems with incomplete or imprecise data. In this environment, trade planning optimization remains a theoretical exercise.
With the utilization of continuous data systems and planning platforms, CPG companies can drive both topline and bottom-line efficiencies in promotion planning.
Fractal’s TradePulse is a continuous Trade Promotion Planner, providing the user with a practical and flexible platform, and improve trade promotion productivity by 5-15%.
COVID-19: A Unique Challenge
As CPG companies navigate the impact of the COVID-19 pandemic and the potential economic slowdown, re-examining RGM decisions will be essential.
- Some categories and brands continue to see high demand, while others are less top-of-mind and declining in this new environment.
- We have seen an unprecedented to ‘Click and Collect’ and online delivery formats, even for categories that
are traditional brick and mortar focused.
- Stockpiling for essential categories has created supply constraints, and pulling back on promotion spend may be needed.
Creating value through continuous RGM
The multiple shifts in consumer preferences, perceived value, and shopping venues have created significant challenges for the CPG industry, and a recessionary environment has added to that. The solution is to understand the levers of consumer demand continuously and create linkages to growth opportunities. Our scaled revenue growth management programs focus on driving value with:
- A design-first approach to ensure we are solving the most valuable problems,
- A harmonized data layer, enabling continuous analytics and data refreshes,
- The latest analytical and AI techniques to uncover insights quickly and continuously,
- Creating simple consumption platforms to discover insights and create scenario simulations; democratizing RGM insights through cognitive consumption applications,
- Agile ways of working, with value uncovered in every sprint.