Harmonizing data across the enterprise to enable powerful analytics

Consumer goods companies are spending millions of dollars to access data from multiple sources. These data sets must be harmonized for their true power to be realized.

Misaligned data sets lead to incorrect analysis, which could then lead to misdirected business strategies. It also means a poor return on your investment, making it hard to justify additional spending on data. Common problems resulting from an un-harmonized data environment include:

Harmonizing data across the enterprise to enable powerful analytics

Having different data sets pointing in different directions can be worse than not having data at all. At best, incompatible data sets represent a waste of money; at worst, using them for business-critical analytics could lead to illguided strategies and material loss to the company.

Good data can transform a business, however. So your focus should be on turning these disparate sources into a coherent whole that lets you access the relevant data at the right levels. Data harmonization does exactly that, and typically at a fraction of the cost of purchasing the data sets in the first place.

What is data harmonization?

Harmonization is a continuous process that aligns all available data sources – market data, shipment data or customer data – across all key metrics.

By synchronizing data points across products, channels, time periods and geographies, otherwise disparate data sets can easily converse with each other. And that brings the full range of business data to bear, making the entire organization “analytics-ready.” This will:

Harmonization helps develop focused marketing strategies.

 

A major Fortune 500 was able to estimate the market size and develop focused marketing strategies for each subcategory, which were originally masked under one big category of oral care in the data provider’s database.

Harmonizing data across the enterprise to enable powerful analytics

Harmonization vs. transformation

Combining data sources in a single warehouse is not the same as harmonization. There are many tools in the market that claim to transform and integrate different sources, but few are capable of truly harmonizing disparate data sets to the level required to build powerful analytics.

Sometimes, integration is really all that you need. If the objective is to check the data for completeness, validate for accuracy or simply add to your existing data warehouse, then you may not need a complete harmonization. Setting up a simple “extract, transform, load” (ETL) process or commissioning a data service provider to clean your data may suffice.

Harmonization cuts down ‘data-to-decisions’ time.

 

A CPG manufacturer was able to cut down the data-to-decisions time by 45 days through automated harmonization.

However, if your objective is to make the organization “analytics-ready,” then you should embark on the harmonization journey. This should include:

Harmonizing data across the enterprise to enable powerful analytics

Harmonization also simplifies data governance – a process overlooked at most companies. Updates to one data set will automatically be reflected in all others. This can prevent companies from making multi-million dollar errors. Imagine if a retailer consolidation was reported in your internal systems, but not in the syndicated market data; the insights derived and, therefore, the recommendations for your customers could be disastrous.

Harmonization will help go beyond linear hierarchies to help with business strategies.

A global beverages company was able to create attributes to classify retail outlets based on monthly sales volume and hence support its distributors with promotion strategies for specific retail outlets.

The trouble with data sets

While you should expect data providers to share data sets that are ready to use, that unfortunately is not always the case. Even data from top global providers can have problems such as:

Harmonizing data across the enterprise to enable powerful analytics

This means that, even after spending millions of dollars to access data from multiple sources and even more on having it all streamlined, your organization could still be left with data sets that can’t talk to each other.

Taking the plunge

Choosing the right partner to guide you through harmonization is vital. But before you can choose that partner, setting a clear objective is integral to ensuring that the process does not end up becoming a simple data integration.

A strong vision, and identified use cases, will help you choose the relevant fields and optimize your effort and resources toward enabling the analytics solutions you need. This requires a commitment from any organization embarking on the harmonization journey to deliver the right infrastructure, set up robust governance processes, and ensure that key personnel are engaged in the project.

Fractal has experience harmonizing more than 1100 country categories.

For consumer goods companies, the rewards are considerable. You will finally be able to unlock the enormous value in datasets currently sitting idle or – even worse – generating misleading insights. By turning the flood of data that’s available into a coherent, cohesive whole, data harmonization gives businesses a full-spectrum view of their organization and the markets in which they operate.

Fractal has experience harmonizing more than 20 different data sources (from syndicated to media to shipment to consumer survey).

AI to Unlock The Buying Power of Millennials For CPGs

By Amitabh Bose (Ambo), Fractal Analytics

The demographic segment retailers desire most – yet which is often the most elusive – is the millennial consumer. The fervor to unlock the buying power of millennials makes sense; those 26-42 are the largest percentage of working-age adults and currently account for $600 billion in annual purchase power – a number expected to mushroom to $1.4 trillion by 2020. So, it’s no surprise that forward-thinking retailers are adapting to reach this generation.

Consumer packaged goods (CPG) companies, in particular, are eager to crack how they can market to the millennials to increase their mind share. In fact, it was recently projected that millennials alone will spend $65 billion on CPGs over the next decade. So why not double, or even triple this projection? That’s exactly what CPGs have in mind.

For CPGs to get the attention of millennials, however, they’ll need to identify what differentiates them from other generations – including interpreting their spending habits and understanding their personal needs and wants. This is harder for CPGs than it is for online retailers, as most CPGs are sold through third-party sellers.

This is where AI-driven strategies can come into play.

For example, millennials are known for demanding personalized experiences much more than earlier generations. In fact, 75 percent of millennials say they’re willing to give up personal data in order to work with businesses that have instant on-demand engagement, as opposed to only 53 percent of baby boomers or traditionalists. CPGs that learn how to leverage AI technologies to create these tailored experiences will become the leaders of their industry.

One way CPGs can harness the power of AI for personalization is by analyzing various customer datasets-both first-party and beyond-to conduct an “equity drivers evaluation.” AI can help identify the most resonant brand drivers for different types of consumers. It can also segment consumers in a very granular way, by the drivers identified for each, and develop brand “ratings” for each segment. This type of analysis can reveal valuable new insights that may be adopted in the design of campaign messaging, product positioning and even product innovation.

Leading CPGs understand that scale is no longer a competitive advantage. In fact, millennials, which were the generation to drive the Direct-to-Consumer (DTC) market to the size that it is today, prefer a much more simple, personalized approach. To adopt a strategy that answers this demographic’s demands, CPGs need to adapt their current offering and develop new ones, through analytics, machine learning and AI. In doing so, they will be better equipped to develop and market products that will ­address the millennials’ very specific preferences, helping them to expand deeper into this coveted segment of the market.

The CPG that can understand and cater to the millennial consumer in an authentic way will be king in this new age of retail. And, the key to unlocking the lion’s share of that revenue opportunity will be AI.

This interest in using AI to conquer new revenue streams, such as the millennial, is a perfect example of how the consumer goods and retail industry are currently in the middle of an AI and machine learning transformation. My company, Fractal Analytics, a global leader in artificial intelligence and analytics that powers decision-making in Fortune 500 companies, has much to bring to the table in terms of strategy and implementation for such a transformation, due to our ability to identify future consumer and shopper needs, as well as market trends, through our trifecta of enterprise capabilities: AI, engineering & Design.

This year, we are the Title Sponsor for the 2019 Retail and Consumer Goods Analytics Summit. At the summit, we will showcase our recent, exciting work in artificial intelligence, machine learning and behavioral sciences. By presenting specific cases from the industry’s largest brands, we’ll be sharing how our technology and expertise have been used to drive positive results in the form of more sales, reduced costs and beyond.

The Physics of Marketing Analytics

Warning: count(): Parameter must be an array or an object that implements Countable in /var/www/html/wp-content/themes/Fractal-Analytics/functions.php on line 1004
Influencing C-suite to advance analytics adoption

Our second annual ai.nyc was held at One World Trade Center on Wednesday, June 5, 2019. This year’s event focused on the perfect recipe for AI problem solving, driving better business outcomes. Our panelists left attendees feeling optimistic and confident about the future of AI, as they shared inspiring messages about how the prospective relationship between humans and AI will elevate businesses, lives and the world.

New York Times best-selling author and mathematician, Cathy O’Neil opened ai.nyc with an ethical examination of artificial intelligence, citing multiple documented cases of biased algorithms, from criminal justice to child abuse to college admissions. Cathy also detailed findings from her book, “Weapons of Math Destruction: How Big Data Increases Inequality and Threatens Democracy.” For example, using information from her book, Cathy shared with the audience her views on the potential for bad algorithms to institutionalize injustices, such as racism and xenophobia. She also talked about how the coming together of human oversight and the right algorithms, could prevent such unwanted outcomes, helping us to focus on the good that AI can do, if planned and executed in a smart way.

Our co-founder, Srikanth Velamakanni delivered the second keynote of the day. In a fascinating presentation, Srikanth shared his recipe for scaled problem solving, including well-built AI, unique design and smart engineering. He referenced Microsoft’s Tay chatbot experiment, Google’s flu activity algorithm and Google Glass to demonstrate how a strategy missing any one of these ingredients can hamper great problem-solving ideas. He concluded by saying, “augmenting human intelligence is where the action is.”

E-commerce optimization expert, Joe Keating from Hill’s Pet Nutrition, along with Fractal client partner, Dipita Chakraborty led a positive and constructive discussion about mining social data analytics to help e-commerce and CPG brands perform better. Keating, who works in consumer pet products ecommerce, shared examples of real-world insights, cautioning brands against excessive automation, “don’t fall into a trap of thinking you don’t need people.” He also noted that by doubling down on analytics they’ve seen significant growth. It seems the right combination of both, is the key to success.

In her chat, Dipita detailed how to analyze social chatter, advising, “whenever we use an algorithm, we have to figure whether the algorithm is spitting out meaningful data or not.” She also shared that brands are all trying to answer the million-dollar question, “what’s the next big thing consumers want?” through analytics. Dipita helped the audience to understand how to answer these questions for themselves, by sharing insights and her own real-world use cases in which analyzing social chatter correctly helped brands spot innovation, improve brand health and drive conversions.

In the “Problem Solving @ Internet Scale” session, Chris Jasensky, Area IT Director and VP at North America, RB and Rambabu Vallabhajosula, SVP, Hotels Performance and Revenue Management at Priceline, along with Fractal’s own, David Yeo discussed the critical ingredients for implementing AI at scale, including the importance of human capital. “It takes a village to create a successful AI scaling program,” shared Yeo. The panelists detailed how implementing AI at scale is not just a technical challenge and agreed that AI projects need organizational buy-in and alignment to succeed. Company colleagues need to feel they have a vested interest in AI projects, which can be stimulated by sharing KPIs and measurement to inspire people to work for the common goal. Jasensky added that “AI projects must deliver a big, dynamic benefit to keep people engaged.”

Biju Dominic, CEO and Co-Founder of Final Mile Consulting energized the delegates with a passionate plea for designing AI products with the human element in mind. Dominic urged the industry to consider human behavior in the creation of its solutions, striving to build stronger emotional connections between humans and AI. Dominic also suggested the industry address the non-conscious barriers to adoption — primarily trust and emotional connection, closing with a thought-provoking message on the future of artificial intelligence, “we can’t win by making others within the organization lose.”

Panelists Natali Mohanty, who is the Senior VP of Data and Analytics at Pure Insurance, Mike Gualtieri, VP and Principal Analyst at Forrester Research, and Fractal’s own, Lana Klein discussed the “Magic of Three Ingredients of Transformation.” This provoking conversation also shared the nexus between engineering, design and human capital, as it relates to scaling AI products, referred to as the “trifecta.” Natali presented a compelling case study, which demonstrated the difficulties in scaling AI products when either quality engineering or design are inadequate. “No matter how hard the problem is, you cannot divorce from the design mindset,” observed Natali, who also defined a design skill as, “requiring a combination of imagination and empathy.”  They all agreed that a practical, long-term mindset is mandatory when launching AI products, “it’s a marathon, not a sprint.”

Our co-founder and CEO Pranay Agrawal concluded the day-long conference with fascinating assertions about the role AI plays in everyday lives and how it has helped make our world a better place in the long-run. He also talked about how the world has never been a better place to live in, on any objective measure, and about how technology has played a major role in that.

Pranay also covered the framework for a successful AI-driven future, which included algorithms that allow us to match and exceed human capacity, engineering that will successfully feed tremendous amounts of data into the algorithms, and good design, which will help to solve the right problem. He concluded by saying that in order to solve a problem, thoroughly, we also need to put users at the center of our decisions.

At the very end, Pranay left the audience with this, “Let’s all be happy, and appreciate the fact that we are living in the best times ever, and that if history has taught us anything, it’s that as technology gets better, which it always does, so will our lives, and the world.”


Warning: count(): Parameter must be an array or an object that implements Countable in /var/www/html/wp-content/themes/Fractal-Analytics/functions.php on line 1004
(Machine) Learning to make great decisions

It’s no exaggeration to say that every problem today is an AI problem. But while AI is a critical component of our problem-solving approach, it’s no longer enough to overcome these challenges at scale. We now have a new recipe to solve complex challenges at scale & drive action –  AI combined with Engineering and Design.

Our ai.lcy event in London last week focussed on helping businesses drive better decisions when operating at scale. The event was attended by FTSE 250 attendees from a wide-range of industries including CPG, financial services, healthcare, insurance, telecom, retail and more. Speakers from companies including Mars, Visa, Google, Lloyds Banking Group and M&G Prudential provided their insights to 60+ attendees, as well as members of the media. They also experienced AI products and services in the exhibit showcase area. Here is a brief top line summary of each session:

  1. Keynote – How AI revolutionises business strategy – Kenneth Cukier (Senior Editor, The Economist)
     
    Businesses need to think of data as a new factor of production. The more data we collect, the more we can do with it and the more we can produce. Because we can now apply artificial intelligence (AI) to different problems, businesses are able to learn things they couldn’t before. They can maximise new opportunities and create new values (jobs, services, production, sales). But how an organisation ‘frames’ the problem it’s trying to solve through AI is becoming increasingly important. AI is at it’s very best when forecasting and making predictions. As a result, businesses need to stop considering the problem as “humans vs. machines” and instead make everything a predictions problem. That needs to be at the core of their AI strategy.
  2. Keynote AI is not enough – Srikanth Velamakanni (Co-founder and Group Chief Executive, Fractal Analytics)
     
    AI is becoming ubiquotus – all problems will be reframed as AI problems & at the core of everything that we are doing around the world of  AI is a behavioural problem. When we think of it as humans and machines as opposed to humans vs. machines, and balance between driving forces and restraining forces, we power decisions that make real progressAnd AI alone is not enough to solve problems at scale. Machines are scalable, but so are their errors. As simple automation isn’t always the solution, a deep understanding of human behaviour is needed. Therefore, a combination of AI, engineering and intelligent design is needed to solve problems at scale. To maximise the benefits and efficiencies of AI, businesses must combine their solutions with their staff. Simply having algorithms that outperform humans isn’t enough, you need to marry super intelligence to experience. Machines solve problems, humans make sense of things. They have the ability to adapt to a new environment, to a new set of circumstances very quickly. Machines alone don’t possess that adaptability
  3. How to use AI to hack tricky problems – Abhijit Akerkar (Head of Applied Sciences, Business Integration, Lloyds Banking Group), Priyank Patwa (Head of AI & ML, M&G Prudential), Rahul Desai (Client Partner, Fractal Analytics)
     
    To solve tricky problems, we need to understand human behaviour. To improve the overall experience – which should be the goal of any business – businesses need to consider the motivating factors that drive customer decision-making. And then we need to ask ourselves: can we predict the next likely event in the customer’s journey and power the “next best action” for the customer? To do this, we need to appreciate that short-, medium- and long-term actions have consequences on the customer journey. So, we need deep learning to extract customer patterns (and memory) from the journey so that we can craft a model that propels that person to the next best action (and deep learning-based models outperform traditional models across various scenarios). We need to change the decision-making journey by making sure that the right information and the right insights are available to the right person at the right time.
  4.  How to solve problems at internet scale – Linden Glen (Digital Transformation Director, Analytics & Data, Mars), Arpan Dasgupta (Client Partner, Fractal Analytics), Sameer Dhanrajani (Chief Strategy Officer, Fractal Analytics)
     
    Business should start with a user-centric approach: find out the problem of specific users, use design thinking to uncover the problem and why it needs to be solved. Once this is done, we must then consider how we should use analytics and AI to solve these problems. The final step is looking to find out how to scale a solution. But all starts with understanding what the initial problem is before thinking about what data and technology is needed.It is the business’ goal/aim – whatever it is that it’s trying to achieve – that will inform the type of AI solutions and computational architecture that it crafts and deploys. At the end of the day we’re talking about a sea-change in personal preferences and so now we have to make a change in business processes.
  5. How to make it work with design – Pranay Agrawal (Co-founder and Chief Executive Officer, Fractal Analytics)
     
    To fully unravel human behaviour, we need to go beyond data. We need to ask why people take the actions they do? And businesses need to go beyond what the data shows them. Behaviour is driven by a wide variety of emotions, desires, factors and influences, some conscious and some unconscious. The two key factors in any behavioural change situation/scenario are the driving force and the restraining force. These need to be examined and understood so that we can design solutions for non-conscious behavioural change. So, we need to not only get a better understanding of behaviour but also context. Because context alters human behaviour. To identify the problem, we need to analyse the behaviour. To solve it we need to understand it.
  6. The new recipe and it’s magic – Ben Neffendorf (Joint Data Science Lab Delivery Lead, Visa), Eleonora Kourtzi (Product Marketing Manager – Digital Growth Lead, Google), Martha Bennett (Principal Analyst, Forrester), Natwar Mall (CEO, Cuddle.ai)
     
    To build and implement AI solutions correctly, organisations should pay special attention to the people involved. It’s the people who gather the data, people who select which data goes into the model and people who design the algorithms. All the problems that businesses are trying to solve are problems of AI, design and engineering. Through the right combination of all three, we can help business leaders reimagine their business through new technologies. Improving the overall experience for the customer and generating high-quality results.

Through ai.lcy, we have challenged some of the assumptions around AI and big data practices, while delivering insightful and thought-provoking sessions on what businesses need to do to apply AI efficiently and scale it effectively. To be successful, organisations need to understand the driving forces behind customer behaviour, the context of that behaviour within the customer journey and the problems that need solving. For businesses to get the best results and deliver the best possible experience, they need a combination of AI, engineering and design.


Warning: count(): Parameter must be an array or an object that implements Countable in /var/www/html/wp-content/themes/Fractal-Analytics/functions.php on line 1004
Feuling digital sales

“By 2020, only one-third of sales organizations will have embraced predictive and robotic technologies that guide and automate actions to achieve sales goals.”

– Mark Smith, CEO & Chief Research Officer, Ventana Research

ABSTRACT

Fueled by new levels of sophistication, processing power, and AI solutions in the digital landscape, it is time for enterprise executives and sales leaders to fully embrace AI to empower their sales organizations. Innovations in AI, robotics, and chatbots are expanding at an accelerated rate. As a result, there are more capabilities and solutions available today than the ability of sales organizations to adapt. Moreover, many sales organizations appear to be lagging in their efforts to utilize AI as an enabler for digital sales transformation.

Sales drives the engine of growth and represents your front line in creating and sustaining lasting customer relationships. So why are only one-third of sales organizations adopting predictive analytics, AI, and robotic technologies? Sales organizations also represent the largest and most expensive labor pool in most organizations so any gains in productivity or effectiveness will have significant impact to business performance, shareholder value, and enhance customer experience.

This white paper provides a roadmap you can utilize to embrace AI in your digital sales transformation and realize the full potential of your most powerful employees – front line sales! There are six critical steps in the journey:

  1. Understand your customer journey and engagement model
  2. Activate your data
  3. Move from many channels to omni-channel
  4. Embrace predictive analytics
  5. Stop “pulling data” and start “pushing” information and insights to your sellers
  6. Reduce complexity – Utilize AI and digital solutions to simplify sales processes

The following discussion and examples are focused on B2B enterprises and sales organizations. There are many more use cases which can be discussed for planning and optimizing your sales organization and the design of your sales force (both “direct” and “indirect”). However, the focus of this white paper is empowering your salespeople to be more productive and effective using AI and emerging digital technology solutions. It may also be useful to review the Fractal Analytics’ white paper titled “Accelerating AI enterprise-wide to achieve a competitive edge” to learn more about key ingredients and strategies for successfully implementing AI and analytics in enterprises.

Many of these suggestions are not new. However, what is new is the rapid growth in innovation and available solutions, and an increasingly competitive landscape that is adopting AI solutions for sales at varying rates of speed and success. Enterprise executives and sales leaders who employ these best practices will be at a competitive advantage over those who do not or cannot. Perhaps easier said than done, but there are significant rewards for enterprises that lead the way in driving digital sales transformation through AI.

CUSTOMER JOURNEY AND ENGAGEMENT MODEL

According to Forrester1, “Modern B2B buyers want to buy from modern sellers; they want to interact fluidly across channels, and when doing so, they expect to have a consistent brand and engagement experience. B2B companies that fully embrace this journey will thrive, and those that delay risk disintermediation from competitors and/or buyers themselves.”

There are obviously massive implications with this trend, but one clear message is that there is more data than ever before being generated by these digital buyers through an ever-increasing number of channels. The “art” of selling is being augmented, if not replaced by the “science” of selling.

Therefore, whether you are early in your digital sales transformation, integrating a new acquisition, or establishing standardized sales processes, the critical first step is to understand and document the customer journey across these various channels and touchpoints. Identify the key touchpoints, events, or actions that shape or influence decisions, perceptions, actions, and customer experience. From there, you can map your sales engagement process to understand where you can make the greatest impact from a sales perspective. Likewise, this will help you identify where you also have the most significant gaps.

Once you’ve identified these critical events, you can then map your data sources and begin to determine where, and how, you can employ AI and new technology to deliver the information your sales teams need, and when they need it. You may also determine where you may have glaring gaps in your data models and information flow, or where you rely on excessively manual processes and corporate knowledge. Once understood and documented, you can then prioritize where you need to invest in process redesign, tools, or capabilities to remedy the gaps.

1 Mary Shea, “Sales Digital Transformation: It’s Now or Never!”, Forrester, January 8, 2018.

Using a simplified and generic customer journey map for B2B customers helps to illustrate the point. By utilizing a customer journey map, you can identify what events or actions your sales team should either be aware of, influence, or drive to win business and satisfy customers. It will also help to expose where you have excessive complexity, where your salespeople require better support and collaboration by specialists, support, sales operations, or with external partners, including channel partners. The exercise will also help you determine your future- state model to improve the end-to-end process to remove complexity and make it easier for your salespeople to sell and to satisfy your customers.

Finally, the exercise will also provide you with critical information to help you identify where you may be able to invest in new applications, tools, or redesign your legacy sales processes to utilize data-driven or AI-enabled sales processes. By leveraging AI-driven sales process software and tools, you can redesign, automate, and standardize how you perform activity management, opportunity and deal management, prioritization, pipeline prediction, and sales forecasting. By mapping your customer journey, you can also enable the implementation of more comprehensive quote-to-cash (QTC) and configure-price-quote (CPQ) processes. In all these cases, you will be in a better position to leverage advance analytics, including machine learning, to compare actual vs. desired activities or actions and make appropriate recommendations and drive the right behavior throughout the sales engagement process.

FIGURE 1. B2B Customers Illustrative Customer Journey Map

Fueling digital sales transformation with AI

Mapping the complete customer journey through all channels-digital and non- digital-is obviously an extensive and lengthy exercise. However, even if you do some basic mapping and understand critical events or activities along the journey, you can use these to begin charting your digital sales transformation. In other words, you can take a “crawl, walk, run” approach vs. “boiling the ocean” to get started. This will also allow you to begin to design or redesign your workflow around digitally enabled processes and tools which can improve customer experience, sales productivity, and improve win rates or conversions. As an example, in analyzing your online customer engagement or “clicks”, you can apply advanced data engineering and anomaly detection to identify unique customer journeys, determine “drop-offs” in the journey, and establish improved or automated processes to reduce drop-offs and improve conversion rates.

Once you have identified your key processes and touchpoints, it is likely that the data rests in many various sources and in many different forms and formats (e.g., structured, unstructured). As a result, this requires a salesperson to spend valuable time aggregating this information from these various systems and tools to prepare for or meet new or existing clients, address challenges or questions, and update sales management on progress vs. plans and objectives. This administrative work is obviously unproductive and prevents the salesperson from spending more time in the field or with customers. It can also force the salesperson to determine the appropriate or next best course of action without the benefit of predictive analytics to help guide the process or the decision. Predictive analytics can provide recommendations or guidance based on analysis of historical events or activities that can yield better results than traditional methods or gut instinct. Augmenting the “art” of selling with “science” can make it easier for the salesperson and can generate more effective and productive outcomes. Taking this a step further, if you can complement the use of predictive analytics with technology solutions which can aggregate and simplify the delivery of this information to sales, then you can eliminate these administrative tasks and free up your sales teams to spend more time where it matters-with your customers.

ACTIVATE YOUR DATA

Once you’ve defined your customer journey and critical touchpoints, the next obvious question is “where is the data, and what do we need”? Data is the “fuel” for AI and empowering your sales teams to successfully manage the customer journey. Delivering effective AI-driven recommendations and results requires readily-available clean and trusted data from your source systems.

Starting with your CRM, begin to identify where source data resides in your organization. You will quickly find out that only a portion of the data you need resides in your CRM. You may also need to access data in your ERP system, support systems, customer success, content management, CPQ (configure- price-quote), and other transactional systems. A significant amount of data will reside in your data warehouse, data lake, or the data may reside in “shadow IT” data sources, which are the most challenging of all. Regardless of where the data resides, you need to map each of these data sources to the critical events you defined in the customer journey and determine how easy, or difficult, it will be to integrate, access, cleanse, transform, and deliver the data you need to be successful.

FIGURE 2. Enterprise Date Strategy & Governance Program

Fueling digital sales transformation with AI

As you identify these various sources and map them to the critical touchpoints you’ve defined from your customer journey work, you then need to determine the quality of the data. You must understand if you have consistent, well-defined, and agreed-upon data definitions and standards. Based on these definitions, how clean is your customer data? How are inputs created, by whom, and how frequently? What are the chances you may clean your data only to find that errors creep back into your data due to lack of controls, data stewardship, or data governance? Frequently overlooked, an ongoing data governance framework is vital to ensure that once you clean your data, it remains accurate and reliable.

AI can also provide a very useful benefit here as well. It can improve how you capture customer data and information (e.g., scanning business cards vs. manual data entry, or ingesting LinkedIn data). AI can also help rationalize and cleanse customer data records like names, job titles, addresses, phone numbers, company names, etc.

Since the foundation for much of your customer data is your CRM system, it is imperative to drive adoption and use of your CRM systems to ensure timely, accurate, and reliable data. Historically, this has been a significant challenge for many sales organizations due to many factors. Updating CRM systems often requires significant administrative time to modify or add records, update opportunities, etc. There is also often an unwillingness or reluctance to provide accurate opportunity or pipeline data due to quota or compensation concerns, complexity, or other factors. As a result, legacy CRM solutions can present a drain on productivity due to the manual data entry required to provide accurate and timely information for sales management. There is also some reluctance to provide too much information, which can then be “micro-managed” by sales leadership. Nevertheless, the more accurate your underlying sales and customer data, the better and easier it will be to make decisions for your sales organization and your company. Given these challenges, how can you improve the accuracy of your customer data?

There are many tactics you can utilize to drive adoption, use, and compliance. Traditionally, these methods work with varying degrees of success, but industry-wide CRM systems remain plagued by low adoption, infrequent usage, complexity, distrust, and erroneous or missing data.

Through innovations in AI, a new class of products and solutions has recently emerged to address this challenge. These solutions are generally described as “Virtual Digital Sales Assistants (VDSAs)” or “Intelligent Assistants”. VDSAs combine AI algorithms with touch, talk, or text interfaces with leading CRM and Salesforce Automation (SFA) solutions, marketing content/automation systems, customer support, legacy third-party databases, and other data sources. VDSAs simplify how sales can interact with these source systems to provide updates on contacts, leads, opportunities, and gather critical information at pre-determined times or on demand. Ultimately, VDSAs remove “friction” from current sales processes to allow sales to be more productive and effective. When coupled with predictive analytics or AI-delivered insights as described elsewhere in this white paper, it can become an even more powerful element of your digital sales transformation.

Not only can VDSAs simplify and streamline data integration for sales and help reduce administrative time and friction, there is another very strong value proposition. By making sales’ jobs easier and delivering information and insights to sales to help them perform their jobs better, there is a strong incentive by sales to use these solutions. As they do so, the data in the underlying CRM will be better, more accurate, and timelier. This means that all downstream data- driven processes, including forecasting, supply chain demand planning, and predictive analytics, will also become more accurate and trusted. An example from GE illustrates this point. At GE, “sales, technology, and finance executives have been collaborating on an app to reduce time that sellers spend inputting and addressing forecast questions. This app allows sellers to enter information on the fly, through text and voice solutions, and has eliminated multiple rounds and levels of management inspection of the numbers. Early pilots point to significant ROI as sellers spend more time on customer-facing selling activities.”2

To conclude, as you aggregate and deliver clean, reliable customer data through your CRM and source systems, you can then apply AI predictive models and algorithms to help your sales teams make sense of the mountain of information available to them. Ultimately, this will enable you to make it easier for your sellers to succeed in their core mission, which is to serve your customers and drive growth for your business.

MANY CHANNELS TO OMNI-CHANNEL

According to Forrester,3 “the explosion of content-rich B2B marketing and commerce sites has made buyer research easier than ever. This digital maturation is in stark contrast with the experience buyers receive working with human sellers. More than 90% of B2B buyers prefer to make their purchases online rather than interact with a salesperson, yet highly considered purchases often still require seller involvement. AI in human-assisted sales needs to match the buyer experience of self-service if sellers hope to stand a chance in long sales cycles.”

Buyers today are more educated and connected than ever before, and accustomed to a digital experience in their personal lives. As a result, their expectations for achieving the same experience in their working lives has increased dramatically. When considering new products or services, they will engage multiple channels to explore and learn about products and services, frequently doing so before they ever engage a salesperson. They will go online and read marketing literature, white papers, customer testimonials, watch videos or demos, or perhaps even download sample or trial software. As a result, it is essential to connect all these various channels to ensure not only a consistent experience for the customer, but also arm your sales team with the information they need to be successful.

2Mary Shea and Jacob Milender, “B2B Sales Force Digital Transformation: Three Global Leaders Share Best Practices”, Forrester, July 26, 2017.

3John Bruno, “How AI Will Transform Sales”, Forrester, December 18, 2017.

This means that coordinating and collaborating across organizations, partnerships, and through various channels is more critical than ever. It’s no longer adequate to have your marketing organization collect and distribute recommendations and actions through email, powerpoint, or other standalone data sources to your inside or outside sales teams. Data-driven information must be available and  delivered as “real-time” as possible to exploit opportunities before they are lost. A recent article by Boston Consulting Group also stated that “the big problem for most companies is that marketing and sales operate in their own silos, each function having its own organization processes, incentives, cultures, and in many cases, objectives”. Furthermore, they state that “companies that do not cooperate closely across their organizations may suffer because of poorly executed customer buying journeys, misaligned objectives, misallocated resources, and poor team morale”, resulting in the potential for “customer alienation, loss of market share, and slowed or no growth”.4 As a result, it is imperative to close the historical gap between marketing and sales through integration of data and insights into common language, actions, and insights delivered through a single, integrated platform. Many CRM systems attempt to provide this connection, but all too often, marketing teams and sales ops or support teams deliver additional insights or recommendations through alternate means like email, chat, powerpoint, or excel. Frequently, these recommendations may not align due to underlying data differences, inconsistencies in definitions, KPIs, or analytical models. Ultimately, this requires the salesperson to consolidate and decide what course of action he or she needs to act on. This can exacerbate the administrative burden and reduce confidence in the predictive analytical models.

One excellent example of addressing this challenge is Cisco’s 2020 initiative5. Cisco’s sales and marketing teams partner on Customer360: a data-driven collaborative effort to better understand buyers and provide guidance and recommendations for sales to take appropriate action. By employing data engineering, predictive analytics, and “connecting” marketing and sales data and insights, it means the seller doesn’t have to look for the information and can focus on selling and their customers. Not only does this mean your reps are more effective, but you can also realize significant improvement in your sales cycle time, conversion, win rates, and ultimately revenue.

It’s also imperative to be aware of and align your efforts between your “direct” and “indirect” sales teams or your channel partners. It’s no longer acceptable for the sophisticated B2B buyer to have discrete engagements by both sales organizations, particularly if unaware of the potential “conflict”, or worse, uninformed and openly competitive. It’s equally important to collaborate and align with your support organization (i.e., are there any open service tickets or escalations?) and your specialist teams with deeper knowledge of your product and services capabilities and features. Your salespeople need to know who is calling on their customer, what have they bought, and what is their experience with your company, your products, and services? All of this requires knowledge and information from various sources and organizations-both inside and outside of the company.

4Phillip Andersen, Robert Archacki, Basir Mustaghni, Roger Premo, “Building an Integrated Marketing and Sales Engine for B2B”, The Boston Consulting Group, June 2018.

5Mary Shea and Jacob Milender, “B2B Sales Force Digital Transformation: Three Global Leaders Share Best Practices”, Forrester, July 26, 2017.

“We take marketing sentiment data, pair it with sales data, and create insights that tie to opportunities and actions that reps can take. This collaborative effort fosters higher- quality interactions with customers and better rep prioritization of selling activities.”

– Forrester describing Cisco’s 2020 initiative

To meet the need to share and exchange information and ideas across these various channels, there are many new collaboration software and solutions in the marketplace. These solutions are increasingly powered by AI and address various use cases such as team collaboration and communication, content or document coordination, edit and approvals. Collaboration tools and applications can help sales work seamlessly with other teams and customers to accelerate deals and provide a superior customer experience. If designed and implemented with the salesperson at the center, you can facilitate rapid collaboration by diverse teams to enhance communications, enable the use of file sharing and document annotation capabilities (e.g., proposal content review and coordination), support live meeting capabilities through video and audio, and incorporate electronic signature capabilities for faster approvals of proposals and contract documents. Ultimately, by improving collaboration through these tools, you also remove friction and administrative burden from your selling process.

“Sales force digital transformation requires new and more creative ways of collaborating.”

– Forrester

EMBRACE PREDICTIVE ANALYTICS

Fueling digital sales transformation with AI

It’s an unprecedented era for B2B sales organizations. Buyers and buying patterns are changing. Traditional sales models are being augmented if not supplanted by digital channels and expanded routes-to-market. There is a staggering amount of data, computing power, and technology solutions available to sales. However, if left unmanaged, it merely adds to the burden a salesperson has, which is to navigate the growing amount of data and information being sent their way. Sophistication in advanced analytics and machine learning provides the ability to augment traditional sales “instincts” with data-driven information at a scale never possible before. Due to these trends, B2B sales is also rapidly evolving from an “art” to a “science”. To many, this is not an entirely comfortable conclusion given that “science” does not completely replace human knowledge, intuition, experience, and “gut instinct”. However, there is so much data and information available for the salesperson today, it is not humanly possible to make sense of it all without the benefit of big data, analytics, and AI, particularly if utilized effectively. Like the “needle in the haystack” analogy, it is critical to discern what is important and insightful from the mountains of available data. This is where predictive analytics can make a significant impact. Furthermore, according to Forrester in a recent Forbes article6  “companies that opted to blend AI with human insight report improved satisfaction on the part of sales reps (69%), as well as heightened operational efficiency (68%), agent productivity (66%), and customer satisfaction (61%).

Therefore, embracing advanced and predictive analytics can benefit sales organizations in many ways. From the massive amounts of data available to enterprises today, advanced and predictive analytics can rapidly evaluate the available data faster and better than a human is able to. By employing predictive analytics, you can provide recommendations on campaigns to maximize impact, accounts, customers, or opportunities to prioritize. You can also provide guidance on accounts with a higher propensity to win or buy, help optimize product, and pricing strategies, next product to buy, and more.

6Falon Fatemi, “4 Ways AI is Transforming Sales Organizations”, Forbes, February 28, 2018.

The table below illustrates some predictive analytics use cases that can be applied against each step of the customer journey to provide guidance and recommendations to the salesperson throughout the process. Moreover, as you implement these predictive analytics solutions and apply machine learning, the output will become more comprehensive, refined, and accurate. A fundamental principle of AI and machine learning is the ability to learn, improve, and increase in accuracy over time through repetition and feedback. As you compare actual results vs. predicted outcomes through machine learning processes, you can refine your algorithms. Therefore, as you implement, explore, test, and learn, the software algorithms will improve as well. This means the entire ecosystem gets smarter—sales reps, sales managers, executives, and those who rely on sales to drive (and predict) the engine of growth and customer experience for the enterprise. It also means the outcomes and recommendations gain trust as they improve in accuracy; and based on Forrester’s findings, your sales reps will also be more satisfied, productive, and efficient.

 

FIGURE 3. Sampling of Predictive Analytic Use Cases

Fueling digital sales transformation with AI

As you implement these various use cases, you will find that not only can you provide recommendations or guidance based on the algorithms, but you may even be able to introduce proactive measures. For example, you could alert your support teams to act when you observe an event, trend, or pattern that requires attention or action while notifying your salesperson of a potential issue. As an illustration, if you have a shipment that may be delayed, you could notify your sales operations and supply chain organizations to proactively address or mitigate the issue to prevent or minimize the impact. Your salesperson would be informed and can help manage the event or issue with your customer. This is far better than not knowing, missing the shipment, and finding out about the issue through customer escalation. It could be as simple as a quote is expiring, or an opportunity at 90% in your pipeline is “stalled”. If you measure and track customer sentiment, satisfaction, or NPS, you could also proactively alert your sales rep that one of their accounts is at risk of leaving and recommend actions. If your account is “healthy” and/or you are expanding your presence in the account, you could also identify opportunities for cross- selling or up-selling complementary products and services or proactively capture renewal opportunities.

Depending on your business priorities or challenges, there are an infinite number of use cases, which can be designed and implemented to improve your sales execution and customer satisfaction. As you employ these predictive analytics, learn and improve the underlying algorithms, you can improve your ability to proactively provide recommendations or guidance to your sales teams.

Proactively suggesting actions avoids not only missing critical events or opportunities to win business, expand relationships, or retain accounts, but it also moves you away from the reactive and time-draining cycles of managers, support staff, or executives asking what happened along with the ensuing emails, phone calls, and escalations. Even worse is it means a salesperson will be forced to spend more time dealing with internal issues and questions as opposed to spending time with customers. You may even go beyond data science and apply “behavioral science” in order to understand the drivers of behavior (customer and/or salespeople) and design levers which “nudge” or encourage the desired action or outcome.

To improve or accelerate the use of predictive analytics, you may also consider how you organize your analytics and data scientists (internal, or external partner) to support your sales organization. While there are benefits to a shared service function like analytics and data science, you will get significantly more impact, relevance, and buy-in from sales if you align your talent to the sales organization they support. This does not suggest you utilize a fragmented organizational model for your analytics talent. They may (and likely should) still report into a central function. However, they should be embedded into the business they support. In this manner, you can benefit from both the advancement in analytics and AI knowledge and skills developed within the central function while learning more about the business processes and challenges. The more domain knowledge the analytics professional or data scientist has, the more relevant and accurate the predictive analytical models. From there, it is imperative to experiment, test, and modify the original hypothesis with actual results and continually refine the algorithms.

As you advance in your knowledge, use, and trust with data-driven recommendations, it will only be natural to see the increase in use of AI, chatbots, and robotics to automate repeatable activities or transactions, freeing up your high-cost, and highly-skilled, sales talent to focus on more complex engagements and opportunities.

From a sales leadership perspective, these predictive analytical models should be used in conjunction with “gut instinct”, not instead of. The more accurate and useful the prediction, the lower the variability in actual results vs. predicted results. This is a significant improvement over purely “gut instinct” or even worse, “gaming”. As discussed above, algorithms and models will mature and improve. As they do, sales and sales leadership can rely even more on these data- driven recommendations. This also means sales managers and sales leaders can provide more effective coaching and training based as much on facts and data as behaviors and acquiring knowledge. The emergence of “behavioral sciences” to complement “data sciences” may provide additional benefit as the field matures and more use cases become evident. By applying behavioral science to determine the way decisions are made (by sellers and buyers), identify optimal sales candidates, assist in sales coaching, sales communication and goal setting, and improve sales processes to encourage or “nudge” the appropriate behavior, there is great promise in what behavioral sciences can yield.

As discussed earlier in this white paper, it is essential to have good, clean, and reliable data from trusted data sources for these predictive models to work.

PUSH INFORMATION AND INSIGHTS TO YOUR SELLERS

Once you’ve defined the critical events in your customer lifecycle, the single “source of truth” for the data, and have begun utilizing predictive analytics to provide guidance to your salespeople, what is the best way to deliver that information to your front-line sales teams when they need it?

Traditionally, it was up to the sales individual to access multiple data sources to plan, manage, and report on their business and performance vs. plans and quotas. In many enterprises, sales planning teams, sales operations, marketing, or other organizations are sending either emails, texts, tasks, or actions in your CRM or using any available means to communicate a promotion, new offers, events, or recommendations for your sales teams, ultimately overwhelming them with information.

In addition, sales management and executive leadership want to be kept informed. They request status updates through email, chat, text, or most critically – the CRM. As a result, the salesperson would access their CRM system to enter contact data, account information, update opportunities, enter trip reports, and keep their management updated on all assigned accounts. The salesperson may need to create reports, charts, or slides for management review. While this helps keep management informed and allows them to keep track of appropriate actions and activity, it’s very time consuming for the salesperson.

If they are geographically or territory-assigned salespeople, they may need to access multiple data sources to plan their customer visits. They may use D&B, Hoovers, Aberdeen, and LinkedIn to gather information on the customer and account information or gather some competitive intelligence. They may access their CRM and other internal data sources to understand buying patterns, product history, outstanding quotes, and any existing relationships. They might use a mapping program like Google Maps to efficiently plan their route.

These examples highlight the fact that the salesperson must access disparate tools, applications, or data sources and aggregate the information in a manner that is relevant and useful for them to do their jobs. Perhaps work like this is being performed by your inside sales, sales support, or sales operations organization. However, all this work requires manual effort and can divert the salesperson from what they should be doing—spending more time with customers and less time doing administrative or data entry work. Moreover, most modern business intelligence (BI) tools and applications used to gather this information often requires a lot of training or require the end user or seller to drill into raw data, create pivots, or query databases, which contributes to the challenge in how you effectively deliver information to sales.

Fortunately, there are emerging platforms to make it easier for sellers (or anyone for that matter) to acquire this information. By using GUI search tools similar to Google, BI and analytics providers are beginning to make it easier to work with BI and analytics applications. Therefore, instead of asking your end users or sellers to navigate dense databases to search for information, they can query these new applications through search or voice commands and obtain the information they need. In return, they will receive real-time, contextual answers to their questions without having to spend valuable time mining databases. Behind the scenes, these search solutions employ AI technologies like NLP, machine learning, and chatbots to query, ingest, and deliver information to the end user or seller in an intuitive, relevant, and contextual manner. While GUI and cognitive search engines simplify how your end users or sellers can acquire valuable information or insights, it is even better if you can proactively communicate or push this information to your sellers, so they don’t have to look for it.

To further advance the thought, what if you not only provided data and insights to sales when they need it, but relied on tools and processes to remind or “nudge” them to act? For example, you just left a customer meeting, what if your sales application asked you to document your meeting, provide a trip report, and enter all of this into your CRM seamlessly?

Even better, what if you alert your salesperson before they enter the meeting that there is an update on pricing for an outstanding quote, expiring warranties, or software licenses that are up for renewal, or products which may be nearing “end of life”. Furthermore, by leveraging your predictive analytics engine, you can also provide recommendations on next best buying opportunity (e.g., if you are a B2B enterprise selling products and services, are there additional products, peripherals, accessories, or services that you can offer to complement a recent quote, or better, order?). What if the customer has experienced a recent product outage or is having troubles with your services organization? Obviously, it would be better to be armed with this information in advance to avoid surprises, better manage the customer experience, and take advantage of selling opportunities.

All these scenarios are possible (and many more) if you have defined your customer engagement and sales process, the underlying data sources, and leveraged your Analytics Center of Excellence or your analytics partner to develop and push these insights and recommendations to sales. As discussed in the next session, ideally you can do so through simple, intuitive mobile solutions.

REDUCE COMPLEXITY

UTILIZE AI AND DIGITAL SOLUTIONS TO SIMPLIFY SALES PROCESSES

The final piece of the transformation is delivering the data, information, and insights you have generated into an easy and intuitive interface for your sales organization-how they need it, when, and where. Mobile business intelligence, salesforce automation (SFA), or analytics solutions represent an improvement over legacy PC or browser-based applications. Mobile SFA solutions provide an easier way for sales people to interact with their CRM systems through mobile devices. While these mobile solutions represent an advancement in making it easier for sales to utilize technology to aid them in their work processes, they fall short of truly empowering the salesperson and solving the dilemma of poor adoption, use, and accuracy of data in CRM systems. As discussed earlier in this white paper, contributing to the challenge is the fact that critical data is typically stored in different source systems, data warehouses, or data lakes requiring the salesperson to aggregate the information.

As described by The Boston Consulting Group7, “while companies have made massive investments in technology, they haven’t focused on true integration – that is, integrating tools with the way people actually work”. The resulting paradox they claim is the “complexity trap” that most companies face. They further assert that “digital technologies and methods are supremely flexible. They enable businesses, end users, and IT departments to design applications and user journeys that are “just right” and adapt processes accordingly-in the end, reducing or eliminating this complexity trap. Naturally, this is also the dilemma of the average salesperson who is asked to navigate internal complexities to do their jobs.

Yet another challenge is that sales reps notoriously delay putting a deal opportunity into a CRM because they don’t want sales managers learning about it and constantly asking what they are doing to move the opportunity further in the sales cycle. In addition, it took time to translate notes or recall into CRM databases, which means sales reps will often wait until they work from home on Friday or prepare on Sunday evening for the week ahead. This means the data may be inaccurate, stale, or forgotten. While this is useful to provide information to sales while they may be traveling or visiting customers, it falls far short in empowering your sales teams with the data and insights they need to be successful in performing their work, generating sales, and spending time with clients.

As suggested by Gartner’s Tad Travis8, this challenge can be addressed by employing a Virtual Digital Sales Assistant “VDSA” solution for your sales organization, particularly when utilized in conjunction with predictive analytics use cases described in this white paper. VDSAs can integrate data across the CRM, support systems, content management, legacy data sources, and external data sources like LinkedIn, email, calendar, mapping, and other sources to deliver frictionless and contextual insights. Many of these VDSA solutions utilize Natural Language Processing (NLP) and chatbots to enable  your sellers to interact with these source systems and your CRM through voice command similar to how you may use Alexa, Siri, Cortana, or similar applications in your personal life.

7Vanessa Lyon and Anne-Francois Ruand, “Take Control of Your Digital Future”, The Boston Consulting Group, 2018.

8Tad Travis, “2016 Recap: The Third Wave of Sales Automation is Here”, Gartner, January 3, 2017.

“While companies have made massive investments in technology, they haven’t focused on true integration – that is, integrating tools with the way people actually work.”7

– Vanessa Lyon andAnne-Francois Ruand, The Boston Consulting Group

As a result, you could have an AI-powered sales digital assistant proactively deliver information to you in advance of a meeting that you’ve scheduled, or your sellers could ask for information through voice command. The VDSA can then deliver information on the account, contact, prior sales, open opportunities, open service tickets, partners or competitors who may be engaged, etc. All of this means your salesperson is significantly more knowledgeable going into a customer meeting, and with significantly less effort than trying to gather this information on their own. Moreover, you don’t need to employ large teams of support staff to gather and generate information like this either—further freeing up your talent and operating budget for more value-add work and more time with customers. When your salesperson leaves the meeting, their personal digital sales assistant can “nudge” them to capture and document key findings, agreements, next steps, or actions. It can then automatically update your contacts, account information, or opportunity status in your CRM system, meaning the information is timely, accurate, and readily available to your supporting organizations in sales operations, support, supply chain planning, or even your financial forecasting team. Using The Boston Consulting Group’s analysis, this would also allow you to remove complexity from your sales processes, and free up your salespeople.

Ultimately, instead of salespeople being expensive data entry clerks, they can enjoy a seamless, intuitive AI-powered interface (“touch, talk, text”) with their CRM and various data sources, allowing them to focus on the work of selling. The seller becomes the center of the selling universe, not the CRM. Coupled with the use of predictive analytics along the customer journey to deliver insights and information to your salespeople when they need it, you will greatly empower the ability of your entire sales organization.

Your sales organizations can enjoy higher productivity, win rates, and faster sales cycles, while improving employee engagement and customer experience.

“VDSA will become the primary interface by which sales representatives manage their work. When combined with artificial intelligence systems, VDSA will become the cognitive system that removes much of the inefficiencies common in B2B sales processes.”

– Tad Travis Research Director Gartner

CLOSING AND CALL TO ACTION

Unfortunately, there are no complete, end-to-end solutions which adequately address all of the challenges and inefficiencies that exist within legacy sales operations. Naturally, there will be convergence and consolidation as vendors and service providers mature in their application of AI solutions to solve these sales process challenges.

The good news is many of these emerging applications, tools, and solutions are focused on simplifying and improving the seller’s (and customer’s) experience through the application of advanced analytics and AI. This is a vast improvement over legacy applications that largely focused on the collection of data and information for management, oversight, and inspection of sales activities.

None of these steps are easy and may be too large of a leap for many firms. However, these strategies are pivotal to success in driving digital sales transformation in today’s rapidly evolving, complex B2B selling environment. Therefore, the sooner executives and sales leaders adopt these strategies, or embark on a roadmap to do so, the more competitive they will be in the digital era.

TO SUMMARIZE:

  1. Define your customer journey and identify your critical touchpoints. Use this to determine how and when your salespeople should engage customers (your methodology or desired process) and where you can apply AI solutions to aid them.
  2. Map your touchpoints to your data and your data sources. Leverage open source or API solutions to accelerate access to necessary data sources.
  3. Embrace all channels-online, offline, direct, indirect, support, and social media. Utilize collaboration tools to break down internal and external silos.
  4. Aggressively employ advanced and predictive analytics-adapt and modify algorithms to improve accuracy and confidence. As your organization learns what is possible, new methods will become apparent, including proactive, predictive, and prescriptive solutions.
  5. Take the burden of data entry and data consolidation away from your sellers. Streamline, integrate and “push” information and insights to your sellers when they need it.
  6. Utilize AI-powered digital solutions to deliver the insights and information in a simple, intuitive manner.

Fueling digital sales transformation with AI

Analytics and AI are beginning to make a significant impact in enterprise sales organizations. Leaders in adopting AI to enhance their sales processes are already reaping the rewards of their investments.

As stated by Gartner in a Forbes article9, “30% of all B2B companies will employ AI to augment at least one of their primary sales processes by 2020. The most effective companies, though, will use AI to augment multiple parts of their sales processes”. There will be significant rewards for companies that do so. According to McKinsey10, “companies that have embraced what we call the ‘science of B2B sales’ have already started to pull ahead of their peers in terms of revenue growth (registering 2.3 times industry average revenue growth), profitability (3 to 5 percent additional return on sales) and shareholder value (8 percent higher total return to shareholders than the industry average).”

 

9Falon Fatemi, “4 Ways AI is Transforming Sales Organizations”, Forbes, February 28, 2018.

10Tim Colter, Mingyu Guan, Mitra Mahdavian, Sohail Razzaq, Jeremy Schneider, “What the future science of B2B sales growth looks like”, McKinsey&Company, January, 2018.

AUTHOR

Fueling digital sales transformation with AI

DOUG HILLARY

Advisory Board Member, Fractal Analytics

Doug provides advisory services to help advance Fractal Analytics’ capabilities, services, and offerings to empower enterprise clients. Doug leverages his knowledge and experience to help Fractal Analytics and clients accelerate the use, adoption, and value creation with data, analytics, and AI in the enterprise. Previously, Doug held various leadership roles at Dell for more than 19 years. In his most recent role at Dell, he was responsible for providing global data, reporting, and analytics services to support Dell’s sales, marketing, finance, services, e-commerce, and operations business units. He was also responsible for leading a transformation strategy for improving the use of data, BI, and analytics across the company to enhance decision-making. Doug also led a digital sales transformation for Dell’s global sales operations by partnering with IT to create a big data platform for sales, enabling the use of enterprise- wide KPIs and BI solutions. He also led the creation and implementation of predictive analytics forecasting solutions for the global sales organization.

 

Customizing big data strategies to drive success in deployment

Due to the disparate nature of data, analytics, and technology in different organizations, the optimal big data deployment strategy may differ for every organization.

Therefore, a customized strategy that is specific to the organization should be drafted for deploying big data technologies without any disruption. A detailed assessment of integration and interoperability, security, governance, and processes should be made before the deployment. All the deliverables should be defined and artifacts made available for seamless implementation.

The objective of this paper is to enable the decision makers to adopt a comprehensive approach while deploying big data technologies and processes in their organizations.

The age of big and smart data

We are in the age of big data where organizations are exploring ways to store, manage, and harness large amounts of data effectively. With an intention to improve their operations and make informed business decisions, they are applying analytics to convert data into actionable insights. With the help of intelligent algorithms, the attempt is to make data smart so that it can send patterns and signals for informed decision making. This will eventually result in significant reduction of operational cost and increase profit.

The think tank at the USA’s leading health insurance company and Fractal Analytics planned rigorously for eight weeks before successfully deploying big data technologies within the former’s infrastructure.

Immense expertise, however, is required to select and deploy the right combination of big data technologies that enhance operations and address specific business needs. There is a plethora of technology offerings in the market that solve specific problems within big data environments. Moreover, these technologies are evolving at a rapid pace to offer greater efficiency and solve more complex problems. To make the best of these technologies, it is important to assess the existing data and infrastructure, and compare with the industry benchmarks to identify the gaps. It is also essential to articulate the key performance indicators to be achieved and draft a detailed big data deployment and adoption plan tailored for the organization.

Taking the plunge without adequate knowledge and a foolproof big data strategy can result in failure. Ill-informed decisions and flawed deployment roadmaps can drain budgets and have an undesired impact on business performance.

Big data deployment framework

There are several factors that influence the decisions to deploy big data technologies in an organization.

  • Factor: Presence of unstructured and/or non-traditional data in the system
    Examples: Web chats, call center transcripts, digital notes and records, web metrics
  • Factor: Dealing with a huge volume of data that runs into petabytes or zettabytes
    Examples: Social media data, calls, web chats, web logs, mobile devices data
  • Factor: Flow of ultra-low latency data
    Examples: Social media and blog comments, recent calls and chats, web adoption, user authorization information
  • Factor: Need for real-time scoring and insights
    Examples: Authorization triggers, incoming calls
  • Factor: Exploring new analytics algorithms
    Examples: Text mining, predictive models, probabilistic learning algorithms, unsupervised learning methods, Bayesian algorithms, neural networks

Data, analytics, and technology are the three main pillars of the big data landscape.

However, not all pillars are equally strong in every organization. Some organizations have structured data with known latency and volume, but have no means to perform analytics with it. Others may have all the analytical tools in place, but no control over managing data. There may be yet others that have control over data and analytics, but are unable to harness the insights for decision making due to outdated technology infrastructure.

These three pillars are integrated and further reinforced with security, governance, and processes. The following illustration depicts:

Customizing big data strategies to drive success in deployment
In the subsequent sections, let us delve deeper to decipher this framework

Data

Data is at the heart of analytics, technology, and informed decision making. Its shape, volume, and latency determine the breed of big data technologies to be deployed in the organization. Meticulous mapping of data properties, sources, and frequencies is important while drafting the deployment strategy.

Data from existing and new sources may be dealt with differently. Besides, there may be different methods to manage transactional and non-transactional data. How structured the data is also influences the deployment decisions. Data’s format, its ability to interact with other data and databases, and its consistency should be thoroughly assessed. Other factors, such as write throughput, data-source prioritization, event-driven message input, data recoverability, fault tolerance, high performance deployment, platform availability, and automation, also need to be considered while strategizing.

Data is at the heart of analytics, technology, and informed decision making. Its shape, volume, and latency determine the breed of big data technologies to be deployed in the organization.

A combination of traditional technologies that are in use (such as relational databases) and big data technologies (such as Apache Hadoop and NoSQL) might be the apt solution for some organizations to achieve their business objectives. For others, adopting new technologies altogether would be the best solution.

Analytics

Analytics may involve developing and operationalizing descriptive, predictive, text mining, and unsupervised learning models leveraging data sources. Developing an analytical capability in a big data environment involves understanding how it would support the following:

  • Data processing, querying, aggregation, and transformation
  • Structured query language (SQL) and native programming languages
  • Human-acceptable query latency
  • Text mining and processing
  • Supervised and unsupervised algorithms
  • Interoperability with other platforms and analytical tools

 

Apart from these factors, the analytical models should ideally offer features such as ease of use (coding, debugging, and packaging), open source libraries and application programming interfaces (APIs), and graphical interfaces for visualization. Operationalizing analytics may typically involve assessing fault tolerance and automated recovery of data processing jobs, setting standardized parameters such as dates and strings across platforms, seamless scheduling of jobs, logging, and monitoring.

Analytics may involve developing and operationalizing descriptive, predictive, text mining, and unsupervised learning models leveraging data sources.

To operationalize analytics, a combination of traditional technologies that are in use (such as SAS, R, Python server) and a distributed environment for big data could possibly be the best solution for some organizations to achieve their business objectives.

Technology

The existing infrastructure in the organization may have limitations to solve complex data problems. It is important to study the existing software and hardware to identify the gaps that can be filled with new technologies and systems.

The organizations should assess their networking, server, storage, and operations infrastructure thoroughly before deploying big data technologies. Besides, the nature of data processing—real-time or batch— also drives the infrastructure needs to a considerable degree. This in turn determines if the infrastructure needs to be commissioned on cloud, dedicated servers, or a combination of both.

It is important to study the existing software and hardware to identify the gaps that can be filled with new technologies and systems.

Speed, performance, scalability, and costs are other important factors that can influence the decisions around investments in big data infrastructure.

Lastly, it is of utmost importance to map the technology infrastructure with human skills for deploying and using it.

Integration and interoperability

Big data technologies should be integrated into the existing infrastructure in a seamless fashion to avoid any disruption, business downtime, and cost overruns.

Initially:

Data storage and operationalization can happen in both traditional and big data environments, whereas analytics can remain exclusive to the traditional environment. This will provide a certain level of cost optimization.

Gradually:

Some analytics can happen in both the environments, thereby providing additional cost savings.

Finally:

Data storage and analytics can happen in both the environments with operationalization becoming exclusive to the big data environment. This will optimize cost significantly.

Big data technologies should be integrated into the existing infrastructure in a seamless fashion to avoid any disruption, business downtime, and cost overruns.

Various databases, clusters, and nodes should be studied for integration. Other important considerations are metadata and master data management (MDM), extract-transform-load (ETL) preprocessing, data retention, framework for faster deployment with automation, and scalability.

At the organizational level, inter-departmental, cross-project, and multi-platform integration of big data technologies should be planned early on, as it may get difficult to achieve this later.

Security

Security and privacy have become major concerns with the advent of cloud, diversified networks and data sources, and the variety of software platforms. As the organization’s data and infrastructure become more accessible from different platforms and locations, they also become vulnerable to hacking and theft risks.

Security and privacy have become major concerns with the advent of cloud, diversified networks and data sources, and the variety of software platforms.

Traditional security methods and procedures that are suitable for smallscale static data may be inadequate to fortify big data environments.

Amongst several security considerations, big data deployment strategy should most importantly encompass the following:

  • Securing all the applications and frameworks
  • Isolating devices and servers containing critical data
  • Introducing real-time security information and event management
  • Providing reactive and proactive protection

Finer details of configuring, logging, and monitoring data and applications should be known beforehand to implement the security measures.

Governance

Data governance involves having access to audit reports and reporting metrics. The scope of governance should be clearly defined while commissioning the big data environment.

The following are certain important considerations for governance:

  • Defining the frequency of refreshing and synchronizing metadata
  • Identifying possible risk scenarios along with failovers
  • Instilling quality checks for each data source loaded and available within the big data environment
  • As per the regulations and business needs, disposing of the assets that are no longer required
  • Defining guidelines for acceptable use of social media data of existing and potential customers
  • Scheduling audit logging of events along with defined metrics
  • Productionizing a logging framework for data access, and creation and update of intermediate datasets to generate logs of event runs and identify failures/errors during production run and analytic operations
  • Identifying access patterns across data folders and defining access control rules based on hierarchy (groups, teams, and projects)

 

Data governance involves having access to audit reports and reporting metrics. The scope of governance should be clearly defined while commissioning the big data environment.

Processes

Adopting big data technologies requires a shift in the organization’s way of functioning because it changes the operations of the business. The deployment should be manageable and timely, integrate into the broader enterprise architecture, and involve specialized personnel.

Processes to implement, customize, populate, and use the big data solutions are necessary. Methodologies to control the flow of data into and out of the big data environment should be defined. Furthermore, processes to provide feedback for improvements during big data infrastructure deployment and thereafter should be in place.

Insights in the big data environment

Data, analytics, and technology have an impact on the consumption of insights for making informed business decisions. Using tools and technology, analytics performed on data provides insights that determine how users and applications will access, interpret, and consume the output of analytics performed on data. Having maximum visibility of the analytical output will enable effective decision making and optimize business outcomes.

Visualizing the output provides the ability to effectively consume aggregated and granular data. Factors such as connecting to multiple data processes and look-up engines, and interfacing with databases and warehouses to push output for downstream consumption should be considered. Furthermore, the ability to modify a reporting module and add new dimensions to it will enhance the output and decision making in a big data environment.

A Fortune 500 Global company got an analytics solution developed for surveillance monitoring to identify anomalies in real time by processing streaming data. The solution also provided an interface for retroactive analysis on large historic datasets.

While planning for the consumption of insights in a big data environment, the constraints and opportunities of the underlying systems should be considered. On one hand, the infrastructure should be configured to provide quick (realtime) insights. On the other, provisions should be made for greater processing times for certain insights that are derived from large datasets.

Other important factors to be considered while distributing the output should be the frequency of consumption, the extent of personalization, access through APIs, message pushing ability, and reusability.

Concluding note

Intensive, comprehensive, and persistent planning is imperative for deploying big data technologies in any organization. With the changing landscape of big data technologies, the strategy itself should evolve to accommodate such changes. The deployment strategy is more than a piece of paper. It is a mechanism to deploy big data technologies within the existing infrastructure for maximum impact without affecting the core business.

Intensive, comprehensive, and persistent planning is imperative for deploying big data technologies in any organization.

The effort invested at the initial planning stage might determine the success or failure of the deployment, and in many cases, of the organization itself. The nuts and bolts of every aspect of deployment should be fine-grained. Stakeholders from different departments should be involved and a collaborative environment should be created. A roadmap with the deployment phases should be drafted and made handy.

The effort invested at the initial planning stage might determine the success or failure of the deployment, and in many cases, of the organization itself.

The following should be the typical outcome of a detailed assessment that will help in formulating the big data deployment strategy:

  • Big data architecture diagrams
  • Data flow diagrams
  • Logical architecture diagrams with detailed explanation of the inherent layers
  • Technology mapping diagrams
  • Process flows of operational patterns
  • Reference architecture diagrams for the inherent scenarios

 

Reference architecture diagrams for the inherent scenarios

  • Deployment recommendations
  • Deployment phases
  • All the activities within the deployment phases
  • Human skill mapping

 

These artifacts of strategy and roadmap collectively form a construct for deploying big data technologies. Each one plays an important role in articulating, tracking, and controlling the deployment. These are must-have for any organization seeking a successful deployment and therefore, should be tailored carefully.

Finally, as deployment of big data technologies can be a complex task, organizations need to realistically assess what to execute in-house and where to take the help of external partners.

Finally, as deployment of big data technologies can be a complex task, organizations need to realistically assess what to execute in-house and where to take the help of external partners. Collaborating with partners can provide access to a proficient talent pool, improve utilization rates, reduce cost, and offer the much needed strategic direction for successful deployment.

About Author

Suraj Amonkar – Director (Big Data and Visualization) at Fractal Analytics

Vishal Rajpal – Director (Global Consulting) at Fractal Analytics

Acronyms

  • API – Application programming interface
  • ETL – Extract, transform, load
  • SAS – Statistical analysis software
  • SQL – Structured query language
  • USA – United States of America
Transforming order to cash using advanced analytics

As per an industry report, for every $1B in revenue, working capital optimization can result in $20-60M annual benefit1

Working capital optimization is a sweet spot for CFO’s, as it determines the financial health and operational success of the business. Organizations today are looking for end to end solutions around all the levers impacting working capital and are talking about overall processes rather than just payables, receivables, and inventory. OTC-order to cash process has replaced accounts receivable. Companies are looking for solutions to move beyond just cost savings to a more strategic impact, and an integrated system is the answer.

Abstract

Working capital is one of the most important components of the financial operation of a company. Organizations need to explore more efficient ways to manage the levers of working capital, i.e., accounts receivable, accounts payables and inventory. To manage these levers, we need to look at the overall picture rather than just focusing on
receivables and payables.

In this whitepaper, we are going to focus on order to cash (OTC), as it is one of the important strategic components that can be used-not just to improve the organizations profitability, but also to better manage the working capital and customer services for future growth.

With growing business and customer expectations, organizations must go into expert mode and transform their order to cash process using the advance technology and analytical solutions available.

This paper discusses a step-wise solution that can help organizations transform their accounts receivable process into a smart order to cash process. A detailed analysis of various subprocesses within the OTC process-and possible technological and AIML (Artificial Intelligence and Machine Learning) interventions – forms the base of the solution. Artificial intelligence (AI) engines can be built to scan through the invoice details, customer e-mails, contracts, and other unstructured data and significantly reduce the manual effort while improving the execution time for collections, deductions, dispute resolutions, shorts payments, cash posting, etc. Many organizations have explored similar solutions, though only in parts. An end to end approach, starting from order processing to cash application and performance reporting, is the answer.

Reducing the cash conversion cycle can improve EBITDA margins and significantly increase profitability-by as much as 20%, in some cases2.

Challenges in OTC

Today across industries, we see the cash-to-cash cycle-from procurement to receipt of payments from customers – exceeding six months, which means companies have already made payments for the raw materials and services but haven’t sold the product yet.

To address this issue around working capital, we need to understand each component of working capital, how they interact with each other, and how advanced analytics can be used to enhance the output of each of these levers for an optimal working capital.

Most organizations have accounts receivable, accounts payables, and inventory working in silos. If we go within these processes, there is a lot of system dependencies, and not much importance is given to the interactions within the subprocesses. Figure 1 below explains the dynamics of AP, AR and inventory in the overall working capital
optimization.

Figure 1 – Working capital optimization and roles of the key levers

Working capital optimization and roles of the key levers

An analysis by PWC in the year 2016 suggests that ~1.1 trillion Euro is locked up in working capital and this is increasing3.

We’ll discuss in detail how accounts receivable is moving towards OTC and what challenges we see across the value chain.

The biggest challenge that we see across the order to cash process is customer experience and a lack of visibility across the entire OTC process. OTC is one of the most important processes in the entire business cycle, as without this companies won’t make any money. However, system dependencies and the business silos of various departments involved make it highly manual and challenging. For most companies, a huge amount of transaction data available is not being used for any intelligence. Almost one-fourth of the effort goes into collecting payments from on-time payers, and an equal amount of time is spent in resolving disputes that could have been easily avoided.

Let’s have a look at the system dependencies:

OTC interactions among the various departments

Figure 2 – OTC interactions among the various departments

According to ordertocash.net, ~30% of account department costs goes towards managing the OTC process. If not managed properly, OTC can cost 6-25% of revenue4.

The other pitfalls of the process include inconsistent data and documents, duplicates, incorrections, reactive measures, unclear deduction rules, and inefficient customer contacts.

One of the key problem areas is collections, where most often the account/customer prioritization for customer contacts, calls, dispute tickets, and resolutions are performed manually and, as a result, are suboptimal and expensive. All of these can be automated through RPA and cognitive automation.

A reduction of one day in DSO for a $10B company reduces WC by around $30M5.

The OTC solution: going beyond efficiency

The solution to the above problems is to go beyond efficiency to effectiveness of various subprocesses under OTC—to deliver better cost, cash, and improve upon DSO (days sales outstanding) and customer satisfaction levels to build the long-term relationship. DSO (days sales outstanding), APT (average payment term), and WAPT (weighted average payment term) are the most important metrics to use when measuring the health of OTC for any organization. An improved DSO reduces the overall cycle time of collection and significantly improves working capital health.

The key to the overall solution is to connect business silos through upstream and downstream interlinkages and use advanced analytical techniques to drive focused customer contacts, proactive collection strategies, and a smart auto cash posting system. Also, rather than working on discrete portions of the process or exploring a complete technology overhaul, a more effective solution is to integrate all the systems to a single platform to form input links at various process stages.

We are suggesting a step-wise approach to gradually move from the current state to a more cutting-edge state as an industry leader in OTC practices.

Below is a snapshot of multiple subprocesses and dependencies on various other departments:

  1. Order processing: The OTC process starts with sales order processing with high volume and complex orders. The order processing demands instant access to delivery docs and related customer and order details.
  2. Billing: Billing starts post product/service delivery to customers. It has its own issues that arise due to incorrections, information breakage from deliver point, and lack of customer interaction information.
  3. Terms: Payment terms are not clearly defined and followed. Non-standard payment terms and variation across divisions, business segments, etc., make it difficult to follow and optimize.
  4. Collection: Collection is a process of contacting customers and getting the dues cleared. This requires solving a lot of customer queries, and the people involved should have smooth information flow from various departments involved. Typically, collections agents wait for invoices to go delinquent to start acting.
  5. Dispute resolutions: Disputes arise for multiple reasons, such as incorrect billing details, discounts, goods return, etc. Once an invoice falls into dispute, the cycle time to resolve it is long and delays the collection process. A systematic information flow from the order processing unit, and measures to reduce incorrections at each step, can help significantly reduce the number of disputed invoices.
  6. Cash application: This is the most manual process in the collection cycle, where once the payment is received from the customers, and remittance information around the same is shared by banks, agents map this remittance data with invoice details.
  7. Analytics: Here, relevant information is shared with the leadership team, and insights are provided for proactive actions. Measuring the process performance and identifying the gaps for improvement is necessary to move towards a smarter and effective order to cash system.

It is important to note that the root cause of overdues and collection issues is outside of the OTC process. So, it’s essential to understand the interdependencies.

Let’s have a look at the various subprocesses and the issues organizations face within these subprocesses, as well as what KPIs to measure to analyze the health of these processes. This helps us identify various possible technology and AIML interventions across the OTC value chain.

Figure 3 - Approach model: opportunities analysis across accounts receivable process

Figure 3 – Approach model: opportunities analysis across accounts receivable process

As it is mentioned above, OTC starts with order processing and, if not handled properly, it impacts all the downstream processes, impacting DSO or on-time collection and reconciliation. Based on experience with various organizations across industries, it’s seen that sales data, customer master, price promotion information, and point of
delivery information are not integrated properly and leads to billing/invoice incorrections and a long lag time to correct these incorrections. These incorrections also result in a high number of customer disputes and a negative impact on long-run relationships.

To transform accounts receivable to a smart order to cash process, we start by studying all the manual interactions and automatic data flow. This will help us identify the information leakages and process loopholes.

Developing a Risk Management Tool to understand the impact of the root causes and taking corrective actions is an effective way to start. Also, providing a web application for the delivery people and sales team that works in both online and offline mode to collect all customer interaction information is a must for this solution. Regular performance measurement and benchmarking of the implemented solution helps in identifying and filling in the gaps.

OTC – Solutions to start with for the OTC process transformation

OTC – Solutions to start with for the OTC process transformation

OTC – Solutions to move to the advanced stage

OTC – Solutions to move to the advanced stage

Collection data

The develop phase, once taken care of, will help streamline all the downstream processes, from collection to dispute resolution, and finally, to cash reconciliation.

This stage specifically focuses on smart collection strategies and a high hit rate of auto cash posting using AIML techniques.

Smart Collection: Key drivers of customer behavior and invoice level risk profiling.

Automated On-Invoice Cash Application Hit Rate: A combination of OCR, AI, and RPA for auto cash posting, where we have all invoice rules and deductions rules in one place.

OTC – Solutions to become an industry leader

OTC – Solutions to become an industry leader

This helps automate and transform the process into a smart OTC process with smooth data flow at each touchpoint and advanced visualization solutions for alerts and insights for proactive decision making. A dynamic management dashboard helps to measure the health of the process and take data-backed actions in any pain area. This includes:
• Monitoring collection performance
• Identifying the most frequently occurring issue Transforming

Electronic data flow

To be a world-class leader in the OTC process, organizations must leverage the technological solutions available. Having a completely integrated system from customer master to final cash application is the key to this solution. There shouldn’t be any lag in the information flow, and RPA can help by eliminating all the manual efforts needed in fixing the billing errors, resolving customer disputes, and looking for supporting data for cash application.

Benefits of the overall solution

The overall solution can help organizations better manage their working capital and significantly improve DSO at a much lower operations cost. An overall OTC solution can enable:

  1. Working capital optimization: OTC being the most significant lever of working capital management systems, if managed smoothly can help companies manage their working capital in the most efficient way.
  2. Reduced DSO: The overall solution significantly reduces the number of days-to-pay/days-sales-outstanding. With reduced manual interventions, targeted customer contact approaches, and quick access to information, a significant amount of payments can be collected on time, and the energy can be diverted towards risky invoices and customers.
  3. Improved service levels: Fewer disputes and better dispute resolution in the shortest possible time strengthens customer relationships. With customer behavior understanding and alerts on likely payment defaults, businesses can initiate strategic partner discussions in advance to reduce the risk of payment defaults.
  4. Improved cash flow: Reduced revenue leakage, lower write-offs, intelligent ways to connect with customers, and smart collection all lead to improving on-time payments and visibly reducing write-offs. AI can be used to identify and validate the short pays, deductions, and charge backs. With all information available at hand, companies can rework their payment terms and considerably lower the >90/120-day transactions.
  5. Visibility to cash position/working capital in real-time: A visibility into customer payment behavior and invoice risk profiling gives a clear look at future payment trends, and companies can manage their expenses and working capital requirement accordingly.
  6. Improved productivity and reduced operations cost: As order to cash is mostly a manual process and consumes the highest share of cost in the finance department, the cognitive automations reduce the man hours needed to complete any task. Also, with quick access to information, the lag/wait time is minimal, which improves the productivity of people managing the OTC process. They can now focus more on analyzing and deep diving the process performance and issues.
  7. Improved auto cash posting: With all the invoice details and deductions rules in one place, and pre-built algorithms to match short pays, auto cash applications can be improved up to >90%, which would significantly reduce the effort needed to reconcile the invoice details and remittance data.
  8. Alerts and role-based actionable insights using role-based visualization techniques: This helps in being proactive and taking necessary actions before the actual damage hits. The complexities around the subprocesses, customer disputes, short payments, lock boxes, payment terms, etc., can be very well understood and managed through a single reporting system with automated alerts and narration, customized reports, and real-time insights within seconds.

The call to action

For multiple client engagements, these solutions have delivered historic and positive business impacts, reducing the payment cycle time and improving the overall process effectiveness. With everyone leveraging the current technology trends and advancements in machine learning and artificial intelligence, this is the time when organizations should move away from a typically manual and inefficient order to cash cycle. Now is the time to create a system that breaks silos and allows for the smooth movement of information across subprocesses. Technology is an enabler for the same, and with millions of rows of data, the available information can be easily transformed into intelligence to make a smart OTC process.

 

Author

Shipra

 

 

 

 

 

SHIPRA SOODEN
Practice Lead – Financial Planning & Analytics

Shipra is leading the FP&A practice at Fractal Analytics. She has over 12 years of diverse experience in building advance analytical solutions around finance, pricing, forecasting and working capital management. She has worked across industries like CPG, retail, energy, OEMs, pharma, telecom.

References

  1. https://magnitude.com/blog/working-capital-performance/
  2. https://www.thehackettgroup.com/working-capital-efficiency-1704/
  3. https://www.pwc.com/gx/en/services/advisory/deals/business-recovery-restructuring/working-capital-opportunity.html
  4. https://www.nchannel.com/blog/order-to-cash-best-practices-for-multichannel-retailers
  5. https://www.cimaglobal.com/Documents/Thought_leadership_docs/Management%20and%20financial%20accounting/using-analytics-to-reduce-dso.pdf
Dr. Li Deng on the future of AI
The evolving role of AI in effectively combating cybercrime

The vast proliferation of data, faster computing, digital detonation, and the need for continuous innovation to stay ahead of cybercriminals has put the cybersecurity industry at an interesting inflection point, needing a radical paradigm shift in the way we foresee cyber businesses operating in the near future. Cybercrime is increasingly becoming prominent in every boardroom’s agenda,as it is costing businesses across industries nearly $118bn annually, and the role of cybersecurity experts is undoubtedly gaining more relevance in today’s ever-evolving IT landscape. This paper discusses in detail how cyber security officers can respond more proactively and more effectively to cyber threats using AI & machine learning techniques.

Despite the sophistication in tools and technology and machine learning driven solutions available out there, cybersecurity officers have been barely successful in bringing down the time it takes to acknowledge an attack ‘after it has happened’, let alone stay ahead of the curve and predict the next breach or attack much before the attackers strike the first blow.

As per a recently published Verizon Data Breach Investigation Report, more than 50 percent of data breaches go undetected for several months. And most of the traditional approaches tend to focus on just aggregating data around malware, hacking attempts, identity thefts, data breaches, phishing campaigns, etc., translating these into threat signatures (digital fingerprint of the attack) and then analyzing streams of historical/real-time data for finding similar patterns/behaviors. Not so surprisingly, owing to our adversary’s inventiveness, cybersecurity criminals have always been one step ahead in terms of constantly advancing and fine-tuning their attack strategy to circumvent existing systems and finding newer innovative ways to threaten organizations.

Cybersecurity officers today are looking to employ a more advanced, intelligent and less human-intensive system to proactively monitor cybersecurity threats and mitigate them in order to reduce cost, prevent fraudulent activities from happening in the first place or even improve the efficacy of their current cybersecurity implementations. And when the rules of the game are changing at such an unprecedented pace, agility and the right attitude to let go of the old rules and learn new ones is no longer a matter of choice but rather a necessity to avoid the extinction event. Having said that, the critical questions that loom unanswered are:

  • How can cybersecurity officers break this endless loop of playing a catchup game with cyber criminals and have an advantage in the game? How can they tackle evolving fraud?
  • How can they properly channelize investments to handle the volume and complexity of today’s cyber-attacks?
  • How can they move beyond the current sub-optimal approaches of maintaining black-lists and adopt a more signature-free security approach?
  • How can they truly differentiate between an actual/genuine human activity vs intentional misconduct and minimize false-positive alerts?
  • How can they proactively detect out-of-normal behavior by analyzing real-time data streams from multiple network and infrastructure assets to uncover threats in real-time?
  • How can they automate interventions based on severity/criticality/ complexity of the threat event as well as the risk appetite of the organization?

Growing Beyond a “Reactive” Signature-Based Methodology

A cyber-attack, security breach, hacking attempt or security threat is not identified until after the event has occurred. Organizations today are looking for options to rapidly mitigate threats in order to avert ramifications associated with retrospective identification, rationalize spends and opportunity cost tied to the investment, and also implement a robust, scalable cybersecurity strategy which caters to their future needs.

Fraudulent behavior or misconduct in this context must be looked at through a different lens, a new perspective which most of the traditional approaches don’t cater to today. Most current implementations out there look at historical evidences of attacks and potential breaches from ‘known’ sets of events. Instead of just gleaning over individual areas of anomalous behavior, we should mathematically define ‘what is normal’; the reason being fraud is ever-evolving.

By following this approach, we should be able to understand and digest the nuances of what is “Not Normal”. And by doing so, we have a higher likelihood of uncovering out-of-normal activity, improving overall detection, automating incident investigations, improving threat containment and implementing better threat aversion strategies.

An Innovative Approach to Combating Evolving Cyber Threats

Using the following three-phased approach, businesses can establish a System of Intelligence for end-to-end cyber threat prediction, detection, prevention and intervention in real-time, thereby improving the overall cyber threat remediation process.

The evolving role of AI in effectively combating cybercrime

Real-time threat tagging

Real-time threat assessment, evaluation and adjudication strategy

In this phase, historical cyberthreat activity and potential threat actor information will be used to determine what is ‘normal’ and what is ‘not normal’. When normal is understood, a model will be developed to identify all non-normal activity which will flag incoming real-time activity from IR tools (incidence reporting) or logs/events from systems, applications, network, security devices and other sources. These events will then be bucketed into various ‘known’ threat categories (threat types defined based on historical incidences plus SME knowledge) and unknown/undefined events which could be newer forms of evolving threats (discovery of new trends and behavioral patterns).

In real-time, the potential threat events that are flagged ‘At Risk’ will be adjudicated through RPA systems (robotic processing automation) based on client’s risk appetite or severity/criticality/complexity of the threat involved and associated downstream effects. For example, if you have a vendor or contractor, Edward Snowden, downloading copious amounts of data and has done the same thing “X” number of times over a certain period, he will be referred to for human investigation vs an email/text/pop-up message if it’s just the first attempt. Similar anomalies in employee and/or vendor/contractor behavior can be brought to light ahead of time and adjudicated for early mitigation.

Predicting future threats

Analyzing attacker behavior to predict future threat actors and events

Build profiles of the top 5-10 percent of cyber criminals responsible for extreme threat events in the past by analyzing their longitudinal digital footprints over elongated periods of time, plus profiles of ‘known’ threat types and finally ideal profile(s) for a user who had a genuine digital/web activity.

Based on these profiles, build a mathematical model to predict WHO is a potential threat actor and WHEN will they indulge in a potential cyber threat activity

When these events are predicted, a Robotic Processing Automation (RPA) system shall be implemented to contact the concerned party using the ideal outreach channel (email/text/pop-up window/phone call) based on the event’s risk score, organization’s risk appetite and/or employ a human investigator to intervene for a corrective action

Proactive mitigation

Real-time identification of future attack events and enabling automated interventions ‘ahead of time’ to mitigate risk or minimize losses

This phase combines the results from phase 1 and 2 with unstructured tertiary inputs from cybersecurity SME’s and/or external world events to make the system more intelligent and reduce false positives or negatives.

This phase will drive compliance and education (via training, behavioral coaching, etc.), decreasing the amount of non-compliance, avoiding mistakes from genuine users and/or averting actual intentional cyber threats. This phase will help detect and stop threats ahead of time, and shorten time to remediation when attacks occur (using next best action).

Conclusion

Cybercrimes are growing exponentially, faster than what most business could decipher and embrace the winds of change. Naysayers will perish, existing market incumbents will be toppled and early winners shall rewrite the underlying business fundamentals, disrupting the marketplace in ways unimagined. Cyber business across the globe should get attuned to this new operating model paradigm shift. The situation may seem insurmountable unless businesses are equipped with the right set of tools/technologies and knowledge partners to help. “AI-driven” cybersecurity implementations behold the future for businesses bracing up to flip the markets again.

It’s the tip of the iceberg, or just scratching the surface—call it what you may! There’s a new cybersecurity story waiting to be etched in history, and it’s being driven by AI and powered by analytics.

About Author

Rohit Kewlani

Principal Consultant, Fractal Analytics

Accelerating AI enterprise-wide to achieve a competitive edge

Artificial intelligence (AI) has limitless potential, and for most enterprises, we are only scratching the surface of the potential opportunity. There are many examples of enterprises embracing artificial intelligence to predict buying patterns, understand customer behavior, create personalization, help in genome research, optimize supply chains, conduct financial trading, or recommend movies. For these companies, artificial intelligence has already become a competitive differentiator

However, there are many more companies who are trying to figure out the reality vs. the hype and how to either begin or accelerate their own journey. For these companies, it’s still a significant challenge if not a daunting undertaking to begin the AI journey. Many firms also have concerns over technology selection, cost, integration, privacy, security, and regulatory challenges.

These challenges are not unique to AI. They are many of the same challenges that have existed in the widespread effort to adopt, use, and benefit from analytics and machine learning (ML) initiatives. Therefore, given these challenges, how can you start the journey if you haven’t already done so?

The strategies discussed in this white paper are intended to provide guidance and suggestions for organizations and enterprises who are navigating through this journey. However, these strategies can also be applied by leaders of business units, divisions, or other entities of larger organizations who wish to adopt or accelerate their analytics and AI initiatives

Accelerating AI enterprise-wide to achieve a competitive edge

1. Gain C-Suite Sponsorship

To ensure success in adopting or accelerating analytics or AI across your enterprise, it’s important to provide active sponsorship from all C-suite leaders. These leaders must make cultural adoption a priority to drive progress, and align your assets, investments, and plans to your corporate strategy.

Executive buy-in is key to success.

Executive sponsorship and buy-in is vital for success. The more engaged and bought-in the C-suite is for AI, the better the chance of success in implementing and adopting Analytics and Artificial Intelligence across the enterprise. Ensure that all senior executives engage, actively participate, and “buy-in” to the initiative. It is even more critical as technology, data, complexity, risk, and demand increase. According to McKinsey Global Institute, “strong executive leadership goes hand-in-hand with stronger AI adoption. Respondents from firms that have successfully deployed an AI technology at scale tended to rate C-suite support nearly twice as high as those from companies that had not adopted AI technology.”

It’s not just the job of a CDO, CIO, or CAO—all need to buy-in

It’s not just the job of a single function, the CDO, CIO, and/or CAO if your firm has adopted these roles. If there is no business leader or function that is chartered (and empowered) to spearhead your AI efforts, then you are at risk of falling behind your competitors. Therefore, as a critical first step it is important to establish clear executive ownership for data and analytics (e.g., a CDO and/or CAO). Ensure that the functional leader and their teams are staffed at an appropriate level to drive the transformation strategy and change management needed throughout the enterprise in order to be successful. One-off, independent, or silos of effort will not succeed and may impede progress due to competing strategies, investments or performance objectives

Ensure there is active and engaged sponsorship from all C-suite leaders

Align assets, investments, and plans to enable your corporate strategy

It is important that you are able to drive strategy, resources, investment, and set the tone for the organization and cultural adoption. This includes active engagement and support for BI/AI strategy, assets (both IT and human), investments, and cultural adoption. By contrast, if budget for AI projects, initiatives and/or talent are disbursed throughout the enterprise, or not aligned to enterprise priorities, it will impede progress. Therefore, once you establish commitment at the C-suite level and formalize your strategy, you should also determine how you wish to manage and control your budget and capital throughout the enterprise, particularly if your current landscape consists of competing internal (or external) analytics or AI efforts.

This is not to suggest analytics or AI budgets should not exist throughout the organization but if left unmanaged or not aligned to your enterprise strategy, then the end result may be sub-optimized use of budget, capital and resources, or worse, competing solutions and efforts. The key is to identify the budget and resources, and align them towards your desired outcomes.

Make cultural adoption a priority to remove barriers, obstacles, or blockers

Cultural adoption is perhaps the biggest single challenge and requires the most ops-down leadership to define, communicate, and reinforce the vision/strategy. Be sure to hold the organizational leaders accountable to execute the changes required to drive the transformation. Executive leadership needs to remove barriers, obstacles, or even blockers if required to increase the chance for success.

Champion and communicate wins and progress.

Champion and communicate wins and progress to the broader organization. This will help reinforce the commitment from the top as well as garner support for the transformation. Reward and recognize champions to reinforce and support the behaviors and leadership you need for success. Conversely, a lack of communication and engagement from the top will slow or delay progress by reinforcing that “business as usual” is acceptable.

2. Coordinate Your Enterprise Strategy and Investments

Make sure you drive strategy and investments that align the C-suite, business, IT, data, and analytics functions or organizations. Success also requires a tops-down, coordinated strategy for IT investments to create or optimize big data solutions, data storage, apps, BI tools, and platform integration.

Have a strategy, plan, and roadmap that aligns business, IT, and analytics.

You need to have an integrated strategy, plan, and roadmap that aligns business, IT, and analytics. This could take the form of a strategic plan and three-year roadmap to align and drive investments in IT, processes, and talent. This roadmap should include your strategy (and timing) for investments in the entire “BI stack” including data storage and governance, big data technology, BI, analytics, and visualization tools and solutions. Without an integrated plan and roadmap, it will be difficult to yield an appropriate ROI on your IT investments.

Create a strong Business-IT partnership to promote progress.

There must be collaboration by all key stakeholders and functions to ensure enterprise benefits. Business-IT partnership is vital for success. It’s not purely IT’s role to implement big data, analytics AI projects, or fix “data problems”. Business leaders must take ownership and responsibility to partner with IT and other functions to drive analytics and AI initiatives. Silos are the enemy of progress.

Big data is part of the “BI stack” and overall solution—not the end- state.

Investing in big data architecture is critical to success, particularly given the explosion in data and data sources like video, chat, Internet of Things, and sensor technology—largely “unstructured” data which complements most legacy structured” data sources like CRM and ERP systems. However, big data solutions should be considered part of the strategy for the “BI stack” of technology from data ingestion, storage, discovery, modeling, analytics/ML, visualization, and mobility.

Big data can provide scalable, fast, and responsive BI and AI solutions to manage both structured and unstructured data. In order to maximize success of your investment in big data, it is equally important to determine what data you will store and make available through an enterprise data warehouse, data lake(s), and high performance memory-resident appliances in order to provide fast, responsive solutions for the business. As discussed later in this white paper, this also requires an effective enterprise data governance capability.

On top of this architecture, you also need to consider what tools are required to enable data exploration and visualization by the business and end-users. There are a number of compelling solutions currently available, and many more being introduced so it is important to strategically assess these solutions against your enterprise requirements and rationalize the tools you require for the given business need or use case. If you have too many BI tools, it can result in inefficiencies, confusion, competition, and a drag on productivity. Ultimately, you may choose several depending on your needs, but the key is not to ignore BI analytics and visualization tools as part of your overall BI architecture strategy and roadmap.

Create or scale a BI system that provides a platform for innovation, analytics, and AI.

Building an enterprise-wide business intelligence or business management system can create a robust big data platform for not only descriptive analytics and reporting, but an effective and fast way to implement predictive analytics solutions, ML, and AI at scale. Build once and share for the benefit of the enterprise is much more effective than building isolated solutions throughout the enterprise. A business management system works well when combined with effective data governance policies and practices governing what data is available in data warehouse(s)/ data lakes, for whom, and how it can be accessed and utilized. Such a platform can actually accelerate innovation and AI by allowing faster identification of end-user best practices or ideas, rapid prototyping, replication, and deployment of best practices, algorithms, and solutions.

An enterprise-wide BI or business management system can accelerate analytics or AI in the enterprise.

On top of this platform, you can readily apply analytics and visualization solutions to drive a scalable, fast, single-source-of-truth solution for the enterprise. It also accelerates analytical discovery by the business since it represents a large volume of trusted, normalized, and relevant data which can be easily accessed and analyzed by the business to determine drivers for success or reasons for under-performance. Ultimately, this will lead to improved ways of evaluating business performance, or introduction of predictive measures and KPIs for success. therefore, in the context of AI, such a platform enables and accelerates the ability to leverage ML and AI faster and more broadly across the enterprise. It is essentially a platform for analytics and AI innovation.

Establish the ability to do rapid prototyping with business and IT teams.

Utilize a prototyping capability or data lab to allow the business to perform rapid testing of new theories, algorithms, or models prior to production. Speed is essential, but so is scalability, security, and maintainability. Rapid prototyping can not only speed solutions to the business, but it can also raise confidence in the final product before it is released which accelerates adoption. You may also wish to create a cross-functional task force encompassing key business leaders with responsibility for chosen use case(s), data, analytics capability, IT, and IP partner(s) to collaborate on critical initiatives utilizing rapid prototyping and agile methodology to quickly move from idea to testing and action. This would also allow you to iterate quickly in enhancements to improve the quality of the product or solution before going into production.

Agile methodology can help accelerate the development of analytics and AI solutions

Agile methodology also needs to be embraced to develop analytics and AI solutions at both speed and scale. Traditional IT development projects take too long, are too rigid, and often do not meet the needs of the business, which do not remain static. You need to have a way to iterate the development process and deliver more frequent “wins” to the business. Otherwise, they will look for solutions elsewhere, which can result in proliferation of “shadow IT”.

Through agile methodology, the business is allowed direct and immediate access to all data within the data warehouse or data lake which precludes the need to acquire, replicate, or export the data into offline tools or systems (in effect creating “shadow IT”). This also allows the business users to shift efforts towards analytics insights and action while ensuring compliance to enterprise data security, privacy, regulatory, risk, and scale considerations.

Data integration, harmonization, and governance are critical to success.

Data integration and consolidation into data warehouses and data lakes are critical enablers for success. Collecting, storing, and providing data is the lifeblood of analytics and AI. Fragmented, “shadow” IT (data disbursed throughout the enterprise in various sources, which are often unsupported and unmanaged) is a significant drag on speed, productivity, and ability to implement enterprise-wide AI solutions. Beyond data collection and integration, data governance is essential to make sense of the data, improve (and maintain) data quality, and manage the access, use, and distribution of the data to enable the enterprise AI strategy.

Be open to strategic partnerships for creative, breakthrough thinking and IP

There is significant investment flowing into AI and related fields. Don’t miss out on the opportunity to partner with leaders in innovation and IP. Strategic partnerships can provide access to leading edge IP and capabilities. However, scalability, security, and integration challenges can slow adoption. As more investment flows into AI, this will become increasingly important and will require the ability to speed integration through open source, API, or cloud integration tools or platforms.

3. Establish Data Governance and Management

An enterprise-wide data strategy and governance process is a critical enabler for successful analytics and AI implementation. It’s important to recognize how critical, yet challenging, it is to govern and manage data across the enterprise. An effective data governance strategy will help you understand where your data is, what is important, how you need to manage it, and how (and whom) you want to allow access to the data, and then how it will be used.

Data strategy, governance, and management is mandatory for success.

Given the proliferation in data, data sources, and increased end-user demand, along with more intuitive and pervasive “self-service” tools and solutions, the need to have an effective data governance program is becoming even more critical. Without data governance, all of this data will end up in a data warehouse or data lakes and become “data swamps”.

Said another way, proliferation of data and uncontrolled user access can provide full freedom for business users. However, for an enterprise, it can result in confusion, duplication, inefficiencies, and distrust. The appearance of moving quickly on analytics or AI projects will mask the fact that enterprises are absorbed in an internal battle over data access and use, and not allocating the critical assets (people, process, technology) to serve a broader purpose—the organization itself, it’s customers and shareholders.

Establish or scale out data governance and data stewardship across the enterprise.

If there is a weak or nonexistent data governance process or function, then it is critical to create or endorse an organization chartered to address this challenge with appropriate support from the C-suite. The organization or function(s) should include resources committed to manage and improve data collection, accuracy, and use across all critical business functions. The data governance organization must also define and manage data policies, standards, definitions, and manage data quality in order to support an effective analytics and AI strategy while complying with regulatory or legal requirements, privacy, security and other considerations.

Break down data silos to gain access to the data that is most critical, and decide how you want to balance control vs. speed and flexibility in use of the data.

Be maniacal about managing data quality and invest in the tools and rocesses to maintain data quality.

Data quality will directly impact the accuracy of the analytics and AI models and output, and the resulting business decisions, so it is critical to have strategy, tools, and resources dedicated to ensure data accuracy and availability in source systems and data warehouse(s)/lakes. If there are data quality issues, don’t underestimate the criticality to have a strategy and capability to ensure ongoing data quality once you’ve corrected or cleaned up the data challenge(s).

Not all data is equal. Determine what data to tightly control, and what data you wish to make available for self-service, discovery and exploration.

Establish the rules, policies, and controls to govern critical data or KPIs, which must be tightly controlled and distributed or published vs. data you will allow for discovery, exploration, and ad-hoc analysis. This may also change over the life of the data, however, these rules must be strategically determined, managed, and controlled.

Not all data is the same, so it is important to determine what you wish to tightly control as a “single source of truth”, and to comply with privacy, security, risk, or regulatory demands. What data or KPIs are most critical to make decisions, who needs them, how are they delivered, and who “owns” and produces them are a few key questions that need to be answered. Ironically, although tight compliance and governance sounds restrictive, it can actually accelerate innovation and application of predictive analytics at scale. To carry the illustration a bit further, if you have defined the key measurements you need to successfully manage your enterprise (financial or operational), then how powerful is it if you have standardized, normalized, readily available data that you can use to identify trends, variances, business unit or individual performance (and underlying attributes)? It can be argued that your ability to apply analytics solutions increases significantly, including machine learning and ultimately AI solutions.

In parallel, you also need to understand what you will allow for discovery, experimentation, and analysis by the end-users and business, or where you will allow “multiple versions of the truth”. Given the increase in availability of more user-friendly, intuitive analytics and visualization tools, how far will you go to enable “self-service” to create new predictive analytical models or new ways of evaluating the business or creating new business process(es)? Who will you enable, with which datasets, KPIs, and use cases? These are important questions to consider. There is a balance here between being too rigid, yet being too flexible, which underscores the need to have a strategy, organizational capability, process, and adherence to a well-defined data governance model to enable an effective enterprise-wide analytics or AI strategy

  • Too much flexibility can result in different or competing versions of the truth, which can create debates, confusion, conflict, and a significant drag on productivity.
  • Too much control can result in rigid processes, bureaucracy, slowed or lack of response to the business, and the creation or proliferation of business-led IT solutions (“shadow IT”).

As you make these decisions, it’s important to have a governance process in place that allows you to implement and manage these decisions, including who gets access to what, how much, and what they can do with the data. There are a growing number of vendors and tools in the marketplace that can help enforce and support these decisions (also referred to as “Metadata Management”), including Informatica, Collibra, DATUM, and Global Data Excellence, to name a few

Consider creating a Chief Data Officer (CDO) role and function if you haven’t done so already.

There is a lot of literature on how to organize data management functions, including the role of the CDO. However, the primary message is you need to have an enterprise-wide strategy and effective governance model managed by a function that has the charter and ability to effectively lead and manage the acquisition, quality, dissemination, and use of the data across the enterprise.

4. Solve Enterprise-Level Business Problems

Identify specific business problems that you can address. Ensure they are strategic and impactful for the broader benefit of the enterprise vs. department or mid-level business problems. Too often, critical or scarce analytics and data science talent is applied to solve minor challenges, automate reporting, or other mundane tasks. It’s vital to align the talented assets and investments to solve critical business problems or opportunities.

The business must lead in the identification and selection of use cases or projects for AI.

Preferably, this should be done by senior executives. Projects or initiatives should be aligned with or enable the corporate strategy. Ideally, the use case or initiative will have a direct and material impact in driving the P&L (cost reduction, top line growth, profitability), better serve customers through reduced cycle time, better quality products and services, and improved customer experience to highlight a few of the most obvious choices.

It must be business-led to ensure commitment and enhance adoption. Depending on where you are in your AI journey, it may be advisable to pursue a “crawl-walk-run” approach where you identify near-term opportunities that may not be as impactful as you may hope (or plan for), but will allow you to test and learn what you are capable of doing, and how to effectively deploy AI. This “roadmap” approach can be a powerful way to build a foundation and rapidly expand your initiatives, and results, as you learn and iteratively improve. I am not suggesting ignoring breakthrough ideas or capabilities, but would suggest focusing on near-term opportunities, building your AI “muscle” in addition to planning for breakthrough capabilities if you are still early in your journey towards adoption.

AI is well suited to optimize internal processes to enhance your operations, improve customer service, or deliver greater financial outcomes for your enterprise.

Ideally, it is a business process that is repeatable, where the business process may be done manually today but can be automated to the point that you can apply machine learning techniques and AI to refine, improve, learn, and strengthen the process. It’s also important to approach these problems initially like experiments where you can identify the critical success factors you want to influence or change and measure the impact of the change in process or hypothesis by implementing a new analytics/ML/AI solution, algorithm, or model. As you do so, you can readily demonstrate the impact of the change in process with agreed-upon measurements, facts, and data which demonstrate the ROI of the solution.

Tie use cases end-to-end, and do not get bogged down in functional silos.

Although you may start with a specific use case like sales forecasting or demand management for supply chain, it’s important to tie these use cases end-to-end, and do not get bogged down in functional silos. You may need to commence with functional solutions, but the key will be how to tie them “end-to-end” for greatest impact. For example, building an outstanding sales forecast model delivers a lot of value to the corporation, but what if you can tie it to your supply chain demand model?

To provide another illustration, marketing and sales must collaborate on the best use of data and insights to improve sales effectiveness, productivity, and results. Marketing data gleaned from customer browsing history, social media sentiment, or contact information can be shared with sales to provide recommendations and leads for follow-through. If the datasets or analytics solutions are not harmonized or integrated, then these will become disparate data points requiring the end user (sales person) to access multiple sources to seek the information, or they may not get the information at all. As a result, integrating these different datasets into a consolidated solution, recommendation, or set of actions for sales will have a much bigger impact than if they stood alone, or worse, competed for attention. For the sales end user, this means a loss in productivity as well as forcing them to decide what is the most appropriate or effective action to take.

Ensure the business problem or use case is strategic and understand its impact to the bottom line.

Don’t underestimate the importance of process engineering skills.

Most business problems that can be automated need to be defined in an “as-is”, “to-be” state where you can apply analytics, ML, and AI solutions to automate, learn, and improve. Experiment and test. Measure before and after. Tweak, modify, and learn. Process engineering skills are critical. Your organization must have the ability to assess current and proposed future state processes that require change.

For example, how do you perform sales forecasting or demand management today? Is it being done on spreadsheets using different formulas, gut instinct, or other means and then manually rolled up? If so, how can you automate the process, embed that into a tool that you can apply ML and AI to in order to learn, improve, and dramatically increase the outcome? At the root of this, you need to have the ability to do process engineering and ultimately, process redesign in addition to the analytics/AI work.

Embed these new AI-driven solutions into decision-making tools and processes.

It’s critical to think about how you can embed these new AI-driven solutions and processes into decision-making or transactional systems. These will yield the greatest impact on the bottom line, customer satisfaction, and productivity. Think about your CRM or sales tools and how you would embed these “recommendations” into your point of quote, transaction, or customer interaction. This can be a powerful differentiator in how you serve customers and how you can improve your P&L.

For illustration, how impactful would it be if you can provide automated recommendations to optimize product configurations or add-on and upsell recommendations at the point of quote or order? Even better, what if these recommendations didn’t require human intervention? Ultimately, integrating AI into workplace processes, decision-making, or transactional tools is the key to long-term success. A redesigned end-to-end process with applied machine learning techniques will not only facilitate faster and better decision-making, enhance business performance and customer experience, but the cost to identify, and deliver on these decisions is dramatically reduced over historically manual (or nonexistent) methods.

Personalized recommendations will accelerate end-user acceptance and adoption.

Generic algorithms and models are generally more advantageous over pure “gut instinct” (although some might argue otherwise). However, the more personalized the recommendation, the better. This also suggests that you should think about how your customers or end-users consume information and how you can deliver your analytics/AI solutions in a relevant, intuitive, and adaptive manner. Man-machine interface is becoming increasingly important as the amount of data, and data-driven solutions multiply. It’s not sufficient to deliver large static reports with hundreds of rows of data. The key is how you convert the data into insights and actions for the individual end-user to act upon.

5. Build, Scale, and Partner for Talent and Access to Intellectual Property

To be successful, it’s important to develop, partner, or acquire the critical skills you need. It’s well documented that the demand for analytics and data science professionals exceeds the supply of talent, and the gap is increasing.

Ensure you have a strong recruitment process with leading universities and institutions developing analytics talent.

If you have an in-house analytics team, ensure you also have a strong recruitment process and relationships with universities to gain access to critically needed analytics talent. In a highly competitive field like analytics and data science, it goes without saying that you also need to have competitive salaries, benefits, and a well-defined, structured, (and enviable) career path that not only defines upward mobility, but encourages cross-training.

It’s impossible to accelerate creation or adoption of AI without the right talent or partnerships with industry leaders and innovators.

Create or leverage talent development programs to build expertise, skills, and scale.

Once you acquire the talent, it’s also important to retain the talent and expertise. This means investing in talent retention or development programs that allow you to quickly ramp new hires, provide advanced skills, and mitigate the impact of turnover. Given the high demand, and high attrition of talent in this space, a rigorous talent development program can help reduce turnover while accelerating the use of critical knowledge or skills to be successful – whether “hard” or “soft”.

Development programs should also bridge the gap between the “science” (e.g., predictive analytics, machine learning, deep learning) of analytics and data science and the “art”. Deep business or domain knowledge is needed to be most successful, or ensure the analyst is embedded in the business. Don’t ignore the importance of soft skills like verbal and written communications, including “storytelling”. Invest and develop skills in these areas.

Utilize a knowledge management platform and tools to share models and best practices.

This will help provide faster, proven solutions to end-users and customers. Invest or create knowledge management systems or platforms, and tools to share analytics models, automate data ingestion, etc. This will help build scale in your analytics organization, speed solutions to the business, and reduce “re-work” or duplication. More time can be spent on new techniques, or insights derived from the data.

Build the talent and skills needed in complementary functions to be successful.

Don’t overlook the importance of process engineers, IT infrastructure, developers, and project managers. All functions and skills are required in most enterprises to build robust, sustainable AI solutions. If these skills are lacking, it will be difficult to quickly deploy analytics solutions into production.

Partner with firms like Fractal Analytics to gain access to critical expertise, skills, and IP.

This will help you extend or complement your in-house capabilities to move faster to meet internal demand. It will help you obtain access to the talent and proven knowledge, skills, and IP solutions necessary to understand the business problem, build the analytics/ML solutions needed to address the business problem, and access leading-edge IP and knowledge from their deep bench of experts and investments in AI. They also have the ability to provide best practices or solutions from other industries or use cases that may accelerate your own efforts. Given the rapid increase in investments in artificial intelligence, partnering with industry thought leaders can also help keep you current on new and emerging technology and IP that can further accelerate your AI initiatives.

6. Organize for Success

Choosing the right organizational model is also an important factor in accelerating adoption of analytics, machine learning, or AI solutions. There are benefits and drawbacks to choosing a fragmented analytics team vs. one that is centralized in the organization. Consider using a hybrid model to combine the best of both worlds.

Highly dispersed or fragmented analytics talent can create challenges in driving enterprise AI

If your talent is highly fragmented or staffed in department and/or mid-level functions, it will be extremely difficult to build enterprise-wide solutions. Analytics or AI solutions being developed in fragmented manner may be interesting, “sexy”, or possibly breakthrough in some respects, but do they solve the most critical business problems or challenges? Are they aligned or are there competing solutions being developed? Is the work effort strategic, or tactical, and are you leveraging the critical and scarce talent in the most effective manner to achieve your goals? De-centralized analytics organizations may result in some of the following challenges:

  • Excessive focus on depart or tactical objectives, not strategic.
  • Internal competition for funding, tools, talent, and control.
  • Proliferation of BI/Analytics tools or solutions.
  • Limited or inability to apply predictive analytics at scale.
  • Difficult or impossible to move or develop talent across the organization.

On the other hand, highly centralized analytics organizations run the risk of being irrelevant.

By contrast, highly centralized analytics organizations run the risk of being irrelevant or lack the business knowledge to be successful. Centralized analytics teams can provide scale and the ability to quickly understand, share, and leverage best practices, tools, and methodology which are constantly and rapidly evolving. Characteristics of an overly centralized analytics organization may include:

  • Too rigid or slow in delivering analytics/AI solutions to the business.
  • Disproportionate focus on the science of data and analytics vs. the business impact or outcome.
  • Lack of business knowledge resulting in ineffective or irrelevant solutions.
  • May result in increased growth or proliferation in de-centralized analytics talent

A well-defined and effective organizational model can accelerate the adoption of analytics and AI solutions in the enterprise.

Consider a hybrid model to combine the best of both worlds.

Therefore, whether you build or partner to acquire your analytics expertise, you must carefully consider how to organize your talent in a way that allows for both expertise and scale in the tools, processes, and techniques of analytics and data science (the “science”), yet also provide for close alignment and understanding of the business and business problems that need to be solved (the “art”)

  • Analysts or data scientists must be close to, if not embedded in, the businesses they support in order to better understand the business challenges or processes that need to be improved. Don’t “throw the challenge over the fence and hope for the best”.
  • A model that provides the best of both worlds is the “hybrid” or “hub-and-spoke” model. There are different ways to implement these hybrid models and determine where the resources are placed, and who manages them. Overall, this hybrid approach has many advantages over the decentralized or centralized functional models.

No matter which model you choose, it’s ideal to have analytics talent remain close to the business. The more they learn and know about the business, the more effective the solutions will be. Conversely, the more remote (physically, intellectually, organizationally) your analytics talent is from the business, the less successful you will be.

7. Create and Strengthen a Culture of Collaboration and Experimentation

It’s also important to build a culture to collaborate, experiment, and innovate. Build a roadmap that delivers early and frequent “wins”, and communicate the progress and wins throughout the enterprise to inform, “de-mystify”, and rally support for your AI initiatives.

Remember that AI is a journey.

It’s a journey. Be willing to fail, learn quickly, adapt, and test again. The organization will learn. As you do so, collectively the organization will also learn how to better utilize analytics and AI to solve more complex questions or problems, and learn how to be more proactive in applying the solutions and outputs.

Enable collaboration and teamwork

Provide incentives, KPIs, or metrics to encourage coordination across functions like business, operations, IT, and analytics to work together. Engage the right subject matter experts. AI is not a problem nor solution that is solely in the realm of data scientists or IT. You need the operational, business, and technical expertise to ensure that the analytic outputs solve the business problem or challenge. It requires effective collaboration by business, operations, analytics, and IT experts to solve complex challenges and design, deploy, optimize, and maintain solutions leveraging ML and AI

Don’t underestimate the challenge or importance of cultural acceptance and adoption.

Identify internal champions, thought leaders, and change agents to help drive cultural awareness and adoption.

Identify and build champions or change agents who can help drive the cultural awareness and adoption needed to gain traction. Ensure they are recognized as key leaders and operate at a level or in a function that is strategic enough to make an impact. They are instrumental in achieving and communicating early wins and gaining buy-in from their peers, co-workers, and team members. Champions can also provide honest and frank feedback which can help improve the solutions you deliver which can ultimately accelerate broader adoption

Build your roadmap to yield early wins and successes to increase confidence and momentum.

Build your projects or roadmap in a manner to yield early wins or successes. This will enhance confidence and demonstrate the value or impact of the investments in big data, analytics, and AI. If you are successful in doing so, you will gain further buy-in from the sponsoring business or executives to invest, move faster, and more broadly if they understand the value and impact to the corporation.

Communicate early wins and successes to provide encouragement and support for the journey.

Communicate the progress throughout the enterprise to garner broader support and momentum for further investments, collaboration, and buy-in.Communications should start with the CEO in order to instill the importance throughout the enterprise. Change management is paramount to success in implementing AI in the enterprise in order to offset resistance, fear, uncertainty, or questions over the ROI of AI, and potential impact to the organization.

Be aware of obstacles, roadblocks or even “blockers”, and take appropriate action.

Internal competition for talent, data, control, or even “storytelling” work against the greater good. The more internal silos and competition you have, the slower you will be in adoption and gaining the benefits of analytics or AI to benefit the enterprise.

Determine an objective way to recognize the impact of the investments being made and impact to the bottom line.

Speaking of ROI, determine a way to equitably measure the impact of analytics or AI vs. the business decision or action. The impact of analytics (or AI) may be “watered down” if the value of the work is hidden, understated, or misunderstood. Alternatively, proclaiming excessive impact due to the analytics output vs. the business action can result in distrust.

The Next AI Conversation

In closing, there are many critical elements required to adopt or accelerate analytics and AI solutions in an enterprise. In an ideal state, these strategic elements work in an orchestrated fashion to enhance the chance for success. This doesn’t suggest you cannot be successful if all of these elements are not in place, or mature throughout the organization. However, senior executives and leaders who utilize these strategies will be more successful in implementing or accelerating their analytics and AI initiatives. Ultimately, enterprises which make AI a strategic priority or imperative will create a competitive advantage in the marketplace.

The Next Step

Contact us at Fractal Analytics to have a conversation to see how we can help you in your journey.

About Author

Accelerating AI enterprise-wide to achieve a competitive edge

Doug Hillary

Board Advisor at Fractal Analytics and Former Senior Vice President, Performance Analytics Group at Dell Technologies

Doug provides advisory services to help advance Fractal Analytics’ capabilities, services, and offerings to empower enterprise clients.

Doug held various leadership roles at Dell for more than 19 years. In his most recent role, he was responsible for providing global data, reporting, and analytics services to support Dell’s sales, marketing, finance, services, e-commerce, and operations business units. He also partnered with IT to launch Dell’s first enterprise-wide big data business intelligence solution to create a platform that significantly improved enterprise level descriptive analytics while enabling and accelerating predictive analytics at global scale

In his prior role at Dell, Doug was the Vice President of Dell Services, where he was responsible for delivering data center and desktop managed services and outsourcing engagements for Dell customers worldwide. His responsibilities included ownership for a $1.5B P & L, solution design and delivery, and global leadership of over 6,000 professionals. Prior to that, he held several senior executive roles in Dell Services where he led growth and scaling of Dell’s enterprise and global service capabilities to serve customers in all segments and regions. Throughout his tenure at Dell, Doug was also a champion and leader for diversity, STEM education for girls, and an advocate for women in technology leadership

Doug leverages his passion, knowledge, and experience to help Fractal Analytics and clients accelerate the use, adoption, and value creation with data, analytics, and AI in the enterprise.

Preventing unnecessary ER visits to reduce health care costs

Abstract

The sole purpose of the Emergency Room (ER) is to save lives by providing immediate attention to people with life threatening situations. With 24×7 access, a broad array of services, and the latest technology at hand, ER teams are well equipped and trained in treating medical urgencies, stabilizing patient conditions, and preventing further damages.

Today, many ERs are overcrowdedi as:

  • Unlike other treatment facilities, ERs have a federal mandate to provide care to any patient requesting treatment
  • Primary care physicians (PCPs) are in short supply
  • Poor patient knowledge and socio-economic conditions drive more patients to seek medical care in ERs

An increasing abuse of ERs, either due to patient ignorance or convenience, demands urgent attention from both payers and providers. There is a lot of documented evidence where patients have used ERs for situations that could have been treated in a more cost-effective care setting such as Urgent Care Clinics (UCCs) or Patient-Centered Medical Homes (PCMHs). A study from Project HOPEii estimated that 13% to 27% of all emergency department visits in the US could be managed in alternative sites, with a potential cost savings of approximately $4.4 billion annually.

This paper presents a comprehensive end-to-end solution to reduce ER utilization for non-emergent conditions. The proposed data-driven solution leverages predictive analytics to develop a framework to identify members likely to use the ER for avoidable reasons in the near future, and the solution recommends designing specific interventions to prevent future visits. During the process, we will have also built a case to leverage analytics in an agile way to rapidly derive maximum value.

iBarish RA, Mcgauly PL, Arnold TC. Emergency Room Crowding: A Marker of Hospital Health. Transactions of the American Clinical and Climatological Association. 2012;123:304-311.

iiCopyrighted and published by Project HOPE/Health Affairs as Weinick RM, Burns RM, Mehrotra A. Many emergency department visits could be managed at urgent care centers and retail clinics. Health Aff (Millwood). 2010;29(9):1630-1636. The published article is archived and available online at www.healthaffairs.org.

The ER health care landscape

In 2016, America spent more than $3.3 trillion on health care, or approximately $10,348 per person. This was a 4.3% increase from the previous year, contributing 17.9%1 to overall US GDP. More importantly, the health care expenses grew 1.5% faster than the rise in GDP. This faster growth in total spending was partly driven by strong growth in spending for private health insurance, hospital care, physician and clinical services, aging population, and the expansion of coverage through the Affordable Care Act (ACA).

While health care experts and economists are still debating the long-term and exact impact of the probable causes, in a recent study published in JAMA,2 the Obama administration claimed that since the ACA became law, the uninsured rate has declined by 43%-from 16% in 2010 to 9.1% in 2015- with most of that decline occurring after the law’s main coverage provisions took effect in 2014 (see Figure 1 for details). Further, the Department of Health and Human Services (DHSS) estimated that 20 million more people had health insurance in early 2015 because of the law.

Preventing unnecessary ER visits to reduce health care costs

However, having more insured people under the health care safety net without improving supporting infrastructure is expected to put significant  pressure on the entire delivery system. A Center for Disease Control and Prevention (CDC) report3,4,5 estimated that in 2011 there were over 136 million emergency room (ER) visits, for an average of 44.5 visits per 100 persons. Now, with an additional 20 million members getting coverage in 2015 and many more expected to have it in the following years, there will be an increased spotlight on ER utilization.

A finding from the National Hospital Ambulatory Medical Care Survey (NHAMCS) cites that nationally, 39.5% of ER visits among the general population are primary care sensitive in nature and therefore preventable.6 A Truven study7,8 estimates that only 29% of ER visits required emergency attention, with a rough cost estimate of $1,233 per ER visit,9,10 wasting billions of dollars in health care costs. Another study projected $4.419,11 billion in annual savings if non-urgent visits are better managed in alternate care settings.

Ideally, ER usage should drop when there is an efficient health care system with better access to care and affordable costs, as expected by ACA provisions. However, ERs are not a substitute for primary care relationships, nor can they address the broader socio-economic factors driving health care costs.11,12

The scope of the ER problem

The scope of this paper is to dig deep and answer three broad questions:

  • Which members are likely to make avoidable ER visits in the near future?
  • Why are these members more likely to make an avoidable ER visit than others?
  • How can we prevent such visits in the future?

Three board questions

 

Literature review

Academia and organizations have been studying parts of this equation for years. We started with a systematic review of the existing literature to leverage present-day knowledge to understand:

  1. ER utilization trends: What has been the historic trend and what do experts say about future use?
  2. Emergent vs. non-emergent ER visits: What factors drive non-emergent visits?iii
  3. Interventions: What can be done to reduce non-emergent ER visits?
  4. Efficacy: What works vs. what does not with respect to environment (type of payer) and care delivery settings?

Components of ER Over-utilization

FIGURE 3. Components of ER Over-Utilization Problem

iiiCases where immediate care is not required within 12 hours (e.g., sore throat).

Issues and challenges in existing solutions

  • Complexity of the problem
  • Limited scope
  • Limited data
  • High cost of using analytical solutions
  • No early cost benefit analysis
  • Analytics in isolation without synergies in interventions
  • Solutions specific to population cohorts under study and cannot be generalized
  • No attempt to answer “So what?”

We found that due to the complexity of the problem, the inherent restrictions on sharing patient data, and the vast variety of health care delivery settings (Medicare, Medicaid, Employer Sponsored, Commercial, Individual, etc.), most of the studies answered only some of the questions that we set out to answer. The limited evidence from the academic studies did suggest that age, ease of access to ER compared with other care alternatives, perceived severity, and socio-economic settings all play a role in decisions to seek care in the ER for non-urgent problems.

The usual research studies focused on identifying:

  • How the visits should be classified: emergent and non-emergent? Or…
  • What are the factors driving non-emergent usage? Or…
  • What intervention may work for a select population through statistical analysis, review of patient charts, or survey techniques?

We observed that there was no systematic end-to-end approach that addressed all parts of the problem holistically. Further, most of the studies were not focused on demonstrating ROI from such initiatives which may, in part, be attributed to the nature of their funding itself: academic or through grants from non-profit organizations.

The analytics driven ER utilization solution

Conceptual framework

In a study funded by the California Healthcare Foundation to understand factors influencing an individual’s decision to visit an Emergency Department (ED) for a non-urgent condition, authors Lori Uscher-Pines et al13 proposed a conceptual frameworkiv to show how a patient arrives at a decision to seek care in an ER by consciously or unconsciously weighing several considerations. The decision to go to an ER is influenced by an array of causal pathway factors and associated factors. See Section I of Figure 4 for details.

The associated demographic, socio-economic, and lifestyle factors (Section I of Figure 4) can be determined through qualitative and quantitative techniques. We will limit our study to patients who chose the “Go to ER” path and algorithmically determine the associated factors. Then, we will extend the framework (Section II of Figure 4) to identify which visits, retrospectively, were avoidable and what was the dollar impact of it. Finally, we will show how advanced AI/ML techniques will help to prospectively identify members likely to make an avoidable ER visit in the near future.

Conceptual Framework

FIGURE 4. Conceptual Framework

ivConceptual Model of Non-Urgent ED Use. Deciding to Visit the Emergency Department for Non-Urgent Conditions: A Systematic Review of the Literature. Am J Manag Care. 19(1):47-59.

Uscher-Pines L, Pines J, Kellermann A, Gillen E, Mehrotra A. Deciding to Visit the Emergency Department for Non-Urgent Conditions: A Systematic Review of the Literature. The American journal of managed care. 2013;19(1):47-59.

Methodology

Algorithm and techniques to classify ER visits

The foremost problem is to classify ER visits as emergent (unavoidable) or non-emergent (avoidable). The gold standard is to have a panel of clinical experts review patient charts for the selected population and then classify each visit accordingly. However, this process is very resource intensive and not feasible when quick results are needed. There can be multiple alternative approaches to classify ER visits as avoidable or unavoidable:

Option 1: Developing an independent algorithm

Here, a statistically large sample of claims is selected, and the frequency of primary diagnosis codes is analyzed in a “regular setting” vs. an “ER setting.” Diagnosis codes that were more often treated in a “regular setting” and were also present in an “ER setting” are flagged. A threshold value is chosen to further trim down the selection, and then claims with flagged diagnosis codes are considered as “avoidable.”

Option 2: Leveraging a publicly available algorithm

The NYU ED algorithmv is widely used to identify diagnosis codes which are avoidable (with greater than 90% probability). There are other variants of the NYU algorithm: The Billings/Ballard algorithm14 and, more recently, the Minnesota algorithm.

Option 3: Leveraging in-house learning to create a hybrid algorithm

Leverage an in-house clinical research team’s prior experience to identify avoidable ER visits for a sample population.

We used “Option 2—The NYU ED Algorithm” to identify avoidable ER visits.

NYU Algorighm

FIGURE 5. NYU Algorithm

vNYU ED Algorithm, NYU Wagner, New York University, wagner.nyu.edu/

Identifying the analysis population and expected ROI

For our pilot study, we selected members with a specific chronic disease and then segmented the entire population into homogenous groups across several key dimensions such as plan (HMO, Non-HMO), health conditions, geography, age groups, etc. We also checked two key factors for the selected population:
a) Is there enough opportunity (avoidable ER visits) to begin with
b) Is support available to drive intervention programs and traverse the last mile

Highly sophisticated algorithms with super rich data quality are certainly expected to deliver the best of outcomes. However, in the real world there is a need to balance research and implementation costs with expected benefits and ROI.

Figure 6 below shows an illustrative approach to do a quick ROI estimation before moving ahead.

ROI estimation framework

Leveraging predictive analytics in an agile way

  • What proportion of ER visits are avoidable?
  • Is intervention program support readily available?
  • What are the potential benefits?
  • What is the expected cost?

Traditionally, predictive modelling exercises follow a waterfall approach. The requirements must be frozen and all stakeholders aligned before moving to design, development, and validation phases. This methodology lacks the necessary flexibility to rapidly react to evolving business needs. Further, the cost of failure is relatively high, as benefits cannot be established until the model results are field validated.

We recommend developing predictive solutions in an agile way: start with small data (e.g., claims) and simpler techniques to establish initial value. After each sprint, reassess the benefits to validate the necessity of the next sprint. Once the incremental gains are established, explore options to add either new data sources and/or complex predictive algorithms to further maximize the returns. See Figure 7 for more details.

Model Permormance

Using the agile approach shown above, we used traditional analytical steps to develop our models, starting from logistic regression and claims data to artificial intelligence (AI)/machine learning (ML) techniques with non-traditional data sources, e.g., augmenting traditional claims data with lifestyle and behavior information, bringing in zip-level socio-economic information, adding macroeconomic indicators, temperature, pollution information, etc. Refer to Figure 8 for more details.

Our final solution consisted of an ensemble of machine learning and logistic models. The solution was able to capture 50% of all avoidable ER members within top two deciles.

Below are some selected insights/validated hypotheses for members likely to make an avoidable ER visit:

  1. Comorbid conditions, such as Congestive Heart Failure (CHF) or Chronic Obstructive Pulmonary Disease (COPD)
  2. History of frequent ER use
  3. Behavioral conditions such as drug or sub stance abuse, alcohol dependency, etc.
  4. Depression and other bipolar disorders
  5. Certain ethnic groups
  6. Poor educational levels
  7. Obesity and others

Predictive modelling

The solution implementation

Question:

What can you do about the in-house information that your care management/disease management teams already have?

A bigger question is:

What can you do about the information that you don’t have?

This phase involved identifying the right set of interventions that would benefit the member population identified from our predictive model.

Identifying the right set of interventions

Once the top “x” decile members had been selected, we wanted to further segment this population based on its propensity to respond to a specific intervention to maximize the return on intervention. However, due to challenges outlined below, such information was not readily available:

  • Delivered in a specific care setting
  • Targeted at a specific geographic region for a certain ethnic group
  • Broad disease-specific interventions are force fitted for selected population segment
  • No longitudinal study that tracks outcome of historic care management (CM) or disease management (DM) programs linked with predictive models-e.g., impact of diabetes management program on members
    with high risk scores from a predictive model
  • Limited employer workforce performance related data—e.g., how a certain wellness program resulted in member performance: absenteeism, productivity, etc.

In the real world, there are additional practical limitations, such as cost to execute, limited resources in the care management team, limited time to execute, and the need for rapid realization of benefits from the experiments.

Analytics can help organizations in optimizing the entire value chain of experiments: planning, designing, and execution. We need to leverage analytic techniques to fully mine information that the traditional care management/disease management programs have collected so far, and if not, design experiments to gather such intelligence.

Approach

In our case, we divided interventions into two broad categories (Figure 9):

  1. Known interventions linked with member profiles identified through profiling
  2. Experiential interventions to test and learn new programs through secondary research

Designing Prioritized Interventions_2-01

FIGURE 9. Designing Prioritized Interventions

We used below steps to identify and design the right set of interventions:

Step 1: Identify key themes from predictive model results

Table 1

We identified key themes by reviewing patient profiles from our predictive models. We then reviewed them against the existing medical literature and selected interventions that were most relevant to our study population.
See Table 1 for details.

Step 2: Design collaborative interventions for experiential learning

Here, our objective was to identify which pilot interventions would help most in getting the right data for future interventions. Sample experiential interventions linked with emerging themes identified in step 1 are listed below:

  • Smoking cessation program for members in a select Accountable Care Organization (ACO), Patient Centered Medical Home (PCMH), Skilled Nursing facility (SNF), or Long-Term Acute Care Facility (LTAC)
  • Pharmacist-led education on early parents or members residing in a rural area
  • Providing free nebulizers to asthma population—this will help in seeing (if) an increased medication adherence results in increased response to DM program

Step 3: Prioritizing interventions

Once we had a broad set of interventions linked to members’ medical, socio-economic, behavioral, and lifestyle data, we prioritized the specific interventions for maximum ROI within the constraints of time, effort, and budget. See Figure 10 for an illustrative framework.

Prioritation table

Step 4: Efficacy of the interventions

As a final step, we designed a detailed IT system to capture the data from the experiments. Our ultimate aim is to use the collected intelligence to evaluate the efficacy of the program and also as an input to future predictive models.

Conclusion

Reducing avoidable ER visits is a complex problem requiring a collaborative effort from multiple functions. Early identification of the problem through a sophisticated predictive analytics solution can provide a competitive edge in mitigating revenue leakage and containing health risks.

To balance between program costs and potential benefits, we recommend using analytics in an agile approach and accounting for below critical success factors:

Ensure strong executive sponsorship for end-to-end program support: Predictive models have little value if the CM/DM teams cannot timely use the results. A multi-divisional collaboration for program execution and implementation is a must.

Perform early ROI estimation through baseline vs. benchmark comparison to ensure that there is value in pursuing the initiative.

Start small but be specific: Identify the right population segments where the problem is severe, and establish clear metrics and thresholds to measure success.

Develop the solution iteratively, starting with easily available small data: Leverage external data to fill in gaps due to lack of internal data, and use non-traditional data sources such as lifestyle, behavioral, and socio-economic data to enrich data quality.

Start with simpler analytic techniques to show value for executive buy-in before making a case to move to complex machine learning algorithms.

Plan, design, and develop systematic experiments to test and learn from interventions. Use this data as an asset for future studies.

Leverage artificial intelligence techniques to scale and automate solutions.

It’s time to reduce unnecessary ER visits and deliver impactful interventions to prevent them from occuring in the future. An analytics-powered approach, delivered in an agile manner, can help organizations deliver on this opportunity.

REFERENCES

  1. https://www.cms.gov/Research-Statistics-Data-and-Systems/Statistics-Trends-and-Reports/NationalHealthExpendData/downloads/highlights.pdf
  2. Obama B. United States Health Care Reform Progress to Date and Next Steps. JAMA. 2016;316(5):525-532. doi:10.1001/jama.2016.9797
  3. https://www.cdc.gov/nchs/data/ahcd/nhamcs_2011_ed_factsheet.pdf
  4. https://www.cdc.gov/nchs/fastats/emergency-department.htm
  5. Weiss AJ (Truven Health Analytics), Wier LM (Truven Health Analytics), Stocks C (AHRQ), Blanchard J (RAND). Overview of Emergency Department Visits in the United States, 2011. HCUP Statistical Brief #174. June 2014. Agency for Healthcare Research and Quality, Rockville, MD. http://www.hcup-us.ahrq.gov/reports/statbriefs/sb174- Emergency-Department-Visits-Overview.pdf
  6. The Role of Health Centers in Lowering Preventable Emergency Department Use: http:// nachc.org/wp content/uploads/2015/06/ED_FS_20151.pdf
  7. http://blog.bcbsnc.com/2017/12/5-emergency-room-myths-busted/
  8. https://truvenhealth.com/media-room/press-releases/detail/prid/113/study-finds-most-emergency-room-visits-made-by-privately-insured-patients-avoidable
  9. “The median charge for outpatient conditions in the emergency department was $1,233, which is 40% more than the average American pays in rent each month ($871).” [$1,233/$875=1.42] “Avoidable Emergency Department Usage Analysis.” Truven Health Analytics. (April 25, 2013)
  10. Caldwell N, Srebotnjak T, Wang T, Hsia R (2013) “How Much Will I Get Charged for This?” Patient Charges for Top Ten Diagnoses in the Emergency Department. PLoS ONE 8(2):e55491. doi:10.1371/journal.pone.0055491
  11. Cunningham PJ, et al. The use of hospital EDs for nonurgent health problems. Med Care Res Rev. 1995; 52(4):453-474
  12. Weinick RM, Burns RM, Mehrotra A. Many emergency department visits could be managed at urgent care centers and retail clinics. Health Aff (Millwood).2010;29(9):1630-1636.
  13. Emergency Department Visits for Nonurgent Conditions: Systematic Literature Review. The American Journal of Managed Care. 2013;19(1):47-59 https://ajmc.s3.amazonaws. com/_media/_pdf/AJMC_13jan_UsherPines_eApx_47to59.pdf
  14. Ballard DW, Price M, Fung V, et al. Validation of an Algorithm for Categorizing the Severity of Hospital Emergency Department Visits. Medical Care. 2010;48(1):10.1097/ MLR.0b013e3181bd49ad. doi:10.1097/MLR.0b013e3181bd49ad.
  15. Basch CE, Walker EA, Howard CJ, Shamoon H, Zybert P. The effect of health education on the rate of ophthalmic examinations among African Americans with diabetes mellitus. American Journal of Public Health. 1999;89(12):1878–82. [PubMed: 10589324]
  16. Piette JD, Weinberger M, McPhee SJ, Mah CA, Kraemer FB, Crapo LM. Do automated calls with nurse follow-up improve self-care and glycemic control among vulnerable patients with diabetes? American Journal of Medicine. 2000;108(1):20–27 [PubMed:11059437]
  17. Clancy DE, Brown SB, Magruder KM, Huang P. Group visits in medically and economically disadvantaged patients with type 2 diabetes and their relationships to clinical outcomes. Topics in Health Information Management. 2003;24(1):8–14. [PubMed: 12674390]
  18. Banister NA, Jastrow ST, Hodges V, Loop R, Gillham BM. Diabetes self-management training program in a community clinic improves patient outcomes at modest cost. Journal of the American Dietetic Association. 2004;104(5):807–10. [PubMed: 15127069]
  19. Jaber LA, Halapy H, Fernet M, Tummalapalli S, Diwakaran H. Evaluation of a pharmaceutical care model on diabetes management. Annals of Pharmacotherapy. 1996;30(3):238–43. [PubMed: 8833557]
  20. Rothman RL, Malone R, Bryant B, Shintani AK, Crigler B, Dewalt DA, Dittus RS, Weinberger M, Pignone MP. A randomized trial of a primary care-based disease management program to improve cardiovascular risk factors and glycated hemoglobin levels in patients with diabetes. American Journal of Medicine. 2005;118(3):276–84. [PubMed: 15745726]
  21. Gerber BS, Brodsky IG, Lawless KA, Smolin LI, Arozullah AM, Smith EV, Berbaum ML, Heckerling PS, Eiser AR. Implementation and evaluation of a low-literacy diabetes education computer multimedia application. Diabetes Care. 2005;28(7):1574–80. [PubMed: 15983303]
  22. Davidson MB. Effect of nurse-directed diabetes care in a minority population. Diabetes Care. 2003;26 (8):2281–87. [PubMed: 12882849]
  23. Erdman DM, Cook CB, Greenlund KJ, Giles WH, El-Kebbi I, Ryan GJ, Gallina DL, Ziemer DC, Dunbar VG, Phillips LS. The impact of outpatient diabetes management on serum lipids in urban African Americans with type 2 diabetes. Diabetes Care. 2002;25(1):9–15. [PubMed: 11772894]
  24. Anderson-Loftin W, Barnett S, Bunn P, Sullivan P, Hussey J, Tavakoli A. Soul food light: Culturally competent education. The Diabetes Educator. 2005;31(4):555–63. [PubMed: 16100331]
  25. California Medi-Cal Type 2 Diabetes Study Group. Closing the gap: Effect of diabetes case management on glycemic control among low-income ethnic minority populations. Diabetes Care. 2004;27(1):95–103. [PubMed: 14693973]
  26. Brown A, Gregg EW, Stevens MR, Karter AJ, Weinberger M, Safford MM, Gary TL, Caputo DA, Waitzfelder B, Kim C, Beckles GL. Race, ethnicity, socioeconomic position, and quality of care for adults with diabetes enrolled in managed care. Diabetes Care. 2005;28(12):2864–70. [PubMed: 16306546]
  27. Brown SA, Garcia AA, Kouzekanani K, Hanis CL. Culturally competent diabetes self-management education for Mexican Americans: The Starr County Border Health Initiative. Diabetes Care. 2002;25 (2):259–68. [PubMed: 11815493]

Authors

Anupam Bhatnagar

Engagement Manager, Fractal Analytics

Anupam Bhatnagar is working as an Engagement Manager with Fractal Analytics and has over 12 years of experience in data analytics, problem solving, and consulting in the US health care and insurance domain.

Kishore Bharatula

Principal Consultant, Fractal Analytics

Kishore Bharatula is working as a Principal Consultant with Fractal Analytics and has over 13 years of experience in the analytics industry. He is passionate about solving challenges through implementing analytics to bring measurable topline and bottom-line impact.

 

Redefining growth in a zero-sum game environment

By Eugene Roytburg, Managing Partner and Lana Klein, Managing Partner

 

Over the past four years, the growth of CPG sales in the U.S. stalled at under 1%, in contrast to 7% growth between 2006 and 2011. Volumes are mostly stagnant or have declined in many categories – in some cases, sharply. In addition, growth has slowed dramatically in key emerging markets. These statistics are a result of several fundamental drivers.

Fewer People & Slower Growth

First, the number of consumer product consumers is stagnant due to flat population growth. Birth rates in Western Europe are below the replacement rate, for example. And without migration, most countries in Europe (with the exception of France, Ireland and Norway) are projected to have negative population growth. In the US, birth rates in 2016 were 14% lower than in 2008.

Economic hurdles have also had an impact. Despite the global economic recovery, wages have stagnated for the two lowest-income quintiles. And help from emerging markets, which fuelled CPG a few years ago, has also dried up as GDP growth in those markets has slowed. Additionally, currency weakness in many emerging markets has impacted consumer purchasing power to compound softness in CPG sales.

Shifting Consumer Preferences

Against this backdrop, consumer attitudes, tastes, needs, and behaviors have also shifted. E-commerce upended shopping behavior and levelled the playing field for small companies, helping them reach consumers more easily than in the past. Buying power has shifted from the baby boom generation, which is well understood by marketers, to the more fickle Millennials, who often shun large brands in favor of newcomers. As a result, small, upstart CPG companies have captured 3% market share from larger players over the past few years.

Consumer behaviors are also becoming less homogeneous. Increasingly, they’re polarizing into “low involvement” and value-seeking buying on the one hand, and highly-selective purchase behavior on the other. To complicate matters further, some consumers show both behaviors – depending on what category they’re shopping for. As a result, large brands are under threat of death by a thousand cuts, losing share to “low involvement” and to a host of smaller, innovative brands who cater to increasingly fragmenting consumer tastes.

Traditional CPGs vs. Digital Natives

So, where can manufacturers find growth in this environment? In many niches across different verticals. All categories show the same pattern: The emergence of niche and micro-segments – largely dominated by start-up brands – taking share from established mainstream products.

So, why can’t established brands adapt? Here are a few reasons:

  • The innovation process in large CPG firms isn’t set up for the new environment. It’s slow and risk-averse, with a relatively large “hurdle rate” favoring initiatives that don’t venture too far from the core business. Sure, many companies recognize this and try to become more nimble. The problem, however, is that incremental improvements are usually too weak to change the massive “cultural DNA” of large companies and produce a meaningful shift.
  • Many companies are still laggards in e-commerce and digital – at least compared to relative newcomer brands like Dollar Shave Club (later bought by Unilever), who are often digital natives and have entire operating models rooted online.
  • Brand Equity – traditionally a huge asset – can play against large brands with long-established mainstream perceptions when they try to venture into a new niche. Small start-up brands have an authentic story that’s typically better aligned with the needs and attitudes of their target consumers.

How Can CPGs Compete?

To survive in these conditions, large CPG companies need to re-think the ways they look for growth opportunities. They must shift their mindset, culture and operations to succeed in a changing environment.

  • Develop capabilities to quickly identify growth opportunities. With landscapes shifting so rapidly, traditional category segmentation and a “future will be like the past” approach should be replaced by robust analytics that can quickly scan for emerging growth niches and discern fads from longer-lasting opportunities. These opportunities span the intersections of categories, accounts, product attributes and consumer segments, to name a few.
  • Develop a strong acquisition strategy and execution. This is an obvious route that many firms already pursue. But the devil is in the details — creating the ability to rapidly and effectively identify suitable acquisition targets. Another important point is determining optimal deal size. While smaller acquisitions are likely to deliver higher growth rates, incremental revenues may be too small to move the growth needle. What’s more, they require deep resources for execution. Finally, it’s vital to design a post-integration strategy to preserve the entrepreneurial spirit of the new brand, while using corporate muscle to scale it. Hormel (Muscle Milk®, Applegate®, Justin’s®, Wholly Guacamole®) is one company that does this especially well.
  • Invest in start-ups like venture capital funds do. Many corporations are engaging with start-ups through internal corporate venture capital arms that invest directly in these types of companies . Nestle, Chobani, General Mills and PepsiCo are some who’ve launched “accelerator” units to participate in growth and support start-ups. Kraft has launched a business unit called Springboard, for example, to develop what the company describes as “disruptive” food and beverage brands. The unit is actively searching for emerging, authentic brands and looking to build a “network of founders.”
  • Finally, take a more agile and differentiated approach to your brand portfolio. Ask yourself: Where are unmet needs in your categories? Do consumers care or need anything innovative, or have all meaningful problems and needs been addressed? Too often, the growth targets in mature spaces are unrealistic and driven by inertia. They’re unlikely to produce much growth. Yet, firms keep pouring money into marginally incremental innovation, rather than looking for new spaces.

Ultimately, the current environment for CPG brands is a zero-sum game. Everything from slow population growth to unfavorable economic factors — which have given rise to a new breed of digital competitors — is putting pressure on them. To compete, CPGs must adapt and force growth opportunities through innovation and a more differentiated approach.

AI Disruption Roadmap
Non-conscious design for behavior change
Fractal CAB 2017 - Humanizing AI - Srikanth Velamakanni
AI meets BI

What’s the most frustrating aspect of using current enterprise solutions for a senior executive? It is the inability to find timely and reliable answers to their questions. Often, answers do lie somewhere – it is just too challenging and time consuming to get to them.

Imagine how cool it will be if you can get the most important information and insights before every tactical decision you can make during your workday?

Too much data, insufficient, unreliable metrics

Consider the case of a global consumer goods company we work with. In a large growing market, their sales team has little or no idea how their trade promotions are performing (even though they spend significantly more than $100 million/year).

At a macroscopic level, they do have data on their overall shipments and revenues, but if they want to know whether a specific promotion generated incremental sales or how it performed across different regions, the answer is so excruciating to find that they have stopped asking the question!

There are just too many data sources (from their distributors) that are highly inconsistent as well as complex – and there’s no agreement in defining true incrementality.

In such a situation, they just go with their gut and experience when they decide which promotions to repeat, which to drop and which new ones to introduce. This is the equivalent of driving in an unknown city by asking passersby for directions. It’s time to get a map with real time traffic information and turn by turn directions.

Information overload, too little time

There are other cases where companies have great data but they are still deluged with information overload. A bank we work with expected its senior executives to read an 800-page document, ironically called “at-aglance” report to understand how the bank was doing!

The idea here is this – since we don’t have a clue what’s really important, let’s just get everything together so that senior executives can find whatever they may be looking for. This is like printing out the map of the whole country because you don’t know exactly where you are going.

A bank we work with expected its senior executives to read an 800page document, ironically called “at-a-glance” report to understand how the bank was doing!

BI/Data discovery platforms overpromise, underdeliver

BI & “data discovery” platforms that promise answers aren’t working either. These platforms suffer from GIGO (garbage in garbage out) syndrome and have performance constraints.

In a healthcare company we recently worked with, their report takes forever to load as it attempts to load hundreds of GB of data. Most importantly, these tools don’t even make an attempt to understand their users, like the banking example we discussed above. Whether you are the CEO, Chief Sales Officer, marketing director or finance manager; whether you are in Mexico or look after Western Europe as a region; the reports look more or less the same.

The BI system expects you to learn it and find your own answers and not the other way round. Why is that acceptable?

BI & “data discovery” platforms that promise answers aren’t working either. These platforms suffer from GIGO (garbage in garbage out) syndrome and have performance constraints.

Your Facebook seems to know you quite well, why can’t your BI report understand you likewise and anticipate your questions?

Can AI transform BI?

The answer, I believe, is to bring AI to BI. We need to rethink BI dashboards in light of the advances we are making in AI. Thanks to big data & AI techniques in text analytics, it is easier than ever before to bring together disparate, messy, inconsistent data and fill in the missing gaps. AI algorithms in knowledge representation have made it possible to connect fluid data points into probabilistic but consistent, highly accurate understanding of what’s happening in the business (KPIs, competitive intelligence, etc.).

Most importantly, AI transforms our understanding of the user, helping us serve information, recommendations and insights that the user really needs to know, even before she “wants” to know. That’s when BI truly becomes personalized and “anticipatory”. Additionally, by instrumenting how the user is interacting with these insights/recommendations and acting on them, the AI within BI can learn to be even more relevant, actionable and dare I say, addictive. Eventually, managers and executives will spend more engaged time with their BI than with their Facebook feed. (OK, the last line went too far :-), but I am optimistic).

AI transforms our understanding of the user, helping us serve information, recommendations and insights that the user really needs to know, even before she “wants” to know.

Returning to the example of the consumer goods company, thanks to an AI plus BI solution, the company executives are now beginning to get a clear, in market read of their trade promotion performance. Machine learning algorithms make recommendations to the design team on what promotions to retain, what to drop and predict how a new promotion will perform. The sales team, including the distributor sales representatives will soon have, on their smartphone, information they “need to know”. The National sales director will have real-time understanding of performance of trade promotions and the same app (Cuddle.ai) recommends the right promotion for the right channel to each sales representative.

BI/Data discovery platforms will benefit by embracing this “AI meets BI” thinking to move from data discovery (by the user) to user/insights discovery by the platform

What do you think? Will this be a game changer for your business? Will BI platforms incorporate AI soon enough?

About Author

AI meets BI

Srikanth Velamakanni

Co-Founder, Group Chief Executive & Executive Vice-Chairman, Fractal Analytics

Srikanth is a co-founder of Fractal Analytics. In his role as Group Chief Executive & Executive Vice-Chairman, he is responsible for all four entities, inorganic growth and the long-term future of the business.

At Fractal, he has played a role in the evolution of the analytics industry. Long before big data became a buzzword, Fractal evangelized the idea of using advanced analytics and data assets of the company to make better decisions.

Over the last 16 years, he has been a thought partner to global corporations as they have embraced analytics to improve the quality and execution of their decisions. He also believes in building a great place to work that attracts the best minds in the world and creating a trusting environment where people are respected and are free to do creative problem solving.

Srikanth considers himself a lifelong student of mathematics, probability & AI and is interested in consumer behavior, behavioral economics and deep learning.

 

Evolving the role of the retail store in an omnichannel world

All roads lead to an omnichannel retail world. In an increasingly competitive market, retailers are starting to lay the foundations for omnichannel. Yet, there is a lot to be done before the retailers can harmonize all the touchpoints to their customers.

Companies have historically focused on improving operational capability, but they have so far struggled to convert this into an enhanced retail experience.

Online retail made up just 8.4% of total U.S. retail sales in 2016, and Forrester research predicts it will account for only 11% of total U.S. retail sales by 2018. As per an AT Kearney study, around two-thirds of customers shopping online use physical stores before or after the transaction. In such cases, stores are essential in converting the sale. Physical stores provide consumers a sensory experience that allows them to touch and feel the product, immerse in brand experience, and engage with sales associates who provide suggestions and reaffirm shopper enthusiasm for their new purchases. Nothing can replace these aspects of in-store shopping. This suggests that physical stores still dominate the retail landscape, and will continue to do so in the near future.

Evolving the role of the retail store in an omnichannel world

However, there are certain aspects of online stores that physical retailers can take inspiration from to enhance the customer experience. Prominent among those aspects is the aspect of experimentation known as A/B testing in the e-commerce space.

Retailers must test many ideas, quickly and accurately before they decide what works and what doesn’t. 

Retailers should innovate, start new initiatives, and bring new technology into existence to transform the store experience. Retailers must test many ideas, quickly and accurately before they decide what works and what doesn’t. Those who have understood this have already started to build a culture of business experimentation within their organizations and are seeing its benefits. For those who haven’t, this is the right time.

New role of the physical store

The new retailer needs to be a combination of store retail and non-store retail. Retailers need to integrate the online and offline advantages to provide a seamless experience across channels.

An indication of how the role of stores is expected to be transformed is evident in the fact that 40 percent of Best Buy’s and more than 50 percent of Walmart’s online sales already are picked up in stores.

According to a McKinsey Insight article, to make informed network choices, retailers must take a long-term view of their real-estate. Beyond building stores, what expansion models are available when they look for growth? How can they enable new multichannel experiences?

Evolving the role of the retail store in an omnichannel world

The new retailer needs to be a combination of store retail and non-store retail. Retailers need to integrate the online and offline advantages to provide a seamless experience across channels.

As prices and inventory availability become more transparent, retailers will not survive just by being “pass through” sellers of national brands. They will have to give consumers a reason to choose their stores over competitors.

No longer will consumers shop at a retailer simply because it happens to be where a product is distributed. Retailers will need to offer deep product expertise and a unique product education.

There are enormous possibilities where modern stores can bring new experiences to customers. The following scenarios will help paint a picture:

  • Customers browsing online, locating the nearest store and purchasing from that store.
  • Customers picking up the product in the store and paying online to avoid long checkout queues.
  • Customer making an online purchase but returning the product at the physical store.
  • Access to an in-store interactive screen where customers can browse through various products, read reviews and pick up from the shelf.
  • Sensors which can understand the interest of a customer and send the data to a screen which displays relevant product information.
  • Experience zones within a store that simulates the environment in which a product is designed to be used.
  • In store assistants carrying mobile/tablets with information on each customer’s profile and personalizing the experience for a customer.
  • Personalized promotions sent directly to the customers mobile, based on the location within the store.

Some stores have already incorporated a few of the above scenarios.

Amazon Go

is pioneering the ‘Just Walk Out’ technology, enabling the customer to completely bypass queues.

Kate Spade Saturday and SONY

are experimenting with shoppable windows and revolutionizing the concept of window shopping.

Macy’s and Waitrose

have started sending personalized recommendations and offers based on the location of the customer in the store.

Oasis

is a U.K. fashion retailer that’s fusing their ecommerce site, mobile app, and brickand-mortar stores into a simple shopping experience. If you walk into one of their stores, you’ll find sales associates armed with iPads that are available to give you on-the-spot, accurate, and up-to-date product information.

The iPad also acts as a cash register, making it easy for associates to ring you up from anywhere in the store. And the cherry on top? If it appears that something is out of stock, the staff can instantly place an online order for you to have the item shipped directly to your home. This is true seamless customer experience.

For many retailers, future store layouts will need to foster greater customer learning and experimentation. Technology will need to be fully integrated into how stores and employees engage customers. And the lines between physical and digital will have to blur.

Store transformation journey: Ask a lot of questions

It is important to note that none of the stores have adopted a big bang strategy to invest in the latest technology. Amazon Go is currently open only to Amazon employees in their Beta program. Similarly, SONY has installed shoppable windows at only one store. And rightly so. Technology adoption is a high risk, high investment venture. It is important to understand whether making a change translates to an increase in the metrics that matter, be it sales, customer engagement, footfalls, or any metric that the store decides is an objective metric.

Evolving the role of the retail store in an omnichannel world

 

It is important to understand whether making a change translates to an increase in the metrics that matter, be it sales, customer engagement, footfalls, or any metric that the store decides is an objective metric.

Modifications to these existing processes can increase the chances of a sale considerably. It is important to ask the right questions to identify which modification will be most effective.

Digital presence

Once a customer identifies a need, the next step will be to browse online for the relevant product. A considerable digital presence can go a long way in attracting the customer. For a retailer, the key is to understand which digital channels are the most effective.

Will a significant web presence lead to more views or is mobile the more effective channel? Will ads placed on other websites get those views? Are these channels an effective space for running promotions? Are these promotions resulting in additional footfalls to the store?

Once a customer identifies a need, the next step will be to browse online for the relevant product. A considerable digital presence can go a long way in attracting the customer.

Will sending personalized promotions and coupons entice the customer to the store? Or is this a privacy concern? Will adding a Google Maps plugin make it easier for the customer to reach the store, resulting in additional footfalls? The permutations and combinations are many, but incremental changes and accurate measurement will simplify many of those. Walmart has recently introduced an app that lets customer buy products online and pick them up in stores. This helps customers conveniently manage their shopping cart while maintaining their in-store experience and loyalty.

Store design

Store designs and layouts strongly influence in-store traffic patterns, shopping behavior, and the shopping experience. Understanding the cause-effect relationships here will help retail stores arrive at the optimum store design.

Store designs and layouts strongly influence in-store traffic patterns, shopping behavior, and the shopping experience. Understanding the cause-effect relationships here will help retail stores arrive at the optimum store design.

For example: Will changes in the exterior of the store result in more footfalls? Will a redesign of the zones within a store help customers navigate better and reach their desired products faster? Will a change in the shelf layout increase the visibility of a product, which in turn lead to an increase in sales? Why do more than half of customers, as soon as they enter the store, walk towards the beverage section despite visible promotions on packed foods? Who went without purchasing anything? What did the shoppers not buy?

The design of a store is also greatly influenced by the persona of its customers. The Amazon Go store, at roughly 1,800 square feet, is conveniently compact to suit the needs of its target customers: busy shoppers who want to get in and out as quickly as possible. A store which serves leisurely shoppers may want to incorporate more open spaces.

In-store assistance

Now that customers have arrived at the product location, they may want to compare different brands, specifications, and prices.

Now that customers have arrived at the product location, they may want to compare different brands, specifications, and prices.

Will the presence of a screen that automatically displays the relevant information add to a positive experience? Will a store assistant, armed with behavior patterns and demographic data of that customer, make it easier for the customer to make a choice? Or is this another privacy concern? Is it feasible to add a premium on the price for the enhanced instore experience?

Augmented reality (AR) is touted as the next big technology in retail due to encouraging feedback from several customer surveys. However, it is not yet clear whether AR will actually increase sales. For example, will a virtual mirror that can quickly learn preferences and show customers new looks without requiring them to try out any of the products result in tangible benefits for the store? Will sales increase if the retailer rolls out an app that lets customers imagine what a pair of shoes would look like on their foot without actually trying them on?

The key here is to take one step at a time, make gradual changes, and measure each change through controlled experiments.

Store transformation journey: Find answers to questions through experimentation

Organizations have ideas and they spend an immense amount of money to execute/implement them, but very few companies succeed in the end. For any retailer, effective implementation of ideas is the key challenge. Dearth of talent, limited budget, high operational cost, and lack of technical infrastructure reduce a retailer’s appetite for change. Hence, they lag in innovation.

For any retailer, effective implementation of ideas is the key challenge.

On the other hand, online players like Amazon, Best Buy, etc., frequently bring new features to lure customers.

As per Jeff Bezos, “If you double the number of experiments you do per year you’re going to double your inventiveness.”

Clearly, companies like Amazon do not hesitate to try new ideas and thrive on innovation. They also have the advantage of technical infrastructure to experiment with new ideas.

Experimentation as a concept is not new. Conventional retailers have been using some form of manual methods to try and test their ideas and it has very limited scope. But today, to transform an entire store, retailers need to bring gradual changes in the store and test them in more sophisticated and agile manner. Whether the desired change works or not, it is vitally important to gain insight on scales large enough to assess results but small enough to reduce the large investments and risks that come with full scale execution. With numerous options, factors and possibilities in play, a robust approach is necessary for testing ideas.

…today, to transform an entire store, retailers need to bring gradual changes in the store and test them in more sophisticated and agile manner.

The first step to transform the role of retail stores is to build a mechanism of testing new ideas in an agile manner. It will empower business users to increase the risk appetite, efficiently manage their budget, and evaluate their ideas to maximize ROI.

A transformation story

Store remodeling is an investment-heavy process of producing an incremental change in a store’s physical design to enhance customer experience. It is very difficult to accurately predict whether a store remodeling exercise will generate returns. The best way to know this for certain is to test the change in a subset of stores and based on the assessment, make a decision on whether it should be rolled out.

A leading US retailer decided to conduct a remodeling experiment to test the effect of introducing customer experience lounges, changes in exterior signage, and upgrade in the existing lighting system to introduce smart lights.

The retailer designed an experiment to measure the impact of remodeling in select representative stores, analyzed the results by comparing with a list of similar control stores, and devised the future action plan.

The retailer designed an experiment to measure the impact of remodeling in select representative stores, analyzed the results by comparing with a list of similar control stores, and devised the future action plan.

Challenge

A remodeling exercise is a significant investment sometimes ranging into millions of dollars. Additionally, at times store operations need to be put on hold for few days, which impacts the sales and revenue further. The retailer had to quantify the impact accurately to decide how they wanted to make this change for any subsequent set of stores.

Solution

The retailer decided to remodel 27 stores (to include both large and small stores) and wanted to assess its impact. These 27 stores were spread across the US. Control stores were simulated algorithmically for each test store Overall experiment could generate 6% lift in sales.

Results

22 stores generated a positive lift. Few experienced lifts more than 15% in sales. In addition, break-even for large stores (sales greater than $10 million) was expected within 2.5 years whereas that for small stores (sales less than $5 million) was expected in 5-7 years. Based on the results, the retailer decided to prioritize remodeling for large stores.

Conclusion

The role of the retail store remains essential for today’s consumers. Retailers that use technology to transform the in-store experience can capture new opportunities to create true omnichannel customer experiences. It will take innovative thinking, experimentation, and data savvy to create the seamless digital and in-store experiences of tomorrow.

As in the case of the retailer that successfully remodeled its stores, those that take an intelligent approach to experimentation, powered by measurement and data, will drive real results, while minimizing risk. The opportunity is ripe for retailers that take smart action and strive to innovate. The leaders have already started. For those that haven’t, the time is now.

References

About Authors

Ankit Bhardwaj – Senior Manager, Client Development, Fractal Analytics

Gourav Chugh – Product Manager, Fractal Analytics

Nachiket Sane – Product Manager, Fractal Analytics

Shivendu Mishra – Director, Product Management, Fractal Analytics

 

Transforming enterprise analytics and BI in seven steps

The lesson: inspirational leadership is essential in our knowledge economy, but inspirational leadership is not “feel good” leadership. It is not about charisma. It’s about creating the conditions that motivate peak performers to seize opportunities and attack problems. It can and must be carefully cultivated through training and development, through personal coaching and example. Inspirational leadership is more likely to enable transformational change to deliver sustainable growth.

 

A.G. Lafley (Paris, April 5, 2006)

Analytics and BI

Starting in January, 2010, I was presented with a unique opportunity from Bob McDonald, P&G CEO – one of the most principled & purpose driven men I’ve had the privilege to work with in my career, and Filippo Passerini, P&G CIO – one of the most strategic leaders I’ve experienced and learned from in any industry. They believed that “Analytics” was going to transform P&G, our industry, and business in general. However, P&G efforts to date were fragmented, IT vs. Business Needs driven, a cross-functional nightmare, plagued by under-leveraged talent, and not positioned to take advantage of what would be the “perfect storm” of capability over the following three years.

They wanted me to lead a complete transformation of Analytics, Business Intelligence, and how we approach and drive value for individual business units, functional areas, and the Company overall. This would turn into one of my largest Vertical Startups (VS) to date across strategy, talent development, information technology, strategic partnerships, innovation, and the move into advanced analytics to drive significant business value for P&G. My team and I created a unique blue print for P&G’s business (one that is confidential and not shared here), but also a model that is clearly applicable across any business to leverage analytics as a key capability to win.

“The Magnificent 7,” was a 1960 classic movie (actually an adaptation of a Japanese classic… samurais vs. gun fighters) with screen play by William Roberts, direction by John Sturges, and starring Yul Brynner, Steve McQueen, Charles Bronson, and James Colburn. “They were seven… they fought like seven hundred.” This was our inspiration! Get more done faster, cheaper, and with bigger impact than anyone thought was possible.

Ignore the status quo, barriers, people playing not to lose, cultural antibodies, and technology challenges… just make it happen. It was also about making choices… where to play.

Driving a holistic / business impacting Analytics Program is not easy. It helps if you can network with someone that has done it; and it definitely requires a little help from your friends!

Transforming enterprise analytics and BI in seven steps

1. Start with the Business Need & Strategy

Sounds simple, but is done incredibly poorly by most. All analytics start with the business problem you are trying to solve! Not a fancy technology, interesting data set, or impassioned leader preaching to the crowds. It’s about the business need.

If you do not know how to pull a strategy together, start with “Playing to Win” by A.G. Lafley and Roger Martin. Get clear on where to play, how you will win in those areas, the capabilities needed to deliver, and how you will measure success. Move from lots of activities to an articulated strategy and execution!

You need a set of business leaders that will iterate with you on the analytics. The CEO may be necessary, but is not sufficient. Find key leaders in the line businesses, key functional areas, e.g., Supply Chain, and respected forward thinkers who will lead with you. You won’t get the analytics or data right the first time. If they bail after one or two misses, you have the wrong players. I was extremely fortunate to have the President of our European business, a truly strategic, no-nonsense respected leader, and an early adopter, willing to iterate with us to solve his biggest challenges: how to grow profitable share in stagnant markets, how to optimize the supply chain across one Europe, and so on. We then found 3-4 other key leaders (including one of the best Supply Chain Global Officers in the industry) that jumped in as well.

These business leaders will also help you shape your strategy and business areas to focus. Without a doubt, Supply Chain, Retail Execution, Consumer / EBusiness, and Brand Analytics will quickly become some of your internal “Magnificent Seven” analytic domains! Analytics will become a “currency” that adds value to your business and with your external business partners.

2. Invest in Talent

Special Operation Forces, Rule #1: Humans are more important than hardware.

Talent is critical! But like any asset, how you leverage makes all the difference. We had incredible talent that was all off trying to solve problems (often the same problems) in a hundred different ways. You need to get your Top Analysts working as one unit. No, you don’t need to centralize them (it’s actually better if they stay embedded with the business units). You do need to organize, develop, and recognize them as the critical asset they are to your company.

The best analysts have three skills: 1) Analytics expertise, 2) Deep Business knowledge in the domain or business unit they are working, and 3) Effective Communications skills. Focus on developing all three aspects of your analysts. Great analytic expertise without context is useless. Great Business knowledge and analytics without the ability to communicate / influence makes it slow and tedious. Great communication without substance is smoke & mirrors – you know, the Powerpoint warriors.

Build internally and leverage analytic know-how from strategic partners! You need to be building your talent pool and leveraging immediate talent & external insights brought from a strategic partner. This investment is one of your most critical choices. Focus on talent, principles/value/purpose, and culture that fits with yours.

3. Partnership between Analytics & IT
One Leadership

If you don’t reorganize and give one leader accountability for IT, Analysts, and Business Algorithm development, get them working as one team. The biggest barrier to any strategy is groups having different measures, definition of success, or priorities. I was extremely fortunate to get one IT Leader, Corrado Azzarita, who understood the technology inside out, garnered the respect of the broader IT organization, and operated with a sense of urgency / playing to win!

Don’t chase the $500 golf driver of the year! Most bad golfers think the answer to their problems is an expensive new club. Bad golfers actually get worse with better equipment… what they really need is a swing coach that can teach them how to play. Don’t get distracted by the new shiny object, software, or tool of the day. Focus on getting better at the fundamentals / make a real difference for the business. Interestingly, once a golfer gets better, giving them great equipment then makes them even stronger. The same is true here.

Be ready to change the tires while driving down the road! We needed to fix the IT Architecture (ADW, Harmonization, Big Data, and more). We did not have time to stop, do these IT projects, and then resume analytics to help the business. Design your project stream to self-fund, invest for business impact, and focus on winning now.

4. Don’t Wait for the Data to be Right

In talking with numerous Fortune 500 companies, one of the insights I always share is don’t wait to get the data right. I remember the CIO during this discussion who turned as white as a ghost! They had been spending money for two years trying to get the data right before trying to do any analytics with the business. This is a waste! First, you don’t know what data will be most critical without driving true business analytics, and second, nothing gets data cleaner faster than presenting it to the senior leadership of the company!

Select tools that allow your analysts and data scientists to adapt/ harmonize on the fly. You will get the data right for the most critical business needs… it’s a simple value proposition. However, it is key to select the tools that allow your analysts and business teams to adjust quickly, add new sources of data, and so on. Don’t create a model that is dependent on a central team to “code” for every new business problem or adaptation.

5. Strategic Partnerships to Accelerate

Selecting the right strategic partners is essential to your journey. This includes IT, Analytics, and business transformation partners. I shared that I only had two types of partners: 1) Strategic Partners who put their best people on my business, invested with me to grow, and brought innovation to my business needs, and 2) Tactical Partners who I negotiated only on cost until I could replace them with a strategic partner. The right strategic partnership model (and I’ve learned from some of the best in the Purchases / Procurement field – always a core part of my leadership team / inner circle) is an entirely separate topic. I invested heavily with and in my strategic partners. Not surprising to see who I am still working with now (https://www.linkedin.com/in/andy-walter/ )!

Create a true Joint Business Plan with your strategic partners. Build a winwin plan that includes all aspects of your work with the partner: Operations / SLA’s / Savings, Key Projects, Co-Innovation, and Moon-shots you’d like to solve together. Leverage your senior leaders as core to this effort. Design quarterly rigor and annual top-to-top meetings with your strategic partners.

6. Focus on Innovation immediately
Analytics, Delivery, Collaboration, Scale

Don’t wait, organize for innovation within your analytics eco-system. Innovation is the life blood of your product strategy (and you are building an analytics product strategy for your company). Focus it around key elements of your plan: Data, Analytic Algorithms, Delivery of Insights, Analytics Team Collaboration, Scale, etc. New break-through is in progress on all fronts…

VR/AR with advanced analytics, mobile delivery analogous to LinkedIn/ Facebook, Machine Learning, and more. Did I mention checking out who I’m working with now?

Get your innovation briefs articulated to be able to share with Strategic Partners and the broader ecosystem of startups, accelerators, and industry players. Get everyone working on your business problems with you!

7. Network Beyond

I immediately realized there were extremely smart people working in other companies (non-compete), Industry bodies, Academia, and for-profit institutes that could be incredibly valuable to the journey. Seek them out and learn with the best together. I’ve created and am still leveraging a powerful network across companies in Analytics.

Conferences are interesting, the networking that occurs from interacting with the top analytics thought-leaders is priceless! I happen to be co-chairing the CGT/RIS Analytics Summit on April 27-28th. The leaders we are bringing together across industries and the ability to interact and learn from them, is incredible. You as a leader need to be investing in your talent and in yourself!

About Author

Transforming enterprise analytics and BI in seven steps

Andy Walter

Fractal Analytics Strategic Adviser

Andy Walter is a business results-driven professional with extensive experience in strategy, development, execution, and operations across Shared Services and IT. He led the Commercial Services & Delivery Organization (over 1500 IT and multifunctional professionals) for Procter & Gamble’s Global Business Services (GBS). He was responsible for IT & Shared Services for all Global Business Units and Markets around the world. His team was accountable for developing cutting-edge digital capabilities for Procter & Gamble to win “where it matters most,” with Consumers, Shoppers, and Retailers. This included all eBusiness, Consumer Services, BI/Analytics, Sales Force Solutions, Project Delivery, Business Process Services, and A&D / Company restructuring efforts.

Innovation is core to a winning analytics and digital transformation

The people who are crazy enough to think they can change the world are the ones who do.

 

                                                                                                    Apple “Think Different” Campaign 1997

Enterprise leaders often consider how and when they should be thinking about innovation in their analytics and digital transformation efforts. But, times have changed. To keep pace in a rapidly changing business and technology environment, innovation is no longer a choice. It’s an imperative.

The market is being reshaped by technology changes in the areas of artificial intelligence, block-chain, and computing. Business models are also changing, as new platforms, direct-to-consumer models, and mobile innovations continue to shake up the market. All this change creates great opportunities for those that make innovation their operating imperative. To win today, leaders must build innovation into the core of their efforts from day one. Creating a strong integrated innovation engine is critical to analytics and digitization transformation success.

As you build out your own winning strategy and operating model, focus in on three key elements to build your innovation engine:

Innovation is core to a winning analytics and digital transformation

1. “What Matters” Is Changing

At any point in time, “what matters” and how it is changing is critical to your successful transformation. Some call it a landscape assessment, canvassing the industry, bringing the outside in, articulating your business model, or any other expression of understanding “what matters” in your company, industry, and category and how it is evolving now and in the coming years. The only thing for sure is that the forces impacting “what matters” are changing right now whether you know it and like it, or not!

In his seminal book, “What Really Matters”, John Pepper (Former P&G Chairman of the Board, President and Chief Executive Officer) shares the importance of this approach with respect to winning brands. He outlines the “Brand-Building Success Factors” that are required to create leadership brands and keep them young year after year: “It’s only when all of these factors are present – not three or four, but all of them – that we sustain and improve our record of creating and building leadership brands.” He defines the five “what matters” factors and how to shape the innovation engine and forces around them:

Innovation is core to a winning analytics and digital transformation

Interestingly, the last factor, talent, was and is a passion that John never stops innovating and investing in. Even now, many years after retiring from P&G, his coaching is as relevant as ever.

Evolving success factors in enterprise analytics: From 2010 to today

Let’s look at analytics and how AI, robotic process automation, and other success factor forces will shape your innovation engine. In 2010, MIT published an interesting chart: “Analytics, the new path to value.”

Innovation is core to a winning analytics and digital transformation

In 2010, this laid out key insights on where to drive innovation in leading analytics programs like P&G. At P&G we built the Business Sphere environment to harness real-time global data to power business-decision making. We began testing more advanced analytics at store/SKU level, created strategic partnerships with key players in these critical analytic domains, and drove significant impact across the businesses.

It’s very clear now that the “What Matters” forces are transforming in the analytics domain. The next evolution is taking shape, and it’s looking something like this:

Innovation is core to a winning analytics and digital transformation

As you drive your own transformation, build your innovation efforts around “what matters” and the forces (e.g., AI) that are shaping them in the coming years. Leading analytic and digitally transforming companies will leverage both internal talent and strategic partners to create competitive advantage now and for the future.

The greatest danger for most of us is not that our aim is too high and we miss it, but that it is too low and we reach it.

                                                                                                                                                 Michelangelo

The decision isn’t about whether to embrace a new business model, but about when, how, and how fast to do it! 

            Filippo Passerini, Former P&G CIO & Global Shared Services President

2. Build an Integrated Innovation Strategy into Your Business Strategy

Like any good strategy, innovation also requires the rigor of making bespoke choices: where to play, how to win, capabilities needed, measuring success, management systems required, and your aspirational vision.

An integrated innovation strategy focuses across short and long-term objectives by focusing your innovation in three areas (tightly linked with your business model):

Innovation is core to a winning analytics and digital transformation

1. Innovation to lead the industry

This is sometimes called “sustaining innovation.” It is focused on your current products or solutions with intent to stay focused on new consumer trends (“Who” focused), vertical or horizontal expansion to adjacencies, or geographic expansion. Continued innovation in the core is critical.

Think about how you will bring analytic innovation to core areas like communications, knowledge management, your existing CRM programs, and more. Sometimes innovation on the day-to-day core can become break-through for your company and employees.

2. Innovation to change the industry

This part of the innovation focus looks to shape the industry landscape in order to tilt it in your favor. It is focused on transformational innovation. It is focused on the “How” with an intent to create a competitive advantage.

Think about how the digital and analytic transformations will tilt external relationships in your favor. Analytics has become a “currency” between business partners. Are you using it to shape external business reviews, product launches, consumer interactions, and other key areas that matter to your consumers?

3. Innovation to create new industries

This area of innovation holds the highest risk and reward. It is focused on building new consumption where it did not exist before. It is focused on the “What” with the intent to disrupt the market. This is also the most difficult for large companies, as so many processes focus on driving the current business model.

These are your “moon shots,” “exponential”, or “black ops” projects. This is digital and analytic transformation set out to completely disrupt your current business, business process, reliance on expensive status quo, or transform and industry. If you’re not trying to do it, I guarantee you someone is trying to do it to you.

The reason why it is so difficult for existing firms to capitalize on disruptive innovations is that their processes and their business model that make them good at existing business actually make them bad at competing for the disruption.

                                                                                                                                            Clayton Christensen

Your innovation plans should have a balance across the three areas and be fully integrated with your current and future business models. Too often I see siloed groups working on the “exponential projects” with no business understanding, credibility, or chance of succeeding. That’s simply a model for burning cash.

3. Unleash Your Hybrid Talent

In the end, innovation comes from talented individuals who make connections, discover insights, solve problems, take risks, or devise approaches that no one previously made. Your innovation strategy and results will only be as good as the talent you have working against it.

Stack the odds in your favor and leverage a hybrid talent model to put the very best people in your organization and your strategic partners working on it together. Innovation is critical to growing your business and theirs. “You can’t cost cut yourself to growth” is valid now more than ever before.

Are you putting your top players, your “water walkers”, against your biggest innovation opportunities? James Lafferty, then P&G VP of Europe Family Care, laid out incredible observations on the difference between “swimmers” and “water walkers”. Here are some key points:

  • When confronted with a severe business crisis, the water walker focuses all energy on how they will overcome the crisis and still deliver. The swimmer will often focus instead on doing a superb job of “selling” a lower base.
  • Water walkers consistently focus on self-improvement and ask themselves, “How do I get better every day?” Swimmers focus more on self-promotion and ask themselves, “How do I sell myself and get myself positioned to get the promotion I deserve?”
  • Water walkers make any assignment a great assignment. Swimmers think success or failure is based upon having a “good assignment”.
  • Water walkers always approach a topic from the standpoint of “how crisp and clear can I make it?” Swimmers tend to measure success by how long, how many charts, and how many numbers they can put on one page. They erroneously equate quality with quantity.
  • Water walkers recognize the power of developing people as the way to achieve their business goals. Swimmers tend to prefer to “go it alone” and consistently believe they alone must be the major driver.
  • Finally, water walkers tend to be thinking, “How do I change the game?”, whilst swimmers work on, “How do I grow the business?”

When faced with building and growing innovation, I’ve often pulled out Jim’s memo ahead of staffing and partner decisions. Are you putting your ‘A’ Players on the innovation opportunities (in some cases double-hatting with their current business scope)? My experience is that your truly top talent wants more accountability, scope, and operating freedom. Mine always rose to the occasion!

With the complexity and speed of digital and analytic innovations, it is also critical to have the right strategic partners working closely with you. I call it the hybrid organization design. You should be looking for three characteristics:

Innovation is core to a winning analytics and digital transformation

1. Their CEO’s and leadership are thought-leaders and “get it.”

Companies with CEOs that are good managers is nice. However, you’re looking for individuals and teams that are shaping the industry and “What Matters” for you.

2. They truly believe in and want strategic partnerships with their clients.

They hire and put top people working with you, bring innovation to you, and co-invest with you as they believe innovation is a win-win. If you win, they win. If you lose, they lose.

3. They are focused on “speed to value.”

It’s all about winning with your business, not theirs. Too often, suppliers (not partners) will sell you something, and whether you create value is your issue, not theirs. Strategic partners make value creation and speed to value their imperative.

What’s Next?

There are great opportunities out there to power innovation through analytics and digital transformation. Those that do will see their organizations thrive in a fast changing landscape and be ready for tomorrow.

So, remember, be crazy enough to think you can change the world:

Innovation is core to a winning analytics and digital transformation

With the right approach to innovation, you can change the game. Get going.

About Author

Innovation is core to a winning analytics and digital transformation

Andy Walter

Fractal Analytics, Strategic Advisor

Andy Walter is a business results-driven professional with extensive experience in strategy, development, execution, and operations across Shared Services and IT. He led the Commercial Services & Delivery Organization (over 1500 IT and multifunctional professionals) for Procter & Gamble’s Global Business Services (GBS). He was responsible for IT & Shared Services for all Global Business Units and Markets around the world. His team was accountable for developing cutting-edge digital capabilities for Procter & Gamble to win “where it matters most,” with Consumers, Shoppers, and Retailers. This included all eBusiness, Consumer Services, BI/Analytics, Sales Force Solutions, Project Delivery, Business Process Services, and A&D / Company restructuring efforts.

 

Enhancing Trade Investment Management Effectiveness (TIME) for CPGs

You may think this article has a provocative title. Well, in this article, I’d like to highlight what I believe to be key success factors that are integral in the journey towards achieving Trade Investment Management Effectiveness (TIME) and I’d also like to describe what a ‘good practice’ looks like in each of those factors.

Trade Investment for most CPG companies is the biggest line item in the P&L after Cost of Goods Sold. In my experience, most CPG companies have ‘nth-degree-detailed’ focus, capabilities, analytics and knowledge when it comes to supply-chain efficiency, asset utilization and cost-to-serve dynamics. However, when it comes to Trade Investment Management Effectiveness (TIME), most CPG companies have much less of an understanding of what’s working, what’s not, why and how to drive higher ROI and improved business results from that investment. But first, I’d like to stress the importance of the concept of Trade Investment as opposed to Trade Spend as it is often referred to.

Spend vs. Investment

As people who know me will (painfully) attest to, this is a real point of principle and paramount importance for me. When you look up ‘spend’ in the dictionary, it states: ‘to pay out, disburse, or expend; dispose of (money, wealth, resources, etc.)’. To me, spend does not describe the purpose of trade monies. When you look up ‘investment’ in the dictionary, it states: ‘the investing of money or capital in order to gain profitable returns, as interest, income, or appreciation in value.’ Now to me, investment sounds much more like how CPG CFO’s would describe the purpose and the requirement from budgeted trade monies. This may seem like a nuance in wording, but it’s very important in terms of mindset and culture for an organization as they seek to achieve TIME.

Now let me introduce – in my experience – the 6 key success factors to achieving TIME:

Enhancing Trade Investment Management Effectiveness (TIME) for CPGs

Focus!

The first of the success factors that I’ll cover is Focus. In this instance, by Focus, I mean Focus on success criteria and balanced objective setting. There are inherent trade-offs between using Trade Investment in driving volume vs. driving ROI vs. driving mix and % margin. All too often, there will be stated goals for one of these metrics for a brand / portfolio in an organization, but no communicated guardrails relating to the other metrics. As an example, if I had $10 for every time I’ve been asked ‘Did that promotion work?’ in my career, I’d be taking private villa family holidays in Barbados every year! The answer is straight forward, as are the lessons. Answer ‘Yes! If you’re objective was to drive volume. The promotion was a BOGOF. So of course it did, but look at the state of your P&L…’

In order to set Focused balanced objectives and an effective deployment of Trade Investment, companies need to set balanced objectives that are clearly communicated e.g., ‘Grow volume by 10%, $ Profit by 5% with no more than 100bps decrease in % margin.’ This is clear across all key metrics and very easy for everyone in the organization to be measured against. Not easy to achieve, I agree, both in terms of objective setting and objective achieving. But not only does this Focus drive clarity, it also forces more rigor into the objective setting process, meaning the combination of metrics has to be modelled out and deemed to be achievable in unison rather than setting an unachievable combination of objectives for the organization and setting the organization (and the individuals within it) up for failure.

Visibility!

The second success factor I’ll address is having the right Visibility into how Trade Investment is actually invested and the purpose of the investment (i.e., what are we aiming to achieve from the investment and how it is actually recorded internally?) These factors, therefore, provide Visibility to drive the analytic capability. After all, if you can’t see it, you can’t measure it. If you can’t measure it, you can’t understand it. If you can’t understand it, how on earth do you make solid business decisions on the often $1B+ global Trade Investment budget?

Usually, Trade Investment is recorded by financial controls criteria: Offinvoice vs. billback / retrospective payments; lump sum vs. variable cost. I’d like to put forward that whilst there are legitimate ways of accounting / clear criteria and rules to record by, they are not representative of the impact that those investments were intended to create relative to strategy and desired consumer behavioral response.

I also propose that ideally classifying Trade Investment according to intended impact and desired consumer behavioral response is far more useful to feed analytics to understand what is working, and how and why to improve business performance. For example, see the figure below for suggested classifications and goals of the investments.

Enhancing Trade Investment Management Effectiveness (TIME) for CPGs

Using the above classifications, aligned to the purpose and desired outcome of these investments, we can far more clearly analyze the key drivers to invest more in and those activities that we should discontinue or at least minimize.

I should add here that the notion that increased Trade Investment is bad and decreased Trade Investment is good, which exists in a majority of CPG’s, in my opinion is far too black and white versus the realities of the world of CPG and Retailing.

As an example, if I have a really breakthrough set of innovation products to launch this year, should I really be aiming to keep Trade Investment flat or decrease it? If I try to do that, because I don’t have the Visibility framework I detailed above, I will invest in the innovation at the expense of the core portfolio (from where the savings need to be made to fund it). As this continues, my innovation (the Children) are being fed and looked after, meanwhile the core portfolio (the Adults) who bring home the revenue to feed the Children, get weaker and longer term, may not be able to feed the adolescents.

Using the Trade Investment Visibility approach, I would argue that in the above situation, a year of increased Trade Investment is not only acceptable, but possibly desirable, providing that all the increase is being attributed to the breakthrough innovations in the classifications of InStore Execution and Consumer Price – the objectives of which are to drive Visibility, awareness and trial of the innovation. This is very much in support of the overall longer term business objectives.

If we now have clear Visibility to strategically aligned investments / fluctuations / explanatory factors around investment decisions and ROI, next comes ‘Did it work?’

Knowledge!

The third area I’ll cover is Knowledge. By that I mean a full and regular diet of Pricing and Promotional Effectiveness Analytics that, wherever possible, cover the differing in-market executions impacts on all major metrics: volume, retail revenue, net sales and gross profit.

This includes:

  • Impacts of everyday price and combinations of promotional executions on each of these metrics including all cannibalization and competitive switching effects
  • Impacts of all of these on manufacturer profitability, retailer profitability and how the profit pool is evolving between manufacturer and retailer

Within each promotional execution, the precise investment Visibility created will break down the investment components which in turn drive the delivered metrics. Tying these back to rounded Focus objectives and guardrail parameters allows much more accurate planning, monitoring and data-driven decision making.

Given the size of the investment that most CPGs have in Trade Investment, it amazes me how few companies have an institutionalized approach to the Knowledge of how these investments work and are performing. A former boss of mine used to have a mantra that I’d like to share here. ‘You cannot save yourself rich.’ This clearly is linked to my earlier point about thinking of Trade as an ‘Investment’ rather than referring to it as ‘Spend’. If a CPG company invest $1B globally in Trade, surely it’s sensible to set aside 0.25%–0.5% as a ratio to get detailed Knowledge regarding the performance and drivers of the investment results to refine those investments in the future to improve business results?

If I was running a private business, I would be approaching this with a ‘My Money Mindset’ and this level of investment in Knowledge vs. the overall investment that Knowledge would influence seems a no brainer to enable my organization to make far more decisions on ‘What I know?’ vs. ‘What I think?’

Controls!

Having established the Focus and Visibility and the Knowledge of best uses of investment to drive the business objectives, the next challenge is to embed all of this into standard business processes, such as Annual Operating Planning, Sales & Operations Planning, but also to feed into brand / retailer strategy plans and playbooks.

For example, For Brand A in Retailer X, we need to do more of B and stop doing C. This sounds very prescriptive. There will always be exceptions (and often these can turn out to be the most difficult decisions and most relationship-led, successful investments there are!). But there needs to be clearly defined playbook guidelines, aligned to the overall Focus-balanced objectives. And when exceptions to those arise, there are clear escalation and approval / ‘no go’ processes that can be invoked quickly (by either cyclical meeting cycles or by exception meetings of a select group to make a decision) based on Knowledge and contextual business situation. Without Focus, Visibility and Knowledge, these exceptions would never be flagged, they would just happen and unknown / unapproved consequences would ensue.

Execution!

Execution probably warrants an entire article in its own right; such are the complexities of every single executional negotiation of Trade Investment. But this is an article, not a book, so I’ll keep it short.

With Focus, Visibility, Knowledge and Control, the time is now for the rubber to hit the road. These four areas combine to provide a playbook for the sales teams to execute with excellence and bring home the Focused business objectives. This means: ensuring the right information gets to the right people, at the right time, to enable them to sell. The Execution of the playbook plans is where the money is made and where the ROI increases become realized.

Investment in the right technology and systems is key to enable this two-way communication, as quickly as possible. What to execute and compliance reporting are paramount in capitalizing on all the good work that has been done internally regarding Focus, Visibility, Knowledge and Control. Without Execution, the rest is rendered almost academic.

Do It Again and Again!

There is a reason that CPG in Europe is called FMCG. F = Fast. The marketplace, consumer attitudes and retailer agendas can change quickly. So the need to continually update and improve on these six areas is inevitable. This is not a one-time exercise. This is a continuous improvement journey. Focused objectives and guardrails need revisiting. Trade Investment Visibility needs maintaining, improving and adapting to innovative in-market executions of investment. A regular diet of up-todate Knowledge is required to keep guidelines, playbooks and executional excellence on track. Controls will always remain important, but also will the escalation process to manage and veto exceptions (You can’t save yourself rich!). Execution is Execution. Without it, nothing is going to improve.

Closing Comments

I believe that most CPG’s are moving ahead on some if not all of these areas. But I’d love to hear from anyone who believes they’ve really fully cracked the Trade Investment Management Effectiveness (TIME) challenge, along with the six areas I’ve detailed as the key success factors.

There is also a cultural change management component required to truly embed and embrace this ethos that I’ve touched on in parts and will summarize here. In that cultural change is a language element. No longer should we refer to ‘Trade Spend’, only ‘Trade Investment’. Mindset shifts should follow. People should be encouraged to think no longer ’My budget. I must spend it or else I’ll get a lower budget next year’ rather instead be encouraged to think ‘If it were my money, would I invest it that way?’.

Enhancing Trade Investment Management Effectiveness (TIME) for CPGs

Focus, Visibility, Knowledge, Control, Execution and Do it again and again.

In order to be successful on the journey to enhance TIME, believe it or not, Business Practices and Culture need to become a lot more FVKCED!

About Author

Enhancing Trade Investment Management Effectiveness (TIME) for CPGs

Chris Dootson

Client Partner, CPG Retail Consulting, Fractal Analytics

Chris leads Fractal’s work in CPG and Retail with his domain thought leadership and client consulting expertise. He has 20 years of experience in the CPG industry, starting with IRI in the area of Retail. He spent 13 years with Kellogg Company, where he held various roles across CMI, CatMan, Brand, Sales, S&OP, Revenue & Trade Management. He was also a member of Sales and Marketing Leadership Teams. Chris moved to Kimberly Clark as Senior Director – Global Analytics, where he lead global best practice deployment in Sales and Marketing ROI. Chris holds a BA in European Marketing from the University of Hull in the UK. He also holds a diploma from the Chartered Institute of Marketing in the EU.

Chris has a real passion for using advanced analytics to tackle very real business challenges. He believes that companies invest huge amounts in Marketing and Sales, but most invest little in understanding the ROI of these investments. He believes those that do will be winners, and he wants to help drive that success.

Better AI needs better design
AI’s apparel eye

ABSTRACT

The “FashionAI Global Challenge 2018 – Attributes Recognition of Apparel” is conducted to push the ability of AI to help the fashion industry in recognizing the attributes of clothing from a given image. This capability could be widely applied in applications such as apparel image searching, navigating tagging, mix- and-match recommendations, etc. The competition was hosted at the Alibaba cloud competitions site: Tianchi. The dataset released for the competition is the largest dataset available in the domain of attributes  recognition of the apparel. We finished the competition at 30th position out of 2,950 contestants across the world. In the sections that follow, we will define the dataset, our approach, other experiments, results, conclusions and further ideas, and references.

THE DATASET

Apparel attributes are the basic knowledge of the fashion field, and they are large and complex. The competition provided us with a hierarchical attributes tree as a structured classification target to describe the cognitive process of apparel, which is shown below. The “subject” refers to an apparel. Our focus for the competition was in the characteristics of the apparel.

AI’s apparel eye

Data provided has eight categories, each representing a clothing type. Each category is further broken down into labels defining it in terms of design or length. If the design or length is not clearly visible in the image, an invisible label is assigned. Tables below show the categories, the labels in them and the total number of images inside each label:

AI’s apparel eye

EXAMPLE IMAGES

CATEGORY SKIRT LENGTH LABELS

AI’s apparel eye

CHALLENGES IN THE DATA

  • In some images, the way the person is posing might obscure the design or length of the clothes. For example, if the person is sitting, a floor length skirt might seem like an ankle length skirt
  • The background in the image also added to the noise. For some images, it merged with the dress color, making it difficult for the model to distinguish between the dress boundary and the background
  • Also, in some cases, the model couldn’t differentiate between clothes of similar length (example, knee vs. midi length skirt)

Data is given to us in separate folders for each category. This eliminated the requirement to first predict the category and then the labels inside it. We need to predict the labels of each category. As obvious as it may sound, the test dataset also has a similar structure.

OUR APPROACH

Convolutional neural networks (CNNs) are used to solve the problems related to image classification. We have used the same technique and approach, which can be divided into four parts:

A. Preprocessing the images and data augmentation

B. Choosing network architecture of CNN

C. Optimizing the parameters of the network

D. Test time augmentation

A. PREPROCESSING THE IMAGES

We normalized (took difference from the mean) the pixel values of the images (0-255) to suit the network architecture used. We applied certain transformations like zooming (1-1.1X), adjusting the image contrast (randomly between 0-0.05), rotation (randomly between 0-10 degrees) and flipping the images. This helped in making the model more invariant to orientation and illumination in the image. In every epoch, a random transformation was chosen, so that in every epoch we are showing a different version of the same image, thus avoiding the network to overfit to a set of images.

B. CHOOSING NETWORK ARCHITECTURE OF CNN

We used transfer learning for solving this problem. Transfer learning means using a model that is trained for another task to assist us in solving the problem at hand. This helps in creating the initial base features and avoids training the model from scratch when you have limited data and computational resources. We took network trained on ImageNet data as a starting point. ImageNet is a large database of images, and every year many researchers try to improve upon the accuracy of the classification of objects in ImageNet and submit it to Large Scale Visual Recognition Challenge (ILSVRC). This challenge has 1,000 categories to predict.To suit the problem at hand, the final output layer after the Fully Connected layers (FC layers) in the architecture were replaced with the number of labels of the given category.

We experimented with different types of Residual Networks: ResNet [1], ResNext [2] and the current state of the art architectures NasNets& SeNets. In our experiments, ResNext gave better results than other algorithms when looked in both accuracy and computational time.

C. OPTIMIZING THE PARAMETERS OF THE NETWORK

FINDING THE LEARNING RATE:

Choosing a starting value of the learning rate is highly important to ensure convergence of the network parameters to the optimal value.

Leslie Smith’s (researcher in field of deep learning) recent work on “Cyclical Learning Rates for Training Neural Networks” [3] contains a point on choosing an initial learning rate for the given problem. In summary, the idea is to start with a very small learning rate and gradually increase the learning rate in powers of 2 or 10 for every iteration in the epoch. Initially, when the learning rate is too small, error will decrease at a very slow rate. If you keep on increasing, at some point the learning rate becomes so high that the error skips the minimal value and starts shooting upwards. This indicates that beyond this learning rate shouldn’t be chosen, as it has become too high for parameters to converge.

The image below shows the learning rate finder for the ResNext-101 architecture when trained on a category of clothing. Loss has been decreasing drastically between 10-4 to 10-3, and then from 10-2 loss has started increasing, and at 10-1 it has increased drastically. Ideally, we should choose a learning rate between 10-4 and 10-3. We have chosen 10-4 to accommodate another technique of adjusting the learning rate.AI’s apparel eye
CYCLIC LEARNING RATE:

Leslie Smith’s (a researcher in the field of deep learning) recent work on “Cyclical Learning Rates for Training Neural Networks” points out that instead of just having a constant learning rate across the epochs, the learning rate can be made cyclical across the epochs. In summary, the number of epochs could be equal to the number of cycles, and in each cycle the learning rate resets back to the original learning rate (learning rate chosen from learning rate finder above). Inside a cycle, the learning rate decreases gradually for each batch in cosine fashion. This process helps the network to escape the narrow regions (local minima) in error surface and favors a wider region.

The plots below show the cyclic learning rate plots. After running for the few cycles, we can change the length of the cycles so that the learning rate gradually decreases to help weights converge. When the cycle length is two, in that case, the learning rate of the next epoch is equal to the latest learning rate in the last epoch.AI’s apparel eyeA total of seven epochs were run in the case shown in the image. Here, the total cycles are three, and the cycle length is multiplied by two times the length of the previous cycle. We can see that the last cycle was run for four epochs, the second cycle was run for two epochs, and the first one was run for one epoch.AI’s apparel eye
LOSS VERSUS THE ITERATIONS:

We can see that error surface is not smooth, and the concept of cyclic learning rate can help us jump past the narrow regions of error surface. We have increased the number of cycles and have seen that loss has been constant, indicating that it is not a narrow region of error surface.

UNFREEZING THE LAYERS AND DIFFERENTIAL LEARNING RATES:

As we are using a pretrained architecture and performing transfer learning, not all layers require additional training. As the architectures are state of the art on ImageNet, they are already good in identifying the low-level abstract features like boundaries and edges. Those are captured in the few initial layers of the architecture. Hence, they don’t require much re-training.

We chose different learning rates for different parts of the network, and the layers are grouped into three parts. The first part corresponds to the initial set  of layers, the second part corresponds to the layers in the middle, and the third part corresponds to the last set of layers plus FC layers (Fully Connected layers).

Two steps that are used to train the network are listed below:

  • Initially, the network is frozen for all layers except for the last Fully Connected We mean those layers are not trained; we are just predicting the values till the layers before the FC layers and trying to tune the weights between Fully Connected layers (two layers of size 512) and the output layers (size is dependent on the category we are predicting).
  • Next, the network is unfrozen, e. all the layers are made trainable. Now, the learning rate for the three groups of layers is set by the rule of thumb of [lr/100, lr/10, lr] in the order for the images like ImageNet, but in our case [lr/100, lr, lr] has proven to work well. Information getting captured in the middle layers is having equal importance to the information that is getting captured in the layers near to the FC layers. (Here, “lr” refers to learning rate).

This concept of using different learning rates across different layer groups is termed as the differential learning rates [4].

INCREASING INPUT SIZE OF IMAGE GRADUALLY:

The images we received in the data are mostly at 512 pixels, and we resized  the images to 224 (since most of the ImageNet images are of this size) for the initial tuning of weights. And then we resized the images to 299, and we ran the same number of epochs using the final weights generated at 224 as initial weights. Then we resized all the images to 512 pixels and used the weights generated, using 299 pixels as the initial weights.

The advantages were twofold:

  • We would get computational time advantage, since larger images increase the time it takes to tune the weights of the network. Hence, if we provide the weights obtained from the smaller size of images for the same problem, we are  providing optimal weights for  the problem and network converges in less time than it takes.
  • We get accuracy gains from We are providing the data  at  different sizes; hence, those categories that are very far way in the classification (i.e., sleeveless vs. wrist length of the sleeve) will already be taken care of in the smaller sizes. But the nearby classes will be  classified more accurately in  the case of higher resolution input image.

D. TEST TIME AUGMENTATION

During the prediction, we applied the same transformation parameters that  we used during the training and generated eight images. We choose four of them randomly and predicted on these sets of images and also on the original image. We averaged out the prediction probabilities, and this has increased  the accuracy of the predictions obtained. We believe that one possible reason for this is some center cropping could happen while resizing the image, which could result in loss of information from the sides. When we do transformations, that information is captured in one or more of the images, and averaging the probabilities is increasing the accuracy of the model.

OTHER EXPERIMENTS

As discussed earlier, a gradual increase in size is bringing the accuracy improvements, but most of the original images are of size 512 pixels; hence, we applied a concept of super resolution (a deep learning based method to resize the images to a higher resolution). We resized the images to 1024 pixels and performed the similar experiments, but we didn’t get the accuracy improvements, and computational time grew exponentially from 720 pixels.

During the semi-finals of the competition, we were provided with the dataset that contains images of apparel that were just hanging on a hanger or on the wall (i.e., not worn by humans). We used a similar approach but used yolo to separate out images of hanger and humans and built separate models. But the combined model of human and hanger always gave better results in comparison with separate models.

RESULTS

We have results for all the categories following similar trends of results across the experiments. Hence, we will present the results of one category: skirt length (as we have provided example images for the same category).

COMPARISON OF RESULTS ACROSS ARCHITECTURES:

We have started with ResNet architecture and  moved  on  to  ReNext-50 and ReNext-101. ResNext-101 outperformed all the other architectures, as shown in the results below. Notations: Epoch: Number indicates the cycle-indexing starting with zero; trn_loss- log loss on training data; val_loss: log loss on validation data; accuracy: classification accuracy of validation data.

It can also be observed that the TTA has always provided an improvement in the prediction accuracy when compared to the last epoch’s accuracy.

AI’s apparel eye

SEQUENTIALLY INCREASING THE SIZE OF INPUT IMAGE:

Choosing the best performing architecture, ResNext-101, we have sequentially increased the size of the input image. Accuracy has increased from 85.39% at size 224 pixel to 88.25% at size 512 pixel.

AI’s apparel eye

CONFUSION MATRIX (FOR THE SAME CATEGORY-SKIRT  LENGTH):

There is no confusion between the short length and floor length (i.e., those categories that we as humans can also do very accurately). But the model suffers to correctly classify the nearby classes. We have tried other approaches like modeling separately for  the nearby categories and  making changes in loss function, but none of them are able to solve the problem of confusion of nearby classes.

AI’s apparel eye

CONCLUSIONS & FURTHER IDEAS

The way the learning rate is chosen is very important for neural nets convergence, and the way the network is optimized is also another  important step in the process of model building. But we feel the current  state of the art architectures are throwing away a lot of information when it reaches the FC layers. But taking all the activations will make the parameter space exponentially bigger, thereby causing overfitting and increasing time complexity to tune the network. Current methods are taking the average value of all the channels before the FC layers. By doing so, we are losing the detailed information captured till that layer. We propose the following ideas to improve on this:

  • We tried XGBoost at the end of the competition by taking all activation values from all the filters in the layer before the FC layers or final conv We observed that we can get better  results  when compared to just taking average values of those filters. But due to the time constraint in competition, we haven’t been able to complete the experiment, and we will publish those results soon. In summary, using XGBoost on the activations obtained on filters just before FC layers could help boost the accuracy of nearby classes by capturing some detailed information.
  • The approach of bagging could help, where we selectively expose all the activations of the few important filters based on their weights importance and repeat it multiple times and take an average This might help us capture the detailed information from some important filters.

AUTHORS

ABHEET AGGARWAL

Data Scientist

Mathematics and Scientific Computing graduate from IIT Kanpur. Fascinated by the mathematics behind machine learning. Interested in applications of deep learning, especially in the field of computer vision.

EKTA SINGH

Senior Data Scientist

Material Science graduate from IIT Roorkee. Enjoys problem solving and has worked on creating data driven solutions for various domains like Insurance, Supply Chain, CPG.

GEETHASAIKRISHNA ANUMUKONDA

Data Scientist

Material Science graduate from NIT Warangal. Passionate about solving problems that create a meaningful impact to the society/business using machine learning and deep learning.

REFERENCES

      1. ResNet. Deep Residual Learning for Image Recognition. https://arxiv.org/pdf/1512.03385.pdf
      2. ReNext. Aggregated Residual Transformations for Deep Neural Networks.https://arxiv.org/pdf/1611.05431.pdf
      3. Cyclical Learning Rates for Training Neural Networks. https://arxiv.org/ pdf/1506.01186.pdf
      4. Lectures by fast.ai. http://www.fast.ai/

ACKNOWLEDGEMENTS

We would like to thank Jeremy Howard and Rachel Thomas for generously helping everyone learn the state of the art deep learning techniques through their lectures at fast.ai [4]. Many of our ideas are inspired from the lectures there.

 

How Fidelity’s automated digital advice solution is transforming wealth management for investors and advisors

Automated digital advice platforms, commonly known as robot-advisory platforms, have seen accelerated adoption over the last three years due to the promise of low-cost wealth advice, enabling wealth managers to serve new profit pools of mass and mass affluent clients. Legacy wealth and brokerage firms are vying to build rapid scale and leadership in this new market. The initial accelerated adoption is slowly reaching a plateau as leaders compete to scale to the next heights by incorporating new data, AI, and digital experiences.

Fidelity’s bold foray in this space with the AMP platform is generating keen interest and adoption from leading banks and RIAs. Though, the best is yet to come.

Read the perspective of Vinod Raman, Head of Digital Advice Solutions at Fidelity, as he talks to Arpan Dasgupta, Head of Financial Services at Fractal Analytics.

Arpan Dasgupta: As the builder and product owner of the Fidelity AMP robot-advisory solution, can you tell us a little bit more about the platform?

Vinod Raman: Fidelity AMP is an end-to-end digital wealth management solution. The first leg of the stool is the interaction with end investors and advisors, which offers recommendations and ongoing digital services in a goal-oriented fashion.

The second leg of the stool is Fidelity-offered brokerage services, such as account opening, money movement, and transfer of assets. The core brokerage capabilities, trading, and custody services are offered by Fidelity.

The third leg of the stool is Geode Capital Management offering core investment management services.

Combining this all together, AMP looks at investors’ holistic risk profiles and offers recommendations towards achieving their goals, such as buying a home in three years.

Once that happens, the brokerage services kick in. In a matter of minutes, the investor can open and fund that account. Then, the investment management services kick in, where Geode starts managing that portfolio. From there, the investor continues to interact with the portal and gets access to all the brokerage capabilities and sophisticated investment management capabilities.

This end-to-end solution has received tremendous traction in the last year or so since we launched. We’ve had big banks, broker dealers, and RIAs go live, and that’s where we are right now.

Arpan Dasgupta: Why is robot-advisory important for Fidelity? Is the industry heading towards automated digital advice?

Vinod Raman: Absolutely. The industry is clearly heading there. There’s been a proliferation of robot-advisory solutions that are gaining market share. Providers come in from startups as well as incumbents.

Aside from industry dynamics, there are three important reasons from my perspective that we decided to offer robot-advisory services.

One is access to a new customer base. A lot of our clients – such as RIAs, banks, and broker dealers – don’t have the time or the ability now to manage clients below a certain asset base like $500K-$1M. Robot-advisory is opening a whole new segment of customers that these firms can begin to target with different strategies. An RIA might be interested in serving the family of an HNI that, traditionally, they couldn’t have served profitably. A bank might be interested in cross-selling digital capabilities along with lending and savings accounts they currently offer. A broker dealer might want to bring someone who’s under half a million onto this digital platform, serve them in an efficient manner, and eventually transition them into a full-service model. So, it opens your world to a whole new customer base or asset base. That’s one reason.

Secondly, investors are asking for more digital technologies that are delivered in a simple, intuitive, and digital fashion. We also want to out-serve other financial services competitors that are providing a full suite of digital capabilities and services online to financial institutions and their end investors.

The third important reason is democratizing financial education. Especially with lower asset base investors, financial education is still in its nascent stages. It’s not easy for a layman to understand sophisticated financial products, where they should invest, and how they should invest. Robot-advisory solutions allow you to leverage data, provide simple and intuitive digital experiences, offer financial products that people have not had access to, and help with financial education.

It’s the combination of those three forces that motivated Fidelity to offer our own digital advice solution.

Arpan Dasgupta: Since the AMP platform is targeted to end investors, some of whom may not be financially savvy, how well is the platform adapted to be relevant and targeted with recommendations for individual investors?

Vinod Raman: We’ve done a lot already with Fidelity AMP, which has set it apart in the market. It’s the nation’s first planning-oriented, goals-based digital advice solution. As part of the workflow, we’ve been able to not only collect data but also intelligently analyze it and provide the right recommendations. Additionally, eMoney allows the advisor to collect some information from the investor and other aggregators. The advisor can then run analyses such as: Does this investor have debt somewhere? Has this investor aggressively invested in another portfolio? How is this investor doing with respect to retirement?

The advisor can then come up with additional tailored recommendations for the investor.

It starts with recommending a portfolio, but it doesn’t stop there. It allows the advisor to collaborate with the end investor to intelligently source information from a host of data sources, such as banks, other financial institutions, and brokerage firms. Then, it determines the holistic financial profile of that investor across debt, savings, investments, and their overall risk profile. It goes as far as saying, “Is the investor aggressively invested in retirement versus not?” It’s a heavy, data-driven exercise with all the necessary regulatory guidance taken into consideration.

Arpan Dasgupta: All of this must have substantial investments around data, analytics, technology, and people. What did it take to bring it all together? What are some fundamental choices that you made that caused this product to be a success?

Vinod Raman: It has been a multi-year journey since we first developed the strategy.

Along the way, we have made many transformations in our business and operating model as we worked closely with internal and external partners.

A major ongoing transformation has been to move towards agile development (on the tech side). Fidelity acquired eMoney – a financial planning software – and the underlying AMP technology combined Fidelity and eMoney’s software. However, eMoney is more of an agile shop, while Fidelity is more of a waterfall shop. We made key organizational and structural shifts to re-orient everybody to an agile mindset and collaborated closely with eMoney to accommodate such a large initiative as AMP.

The second big structural change was creating a partnership ecosystem after the strategic evaluation of build vs. buy. Apart from co-developing with eMoney, which was an excellent choice, we also partnered with a host of other providers. We partnered where it made sense, especially with respect to sourcing data, analyzing data, and digital experiences.

We also made a big shift culturally. We emphasized the spirit of pace over perfection, empowered our teams to quickly make decisions, and accelerated development.

We partnered closely in parallel with multiple internal groups such as finance and legal compliance.

Arpan Dasgupta: How do you see AMP evolving over the next couple of years?

Vinod Raman: We launched AMP in 2018 with different firms such as Fifth Third Bank, First Tennessee Bank, HD Vest Financial Services, and a couple of large RIAs. We have over 200 firms in the pipeline.

Going forward, our focus is three-pronged. One area of focus is to offer additional portfolio flexibility/configuration options to our clients by enhancing our platform capabilities.

The second area of focus is retention. How can we continuously learn about the investor, personalize our services, and retain the investor? We’ll continue to help our institutional clients do this more effectively and use data and technology to do that.

The third focus is around scale. Our pipeline continues to grow. Next year, our focus is to quickly get more clients on the platform.

Arpan Dasgupta: With recommendations coming from AI and machine-learning, there is always the question of adoption. Have advisors or end investors questioned whether the recommendation is right for them or whether they should believe it?

Vinod Raman: Yes. Especially in the advisor world, we’ve faced challenges around the advisors saying, “If the tool is intelligently taking all this information and providing advice in an unbiased fashion, is it replacing me?” So, that’s a big adoption barrier that we must break. Then, if you look at the end investor side, the tool is taking all this information and providing them the recommendation. The question they ask is, “How do I know this is right for me, and am I educated enough to know that this looks right based on what I think my financial profile is?”

We see these kinds of technology as augmenting the work that advisors do. The way we are approaching that within my team – and broadly at Fidelity as well – is that we are very use-case-driven. The big business use case has been acquisition. How do you acquire more investors? How do you get the right investor on the platform? Then, once you get them on the platform, how do you not only cross-sell, but how do you retain them? That’s because we’ve seen investors come onto the platform and open accounts but hesitate about funding them. Or after they fund the accounts, they start thinking about potentially closing the account and leaving the platform, especially since there was some volatility this year.

We prioritize these business use cases internally and marry it with how data and AI can drive incremental impact. We ask the questions: Is AI the right way to solve it? Does AI have everything it needs to be able to solve this problem? And if it does, then we say that’s the number one priority for us next year.

When it comes to adoption challenges, we are continuing to work on education both on the advisor side as well as on the investor side. One of the other big areas of focus this year has been practice management for advisors driven by three or four digital strategy consultants. The sole focus of these practice management consultants is to work with our clients, advisors, broker dealers, and banks to inform them about how the digital world is evolving, how wealth management itself is evolving, and help clients make that transition.

From the investor standpoint, our focus continues to be education. What else can we provide in their ongoing experience that really helps them understand what services we are offering and what products we are selling to improve the adoption metric on the investor side?

Arpan Dasgupta: Who do you see as competition in this space other than the startups? Do you see other institutional wealth managers also providing this service?

Vinod Raman: The one thing I keep telling my teams is, “Don’t think of traditional financial services firms as your competitor. Think of internet firms such as Amazon or Google as your competitor.” Because that’s what’s happening here in this space. Investors are not comparing the Fidelity experience with the experience they might get at other financial institutions. They want to compare that experience with what they get on Amazon. In my mind, it’s the advanced technology companies that are our competitors in the financial services world, more so from an experience, product delivery, and usage standpoint.

From a product innovation standpoint, we’ve done a lot with offering sophisticated financial products through the digital advice solution. But with everything becoming digital, the whole internet ecommerce space is competition. Investors are asking for more digital capabilities, and they want to do everything seamlessly at the click of a button.

This space is certainly up-and-coming and exciting. There’s a lot of innovation going on. What’s really exciting about the space is the true intersection of financial services and technology, powered by data and artificial intelligence. A lot of evolution is going to happen within the next 3-5 years, which is exciting for all of us.

Digital Transformation at AllianceBernstein

An interview with Koley Corte, SVP and Global Head of Business Transformation at AllianceBernstein

How AllianceBernstein is using digital to create better customer experiences in asset management

Digital is helping asset managers deliver more value to the customer experience. Still, going digital can be a complex enterprise-wide effort that requires answers to some challenging questions. For example, where does digital fit in with other business priorities? How can leaders get backing for digital? How can companies leverage, and not duplicate, digital capabilities?

AllianceBernstein is answering these questions. They are strategically deploying digital as a key piece in their business ecosystem. It’s helping them better understand their customers and deliver the right experiences at the right time and place. Through their vision, initiatives, and partnerships, they already have a lot to offer, and they are just getting started.

Read the perspective of Koley Corte, SVP and Global Head of Business Transformation at AllianceBernstein, as she talks to Arpan Dasgupta, Head of Financial Services at Fractal Analytics.

Fractal: What does digital mean for AllianceBernstein when it comes to customer experiences?

Koley Corte: Our main goal is to improve the customer experience and deliver more value to more customers. That means making it easier for customers to engage with us. Digital and data work hand in hand to help us know the customer better, anticipate and deliver against their needs, and reach more of them in greater depth.

Fractal: In taking care of customer needs, are you focused on reducing the need for a human interface?

Koley Corte: No. Our customers should be able to get the information they want, where and when they want it. Whether that’s self-service from a digital platform, engaging with someone on the phone, or meeting with someone in person, we should provide the experience to deliver what they want and anticipate what they need—at the right time and place.

Fractal: What digital channels are you focused on?

Koley Corte: We’re focused on delivering customer value by understanding and anticipating their needs and engaging with them through omnichannel experiences including social, email, and potentially voice, delivered through different kinds of devices. We’re digitally engaging with customers in a way that is platform agnostic. We have a lot of valuable content out there across multiple channels: web, social, and email outreach. There’s an opportunity to increasingly curate that experience and have all the channels work in harmony around the customer and needs.

Fractal: Where does digital fit into AllianceBernstein’s priorities?

Koley Corte: Digital and data are critical enablers for us to drive continued success with our clients and reach new customers. It works in tandem as part of an ecosystem around the customer, and not in exclusion. We have a clear vision and strategy, we have funded initiatives in place, and we are deploying digital and data-driven solutions across channels.

Fractal: What are some key decisions you’ve made to drive digital initiatives?

Koley Corte: We work alongside and with the organization. We partner outside the organization to bring in fintech skills and capabilities. We are working with different firms like Fractal to bring in data knowledge to accelerate our ability to deliver as well as figure out how to work together and truly partner with an operating model that supports the business.

Fractal: Internally, have you brought together a team that’s focused on digital?

Koley Corte: Yes. My role is broader than digital; it’s business transformation. We have built out a business transformation team in the Americas and recently hired our first team member in Asia. We are in the midst of building out a global model. We’ve supplemented internal resources, so we aren’t trying to duplicate capabilities – we are bringing in strategic thinking, the ability to drive results forward, and the partnerships to execute. We work with the business with a goal of weaponizing the current model and operating at scale.

Fractal: In accelerating your vision of improved customer experiences, what challenges have you experienced?

Koley Corte: The first challenge is being able to manage an environment of experimenting and learning, as a mechanism, as opposed to building the perfect mouse trap. We need to deliver on our goals today along with the future. Getting people comfortable with playing with data, and giving partners access to data, is a different model and a test and learn approach.

Onboarding new partners, tools, and services has taken longer than I’d like. It’s new for us. There are growing pains in our approach.

The other learning for us has been helping the organization in balancing short and long-term priorities. That’s the learning that we need to do sometimes in how we work.

Fractal: In prioritizing a digital initiative, how do you get people to agree on what the organization should focus on?

Koley Corte: We formed business transformation committees regionally to create more transparency and alignment within the organization. We have built shared goals with the regional business heads and their teams, so it’s embedded in their goals to both deliver the business today and embrace delivering as a business tomorrow. I do a lot of championing in talking about it and building relationships. As we experiment and deliver highlights and proofs of concept, the more people see it, the more real it becomes, and we build momentum and a fear of missing out.

Fractal: In executing on those strategic decisions, what kinds of challenges have you faced?

Koley Corte: We make sure this remains a priority and gets resources and funding. We manage the data, so we see when things are working and when they are not, so we can be agile in our thinking and approach and can pivot along the way.

Fractal: Are there multiple partners involved here?

Koley Corte: We rely on a lot of internal business partners. We lead the governance around that. We work with partners to develop plans, and we hold people accountable to timelines. We first develop the concepts together with our customers in the business, and then we lay out realistic plans and put resources to deliver against those.

Fractal: How do you figure out what your customers need from digital experiences?

Koley Corte: We observe how customers engage with us, talk to them, and understand what they want. We have online tools to put things in front of customers or potential customers to understand their perspective. We look at the art of the possible: how have new capabilities evolved that we can leverage into understanding what our customers are trying to achieve?

Fractal: How are you leveraging learnings and capabilities across AllianceBernstein?

Koley Corte: We’ve formed an innovation group across the company. We meet regularly and share what we are doing to leverage any capabilities we can. We stay aware of what the other firms and strategic business units are focused on, and we share what we are learning. I spend a lot of time understanding what others are doing and making sure we aren’t reinventing, and instead, leveraging capabilities where we can.

Fractal: Are there any closing thoughts you’d like to share?

Koley Corte: This is a work in progress. It is following the customers’ needs and lead. It is testing, learning, and experimenting to drive more value. It is early days. As a firm, we have a lot to offer, and we need to experiment with ways to reach customers even more effectively, penetrating the market more deeply and broadly. There is a lot of opportunity to continue to improve and grow.

Authors

Digital Transformation at AllianceBernstein

 

 

 

 

 

 

Koley Corte

Senior Vice President and Global Head of Business Transformation, AllianceBernstein

A senior executive focused on leveraging data and digital to drive strategic change, Koley Corte is currently Senior Vice President and Global Head of Business Transformation at AllianceBernstein focused on developing transformative growth strategies for next generation institutional and retail distribution.  Koley’s areas of current focus include artificial intelligence, sales enablement, demand generation, robotics and automation, predictive analytics and new channel development.

Prior to joining AB, Koley was at Reed Elsevier (RELX) where she served as Senior Vice President and Head of Digital, Innovation and Customer Acquisition Strategies, Americas Region for Reed Exhibitions (RX).  While at RX, she focused on driving attendee acquisition and engagement, developing digital programs, assets, capabilities and revenue and improving analytics and insights. Previously, Koley was Vice President, Head of Market & Competitive Strategy and Integrated Campaign Management at TIAA, where she led teams responsible for developing the environmental backdrop with insights and implications for revenue growth and diversification strategies, the rebrand and large-scale external communication initiatives, including complex product redesigns.  Previously, Koley held progressive leadership roles at American International Group (AIG).

Koley received her MBA with distinction from the Leonard N. Stern School of Business at New York University, with a concentration in Finance, Management, and International Business, and her B.A. cum laude in Economics and Psychology from Brandeis University.

Arpan Dasgupta

 

 

 

 

 

 

Arpan Dasgupta

Head of Financial Services, Fractal Analytics