Latecomers Beware: How Artificial Intelligence Spawns the Next Largest Divide

Posted: 04/06/2019 - 06:13
Artificial Intelligence (AI) is spawning the next largest divide and latecomers must beware. As science fiction author William Gibson so eloquently put it, “The future is already here – it's just not evenly distributed.”  
AI is undoubtedly the big story of this decade and the next. Organizations everywhere are enchanted with the notion of using AI to improve everything from customer experience to identifying new market opportunities. However, there is a real and growing danger that can easily leave many far behind in the “have nots” camp. They may never benefit from applying AI to business operations. 
The reality is that most organizations lack the necessary knowledge and capabilities in two key categories critical to success: first, having a solid approach to identify opportunities to utilize machine learning; and second, implementing AI in a practical manner with ready access to the right information and the data science wherewithal to realize true improvement.  
How to Approach Incorporating AI  
While business media often covers AI as a general topic, it is anything, but general. Apart from the concept of “artificial general intelligence,” which would have machines truly think and adapt like humans, AI is very much a technology that must find the right “fit” within a given process to provide value. AI left to itself is no silver bullet. Approaching AI should not be an exercise left to itself either, but rather be built-in to specific process improvement plans.  
As cited by consultancy HFS Research, organizations must look at the application of AI as one part of a three-legged stool, including automation and analytics as applied to a specific business process with AI. Their research on Robotic Process Automation (RPA) has shown that approaching the AI portion first provides more support for future RPA scalability because you can focus on the output of AI as a key enabling input to the automation portion. Focusing on AI last can impose significant roadblocks to automation because there is a limited understanding of exactly the way in which AI would be implemented to drive automation.  
The other key success criteria for establishing an approach to benefit from AI is to identify the business processes that can gain most from its introduction. Since one key beneficial attribute of AI is that it can replace tedious, low-value human tasks, it is important to target processes that enable staff to focus on other higher-value areas.  
Another consultancy, Everest Group, views application of AI as a means to automate the various “input-oriented” tasks involved with business processes such as data entry and enrichment so that expensive staff can focus on the “output-oriented tasks” such as taking meaning from the results. Automation of input tasks is certainly the main theme of initiatives such as “cognitive RPA” and “cognitive analytics” where the AI portion does the job of armies of data entry and data scientists to shepherd business processes along.   
Tackling Moonshot Projects 
As our friends at Capgemini point out, you can look at AI as a way to tackle processes that you do not have currently because they are impossible without AI, often called “moonshot projects.” Alternatively, you can take a more pragmatic route by tackling processes that leverage AI’s speed and comprehensiveness. Processes such as claims adjudication and fraud detection require access to complex data that AI can parse to detect patterns, which can then be routed to the proper staff for further handling, if even necessary. 
The perspective of pragmatically tackling rote processes first is echoed in research presented by Harvard Business Review, which provides a useful construct by defining three types of AI: one applied for automation, another for delivering insight and a third for customer engagement. It is no surprise that automaton-oriented AI delivers the quickest results and aptly mirrors the advice of Everest Group.  
Data Science: the Key to Successful AI 
Even when an organization thoughtfully approaches AI as a component within an overall process reengineering effort, a significant hurdle remains. This is the ability to produce and curate the data necessary to power an AI-infused project. Even though many software vendors include AI in their products, customizing the desired outcomes based upon a given organization’s own needs requires a highly disciplined approach to understanding what data is required and in what capacity.  
Many high-profile cases of AI initiatives by the likes of Google and other large, well-heeled enterprises have gone awry, providing output that is both incorrect and could potentially cause adverse effects. This is due to the bias hidden in the data and the training of AI algorithms—bias that all humans have but may fail to recognize. The ability to overcome this bias is one problem. Another challenge is simply understanding the data requirements and then producing data in the correct representative set to make AI work correctly.  
Rise of Connected Systems 
The rise of connected systems did a lot to help create the “big data” that we have all heard about, but a large amount of data—in and of itself—is nothing. Organizations need to understand what “data features” are important to each particular process and then go about locating, curating and quality checking that data. This process is what data science is all about, but one which few organizations are adequately staffed to manage. Without data science, the outcome will reflect that old axiom “garbage-in, garbage-out.” With black box-like machine learning, it may not be immediately apparent that what is output is indeed garbage. The more complex the business process, the more complex the data requirements become.  
In 2017, a survey of data scientists revealed that they spend a significant amount of their work (51%) simply collecting, cleaning and organizing data sets, which leaves far less time to actually mine the data and refine the algorithms. In a follow-up survey in 2018, a majority viewed the curation part as the least enjoyable aspect of their work. Additionally, 55% reported that the quality and quantity of data was their biggest challenge. Echoing the “inputs and outputs” cited by Everest Group, there is a lot of room for improving the “input side” of things even with the curation of data sets. If your organization is not planning for this now, you may end up on the side of the “data poor.” 
Future AI Bifurcation 
Overall, there is no doubt that AI will have a significant impact on many different parts of our daily lives, both in personal and professional activities. However, it is clear that while there is exuberance around applying AI, many—if not most—organizations lack the necessary experience and skill to directly benefit from AI, leaving the real progress to forward-thinking enterprises that are leading the charge.  
The relative complexity of applying AI to processes of all types may lead to a Faustian bargain of sorts. It may come down to a decision of whether to completely outsource and entrust use of AI to companies like Google and Amazon rather than build the necessary capabilities in-house. Other alternatives are sure to enter the market in addition to the options of engaging with consultancies and service providers to aid with implementations. Since AI will have a significant impact, it is up to organizations to start preparing for that eventual decision sooner rather than later.

About The Author

Greg Council's picture

Greg Council is Vice President of Product Management at Parascript, responsible for market vision and product strategy. Greg has over 20 years of experience in solution development and marketing within the information management market. This includes search, content management and data capture for both on premise solutions and SaaS. To contact Greg and Parascript, please email: