Companies need to start looking beyond packaged software from vendors and redesign their own applications to incorporate machine learning algorithms if they want to make artificial intelligence a viable business tool, a Forrester analyst warned the Enterprise World conference this week.
Speaking in a breakout session at the annual OpenText customer and partner event, Forrester VP and principal analyst covering artificial intelligence (AI) Mike Gualtieri said 58 per cent of enterprises are still researching AI tools and techniques. Only 14 per cent, according to its surveys, are actually training AI tools to work in a specific business context. That gap won’t close, he said, if more organizations don’t realize the difference between what he called “pure AI,” which mimics human thought, and “pragmatic AI,” which is narrower in scope but tends to solve specific problems, like the way IBM taught its Watson technology to play on the TV game show Jeopardy.
Pragmatic AI can incorporate everything from image analytics and natural language selection to robotics, but Gualtieri emphasized the role of machine learning. He defined this as algorithms to analyze data that find models to predict outcomes and understand context with significant accuracy.” Machine learning is essential in the enterprise because it improves as more data is available, he said.
“The time is coming where almost every application, such as Oracle’s, have machine learning. That’s what SAP is doing. That’s what OpenText is doing,” he said. “These building block technologies are readily available.”
Gualtieri pointed to Apache Spark, for example, as an open source a scale-out processing engine that can let data scientists work with machine learning algorithms in their own applications.
Of course, machine learning and pragmatic AI has its limitations, he admitted. The results tend to correlate things rather than prove definitive causation, which is why companies need to also ensure they have quality data to feed what the algorithm does. The problem is that while all data takes place in real time, traditional analytics to gain insights and build models has tended to happen much later, he said. Moving to real-time AI could be a mental shift for many organizations, he suggested.
“It’s not all about the algorithms. It’s only as smart as the data provided,” he said. “The data is not perishable but the insights are.”
Some early examples of companies doing this effectively include Verizon, which Gualtieri said has looked at data across more than 4,000 columns to assess which customers are likely to leave or “churn” in a given time period. Another is InteractiveTel, a Houston-based firm that converted more than 200,000 minutes of voice calls a month into text at a local car dealership to gauge who was most likely to make a purchase.
Customer experience professionals may not initially realize the need to redesign applications because they’re more focused on other things. Gualtieri suggested walking through each step of a customer’s journey and asking: would this experience improve if we could make it conversational, predictable, personalized, or sensory? Then, depending on how you answer, go into AI knowing there will be significant experimentation involved.
“You really have to think like a VC,” Gualtieri said. “They make all kinds of investments and they know some will not provide a return but they invest as though they will. This is not a project that you can do an ROI case on. It’s an R&D project.”
Enterprise World runs through Thursday.
Latest posts by Shane Schick (see all)
- B2B Marketing Exchange 2018 showcases ABM best practices from Windstream and Quarry - February 22, 2018
- Outlaw fights back at the incomprehensible legalese that plagues business contracts - February 21, 2018
- Contently’s EIC explains why B2B brands may have a storytelling edge - February 20, 2018