We are excited to bring Transform 2022 back in-person July 19 and virtually July 20 - 28. Join AI and data leaders for insightful talks and exciting networking opportunities. Register today!


Want AI Weekly for free each Thursday in your inbox? Sign up here.

My artificial intelligence-related news feeds this week were filled with oil painting-style images of fuzzy pandas wearing leather jackets and riding skateboards, courtesy of Google Brain’s new Imagen diffusion model. I admit it: I yearn to pet Imagen’s uber-realistic corgi wearing sunglasses riding a bicycle in New York City’s Times Square.

I also went down the Twitter rabbit hole of everything from how Clearview AI’s controversial facial recognition software is reportedly coming to private companies and schools — to more memes, threads and emoji-filled posts around the will-AGI-come-soon-or-won’t-AGI-ever-come debate that I highlighted last week

However, I decided to focus this issue of AI Weekly on something very of-the-moment and applicable to companies of every shape and size working to implement AI in their organizations: That is, AI’s “last mile” deployment problem.

Event

Transform 2022

Join us at the leading event on applied AI for enterprise business and technology decision makers in-person July 19 and virtually from July 20-28.

Register Here

Oh, and Transform 2022’s agenda is now live.

Let’s dig in.

Sharon Goldman, senior editor and writer
Twitter: @sharongoldman

Can AI’s ‘last mile’ problem be solved?

Since I joined VentureBeat six weeks ago to cover the AI beat, few statistics have been repeated to me more often than some version of “the vast majority of AI and machine learning projects fail.” 

Whether that number is quoted as 80%, 85% or 90%, it seems clear that the biggest issue is getting AI and machine learning projects from the pilot stage into production. This is known as the “last mile” problem. The term hails from the supply chain industry’s famous “last mile” obstacle, which is described as the highly-complex last leg in the transportation journey of people and packages from hubs to their final destinations. 

This week, I asked some AI vendors, leaders and practitioners for their thoughts on how to look at the “last mile” problem and how organizations can rise to the challenge. 

Companies have to develop model deployment fluency

“The challenge is that many companies simply lack the data engineering, data science and MLops expertise required to properly build a model, place it into a system or environment (cloud or on-prem), deploy and run. Companies have to develop fluency in how to deploy models in various environments and how to run those models and how to manage models for drift and related issues. This doesn’t mean they need to hire legions of technical talent. Rather they can simply gain the basic organization competency and fluency and leverage emerging players who shoulder much of this complexity and are offering MLaaS or machine-learning-as-a-service.” 

Edward Scott, CEO, ElectrifAi 

We need the ‘bad’ AI models that never get published

“It’s easy to say 80% of models never go to production. But we need that work done so that the 20% that do get published are actually the good ones. A lot of bias elimination, optimization and learning come out of the ‘bad’ ones, but you don’t want to publish them. I think that this pinpoints the interplay between the expectations of leadership and what the actual problems for running a machine learning program are. Is it reasonable that 100% of all models make it into production? Of course not — some problems can’t be solved by machine learning, and the people who are working on them go on to work on something else that does actually go into production. But you also have models that don’t make it because of more structural reasons. It is data scientists who can’t actually get access to the data. Engineers aren’t able to get access to the infrastructure, or systems are too fragmented for performing large data jobs.” 

– Joe Doliner, cofounder and CEO, Pachyderm

Focusing on outcomes and data strategy is essential

“The ‘last mile problem’ is disengaging data practitioners and costing organizations hundreds of thousands of dollars. Imagine a data science team with access to the data they want, who have built and validated a predictive model, that model is generating exciting results locally … and then it is left on the shelf. For data science teams, moving away from the traditional ‘data science projects’ approach and focusing on outcomes is essential. At an organizational level, a data strategy that clearly includes provisions for AI is the first step. Bringing data practitioners – engineers and scientists – into cross-functional teams and working groups is the next step. Technical teams need input from commercial users to ensure their models are fit for purpose and deliver against a business need, and commercial users need to be comfortable using the model and fully understand how an application will augment their day-to-day work.” 

– Stuart Davie, VP of data science at Peak

Processes, procedures and roles must be clearly defined

“Three weeks ago, a potential customer volunteered that a model that was running insurance pricing in their organization had not been looked at in 18 months. The data scientist who created the model had left the organization, and the model was undocumented and unattended. The organization had relied on its data science team to both design and maintain their models. While this is a perfectly acceptable way of running a business, there must be processes, procedures, and roles clearly defined when something gets to production. Data scientists find great meaning in having their AI models make it to production. In fact, after salary, lack of business impact is the number one reason data scientists leave companies. That said, many do not understand or want to maintain the production operation, especially as models deployed go from two to three to hundreds.”

Grant Case, APAC head of sales engineering, Dataiku

AI engineering will be a game-changer

“The real differentiator for enterprises will lie in their ability to continually enhance value through rapid AI change. The ‘AI Engineering Era’ is going to be necessary for enterprises to achieve true client-centric AI. AI engineering is a discipline that enables both business and IT leaders to work together to deploy repeatable patterns for AI solution success. According to Gartner, AI Engineering, one of its top 12 strategic technology trends of 2022, will operationalize the delivery of AI to ensure its ongoing business value. It’s an essential bridge between MLOps and Client-Centric AI, where enterprises will see sustained value because they’ll be able to truly know and serve their customers across the full customer journey.” 

– Akshay Sabhikhi, founder and COO, CognitiveScale

I hope you’ll share your thoughts on AI’s “last mile” problem with me: [email protected].

P.S. I also accept AI-related Twitter memes, emojis and images of teddy bears swimming the 400m butterfly at the Olympics.

VentureBeat's mission is to be a digital town square for technical decision-makers to gain knowledge about transformative enterprise technology and transact. Learn more about membership.

Author
Topics