advertisement
Artificial intelligence predictions for 2020
Artificial Intelligence (AI) has become integral to practically every segment of the technology industry. It’s having an impact on applications,…
Artificial Intelligence (AI) has become integral to practically every segment of the technology industry. It’s having an impact on applications, development tools, computing platforms, database management systems, middleware, management and monitoring tools—almost everything in IT. AI is even being used to improve AI.
What changes in core AI uses, tools, techniques, platforms, and standards are in store for the coming year? Here is what we’re likely to see in 2020.
GPUs will continue to dominate AI acceleration
AI hardware accelerators have become a principal competitive battlefront in high tech. Even as rival hardware AI chipset technologies—such as CPUs, FPGAs, and neural network processing units—grab share in edge devices, GPUs will stay in the game thanks to their pivotal role in cloud-to-edge application environments, such as autonomous vehicles and industrial supply chains.
advertisement
Nvidia’s market-leading GPU-based offerings appear poised for further growth and adoption in 2020 and beyond. However, over the coming decade, various non-GPU technologies—including CPUs, ASICs, FPGAs, and neural network processing units—will increase their performance, cost, and power efficiency advantages for various edge applications. With each passing year, Nvidia will draw more competition.
Industry-standard AI benchmarks will become a competitive battlefront
As the AI market matures and computing platforms vie for the distinction of being fastest, most scalable, and lowest cost in handling these workloads, industry-standard benchmarks will rise in importance. In the past year, the MLPerf benchmarks took on greater competitive significance, as everybody from Nvidia to Google boasted of their superior performance on these. In 2020, AI benchmarks will become a critically important go-to-market strategy in a segment that will only grow more commoditized over time. As the decade wears on, MLPerf benchmark results will figure into solution providers’ positioning strategies wherever high-performance AI-driven capabilities are essential.
AI modeling frameworks will converge on a two-horse race
AI modeling frameworks are the core environments within which data scientists build and train statistically driven computational graphs. In 2020, most working data scientists will probably use some blend of TensorFlow and PyTorch in most projects, and these two frameworks will be available in most commercial data scientist workbenches.
advertisement
As the decade proceeds, the differences between these frameworks will diminish as data scientists and other users value feature parity over strong functional differentiation. By the same token, more AI tool vendors will provide framework-agnostic modeling platforms, which may offer a new lease on life for older frameworks in danger of dying out. Accelerating the spread of open AI modeling platforms is industry adoption of several abstraction layers—such as Keras and ONNX—that will enable a model built in one framework’s front-end to be executed in any other supported framework’s back-end.
By the decade’s end, it will become next to irrelevant which front-end modeling tool you use to build and train your machine learning model. No matter where you build your AI, the end-to-end data science pipeline will automatically format, compile, containerize, and otherwise serve it out for optimal execution anywhere from cloud to edge.
SaaS-based AI will reduce enterprise demand for data scientists
This past year saw the maturation of machine learning as a service offerings from AWS, Microsoft, Google, IBM, and others. As this trend intensifies, more business users will rely on cloud providers such as these to supply more of their AI requirements without the need to maintain in-house data science teams. By the end of 2020, SaaS providers will become the predominant suppliers of natural language processing, predictive analytics, and other AI applications, as well as platform services and devops tooling. Those enterprises that maintain in-house AI initiatives will automate data scientist jobs to a greater degree, thereby reducing the need to hire new machine learning modelers, data engineers, and anciillary positions. Over the decade, most data scientists will find gainful employment primarily with SaaS and other cloud providers.
advertisement
Enterprise AI will shift toward continual real-world experimentation
Every digital business transformation initiative hinges on leveraging the best-fit machine learning models. This requires real-world experimentation in which AI-based processes test alternative machine learning models and automatically promote those that achieve the desired result. By the end of 2020, most enterprises will implement real-world experiments in every customer-facing and back-end business process. As business users flock to cloud providers for AI tooling, capabilities such as those recently launched by AWS—model-iteration studios, multi-model experiment tracking tools, and model-monitoring leaderboards—will become standard in every 24×7 AI-based business application environment. Over the decade, AI-based automation and devops capabilities will spawn a universal best practice of lights-out AI-based business process optimization.
AI will automate AI developers’ core modeling function
Neural networks are the heart of modern AI. In 2020, an AI-driven methodology called neural architecture search will come into enterprise data scientists’ workbenches to automate the practice of building and optimizing neural networks for their intended purposes. As neural architecture search gains adoption and improves, it will boost data scientists’ productivity by guiding their decisions regarding whether to build their models on established machine learning algorithms, such as linear regression and random forest algorithms—or on any of the newer, more advanced neural-network algorithms. As the decade proceeds, this and related approaches will enable continuous AI devops through end-to-end pipeline automation.
AI-driven conversational user interfaces will eliminate the need for hands-on in most apps
AI-based natural language understanding has become astonishingly accurate. People are rapidly going hands-free on their mobiles and other devices. As conversational user interfaces gain adoption, users will generate more text through voice inputs. By the end of 2020, more user texts, tweets, and other verbal inputs will be rendered though AI-driven voice assistants embedded in devices of every sort. Throughout the decade, voice assistants and conversational UI will become a standard feature of products in every segment of the global economy, with keyboards, keypads, and even on-screen, touch-type interfaces diminishing in usage.
Chief legal officers will mandate end-to-end AI transparency
AI is becoming a more salient risk factor in enterprise applications. As enterprises confront an upswell in lawsuits over the socioeconomic biases, privacy violations, and other unfortunate impacts of AI-driven applications, chief legal officers will demand a complete audit trail that reveals how the machine learning models used in enterprise apps were built, trained, and governed.
By the end of 2020, chief legal officers in most enterprises will require that their data science teams automatically log every step in the machine learning pipeline while also generating a plain-language explanation of how each model drives automated inferencing. As the decade proceeds, a lack of built-in transparency will become a predominant factor in denying AI project funding.
Finally, we can safely assume that calls for regulation of AI-based capabilities in all products—especially those that use personally identifiable information—will grow in the coming years. Apart from the growing emphasis on AI devops transparency, it’s too early to say what impact these future mandates will have on the evolution of the underlying platforms, tools, and technologies.
But it appears likely that these regulatory initiatives will only intensify in coming years, regardless of who wins the US presidential election this coming November.