Understanding the "Eternal AI Spring"
By Nick Ross, CIO, Genius Sports
It is broadly accepted that 2012 was the tipping point for machine learning’s transition to the mainstream. The essence of machine learning is that it is often easier to search for mechanisms that can solve problems than it is to handcraft those mechanisms.
Artificial neural network models have proven to be a fruitful area to search for mechanisms that can tackle a broad range of problems, from face recognition to selecting moves in board games. Searches within this space occur on two levels:
1. Searching the space of neural network topologies and their hyper-parameters (transfer function(s), learning rate(s), etc.);these are usually searched manually but Google and others are beginning to introduce methods to automate the process
2. Searching the space of weights on the edges (lines) that connect the vertices (nodes) that compose the network topology
To date, the majority of successes in machine learning have been achieved using “supervised learning” as the search technique. This approach performs well when it’s difficult to express a rule (or function) that maps from an input to a response but where it is easier to provide lots of examples of these mappings (e.g. from loan applications to the likelihood that the loan will be repaid, from a piece of French text to a piece of English text, from a picture to a count of the number of pedestrians in it).
The supervised learning process sees this body of labeled data (input/response mappings) divided into a training set and a testing set. After using the former to train (search the space of weights for) the network, the latter, unseen data is used to confirm that the resulting solution has successfully generalised to the mapping sought.
The ideas that led to the development of modern neural networks started in the 1940s and the key supervised learning algorithm (“back propagation”) required to train networks with more than two layers, was in place by the 1980s. However, Geoff Hinton (machine learning pioneer and one of the co-authors of the paper introducing back propagation) has said, “What was wrong in the 80s is that we didn’t have enough (labeled data) and we didn’t have enough compute power.”
So, what changed in 2012? Two things:
1. Data: the availability of very large labeled datasets, and one in particular, ImageNet
2. Hardware: affordable parallel computation Mass democratisation of machine learning technology makes this an area that is well worth investing in
Mass democratization of machine learning technology makes this an area that is well worth investing in
Data and Hardware
The data came from ImageNet, among the first very large labeled datasets, containing millions of manually labeled images (currently over 14 million). Since 2010 this dataset had been used in the “ImageNet Large-Scale Visual Recognition Challenge” (ILSVRC), where researchers compete in a range of computer vision tasks.
The hardware came from advances in graphics processing units (GPUs) driven by video gamers’ desire for better graphics at higher resolutions and higher frame rates. The parallel compute capability of these cards aligned well with the needs of neural network models. A modern GPU (available for less than $700) may have more than 3,500 cores, contrasted with the four that are present in a typical laptop CPU.
In 2012, a team led by Geoff Hinton used a deep convolutional neural network running on a pair of NVIDIA GPUs to win the ILSVRC competition, classifying100k images from ImageNet into 1,000 categories after training on 1.2m category labeled images. (The term “deep” indicates that the network has several layers, eight in this case, and “convolutional” that the network has a specific topology well suited to images and other continuous data like text, audio and video). This combination of algorithmic advances, a very large dataset and the use of GPUs finally brought together the ingredients required to trigger the current “AI Spring”.
The Eternal AI Spring
The broad applicability of machine learning approaches has led to its current domination of the field of artificial intelligence (AI), a field of study that has had a number of booms and busts since its inception in 1956. We refer to the busts as “AI Winters”, where funding for academic research dries up after bold claims fail to materialise. The current “AI Spring” is reflected in the appearance of both “machine learning” and “deep learning” at the very peak of the “Peak of Inflated Expectations” in Gartner’s 2017 Emerging Technologies Hype Cycle, just waiting to crash into their “Trough of Disillusionment”.
However, Andrew Ng (adjunct professor at Stanford University, founder of Google’s first AI group, Google Brain, and former lead of Baidu’s AI group) believes that we’ve now entered an “eternal AI Spring”. This is based on both the impact machine learning is currently having outside of academia across a range of industries, and the clear roadmap for changing almost all other industries, just with current technology.
Genius Sports have established a horisontal Machine Learning function, offering services to all Group business units. This team is currently working on a number of projects using Amazon Web Services cloud infrastructure, which includes GPU support (Microsoft and Google also have strong cloud offerings). Cloud computing is well suited to machine learning work, as the compute requirement during training can be significant, often requiring hours of compute time. Conversely, a trained model can be executed in milliseconds.
The appetite from machine learning teams for GPU hardware, combined with that from the original video game and the newer virtual/augmented reality markets, and GPUs applicability to blockchain “mining”, bode well for further performance and capacity gains in this area.
The Genius Sports team are also benefiting from three forms of open source technology:
1. Programming languages, specifically Python and R (and their comprehensive, high-performance numerical libraries)
2. High-quality, interoperable frameworks from major vendors (e.g. Google’s TensorFlow and Facebook’s support for PyTorch)
3. Large datasets and trained models
This mass democratisation of machine learning technology, and the valuable results that can be achieved by skilled practitioners, makes this an area that is well worth investing in for businesses with, or with the capability to acquire, the necessary data assets.
AI and the Future of Field Service: Moving from Efficiency to Innovation
Michael Alcock, Director-CIO Executive Programs & Content, Microsoft [NASDAQ: MSFT]
Artificial Intelligence-Journey towards the Center of the Enterprise
Raman Mehta, SVP & CIO, Fabrinet [NYSE: FN]
Legal Knowledge Management and the Rise of Artificial Intelligence
Christopher Zegers, CIO, Lowenstein Sandler LLP
Why does new Technology fill CIOs with Anxiety and Uncertainty?
James Parker, Chief Revenue Officer, Tata Communications