BETA
This is a BETA experience. You may opt-out by clicking here

More From Forbes

Edit Story

How Deep Is Your Learning?

Following
POST WRITTEN BY
Abhi Arunachalam
This article is more than 8 years old.

When the virtual assistant Siri showed up on iPhones four years ago, the technology felt like the first glimpse of those scary-smart, talking robots you’ve been reading about in science fiction for years. A machine that could answer any question you had, from who won the 1968 World Series to the location of your nearest dry cleaner? And even pick music for you? Amazing (though perhaps not that weighty).

Today, there’s an emerging new class of technologies that could make Siri look as sophisticated as a Magic 8-Ball. These new technologies fall into a category dubbed “deep learning,” an evolution in computer programming that takes machine learning one step closer to true artificial intelligence. Deep learning has been around, in some form, for years. But it’s now gaining in prominence because of the volumes of new data being thrown off by Internet-enabled smart devices and drones, as well as the availability of new, more sophisticated tools and algorithms for analyzing that information —many of them tapping computing power through the cloud. I believe deep learning’s impact will be substantial in fields as diverse as medicine and education, retail, agriculture and more, and there is a big opportunity to back potentially transformative companies in many of these areas.

So how does deep learning work? Consider: Right now, computers can already learn to recognize and tag your face in a photo, as Facebook does when you post pictures to your timeline. That is enabled by a multi-layer neural network (think human brain) that studies millions of everyday images, then isolates one pattern of images and looks for places where that pattern repeats itself. In more advanced deep-learning scenarios, software could go one step further and actually teach itself to read X-rays or an MRI. It  would not only recognize what an image of a brain or a heart looks like, but zero in on images containing an anomaly--a brain with a tumor, for example, or a heart with a defect.

Another example for potential applications of this technology is in natural-language processing (helping computers learn to understand speech instead of just searching for keywords). “Sentiment analysis”, also known as opinion mining, is a type of natural-language processing and a way of determining the emotional tone behind an online review of a product or a social media reference. For example, the word ‘sick’ in an online review could be mean awful or, in the case of a hip millennial reviewer, really cool. Deep-learning technologies can parse that context to figure out the true meaning.

So how does all this compare to more-traditional machine learning? One analogy is that machine learning forced computers to “learn” the way a child learns to spell--through repeated drills, with ongoing corrections and new instructions fed to the child by a teacher. The child doesn’t learn from his or her mistakes unless the teacher points them out, and the process continues until the child achieves a certain level of accuracy. Deep learning, on the other hand, removes the need for the teacher’s input and instead empowers the child to self-improve through analyzing large data sets. It is teaching computers to iteratively learn like humans.

Some pioneering companies are already using deep learning. Natural-language processing and image classification are both areas where deep learning is already creating new opportunities; my firm, Battery Ventures, in 2011 invested in Narrative Science*, for example, a natural-language generation platform that is powered by artificial intelligence.

But future applications could reach into all corners of our lives. Battery recently led a seed-funding round in cybersecurity startup JASK*, which uses deep-learning technology to go beyond identifying computer hacker attacks from outside a company’s network to spot unusual patterns of behavior inside the network, like when your workstation suddenly starts sending huge data files to an IP address in Ukraine.

Applied deep-learning firm MetaMind focuses on understanding natural language and image recognition to solve business problems. Nervana, meanwhile, runs deep- learning algorithms in the cloud and essentially offers deep learning as a service. Orbital Insight mines images to predict retail sales and crop yields. Expect Labs seeks to improve the user experience on mobile devices by providing deep learning and voice recognition for app developers.

Major technology companies are rushing to acquire companies with deep-learning capabilities. Twitter in 2014 bought Madbits, an image-search startup; Yahoo scooped up LookFlow, an image-classification firm the year before; and Apple just this past fall acquired Perceptico to make iPhones more powerful. Google bought DeepMind in 2014 for $400 million in a deal primarily aimed at acquiring the company’s deep-learning specialists, because specialists in this field are in incredibly short supply.

Right now, deep-learning technology is in its infancy. Maybe 100 people in the world have the true implementation skills to drive it forward. But the possibilities are large. We’re on the brink of an explosion of innovation in this space, both because we’re capturing so much more data about everything in the world, and because we’ve got so much more computing power to process and analyze it all. Expect to see computers get a whole lot smarter in the next couple of years.

Arunachalam is a vice president in Battery Ventures’ Menlo Park, CA office, focusing on investments in areas such as cloud infrastructure, enterprise mobility and big-data analytics.

*For a full list of all Battery investments and exits, please click here.