BETA
This is a BETA experience. You may opt-out by clicking here

More From Forbes

Edit Story

From Space Invaders To Curing Cancer: The Rise of GPUs

This article is more than 8 years old.

I just spent more hours than I’d like to admit playing the new Fallout 4, feeling like I was actually in the game mostly because of the incredible true-to-life graphics. I opened Wikipedia, learned how GPUs power the graphics, and wondered if any startups were harnessing this technology.

I know it's not usually what I discuss, but I figured you'd find this topic just as interesting I did. Here’s what I found out:

In the late 1970s, also dubbed The Golden Age of Arcade Video Games (think Space Invaders, Pac-Man, and Frogger), gaming developers started relying on graphic processing units to, well, process their graphics. This is because regular central processing units, or CPUs, weren’t advanced enough to read and analyze the “highly detailed” enemy aliens you and your friends enjoyed shooting Thursday afternoons at the arcade. Rather, the developers needed a processor that would compute and render the graphics fast enough to accelerate the images, so that the aliens and laser cannon moved fluidly and quickly.

Now, nearly forty years later, GPUs are in every computer, console, and supercomputer—processing graphics and vectors at such high speeds and with such great capacity that 1980 Pac-Man wouldn’t know which ghost to munch on first. For example, the very computer you’re reading this article on has just the right amount of GPU cores to handle various media-intensive tasks, like accelerating Adobe Flash videos, transcoding videos between different formats, image recognition, virus pattern matching, and more. Essentially, rendering graphics would be impossibly slow if your computer processor relied solely on CPUs.

GPUs outperform CPUs in nearly every field: speed, efficiency, and cost being the most important. In its architecture, CPUs are composed of just a few cores, meaning they can handle processing just a few software threads at a time. In contrast, GPUs are composed of thousands of computing cores and can handle thousands of threads at once—basically, GPUs can accelerate software over 100x faster than CPUs.

Recently, developers have become eager to translate GPUs’ super fast, high compute processing ability into other mainstream industries besides gaming. Next on the list? Big Data.

In today’s digital age, where the volume of data generated every day is 250,000 times the size of the printed collection at the US Library of Congress, developers are learning that it’s becoming increasingly expensive and inconvenient to continue processing huge workloads using traditional CPUs. It’s for this reason that IBM and big data analytics startups like SQream Technologies are leveraging GPUs to run major data warehousing workloads on hardware that is not only hundreds of times faster but also literally a fraction the physical size of what is normally used.

IBM Power System is a hardware solution that enables better GPU acceleration in big data, machine learning, and more; and SQream is a software solution that uses GPUs to massively crunch large data sets with SQL, the standard language in today’s database management systems. To optimize performance, cost, and storage, SQream can also be installed in data warehouses that already rely on CPU-powered databases, like Teradata and Oracle Exadata.

Essentially, it’s the small size (and I mean small, it’s like a standard 2U server or even a shoebox) that makes GPU-powered databases so cost-effective. Instead of dedicating rooms and rooms to CPU-powered database hardware, because that’s how much it takes to successfully process and accelerate that much data, you’d just need a little corner of a room to stash your GPU-powered servers. This means that smaller companies on more of a budget can solve big data issues, as well—not just the AT&T’s and Verizon’s of the world.

What’s more, there are tons of use cases that require databases to run search queries at supersonic speeds—like genome research, financial services, telecommunications, cyber security, and Internet of Things. Even Google and Yahoo are utilizing the high compute capability of GPUs to run the deep learning algorithms that are currently powering their image recognition searches, advertisement targetings, abuse detections, and more.

And what GPUs are doing for genome research is something out of a science fiction film. It takes scientists months to manually segment billions of rows of sequenced data from various chromosomes, while it’d take a GPU-powered database just seconds to do the same thing. Imagine how much closer that brings us to cancer cures and other disease treatments. It’d be super impractical for a medical or research center to pay tens of millions of dollars for a CPU-based database, that wouldn’t even perform as well a GPU-based one.

If I haven’t made my point already, then the cure for cancer should convince you that it’s time GPUs become the norm in data processing.