Acquired Electronics360

Processors

How to Select the Right GPU for Deep Learning

12 June 2017

Deep learning is a subset of machine learning based on neural networks. With deep learning the more data the better which can require more computing power. In this case that computing power comes from graphics processing units (GPU), as their architecture is bested suited for the job. Typically the GPU is needed in the training stage of machine learning. At this stage more cores and faster GPUs mean you can train the system faster. The amount of memory is relevant to how much data you have.

The first thing to keep in mind is which software library you plan to use for deep learning. Many different Image credit: AMD Press / CC BY-SA 2.5Image credit: AMD Press / CC BY-SA 2.5software libraries are available. Depending on what you are trying to do one may be better than the other. Some examples of the various software libraries are Theano, Tensorflo and Caffe, although many more exist. Once you start looking into the software you will find that many will only work on certain GPU’s but the GPU most often used are based on Nvidia’s CUDA cores. This is due to Nvidia supporting neural networks early on so that the code bases are well established. Other competitors such as AMD, which uses openCL, don’t have as much code that will work with them. For this reason Nvidia cards are most common for deep learning. Some work is being done that has adapted some of these libraries to work on openCL, so it is possible that you might be able to buy a cheaper GPU if that is the case but it will limit which software you can use.

Depending on how much money you have to spend on a GPU will have a large impact on which is the GPU for you. There are three main factors to consider when selecting a GPU video card for deep learning: the number of cores, the speed of the cores and the amount of RAM memory. In all cases more is better. It is possible to utilize multiple GPUs in parallel, although most seem to agree that this approach is more difficult. Users are usually are better off purchasing a single GPU that will fit their needs.Image credit: ZSkiraly / CC BY SA 4.0Image credit: ZSkiraly / CC BY SA 4.0

The best card currently available for deep learning is the Titan Xp. This card has 3840 CUDA cores running at around 1.6 GHz with 12 GB of ram. This card is relatively costly at about $1200. Just below that is the GTX Titan x at 3072 CUDA cores, that run at 1 GHz and 12 GB of memory with a cost of about $1000. This is also a decent card but the cost doesn’t save you much.

More cost effective GPUs can be had compared to the two cards above that will still perform quite well. Performance will vary depending on the software you use but generally more cores and higher speeds will be faster. The amount of memory you need will have a greater dependence on what you are doing. You may be fine withImage credit: GBPublic_PR / CC BY 2.0 Image credit: GBPublic_PR / CC BY 2.0 cards with less memory for many applications.

The GTX 1080 Ti, with 3585 CUDA cores, runs at about 1.6 GHz and has 11 GB of memory for a little over $700. This is a good value and for many this card is more than adequate. The GTX 980 Ti with 2816 CUDA cores running at 1 GHz, with 6 GB of memory for a little under $800, is also a good value.

If you are more budget conscious, the GTX 1070 with 1920 CUDA cores running at 1.6 GHz, with 8 GB of memory for a little over $400, might be a good choice. This card has significantly fewer cores and less memory but will still work for a lot of applications. For just a little more the GTX 1080 with 2560 CUDA cores running at 1.7 GHz and with 8 GB of memory costs about $550.

The GTX 1060 is at the lower end of the price spectrum, with 1280 CUDA cores running at 1.7 GHz with 6 GB of memory for around $200. Also, the GTX 1050 Ti with 768 CUDA cores running at about 1.4 GHz, with 4 GB of memory for about $140, might be an option. These cards are slower but can still be used for some applications or just to experiment with deep learning. But if you are serious about deep learning these cards are probably not going to cut it.

There are many other GPUs to consider although the ones listed above should give you an idea of what you will get for your money. New GPUs come out all the time, so you will have to look at what is current. If your application works on a library that can run on OpenCL you may be able to get more for your money that way but with limits. Fortunately, there are video cards that have the GPU power necessary to do deep learning at home, while this was formerly achievable only by renting time on a computer. While many of these cards were designed with gamers in mind they have the raw computing power necessary to do some amazing things.

To contact the author of this article, email daniel.franklin@ieeeglobalspec.com


Powered by CR4, the Engineering Community

Discussion – 0 comments

By posting a comment you confirm that you have read and accept our Posting Rules and Terms of Use.
Engineering Newsletter Signup
Get the Engineering360
Stay up to date on:
Features the top stories, latest news, charts, insights and more on the end-to-end electronics value chain.
Advertisement
Weekly Newsletter
Get news, research, and analysis
on the Electronics industry in your
inbox every week - for FREE
Sign up for our FREE eNewsletter
Advertisement

CALENDAR OF EVENTS

Date Event Location
30 Nov-01 Dec 2017 Helsinki, Finland
23-27 Apr 2018 Oklahoma City, Oklahoma
18-22 Jun 2018 Honolulu, Hawaii
Find Free Electronics Datasheets
Advertisement