Run experiments by building your own Deep Learning Machine - Great Learning
We use cookies to give you the best online experience. By using our website, you agree to our use of cookies in accordance with our cookie policy. Learn More

Run experiments by building your own Deep Learning Machine

If you are taking up a Deep Learning or Machine Learning course right now, the best way to learn is by running your own experiments. To run those experiments and train models with large data sets, you’ll need a significant amount of processing power. Theoretically, you could use your laptop to run some of these tests, but it will be a long and arduous process. So how do you get all that processing power necessary to run these resource-hungry tests? By building your own machine.

We can start by making a list of all the components you’ll need to build a functional Deep Learning setup. We are not considering cloud components yet because unreliable internet speeds make it impractical to expect consistent performance. Here we go.


GPUs are built for processing large amounts of graphics in video games. Graphics rendering is an extremely resource-intensive process, and GPUs were built for the express purpose of doing large amounts of computations. This makes them ideal for a variety of purposes which includes cryptocurrency mining, and of course, for running deep learning experiments.

The GPU is the lynchpin in the deep learning machine, so it’s customary for it to constitute over 50% of your overall budget. The more powerful the GPU, the lesser time you will have to spend waiting for all your results to compile. This allows you to quickly adjust the parameters of your deep learning models if you’re not getting the results you expected.

There might be many people differing in this stead but the most powerful GPU you can get right now is the GeForce RTX 2080 Ti. It costs a pretty penny, but its the absolute best that money can buy right now. For a mid-range GPU, Nvidia has the RTX 2070 that’s at half the price of the 2080 Ti.  For just a basic GPU, users can consider the GTX 1050 Ti. To really run large computations, you’ll need a dual or multi-GPU build, so plan accordingly to your budget.

The Motherboard

The motherboard is the foundation for all your different components, and it needs to be compatible with the GPU that you are selecting. It needs to have enough slots to support a dual-GPU build and needs 16 PCI-e lanes to be able to fully harness the complete power of the GPUs. The motherboard will need to support an i7/i8 processor and a DDR4 RAM. You can’t go wrong with the Z series from Gigabyte, or you can also choose from H or B series – just make sure it’s compatible with everything else.


When choosing the processor, you’ll need to pay attention to the number of cores, and the threads per core. The more cores, the better your parallel data computations will be, so choose at least a quadcore processor. The CPU is the conduit between the GPU and the rest of your rig, so we would again recommend a gaming-oriented processor to ensure that it can handle high loads.


Since the amount of RAM available dictates the size of the data that can be held in memory, you’ll need a minimum of 16GB of RAM. Make sure you get a 2x8GB RAM instead of 4x4GB so that you can upgrade your RAM later. You can choose the Corsair 16GB (2x8GB). It also has 3200MHz of clock speed which will ensure that access times for the RAM are kept suitably low.


We assume that the large datasets on which you will operate will have to go somewhere, so you need a big enough hard drive to house all of that. The prices of SSDs are dropping dramatically, but they’re still more expensive than regular hard drives. You can choose to put all your static data in the hard drive, and the data sets that need to get accessed frequently by your Deep Learning programs can go in the SSD. The only constraints here are the size, so choose according to your requirements.

Power supply

The power supply unit delivers the necessary power to all your components. The GPU is a very power-hungry component, so you’ll need to ensure that the wattage for is right. An underpowered PSU can seriously jeopardize the entire rig, so make sure you have enough wattage for the other components that you have chosen.


The most widely used OS for Deep Learning experiments is Windows, but you can save a few thousand by opting for free Linux distros such as Ubuntu.

Computer Peripherals

The quality, size or type of your computer monitor, keyboard and mouse are not going to affect the outcomes of your Deep Learning experiments, so you are free to choose whatever you like depending on your requirements and preferences.

To check for compatibility between the component you choose, you can check out this great resource which will suggest the right components for you:

Now that you know everything you need to build your machine, its time to start setting your rig up. Happy building!



Subscribe to Our Blog