[HN Gopher] Which GPUs to get for deep learning
       ___________________________________________________________________
        
       Which GPUs to get for deep learning
        
       Author : dsr12
       Score  : 57 points
       Date   : 2020-09-07 16:40 UTC (6 hours ago)
        
 (HTM) web link (timdettmers.com)
 (TXT) w3m dump (timdettmers.com)
        
       | fxtentacle wrote:
       | Sadly, this is mostly purchase advise.
       | 
       | In short: You need lots of RAM.
       | 
       | And stay away from overclocked (founders edition) and from
       | datacenter models due to heat or price problems.
        
         | timdettmers wrote:
         | What else would you like to see?
        
       | nsriv wrote:
       | Site down apparently
        
         | rcarmo wrote:
         | Yep. Hugged to death, can't seem to find a recent mirror/cache
         | either.
        
           | bserge wrote:
           | From just Hacker News traffic? How is that possible, even a
           | (properly configured) Wordpress installation on a Nanode can
           | handle a spike of 10,000+ connections :/
        
             | timdettmers wrote:
             | Is up again and stable it seems. I shot my cache while
             | trying to "tune" it. Well, I learned my lesson.
        
         | PNWChris wrote:
         | Same situation for me, luckily it looks like archive.org
         | snagged a backup while it was up!
         | 
         | https://web.archive.org/web/20200907164516/https://timdettme...
         | 
         | This post looks very thorough, and came just in time for me.
         | I'm looking to snag an upgrade from my GTX 970 for a mix of
         | flight sim 2020 and digging into Fast.ai's course part 2.
         | 
         | The 970 has been my big hold-up, right now even simple models
         | take a really long time to work with.
        
           | g42gregory wrote:
           | Thank you for posting the archived link. I read it of this
           | archive page, and it's great. This article is so good, that
           | it made to the HN front page without even being available!
        
           | dgellow wrote:
           | Have you tried using Google Colab and other online platforms?
           | I started the course a few days ago and so far Colab works
           | well (I don't really like Jupiter but that's a detail...).
           | It's free and you have the choice between a CPU, a GPU, and a
           | TPU.
        
             | PNWChris wrote:
             | Disclosure (since Colab is a Google product): I work at
             | Google, but everything I say is my personal opinion and
             | experience.
             | 
             | I really dig the overall idea of cloud notebooks. Back when
             | I did fast.ai part 1, I used Paperspace Gradient. It was a
             | pretty good experience, but moving files around was a bit
             | of a hassle. For example, getting the images for the Planet
             | Labs exercise took a round trip of downloading from Kaggle
             | to my computer and re-uploading into Jupyter to do
             | analysis.
             | 
             | Because of all those moving parts, I decided to give a try
             | running things locally. To my surprise, setup was super
             | easy and I was quickly productive! I really dig how
             | customizable a local Jupyter server is, too.
             | 
             | I do use Colab, it's particularly great for
             | collaboration/sharing notebooks, but my past experience has
             | me hooked on the idea of a capable ML machine at home.
             | 
             | Plus: I can pitch it to myself and my spouse as an
             | investment in personal development that happens to be able
             | to game :D
        
               | fxtentacle wrote:
               | Go for it! I have my own Jupyter docker image that I run
               | on a server in the basement. PyCharm can even do code
               | completion for TensorFlow inside a remote docker
               | container. So it's instant, reproducible, and I don't
               | hear a thing in my office :)
        
       | coredog64 wrote:
       | I'm confused:
       | 
       | > Do not buy GTX 16s series cards. These cards do not have tensor
       | cores and, as such, provide relatively poor deep learning
       | performance. I would choose a used RTX 2070 / RTX 2060 / RTX 2060
       | Super any day over a GTX 16s series card
       | 
       | ...a few paragraphs later...
       | 
       | > If that is too expensive, a used GTX 980 Ti (6GB $150) or a
       | used GTX 1650 Super ($190).
        
         | KSS42 wrote:
         | Having a GPU is better than not having one, even if doesn't
         | having tensor cores.
         | 
         | If you are on a tight budget, his advice is to pick in this
         | order:
         | 
         | > I have little money: Buy used cards. Hierarchy: RTX 2070
         | ($400), RTX 2060 ($300), GTX 1070 ($220), GTX 1070 Ti ($230),
         | GTX 1650 Super ($190), GTX 980 Ti (6GB $150).
        
         | FridgeSeal wrote:
         | I guess it should be taken as: * Strongly prefer cards with
         | Tensor cores -> suggested cards * if you need a card because
         | you're currently CPU only, and you're strapped for cash, these
         | are your best bets, but only if you have no other option.
        
         | timdettmers wrote:
         | This is good feedback. Will note this down and incorporate it
         | in a small update.
        
       ___________________________________________________________________
       (page generated 2020-09-07 23:00 UTC)