Do I Need A Powerful Video Card for Lightroom Classic CC?

[This article was originally posted in April of 2015. As of January 2019, this advice still holds true. For example, I have an Nvidia GTX 1070 video card. While editing hundreds of images in Develop module over nearly 1.5 hours, I could not get the GPU usage to go much higher than 10%. If you’re building a computer, and you’re wondering if you should spend big bucks on your video card for Lightroom, as of this update, my advice is still “no.” (If you’re a gamer, spend the money, but understand it’s not LR that will benefit from it.) It’s conceivable that Adobe adds some feature in the future that will really leverage a big GPU, but at this stage it’s a gamble to spend big money on the hope that this happens.]

No.

Why?

The work Lightroom is doing just isn’t that hard.  Any relatively recent solution will work great.  There is no difference between using Lightroom with GPU acceleration on my desktop which has a monstrous discrete nVidia gaming card, versus using it on my Late 2013 Retina Macbook Pro, which only has integrated Intel Iris graphics.

While the Intel Iris graphics is remarkable considering it’s an integrated solution, it’s still light years from the power provided by my big gaming card (benchmark comparisons show my nVidia card outpowers my Intel Iris graphics by anywhere from 10x to 50x.)  And in spite of that, in Lightroom, both provide identical user experiences.

This is not a comment like “one is faster, but not by much.”  This is a definite, “there’s no difference, don’t waste your money.”

Consider the horsepower involved in rendering an entire real-time 3D universe complete with models, textures, advanced lighting, physics simulations, etc., and having to output that to a display fast enough for it to feel like a real, living world.  That’s gaming.  Now consider what Lightroom has to do – put one static picture on your screen and make it brighter, dimmer, warmer, cooler… whatever.  Even the lightest of modern video solutions can do this with one hand behind its back, so please don’t waste your money on a huge graphics card just for Lightroom.  Spend that money on extra RAM or a larger SSD.

Lightroom CC 2015’s (AKA Lightroom 6’s) new GPU acceleration only accelerates things you do in the Develop module.  All the sliders for example, or any of the brush or filter tools will feel buttery-smooth and responsive.  Lightroom’s GPU acceleration does not accelerate your Library module, preview building, exports (though some performance improvements have been added for this version, they do not take advantage of the GPU,) DNG conversions, etc.

Further, if you are a dual-screen user, Lightroom CC 2015 will accelerate the main develop module window, but it will not accelerate views in your secondary monitor window.  This is because that window still uses the Library / Preview rendering pipeline, not the Develop rendering pipeline.

For a more visual sample of the new GPU acceleration, check out this video.

While it doesn’t matter that your GPU be powerful, it does matter that it be fairly recent.  Cards around 4 years old (or more) seem to have spotty support.  Some will work, some won’t.  The technical requirement is that it support both OpenGL 3.3, and DirectX 10 (on Windows.)

11 thoughts on “Do I Need A Powerful Video Card for Lightroom Classic CC?”

    1. Laptop or desktop? What card do you have, and do you have the latest drivers? How much video RAM does the card have? When you say it does not work, do you mean LR tries to enable support but you get video glitches? Or do you mean that it just won’t recognize the card at all, and forces you to remain unaccelerated?

    1. That’s been my experience, yes. Performance in PS using the liquify tool (for example) is fantastic, even on my Late 2013 MBP with Intel IRIS. Having a more recent card trumps having a more powerful card in terms of avoiding potential glitches.

  1. Thanks for the helpful video. I was wondering if you could explain, How does one speed up jpeg rendering or exporting TIFFS or DNG conversions. What’s the bottleneck in those types of processes?

    1. Hi Arif, great question.

      First off, when exporting using the Lightroom Export interface, JPEG output has been sped up quite a bit in LR6. Where it would only work on one file at a time up through LR5, now in LR6 it processes three images at once. This appears to be true for TIFF as well, though I only did a quick-n-dirty test just now. If you run a CPU meter while doing output like this, you should see all your cores light up to nearly 100%. This is good news.

      Unfortunately for DNG conversion, there’s nothing you can do. The bottleneck isn’t in your system, it’s in the way LR’s DNG conversion process is tuned. The engineers at Adobe have decided that it’s more important to keep the UI of LR performing well, and so they throttle the CPU usage of the DNG conversion process back quite a bit. If you’re absolutely hell bent on getting those DNGs converted as quickly as possible, you might download the free Adobe DNG Converter. This tool does the exact same thing LR does, but the CPU tuning is a tad more aggressive. It’s still not as aggressive as I’d like, but it will save you a little time if you’re doing thousands of files. Of course the drawback is that you have to manage that process outside of LR, and then re-import the files after the conversion is complete.

      Hope this helps.

  2. Hi Gavin,

    Thanks for the useful video. I’m particularly interested because I also have a late-2013 15″ MBP with Iris graphics. Have you used LR6 on a 4K monitor with your MBP? If so, do you find the performance acceptable?

    Thanks

  3. “Further, if you are a dual-screen user, Lightroom CC 2015 will accelerate the main develop module window, but it will not accelerate views in your secondary monitor window. This is because that window still uses the Library / Preview rendering pipeline, not the Develop rendering pipeline.”

    Does this mean that the quality of the image on the secondary monitor is inferior to that on the primary monitor? That is, can I judge results of editing on the secondary or must I use the primary?

  4. i know the question will sound lame…still ..i have a 4130 i3 processor and intend to install ddr3 8gb ram , so will a asus gt 710 ddr3 2gb gpu enhance my lightroom experience or will it remain the same

Leave a Reply

Your email address will not be published. Required fields are marked *