[This article was originally posted in April of 2015. As of January 2019, this advice still holds true. For example, I have an Nvidia GTX 1070 video card. While editing hundreds of images in Develop module over nearly 1.5 hours, I could not get the GPU usage to go much higher than 10%. If you’re building a computer, and you’re wondering if you should spend big bucks on your video card for Lightroom, as of this update, my advice is still “no.” (If you’re a gamer, spend the money, but understand it’s not LR that will benefit from it.) It’s conceivable that Adobe adds some feature in the future that will really leverage a big GPU, but at this stage it’s a gamble to spend big money on the hope that this happens.]
The work Lightroom is doing just isn’t that hard. Any relatively recent solution will work great. There is no difference between using Lightroom with GPU acceleration on my desktop which has a monstrous discrete nVidia gaming card, versus using it on my Late 2013 Retina Macbook Pro, which only has integrated Intel Iris graphics.
While the Intel Iris graphics is remarkable considering it’s an integrated solution, it’s still light years from the power provided by my big gaming card (benchmark comparisons show my nVidia card outpowers my Intel Iris graphics by anywhere from 10x to 50x.) And in spite of that, in Lightroom, both provide identical user experiences.
This is not a comment like “one is faster, but not by much.” This is a definite, “there’s no difference, don’t waste your money.”
Consider the horsepower involved in rendering an entire real-time 3D universe complete with models, textures, advanced lighting, physics simulations, etc., and having to output that to a display fast enough for it to feel like a real, living world. That’s gaming. Now consider what Lightroom has to do – put one static picture on your screen and make it brighter, dimmer, warmer, cooler… whatever. Even the lightest of modern video solutions can do this with one hand behind its back, so please don’t waste your money on a huge graphics card just for Lightroom. Spend that money on extra RAM or a larger SSD.
Lightroom CC 2015’s (AKA Lightroom 6’s) new GPU acceleration only accelerates things you do in the Develop module. All the sliders for example, or any of the brush or filter tools will feel buttery-smooth and responsive. Lightroom’s GPU acceleration does not accelerate your Library module, preview building, exports (though some performance improvements have been added for this version, they do not take advantage of the GPU,) DNG conversions, etc.
Further, if you are a dual-screen user, Lightroom CC 2015 will accelerate the main develop module window, but it will not accelerate views in your secondary monitor window. This is because that window still uses the Library / Preview rendering pipeline, not the Develop rendering pipeline.
For a more visual sample of the new GPU acceleration, check out this video.
While it doesn’t matter that your GPU be powerful, it does matter that it be fairly recent. Cards around 4 years old (or more) seem to have spotty support. Some will work, some won’t. The technical requirement is that it support both OpenGL 3.3, and DirectX 10 (on Windows.)