[This article was originally posted in April of 2015. As of January 2019, this advice still holds true. For example, I have an Nvidia GTX 1070 video card. While editing hundreds of images in Develop module over nearly 1.5 hours, I could not get the GPU usage to go much higher than 10%. If you’re building a computer, and you’re wondering if you should spend big bucks on your video card for Lightroom, as of this update, my advice is still “no.” (If you’re a gamer, spend the money, but understand it’s not LR that will benefit from it.) It’s conceivable that Adobe adds some feature in the future that will really leverage a big GPU, but at this stage it’s a gamble to spend big money on the hope that this happens.]
The work Lightroom is doing just isn’t that hard. Any relatively recent solution will work great. There is no difference between using Lightroom with GPU acceleration on my desktop which has a monstrous discrete nVidia gaming card, versus using it on my Late 2013 Retina Macbook Pro, which only has integrated Intel Iris graphics.
While the Intel Iris graphics is remarkable considering it’s an integrated solution, it’s still light years from the power provided by my big gaming card (benchmark comparisons show my nVidia card outpowers my Intel Iris graphics by anywhere from 10x to 50x.) And in spite of that, in Lightroom, both provide identical user experiences.
This is not a comment like “one is faster, but not by much.” This is a definite, “there’s no difference, don’t waste your money.” Continue reading Do I Need A Powerful Video Card for Lightroom Classic CC?