Eric Chan on GPU
See Adobe’s Eric Chan’s post GPU notes for Lightroom CC (2015) for a behind-the-scenes explanation of how Adobe are adding GPU support:
Lr can now use graphics processors (GPUs) to accelerate interactive image editing in Develop. A big reason that we started here is the recent development and increased availability of high-res displays, such as 4K and 5K monitors. To give you some numbers: a standard HD screen is 2 megapixels (MP), a MacBook Retina Pro 15″ is 5 MP, a 4K display is 8 MP, and a 5K display is a whopping 15 MP. This means on a 4K display we need to render and display 4 times as many pixels as on a standard HD display. Using the GPU can provide a significant speedup (10x or more) on high-res displays. The bigger the screen, the bigger the win….
let’s be clear: This is the beginning of the GPU story for Lightroom, not the end. The vision here is to expand our use of the GPU and other technologies over time to improve performance. I know that many photographers have been asking us for improved performance for a long time, and we’re trying to respond to that. Please understand this is a big step in that direction, but it’s just the first step. The rest of it will take some time.
Also see Eric’s comments here and though I’ve already linked to it, Adobe’s GPU Troubleshooting & FAQ is also important.
Annoyingly, the current generation of MacBookPro/Retina laptops is all-AMD. Are people with “acceptable” GPUs really getting a LR performance boost? I am soldiering on with an elderly (mid-2010) MBP and LR4, somewhat afraid that if I update to a Retina screen, LR, especially any LR beyond 4, w i l l c r a w l.
It seems the AMD problems are mainly on Windows, but I am not sure anyone is getting obvious benefits yet. It’s not an overall speed improvement, but applies in specific tools in Develop, and I certainly find it hard to point at something I’m doing and say it’s faster. Maybe it is, but I just don’t feel so.
It appears the GPU support is a bit problematic, particularly for those of us using AMD cards. Although mine has enough RAM and supports OpenGL 4.1 Lightroom shows it can’t use it due to ‘an error’ that occurred. A quick glance into Adobe’s forums shows that a lot of people have the same problem – all with AMD cards.
In the early days it’s bound to be problematic – using the GPU opens a can of worms. Updating the driver looks like it’s a solution, though I’ve also seen cases where going back to an earlier driver has helped.
I had to replace my card to an AMD Radeon R9 200 with 3gb and haven’t had any problems. It looks like the best advice is to go for recent cards with at least 2Gb of memory such as the AMD R9 2xx series or nvidia’s Geforce 700 or 900.
Adobe is aware of the problem, which causes problems with lots of AMD chips. I also do have a rather powerfull card, but it does not work as it should. Here is the link of the Adobe Help entry regarding this problem: https://helpx.adobe.com/lightroom/kb/lightroom-amd-graphics-cards.html
Adding GPU opens a huge can of worms! You just have to hope that once things settle down, we’ll really see the benefits – and Adobe will choose more wisely where those benefits are….