In preparing to hopefully test the GeForce GTX 1070/1080 "Pascal" graphics cards under Linux in the days ahead, I’ve been re-testing my collection of available NVIDIA GeForce graphics cards going back to the GeForce 9800GTX up through the Maxwell-based GeForce GTX 980 Ti and GTX TITAN X. Besides looking at the OpenGL performance at 1080p and 4K, I’ve also been recording the power metrics and performance-per-Watt data.
For making for a really interesting Pascal comparison once I’m able to get my hands on the GTX 1070 and GTX 1080, I’ve been re-testing the slew of cards atop my current main test system and using Ubuntu 16.04 LTS x86_64 with the latest NVIDIA Linux drivers. As the data is interesting already in its own right with testing 16 graphics cards so far, here are the results in going from the G92 GPU (9800GTX) through the entire GeForce 900 Maxwell families. Stay tuned for the Pascal data once I have my hands on the hardware and any embargo is lifted.
All of the tests happened from an Intel Xeon E3-1280 v5 Skylake system with MSI C236A Workstation motherboard with 16GB of RAM, 120GB Samsung 850 EVO SSD, and running Ubuntu 16.04 LTS. Ubuntu 16.04 has the Linux 4.4 kernel, Unity 7.4 desktop, and X.Org Server 1.18.3 as the key components during graphics testing. With my tested Fermi hardware and newer was the NVIDIA 364.19 driver as the latest currently available driver for Linux users/gamers. For the NVIDIA GeForce 9800 GTX testing I had to use the NVIDIA 340.96 driver as it was the last release stream supporting the GeForce 9 series.
With the sixteen graphics cards, first up I ran a variety of Linux OpenGL tests at 1080p in order to provide comparable results to the GPUs going back to the 9800 GTX. Following the 1920 x 1080 results, on the newer (Kepler and Maxwell) graphics cards I conducted 4K benchmarks for these higher-end graphics cards capable of playing Linux games at 3840 x 2160. I also did run some OpenCL compute benchmarks in a similar manner, which will be saved for a follow-up article on Phoronix.
During all of the graphics card testing, the AC system power consumption was monitored using a WattsUp Pro power meter that’s automatically polled via our open-source Phoronix Test Suite benchmarking software. In addition to monitoring the power usage on a per-test basis, the Phoronix Test Suite was also logging the reported GPU temperature as well as calculating the performance-per-Watt for each of the benchmarks.
Raw performance and performance-per-Watt has improved a lot with NVIDIA GPUs since the 9800GTX launch in 2008…
With all of that said, the graphics cards used for this fun comparison included the GeForce 9800GTX, GTX 460, GTX 550 Ti, GTX 650, GTX 680, GT 710, GTX 750, GTX 750 Ti, GTX 760, GTX 780 Ti, GTX 950, GTX 960, GTX 970, GTX 980, GTX 980 Ti, and GTX TITAN X. The cards were limited to those that I had available, working, and in my possession. Once I’m able to get my hands on Pascal hardware, I’ll have those results added in from this same system software/hardware configuration for providing an interesting perspective of NVIDIA’s evolution under Linux going back to the 9800GTX.
On the following pages are all of these initial results starting with all of the data for 1080p followed by the newer, higher-end cards at 4K. If you find this large comparison insightful ahead of Pascal’s hard launch, please consider subscribing to Phoronix Premium or making a PayPal tip to make more of our Linux hardware tests possible, especially I haven’t heard from NVIDIA yet whether in fact they’ll be sending out Pascal samples for Linux testing or if I’ll have to buy them retail. Your support and those viewing this site without ad-blockers is what makes the continued existence of Phoronix possible; thanks. A big benefit for premium subscribers on long articles like this is that all articles are rendered ad-free on a single page.