ATI JUST RELEASED a technology called Eyefinity, basically the ability to support up to six 30″ monitors from a single video card. That is the shiny part, the much more important thing is how it broke a fundamental barrier to future GPU adoption.
Flight Sim on 24 monitors
What you see above is a flight simulator running on 24 monitors from one PC, the holy grail for most flight sim addicts. What was once the domain of hacks, registry tweaks, or amazingly expensive proprietary solutions is now easy. Very soon, you will be able to get that in off the shelf consumer products. This is what AMD calls Eyefinity.
If you are thinking that Linux, and sometimes even Windows, can do multiple monitors, there is a very large problem with how some implementations work. For Windows, each monitor is it’s own separate workspace, and this has some hugely important implications. If you put two monitors on Windows, and drag a video, or something that uses overlays, so it spans two screens, half of the image or movie will not display.
Separate and non-contiguous workspaces can be made to look like a single workspace, but the devil is in the details. Add in the crippling DRM infection baked into Windows Me II (Vista) and Me II SP7, and you push things farther and farther away from having a seamless experience. As a side note, most current Linux distros do not have this problem; you can span monitors mostly without problems, and the OS is also not a slave to the content MAFIAA.
Eyefinity does a bit to take back your computer from those who do not have your best interests at heart. While it is fully DRM infected, the card presents itself to Windows as one single workspace. You can span multiple screens with almost any workload that would run on a single screen. ATI unbroke Windows.
This means games, videos, or just about anything else can run on multiple screens seamlessly and transparently. It just works.
WoW at 7680×3200
A few days ago, AMD demo’d WoW running seamlessly on six Dell 30″ monitors, 7680×3200 resolution, at playable frame rates from a single Eyefinity card. Dirt 2 was also running, full DX11, again at playable frame rates, from the same card. Left4Dead stuttered a bit here and there, but was very playable. At 7K+ resolution from one card, that is not bad at all.
The take home message is that Eyefinity does one thing that no one else can, simulate a single monitor on multiple displays transparently to Windows. It should just work, and from what we saw, it does. This simple thing breaks through a fundamental brick wall for GPU adoption, limited resolution monitors.
Monitor growth has gone from CGA’s 320×200 to the 2560×1600 that debuted in the Dell 30″ monitor a few years ago. That is not even two orders of magnitude growth in 30 years. Meanwhile, GPUs have gone up my many many times more in performance. The Evergreen cards have 2.15 billion transistors and push 2.5+ teraflops, more than the fastest supercomputer in 2000 (well, that is SP flops, but it is still a nice number).
With the 770, ATI had a single card that could push most games at the highest resolutions any consumer monitors could show with a fair bit of ease. When performance goes up from Evergreen, monitor resolution didn’t go up, so what is the use case for that performance?
Allowing the single digit percentage of people with 30″ monitors to turn the AA/AF mode slider up one notch, or smoothing out the most pathological and poorly written games is nice, but not a sane thing to architect a card or business model around. Fanbois are nice, but there are not that many compared to the general audience of computer buyers.
If monitors are not going to grow, the ability to push nearly any game on any existing monitor already exists, multi-monitors are broken by Microsoft incompetence, and Microsoft sold your PC out to the content MAFIAA, why should anyone buy these cards? How about the next generation? The simple answer is that they won’t buy without a reason to do so. Waiting for Microsoft to do the right thing is pointless, changing the consumer electronics industry is impossible, and game complexity is dictated by consoles now. Dead end for all paths, and a big problem for future GPU adoption.
The magic of Eyefinity is that it takes a sharp left turn and solves a fundamental problem that limits demand for GPU power, screen size. By making multiple monitors appear transparently to the PC, regardless of the OS, you can just run a single ‘virtual’ monitor of nearly unlimited resolution. Even better, three 1920×1200 monitors cost a lot less than a single 2560×1600 monitor, so you essentially get many more pixels per dollar.
ATI was limited by the ability for consumer electronics industry’s ability to make bigger sheets of glass. No longer. Those shackles of screen size are now gone. If you want higher resolution, buy another monitor or two. Or five. If six isn’t enough, buy three more cards and 18 more monitors. If you need more, talk to the chipset guys and have them add more PCIe 16x lanes.
Eyefinity fundamentally changes the way multiple monitors are done. The industry was not going to change, so ATI went around them. In doing so, ATI justified the existence of this and the next few generations. It not only changes the game, but also how games are played, a clear win for consumers. Brick wall avoided, just in the nick of time.S|A
Charlie Demerjian
Latest posts by Charlie Demerjian (see all)
- Intel Removes Pat Gelsinger - Dec 2, 2024
- What silicon is in the new Sony Handheld? - Nov 26, 2024
- Surf Security Puts Deepfake Detection In A Browser - Nov 20, 2024
- Asus’s ROG9 phone has a lot of interesting engineering - Nov 12, 2024
- What platform is Intel’s Diamond Rapids on? - Nov 7, 2024