I’ve been thinking of my next PC. It will be a beast all around, and totally more than I need, but on par with what I want. One of the components on this mythical system is the Dell 3007WFP, a 30″ LCD monitor.
Everyone knows bigger is better. Ok, yes, except for penis size (you heard Shannon Doherty in Mallrats, a good size means small). But for monitors, 30″ has to be better than 24″, right? Well, maybe.
The problem is something called “native resolution”. That’s tech talk for, “if you use a different resolution, it will look shitty”, at least in comparison. And you didn’t shell out all that money to look at something with subpar clarity. You can usually switch to a lower res in an even multiple. In other words, if the native resolution is 1280 x 1024, you can cut that to 640 x 512 and still have it look good. You’re now using four pixels to represent one, so it’s still a square.
You might be asking yourself, “Why would I want fewer pixels? We just established that more is better!” And if you live in the 2D world, you’re probably right. Watching movies, surfing the web, reading email – no big deal.
But most games create a 3D world, and that world is computationally intensive, and that intensity is resolution-dependent. The higher the resolution, the harder your graphics card has to work. A couple years ago I bought a Radeon X800 Pro graphics card, which was definitely high-end at the time ($400 retail [1]). I bought it to play World of Warcraft at high resolutions. The game played fine for a while, but eventually it would sputter and stall and crash. Maybe 15-20 minutes into it [2]. Then I upgraded damn near everything else, but it still crashed. Obviously, the good folks at Blizzard wouldn’t provide a resolution choice that was impossible to play with state of the art equipment. And if they did, well, their tech support people would tell you right away. “Lower your resolution! That setting is for hardware that hasn’t been invented yet!” Well, I never heard that, but maybe that kind of honesty only comes after you’ve fetched all their rocks. I finally broke down and lowered the resolution and some effects, and voila, it ran smoothly for hours. Same thing happened with Oblivion.
Oh, when I say high resolution, we’re talking 1280 x 1024, not 1600 x 1200. 1.3 megapixels. My card maxed out at about 1024 x 768, less than .8 megapixels.
In other words, nowhere near the 2650 x 1600 native resolution of the Dell 3007WFP. Simple math puts that at 4.24 megapixels, more than 5 times the max res. of the X800. The card is now 2 years old, understood, but it’s still not bottom of the barrel. Even so, let’s discount it. Let’s look at the awesomest card on the market, the GeForce 7900 GTX SLI. $470 on PriceGrabber.com. According to the VGA charts at Tom’s Hardware, that card only gets 20 FPS on the benchmark for Oblivion run at 1600×1200 with everything on, in an outdoor scene (very common in that game). I don’t know if that’s single or dual card setup, but they also say that an extra card only buys you a 30-40% boost. And remember, while 1600×1200 may sound high (and it is!), it’s less than half of 2650×1600! Assuming the complexity scales linearly with resolution [3], you’d need a card 3 times more powerful than the best card available. And probably a stronger CPU, too. And that’s on a game released months ago. What happens when you try to run future games, using DirectX 10? You’re probably stuck running in 1325×800, if that resolution is even offered by the game. Most likely, you’re in a crappy looking non-native resolution, just so you can get the game to run. What a waste!
Luckily, there aren’t any games driving me to build this beast, not even the upcoming Neverwinter Nights 2. Alas, the original plan of waiting for Diablo III is still in effect. I’ll have to find something else to waste stupid amounts of money on. Perhaps a bigger penis…
Update:
I thought I was wrong, but I was mistaken. I thought I might not be throwing enough money at the problem. Well, Tom’s Hardware’s $10,000 PC – with quad SLI (that’s four videocards bridged together) – gets less than 18 FPS on Oblivion outdoors. Granted, it does well on all their other game tests, but I don’t want to play those games! Check out Tom’s PC anyway, it’s an interesting read.
[1] Amusing story about that. Friend went to a major electronics store, let’s call them Pommes Frittes, or Fri’s for short. Wanted a Radeon 9600 with DVI for his pricey Apple LCD. It’s on sale for $130, but when he goes to buy it, it rings up as $400. He points out the price tag and the sale sign to the cashier, who agrees and charges him the marked price. He goes home. He opens the box. Inside: another box. This one says Radeon X800. It’s price tag says $400. Clearly, some shennanigans going on at Fri’s, probably a warehouse guy setting himself up for some extra take-home pay, which my friend inadvertently thwarts. But it’s win/win, as I buy it off him for $250. Huzzah!
[2] If you know the game, it also happened as soon as I took a gryphon ride.
[3] This is a really big assumption, but I have a hunch I’m being lenient.
Speaking from experience, as I am the proud owner of a Dell 2405, the scaling on these monitors is excellent. I like FPSs and play at a measly 1024×768 for F.E.A.R and Doom3, and bump up to the 20″ widescreen native of 1680×1050 for Half-Life2. The problem with FPSs is that you need fast framerates to make the game look good, but slower playing RTSs I don’t think this is as big of a problem. I’m stuck w/AGP for a couple more years, and I thought I bought a good card too, a BFG6800GS, still not good enough to run 1600×1200 and have the black bars. One last thing that bugs me is the tearing of the image on LCDs that doesn’t happen on CRT. This is because CRT has to wait for the vertical sync signal, LCDs can draw whenever they get information, but this can cause the “middle” of the screen to update before the “top” it causes wierd tear lines across the screen. Most games have a setting to enable v-sync, which lowers your framerate, but makes the game look smoother. TV also looks great on this monitor, and the new ones even support HDCP. Good luck with your decision!
I appreciate the informed feedback! You have a great point, I think I’m just going to have to see the 30″ in person and judge the scaling quality for myself. I have a hunch Apple’s 30″ Cinema display uses the same LCD, but I’m not absolutely positive. The specs are close to identical. So I might be able to visit an Apple store if I can’t find a Dell one around here.
I haven’t been playing RTSs much, if you mean things like Starcraft, Warcraft, and AOE. Mainly RPGs like WOW, Neverwinter Nights, and Oblivion, which let you play first or third person. WOW and NWN are almost always played 3rd person, but Oblivion is first. Either way, there is a lot going on on the screen, as most of it is outdoors, which seems to make a huge difference (esp. in Oblivion). But I usually don’t have problems until I move around the world to different areas. The game-killing gryphon ride I spoke of above is a real time flyby. I’m wondering if the problem is memory paging, but if so, I still don’t know if it’s texture maps (video memory) or other world data (system mem.). I spent time on the forums and found others in this boat, but no fixes other than lowering the res. I don’t know if a drastic memory upgrade (say, 4GB) would do it, but I’m not going to bother with this system. I’m just going to stick with it until I have a compelling reason to do so.
I think (at least on the XBOX) that you can choose different POV in Oblivion, though in that one I found the first person view to be freakin great for most fights.
Me, I am dying to land another contract so I can justify buying a 30″ monitor for playing WOW. My 17″ Macbook Pro will drive that beast I am told, but then again, I haven’t tried it. I did try hooking it up to my big flat pannel TV with the computer input. Took me about 2 minutes before I took those stolen cables back to work. YUK.