You behave ............ lol My EVGA 9800GTX+ was a fantastic card I was upset I sold it before trying the ATi 4890 O/C Well from my experience nearly all of my clan had the 8800 gtx and we loved them, then a few went to the 9800 and were so disappointed, hence I decided to switch brands, and lolz that 680 is a joke I like someone else stated believe nvidia will release a better beast near the end of the year it reminds me of last year a few people went from ati 5870 to 6870 and there was no comparison, I may for the first time in years skip a generation and maybe get another 6970 and crossfire although never been keen on dual cards due to the issues. Has anyone here got crossfire 6970s what worries me is my mobo on the 2nd slot is only x4 and surely that is going to bottleneck the cards.
Sounds like a contradiction to me seeing that the 9800 was just a respin on the same architecture that the GeForce 8800 series used, lol. As mentioned earlier the only better beast mentioned so far has been a dual GPU card based on the GTX 680. The 6000 series GPU’s from AMD/ATI use the same architecture as the 5000 series with a die shrink, again essentially the same GPU. It also suffers from the same poor image quality (transparency texture shimmer) with older DirectX 9 games that the 5000 series cards displayed.
The Bottom Line NVIDIA has also surprised us by providing an efficient GPU. Efficiency is out of character for NVIDIA, but surely we welcome this cool running and quiet GTX 680. NVIDIA has delivered better performance than the Radeon HD 7970 with a TDP and power envelope well below that of the Radeon HD 7970. NVIDIA has made a huge leap in efficiency, and a very large step up from what we saw with the GeForce GTX 580. NVIDIA has raised the performance metric at the $499 price point. This is what we expect out of next generation video cards, moving efficiency forward as well as performance at a given price point. The $500 segment just became a lot more interesting, and will give you more performance now than ever before. We've given many awards to Radeon HD 7970 video cards, and those were well deserved. Now that the GeForce GTX 680 is here, the game changes again, and there is no doubt in our minds that this video card has earned HardOCP's Editor's Choice Gold Award. NVIDIA’s GeForce GTX 680 has delivered a more efficient GPU, lower in TDP, that is better or competitive in performance, at a lower price. The GeForce GTX 680 truly is a win right out of the gate. It has been a long time since we've said that about a new GPU from NVIDIA, and it is about time the company got something right the first time! Perhaps the stigma of power hungry, hot, inefficient GPUs is gone thanks to the GeForce GTX 680? NVIDIA needed to build its "green" reputation back up with hardware enthusiasts and gamers, and the GeForce GTX 680 is an excellent start. Let’s just hope we see NVIDIA’s next flagship GK110 do the same.
Just ordered an Asus 680 to try out, should arrive tomo so will benchmark it against my 580s and report back...
The GTX680 is not slower than the GTX580. Reviews show that it is faster. https://www.youtube.com/watch?v=Y8RZDPjMttY&list=UU_SN80_V2GymyCWM2oTYTeg&index=6&feature=plcp
I'm waiting for your review Spyder, as we wont get one professionally from the big guys, they wont test on rfactor 2 as it is a minority next to games like dirt etc. We can only base performance of DX9 games like Mass Effect
I'm hoping it'll be good, will know tomorrow Was actually all set to go for the 7970 until I worked out I'd have to buy two active mini displayport to dual link dvi connectors in order to use my monitors 120hz and those adapters are not cheap! Got away with buying just one adaptor at £70 for the 680s displayport, will hopefully be money well spent...
680 can come in a variation of outputs I read somewhere, I think 2 x DVI, 1 xHDMI and 1 x DP. Do you need to employ the DP for surround?
i think I also read hdmi 1.4 should support 120hz otherwise how would you get 3D on consumer TVs, 3D can only be on a 120hz signal. So a 1.4 hdmi cable should be enough, here this maybe an intresting site for you hardware secrets.
I have a Panasonic 3D and know for a fact that it runs at 60 fps in 3D with 30 fps going to each eye using hdmi 1.4 cables for Sky and my PS3... I did a lot of research before spending £500 today but as you don't believe what I'm saying Bart I thought you might appreciate this excerpt from Nvidias site regarding the 680 and surround gaming: "With the NVIDIA GeForce 400 series we introduced NVIDIA Surround, which let you game across three monitors at once, an incredibly immersive gaming experience. In the past, two NVIDIA graphics cards in SLI were required to operate a Surround setup, but now, with the launch of the new GeForce GTX 680, users can create a four-monitor setup using just the one GPU. GeForce GTX 680 Single-GPU Surround The GTX 680 allows three Surround monitors to run simultaneously through the GTX 680’s two DVI connectors, single HDMI connector and single DisplayPort connector. Any combination of connectors and monitors is A-OK in 2D Surround, though in 3D Vision Surround the two DVI connectors and single DisplayPort connector must be utilized, and users must have three matching 3D Vision monitors. Furthermore, the 3D Vision monitor connected to the DisplayPort requires a DisplayPort to Dual-Link DVI adapter if the monitor lacks native DisplayPort compatibility. You can purchase adapters from the NVIDIA Online Store and Club 3D." ... Hope that helps Edit: Seems like monitors only offer an HDMI 1.4 input with the view to allowing the user to watch 3D tv. 3D vision with the 680 offers 60 fps per eye, something HDMI 1.4 can't match as output at the moment is 24 frames per eye at 1080p, yes Avatar and PS3 games are only 48FPS. Nvidias website : "Will 3D Vision Pro work with consumer HDMI 1.4 3D TVs? No, the current generation of 3D TVs use their own glasses and are not compatible with 3D Vision Pro."
I didn't intend to offend you in anyway about believing you, I am inexperienced with 3D just like I was with eyefinity before I started, I jumped the gun and got a 5870 the first day they came out, only to find I needed an adapter which wasnt included so I was waiting 2 week for one to arrive from another country and cost £50. And then there was all the confusion of active and passive OMG what a hassle. Most of my information is just speed read to learn what I gotta know about, something I read today on nVidia site also a 2Gb GTX680 is fine for a 32bit Windows 7 but a 4Gb is recommended for a 64bit system here http://www.nvidia.com/object/3d-vision-surround-system-requirements.html I just like to know all about things before I commit to buy and waste money so sometimes I may come across the wrong way but its my way of learning about stuff. I know its going to cost a fortune to get 3 3D monitors thats about £900 a GTX 680 £500 for a 4Gb and then adaptor £70 total = £1470 call it £1500 OMG, I also know Im probably going to require a CPU upgrade for this.
Lol Bart, you sound a lot like me! I've learnt the hard way too since getting into PC sim racing last year ... Unfortunately you weren't wrong about your speed reading, they're talking about "system ram" not "gpu ram" in that link!
Make sure the new glasses are compatible with 120hz though Anyway, spent most of my night benchmarking my 580s properly in single/sli/surround formats so when my doorbell rings tomo (morning hopefully) I can chuck the 680 straight in and crack on with some RF2... Can't wait
I'm buzzin for you man, your findings will probably make my mind up for me. I just want to ask what are the 2x 580s in SLI not doing right for you to feel the need to switch? Give me all the details of the things you don't like please. I test debezeled my benQs yesturday to see what lay beneath, planning to build a solid steel case and eventually mount them on my motion sim rig. Very please to find only a 1cm bezel.
Specific request? A review that based on real racing situation instead of using synthetic benchies such as 3DMarks. This kind of benchmark has always been useful for me. My idea of doing a benchmark is you're starting from the back of the grid, 20 AI minimum, 20 visible vehicles, all running Meganes, in Monaco. Why? Because Meganes and Monaco are the most greedy objects, above other tracks & cars in rF2 so far. The graphic detail setting could be set into three kinds: low/none, medium, full/high/max. Running a replay is the best way to do a benchmark for rF2, since the animation will be the same through all iterations. Oh, and don't forget the effect of anti-aliasing setting in performance as well. Whew, I hope this isn't too much to ask
Would be cool if 4GHz monitors were for mere mortals. GTX680 supports 4096 I believe ? Meaning they could view screens in pCARS @4096 ? ie: using monitor.inf , whatever ( I have not been able to work it out on BenQ 24" ) That would be sweet