Not sure if this has been mentioned yet. http://www.anandtech.com/show/7436/nvidias-gsync-attempting-to-revolutionize-gaming-via-smoothness
I'm interested in how universal the G-Sync module is going to be and how difficult it is to fit to a monitor, I don't really want to be limited to crappy TN panels tbh.
Really looking forward to this as I'm about to buy new graphics adapter (GTX 780 Ti) which supports G-Sync. Hopefully my current monitor can be upgraded for G-Sync but we'll see. Anyways this will be awesome technology and most likely will help lower input lag too. I'll update my little research of input lags hopefully early next year. Will be interesting! Downside could be to not be able to use non-3D LightBoost "hack" as I can't imagine how it would work with variable refresh rate (brightness level would go up and down as refresh rate varies?).
yes, great it would be still a solution for it a good explanation here again: http://techreport.com/review/25788/a-first-look-at-nvidia-g-sync-display-tech
http://www.overclockers.co.uk/showproduct.php?prodid=MO-004-EO There are some so-called 240hz monitors entering the market now as well. It will be interesting to watch them develop and see which one prevails, the price of g-sync and 240hz is a big obstacle at the moment though.
Thats ok .... for 60Hz, having a 120Hz and a High end PC that give you always +120fps, thats makes no sense.
Reading through that, the monitor is actually a 120Hz monitor that does something either through software or hardware inside the monitor to supposedly double the refresh rate. How they achieve this is not discussed, but my guess is through interleaving the frames, though I could be wrong on that point. Regardless of how they do it, I would really be surprised if it looks any better than a native 120Hz monitor.
Apparently it is a software thing and you get a slight ghosting effect at 240hz reading a review on overclockers from a customer. Scan are doing the Asus pre modified with gsync for £441 or £269 un modified, but you cannot get the board separately in the UK as far as I can tell which is about typical. If you live in Puerto Rico no problem. Board should only cost £125 if they are $199, which would be a good saving over 3 monitors. I would want to modify my own anyway and remove the bezels. What is it with manufacturers and making bezels chunky. Where is the frameless 144hz g-sync monitor?
Insane price I know but as always new tech is very expensive. I'm curious if price will come down within first year of release.
They are 27" though not 24 so that price isn't through the roof when you compare it to what else is available. $600 of it is gsync modules. That's more like it anyway and that's what I'm going hold out for. I asked in the other triple screen thread and maybe its a moot point with gsync anyway, but when you run triple 144hz monitors do you have to run at a reduced resolution to get 144hz or will you still get that at maximum resolution? For instance on my dells I have to run 3786x800 to get 75hz anything above that defaults to 60hz, and you can really tell the difference (I prefer the higher refresh rate). I could only ever get 60hz when I was running the same monitors with an ATi card, no idea why. Nvidia sure are creaming it with their new tech , I don't think mantle is going to compare to what Nvidia have going on unfortunately.
I got a watercooled 780ti Classified, you can get those sort of frames at quite reasonable quality but as I say I don't run max resolution. I'm thinking maybe you have to run multiview to get the max refresh rate your monitor can handle at all resolutions, there must be some other benefit to it other than not hiding bits of hud behind bezels?
I really can't wait for this new technology. But, there seem to be a problem with a 3 screen setup. Asus monitor presentation says: "Nvidia’s G-Sync input currently requires DisplayPort in order to function, so this is the only input available." AFAIK, there's only 1 displayport output on a graphics card... Maybe we'll need a 3-way SLI ??
If that was the case it would sure be expensive but this is the way I think it should work anyway. 1 card per screen. No input lags, dropped frames, x16 from each card etc. Sure you could get convertors or isn't there some way of daisy chaining them or something? Or just wait for the V2 when they realise they made a mistake.