I'm just talking about 3D in general. I'm saying that this time 3D will finally not only "stick", but be mainstream, or maybe not mainstream, but not super-niche either. I can just sense it from all the great experiences people are getting with devices like the Rift, all the companies backing it and continuing to push and improve the tech at such a fast rate, etc. etc.
Anyone with a bit of foresight knows that this time VR is going to hit big time. It will take a few years of course, but the moment you can buy a standalone VR headset that allows you to watch movies and talk to people in VR and stuff for a low price, it's going to spread like wildfire. While the mobile versions improve, it will stay relatively niche on the PC because the average consumer won't spend around $1000 to get a VR-capable rig Whoever says VR is a fad, either hasn't tried this new generation or just can't see past a dev kit and it's flaws. The only thing you have to do is imagine how it will be like when they improve FOV and resolution, which I think are the "biggest" hurdles at the moment compared to what it will become in the future
Ya, and they need to make sure they have super low input lag like some of the top hardcore gaming monitors (eg. BenQ XL2720Z in "instant mode", ASUS PG278Q, ASUS VG248QE, etc.). I haven't seen any tests on that yet. 3D and VR are definitely sticking this time around. I hope it has a knock-on effect for non-VR 3D users, since at the moment I prefer triple 1080p screens (and soon triple 1440P screens) with Nvidia 3D Vision 2. Maybe some of Nvidia's VR 3D tech will naturally be easily implemented into their 3D Vision tech. I know that the VR 3D stuff of theirs is already using lots of tech from, or similar to, 3D Vision. So hopefully as they further develop 3D VR, the opposite will happen and 3d Vision gets improvements. A "5K" screen (5120x2880) would be needed in order to have the same resolution as a regular monitor at 1440p (2560x1440) since the Oculus cuts the effective resolution in half in order to achieve 3D. I doubt even the CV1 will be 5K
1440p is not half but a quarter resolution of a 5k monitor, right? But a 5k monitor would have quadruple the pixel density for the same size display. You'd need a 3620 x 2036 res/pixel count like monitor to match the perceived pixel count. But the pixels will appear much larger ofc since the display is stretched to fill 90 degrees of your field of view whereas most people set their monitors in the 30-45 fov range. edit: i think you might be right calling 2880p double res but you don't need 2880p to get the same pixel count overall as 1440p non vr display. I have a problem with this though because whilst relative resolution statements are based on the 1-dimensional comparison between different monitors (of the same aspect ratio) in either horizontal or vertical direction (i.e. 2880p is double the "resolution" of 1440p), it doesn't make sense to me to define perceived quality by this same metric. Why? Because pixels are 2-dimensional and the perceived quality is based on how many pixels there are overall (or the same pixel density if both displays are the same physical dimensions).
IMHO DK2 allows a different experience of the physics, just like sight offers a different experience of Monet. If Assetto Corsa were to improve their OR support I would even drive Assetto Corsa. SW
To experience Monet you need to have sight as his works are paintings. To experience rF2 you need a PC with FFB wheel. The word physics and its usage get thrown around way, way too much everywhere.
How can that be a problem if the main advantage of the Rift is the low latency due to the OLED screen? A 90Hz CV1 will have less input lag than anything on the monitor market, probably even DK2 but I don't have the data for it
OLED doesn't neccessarily have low input lag, you're confusing pixel response time with input lag, no to mention, my monitor has a "low persistance" (stroboscopic aka Lightboost) and at 120 Hz, let alone 70 or 90 Hz, and some monitors can even do it at 144 Hz (such as BenQ in their version which they call "Blur Reduction Mode"). Again though, it's for pixel response time, not input lag.
I wouldn't worry about input lag being a problem with the Rift, all the big players at Oculus have talked about how important input lag is to good VR and John Carmack is super super into having as little lag as possible.
The OLED was used primarily for the motion blur problem (although not the only advantage/benefit of using OLED over LCD). Motion blur is caused by the significant time is takes per a refresh for the pixels to fully transition from one colourto the next colour in the next frame. This problem did not exist in CRT monitors which is why motion blur does not exist on CRT displays. The solution for modern LCD's is to disable the backlighting of the pixels during the time the pixel colour is transitioning to it's next state, thereby removing the very thing that causes the motion blur effect of quickly moving objects across the display. This backlight strobing motion blur removing solution to motion blur on LCD's is patented by Nvidia and used in their 3d vision 2.0 monitors and the strobing tech is called "lightboost". I don't know for sure but i think that Nvidia's patent only extends to LCD monitors but i could be wrong. OLED has no dedicated backlight, each pixel has it's own backlight. So they simply turn off the lightsource for each pixel individually and i think this maybe how they get around the nvidia patent issue.
I agree, he seems very committed to this, however, what worries me is everytime they mention lag, they're talking about the head sensor tracking rather than the input lag we're talking about. I'm not sure about patents, but apart from Nvidia's "Lightboost", BenQ also has their mode called "Blur Reduction Mode", ASUS have "ULMB" ("Ultra Low Motion Blur"), Ezio have "Turbo 240". I believe Acer, LG, and Phillips have this on their new top-line gaming monitors as well (philips and acer just came out with some real high-end stuff pretty much identical in specs to the top-end gaming BenQ and ASUS monitors, LG may have as well). I don't know if they license the tech or do it themselves, no idea. Actual "Lightboost" is Nvidia though, for sure.
oh, maybe i have it completely wrong then. Was sure i remember seeing it mentioned it was patented (if you can even do that to preexisting tech). Maybe i was wrong about nvidia owning it.
Yup but they need to shave off time between both tracker->PC and PC->display. The tracker/pc isn't of much interest to non-rift users but everyone should benefit from the work they're doing with GPU vendors on the other direction.
[video]http://video.cnbc.com/gallery/?video=3000333524[/video] Oculus @oculus CEO @brendaniribe discusses Oculus with CNBC
Is hdmi cable gonna cap higher resolutions at high fresh través like happens with beamers and tvs 3d ready?
Yes it's pretty much already on the limit, DisplayPort 1.3 is going to be the way forward for the consumer version as HDMI is to slow to do 1440 @90hz which is what the latest prototype is.