Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Could someone explain the point to me? I read the post and still don’t quite understand. I remember CRTs looked smoother when pixels were still noticeable in (o)led displays. Is it to effectively lower the frame rate?


It's to reduce sample-and-hold blur. Modern displays typically produce a static image that stays visible for the whole frame time, which means the image formed on your retina is blurred when you move your eyes. CRTs instead produce a brief impulse of light that exponentially decays, so you get a sharp image on your retina. Blurbusters has a good explanation:

https://blurbusters.com/faq/oled-motion-blur/


You need more Hz to reduce display motion blur.

- 120Hz = can reduce motion blur by up to 50%

- 240Hz = can reduce motion blur by up to 75%

- 480Hz = can reduce motion blur by up to 87.5%

There's a new article on Blur Busters that's showing 120Hz-vs-480Hz OLED is more human visible than 60Hz-vs-120Hz, and easier to see than 720p-vs-1080p, and also why pixel response (GtG) needs to be 0 instead of 1ms, since that's like a camera shutter slowly opening & closing, but MPRT is equivalent to the shutter fullopen time. The science & physics is fascinating, including links to TestUFO animations that teaches about display motion blur and framerate physics.

Motion blur of flicker = pulsewidth

Motion blur of flickerless = frametime

So you need tons of framerate or short pulsewidth BFI/CRT/etc.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: