oleds are much better at almost everything, but it all comes crashing down because of burn in.
having a screen with a guaranteed explicit expiry date is a huge dealbreaker.
It does matter, but there are drawbacks and advantages each way.
My current monitor is LCD. When I bought it, that was because OLED prices were significantly higher.
I like the look of the inky blacks on OLEDs. I really love using the things in the dark.
If you’re using a portable device, OLED can save a fair bit of power if you tend to have darker pixels on the screen, since OLED power consumption varies more-significantly based on what’s onscreen. I use dark mode interfaces, so I’m generally better-off from a pure power consumption standpoint with OLED.
https://en.wikipedia.org/wiki/Comparison_of_CRT,_LCD,_plasma,_and_OLED_displays
OLED displays use 40% of the power of an LCD displaying an image that is primarily black as they lack the need for a backlight,[35] while OLED can use more than three times as much power to display a mostly white image compared to an LCD.
OLEDs are more prone to burn-in than LCDs, but my understanding is that newer OLEDs have significantly improved on this. And it takes a long time for that to happen.
Aside from price, I’d mostly come down on the side of OLED. However, there is one significant issue that I was not aware of at the time I was picking a monitor that I think people should be aware of. As far as I can tell from what I’ve read, present-day OLED displays have controllers that don’t deal well with VRR (variable refresh rate, like Freesync or Gsync). That is, if you’re using VRR on your OLED monitor and the frame rate is shifting around, you will see some level of brightness fluctuation. For people who don’t make use of VRR, that may not matter. I don’t really care about VRR in video games, but I do care about it to get precise frame timings when watching movies, so I’d rather, all else held equal, have a monitor that doesn’t have VRR issues, since I have VRR enabled. If I didn’t care about that, I’d probably just turn VRR off and not worry about it.
EDIT:
https://www.displayninja.com/what-is-vrr-brightness-flickering/
This was a great comment! Where does QLED fit into all of this?
I’ve never really wanted to get a QLED monitor, so I haven’t spent time looking at their VRR behavior; sorry. I imagine that there’s material out there about it, though.
No brightness issues here with LG QLED.
There’s of course the limitations of VRR itself and the implementation. Freesync only works within a frame rate range.
I’ve seen strobe/flicker when it’s too low.
I cap my GPU to 108FPS to prevent tearing, I leave VRR always on.
I do like the OLED on my phone, however certain colors do show that the keyboard down at the bottom has now burned in. :(
Edit Bottom of my OLED phone on a white/gray background. No keyboard present:
I’m always worried about burn in so I prefer the kind of screen where that can’t happen.
On the other hand, I like that OLED screens can be near paper thin. There are some applications of that, that I would really like to see eventually. Namely: Animated T-shirts.
I would say OLED, though I had an old tv from 2017 that showed significant burn in after 4-5 years of usage. Anything that was red had a permanent shadow like the Netflix logo and some tv channel logos. As someone with hearing loss I also always have subtitles on which causes shadows there too.
Now it’s apparently much better nowadays, still happens with excessive usage and I actually ended up buying an LCD tv this time. I do miss the OLED razor sharp contrast though.
Ed i: I just checked my pixel 5 phone with an OLED screen, it does indeed also have burn in on the top row when I put a red or green picture full screen. I wouldn’t notice it unless I really looked though.
Definitely! OLED is unusable for me because it has really bad PWM flickering. The majority of people can’t see the screen flashing on and off like a strobe light, but many of us have eyes that do see the flashing, and it’s awful.
I can’t wait until a new display technology gets popular that doesn’t use Pulse Width Modulation
I can see flicker in some 60 Hz LED lighting in the corner of my eyes, using the rods in the irises of my eye, when I can’t see it using the cones in the pupil. Stick the light in the middle of my vision, and the flicker vanishes. Drove me nuts with some inexpensive, high-power corncob LED light bulbs that didn’t have an electronic ballast, just fed wall power directly to an array of LEDs.
Wikipedia says that the cones are more time-sensitive than the rods, which isn’t what I’d expect if that were the case. But that’s what I experience. Maybe it was the result of the thing getting a sine wave — which is what wall power would input to an LED — rather than a square wave, which is (roughly) what I’d expect a system controlling brightness of an LED using PWM to output. I don’t know what else would be unusual about that situation.
https://en.wikipedia.org/wiki/Flicker_fusion_threshold
Different points in the visual system have very different critical flicker fusion rate (CFF) sensitivities; the overall threshold frequency for perception cannot exceed the slowest of these for a given modulation amplitude. Each cell type integrates signals differently. For example, rod photoreceptor cells, which are exquisitely sensitive and capable of single-photon detection, are very sluggish, with time constants in mammals of about 200 ms. Cones, in contrast, while having much lower intensity sensitivity, have much better time resolution than rods do. For both rod- and cone-mediated vision, the fusion frequency increases as a function of illumination intensity, until it reaches a plateau corresponding to the maximal time resolution for each type of vision. The maximal fusion frequency for rod-mediated vision reaches a plateau at about 15 hertz (Hz), whereas cones reach a plateau, observable only at very high illumination intensities, of about 60 Hz.[3][4]
Passing an open hand with fingers extended in front of the light tends to make any flicker more visible, as it makes the moving fingers “judder”, as with a strobe light.
The flicker fusion threshold does not prevent indirect detection of a high frame rate, such as the phantom array effect or wagon-wheel effect, as human-visible side effects of a finite frame rate were still seen on an experimental 480 Hz display.[6]
It’s totally normal to perceive flickering at 60Hz
According to the above Wikipedia article, I don’t believe it should be possible to see it with the rods at 60 Hz.
Tell that to my eyes lol. It’s easy to see flickering of 60Hz on a CRT displaying a white screen.
With the rods of the eye. Your eye doesn’t consist entirely of rods.
The Wikipedia article says that your cones should be more-sensitive to flashing at higher frequencies than the rods. The rods are what are what pick up light when you’re viewing something through the corner of your eye. What I experience with these bulbs is the opposite of what I’d expect from that: flashing is noticeable and annoying when viewed in my peripheral vision, but gone (well, or on the edge of noticeability) when in the center of my vision.
EDIT: Well, to be fair, I guess I don’t actually know that they don’t have some sort of power control circuitry, haven’t pulled one apart, so I guess I shouldn’t say that they’re 60 Hz. But unlike typical LED bulbs, they’re narrow; these corncob bulbs don’t have the bulge for space for an electronic ballast. If they don’t have the ballast, I’d be expecting them to run off the wall power directly.
I wonder if I can go dig up a datasheet somewhere.
EDIT2: None of the technical material talks about any frequency of the bulb, but you might be right. There’s one other thing power-control thing that you can stick in a bulb that might take up space, and that’s a dimmable power supply. Like, if the wall power voltage drops, those will detect and reduce brightness. This one’s non-dimmable. Maybe that’s where the bulge at the base of LED bulbs comes from — dimmer electronics — and there’s enough space to fit non-dimmable electronics up inside the body of the bulb.
EDIT3: No, it’s not dimmability that determines the bulge. I see corncob lights with no bulge that are dimmable and corncob lights with a bulge that are not dimmable. But that also invalidates my reasoning above – you have to have power regulation circuitry to make dimmable LED lights work, because that requires a variable-PWM source, and if it’s possible to build dimmable bulbs with no bulge at the base, then I can’t assume that no bulge means that LED bulbs are being driven straight off wall power without power regulation, which was my original assumption. Sorry, this is probably my error then. These are probably being driven by an electronic ballast at some frequency higher than 60 Hz, just still low enough to be within the range that I can still see.
Yeah that’s still normal. Unless we’re both just special. When looking at the center of a 60Hz CRT, the flickering is seen around the edges of the screen where I am not focusing. Or the whole screen if I look to the side of it. I also perceive LEDs flickering the same way you describe.
I’d guess the fact that we are not seeing it in our focus vision probably has less to do with physical attributes of the eye and more to do with the way our brains create our perception of vision. There’s a lot going on there. Like our eyes are also constantly rapidly moving, and we are not conscious of or perceive that movement, there are 2 blind spots in our vision where our optic nerves connect to our retinas that we don’t perceive, and our brain invents the color that we perceive in our peripheral vision, which cannot physically be detected by the eye. Vision is weird and complicated.
Damn, so basically you can’t use any high-end smartphones due to a biological reason outside your control? 😥 (because all high-end smartphones use OLED)
Yup. It really sucks. Being able to see things that are very quick is like a superpower, but as far as i can tell there’s only downsides
Most modern OLED panels on TVs and monitors don’t actually use classic PWM for dimming, they never turn off completely and instead fluctuate between like 100% and 95% brightness based on the refresh rate.
Did you ever test if you can see that as well at different refresh rates?
rtings always tests this under “Image Flicker”. https://www.rtings.com/monitor/tests/motion/image-flicker
It’s not considered flicker-free but the OLED panels listed with 0 Hz PWM frequency (most of them) should look fine.
However, there are two other elements that might cause issues:
- VRR flicker
- ABL dimming in HDR
Both can cause an unpleasant experience if you are sensitive to it.
Phones still commonly use PWM because it uses less energy. There are some that have a DC dimming option but it’s rare.
This sort of flickering can be really noticeable especially at low brightness, with the always-on display for example (although still nowhere near as bad as 60hz CRT flicker *shudders*)
But I honestly do not believe thet you’re able to see 4000+ hz flickering. If you genuinely can, I’m sure you could get a world record for that.
I’ve used OLED on phones (my current, free because cracked screen) and like the idea* but considering I have a super-budget desktop (old stuff, unlikely to upgrade) and keeping it mostly to free/old content I’ll stick to whatever low-tier 1080p displays are already in my home.
Maybe OLED multi-touch if it wasn’t an upsell and niche market, so realistically when you add in burn-in fear it’s either I get some second-hand laptop/tablet that has it (with a bad/no battery) or some new manufacturing tech solves it (either way, probably not for me in the next 10 years).
It might make more sense for VR immersion, though again between cost and specs (cost again) plus whatever lock-in nonsense (which I already saw of the oculus stuff with a family member who likely won’t ever unlock dev mode) probably not for me.
* particularly for the contrast ratio (off pixels), though unless you’re into horror stuff this seems like a bit of a gimmick (even space content is not a guaranteed fit). It’s either that or making my own OLED edits of movies, which I find unlikely to work well via a blind edit (as I don’t expect a script to be perfect).
My WOLED monitor vs my old main.
It’s amazing. With my black theme, a black background, and the mouse off the monitor, you can’t even tell the thing is on.
It’s amazing. With my black theme, a black background, and the mouse off the monitor, you can’t even tell the thing is on.
I have a solid black color as background and a hidden task bar on my OLED monitor.
It’s just a mouse cursor floating in nothingness.
I just don’t understand why I find it so cool that the only lights that are on are the lights giving you information.
I’m glad there are other people like you stop and appreciate it!
Goddamn, you just sold me.
Be careful though, with the contrast between my old ips and my new monitor, all it took was a best buy gift card bonus for Christmas for me to justify an open box return for a second one.
It does matter. But all my big displays are still LCD, because of cost.
It’s about blackpoint. With an LED, pixels which are black still have a backlight. This makes them a kind of grey.
With OLED, the pixels themselves emit light. This means that black pixels are unlit.
The difference is obvious in a dimly lit room looking at dark content.
That said, while I would love OLEDs all around, they’re expensive. I’m willing to give up having true blacks for the cost difference. It may be different as costs on OLED come down.
I do have an OLED phone, because Samsung is pumping out OLEDs on everything.
OLED also matters more on phones because such a large fraction of their power use goes to the display (apparently up to 80% at max brightness on a task that doesn’t require much computing power). A desktop would need one hell of a multi-monitor setup to get remotely close, plus you aren’t as concerned about power usage when there’s no battery to deplete.
It does matter, but does is it justify spending 50% more on a product? You’d only notice the difference if you did side-by-side testing anyway. Ignorance truly is bliss.
I said it matters: “the difference is obvious”.
But for me, it does not justify the cost difference at the current time.
You definitely can easily tell led from oled. No need for a side by side test.
Reposting from my comment in the past. TLDR: I took the plunge on OLED TV in 2021 as a primary monitor and it’s been incredible
I’ve been using an LG C1 48" OLED TV as my sole monitor for my full-time job, my photography, and gaming since the start of 2021. I think it’s at around
30004500 hours of screen time. It averages over 10 hours of on time per weekdayIt typically stays around 40 brightness because that’s all I need, being fairly close to my face the size. All of the burn-in protection features are on (auto dimming , burn-in protection, pixel rotation) but I have Windows set to never sleep for work reasons.
Burn in has not been a thing. Sometimes, I leave it on with a spreadsheet open or a photo being edited overnight because I’m dumb. High brightness and high contrast areas might leave a spot visible in certain greys but by then, the TV will ask me to “refresh pixels” and it’ll be gone when I next turn the TV on. The task bar has not burned in.
Experience for work, reading, dev: 8/10
Pros: screen real estate. One 48" monitor is roughly four 1080p 22" monitors tiled.The ergonomics are great. Text readability is very good especially in dark mode.
cons: sharing my full screen is annoying to others because it’s so big. Video camera has to be placed a bit higher than ideal so I’m at a slightly too high angle for video conferences.
This is categorically a better working monitor than my previous cheap dual 4k setup but text sharpness is not as good as a high end LCD with retina-like density because 1) the density and 2) the subpixel configuration on OLED is not as good for text rendering. This has never been an issue for my working life.
Experience with photo and video editing: 10/10
Outside of dedicated professional monitors which are extremely expensive, there is no better option for color reproduction and contrast. From what I’ve seen in the consumer sector, maybe Apple monitors are at this level but the price is 4 or 5x.
Gaming: 10/10
2160p120hz HDR with 3ms lag, perfect contrast and extremely good color reproduction.
FPSs feel really good. Anything dark/horror pops A lot of real estate for RTSs Maybe flight sim would have benefited from dusk monitor setup?
I’ve never had anything but a good gaming experience. I did have a 144hz monitor before and going to 120 IS marginally noticable for me but I don’t think it’s detrimental at the level I play (suck)
Reviewers had mentioned that it’s good for consoles too though I never bothered
Movies and TV: 10/10 4K HDR is better than theaters’ picture quality in a dark room. Everything I’ve thrown on it has been great.
Final notes/recommendations This is my third LG OLED and I’ve seen the picture quality dramatically increase over the years. Burn-in used to be a real issue and grays were trashed on my first OLED after about 1000 hours.
Unfortunately, I have to turn the TV on from the remote every time. It does automatically turn off from no signal after the computers screen sleep timer, which is a good feature. There are open source programs which get around this. Bazzite and Mac seems to handle this too.This TV has never been connected to the Internet… I’ve learned my lesson with previous LG TVs. They spy, they get ads, they have horrendous privacy policies, and they have updates which kill performance or features… Just don’t. Get a streaming box.
You need space for it, width and depth wise. The price is high (
around 1k USD on saleprices are even lower now) but not compared with gaming monitors and especially compared with 2 gaming monitors.Pixel rotation is noticeable when the entire screen shifts over a pixel two. It also will mess with you if you have reference pixels at the edge of the screen. This can be turned off.
Burn in protection is also noticable on mostly static images. I wiggle my window if it gets in my way. This can also be turned off.
You can block all the ads/spying with pihole. And only let the streaming apps through. Then you can still use the native apps but also have zero ads.
I really want to try pihole or Adguard
I don’t have experience with adguard but I’ve been using pihole for years. I’d never want to have a home network without one.
I’ve had a mid-tier OLED tv the last few months. The colors and contrast look phenomenal to me. You get true black on OLED since each pixel is individually lit.
I watch a lot of horror films which will have many dimly lit or night time scenes. OLED makes those scenes much easier to see because of increased contrast between dark and light.
Oleds look great, but I’m severely allergic to the concept of burn in. Not interested in technology that has such a comparatively short life span, and don’t want to think about auto hiding UI elements either.
Modern ones have anti-burn in stuff so when it detects fixed elements it turns these down in intensity. For what it’s worth mine is going on 4 years regularly playing a game with fixed UI and no issues so far touches wood.
I like to read in the dark, and a black OLED screen with white text is so much more comfortable than even an e-ink screen for me.
LCDs are good for price, I guess. All my big monitors are LCD, but phone has to be OLED.
I noticed that every Samsung phone is now OLED (branded as “AMOLED”), even the lowest of the budget A series.
Probably cheaper to just do Oled for the mobile line up vs split manufacturing.
The strange nature of mass producing parts…
OLED every time. The original PS Vita was far superior to the remake with the LCD, and the OLED Steam Deck is way better than the first model. Also the Switch OLED screen is very nice, but the Switch is garbage in general so screw it.
Even the most subtle burn in bothers me, but grey instead of black doesn’t. LED is better than OLED for me.
As soon as there is a technology with the same colours as OLED with absolutely no burn in (and my existing displays get too old), I’ll consider it.