Follow Slashdot blog updates by subscribing to our blog RSS feed

 



Forgot your password?
typodupeerror
×
AI Television

LG's 2024 OLED TVs Put a Bigger Focus on AI Processing Than Ever Before (theverge.com) 37

LG touts AI for its 2024 OLED TVs, but don't expect AI assistants onscreen. The Alpha 11 processor in LG's new G4 and M4 series aims to sharpen clarity, color and image quality. The G4 features LG's Micro Lens Array technology for enhanced brightness. The M4 adopts 2023's wireless connectivity to eliminate unsightly cables. The Verge adds: So the AI supposedly now understands creative intent, according to LG, and can adjust your TV's image settings accordingly. Picture purists can always ignore and disable these AI modes, but many people inevitably leave them on -- so if the upgrades are noticeable, they'll be a difference maker for those customers.
This discussion has been archived. No new comments can be posted.

LG's 2024 OLED TVs Put a Bigger Focus on AI Processing Than Ever Before

Comments Filter:
  • How many microphones are included with the TV? I'm sure they think it world be a shame to miss out on all that juicy monetization...and it's not like any of these companies come right out and say what their doing.

    Alphabet and Meta are the lords of over collection of user data though. If you have a mic or camera near them, ready assured your conversations are being recorded and sifted through.

    It's okay though, they aren't breaking any laws! We're not foolish enough to pass any sorts of data protection laws.

    • by xeoron ( 639412 )
      YES very important question... although Matters not, if you do not connect it to wifi :) My experience to setting LG TV/Projectors they will demand WiFi access to upgrade the firmware, yet will let you avoid it. Personally, I will upgrade it and then tell it to forget the wireless network. My current LG screen has 4k upscaling it calls something like 4k AI upscaling and it works for image quality yet it delays all audio so it is unusable.
  • ...for end consumers is that the menu response, app selection and general usage experience is fast.

    Everything else is fancy buzzwords for making their TV's better from a tech-spec point of view.

    The two things that irritate consumers the most:

    1) Ads and bloatware forced upon them with new tv's.
    2) Smart TV slowness, and unsupported apps.

    • by Calydor ( 739835 )

      After spending an hour adjusting a new TV this Christmas, I'd put "obscure settings to make the sound not sound like you're at a heavy metal concert" at a very close 3).

      • That includes atrocious EARC support. But that seems to be a problem for TV makers along with Receiver manufacturers.
  • by MobyDisk ( 75490 ) on Wednesday January 03, 2024 @11:15AM (#64127801) Homepage

    A TV's job is to display the image as provided. If you want to make AI enhancement tools, make plug-ins for color grading software like DaVinci Resolve [wikipedia.org], AutoDesk Lustre [wikipedia.org], etc. so that cinematographers can use it when appropriate. Anything that changes the image at the TV level, especially in a subjective way, has degraded it.

    • by pr0nbot ( 313417 )

      The one and only thing I would like, in terms of altering the broadcast image, is some kind of AI upscaling of standard definition content. On a 65 inch TV, broadcast SD can look pretty awful, and DVDs, while not having the broadcast heavy compression artifacts, look soft focus. You'd have to turn every 1 pixel of information into 16, and perhaps AI (or AIs, selected depending on content type) could do a better job than standard upscaling techniques.

      • by NFN_NLN ( 633283 )

        If you want an authentic viewing experience... get a smaller TV or sit further back.

        • by pr0nbot ( 313417 )

          HD looks fantastic. (My eyes can't really tell the difference between UHD and HD.)

          I'd prefer not to shuffle the sofa back and forth depending on the definition of what I'm watching :)

          • So true... I'm pretty sure (at 71 years of age) my eyes are barely HD let alone UHD. In fact, sometimes I think SD looks "good enough".

            Age is a bitch... except that sometimes it allows you to use cheaper/older gear without noticing the difference :-)

    • by Saffaya ( 702234 )

      I disagree with that thought because you already alter the image with brightness and contrast settings on your display.
      Not to mention the differences induced by the type of screen itself, from dull TN LCD to OLED.

      • by MobyDisk ( 75490 )

        Hmmm... There is a key difference here.

        Brightness and contrast are objectively measurable values that one adjusts to match the viewing conditions of the room to the LCD technology such as TN or OLED. I would not object to the TV having a calibration mode that automatically adjusts the image brightness and contrast to match the room brightness or the viewing angle according to some objective criteria.

        If this was merely compensating for capabilities of the screen, I would understand. Like, if the screen use

      • by MobyDisk ( 75490 )

        I'm still thinking about your comment, and I can think of another case I would find this acceptable: suppose a TV could display a wider color gamut than the source. In that case, the manufacturers have 2 choices: waste the wider color gamut capability, or stretch the colors. The first is acceptable, but the second could result in oversaturation or wrong white-balance. So the only way to use that gamut is to intelligently use the gamut. And AI is probably the only thing that can do that reasonably.

        And, s

    • It's already degraded from what the cinematographers intended, because you're looking at lossy compressed video rather than the 422 ProRes print or whatever they work with in the edit bay.

      There's an argument to be made that an AI filter might actually get rid of some of the compression artifacting and color dithering that remains present after compression / encoding, transmitting, and decompressing / decoding the video stream to make what is being shown more accurate to the cinematographer's vision.

      • by MobyDisk ( 75490 )

        Fair enough, and some of these are gray areas, but removing compression artifacts is 100% the job of the player not the TV. The player knows what the macroblock size is, and what compression algorithm was used. This is similar to the issue with motion interpolation: the compressed stream contains the motion vectors that the compressor wrote into the file, so the only way to add motion interpolation without introducing delays is to do it at the point of decompression. By the time the signal goes to the TV

    • A TV's job is to display the image as provided.

      You're right. Unfortunately the image as provided was graded for viewing in a dimly lit room with a certain expectation for not just visual performance but also the environment. While many people are capable of meeting the former, very VERY few are capable of meeting the latter requirement and therefore TVs need to compensate.

      My TV also has a colour accurate mode which perfectly meets the cinema standard used by the industry. And when it's on I can't see shit during the day, and no I'm not going to close bl

  • Sharper than my aging eyes

  • Buzzwords (Score:4, Funny)

    by dysmal ( 3361085 ) on Wednesday January 03, 2024 @11:26AM (#64127817)

    AI - Great! There's no blockchain integration? No Crypto transactions? I'll pass!

  • by 0xG ( 712423 ) on Wednesday January 03, 2024 @11:27AM (#64127821)

    'Smart' TVs that call home with my viewing habits can FOAD.

    • Historically you could completely shut off the smart stuff on the C models, including menu at launch. I have a C8 that I really love, and it hasn't been connected for several years. I did connect it for the first six months to keep up on firmware as the model stabilized.Bit now it behaves exactly as a dumb monitor should.

  • by Artem S. Tashkinov ( 764309 ) on Wednesday January 03, 2024 @11:28AM (#64127829) Homepage

    aims to sharpen clarity, color and image quality

    Even setting aside that this sentence is rubbish in terms of English (sharpen color? sharpen image quality? what is that?), is this tech even right for the consumer? I mean it sounds like you will be watching something completely different than what the camera/operator shot (or rendered) in the first place.

    • All that sharpness sounds painful.
    • While I agree with what you're saying, there is some added value to having some filtering smarts on the receiving end to clean up potential compression artifacting and color dithering.

      Some compressed streams look like absolute shit in minute color gradients, such as the sky or underwater - you can see lines between the shades of blue where the compression took a shit all over the color blending. If it fixes that, it's probably worth doing, as you are actually returning to what the director wanted you to se

    • Re: (Score:2, Interesting)

      aims to sharpen clarity, color and image quality

      Even setting aside that this sentence is rubbish in terms of English (sharpen color? sharpen image quality? what is that?), is this tech even right for the consumer? I mean it sounds like you will be watching something completely different than what the camera/operator shot (or rendered) in the first place.

      That's completely beside the point. The point is "AI = good" therefore "more AI = more good."

      I can't imagine why a consumer level television needs AI, unless it's just being used to cover up "collect all user data imaginable and aggregate into useful advertising info before sending home." Anything else? Color correction? Audio adjustment? All that shit's been covered by automated systems for ages if you're a professional creator, or recommendation systems if you're an amateur creator that do just as well if

    • by teg ( 97890 )

      aims to sharpen clarity, color and image quality

      Even setting aside that this sentence is rubbish in terms of English (sharpen color? sharpen image quality? what is that?), is this tech even right for the consumer? I mean it sounds like you will be watching something completely different than what the camera/operator shot (or rendered) in the first place.

      AI (or rather, ML) upscaling can do wonders for old, low resolution material - some examples: Babylon 5 [youtube.com], Star Trek Voyager [youtu.be], and Married with children [youtu.be]. It's an often superior alternative to various pixel interpolation techniques.

    • Technically you absolutely can sharpen colour. You do so by applying edge contrast to the U and V channel while leaving the Y' channel untouched. This is something actively done to clean up images shot with 4:1:1 or 4:2:0 subsampling.

      Also technically you are always watching something different than the camera operator shot or rendered in the first place. When a movie is produced it is colour graded with expectations not only for your equipment, but also for your environment. Very few movies are watched with

  • ...It's got AI.

    But why has it got AI? Have you ever stopped to think?

    Because that's what TV's crave.

    Yeah, it's got AI.

    TV need AI because reasons.
  • "the ingenious AI processor adeptly refines colors by analyzing frequently used shades that best convey the mood and emotional elements intended by filmmakers and content creators" Ingenious, adeptly? I want to see an A/B comparison on some films I'm familiar with. The 97" screen starts at $25k - https://www.lg.com/us/tvs/lg-o... [lg.com] ouch! I'll have to sit a little closer to get a reasonably priced TV. Currently I have a 109" screen at maybe 17' distance.
  • Now only if I didn't have to deal with their garbage "smart" WebOS bullshit.

    Does this magic AI crap work while blocked from accessing the Internet, on HDMI video streams from external devices? Because otherwise it will make absolutely no difference in my home if I was to buy an LG TV, which I will not.

    • Yes it does., at least for the CX series (2020 version)
      It's quite a good screen IMO. Just don't use anything except the HDMI ports.

  • Remember 60fps+ "soap-opera mode" on current TVs, which enraged filmmakers and made film-based content look unrealistic and hard to watch. Buddy, just wait for...

    “precise pixel-level image analysis, to effectively sharpen objects and backgrounds that may appear blurry.”

    ... and ...

    "fine-tunes brightness and contrast by analyzing variations in brightness where light enters the scene, creating images that look more three-dimensional."

    Imagine all those carefully shot shallow depth-of-fi
  • The hardware sucks so you distort reality to make it look good.

  • by Fly Swatter ( 30498 ) on Wednesday January 03, 2024 @12:28PM (#64128051) Homepage
    Is AI really going to fix what is no longer there? I think not. Even original analog TV was lossless, and until digital gets to that point, smoothing and interpolating pixels won't fix stupid compression.
    • This isn't about fixing what isn't there, it's about making sure what is there can actually be seen. So many movies and modern TVs are only able to be viewed in the dark in a colour correct way. This is about making sure they can also be seen in the day or with the living room lights on.

  • No, larding up a television with more spyware isn't going to fix a screen.
  • I picked up a 65" G3 a few months ago and the brightness boost without color washout is real. Great TV.

"Everyone's head is a cheap movie show." -- Jeff G. Bone

Working...