The thing missing from this discussion is always that modern TV's _need_ powerful SoCs to look good. They have not been a "just take a signal from this input and display it" kinda devices for a long time, and _definitely_ after HDR became a thing.
Your TV does _tons_ of processing to display SDR content (24fps pulldown if the panel isn't 120hz, motion/judder compensation, "sport mode" and a bajillion of other things, and these get orders of magnitude more complicated when HDR gets into the picture (tone mapping is a Whole Thing!), and that's before we've even started talking about compensating for physical quirks of different panel types.
And this isn't the "motion smoothing" (though it also does that!) kinda processing, which purists will want to disable. Some of those are configurable and can (should) be turned off, but many of those are just what's required to make the image look good, without messing with the "creative intent" or whatever.
OLED displays need to have another layer of processing applied to make sure they don't burn-in; there's another layer of logic when displaying near-black content (turning a pixel takes more time than switching a color; sometimes you don't want to turn off a pixel for a single frame if it's gonna be lit up in the next again, etc, etc.). There are other consideration for other panel types, you also have to worry about heat and bajillion other things.
This is why you won't get a high-quality, dumb TVs.
You just physically can't make a high-end TV without putting a powerful SoC in it. And if you're doing that, you might as well install an Android on it for a nice OOTB experience.
Intersection between people who want the best picture quality you can get, and who are very... particular about what software runs on their TVs (and aren't satisfied with "just don't connect it to Wi-Fi") is not big enough to make a product for, to put it very delicately.
People who don't care about picture quality _might_ buy "dumb" TVs, but margins on those are so much smaller, nobody cares.
When people say they want a dumb TV, they usually don’t mean „without advanced silicon“. They mean built-in media streamers, operating system, data collection, ads and bloatware.
Those image improvement algorithms don’t run on a general-purpose CPU, there are specialized chips for that.
As I understand how those TV's are built, this is not true.
They are definitely running on the same _chip_, but you're right that (some of) those operations are not running on the generic ARM CPU cores. There is dedicated silicon for some of those fore sure, but it's all part of the same SoC.
Take a look at something like MediaTek Pentonic 1000: the product page talks about the included CPU and GPU ARM cores, but that SoC is responsible for _so much_. It dictates how many inputs your TV can have, it contains all the media-decoding blocks (AIUI, including things like ATSC!), and _it_ handles Dolby Vision, ALLM, and half of the niceties of modern high-end TVs.
So for everyone that doesn't have enough engineering bandwidth/competency/money to build their own silicon for things like this (which includes, oh, say, Sony.), and if you're buying off the shelf, you're getting those cores!
Thats actually an interesting angle. Sure, the low-margin price pressure has lead to increased consolidation and saving a few cents by having one chip instead of two - unremarkable. But that this exerts a kind of pressure to use the hardware that you got anyway is not immediately obvious, but probably true. Why let it go to waste? Why not collecting a bit of data here and there? Show a few ads, where they are not that annoying, really, who would mind.
> TV does _tons_ of processing to display SDR content (24fps pulldown
I keep hearing this. How did we do it in 2005? Because I was definitely watching HD SDR content on large displays in 2005, and we didn’t have the monster SoCs back then that we do today.
In fact it wasn’t until modern “smart” displays started coming out maybe 10 years ago, Cable switched over to HD content, etc, that I started noticing major issues with jutter, lag, etc.
Your TV does _tons_ of processing to display SDR content (24fps pulldown if the panel isn't 120hz, motion/judder compensation, "sport mode" and a bajillion of other things, and these get orders of magnitude more complicated when HDR gets into the picture (tone mapping is a Whole Thing!), and that's before we've even started talking about compensating for physical quirks of different panel types.
And this isn't the "motion smoothing" (though it also does that!) kinda processing, which purists will want to disable. Some of those are configurable and can (should) be turned off, but many of those are just what's required to make the image look good, without messing with the "creative intent" or whatever.
OLED displays need to have another layer of processing applied to make sure they don't burn-in; there's another layer of logic when displaying near-black content (turning a pixel takes more time than switching a color; sometimes you don't want to turn off a pixel for a single frame if it's gonna be lit up in the next again, etc, etc.). There are other consideration for other panel types, you also have to worry about heat and bajillion other things.
This is why you won't get a high-quality, dumb TVs.
You just physically can't make a high-end TV without putting a powerful SoC in it. And if you're doing that, you might as well install an Android on it for a nice OOTB experience.
Intersection between people who want the best picture quality you can get, and who are very... particular about what software runs on their TVs (and aren't satisfied with "just don't connect it to Wi-Fi") is not big enough to make a product for, to put it very delicately.
People who don't care about picture quality _might_ buy "dumb" TVs, but margins on those are so much smaller, nobody cares.