I'm surprised AV1 usage is only at 30%. Is AV1 so demanding that Netflix clients without AV1 hardware acceleration capabilities would be overwhelmed by it?
Thanks to libdav1d's [1] lovingly hand crafted SIMD ASM instructions it's actually possible to reasonably playback AV1 without hardware acceleration, but basically yes: From Snapdragon 8 onwards, Google Tensor G3 onwards, NVIDIA RTX 3000 series onwards. All relatively new .
There are a lot of 10 year old TVs/fire sticks still in use that have a CPU that maxes out running the UI and rely exclusively on hardware decoding for all codecs (e.g. they couldn't hardware decode h264 either). Image a super budget phone from ~2012 and you'll have some idea the hardware capability we're dealing with.
Compression gains will mostly be for the benefit of the streaming platform’s bills/infra unless you’re trying to stream 4K 60fps on hotel wifi (or if you can’t decode last-gen codecs on hardware either ). Apparently streaming platforms still favor user experience enough to not heat their rooms for no observable improvement. Also a TV CPU can barely decode a PNG still in software - video decoding of any kind is simply impossible.
If you are on a mobile device, decoding without hardware assistance might not overwhelm the processors directly, but it might drain your battery unnecessarily fast?
I'd love to watch Netflix AV1 streams but they just straight up don't serve it to my smart TV or my Windows computers despite hardware acceleration support.
The only way I can get them to serve me an AV1 stream is if I block "protected content IDs" through browser site settings. Otherwise they're giving me an H.264 stream... It's really silly, to say the least
From mobile or tablet? Currently hosting (and audio sharing) should be possible only from Desktop unfortunately.
Anyways, thank you for your comment. We'll investigate. If you want us to follow up with you, please drop us your email using the FEEDBACK form on the left side, referring to this message.
If it was a few large files as opposed to many small ones, this is totally believable. iPhones have Wi-Fi 6E chips, and an ad hoc network where the devices are right next to each other can actually reach the theoretical max speed of the protocol (as opposed to real-world connections to a base station, which never do). I've never measured it precisely but I've transferred ~1 GB disk images over AirDrop in a couple seconds.
that all is most basic bookkeeping, I cannot take the argument "$x/user × every employee adds up" serious.
Also, latest with 20 employees or computers, someone in charge of IT (sysadmin, IT department) would decide to use a software asset management tool (aka software inventory system) to automatically track, roll out, uninstall, monitor vetted software. Anything else is just unprofessional.
Well all the downstream distros have their own installers (apt, dnf, pacman, etc.). If you're compiling from source, then "make install" [0] should work as expected, and if you're downloading the pre-built binaries from GitHub [1], you just need to copy a single statically-linked binary into "/usr/local/bin".
why not bring Debian's guix version to closely follow vanilla guix's releases? Is it because Debian wants to guarantee that a Debian release (such as trixie) only provides packages that stick to at most bugfix versions such that there are no breaking changes introduced?
It could have been possible to upload something like a 1.4.0+git2025mmdd package, if not for the timing of the CVE announcement with regards to Debian's release freeze.
reply