The fact that Jellyfin lacks a AppleTV/tvOS app seems like it continues to make it a dealbreaker... at least for my setup.
I hear people recommending clients like Infuse, but it feels odd to swap out Plex at this point if I can't go all in on the open source side of things.
Am I missing something here wrt Jellyfin clients? I guess I could try running it side-by-side with Plex and see how it goes.
I really like the infuse jellyfin setup.
Only two things that bugs me are
1: Choosing a movie and then cast member wont show all shows/movies for that casr member, only the cached ones. No big deal but a bit of a petpeeve.
2: And I think this might not be solvable from Jellyfin but more than one version/quality of a tv show episode shows up as a seperate show episode and not version of the same episode.
Might not be a Jellyfin issue since InFuse cant handle that in stand alone either. Havent tried the jellyfin clients to see the difference there.
There Swiftfin, Jellyfin Mobile, and Streamyfin at least. My forthcoming iOS-only music player has first-class Jellyfin support (beta sign-up: https://forms.gle/AGLePh9RtaYEfDH6A) if you're looking for a dedicated, offline-capable music app.
> Am I missing something here wrt Jellyfin clients?
Unfortunately, I don't think so. I had many issues with playback on ATV using Swiftfin. Infuse works very well, so it is worth the ~$15 yearly to me. I am hopeful that Swiftfin will improve over time, they have a few dedicated developers working on it.
I have Kodi running on a raspberry pi plugged into my Google TV. The Jellyfin plugin for Kodi works flawlessly so far for me. It’s just great! Sure if I could put Jellyfin directly on the TV, that would save me the RPi. But not a big deal for me.
Jellyfin has Swiftfin, I’ve been using it for a few years now.
There are some small bugs that you can work around. The rework to the new version has been in progress for about two years but it works just fine right now.
Small bugs? May be. But there’s a lot of lack of functionality and stability. I’d recommend InFuse if anyone is hitting those problems. If it has been running fine for you then there’s no need to switch.
The problem is related to source codec. Depending on that you’ll have difference experience. So that’s why the experience varies because there’s vast differences in source formats.
A good client not only handles well on some sources, but many if not all.
I think you’re after something other than immutability then.
You’re allowed to rebind a var defined within a loop, it doesn’t mean that you can’t hang on to the old value if you need to.
With mutability, you actively can’t hang on to the old value, it’ll change under your feet.
Maybe it makes more sense if you think about it like tail recursion: you call a function and do some calculations, and then you call the same function again, but with new args.
This is allowed, and not the same as hammering a variable in place.
for (0..5) |i| {
i = i + 1;
std.debug.print("foo {}\n", .{i});
}
In this loop in Zig, the reassignment to i fails, because i is a constant. However, i is a new constant bound to a different value each iteration.
To potentially make it clearer that this is not mutation of a constant between iterations, technically &i could change between iterations, and the program would still be correct. This is not true with a c-style for loop using explicit mutation.
I argue in your example there are 6 constants, not 1 constant with 6 different values, though this could be semantics ie we could both be right in some way
I'm curious how/what you use the screens via AR for if you're not using environment locked screens. Particularly in a productivity/work environment.
Unless I'm misunderstanding the feature, it seems like enironment locked screens allows for more natural usage and interactions with the screens in the virtual space?
My experience is mostly with VR/AR products like Oculus has been mostly with environment locked AR information.
I suppose they could? I prefer having my posture decoupled from what I'm looking at though.
It's like having a very nice monitor that uses ~1 watt of power and happens to be positioned exactly wherever is most comfortable without even having to think about it. It's way better than a normal monitor if you don't have to do eg pair programming.
How are you finding the focus? I use the Xreal Air 2, but the edges are blurry, and I can't get the glasses close enough to my face to see the entire screen in focus, even if the top of the glasses is touching my forehead.
That's interesting. I think mine are the original Xreal glasses? I had to slightly adjust the nose part but otherwise the focus has been great for me. That would be a shame if their regressing.
I think the use-case for these is more VR focused, with the AR just being a "being able to notice when something needs your attention" feature (where you would respond to such an interrupt by taking the glasses off, not by trying to look at the interrupting thing through the glasses.)
I've heard people propose that these "screen in glasses" devices (like the Xreal Air) are useful for situations where you want a lot of visual real-estate but don't have the physical room for it — like in a dorm room, or on a plane. (Or at a library/coffee shop if you're not afraid of looking weird.)
---
Tangent: this use-case could likely just as well be solved today with zero-passthrough pure-VR glasses, with a small, low-quality outward-facing camera+microphone on the front, connected only to an internal background AI model running on its own core, that monitors your surroundings in order to nudge you within the VR view if something "interesting" happens in the real world. That'd be both a fair bit simpler/cheaper to implement than camera-based synced-reality AR, and higher-fidelity for the screen than passthrough-based AR.
† Which wouldn't even need to be a novel model — you could use the same one that cloud-recording security cameras use in the cloud to decide which footage is interesting enough to clip/preserve/remote-notify you about as an "event".
I always like these new comile time features getting into the C++ spec.
I'm actually looking forward to the related reflection features that I think are currently in scope for C++26. I've run into a number of places where the combination of reflection and constexpr could be really valuable... the current workarounds often involving macros, runtime tricks, or both.
I'm surprised by this, is Comcast super regional with it's restrictions? I have a Comcast 1gig plan in the Bay Area, and last I checked I get a small ($5?) discount for using my own modem. I've been on the plan for a least a few years now... so alternatively maybe I'm grandfathered in or something? Or maybe some Comcast sales person was lying to you about your options?
My experience in the Bay Area - if you rent the gateway from Comcast ($25/mo) then you have no data cap. If you use your own modem and want to remove the data cap it costs $30/mo, more than renting the gateway. The data cap is 1.2TB per month in my area.
I think that is what the commenter meant: "...unless I pay a whole bunch of extra fees or accept a stupidly low monthly data cap"
(edit: I initially thought it was $15/mo for the gateway + no data cap but just checked and it is $25/mo. They are called "Xfinity Gateway" vs "xFi Complete").
Tell them it's a home office and get comcast business. There's no data caps on any of the tiers and they allow use of any modem on their approved list.
My current residential price is $65/mo for 500mbps/20mbps. Business is $120/mo for 500mbps/200mbps ($105 for first 24 months). I wouldn't mind getting a bit more sweet sweet upload. Maybe I will!
There is also "gigabit pro"/"gigabit x10" where they run fiber to your house. That is $350/month for symmetric 10gbps. Lots of limitations on availability and a big install fee, though. Gotta get the other half on board with that ;-)
I always wonder what are some ways to put 10 Gbps (well, even 1 Gbps) to good use in a home setting, beside marginally lower ping times. I'm not saying such uses don't exist, I'm just curious to know.
For me, the big win is everything being snappy and never having contention on my Internet connection. Maybe I could do with 500Mbps up and down just fine instead of a gigabit but I almost never hit the limits of my connection and that’s an amazing place to be. When I do hit the limits, it’s when I’m downloading a huge file and I’m very grateful for the speeds I have.
I’m not the first one to say this, but often it seems that faster Internet speeds have enabled completely new use-cases and applications that sometimes weren’t even obvious until a critical mass of people had the faster speeds.
Competition matters. Comcast/Xfinity was my only "choice" in Cambridge, MA. It cost about $70 per month for 100Mbps service.
My building in Oakland, CA has multiple options, including fiber. The Comcast folks setup tables at least once per quarter to help customers/residents. The cost was much cheaper. I now have gigabit fiber from Wave, and pay less than I did back in MA.
iOS isn’t killing OOM apps. It’s killing inactive apps. It’s something that wouldn’t fly on a desktop or server OS under general use, but works reasonably well in the mobile space.
I hear people recommending clients like Infuse, but it feels odd to swap out Plex at this point if I can't go all in on the open source side of things.
Am I missing something here wrt Jellyfin clients? I guess I could try running it side-by-side with Plex and see how it goes.
reply