Hacker Newsnew | past | comments | ask | show | jobs | submit | catapart's commentslogin

Since you seem like you have practical knowledge here, I hope you don't mind me asking:

Would it change the equation, meaningfully, if you didn't offer any transcoding on the server and required users to run any transcoding they needed on their own hardware? I'm thinking of a wasm implementation of ffmpeg on the instance website, rather than requiring users to use a separate application, for instance.

Would you think a general user couldn't handle the workload (mobile processing, battery, etc), or would that be fairly reasonable for a modern device and only onerous in the high traffic server environment?


That's very much not what transcoding is for. You don't want transcoding so a client can render the video in a comfortable resolution. You need transcoding to save bandwidth. If you want the client to do transcoding, you must send them the full raw video file. Either end of the connection may not have enough free bandwidth for that. The client may not be able to teanscode depending on size and format.

You of course, can do this anyway. PeerTube allows you to completely disable transcoding. But again that means you're streaming the full resolution. Your client may not like this.

If realtime performance is your concern I think PeerTube allows you to pre-transcode to disk. If there is a transcoded copy matching the client request, the server streams that direct with no extra transcode.

To answer your question: shifting transcode onto the client won't improve performance and will greatly increase bandwidth requirements in exchange for less compute on the server. You almost certainly do not want this.


Yep. As OP said: I meant the user could transcode the various versions on their machine and then upload each to the server. Sorry about the wording; I can see that it's vague.

I think GP meant making the user perform transcoding at upload time

> Would it change the equation, meaningfully, if you didn't offer any transcoding on the server and required users to run any transcoding they needed on their own hardware?

I think the user experience would be quite poor, enough that nobody would use the instance. As an example a 4k video will transcoded at least 2 times, to 1080p and 720p, and depending on server config often several more times. Each transcode job takes a long time, even with substantial hwaccel on a desktop.

Very high bitrate video is quite common now since most phones, action cameras etc are capable of 4k30 and often 4k60.

> Do you think a general user couldn't handle the workload (mobile processing, battery, etc), or would that be fairly reasonable for a modern device and only onerous.

If I had to guess, I would expect it be a poor experience. Say I take a 5 minute video, that's probably around 3-5gb. I upload it, then need to wait - in the foreground - for this video to be transcoded and then uploaded to object storage 3 times on a phone chip. People won't do it.

I do like the idea of offloading transcode to users. I wonder if it might be suited for something like https://rendernetwork.com/ where users exchange idle compute to a transcode pool for upload & storage rights, and still get to fire-and-forget uploads?


Right on. Thanks for the consideration!

I really appreciate you walking through that; it's an eye-opener! It seems like you not only deal with a considerable amount of five-minute-or-greater videos, but much higher quality than I was expecting, too.

I also like the idea of user-transcoding because, honestly, I think it's better for everyone? I would love if every place I uploaded video or audio content offered an option to "include lower-quality variants" or something. Broadly, it's my product; I should have the final say on (and take responsibility for) the end result. And for high-quality stuff, the people who make it tend to have systems equipped to do that better anyway. So they could probably get faster transcoding times by using their own systems rather than letting the server do it. Seems like a win-win, even outside of the obvious benefits of "make a whole lot of computers do only the work they each need done, instead of making a few computers do the work that everyone needs done". With the only slight downside of the "average user" having some extra options that they don't understand which cause them to use it wrong and then everyone hates your product. Yay, app development.


I think offering client side transcode as an option, with server side transcode available for those who don't want to do it client side, is compelling. I would probably do it, as I have a powerful home system that can transcode much faster than my cloud host (I do use the remote transcoding feature in Peertube though).

Very neat! I and completely respect the skill. I respect the effort even more!

That said, it's not 'hands down, one of the coolest 3D websites', at least that I've seen. It's all "technical", very little "design". For example, why is it 'isometric overhead'? There's no particular benefit in the view, and it's specifically harder to control than it would be with a 'chase'/'third-person' camera. It's not like this is an RTS or a city-builder-ish thing, where having an overhead layout works to your benefit. Rather, it's just easier to program a camera that never changes angles and input controls that never have to re-interpret camera position/rotation (lookat vector) to function correctly. And there's a kind of symmetry between a flat page and the "ground" that the truck drives on, so some parts of the web forms have been ported over to that.

Again, none of that is bad and especially none of it is wrong. It's very cool that it works and works so well (technical)! It's just that the design feels more "portfolio" than it does "best ux for interacting with the environment I've created and the paradigms I've invoked (vehicle control)".


> For example, why is it 'isometric overhead'?

That's design exactly. There's no technical obstacle to making it over-the-shoulder instead, but it changes the aesthetic. The animations focus on what the jeep does to things, so a racing view that helps you avoid running into things wouldn't be appropriate. It also changes how you see the assets. And you'd lose that 'RC Pro-Am' feel.

> Rather, it's just easier to program a camera that never changes angles and input controls that never have to re-interpret camera position/rotation (lookat vector) to function correctly.

Not really, you just put the camera on a spring arm attached to the vehicle. Vehicle movement isn't harder either. You get this stuff practically for free with any game engine.


What do game engines have to do with this?

You're welcome to your counter-opinion about the design, but you haven't convinced me. I've played plenty of games with third-person views where the gameplay was quite conducive to running in to things. I can also appreciate that the design is faux-retro, but that's kind of my whole issue with it. Sticking to a design because it is nostalgic is not user-focused. It's demographically limiting, by design. It's specifically niche-targeting. That's the opposite of trying to make the best kind of thing for the most kinds of people. Which is a business interest of a portfolio site. Building a little game for people who likes those types of games? Sweet! More power to you. But if you're showcasing a demo for wide audiences, a critique of the niche-targeting is valid. Not nearly as important as the people claiming they can't even play the game, for sure! But if you bounce one person because they press up on the keyboard and the truck moves "forward", and they don't like that - it's a marked negative for the site's intent.

You can't worry about pleasing everyone, and you especially can't worry about broad, overall, two-paragraph critiques on literal months of dedicated work. But neither of those make the critiques, themselves, improper or even wrong.


> What do game engines have to do with this?

You seemed to imply that the developer chose isometric to make development easier. I'm rebutting that this is unlikely; they're equally easy with an engine (and if you're not using an engine, you're skilled enough that they're still equally easy).

> But neither of those make the critiques, themselves, improper or even wrong.

Are you referring to my critique of your critique of razzmatak's critique ("Handsdown one of the coolest 3D websites")? Surely if you're allowed to disagree with them, I am with you.


ah, easy enough then: mistaken inference on your part.

> Are you referring to[...]

I'm referring to critique, in general, for the former, and my specific two paragraphs of critique on the project - not the commentary - for the latter. Your being "allowed" to disagree with me is what is meant by the sentence "You're welcome to your counter-opinion about the design, but you haven't convinced me."


> That said, it's not 'hands down, one of the coolest 3D websites', at least that I've seen.

Would love to see those websites.


Right? I'd love for some of them to still be around. Unfortunately, portfolio sites are ones that I find are often lost to time.

Reminds me of those insanely intricate flash demo websites

I can't see Bruno's site and I assume it's because of the HN hug of death, but an impressive 3D website that always comes to mind is acko.net, with its 3D rendered tubular logo. He even describes how it was done in a blog post.

https://acko.net/blog/zero-to-sixty-in-one-second/


OK, I take it back now that the HN hug has died off a bit. Bruno's site is a ridiculously neat thing to see in a web browser.

acko.net is one I thought of immediately too. The front page for Three.js usually has some nice examples too.

Of course, with WebGL and WebGPU support becoming ever more ubiquitous I'm not sure when 'impressive 3D website' just becomes either 'impressive website' or 'impressive 3D'.

[1] https://threejs.org/


I agree with you, it's not that it isn't impressive, but it functions poorly as a website. Innovation in design I'd expect from the HN title is something where the 3D enhances the user experience of the website itself, navigation interfaces feel natural, and so on.

This is a very well made little game that also showcases some of their work. I was hoping for something like, now I wish all websites were like this.


- navigate to dist directory

- run pnpm dlx http-serve

- navigate to one of the provided ip addresses

(this uses the tunnl.gg service and is not necessary for local network access)

- [optional, for access via internet] run ssh -t -R 80:[provided ip address including port] proxy.tunnl.gg


That gave me some other errors. I am giving up. Thanks for helping!

Fwiw if you want a simple pastebin, I've been running pinnwand for a couple years without any issues off of a single short docker compose file, I think running it on host also shouldn't be complicated

I appreciate that this writeup takes care to call out use cases when they help with understanding!

I do have a semi-unrelated question though: does using the recursive approach prevent it from being calculated efficiently on the GPU/compute shaders? Not that it matters; plenty of value in a CPU-bound version of a solution and especially one that is easy to understand when recursive. I was just wondering why the prominent examples used a non-recursive approach, but then I was like "oh, because they expect you to use them on the GPU". ...and then I was like "wait, is that why?"


> I do have a semi-unrelated question though: does using the recursive approach prevent it from being calculated efficiently on the GPU/compute shaders?

Historically speaking, the use of recursion in shaders and GPGPU kernels (e.g. OpenCL "C" prior to 1.2) was effectively prohibited. The shader/kernel compiler would attempt to inline function calls, since traditional GPU models had little in the way of supporting call stacks like normal CPU programs have, and thus recursion would be simply banned in the shader/kernel language.


Non recursive approaches can help on the CPU as well. It's just easy and elegant to do function recusion but not necessarily faster.

lol

imagine thinking you can discuss a hopped up ket-head into "aligning with your ideals".


it's the town square- it's not about the two people talking but for everyone reading what they said instead of controlling the narrative by only speaking to people you want in places you want. They don't even have to answer to everyone, so the only benefit of losing access to thousands/millions of people is to make an article like this for Pride.

It's hard to tell whether it's still really a town square, since the only information we have about X's usage is tweets from Musk.

I just mean that in the way that this site is a town square, or all of the rfc's, in that anyone can (for free) see all of the discussions and participate. Just like a town square, not every comment has to be addressed or promoted equally, but I don't know of a better way to release information and allow for anyone to discuss it with others.

You only need a placeholder if you think the platform matters enough to hold space for. For example: they don't have a placeholder on MySpace.

But if your goal is to prevent other people from having the name altogether, the move I personally enjoyed engaging in was getting my account blocked. That forces them to hold your account only to prevent anyone from using it, lest you might sneak back in and say something "harmful" like "stonetoss is hans kristian graebener".


> the move I personally enjoyed engaging in was getting my account blocked.

So you think the FSF should've used the account representing them to troll?


Haven't tested lately, but at least for a while you could get your account blocked by publicly suggesting people follow you some other place.

PG got suspended for directing his followers to Mastodon: https://x.com/alexisohanian/status/1604604968677392386

They did walk that policy back due to the backlash, but they really wanted to do it.


I made my account private and put my bsky address in my profile, so that doesn't appear to be an insta ban.

IIRC, it was for a while and then the decision was reverted.

It received so much backlash it didn’t even last one day.

https://edition.cnn.com/2022/12/19/tech/twitter-elon-musk-de...


No, I described a way of getting blocked that I personally enjoyed. Haters (me) gonna hate, after all.

I think the FSF should not be on Twitter at all. Sorry if I was unclear about that in my previous comment, but the first paragraph was meant to contradict OP's suggestion.


could change birthday to underage and get banned that way

> the move I personally enjoyed engaging in was getting my account blocked.

Interesting idea. What did you do?

> say something "harmful" like "stonetoss is hans kristian graebener".

What that it?


Stonetoss is a well known comic by an alt-Right Neo-Nazi. He kept his identity secret for years (for obvious reasons), but was outed a few years back. He received a lot of hate over this, and got fired from his tech job over it.

The comic was antitrans, antisemitic (with full-on Holocaust denial), racist, and sexist... but Graebener himself is a Latino, so he gets hated on by both the Left and the Right.

Websites that cater to the alt-Right ban users for saying his real name and ban people who make Stonetoss memes that shit on Graebener for being a Nazi.

And you know why HN is actually a great place? dang isn't going to ban me for repeating verifiable facts.


HN is the safe space. People argue and disagree here but somehow in pleasant way I haven’t seen anywhere else. So dang has easy job :))

And as you can downvote a comment so HN is self-regulating.


> People argue and disagree here but somehow in pleasant way I haven’t seen anywhere else.

Turn on `showdead` in your settings (or don’t, probably for the best) and be prepared to read some nasty comments. No substance, only hate. There are a few on this very submission.


I have showdead enabled and I haven't seen anything particularly vitriolic on this post. One troll comment maybe, but that's it.

I think it's a mistake to imply that just because a comment is dead because it was flagged that it is hateful. The vouch button exists for a reason.


> I think it's a mistake to imply that just because a comment is dead because it was flagged that it is hateful.

I wish people would stop inventing arguments and “reading between the lines” when interpreting comments from people they don’t know. There was no implication. Whatever you think you read is only in your head.

Of course not every flagged and dead comment is hateful. But hateful comments do get flagged so that’s where you’ll find them.


I'm about to do what you just asked people not to do. Perhaps, we're so used to dishonest interlocutors online that we search for intentions in people's statements?

Sorry, I had to.


It's a sign of someone gifted that they make everything look easy

That was the thing that got me blocked enough for it to stick. But it was right during the height of that meme, so I doubt it would go far now. Unless a bunch of people all started doing it or something.

If you're looking for something that might actually work right now, though, I think there's still some weird libertarian-ish "principle" they're pretending makes it wrong to post elon's (or others') flight information. At least that would be where I would start, because I don't like to bother people that don't deserve it, so general abusiveness is out, and it's funny to throw their free speech bullshit back in their face.


oh well. it was cool while it lasted! I guess I'll figure out how to make deno do what I want, now.

Same. I had a little library I wrote to wrap indexedDB and deno wouldn't even compile it because it referenced those browser apis. I'm sure it's a simple flag or config file property, or x, or y, or z, but the simple fact is, bun didn't fail to compile.

Between that and the discord, I have gotten the distinct impression that deno is for "server javascript" first, rather than just "javascript" first. Which is understandable, but not very catering to me, a frontend-first dev.


Even for server ~~java~~typescript, I almost always reach for Bun nowadays. Used to be because of typestripping, which node now has too, but it's very convenient to write a quick script, import libraries and not have to worry about what format they are in.

fantasitic

Am I misunderstanding what a parquet file is, or are all of the HN posts along with the embedding metadata a total of 55GB?

I imagine that's mostly embeddings actually. My database has all the posts and comments from Hacker News, and the table takes up 17.68 GB uncompressed and 5.67 GB compressed.

Wow! That's a really great point of reference. I always knew text-based social media(ish) stuff should be "small", but I never had any idea if that meant a site like HN could store it's content in 1-2 TB, or if it was more like a few hundred gigs or what. To learn that it's really only tens of gigs is very surprising!

Scraped reddit text archives (~23B items according to their corporate info page) are ~4 TB of compressed json, which includes metadata and not just the actual comment text.

I suspect the text alone would be a lot smaller. Embeddings add a lot - 4K or more regardless of the size of the text.

That’s crazy small. So is it fair to say that words are actually the best compression algorithm we have? You can explain complex ideas in just a few hundred words.

Yes, a picture is worth a thousand words, but imagine how much information is in those 17GB of text.


I don’t think I would really consider it compression if it’s not very reversible. Whatever people “uncompress” from my words isn’t necessarily what I was imagining or thinking about when I encoded them. I guess it’s more like a symbolic shorthand for meaning which relies on the second party to build their own internal model out of their own (shared public interface, but internal implementation is relatively unique…) symbols.

It is compression, but it is lossy. Just like the digital counterparts like mp3 and jpeg, in some cases the final message can contain all the information you need.

But what’s getting reproduced in your head when you read what I’ve written isn’t what’s in my head at all. You have your own entire context, associations, and language.

how much?

Thanks, that's really helpful to guys like me to start up my "own database". BTW what database you choose for it?

It's on my personal ClickHouse server.

you'd be surprised. I have a lot of text data and Parquet files with brotli compression can achieve impressive file sizes.

Around 4 millions of web pages as markdown is like 1-2GB


based on the table they show, that would be my inclination

wanted to do this for my own upvotes so I can see the kind of things I like, or find them again easier or when relevant


Compressed, pretty believable.

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: