Hacker Newsnew | past | comments | ask | show | jobs | submit | yesco's commentslogin

I get maintainers have their own issues to deal with, and respect that they are trying to keep the project clean. At work I have had many times where I spent more of my day reviewing MRs than actually writing code, and sometimes my cold blunt replies can unintentionally rub people the wrong way.

Still, I feel like they were pretty rude to this guy for no real reason. I don't think I'd want to work with them.


If we focus only on the impact on linguistics, I predict things will go something like this:

As LLM use normalizes for essay writing (email, documentation, social media, etc), a pattern emerges where everyone uses an LLM as an editor. People only create rough drafts and then have their "editor" make it coherent.

Interestingly, people might start using said editor prompts to express themselves, causing an increased range in distinct writing styles. Despite this, vocabulary and semantics as a whole become more uniform. Spelling errors and typos become increasingly rare.

In parallel, people start using LLMs to summarize content in a style they prefer.

Both sides of this gradually converge. Content gets explicitly written in a way that is optimized for consumption by an LLM, perhaps a return to something like the semantic web. Authors write content in a way that encourages a summarizing LLM to summarize as the author intends for certain explicit areas.

Human languages start to evolve in a direction that could be considered more coherent than before, and perhaps less ambiguous. Language is the primary interface an LLM uses with humans, so even if LLM use becomes baseline for many things, if information is not being communicated effectively then an LLM would be failing at its job. I'm personifying LLMs a bit here but I just mean it in a game theory / incentive structure way.


> people might start using said editor prompts to express themselves, causing an increased range in distinct writing styles

We're already seeing people use AI to express themselves in several contexts, but it doesn't lead to an increased range of styles. It leads to one style, the now-ubiquitous upbeat LinkedIn tone.

Theoretically we could see diversification here, with different tools prompting towards different voices, but at the moment the trend is the opposite.


>Human languages start to evolve in a direction that could be considered more coherent than before

Guttural vocalizations accompanied by frantic gesturing towards a mobile device, or just silence and showing of LLM output to others?


I was primarily discussing written language in my post, as that's easier to speculate on.

That said, if most people turn into hermits and start living in pods around this period, then I think you would be in the right direction.


Eventually the spacers will depopulate the planet and we'll live alone. The robotics aren't quite there yet though.

>People only create rough drafts and then have their "editor" make it coherent.

While sometimes I do dump a bunch of scratch work and ask for it to be transformed into organized though, more often I find that I use LLM output the opposite way.

Give a prompt. Save the text. Reroll. Save the text. Change the prompt, reroll. Then going through the heap of vomit to find the diamonds. Sort of a modern version of "write drunk, edit sober" with the LLM being the alcohol in the drunk half of me. It can work as a brainstorming step to turn fragments of though into a bunch of drafts of thought, then to be edited down into elegant thought. Asking the LLM to synthesize its drafts usually discards the best nuggets for lesser variants.


That's a strange example. An unauthenticated server on a LAN wouldn't be exposed to the Internet any more than a network using NAT would be. You would need to explicitly configure your routers firewall to expose a local node, the same way you would need to explicitly configure port forwarding with a NAT based network.

I've see some argue that a hypothetically buggy router would somehow be less likely to fail if NAT was used but really, that could be equally said about bad port formatting defaults, which have in fact happened. Complexity is what increases the likelihood of bugs at the end of the day.

NAT is just an addressing hack, a weirdly complex way of indirectly routing to local addresses. It only influences what is written on the envelope, not how that envelope is processed at the post office.


> You are not bypassing the router, the devices need to get their packets from somewhere, and it is only like a forever-open port if the router/firewall decides it is.

This trips up a lot of people, and I think it's because NAT was probably their first real exposure to networking. When that happens, you end up building all your mental models around NAT as the baseline, even though NAT itself is really just a workaround for address space limitations.

What's interesting is that someone with no networking background who thinks of it like a postal system (packets are letters that get forwarded through various routing centers from source to destination) would actually have a more accurate mental model of how IP networking fundamentally works. The NAT-centric view we all learned first can actually make the basics harder to understand, not easier.


Don't these data centers have pretty elaborate cooling setups that use large volumes of water?

So they're sitting on real estate with access to massive amounts of water, electricity, and high bandwidth network connections. Seems like that combination of resources could be useful for a lot of other things beyond just data centers.

Like you could probably run desalination plants, large scale hydroponic farms, semiconductor manufacturing, or chemical processing facilities. Anything that needs the trifecta of heavy power, water infrastructure, and fiber connectivity could slot right in.


> Don't these data centers have pretty elaborate cooling setups that use large volumes of water?

Depending on where, and (more importantly) when you last read about this, there's been some developments. The original book that started this had a unit conversion error, and the reported numbers were off by about 4500x what the true numbers are (author claimed 1000 times more water than an entire city consumption, while in reality it was estimated at ~22% of that usage).

The problem is that we're living in the era of rage reporting, and corrections rarely get the same coverage as the initial shock claim.

On top of this, DCs don't make water "disappear", in the same way farming doesn't make it disappear. It re-enters the cycle via evaporation. (also, on the topic of farming, don't look up how much water it takes to grow nuts or avocados. That's an unpopular topic, apparently)

And thirdly, DCs use evaporative cooling because it's more efficient. They could, if push came to shove, not use that. And they do, when placed in areas without adequate water supply, use regular cooling.


I always find water use and farming weird. Living in part of planets where water for farms mostly if not fully rains down from the sky. So it getting on farm land is inconsequential one way or an other.

Still, I do feel there must be some difference between farming and cooling use by evaporation. As at least part of water is run off back to rivers and then seep back to ground water. These again depend largely on location.


I have no idea what book you're talking about, and I never claimed water "disappears" or made any argument about consumption statistics. Why would you assume I think water vanishes from existence? That's absurd.

My point is simple: the utility infrastructure is the hard part. The silicon sitting on raised floors is disposable and will be obsolete in a few years. But the power substations, fiber connections, and water infrastructure? That takes years to permit and build, and that's where the real value is.

Building that infrastructure (trenches for water lines, electrical substations, laying fiber) is the actual constraint and where the long term value lies. Whether they're running GPUs or something else entirely, industries will pay for access to that utility infrastructure long after today's AI hardware is obselete.

You're lecturing me about evaporative cooling efficiency while completely missing the point.


Sorry if it came out that way, it was not my intention. I just thought you asked and provided some info that I'd recently read.


Water usage of the DC itself can vary a lot. If they're in an area where clean water is cheap, then they might use evaporative cooling which probably has the most significant water consumption (by volume and the fact that it's been processed to be safe to drink). In other areas they may use non-potable water or just a closed loop water system where the water usage is pretty negligible. The electricity is going to be the much larger consideration on the larger scale (though still affected by local grid capacity). Also, the capital cost is a very significant part of these systems: there's a pretty big gap in pricing between 'worth building' and 'worth keeping running'.

(I recommend this video by Hank Green on the subject: https://www.youtube.com/watch?v=H_c6MWk7PQc . Water usage of data centers is a complex and quite localized concern, not something that's going to be a constant across every deployment)


Semiconductor manufacturing might make sense here but I also don't think it might not simply because it would require probably a lot of expertise and knowledge and complex machinery with experience in this industry which I assume would be very hard to gather even for these datacenters.

I don't see any reasonable path moving forward for these datacenters for the amount of money that they have invested.


Semiconductor manufacturing needs supply chain a lot more than it needs fast internet. Wafers, fine chemicals, gases, consumable parts. A lot of this comes from petroleum refining, so it helps to be near a lot of refineries, although not enough to be decisive in site selection.


Agreed. Your point is true and as such too I don't really think that they could really be used for semiconductor industry.

And all other industries also don't really seem to me to have any overlap with the datacenter industry as much aside from having water access and land and electricity but like I doubt that they would get used enough to be justified their costs, especially the costs of the overpriced GPU's and ram and other components

In my opinion, These large datacenters are usually a lost cause if the AI bubble bursts since they were created with such a strong focus of GPU's and other things and their whole model of demand is related to AI

If the bubble bursts, I think that auctioning server hardware might happen but I doubt how much of that would be non-gpu / pure compute related servers or perhaps gpu but good for the average consumer.


I'll just make note here for anyone else confused that Groq and Grok are distinct entities. They just have similar names.

Groq is more of a hardware focused company.


Completely agree. Modern computers are basically just web terminals for most people, so a basic Linux distro + browser is all they need.

Windows is actually terrible for non-technical users now. The constant pop-ups, nagging messages, and decision prompts create genuine anxiety. People don't know what they're clicking on half the time. Yet somehow most technical people I talk to haven't caught on to this.

Look at what younger generations are actually using: Chromebooks in schools, Google Drive instead of Microsoft Office. Even people who legitimately need Office aren't on Windows anymore, they're on Macbooks. That's the case at my company anyway.

At this point Windows is really just gamers, engineers who need CAD, and office workers stuck on it from inertia. There's nothing inherently attracting new users to the platform anymore. I honestly don't know who their primary audience even is at this point.


Then why is Google killing the ChromeOS/Chromebook? Also Windows is increasing in its share again. Maybe that is due to companies that want AI in there systems.


> Then why is Google killing the ChromeOS/Chromebook?

They're not? They're combining it with Android, which honestly seems like a decent bet for what Chromebooks are meant to be. The end result will have a different name, but it will still be a cheap laptop to do school work and simple computing, and that isn't a Windows machine.

> Also Windows is increasing in its share again.

Is it? And is that pie even getting any bigger?


> Then why is Google killing the ChromeOS/Chromebook?

They're not killing it, they're merging it into Android. Makes sense. Android already does everything ChromeOS does, it just needs better desktop input support. Google said this was to compete with iPads, which only reinforces my point.

> Also Windows is increasing in its share again.

Short-term fluctuations don't change the long-term trend. We're talking about where things are headed over the next decade vs where it once was

> Maybe that is due to companies that want AI in there systems.

My company went all-in on Copilot, but I'm not seeing this translate to more Windows usage. Copilot works fine on Macbooks, and that's what most people here use. When management gets excited about it, they talk about Outlook and Teams integration. Nobody cares about Windows-specific features. What does OS integration even buy you? Access to local files that are already in the cloud anyway? I'm using Copilot on my company-issued Ubuntu laptop right now. And honestly, the fact that IT at a massive, conservative corporation even started offering Ubuntu as an option says a lot about where things are headed.

Microsoft will be fine, but I'd bet on Windows declining over the next 10 years, not growing.


Yeah, following the OP's logic, if I think this obsession with purity tests and politicizing every tool choice is more toxic than an LLM could ever be, then I should actively undermine that norm.

So I guess I'm morally obligated to use LLMs specifically to reject this framework? Works for me.


I like to think of LLMs as the internet's Librarian. They've read nearly all the books in the library, can't always cite the exact page, but can point you in the right direction most of the time.


Are you aware of any new TVs that have a displayport? Genuinely asking since I haven't been very successful finding any.


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: