At which scale are you looking at? We have used Miyawaki to create a microclimate around our house, perhaps 120m^2, in a tier-3 town in Andhra. I have heard someone saying the minimum is around 10m2, in order to have room for proper diversity.
I hope there would be enforced regulation around this kind of thing in cities in India. In residential areas, it's common that your house takes up almost all of your available plot space, and on top of that mostly constructed from concrete.
Air temperature is already high (e.g. 36C at my location just now), and radiant heat from sun and concrete can make the felt temperature more like 60C.
It's sad that new real-estate layouts continue to be approved, which will only be good for this type of dense concrete hell.
The deterioration of language of late has completely devalued treaty, combined of course with reckless disregard of treaties by certain parties, but, due to the valorization of idiots and deal-makers the word deal has become the preferred term of art.
I'm sticking with LaTeX, not as a fetish, but because journal/conferences still do not accept e.g. typst. Will they ever do? I don't know, depends on their willingness to integrate it into their toolchains I guess?
There are already at least two publishers which accept Typst. So that "ever" part is already covered. But most still don't accept Typst and LaTeX is usually mandatory if the sources are required.
Yeah, that was my first thought. And it's not just about them accepting typst, but also whether they would provide a template using typst, like they currently do for latex. Using the conference/journal template to write the article saves a lot of time for both submitters and editors (who have to deal with hundreds, if not thousands of submissions).
That is for sure my biggest concern with typst. I wrote a tool that can convert from typst to latex for final submissions, but it is a bit sketchy and at the moment won't handle math very well. https://gitlab.com/theZoq2/ttt
You normally submit a LaTeX or Word document, and the publisher does the final typesetting. Even in computer science, where people often spend a lot of time tweaking the typesetting, the pdf generated by the authors is essentially a preview. There are often visible differences between it and the publisher's version.
Yeah this is one of the craziest things about the scientific publishing industry.
Journals justify their fees by claiming its for typesetting, but all they are really doing is adding extra work to nit pick bibliography formats and so on (see the comments in this article about sentence case). Nobody cares about that. I don't think anyone even reads "journals" any more (except maybe Nature/Science etc.). They mostly just read individual papers and then there's no consistency to maintain.
In a sane world journals would accept PDFs. They would check that the format roughly matches what they expect but not insist on doing the type setting themselves.
I would note arXiv requires the source as well, and having the source is what is enabling the HTML experiments they're doing.
On consistency, what the journals provide is some level of QA (how much is a function of field and journal, rather than the what is charges), and the template is the journal's brand, so both the authors and journals benefit from the style (I can tell the difference between the different (all similar quality) journals in my field at a glance by the style).
It's also worth noting that there's a whole much of metadata that needs to be collected (whether you agree with it or not, funders require it), so a PDF isn't going to cut it here either.
Citation and bibliography guidelines are by far the things that authors neglect the most, and they are absolutely essential to ensure quality.
Using PDF as an input format would make editing and typesetting practically impossible. Not that I haven't seen volumes where publishers did that but the results are abysmal and in my experience that only occurred with local "grey literature" like really crappy conference proceedings edited in an institute.
> Using PDF as an input format would make editing and typesetting practically impossible.
But they don't do any editing or typesetting. They say "use our template" and "the author's second initial needs to be italic in citations". That's my whole point.
> Journals justify their fees by claiming its for typesetting [..]
Because they used to actually be doing that. Historically, science journals were pay-to-play because the journal had to typeset your document and print it. But with the advent of computers, they had to pivot while still retaining their revenue streams.
Every sci-fi is a contemporary author's idea of the future. The further our the more unlikely the idea. Would love the site to be extended to any sci-fi. The percentages could be automated. How close to Dune are we?
I don't think it's true, sci-fi rarely tries to be an accurate prediction of the future.
Rather it's talking about the present, by accentuating some trends of the present to make people think about it. Black Mirror in particular is all about that, it's exaggerating some trends from the present to make us aware of them and the potential consequences if pushed too far.
I'm on a relatively large Indian ISP, and my home network gets an IPv6 network assigned, which is directly routable. Didn't think about it until tailscale told me it was connecting over a direct IPv6 connection and I wondered how that was possible. Sounds like 90s network rampage may be back here.
I like to use llm to produce code for known problems that I don't have memorized.
I memorize really little and tend to spend time on reinventing algorithms or looking them up in documentation. Verifying is easy except the fee cases where the llm produces something really weird. But then fallback to docs or reinventing.
I once worked at a place where the marketing team explicitly removed dates from corporate blog posts for SEO purposes.
The idea was that some content is, or benefits from being perceived as, "evergreen". Always relevant.
Maybe -- but it's still deceptive, I think.
On the other hand, even here on HN we see people talking about "unmaintained" GitHub repos where the last commit was more than a few months ago. So, recency-bias is a real thing, and marketers certainly don't want to be penalized for honesty. :)
This was a while ago. Nowadays, the obvious opposite extreme is common. Blogspam that is "updated" with a current date, but no changes to content.
And the more date-specific 'best thing 2025' articles also update titles to match current year. Email is already a wasteland of notifications and things I haven't unsubscribed to after buying something. Web is now a wasteland of SEO optimized content. I'm bummed it's gone this way. Looking forward to AI becoming monetized in a similar way by inserting product placement etc.
Its not an OS update, its a Google Play Services update .. so if they still apply you would get it
I found it strange that things like 'prettier settings screens' and 'improved connection with cars and watches' would be included in Google Play Services. Surely those things are part of the OS not part of a thing which helps you access the Play store?
I've been using a LineageOS (prev. Cyanogenmod) phone for years and have never installed any google stuff so I don't get these updates anyway.
One possible option would be to install Netguard (open source Android firewall that doesn't require root) and block Play Services.
I have that on a spare unrooted Android phone. Seems to be working so far. But I'm sure Google could bypass it if they really wanted to. I don't know if they've ever made an effort to bypass Netguard (or similar) in the past.