GNU Emacs is now dependent on TreeSitter which is a MIT-licensed project and LSP which is a Microsoft project. Also built-in support for non-gnu packages to install. Soon it will be a non-gnu project entirely. I think it's a bit sad that the ideological basis is beginning to be abandoned but I think there is not enough believers in the ideology anymore.
I would say most modern editors (Helix, Neovim) do TreeSitter and LSP better than Emacs today and probably for many years to come
Emacs' LSP client Eglot and many of the LSP servers have nothing to do with Microsoft. Honestly, LSP is one project that I'm thankful to MS for. Personally, I value open standards like LSP more than any single FOSS project.
Eh, I've been looking and haven't found anything for other editors that actually tries to use TreeSitter for anything beyond highlighting. The Emacs structural editing packages are still very WIP but at least they exist.
(And also some have been based on the out of tree implementation that's been around for a while now)
LSP I can probably understand, mostly for performance reasons. Native json parsing and native compilation goes a long way, but clients written in elisp seems susceptible to edge cases where it's not performant enough because UI and IO runs in same thread in Emacs. Not insurmountable even without multithreading, some newer clients that uses better IPC or the dynamic module system are not constrained by Elisp requirement and seems to be doing fine in terms of performance.
The dynamic module system is generally a win for pragmatism over ideology, and it has been around for 7 years already. You can't do everything, like say extending core graphics of Emacs, but you can do a lot in any language of your choice if you feel constrained by Elisp. Tree-Sitter feature is built on top of that so it's not clear to me why do you think Emacs can't do better than say neovim. I use neovim and tree-sitter daily and generally don't think tree-sitter itself is rock solid yet, I run into indentation and slow query issues semi-routinely. But I am much more impressed with the work happening on Emacs community that leverages tree-sitter for advanced tooling [1].
Why is it a problem that LSP was originally invented by Microsoft? It's an open protocol with many free-software implementations. You don't have to use any Microsoft code if you don't want to.
Besides, even if that wasn't the case, Emacs has long had a policy of interoperating with non-free software. It runs on versions of Windows from 98 to 11. That's not because its developers don't value free software, but because they realize that this is a more effective way of convincing people to use free software than insisting on absolute purity.
Not really, only when it can be shown that the project is derivative of the GPL code (for example, hard dependence on GNU-only library for some of its features vs. "one of the multiple possible implementations we can use is GPLed")
Google Translate got a lot worse after the AI version was introduced, maybe not for english-centric translations but all other. The previous deductive translator was be much better. Same with Siri and Google Assistant, they are really bad at other languages except English
This mislabeling is quite common in popsci outside the AI field, so sorry for the rant, but I've got to rant as this is my pet peeve. Like, I get the joke you're making, but it's based on a horrible misuse of the word. All the major dictionaries (https://www.merriam-webster.com/dictionary/artificial%20inte... or https://www.oed.com/viewdictionaryentry/Entry/271625 etc) have only the uncountable or adjective meanings, and none have plural "artificial intelligences" as a valid option referring to anything ever, that's simply not a word in English.
M-W has (IMHO rightly) these two senses:
artificial intelligence
noun
1 : a branch of computer science dealing with the simulation of intelligent behavior in computers
2 : the capability of a machine to imitate intelligent human behavior
And that's it. There's no plural "artificial intelligences" or singular "an AI" because this term never refers to a specific system, it may refer to the field or the property but not to the specific machines which (perhaps) possess some artificial intelligence as the attribute/capability. Even if you'd have a system with fully superhuman capabilities, it wouldn't be "a artificial intelligence" because you simply don't (or at least shouldn't) call things or systems "artificial intelligences", just as you don't call people "natural intelligences".
Emphatic disagreement, at least when it comes to Indo-European languages. The previous translator was effectively unusable. Then suddenly Google Translate became something that would work most of the time. At the least for average users who were dealing with English, Spanish, French, German, Russian, etc.
Despite the documented failure modes (and they were many), suddenly it was possible to read articles in other languages, and it was likewise possible to make yourself understood in other languages using it. I personally know a lot of people who speak multiple of those languages. And they all agreed that it was a giant improvement. And the fact that it WAS a giant improvement was why they got rid of the previous translator.
I understand that it was terrible with Chinese. But I never used it for that.
I know that they focused on Chinese as a specific problem and have improved. I would expect it to be much better today than it was in 2006.
Part of the problem was that there is a lot less grammar in Chinese than in Indo-European languages. So there are many ways to translate a given Indo-European sentence into Chinese, and you need to understand context on a Chinese sentence to properly translate it into an Indo-European one.
The many ways to translate to Chinese is a problem because Chinese flexibility in word order means that there are many choices of reasonable next word, and they didn't have enough data to tell the difference between a reasonable next word and an unreasonable one.
Going the other way Chinese may not care whether you have one apple or 10 apples, or whether Xi is a man or a woman. But Indo-European languages generally do care. So Google Translate has to guess, and often gets it wrong.
This is not true, it varies by personality, some personalities have high neuropasticity even when they get older. It depends on individual differences in the dopaminergic system
What is "personality"? How much of that is choice?
And for the neuroplasticity and are the dopaminergic system differences, how much is cause and how much is effect? Our actions affect our brains and the rest of our bodies, and vice versa. If I chose to lift heavy weights every day, after ten years you wouldn't say, 'they do that because they have extra muscle'.
tree-sitter is a bit better than regexp but it is not an actual parser of grammars, a fast actual parser of all languages for syntax coloring is the future I think, tree-sitter is a pragmatic middle-ground while we wait for the prime solution
I would say most people don't know 90% of how to use the iOS interface but ppl are in general not interested in learning it and wouldn't read a manual if the box included one or even watch videos of it. People just what Apple to read their minds, to get their stuff done in as little effort as possible and people are willing to pay a lot of money for that experience
I will never buy a Apple laptop again because Apple stops supporting it after a while and you can't install new version of the OS on it and new software. This is a deal breaker for me, completely a waste of perfectly fine hardware. Installing Ununtu on it solved all problems fortunately. Same thing with Apple AirPort TimeCapsule, such a waste to buy it when Apple stops supporting it after a couple of years. You might as well buy products from Apple and get the delivered directly to the garbage dump
Lisp is inspiring at first glance, but when you need to solve complex problems like references, pointers, macros, byte-compilation and native compilation it just is not expressive enough like C, C++, or Rust. Neither is Javascript. Lisp could not replace all other languages
I seriously can't downvote your comment hard enough.
> references
Everything in Lisp is passed by copied references (when talking in C++ terms), so that's kind of a solved problem.
> pointers
You don't have raw pointers in CL since it's a memory-managed language, unless you're dealing with foreign code, and then there's support for handling these in a meaningful way (see CFFI).
> macros
Seriously? The CL macro system with quote-unquote and the full language being available has been good enough to be an inspiration for syntactic macro systems in multiple other languages, such as Rust or Elixir.
> byte-compilation
You only compile to bytecode if you cannot compile to native code. You can see it in implementations like ECL (if GCC is not present) and CLISP (if not built to use GNU Lightning).
> native compilation
SBCL and CCL and ECL and Clasp and LispWorks and ACL all do that, it's a solved problem. Or do you mean that writing compilers in Lisp is impossible since the language is not expressive enough, at which point you can see the source code for Lisp compilers which is frequently written in Lisp?
> Neither is Javascript.
That's the only part of your comment that I think makes any sense.
> Lisp could not replace all other languages.
Who even posed the statement that it should? Are you trolling?
This is entirely inaccurate and a fundamental misunderstanding of Lisp, both as a language and as an ecosystem. Have you seen [0], for example? It's how to leverage the native assembler of a Common Lisp compiler to build a small JIT-like compiler in Common Lisp.
Did you know Lisp has a disassembled built-in, with the standard library function DISASSEMBLE [1]?
Why would Lisp support these things like "native compilation" are supposedly out of scope?
Never mind that Lisp solved references, pointers, macros, byte-compilation and native compilation before the calendar flipped from 1969.
Lisp was compiled by around 1960.
Peter Landing, in 1964, presented the SECD machine, a kind of bytecode based on Lisp-like lists, directly usable for Lisp implementations, in "The Mechanical Evaluation of Expressions". Numerous Lisps have historically had some kind of "Lisp Assembly Language" (LAP) as an alternative or in addition to native compilation.
ZetaLisp, in 1981, the variant of MacLisp running on Lisp machines, featured locatives, which allow the address of a location to be passed around, for simulating reference parameters or members.
What do you mean? Many lisps have these features, in fact lisps in general are well-regarded for their macro systems. I don't think any programming language can replace all other languages but of all options, I feel lisps which allow you to safely and cleanly extend the language to better reflect the problem domain cast an unusually wide shadow.
I'm saying that it's hard to build a basic lego block with another lego block. You have to use plastic, heat and a mold.
That's why we have a wide range of programming languages at different levels, and why it's common for people to say "high level" or "low level" languages.
It's unclear to me what your suggesting, or what question your asking really.
Thank you for the context, this also explains why most UIs gets broken / stuck at times. Like spinners that never stops or buttons that stay disabled even though they should be enabled. Those kinds of issues happens every day even in Apple, Google and Meta products (a lot of economical power) so there must be a
deeper structural cause for this and I think it boils down to that the UI frameworks used are inherently fragile and has built-in edge case issues because of this
I wouldn’t say it is inherent, I think it has more to do with state management - some state is inherent to the component itself, but the separation is problematic. All too often business logic gets stored in the component and vice versa.
I would say most modern editors (Helix, Neovim) do TreeSitter and LSP better than Emacs today and probably for many years to come