Hacker Newsnew | past | comments | ask | show | jobs | submitlogin
Rusty revenant Servo returns to render once more (theregister.com)
190 points by lproven on Sept 27, 2023 | hide | past | favorite | 39 comments


> Servo has been around for about a decade, so as experimental software projects go, it's a mature one. Igalia developer Manuel Rego presented a talk which reports that the project is back under active development

The talk: https://www.youtube.com/live/e3Y1C695CIw?si=dD8_tbyEezwei8As...


Just to clarify for others that might be confused about the "about a decade": To the best of my knowledge it started in 2009. No criticism of the understatement, only to save others the look-up.


I think servo was 2011, rust itself was 9.


29th of March 2012, to be specific for Servo.

> Initial commit

> @brson - committed Mar 29, 2012 - 0 parents commit 4825529 - Showing 1 changed file with 1,522 additions and 0 deletions.

https://github.com/jdm/servo/commit/48255297b8958fa559d11704...


I could not find a source from 2009 but the following is from 2010 and it already mentions Servo.

http://venge.net/graydon/talks/intro-talk-2.pdf

The Wikipedia article about Rust says:

"Mozilla began sponsoring the project in 2009 as a part of the ongoing development of an experimental browser engine called Servo [..]"


[Author here]

I did embed a link to it, but it looks like the start time is lost. I will try to get that put back in.


> As well as being independent of any browser vendor, it is designed to be embeddable, memory-safe, modular, and parallel.

How does Gecko (the engine used in Firefox) fare on being embedded? I thought it wasn't as easy or simple, and hence one of the reasons why Chromium dominated in things like Electron.

I can't square these two statements in the article:

> Servo has been around for about a decade, so as experimental software projects go, it's a mature one.

> It still can't pass the Web Standards Project ACID tests, which way back in 2008 WebKit was the first browser to successfully handle, so there's clearly some way to go yet.

I thought the goal of Servo, while it was incubated and paid for by Mozilla, was to absorb parts of it into Firefox over several years (or across decades) since this is a humongous undertaking. It's still painful to see Mozilla's revenues (mostly from Google) and how keeping Servo around could've helped in making a better browser, sooner (compared to current Firefox development).


Do you really see a clear link from "better/safer/faster browsing rendering component" to "better revenues" or "better business situation" for Mozilla? Because I don't think I do, and I suspect Mozilla didn't either, really.

As a Rust dev, and nerd generally, I love the idea of servo and if I were more excited about browsers and the web would even love to contribute to it. But unfortunately stuff like this isn't what makes or breaks project success in the short or even medium term.

I use Firefox as my main browser everywhere now, but I didn't make the jump on any technical merits in rendering. Most days I barely notice which browser I'm on. I use Firefox because of the principles it represents and Google finally crossed the final lines with me, ethically.

I don't know what the future holds for Firefox and Mozilla but I don't think it's fundamentally strongly altered by the success of servo.

In any case, as a software engineering community we need to find ways to fund big picture projects like Servo etc. without having them tied to the fortunes and whims of various corporations. We lucked out with Linux filling a niche where it could explode in popularity and be securely funded for dev, but that hasn't been the case with lots of other things.


> Do you really see a clear link from "better/safer/faster browsing rendering component" to "better revenues" or "better business situation" for Mozilla? Because I don't think I do, and I suspect Mozilla didn't either, really.

Most people don't choose a browser based on principles or mission. Even if Firefox captured 100% of the market that did, it would barely move the needle in browser share.

Most people choose browsers based on fairly straightforward things, such as the default (many people never change this, which is why changing the default browser via bundled downloads has remained such an effective marketing strategy for decades), performance, the feeling of performance (which is different from actual performance), whether the websites they're visiting are supported, etc.


Yeah, absolutely, which is why I think it's always going to be tricky for Firefox as long as there are corporate entities like Google and MS that have identified "having our own browser" as a competitive advantage.

When Chrome came on the scene it was a huge improvement, and Google seemed mostly-benevolent then, and the core of it was open source. And I worked in that codebase and met some of the people involved, and they were all good very smart(er-than-me) people that I met. The core engineering project itself has good foundations.

The problem is those folks gotta get paid. And so ultimately in the long run, Chrome has become the problem we always dreaded it would become.

Which is why, yeah, "we" (for some definition of we) need to come up with a way that these kinds of projects can get paid for outside of places like Google etc.

Mozilla almost feels like that, but from talking to the people I know who worked there... it's not.


> When Chrome came on the scene it was a huge improvement

I don't think people in general realise how big a jump it was. Process isolation demonstrated that a browser with a single event loop is not a given, and the idea of completely separating page loads from page renders was, for want of a better term, a true paradigm shift.

V8 showed that JavaScript didn't have to be disgustingly slow. I still remember the "Chrome Comic" booklet. Google made a big deal out of having hired some of the best compiler optimisation experts money could buy and pointed them towards a goal. V8 came out with two decades of hard-core, bleeding edge research behind it.

It took Mozilla nearly a decade to catch up with multi-process Firefox (ironically, the project was called "Electron"). Chrome had the benefit of being a pure greenfield project. Firefox had to retrofit a completely different processing and rendering architecture that broke pretty much all previous assumptions.

I recall writing a note to an internal Nokia-infused thread that "Chrome's process isolation is an idea too good not to steal". At the time I didn't know just how damn hard that would be to pull off without introducing multiple flag days. In the end Mozilla managed with one.


The multi-process Firefox project was called Electrolysis: https://wiki.mozilla.org/Electrolysis


Whoops, my bad. Thanks for correcting the record.


Yeah, back in 2008, Chrome was really a breath of fresh air, both in terms of performance and in terms of stability. And also for web developers, being the first browser to actually integrate devtools...


It depend on what you mean by "integrate".

Mozilla/Firefox had the extensions ecosystem that produced a bunch of interesting tools, which eventually crystallized as Firebug:

http://flailingmonkey.com/the-history-of-firebug


As someone who worked at Mozilla the best way I heard it described is that Netscape designed the embedding to be monolithic. Something you built your app on top of. You best use all of Gecko, NSS, Spidermonkey, etc. Any path that meant your app needed to make changes to Gecko etc. was/is here be dragons territory. If there was some part that your app did not need disabling that part was unsupported and the effort was the the embedding app's responsibility. Where as Chromium was designed to be modular.


> How does Gecko (the engine used in Firefox) fare on being embedded? I thought it wasn't as easy or simple, and hence one of the reasons why Chromium dominated in things like Electron.

I haven't checked for a while, but it's pretty painful, especially because Mozilla kept changing their mind about how they wanted it to be embedded.


It's remarkable that Mozilla managed to be worse than Google in the "changing their mind" department.


Quite a few parts of Servo were actually integrated into Firefox/Gecko (such as Stylo).


Firefox had an Electron-like solution from 2006 to 2015 called XULRunner. Firefox and Thunderbird were built on top of the very same tech (XUL UI, XPCOM component object model, Sqlite databases, XPI extensions...)

Then Electron came along around 2013. You get a backend using NodeJS (same V8!) and you don't need to learn XUL, XPCOM etc.


If this engine could manage all of htmx and most of tailwind it could be a very attractive local application platform.


> most of tailwind

Also known as CSS.


It just needs a stable embedding interface.

I'm watching the speech to see if they have a stable, easy to compile release, with a documented embedding interface. That's all they actually need to make servo usable, but up to the last time I looked, they didn't plan on doing it.


I can't edit my comment anymore, but on the speech he goes into each one of those points and say they are immediate goals.


I've still never used it but I've long been curious about Sciter:

https://sciter.com


Yes it would be interesting to have a Sciter-like framework built on top of Servo.


Make sure to read the license carefully...


Isn't Firefox Reality (now Wolvic) running on Servo?


According to https://wolvic.com/en/faq/ it uses GeckoView.

Wolvic currently uses GeckoView. That said, as part of our experimentation in 2021, we showed a version of Firefox Reality backed with WPE WebKit through an experimental abstraction layer. Igalia has since developed a a robust abstration layer which will allow it to be somewhat more decoupled from the underlying engine, thus allowing different engine backends. We are currently in the process of integrating a Chromium backend.


Maybe that’s what this is about? It’s Igalia.


[Article author/submitter here]

I can only tell you that it is not what this is about, inasmuch as I was at the talk and there was not a single mention of Firefox Reality or Wolvic in the talk.

Wolvic might use Servo – but I think if it did they would mention it, right?

The talk didn't and the word "Wolvic" does not occur anywhere on https://servo.org

So I am guessing not, no.

Igalia has -- or rather is because it's a co-op -- about 100 developers. They are not all working on the same thing.


Hi, Igalian here. I can tell you that Wolvic does not use Servo nor are there any immediate (or even long term at this point) plans for it to. That's not to say it couldn't happen someday or that it wouldn't be interesting. We are major contributors to a lot of major projects and many of them have something to do with web engines (see https://bkardell.com/blog/2023-Mid-Season-Power-Rankings.htm...) so it's easy enough to see connections, but the connections are mainly often more generally just "we have a lot of interest in web stuff, and a great reputation for contributing".


Mozilla realized it was actually doing something useful and that such things would upset its unofficial corporate overlords, so it shut everything down.

Hopefully one day we’ll have a fast and light web browser again :(


[Author/submitter here]

> Mozilla realized it was actually doing something useful and that such things would upset its unofficial corporate overlords, so it shut everything down.

I don't think that is the case, no. If you have any kind of evidence, I would be very, very interested to see it.

> Hopefully one day we’ll have a fast and light web browser again :(

There are fast, light browsers around. I like NetSurf, which originated on RISC OS but now runs on lots of things. There's also Dillo.

But they are small, fast and light because they don't do Javascript, and that makes them useless for a lot of stuff.

I welcome suggestions on how to improve that situation. I do not think that Gemini is an answer.

Midori used to be small, light and fast, but not any more. It's another Blink browser now, AFAIK.

Epiphany is still quite snappy, and it's the only WebKit browser for Linux AFAIK. (This also means it's a fully native browser for Linux on Arm, because there's no Chrome for ordinary Arm Linux, or for Windows on Arm64. There is a native Arm64 Windows Firefox but it's well hidden. Chromium exists of course but it is quite limited.)

Epiphany is also the best browser on Haiku.


In terms of new renderers, there's Ladybird: https://ladybird.dev/ For innovative new browsers, there's Nyxt: https://nyxt.atlas.engineer/

Both are looking for funding and sponsors.


Interesting - I had never never seen nyxt - but it doesn't seem to be a new renderer as far as I can tell? I could be wrong! I'm curious if it is though! It seems to say in the FAQ it is agnostic and currently supports a few engines as the renderer https://nyxt.atlas.engineer/faq


I don'r know if this counts but there is also https://www.qutebrowser.org/ which uses qt webkit IIUC


qutebrowser doesn't really use webkit anymore. It was an alternative browser engine, but even the developers say that it isn't recommended.

It now uses the QtWebEngine, which is chromium based:

https://github.com/qutebrowser/qutebrowser#requirements


QtWebKit is not recommended but it is still available if you want it: https://qutebrowser.org/doc/help/settings.html#backend. There is a ticket to remove it completely which is now targeted for V4 (https://github.com/qutebrowser/qutebrowser/issues/4039) but, in reality, you could still use it.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: