This one runs Linux on the ARM cores rather than the RISC-V cores in the same package, so it's apples and oranges, but still pretty neat for something that comes essentially with a companion MCU and is in the same form factor:
In embedded systems it's very common to have a bigger SoC and a satellite MCU to handle e.g. network comm and power lifecycle. I've still not really tried out my Milk-V Duo, but it's interesting to get a combo like this in a hobbyist board form factor.
They also claim desktop-class performance for this RISC-V-based miniITX board, it's a bit weird though since it doesn't claim RVA23 compliance:
I'm building Fostrom (https://fostrom.io), an IoT Cloud Platform. We have Device SDKs to simplify integrating devices, powered by a small Device Agent written in Rust.
I wanted to support RISC-V boards too, so I went with the Milk-V Duo S as the test device. I have managed to get Tailscale working, and our Device SDK works too, with the bundled Python.
The experience of using the Milk-V Duo is definitely not as straightforward as the Pi Zero, but it does work, and is easily available in most places, unlike some of their other products. The Linux distro they provide is quite barebones, and I wasn't able to get Debian working. The docs for the device are pretty decent. I hope we get better support for Debian/Alpine/Arch for these kinds of boards soon.
Interesting. I sometimes get similar behaviour on KDE / wayland, usually it is "2" or "3", and it seems to only affect electron apps. Always thought it has something to do with a dodgy ps/2 to usb converter I use to attach my old mechanical keyboards. I think it does not happen if electron apps are started with "--ozone-platform=wayland" but not completely sure, and I have no reliable way to reproduce or somehow trigger that behaviour.
Aren't they talking about the c++ dialect the compiler expects without any further -std=... arguments? How does that affect the bootstrapping process? This https://gcc.gnu.org/codingconventions.html should define what C/C++ standard is acceptable in the GCC.
The way I read withzombies's comment (and it could be wrong) was they were talking about the language version of the compilers source. I assumed that from the "dogfooding" portion of the comment.
Correct, this is a discussion of which language version the compiler should follow if the programmer doesn’t specify one. It’s not about which features are acceptable when implementing the compiler.
Nobody is constantly charging their completely empty EV.
A typical commute of 50km/day at 20kWh/100km means you have to put 10kWh into you car (per day). A 230V outlet can deliver 3,7kW at 16A, so your car would be topped up again after about 3h.
Tesla Supercharger prizes at 20kWh/100km are in the same ball park as Diesel at 5l/100km. Charging at home should approach half that, and charging with PV will amount to <2€/100km.
We don't want the CO2 that is created (and other consequences of oil production/consumption), the money can be spent on something better, and we don't want to depend on oil-exporting countries more than necessary.
Europe is not the US, we have somewhat functional public transport in most parts of the continent, you are not _that_ dependent on a car. Also, EVs will become cheaper than ICEs, with or without subsidies or tax incentives, it's only a matter of time. Battery prices on a cell level approach €40/kWh. A new drive-train incl. battery will be < 2000€.
Also, given that polluted air affects poor people the most, getting rid of all that exhaust of old worn out cars with ICEs will be a good thing in any case.
>Europe is not the US, we have somewhat functional public transport in most parts of the continent, you are not _that_ dependent on a car.
That's a case by case basis and not valid blanket-wide over everyone in every city on the whole continent. Outside of HN bubble, not everyone lives in big cities with high speed rail, underground subways or working remotely in small villages with amazing bicycle paths
A lot of tier 2 cities are heavily underdeveloped in that regard and need a car for commute to work outsider or inside the city, unless you wanna spend 1-2+ hours/day, each way, on public transit switching and waiting on buses since such cities sprawled out and grew in size a lot, but public transit infra is still stuck in the 90s with slow busses and no trains. Car ownership is still the only way you can have some free time between work, sleep and commute.
US made cars had the reputation of being low quality, too big, too heavy and too inefficient for european cities.
Tesla was somewhat different. People bought Teslas not for their promised "self driving" capabilities (I know no Tesla driver that took those promises at face value or got the FSD option FWIW), but one motivation was to "stick it" to snobbish arrogant european manufacturers wanting to develop "clean" ICEs with "green fuels" or other non-sensical crimes against thermodynamics like H2-cars.
Now, Tesla (and the US in general) has a brand toxicity problem, and it is worsening. People I know that would consider a Tesla some years ago now drive electric VWs or BWMs or KIAs, often times much more expensive cars than the comparable Tesla 3 / Y model.
This trend will probably continue the next years, and I don't see a way for Tesla to repair the brand image.
For what it's worth, Tesla is by a gigantic margin the lowest quality-for-dollar American vehicle you can buy. The EV thing was unique until it wasn't.
Go sit in a $90,000 Tesla then go sit in a $90,000 literally-anything-else.
That price range for Ford and Chevy trucks or SUVs for example are outrageously luxurious by comparison (not even considering their additional utility).
To start you are comparing wrong segments (premium vs luxury), wrong platform (ice vs ev) and wrong generations (plush shitbox vs self driving sports spacemobile).
Compare to Rivian or Lucid and Tesla is actually cheapest (and yes worst interior).
You don't need to be car literate to know that a $90,000 Tesla interior feels on par with a bottom of the line Nissan Sentra. I'm not sure you can even buy another American-made car that feels so cheap? Curious if people have an idea of what American make/model feels worse to sit in.
Nobody cared that the build quality is "a little worse" all around because it doesn't meaningfully affect the vehicle's fitness for purpose like the internet comment sections will pretend it does.
As long as the vehicles were meaningfully different in other ways, those other ways were the dominating variables in the equation that make/break the purchase decision. Only when all else is within spitting distance of equal do Nth order variables like "muh door feel" and upholstery texture and speculative comments about reliability long after it's projected to replaced (gotta throw that one in there for the Toyota fanboys) start mattering....because they don't actually result in a seriously different ownership experience for the average user and the average user knows this.
What you originally said is that there is little variance along any dimension.
What you're saying now is there is variance along different dimensions and different people care about different dimensions.
This is also what the original comment that you replied to said: build quality is bad (Dimension A), people were willing to accept it due to being an EV (Dimension B).
My point is that "bad" build quality is basically a non-difference. It was never a problem, or it's a manufactured problem in people's minds. Sure, tesla is probably "worse" from a statistical perspective but the average buyer could never see this. You almost have to be looking to see it, and so you're not gonna see it unless all your other problems are solved.
Like if every OEM sets out to build a car of the same specs, they're gonna all be within spitting distance of each other. You'll have to scrape the bottom of the barrel (i.e. "muh build quality) to find differences.
Tesla was winning before because they were the only ones who set out to build a car of that nature, build quality was a non-issue because it simply isn't an issue. It only became a meaningful one after the fact when more cars of the same sort arrived on scene and people went looking for minutia.
I’m not arguing from a statistical perspective and nor are buyers using that.
When Tesla came out, its build quality was awful but it succeeded because people wanted EVs.
Now there are EVs that don’t feel like Mattel toys, and Tesla is doing very very badly, in part because its build quality is still very bad which is now a glaring problem in a more competitive field.
I agree pretty much with your entire thread, but if Elon was Warren Buffet people who think Tesla is a premium brand would still buy them. It wasn't the lies or the quality that turned people off, it's the cost, lack of customer support and - most impact-fully - that Elon is a whiney little man child and performs racist actions.
- First they went "camera only", alienating people who knows the tech.
- Then they mocked car industry for so long. It was a necessary poke at first, but they didn't get prepared, and the elephant proved that it can run.
- Then Elon's Trump affair and all the shebang happened.
The broken FSD promises, using non-auto rated parts (and related failures), being negligent of their own errors and acting like they are deaf to the criticism is the cement between the layers.
Tesla‘s “vision only” with phantom braking suicide experiments reached court last year in Germany and for the first time the court proved they exist for real and the cars are dangerous. This will be interesting to watch. I tried often free autopilot in model Y and it hits hard the brakes on empty road every other time. Afterwards I stopped using it completely. The car is nice, but without working assistance systems. Lane keeping also does not work reliably. Model Y is nice electric car for people without much requirements like me - it’s spacious and electric range is acceptable.
They had camera-only tech employing multiple 4k cameras running at over 2000fps. Not your grandma's 480p/25fps webcam many car manufacturers use as parking cameras. 2000fps gives you enormous safety margin even in case of individual frame misdetection. The long-tail issues they hit are present on LiDAR vehicles as well but LiDAR is much slower, more difficult to process and sensor fusion adds its own errors.
> The long-tail issues they hit are present on LiDAR vehicles as well but LiDAR is much slower, more difficult to process and sensor fusion adds its own errors.
The long tail is long no matter what. Which is why the most robust solutions deploy sensors with orthogonal sensing modalities that can compliment one another. By relying on only one sensor type, Tesla has made it hard for their system overly brittle, which has resulted in avoidable deaths and destruction.
> LiDAR is much slower, more difficult to process
LiDAR in my experience is much easier to process, as the sensor stream is just an array of distances. Camera in my experience is much harder to process, as the sensor stream is an array of RGB values from which you have to infer distances. So by what metric are you alleging LiDAR is more difficult to process?
> sensor fusion adds its own errors.
You'll have to do a degree of sensor fusion across all the camera sensors anyway, so going camera-only don't absolve you of having to fuse sensor streams and come up with a belief. Sensor fusion in general tends to decrease overall system error as more sensors are added.
Everything is possible. They also might have used some creative metrics giving 2000+ fps. I don't know. Or they might have found some neat trick nobody thought about before.
That would be something like 371Gbps (some assumptions) raw data to process, per camera. I would assume a lot of shortcuts to get that down, but still an unreasonably huge scope to process in "real time" in a car.
Running at 2000FPS in low light (and getting meaningful data at that sensor size) is also impossible to begin with. Even if you can do constant 60, you're in good shape.
2000 can be good for doing multiexposure and maybe detecting fine movement, but assuming that everything running 2000FPS (and processing 16000 frames/sec) is not a simple thing, esp, if you're running in an uncontrolled and chaotic environment.
I was about to comment the same, 2k FPS means a maximum shutter speed of 1/2000, you need a lot of light to capture an image this quickly, in low light conditions it's simply impossible to capture enough light even if using very high end optics and sensors.
I don't know the specifics, maybe they are timing individual cameras in a way they achieve 2000fps with a crisp image in each camera and merging them together. Or maybe they are using some MIT tech that was able to capture super low light conditions.
Being able to capture in super low light conditions is dependent on two things. 1. Your sensor's noise floor, 2. The number of photons you can get per unit time.
First one is dependent on the manufacturing process, and the second one is dependent on your sensor size.
Currently, the leading sensor manufacturers (namely Sony Semiconductor and Canon) are doing very low noise sensors. However, to get both these low noise levels and convincing images needs full frame sensors, at least. APS-C can somewhat close, but it can't be there (because physics).
Even in that case, you can't do 2000FPS and get meaningful images from every one of them.
There's no way that a Tesla car cam sports full frame or APS-C sensors.
AFAIK Sony and Canon are still using some ancient manufacturing process for sensors as chips have the priority and if Tesla has access to e.g. 5nm process for manufacturing sensors that would drastically expand possibilities. Also you bypassed the possibility of timing multiple sensors separately to achieve 2000fps.
The reason sensor manufacturers use "seemingly ancient" (i.e. huge feature sizes) processes in their sensors is you really don't need a more advanced process like in processors.
When you manufacture something which computes, power consumption and internal noise improvement is more drastic with improved manufacturing processes. When you are measuring something, you don't need or want too small pixels or features to begin with.
So having a small gigapixel sensor just because your process allows creates more disadvantage over having a sensor same size with a lower resolution, from light capturing angle. So, low-light sensitivity and resolution is a trade-off.
Back-illuminated sensors used by all contemporary cameras created this leap rather than reducing feature size via improved processes. You already pack the sensor as dense as possible (you don't want gaps or "smaller" pixels w/o increasing resolution either), and moving data/power plane away from pixels is the biggest contributor to noise in the sensor.
See the link [0]. Top left image is full frame, top right is APS-C, bottom left is M4/3, and bottom right is full frame / high-res (60+MP) sensors.
When you look at the images, smaller the sensor, worse the noise performance. When you compare full-size images of top left to bottom right, top left image is better in terms of noise. I selected RAW to surface "what sensor sees" The selected spot is the darkest point in that scene.
You can select JPEG to see what in camera image processing does to these images. Shutter speed is around 1/40s and ISO is fixed at 12800 since it's the de-facto standard for night photography.
> Also you bypassed the possibility of timing multiple sensors separately to achieve 2000fps.
Working on an image which doesn't reflect real world is a bit dangerous, isn't it?
You don't need to give me lessons in photography. I remember around the time of D750 Sony upgraded their sensor manufacturing process from some ancient 100-200nm or so to something newer which improved night performance tenfold. Quantum efficiency got substantially better on a better process. Nobody is telling you to shrink pixels to get 1000MPx, instead about making a better 30MPx sensor of the same size. Yet they aren't using the latest (2-5nm) processes for sensors as at their sizes that would be too expensive (I guess H100 chip-level prices for a medium-format sensor).
Why 2k FPS? I'm not being facetious; human eye sees, apparently, at around 25fps, which is why this is what TVs and cinemas used to use. At that rate, and 144kph, say, the car moves 1.4m between frames.
Fine, so maybe you think this is too much. But 10x this still gives you 14cm between frames, at what is already speeding in most jurisdictions I know of.
2000 FPS seems to my untrained eye like a problem, not a feature.
What do you mean by "sees"? I'll bet you that you can't walk around wearing a VR headset running at 25 FPS for more than 30 seconds without violently emptying your stomach. Trying to watch a movie on a display that doesn't exhibit motion blur also makes me motion sick.
Human brain doesn't see in terms of frames at all. There's a limit where an increase in FPS likely becomes imperceptible to most people but that limit is at least 10 times higher (from personal experience), likely more.
Because you are processing a sequence of a fixed length in deep learning models and the more frames you have, the more accurate your FSD output is. Driving 1.4m between the frames with single-frame accuracy of 80% is quite risky and input correction quite discrete; 14cm is still risky for a proper trajectory planning. Now make it in millimeters and suddenly your trajectory is nearly perfect with only little noise.
Not detecting overturned semis, road debris, and swerving to road dividers is even more impressive with that tech.
Where a relatively simple radar can prevent without running a slow-motion camera rig and a wannabe supercomputing cluster on the car.
To be frank, I'm not against 2000 FPS cameras, but I can't come into terms with not adding a simple radar to detect something unknown is dangerously close and the land missile needs to stop.
I think it's more like that before only Tesla had working lane following system on highways that allowed one to do mostly hands-free driving during long drives but nowadays even the cheapest KIA has it working well so no need to spend extra money on Tesla. I know working-class EU folks driving Teslas who couldn't care less about any perceived toxicity of the brand typical for German green party voting snobs.
Individual owners may care about brand toxicity or may not, but it affects how much tesla can charge for new vehicles and resale value of used ones. Of course this analysis is just MHO, maybe the real cause why EV market is expanding and Tesla is stagnating is something completely different, like missing "android auto" resp. apple equivalent thing functionality.
Modern ICEVs are super clean [1]. Teslas were bought because of their software advantage and I don't mean "self driving" I argue that Tesla in its core is a software company, the old brands quickly caught up on the software part, that is why you are going to see a shift from the Tesla market. Yes sure there is going to be some political factor but I don't think the % is that high, compared to better/improved software more slick UI and overall better build quality.
I see quite the opposite trend tho.
Hybrids are great this is where the push should have been. Dacia is doing really great in Europe. The old manufactures are again not in the loop. Dacia rebranding is quite something[2] their new Duster/Bigster line looks super cool and modern. The market is already starting to slowly shift less digital more analogue[3]. The whole TV screen cockpit, piano black plastic, AI everywhere is monstrosity its atrocious this is not luxury its grotesque.
> US made cars had the reputation of being low quality, too big, too heavy and too inefficient for european cities. Tesla was somewhat different.
How so? Tesla doesn't produce a compact car by any european standard. Their smallest car, the Model 3, is the same size of a VW Caddy, an utilitary/7s seat Family VAN, bigger than a more refined VW Touran, another 7 seater family van or the popular VW Tiguan, a large (by euro standards) SUV.
> typical middle class sedan size IMHO, like VW Passat, Audi A4, BWM 3, i.e. one size up from VW Golf/Tiguan/Touran.
Which have inflated one size/class bigger than they used to 25 years ago.
The people who now drive these kind of cars today used to drive A6, BMW 5 series and E-Class Mercedes Benz. Cars lass/segments have slided both in size and luxury over a few decades.
If you look at car sales number you will see that the cars that sell the most are in the small and compact segment categories. Here is the top10 in Q1 in Europe:
Rank Model Units Sold Manufacturer Segment
1 Dacia Sandero 42,913 Renault Group Supermini
2 Citroën C3 34,064 Stellantis Subcompact
3 Peugeot 208 33,821 Stellantis Supermini
4 Volkswagen Golf 33,663 Volkswagen Group Compact
5 Renault Clio 31,754 Renault Group Supermini
6 Dacia Duster 31,217 Renault Group Compact SUV
7 Volkswagen T-Roc 30,949 Volkswagen Group Crossover
8 Volkswagen Tiguan 29,733 Volkswagen Group SUV
9 Toyota Yaris Cross 29,226 Toyota Motor Europe Crossover SUV
The BMW i3 was an interesting car, it's a pity they cancelled it and don't offer it with current drive train / battery. There are 3rd party battery upgrades available though if you get a used one.
> but one motivation was to "stick it" to snobbish arrogant european manufacturers wanting to develop "clean" ICEs with "green fuels" or other non-sensical crimes against thermodynamics like H2-cars
Eh? Most European manufacturers (maybe not Stellantis) had at least one BEV by the time any Tesla was available in Europe. I'm not sure any European manufacturer has ever released a production hydrogen car? That's mostly Toyota.
There was also the VW eGolf/eUp, and the pre-Zoe Renault (which IIRC was a bit of a disaster). The first Tesla didn't become available in Europe til after the Zoe came out.
EDIT: Actually, looks like the eGolf was a few months after the Tesla Model S.
I remember needing a patch to the DSDT on a Dell latitude x300 for linux to work proberly (~20 years ago?). It was attached to the initrd. IIRC one problem was that the microsoft ACPI table compiler produced code that was illegal under some interpretation of the standard, and the intel tools on linux didn't like that.
I don't know. Salt (NaCl) is corrosive. Specific heat capacity is not that high (about 1/5th of water per weight). Suppose you have cubic meter of molten salt at 800°C in a dewar, how do you get the heat out again?
reply