Hacker Newsnew | past | comments | ask | show | jobs | submit | Muromec's commentslogin

Mitivated terrorists pivoted to driving cars into crowds and shootings.

Don't forget strapping knives to their hands and slashing into crowds.

>Other places can only afford universal healthcare to begin with because their healthcare sector is not nearly as corrupt or shackled by a huge amount of government regulation that was only put in place here for self-serving reasons.

coughs in Ukrainian


Apple decided its better to not enable stalkers and get bad press for that. From tne point of view of the tracker anti theft and stalking are kinda the same. This mirrors yesterdays one about efuses btw

It's there on all phones since forever lol. Apple can ship an update that adds "update without asking for confirmation" tomorrow and then ship another one that shows nothing but a middle finger on boot and you would not be able to do anything, including downgrading back.

Fuses are there on all phones since 25+ years ago, on the real phone CPU side. With trusted boot and shit. Otherwise you could change IMEI left and right and it's a big no-no. What you interact with runs on the secondary CPU -- the fancy user interface with shiny buttons, but that firmware only starts if the main one lets it.

Otherwise you could change IMEI left and right and it's a big no-no.

You can still change the IMEI on many phones if you know how to.


>What exactly is it comparing? What is the “firmware embedded version number”? With an unlocked bootloader you can flash boot and super (system, vendor, etc) partitions, but I must be missing something because it seems like this would be bypassable.

This doesn't make sense unless the secondary boot is signed and there is a version somewhere in signed metadata. Primary boot checks the signature, reads the version of secondary boot and loads it only if the version it's not lower than what write-once memory (fuse) requires.

If you can self-sign or disable signature, then you can do whatever boot you want, as long as it's metadata satisfies the version.


People need to re-sign their releases and include the newer version of bootloader, more or less.

Yes, though noting that since the antirollback is apparently implemented by the bootloader itself on this Qualcomm SoC, this will blow the fuse on devices where the new version is installed, so the unofficial EDL-mode tools that the community seems to be most concerned about will still be unavailable, and users will still be unable to downgrade from the newer to older custom ROM builds.

> unofficial EDL-mode tools

The linked page seems to indicate that the EDL image is also vendor signed. Wouldn't that mean they're official?

Unless I've misunderstood, the EDL image is tied to the same set of fuses as the XBL image so it's only useful to recover if the fuses don't get updated. Which seems like an outlandish design choice to me because it means that flashing a new XBL leaves you in a state where you lack the fallback tooling (hence the reports of people forced to replace the motherboard) and also that if there's anything wrong with the new XBL that doesn't manifest until after the stage where it blows the fuses then the vendor will have managed to irreversibly brick their own devices via an only slightly broken update.


EDL itself is a huge hack anyway, so who knows. The underlying issue is that the OS suppliers are forced to bundle what is effectively the equivalent of a BIOS (low-level firmware) with their image (because of the underlying assumption that this is an embedded system where there are no third-party OS suppliers), and the "BIOS" update has to be made a one-way street when the older firmware has vulnerabilities. Newer EDL tools ought to become available but they might not have the exact same capabilities as the older ones, though they'll most likely be usable for basic recovery.

Not being able to downgrade and using the debug tools was the exact point of doing this thing, as far as I understand.

> Not for technical reasons, but because of an arbitrary lock (achieved through signature).

There is a good reason to prevent downgrades -- older versions have CVEs and some are actually exploitable.


and ? this should prevent you from deciding the level of risk or even installing forks of that OS (that can also write fixes, even without source-code by patching binaries) ?

This kind of thing is generally used to disallow downgrading the bootloader once there is a bug in chain of trust handling of the bootloader. Otherwise once broken is forever broken. It makes sense from the trusted computing perspective to have this. It's not even new, it was still there on p2k motorollas 25 years ago.

You may not want trusted computing and root/jailbreak everything as a consumer, but building one is not inherently evil.


Trusted computing means trusted by the vendor and content providers, not trusted by the user. In that sense I consider it very evil.

If the user doesn't trust an operating system, why would they use it. The operating system can steal sensitive information. Trusted computing is trusted by the user to the extent that they use the device. For example if they don't trust it, they may avoid logging in to their bank on it.

> If the user doesn't trust an operating system, why would they use it.

Because in the case of smartphones, there is realistically no other option.

> For example if they don't trust it, they may avoid logging in to their bank on it.

Except when the bank trusts the system that I don't (smartphone with Google Services or equivalent Apple junk installed), and doesn't trust the system that I do (desktop computer or degoogled smartphone), which is a very common scenario.


To trust an Android device, I need to have ultimate authority over it. That means freedom to remove functionality I don't like and make changes apps don't like. Otherwise, there are parts of practically every Android that I don't approve of, like the carrier app installer, any tracking/telemetry, most preinstalled apps, etc.

I recently moved to Apple devices because they use trusted computing differently; namely, to protect against platform abuse, but mostly not to protect corporate interests. They also publish detailed first-party documentation on how their platforms work and how certain features are implemented.

Apple jailbreaking has historically also had a better UX than Android rooting, because Apple platforms are more trusted than Android platforms, meaning that DRM protection, banking apps and such will often still work with a jailbroken iOS device, unlike most rooted Android devices. With that said though, I don't particularly expect to ever have a jailbroken iOS device again, unfortunately.

Apple implements many more protections than Android at the OS level to prevent abuse of trusted computing by third-party apps, and give the user control. (Though some Androids like, say, GrapheneOS, implement lots that Apple does not.)

But of course all this only matters if you trust Apple. I trust them less than I did, but to me they are still the most trustworthy.


>to protect against platform abuse, but mostly not to protect corporate interests

What do you mean by this? On both Android and iOS app developers can have a backend that checks the status of app attestation.


"App attestation" means different things for Android than for iOS. On iOS, it verifies the app was installed from the right place. On Android, it tries to check if the device is tampered with, or hasn't been fully certified by Google, or etc... Android's far more finicky because Google uses this process to crack down on OEMs and hobbyists, while Apple implicitly trusts itself.

Also, "checking the status of app attestation" is the wrong approach. If you want to use app attestation that way, then you should sign/encrypt communications (requests and responses) with hardware-backed keys; that way, you can't replay or proxy an attestation result to authorize modified requests.

(I believe Apple attestation doesn't directly support encryption itself, only signing, but that is enough to use it as part of a key exchange process with hardware-backed keys - you can sign a public key you're sending to the server, which can verify your signature and then use your public key to encrypt a server-side public key, that then you can decrypt and use to encrypt your future communications to the server, and the server can encrypt its responses with your public key, etc.)


Do you actually, bottom-of-your-heart believe that ordinary consumers think like this? They use TikTok and WhatsApp and Facebook and the Wal-Mart coupon app as a product of deep consideration on the web of trust they're building?

Users don't have a choice, and they don't care. Bitlocker is cracked by the feds, iOS and Android devices can get unlocked or hacked with commercially-available grey-market exploits. Push Notifications are bugged, apparently. Your logic hinges on an idyllic philosophy that doesn't even exist in security focused communities.


Yes, I do believe from the bottom of my heart the users trust the operating systems they use. Apple and Google have done a great job at security and privacy which is why it seems like users don't care. It's like complaining why you have a system administrator if the servers are never down. When things are run well the average person seems ignorant of the problems.

Google certainly hasn't done a great job on privacy. Android devices leak so much information.

https://arstechnica.com/information-technology/2024/10/phone...

https://peabee.substack.com/p/everyone-knows-what-apps-you-u...

About Apple I just don't know enough because I haven't seriously used them for years


Yet, in the big picture Google is doing a good enough job that those information leaks have not caused them harm. When you really zoom in you can find some issues, but the real world impact of them is not big enough to influence most consumers.

What sort of hypothetical harm are you imagining here? Suppose the information leaks were a serious issue to me - what are my options? Switch to Apple? I doubt most consumers are going to consider something like postmarketos.

The carriers in the US were caught selling e911 location data to pretty much whoever was willing to pay. Did that hurt them? Not as far as I can tell, largely because there is no alternative and (bizarrely) such behavior isn't considered by our current legislation to be a criminal act. Consumers are forced to accept that they are simply along for the ride.


Lets say that Google let anyone visit google.com/photos?u=username to see all of the images from their camera roll and left this online not caring about the privacy implications.

People would stop taking photos with their camera that they didn't want to be public.


People would presumably switch away from gcam and the associated gallery app. Or they would simply remove their google account from the phone. They have realistic options in that case (albeit somewhat downgraded in most cases).

If Google did something egregious enough legislation might actually get passed because realistically, if public outcry doesn't convince them to change direction, what other option is available? At present it's that or switch to the only other major player in town.


Those issues I quoted may not have done harm to Google but they certainly did to their users. It's just not something that is widely known.

They used Windows XP when it was a security nightmare and many used it long after EOL. I just talked to someone whose had 4 bank cards compromised in as many months who is almost certainly doing something wrong.

I'm talking about people's feelings. People can feel like a Masterlock padlock is secure even if it may be trivial to get past.

> which is why it seems like users don't care.

...and not because, in truth, they don't care?

How would we even know if people distrusted a company like Microsoft or Meta? Both companies are so deeply-entrenched that you can't avoid them no matter how you feel about their privacy stance. The same goes for Apple and Google, there is no "greener grass" alternative to protest the surveillance of Push Notifications or vulnerability to Pegasus malware.


They would stop using them, or reduce what kinds of things they do on them if they didn't trust them. No one is forcing you to document your life on these palatforms.

> They would stop using them

Would they? Nobody that I know would.


Pre-TC mobile/embedded security was catastrophic:

  Persistent bootkits trivial to install
  No verified boot chain
  Firmware implants survived OS reinstalls
  No hardware-backed key storage
  Encryption keys extractable via JTAG/flash dump
Modern Secure Boot + hardware-backed keystore + eFuse anti-rollback eliminated entire attack classes. The median user's security posture improved by orders of magnitude.

Did this ever effect real users?

Arguably yes. By preventing entire classes of attack real users are never exposed to certain risks in the first place. If it were possible it would be abused at some rate (even if that rate were low).

It's not that trusted computing is inherently bad. I actually think it's a very good thing. The problem is that the manufacturer maintains control of the keys when they sell you a device.

Imagine selling someone a house that had smart locks but not turning over control of the locks to the new "owner". And every time the "owner" wants to add a new guest to the lock you insist on "reviewing" the guest before agreeing to add him. You insist that this is important for "security" because otherwise the "owner" might throw a party or invite a drug dealer over or something else you don't approve of. But don't worry, you are protecting the "owner" from malicious third parties hiding in plain sight. You run thorough background checks on all applicants after all!


I get your point, but I think your example doesn't make sense. I can get drugs delivered to a psych ward if I'd want that. Dealers don't need a key.

Yes. See attacks like Pegasus.

A discussion you don't see nearly enough of is that there is a fundamental tradeoff with hardware security features — every feature that you can use to secure your device can also be used by an adversary to keep control once they compromise you.

In this case, the "adversary" evaluates to the manufacturer, and "once they compromise you" evaluates to "already". This is the case with most smartphones and similar devices that treats the user as a guest, rather than the owner.

See also:

https://github.com/zenfyrdev/bootloader-unlock-wall-of-shame


Not only can, but inevitably is. Security folks - especially in mobile - are commonly useful idiots for introducing measures which are practically immediately coopted to take away users ability to control their device and modify it to serve them better. Every single time.

We just had the Google side loading article here.


Fair enough, but so does your front door. Either thing is not smart enough to judge the legitimacy of ownership transitions.

Yeah, not disagreeing with you. It's just that, every time we have this discussion, we see comments like GP's rebutted by comments like yours, and vice versa.

All I'm saying is that we have to acknowledge that both are true. And, if both are true, we need to have a serious conversation about who gets to choose the core used in our front door locks.


I’d like to think I’m buying the device, not a seat to use the device, at least if I do not want to use their software.

You can't have that with phones. You are always at the mercy of the hardware supplier and their trusted boot chain that starts with the actual phone processor (the one running GSM stuff, not user interface stuff). That one is always locked down and decides to boot you fancy android stuff.

The fact that it's locked down and remotely killable is a feature that people pay for and regulators enforce from their side too.

At the very best, the supplier plays nice and allows you to run your own applications, remove whatever crap they preinstalled and change to font face. If you are really lucky, you can choose to run practically useless linux distribution instead of practically useful linux distribution with their blessing. Blessing is a transient thing that can be revoked any time.


Not true on the pinephone, the modem is a peripheral module, so the boot chain does not start with it.

Nor the Mediatek platforms as far as I know (very familiar with the MT65xx and MT67xx series; not sure about anything newer or older, except MT62xx which also boots --- from NOR flash --- the AP first.)

> You can't have that with phones.

Why not?

Obviously we don't have that. But what stops an open firmware (or even open hardware) GSM modem being built?


>Obviously we don't have that. But what stops an open firmware (or even open hardware) GSM modem being built?

The same thing that stops you from living on a sea platform as a sovereign citizen or paying for your groceries with bitcoin. Technically you can, but practically you don't.

If you want to sell it commercially, you can opensource all you want, but the debug interface and bootloader integrity would have to be closed shut for production batch.

At best, you can do what the other comment refers to -- instead of using the baseband as a source of root of trust, make it work like wifi modules. This of course comes at a cost of having a separate SoC. Early motorola smartphones (EZX series) did that -- Linux part talked to the gsm part literally over usb. It came with all kinds of fun, including sound being khmm... complicated. I don't remember whether they shared the RAM zo. You don't want to share you RAM with a funny blob without reading fine print about who sets up the mappings, right?

Figuring out all of that costs money and money have to come from somewhere, which means you also have to resist the pressure to not become part of the problem. And then the product that comes out is 5 years too late for the spec and 1.5 times too expensive for the vague promise of "trust me bro, I will only blow the e-fuse to fix actual CVEs".


There are some open firmware, or partially open firmware projects, but they're more proof-of-concepts and not popular/widely-used. The problem is the FCC or corresponding local organization requires cell phones get regulatory approval, and open firmware (where just anybody could just download the source and modify a couple of numbers to violate regulations) doesn't jive with that.

https://hackaday.com/2022/07/12/open-firmware-for-pinephone-...


The GSM processor is often a separate chip. You may have read an article about the super spooky NSA backdoor processor that really controls your phone, but it's just a GSM processor. Connecting via PCIe may allow it to compromise the application processor if compromised itself, but so can a broadcom WiFi chip.

>The GSM processor is often a separate chip

Is it? I remember MotoMing of EZX years to be actually separate and maybe the latest failed attempts at linux phone had one, but I'm under impression the most common way to do it is a SoC where one core is doing baseband and the other(s) are doing linux and they also share the physical RAM that is part of the same SoC. I don't follow the happenings close enough to say it's 100% of all phones and people call me out saying mediatek is totally حلال in this department. It's not like I'm going to touch anything with mtk ever to check.


Of course you can have that.

The governments can ban this feature and ban companies from selling devices with that.


> It's not even new, it was still there on p2k motorollas 25 years ago.

I’m sure CIA was not founded after covid :-)


Uhh…Wut?

Let me remind you the gist of the parent comment:

> So that’s how in an event of war US adversaries will be relieved of their devices


>I could have gone pro. I hate my life..

proceeds to write a 75 page diss and bill taxpayers for that


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: