> Smartphones already do this. They essentially measure the resistance of the cable by measuring the voltage and correlating it with the current drawn, and regulate the current accordingly.
Not sufficient. It’s easy to have a cable where the long conductors can safely carry a lot more current than the terminations. Or one could have a 1 foot thin cable connected in series with a 10 foot thick cable.
For that matter, a short low-ampacity cable would look just like a long high-ampacity cable.
It's not resistance per se that you care about, it's how much the cable is heating up which is mostly dependent on resistance per meter.
Imagine 2 cables with the same resistance, one 0.5m long and the other 2m. At a set amount of amps, the 0.5m long cable would need to safely dissipate 4 times as much heat than the 2m cable over the same distance. And you don't know how long the cable actually is (and they make 10cm usb cables after all) so you can't make any decision based on resistance that doesn't risk some cables going up in flames.
Fortunately copper and aluminium, the two most common metals for cables, have resistance that increases with temperature.
Ultimately what I'm saying is that the endpoints have or can measure all the information they need to adapt, and this is even more accurate than requiring a cable to self-identify which is worthless if it's lying (and that can happen unintentionally if it's damaged or the connector is worn.)
Yes resistance goes up the warmer the cable gets, but you know neither the resistance of the cable at 20°C nor the temperature of the cable. Simple example, say the user is charging their phone, you detect the cable getting plugged in and measure the resistance at say 1 Ohm to make the numbers nice. Cool, now at what resistance do you determine the cable is too hot and reduce the current? Copper's temperature coefficient is about 0.4% per °C so the resistance should be 1.12 Ohm at a safe 30°C increase right?
Wrong! The cable resistance you measured could already have been at 50° and you're now pushing the cable into fire hazard territory. This isn't theoretical either, people plug in different devices after another all the time (this also eliminates any sort of memorizing if you were thinking about that). So what are you just going to wait 5 minutes at safe charging current and see if the temperature goes down? That's not going to fly in a market where some devices charge 50% in like 20 minutes.
And all of this is ignoring another important design goal of usb-c: dumb charge-only devices exist and should continue to function with a minimal of extra added components. USB-C allows this with the addition of two simple resistors that set the current you want. Measuring resistance on the other hand either requires a accurate voltage source on the device side (expensive) or somehow connecting the power line to some other line (how that would work without additional control electronics I have no idea).
The copper and aluminum will not become resistive enough to make a difference at temperatures low enough to prevent the rest of the cable from becoming crispy.
Not sufficient. It’s easy to have a cable where the long conductors can safely carry a lot more current than the terminations. Or one could have a 1 foot thin cable connected in series with a 10 foot thick cable.
For that matter, a short low-ampacity cable would look just like a long high-ampacity cable.