Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

> Smartphones already do this. They essentially measure the resistance of the cable by measuring the voltage and correlating it with the current drawn, and regulate the current accordingly.

Not sufficient. It’s easy to have a cable where the long conductors can safely carry a lot more current than the terminations. Or one could have a 1 foot thin cable connected in series with a 10 foot thick cable.

For that matter, a short low-ampacity cable would look just like a long high-ampacity cable.



On top of that, a slightly imprecise voltage source could also easily look like a very low resistance cable.


It's differential impedance that they measure, not the absolute voltage.


What would they measure that for?


To determine how much current they can draw.


They can only measure combined voltage sag of the source and resistive losses in the cable though that way, no?

Doesn’t help with the concern of high-resistance spots somewhere in a low-current-rated cable.


They aren’t trying to deal with poorly manufactured out of spec cables.


But resistance in series is additive - the resistance of the combination is dominated by the resistance of the worse cable: 0Ω + 1Ω = 1Ω

Not sure what problem you're thinking of.


It's not resistance per se that you care about, it's how much the cable is heating up which is mostly dependent on resistance per meter.

Imagine 2 cables with the same resistance, one 0.5m long and the other 2m. At a set amount of amps, the 0.5m long cable would need to safely dissipate 4 times as much heat than the 2m cable over the same distance. And you don't know how long the cable actually is (and they make 10cm usb cables after all) so you can't make any decision based on resistance that doesn't risk some cables going up in flames.


Fortunately copper and aluminium, the two most common metals for cables, have resistance that increases with temperature.

Ultimately what I'm saying is that the endpoints have or can measure all the information they need to adapt, and this is even more accurate than requiring a cable to self-identify which is worthless if it's lying (and that can happen unintentionally if it's damaged or the connector is worn.)


You're not thinking this through.

Yes resistance goes up the warmer the cable gets, but you know neither the resistance of the cable at 20°C nor the temperature of the cable. Simple example, say the user is charging their phone, you detect the cable getting plugged in and measure the resistance at say 1 Ohm to make the numbers nice. Cool, now at what resistance do you determine the cable is too hot and reduce the current? Copper's temperature coefficient is about 0.4% per °C so the resistance should be 1.12 Ohm at a safe 30°C increase right?

Wrong! The cable resistance you measured could already have been at 50° and you're now pushing the cable into fire hazard territory. This isn't theoretical either, people plug in different devices after another all the time (this also eliminates any sort of memorizing if you were thinking about that). So what are you just going to wait 5 minutes at safe charging current and see if the temperature goes down? That's not going to fly in a market where some devices charge 50% in like 20 minutes.

And all of this is ignoring another important design goal of usb-c: dumb charge-only devices exist and should continue to function with a minimal of extra added components. USB-C allows this with the addition of two simple resistors that set the current you want. Measuring resistance on the other hand either requires a accurate voltage source on the device side (expensive) or somehow connecting the power line to some other line (how that would work without additional control electronics I have no idea).


a safe 30°C increase

That's far higher than what the implementations assume.


The copper and aluminum will not become resistive enough to make a difference at temperatures low enough to prevent the rest of the cable from becoming crispy.


Most of the resistance occurring in a small part of the entire cable, potentially causing a fire, seems possible, no?


No. You could also suppose a dead short in the cable that causes a fire, or just a cable with a length of detcord in the middle.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: