Understanding PD Output

Let’s imagine this following charger:
Output: 5V ⎓ 3A/9V ⎓ 3A/15V ⎓ 2A/20V ⎓ 1.5A

I read that as it outputs at 15 watts, 27 watts, and 30 watts. At the same time, I know that once you get over 80% or so on the iPhone, the charging speed slows significantly. As such, I highly doubt at 95% on your iPhone this sucker is still pumping out 15 watts. Then again, it doesn’t list anything lower than 15 watts output. Can someone please explain what I’m missing? Lastly, if the Omnia has let’s say 30watt PD output, will it charge the iPhone 12 Pro at 20watts even if there is no “5v/4A” output for instance?

I may be wrong but I think the device can regulate how much is drawn. Even if the lowest watt on the charger is 15, I think the phone can pull less. Hopefully someone will correct me if I am wrong.

Chargers list their highest possible power output. But they most often don’t operate at those levels. Devices, not the charger, regulate the power flow and that regulation is based on how lithium battery chemistry works. It is like your car’s speedometer. It can reach 120 MPH, but it spends most of its time a lot lower because you are regulating the speed and your “specs” (road conditions, laws, other cars) don’t allow for 120 MPH.

When you plug in a USB-C device (iPhones can use USB-C PD, even though their connector is Lightning) it negotiates with the charger. It asks the charger what it can offer, the charger responds, the device tells it what voltage and current (amps) it wants. Devices draw power from chargers, chargers do not push power to devices. As the battery charges the power draw changes. Specific numbers vary, but generally from 0-50% the voltage is the same and current goes down over time. At some point fast charging is disabled, between 50-80% charged. At this point current really drops, until it is a bit more than a trickle as it nears 100%. Voltage general stays the same, but if you plugged in a non-fast charger at this point the charge time would be the same.

iPhones have a 15W max power draw rate. They also default to 9V when available, even if higher voltages (which they can use) are offered. iPhones are not a good example of how most USB-C PD devices behave when charging. It is rare to get close to 15W power draw from an iPhone. It would need to be near 0% and active with high power demands (100% brightness, network activity, gaming, etc). Most of the time from 0-50% it draws closer to 7-9W. After 50% it drops off and soon is 5W or less. That’s why iPhone fast chargers advertise 0-50% in 30 minutes and not their 0-100% time (which is 3-4 times as long).

Most other USB-C devices go with the highest voltage they support and is offered. Then draw up to either their max current, based on battery level and power requirements, or the max current of the charger… whichever is lower. So a 60W MacBook Pro might draw 55-60W with a low battery and under usage stress. But when sitting at 100% battery and typical usage it is closer to 30W.

Chargers with specs like 5V/4A are not using USB-C PD standard, but something else. USB-C PD based devices like the iPhone will either refuse to charge completely or treat them like a 5V/3A charger and not perform much better than an old Apple 5W charger.

If you want to dive into the different fast charging standards out there I have a write up here.


Thank you for that.