Benefits Of Forced Slow Charging? Ideal Charger Voltage Range?
Aside from keeping a lithium battery between a 40%-80% charge and minimizing the duration it is below 30% or above 90% I have also been researching and experimenting with the actual rate and voltage a battery is charged at.
In the automotive world (lead-acid) you want to slow charge a battery, likely the slower the better if you have the time...and I believe this is entirely due to a heat factor but current may also play a role in physical decomposition?
With that being said regarding a lithium battery heat is certainly a factor, which is why wireless charging is throttled by a temperature threshold. The entire industry seems to have moved from 1A chargers to a standard of 2.4A increasing the charge rate. I have always used 1A chargers when possible and disabled fast charging to slow the charge down based on the maximum limitations of a 1A charger. I spend a lot of time near outlets or at a desk so my phone is frequently plugged in most of the time always avoiding discharge cycles and certainly never allowing a deep discharge cycle except for maybe once bi-monthly intentionally for battery health, which can be controversial.
In my quest to remain above 40% and below 90% I started using an app that monitors the battery level and sends a notification when you cross those thresholds. Allows me to get to an outlet in time or unplug the charger before a full 100% charge. However this does not stop the charge from passing 90% so it still requires my intervention and that does not help if I am sleeping. So I found a device called "Chargie" which is a piece of hardware between the cable and charger that works with an app to stop further charging above 90% while you sleep or sit at your desk with the phone plugged in. It is frequently used for sales floor display devices as well.
So I began collecting data using a USB multimeter on approximately 10 different chargers ranging from Apple-Anker-Knockoffs rated @ 1A-2.4A and below are bullets using a quality 2.4A charger.
- Below a 10% charge iPhone 6S only pulls 0.50A-0.97A
- Above a 20% charge iPhone 6S pulls 0.50A-1.30A
- Above a 50% charge iPhone 6S pulls 0.50A-1.70A
- Above a 90% charge iPhone 6S pulls 0.01A-0.40A
The most interesting findings were that my PC and laptop USB 3.0 ports supplied a maximum and consistent 0.33A whether the battery is at 5%, 50% or 80%
In conclusion iPhone 6S prevented charging above a rate of 1A when under 10%, blasts it between 50%-80% before throttling down below 0.50A after reaching 90% and I have even observed it dropping to 0.00A after reaching 93% except when you are using the phone spiking the CPU usage. Being that I am never in a rush to charge my phone it only makes sense to charge it at the minimum current possible, but if I use a 1A charger to limit current with a battery between 50%-80% am I going to put a heavy load on the charger and burn it out prematurely? Furthermore I have been leaving my phone connected to my desktop computers USB port all day for many years, unfortunately keeping it at 100% most of its life BUT charging at a rate of only 0.33A so moving forward combining the "Chargie" device with my computers USB port seems to be a good practice in theory; slow charging @ 0.33A until a max of 90% capacity. It makes me want to find a 0.33A wall charger to use by my bed at night long as I am not risking an overload on the undersized wall charger.
The only other way I have found to forcefully limit the charging current to some degree is by using an off-brand/knockoff cable by Bolatus that is meant for charging-only (no-data) the two middle data pins in the USB-A connector are completely absent. This will limit the charge rate to a consistent 0.85A unlike a good cable which fluctuates all over the place in current draw while charging. I know obviously non-certified cables are a risk without the regulation chip however if connected to a quality Anker or Apple transformer/charger brick it is already being supplied with a well regulated power supply. I thought this was due to the missing data pins in the USB connector but it must be another variable because this USB multimeter only has the outter two power pins so no data is passing between the charger itself and the phone regardless of the cable used, yet with a certified cable the current does fluctuate all over as stated in the bullet points above. Honestly this method is not preferred because it only limits the charge to 0.85A which is not far off from just using a certified cable with a 1A charger....which actually drops below 0.85A as it fluctuates compared to a stable 0.85A.
Now regarding the charger voltage using the USB multimeter without any cable or load connected I have found chargers typically maintain a range of 0.03v or less (some are more stable and do not move +/- 0.03v). After testing about 10 different chargers I have found voltages as low as 4.85v and as high as 5.12v with one instance reaching 5.18v but was not observed again. With that being said what IS the most ideal voltage? 5.0v is the baseline right? If it rises above 5.0v the charging circit in the phone can regulate it but if it drops below 5.0v is there any downside to an under-volted charging session? I have a "Quick Charge 3.0" brick (model HZQDLN) that supplies two 2.4A ports along with a "lighting-bolt 3" smart port which is rated at (3.6v-6.5v @ 3A) (6.5v-9.0v @ 2A) (9.0v-12v @ 1.5A) but when connected to an iPhone 6S it puts out 4.9v @ 1A max. Under a load greater than 1A all chargers dropped 0.01v-0.14v with the exception of an Anker IQ charger which did not drop but actually rose 0.09v higher.
Posted the same topic to an Apple community forum and it got removed…its almost as if they do not want anyone to know or have free speech on the topic?
Ist dies eine gute Frage?