NIMH technology has progressed greatly since the early 2000s. There used to be no such thing as LSD (that sounds trippy ;O ) and now we have Japanese Takasaski FDK factory (Maker of Eneloop Japan, Fujistu probably AmazonBasics and many other LSD battery brands) consistantly improving the NIMH technology generation after battery generation.
Why do we even have an UltraSmart charger (aside from the fact we're probably geeky) except that we want to get the most usage out of our rechargable batteries? 10,000 cycles? (As in Mark's testing) Sounds good to us!! You all do realize that if we do figure out how to fully optimize battery life we will probably kill the industry? People would only buy a few sets of rechargables, an ultrasmart charger and never buy batteries again.
1. But leaving all that aside, why do NIMH batteries, and the crystals inside them go bad?
2. Can we change how we recharge a battery to reduce that effect?
3. Once a rock or big crystal has formed can we bring it back to the light side?
Under the first question, in a test environment (the ultimate charger itself) Mark has proven that we can get an Eneloop (FDK) to 10,000 cycles...in an artificial environment (charger itself). But how does that compare to how we will actually use our batteries in real life? And in that "real" world, how do we keep our batteries strong and healthy and long lived?
In general use, a user will use a battery or set of batteries in a device, until that device's performance decreases (enough to matter) or stops altogether. If it is a high drain device (camera flash) there will probably be some charge left in the battery, probably more than 1.0 volt. If it is a lower drain device, one or more of the the batteries might be below 1.0 volt remaining. Typically this is due to using batteries in series, where some have more charge, and will compensate for the 1.0 or less batteries until they too go below the "threshold" of whatever the device's minimum requirements. For some users this would cause them to remove the drained batteries, replace them with full(er) batties and continue on. After some delay (end of day, later that week) the drained batteries are put into a charger and on we go.
How does that affect the chemistry of the batteries? Others have produced lengthy learned articles about this, and I'm not a chemist so I will leave that job to them. What we care about is how to reduce the effect on our batteries cycle lifetime. As I understand it, crystals form (larger crystals) and these are reluctant to release their charge in the future (when a battery is being used) and so their portion of the total NIMH capacity essentialy is on the sidelines for the remainder of the game. Deep discharge (refresh) can sometimes cause these larger crystals to break down and give up their energy, such that when charged again they reform as smaller crystals.
Mark's testing, seems to imply that never fully draining the battery, stopping at 1.1 volts during discharge, reduces damage over time. We have not heard back about his results with deep drain and high cutoff testing (0.9 on drain and 1.44 cutoff during charge), but that end of the cycle might also come into play. However, if we are already in the scenario, where in real use we often drain the batteries below 1.0, how do we keep these batteries healthier, extend their life?
It would seem, then, that we have to improve on the REFRESH techniques that exist to break down those crystals and defeat the dark side.
Perhaps many of you, if you are already here on this website, UltraSmartCharger.com, have found other websites and studied up on who is doing what in the industry. The most obsesive site I've found is a Danish site (don't worry all the articles are English) Lygte-Info.DK. This fellow apparently has tested huge numbers of chargers and batteries in exhaustive detail. For example here is an article about NIMH charging, wow! In the review page of chargers he has, you can explore the several methods other chargers use (behind the scenes) to cycle batteries, and what techniques we might experiment with.
We can't change the chemistry of the batteries (except by choosing what we buy), we can't change the mechanical characteristics of the batteries. With a charger, even an Ultra Smart charger, all we can do is vary the voltage or discharge amounts and our timing and cutoff values. We can vary them at the beginning, middle and end parts of the charging or discharge process. Any one of these might affect the ultimate lifetime of our batteries. Of course that is why Mark carries out those battery and or cycle testing. What some other chargers have chosen to do can be categorized in the following ways.
- Lower voltage near the end of charge. Note that NEAR might be determined by resistance, temperature or voltage. This charger halves the amps about 25% before delta detection.
- Stairstep voltage near the end of charge. This other charger, about 20% before the end, lowers its amps in a staircase manner.
- Further still this charger uses a declining curve, or perhaps lygte points out it might not be able to keep 1400 amps going near end of charge.
- This charger, from the graph, appears to vary the voltage during charging. If I'm reading the graph correctly, the pattern seems to be charge from -20% set voltage to +20% voltage in a saw tooth pattern over 10 minute periods .
- Lower drain, during discharge, in a curve or stair-step fashion. Note I have not seen an example of this, but reason would dictate if you can do it at the end you can also do it when approaching the beginning.
- Delay time either after charging or after discharging. When a battery is full, inflection point reached. There is still a fair amount of chemistry going on in the battery. The voltage is equalizing, osmosis, among the various crystals and settling down (thus the temperature drop). The same happens after a battery has been drained. If you look at the various discharge graphs at lygte-info, you can see that after cutoff the voltage output of the drained battery rises again (when not under load). This probably represents the bigger crystals releasing their energy through osmosis while the NIMH is trying to achieve equilibrium.
- Changing what happens after either a battery being full or being emptied. You might delay a while with a full battery, and then try to put more into it, once some equilibrium was achieved. You might also try extracting more energy on the other end after suitable delay to further break up crystals, with some very low drain rate. Would either of these improve battery life or reduce it (depending on how you performed them)?
- Vary voltage during the middle of charge process. Would this help equilibrium spread out faster and equalize the size of crystals?
- Here is some heresy...perhaps some small periods of discharge during the charge process. Again to normalize crystals.
- Likewise, during discharge you could pulse some energy into the battery. During or perhaps after reaching 1.0 or 0.9 volts trickle or put some charge in before continuing to drain
All of these would be straightforward to code into the firmware as options with parameters.
I'm sure some of you all have various other suggestions to add.