News:

Masm32 SDK description, downloads and other helpful links
Message to All Guests
NB: Posting URL's See here: Posted URL Change

Main Menu

Better charging formula ?

Started by Magnum, October 26, 2014, 11:01:50 AM

Previous topic - Next topic

Magnum

I have a "dumb" NiMh/NiCd charger.

Using it for a 9.0 NiMh battery and 280 Mah capacity.

For 9 volt batteries, it specifies 25 milliamps as the charging current.

The charger manual gives this charge formula.

Charging time(h) = 1.2 x battery capacity (mAh)
                                  -------------------------------
                                 charging current (mA)
13.44 hrs.

Is is formula useful or is there something better that factors in the starting Voltage ?

I want to also verify the charging current.

I have the leads in the correct sockets.

The manual says to measure the + and - connections directly and it shows .11 amps.

Am I supposed to put the leads in parallel to get the charging current ?

Manual shows 25 milliamps is used in charging 9 Volt batteries.

A run down battery had a starting voltage of 8.62 volts.

The TENS unit lasted about 60 minutes.

Do you think I may need a 600mAh 9V Li-ion battery ?

The wall voltage supplier outputs 800 Ma to the TENS unit.



Take care,
                   Andy

Ubuntu-mate-18.04-desktop-amd64

http://www.goodnewsnetwork.org

dedndave

a volt meter is used "across" a voltage source or load

to measure current, the meter goes in series with source and load
ideally, an amp meter is 0 ohms, itself (an ideal volt meter is infinite ohms)
so, if you put it across a battery, you are shorting it out - possibly damaging the meter and/or the battery



as for a better formula....
the best way to determine charge level is to measure the battery temperature rise above ambient
there may be some equation (that i'm not aware of) for partially charged batteries
you might look to some specification sheets from manufacturers
they generally provide graphs - from which an equation might be extracted

batteries made by different manufacturers use different combinations of chemicals/minerals
so - there is not likely to be one equation that works for all batteries

dedndave

as for measuring temperature, i have many times used el-cheapo 1N4148B diodes (modernized 1N914's)
you force a constant current on the diode, then measure the forward voltage drop, VF

the junction VF of most diodes is very linear over a limited range of temperatures
it should be noted that these diodes are encapsulated in glass - and are light-sensitive
so, some steps should be taken to prevent light from affecting the measurement

many encased battery packs have such diodes, internally mounted onto the battery cell
then, the charger has another similar diode, used to measure ambient temperature
when the temperature differential reaches a preset point, charging ceases

Magnum

I was told that a 25 ma charging was too small to raise the battery temp up.

How would a thermistor compare to those diodes ?


Take care,
                   Andy

Ubuntu-mate-18.04-desktop-amd64

http://www.goodnewsnetwork.org

dedndave

#4
that may be true - i suggest a normal charging rate
C = mAh capacity
rate = C/10
so, if you have a 1000 mAh battery, charge it at 100 mA

thermistors are typically more expensive than diodes
probably not as linear, either (over a range of 0 to 50 degrees C)
the circuit is about the same (as far as cost)
with a thermistor, you can use constant-current or constant-voltage
with a diode, constant current only

there is another way to tell if it's charged up all the way.....