Questions about power supply build.
Previously I asked: Is this a linear power supply? and What is the third switch terminal for? I suspect it is better form to combine questions for a single project under one topic.
I am starting with the 24VDC supply I mentioned previously and adding a cheap buck board to get variable voltage and constant current capabilities.
My panel meter requires 4-30VDC but 5-12VDC is recommended. I thought a resistance voltage divider would be the simplest way to power the meter. It only draws 15mA and, in an experiment that verified my calculations, my 1/4 watt resistors did not get hot. Then I read an admonition against using a resistance voltage divider to reduce voltage for a load. Does that apply in this case? What am I missing?
Because the current you need to power the load increases the current passing through front/top of the resistor network that establishes the voltage. Since Voltage = current x resistance, this will change the voltage dropped across that resistor and change the voltage delivered to the load.
Note that the changed current does NOT flow through the lower resistor so the voltage drop across it is not affected the same way.
I had a psychic girlfriend but she left me before we met.
Personally I would prefer 5 separate forum threads for 5 different questions, even if they all relate to the same project. Sometimes all 5 questions might get answered in a simple answer, but all too often 'simple questions' have complicated answers, and the tangle that happens when 5 different people try to answer 5 different questions, all in the same thread makes the thread look like an ocean going liner's rope!
This is one of those 'tricky cases, where judgement, rather than fixed rules, is appropriate.
I'll start by trying to explain why potential dividers are not normally used as a "power supply" for a load
Using a potential divider means some of the current is flowing through both resistors, and doing nothing useful for the load. It works fine for something like a radio volume control, in which the power is minute, and the potentiometer is tapping off a small proportion of the voltage signal to feed to an amplifier, but it is generally a wasteful approach when dealing with significant power levels.
In electronic circuits, there are many similar occasions, in which the load has a much higher resistance than the resistors that make up the potential divider. Note the word 'potential' implying 'voltage divider', not 'power divider', which is effectively what you normally need when you are reducing the voltage to provide a power source to a load.
When you attempt to use a potential divider to supply a lower voltage to a load, the resistance of the load is in parallel with the lower resistor of a potential divider, so unless you know the resistance of the load, and compensate by changing the potential divider values to allow for its influence, the voltage with the load attached will be lower than without it.
So far, I have assumed the load always draws the same current .. if the current changes, so will the voltage at the potential divider output point.
So as you can see, there are reasons why 'potential dividers' typically make very bad 'power dividers'.
Assuming your panel meter is based on "7 segment LEDs" ie each digit consists of 7 line or bar shaped LEDS, arranged in an '8' shape, that can make any one of the digits 0 to 9 by switching on the appropriate LEDs.
You may realise, that at least as far as the current to illuminate the LEDs is concerned, to show a '1', two LEDs are powered, whilst to show an '8', 7 LEDs are powered. Thus I would expect the current demand to change markedly as the value presented changes.
From the very wide 'acceptable' voltage input of 4-30V, it is possible you could consider using just a single series resistor, chosen so that some of the 'surplus' volts were dropped, which might reduce the heat dissipation from the display. You might pick a resistor that still provided an adequate voltage when the displays were showing something like "888", to ensure there was always a high enough voltage to cope with any situation, then make sure the voltage was not excessive when showing something like "111".
I wouldn't generally recommend using resistors as a solution fo rthis case, as it means that the voltage to the meter changes with the reading, and that might upset the calibration is a weird way. However, sometimes a simple, but crude approach is 'good enough'. Only you, as the user, can make that judgement.
Personally, I would consider adding a small voltage regulator, just for the meter. As the current demand is modest, either a '1 chip' linear regulator (with input and output capacitors as per data sheet) or one of the very small buck regulators could be chosen. Either can be purchased from the likes of AliExpress, etc. for much less than a US dollar each, though you might have to buy them in 5s or even 10s pack sizes, which might seem a bit excessive.
Sorry, such a simple question has a more complicated answer, but I hope it helps you make informed decisions in the future.
Best wishes, Dave
I hadn't considered that the meter will vary its current requirements. I am going to try to measure that and see if it is significant. Also, it is a combination volt/ampmeter and I didn't wire the ampmeter when I measured the current requirement. I need to check that too.
I do have a buck board I could use to power the meter. But it requires a maximum 23V input so I would have to drop the voltage a bit to use it. A couple of diodes in series would probably do.
But I have to admit I see the elegance of a voltage regulator IC.
All this assumes I was right to reject running the meter on 24V in the first place.