59. The short answer is "don't do that." The voltage dropped by a resistor is given by Ohm's Law: V = I R. So if you know exactly how much current your device will draw, you could choose a resistor to drop exactly 7.5 V, and leave 4.5 V for your device, when that current is run through it. But if the current through your device is changing, or
If you have a 12V source and a 3V LED, why not put three or four of them in series, instead of all of them in parallel with huge power-wasting resistors for the rest of the voltage. If your max current is 1 amp but you're only driving it to 1 milliamp, why not use a smaller, cheaper LED?
You can drive a 10watt RGB LED from a regulated 12volt supply the cheap way (with a bit of heat). Use three TIP120 darlington transistors (using mosfets means hotter resistors). Emitters to ground. Bases via three 1k resistors to three PWM outputs. LED anode to 12volt. LED cathodes via three current limiting resistors to the three collectors.
If V1 was 100V, for example, then R1 would need to be a large and cumbersome power resistor, and you might need to add a fan or heatsink to keep things cool. Another way is to use some type of current source. In that case, it is pointless to add a resistor. All the resistor will do is dissipate power without changing the LED current.
.
do 12v leds need resistors