View Single Post
Old 04-20-2004, 09:05 AM   #5 (permalink)
kazoo
Psycho
 
Location: Where the night things are
Quote:
Originally posted by MrSelfDestruct
If you can do it safely, use a voltmeter and test the voltage running through the fixture. There's a chance that a voltage drop is causing the bulb to draw more amperage from the circuit than it should, still coming to 100 watts, but burning bulbs with the increased current.
Wait a minute! Were that postulate true, I could take a bulb and connect it to a variac, and as I reduced voltage from 120, the brilliance would be constant, and as I neared zero, the bulb would fail, since you're claiming that at 10 volts supply, the bulb would draw 10 Amps?! Furthermore, if your point was accurate, then how would one explain the use of a simple rotary dimmer? They are rheostats, or variable resistors, and they reduce supply voltage to the fixture. :highly skeptical smiley:
__________________
There ain't nothin' more powerful than the odor of mendacity -Big Daddy
kazoo is offline  
 

1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 46 47 48 49 50 51 52 53 54 55 56 57 58 59 60 61 62 63 64 65 66 67 68 69 70 71 72 73 74 75 76