# Power Consumption Question



## BleedingStar (Feb 3, 2008)

I have just been curious about Power Consumption of stereo amplifiers. When an amp is on, is the power consumption determined solely by the gain, or is it dependent on the actual sound level being sent through it. Say my sub amp is on, however I am listening to a song or movie soundtrack with very little bass... is it consuming less power than when it is working much harder. This seems to make the most sense to me, but I have also heard that it is solely based on the gain settings regardless of signal being created, and that it takes the same power to amplify the sound regardless of the signal level.

My case in point is this... my system is roughly 2000 watts max output. I would assume that would mean that I would only actually be consuming 2000 watts of power if all the gains and volumes where maxed out and the audio was peaking? However at a lower reasonable level watching an average movie soundtrack it should be consuming much much less power than that, only consuming power for peaks as need correct?

Just curious.


----------



## Wayne A. Pflughaupt (Apr 13, 2006)

Well, you’re kinda confusing power consumption with output. Consumption would be what it’s demanding from the electrical circuit it’s plugged in to. I think what you’re really asking about, though is the amp’s power output, but they do kinda go together.

At idle – i.e. if it’s on, but with no signal present – the amp will draw a limited amount of power from the wall, but will put out no significant power (i.e. wattage) at all. The amp will only deliver power (to the speakers) and consume power (from the wall) with the presence of an incoming signal. The amount of power output will depend on the level of the input signal.

Any amp will deliver its maximum power output when it gets the maximum input signal that it can handle. Note that this is independent of the amp’s gain settings, which should have been previously adjusted to compensate for a hot or weak input signal.

Make sene? 

Regards,
Wayne


----------



## BleedingStar (Feb 3, 2008)

I am actually talking about the power that is taken from the wall. I spoke a bit with a friend majoring in audio engineering and he said that two were closely related. He said a 600 amp would use roughly 700 watts of power consumption. He had some crazy calculation for it if i remember correctly. But I wasn't sure how gain effected that.


----------



## tonyvdb (Sep 5, 2007)

Wayne is right, But I should add the it also depends on the class of the amp. Class "A" amps use almost as much power sitting idol as they do running at full power as they have a unique design that causes them to do so.


----------



## Wayne A. Pflughaupt (Apr 13, 2006)

Thanks Tony, I was going to mention that. I don’t know how a formula could predict how much power every amp will draw, as some designs are more efficient than others. For instance, I recall back in the 80s or 90s that Bob Carver designed an amp that would just about put out a watt for every one used from the wall. Pretty unusal. And I understand that late-model digital amps are extremely efficient.

And again, the only thing gain does is adjust for the incoming signal level. If an amp clips at so many volts of input then that’s all it’s going to do. Doesn’t matter if the gains are set low for a hot input signal, or high for a weak signal.

Regards,
Wayne


----------



## eugovector (Sep 4, 2006)

Don't know if you've seen this yet: http://www.hometheatershack.com/for...ment-how-much-does-cost-run-these-things.html

I found that they louder the movie, the more power was consumed, but I sure didn't get anywhere near the 1000W rating of my AVR (that's toaster oven levels).


----------



## BleedingStar (Feb 3, 2008)

Just read through it, it was very interesting to know... Makes me feel a loss less guilty about the addition of some recent power consuming items. I was starting to feel a bit guilty about running a tri-sub tri-amp system, but it seems like on average that might actually be taking less power to run my theater in the dark then to light the whole room up with the overhead light... hehe.

Ps. where did you get the Kill-a-watt meter?


----------



## Mike Cason (Mar 17, 2007)

Here is a tip most overlooked by many audio folks who keep adding equipment to their system.

Make sure you have ample voltage to your equipment. Standard supply voltage ratings on most equipment is 120 volts. If your voltage drops below that, down to 110 to 115 volts or in some cases, even lower, your equipment will draw higher amps. 

The electrician's rule is "the lower the voltage on a given load, the higher the wattage or amperage." I have 4 amplifiers in my home theater system so I've added two extra circuits to maintain proper voltage. (I'm a licensed electrician)

A lower voltage will cause premature equipment failure because they are working harder and hotter for the given output, especially with the high current amplifiers. 

Most public utility commissions allow your electrical supplier (your utility company) to remain within a few volts or percentage of the 120/240 single phase supply at your meter. If your supply current to your home is 118v per line, this may be allowable by the commission. If it is lower, call the utility company and have them check your supply service drop because it may be too small or have a bad connection at your weatherhead, especially if you are living in an older home. Not much current was required in the old days as the electrical gadgets were minimal. The transformer may not be big enough at the telephone pole too if other homes were built and you have more than 4 homes on a 25 kva transformer. The number is "25 or 50" should be on the side of the pot (transformer). 

If you have low voltage to your home, expect a greater voltage drop when the wiring finally hits your outlet at your equipment receptacle. Most utility company transformers will supply 120 to 124 volts to your home which is fine. (They make and I recommend 130 volt light bulbs because they last longer, especially when your power supplier is giving you 124 volts and your standard light bulb is only rated at 120 volts).

I recommend for those of you with a lot of equipment to power up everything close to its peak current draw and check the voltage at the receptacle or power strip. If it is low, get a power supply or add an extra circuit. Without powering up everything with the full load, you won't know what your actual supply voltage really is.

You can purchase a 240 volt power supply (stepped down to 120 volts) or a Monster Power Center to keep the voltage consistant with the name plate rating of your equipment. Proper voltage to your equipment can be compared to good gasoline for your automobile engine.

Mike


----------



## eugovector (Sep 4, 2006)

BleedingStar said:


> Ps. where did you get the Kill-a-watt meter?


Borrowed it from a friend, but a google search will turn up a number of places. Newegg has them on promotion for under $20 shipped from time to time. I'll probably be picking one up for reviews the next time they go on sale.


----------



## tonyvdb (Sep 5, 2007)

Wayne A. Pflughaupt said:


> For instance, I recall back in the 80s or 90s that Bob Carver designed an amp that would just about put out a watt for every one used from the wall. Pretty unusal. And I understand that late-model digital amps are extremely efficient.


Yup, I had just such a receiver it was the Carver Receiver 6250 Magnetic field with Sonic Hologram built in. worked really well but I had to sell it when I upgraded my receiver (the wife said it had to go)


----------

