Nouveau retro pop-rock music:
Definitely different!!
Free MP3s!!

Back to Rants and Raves

Is The Master Volume Control Needed In Power Reduced Amps?

By Sid of Stone Marmot

April 26, 2009

A lot of guitar and bass amplifiers, particularly tube amps, have master volume controls. This allows you to set up the volumes of your preamps for the desired preamp overdrive or for the desired gain ratio between the preamps when using channel switching and just adjust the master volume for the desired volume for where ever you are playing.

Many players don't like the master volume control because they claim it reduces the power amp overdrive, this overdrive resulting in a sound that many like, and takes something away from the sound even when you think you are not overdriving the output stage. These claims are valid though, with a properly designed master volume control, the control should be completely out of the circuit when the master volume is turned all the way up. With the properly designed master volume, you have the best of both worlds.

A lot of people, including myself, are using various techniques to reduce the power supply voltages in their tube amplifiers. Kevin O'Connor is one of the best known advocates of this with his “Power Scaling” circuits, which are well thought out and designed and recommended by many. But there are many different techniques used by many people for accomplishing this same end. Reducing the power supply voltages reduces the output power of your amp, allowing you to get that desired power amp overdrive sound at more reasonable power levels.

The question is: What power supply voltages do you reduce?

The most obvious answer (and probably cheapest and easiest to implement) is to reduce all the power supply voltages proportionately. But my experience is that this does not usually give the best sound, though this is subjective and you may feel otherwise. The reason is that an electric guitar has a very wide dynamic range. Though the average level out of a typical guitar pickup may be 100 mV peak about a second after hitting a note or strumming a chord, the maximum level could easily be over 500 mV peak in the first 20 milliseconds after the chord or note is picked. This voltage out of a guitar does not change with amplifier power supply voltages but only with the way you play and how the controls on your guitar are set.

My observation is that when you reduce the power supply voltage to a vacuum tube, the gain does not change that much. But the dynamic range, that is, how big the output signal can get before saturating, does change dramatically with supply voltage.

For example, Table 1 shows how the gain and maximum output voltage change with power supply voltage for a 12AX7A. These numbers are taken from the RCA resistance-coupled amplifier tables. The resistances were chosen because they are typical of a black or silver face Fender preamp stage. Note that when the supply voltage is reduced from 300 to 180 Vdc, the gain drops less than 10 %, but the maximum output voltage drops over 56 %. Going from a 300 Vdc supply to 90 Vdc is even worse, as the gain drops less than 33 % but the maximum output voltage drops almost 90%.

Supply voltage Plate resistor Grid load resistance Max output V Gain Max output change from 300 Vdc Gain change from 300 Vdc
300 Vdc 100 kohm 220 kohms 57 V 52 - -
180 Vdc 100 kohm 220 kohms 25 V 47 56.1 % 9.6 %
90 Vdc 100 kohm 220 kohms 6 V 35 89.5 % 32.7 %

What does this mean? Well, if you have a 500 mV peak signal from your guitar pickup, with a 300 Vdc supply it will be amplified by 52 V/V, giving a 26 V peak signal at the output of this stage. There is plenty of margin between this desired 26 V output and the 57 V max output of this stage with a 300 Vdc supply, so the stage doesn't saturate. But with a 180 Vdc supply, the 500 mVpk signal is now amplified by 47, giving 23.5 Vpk. This is very close to clipping with the 25 Vpk max output at 1800Vdc. At a 90 Vdc supply, the 500 mV input signal is amplified by 35, giving 17.5 Vpk desired output. But at 90 Vdc supply the stage can only out a maximum of 6 Vpk, so this stage is now very far into saturation.

So if you are trying to get just power amp distortion at lower volume levels without altering any of the characteristics of the previous stages, you really need to decrease only the output stage power supply voltages and leave all the other stages alone. Decreasing the power supplies for any other stage will probably result in that stage saturating sooner than desired and probably before the power amp starts to saturate.

But if you don't reduce the supply voltages for any other stage, these stages will still be outputting the same maximum signal levels for a given input signal level. Reducing the power supply voltages for the power amp will result in the power amp starting to saturate with lower input signal levels to the power amp. So, to keep your amp sounding close to what it does when you turn it up most of the way, but only at a lower volume level by reducing the power amp supply voltages, you also need to reduce the signal input to the power amplifier.

A master volume control located between the phase inverter/splitter and the power amp provides the capability to reduce the input to the power amp as the power amp supply voltages are reduced. You have to manually adjust this master volume by ear to get the sound to be close to what it is with amp turn most of the way up.

Note that running with reduced power amp supply voltages will never sound exactly the same as running the amp at close to full volume for a number of reasons:

1) You aren't driving the speakers or output transformer as hard at reduced supply voltages, so you are missing any distortions they may introduce.

2) You aren't taxing the power supply as much, so you will not get the sag and other peculiarities that occur at higher current levels.

3) Your ears hear things differently at lower volume levels, particularly in the bass and high frequency regions.

Back to Rants and Raves

© 2007-2009 Stone Marmot Enterprises, all rights reserved.