"The input sensitivity of an amplifier simply means how many volts are required to bring the amplifier to full power. Any amount of voltage beyond this figure will make your amplifier try to put out more power than it actually has, the result is called "clipping"." *
So we know that most CD Players and DACs put out a signal of roughly 2V over the single-ended line out.
Some output even more, but 2V (or a little higher) seems to be fairly common.
So you'd think that a modern amplifier would be designed to match this output, roughly speaking.
Not so.
Turns out modern amplifiers are incredibly sensitive (haha) and need much less than this to play at full power.
How much less?
For the purposes of my post I have only looked at integrated amplifiers, and only at line-level/RCA specs.
The separate pre/power approach, balanced transmission and vinyl introduces even more questions, and I'm trying to keep it simple.
(I've also ignored impedance for now.)
Here are three current amplifiers, chosen for no specific reason other than they shared their line input sensitivity specs online.
(Some brands don't.)
Marantz PM7005.
"Input Sensitivity: High level - 200mV / 20kOhm"
Rotel RA-1592.
"Line Level Inputs (RCA): 340mV"
Sugden A21 Signature.
"Line Input Sensitivity - 170mV for max. out"
What this means, as far as I can tell, is that you would never be able to turn the volume of any of these amps up all the way when connected to a source with a 2V output. At a few hundred mVolts these amps would already be reaching their full power output and beyond that, I would imagine, bad things would happen.
I just don't understand this.
If an integrated amp was designed with an input sensitivity of 2V, or even 1,5V, wouldn't it be much better matched to a modern source?
You'd have more range in the volume control, and much less of a risk of anything going poof.
So why do manufacturers do this?
Because 'louder' sells better on the showroom floor?
So that you can't really turn up the volume control very far, implying that it's got 'massive power reserves'?
To ensure broad compatibility with a wide variety of sources, i.e. phones and iPods that might provide lower voltage signals?
Or is there a good technical reason for this that I'm not aware of?
"With very high input sensitivity an amplifier will jump up to full volume very quickly as volume is turned up, giving a perception of being powerful, irrespective of true power output. For this reason, as well as broadening compatibility with sources, sensitivity is increasing in modern amplifiers." +
"An amplifier with high input sensitivity will also deliver a worse noise figure under measurement, because output noise is in most amplifiers determined by noise from the first stage, multiplied up by subsequent gain. When gain is turned down however, this noise falls accordingly and it isn?t in practice audible. A broadly useful input sensitivity is 200mV. It will cope with most sources. A lower value of 400mV suits silver disc players and modern tuners that typically give 1V output. It is too low for many external phono stages however. An input sensitivity of 90mV such as that of Naim amplifiers is very high, meaning volume will have to be kept low from CD." +
Sources:
http://www.hi-fiworld.co.uk/amplifiers/75-amp-tests/150-sensitivity.html +
http://www.decware.com/paper55.htm *
So we know that most CD Players and DACs put out a signal of roughly 2V over the single-ended line out.
Some output even more, but 2V (or a little higher) seems to be fairly common.
So you'd think that a modern amplifier would be designed to match this output, roughly speaking.
Not so.
Turns out modern amplifiers are incredibly sensitive (haha) and need much less than this to play at full power.
How much less?
For the purposes of my post I have only looked at integrated amplifiers, and only at line-level/RCA specs.
The separate pre/power approach, balanced transmission and vinyl introduces even more questions, and I'm trying to keep it simple.
(I've also ignored impedance for now.)
Here are three current amplifiers, chosen for no specific reason other than they shared their line input sensitivity specs online.
(Some brands don't.)
Marantz PM7005.
"Input Sensitivity: High level - 200mV / 20kOhm"
Rotel RA-1592.
"Line Level Inputs (RCA): 340mV"
Sugden A21 Signature.
"Line Input Sensitivity - 170mV for max. out"
What this means, as far as I can tell, is that you would never be able to turn the volume of any of these amps up all the way when connected to a source with a 2V output. At a few hundred mVolts these amps would already be reaching their full power output and beyond that, I would imagine, bad things would happen.
I just don't understand this.
If an integrated amp was designed with an input sensitivity of 2V, or even 1,5V, wouldn't it be much better matched to a modern source?
You'd have more range in the volume control, and much less of a risk of anything going poof.
So why do manufacturers do this?
Because 'louder' sells better on the showroom floor?
So that you can't really turn up the volume control very far, implying that it's got 'massive power reserves'?
To ensure broad compatibility with a wide variety of sources, i.e. phones and iPods that might provide lower voltage signals?
Or is there a good technical reason for this that I'm not aware of?
"With very high input sensitivity an amplifier will jump up to full volume very quickly as volume is turned up, giving a perception of being powerful, irrespective of true power output. For this reason, as well as broadening compatibility with sources, sensitivity is increasing in modern amplifiers." +
"An amplifier with high input sensitivity will also deliver a worse noise figure under measurement, because output noise is in most amplifiers determined by noise from the first stage, multiplied up by subsequent gain. When gain is turned down however, this noise falls accordingly and it isn?t in practice audible. A broadly useful input sensitivity is 200mV. It will cope with most sources. A lower value of 400mV suits silver disc players and modern tuners that typically give 1V output. It is too low for many external phono stages however. An input sensitivity of 90mV such as that of Naim amplifiers is very high, meaning volume will have to be kept low from CD." +
Sources:
http://www.hi-fiworld.co.uk/amplifiers/75-amp-tests/150-sensitivity.html +
http://www.decware.com/paper55.htm *