Can AVR DC Output at Preamp Outs be Increased?


Can the DC voltage output at a typical AVR be increased from a specified 1.0V output to 10V or more output Volts?

If so, would this be done by upgrading the DC voltage opamps in the preamp out section of opamps in the DC voltage section of the receiver. Or is there another way this could be accomplished?

And has anyone ever done this?
128x128vinyl_rules
let’s start w the basics, musical signals are AC…….

DC is a speaker killer., and why most amps filter DC from the input….

iF you want a bit of gain juice, run a tape loop or a Schitt Loki, or ?….
@vinhyl_rules yes it does and to answer your question you can.  I would have to look at the design to see how to make the design change and if it would make sense to do.

Happy Listening.
I understand you are now talking about dynamic range, a completely new subject. What I do not understand is how you think more voltage is going to increase dynamic range?
I am confused by the lack of comprehension regarding my question.

Well first you asked about DC, which is nothing to do with anything here. 

Then you asked about voltage, with no reason, at least none anyone can understand. 

The AR preamp is capable of higher voltage. Does not mean it is ever used. If you were saying your AVR volume is max'd out and still not enough to drive the amp that would at least make sense. But no. There seems to be no reason for any of your questions except as an exercise in what if's. 

I would highly recommend the excellent The Complete Guide to High End Audio by Robert Harley, now in a new 7th edition. In it you will learn a lot about power supplies, which will answer a lot of your questions and if nothing else help you understand the subject well enough to not be so perplexing when you do have a question to ask.
Perhaps some clarification would enable more understanding of my post.

My intended outcome is to increase the dynamic range of the preamp’s output into an external power amp, not to pump more DC volts into an external power amp with either a 0.75v DC or 2.5v DC input sensitivity. I believe electronics with greater dynamic range generally sound better than electronics with lesser dynamic range.

As always, YMMV 😎
vinyl_rules
I am confused by the lack of comprehension regarding my question. ... why would I want 0 Volts DC at my output? ...
You want as little  DC voltage on your preamp output as possible.
As previously stated, “I wish to increase the PREAMP output of my AVR from 1.0V RMS to at least 10V RMS at all PREAMP outputs.”
As has been explained to you, there's no advantage to doing this and  you'll likely overload your amplifier input if you do.
Post removed 
I am confused by the lack of comprehension regarding my question.

1. jasonbourne52, my question said NOTHING about speaker outputs or the number of channels in my AVR. And FYI, voltage DOES NOT “drive” speakers, watts drive speakers. Watts = amps TIMES volts; volts = watts DIVIDED by amps.

2. kinjaki, to state “Your power amp can handle 0.75V or 2.5V at the input” is incorrect. I presume you meant to state my power amp is RATED at 0.75V or 2.5V at the input.

3. creeds, why would I want 0 Volts DC at my output? 0 Volts DC at a PREAMP output renders it incapable of driving an external power amplifier.

To reiterate, my question my question involved increasing the voltage output at my AVR’S PREAMP outputs. As previously stated, “I wish to increase the PREAMP output of my AVR from 1.0V RMS to at least 10V RMS at all PREAMP outputs.”
vinyl_rules
Can the DC voltage output at a typical AVR be increased from a specified 1.0V output to 10V or more output Volts?
Ideally, you want your DC voltage at the output to be as close to 0 VDC as you can get.

Your power amp can handle 0.75V or 2.5V at the input.  Why would you make your preamp output >10V?  It wouldn't work with you amp.

In general, it is better to put more gain in the cleaner environment (preamp), but both pre and amp have to be designed for that (like my Benchmark DAC3 and AHB2) - otherwise it doesn't make sense.

In addition it is not an issue of different opamp - they don't set the gain, circuitry around them does and changing it would be like redesigning it.
Stop trying to second guess the engineering team that designed your AVR (Denon, Pioneer, Technics ...). Don't be an I***T!
Why do you want to do this? Only the AVR's speaker outputs need more voltage to drive the five or more speakers in your home theater setup. I assume your AVR already has at least 5 channels rated at 100 watts (28 volts AC) each! That is adequate for most speakers! 
This is the second stupid thread I have read this morning! The first was by dbefus85!
Post removed 
Let me simplify.

For example, the Audio Research SP3 preamp is rated for 25V RMS @ 1 kHz at all outputs (https://www.arcdb.ws/model/SP3).

My AVR is rated for 1.0V RMS @ 1kHz at all PREAMP outputs.

I wish to increase the PREAMP output of my AVR from 1.0V RMS to at least 10V RMS at all PREAMP outputs.

In theory I would think I could accomplish by changing out a couple of op amps. 
How easy and/or complicated would it be to accomplish this?