I completely understand why many listeners prefer and seek out the rich
harmonic distortion of tube amps and similar gear that doesn’t perform
so well on a test bench.
The problem here is that most solid state amps make harmonic distortion too- but being higher orders, is interpreted by the ear as brightness and harshness.
This is why tubes are still around. But the real problem isn't transistors so much as its a lack of enough feedback. Feedback is on a sort of bell curve- with 17.5dB of feedback being right in the middle of where you don't want to be! You need about 35dB of feedback to really get rid of not only the innate distortion of the circuit but also to knock down the distortion caused by the application of feedback itself. The semiconductors needed to do that really didn't exist in the 1970s and 1980s. Also the **will** to design such an amp has been a bit lacking as well.
You know you have a problem if the distortion measured at 100Hz isn't the same at 1KHz and 10KHz. Often it isn't because the amplifier under test lacks the Gain Bandwidth Product to stay linear- so the distortion increases with frequency as a result. Essentially the feedback is decreasing with frequency. This is why the industry usually only does its harmonic distortion measurement at 100Hz...
The other part of the distortion signature that might be more important is that there be enough of the 2nd and 3rd harmonic (both treated by the ear the same way, which is to say that the ear is insensitive to them) to mask the presence of the higher orders. This might actually be more important than how much distortion is present overall, although its best to keep it low as it can mask detail. If the higher orders are masked the amp will sound smooth (which is why tube amps often sound smooth despite having higher distortion than solid state amps).