From what I have seen and heard, a tube circuit can be tuned to work without transformers, but it would be very hard to mass produce because you would have to very carefully match components and test and re-test each unit by hand. They must have come into a way to bypass the tolerance issue.
No--that's not why. The issue with tube circuits is one of impedance. Tubes have a very high output impedance, and they really hate to see low impedance loads. Remember the impedance rule--low into high? Well, consider the following things:
1) The more gain you require a circuit to perform, the higher the output impedance.
2) The most gain you can get out of a single 12AX7 type stage is about 35dB, give or take. That means you can configure one 12AX7 (a dual triode, so it has two stages) for about 70dB of gain, provided that you are completely ignoring how linear its response is. Your output impedance from that one 12AX7 is going to be about 1.5k or so.
Obviously 1.5k is far too high of an output impedance. The traditional way to drop that output impedance is to use either an output transformer, or to use a buffer stage.
Since a tranformer is pretty expensive, less expensive designs will use the output buffer. A good example of this is the Bellari RP520. It's a full-plate voltage design that uses an opamp as a buffer to drop the impedance.
Another method would be to follow the gain stage with a tube circuit known as a
constant-current grounded cathode circuit. It has an output impedance of about 300 ohms, which is low enough to reasonably interface with another device without impedance problems.
Now, this is by no means a complete explanation of the issue--that's kind of beyond the scope of this post. But hopefully, it will give you a good idea of what sort of problem is posed by this issue, and how one may choose to solve it.
Also, Randy, I saw your question about the LA2 comp/mic preamp thing in the new issue. I think I can solve your problem if you provide a schematic to me. You can PM me if you like.