One of the major flaws in your maths is the assumption that an n/a production engine like an old 5 litre Holden or Ford engine will fill each cylinder with the equivalent amount of air as the per cylinder displacement on each induction stroke. It doesn't. To do that, it would have to have 100% volumetric efficiency at the revs in question.
The pumping losses through the filter, inlet tract, ports, and past the valves will bring it down to around 80-90%(?) [don't quote me], which is why you have vacuum in the intake on atmo road cars. Race car V8s with short velocity stacks/ram tubes and multi-throttles combined with big ports/valves and huge duration cam shafts can get up to 110% V.E or more, but that's hardly relevant to road use engines (shocking economy, undriveable below 3000 or more rpm, terrible emissions).
Also, and most importantly, the whole point of a turbocharger is to harness heat energy from the engine that would have otherwise been wasted, and to convert that back into extra charge density for the intake air, thereby increasing the efficiency of the system again. Of course you are right in saying that adding an extra atmosphere of pressure won't result in twice the charge density (due to the density reduction when heat is added to the air while being compressed), but with good intercooling, you will see at least 90%, or even 95% in the best examples, of the charge density retained.
And the final factor is the ability of a turbocharged engine (esp. one with a larger than standard turbo) to hold on to more torque at high revs (due to much, much better volumetric efficiency), and make more power that way.
As you would know- There are plenty of Skylines that make heaps more power on pump fuel than 5L Commodore OHV motors, with nowhere near 33psi.