given that the impedance of the load is the same.
Well there's your problem then.
A better assumption would be that the power dissipated in the load would be the same. As the voltage has increased 10x, and the current has dropped to 1/10th, the impedance of the load has to rise 100 times.
If you load a stepup transformer with the load you suggest, that's as good as applying a short circuit to it, and an ideal transformer in that situation would draw 100x as much current. A fuse will blow, or something will smoke.
To put some figures on it, your figures from the question, let's say you have 12 V AC, with a 1 Ω load. That pulls 12 A of current, and generates 144 watts, to heat your tropical fishtank perhaps.
You still want to heat the tank, but the only heater you have available is rated for 144 watts at 120 V (you have misplaced the 12 V one in your recent move). It will draw 1.2 A at 120 V, meaning that its resistance is 100 Ω. If you connect that directly to 12 V, it will draw 0.12 A, generating 1.44 watts. So you get a 1:10 step up transformer, and transform your 12 V AC supply up to 120 V to power the high voltage heater.
Now you suddenly find the 12 V heater again, and connect it to the 120 V output of your transformer.
With an ideal transformer, the heater's 1 Ω load would draw 120 A, and it would generate 14400 watts. Obviously, not for long. Which would die first? The power supply, the transformer, the heater? Hopefully, the fuse protecting the circuit would do its job and open.
With real components, your fishtank heater transformer might only be rated for a few hundred watts, with losses at that rating of a few watts so it will stay cool. Its internal I2R heating will rise as the square of the current, so would overheat real fast if you connected it to a 1 ohm load, while the output voltage sagged to near-zero due to the voltage drop in the losses of the transformer.