Your informant is correct, but with caveats.
The statement that a power supply able to provide 3W of power is able to supply any load that demands 3W or less. This is absolutely true.
The caveat is that this power source has conditions attached. One of those conditions is that the voltage used to deliver the power is 12V. The other condition is that you can draw up to, but not exceeding, 250mA.
The question, then, isn't whether the power supply can provide the power (it can), it's whether the load can use that power given those conditions.
Take for example a load which is a 3W, 6V incandescent lamp, which (at 6V) would draw a current:
$$I=\frac{P}{V}=\frac{3W}{6V}=0.5A$$
The lamp requires 3W, so at first glance it seems that the power supply is suitable. Indeed, the supply can provide all the power necessary to operate this lamp, as your informant suggests. However, the question is not whether the supply can provide the power, it's whether the load can use that power at 12V. The answer is no, it can't. For starters, 0.5A is more current than the supply can provide. On top of that, given 12V, that lamp would try to draw way above 0.5A!
You can overcome this with a DC-to-DC converter. An ideal DC-DC converter is a device that can take any voltage as an input, and produce a different voltage output, with 100% power efficiency. In other words, a DC-DC converter can take power from a supply with a voltage condition attached (a fixed 12V output,in this case), and provide a different voltage source. However we must not forget that the converter is supposed to be 100% efficient, so not only can it transform the source voltage, but also the current.
In our example of a 12V power supply, limited to 250mA (for a maximum output power of 3W) and a lamp requiring 6V at 500mA (also 3W), you could use a "buck converter", otherwise known as a "step-down DC-DC converter", in the following configuration:

simulate this circuit – Schematic created using CircuitLab
Here you can see that the power supply is providing 250mA, which it is quite capable of, but the lamp is drawing twice that, 500mA! Crucially, the power being provided by the power supply, and power being delivered to the lamp are both 3W:
$$
\begin{aligned}
P_{SOURCE} &= V_{SOURCE} \times I_{SOURCE} = 12V \times 250mA = 3W \\\\
P_{LAMP} &= V_{LAMP} \times I_{LAMP} = 6V \times 500mA = 3W \\
\end{aligned}
$$
We can conclude that this power supply is able to operate the lamp, but we will have to use an intermediary to bridge the different conditions of supply and demand.
If you are wondering how it's possible to get more current than 250mA, from a source limited to 250mA, I remind you that the only physical law that needs to be obeyed here is the law of conservation of energy. There's no law called "conservation of current". The means of achieving this current-boost and voltage-buck behaviour are complex, and I encourage you to study "DC-DC converters" and their counterparts "linear regulators", to understand this better.
In this example I have assumed a 100% efficient buck converter, which is of course impossible. Nothing is 100% efficient in real life, but the principles here should be clearer now.
Edit, for clarification: When I say there's no law "conservation of current", immediately I am reminded of Kirchhoff's Current Law (KCL), which embodies that very principle I deny the existence of. However, the operation of a DC-DC converter depends on either magnetic coupling or switching of a capacitor's effective location, which both involve discontinuities in the current path. So, while KCL is true, it's not appropriate to apply it to currents in and out of a DC-DC converter or charge pump.