You mentioned that you're using a laser and photodiode. Fiber optic cables will have essentially zero propagation delay jitter, so an optical delay line is a very good solution. A general rule of thumb for typical networking-grade optical fibers is a propagation delay of 5ns/m, or 200m/µs. Based on this, for 30µs you'd need around 6000m of optical fiber.
You could buy a 10km reel and get it cut and terminated, but you can just buy singlemode fiber in 5km and 1km spools, then couple them together. With a decent coupler you should see very little additional insertion loss. The reels are not physically large.
You can buy these fiber reels for far less than the cost of the test equipment you would typically need to buy to achieve the jitter spec you need. If you went with cheaper brands you could probably pull this off for just a few hundred USD, although you might need higher grade fiber if you need to minimise losses due to the source laser not being particularly powerful. Even then, I wouldn't expect the cost of fiber to exceed 1000USD.
If the length delay period needs to be fairly exact, rather than just roughly 30µs, you might want to buy a 10km reel, measure what the total propagation delay is, then calculate the required fiber length and get it cut and terminated to match your needs. Given that 1cm of fiber represents approximately 50 picoseconds of delay, if you give yourself some margin and take a few passes at cutting and re-terminating the fiber you should be able to dial in that 30µs delay to within a couple hundred picoseconds by hand.
You can also buy off-the-shelf fiber delay lines for this exact purpose. They're used by networking equipment manufacturers to test long distance fiber runs in the lab. Far more expensive than just buying a reel and DIY'ing it, but they do come in a nice rackmount chassis and you might be able to find something inexpensive second hand.
If your laser isn't powerful enough to propagate through this much fiber, you could look at hooking up your drive pulse to a network transceiver module. The interface is differential and fairly simple to use. On paper, the worst-case delay jitter in such a setup is one unit interval (UI) at the transmitter - i.e. if your pulse arrives just after the start of an interval, plus (see note below) the maximum transmission jitter (typically 0.35UI or less), plus the max receiver jitter (again typically 0.35UI or less). For a 10GBASE-LR transceiver (capable of 10Gbps over 10km) a unit interval is 100ps, so based on the specs you would expect to see a worst case of 170ps 70ps of jitter. However, I've never actually tried to measure the real-world jitter in a transmitter and receiver pair, so you'd have to try this out yourself. Lucky for you, 10GBASE-LR transceivers are ubiquitous and cheap in the second hand market, so it's not an expensive experiment to try.
note: I got the jitter wrong initially (thanks to @dotwaffle on Twitter for pointing this out) - 10G modules are asynchronous and the SERDES is on the linecard not the module, so the total jitter is simply the sum of the two modules' jitter numbers. it's also worth noting that, clock & data recovery and dispersion compensation are also on the linecard in SFP+ which might complicate things a little; you'll need to experiment with the modules to see if you can use them as a transparent interface.