Here's an idea that might make both groups happy, and fit in well with moving towards an Agile approach:
Automate your user acceptance checks, and screencast them.
http://pragprog.com/magazines/2009-12/automating-screencasts
It sounds like part of the problem you're having is that the test plans you're writing are very repetitive and purely confirmatory. To be honest, I wouldn't call what you're writing testing at all - if it's just confirming the requirements, it's checking. Automating this and screencasting it will let you package up a neat demo for your customers regularly (you could even send them over a short daily) - they'll be more likely to click on a demo and watch it than to open a test plan and start working through it, so hopefully you'll get faster feedback (very important if you're moving towards a more Agile approach). You'll be able to re-use components so it'll reduce the workload for you, and developers usually enjoy writing code a lot more than writing documents.
It also provides a way of actually executing the requirements - have you come across Gojko Adzic's executable specifications? Take a look here:
http://gojko.net/2010/08/04/lets-change-the-tune/
If you're thinking of this as a way to get the requirements into an executable form to demo to your customers, then it suddenly seems a lot less pointless.
Now, putting my tester hat on, I'm honour bound to point out that if the screencast thing takes off, it will free you/your stakeholders up to do some proper testing - i.e. trying edge cases, and tests that actually challenge the app, rather than just confirming requirements. I'd suggest that you provide the screencasts along with short questions or suggestions for areas you'd like more feedback on, for example:
1) Here's our new registration form -
watch this screencast to see how it
works!
What we'd like feedback on: We've
added a lot of extra checking on this
form to make sure customers aren't
able to enter the wrong data - we'd
really like you to look at the error
messages customers get when they put
in the wrong thing and tell us whether
our customers will find them easy to
understand.
We'd also like to know
whether we've been too strict in some
cases - if you've got any particularly
unusual customer data (maybe a really
long name, or a really short one, or
someone with unusual characters in
their name, or something else we didn't think of, or maybe their address doesn't have a street name or something weird like that?)
then perhaps you could spend a few
minutes trying those out?
I.e. you present a nice screencast, and then ask for feedback, framing it without being too specific, get them thinking about potential issues rather than just confirming. Get them thinking, instead of just clicking blindly through a test plan. You're basically writing an exploratory test charter for them. (If you look at the Agile Testing Quadrants, these would be tests in Quadrant 3).