I'm guessing that the server's total uploads per second is more than the capabilities of your users... at which point packets start filling up a queue faster than they're being sent, which causes 99% packet loss to start occurring, which you're detecting as massive ping times.
You need to either tell your users how many players they can support on their upload bandwidth, and/or optimize your game to use less bandwidth.
Most residential DSL type plans will have ~10x more download bandwidth than upload bandwidth -- e.g. 20Mb/s down and 1Mb/s up, or 5Mb/s down and 0.25Mb/s up.
This kind of connection is fine when you're acting as a client, but is not ideal for a server.
1Mb/s is equal to ~122KiB/s. So if a user only has a connection with 1Mb/s of upload bandwidth, and you want them to be able to host 30 players, then you need to design your game so that each client only requires the server to send them <4KiB/s of data.
Most games that are designed for residental user hosting, simply just don't support 30 players...
Diagnose the problem:
* Ask your users what speeds they're promised from their ISPs.
* Ask them to use a site like http://www.speedtest.net/ to test their actual upload speeds.
* Add code to your game to measure the amount of data that you're sending in each direction per second.
You should be able to come up with some guidelines for hosting -- e.g. the server requires 3KiB/s of upload bandwidth per player.
Also, if you add this measuring code so you know how much network traffic you're generating, you might find that certain systems are using a disproportionate amount of data, and that you've got some good targets for optimisation work...
What kind of network synchronisation model are you using for your FPS? Have you based it on another FPS game's model, like Unreal, Half-Life, Quake 3, etc? How often do you send out updates, etc? Can you tweak this, so that people with worse connections can update their clients at 15Hz, while better servers can update at 30Hz, etc?