Assuming the machine is otherwise correctly provisioned, but this one component is different, it is a clear winner:
3- Super Memory: 64 Terabytes of main memory, with access timing specs similar to that of a high-rated DDR4.
Since the machine is otherwise competent, my first immediate use would be as a ram disk. Since nearly all processing and serious computing also ends up touching main memory frequently, everything I do would benefit from the higher execution speed with no additional effort. It provides immediately accessible gains to me.
The other items have some problems:
Super CPU doesn't help too much within these constraints. You don't allow other hardware to change, and it would be quite difficult to keep the CPU fed with today's level of hardware. Unless you are spending the time doing raw number crunching (perhaps BitCoin mining in the background) it is difficult to provide a steady stream of instructions and data to the modern processor. That is why the HyperThreading model works so well for so many applications, bolting on a second instruction decoder (rather than additional CPU horsepower) is very often able to provide roughly double the user-facing performance for most users. The actual CPU number-crunching aspect is rarely a bottleneck. Even the enormous number-crunching supercomputers are often constrained more by memory speeds and data interconnects rather than CPU horsepower.
Super GPU provides additional eye candy of course, and GPGPU programming can help for your BitCoin mining efforts. But you still need to develop for it and that takes work. Having recently bought a new machine with a nice new graphics card (GeForce 970) the graphics there in Dragon Age are beautiful. But since you hypothetically cranked it up to "about 100 petaflops" you've really given exactly the same problem as the Super CPU. How do you intend to provide enough data and instructions to keep the beast busy? Sure you can have a rock-solid rendering framerate, but with the other components stuck at today's level it would be difficult to leverage. Maybe I'd take it if I were a digital currency miner, but not for much else.
Super Disk would be fun but not particularly practical. For me personally with a photography hobby I keep multiple copies of everything. With a decade of raw files I'm approaching a terabyte of that data. One contract I'm working with is a moderately sized data warehouse. They've got around 11TB of data records to work with. But your fictional 25 petabytes, or 25 thousand terabytes, is far more space than most people would have a practical use for filling. For things that have cool storage applications like solving the game of Chess a 25 petabyte collections is inadequate by a double-digit order of magnitude. Some commercial uses could benefit. Consider that the Internet Archive just passed 15 PB of data, and all of facebook combined is estimated to have 300 PB. Government agencies would love your 25 PB drives, since that is about the total of everything Google transmits globally in a day. So a few data-centric businesses would love it, but it doesn't give me much individually.
Super Internet would be interesting. Not so much useful for anything I work with, but interesting because of the "10ms stable round trip latency at all times". The financial stock-trading world would kill for this one. Currently it takes slightly over 14ms for the New York to Chicago connection and a surprisingly huge number of financial businesses use that. The speed of light for that distance is about 13.3ms. If you could guarantee a round trip of 10ms to all the financial hubs of the world not only would physicists want to have a chat but you could be a very rich individual. For most software it would be convenient but not really transformative. Games get a better connection, but some players already have great connections, especially for LAN play. For common consumers it might mean videos load a little faster, but since all the rest of the global infrastructure would remain the same it would quickly max out connections and you'd have bottlenecks of memory and CPU at every processing hub.
So if I had to pick one only one of them to help improve my computer experience and to help humanity generally, I'd have to pick the Super Memory.