🎉 Celebrating 25 Years of GameDev.net! 🎉

Not many can claim 25 years on the Internet! Join us in celebrating this milestone. Learn more about our history, and thank you for being a part of our community!

Worldwide ransomware cyber attack

Started by
34 comments, last by samoth 7 years, 1 month ago

it is difficult (if not impossible) to write software with every use case bug-test scenario, considering how fast evolving third party software can be

The main problem with that statement is that we're not talking about 3rd party software, but about operating system components, in this case a network protocol. That should have been in a better state.

Secondly, if we can't develop bug free software at the rate we're developing it, then we're developing too fast. Time we focused the industry on respecting customers and delivering them working, safe software, instead of throwing out features as fast as we can to grab the market and blaming users when they don't keep on top of our flood of patches later.

It is quite possible to write bug-free, safe software. We just choose not to because it costs more, takes longer, and often runs slower.

Advertisement

it's still a stupid thing of the government of anywhere to rely on _closed-source_ software.

So, it is better for a government to go with an open source option, where foreign operatives have easy access to not only review the systems in use, but even have access to attempting to slip malicious code right into source?

The line of "But we can read and review stuff, so you can't just slip malicious code into an open source project!" is kind of blown out of the water by the existence of something like Shellshock, which only took how long for anyone to notice such a bug existed? - From a code review standpoint there isn't much of a difference between something that was overlooked in design, and something that was designed and coded in such a way as to allow future exploitation.

Getting an agent into a position in somewhere like Microsoft isn't impossible, but is still more difficult than getting a bunch of people 'being supportive and writing good code' for an opensource project.

Old Username: Talroth
If your signature on a web forum takes up more space than your average post, then you are doing things wrong.

I think it's more about that if you have open source software, you can directly pay someone to address any known bug. That is probably easier than compelling a company like Microsoft to do so. There are obviously pros and cons of the inherent security of each type of software, but not being reliant on private corporations for fixes is useful.

It is quite possible to write bug-free, safe software. We just choose not to because it costs more, takes longer, and often runs slower.

It's funny that so many vunerabilities are caused programmers using functions that doesn't take a limit of buffer (strcpy, strcat, strcmp, etc.).

So, it is better for a government to go with an open source option, where foreign operatives have easy access to not only review the systems in use, but even have access to attempting to slip malicious code right into source?

The line of "But we can read and review stuff, so you can't just slip malicious code into an open source project!" is kind of blown out of the water by the existence of something like Shellshock, which only took how long for anyone to notice such a bug existed? - From a code review standpoint there isn't much of a difference between something that was overlooked in design, and something that was designed and coded in such a way as to allow future exploitation.

Getting an agent into a position in somewhere like Microsoft isn't impossible, but is still more difficult than getting a bunch of people 'being supportive and writing good code' for an opensource project.

It's actually better than having a single US company with an OS that is literally a spyware responsible for the information of your (not-US) government. That's what I meant by stupidity. One is accident or unspoted malicious patch, the other is intentional by the developers.


...while an OS like Windows simply has weak security by default, for "friendliness". Replacing dynamic libraries on Windows by malicious version is pretty easy, files and folders have weak permission system.

You're going to ned to show some evidence that you actually know what you're talking about (rather than e.g. parrotting Slashdot statements from 1998) if you're going to say this kind of thing.

Well, I'm not going to write a tutorial about it on a gamedev forum, and I'm parroting like it's 1998 because these are very well known problems: kernel configuration in a single flat database (registry), bad permission system, weak separation between user space and kernel space. What is so dificult about replacing a dynamic library or having access to database files? The permission system on Windows was designed with single-user access in mind, migrating to muti-user later, while Unix systems were designed with multi-user design since the start. On Windows, it's just *confortable* to give a "normal" user admin permissions (a.k.a. modify anything in folders like /Program Files/ and /System32/) rather than switch to root user (or run a command with root ~sudo~ permission) and be done with it. The weak permission system makes fairly easy to have access to the files where a database is stored (usually somewhere in /Program Files/) and encrypt it, or replace a DLL with a malicious version of it, even replacing a system DLL with a malicious version, since the kernel is composed of dynamic libraries, or injecting a new process in one of those DLLs. And isn't so hard to encrypt when so many libraries are available. Even using a simple library like minizip with a sha256 algorithm can make the job very well. Normal user doesn't have admin permissions? Choose one of the powerpoint vunerabilities to deliver your malware as a OLE object and gain admin access. And all those NHS databases wouldn't be damaged if it wasn't simply because those normal users had admin access by default.

It's actually better than having a single US company with an OS that is literally a spyware responsible for the information of your (not-US) government. That's what I meant by stupidity. One is accident or unspoted malicious patch, the other is intentional by the developers.

What complete and utter nonsense. Take your conspiracy mongering garbage somewhere else. This was a system bug like anything else, including heartbleed.

...while an OS like Windows simply has weak security by default, for "friendliness". Replacing dynamic libraries on Windows by malicious version is pretty easy, files and folders have weak permission system.

Entirely false. System components are now checksummed, admin privileges are not assigned by default, and there aren't significant configuration problems. While these things CAN be circumvented, the circumvention approaches are equally as effective on other operating systems. We live in a world where it's now likely possible at any given time to attack a Linux server running on a VM, jump the Xen hypervisor, and take over the host. We have SEEN these bugs being sold in the wild.

kernel configuration in a single flat database (registry)

The registry does not work that way.

bad permission system

I said it already but the permission system is a perfectly robust ACL based design shared by many other systems. I'm more concerned that you might think the old owner/group/user octal permission system is a good thing, which would be a shockingly lax security approach.

weak separation between user space and kernel space

In what way, exactly? You don't know, do you. Come back when you can explain why it's somehow less separated than Linux or OSX.

What is so dificult about replacing a dynamic library or having access to database files?

Are you just making shit up now?

The permission system on Windows was designed with single-user access in mind, migrating to muti-user later

No, it wasn't. It was assigning admin access to all users by default, which was bad. That's no longer the case, and the exploits we see in the wild are privileges escalations that exist in some form or another on all operating systems.

On Windows, it's just *confortable* to give a "normal" user admin permissions (a.k.a. modify anything in folders like /Program Files/ and /System32/) rather than switch to root user (or run a command with root ~sudo~ permission) and be done with it.

Not only is this wrong, it's also not how the exploits today work. Because it's wrong.

The weak permission system makes fairly easy to have access to the files where a database is stored (usually somewhere in /Program Files/) and encrypt it

That is not happening, save a few cases where program installers are deliberately assigning bad permissions to their own files. I've seen that all the time on Linux boxes.

since the kernel is composed of dynamic libraries

No, it's not. The kernel is one file and it loads dynamic drivers pretty much just like Linux loads drivers.

Normal user doesn't have admin permissions? Choose one of the powerpoint vunerabilities to deliver your malware as a OLE object and gain admin access.

Yeah, that's called a privilege escalation exploit. They happen to every OS. Yes, they're bad. No, they're not at all the same thing as users having admin permissions.

And all those NHS databases wouldn't be damaged if it wasn't simply because those normal users had admin access by default.

You don't know the first fucking thing about how NHS databases are configured. You don't know the first thing about how medical systems are configured. Frankly, a lot of the companies that put these systems together don't understand security in the first place and no OS could save them from their own idiocy. These are frequently people who would have a chmod -R 777 in their install script if it were a unix style platform.

Go back to Slashdot or whatever random hole you crawled out of to waste our time.
SlimDX | Ventspace Blog | Twitter | Diverse teams make better games. I am currently hiring capable C++ engine developers in Baltimore, MD.

It's actually better than having a single US company with an OS that is literally a spyware responsible for the information of your (not-US) government. That's what I meant by stupidity. One is accident or unspoted malicious patch, the other is intentional by the developers.

What complete and utter nonsense. Take your conspiracy mongering garbage somewhere else. This was a system bug like anything else, including heartbleed.

Technically this was an NSA cyberweapon that's fallen into the hands of criminals, so, not speaking for the rest of his post, but these conspiracies aren't garbage any more.

There's about three levels of conspiracy that you get to pick from here. A decade ago they'd all be tinfoil hat territory, but we now know that one of these is almost certainly true:

  • Employees at Microsoft were complicit in the creation of the malicious flaw (possible but unlikely, as MS became complicit with NSA spying around 2007, and this bug is present in XP, which was being replaced by vista in 2007).
  • Employees at Microsoft were aware of the flaw, but were instructed not to fix it as the NSA was using it and, as of yet, there was no evidence that anyone else was using it.
  • NSA was aware of the flaw and did not notify Microsoft, as they wished to use the flaw for their own purposes - despite a pledge from Obama that the intelligence services would not horde vulnerabilities in this way.

The nicest world-view that we can take is the last one. That MS simply wrote bad code, the NSA found it and developed an attack vector, and they chose not to inform MS about it.

Also up for speculation are whether NSA has access to the source code for Windows, which would make their job a whole lot easier, or whether they have to reverse engineer everything like every other group of security researchers... It would not be a stretch to imagine that when it makes their job of "national security" easier, that they wouldn't be able to procure the source code, giving them a good edge over every other hacking group out there. You then get to speculate as to whether Microsoft is unaware (the source is "stolen") or complicit (ordered and gagged about it). The latter is certainly possible, especially how we know that much of the communications interception infrastructure has been created with the forced co-operation and gagging of big US corporations...

We also know that the US routinely spies on foreign leaders, including their own allies and UN staff, and uses such intelligence to interfere in foreign elections and political processes. Given all of these facts, yes, it is completely advised for anyone in these positions to not use any software or hardware that is made in the US to handle any kind of secret communications -- China does exactly this. Likewise, the US does not use any hardware/software manufactured in China/Russia to handle secrete communications. Your CPU has a hardware back-door in it, containing a completely separate CPU that runs even when the computer is off-but-powered, with full DMA access to every connected device, including encrypted hard-drives once your real OS boots up and decrypts them. That's the world we live in now. Let's just hope that the keys to those back-doors aren't leaked to criminals any time soon...

The nicest world-view that we can take is the last one. That MS simply wrote bad code, the NSA found it and developed an attack vector, and they chose not to inform MS about it.

There's some evidence supporting this view, while the other two are purely speculation.

Also up for speculation are whether NSA has access to the source code for Windows, which would make their job a whole lot easier, or whether they have to reverse engineer everything like every other group of security researchers... It would not be a stretch to imagine that when it makes their job of "national security" easier, that they wouldn't be able to procure the source code, giving them a good edge over every other hacking group out there. You then get to speculate as to whether Microsoft is unaware (the source is "stolen") or complicit (ordered and gagged about it). The latter is certainly possible, especially how we know that much of the communications interception infrastructure has been created with the forced co-operation and gagging of big US corporations...

There is no need to speculate on this point, as MS has a well established source code access program which goes out to many different organizations. For that matter, I personally had full Windows source access.

We also know that the US routinely spies on foreign leaders, including their own allies and UN staff, and uses such intelligence to interfere in foreign elections and political processes. Given all of these facts, yes, it is completely advised for anyone in these positions to not use any software or hardware that is made in the US to handle any kind of secret communications -- China does exactly this. Likewise, the US does not use any hardware/software manufactured in China/Russia to handle secrete communications.

Of course China is simply using hardware and software where they added the backdoors themselves, so it's not particularly helpful to those who would like neither China nor the US to have access to their systems. And no, before someone invariably brings it up, going to open source doesn't even remotely address the problem.

Your CPU has a hardware back-door in it, containing a completely separate CPU that runs even when the computer is off-but-powered, with full DMA access to every connected device, including encrypted hard-drives once your real OS boots up and decrypts them. That's the world we live in now. Let's just hope that the keys to those back-doors aren't leaked to criminals any time soon...

All backdoors are broken eventually...

At the end of the day, the US government requires significant visibility into systems running all kinds of operating systems and software, whether the parties responsible for that software are cooperative or not. That includes a variety of foreign and non-consumer equipment This means that they have to have a major program to penetrate all of those systems and we know factually that they do exactly that. Once you invest in all of that infrastructure, there is essentially no need to coerce Microsoft into adding or protecting vulnerabilities (which weren't present in W10 in the WCry case, by the way). You already have everything and you have it on your own terms. The conspiracy theory adds a bunch of extra idiotic steps for no reason. Spooks are nothing if not ruthlessly efficient.

SlimDX | Ventspace Blog | Twitter | Diverse teams make better games. I am currently hiring capable C++ engine developers in Baltimore, MD.

I could care, but I don't. Government using a proprietary operating system, legacy or updated, is a stupidity that was warned for years, and that would result into something like this.

(cough) Heartbleed (cough)

Proprietary's nothing to do with it.

The difference between what happened in heartbleed is that heartbleed was a bug, while an OS like Windows simply has weak security by default, for "friendliness". Replacing dynamic libraries on Windows by malicious version is pretty easy, files and folders have weak permission system. The protocol that this current virus exploits is for network transfer, while there's nothing special about accessing files or folders and then to modify them. Not to mention that even if all that was good, it's still a stupid thing of the government of anywhere to rely on closed-source software.

You do realize that for every Windows exploits that got leaked from the NSA, there's like 5 leaks for *nix OSes, right?

Linux has had extremely very bad exploits:

  1. Heartbleed
  2. Shellshock
  3. Debian Fiasco
  4. X11 is impossible to implement a secure lockscreen or screensaver. This is not fixed as of today. Unless you use Wayland... and when is Wayland adoption going to become wide spread? I'm tired of waiting...
  5. OpenGL drivers (including Mesa) return GPU memory without zero-initializing it first (which is a MASSIVE security hole). This is not fixed as of today.

Just today was released patch to a lightdm bug allows guest users to access any file.

I agree that basic infrastructure should be running in FOSS software and not proprietary. But asserting FOSS software is more secure than proprietary just because it's open source is blatantly wrong. Stop trolling.

Regarding security and single vs multi user, a charitable reading is that he's talking about Windows 95/98 but not aware that he's doing so.

This is a bad habit that I sometimes see many in Unix/Linux communities display; they form opinions based on older (in some cases, much older) versions of Windows, assume that the behaviours of current versions are unchanged, then embarrass themselves publicly by making pronouncements based on those opinions.

Worst case I recall was someone who attempted to claim that Windows still used co-operative multitasking in 2003.

@felipefsdev - you may wish to learn a little about Windows NT which was designed as a secure multi-user OS from the outset, which was a separate code base to the 95/98 line, and which all modern versions of Windows inherit from; the 95/98 line was killed off 15 years ago - that's how out-of-date you're being.

The registry stuff is likewise nonsense; first hit on Google will tell you exactly what kind of database the registry is (hint: it's not a flat one). Zero marks for not even trying here.

By all means criticise where it's warranted, but please inform yourself first.

Direct3D has need of instancing, but we do not. We have plenty of glVertexAttrib calls.

Oh ho, seems things are getting a bit more interesting still.

Fearing Shadow Brokers leak, NSA reported critical flaw to Microsoft

SlimDX | Ventspace Blog | Twitter | Diverse teams make better games. I am currently hiring capable C++ engine developers in Baltimore, MD.

This topic is closed to new replies.

Advertisement