To the above post:
1. We have to make difference, big difference between commercial and non commercial stuff.
You can make machines, you can "heal" (maybe you can even architect) or whatever for yourself (if someone is hurt or dies, you go the jail), but selling a product is different. In other fields you have to certificate the product in a way, and the party that gives the certification gets the responsibility.
No one cares if you make software for yourself or your friends, plus you should be gratuated If you want a job in the software industry anyway. Maybe you can get a job without school, but as far as I know, you can get a job in other fields too without paper. Maybe not...
2. In fields where this thing is "solved": it's not likely that the actual developer will be held responsible. So I don't see why the actual SW developers should be held responsible unless it is a one man team.
Should developers be sued if they introduce security holes in software?
To the above post:
1. We have to make difference, big difference between commercial and non commercial stuff.
You can make machines, you can "heal" (maybe you can even architect) or whatever for yourself (if someone is hurt or dies, you go the jail), but selling a product is different. In other fields you have to certificate the product in a way, and the party that gives the certification gets the responsibility.
No one cares if you make software for yourself or your friends, plus you should be gratuated If you want a job in the software industry anyway. Maybe you can get a job without school, but as far as I know, you can get a job in other fields too without paper. Maybe not...
2. In fields where this thing is "solved": it's not likely that the actual developer will be held responsible. So I don't see why the actual SW developers should be held responsible unless it is a one man team.
I'm a bit skeptical on the effectiveness of having a certification authority/accreditation for software.
1) It's not going to prevent errors from popping up in the software.
2) The accrediting authority would probably have to get a good look at the code before they certify it. For large software projects, that can be millions of lines of code. All it takes is one buffer overflow vulnerability to make it insecure. What are the chances of a certifying authority knowing what they're looking at and finding that needle in the haystack?
At best, it would prevent some of the more obvious and blatent errors from occurring in the software, but it would be a bad case of security theater to believe that it's completely secure software. I mean, we recently found a critical vulnerability in the JPEG file format which has been sitting there for 15+ years... (http://technet.microsoft.com/en-us/security/bulletin/ms04-028) How long has that been sitting there? Could you really sue someone for that and have a case to stand on? *should* we let lawsuits like this happen? I'm afraid it would have a chilling effect on the software industry and innovation... It's a cause for concern to watch closely and participate in as an "expert".
Eric Nevala
Indie Developer | Spellbound | Dev blog | Twitter | Unreal Engine 4
You mention buffer overruns. I remember those issues in the Win 3.1 - 98 days. Mostly because of C and string manipulation. But isn't that a solved problem at this point? Or IOW, isn't that just people not using best practices and hand rolling their own flawed solution? I'm curious.
Are there internatinal stantards for software testing? Is there a CE of some sort?
I thought that would be the nice middle ground. It seems like this stems from people's frustration of downloading and using buggy software applications, which in some cases resulting in data losses. An independent software testing company can be established that certify software apps. Not all applications must be certified, but those that did get certified, get to put their certification logo (similar to VeriSign, ESRB, BBB, etc.), which hopefully should increase consumer confidence in the quality of the app.
I don't know how to certify software, but maybe it's doable. Maybe the certification should only include obvious bugs, or a growing list of known bugs, exploits and issues (standards could state these) and exploiting the rest of the bugs would not be the responsibility of the SW side. Hackers will always hack. Okay, SW development should care about those non-standard cases too, but it could be as it is today.
The biggest difference between SW and other products, as someone pointed out, that tools and environments are free and easily available. Architects and engineers have to deal with security holes and corner cases nowhere near at the level of SW in almost all cases (like if a bridge is not TNT proof or meteorite proof is not the responsibility of the architects). If tools were available easily and free (everyone would be running around with screwdrivers, angle grinders and TNT) it would be the same nightmare with architecture and engineering I guess.
And I guess only a few SW should be this secure. Operating systems, company management systems, banking systems, etc.
Games don't have to be that secure IMHO. Maybe the EULA is enough for them. So hopefully we gamedevs don't have to worry.
The biggest difference between SW and other products, as someone pointed out, that tools and environments are free and easily available. Architects and engineers have to deal with security holes and corner cases nowhere near at the level of SW in almost all cases (like if a bridge is not TNT proof or meteorite proof is not the responsibility of the architects). If tools were available easily and free (everyone would be running around with screwdrivers, angle grinders and TNT) it would be the same nightmare with architecture and engineering I guess.
And I guess only a few SW should be this secure. Operating systems, company management systems, banking systems, etc.
Games don't have to be that secure IMHO. Maybe the EULA is enough for them. So hopefully we gamedevs don't have to worry.
Aaah, this is a topic that makes me squirm as a freelance web developer implementing eCommerce websites. Of course I am trying to be secure, using tested solutions (i.e. Drupal + Ubercart), making sure we got the SSLs and that the Payment Gateway fraud detection is at max settings, but still, there is always a chance that someone can hack the server and pull user data, or do some other XSS hacking magic. That's why I always advise my clients to consult an attorney, have a strong Terms&Conditions / Privacy Policy to guard against that, as well as giving them a disclaimer that I am not personally responsible for any fraud/hacks/stolen data/etc. Not sure how useful that would be if they sued me...
I understand a businesses desire to protect themselves against sloppy developers, particularly after having worked with some sloppy developers / inheriting some really poorly written or insecure code, but being a developer myself, I also am concerned that I may end up getting sued for all my pennies because some hacker used a Chrome voulnerability to steal someone's account information and gave our web service bad press.
Which I guess that's another issue - if a building crumbles you can usually find a specific reason (i.e. "poor material choice / unaccounted for weather etc.") and maybe the person responsible. But with software? You got a multitude of 3rd party libraries, compilers, web browser holes, web hosts, SSL certificate strength, ISPs... when a hack happens it isnt always clear which part is the culprit, and often it's a very specific combination of many. Who do you pin it on? Should you sue me for shoddy coding, or Microsoft for a flaw in their .Net framework that enabled the exploit? Or maybe Google for the security hole in their browser that revealed session IDs?
I understand a businesses desire to protect themselves against sloppy developers, particularly after having worked with some sloppy developers / inheriting some really poorly written or insecure code, but being a developer myself, I also am concerned that I may end up getting sued for all my pennies because some hacker used a Chrome voulnerability to steal someone's account information and gave our web service bad press.
Which I guess that's another issue - if a building crumbles you can usually find a specific reason (i.e. "poor material choice / unaccounted for weather etc.") and maybe the person responsible. But with software? You got a multitude of 3rd party libraries, compilers, web browser holes, web hosts, SSL certificate strength, ISPs... when a hack happens it isnt always clear which part is the culprit, and often it's a very specific combination of many. Who do you pin it on? Should you sue me for shoddy coding, or Microsoft for a flaw in their .Net framework that enabled the exploit? Or maybe Google for the security hole in their browser that revealed session IDs?
Comrade, Listen! The Glorious Commonwealth's first Airship has been compromised! Who is the saboteur? Who can be saved? Uncover what the passengers are hiding and write the grisly conclusion of its final hours in an open-ended, player-driven adventure. Dziekujemy! -- Karaski: What Goes Up...
Well the "attack" that happened with GitHub a few months back was because of the developers not putting in the proper security restrictions on their code. It was a known problem with a known solution. However, they, for some reason, did not put down those restrictions. So on a forward-facing, popular site that is for other developers, they did something extremely sloppy. Shouldn't they be blamed for allowing that vulnerability in the first face? Isn't there Testing Software or procedures to prevent that sort of thing? That issue was definitely not an edge-case issue. It was not using best practices from everything that I've read. So why can't we prevent issues such as those?
Should developers be sued if they introduce security holes in software?[/quote]
In general, no, because "security holes" are almost always not introduced by the developer but by some clever person bending the code in an completely new way. Sure, if the developer goes out of his way to explicitly introduce buffer overflows or other known vulnerabilities in a codebase, he should be called out on it (perhaps not suing, but it depends I guess), but if a new vulnerability happens to be discovered in code that was previously thought secure - or at least "not insecure" - then no, the developer should be left alone because he had no way to know. That would be unfair.
“If I understand the standard right it is legal and safe to do this but the resulting value could be anything.”
Suing an employee is farcical. As an employer you are responsible for having liability insurance to protect you from your employees.
As a contractor/freelancer you might need your own such cover.
As a contractor/freelancer you might need your own such cover.
www.simulatedmedicine.com - medical simulation software
Looking to find experienced Ogre & shader developers/artists. PM me or contact through website with a contact email address if interested.
This topic is closed to new replies.
Advertisement
Popular Topics
Advertisement
Recommended Tutorials
Advertisement