http://www.techrepub...109?tag=nl.e101
Most developers would say, resoundingly, NO! However, it's an interesting question, for me anyway, because it shows the level of maturity in our industry and points to the flaws about how industry and developers in general are approached. For instance, who would be to blame? The software developer, software engineer, software architect or Q&A? Are these titles/positions really defined clearly? Do companies even care about such a distinction?
I believe in Canada an engineer has to be certiified first before working on anything. And if a catastrophe occurs because of shoddy engineering from that engineer, the engineer himself can be sued or worse. (If I'm wrong, please do correct me)
Of course, such a thing would send costs rising and have software be released at a slower rate. It could definitely negatively impact indie developers. But from what I've seen we have the tools to mitigate these issues. Also, such a thing would really force the industry and companies to clearly define roles in the development process, IMO.
I don't consider myself a veteran and I have not had the benefit or working on a variety of projects. So the opinions of others are needed because I may not know what I'm talking about, lol.
NOTE: I wanted to put this in the General Programming forum, but wasn't quite sure that was the place for it. Since it's a discussion about general programming as opposed to a particular code problem. But any Mod or Staff, please feel free to move it there, if you think it's appropriate.
Should developers be sued if they introduce security holes in software?
Liability issues are complex.
The nature of software development today is such that errors are inevitable. Software errors are pervasive and considered normal behavior. It would be extremely difficult to prove liability for damages for most general-purpose software.
A few industries, such as medicine, are becoming exceptions to that rule. In those cases where software is used for life-critical applications it should be given a higher level of scrutiny. How much liability is a matter of debate, I'd say it is rather low but a litigant would want to force developers to take full liability.
Note that standards are already in place for many industries. A friend who works in chemical analysis equipment has a series of government-mandated tests that the software and hardware must pass before getting certified. Once certified they only need to show that random sample equipment from the line still meets the certification standards; if something goes wrong in the real world they can simply point to the government certification and that they are meeting all relevant standards.
As the games sub-industry of software development, I don't think we'll ever be at the point where liability should be assumed. The value bar is much too low for it to be otherwise.
The nature of software development today is such that errors are inevitable. Software errors are pervasive and considered normal behavior. It would be extremely difficult to prove liability for damages for most general-purpose software.
A few industries, such as medicine, are becoming exceptions to that rule. In those cases where software is used for life-critical applications it should be given a higher level of scrutiny. How much liability is a matter of debate, I'd say it is rather low but a litigant would want to force developers to take full liability.
Note that standards are already in place for many industries. A friend who works in chemical analysis equipment has a series of government-mandated tests that the software and hardware must pass before getting certified. Once certified they only need to show that random sample equipment from the line still meets the certification standards; if something goes wrong in the real world they can simply point to the government certification and that they are meeting all relevant standards.
As the games sub-industry of software development, I don't think we'll ever be at the point where liability should be assumed. The value bar is much too low for it to be otherwise.
IANAL, but Canadian Engineers don't work on their own. They work as part of companies or other legal entities that are contracted to do the work. If the work is substandard, then the companies can be held liable for damages/negligence.
It's my understanding that software companies follow the same behavior, though there's some wiggle room around the whole "We didn't make you use our software, we just sold you a license!" garbage. There's also the standard "By using this software, you agree not to sue us since it sucks" EULA text that is of dubious standing. People don't want to be the ones to make EULA's worthless.
The other issue that arises is culpability. Engineers aren't responsible if a hurricane comes along and knocks down their bridge. They're not responsible if someone takes their 500 yard pre-fab bridge and uses it in a 1000 yard wide river.
And there's the whole 'substandard' thing. Bridges will eventually fall, and there will be some variance between their lifetimes. The engineers cannot be held liable for that variance, only by making choices that they should have reasonably known would lead to the failure. That becomes really dicey to prove, especially in an industry where security vulnerabilities are commonplace.
It's my understanding that software companies follow the same behavior, though there's some wiggle room around the whole "We didn't make you use our software, we just sold you a license!" garbage. There's also the standard "By using this software, you agree not to sue us since it sucks" EULA text that is of dubious standing. People don't want to be the ones to make EULA's worthless.
The other issue that arises is culpability. Engineers aren't responsible if a hurricane comes along and knocks down their bridge. They're not responsible if someone takes their 500 yard pre-fab bridge and uses it in a 1000 yard wide river.
And there's the whole 'substandard' thing. Bridges will eventually fall, and there will be some variance between their lifetimes. The engineers cannot be held liable for that variance, only by making choices that they should have reasonably known would lead to the failure. That becomes really dicey to prove, especially in an industry where security vulnerabilities are commonplace.
I dont think you can compare medicine to computer programming / IT.
To do that humans would need to be evolving on a monthly basis thus making all the current medicines redundant or in need of retesting.
You cant blame the delivery when what was requested, mostly didnt cover a smidgin of what you actually did. Technology companies have to assume alot when they get a project and typically try to foresee what the client did not.
What if a windows update is the cause of a serious error? How would microsoft have known that would cause your software to malfunction. How can developers be constantly aware of other changes in the code or provide 100% code coverage when the code is constantly changing.
Cars break down.
To do that humans would need to be evolving on a monthly basis thus making all the current medicines redundant or in need of retesting.
You cant blame the delivery when what was requested, mostly didnt cover a smidgin of what you actually did. Technology companies have to assume alot when they get a project and typically try to foresee what the client did not.
What if a windows update is the cause of a serious error? How would microsoft have known that would cause your software to malfunction. How can developers be constantly aware of other changes in the code or provide 100% code coverage when the code is constantly changing.
Cars break down.
Eh, that's going to opens to a lot of legal loopholes, and a brand new cash-making-machine-for-lawyers industry, if people are ever allowed to sue developers for security holes.
First, what's the definition of a security hole? Somebody hijack your back account throught ATM? Some kid access just your name and address on Valve's db? Some kid posted fake hi-score online?
Second, It takes a certain amount of knowledge to understand and use computers, and to understand the difference between security holes vs normal app crashes/mishaps requires software development knowledge, and an average Joe, or even a typical FB/Email/Web/Word users, won't know that. You can totally imagine someone in the suburb who never owned a computer, start using an iPhone, then it crashes, and claim that there's a security hole.
You want to trust lawyers to write this definition for you? Talking about potential scams all over the country.
First, what's the definition of a security hole? Somebody hijack your back account throught ATM? Some kid access just your name and address on Valve's db? Some kid posted fake hi-score online?
Second, It takes a certain amount of knowledge to understand and use computers, and to understand the difference between security holes vs normal app crashes/mishaps requires software development knowledge, and an average Joe, or even a typical FB/Email/Web/Word users, won't know that. You can totally imagine someone in the suburb who never owned a computer, start using an iPhone, then it crashes, and claim that there's a security hole.
You want to trust lawyers to write this definition for you? Talking about potential scams all over the country.
First, what's the definition of a security hole?
They define that in the article that I linked.
In some cases I would definitly say yes, but it would be very hard to define when this is. But in most industries this is quite complicated to be defined, for example if a building collapses because of a heavy earthquake, it's unlikely that the builder gets suid, but if a soft earthquake causes a building to collapse because the builder used cheap concrete, even though the contract stated otherwise, he will be hold responsible. In case of software, if a company decides to outsource development to a number of interns without a senior developer as supervisor to safe money in development costs (which over here happens), I don't really think you can get away with your "no warrenty" EULA.
[quote name='alnite' timestamp='1345834117' post='4973072']
First, what's the definition of a security hole?
They define that in the article that I linked.
[/quote]
Laymen definition is totally different than legal definition.
Clayton thinks that developers should be held accountable in cases where avoidable security holes in their software are exploited to infect a user with malware, and that user suffers some form of material loss - for instance the theft of money.
[/quote]
What counts as damage? Material losses? Does your personal information count as material losses? What about people's WoW items? Virtual currencies and properties? What about losses because other users are exploiting the loopholes? For example: time/virtual money lost playing games online to lose to some kids with cheating apps count as material damage and security flaws?
"Avoidable security flaws." What counts as avoidable? That's pretty much a shady area. Are overlooked bugs avoidable? What about untested corner cases that nobody in the software development house hadn't thought about?
There were going to be script kiddies houses if this was ever implemented. There would be whole new industry of hackers -- to prove that there's an avoidable security flaws -- just so that they can get their money from software developers, who built the software in good faith,
In engineering: before something is used, it goes through many different (and some independent by law) parties, and between the major steps and milestones, there is quality control (very simple version: design->manufacturing->redesign->manufacturing->testing->redesign+manuf.->testing->red.man.->field testing ->red.man->testing+field testing->use. Even with non-at-all-innovative machines). And in engineering, there is a very strict and heavy standardization (international standards). It's hard to explain since I'm still on the bottom of the engineering chain and don't really see it all, but one thing is certain: I won't be held responsible. The guys who make the final dicision will.
Another example. If my design is flammable but if ther fire service's experts give the permission and someone dies in a fire because of my design, the fire service will be held responsible. I only get fired from the company probably. There are quality assurance companies, like TÜV. If they say the product is safe, they will take the responsibility. Products are not even allowed to be sold without these certificates (CE marks for example).
Software development doesn't seem mature at all in that regard despite it may be more complex than engineering but I may be wrong.
Are there internatinal stantards for software testing? Is there a CE of some sort?
Another example. If my design is flammable but if ther fire service's experts give the permission and someone dies in a fire because of my design, the fire service will be held responsible. I only get fired from the company probably. There are quality assurance companies, like TÜV. If they say the product is safe, they will take the responsibility. Products are not even allowed to be sold without these certificates (CE marks for example).
Software development doesn't seem mature at all in that regard despite it may be more complex than engineering but I may be wrong.
Are there internatinal stantards for software testing? Is there a CE of some sort?
Most developers would say, resoundingly, NO! However, it's an interesting question, for me anyway, because it shows the level of maturity in our industry and points to the flaws about how industry and developers in general are approached. For instance, who would be to blame? The software developer, software engineer, software architect or Q&A? Are these titles/positions really defined clearly? Do companies even care about such a distinction?
I believe in Canada an engineer has to be certiified first before working on anything. And if a catastrophe occurs because of shoddy engineering from that engineer, the engineer himself can be sued or worse. (If I'm wrong, please do correct me)
Of course, such a thing would send costs rising and have software be released at a slower rate. It could definitely negatively impact indie developers. But from what I've seen we have the tools to mitigate these issues. Also, such a thing would really force the industry and companies to clearly define roles in the development process, IMO.
A few years ago, one of my university professors made a similar point: Doctors, lawyers, architects, civil engineers, etc. all have to go through a process which validates them as being a professional in the field. The last thing you want is an unqualified professional working in a field in which their negligence can cause catastrophes.
"Shouldn't the same standard apply to software developers?" he asks.
My gut response is a righteous "No!" and then I'm sent off trying to find ways to validate my answer. So, after a few years of chewing over this question, here are several points I've come up with:
1) The nature of software development is iterative. Every project is an evolving research and development project, with each version fixing flaws (and hopefully not introducing new ones) and adding features. If you're going to legally hold every software developer liable for bugs in their software, then the smart developers will never release their software as a part of their risk mitigation strategy. The loser becomes the consumer and smaller entities which can't afford to hire their inhouse developers.
2) Software development tools are so ubiquitous that anyone can create software. Creating software is like authoring a book, blog or news article. And, with a wide open internet community, anyone can be an author. If you're going to hold software developers liable for flaws in their software, then you ought to also hold every writer in the world equally liable for flaws in their writing, whether its flaws in reasoning, grammatical errors, or blatant lies. If you aren't willing to hold writers accountable for what they write, then I'll just say that my code is protected under free speech rights.
3) Software is fucking complicated! It's a series of logical and mathematical instructions working together to create an interconnected system. It's just fucking amazing that it even works, let alone expecting it to work perfectly? That's like asking every mathematician in the world to never make a mathematical mistake. One of the hardest branches of mathematics (IMHO) is crytography and I believe the nature of cryptography is very similar to software development. A crytographer can release an encryption algorithm (set of mathematical instructions to obscure information) in 1995 and have it vetted by other cryptographers as being "secure". Yet, fifteen years later, that secure encryption algorithm can be found to be flawed and unsecure because of some obscure fact, such as a slight imperfection in random number generation, poorly chosen prime numbers interacting with each other, changes in technology, etc. Should we sue the cryptographers of fifteen years ago for releasing insecure crypto? How could they know that their crypto is insecure if its been vetted by peers and thought to be good? It could be stated that EVERY encryption algorithm is flawed and its merely a matter of time until its vulnerabilities are discovered. Likewise with software.
4) Since anyone can create software and release it online, there's a matter of enforcement. Unlike in the other professions, there is no barrier to entry into the world of software development. All it takes is a software developer to say "fuck you and your regulations, I'm going to write software damn it, and I'm going to release it anonymously and for free on my own website!". And thats a good thing! how many valuable software applications started off as a project hacked together by an individual?
The better principle is to assume that no software is perfectly secure and flawless (regardless of author), so your better course of action is to distrust it and add addtional layers of security into your business structure. Where are your double checks? How are things vetted? Is there auditing? etc. Strong security depends on people, processes and technology. If you're relying only on one of those dimensions for security and ignoring the other two, then you deserve to have a security breach and you deserve to be roasted appropriately for it.
Eric Nevala
Indie Developer | Spellbound | Dev blog | Twitter | Unreal Engine 4
This topic is closed to new replies.
Advertisement
Popular Topics
Advertisement