Catch up on stories from the past week (and beyond) at the Slashdot story archive

 



Forgot your password?
typodupeerror
×
Security Software Linux

Will Security Task Force Affect OSS Acceptance? 224

An anonymous reader writes "An interesting article published by SD Times: "Application Security Goes National" discusses some of the talking points generated by a federal task force that will make recommendations to the Department of Homeland Security. One of these talking points is to license software developers and make them accountable for security breaches. Licensed developers would get paid more as well. The article also mentions that "Executives" might not wish to work with smaller undiciplined partners and a little further down that "Hobbyists create Web services [and] professionals create them" and that "companies relying on critical infrastructure Web services need confidence". Would OSS have to be writen entirely by licensed developers to be considered secure? . Yahoo Finance has another article on the subject." The SD Times article is current, despite the incorrect date on it.
This discussion has been archived. No new comments can be posted.

Will Security Task Force Affect OSS Acceptance?

Comments Filter:
  • by mikeyrb ( 686396 ) on Wednesday December 31, 2003 @08:45PM (#7850024)
    But programs are only as secure as the platform they run on, and of course the same as the people who use them. If people don't run their system properly, I'd say that's worse. Not to mention that people would use trusted vendors anyway, so I don't see what this adds.
    • Systems are software too. The article talks about having different levels of programmers. If you want to be working on an OS (system), you'll need a certain sort of licence to do so and you will be held accountable for any problems that occur.

      Your statement "programs are only as secure as the platform they run on" may or may not be true, but if it is, wouldn't insisting that the systems are built by licensed professionals who are held accountable be preferrable?
      • Lets not start the paid != better debate.

        My 2 cents: I don't think this will fly. I doubt that Microsoft wants to have to find only licenced developers. I also don't think Microsoft wants to pay them more. But most importantly, I have no doubt whatsoever that Microsoft doesn't want to have to do a complete rewrite of it's OS just so it will be certified. That would throw them a few more years off schedule. I think Microsoft will throw some weight against that.

      • If this does get pushed through, and I doubt it, the end result is the important thing. The US government should get to see the Windows source code and analyze it. It can also analyze the source code of any GPL'd OS just as easily.

        May the best code win.
  • by roninmagus ( 721889 ) on Wednesday December 31, 2003 @08:48PM (#7850038)
    Do they really believe that licensing software developers will lead to more secure software?

    I'm not following their train of thought. Software development is an industry which constantly has to defend itself from **NEW** hack attacks. The best we can do is protect ourselves from known attacks, and try our best to forsee future ones.

    It puts yet another industry under undo government control, and yet against shifts the focus away from the people actually doing harm--the hackers.
    • by vegetablespork ( 575101 ) <vegetablespork@gmail.com> on Wednesday December 31, 2003 @08:49PM (#7850046) Homepage
      On the plus side, since we're licensing for "homeland security" reasons, there's no reason non-citizens should be writing any software used in the U.S.' critical infrastructure. Right?
    • by aheath ( 628369 ) * <[adam.heath] [at] [comcast.net]> on Wednesday December 31, 2003 @08:53PM (#7850079)
      Neither article explicitly touched on the issue of software quality assurance. The development of processes and procedures for writing secure software should go hand in hand with the development of processes and procedures for testing secure software. SQA methodology has to expand beyond usability and functional testing to incorporate security testing.

      It's my understanding that there are procedures for developing and testing software that is used in medical products and aviation products. Perhaps the rigor that is applied to developing software to control an airplane could be applied to the development and testing of secure software.

      • by the_2nd_coming ( 444906 ) on Wednesday December 31, 2003 @09:15PM (#7850240) Homepage
        yeah...is is called Software Engineering.

        very few commercial software applications use correct software engineering techniques which is why so many bugs are in the software. medical equipment and air craft equipment and car equipment is tested. re tested and run through all the engineering processes in order to make it bullet proof.

        real software engineering is not profitable with out making software cost a bloat load more than it does.
        • One big problem as I see it is that most of the ideas we associate with Software Engineering fit closer to Software or Engineering Management. Metrics like Halsteads's "Software Science" we teach students these days are about measuring time to completion, rather than any sort of quality. Students at my university are not taught to use any tools to assist in the development process, like lint or oxygen. We do not teach performance measurement outside of a theoretical Big O guesstimate. And nor do we teach st
      • by Jerf ( 17166 ) on Wednesday December 31, 2003 @09:52PM (#7850412) Journal
        It's my understanding that there are procedures for developing and testing software that is used in medical products and aviation products. Perhaps the rigor that is applied to developing software to control an airplane could be applied to the development and testing of secure software.

        It's a good idea on paper, which is why people like me are well-nigh terrified when this idea comes up.

        The problem is one of expectations. Yes, we could apply that rigor to all software. But,
        1. No more garage startups... and all new technology tends to start there. Innovation, true innovation, takes a huge hit under these schemes and we lose huge advantages to any country that doesn't enforce these rules.
        2. Expense. Those methodologies eat manpower for lunch. Are you going to pay for it? For every piece of software you use? Even "ls" or "echo"? No, and neither will anyone else. It only makes sense for certain things, and different level of rigor makes sense for different kinds of programs... even different levels of rigor for different guarentees. Good luck even figuring out which of these is right, let alone getting the government to mandate the correct levels! We are far from a consensus on what is appropriate; we're not even sure where it makes economic sense to use what we know, and we certainly don't know what we don't know.
        3. Freedom of choice. The converse of the above; we should be able to choose how secure our software is, because it's not free. Mandating any security level, and since other people's time is always free, you can be sure the government will mandate a very high level, means that I am forced to buy these high security products. What if I don't care? My game console is free to crash, and even if it's 0wz3r3d, who cares? On the next power cycle, it'll return to normal. (At least modern architectures.)
        In the real world, it is, to put it bluntly, a shitty idea.

        It's not time for government mandate, it's time for the market to start demanding security. The proven method for balancing cost vs. performance is the invisible hand of the market.

        The root cause here is a monopoly, training people not to be concerned about security. The correct solution is a healthy market.

        Best of all, we won't find ourselves in 2015 shackled by government mandate to 2005 engineering techniques. It's an act of shocking hubris to think we've got this figured out enough yet to mandate any solution.
      • The real solution to the quality/security problem already exists. Sue the bastards. A software problem causes a plane to crash (people die.) Sue the bastards. Medical equipment, cars, etc.. screw up AND PEOPLE ARE HURT, the manufacturer gets hauled into court. Seems to work.

        Now the problem is to determine the $ damages if software fails & someone is not hurt. It would need to be something like net income per hour averaged from the past 12/18 months and applied to the recorded downtime plus a pre-deter
        • An interesting point that always seems to come up for me when eulas are discussed is this:
          if software embodies patented concepts, doesn't that sort of make it an actual capital "P" PRODUCT that that can be held to standards of fitness for use, safety, performance, implied warranties, etc, regardless of what the eula says?
          Otherwise, why should it deserve patent protection at all?
          It seems to me that if for software to enjoy patent protection, it would have to be considered just another product like, say, a to
    • by elrond2003 ( 675701 ) on Wednesday December 31, 2003 @09:00PM (#7850117)
      >>>>Do they really believe that licensing software developers will lead to more secure software?


      You have missed the point, nobody on the committee cares about improving security. The worse it is the more money they make. Only MS (and perhaps a few other huge contributors) will be able to generate certified software engineers so only MS software will be useable. Thus LINUX will either die from lack of use or die from being commercialized by MS. There will be two benificiaries, MS by making money and selected congresspeople who will get brib^h^h^h^h campaign contributions. Meanwhile NSA software will be generated in China, rather than by US programmers.
      If there were any interest in having secure software the committee recommendation would be to ONLY allow open software.
    • Right now there is no way to prevent incapable programmers from writing critical code. If a license was required, then programmers who cannot meet a minimum level of demonstrable competency wouldn't be allowed to get started writing critical code. A programmer who manages to get certified but who then writes sloppy code could have his license revoked (like disbarment for a lawyer) thereby preventing that programmer from writing any more critical code. By having various licensing levels you could regulate

      • by ergo98 ( 9391 )
        If a license was required, then programmers who cannot meet a minimum level of demonstrable competency wouldn't be allowed to get started writing critical code.

        What organization is this developer working for that they threw an incompetent programmer onto critical code? Do they do appropriate code audits given that it's critical code? Do they have external code audits if it's for a critical medical or government system?

        A programmer who manages to get certified but who then writes sloppy code could have h
        • Would you consider banking software to be fairly important? I have seen banking software, to be used by the national banks of brittle developing economies being worked on by high school students with no engineering techniques being used at all. This software was being sold by a very large computer company with over 175,000 employees in over 100 countries, not a "fly by night" basement operation.

          As to organizations being sued because their critical software failed, that is rarer than disbarments. Even t

          • by ergo98 ( 9391 )
            I have seen banking software, to be used by the national banks of brittle developing economies being worked on by high school students with no engineering techniques being used at all.

            This tells me a lot about the software development firm, and little about the software developers. I've worked with many levels of software developers -- from self-taught high school dropouts to professional certified engineers -- and I have noticed that there is incredibly little correlation between those "classic" indicato
            • I don't think we are disagreeing but you don't seem to quite track what I am saying so let me try another approach. Ever notice how you can talk yourself blue in the face explaining to your boss how something is illegal but he ignores you? Ever notice how there's not a peep out of him once a lawyer from legal utters one sentence saying exactly the same thing you've been saying? That's because the lawyer speaks with the weight of his license behind him. Your boss knows every competent lawyer will tell hi

      • You seem to think that poor progrmming is the problem. Poor management is the real problem. I can, in the right environment, write code with a defect density of 1 per 10 KSLOC. I've actually done that. But I can't do it on the project I'm on now, because the schedule is way to short to fit all the features that we committed to deliver. So quality goes down.

        Before we hold programmers responsible for defects, let's hold program managers responsible. I wonder how many jobs that are currently bid at half
        • Yes, absolutely poor programming is the problem. The question is "why do we get poor programming?". I agree with you that it is largely the fault of people, other than the programmers, making unreasonable demands. Licensing developers would provide those developers with a weapon for self defense: "I am licensed. You are required to use licensed developers. All licensed developers will tell you the same thing. You cannot fire me in favor of someone who will cut corners as no developer will sacrifice hi

    • by ergo98 ( 9391 )
      Do they really believe that licensing software developers will lead to more secure software?

      Most licensing advocates propose licensing as some sort of magical solution that will do everything from improving security, speeding development, improving estimates, lowering bug counts, etc. The trouble is that they never provide any metrics or actual examples to back this up. It'll just happen, apparently.

      I say this with interest as I'm currently reading the book "Professional Software Development", a book by
    • Do they really believe that licensing software developers will lead to more secure software?

      I'm not sure they think at all. I volunteered as a reviewer of the initial SWEBOK (Software Engineering Body of Knowledge) a few years ago. Basing licensure on the SWEBOK would have been a disaster. No design patterns. No agile methodologies. Nothing newer than the late 80s.

      You can't have licenses without tests. And you can't have tests on things that are still evolving. So licensed software engineers will

    • No one wants to require a license to program. What is under discussion is the possibility of making software development a profession.

      There are many parallels that already exist. Medicine for example: you can treat yourself with home remedies, pick up an over the counter drug, or go to a doctor. Building construction has the same sort of range, with the added complication that even for a do it yourself project there are certain safety standards you must meet and liabilities that you must assume for constr

    • >> ...and yet against shifts the focus away from the people actually doing harm--the hackers.

      No, if I figure them right, they're going to want to licence all the hackers too... At least we'd get rid of the script kiddies...

  • OSS Acceptance (Score:2, Interesting)

    by Anonymous Coward
    For commonly used software this provision of jobs increasingly depends on artificial barriers to the acceptance of free alternatives. Now that millions of people are programmers with supercomputers on their desks and an itch to scratch, and now that the cost of software distribution is approximately zero, the unconstrained market value of a line of code for a commonly used application is rapidly converging to zero.

    The anti-FOSS lobbying is merely an example of the artificial barriers that prop of the price
    • Indeed, just as the New Deal introduced mandatory schooling and mandatory retirement ages not, principly, out of any ideas of children's rights or the rights of the elderly, but as a way of reducing unemployment and keeping wages high by artificially reducing the number of people who could be legally employed.

      Many trade licenses fulfill the same function, such as that needed to be a plumber or electrician, where licensing is typically handled not by the state but directly through the unions.

      This proposal
    • Though I doubt that there are actually that many people earning their living by programming operating systems, Web browsers, and word processors these days. In the future the way to make money as a programmer will be to implement special-purpose applications that only scratch the itch of some company's shareholders

      AC, you just described the state of the world today.

  • Licensing again huh? (Score:5, Interesting)

    by DroidBiker ( 715045 ) on Wednesday December 31, 2003 @08:49PM (#7850048)
    I suspect we'll have some sort of meaningful licensing scheme someday. It'll probably take a while tho. There will be a lot of pain and probably more than a few witch hunts before it happens.

    One problem (of many) is of course that if you make programmers legally responsible for security failures you also need to give them the authority to say "No! You can't do it that way! I don't care WHAT Marketeering says!"

    Texas has had licensing for a few years. Anyone know how it's worked out?

    • by Alan Cox ( 27532 ) on Wednesday December 31, 2003 @09:36PM (#7850349) Homepage
      There is two reasons to license software developers in the USA. Neither are good. The first is so that you can forbid compilers, debuggers and other "dangerous" tools to the RIAA/MPAA being in the hands of the masses. The second is to stop the all the computing jobs leaving the US by having a US certification required but inaccessible to the competition.

      I'm all for formal open standards for security. And I am very much for formal accredited qualifications in safety critical systems. I'd love to see an MSC in computer security and similar university qualifications - but it has to be a proper and open thing, not some goverment office of computer programmer licensing.

      As to accountability - there is a simple solution. Do something about the ability of companies to use software licensing as a get around for liability for product in most countries. Make it like other product. If its sold then it should be suitable for purpose. (Note here sold - paid money for. I see no reason why *paying* for open or closed source ought to be different).

      It will also improve computer security no end the day a company gets sued for harming others by being negligent in applying security patches to its systems.

      • I agree with most of what you say. I do, however, see a couple good things about PEs:
        1. EITs have to prove themselves, its not enough to know the theory
        2. EITs must endure a period of on-the-job-training where a mentor oversees their work
        Where i started after college it was assumed that college was enough to completely equip me with everything i needed to be a successful software developer. Not only can i say from experience that this was not true for myself, but through observation I can say that it was
    • One problem (of many) is of course that if you make programmers legally responsible for security failures you also need to give them the authority to say "No! You can't do it that way! I don't care WHAT Marketeering says!"

      From my understanding this is exactly what happens today in areas where a PE has to sign off on a design making himself legally liable for any design flaws. The PE doesn't like the design for safety reasons, the PE refuses to sign, the design gets changed. At least in an ideal world that
  • by civilengineer ( 669209 ) on Wednesday December 31, 2003 @08:51PM (#7850064) Homepage Journal
    THe idea was to give licenses to only those who can actually drive safely. But, if they really implement that there will be very few people with licenses and car companies will go bankrupt ( no more wars maybe??). So, they give this easy test for the license and every TD&H can drive. Of course we have had over 40,000 fatalities and 2 million crashes every year in the US for past 20 years.
    Similarly, the licensing scheme will again create a dearth of licened software professionals,leading to high salaries for the licensed initially and then the bubble will burst. Everyone will have a license eventually, and we will be back to square one. So, the solution is to come up with better error prevention and correction methods for existing software professionals/ (drivers) rather than try to create licensed professionals. SO, as of now OSS still rocks and it will be good to see more OSS testing volunteers rather than just OSS developers.
    • Yes that is it!!! Cars cause war. You sir are a dumb ass!
    • The analogy is with a Commercial Driver's License (CDL) versus standard license. Truckers have to have a CDL, and they can lose it pretty quickly by bad driving.
    • THe idea was to give licenses to only those who can actually drive safely. But, if they really implement that there will be very few people with licenses and car companies will go bankrupt ( no more wars maybe??). So, they give this easy test for the license and every TD&H can drive.

      It dosn't help that a bunch of jokers decided that what was originally intended as a document to indicate someone could operate a motor vehicle should also (in some cases primarily) be used as an identification document.
  • by spearway ( 169040 ) on Wednesday December 31, 2003 @08:52PM (#7850068) Homepage
    May be the SD Times should hire a "licensed developper" to fix the date. They appears to be one year late "January 1, 2003".
  • I got annoyed at the slashdot comments last time there was security hole in OpenSSH and wrote this page [irccrew.org] (copy pasted below). I count OpenSSL as insecure software - we need a secure replacement. GNUTLS [gnutls.org] looks somewhat better, but I don't trust it too much either.

    Why is some software more secure than others?

    How do you measure software security?

    Here's my definition on what is secure software.

    Intro

    I get really tired of seeing these kinds of comments every time some widely used software has security holes

  • This is the grant of government license to do a specific type of work. That's akin to the government granting the title of Lord, and is technically illegal.

    That said, the idea itself is good -- but let ACM *and* IEEE *and* Sun *and* whatever other institution do certifications... That avoids the government regulation, and allows potential employers to select "qualified" individuals.
  • Pointing Fingers (Score:5, Insightful)

    by RetroGeek ( 206522 ) on Wednesday December 31, 2003 @08:54PM (#7850081) Homepage
    All this does is create a person who can be targeted if Something Goes Wrong(tm).

    With OSS there is no "someone". With a licenced developer you have someone to blame.
    • However, I don't know if developers would actually want to be liable for their work. The pay increase would have to be much higher than his current pay so he can afford some kind of liability insureance.

      With the possible amount of damage a company can claim for intrusion (remember K. Mitnick case) I'm pretty sure that insurance cost will be very high.
  • by Nate B. ( 2907 ) on Wednesday December 31, 2003 @08:58PM (#7850107) Homepage Journal
    I recall a quote from John Milton that went something like this, "None can love freedom but good men. Others love not freedom, but license."

    How much would licensing developers much like doctors, lawyers, architects, etc. affect development? It would likely mean more than, say, an MCSE or RHCE, or NCE. Would developers need to be licensed for a specialty?

    Most likely there would be some sort of age and education requirement which would prevent some of the younger and perhaps self-taught developers from contributing to certain projects. Also, what about code developed outside the USA? One would have to be rather naive to assume that all the software in use was written in the USA, but sadly, I think that perception is all too common.

    Happy 2004, everyone!

    - Nate >>
    • by breadbot ( 147896 ) on Wednesday December 31, 2003 @09:31PM (#7850325) Homepage

      I believe the word license in this sense is:

      3 a : freedom that allows or is used with irresponsibility b : disregard for standards of personal conduct : LICENTIOUSNESS
      (from Webster's [webster.com])

      Implying that non-good men love the opportunity to act irresponsibly, which is what freedom offers them.

    • "None can love freedom but good men. Others love not freedom, but license."

      It's a nice quote, but it doesn't apply here. Milton was contrasting freedom to license, a noun meaning "an excess of liberty; freedom abused; also, licentiousness." (Webster's New Collegiate Dictionary - 1959)
  • Would OSS have to be writen entirely by licensed developers to be considered secure?

    I'm sure glad the DHS steps in and prevents all those 1ee7 uncontrolled hackers from creating [apache.org] evil [kernel.org] unlicensed [gpg.org], software [openbsd.org] that [freebsd.org] aren't [debian.org] secure [openssh.org].

    Why do I always picture half-drunken bar patrons reinventing the world in front of a beer when I hear about the DHS talking about things they don't have much of a clue about?
  • by mrkurt ( 613936 ) on Wednesday December 31, 2003 @09:00PM (#7850125) Journal
    Quite honestly, the SD Times article told me nothing about what they're really going to do about improving security in applications. You could substitute "licensing" in that article for "certification", as in some vendor's certification of developers. Then, it looks like a useless measure of what that person knows about security. If, however, it is more of a civil service exam, and they're going to test for knowledge of how to write secure code, then it would make a lot more sense.
  • Trends are fun (Score:5, Interesting)

    by DroidBiker ( 715045 ) on Wednesday December 31, 2003 @09:03PM (#7850144)
    In the near term if they adopt a licensing scheme the first iteration at least will be something like the programming language Ada.

    The US military brass decided at one point that it would be great if all of their software was written in one language. They forned a comittee to design what they wanted. Ada was created and various military agencies started insisting on its use.

    The problem was that what they designed wasn't flexible enough and over time Ada became less and less important.

    Licensing will go a similiar route. The government will spend millions on a comittee to come up with requirements for a standard software engineer license. Then they'll find out that their licensed folks STILL screw up and eventually it'll become less of a big deal.

    That being said, if software engineering licenses come into existance at the federal level you can bet I'm going to get one.

  • Two questions (Score:5, Insightful)

    by hdparm ( 575302 ) on Wednesday December 31, 2003 @09:03PM (#7850145) Homepage
    Does it mean that software created by those same developers, now licensed, in the past is now cleared? Are they going to hold developers and engineers accountable even if they're forced to produce code based on inherently flawed design, driven solely by profit and questionable business practices?
  • when the next dozen Microsoft "critical vulnerabilities" come out.

    Who wants to bet that Microsoft gets some kind of exemption from the revocation of licenses due to poor design and coding?
  • If this if for homeland security does that mean the only people who can be licensed are US citizens native to this country? If so, that may help with our outsourcing epidemic.
    • Don't bet on it.

      One problem with this scheme is that since programmers are now accountable then they and their companies are likely open to lawsuits. Which means developing software in the US becomes very very expensive.

      Even if there are no lawsuits the sudden reduction in available programmers (just how quickly can all those current developers be licensed anyway?) means salaries go through the roof and many developers are unemployed and suddenly a lot of software becomes vapourware for the next 5 years.

    • Two words: Malpractice insurance.

      This would drive the costs of coding way, way up, and presumably speed the movement of jobs towards India.
    • You're right : in my opinion licenses should require all of the following:
      - US citizen
      - born in the US
      - currently living in the US
      - registered Republican
      - $500 donation to "Jeb Bush for America 2008"

      Regular church going would be a plus, but optional. The license granting ceremony would involve an oath on the Bible though.
  • by rice_burners_suck ( 243660 ) on Wednesday December 31, 2003 @09:12PM (#7850212)
    Would OSS have to be [written] entirely by licensed developers to be considered secure?

    As the past owner of two different businesses and the present manager of a mid size company, I can confidently say that the answer is no.

    This is very simple. Over the years, I have hired a wide range of different people to work as programmers. I had everything from masters degree programmers with 20 years experience to kids out of school who do it as a hobby. In all cases, what determined the success or failure of the project was not the qualifications of the programmer. I had masters degree programmers write such gibberish that multi-hundred-thousand dollar projects were cancelled. I had masters degree programmers who did a marvelous job. I had some kids code up another product that worked so beautifully that it only made the company money. I also had kids who did a crappy job and the project failed. In other words, success or failure is determined by results, and nothing else.

    Returning to the above question, software is considered secure if it is tested for vulnerabilities and is found to be strong against attempts to break in. If the programmer has a Ph.D., that's all nice and pretty, but it means exactly Jack Schitt. The results are the only thing that matter.

    Therefore, I think this committee should not waste its time with issues like licensing, because that will only create more bureaucracy, more fees, and entire administrative efforts... and it provides no guarantees of success. They should figure out a way to measure the reliability of a piece of software (reliability is the parent category of security, because an insecurity reduces reliability). They should make up some guidelines for how mission critical systems should be judged and tested. Perhaps they should recommend that the government should hire its own crackers to constantly look for and help fix vulnerabilities. Because security isn't a one-time thing. "Let's license programmers and the problems will go away." It doesn't work like that. Like everything else related to management, in security, the only constant is change.

  • by RealProgrammer ( 723725 ) on Wednesday December 31, 2003 @09:19PM (#7850263) Homepage Journal

    ... syndrome. Lawmakers always want something that sounds good, looks good, and will make them appear to be addressing the problem.

    The conceptual framework they're working under is wrong. They assume that a single person is the author of a program. Maybe some programs have just one author, but most have several. The main, lead programmer, who is typcially the copyright holder, may not even look at every line of code in a program.

    The bit about a culture shift is valuable. Projects should be built with security in mind, using basic principles (least privelege, minimize scope, check your loop bounds, etc.) that are, coincidentally, good programming practice.

    But the culture shift that's needed is away from blame-based analysis of security failures and toward cooperative assistance. That shift is assisted by opening source code. Licensing programmers will tend to accentuate the blame attacks when bugs are found, and will provide incentive to hide them.

    No program is bug-free. No committee of Licensed Gurus can eyeball scan a progran and find all its bugs. It takes running the program in real-world situations to find some (most) bugs. Licensing the programmer will not decrease the number of bugs in a given program.

    Lawmakers would do better to simply stay out of the matter entirely than to introduce bureaucracy for the sake of appearance.

    • The main, lead programmer, who is typcially the copyright holder

      Often not - if it's a commercial piece of software, then the company will be copyright holder.

      Licensing programmers will tend to accentuate the blame attacks when bugs are found, and will provide incentive to hide them.

      Agreed. A licencing scheme is all but meaningless if, once a licence is obtained, it cannot be lost. Therefore, it will be in a licensed programmer's bset interests to hide any such bugs, for fear of losing their licence.
    • The conceptual framework they're working under is wrong. They assume that a single person is the author of a program.

      The framework they're working under is that of Engineering practice. When a PE signs off on a job (s)he was no more the sole author of the work product than a software development lead or architect is the sole author.

      Is software practice generally anywhere near ready to be held to the kinds of standards applied in structural/civil/electrical/chemical/... engineering practice?

      Hell no, it

  • This is EXACTLY what we need, a government bureacracy created to step in and solve all our problems, just like they have in every other area.

    Lord knows when you hire a licensed contractor, nothing will go wrong.

    Instead of those "Licensed contractors build confidence" bumper stickers the union thugs put on their trucks, they should put:

    "Licensed contractors build artificial barriers to competetion and inflate prices unecessarily while slowing everything down jumping through government red tape."
  • Isn't taking as long as i expected for the HSD to get involved. :(

    I guess my prediction of 5 years out before all software is controlled, licensed and restricted may have been a bit optimistic.

    Don't forget, hardware will go this route too in order to "be secure"... ( I.E. mandatory DRM )

    First get 'corporate' acceptance of the concept by snowing them enough, then put it into law
  • by phliar ( 87116 ) on Wednesday December 31, 2003 @09:40PM (#7850364) Homepage
    I can only speak for myself: but why should I believe that some yutz who took a Kaplan's or "ITT Tech" course and passed a US government approved class is going to write decent code? I think the odds that Theo is going to take a licensing exam of a different country are exactly zero. Will that magically make OpenBSD less secure?

    The proof of the pudding is in the eating, and free software has done pretty damn well on the security front. If some pinhead executive wants to pay for "confidence" -- well, I'm sure someone will be happy to take that money off him.

    And getting paid more for jumping through silly hoops when you're writing for free? How much more? 10% more than zero is -- zero. The whole thing is silly.

  • by Crypto Gnome ( 651401 ) on Wednesday December 31, 2003 @09:52PM (#7850413) Homepage Journal
    Here's a summary of the plan.
    • A software developer (ie a programmer) gets licensed
    • works on a project for (name some large company)
    • company management provides direction for the programming efforts (as they do)
    • software is iunsecure by design, due to management decisions (happens now, and the plan changes nothing here)
    • software is finished
    • ....marketed
    • ....purchased
    • ....deploye d
    • ....ends up killing over 10 thousand people for some trivial reason
    • programmer takes 100% of the blame; firing squad at dawn
    • company/management who made the decisions which introduced the lack of security get off Scott Free; zero legal consequences of their stupidity
    Or am I misunderstanding the whole point of the exercise?
  • EAL Certification (Score:3, Informative)

    by omnirealm ( 244599 ) on Wednesday December 31, 2003 @09:53PM (#7850414) Homepage
    Let us not forget that the IBM Linux Technology Center has certified a Linux distribution (SLES 8) under the Common Criteria Evaluation Assurance Level 2, and they are currently working on EAL 3. This qualifies a Linux distro, composed largely of Open Source software, to take part in bids on certain security-sensitive government contracts. This sounds just like the kind of assurance that this security task force is looking for.
  • the journalistic integrity of the host of this article. If they are proposing, or even carrying the message, that programers be "licenced" and held accountable, should they not be held to the high standard of having accurate dates on their articles?

    Note that this sounds fairly familiar, in that I think we have heard suggestions quite similar coming from the northwest coast of the US. I also note that the vast majority of exploitable code comes from that region of the US as well. (Ok, the vast majority of c
  • Licensing Software professionals and holding them accountable for software security is the Palladium concept applied to people. Once you have to license "software engineers" in general, you will have them digitally signing their code and then only software duly signed will run on your Palladium Computer. Otherwise, your computer might run (gasp!) pirate code!
    I am assuming the compiler will digitally brand your code with your signature, in order to find out who wrote the "unsafe" code that was breached.
  • Moritz suggested that a sort of class system of programmers might emerge, with those creating the mission-critical applications needing to be licensed and perhaps even bonded, but also more highly paid. Those licensed professionally would be held accountable for their work, such as for security breaches to critical systems.

    "We license civil engineers to have confidence their bridges will support a certain amount of weight over a certain period of time. But is it bomb-proof? We need to define software i
    • There's no technical separation between design and realizaton of software. What you call implementation is just a further refinement of what you call design. Even what the compiler and linker do automatically is just a further refinement of that, in a continuous process.
      • Ahem! Meanwhile back to the point I was making which you so blithely ignored.

        You can sue the Engineer because he's wholely and solely responsible for the design, said design being a thing entirely and completely seperate from the implementation.
        That's essentially the definition of his position the person who is personally accountable for all aspects of the design.

        A programmer is (more often than not) neither wholely nor solely responsible for the design (management tells him at least some of the design di
  • Why would OSS have to change? OSS is what it is, proof that collaboration, cooperation, and openness will eventually lead to a much better product. It doesn't matter a single iota what anybody legislates or says, if we keep building software better than everybody else eventually everybody else WILL buy into it.

    Bryan
  • The blame game (Score:4, Insightful)

    by k12linux ( 627320 ) on Wednesday December 31, 2003 @10:10PM (#7850461)
    license software developers and make them accountable for security breaches
    How will these licensed developers be held accountable? Lose their license? Have points awarded against them (as is done with driver's licenses in many places?) Will they face fines? Jail time?

    Exactly who will be willing to take personal responsibility for a security breach? How many new legal cases will come up trying to prove that a breach is the "other guy's" fault? "We'll show, your honor, that it was the 'evil bit' hidden in the compiler that caused the security hole!" I suppose we'll see programmer malpractice insurance not long after too.

    Would this mean we could go after MS for monetary damages? Somehow I doubt it. Would MS's recourse be to say "Don't worry, that developer has had his license revoked."?

    This whole thing seems like a big CYA bid. Just make sure someone else is available to blame. Seems like they are saying, "We can't blame the hackers because we can't find them. But don't worry, you can blame the programmer now."

    Regardless of the intent, I don't see this doing a bit of good for security. People with real talent, but who want to reliable income will shy away from a system which they could easily be responsible for damages, or alternatively lose a license to practice their trade. I have a wife and kids... no matter what I think of my skills, if I'm at the mercy of every hacker out there I'll find another field.

    So, the result will be that it will become very HARD to hold someone responsible. Action, if ever taken, will be only in events of gross negligence. Security *may* improve short term. But, if we drive out all but the risk-takers I suspect that security will go down and the quality will go down too.

    In the end I just see an institutionalized profession which hasn't given us any real benefits.

    This seems like just another knee-jerk-silver-bullet attempt to fix an embarassing problem. Why do I picture a meeting somewhere running late and somebody jumps up saying, "Hey, I know! We'll license programmers and hold them responsible for breaches." Followed by, "Yeah, and licensed programmers will get higher pay, so there is an incentive right there!" Then "Discussion? None? All in favor..." And whispers of "Great.. I'll be home in time for dinner tonight!"

    • Exactly who will be willing to take personal responsibility for a security breach?

      Slashdotites, esp. those of you young'uns developing a budding anti-corporate stance, this is one reason why you should consider being more nuanced in your opinions. Huge multi-nationals may be evil, but consider the flip side. Would you be willing to work somewhere where you are going to be held personally financially liable for your mistakes? Which, since you are a programmer and the first person in line to blame for multi
  • I write some code for a particular application. It works perfectly there for that purpose. Then, some manager decides he wants it for an unintended application and runs it there. It fails miserably in that environment and I'm on the hook because somebody misused some software I wrote. NOT!

    Also, if that same software is intended for a specific computing environment and that environment isn't configured per the requirements of the software and it then has holes all through it but I'm on the hook. NOT!

    T
  • States' rights? (Score:3, Interesting)

    by Burnon ( 19653 ) on Wednesday December 31, 2003 @10:37PM (#7850575)
    So, leaving aside issues of whether or not this is a good idea, are states' rights being encroached upon with this idea? States currently license engineers as they feel its necessary - why would software require federal licensing? Engineering is engineering, whether your twiddling bytes or blocks.

  • This is incompatible with both the Microsoft business model and the design of Linux. But it's not impossible.

    We could, in theory, have secure message-passing microkernels enforcing a mandatory security model running on secure machines with machine-checked proofs of correctness of both the code and the hardware at the VHDL level.

    But every project to build such a thing has produced only a toy OS. All the verification projects are dead. C and C++ are hopeless for code verification. Java isn't really sui

  • by TheBigx00FF00 ( 732027 ) on Wednesday December 31, 2003 @10:40PM (#7850597)
    This goes back to the digital sigs for website shop front ends, and "signed" ActiveX controls etc. First off, just because something is liscensed, doesn't make it trustworthy. More problems will arise from people nievely trusting applications that have the "It's secure" sticker on it, instead of doing what they can to understand the application and it's proper implementation. Secondly it would destroy the market for developers who refuse to conform to, or cannot afford "liscensing". MANY useful and integral applications, especially for non M$ platforms, rely on people making improvements and fixes in their spare time. Who's going to be willing to submit a quick hack to fix a problem if they might be liable for the result? Hell who's going to code anything for free?? I'm certainly not willing to make myself personnaly liable without any monetary compensation. For legal fees if nothing else. Htf am I going to know that when my obscure software is compiled on the 2.9.4 kernel years from now, it creates an exploitable condition?? Going back to the first reply, the platform the software is running on makes a HUGE impact on it's security. How am I going to develop an application on a platform with an inherantly flawed API subject to hijacking etc? How about physical security issues? What if a compromise occurs on a machine, that resulted from say a hardware keylogger ($40 from thinkgeek), or a disgruntled employee? Must I bear the burden of proof that it was not my application but one of these or a host of other issues that caused a compromise in a system running my software? It's just a plain bad idea, poorly formulated, and not very well thought out. It's the "higher ups" deciding to place the blame on the developers, and remove personal liability from themselves.
  • Programmers of designers?

    You could argue that buffer overflow hacks are partly the fault of the CPU since Intel and PPC chips can't fully protect against buffer overflow attacks (when using OpenBSD).
  • Immature discipline (Score:5, Interesting)

    by sjames ( 1099 ) on Wednesday December 31, 2003 @11:53PM (#7850909) Homepage Journal

    Consider the many centuries we were building buildings before we had anything beyond a few guestimated best practices to assure that they wouldn't fall down. Eventually, the field matured and we figured out how to calculate the strength of a building in advance. Even then, it is only reletivly recently that we could do dynamic simulations. In spite of that, we still have mishaps.

    Furthermore, we STILL are not at the point where we can guarantee that a building will hold up under attack. In fact, we are certain that ANY building can be destroyed using explosives. In fact, any device we invent can be destroyed and in turn cause destruction when deliberatly used contrary to it's design.

    At the same time, there are levels of vulnerability that are clearly substandard. Buildings must not simply fall down in a light breeze and cars must not explode when you start them.

    On the basis of that, licensing and liability will need to be restricted to a very small subset of applications, and they will be very expensive. For the same reason that most of us don't have bomb proof cars, most software will not be built to that standard.

    The other case would be grievously stupid design decisions such as having email from anonymous strangers be executable or using gets for a publically acessable interface.

    • Consider the many centuries we were building buildings before we had anything beyond a few guestimated best practices to assure that they wouldn't fall down. Eventually, the field matured and we figured out how to calculate the strength of a building in advance. Even then, it is only reletivly recently that we could do dynamic simulations. In spite of that, we still have mishaps.

      In the past buildings were massively over engineered, because the engineer wasn't sure what the tolerances were. Witness Isenbar
  • We will have good software the day software companies face the same kind of liability that, say, Ford faced for the Pinto exploding gas tank.
  • Employers say that people will be paid more but watch everyone end up being paid LESS for not having the license, while those that get the qualification will get the same salaries as now.

    How much do you want to bet that salaries will remain the same while employers only hire those that are licensed? Those that are not licensed will probably get paid less (becoming an underclass). There is nothing better to a capitalist than shifting risks to the worker (software developers are now liable) while not want
  • Type "program verification" into Google, and you get a "Work at Google" paid ad.
  • And we all know that with the Business Software Allaiance [bsa.org] as one of there is no possible way this could lead to the effective outlawing of Open Source (not allowed to contribute if you don't hold a license).

    In all likelyhood, any government regulation over development or licensing scheme for developers will only lead to protecting the high profits of a few of the largest vendors and hurt everyone else in the industry.

  • by Todd Knarr ( 15451 ) on Thursday January 01, 2004 @02:22AM (#7851352) Homepage

    That condition comes from the licensing of civil engineers, too. You have to be licensed to be a civil engineer, pass some fairly effective exams and all that. You can be held personally and professionally liable for screw-ups in your designs. But there's another aspect: you have control. If you're the civil engineer on a project and you specify that it needs X grade of concrete, that's it. If management tries to say "That's too expensive, build it using a cheaper concrete.", you get to say "No can do." and they can't argue. If they do, you make a phone call and the next day some gentlemen with badges show up to discuss the fines and penalties management is going to pay. If management fires you and uses the cheaper concrete anyway, the discussion will be about criminal charges on top of their liability, not yours, for any damages done because of their illegal substitution.

    If licensing of software engineers includes everything that licensing of civil engineers does, including the "those who don't have the license do not get to overrule you on how the job gets done" provisions, it's IMHO a good deal. We ought to press for exactly that in licensing, because while companies would be highly allergic to it it'll play very well with the public. Think about public reaction when a structural failure turns out to have been caused by someone substituting shoddy materials for what was originally specified or otherwise not doing things the way the engineers said to do them.

    • OSS has no problem with professional certification you get the source, review it, test it and certify it to a grade. The professional would do this or sign off. For closed source the process is the same except you don't have the source or your rely on the vendors professional certification.

      I worked summers in an Architectural/Engineering firm before I got my degree for Computer Engineeering in 1979. The real way these firms worked at that time is that the Professionals (Registered Architects and Proffessi
  • Another boondoggle (Score:3, Insightful)

    by ScrewMaster ( 602015 ) on Thursday January 01, 2004 @04:06AM (#7851598)
    This is an attempt to divert attention from the real problem with software development, and for that matter business processes in general. Programmers and software engineers are, point-blank, not responsible for the quality or reliability of shipping code. Period. That responsibility lies with management, and the resources it chooses to devote to the initial design process (very important ... Microsoft didn't pay enough attention to this and is now paying the price in spades) and, just as significantly, to the quality-assurance cycle. Attempting to lay the blame for poor quality design and implementation solely upon the shoulders of the actual programmer simply ignores the root causes of poor software. The people that design software, and those that test it, are even more important to releasing a quality product than the programmer. However, the biggest problem that I've experienced in a quarter century as a software engineer is that management simply refuses to allocate sufficient time to initial design and prototyping. They want coding to begin as soon as possible after inception, and that often doesn't allow a good foundation to be laid before the design is frozen.

    I'm tired of hearing how architectural and structural engineers are "certified", and the insipid comparisons made between this status and that of software engineering. The penalties for a bridge or building collapsing are extreme of course, and no-one would want an incompetent engineer designing such a structure. But what is lost in all this talk is the design review process that occurs long before anything is actually approved for construction. Yes, perhaps the design engineer is technically accountable for a flaw in his work (I don't know, I am not a lawyer), however in any major undertaking there are dozens of others responsible for validating and double-checking the design, and there is no way in Hell that that engineer would be considered solely responsible for a serious failure when a whole review team approved his efforts. Besides, that's what we have insurance for, anyway.

    Given that corporate America has proven to be even less reliable and trustworthy than Microsoft Windows 98, I think we should start by certifying the business ethics exhibited by corporate executives and middle managers. Then let them pass tests that indicate an understanding of the software development process, and once that is done make it illegal for anyone in a marketing or sales department to influence software release dates. The programmers aren't the problem. Corporate America is the problem, and until the market decides that it is willing and able to pay for quality software no amount of legislation or governmental interference will improve matters one whit. Believing otherwise is naive or disingenuous.

    Of course, it won't matter if the current trend in outsourcing continues, since there won't be any software engineers or programmers left to be certified anyway.
  • I can definitely see the advantages of being licensed and being having a professional organization:

    1) We get paid more.

    2) others not acredited can not do our jobs. For example, at a company I used to work for, an Engineer was programing. Now could a computer programmer do an engineers job? No Way!
  • by Angst Badger ( 8636 ) on Thursday January 01, 2004 @05:05AM (#7851751)
    I was reading the first volume of Alexander Solzhenitsyn's The Gulag Archipelago the other night. (For those of you too young to remember either Solzhenitsyn or the Soviet Union it describes, go read it. Along with 1984, it ought to be required reading for citizens of putatively free countries.) The section I was reading dealt with the purges of competing socialist movements once Lenin's party had consolidated its power. Political dissenters were at this time -- 1924 -- being tried by special tribunals, denied counsel and contact with the outside world, and executed, first by tens, and ultimately by the hundreds of thousands.

    The official explanation for all this was the accused were terrorists who threatened the security of the motherland. It was Guantanamo Bay writ large, and once it picked up steam, it did not stop until, after somewhere between 20 million and 40 million state murders, the Soviet Union collapsed under its own sheer inefficiency in the early 1990's.

    In the Soviet era, the most improbable things were tied to the idea of, as we say today, homeland security. If you twist the logic far enough, and people are either stupid enough or frightened enough, you can get away with claiming that the manufacture of cheese is a matter of homeland security. (And why not? It is a fungal product susceptible to both accidental and intentional contamination with biotoxins; an economic resource vulnerable to sabotage; it is produced by wealthy companies whose political allegiances might not be entirely healthy; and worst of all, it is a national emblem of the hated French.)

    This programmer licensing is a ruse. Like the bulk of the Department of Homeland Security, it is a crock of shit designed to convince the public that the government is "doing something" against a threat of dubious reality but great electoral usefulness, and it will serve only to centralize more power and money in the hands of large software companies.

    Even if it weren't part of a fairly nefarious political trend, does anyone really believe this will make any damn difference? Commercial programmers don't make the important quality decisions -- they are handed down by management to suit marketing needs and the bottom line. If there's any professional programmer here who hasn't written inferior code to satisfy arbitrary time and resource requirements imposed from above, speak now and be counted with your five or six other brethren.

    If you want to improve the quality of software, hold companies and their shareholders financially responsible. In other words, put pressure on the people who actually make the decisions, and they will select those programmers -- licensed or not -- who write quality software and give them the resources to do it.

    Of course, the big software houses (read: Microsoft) will never go for that because neither they nor the subversives at the Department of Homeland Security give half a rat's ass about the well-being of the public. What they do care about is enhancing their own prestige, power, and wealth.
  • One of these talking points is to license software developers and make them accountable for security breaches.

    It seems to really prevent all possible security breaches, you need to prove that the program is correct [uottawa.ca] first - I don't know of many entities that even try to prove their programs. I have heard of a few telecom infrastructure programs, but remember the big SS7 outage caused by one tech some years ago? The SS7 code is probably better "audited" than most code but would that outage have been constr
  • That's the best way to make sure that applications are secure. Believe it or not, most developers care if there is a security problem in their software.

    The problem with this whole idea is the assumption is this: Programs depend on libraries, libraries depend on lanuges runtimes/libraries and those depend on the OS. Of course you're also depending on the compiler not to produce buggy assembler code.

    So with all of these layers can we truely say who is liable when something goes wrong?

    GJC

The use of money is all the advantage there is to having money. -- B. Franklin

Working...