In this 2-part blog series, we’ll look at security from a developer’s point of view. (You can read part 1 here if you haven’t already.)

Fixing security bugs might be simpler than expected

At least once or twice a year every programmer encounters a particular class of bugs: the subtle, almost invisible problem that appears only when certain unlikely conditions arise. Those bugs are hard to spot, tricky to reproduce but in hindsight very simple to fix.

Security issues are a subset of these ’edge case‘ bugs, with the main difference being they can be exploited for malicious purposes. Nevertheless, while the outcome of a hack can be disastrous, the fix often is simpler than one might imagine and will be cheaper to a CTO than the cost of a breach.

In this matter, our Secure Coding Workshop adds values in the following areas:

  • The clear, hands-on approach to fixing security bugs removes the aura of mystery surrounding those ‘creatures’ and roots them in reality
  • The many practical examples given on the course emphasises the necessary mindset to approach the problem; your team will benefit from this eye-opening experience during the next design phase of your team’s core product

Don’t blame the developer

Everybody makes mistakes. Developers embrace this fact and rather than aiming for a perfect codebase on first try, and punishing every mistake, a healthy team encourages and appreciates code that can be maintained and fixed with ease – because there will always be a bug or a mistake that needs redressing.

Security bugs, however, are often frowned upon —especially by upper management—as they can cause public embarrassment, and affect user data and personal information. We have seen calls for bug bounties to be paid by taking money off developers’ bonuses: a sure recipe for disaster and the collapse of team morale within the business

Main reasons for undetected security bugs are:

  • Cutting corners due to managerial or sales pressure, and a rush to deliver
  • Genuine ignorance on the matter, perhaps stemming from inexperience
  • Disenfranchisement of the individual programmer
  • A team culture that rewards quick fixes over ’doing the right thing’
  • The ‘not my problem’ syndrome: when a developer puts their blind trust and reliance in third party components without challenging their own security assumptions and ability

The Secure Coding Workshop will show how seemingly small errors can lead to disastrous results, and why the right solution is in fact the only solution. The course will also show examples of partial fixes and how they can be bypassed, present former and current best practices, discuss strategies to deal with unknown circumstances (e.g. unexpected user input) and give each developer a basic toolkit. This is so they can understand and address security with the right frame of mind.

Even basic, practical security training can yield immediate benefits

You need trained eyes and experience to deliver on time while avoiding traps and pitfalls. This is often why younger and inexperienced developers have longer lead times: they have to overcome the novelty of the toolkit, acquire familiarity with the codebase and the product, climb a steep learning curve, and navigate wrong or misleading paths—all of this under pressure and with growing responsibility.

When security is thrown into this picture it risks becoming a negligible factor. Especially when the only experienced or senior developer is too busy with other tasks to perform adequate code reviews, or to provide feedback and training to the rest of their team.

Our workshops shows how to exploit common security vulnerabilities with a series of practical exercises, giving time to each candidate, regardless of their level of ability, to investigate the issue and develop their own solutions.

Our experience shows that once there is even a basic familiarity in exploiting the most common security vulnerabilities, they become more obvious to spot in an existing codebase. In other words, they are no longer just another problem to keep in mind, but the rightful priority they deserve in a modern development team.

Good process creates good software

An example of software that was almost entirely bug-free was the Space Shuttle. In fact, some say that the Space Shuttle software group’s most important creation was not the software they wrote, but the process they invented that allowed software of such good quality to emerge.

As security bugs are a subset of software bugs, if good process creates good software then it comes naturally that a good development process must yield also more secure software.

However, opinions differ on what a good process is: for example, a manager would prefer something that gives them good estimates and clear accountability, helping in reducing costs. A developer, on the other hand, might consider process a nuisance because it is often in the way of getting things done.

The truth almost certainly lies somewhere in the middle, and while good process must be cared for, and adapted to evolving teams and business strategies, bad process tends to rot and becomes a checklist.

A good topic for discussion within your own development team would be how to design good process, how to spot the tell-tale signs of a bad process, and how to adapt quickly to new challenges.

“Ten years ago [this was written in 2001] the shuttle group was considered world-class. Since then, it has cut its own error rate by 90%.”[source]

Legacy does not necessarily mean it needs replacing

It is common to see grand attempts at rewriting large chunks of working legacy code, once its complexity reaches a certain threshold. While legacy applications can be difficult to maintain, their fundamental role in any business makes them very difficult to replace. For this reason, learning to spot security bugs in existing code is an efficient ROI.

However, for those times where a rewrite is in order, developers often forget that software does not age, and every quirk or oddity may be there for a good reason, perhaps as a result of a painful bugfix.

In a sense, a codebase is like a genome: an encoded history of all previous generations. Discarding years of iterations and refinements would be unwise.

This concludes our mini-series on security from a developer’s point of view. We’d love to hear your thoughts, so feel free to drop us a note on Twitter.

Latest

The Role of AI in Cybersecurity Friend or Foe

In this article, we'll explore the role of AI in Cybersecurity the potential benefits it provides, a...

Consulting on IoT and PSTI for manufacturers

IOT Self-Statement of Compliance for PSTI?

Often when our IoT consultants find themselves deep in conversation about the Product Security and T...

fintech penetration testing

Understanding the Importance of Fintech Penetration Testing

Fintech Penetration testing aids in the identification and remediation of vulnerabilities within an ...