OU blog

Personal Blogs

Christopher Douce

Software Engineering Radio: Security and secure coding

Visible to anyone in the world
Edited by Christopher Douce, Sunday 5 October 2025 at 10:10

Digital security is an important specialism in computing. The OU offers a BSc (Hons) in Cyber Security which features TM359 Systems penetration testing. Security is obviously and clearly important within software engineering. The extent to which security is required should be made explicit within non-functional requirements. Any software product that is created should be created and deployed with security in mind.

There are a number of podcasts in Software Engineering radio that addresses security from different perspectives, such as SE Radio 640: Jonathan Horvath on Physical Security and SE Radio 467: Kim Carter on Dynamic Application Security Testing.

One of the podcasts that caught my attention was about secure coding.

Secure coding

SE Radio 658: Tanya Janca on Secure Coding discusses secure coding from a number of perspectives: code, tools and processes to help create robust software systems. She begins at 1:50 (until 2:11) by introducing the principle of least privilege. This led to a discussion of user security and the significance of trust. The CIA triad, Confidentiality, Integrity and Availability is introduced between 10:00 and 11:45. 

A notable section of this podcast is the discussion about secure coding guidelines, between 27:00 and 32:12. Some of the principles shared included the need to validate and sanitise all input, to sense check data, to always use parameterised [database] queries since how and where you write your database queries is important. The software development lifecycle was mentioned between 41:32 and 50:18, which led to a discussion about different types of testing tools and approaches (static and dynamic testing, pen testing and QA testing).

A really notable quote I noted down was the reflection that “software ages very poorly”. There are simple reasons for this. Requirements can change. They change because of changes to the social and technical contexts in which software is used.

Reflections

The podcast scratches the surface of a much bigger topic. One thing that I have picked up from other podcasts is that it is possible to embed code checking within a CI/CD software deployment process. Having a quick look around, I’ve discovered an article by OWASP called OWASP DevSecOps Guideline which discusses ‘linting code’.

The concept of ‘lint’ and ‘linting’ deserves a little bit of explanation. The ‘lint’ in software engineering is, of course, a metaphor. Lint (Wikipedia) is bits of fluff or material that can accumulate on your jumper or trousers. You can get rid of lint using a lint roller, or Sellotape.

There used to be a program called ‘lint’ which ‘went over’ any source code that you have written. Although your source code might compile and run without any problems, this extra program will identify additional bits of code that might potentially be problematic. Think of these bits of code as being pieces of white tissue paper that are sitting on your black trousers. Your ‘lint’ software (which is also called static analysis software) will highlight potential problems that you might want to have a look at.

Continuing looking at OWASP, I was recently alerted to the OWASP Top Ten list which is described as “a standard awareness document for developers and web application security. It represents a broad consensus about the most critical security risks to web applications”. It presents a summary of common security issues that software engineers need to be aware of.

Each of these items are described in a lot of detail and go a lot further than my simplistic knowledge of secure coding. A personal reflection is: software engineers need to know how to read these summaries. This also means: I need to know how to read these summaries.

Python is going to be used in TM113. I’ve been made aware of Six Python security best practices for developers which from an organisation called Black Duck (which is thoroughly in keeping with the yellow rubber duck theme of this new module.

A bit more searching took me to the National Cyber Security Centre (NCSC) and 8 principles of the Secure development and deployment guidance (2018). This set of principles takes a broad perspective, ranging from individual responsibility and learning, through to effective and maintainable code, to creation of a software deployment pipeline.

A final reflection is that none of all this discussion about security is new. Just as there are some classic papers on modular decomposition within software engineering, I’ve been made aware of a 1975 paper entitled The Protection of Information in Computer Systems. I haven’t seen this paper before, and I’ve not read it yet; it requires a whole lot of dedicated reading that I need to find time for.

The geek in me is quite excited at the references to old (and influential) operating systems of times gone by. The set of eight principles (a bit like the NCSE guidelines) contains one of the most important principles in security I know of, namely, the principle of “Least privilege: Every program and every user of the system should operate using the least set of privileges necessary to complete the job. Primarily, this principle limits the damage that can result from an accident or error”.

I have some reading to do.

The article of this article mentions “architectural structures - whether hardware or software - that are necessary to support information protection”. This takes me directly onto the next blog which is all about software architecture.

Acknowledgements

Thank you to Lee Campbell for sharing those additional resources.

Permalink
Share post