An agile approach to AI-supported coding

An agile approach to AI-supported coding

According to McKinsey, developers using AI are twice as productive as those without. Although some enterprises will see this as a potential quick win for their developer team, it’s important to realize that the benefit of AI can only be unlocked safely with the right upskilling in security best practices.

Blindly trusting AI to do work by itself, even with good prompt engineering, doesn’t ensure code quality. For instance, take an AI-generated image of a person: it may look convincing at first glance, but look closer and you might notice that there are slightly more fingers or ears than there really should be. AI-generated code has similar issues. It may work, and even stand up to surface-level scrutiny… but look a little closer and the cracks start to emerge, potentially revealing vulnerabilities.

Security training is something developers must engage with throughout their careers, with ideal solutions offering the continuous development necessary to keep pace with changes. Traditional training isn’t the answer when it comes to secure code development using AI. Finding relevant and up-to-date courses won’t be easy in an area where things are changing so quickly. Developers instead need to upskill in a flexible way, if possible, with relevant material and scenarios that are familiar to them through the course of their day jobs.

Pieter Danhieux

Co-Founder and CEO at Secure Code Warrior.

Understanding AI and the risks involved

Regular training is necessary because threats that use AI are developing as quickly as the technology itself. One example is “hallucination squatting,” where AI’s incorrect answers can be used for malicious purposes. AI has a tendency to “hallucinate” incorrect answers with a degree of confidence, rather than admitting it doesn’t know. This already increases the potential for critical damages and errors if this can be used to subvert a piece of code. If an AI tool is known to generate a call to a fake library when creating code, an attacker can use these names to create malware disguised as the fake library, easily executing an attack. Previously, the code would fail. Now, it will work—but calls malicious code in doing so.

Unless a developer has full knowledge of this hallucination and can clearly identify signs of insecure code, attackers can take advantage of their naivety. Developers must be enabled to hone their secure coding skills and awareness. After all, they are the first line of defense in protecting organizations from the introduction of code-level security bugs and misconfigurations – especially as the adoption of AI coding tools increases. Traditional upskilling efforts tend to fail because they are too rigid and based on irrelevant information and context. In today’s age of AI, developer upskilling must become tailored to the requirements of individuals, with techniques that address the latest vulnerability and attack trends.

Enter agile AI learning

Agile learning has emerged as an approach that helps developers hone their skills, and assists them on their path to becoming security-skilled, more advanced software engineers. It favors flexibility and gives developers options for multiple pathways in order to upskill on topics that are most relevant to them. Using just-in-time “micro-burst” teaching sessions allows teams to learn and apply knowledge quickly within the context of their actual work.

Teams that implement agile learning in secure code, along with safe deployment of AI assistive tooling enjoy the benefit of hands-on experience with the tools while achieving security at speed. This has been notoriously difficult to achieve, especially if developers have little experience with security awareness and training. The “just-in-time” approach directly ties into what developers are doing on a day-to-day basis, with context that allows them to anticipate and solve relevant vulnerability problems that have been created by AI.

Individuals will always have a preferred learning style. As organizations shift to offer greater flexibility in education, developers are then provided a curriculum that is more curated, based on their needs, workday and education preferences. The adaptive nature of machine learning and large-language models will provide a more individual, tailored learning experience for developers.

AI has shown great potential in improving the way people work when used with skill, discretion, and critical thinking. Organisations know this, and will be tempted to see how far they can push the limits – however, an overreliance will cause critical mistakes in the long run. Any use of AI without proper training or guidance will be an even costlier mistake.

In the short term, companies that depend on AI, even if they lack security training and focus, will produce software faster and grow quickly. This lacklustre approach to security will only catch up with them, resulting in major problems that will lead to significant user and customer issues down the road. Smart enterprises that want to take full advantage of AI need to invest in agile learning for their development cohort, with a security-first approach to allow for cautious adoption of this technology.

We’ve featured the best laptops for programming.

This article was produced as part of TechRadarPro’s Expert Insights channel where we feature the best and brightest minds in the technology industry today. The views expressed here are those of the author and are not necessarily those of TechRadarPro or Future plc. If you are interested in contributing find out more here: https://www.techradar.com/news/submit-your-story-to-techradar-pro

https://www.techradar.com/rss

Pieter Danhieux

Leave a Reply