Collaborative Coding with AI

BY

10 January 2024

Software Development

post-cover-image

Over-reliance on AI

Many organisations are turning to generative AI to create text, images, code, and other types of content. Whilst using AI can be extremely efficient, a rising concern is the potential for major digital service outages due to poorly supervised code generated by AI tools, exposing organisations to risks affecting customer experiences.

The challenge extends to maintaining AI-generated code, much like understanding code from departed developers. This can lead to problems such as team members finding it difficult to fully understand the code, which can become difficult to quickly resolve problems such as bugs. Furthermore, relying too heavily on generative AI can cause loss of fundamental knowledge and undemanding of reviewing and resolving issues in code.

Another limitation of relying on AI coding tools is their inability to replace human intuition and creativity, and moreover produce errors often. Efficient AI solutions, such as ChatGPT, lack meaningful value alone. Organisations should adopt a composite approach, integrating AI and human technologies for empowered teams.

Drawbacks

Below, is some of the drawbacks WorkingMouse has identified if a developer relies too heavily on AI for coding.

  • Lack of understanding: Generative AI can produce code but may lack a deep understanding of the broader context and user requirements.

  • Debugging & maintenance: Risk that the code may not be optimised, secure, or easily maintainable. Debugging and maintaining code generated by AI could be challenging, especially if the developers don’t have a clear understanding.

  • Quality assurance: Relying solely on generative AI might compromise the quality assurance process as the automated code may not adhere to industry standards or best practices.

  • Skill loss: Relying too heavily on AI-generated code could potentially erode the skill set of human developers over time. It’s important for developers to continuously learn and adapt to new technologies.

Do's & Dont's

Using Gen AI for generating entire files or sections may pose several risks. Firstly, there is the potential for copyright issues as the product content may unintentionally infringe on existing intellectual property. Additionally, unknown security risks could arise, jeopardising the integrity of the generated material. It’s crucial to prioritise quality, as quickly generated content might lack accuracy or consistency. Continuous maintenance is required for the produced output to guarantee its ongoing relevance and accuracy. Moving forward without adequate documentation or knowledge may result in accumulating technical debt, posing challenges for future development and understanding. Therefore, careful consideration

At this stage, generative AI should only be used as a collaborative tool and not relied on to produce large amounts of unchecked code. While AI coding tools like ChatGPT and GitHub Copilot are beneficial at generating code snippets and providing suggestions, they lack the contextual understanding and requirements that human developers have. The consequences of relying solely on generative AI may lead to inaccuracies or security vulnerabilities in the code.

Generative AI tools are best utilised in collaboration with developers who can validate, review, and integrate the generated code into the project. Developers bring experience, the ability to understand project requirements, and the capability to ensure the generated code aligns with best practices and security standards. Using generative AI as a collaborative tool allows for a more controlled and reliable integration of AI-generated content into the development process, ensuring that the end code meets the project’s needs and quality standards.

How does WM use AI?

Blog banner statistic

The developers at WorkingMouse have the option of using ChatGPT or GitHub Copilot to use as a collaborative tool. ChatGPT, is a language model and assists our developers in generating code snippets, providing explanations, and offering advice. Our developers also use GitHub Copilot, an AI-powered code completion tool that can fasten coding by 55%. This is used as a collaborative tool that accelerates the coding process by suggesting lines or blocks based on the context of the code. The difference between the two collaborative tools is, ChatGPT facilitates conversations and quick prototyping, whereas GitHub Copilot focuses on enhancing coding efficiency through context-aware code suggestions within an integrated development environment.

Both tools are considered safe for coding tasks as developers are still responsible for reviewing and understanding the suggested content. GitHub Copilot also applies an AI-based vulnerability prevention system that blocks insecure coding patterns in real-time to make Copilot suggestions more secure. Insecure coding patterns can be quickly blocked and replaced by alternative suggestions as the system leverages large language models (LLMs) to approximate the behaviour of static analysis tools. Additionally, the developers are required to validate and review the output for security considerations and to ensure adherence to best coding practices.

Wrapping it up..

While generative AI can be a valuable tool in the development process, it should be used as a collaborative tool rather than a replacement for human developers. Generative AI is proficient at generating code snippets and suggestions, it should not be used for generating whole applications. A collaborative approach ensures developers validate, review, and integrate generated code, preventing inaccuracies or security vulnerabilities. This controlled integration guarantees that the end code aligns with project needs and quality standards.

How we empower departments and enterprises

Government

author-thumbnail
ABOUT THE AUTHOR

Jessica Montgomery

Junior Marketer and Formula 1 lover

squiggle

Your vision,

our expertise

Book a chat