raxIT AI logo
Shadow Coding: what, so what, now what?
By Adesh Gairola

Shadow Coding: what, so what, now what?

Shadow coding—developers using unauthorized AI tools or code—is creating significant security and compliance risks. Organizations need balanced governance that enables innovation while maintaining security.

In the enterprise world, "shadow IT" describes the use of unapproved apps and tools by employees to get their jobs done faster. Now, a similar phenomenon is emerging in software development that we're calling shadow coding.

We define shadow coding as developers writing code or using AI coding tools without formal approval or tracking – essentially the coding equivalent of shadow IT. Just as shadow IT refers to unapproved tech resources, shadow coding represents unapproved code contributions in the codebase. This practice often flies under management's radar, yet it can have far-reaching implications for AI Security and compliance. C-level executives and governance stakeholders are increasingly concerned as more developers experiment with AI-generated code and quick side projects outside the usual controls.

Key Risk Insight

According to a 2024 GitHub survey, 97% of engineers have used AI coding tools, yet only about 40% of their employers officially encourage AI adoption.

To address this emerging challenge, organizations must understand what drives shadow coding, what risks it poses, and how to govern it without stifling innovation.

What Is Shadow Coding?

Shadow coding is defined as the unauthorized or untracked use of software code or AI in development, done without the IT department's approval or oversight. In practice, this might involve developers incorporating code from external libraries or AI assistants into enterprise projects without proper vetting.

The term comes from "shadow IT," highlighting the parallel: just as shadow IT means unapproved tech resources in use, shadow coding means unapproved code contributions in the codebase. Such code may originate from internal side projects, open-source snippets, or AI-generated suggestions, and it bypasses the normal checks and documentation.

Unvetted shadow code can be dangerous because it hasn't been confirmed to be secure, compliant, or compatible with the rest of the system. In other words, it's code operating in the shadows of official process. A IBM security report on shadow IT noted that 80% of employees use unsanctioned tools for convenience and productivity. By analogy, shadow coding is driven by developers' desire to solve problems quickly, even if it means sidestepping formal governance.

How Shadow Coding Emerges

Several common developer behaviors give rise to shadow coding. These usually stem from pressures to move fast or from gaps in policies. Key contributors to shadow coding include:

"Vibe coding" with AI

Using AI tools to generate code from natural language prompts

Many developers are embracing AI tools (like ChatGPT or GitHub Copilot) to generate code from plain language prompts – a practice informally dubbed "vibe coding." This approach lets a programmer describe a feature in natural language and have the AI produce the base code. It's great for rapid prototyping and staying "in the flow."

However, if an enterprise hasn't explicitly sanctioned such AI usage, developers may do it quietly. The drive here is speed and convenience: AI can churn out code in seconds that might take a human much longer.

Speed-driven

Side projects and off-platform code

Developers bringing personal code solutions into company projects

Developers are creative problem-solvers, and many build side projects or utilities to improve their workflow. Sometimes a dev working on a personal project will find the solution applicable to their day job and bring that code into the company's codebase without proper review.

In one incident, a Microsoft engineer's personal GitHub contained a script with an internal Azure access token, creating a severe security exposure. This exemplifies how well-meaning side coding projects can become "shadow IT" nightmares.

Innovation-driven

Skipping reviews and fast-tracking changes

Bypassing standard code reviews or testing procedures

In the pressure to deliver software quickly, some developers may bypass standard code reviews or testing procedures. For instance, a developer might directly commit code to a shared branch or push a hotfix to production without peer review.

These skipped reviews create shadow code because the changes weren't seen or approved by anyone else. Often the rationale is to save time – perhaps the change was "minor" or an urgent patch.

Without the checks and balances of a review, unsanctioned changes can introduce defects or security flaws that others only discover later (if at all). Research shows it's common for developers to skip code reviews to meet deadlines, but doing so allows bugs and quality issues to slip through. This "move fast, skip oversight" mindset is classic shadow coding born from a desire for speed and autonomy.

Deadline-driven

The drivers behind these behaviors mirror those of shadow IT: the need for speed and agility, personal autonomy in choosing tools, and unclear or cumbersome policies. Developers turn to shadow coding when official processes feel too slow ("I can fix this in an hour, why wait a week for approval?") or when policies on AI/code use are not yet defined.

AI Coding Tools Usage by Country (2024)

Percentage of software engineers using AI tools

Source: GitHub Developer Survey 2024

Nearly all developers have experimented with AI coding tools at work, even when formal company adoption lags. In a 2024 survey spanning the US, Brazil, Germany, and India, 97–99% of software engineers reported using AI coding aids. Yet fewer than half work at organizations that actively encourage AI usage. This gap suggests much of the AI-based coding is happening informally, as a form of shadow coding.

Governance, Safety and Security Implications

While shadow coding may boost short-term productivity, it carries significant governance and security risks. Since these coding activities occur outside the sanctioned processes, they can lead to:

Non-compliance and legal exposure

  • Unintentional use of open-source code with restrictive licensing
  • Federal security experts label unvetted library use as "illegal code" or shadow coding
  • Data protection risks if proprietary code is pasted into public AI services
  • Non-compliance with industry-specific security standards and audit requirements

Security vulnerabilities and malware

Degradation of code trust and accountability

A subtle but important impact of shadow coding is the erosion of trust in the software and the development process. In well-run DevOps teams, every change is tracked – you can trace who wrote what code and why. Shadow coding breaks this transparency.

Accountability suffers because when a bug or incident surfaces, it's harder to assign responsibility – was it an officially reviewed change or something implemented off-process? This lack of clear ownership can undermine team morale and culture, as developers might feel others are "playing outside the rules" or worry that unknown landmines lie in the code.

Lessons from Shadow IT's History

The rise of shadow coding strongly mirrors the trajectory of shadow IT in past decades. Business leaders can draw valuable lessons from how shadow IT took root and was managed over time:

  1. It grows out of unmet needs: Shadow IT persisted because employees had jobs to do and the official tools weren't meeting their needs. Likewise, shadow coding is a symptom that developers crave faster or more flexible ways to meet requirements.

  2. Short-term gains vs long-term risk: Shadow IT often delivered quick wins – immediate productivity boosts, faster collaboration – which is why management often looked the other way initially. Over time, however, the accumulated risks became apparent, sometimes through costly incidents.

  3. Why banning it doesn't work by itself: Many organizations initially reacted to shadow IT by trying to lock things down – strict policies, blocks on installing software, network filters for cloud services, etc. These measures had limited success if they didn't address the underlying driver: people still needed solutions.

  4. Embrace the good, manage the bad: Some shadow IT ended up becoming officially adopted once it proved its value. Similarly, some shadow coding outcomes might be positive innovations that should be harnessed.

Recommendations for Managing Shadow Coding

To effectively address shadow coding, enterprises should take a proactive, balanced approach – tightening governance where needed, while enabling innovation. Here are some forward-looking recommendations:

Improve Detection and Visibility

You can't govern what you can't see. Invest in tools and processes to detect shadow code in your environment:

  • Employ security scanners and software composition analysis to flag unknown libraries
  • Integrate automated code auditing tools (SAST and DAST)
  • Set up alerts for unusual commits or new third-party packages
  • Monitor API usage patterns to understand unofficial AI coding activity

Establish Clear Policies

A lack of policy is often the green light for shadow coding. Develop explicit guidelines around:

  • Generative AI usage with conditions (no sensitive data, human review required)
  • Process for introducing new libraries or code from personal projects
  • Clear ownership and documentation requirements for all production code
  • Communication of the "why" behind policies to increase developer buy-in

Foster an Open, Blameless Culture

A culture shift may be needed to bring shadow coding into the daylight:

  • Encourage sharing of productivity hacks and side projects without fear of punishment
  • Create forums for devs to share techniques with security and ops input
  • Provide developer-focused security awareness training
  • Treat developers as partners in governance, not subjects of it

Provide Safe Sandboxes for Innovation

Balance governance with creativity through sanctioned experimentation environments:

  • Create innovation sandboxes with dummy data for trying new tools and AI
  • Establish pathways to bring successful sandbox projects into production properly
  • Consider adopting a "20% time" concept for innovative projects
  • Keep innovation visible to management while allowing creative freedom

Adopt Tools and AI Officially (When Justified)

If you notice developers gravitating towards particular solutions unofficially, it's a strong signal that the tool provides value. Rather than fight the tide, evaluate it from a top-down perspective:

  • Consider deploying enterprise-vetted AI coding tools with proper security configurations
  • Evaluate and approve popular open-source components for general use
  • Partner with security and compliance teams to safely configure and monitor these tools
  • As one Forbes Tech Council piece noted, instituting added diligence throughout implementation can balance speed and security

raxIT AI Perspective

What we're calling "shadow coding" is emerging as a governance challenge of our times, born from the incredible new capabilities (and temptations) that AI and open ecosystems offer developers. It echoes the rise of shadow IT in how it surfaces – from the bottom-up, out of impatience or creativity – and likewise demands a thoughtful response.

At raxIT AI, we understand this challenge from both sides. Our AI Governance platform is designed to bring shadow coding practices into the light without stifling the innovation they represent. Our approach includes:

  1. Automated discovery tools that identify AI usage and shadow code across the enterprise
  2. Risk assessment agents that evaluate the security implications of unvetted code
  3. Governance workflows that streamline approval processes, reducing the incentive for shadow coding
  4. Security guardrails that allow innovation while preventing dangerous practices

The aim is to maintain the spirit of innovation that shadow coding reflects, while mitigating the attendant risks of non-compliance, vulnerabilities, and chaos. Senior leaders and boards should view this as an opportunity: engage your developers about the tools and methods they're finding useful, update your governance to accommodate beneficial new practices, and clamp down on the truly dangerous behaviors.

With the right balance, you can harness the productivity gains of AI and modern development practices without letting your software environment slip into the shadows. By shining a light on shadow coding now, organizations will ensure that fast code doesn't become "dark code," and that innovation and security go hand in hand.