You Have More AI Than You Think
When we ask businesses "how many AI systems do you use?" the answer is usually 2-3. When we help them conduct a proper inventory, the real number is 10-25.
The gap is **shadow AI** — artificial intelligence features embedded in everyday software tools that businesses use without realizing they're "using AI." Your CRM scores leads. Your email platform optimizes send times. Your ATS ranks resumes. Your chatbot routes customers. Your accounting software flags anomalies.
Every one of these is an AI system. Under SB 24-205, every one that makes or assists in consequential decisions requires compliance. And you can't comply with what you don't know you have.
Related: SB 24-205 compliance guide · compliance checklist · impact assessment requirements
The Most Common Shadow AI Systems
**HR and Hiring:**
- ATS resume screening and ranking (Greenhouse, Lever, Workday, iCIMS)
- AI-powered interview scheduling and candidate matching
- Performance review AI (15Five, Lattice, Culture Amp)
- Employee sentiment analysis
**Sales and Marketing:**
- CRM lead scoring (Salesforce Einstein, HubSpot predictive scoring)
- Email send-time optimization (Mailchimp, Marketo)
- Ad targeting algorithms (Google Ads, Meta Ads)
- Customer segmentation and personalization
**Customer Service:**
- Chatbot routing and response (Intercom, Zendesk, Drift)
- Automated ticket classification
- Sentiment analysis on support interactions
- Customer churn prediction
**Finance:**
- Fraud detection (Stripe Radar, PayPal risk scoring)
- Automated underwriting/credit decisions
- Invoice matching and anomaly detection
- Revenue forecasting
**Operations:**
- Predictive maintenance scheduling
- Inventory optimization
- Dynamic pricing
- Supply chain risk scoring
If any of these tools affect consequential decisions about people — and many do — they require SB 24-205 compliance.
Why Shadow AI Is Dangerous Under SB 24-205
**1. You can't audit what you don't inventory.**
Bias auditing requires knowing which AI systems exist. Shadow AI bypasses your governance program entirely.
**2. No disclosure means automatic non-compliance.**
If consumers aren't notified that AI is used in decisions affecting them, every decision is a violation — regardless of whether the AI is biased.
**3. Shadow AI accumulates liability silently.**
A CRM lead scoring algorithm that's been running for 2 years without compliance documentation represents 2 years of potential violations across every lead it scored.
**4. "I didn't know" is not a defense.**
SB 24-205 requires reasonable care. Failing to inventory your AI systems is, by definition, unreasonable care. The AG won't accept "we didn't know our ATS had AI."
**5. Shadow AI often has the worst bias.**
AI features added to SaaS tools as afterthoughts receive less bias testing than purpose-built AI systems. The tools you don't know about are often the ones with the most bias.
How to Find Your Shadow AI
**Step 1: Software audit** — List every SaaS tool and software platform your organization uses. Every department. Every team. Every individual subscription. Include free tools and browser extensions.
**Step 2: AI feature scan** — For each tool, check: Does it score, rank, classify, predict, recommend, or automate decisions about people? If yes, it likely uses AI.
**Step 3: Vendor inquiry** — Contact each vendor and ask directly: "Does your product use AI, machine learning, or automated decision-making in any features we use?" Request documentation of AI features.
**Step 4: Consequential decision mapping** — For each AI feature identified, determine whether it makes or assists in consequential decisions (employment, financial, healthcare, housing, education, legal, insurance).
**Step 5: Risk classification** — Classify each discovered AI system as high-risk (consequential decisions) or standard. High-risk systems require full SB 24-205 governance.
**Common discovery surprises:**
- Salesforce Einstein is AI (most Salesforce users don't realize this)
- Grammarly's tone suggestions are AI
- Your email platform's "best time to send" is AI
- Google Analytics' audience insights use AI
- Your ATS has had AI screening for years
The Shadow AI Inventory Checklist
**Ask every department head these questions:**
1. What software tools does your team use daily?
2. Do any of these tools automatically score, rank, or classify people (customers, applicants, patients, tenants)?
3. Do any tools make recommendations about who to contact, hire, approve, or deny?
4. Have you adopted any new tools in the past 12 months with AI or "smart" features?
5. Do any team members use personal AI tools (ChatGPT, Claude, Copilot) for work decisions?
6. Are there any automated workflows that make decisions without human review?
**Red flag departments** (highest shadow AI risk):
- Sales (CRM scoring, lead routing)
- HR/Recruiting (ATS screening, performance AI)
- Customer Support (chatbots, ticket routing)
- Marketing (ad targeting, personalization)
- Finance (fraud detection, underwriting)
- IT (security AI, access management)
From Shadow to Governed
Once you've found your shadow AI, bring it into your governance program:
1. **Register** each discovered AI system in your compliance platform
2. **Classify** its risk level (consequential decision involvement)
3. **Audit** high-risk systems for bias immediately
4. **Disclose** AI involvement to affected consumers
5. **Monitor** continuously for bias and performance drift
6. **Document** everything in your evidence trail
CO-AIMS makes this transition straightforward. Register systems in the dashboard, configure automated auditing, and generate evidence bundles — even for AI you just discovered yesterday. The platform doesn't judge when you found the AI; it helps you govern it from today forward.
The June 30, 2026 deadline applies to all your AI — including the systems you haven't found yet. Start the inventory now.
Automate Your Colorado AI Compliance
CO-AIMS handles bias audits, impact assessments, consumer disclosures, and evidence bundles — so you can focus on your business.