How to Conduct an AI Bias Audit: Step-by-Step Guide
In This Article
What Is an AI Bias Audit?
An AI bias audit is a structured statistical analysis of an AI system's outputs to detect disparate impact across protected classes. Under Colorado SB 24-205, deployers must actively monitor for "algorithmic discrimination" — when AI systems produce outcomes that disproportionately harm people based on race, color, ethnicity, sex, religion, age, disability, or other protected characteristics.
This isn't an academic exercise. It's a regulatory requirement with teeth. The audit must be documented, repeatable, and defensible — because it may need to be produced for the Attorney General.
The 5-Step Bias Audit Process
Step 1: Define the Decision and Outcome
Clearly document what decision the AI system makes, what a "positive" outcome is (approved, hired, recommended), and what a "negative" outcome is (denied, rejected, excluded). This framing determines everything downstream.
Step 2: Identify Protected Classes
Colorado law protects against discrimination based on race, color, national origin, sex, religion, age (40+), disability, sexual orientation, and gender identity. Your audit must analyze outcomes across each applicable class.
Step 3: Collect Outcome Data
Gather decision data segmented by protected class. You need sufficient sample size — typically 100+ decisions per group for statistical significance. If sample sizes are small, document this limitation and consider aggregation strategies.
Step 4: Apply Statistical Tests
Three standard methodologies:
- Disparate Impact Ratio — The 4/5ths (80%) rule: if the selection rate for any protected group is less than 80% of the rate for the highest-performing group, disparate impact exists.
- Statistical Significance Testing — Chi-squared or Fisher's exact tests to determine whether outcome differences are statistically meaningful (p < 0.05).
- Demographic Parity — Comparing positive outcome rates across groups with allowable deviation thresholds (typically ±5-10%).
Step 5: Document and Remediate
Every audit must produce a written report documenting methodology, data sources, findings, and remediation plan. If disparate impact is found, you have 90 days to remediate and report to the AG.
Common Pitfalls to Avoid
- Insufficient sample sizes — Small datasets produce unreliable results. Document minimum thresholds and confidence intervals.
- Missing intersectionality — Auditing race and gender separately can miss discrimination that affects, for example, Black women specifically. Consider intersectional analysis.
- One-time audits — AI systems drift over time as data changes. Monthly or quarterly audits catch drift before it becomes discrimination.
- Proxy variables — AI can discriminate via proxy (ZIP code for race, name for ethnicity). Include proxy analysis in your methodology.
- Ignoring the training data — If your training data reflects historical discrimination, the model will reproduce it. Audit inputs, not just outputs.
Automating Bias Audits with CO-AIMS
Manual bias audits work for one-time assessments but don't scale. CO-AIMS automates the entire process:
- Continuous monitoring — Automated statistical analysis runs on a configurable schedule (monthly default)
- Multi-metric analysis — Disparate impact ratio, statistical significance, and demographic parity calculated simultaneously
- Intersectional analysis — Automatically tests combinations of protected classes
- Alert thresholds — Configurable triggers when metrics exceed acceptable bounds
- Audit trail — Every audit is timestamped, methodology is documented, and results are stored for three years
When an audit flags an issue, CO-AIMS generates a remediation plan with specific steps mapped to the finding, tracks implementation, and verifies the fix in the next audit cycle.
Frequently Asked Questions
How often should I conduct AI bias audits?
Colorado SB 24-205 requires ongoing monitoring without specifying a frequency. Best practice is monthly bias audits for high-risk systems. At minimum, conduct quarterly audits and always re-audit after system changes or data updates.
What is the 4/5ths rule for disparate impact?
The 4/5ths (or 80%) rule states that if the selection rate for any protected group is less than 80% of the rate for the group with the highest selection rate, disparate impact exists. For example, if 60% of men are approved but only 40% of women, the ratio is 40/60 = 0.67, which is below the 0.80 threshold.
What happens when a bias audit finds discrimination?
Under SB 24-205, discovery of algorithmic discrimination triggers a 90-day remediation and Attorney General notification obligation. You must document the finding, implement corrective measures, verify the fix, and report the incident to the AG within 90 days.
Automate Your Colorado AI Compliance
CO-AIMS handles bias audits, impact assessments, consumer disclosures, and evidence bundles — so you can focus on your business.
AI Solutionist and founder of CO-AIMS. Building compliance infrastructure for Colorado's AI Act. Helping law firms, healthcare providers, and enterprises navigate SB 24-205 with automated governance.