AI Acceptable Use Policy
We are an AI company. Using AI tools is expected. The default answer is yes.
One hard rule: PHI requires a BAA. Everything else is good judgment. Use whatever helps you move fast. Never put patient data into a tool without a signed Business Associate Agreement.
Purpose and Philosophy
We build with AI and ship with AI. This policy answers one question: what data can go where?
We operate "Default Open." Any AI tool is permitted for non-PHI work. No approval process, no waiting. The one non-negotiable: PHI only goes into tools covered by a signed BAA.
Company IP flowing through AI tools is an accepted risk. Leadership has chosen speed over lockdown.
Data Classification
PHI (Protected Health Information)
Any data that identifies a patient and relates to their health, treatment, or payment: names, DOBs, MRNs, diagnosis codes, clinical notes, claims data, or any combination that could identify an individual.
Hard rule: PHI only goes into Tier 1 (BAA-covered) tools. If it came from a patient record, it's PHI.
Sensitive Business Data
Financials, M&A strategy, employee PII, salary data, legal docs, investor materials. Prefer company-managed accounts. No hard block, but think twice before putting the cap table into a free chatbot.
General Business Data
Code, docs, research, specs, ideation, marketing, presentations, emails. Fully open. Use anything.
Data Residency & Provider Awareness
Be intentional about what data goes into any external service. Not all AI providers are equal when it comes to where your data ends up and what legal protections apply.
⚠️ Non-US Jurisdiction Providers
Cloud-hosted models under non-US jurisdiction deserve extra scrutiny. If a provider operates under a foreign government's data access laws, you have limited or no legal recourse for how your data is handled.
This applies specifically to cloud-hosted models under Chinese jurisdiction (e.g., DeepSeek cloud, Qwen cloud), where data is subject to China's national security laws with no US legal recourse.
✅ Local Models Are Fine
Running open-source models locally (DeepSeek, Qwen, Llama, etc. via Ollama, LM Studio, or self-hosted) is perfectly fine. Your data never leaves your machine. No data residency concern applies.
Extend general caution to all external providers when working with sensitive internal data, client information, pricing, or deal terms. This is a data residency awareness principle, not a blacklist. Match the sensitivity of the data to the trust level of the destination.
The One Rule: PHI
This is a HIPAA requirement. No workaround, no exception, no "it's just a test." Violating it creates regulatory exposure for the company and for our patients.
Note: Most employees will never need Tier 1 access because their work does not involve PHI. The list of BAA-covered tools will be published once confirmed. If your role requires handling PHI, contact [email protected]. If you're unsure whether something is PHI, default to keeping it out and ask.
Approved Tool Tiers
Nothing is blocked. Tiers define what data you can put into a given tool.
- AWS (BAA on file)
- Lovable (company team account)
- Genspark (company account)
- Fireflies (company account)
- ChatGPT, Claude, Gemini, Perplexity, Grok, Mistral (personal accounts)
- Any other publicly available AI tool
- Local models (Ollama, LM Studio, etc.)
AI Tool Budget
Every employee gets a monthly budget for AI and productivity tools, charged to the company card (RAMP).
Account Guidelines
- Use company-managed accounts when they exist. They provide audit trails and centralized offboarding.
- Personal accounts are fine for non-PHI, non-sensitive work.
- Don't bypass SSO. If a tool has company SSO, use it. Don't create separate personal accounts.
- Don't share credentials. Provision access through IT, not by sharing logins.
What You Must NOT Do
- Do not input PHI into any tool without a signed BAA.
- Do not disable security features or audit logging on company-managed tools.
- Do not use AI outputs as final clinical decisions without human review.
- Do not share company-managed account credentials.
Compliance Alignment
HIPAA
BAA requirement for any vendor handling PHI. Minimum necessary principle applies: only share PHI actually needed for the task, even in Tier 1 tools.
SOC 2
Company-managed accounts provide the audit trail and vendor management controls required for Type II evidence. The AI Tool Inventory serves as part of our vendor management program.
HITRUST CSF
- AI Tool Inventory: Maintained list of tools in use, data classifications, BAA status, and risk tier.
- Annual BIA: AI systems touching PHI or critical processes are included in the annual Business Impact Analysis.
- Risk Documentation: This policy + tool inventory + incident logs = HITRUST AI risk management documentation.
Reporting and Questions
Policy questions or tool questions
Nick Brueggeman, AI Operations: [email protected]
Potential PHI exposure
Contact Security immediately at [email protected]. Do not wait.
This policy is a living document. If something creates unnecessary friction, say so.
Version and Review
| Version | Date | Author | Notes |
|---|---|---|---|
| 0.1 | March 2026 | Nick Brueggeman | Initial draft for ops review |
| 0.2 | March 2026 | Nick Brueggeman | Added tool budget, data residency guidance, corrected tool tiers |
This policy is reviewed quarterly or when significant changes occur: new tools adopted company-wide, regulatory updates, or incidents that surface a gap.
Next scheduled review: June 2026