Skip to content

Privacy Compliance Guide

Bill C-27 and the CPPA:
What Every Healthcare Agency
Needs to Know

Canada's most significant privacy overhaul in 20 years has been proposed but not yet enacted. Bill C-27 died when Parliament was prorogued in January 2025, but the direction of reform is clear. If your agency uses AI, stores patient data, or has staff using tools like ChatGPT, you need to prepare now.

Last updated: March 14, 2026

1. What is the CPPA?

The Consumer Privacy Protection Act (CPPA) was the centrepiece of Bill C-27, also known as the Digital Charter Implementation Act, 2022. It was designed to replace Part 1 of PIPEDA (the Personal Information Protection and Electronic Documents Act), which has governed private-sector privacy in Canada since 2001. Bill C-27 died on the Order Paper when Parliament was prorogued in January 2025, but federal privacy reform remains a priority and a new bill is expected.

PIPEDA was written before social media, cloud computing, and artificial intelligence became part of daily operations. The CPPA was designed to modernize Canadian privacy law to address how organizations actually collect, use, and process personal information today. While the bill itself has not passed, the proposed changes signal the direction of future regulation, and PIPEDA still applies.

Bill C-27 contains three parts:

  • Part 1: Consumer Privacy Protection Act (CPPA) replaces PIPEDA's commercial privacy rules with stronger consent requirements, new rights for individuals, and real enforcement teeth.
  • Part 2: Personal Information and Data Protection Tribunal Act creates a new tribunal to hear appeals and impose penalties.
  • Part 3: Artificial Intelligence and Data Act (AIDA) establishes Canada's first regulatory framework for AI systems, including requirements for "high-impact" AI.

Key difference from PIPEDA

The CPPA introduces administrative monetary penalties of up to $10 million or 3% of global revenue (whichever is greater) for non-compliance, and up to $25 million or 5% of global revenue for the most serious offences. PIPEDA had no such penalties.

2. PIPEDA vs. CPPA: What's Changing

For organizations already complying with PIPEDA, the CPPA introduces several important changes:

Area Under PIPEDA Under CPPA
Consent Broad consent requirements, some ambiguity on implied consent Explicit, meaningful consent required. Plain language. Purpose must be specific and understandable.
Right to delete Limited withdrawal of consent Full right to disposal (deletion) of personal information upon request
Data portability Not addressed New right to data mobility, transferring personal information between organizations
Automated decisions Not addressed Right to explanation when automated decisions significantly affect individuals
Penalties Complaints to Privacy Commissioner, limited enforcement Up to $25M or 5% of global revenue. New tribunal for enforcement.
De-identified data Minimal guidance Formal rules for de-identification. Obligations on organizations that hold de-identified data.
Service providers Transfer accountability Service providers have direct obligations under the law, not just contractual ones

3. The AI and Automated Decision-Making Act (AIDA)

Part 3 of Bill C-27 introduces AIDA, Canada's first dedicated AI regulation. While it's still being refined, the direction is clear: organizations using AI systems that make or influence decisions about individuals will face new requirements.

What qualifies as "high-impact" AI?

The exact criteria will be defined by regulation, but the bill targets AI systems used in contexts where decisions can significantly affect individuals. Healthcare is explicitly cited as a high-impact domain. If your AI system influences:

  • Client care decisions or recommendations
  • Staff scheduling or resource allocation for care delivery
  • Medication administration workflows
  • Incident reporting or escalation
  • Eligibility determinations for services

...then it will likely be classified as high-impact under AIDA.

What AIDA requires for high-impact AI

  • Impact assessments before deploying AI systems
  • Mitigation measures for risks of harm or bias
  • Transparency, including plain-language descriptions of how the system works
  • Human oversight mechanisms for automated decisions
  • Record keeping to demonstrate compliance

Why this matters for your agency

Even if you are not building AI systems yourself, if your staff use AI tools (ChatGPT, Copilot, Gemini) in their work with clients, your agency may be considered an operator of a high-impact AI system under AIDA.

4. The ChatGPT Problem in Healthcare

This is happening right now

Staff at healthcare and social services agencies across Ontario are using ChatGPT, Google Gemini, and other consumer AI tools to draft incident reports, summarize client notes, compose emails about clients, and generate care plans. Every one of these actions sends personal health information to servers outside of Canada, outside of the agency's control, and outside of any compliance framework.

This is not a hypothetical risk. It is the single largest unaddressed privacy exposure in the developmental services sector today. Here is what happens when a staff member pastes a client's behavioural incident into ChatGPT:

1

Data leaves Canada

ChatGPT processes data on US servers. This is a cross-border transfer of personal health information without the individual's consent, a violation of both PIPEDA and PHIPA.

2

No data processing agreement exists

Your agency has no contract with OpenAI governing how that data is used, retained, or protected. Under the CPPA, service providers have direct legal obligations, but without a formal agreement, there is no accountability chain.

3

Training data exposure

Unless enterprise settings are explicitly configured, input data may be used to train future model versions. Your client's personal health information could influence responses given to other users.

4

No audit trail

There is no record of what was sent, when, or by whom. When the Privacy Commissioner investigates, your agency cannot demonstrate what personal information was exposed or to what extent.

5

Regulatory exposure is growing

Under PIPEDA, the Privacy Commissioner can already investigate and publicly name non-compliant organizations. Proposed legislation (CPPA) would add penalties of up to $10 million. The defence of "we didn't know staff were doing it" does not hold when the organization failed to implement adequate safeguards.

The scale of the problem

A 2024 survey by the Canadian Internet Registration Authority found that 30% of Canadian organizations had employees using generative AI tools without formal policies in place. In sectors like developmental services, where staff are stretched thin and documentation demands are high, the incentive to use AI shortcuts is enormous.

The solution is not to ban AI. Banning it drives usage underground, where it is even harder to monitor. The solution is to give your staff AI tools that are safe to use, tools that keep data in Canada, on infrastructure you control, with audit trails that satisfy regulators.

5. What This Means for Ontario DS Agencies

Ontario's developmental services agencies operate under a unique regulatory stack. You are already accountable to the Quality Assurance Measures (QAM) under the Services and Supports to Promote the Social Inclusion of Persons with Developmental Disabilities Act, to PHIPA for health information, and to PIPEDA for commercial activities. The CPPA adds another layer.

Three areas of immediate impact

Consent for AI use

If your agency uses any AI tools, whether for documentation, scheduling, incident analysis, or chatbots, the CPPA requires meaningful consent. This means explaining to clients and their substitute decision-makers, in plain language, that AI is being used, what data it processes, and how it affects them. Blanket consent buried in intake forms will not meet the new standard.

Vendor accountability

Under PIPEDA, when you shared data with a service provider, accountability stayed with your organization. The CPPA changes this: service providers now have direct obligations. This means your software vendors, including AI tool providers, are directly liable. But it also means you need to verify that your vendors are actually compliant. "We use a big brand name" is not a due diligence defence.

Data residency becomes critical

The CPPA, combined with PHIPA requirements, makes the case for Canadian data residency stronger than ever. Personal health information processed by AI on US servers creates a regulatory exposure that no privacy policy can paper over. When the Privacy Commissioner asks where your clients' data lives, "on OpenAI's servers in the United States" is not an answer you want to give.

6. Where the CPPA Meets PHIPA

Ontario's Personal Health Information Protection Act (PHIPA) governs "health information custodians" and their handling of personal health information (PHI). Many DS agencies are custodians under PHIPA, or act as agents of custodians.

The CPPA does not replace PHIPA. Instead, the two laws will overlap, and organizations need to comply with both. Where they intersect:

  • PHIPA governs collection and use of personal health information by health information custodians. CPPA governs commercial activities involving personal information more broadly.
  • AI-specific provisions are new. PHIPA does not address automated decision-making or AI. The CPPA and AIDA fill this gap.
  • Penalties stack. A single incident involving AI and personal health information could trigger enforcement under both PHIPA and the CPPA. The combined exposure is substantial.
  • Data breach notification is already required under PHIPA. The CPPA adds its own breach notification requirements with potentially different timelines and thresholds.

Practical implication

Your privacy program needs to address both frameworks simultaneously. A policy that satisfies PHIPA alone will not be sufficient once the CPPA is in force. Organizations that are already proactive about compliance will have a much easier transition.

7. Your CPPA Compliance Checklist

While the final regulations are still being developed, these are the steps every Ontario DS agency should be taking now:

1

Audit your AI usage

Identify every AI tool your staff use, whether sanctioned or not. This includes ChatGPT, Copilot, Gemini, Grammarly, and any tool that processes text through a cloud API. Document what data is being sent and where it goes.

2

Create an AI acceptable use policy

Define which tools are approved, what types of data can and cannot be processed through them, and what the consequences are for violations. Make this part of onboarding and annual training.

3

Update your consent framework

Review your intake forms, privacy notices, and consent documents. If you use or plan to use AI in any part of client service delivery, this needs to be disclosed in plain language with specific, meaningful consent.

4

Review vendor data residency

For every software vendor that handles personal information, confirm where data is processed and stored. Prioritize Canadian data residency. If a vendor cannot confirm Canadian processing, evaluate alternatives.

5

Implement data breach response procedures

Update your breach response plan to cover CPPA requirements alongside existing PHIPA obligations. Include AI-specific scenarios (e.g., "staff member pasted client PHI into a consumer AI tool").

6

Provide safe AI alternatives

If staff are using ChatGPT because documentation takes too long, the answer is not a ban. It is providing them with AI tools that are safe to use, tools that process data on Canadian infrastructure with full audit trails and no cross-border transfers.

7

Conduct a Privacy Impact Assessment for AI

Before deploying any AI system, even one provided by a vendor, conduct a PIA. Under AIDA, high-impact AI in healthcare will require formal impact assessments. Getting ahead of this now reduces your compliance burden later.

8. How Merakey Helps

We built Merakey specifically for organizations that cannot afford to get privacy wrong. Both of our products are designed from the ground up for Canadian regulatory requirements.

Meridian

QAM Compliance Scanning

  • - Automated scans of your Home Portal data for QAM compliance gaps
  • - Read-only access, we never modify your data
  • - PDF compliance reports ready for ministry submissions
  • - All processing on Canadian infrastructure (AWS ca-central-1)
  • - Audit trail for every scan, satisfying both QAM and CPPA documentation requirements
Learn more about Meridian →

Sentinel

Self-Hosted AI Agent Platform

  • - AI agents that run on infrastructure you control
  • - No data leaves your network, ever
  • - Full conversation audit logs for compliance
  • - PIPEDA and CPPA-ready by design
  • - The safe alternative to staff using ChatGPT with patient data
Learn more about Sentinel →

Ready to get ahead of the CPPA?

Whether you need compliance scanning, safe AI tools for your staff, or both, we can help you build a privacy posture that satisfies PIPEDA, PHIPA, and the incoming CPPA.

Talk to Us About Compliance

Disclaimer: This page is for educational purposes only and does not constitute legal advice. Bill C-27 died on the Order Paper when Parliament was prorogued in January 2025. A new privacy reform bill is expected but has not yet been introduced. PIPEDA remains the governing federal privacy law. Organizations should consult qualified legal counsel for advice specific to their circumstances. Last updated: March 2026.