Thank you. It is an actual pleasure to seem earlier than you immediately.

My title is Felix Proulx-Giraldeau, and I’m becoming a member of immediately as the Interim Executive Director at Evidence for Democracy.

We are Canada’s main non-partisan, not-for-profit group that’s championing proof utilization in authorities, working to shut the hole between decision-makers, like yourselves, and the greatest out there science and proof. We do that by means of unique analysis, expertise coaching and issues-based campaigns as a result of we imagine Canadians profit when governments make selections knowledgeable by the greatest out there science and proof, main to higher outcomes for all.

We are happy to be right here immediately to focus on the influence of synthetic intelligence in Canada. This influence is now not theoretical. AI is quickly reshaping how governments collect proof, analyze data and make selections. Its use throughout the federal public service is increasing shortly.

This new expertise creates unimaginable alternatives for all Canadians, however it additionally creates severe dangers for transparency, accountability, privateness, equity and public belief.

Canadians are navigating one in every of the most turbulent intervals in our historical past, one marked by rising value pressures, geopolitical instability, climate-related disasters and intensifying disinformation threats. These are precisely the sorts of challenges the place robust, evidence-informed coverage issues most. AI applied sciences have the potential to be a game-changer on this house, however provided that they’re ruled with clear guidelines, sturdy oversight and significant public accountability.

First and foremost, we advocate that the federal authorities create an unbiased regulatory authority, the Digital Safety Commission, with enforcement powers and a transparent mandate to coordinate nationwide digital governance, together with public-facing AI and social media, by means of a brand new on-line harms invoice. This establishment ought to have the opportunity to mandate knowledge entry, conduct algorithmic audits, subject corrective orders and penalties, all the whereas requiring stronger security and design requirements, together with high-privacy defaults and child-impact assessments.

We additionally advocate introducing a statutory duty-of-care for AI builders, impressed by worldwide greatest practices equivalent to the UK Online Safety Act, the EU Digital Services Act, and the EU AI Act.

Our analysis additionally highlights the dangers of bias and discrimination that these AI techniques can introduce. Poor knowledge high quality, opaque techniques, and weak oversight can reproduce or worsen harms already confronted by marginalized communities, particularly racialized Canadians. Those dangers are particularly severe in high-stakes settings, equivalent to immigration, healthcare and social providers, the place automated errors have had actual and lasting penalties, together with wrongful arrests, unfairly denying social service claims, and unjustly stripping residents of their authorized standing.

This is exactly why transparency should even be strengthened. Canadians ought to have the opportunity to inform when AI is being utilized by their authorities, what knowledge it depends on, and the place that knowledge got here from. That is why we advocate seen labelling, digital watermarking, and supply transparency necessities for AI techniques and instruments. 

Finally, we imagine that Canadians ought to have a significant voice in how AI is designed and ruled. We advocate investigating standing residents’ assemblies or deliberative panels, in addition to structured partnerships between civil society and educational researchers, and common advisory councils with consideration to youth and racialized stakeholders. 

Democratic enter is important if AI techniques are to serve the public curiosity.

This means doubling down on transparency, accountability, robust privateness and security protections, sturdy oversight, and significant public participation. If we get these foundations proper immediately, Canada can actually cut back hurt, construct public belief, and use science and proof to higher serve everybody.

Thank you for your time, and I look ahead to your questions.



Sources

Leave a Reply

Your email address will not be published. Required fields are marked *