IPC https://www.ipc.on.ca Information and Privacy Commissioner of Ontario Wed, 06 Mar 2024 21:27:46 +0000 ENH hourly 1 AI on campus: Balancing innovation and privacy in Ontario universities https://www.ipc.on.ca/ai-on-campus-balancing-innovation-and-privacy-in-ontario-universities/ Thu, 07 Mar 2024 14:55:53 +0000 https://www.ipc.on.ca/?p=22371 As in every other aspect of our lives, artificial intelligence (AI) is finding its way onto university and college campuses across Ontario. Given the rapid advancements of AI, this shouldn’t surprise us. Still, it does, time and again. It’s a sad reflection of the opaque veneer often associated with these new technologies that we don’t see coming until they’re at work.

Last week, the discovery of facial recognition enabled vending machines by a University of Waterloo student sparked widespread discussion and concern about the use of potentially privacy-invasive technologies at Ontario’s post-secondary institutions. Because my office is currently looking into complaints about this incident, I am not at liberty to speak about it. We will let our investigation reveal the facts as they are and carefully consider whether the use of these vending machines complies with Ontario’s public sector privacy law as it currently stands.

One case I can speak freely about is my recent investigation into the use of AI-enabled proctoring software at McMaster University. Following a complaint by a student who wished to remain anonymous, my office opened an investigation into McMaster’s use of Respondus exam proctoring software that began during the pandemic and has continued since.

This software includes two programs. Respondus LockDown Browser locks down certain functions of a student’s computer during online exams so they can’t conduct an internet search, access other files on their computer, use the copy-paste function, message, or screen share with others. Respondus Monitor collects sensitive biometric information and monitors students’ movements and behaviours through audio-video recordings, using AI to detect instances of potential cheating.

Our investigation examined the university’s use of the software in the context of the Freedom of Information and Protection of Privacy Act (FIPPA). We found that the university is lawfully authorized to conduct and proctor exams to ensure their academic integrity. Nothing legally prevented them from doing so online, both during and post pandemic. This is a legitimate aim in a context moving more towards remote learning and heightened risks of cheating associated with modern digital tools. But, our analysis did not end there.

While the Respondus LockDown Browser collects minimal personal information, Respondus Monitor gathers more sensitive data, including biometric information, and makes consequential inferences about students’ movements and behavioural conduct through AI technology, raising significant privacy concerns.

Our investigation found that the university’s collection of students’ personal information was technically necessary for this exam proctoring software to function properly and, in this respect, complied with FIPPA. However, the use of students’ personal information by Respondus Monitor, the university’s inadequate notice to students about the purposes for this data collection, and the university’s insufficient safeguards to protect student’s personal information through its contractual arrangements with the company, were found to contravene the act. Most concerning was the company’s non-consensual use of students’ audio and video recordings, including via third party researchers, to improve system performance and enhance its services.

In the investigation report, I recommend that McMaster University introduce stronger measures to protect students’ personal information in the context of online exam proctoring and ensure an approach that balances academic integrity and student privacy rights. I also went on to make additional recommendations to address the broader privacy and ethical risks associated with the use of AI. I asked the university to report back to my office within six months on the implementation of these recommendations.

Recognizing the realities of my mandate, and in the absence of broader compliance tools under current law, issuing recommendations is all I could do. To be clear, McMaster University has been cooperative during the investigation. I have no reason to doubt the university will take my recommendations seriously. However, the fact that I cannot assure Ontarians that my recommendations will be acted upon and must rely on the goodwill of public institutions is simply not adequate. In today’s environment of significantly heightened risks associated with technology, including growing threats to cybersecurity. and potentially invasive uses of AI, Ontario students — and all Ontarians — deserve better assurances than that.

Once again, I urge the Ontario government to modernize FIPPA and its municipal counterpart, MFIPPA. Our province needs a more robust privacy protection regime with stronger measures needed to address the increased risks of a rapidly evolving digital world. I also urge the government to finalize its Trustworthy AI Framework and make it binding on Ontario’s public sector, including the broader public sector, to ensure that technological innovation unfolds in an ethically responsible manner for the benefit of all Ontarians.

Educators and policymakers can foster an environment where technology enhances the quality and integrity of the educational experience without compromising the foundational principles Ontarians hold dear. We must ensure that the legacy of Ontario’s education system continues to be one of innovation, integrity, and inclusivity.

It’s been nearly 200 years since Ontario founded its first university. Much has changed since then. That change must continue if we are to prepare future leaders to embrace and improve the world that awaits them. We owe it to all students — particularly those who dare to question and hold their institutions accountable — to ensure their education does not come at the cost of their privacy or other human rights.

In the words of Dr. Martin Luther King Jr: “The function of education is to teach one to think intensively and to think critically. Intelligence plus character — that is the goal of true education.”

]]>
Artificial Intelligence in the public sector: Building trust now and for the future https://www.ipc.on.ca/artificial-intelligence-in-the-public-sector-building-trust-now-and-for-the-future/ Thu, 01 Feb 2024 17:45:27 +0000 https://www.ipc.on.ca/?p=22206 On January 24, 2024, the IPC had the pleasure of hosting Ontarians to a public event in celebration of Data Privacy Day. The theme was Modern Government: Artificial Intelligence in the Public Sector. If you weren’t able to attend in person or online, the webcast is available here on our YouTube channel.

Here are a few highlights and key takeaways from the event.

Exhilarating promises of AI

AI technologies offer tremendous opportunities to improve public services. They can be used to fast track the processing and delivery of government benefits, inform decision-making by policymakers, and improve communications and engagement with citizens.

There is also a growing use of AI technologies to enable earlier diagnosis of complex health conditions, improve public safety, and respond to global emergencies.

Simply put, AI has the potential to transform the world as we know it today.

A 2023 survey by Global Government Forum found that more than one in ten Canadian public servants say they have used artificial intelligence tools such as ChatGPT in their work. This figure is likely to continue rising throughout 2024, as these technologies rapidly advance and become more commonly integrated into one’s day-to-day work.

Associated risks and potential harms

While the opportunities of AI are promising, we know that there are risks.  AI is not infallible and can lead to costly mistakes and unsafe outcomes for people.

Flawed algorithms can perpetuate biases embedded in the data used to train them, exacerbating the adverse impacts experienced by vulnerable and historically disadvantaged groups.

AI often relies on very large volumes of personal information or data sets that may not be properly protected and may not always be lawfully collected at source. The lack of transparency around the use of AI, and the inexplicability of decisions made as a result, can lead to unfair outcomes for individuals and gouge away at public trust.

Ever since generative AI tools, like ChatGPT, were publicly released and became readily accessible at mass scale, concerns are growing about how consumers can use these to create and spread misinformation. Sometimes spoofs can be funny and quite benign. Other times, not so. Cyber thieves are already simulating CEO voices and using them to spoof employees into transferring money through increasingly sophisticated phishing attacks. “Deepfakes” are being used to mislead the public by fabricating false statements made by political leaders, undermining our democratic processes. Deepfakes can also wreak havoc with financial markets, and gravely harm individuals by ruining their reputations or creating false sexual images of them.

Where the magic really happened

We were very privileged to be able to discuss these opportunities and risks with a blue-ribbon panel of experts from different areas of expertise including philosophy, history, political science, economics, law, social psychology, and technology.  Each of them brought a unique perspective to the table based on their deep knowledge and experiences.

But hearing them in discussion with one another is where the real magic happened! Combined, their contributions were particularly rich, insightful, engaging, and helped advance the dialogue around responsible use of AI in the public sector.

What is your word cloud when it comes to AI?

As a conversation starter, we asked each panelist the following question:

Considering each of you spend much of your day thinking and talking about AI in your respective roles, if we were to create a word cloud above your head, what would be your top three words?

For Melissa Kittmer, Assistant Deputy Minister, Ministry of Public and Business Service Delivery, those were: trustworthy, transparent and accountable. She spoke about the Ontario government’s Trustworthy AI Framework that has been under development since 2021 as part of Ontario’s Data and Digital Strategy. This risk-based framework is grounded in three principles: 1) No AI in secret; 2) AI use that Ontarians can trust; and 3) AI that serves all the people of Ontario.

Melissa highlighted the importance of identifying and managing AI risks. These include potential discrimination and violation of human rights, privacy infringements, misuse of intellectual property, and spread of misinformation. She stressed the responsibility of public servants to mitigate those risks when leveraging the benefits of AI in their work.

Stephen Toope’s three words were: excitement, worry and complexity. As President & CEO of the Canadian Institute for Advanced Research (CIFAR), Stephen spoke about CIFAR’s pan-Canadian AI Strategy. The strategy was launched in 2017 to build AI research capacity here in Canada, while ensuring responsibility, safety, equity, and inclusion. Today, Canada has become a powerhouse in terms of talent. We rank first among G7 countries in the growth and concentration of AI talent, and first in the world in percentage increase of female AI talent globally! Canada is also first in AI publications per capita. Canada used to rank fourth on ‘AI readiness’ in terms of our investment, innovation, and implementation, but we’ve dropped to fifth spot partially due to our lack of access to supercomputing power. Whereas other countries are building major computing platforms, Canada lags in comparison. So, while Canada’s story is one of success, it’s contingent success that requires continued investments in infrastructure and improved ability to protect our intellectual property.

Stephen added that as we deepen our understanding of AI, we also need to have appropriate guardrails in place to address discrimination among other risks. Although some have called for a global AI pact, he thinks that is unlikely to happen. Rather, we should be looking to local and national frameworks— maybe even regulatory coalitions — to ensure harmonization of high standards and avoid a race to the bottom.

The IPC’s own Manager of Technology Policy and Analysis, Christopher Parsons, chose fast-paced, nuanced and noisy. Chris spoke about how AI is being used to enhance national security and law enforcement. He noted the rapid growth of surveillance technologies, plummeting costs of computing power, enhanced access to analytical capabilities used to extract insights from data, all of which are now being leveraged for public security purposes. While this can be positive in some respects, for cybersecurity and automated defense systems, for example, there can also be significant impacts on our privacy and human rights, and ultimately public trust.

Chris emphasized concerns about the obscurity of these practices, many of which happen in secret, and the mass collection of personal information, sometimes from unlawful sources. Inferences derived from these data are largely invisible and may not always be accurate, yet they can feed into life-impacting decisions. This can lead to people being wrongfully identified and accused without the ability to understand how they are being drawn into the criminal justice system. This could further exacerbate bias and discrimination, undermine the right to due process and fair trial, and cause a chill on people’s freedom of expression and association.

Interestingly, Colin McKay, former Head of Public Policy at Google, chose similar words to Chris. Colin took a historical and contextual look back to technology development over the past 25 years. Back then, technology companies did not have the internal teams to clearly communicate to the public or to regulators how they were collecting and using personal information, or the accountability frameworks in which to operate. This created a legacy of mistrust in the use of technologies that naturally frames the context in which we consider consumer applications of AI today.

Colin highlighted the opportunity for companies, large and small, to leverage their past experience with technology development generally. He suggested they could do this by broadening their teams of specialized experts, including technologists, privacy lawyers, data security specialists, and ethicists to explain and communicate publicly about the complexities of AI in a more nuanced manner. Private sector can play a key role in advancing the debate around data cleanliness and process optimization to reduce bias and improve outcomes. He also urged the development of sustainable AI governance frameworks supported by key investments across industry to ensure clear, focused, and ethically responsible use of AI technology.

For Teresa Scassa, Canada Research Chair in Information Law and Policy at the University of Ottawa, risk, regulation and governance were top of mind.  She pointed out that legislative and policy frameworks could be aligned across the country following the lead of the federal government’s Artificial Intelligence and Data Act. Nonetheless, there are still normative spaces for provinces to fill given Canada’s federal reality. One of these important spaces is the provincial public sector, including health care, and law enforcement.

There are fundamental governance questions Ontario needs to ask itself before deploying AI like: What kinds of problems are we trying to address? Is AI the appropriate tool to solve those problems? If so, what kind of AI, designed by whom, what data should feed it, and who will benefit from it?

In filling their regulatory role, provinces should strive for alignment with the laws and policies of other jurisdictions, both nationally and internationally, and draw from their practical experience implementing them. Teresa also emphasized the need to empower and resource existing regulators, like privacy and human rights regulators, to address AI issues that arise in their respective areas of competence.

The three words for Jeni Tennison, Founder and Executive Director of Connected by Data in the U.K., were power, community and vision. Jeni discussed some of the challenges and opportunities around transparency of AI. She spoke about the need for AI developers to be transparent for different purposes and different levels. This includes transparency to the public to enhance public trust, to those procuring AI systems so they can do their due diligence, to intended users of AI so they can carry out their professional obligations with confidence, and to regulators for audit and accountability purposes. A certain level of transparency is also needed to enable fair competition in the market, which is particularly important in a public context to avoid government getting locked into a relationship with a single vendor.

Jeni also stressed how important it is to explain how an AI-based system comes up with a given result, so that affected individuals and their representatives can understand what is happening behind closed doors. This knowledge can help challenge any biases, inaccuracies, and unfairness.

Jeni described why transparency is needed not only in respect of algorithmic models and the development process, but also the results of impact assessments, as well as the number and outcomes of complaints received. These insights are important for communities to understand when and where things may go wrong — a key point for re-equilibrating relationships of power and remedying the public trust deficit.

Finally, Jeni emphasized the need for enhancing capacity and computing power. This is important not only for innovators and developers, but also for civil society, academia, regulators, and other challenging organizations whose role it is to hold developers to account for their use and deployment of AI.

Need for guardrails and limits

Governments in countries around the world are developing laws to address these and other issues associated with AI.

The European Council and Parliament have reached a provisional agreement after lengthy negotiations over the EU’s proposed AI Act. This act takes a risk-based approach to regulating AI and supporting innovation, but with greater transparency, accountability, and with several backstops. These include prohibitions against cognitive behavioural manipulation, the scraping of facial images from the internet, and the use of social scoring and biometric categorisation to infer sensitive data.

In California, the AI Accountability Act has been introduced with the aim of creating a roadmap, guardrails, and regulations for the use of AI technologies by state agencies. This includes requiring notice to the public when they are interacting with AI.

In Canada, the Artificial Intelligence and Data Act, part of Bill C-27, would require having measures in place to identify and mitigate the risks of harm or biased output, and to monitor compliance.

However, this federal legislation would not cover the public sector in Ontario, which is why it is so essential for us to develop our own framework here.

The Ontario government has already taken some positive steps by building various components of a Trustworthy Artificial Intelligence Framework.  But Ontario can and must do more.

Moving forward with AI: Initiatives from the IPC

Raising awareness and bringing to light the critical need for strong governance on AI has been at the forefront of the IPC’s initiatives in recent years.

Last May, the IPC issued a joint statement with the Ontario Human Rights Commission. We urged the Ontario government to establish a more robust and granular set of binding rules governing public sector use of AI that respects human rights, including privacy, and upholds human dignity as a fundamental value.

My office also joined our federal, provincial, and territorial counterparts in releasing Principles for Responsible, Trustworthy, and Privacy-Protective Generative AI Technologies. These principles are intended to help organizations build privacy protection right into the design of generative AI tools, and throughout their development, provision, and downstream use. They’re devised to mitigate risks, particularly for vulnerable and historically marginalized groups, and to ensure that generative content, which could have significant impact on individuals, is identified as having been created by generative AI.

On the international front, the IPC co-sponsored two resolutions at the 45th Global Privacy Assembly that were unanimously adopted by data protection authorities around the world. One on Generative Artificial Intelligence Systems and the other on Artificial Intelligence and Employment, both of which closely align, and resonate with, the kinds of things we’ve been saying and calling for here at home.

The future of AI

We should be proud to know that Canada and Ontario are clearly punching above their weight globally when it comes to AI innovation. Algorithmic systems are powerful tools of measurement, management, and optimization that can help spur the economy, diagnose and treat disease, keep us safe, and perhaps even save our planet.

Ultimately, however, the successful adoption of AI tools by public institutions can only be achieved with the public’s trust that these tools are being effectively governed. To gain that trust, we need to ensure they are being used in a safe, privacy-protective, and ethically responsible manner, with fair outcomes and benefits for all citizens.

— Patricia

]]>
Keeping it real in 2023 https://www.ipc.on.ca/keeping-it-real-in-2023/ Tue, 19 Dec 2023 15:15:49 +0000 https://www.ipc.on.ca/?p=22023 With the rise of ChatGPT, personalized real-time chatbots, and other rapid-fire advancements in artificial intelligence (AI), it is increasingly hard to tell the difference between what’s real and not real.

As AI becomes more common, it appears many of us are searching for meaning when it comes to understanding authenticity. In fact, Merriam-Webster named “authentic” as its word of the year for 2023, defining it as “not false or imitation,” but “real, actual,” and “true to one’s own personality, spirit, or character.”

One way to inspire authenticity is by being very transparent with the use of these new technologies. On December 7, 2023, I joined my federal, provincial and territorial (FPT) counterparts in launching a set of principles to guide the responsible and trustworthy development and use of generative AI technologies in Canada. The best practices we call for include: being open and transparent about the way information is used and the privacy risks involved; taking reasonable steps to ensure the accuracy of input and output data; and making AI tools explainable to users. It’s also important to ensure that generative content, which could have a significant impact on an individual, is identified as being created by a generative AI tool.

Authenticity can also be nurtured through the principle of open and transparent government, something my office has long advocated for. By making government-held information more readily accessible to the public, institutions can combat some of the misinformation and disinformation out there with reliable and trustworthy sources of information that help restore public trust.

In May, the IPC unveiled its first-ever Transparency Showcase. It’s an online 3D gallery highlighting some great examples of how Ontario institutions have been proactive in releasing data to the public in a meaningful and accessible way. We hope this showcase inspires others to adopt similar open data initiatives and look forward to launching our next Transparency Challenge in 2024.

In October, we took further strides with our FPT colleagues, by calling for the modernization of access to information legislation, policies and practices, with critical investments in resources and technological innovations. Access to Canada’s documentary heritage is an important step to understanding where we’ve been and where we’re going. We need to be able to tell the real stories of Canada’s past to support public trust, healing, and reconciliation.

Being authentic and transparent is important for an organization that strives to be modern and effective. Following a period of public consultation, we are modernizing our code of procedure for appeals under the Freedom of Information and Protection of Privacy Act (FIPPA) and the Municipal Freedom of Information and Protection of Privacy Act (MFIPPA). Our goal is to publish the final revised code early in the new year to enhance transparency of IPC’s adjudication process and related policies. These processes and policies have evolved over time to become more streamlined, efficient, and digitally friendly to provide more timely resolution to appeals and complaints.

We’ve also updated our Manual for the Review and Approval of Prescribed Persons and Prescribed Entities under the Personal Health Information Protection Act (PHIPA). After almost two years of consultation with affected organizations, we have modernized the manual to better reflect their reality on the ground and consider the evolving security risks they’re facing in a world of sharply rising cyberthreats. The new manual takes a risk-based approach that involves a more focused and in-depth review of key high risk areas. It emphasizes the ultimate outcome we are trying to achieve — enhanced privacy protection for Ontarians, rather than a theoretical checklist for compliance. It’s coming soon and will be available on our website in the new year.

You may have also heard that as of January 1, 2024, our office will have the authority to issue administrative monetary penalties (AMPs) for violations of PHIPA. Our goal is to maintain a fair and proportionate approach, while promoting confidence in the digital health-care system and supporting an environment for continuous learning and improvement. Watch for details coming soon in our new guidance. It sets out the way we intend to use this new enforcement tool, the kinds of circumstances in which these AMPs may apply, and a transparent list of factors we will be taking into consideration when determining their amounts.

Another way we try to keep it real at the IPC is by providing very concrete and practical resources to individuals and institutions who look to us for information. To that end, we’ve codified some of our decisions into short and actionable Interpretation Bulletins to explain in simple terms how we’ve interpreted complex legal provisions in FIPPA/MFIPPA, so parties know what to expect right up front at the access request stage. We’ve published our first batch of bulletins and look forward to releasing the next batch very soon.

Indeed, keeping it real means keeping our feet to the ground. Following last year’s release of a joint statement and privacy guidance related to the use of facial recognition technology by law enforcement, we received important feedback. Police and other groups in Ontario told us that while they appreciated the FPT guidance, they needed more practical and concrete regulatory guidance in the form of specific use cases in Ontario. Well, we listened and will soon be releasing specific guidance on the use of facial recognition technology in connection with mugshot databases. It’s on its way, so look for it in the new year!

Also, part of being authentic means that our office is open to having real and meaningful conversations on topics that matter to Ontarians. In a recent Info Matters episode, I spoke with Betty-Lou Kristy, Chair of the Minister’s Patient and Family Advisory Council. Her mission is to improve patient care in Ontario by putting patients and families at the center of policymaking and actively involving them as co-designers of the digital health system. That involvement, she says, must be real, “not a tokenistic thing.” We’re hoping to continue that conversation with Betty-Lou and the patient and family advisory council in the new year. We would like to hear their views on how to address privacy and transparency concerns that they experience within the health-care system.

I heard a similar message during my conversation with Jane Bailey and Valerie Steeves, co-leads of the eQuality Project. When it comes to developing digital education and policy, it is critical to really listen to the voices of young people. The views of children and youth are essential to creating a networked environment where they can participate equally, free from surveillance and identity-based harassment. But that means more than just getting their input on things we intend to do to solve problems we think they are facing. It means having them articulate for themselves what their concerns are and how it really feels to navigate the digital environment they live in, and then taking responsibility as parents, educators, and policymakers to address them.

We’re taking that advice to heart by engaging with our Youth Advisory Council and broad-based Strategic Advisory Council members in genuine and meaningful ways that encourage creativity and an open dialogue when it comes to addressing both the access and privacy challenges that Ontarians are faced with today.

As we look ahead to the holiday season, it’s an important time to reflect on and take stock of the activities of the past year, set objectives for the year to come, and resolve to make things even better. I am filled with excitement and optimism, and look forward to continuing this work, alongside the incredibly talented and dedicated team at the IPC.

Author Brené Brown once said, “Authenticity is a collection of choices that we have to make every day. It’s about the choice to show up and be real. The choice to be honest. The choice to let our true selves be seen.”

Despite the hustle and bustle of the holiday season, I hope you’ll take quality time to connect with the people you care about in a real and authentic way and stay grounded in the things that matter most.

Best wishes for a wonderful holiday season and a happy, healthy, and peaceful new year.

— Patricia

]]>
Media Literacy Week: Protecting and empowering students in the digital age https://www.ipc.on.ca/media-literacy-week-protecting-and-empowering-students-in-the-digital-age/ Fri, 20 Oct 2023 15:11:52 +0000 https://www.ipc.on.ca/?p=21638 Guest blog by Commissioner Kosseim for MediaSmarts in recognition of Media Literacy Week

Young people today love going online. Whether it’s for educational purposes, social networking or gaming — there is always something new and exciting to see and do. With every click, they explore new horizons but also, inadvertently, navigate through a sea of potential digital threats.

The online world is fraught with fake content that looks real, creating confusion between what’s true and what’s false. Cybercriminals and cyberbullies use the internet in ways that can seriously harm others, and advertisers attempt by all means to attract attention and nudge buying behaviour.

Canada’s annual Media Literacy Week highlights how critical it is that we all know how to use and engage with digital media. Read more.

]]>
RTKW 2023: Why access to information matters more than ever! https://www.ipc.on.ca/rtkw-2023-why-access-to-information-matters-more-than-ever/ Wed, 20 Sep 2023 14:00:20 +0000 https://www.ipc.on.ca/?p=21367 For a topic that doesn’t often get as much media attention as its privacy counterpart, access to information has been making a lot more headlines this year. Many are urging the government to improve access to information legislation, and some are even taking it a step further — calling for a complete overhaul of the freedom of information (FOI) system.

Recently, the Globe and Mail ran a series, Secret Canada, highlighting many of the barriers to access to information and the many challenges facing FOI offices in ministries and departments across the country. Commendably, the Globe also developed a database of hundreds of thousands of FOI request summaries filed in Canada, as well as a detailed guide on how to file requests and navigate the system.

The Secret Canada series honed in on the critically important reasons why access to information and government transparency matter and why we need to fiercely protect and uphold access rights as a central tenet of our democracy. As part of the series, the reporters interviewed former Chief Justice of the Supreme Court of Canada, Beverley McLachlin. In her words:

“… a democracy just can’t work without the people having information. That is key to making decisions around how you vote. It’s key to making informed decisions. We’re in this age of social media where people are substituting opinions for facts. Facts are absolutely basic to good democratic governance and accountability.”

Her quote captures the very essence of why, in our modern digital world, having timely access to accurate facts is critical. Providing information from reliable sources is an effective antidote to all of the misinformation out there — and even disinformation — especially in the age of generative artificial intelligence, when it is becoming so much more difficult to distinguish legitimate sources of information from fabricated stories or lies.

My office has long been advocating for updates to Ontario’s access to information legislation, and as I mentioned in my recent appearance on The Agenda with Steve Paikin, the FOI system can certainly use some legislative improvements. That said,  there is so much that governments can do non-legislatively as well. For example, institutions can greatly advance public transparency and trust by:

  • allocating additional resources to support over-strained FOI offices;
  • streamlining processes and gaining greater efficiencies by leveraging new automation tools and technology;
  • proactively disclosing more meaningful information Ontarians care about, without waiting to be asked, and;
  • strengthening a culture – and courage — of openness among Ontario’s institutions where transparency is normalized and disclosure of information becomes the default.

But as a modern and effective regulator, the IPC has to do its part too. We need to renew our own commitment to the cause and speak in a united voice with our counterparts across the country and internationally, which you’ll be hearing more about during RTKW and in the weeks to follow.

As an office, we also need to streamline our appeals processes, facilitate the participation of the parties before our Tribunal, and render more timely access decisions.  To this end, we’ve made it easier for people to file and pay for access appeals using our convenient and secure online service. As we head into RTKW 2023, you’ll also learn more about the amendments we’ve made to our Freedom of Information and Protection of Privacy Act (FIPPA) and Municipal Freedom of Information and Protection of Privacy Act (MFIPPA) Code of Procedure to reflect updates to our tribunal processes and procedures and enhance our capacity to provide timely resolution to access appeals. And you’ll hear about the work we’ve undertaken to codify our past decisions into practical and actionable Interpretation Bulletins to help FOI coordinators on the ground when they receive an access to information request.

As a modern and effective regulator, our role is not only to call out non-compliant behaviour when we see it but to promote and encourage good transparency practices, too. Over the past year, we curated some great examples of how several Ontario institutions have succeeded in releasing data to the public in a way that is meaningful, readily accessible, and free of charge. We displayed these in a Transparency Showcase, a virtual exhibit of open government and open data initiatives. In case you missed it, take time during RTKW 2023 to visit the showcase and have a look around for some ideas and inspiration on how your institution can become more transparent, too!

You may also want to carve out some time to listen to a new Info Matters podcast being released as part of RTKW 2023.  In this episode, my guest Laura Neuman of the Carter Center, talks about how access to information — or rather, the lack thereof, can greatly exacerbate the inequities of a significant gender divide that continues to afflict women’s rights not only in developing countries, but developed countries alike. Laura also describes the Centre’s Inform Women, Transform Lives campaign that aims to empower women, helping them access essential information from their local governments to receive benefits or services, help support their families, and engage in civic life.

In fact, while you’re at it, you might want to make yourself a whole FOI playlist in celebration of Right to Know Week. You’ll definitely want to add this recent episode to your line-up, Trust and Truth: Navigating the Age of Misinformation, where I speak with Dr. Alex Himelfarb, chair of the Council of Canadian Academies’ Expert Panel on the Socioeconomic Impacts of Science and Health Misinformation, about how important it is for governments to provide legitimate sources of information that otherwise get too readily filled with so-called facts and theories that aren’t true and can in fact be harmful. Misinformation and disinformation not only adversely affect individuals but can destroy social cohesion in communities, with disproportionately negative impacts on marginalized groups and vulnerable populations.

And you might wish to round out your FOI playlist with this earlier Info Matters episode featuring best-selling author and community activist Dave Meslin in Power to the People! Access, privacy and civic engagement. When it comes to open data and access to information, Dave says transparency is everything. Access to information is one of those fundamental building blocks in this great arena we call democracy where every citizen should have an active voice and a role to play in bringing about societal change for the better, starting with their own school or neighborhood.

As we head into Right to Know Week, I encourage you to reflect on the importance of access to information and how it contributes to the well-being of our communities and to the health of our democracy.

My office has some interesting things planned to put the spotlight on access and transparency throughout #RTK2023. Follow the hashtag and our Instagram, LinkedIn, and X (formerly Twitter) accounts for the latest access initiatives from across Canada and around the world.

It’s going to be a great week, and I encourage you to join in the celebration of information rights! Access to information matters. It underpins the very foundations of our democracy and our fundamental freedoms. Let’s not take it for granted.

— Patricia

]]>
IPC’s Back-to-School Lesson Plans: Helping kids learn about online privacy https://www.ipc.on.ca/ipcs-back-to-school-lesson-plans-helping-kids-learn-about-online-privacy/ Wed, 16 Aug 2023 13:45:45 +0000 https://www.ipc.on.ca/?p=21228 The digital landscape has become an inseparable part of children and youth’s lives, offering boundless opportunities for learning, connecting, and having fun. However, amid this abundance of technology, young people don’t always see the far-reaching implications of their online activities on their privacy, both present and future.

Young people may not always be aware of risks like cybercrime, cyberbullying or even how companies can use ads that attempt to nudge their behaviour. As educators, parents, and regulators, it’s our job to teach children the necessary skills to stay safe online and empower them to navigate their digital environment in an informed manner.

That’s why we’re thrilled to announce the release of four new classroom-ready lesson plans to help educators teach students in grades two through eight about privacy rights, digital literacy, and online safety, just in time for back-to-school!

Prepared by MediaSmarts, in collaboration with the IPC, these free lesson plans include privacy-protective skills that every student needs to develop.  These include teaching kids how to identify and mitigate risks, make strategic choices to protect their privacy online, show empathy for others’ online reputation and respect others’ privacy rights.

They include:

Each lesson plan is based on the IPC’s Privacy Pursuit! activity booklet, filled with fun activities for learning about privacy. The lesson plans include student handouts you can print or complete as a fillable PDF.

In their general comment on the rights of the child in relation to the digital environment, the United Nations stressed the fundamental role privacy plays in protecting children’s online dignity and safety, while supporting their sense of agency, empowerment and growing autonomy. This universally-recognized statement reminds all of us how important it is to equip children with the problem-solving, critical thinking and digital literacy skills they need to stay safe online. These IPC lesson plans provide an empowering platform for active participation and discussion about why privacy is important and how to protect it in the digital world.

We want to hear from educators about their experiences in using the lesson plans and invite them to share their feedback by completing our short survey. It’s important to know how the lesson plans are being used, what areas worked well with students, and potential topics or areas for future development.

We also engaged the IPC’s Youth Advisory Council to ask for their feedback about how we can best reach out to teachers and young people to spread the word about these new lesson plans and the importance of protecting privacy rights. And we’re putting their great ideas into practice. Keep an eye on our Instagram account for updates, as well as upcoming initiatives for youth, tips about privacy and access, comic characters, and whiteboard videos that make learning fun and easy.

As part of our strategic focus on Children and Youth in a Digital World, we are working to champion the access and privacy rights of Ontario’s children and youth by promoting their digital literacy and the expansion of their digital rights. Over the past two years, we’ve continued to make considerable progress in this area. You can read more about our progress in our 2022 annual report.

Albert Einstein once said, “Education is not the learning of facts, it’s rather the training of the mind to think.” Educators today play an essential role in teaching kids about privacy. These lesson plans are sure to spark active participation, foster lively discussions, and instill a profound understanding of why privacy matters and how to protect it online.

I am very excited about this initiative and immensely grateful to all those who helped make it happen — it’s an important step towards ensuring that children have the knowledge and tools they need to participate safely in the digital world.

Wishing all Ontario kids and teachers a wonderful new school year!

— Patricia

]]>
Bookending my year as the IPC’s first Scholar-in-Residence https://www.ipc.on.ca/bookending-my-year-as-the-ipcs-first-scholar-in-residence/ Thu, 29 Jun 2023 13:45:00 +0000 https://www.ipc.on.ca/?p=21139 Guest blog by Teresa Scassa

From September 2022 until the end of June 2023, I had the privilege of being the first Scholar-in-Residence at the Office of the Information and Privacy Commissioner of Ontario (IPC). My goal was to learn more about privacy and access issues from a regulator’s perspective while being exposed to the many complex and challenging issues faced by the IPC. I was not disappointed. Through this unique in-house experience, the commissioner and her teams brought me into key internal and external meetings to seek my views and perspectives. They involved me in strategic planning sessions, asked me to critique and peer-review draft papers on leading-edge privacy and transparency topics, and to mentor junior staff as they developed legal and policy research skills, and more.

I end my term having learned a great deal that will enrich my thinking on many issues about which I research and write. The IPC’s mandate is broad and diverse, covering privacy and access to information issues in Ontario’s provincial, municipal, health, and child and family services sectors. I have had the opportunity to witness some of the strategic thinking and planning that goes into realizing a complex mandate with limited resources in a dramatically evolving technological landscape. I’ve also had the pleasure of building strong working relationships with members of the talented and committed team at the IPC, and I hope this will lead to ongoing cooperation and collaboration.

The idea to have a scholar in residence was part of Commissioner Kosseim’s ambitious vision for a “modern and effective regulator” articulated in the IPC’s just-published 2022 annual report. A regulator’s resources are always limited and must be deployed wisely to achieve maximum positive benefit. Although investigations, tribunal decisions and orders are usually the most highly publicized aspects of the work of any information and privacy commissioner, they are not the exclusive focus of attention. Similarly, while addressing privacy breaches is essential, it is far preferable if breaches never occur. To this end, a modern and effective regulator should direct significant efforts toward ensuring that regulated parties adopt careful and compliant practices. There are many ways to do this, including through timely guidance, engagement, and the creation of incentives that go beyond box-checking and substantively improve practices. On the access side, while addressing complaints over denials of access to information remains a vital part of the regulator’s role, the regulator can also encourage and support efforts to be transparent on a more proactive and systematic basis. One interesting initiative from the IPC this year was the Transparency Challenge that encouraged entities subject to Ontario’s Freedom of Information and Protection of Privacy Act (FIPPA), and its municipal counterpart, MFIPPA, to submit their best efforts at proactive transparency, which the IPC curated as part of a virtual showcase.

A modern and effective regulator is open to advice and input. The IPC has a 25-member Strategic Advisory Council, and a new Youth Advisory Council offering input and advice on issues of particular interest to youth in Ontario. A modern and effective regulator can also develop tools and guidance to help organizations make better choices about the technologies they adopt. They can also educate and empower citizens to understand and exercise their rights. This year, the IPC developed and published new lesson plans for Ontario schools to teach children about privacy. Commissioner Kosseim also reaches out to the public with regular blog posts on important privacy and access issues, and a great podcast series, Info Matters, featuring engaging interviews with experts on important privacy and access issues.

The rapid pace of technological change — and the dramatic nature of some of these changes — has become a major challenge for both legislators and regulators. For the IPC, these changes may require developing new knowledge or skills through training and hiring; planning and strategically forecasting to anticipate changes and their impacts; conducting research to develop policy and guidance for emerging technologies; working with government to provide input on evolving law and policy; and liaising with other regulators — not just in the access/privacy area but also in related fields. The recent joint statement by the IPC and the Ontario Human Rights Commission on the need to develop appropriate AI governance for Ontario is an example of the IPC’s ongoing work to collaborate with other regulators on issues of mutual concern. Another example is the joint resolution on Digital ID issued by the federal, provincial and territorial access and privacy commissioners, which called on governments to ensure “that privacy and transparency rights are respected throughout the design, operation, and evolution of digital identity ecosystems in Canada.” The IPC is also involved in developing recommendations and guidance that can shape how public bodies address privacy and transparency when they adopt new technologies or approaches.

This brief account can only capture a small amount of the day-to-day work of the IPC to meet its multi-faceted mandate in a responsive and accountable way during a time of significant technological change. I step away from my term as scholar-in-residence with new knowledge and insights, with some ideas for new collaborative research ideas for the future — and with a great deal of respect for an outstanding team with a genuine commitment to privacy and access in the public interest.

Dr. Teresa Scassa is the Canada Research Chair in Information Law and Policy at the University of Ottawa, Faculty of Law. She served as IPC’s first Scholar-in-Residence from September 1, 2022 until June 30, 2023.

]]>
Let the sunshine in! Showcasing the benefits of government transparency https://www.ipc.on.ca/let-the-sunshine-in-showcasing-the-benefits-of-government-transparency/ Thu, 18 May 2023 15:00:10 +0000 https://www.ipc.on.ca/?p=21009 Last fall, my office issued a Transparency Challenge. We called on Ontario’s public institutions to share their best and brightest projects or programs that demonstrate a real commitment to government transparency and show how open data can have tangible benefits for the day to day lives of Ontarians. Lo and behold, public institutions rose to the challenge!

My office received and curated a series of compelling examples of transparency initiatives, and are exhibiting them in our Transparency Showcase.

The showcase is an online 3D gallery featuring a range of projects from municipal and provincial institutions from across the province. If you haven’t had a chance to take a look, I encourage you to visit! You can browse through the exhibits from your desktop or phone and learn more through graphics and videos. The projects are illustrated with stunning original digital art pieces by a very talented artist, Aedán Crooke (be sure to click on “about the art” to learn more about each piece).

With this showcase, we wanted to put some great initiatives into the spotlight to inspire other government institutions to be more proactive in releasing information to the public and to remind everyone of the tremendous benefits of transparency and open data.

Part of being a modern and effective regulator means encouraging compliance with Ontario’s access and privacy laws through positive encouragement, not just enforcing the laws through orders and sanctions. Government transparency is essential to democracy, which is why Privacy and Transparency in a Modern Government is one of my office’s four strategic priorities.

Transparency and access to trustworthy information are more important than ever in today’s digital age, where misleading facts and mistruths spread like wildfire online. Misinformation, and how to fight it, was the topic of a recent episode of the Info Matters podcast, Trust and truth: Navigating the age of misinformation.

I spoke with Dr. Alex Himelfarb, chair of the Council of Canadian Academies’ expert panel on the socioeconomic impacts of science and health misinformation. As a professor of sociology, senior federal public servant, and a former clerk of the Privy Council serving three prime ministers, Dr. Himelfarb certainly had a lot to say about this critical topic. If you have the time, I encourage you to read the CCA’s report Fault Lines (it’s a long one, but very eye opening!). Or, for a briefer version of the report’s main takeaways, have a listen to the podcast.

Alex and I talked about just how destructive misinformation can be, negatively affecting not only individuals, but also ripping apart the very fabric of our communities, with the most adverse impacts for marginalized groups and vulnerable populations. Left unchecked, misinformation spawns confusion and mistrust, chipping away not only at our trust in each other, but also in government institutions, science and academia, media, and other key pillars of our society.

Throughout our conversation, it became clear the vital role government organizations must play in helping curb the spread of misinformation by filling in knowledge gaps with reliable, evidenced-based data. And how making quality information based on facts freely available to the public, in a readily-accessible form, can help stem the tide of falsehoods out there.

Modern regulators need to use innovative tools to support and encourage the behaviours they’d like to see. My hope is that through the Transparency Showcase, we will encourage other institutions toward greater openness and help increase awareness and understanding of the benefits of open data to equip the public with the information they need to make better decisions and improve their lives. Providing access to information is certainly part of it, but even more important in a modern digital world, is for governments to proactively release information that will help debunk mistruths and help re-establish the foundation of trust needed to close the growing “fault lines” in our society.

I want to sincerely thank Dr. Himelfarb and the other members of the CCA panel for their excellent assessment of this timely and relevant issue that should be a concern to all of us who care about truth and transparency.

I also want to thank everyone who took up our Transparency Challenge, and to congratulate our exhibitors for their impactful work. We expect the Transparency Challenge will become an annual initiative, so if your organization is interested in participating next year, stay tuned.

For now, we welcome you to our Transparency Showcase. Come on in, take a look around and most of all, enjoy your visit!

— Patricia

 

]]>
Privacy Day 2023: A positive diagnosis for privacy https://www.ipc.on.ca/privacy-day-2023-a-positive-diagnosis-for-privacy/ Wed, 01 Feb 2023 14:45:38 +0000 https://www.ipc.on.ca/?p=20250 On the occasion of Data Privacy Day 2023, our office hosted an event on the theme of Building Trust in Digital Health Care.

Nearly three years of pandemic conditions, overburdened emergency rooms, sparse access to primary care, and an exhausted healthcare workforce have worn the country thin. Ontarians — and Canadians — have spoken their minds and want to see improvements to health care services. Sustaining our publicly funded health care system will require innovative approaches and new digital solutions.

First Ministers are preparing to negotiate a new funding transfer agreement with conditions that will drive fundamental changes to health care delivery — including data sharing. One can feel transformation in the air.

The changes we may see coming out of the new funding agreement will add to the many changes already afoot in Ontario, including the creation of Ontario Health Teams.

To be successful, these changes will require a strong foundation based on public trust, especially trust that health providers will respect patient confidentiality and keep personal health information secure.

Without trust, patients will not be forthcoming about their symptoms or be truthful about following treatment plans. Worse, they may avoid seeking help altogether. They may be hesitant to adopt new digital solutions, participate in research, or allow their personal health information to be shared for broader public health purposes, particularly if they fear that information may be used to stigmatize the community to which they belong.

As the old adage goes, “Trust takes years to build, only seconds to break, and forever to repair.”

The key topics we discussed last Friday — eliminating fax machines, stopping employee snooping, and defending against cyberattacks, while also building a transparent and privacy-respectful culture — are some of the fundamental conditions for earning Trust in Digital Health.

Axe the Fax

According to privacy breach statistics that all health institutions must send to my office annually, misdirected faxes continue to be the leading cause of unauthorized disclosure of personal health information in Ontario.

In 2021, 51% of the unauthorized disclosures reported to the IPC were due to misdirected faxes, mercifully down from 58.5% the year before, but still far too high.

My office recently released a report about the high number of privacy breaches at a regional hospital due to misdirected faxes. This report not only provides important insights for health care providers about the risks of using fax machines, it also shows the significant efforts that can be made to reduce misdirected faxes and the use of fax altogether.

It’s a good news story about how stakeholders can work together to replace faxes with more secure digital forms of communication. Last Friday, we got to hear about that first-hand from Wendy Lawrence, Chief Risk, Legal & Privacy Officer at St. Joseph’s Healthcare Hamilton.

This theme aligns with a joint resolution that my office, along with Canada’s federal, provincial, and territorial privacy commissioners issued last September on Securing Public Trust in Digital Healthcare.

The resolution outlines measures for adoption by governments, health institutions, and providers, including a coordinated plan to phase out fax machines and unencrypted email. It also promotes the adoption of more secure digital technologies and responsible data governance frameworks.

Last Friday, I publicly reiterated our office’s standing offer to work with governments, regulatory colleges, health institutions, providers and others to put an end to fax machines that unnecessarily expose individuals to potentially harmful privacy risks and undermine trust in the system as a whole.

That being said, we recognize that axing the fax is not so easy.

Michael Hillmer from the Ministry of Health explained some of the practical challenges across the sector. This is particularly the case among community health providers, as Ariane Siegel from Ontario MD explained. Sylvie Gaskin from Ontario Health described concrete steps that have been taken so far to move health providers towards more secure and interoperable forms of digital communication, incrementally and over time.

Employee Snooping – AMPs

Another persistent, trust-breaking issue is employees snooping into health records.

Whether out of malice, personal gain, mere curiosity or well-meaning concern about the health of friends and family, snooping through medical records can have devastating consequences for patients, health professionals, and the health care system as a whole.

While the reasons for snooping may vary, the result is the same — it undermines patient trust.

Reports to our office for 2022 reveal that snooping by health care workers accounted for 29% of self-reported health privacy breaches. This is up from 21% the year before, reflecting a disturbing and persistent trend we’ve seen over the past few years.

We need to work together to stamp out snooping once and for all, through awareness training to prevent inappropriate access — however well-meaning — and effective disciplinary measures when not so well-meaning.

To help deter snooping, Ontario’s health privacy law, PHIPA, was amended in 2020 to give my office power to impose administrative monetary penalties (AMPs) on those who seriously breach the law. Michael Hillmer described the policy intent and objectives behind the AMP regime in Ontario.

While we await regulations before administrative penalties can take effect, our panelists discussed how this new enforcement tool can be used most effectively to curb bad behaviour, like snooping, while still encouraging good behaviour.

The health care sector has become familiar with the concept of just culture, which takes a calibrated approach to addressing medical errors.

From time to time, health professionals make mistakes. Nyranne Martin from The Ottawa Hospital explained how a just culture approach uses a spectrum of responses ranging from consoling, coaching, and systemic course correction, with disciplinary action and sanctions reserved for only for the most egregious cases.

Just like medical errors, privacy breaches also range in motivation and severity. Wendy Lawrence described some of the practical interventions that can be taken to curb snooping behaviour, including stepped-up education and accountability mechanisms, to help employees learn from their mistakes, correct them and prevent them from happening again, consistent with a just culture approach.

Cyberattacks

Our third theme was cyberattacks, which, unfortunately, have become an increasingly dangerous and pervasive threat to the security of personal information in all sectors, including the health sector.

This is part of a rising global trend in cyberattacks worldwide, particularly since the onset of COVID-19, that increasingly target public sector institutions, critical infrastructure, and essential services.

In 2021, the number of health privacy breaches reported to our office due to cyberattacks was double the number in the previous year.

Nyranne Martin and Wendy Lawrence discussed how large institutions not only have to make risk-based investments in technological safeguards but also invest in people by ensuring staff are aware of the threats, how to avoid them and what to do if a breach occurs.

Cyberattacks are particularly daunting for smaller health care organizations that process large volumes of sensitive personal data but lack the financial resources to mount a thorough cyber-proof defence, let alone pay ransoms when they do get hacked by cybercriminals.

Sylvie Gaskin described a recent partnership between Ontario Health and the Ministry of Health to develop operational centres that can assist the smaller players build their cyber resiliency. Ariane Siegel described the practical training Ontario MD offers health practitioners in the community on how to protect themselves in terms of cybersecurity, including questions of insurance.

Many of the technical tips offered by the panelists dovetail well with the IPC’s recently updated fact sheet on protecting against ransomware attacks.

As Benjamin Franklin once said, “An ounce of prevention is worth a pound of cure.”

Building a transparent and privacy respectful culture

Finally, the panel concluded with a discussion on how an organization’s transparent and privacy-respectful culture can help build and sustain public trust.

Nyranne Martin discussed how data privacy and security must stay top of mind and be addressed throughout the organization, starting with the board of directors and c-suite executives. Also, privacy and IT security should be integrated cross-functionally and as part of a broader enterprise risk management framework to ensure that risks are mitigated accordingly. She described how staff at all levels of the organization can see their role of protecting patient privacy not only as an obligation but as a mark of pride.

Other panelists discussed the key role education plays in raising awareness about privacy and security issues and instilling a sense of respect for patient privacy as part of organizational culture.

And finally, Michael Hilmer told us how a transparent and privacy-respectful culture is the necessary condition for building trust — the sina qua non — for implementing Ontario’s ambitious plans to future-proof its health care delivery model in a more financially sustainable way.

Modernizing our health care system through transformative digital tools and enhanced data sharing will need appropriate governance structures and processes to sustain that foundation of public trust on which the entire enterprise critically rests.

Just as “trust is the glue of life,” according to author Stephen Covey, trust is the glue that will hold our health care system together in whatever shape it takes so that it’s there, standing solid and ready, to help our loved ones when they need it the most.

If you could not attend our event or watch the webcast, we’ve posted it on our YouTube channel and I encourage you to share it with your friends, colleagues and networks.

— Patricia

]]>
Giving Ontario’s youth a seat at the table https://www.ipc.on.ca/giving-ontarios-youth-a-seat-at-the-table/ Thu, 12 Jan 2023 15:00:01 +0000 https://www.ipc.on.ca/?p=20168 Every January 1, my family and I take turns going around the dinner table revealing our new year’s resolutions. Typically, these are well-intended aspirations to try and improve ourselves and the world around us in some small way. We write them out on a piece of paper and discuss how we will hold each other accountable for achieving them.

Well, this year, the IPC has made a new year’s resolution too. We’re aiming to improve our privacy and access outreach to children and youth, with a view to empowering Ontario’s next generation of digital citizens.

In committing to this resolution, the IPC is starting 2023 with a new Youth Advisory Council! This group of highly-engaged youth will help guide my office in developing education and outreach materials that are more relevant to young people and will hold us accountable for it!

I’m very excited to announce the members of our new Youth Advisory Council. I look forward to convening this unique forum where they can share their views about digital literacy, access, and privacy rights in Ontario. We’ve brought together ten incredibly impressive young people, ranging in age from 15 to 24, from different communities across the province, and from diverse backgrounds, experiences, and outlooks.

At school, their academic pursuits range from science and technology to business, law, and social justice. Outside of school, they all find time to give back to their communities and participate in sports and other activities. Among our council members, we count a violinist with the Hamilton Philharmonic Youth Orchestra and a founder of a model United Nations club. Others are involved with their local Indigenous Friendship Centre, participate in their high school’s Black student union association, or serve in a mentor role for 2SLGBTQ+ students.

Reading their applications, what really struck me, though, was the common thread connecting them all. They each expressed the desire to work with others to make the world a better place, including a desire to contribute to the development of tomorrow’s digital citizens.

Children and Youth in a Digital World is one of four strategic priorities guiding the work of the IPC. It’s the foundation of my office’s commitment to championing the digital literacy and digital rights of young people, while holding public institutions accountable for protecting the children and youth they serve.

We’ve done a lot recently by engaging with youth advocacy groups, developing materials like Privacy Pursuit! Games and Activities for Kids, and launching a new youth-focused Instagram page. We’ve dedicated several episodes of our Info Matters podcast to privacy and access issues among children and youth, and held our 2022 Privacy Day event on the theme of Children and Youth in a Digital World. But through all of these initiatives, a key ingredient was missing. The voices of young people! Today’s youth have grown up online, and no one knows or understands their needs, desires, fears and challenges better than they do. Their fresh perspectives are an invaluable resource for supporting the IPC’s efforts to promote digital literacy and expand digital privacy and access rights for youth in a way that is relevant and meaningful to them.

The breadth of experience and viewpoints from our youth council members will serve to inform and complement our work with stakeholders and our Strategic Advisory Council. They will help shape and “test run” new educational materials we’re developing for children and youth this year. Keep an eye on our website and our social media channels, especially our Instagram page. Great stuff is coming!

With the input from the IPC’s Youth Advisory Council, I feel more confident that we can make a real difference in the lives of Ontario’s children and youth. By equipping them with the tools and resources, they will be better able to protect their privacy, safety and dignity online as they grow, learn, and mature into adulthood.

Someday when we look back, I want to say that we’ve succeeded in guiding this next generation through the opportunities and risks of the online world. By broadening the conversation and engaging young people with a seat at the table, we can learn from them. Hopefully, by listening and learning, we can achieve our new year’s resolution to make the digital world a better place for our children, and for future generations to come.

— Patricia

]]>