Bahá’ís of Canada Français
Panel explores new technologies and public discourse
Top left to bottom right: Gayle Nathanson, Sam Andrey, Dr. Geoffrey Cameron, Dr. Borna Noureddin and Akaash Maharaj.

Panel explores new technologies and public discourse

As more of our public discourse takes place on online platforms, new questions have emerged about the relationship between these technology and democratic processes.

In Canada, this conversation has focused on the problems of online hate and disinformation, and how they can be addressed by democratic oversight and government regulation. During an online event hosted by the Baha’i Community of Canada’s Office of Public Affairs, panelists explored how we should think about the purpose of social media in a democracy and what kinds of values and principles should inform efforts to reform and regulate current practices. The event was held on January 21st, and co-sponsored by the Centre for Israel and Jewish Affairs, the Mosaic Institute, and the Ryerson Leadership Lab.

The Baha’i community’s Office of Public Affairs has been following this conversation and policy discourse for a number of years, including as a member of the Canadian Coalition to End Online Hate. In 2019, its Director, Geoffrey Cameron, was invited to testify before the House of Commons Standing Committee on Justice and Human Rights as part of its study of online hate. Last year, the Office of Public Affairs made a submission on Virtual Spaces and Democratic Processes to the Canadian Commission on Democratic Expression led by the Public Policy Forum.

“It is becoming increasingly clear that these online platforms are not value-neutral spaces of public conversation,” said Dr. Cameron. “In addition to helping to connect people in unprecedented ways, they can also amplify lies and hatred with little public recourse. We wanted to host a discussion about how we should view the purpose of these platforms, and how public oversight can reform them to serve these purposes.”

The panelists spoke from a range of backgrounds, including advocacy, community-based research, and higher education. Sam Andrey, Director of Policy and Research at the Ryerson Leadership Lab, spoke about recent public opinion research generated by the Lab, which found that there are very low levels of public trust in social media platforms but uncertainty about how they should be regulated to address pervasive challenges of hateful and misleading content.

Borna Noureddin, a professor at the British Columbia Institute of Technology, observed that the unease many people feel about these platforms is a reflection of the rapid processes of innovation in technology. With digital technology, the unintended consequences of innovation appear so quickly that we are often slow to address them.

He emphasized that we also have to think about the values that inform the design of new technologies. We need to promote the design of new digital technologies that is not primarily focused on addiction and distraction, he said.

Elaborating on this point, Akaash Maharaj, the CEO of the Mosaic Institute, added that social media platforms themselves are not value-neutral. Their algorithms are rules for content promotion and amplification that have adverse consequences for the health of our public discourse. He warned, however, that government regulation has to follow social change: society needs to demand changes to how these platforms operate.

We are already seeing the ways in which civil society can mobilize to call for change, observed Gayle Nathanson, Associate Director of External Affairs, Centre for Israel and Jewish Affairs. She noted that the Canadian Coalition to End Online Hate has focused on making the connection between online hate and real world violence, and advocating for the creation of an independent government regulator. 

The conversation concluded by considering the role of digital literacy and citizen education in addressing some of the challenges posed by social media. Dr. Noureddin reflected that we can think about our approach to educational programs as helping people to express agency. “Part of protecting yourself is being well-informed, it is about having agency, knowing what to use, and how to use it.” He noted that educational programs need to go beyond a few “dos and don’ts” to consider the social context in which people find themselves, and the habits they create around their online activity.

“We have to create conditions and environments that allow us to be more thoughtful about how we encourage the use of social media, to set things up that make it easier and not harder to understand how to use it in the proper way,” he concluded.