- It’s time for a change
- It’s time for a change
- Navigator Limited Ontario Accessibility Policy
- Virtual Retreat 2020 Closing Remarks
- COVID-19 Resources
- Navigator Sight: COVID-19 Monitor
- Navigator Sight: COVID-19 Monitor – Archive
- Canadian Centre for the Purpose of the Corporation
- Chairman’s desk
- Government relations
- Public affairs campaigns
- Capital markets
- How we win
- What we believe
- Who we are
- Empower by Navigator
For more information on our crisis planning services, click here.
When it comes to social media platforms, angst can eclipse reality
It all sounded so shocking at first. A whistleblower emerged with tales of a conspiracy between shady political actors and Facebook. With words like “personal data,” “privacy,” “Trump” and “mind-reading” tossed into the mix, public outrage and fear ensued.
Was this an overreaction? The activity and impact of social media, after all, remains challenging for even the savviest users to assess.
When we hear that political operatives mined Facebook or used our personal data to learn more about our motivations and intentions in the context of an election, the immediate reaction is to get angry. Anything that involves the use of “personal data” sounds like it must be an invasion of our privacy.
However, before we panic and call for policy changes, we should take a careful look at how we use sites like Facebook and what crumbs we, in fact, voluntarily leave behind.
- We “like” pages that signal our interests.
- We check into restaurants and stores.
- We disclose our relationship status.
- We respond to content in our feed.
- We share photos.
- We share the date of our birthday, or,
- even worse, our actual birth date.
- We talk about our favourite Netflix show.
We choose to disclose all of this information on a public platform. It’s a feature of the platform, not a bug. The decision to hand over this information is ours. In effect, we choose to make ourselves vulnerable.
That said, no amount of data is going to give any platform mind-reading and mind-shaping abilities. U.S. President Donald Trump did not win because he manipulated Americans into voting for him using Cambridge Analytica ad targeting. Anyone making such a claim is grossly exaggerating.
The utility of this particular data has significant limits. It can help advertisers paint a broad picture about a group of people. It can help political operatives target people who fit a certain demographic, geographic or interest profile. But they can do it at the group level only.
Our choices are still ours and we can reject things we don’t agree with. In reality, if we don’t want our data shared, or if we are convinced we could get brainwashed into voting for Trump because an advertiser knows our favourite TV personality, we have a clear choice in the matter: get off the platform.
However, no one wants to hear that. No one wants to look in the mirror. Instead, we expect Facebook to live up to a higher moral code. It’s the nature of becoming a platform billions of people “depend” on. And when you get too big, and when government gets a whiff that you have lost the moral high ground, it sees an opportunity no government can resist: regulation.
With Congress contemplating whether to regulate Facebook, we have suddenly put social media on the same pedestal as industries of national importance like drugs, health care, agriculture, forestry, fishing, energy and mining. If public opinion tilts in the government’s favour, public scrutiny of Facebook and other social platforms will intensify. An entire industry could change. Facebook is not the only platform tracking user data, and governments around the globe will feel compelled to act.