These are your friends you didn’t really know you had. They listen very well. They’re keenly aware of your interests and where you’ve been. They know where you work and they know where you live. What you tell them and what you do, they really never forget. Well, unless you tell them to. And what they don’t know, they don’t simply leave alone. No, they make an informed guess.
These friends are the devices and services you use every day. And depending on your privacy settings, what they know about you is both fascinating and creepy.
If the product is free, you’re probably the product.
Search engines, social media sites and streaming services all collect more information than we think. Not only does Netflix know our show preferences, it knows what scene or episode caused us to pause, rewind, or abandon a series completely. Unlike traditional television rating systems, the sample size is every user and the dataset is complete. Analytics has become an intrinsic part of its strategic programming decisions.
When Netflix ventured into original content, it was confident its $100 million investment in House of Cards would pay off. While TV networks and movie studios have long used focus groups before green-lighting pilots or projects, Netflix had the data of 33 million subscribers to review. What it told them was that the original British House of Cards, director David Fincher, and actor Kevin Spacey were incredibly popular with its users. As someone who has devoured many hours of House of Cards, I’d like to thank the fine people at Netflix for their informed gambit.
Most people are comfortable having their viewing habits studied by Netflix, if it means Netflix returns with more addictive original programming and user-specific recommendations. The user’s reward is access to an ever-growing library of binge-worthy hits. However, some of these same users are more inclined to change their privacy settings on some larger platforms that also use data to improve the user-experience. For example, Gmail analyzes your email and serves you ads accordingly, while Facebook turns on your microphone to listen to the music or TV show you are watching while you compose a status update. How well do these tech giants know you? Who does Google think you are? Based on your search and YouTube history, the company estimates your age, gender and interests. Your work-related search history may skew the results, but Google can guess that I’m a male between the age of 25 and 34 (it doesn’t know my interests as my privacy settings were too restrictive).
While Google may not be able to predict the future yet, the location-based tracking information that fuels Google Maps’ traffic congestion feature also knows what bars plied us on a Saturday night, what greasy diner remedied us Sunday morning and every stop and mode of transportation in between. Try it yourself – depending on your privacy settings and Android phone use, you can track years of daily movement and see where Google thinks you live and work.
These techniques have advanced to the point where your name, location and behaviour, compared against government census data, can signal your age, gender and race. For example, social listening software can infer that an American Twitter user named Britney who follows Taylor Swift is likely to be a female under thirty, while a Canadian user named Todd who follows the Financial Post is an older male.
Facebook has gone one step further, leveraging data on your beliefs, friends, language, organizations and taste in music for its ‘Multicultural Affinity Targeting’. While currently only available in the United States, it provides advertisers the ability to run campaigns that specifically target users who fall under four demographic categories: African American, Asian American, Hispanic and non-multicultural (a delicate way of saying white). While using a diverse selection of talent to reflect your targeted audience is arguably noble and savvy, how marketers use this feature poses major ethical challenges.
When marketing the the movie Straight Outta Compton, advertisers used wildly different trailers for white and black users. The version for white audiences sensationalized the gang elements of the film, showcasing police chases and guns. In contrast, the trailer for black audiences played up the biographical, racial discrimination and protest elements. The notion that advertisers use stereotypes is not new, but the advent of ‘smart’ racial targeting will certainly be studied by businesses and sociologists alike.
On a smaller scale, the advanced data capturing methods will continue to trickle down for popular use. A band may decide to rejig their tour schedule based on the listener data provided to them from Spotify or a local restaurant may use its Wi-Fi to track and promote menu items to its customers. Together, these practices will continue to evolve as far as technology and public sensitivity allows, with smart marketers able to find the right nuances to tap into user attitudes and desires.
Analytics will always be limited by what has already happened – they can’t predict the future. Events of course matter, as do the innovators, like Steve Jobs or Walt Disney, who produce what we didn’t even know we wanted. But, the use of Big Data gets us closer and may prevent big investment losses and ensure that the services we use and the products and media we consume are tailored to our behaviours better than a marketer’s gut feeling. There is a trade-off in receiving tailored products, but the allure of cheap or free services will help gloss over consumer doubts about privacy or security risks.
Ultimately, the tracking and profiling of our online behaviour may create two classes of users: those who believe in the practice’s benevolence and the convenience of more attentive, efficient products and services; and those concerned about the Big Brother aspect, and their loss of privacy and autonomy. Looking at the analytics of it all, marketers may well decide to create more personalized privacy settings and applications to suit our individual tastes – but that behaviour will certainly be tracked too.