“Hey Siri!” “Alexa, call mom.” ‘Hey Google, set my alarm for 10:00 am.” Here are three statements, that you, along with the rest of Generation Z have most likely interacted with in one way or another. Investing in a machine that, as advertisements demonstrate, can literally do anything for you, sounds like an amazing idea. I mean, who would not like to have a device to remember your passwords, tell you how much of an ingredient to use in a recipe, set your alarms, give traffic advisories, among several other things? All of that sounds great, does it not? Yet, have you ever stopped to think about where this information is stored? Or how the machines actually know the direction of your commute? Is it just a coincidence that Siri knows the exact time you set your alarm for every night and reminds you to set it? Or that Alexa knows who’s phone number to call when you tell it to call mom? The amount of information a virtual assistant stores within it about its user should be much more concerning to us. Although these machines are created to make our lives easier, they pose a major risk to the privacy of our generation and on generations to come. 

Virtual assistants collect huge amounts of information on their users. In the Berkeley Technology Law Journal article, “How Digital Assistants can Harm our Economy, Privacy and Democracy,” authors Maurice E. Stucke and Ariel Ezrachi state, “Digital assistants (and the smart technologies connected with them) aim to collect even more personal data [than smartphones.]” (Ezrachi and Stucke 1279). This makes it clear that these machines do indeed collect immense amounts of information. You may be wondering, how do these machines even collect this information if all users do is ask them questions? Well this is an interesting thing to take a look at. As users, we  ask virtual assistants different questions ranging from the weather to the amount of time ones commute will take, to information on a specific singer, and for the machine to play our favorite playlists. These may sound like simple commands, but as each question is asked, the information is stored within the internal servers of the company that owns the machine. Which then in turn provides these companies with information on each one of the machine’s users. The same article references a situation from 2017, when the company VIZIO was caught tracking the television shows its users watched, without their consent. This was done in order to cater advertisements towards their consumers. Just as a company like VIZIO could do this, so could virtual assistants. The only difference is that virtual assistants collect a greater amount of data, due to the fact that a larger amount of people interact with the machine, including children, guests, or anyone else that comes to a home that owns one. (Ezrachi and Stucke 1283). Machines can gain access to a user’s location, as well as his/her likes and dislikes from interpretations of commands that are given. 

In addition to the issue of large corporations knowing too much about us, according to the Berkley Law Journal article, government surveillance should be another concern. After providing an example of how a government agent can hack into smartphones, the authors go on to say “Governments would have similar… abilities to hack digital assistants to monitor and gather evidence” (Ezrachi and Stucke 1282). As a result of this, it is evident that the information that is shared with virtual assistants can be used for a lot more than just answering a question, or conducting an inconvenient task. 

Users of virtual assistants often confide in their machines, as if they were a partner or significant other (Woods). Due to the fact that they feel such an intimate relationship with them, the user tends to share the same amount, or more information with it than they would with a close friend or relative. As a result, the machines get to know the user inside and out. This then leads to the machine storing detailed and personal information about the user. In an interview with Forbes.com, an Amazon Echo spokesperson said that the reason the company collects data on its users is to  “improve the customer’s experience.” (O’Flaherty). Amazon also told Forbes.com that the Echo Dot does indeed record its users, and that these recordings are listened to by select Amazon Echo employees. They claim that this is done to “train its speech recognition and natural language understanding systems (O’Flaherty).” Although this is certain, and the employees can not identify the people who they are listening to, as O’Flaherty states, “Amazon never explicitly tells users that a human could be listening to [them.]” 

Owning and using a virtual assistant is something that is becoming increasingly popular. According to Pew Research Center as of 2017, 46% of American adults frequently used a virtual assistant (Madden and Rainie). As a result, issues of privacy and the way in which these machines store and interpret our day should be concerning a larger amount of people. Furthermore, virtual assistants are not only used by adults, but they have also become intriguing to children. Since these machines speak to you as if they were your friend, children easily become entertained by asking the VA silly questions and waiting to hear the response that has been curated by the company that created the device. 

Works Cited

Ezrachi, Ariel and Maurice E. Stucke. “How Digital Assistants Can Harm Our Economy, Privacy

 and Democracy.” vol. 32, iss. 3, 2017. Berkeley Technology Law Journal. 


Madden, Mary and Lee Rainie. “Americans’ Attitude About Privacy, Security and Surveillance.” 

Pew Research Center



O’Flaherty, Kate. “Amazon Staff Are Listening to Alexa Conversations— Here’s What to Do.” 

Forbes.com, https://www.forbes.com/sites/kateoflahertyuk/2019/04/12/amazon-staff-are-listening-to-alexa-conversations-heres-what-to-do/#42eaa6b71a22. Accessed 16 November 2019.

Woods, Heather Suzanne. “Asking More of Siri and Alexa: Feminine Persona in Service of 

Surveillance Capitalism.” Critical Studies in Media Communication, vol. 35, iss. 4, 2018, 

https://doi.org/10.1080/15295036.2018.1488082. Accessed 4 November 2019. 

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out /  Change )

Google photo

You are commenting using your Google account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s