University of Wisconsin–Madison

Another Short Assignment in a Course about Rhetoric & Power on the Internet

Kathleen Daly (Communication Arts 478: Rhetoric and Power on the Internet)

The final short assignment asks you to analyze and interrogate the privacy and data collection policies and practices that inform our everyday digital lives. Specifically, you will explore data collection and use in two contexts: 1) your personal, everyday data footprint and 2) the data policies and practices of a particular online platform.

Part 1: Taking an Inventory of Your Personal Data Footprint

DUE THURSDAY, APRIL 26th

Terms-of-use policies that describe data collection and use are required by law, but these are lengthy and difficult to understand when read at all. Even more problematic is the fact that everyday users are often led to believe that the data they contribute is a vital and even beneficial component of the services they seek from a given platform. To complicate these shared assumptions about the data collection practices and policies that undergird our everyday digital lives, you will each keep a log of your digital movement/activity over the course of 12 hours. We will then work in class to unpack the various layers and types of data generated and collected during this time period.

Your internet use log may be submitted as a bulleted list, a spreadsheet, a table, or some other readable format. Because the volume of internet/internet-connected device use will vary from person to person, there is no set minimum or maximum requirement for how many entries you include in your log. However, you should strive to be as detailed and accurate as possible as your log your internet footprint.

Part 2: Analyzing a Platform’s Data Collection Policies and Practices

Select a platform (Facebook, Google, Amazon, etc.) to analyze. Write an informal explanation (~300 words) of that platform’s data collection policies and practices, addressing the following questions:

• What are the default settings of this system?

• What modifications to these settings are possible?

• How easy are these settings to manage?

• What are the terms of using this system?

• What are users agreeing to when they use this service?

• How are these terms (and any changes to them) communicated to users?

Part 3: Discussion

In an age when participation in so many life activities—including commerce, education, civic discourse, personal communication—require users to relinquish rights to their personal data and content, norms regarding responsible and ethical collection, management, circulation, and use of content and data need to change. Although internet users have some degree of agency in choosing among available settings and services, these choices are quite limited, and are ultimately controlled by the technology developers and providers. Therefore, the most pressing change that needs to happen is with our shared expectations for how such systems are designed, and what accountability we expect from software companies and service providers to offer ethical systems.

Based on your work from the first two parts of this assignment, write a short (~500 words) reflective essay that addresses the following questions: How might we, as a culture, hold software developers and companies responsible and accountable for designing systems that enact a different ethic, one that consider users’ privacy and ownership rights? What broader structural issues would we need to address in order to enact ethical data practices? What systemic changes might you propose to help users become more responsible and active participants rather than passive consumers of platforms and services?

Evaluation Criteria

You will be evaluated for Part 1 and 2 of this assignment based on completion.

Part 3 evaluation criteria

• Does your discussion/reflection address key structural issues involved in the current standard for platforms’ user agreements?

• Do you use examples to back up your claims about the issues underlying most user agreements?

• Does your essay propose systemic/structural changes that adequately respond to those issues?

• Do you provide clear explanation of how these systemic/structural changes would work to foster more ethical software/tech design and practices?