Privacy and data analytics
Speaking with Department of Philosophy professor, Chelsea Rosenthal it’s clear that ethics and specifically privacy issues are essential factors when dealing with social data analytics.
“We need to ask ourselves,” says Rosenthal, explaining the overall emphasis behind SDA 270, “does our analysis focus on what’s going to help us achieve valuable goals, and are we achieving those goals in a way that overlooks something of ethical importance?”
Rosenthal will be examining data, ethics and society. Among other topics, she’ll be looking at privacy—why it’s valuable and why it matters. Covering informed consent and disclosure, this topic will look at the interplay between privacy and how safeguarding the individual affects innovation. She will also be asking students to consider what is accomplished by protecting privacy when involved in social data analytics.
“Traditionally, ethical research requires that participants give informed consent for the use of their data, but with large-scale, population-level data, meaningful, informed consent may not be possible – and some scholars argue that it’s also not enough to prevent abuse. If that’s the case, what does adequately protecting privacy look like?” she asks.
Rosenthal will also be covering algorithmic bias and transparency, asking students to think about whether these algorithms are as objective as intended. Along with issues regarding privacy, bias and discrimination are also important factors. These can be introduced, often without intent, when tech is being developed. Without transparency and ethical oversight, and without knowing what issues are important to address, algorithms can reflect and perpetuate existing biases.
What matters?
She’ll also be challenging students to ask if what’s being measured is actually what matters.
“Is GDP really telling us what matters in an economy?” she suggests. “And are Quality Adjusted Life Years such a good measure for health outcomes?"