Joseph Reagle
2018-10
Joseph Reagle, Northeastern University
In 2017, globalism, sexist culture, and Trumpian nepotism prompted Dictionary.com to recognize complicity as their word of the year (Dictionarycom (2017)).
I usually lose a lot of things like my keys… [the implant] will give me access and help me. (Sandra Haglof in Salles (2017))
[I want to be] part of the future.
These questions are about problematic technology: tech-related artifacts, ideology, and techniques that have potential to be broadly harmful.
Digital technology—and its assessment—are especially problematic.
Who beyond the thief is obliged to compensate the victim of the theft?
Aristotle said we must consider those who contribute “by command, by counsel, by consent, by flattery, by receiving, by participation, by silence, by not preventing, [and] by not denouncing” (1917sts, II.II.7.6).
BF
(Badness Factor) of the principal wrongdoingRF
(Responsibility Factor)
V
: voluntarinessKc
: knowledge of contributionKw
: knowledge of wrongnessCF
(Contribution Factor)
C
: centrality, including essentialityProx
: proximityRvse
: reversibilityTemp
: temporalityPr
: planning roleResp
: responsivenessSP
(Shared Purpose) with wrong-doers
|
Kc=0 or Kw=0 or CF=0 |
|
Kc=1 and Kw=1 and CF > 0 but V=0 |
|
Kc=1 and Kw=1 but (0<CF<1 or 0<V<1 or 0<SP<max) |
|
Kc=1 and Kw=1 and V=1 and CF=1 and SP=max |
Whereas complicit was Dictionary.com’s word of 2017, fake news took the honor at the Collins dictionary (Flood (2017)).
I feel tremendous guilt…. The short-term dopamine driven feedback loops that we have created are destroying how society works: no civil discourse, no cooperation, misinformation, mis-truth… (Palihapitiya (2017), min. 22:30)
because [of] the unintended consequences of a network when it grows to a billion or two billion people … it literally changes your relationship with society, with each other…. (Parker in Allen (2018))
(responsibility × badness × contribution)
+
(responsibility × shared_purpose)
Let’s assume shared_purpose = 0
(responsibility × badness × contribution)
V
: voluntarinessKc
: knowledge of contributionKw
: knowledge of wrongnessAbsolutely
The inventors, creators—it’s me, it’s Mark [Zuckerberg], it’s Kevin Systrom on Instagram, it’s all of these people—understood this consciously. And we did it anyway” (Parker in Allen (2018))
I think in the back deep deep recesses of our minds we kind of knew something bad could happen, but I think the way we defined it was not like this. (Palihapitiya (2017))
“The Unanticipated Consequences of Purposive Social Action” (Merton (1936)) distinguishes between:
It was not as if propaganda was inconceivable (unknowable), only that they were largely ignorant to the possibility.
It’s clear now that we didn’t do enough to prevent these tools from being used for harm as well. That goes for fake news, foreign interference in elections, and hate speech, as well as developers and data privacy. We didn’t take a broad enough view of our responsibility, and that was a big mistake. (Zuckerberg in TimbergRomm (2018))
RF (Responsibility Factor) = f(V, Kc, Kw)
CB = (RF=0.5 × BF × CF)
Insight: Unintended and unanticipated consequences complicate the assessment of responsibility, especially with respect to knowledge and types ignorance.
It’s literally at a point now where I think we have created tools that are ripping apart the social fabric of how society works. (Chamath Palihapitiya (2017), min. 21:38)
Palihapitiya knew “something bad could happen” but it “was not like this.”
CB = (RF=0.5 × BF=0.7 × CF)
Insight: Again, contingency and scale complicate assessment of badness, which has other issues which we will return to.
Facebook contributed to the spread of false information. They could’ve done better.
CB = (RF=0.5 × BF=0.7 × CF=0.7)
CB = 0.25
If FB didn’t succeed with its platform, someone else would’ve.
Life hacker’s voluntary embrace of technology (1) eases coercive imposition and (2) increases normative pressure.
We believe that modern technology, driven by science, has an incredible (and largely unrealized) potential to support psychological, emotional, and spiritual wellbeing. (Consciousness Hacking (2016))
There is no law or regulation to limit the use of this kind of equipment in China. The employer may have a strong incentive to use the technology for higher profit, and the employees are usually in too weak a position to say no. (Qiao in Chen (2018)).
Insight: Whereas technologies of the self are performed by the self for the self’s benefit, those of power see the self dominated and objectivized by another (Foucault (1982/1997)).
Embracing, adopting, and testing problematic tech facilitates its deployment.
It adds grease to a slippery slope.
Little (1998css) refers to the culpability of participating in cosmetic surgery as cultural complicity, which is “when one endorses, promotes, or unduly benefits from norms and practices that are morally suspect.”
Significant complicity. And those who encourage harmful norms and practices are “crassly complicit” (Little (1998), p. 170)
Negligible complicity.
Negligible complicity. (Though gurus are more so.)
Unlike many of the harms discussed in the literature, problematic tech is equivocal. Consequences differ with respect to:
e.g., Pavlok wrist zapper
Beyond the (limited) economic effect of such distancing, protests have an important role: by making a concern a topic of public discussion, protesters remove the potential for others to claim they were ignorant to their responsibility (i.e., “consciousness raising.”)
(responsibility × badness × contribution)
Shady Non- Contributors (non-causal) |
Non-Blameworthy Complicit Contributors (causal but not responsible) |
Blameworthy Complicit Contributors (causal & responsible) |
Participating Co-principals (constitutive) |
---|---|---|---|
|
(not voluntary or knowledgeable of of harm/role) |
|
|
technologies of the self … permit individuals to effect … operations on their own bodies and souls, thoughts, conduct, and way of being, so as to transform themselves in order to attain a certain state of happiness, purity, wisdom, perfection, or immortality.
technologies of power … determine the conduct of individuals and submit them to certain ends or domination, an objectivizing of the subject; (Foucault (1982/1997), p. 225)
Imprisoning a class of people in a cage is a technology of hard power; shaming those who leave home without a chaperon is a technology of soft power.