surveillance culture

All posts tagged surveillance culture

Below are the slides, audio, and transcripts for my talk ‘”Any Sufficiently Advanced Neglect is Indistinguishable from Malice”: Assumptions and Bias in Algorithmic Systems,’ given at the 21st Conference of the Society for Philosophy and Technology, back in May 2019.

(Cite as: Williams, Damien P. ‘”Any Sufficiently Advanced Neglect is Indistinguishable from Malice”: Assumptions and Bias in Algorithmic Systems;’ talk given at the 21st Conference of the Society for Philosophy and Technology; May 2019)

Now, I’ve got a chapter coming out about this, soon, which I can provide as a preprint draft if you ask, and can be cited as “Constructing Situated and Social Knowledge: Ethical, Sociological, and Phenomenological Factors in Technological Design,” appearing in Philosophy And Engineering: Reimagining Technology And Social Progress. Guru Madhavan, Zachary Pirtle, and David Tomblin, eds. Forthcoming from Springer, 2019. But I wanted to get the words I said in this talk up onto some platforms where people can read them, as soon as possible, for a  couple of reasons.

First, the Current Occupants of the Oval Office have very recently taken the policy position that algorithms can’t be racist, something which they’ve done in direct response to things like Google’s Hate Speech-Detecting AI being biased against black people, and Amazon claiming that its facial recognition can identify fear, without ever accounting for, i dunno, cultural and individual differences in fear expression?

[Free vector image of a white, female-presenting person, from head to torso, with biometric facial recognition patterns on her face; incidentally, go try finding images—even illustrations—of a non-white person in a facial recognition context.]


All these things taken together are what made me finally go ahead and get the transcript of that talk done, and posted, because these are events and policy decisions about which I a) have been speaking and writing for years, and b) have specific inputs and recommendations about, and which are, c) frankly wrongheaded, and outright hateful.

And I want to spend time on it because I think what doesn’t get through in many of our discussions is that it’s not just about how Artificial Intelligence, Machine Learning, or Algorithmic instances get trained, but the processes for how and the cultural environments in which HUMANS are increasingly taught/shown/environmentally encouraged/socialized to think is the “right way” to build and train said systems.

That includes classes and instruction, it includes the institutional culture of the companies, it includes the policy landscape in which decisions about funding and get made, because that drives how people have to talk and write and think about the work they’re doing, and that constrains what they will even attempt to do or even understand.

All of this is cumulative, accreting into institutional epistemologies of algorithm creation. It is a structural and institutional problem.

So here are the Slides:

The Audio:

[Direct Link to Mp3]

And the Transcript is here below the cut:

Continue Reading

“Any Sufficiently Advanced Police State…”
“…Is indistinguishable from a technocratic priestly caste?”
Ingrid Burrington and Me, 04/17/15

As I said the other day, I’ve been thinking a lot about death, lately, because when two members of your immediate family die within weeks of each other, it gets into the mind. And when that’s woven through with more high-profile American police shootings, and then capped by an extremely suspicious death while in the custody of police, even more so, right? I’m talking about things like Walter Scott and Freddie Gray, and the decision in the Rekia Boyd case, all in a span of a few weeks.

So I’m thinking about the fact that everyone’s on the police bodycam trip, these days, especially in the USA–which, by the way will be the main realm of my discussion; I’m not yet familiar enough with their usage and proliferation in other countries to feel comfortable discussing them, so if any of you has more experience with and references to that, please feel free to present them in the comments, below. But, for now, here, more and more people are realizing that this is another instance of thinking a new technology will save us all, by the mere virtue of its existing. But as many people noted at Theorizing The Web, last week, when those in control of the systems of power start to vie for a thing just as much as those who were wanting to use that thing to Disrupt power? Maybe it’s not as disruptive a panacea as you thought.

We’ve previously discussed the nature of the Thick Blue Wall–the interconnected perspectives and epistemological foundations of those working on the prosecutorial side of the law, leading to lower likelihoods of any members of those groups being charged with wrongdoing, at all, let alone convicted. With that in mind, we might quickly come to a conclusion that wide proliferation of bodycams will only work if we, the public, have unfettered access to the datastream. But this position raises all of the known issues of that process inherently violating the privacy of the people being recorded. So maybe it’s better to say that bodycams absolutely will not work if the people in control of the distribution and usage of the recordings are the police, or any governing body allied with the police.

If those members of the authorities in charge of maintenance of the status quo are given the job of self oversight, then all we’ll have on our hands is a recapitulation of the same old problem–a Blue Firewall Of Silence. There’ll be a data embargo, with cops, prosecutors, judges, union reps getting to decide how much of which angles of whose videos are “pertinent” to any particular investigation and yeah, maybe you can make the “rest” of the tape available through some kind of Freedom Of Information Act-esque mechanism, but we have a clear vision of what that tends to look like, and exactly how long that process will take. We’re not exactly talking about Expedient Justice™, here.

So perhaps the real best bet, here, is to provide a completely disconnected, non-partisan oversight body, comprised of people from every facet of society, and every perspective on the law–at least those who still Believe that a properly-leveraged system of laws can render justice. So you get, say, a prosecutor, a defense attorney, a PUBLIC defender, an exonerated formerly accused individual, a convicted felon, someone whose family member was wrongfully killed by the police, a judge, a cop. Different ethnicities, genders, sexualities, perceived disabilities. Run the full gamut, and create this body whose job it is to review these tapes and to decide by consensus what we get to see of them. Do this city by city. Make it a part of the infrastructure. Make sure we all know who they are, but never the exact details of their decision-making processes.

This of course gets immediately more complicated the more data we have to work with, and the more real-time analysis of it can be independently done, or intercepted by outside actors, and we of course have to worry about those people being influenced by those bad faith actors who would try to subvert our attempts at crafting justice… But the more police know that everything they do in every encounter they have with the public will be recorded, and that those recordings will be reviewed by an external review board, the closer we get to having consistent systems of accountability for those who have gotten Very used to being in positions of unquestioned, privileged, protected authority.

Either that, or we just create a conscious algorithmic system to do it, and hope for the best. But it seems like how I might have heard that that was a sticky idea, somewhere… One that people get really freaked out about, all the time. Hm.

All that being said, this is not to say that we ought not proliferate body cameras. It is to say that we must be constantly aware of the implications of our choices, and of the mechanisms by which we implement them. Because, if we’re not, then we run the risk of being at the mercy of a vastly interconnected and authoritarian technocracy, one which has the motive, means, and opportunity to actively hide anything it thinks we ought not concern ourselves with.

Maybe that sounds paranoid, but the possibility of that kind of closed-ranks overreach and our tendency toward supporting it–especially if it’s done in the name of “order”–are definitely there, and we’ll need to curtail them, if we want to consistently see anything like Justice.