A place to cache linked articles (think custom and personal wayback machine)
Nevar pievienot vairāk kā 25 tēmas Tēmai ir jāsākas ar burtu vai ciparu, tā var saturēt domu zīmes ('-') un var būt līdz 35 simboliem gara.

pirms 3 gadiem
12345
  1. title: The Questions Concerning Technology
  2. url: https://theconvivialsociety.substack.com/p/the-questions-concerning-technology
  3. hash_url: b404382125c07935b98295a801049097
  4. <p>A few days ago, a handful of similar stories or anecdotes about technology came to my attention. While they came from different sectors and were of varying degrees of seriousness, they shared a common characteristic. In each case, there was either an expressed bewilderment or admission of obliviousness about the possibility that a given technology would be put to destructive or nefarious purposes. Naturally, I tweeted about it … like one does. </p><p>I subsequently clarified that I was not subtweeting anyone in particular just everything in general. Of course, naiveté, hubris, and recklessness don’t quite cover all the possibilities—nor are they mutually exclusive. </p><p>In response, someone noted that “people find it hard to ‘think like an *-hole’, in <br><a href="https://twitter.com/mathbabedotorg" rel="">@mathbabedotorg</a>'s phrase, because most aren’t.” That handle belongs to Cathy O’Neil, best known for her 2016 book, <em><a href="https://crownpublishing.com/archives/feature/big-data-increases-inequality-threatens-democracy" rel="">Weapons of Math Destruction: How Big Data Increases Inequality And Threatens Democracy</a></em>. </p><p>There’s something to this, of course, and, as I mentioned in my reply, I truly do appreciate the generosity of this sentiment. I suggested that the witness of history is helpful on this score, correcting and informing our own limited perspectives. But I was also reminded of a set of questions that I had put together back in 2016 in a moment of similar frustration. </p><p>The occasion then was the following <a href="https://om.co/2014/11/26/technology-and-the-moral-dimension/" rel="">observation</a> from Om Malik: </p><blockquote><p>“I can safely say that we in tech don’t understand the emotional aspect of our work, just as we don’t understand the moral imperative of what we do. It is not that all players are bad; it is just not part of the thinking process the way, say, ‘minimum viable product’ or ‘growth hacking’ are.”</p></blockquote><p>Malik went on to write that “it is time to add an emotional and moral dimension to products,” by which he seems to have meant that tech companies should use data responsibly and make their terms of service more transparent. In my response at the time, I took the opportunity to suggest that we needn’t add an emotional and moral dimension to tech, it was already there. The only question was as to its nature. As Langdon Winner had famously inquired “Do artifacts have politics?” and answered in the affirmative, I likewise argued that artifacts have ethics. I then went on to produce a set of 41 questions that I drafted with a view to helping us draw out the moral or ethical implications of our tools. The post proved popular at the time and I received a few notes from developers and programmers who had found the questions useful enough to print out post in their workspaces. </p><p>This was all before the subsequent boom in “tech ethics,” and, frankly, while my concerns obviously overlap to some degree with the most vocal and popular representatives of that movement, I’ve generally come at the matter from a different place and have expressed my own <a href="https://thefrailestthing.com/2017/11/06/one-does-not-simply-add-ethics-to-technology/" rel="">reservations</a> with the shape more recent tech ethics advocacy has taken. Nonetheless, I have defended the need to think about the moral dimensions of technology against the notion that all that matters are the underlying dynamics of political economy (e.g., <a href="https://thefrailestthing.com/2018/07/07/political-economy-or-ethics-of-technology/" rel="">here</a> and <a href="https://thefrailestthing.com/2018/10/24/in-defense-of-technology-ethics-properly-understood/" rel="">here</a>). </p><p>I won’t cover that ground again, but I did think it might be worthwhile to repost the questions I drafted then. It’s been more than six years since I first posted them, and, while some you reading this have been following along since then, most of you picked up on my work in just the last couple of years. And, recalling where we began, trying to think like a malevolent actor might yield some useful insights, but I’d say that we probably need a better way to prompt our thinking about technology’s moral dimensions. Besides, worst case malevolent uses are not the only kinds of morally significant aspects of our technology worth our consideration, as I hope some of these questions will make clear. </p><p>This is not, of course, an exhaustive set of questions, nor do I claim any unique profundity for them. I do hope, however, that they are useful, wherever we happen to find ourselves in relation to technological artifacts and systems. At one point, I had considered doing something a bit more with these, possibly expanding on each briefly to explain the underlying logic and providing some concrete illustrative examples or cases. Who knows, may be that would be a good occasional series for the newsletter. Feel free to let me know what you think about that. </p><p>Anyway, without further ado, here they are: </p><ol><li><p>What sort of person will the use of this technology make of me?</p></li><li><p>What habits will the use of this technology instill?</p></li><li><p>How will the use of this technology affect my experience of time?</p></li><li><p>How will the use of this technology affect my experience of place?</p></li><li><p>How will the use of this technology affect how I relate to other people?</p></li><li><p>How will the use of this technology affect how I relate to the world around me?</p></li><li><p>What practices will the use of this technology cultivate?</p></li><li><p>What practices will the use of this technology displace?</p></li><li><p>What will the use of this technology encourage me to notice?</p></li><li><p>What will the use of this technology encourage me to ignore?</p></li><li><p>What was required of other human beings so that I might be able to use this technology?</p></li><li><p>What was required of other creatures so that I might be able to use this technology?</p></li><li><p>What was required of the earth so that I might be able to use this technology?</p></li><li><p>Does the use of this technology bring me joy? [N.B. This was years before I even heard of Marie Kondo!]</p></li><li><p>Does the use of this technology arouse anxiety?</p></li><li><p>How does this technology empower me? At whose expense?</p></li><li><p>What feelings does the use of this technology generate in me toward others?</p></li><li><p>Can I imagine living without this technology? Why, or why not?</p></li><li><p>How does this technology encourage me to allocate my time?</p></li><li><p>Could the resources used to acquire and use this technology be better deployed?</p></li><li><p>Does this technology automate or outsource labor or responsibilities that are morally essential?</p></li><li><p>What desires does the use of this technology generate?</p></li><li><p>What desires does the use of this technology dissipate?</p></li><li><p>What possibilities for action does this technology present? Is it good that these actions are now possible?</p></li><li><p>What possibilities for action does this technology foreclose? Is it good that these actions are no longer possible?</p></li><li><p>How does the use of this technology shape my vision of a good life?</p></li><li><p>What limits does the use of this technology impose upon me?</p></li><li><p>What limits does my use of this technology impose upon others?</p></li><li><p>What does my use of this technology require of others who would (or must) interact with me?</p></li><li><p>What assumptions about the world does the use of this technology tacitly encourage?</p></li><li><p>What knowledge has the use of this technology disclosed to me about myself?</p></li><li><p>What knowledge has the use of this technology disclosed to me about others? Is it good to have this knowledge?</p></li><li><p>What are the potential harms to myself, others, or the world that might result from my use of this technology?</p></li><li><p>Upon what systems, technical or human, does my use of this technology depend? Are these systems just?</p></li><li><p>Does my use of this technology encourage me to view others as a means to an end?</p></li><li><p>Does using this technology require me to think more or less?</p></li><li><p>What would the world be like if everyone used this technology exactly as I use it?</p></li><li><p>What risks will my use of this technology entail for others? Have they consented?</p></li><li><p>Can the consequences of my use of this technology be undone? Can I live with those consequences?</p></li><li><p>Does my use of this technology make it easier to live as if I had no responsibilities toward my neighbor?</p></li><li><p>Can I be held responsible for the actions which this technology empowers? Would I feel better if I couldn’t?</p></li></ol>