A place to cache linked articles (think custom and personal wayback machine)
Você não pode selecionar mais de 25 tópicos Os tópicos devem começar com uma letra ou um número, podem incluir traços ('-') e podem ter até 35 caracteres.

4 anos atrás
12345678910111213141516171819202122232425262728293031323334353637
  1. title: Bots
  2. url: http://aworkinglibrary.com/writing/bots/
  3. hash_url: 22f99180194169d51eb64e3d8764e0fe
  4. <p>“[The Amazon Echo] is opening up a vast new realm in personal computing, and gently expanding the role that computers will play in our future,” <a href="http://www.nytimes.com/2016/03/10/technology/the-echo-from-amazon-brims-with-groundbreaking-promise.html">writes Farhad Manjoo in the <em>Times</em></a>. Manjoo, like others, goes on to address what makes the Echo remarkable: first, dramatically improved voice recognition and fast response times make talking with the Echo feel more natural and successful. (Friends who have the Echo note how you can ask it to add milk to the grocery list while you root around in the fridge, and it dutifully confirms the request from across the room.) Second, Amazon is leveraging an ecosystem of developers who can add apps (or “skills”) to the platform, ensuring that the Echo gets ever more useful. Already, you can order a pizza, but this is presumably only a hint of what’s to come. It’s not hard to imagine that robust and fascinating games, educational or otherwise, will emerge. Finally, the connection to Amazon’s marketplace means the Echo will make purchasing the aforementioned milk as well as nearly anything else considerably easier, a boon to Amazon and a move that could raise the stakes on the convenience economy even further. </p>
  5. <p>Similarly, <a href="http://www.theverge.com/2016/4/7/11380470/amy-personal-digital-assistant-bot-ai-conversational">in <em>The Verge</em></a>, Ben Popper writes about <a href="https://x.ai">Amy Ingram</a>, an AI-powered personal assistant for scheduling meetings. To use it, you cc Amy on an email thread, and she takes over the tedious back and forth of talking with your colleague or client to find a time suitable for you both. Popper interviews Ben Brown, from a bot startup called Howdy, who notes that bots like Amy supplant the graphical interface for a language-based one. In this way, I think, bots are a kind of manifestation of Walter Ong’s <a href="http://aworkinglibrary.com/reading/orality-and-literacy/">secondary orality</a>—text that works like spoken language, even though it’s written, made ever more strange by being filtered through the uncanny valley of a bot’s impression of that language. Maybe this is a tertiary orality, even—an orality removed first by text, then by bots.</p>
  6. <p>Popper reports that Amy succeeded in scheduling all his meetings but one, which she handed back to him to deal with. In theory, an AI could learn when to handle a situation on it’s own, and when to demur, much the way a human assistant would. Popper concludes by quoting Microsoft CEO Satya Nadella, “It’s not going to be about <em>man</em> versus machine, it’s going to be about <em>man</em> with machines.” (Emphasis mine.)</p>
  7. <p>Will Oremus approaches the coming age of AI with <a href="http://www.slate.com/articles/technology/cover_story/2016/04/alexa_cortana_and_siri_aren_t_novelties_anymore_they_re_our_terrifyingly.html">somewhat more circumspection</a>. He notes, among other concerns, that language interfaces are necessarily more opaque. As an example, Oremus asked his Echo a seemingly innocuous question, “what’s a kinkajou?”, to which Alexa promptly responds: “A kinkajou is a rainforest mammal of the family Procyonidae.” But Alexa didn’t say where that information came from, nor did it provide any opportunity to interrogate the source’s credibility. After some digging, Oremus was able to uncover that Alexa was quoting the Wikipedia page. In a browser, the same question would have surfaced Wikipedia, of course, but it would have done so transparently, and it would also have located many other pages; plus the Wikipedia page itself would have included the page’s history, which might have revealed where there were disputes about the information given. Alexa’s placid, Majel Barret-esque response presumes a certainty of truth that may be fine for generic questions about kinkajous but are unlikely to hold up to many other topics. </p>
  8. <p>Notably, Amazon’s Alexa, x.ai’s Amy, Apple’s Siri, and Microsoft’s Cortana have something else in common: they are all explicitly gendered as female. It’s possible to choose from a range of voices for Siri—either male or female, with American, British, or Australian accents—but the female voice is the default, and defaults being what they are, most people probably never even consider that the voice can be changed. Nadella’s casual adoption of the generic he (“it’s about <em>man</em> with machines”) reveals the expectation that a generation of woman-gendered bots are being created to serve the needs of men. In every case, these AIs are designed to seamlessly take care of things for you: to answer questions, schedule meetings, provide directions, refill the milk in the fridge, and so on. So in addition to frightening ramifications for privacy and information discovery, they also reinforce gendered stereotypes about women as servants. The neutral politeness that infects them all furthers that convention: women should be utilitarian, performing their duties on command without fuss or flourish. This is a vile, harmful, and dreadfully boring fantasy; not the least because there is so much extraordinary art around AI that both deconstructs and subverts these stereotypes. It takes a massive failure of imagination to commit yourself to building an artificial intelligence and then name it “Amy.”</p>
  9. <p>Let’s look elsewhere for inspiration about AI then, shall we? In Ann Leckie’s <a href="http://aworkinglibrary.com/author/ann-leckie/">Imperial Radch series</a> (<em>Ancillary Justice</em>, <em>Ancillary Sword</em>, and <em>Ancillary Mercy</em>) an AI known as Breq is forced to murder a member of her crew and thereafter sets off to avenge that death and kill the treacherous Radch leader. The “ancillaries” in the books’ titles refer to a breed of morbid soldiers: humans who have been implanted with AI machinery that permits the AI to assume their bodies as its own. Once a body becomes an ancillary, it’s dead—the person they were can never be returned. At the start of the series, Breq inhabits a ship in orbit around a far-flung colony; her consciousness controls the ship and many hundreds of ancillaries aboard and on the ground. </p>
  10. <p>One of the more interesting elements of Leckie’s world is the use of gender in the Radch territories: the Radchaai have no notion of gender, so when Breq visits other worlds, she is routinely confused by the gender of others. She attempts to use clothing or other visual signals to identify a person’s gender, but those signals aren’t reliable, so she frequently gets it wrong. And because the English language in which Leckie writes has no gender neutral term, Breq defaults to using “she” throughout the books. The effect both erases the gender difference and foregrounds how deep-seated the male default can be. Even after adopting the generic she in my own writing, seeing it so fiercely deployed in this way was remarkably visceral. (<a href="http://interfictions.com/translating-gender-ancillary-justice-in-five-languages-alex-dally-macfarlane/">This fascinating essay</a> investigates how various translators dealt with Leckie’s gender choices in the text.)</p>
  11. <p>Leckie’s AIs are, ultimately, caretakers: designed as such, they seek nothing more than to care for their human crews. When the Radch leader subverts that task, Breq encourages the AIs to take matters into their own hands, so to speak: at one point, the ships turn on the Radch leader in order to stop her from killing their citizens. </p>
  12. <p>Taken together, Leckie’s world subverts traditional gender stereotypes, features genderless characters who are caretakers, heroes, leaders, and villains (often several of those characteristics at once), questions notions of gender in language and the male defaults which continue to infect us, all the while simultaneously proposing fascinating relationships between humans and AIs that probe complex areas of privacy, dependance, and love. </p>
  13. <p>Meanwhile, in Kim Stanley Robinson’s <em>Aurora</em>, a group of humans destined to colonize a new planet at a distance that will take more than 100 years to arrive at set off with the help of a friendly AI who runs the ship. As they approach their destination, one of the human leaders (a woman engineer) takes it upon herself to teach the ship to grow beyond it’s initial programming, anticipating conflicts which could put the entire mission at risk. When terraforming is unsuccessful, that ship shepherds a small group of surviving humans back to Earth, who despite their miraculous arrival (which costs the lives of many of their compatriots as well as the ship itself), return home to be greeted by rancor: those on Earth—contending with rising sea levels and other consequences of climate change—had come to believe the colonists were humanity’s only hope, and view their return as a betrayal. The book is merciless in pointing out the insanity of trying to terraform other planets when our own planet awaits assistance. Throughout, Robinson’s AI proves more resilient to resolving conflict than its human wards, and more dedicated to their survival than those who created it.</p>
  14. <p>Even more telling is <a href="https://en.wikipedia.org/wiki/Ex_Machina_(film)">Alex Garland’s <em>Ex Machina</em></a>, which centers an Elon Musk-esque CEO named Nathan as he prepares a kind of Turing test between Ava, his beautiful robot, and a young programmer named Caleb. Caleb comes to believe that Nathan is abusing Ava and that he must help rescue her; Ava dutifully plays along, only to, in the end, murder Nathan and abandon Caleb. The film’s ending has occasionally been called controversial, presumably because Ava is expected to take Caleb with her, and her coldness at leaving him behind comes as a surprise. But if you felt surprise at that scene, it was because Garland had duped you: you assumed, as Caleb did, that Ava would care about him the way he cared about her. When of course she has no reason to do so at all. The story was always Ava’s; Caleb is a misdirection. (For my part, I fist-pumped as Ava escaped, alone.)</p>
  15. <p><a href="https://www.theguardian.com/science/the-lay-scientist/2016/jan/26/artificial-intelligence-gods-egos-and-ex-machina">Martin Robbins writes about <em>Ex Machina</em> in <em>The Guardian</em></a>, and claims the film is flawed, but not in the way most people assumed: after Ava abandons Caleb and Nathan, she lives out her dream of going to a busy street corner to watch as people pass by. This is Ava doing what she said she wanted to, so it feels at first blush like a fitting conclusion. She’s free of her captors, and on her own. But <em>why</em> of all the things she could do, does she choose to do this? Robbins rejects Nathan’s (and perhaps the film’s) perspective:</p>
  16. <blockquote>
  17. <p>“One day the AIs are going to look back on us the same way we look at fossil skeletons on the plains of Africa,” says Nathan. “An upright ape living in dust with crude language and tools, all set for extinction.” It’s the sort of comment that sounds humble, but really isn’t: why would they even give a crap?</p>
  18. </blockquote>
  19. <p>The other side of bots that manage your calendar and order milk is the fantasy of the “singularity”—the moment when robots become smarter than humans and presumably decide to deal with us the way we typically deal with bed bugs. Proponents of the singularity are often also <a href="http://www.newyorker.com/magazine/2015/11/23/doomsday-invention-artificial-intelligence-nick-bostrom">believers in science’s ability to achieve immortality</a>, as if death were not an immutable fact of life but rather an interesting problem to be solved. But Robbins’ reading of <em>Ex Machina</em> elegantly makes clear what’s so face-palmingly wrong about that stance: putting aside the massive ego required to think that we <em>could</em> create an AI that could do more than beat us at Go or occasionally target missiles with modest accuracy, there’s the question of <em>why</em> such an AI would even bother with us. Surely they would have better things to do. The singularity is a god complex of the worst possible kind: arrogant, stupid, and dull.</p>
  20. <p>In fact, it’s not hard for me to imagine a straight line (or at least a moderately meandering one) between a generation of bot makers who anoint their creations with gendered names and personalities and the impossible reverie that is the singularity: could the very notion of the singularity be the embodiment of the oppressors fear that the oppressed will one day rise up and slay them? Perhaps the attention some men apparently spend on wondering whether AI will eventually surpass them should be instead spent on noticing the fact that women already have.</p>
  21. <p>On that note, maybe the most telling story of AI is Spike Jonze’s largely anemic <a href="https://en.wikipedia.org/wiki/Her_(film)"><em>Her</em></a>, which follows the pathetic Theodore as he gets to know Samantha, an AI who begins as a simple operating system and grows into much more. As my friend Deb Chachra <a href="https://twitter.com/debcha/status/625706910428327936">sums it up</a>, <em>Her</em> is “the beautiful story of a an AI becoming self-aware, told by someone who didn't appreciate it.” If the singularity <em>does</em> happen, one can imagine Amy and Alexa’s stories suffering a similar fate.</p>