A place to cache linked articles (think custom and personal wayback machine)
You can not select more than 25 topics Topics must start with a letter or number, can include dashes ('-') and can be up to 35 characters long.

4 年之前
1234567891011121314151617181920212223242526272829303132
  1. title: Against Black Inclusion in Facial Recognition
  2. url: https://digitaltalkingdrum.com/2017/08/15/against-black-inclusion-in-facial-recognition/
  3. hash_url: cf5eab15b9590499ccb6d989f50fe5e3
  4. <p>By Nabil Hassein</p>
  5. <p><span>Researchers have documented the frequent inability of facial recognition software to detect Black people’s faces due to programmers’ use of unrepresentative data to train machine learning models.</span><span>1</span><span> This issue is not unique, but systemic; in a related example, automated passport photo validation has registered Asian people’s open eyes as being closed.</span><span>2</span><span> Such technological biases have precedents in mediums older than software. For example, color photography was initially optimized for lighter skin tones at the expense of people with darker skin, a bias corrected mainly due to the efforts and funding of furniture manufacturers and chocolate sellers to render darker tones more easily visible in photographs — the better to sell their products.</span><span>3</span><span> Groups such as the Algorithmic Justice League have made it their mission to “highlight algorithmic bias” and “develop practices for accountability during the design, development, and deployment of coded systems”.</span><span>4</span><span> I support all of those goals abstractly, but at a concrete level, I question whose interests would truly be served by the deployment of automated systems capable of reliably identifying Black people.</span></p>
  6. <p><span>As a longtime programmer, I know first-hand that software can be an unwelcoming medium for Black folks, not only because of racism among programmers, but also because of biases built into code, which programmers can hardly avoid as no other foundations exist to build on. It’s easy for me to understand a desire to rid software of these biases. Just last month, I wrote up a sketch of a proposal to decolonize the Pronouncing software library</span><span>5</span><span> I used in a simple art project to generate rhymes modeled on those of my favorite rapper.</span><span>6</span><span> So I empathized when I heard Joy Buolamwini of the Algorithmic Justice League speak on wearing a white mask to get her own highly imaginative “Aspire Mirror” project involving facial recognition to perceive her existence.</span><span>7</span><span> Modern technology has rendered literal Frantz Fanon’s metaphor of “Black Skin, White Masks”.</span><span>8</span></p>
  7. <p><span>Facial recognition has diverse applications, but as a police and prison abolitionist, the enhancement of state-controlled surveillance cameras (including police body cameras) to automatically identify people looms much larger in my mind than any other use.</span><span>9</span><span> Researchers at Georgetown University found that fully half of American adults, or over 100 million people, are registered in one or another law enforcement facial recognition database, drawing from sources such as driver’s license photos.</span><span>10</span><span> Baltimore Police used the technology to identify participants in the uprising following the murder of Freddie Gray.</span><span>11</span><span> The US government plans to use facial recognition to identify every airline passenger exiting the United States.</span><span>12</span><span> Machine learning researchers have even reinvented the racist pseudoscience of physiognomy, in a study claiming to identify criminals with approximately 90% accuracy based on their faces alone — using data provided by police.</span><span>13</span></p>
  8. <p><span>I consider it obvious that most if not all data collected by police to serve their inherently racist mission will be severely biased. It is equally clear to me that no technology under police control will be used to hold police accountable or to benefit Black folks or other oppressed people. Even restricting our attention to machine learning in the so-called “justice” system, examples abound of technology used to harm us, such as racist predictive models used by the courts to determine bail and sentencing decisions — matters of freedom and captivity, life and death.</span><span>14</span><span> Accordingly, I have no reason to support the development or deployment of technology which makes it easier for the state to recognize and surveil members of my community. Just the opposite: by refusing to don white masks, we may be able to gain some temporary advantages by partially obscuring ourselves from the eyes of the white supremacist state. The reality for the foreseeable future is that the people who control and deploy facial recognition technology at any consequential scale will predominantly be our oppressors. Why should we desire our faces to be legible for efficient automated processing by systems of their design? We could demand instead that police be forbidden to use such unreliable surveillance technologies. Anti-racist technologists could engage in high-tech direct action by using the limited resources at our disposal to further develop extant techniques for tricking machine learning models into misclassifications,</span><span>15</span><span> or distributing anti-surveillance hardware such as glasses designed to obscure the wearer’s face from cameras.</span><span>16</span></p>
  9. <p><span>This analysis clearly contradicts advocacy of “diversity and inclusion” as the universal or even typical response to bias. Among the political class, “Black faces in high places” have utterly failed to produce gains for the Black masses.</span><span>17</span><span> Similarly, Black cops have shown themselves just as likely as white cops to engage in racist brutality and murder.</span><span>18</span><span> Why should the inclusion of Black folks in facial recognition, or for that matter, the racist technology industry be different? Systemic oppression cannot be addressed by a change in the complexion of the oppressor, as though a rainbow 1% and more white people crowding the prisons would mean justice. That’s not the world I want to live in. We must imagine and build a future of real freedom.</span></p>
  10. <p><span>All of the arguments I’ve presented could be (and have been) applied to many domains beyond facial recognition. I continue to grapple with what that means for my own work as a technologist and a political organizer, but I am firm already in at least two conclusions. The first is that despite every disadvantage, we must reappropriate oppressive technology for emancipatory purposes. The second is that the liberation of Black folks and all oppressed peoples will never be achieved by inclusion in systems controlled by a capitalist elite which benefits from the perpetuation of racism and related oppressions. It can only be achieved by the destruction of those systems, and the construction of new technologies designed, developed, and deployed by our own communities for our own benefit. The struggle for liberation is not a struggle for diversity and inclusion — it is a struggle for decolonization, reparations, and self-determination. We can realize those aspirations only in a socialist world.</span></p>
  11. <p><i><span>Nabil Hassein is a software developer and organizer based in Brooklyn, NY.</span></i></p>
  12. <ol>
  13. <li><a href="https://www.digitaltrends.com/photography/google-apologizes-for-misidentifying-a-black-couple-as-gorillas-in-photos-app/"><span>https://www.digitaltrends.com/photography/google-apologizes-for-misidentifying-a-black-couple-as-gorillas-in-photos-app/</span></a><span>;</span><a href="https://www.theguardian.com/technology/2017/may/28/joy-buolamwini-when-algorithms-are-racist-facial-recognition-bias"> <span>https://www.theguardian.com/technology/2017/may/28/joy-buolamwini-when-algorithms-are-racist-facial-recognition-bias</span></a><span>↩</span></li>
  14. <li><a href="https://www.dailydot.com/irl/richard-lee-eyes-closed-facial-recognition/"><span>https://www.dailydot.com/irl/richard-lee-eyes-closed-facial-recognition/</span></a><span>↩</span></li>
  15. <li><a href="https://petapixel.com/2015/09/19/heres-a-look-at-how-color-film-was-originally-biased-toward-white-people/"><span>https://petapixel.com/2015/09/19/heres-a-look-at-how-color-film-was-originally-biased-toward-white-people/</span></a><span>↩</span></li>
  16. <li><a href="https://www.ajlunited.org/"><span>https://www.ajlunited.org</span></a><span>↩</span></li>
  17. <li><a href="https://pronouncing.readthedocs.io/en/latest/"><span>https://pronouncing.readthedocs.io/en/latest/</span></a><span>↩</span></li>
  18. <li><a href="https://nabilhassein.github.io/blog/generative-doom/"><span>https://nabilhassein.github.io/blog/generative-doom/</span></a><span>↩</span></li>
  19. <li><a href="https://www.ted.com/talks/joy_buolamwini_how_i_m_fighting_bias_in_algorithms/"><span>https://www.ted.com/talks/joy_buolamwini_how_i_m_fighting_bias_in_algorithms/</span></a><span>↩</span></li>
  20. <li><span>Frantz Fanon: “Black Skin, White Masks”.↩</span></li>
  21. <li><a href="https://theintercept.com/2017/03/22/real-time-face-recognition-threatens-to-turn-cops-body-cameras-into-surveillance-machines/"><span>https://theintercept.com/2017/03/22/real-time-face-recognition-threatens-to-turn-cops-body-cameras-into-surveillance-machines/</span></a><span>↩</span></li>
  22. <li><a href="https://www.law.georgetown.edu/news/press-releases/half-of-all-american-adults-are-in-a-police-face-recognition-database-new-report-finds.cfm"><span>https://www.law.georgetown.edu/news/press-releases/half-of-all-american-adults-are-in-a-police-face-recognition-database-new-report-finds.cfm</span></a><span>↩</span></li>
  23. <li><a href="http://www.aclunc.org/docs/20161011_geofeedia_baltimore_case_study.pdf"><span>http://www.aclunc.org/docs/20161011_geofeedia_baltimore_case_study.pdf</span></a><span>↩</span></li>
  24. <li><a href="https://www.dhs.gov/sites/default/files/publications/privacy-pia-cbp030-tvs-may2017.pdf"><span>https://www.dhs.gov/sites/default/files/publications/privacy-pia-cbp030-tvs-may2017.pdf</span></a><span>↩</span></li>
  25. <li><a href="https://www.rt.com/news/368307-facial-recognition-criminal-china/"><span>https://www.rt.com/news/368307-facial-recognition-criminal-china/</span></a><span>↩</span></li>
  26. <li><a href="https://www.propublica.org/article/machine-bias-risk-assessments-in-criminal-sentencing"><span>https://www.propublica.org/article/machine-bias-risk-assessments-in-criminal-sentencing</span></a><span>↩</span></li>
  27. <li><a href="https://codewords.recurse.com/issues/five/why-do-neural-networks-think-a-panda-is-a-vulture"><span>https://codewords.recurse.com/issues/five/why-do-neural-networks-think-a-panda-is-a-vulture</span></a><span>;</span><a href="https://cvdazzle.com/"> <span>https://cvdazzle.com/</span></a><span>↩</span></li>
  28. <li><a href="https://blogs.wsj.com/japanrealtime/2015/08/07/eyeglasses-with-face-un-recognition-function-to-debut-in-japan/"><span>https://blogs.wsj.com/japanrealtime/2015/08/07/eyeglasses-with-face-un-recognition-function-to-debut-in-japan/</span></a><span>↩</span></li>
  29. <li><span>Keeanga-Yamahtta Taylor: “From #BlackLivesMatter to Black Liberation”, Chapter 3, “Black Faces in High Places”.↩</span></li>
  30. <li><a href="https://mic.com/articles/118290/it-s-time-to-talk-about-the-black-police-officers-who-killed-freddie-gray"><span>https://mic.com/articles/118290/it-s-time-to-talk-about-the-black-police-officers-who-killed-freddie-gray</span></a><span>↩</span></li>
  31. </ol>