Repository with sources and generator of https://larlet.fr/david/ https://larlet.fr/david/
You can not select more than 25 topics Topics must start with a letter or number, can include dashes ('-') and can be up to 35 characters long.

Ban technology.md 2.5KB

3 years ago
3 years ago
12345678910111213141516171819
  1. ## Ban technology
  2. > “I’ve come to the conclusion that because information constantly increases, there’s never going to be privacy,” Mr. Scalzo said. “Laws have to determine what’s legal, but you can’t ban technology. Sure, that might lead to a dystopian future or something, but you can’t ban it.”
  3. >
  4. > <cite>*[The Secretive Company That Might End Privacy as We Know It](https://www.nytimes.com/2020/01/18/technology/clearview-privacy-facial-recognition.html)* ([cache](/david/cache/2020/1d190443e06aa99b44dd2a4d55b1b58e/))</cite>
  5. Vous avez bien joué avec toutes ces apps qui vous rendent plus jeune/vieux/ridicule/whatever ? Et bien ces images sont maintenant utilisées pour vous reconnaitre avec précision. Vous n’avez peut-être rien à cacher mais vous avez quand même mis en danger celles et ceux qui ont justement quelque chose à cacher, c’est la base de l’intelligence « artificielle » : avoir un jeux de données suffisamment large pour apprendre avec pertinence.
  6. > The reality for the foreseeable future is that the people who control and deploy facial recognition technology at any consequential scale will predominantly be our oppressors. Why should we desire our faces to be legible for efficient automated processing by systems of their design?
  7. >
  8. > <cite>*[Against Black Inclusion in Facial Recognition](https://digitaltalkingdrum.com/2017/08/15/against-black-inclusion-in-facial-recognition/)* ([cache](/david/cache/2020/cf5eab15b9590499ccb6d989f50fe5e3/))</cite>
  9. Les exemples ne manquent pas, [on en vient même à parler](http://bostonreview.net/science-nature-politics/annette-zimmermann-elena-di-rosa-hochan-kim-technology-cant-fix-algorithmic) ([cache](/david/cache/2020/d562b547dc4833f0eb84a67ec2a8465d/)) d’*algorithmic fairness*. C’est dire à quel point on est actuellement bien biaisés.
  10. > The New York Daily News reported on Wednesday that a staffing agency hired by Google had sent its contractors to numerous American cities to target black people for facial scans. One unnamed former worker told the newspaper that in Atlanta, the effort included finding those who were homeless because they were less likely to speak to the media.
  11. >
  12. > <cite>*[Atlanta Asks Google Whether It Targeted Black Homeless People](https://www.nytimes.com/2019/10/04/technology/google-facial-recognition-atlanta-homeless.html)* ([cache](/david/cache/2020/384b330b3de6f4f2bac8c81f0f04c404/))</cite>
  13. Les paumes-de-mains-sur-le-visage ne manquent pas non plus.