User:Angeliki/Reading notes: Difference between revisions

From titipi
Jump to navigation Jump to search
m (Angeliki moved page Reading notes to User:Angeliki/Reading notes)
No edit summary
 
(6 intermediate revisions by the same user not shown)
Line 14: Line 14:


The people don't trust the big people behind the scenes and they know all of them.)
The people don't trust the big people behind the scenes and they know all of them.)




Line 19: Line 20:
=== Notes ===
=== Notes ===
The non-repeatability is a form of resistance because it is impossible to police it.
The non-repeatability is a form of resistance because it is impossible to police it.
== Burgess, M. (no date) ‘This Algorithm Could Ruin Your Life’, Wired. Available at: https://www.wired.com/story/welfare-algorithms-discrimination/ (Accessed: 14 March 2023). ==
=== Some highlights===
"Imane’s background and personal history meant the system ranked her as 'high risk.'"
"These include its machine learning model, training data, and user operation manuals. The disclosures provide an unprecedented view into the inner workings of a system that has been used to classify and rank tens of thousands of people. With this data, we were able to reconstruct Rotterdam’s welfare algorithm and see how it scores people"
"Experts who reviewed our findings expressed serious concerns that the system may have discriminated against people"
"More than 20,000 families [in Rotterdam] were wrongly accused of childcare benefit fraud after a machine learning system was used to try to spot wrongdoing"
"Each week, she meets with a group of mostly single mothers, many of whom have a Moroccan background, to talk, share food, and offer each other support. "
"De Rotte, the director of the city’s income department, says these changes include adding a “human dimension” to its welfare processes."
=== Notes ===
"The pattern of local and national governments turning to machine learning algorithms is being repeated around the world."
This reminds me of the article about Voice Recognition Software to Screen Refugees https://gizmodo.com/experts-worry-as-germany-tests-voice-recognition-softwa-1793424680. The software's inaccuracy affected the lives of many asylum seekers. The governments were trying to replace completely the work of linguist experts with this algorithm.
The relationship of tech companies with governments in order to develop tools for public use is obscure to the public: "The system, which was originally developed by consulting firm Accenture before the city took over development in 2018, is trained on data collected by Rotterdam’s welfare department."
This lack of communication and coordination between tech and social workers: "The government auditor found there was “insufficient coordination” between the developers of the algorithms and city workers who use them, which could lead to ethical considerations being neglected."
Platforms like Facebook become spaces for v=creating groups of support: "Throughout her investigations, she has heard other people’s stories, turning to a Facebook support group set up for people having problems with the Netherlands’ welfare system."

Latest revision as of 13:44, 23 March 2023

Simone, A. (2018) Improvised Lives: Rhythms of Endurance in an Urban South

Notes and abstract from the chapter 1 The Uninhabitable

Abstract

It refers to an urban life that is improvised and impossible to police because of its impossibility to be repeated second time, there is no pattern easily recognisable. This is an uninhabitable world. The one is intervening in each others lives and there is a lot of information but no ways to solve it. Unemployed men waiting and women living in domestic spaces divided by walls which "are not just porous sieves of information but marks of complex geographies where bonds and cuts in webs of lateral relations are made".

Even improvised lives need a place to be held and supported. This book supports the practice of districting. Taken the example of black urbanisation and the work of Sun Ra exo-planetary efforts are made to be part of the center of the city. "For Sun Ra, then, districting referred to an incessantly inventive practice of operating in the discontinuities between having a location in which one is identified and from which one can identify and speak to others"


(Paths to be constantly crossed

changing paths constantly difficult to police (that is a form of resistance)

The people don't trust the big people behind the scenes and they know all of them.)



Notes

The non-repeatability is a form of resistance because it is impossible to police it.


Burgess, M. (no date) ‘This Algorithm Could Ruin Your Life’, Wired. Available at: https://www.wired.com/story/welfare-algorithms-discrimination/ (Accessed: 14 March 2023).

Some highlights

"Imane’s background and personal history meant the system ranked her as 'high risk.'"

"These include its machine learning model, training data, and user operation manuals. The disclosures provide an unprecedented view into the inner workings of a system that has been used to classify and rank tens of thousands of people. With this data, we were able to reconstruct Rotterdam’s welfare algorithm and see how it scores people"

"Experts who reviewed our findings expressed serious concerns that the system may have discriminated against people"

"More than 20,000 families [in Rotterdam] were wrongly accused of childcare benefit fraud after a machine learning system was used to try to spot wrongdoing"

"Each week, she meets with a group of mostly single mothers, many of whom have a Moroccan background, to talk, share food, and offer each other support. "

"De Rotte, the director of the city’s income department, says these changes include adding a “human dimension” to its welfare processes."


Notes

"The pattern of local and national governments turning to machine learning algorithms is being repeated around the world." This reminds me of the article about Voice Recognition Software to Screen Refugees https://gizmodo.com/experts-worry-as-germany-tests-voice-recognition-softwa-1793424680. The software's inaccuracy affected the lives of many asylum seekers. The governments were trying to replace completely the work of linguist experts with this algorithm.

The relationship of tech companies with governments in order to develop tools for public use is obscure to the public: "The system, which was originally developed by consulting firm Accenture before the city took over development in 2018, is trained on data collected by Rotterdam’s welfare department."

This lack of communication and coordination between tech and social workers: "The government auditor found there was “insufficient coordination” between the developers of the algorithms and city workers who use them, which could lead to ethical considerations being neglected."

Platforms like Facebook become spaces for v=creating groups of support: "Throughout her investigations, she has heard other people’s stories, turning to a Facebook support group set up for people having problems with the Netherlands’ welfare system."