Inhumane Filtering

In his presentation last night, Thomas discussed the psychology of programmers as outlined in the “Hello World” chapter in Eli Pariser’s The Filter Bubble. Thomas’s basic argument was that programmers are so focused on creating great technology that they have a “willful blindness” of how their inventions can impact the world, ignoring the economic, cultural, and social impact their work has. As I mentioned in class, whenever a Facebook engineer adds a new feature or changes a policy, it impacts the stored data of a billion people. Moreover, since Facebook represents a filtered version of the Internet, that same engineer’s “tweak” to an algorithm can filter the world in a way that will have real effects.

Evgeny Morozov, author of The Net Delusion and who we will read at the end of the semester, lashes out against the techno-utopian argument in Parag and Ayesha Khanna’s recent ebook Hybrid Reality: Thriving in the Emerging Human-Technology Civilization:

For the Khannas, technology is an autonomous force with its own logic that does not bend under the wicked pressure of politics or capitalism or tribalism; all that we humans can do is find a way to harness its logic for our own purposes. Technology is the magic wand that lifts nations from poverty, cures diseases, redistributes power, and promises immortality to the human race. Nations, firms, and cities that develop the smartest and most flexible way of doing this are said to possess Technik—a German term with a substantial intellectual pedigree that, in the Khannas’ hands, can mean just about anything—and a high “technology quotient.”

In both Pariser’s and Morozov’s estimations, programmers and the Khannas, respectively, see technology as an autonomous and neutral force without politics or prejudice. But Pariser and Morozov argue otherwise: deployed technology is an inherently powerful force when it is exercised on a large scale. In the case of Pariser, the filter bubble is deployed to Google and Facebook billion-plus users, who are in effect consuming a version of the Internet that an intentionally designed algorithm has mediated for them. To argue that this mediation is neutral is to ignore the human intervention a programmer or an engineer has made and the responsibility that he or she has refused to accept.

Granted a single human being does not filter the Internet for each and everyone one of us. That would be terribly inefficient. Instead, a set of algorithms filters the Internet for each and everyone of us. That does not make said filtering neutral, but it make it inhumane.

Print Friendly
This entry was posted in After School Special, The Filter Bubble (Pariser) and tagged , , , , . Bookmark the permalink.