Author: Shaka McGlotten
Title: “Black Data.” In No Tea, No Shade: New Writings in Black Queer Studies
Other bibliographic details: ed. E.P. Johnson, 262-286. Durham, NC: Duke University Press.
- Where or what in time-space is the study’s object? What is the work’s spatial scale and scope?
The study is within the realm of the internet, particularly in the United States. It focuses on the effects of the internet, technology, surveillance, and the overall effects of big data on queer black communities.
- What is/are the work’s key question(s)?
How does data turn black lives into commodities? How can we think of race as technology? Why doesn’t the government want us to be able to hide behind anonymity and opacity online?
- Who is the announced and/or implied audience for the work?
I think the implied audience for the work is people interested in privacy and surveillance, as well as people who identify as black and queer (who are more disproportionately affected by these issues than others).
- What are the work’s structure and style?
The work is structured with several clear sections that explain the existence of black data and the complicated interplay between race and technology. McGlotten introduces the subject by explaining black data, black Twitter, and their roles in the Black Lives Matter movement. They then go on to explain the “masks” that people such as Obama use to hide truths and themselves from everyone else, particularly in the political realm.
- What method(s) does the researcher use, if noted?
The researcher largely makes their argument through cultural analysis of the implications of “reading,” “throwing shade,” and performative masks on black data and privacy/opacity. They also analyze a music video, “Google Google Apps Apps,” which explains the effects of gentrification on the black queer community in San Francisco.
- What problems and issues are posed?
The text addresses the issue of commodification of black lives through the use of technology and big data. It also speaks about the issues faced by the black and black queer communities, as well as the rest of the country, surrounding the problem of governmental and personal transparency/opacity.
- What are the arguments? In other words, how does the writer use the theory, method, and evidence to propose answers (or make claims)? (List 3-5)
- Assigning financial and numerical value to black lives is nothing new, but technology is making it easier to commodify them. This is possible in part because race is a form of technology
- People’s performative masks do not necessarily mean that they are hiding; they can also be used as a means of protection and armor
- Companies in the data business have been further marginalizing queer blacks by gentrifying them out of the spaces which they have already claimed
- What evidence does the writer use? Why do these examples (stories, visuals, graphs) stand out above others?
The writer uses anecdotes about author Baratunde Thurston and his explanation of black life online (particularly on Twitter). Evidence also includes a music video by a black queer group, DADDIE$ PLA$TIK, and their experience with gentrification due to big tech companies, as well as a video called Fag Face Mask that responds to the effects of technological profiling on queer men.
- What ideas and/or assumptions serves as the writer’s guide to action?
The writer assumes that black queer practices, or “black data” can provide valuable insight on the effects of surveillance and technology on people and places.
- What is the role of the external actors such as the state or institutions, and how are they defined?
The author examines the external actor of tech giants such as Google and Twitter and the effects of both their technologies and gentrification on black queer lives and spaces. The tech giants are defined as threats to black queers due to their heavy roles in gentrification and surveillance.
- What works for you? What does not? Why?
I agree with McGlotten that surveillance is becoming scarily and increasingly present in our lives. McGlotten also suggests that the effects of this are seen far more easily by people of color, people who identify as queer, and people who are both. I agree with this as well, and I think that more privileged groups of people (both the ones who are causing these issues and the ones who are affected by them to a lesser degree) need to become aware of the issues in order to move ahead and solve them (or at the very least lessen them).
|Term||Definition (in your own words)|
|The gap between demographics and regions that have access to modern information and communications technology, and those that don’t or have restricted access
A response to the call of big data that offers analytic and political orientations around black queer studies and the effects of network culture and surveillance on black queer culture and life.
An artfully delivered insult.
Disrespectful behaviors or gestures that are either subtly or not subtly communicated.
Significant Authors or Texts mentioned (list significant authors or texts discussed)
| Giorgio Agamben and Emmanuel Lewis
Gilles Deleuze and Felix Guttari
Shoshana Magnet and Simone Browne
|People’s ethical encounters with one another are conditioned by our faces.
Faces are a racist regulating function, and “faciality” determines what faces can be recognized or tolerated by society.
Biometric technologies reproduce social stereotypes and inequalities by relying on false ideas surrounding race (such as association of particular facial features with particular races).
Black Boxes (sections you do not yet understand)
Description Page number(s)
Questions (That occur to you as you read):
- What is the definition of race? What is the definition of technology?
- What does transparency/opacity mean for the general public? What does it mean for queer blacks specifically?
- With the scientific basis for biometrics of queerness, how difficult might passing become? What are these biometrics based on and how do they work?
- How can digitally logging our movements and virtually every technologically mediated interaction be an advantage that the government wants? Why doesn’t the government want us to hide behind anonymity and opacity?
- How does facial recognition software reinforce the concept of race as technology? Is there a better way to do it than one that reinforces racial stereotypes?
One sentence summary of reading:
Race and technology are heavily linked in a way that is detrimental, particularly for queer blacks.
Notes on other readings:
Size Matters to Lesbians Too: Queer Feminist Interventions Into the Scale of Big Data
Jen Jack Gieseking
- Lots of labor is necessary for lesbians to confront their invisibilization
- Big data is valid through “masculinist, racist, and heteronotmative structural oppressions,” it also creates a “false norm” to which marginalized groups can’t measure up
- Most data collected on LGBTQ people in history has been used to “pathologize and stigmatize”
- we saw this in the previous reading and in our conversations about AIDS in the 80s
- Professor Gieseking’s data from the Lesbian Herstory Archives takes up 789 KB data
- that’s 0.000789 GB and depending on the iPhone you have, you have anywhere from 16, 64, 128 GB of storage
- It’s also .789 MB and the smallest app on my phone, Apple Wallet, that I don’t use is 1.4 MB. Facebook is 833.3 MB
- “Big data must instead be sized up through its mythos, measurements, and pace of accumulation”
- Scale of data must be read within context of the time it was produced, which makes sense
- how was it produced and which groups have the largest amount of data?
- “Scale is socially constructed through political and economic processes that contribute to the processes of uneven geographical development”
- What does this mean? How can scale be socially constructed?
- Maybe socially constructed in the meanings of points on the scale, like local being smaller than national or global?
- “The global and the intimate” -> intimate is simultaneously global and local, global is also intimate
- “data is both political and personal”
- Assembly and examination reinforce racism, heteronormativity, sexism, ableism, and ageism– how?
- Big data is not new–it has always been collected in archives. The difference now is that it’s digital/ easily accessible and easily shareable
- Datasets must be read with historical contexts in mind
- 17 types of organizations in the LHA, and reading all of the text associated with them allowed comprehensive reading that mere text analysis of “big data” would have missed
- “If this is the largest amount of archival material on the history of LGBTQ organizing in the history of the global city of NYC, is LGBTQ history really that big?”
- I think yes, but much is lost and not well documented due to fear and marginalization
- LGBTQ cyberculture and new media play a big role in LGBTQ people’s lives
- their contributions to new media are important
- more attention is paid to conceptualizations of “the digital” than to actual interfaces
- technology and race often discussed, sometimes in conjunction with queer theory, but never just queer media studies
- Queer OS: “taking historical, sociocultural, conceptual phenomena… to be mutually constitutive with sexuality/media/information tech, making it impossible to think about them in isolation.”
- Thinking about “queer” as an operating system larger than ones in computers
- How does US facial recognition influence Unix, an operating system??
- Everyone in class had difficulty comprehending content of the article