Critical AI

Events

Critical AI’s main focal point for Fall 2021 is our Ethics of Data Curation workshop (to be held over Zoom), the product of a National Endowment for the Humanities and Rutgers Global sponsored international collaboration between Rutgers and the Australian National University. The lead organizers for the series are Katherine Bode and Baden Pailthorpe at ANU and Lauren M.E. Goodlad at Rutgers. All of the workshops and associated talks are free and open to the public but space is limited so please register well in advance (see schedule and registration links below).

“Artificial Intelligence” (AI) today centers on the technological affordances of data-centric machine learning. While talk of making AI ethical, democratic, human-centered, and inclusive abounds, it suffers from lack of interdisciplinary collaboration and public understanding.
At the heart of AI’s social impact is the determinative power of data:
the leading technologies derive their “intelligence” from mining huge troves of data (often the product of unconsented surveillance) through opaque and resource-intensive computation.
The Big Tech tendency to favor ever-larger models that use data “scraped” from the internet creates complications of many kinds including
the under-presentation of women, people of color, and people in the developing world;
the mistaken belief that stochastic text-generating software like GPT-3 truly “understands” natural language;
the misguided haste to uphold this technology as the “foundation” on which the future of all AI will be built;
and the environmental and social impact of privileging ever-larger models that emit tons of carbon and cost millions of dollars to train.

Our Ethics of Data Curation workshop invites you to join a network of cross-disciplinary scholars including leading thinkers on the question of data curation and data-centric machine learning technologies. Please join the discussion, or if the time doesn’t work for you, watch the recordings of our workshop meetings and join us on Critical AI’s blog for asynchronous conversations.

Note: at present we are still organizing the details of various sessions, including the readings, but if you register in advance we will be certain to email you as soon as the links to readings are live!

* * * * * * * * * * * * * * * * * * * * * * 

SCHEDULE AND REGISTRATION LINKS

Meeting 1: STOCHASTIC PARROTS: 

A comprehensive discussion of the social and technological dimensions of large language models (LLMS).

Th Oct. 7, 5:30 PM EST (Oct. 8, 8:30 AM AEDT)

Suggested Further Readings: 

 

* * * * * * * * * * * * * * * * * * * * * *  

Meeting 2: DATA JOURNALISM:

A talk and discussion with Meredith Broussard, Research Director at the NYU Alliance for Public Interest Technology and author of the award-winning book, Artificial Unintelligence: How Computers Misunderstand the World (MIT, 2018).

Th Oct. 14, 5:30 PM EST (Oct. 15, 8:30 AM AEDT)

  • Professor Broussard will be introduced by Caitlin Petre (Journalism and Media Studies, Rutgers).
  • Readings: Chapter 4 and Chapter 6 of Artificial Unintelligence: How Computers Misunderstand the World (MIT, 2018).
  • Check out the video and Rutgers Undergraduate Nidhi Salian’s blog of this event!

Suggested Further Readings: 

  • Chapters 1-3 of Artificial Unintelligence: How Computers Misunderstand the World (MIT, 2018).

 

* * * * * * * * * * * * * * * * * * * * * *  

Meeting 3: BIG DATA:

A workshop discussion about two recent publications of importance to data curation and its discontents. 

Th Oct. 28, 5:30 PM EST (Oct. 29, 8:30 AM AEDT)

Suggested Further Readings: 

 

* * * * * * * * * * * * * * * * * * * * * *  

Meeting 4: DATA RELATIONALITIES:

A talk and discussion with Salomé Viljoen (Columbia Law) on her pioneering work on the relationality of data.

December 02, 2021 5:30 EST (in Australia, Fri Dec. 3 @9:30 AM AEDT)

Suggested Further Readings: 

 

* * * * * * * * * * * * * * * * * * * * * *  

Meeting 5: DATA JUSTICE:

An interview and open discussion with Sasha Costanza-Chock (Director of Research & Design, Algorithmic Justice League) including Kate Henne (School of Regulation and Global Governance, ANU), Sabelo Mhlambi (Berkman-Klein Center for Internet & Society), and Anand Sarwate (Electrical & Computer Engineering, Rutgers).

Th Dec. 2, 5:30 PM EST (Dec. 3, 9:30 AM AEDT)

Registration for Sasha Constanza-Chock’s talk at 5:30

Registration for the workshop discussion to follow

Primary Readings:

 

* * * * * * * * * * * * * * * * * * * * * *  

Meeting 6: IMAGE DATASETS:

A special event on AI & the Arts with Katrina Sluis (Photograph and Media Arts, ANU) and Nicolas Malevé (Visual Artist and Researcher, CSNI). Both will be introduced by Baden Pailthorpe (School of Art & Design, ANU).

Th Dec. 16, 5:30 PM EST (Dec. 17, 9:30 AM AEDT)

Registration for the talk at 5:30

Registration for the workshop discussion to follow

Primary Readings:

Suggested Further Readings: 

 

* * * * * * * * * * * * * * * * * * * * * *   

Ethics of Data Curation series: DATA JUSTICE with Sasha Costanza-Chock

Screen Shot 2021 11 19 at 1.19.28 PM

Sasha Costanza-Chock, Director of Research & Design at the Algorithmic Justice League and author of Design Justice: Community-Led Practices to Build the Worlds that We Need. Our guest interviewers for this occasion are Kate Henne (School of Regulation and Global Governance, ANU), Sabelo Mhlambi (Berkman-Klein Center for Internet & Society), and Anand Sarwate (Electrical & Computer Engineering, Rutgers).

December 02, 2021 5:30 EST (in Australia, Fri Dec. 3 @9:30 AM AEDT)

Registration for the DATA JUSTICE interview with Sasha Constanza-Chock

Registration for the open discussion that follows

 

* * * * * * * * * * * * * * * * * * * * * * 

 

Join Critical AI for IMAGE DATASETS, a special AI & Arts event with Katrina Sluis (Photography and Media Arts, ANU) and Nicolas Malevé (Visual Artist and Researcher, CSNI). Th Dec. 16, 5:30PM EST (Dec. 17, 9:30 AM AEDT).

Registration for the IMAGE DATASETS talk 

Registration for the IMAGE DATASETS discussion

* * * * * * * * * * * * * * * * * * * * * *