We advocate for consideration of harms to communities (including online communities) as they respond to, are used for, and incorporate generative AI algorithms. One area of risk is one we call ‘Social Model Collapse,’ the notion that changes to community dynamics — social and technical — can disrupt the fundamental processes they rely on to sustain themselves and flourish. We see this as a clear point of shared concern among STS, HCI, Sociology, and Communication scholars, and are hosting an open panel at the upcoming meeting of the Society for Social Studies of Science (4S), September 3-7 in Seattle. This panel is being coordinated by CDSC members Kaylea Champion (University of Washington) and Sohyeon Hwang (Princeton University).
From our call for submissions:
Model collapse in machine learning refers to the deterioration such a model faces if it is re-fed with its own output, removing variation and generating poor output; in this panel, we extend this notion to ask in what ways the use of algorithmic output in place of human participation in social systems places those social systems at risk. Recent research findings in the generation of synthetic text using large language models have fed and been fed by a rush to extract value from, and engage with, online communities. Such communities include the discussion forum Reddit, the software development communities producing open source, the participants in the question and answer forum StackExchange, and the contributors to the online knowledge base Wikipedia.
The success of these communities depends on a range of social phenomena threatened by adoption of synthetic text generation as a modality replacing human authors. Newcomers who ask naive questions are a source of members and leaders, but may shift their inquiries to LLMs and never join the community as contributors. Software communities are to some extent reliant on a sense of generalized reciprocity to turn users into contributors; such appreciation may falter if their apparent benefactor is a tireless bot. Knowledge communities are dependent on human curation, inquiry, and effort to create new knowledge, which may be systemically diluted by the presence of purported participants who are only algorithms echoing back reconstructions of the others. Meanwhile, extractive technology firms profit from anyone still engaging in a genuine manner or following their own inquiries.
In this panel, we invite consideration of current forms of social model collapse driven by a rush of scientific-industrial activity, as well as reflection on past examples of social model collapse to better contextualize and understand our present moment.
Submissions are 250-word abstracts due January 31st; our panel is #223, “Risks of ‘Social Model Collapse’ in the Face of Scientific and Technological Advances” [Submission site link].