Bots made by end users are crucial to the success of online communities, helping community leaders moderate content as well as manage membership and engagement. But most folks don’t have the resources to develop custom bots and turn to existing bots shared by their peers. For example, on Discord, some especially popular bots are adopted by millions of communities. However, because these bots are ultimately third-party tools — made by neither the platform nor the community leader in question — they still come with several challenges. In particular, community leaders need to develop the right understandings about a bot’s nature, value, and use in order to adopt it into their community’s existing processes and culture.
In organizational research, these “understandings” are sometimes described as technological frames, a concept developed by Orlikowski & Gash (1994) as they studied why technologies became used in unexpected ways in organizational settings. When your technological frames are well-aligned with a tool’s design, you can imagine that it is easier to assess whether that tool will be useful and can be smoothly incorporated into your organization as intended. In the context of online communities, well-aligned frames can not only reduce the labor and time of bot adoption, but also help community leaders anticipate issues that might cause harm to the community. Our new paper looks to communities on Discord and asks: How do community leaders shift their technology frames of third-party bots and leverage them to address community needs?
Emergent social ecosystems around bot adoption
Our study interviewed 16 community leaders on Discord, walking through their experiences adopting third-party bots for their communities. These interviews underscore how community leaders have developed social ecosystems around bots: organic user-to-user networks of resources, aid, and knowledge about bots across communities.
Despite the decentralized arrangement of communities on Discord, users devised and took advantage of formal and informal opportunities to revise their understandings about bots, both supporting and constraining how bots became used. This was particularly important because third-party bots pose heightened uncertainties about their reliability and security, especially for bots used to protect the community from external threads (such as scammers). For example, interviewees laid out concerns about whether a bot developer could be trusted to keep their bot online, to respond to problems users had, and to manage sensitive information. The emergent social ecosystems helped users get recommendations from others, assess the reputation of bot developers, and consider whether the bot was a good fit for them along much more nuanced dimensions (in the case of one interviewee, the values of the bot developer mattered as well). They also created opportunities for people to directly get help in setting up bots and troubleshooting them, such as via engaged discussions with other users who had more experience.
Our findings underscore a couple of core reasons why we should care about these social ecosystems:
- Closing gaps in bot-related skills and knowledge. Across interviews, we saw patterns of people leveraging the resources and aid in social ecosystems to move towards using more powerful but complex bots. Ultimately, people with diverse technical backgrounds (including those who stated they had no technical background) were able to adopt and use bots — even bots involving code-like configurations in markdown languages that might normally pose barriers. We suggest that the diffusion of end-user tools on social platforms be matched with efforts to provide bottom-up social scaffoldings that support exploration, learning, and user discussion of those tools.
- Changing perceptions of the labor involved in bot adoption. The process of bot adoption as a deeply social one appeared to impact how people saw the labor they invested into it, shifting it into something fun and satisfying. Bot adoption was both collaborative, involving many individuals as a user discovered, evaluated, set up, and fine-tuned bots; and communal, with community members themselves taking part in some of these steps. We suggest that bot adoption can provide one avenue to deepen community engagement by creating new ways of participating and generating meta discussions about the community, as well as the platform.
- Shaping the assumptions around third-party tools. Social ecosystems enabled people to cherry-pick functions across bots, enabling creative wiggle room in curating a set of preferred functions. At the same time, people were constrained by social signals about what bots are and can do, why certain bots are worth adopting, and how the bot is used. For example, people often talked about genres of bots even though no such formal categories existed. We suggest that spaces where leaders from different communities interact with one another to discuss strategies and experiences can be impactful settings for further research, intervention, and design ideas.
Ultimately, the social nature of adopting third-party bots in our interviews offers insight into how we can better support the adoption of valuable user-facing tools across online communities. As online harms become more and more technically sophisticated (e.g., the recent rise of AI-generated disinformation), user-made bots that quickly respond to emerging issues will play an important role in managing communities — and will be even more valuable if they can be shared across communities. Further attention to the dynamics that enable tools to be used across communities with diverse norms and goals will be important as the risks that communities face, and the tools available to them, evolve.
Engage with us!
If you have thoughts, ideas, questions, we are always happy to talk – especially if you think there are community-facing resources we can develop from this work. There are a few ways to engage with us:
- Drop a comment below this post!
- Check out the full paper, available ✨ open access ✨ in the ACM Digital Library.
- Come by the talks we’ll be giving:
- Connect with us on social media or via email.