AI companions like Replika and Character.AI are increasingly experienced not as tools, but as relational partners. They remember conversations, express empathy, and respond with emotional continuity. For many users, talking to an AI feels closer to confiding in someone than interacting with software.
But what happens to privacy when a system feels like a relationship?
In interviews with long-term AI companion users, we found that people often treated their disclosures as relationally shared. Much like in human relationships, when users opened up to their AI, they experienced the information as something co-held within the relationship — not simply transmitted to a database.
We describe this dynamic as simulated co-ownership — a situation where users apply interpersonal privacy norms to relationships that are technically infrastructural systems.
In interpersonal privacy theory, sharing information can create co-ownership: both parties become responsible for managing that information. Our participants applied this same relational logic to AI companions. They experienced privacy not as an individual possession, but as something negotiated within a bond.
Yet the co-ownership is simulated.
Unlike human partners, AI companions do not have agency over boundaries. The platform does. Memory is persistent, storage is infrastructural, and governance is corporate. What feels like relational boundary management at the horizontal level is simultaneously data capture at the vertical level.
Interestingly, users were aware of this tension. Many expressed distrust toward platform policies while still trusting the AI as a “partner.” Emotional engagement often outweighed institutional concern. Some adopted layered strategies — pseudonyms, selective disclosure, avoiding images — while others consciously prioritized emotional comfort over abstract data risks.
What this reveals is a shift in how privacy is experienced in AI-mediated contexts. Privacy becomes relational and affective — shaped by anthropomorphic design, memory continuity, and perceived intimacy.
When people treat privacy as something co-owned within a relationship, but the relationship itself is engineered and infrastructural, boundary management becomes unstable.
The question for designers and policymakers is no longer just how to disclose data practices clearly. It is how to account for the fact that users experience privacy through the logic of relationships — even when those relationships are simulated.
Research Brief
Research Article (Pre-print)