In August, University of California, Berkeley researchers Vivek Nair and Dawn Song, and Technical University of Munich’s Gonzalo Munilla Garrido, unveiled MetaGuard. The system is an open-source plug-in for Unity, a game engine widely used for creating virtual reality content.
Like web browsers’ private modes, MetaGuard adds digital noise to obscure sensitive user data. Using an anonymisation method called differential privacy, it blocks companies from obtaining digital footprints, making it difficult to link data back to a particular user. “The ultimate goal of MetaGuard is the same as incognito mode on the web: to prevent users from being tracked from one session to another," Nair explained to tech publication The Register.
MetaGuard experiments showed an over 90% reduction in attack accuracy for several private data attributes, and a 95% decrease in deanonymisation. But this doesn’t entirely solve the metaverse’s privacy concerns – data-hungry companies can still choose whether they want to allow users to implement such protection. Popular metaverse platform VRChat announced it would be banning all client modifications just days after the MetaGuard team released its prototype. Virtual platforms may also be concerned about anonymity making the flagging of abusive users more challenging.
As we find ourselves deeper in the metaverse, data privacy issues will remain at the forefront of this field.