In a study highlighted at a recent conference on human factors in computing systems, University of Michigan School of Information researchers gave a presentation on extended reality (XR) advertising — a combination of augmented and virtual reality — that warned of the formats potential for manipulation, deception, or harm-doing by advertisers.
A few of the innumerable examples of current XR advertising include the 360-degree videos on YouTube, Snapchat filters, or a furniture company offering try-before-you-buy through the customer’s cell-phone camera that allows them to see how the furniture will look in their room in real time.
According to Grandview Research Inc., investment in the XR market is expected to grow rapidly in the coming years. The augmented reality market is projected to go from $10 billion in 2019 to $62 billion in 2027. For virtual reality, the projection sees it going from $8 billion in 2020 to $100 billion in 2024.
The team identified five mechanisms of manipulative XR advertising: misleading experience marketing, inducing artificial emotions, targeting people when they are vulnerable, emotional manipulation through hyper-personalization, and distortion of reality.
The researchers said the format is not inherently detrimental to people. Many XR advertising experiences can be helpful. But looking out for bad actors with intent to manipulate, mislead, or otherwise bring harm to consumers.
“There are many ways this can manifest,” says lead author Abraham Mhaidli, a doctoral student at the U-M School of Information. “One way could be distorting a consumer’s sense of reality by overlaying graphics on someone’s AR glasses to change what they are seeing. For example, a political ad may try to paint a picture of a booming economy and release ads on AR glasses that subtly overlay graphics which hide or erase evidence of poverty.”
The two-step process started with researchers becoming well-versed in XR devices, XR advertising, and manipulative advertising. Then, they designed the created scenarios and use-cases that demonstrate how future manipulative techniques might manifest.
“A key aspect here is that augmented and extended reality technologies are inherently designed to alter reality,” says co-author Florian Schaub, assistant professor of information. “In contrast, advertising in other media might also be tailored to you or masquerade as content — think infomercials, sponsored articles in news media/magazines, product placement in movies — but they do not inherently change what you perceive as reality.”
The team says most of the harms they examined are still far away due to the current limitations of XR technology. The example given by Mhaidli is a possibility, but the technology to create an overlay that appears to be reality. Similarly, the consumer protection framework for this type of threat has not been developed, nor is public literacy on the topic high.
“I think we need a broader public debate and policy discussion about what kinds of manipulation of people’s reality are acceptable and whether and how XR experiences should come with disclosures about how they affect reality (similar to food nutrition labels or warnings on cigarette packs),” Schaub says.