Consumers fear giving up control
Consumers fear the repercussions of yielding control to AI and allowing it to make purchases on their behalf. They're concerned, for instance, that AI might make incorrect or expensive decisions that are difficult to correct or undo, such as ordering more items than desired. Even consumers who are comfortable with AI automatically selecting which coffee brand to order, for instance, would still want to be in charge of the seemingly trivial action of clicking “buy.”
In our qualitative interviews, consumers voiced such fears, with mentions of errors and inaccuracy coming up frequently. "Voice assistants often misunderstand commands. They don't have a screen for me to validate that I've done the correct thing," one respondent complained. Data security is another concern. “I don’t know if I feel comfortable letting AI use my information and storing it to make payments easier,” another said.
Distrust of what motivates AI decisions is also prevalent. "I have a feeling that the algorithm will follow instructions from the highest bidder and not handle the purchase in my interest,” said one consumer.
Comfort levels plummet, even for popular AI tools
Even AI’s most enthusiastic users are wary of relinquishing decision-making power during the Buy phase. Compared with the Learn phase, Comfort Quotients for Accelerators and Early Adopters drop from 50 to 33 for the former and from 58 to 34 for the latter. More specifically, while 41% of Accelerators (and 52% of Early Adopters) are comfortable using conversational AI during the Learn phase, comfort with AI drops to just 33% and 46%, respectively, in the Buy phase (see Figure 4). A similar dynamic is true of voice assistants, which drops from 38% for Accelerators and 53% for Early Adopters for learning to 30% and 42%, respectively, when it comes to buying.
Interestingly, technologies that add security verifications such as facial recognition compete closely with other technologies, especially among Anchors, in the buying stage. One consumer called it “an extra layer of verification and protection.”