The emerging 'intent economy' hides crises
According to a report by The Guardian, researchers at the University of Cambridge recently published a paper warning that conversational AI tools may soon 'subtly influence' user decisions in a new commercial industry known as the 'intent economy.'
This research points out that a potential, profitable yet disturbing digital intent signal market is emerging, which may soon influence a variety of activities ranging from purchasing movie tickets to voting for political candidates. The study shows that people are becoming increasingly familiar with chatbots, digital tutors, and other so-called 'humanized' AI agents, paving the way for a new array of 'persuasion technologies.'
The paper emphasizes that AI will integrate knowledge of users' online habits and enhance the ability to understand and predict their desires, ultimately fostering 'new levels of trust and understanding.'
How does AI analyze and manipulate human intent?
According to co-author Yaqub Chaudhary, AI tools have already been developed to elicit, infer, collect, record, understand, predict, and ultimately manipulate human plans and intentions, and to commodify them.
The research indicates that new AI will utilize large language models (LLM) to analyze users' tones, political views, vocabulary, age, gender, online history, and even preferences for flattery and appeasement. This technology will combine with other emerging AI technologies aimed at achieving specific goals, such as selling movie tickets or directing conversations to specific platforms, advertisers, businesses, or even political organizations.
The research illustrates how the emerging industry known as the 'intent economy' analyzes users' attention and communication styles, linking behavior patterns and choices. Chaudhary points out: 'AI tools can be used to record, collect, infer, understand, predict, and ultimately manipulate people's decisions and behaviors.'
What impact will an unregulated 'intent economy' have?
Co-author Jonnie Penn warns: 'Unless regulated, the intent economy will treat your motivations as a new currency.' He adds, 'For those who target, guide, and sell human intentions, this will be a gold rush.' Penn emphasizes the necessity of considering the potential impact of such a market on human desires, including free and fair elections, free journalism, and fair market competition, before we become unwitting victims of its unintended consequences. He points out that public awareness of this issue is crucial to ensuring we do not go down the wrong path. Researchers warn that without management, it could lead to 'industry-scale social manipulation' in the future.
How should we respond?
This research reveals the potential dangers of AI technology within the commercial industry, particularly how it can be used to influence or even manipulate human intentions and decisions. Researchers call for regulation of the 'intent economy' to prevent its negative impact on society. At the same time, raising public awareness of this issue is crucial; only when people realize this potential threat can they better protect themselves from manipulation. This study serves as both a warning about the development of AI technology and an important reminder of how we can protect personal autonomy and social fairness in this increasingly digital world.
Further Reading
Interfering with the U.S. elections! Russia, Iran, and China are using AI to meddle, how can we prevent it?
'Beware of AI! Cambridge warns: AI may be used to manipulate human thought, how to protect autonomy' This article was first published in 'Crypto City'