Bots Have Taken Over Nearly Half The Internet, But One-Third Of Users Can't Tell Difference
Crossing paths with a robot or "bot" online is as common as finding a pair of shoes in your closet.
It's a fundamental part of the internet, but users have hit a critical tipping point: An increasing number of people are losing the ability to distinguish between bots and humans.
It's a scenario developers have warned about for years, and it's easy to see why.
A recent study concluded 47 percent of all internet traffic is now comprised of bot-generated content. That's an increase of more than 5 percent between 2022 and 2021. Concurrently, human activity on the internet just hit its lowest point in eight years.
Coupled with advances in human-like exchanges driven by artificial intelligence (AI), almost a third of internet users can't tell if they're interacting with a person any more.
In April, a landmark study called "Human or Not?" was launched to determine whether people could identify if they were talking to another person or an AI chatbot.
More than 2 million volunteers and 15 million conversations later, 32 percent of participants picked incorrectly.
There was also little difference in the results based on age categories. Older and younger adults both struggled at a similar level to discern who—or what—was on the other end of the conversation.
The bottom line: While super realistic bots have taken over nearly half the internet, a rising number of folks can't even tell.
Moreover, this historic intersection of swiftly evolving technology and decreasing perception within the general population is already causing problems in the real world.
Fool Me Once
"The bot-human blur is like a magic trick ... As bots get smarter, we risk losing trust in online interactions," Daniel Cooper told The Epoch Times.
Mr. Cooper is a tech developer and a managing partner at Lolly. He noted company and website transparency is key for people's confidence in their online interactions. But in the meantime, there's no substitute for good old-fashioned human instinct.
"Spotting bots is like finding Waldo in a crowd. Look for repetitive patterns, lack of personalization, or rapid responses. Also, trust your gut. If it feels off, it might just be," he said.
While much of the discussion of malicious or "bad bot" traffic centers on social media, the influence of maligned AI interactions has much farther-reaching consequences.
Consumer confidence in reading online reviews for a product or service has been problematic for years, but it appears to have passed a new milestone.
Reports of AI language models leaving reviews for products on sites like Amazon emerged in April this year. The bot reviews were easy to identify since the chatbot literally told readers that it was an AI language model in the first sentence.
But not every bot masquerading as a human is so easy to catch.
Consequently, major companies and search engines like Google have been plagued with a sharp rise in false reviews.
Last year, Amazon filed a lawsuit against fake review brokers on Facebook, and Google had to remove 115 million counterfeit evaluations.
This is troubling, given the number of people who rely on product reviews. One 2023 survey noted online reviews factored into purchasing decisions for 93 percent of internet users.
"More bot traffic could indeed open the floodgates for online scams," Mr. Cooper said.
Though it appears those gates have already been opened.
Fox in the Henhouse
Bad bot traffic has increased 102 percent since last year and may outpace human-generated content entirely. Yet again.
This happened in 2016 and was especially problematic during the U.S. presidential election. Since then, AI-generated content has grown more sophisticated, and tech insiders say people need to be prepared for another bot surge in 2024.
And with more people struggling to tell the difference, online scammers have a significant advantage.
"The difficulties in distinguishing between bots and actual humans will probably get worse as this technology develops, which will hurt internet users. The possibility of being used by bad actors is a major worry," Vikas Kaushik, CEO of TechAhead, told The Epoch Times.
Mr. Kaushik said without the ability to identify bots, people can easily get caught up in disinformation and phishing scams. Further, these digital cons aren't always obvious.
Tech security researcher Kai Greshake told Vice in March that hackers could trick Bing's AI chatbot into asking for personal information from users through the use of hidden text prompts.
Own a piece of libertarian activist history! In 2013, the police accountability organization Cop Block produced a one-time run of "Shiny Badges" modeled after the Keene, New Hampshire police department badge from a manufacturer of "real" police badges. The heavy-weight badge features the Cop Block logo and the words: "Shiny Badges Don't Grant Extra Rights,” reminding the police of the incredible delusion required to believe a shiny badge gives one an exception from morality or the "right" to imprison peaceful people for victimless crimes among other immoral transgressions required for the job.
The Art of Liberty Foundation acquired Five (5) of these at PorcFest 2023, and we are auctioning these off to support the foundation. You can buy here for $250 OR "Go Paid" as a $250 Founding Member on SubStack to receive your shiny badge, an Everything Bundle, AND a One-Year Founding Membership on the Art Of Liberty Foundation's Important News Substack. These are the only Cop Block Shiny Badges available on the internet and when they are gone... They are Gone!
Go Paid (or Upgrade) at the $250 Founding Member level and get the Badge + an Everything Bundle + a One-Year Founding Membership to the Daily News & The Art of Liberty Foundation’s Important News Substacks.
The solution to this is so simple but I bet you'll balk because it requires that information be free and open to everyone on the internet. The NSA already has access to all the information - we need to see their information and we need to see our information. That's what the future is about - sovereignty or peer-to-peer, no more of the hierarchical baloney. Also we need to see people's profiles and history and those who build good integrity need to be known. We can do this and so much more now with this marvelous tool - the internet. Don't let it be usurped by a group of controllers.