Original Title: The ugly memes driving crypto sales
Original Author: Adam Alexsic, Financial Times
Translator: Peggy, BlockBeats
Editor’s Note: As AI, algorithm recommendations, and crypto speculation converge, internet memes are being systematically "manufactured" to harvest attention and money.
This article starts with a series of aggressive content that has gone viral on social media, revealing how these seemingly absurd trends serve the logic of spreading crypto scams. It reminds us that when trends are no longer generated naturally but are designed for profit, the internet becomes more chaotic and dangerous.
The following is the original text:
The author of this article is known online as Etymology Nerd and is the author of the book "Algospeak: How Social Media is Changing the Future of Language."
This year, a dark and disturbing new facet has emerged on Instagram Reels: aggressive memes are being systematically created to promote cryptocurrency scams—while almost no one is seriously trying to remove them.
Since January of this year, a group of bizarre, distorted characters has begun to spread across this social platform. The emergence of this phenomenon is closely related to the widespread availability of AI tools and the relaxation of hate speech regulations on Meta platforms.
This includes "George Droyd," a cyborg "reincarnation" based on George Floyd, created in April this year to promote a cryptocurrency called $FLOYDAI; and "Kirkinator," which emerged in September, shortly after the death of political commentator Charlie Kirk, to hype the $KIRKINATOR token. Additionally, there are a series of recurring "sidekick" characters, such as "Epstron" and "Diddytron," which reference Jeffrey Epstein and rapper Sean Combs (also known as Diddy).
These accounts exist within the same narrative universe, often gaining traction by catering to racist and anti-Semitic stereotypes, accumulating millions of views. The short videos frequently feature discriminatory language and revolve around the so-called "racial purification" plot.
The sole purpose of this shocking content is to generate interaction and engagement. The ultimate goal is to direct public attention toward so-called "meme coins," a type of cryptocurrency that theoretically rises in value as the meme spreads. Early meme coins (like $DOGE) leveraged existing internet culture, while derivative characters like George Droyd and their ilk are entirely artificially created by crypto speculators.
This trick usually starts at pump.fun, a platform that allows users to easily register and trade digital tokens. Once developers create a token, they share it in trusted Telegram groups or X communities, where investors brainstorm how to artificially generate attention for the related meme, known as "mindshare." They then use AI to generate provocative videos, hoping to make the meme go viral and attract "ordinary people"—those unfamiliar with meme coin culture but potentially drawn in as retail investors. Once the coin price rises, the initial core group will choose to "rug pull," selling off and cashing out profits.
In reality, the number of people actually buying these tokens is often only in the thousands. However, because the barriers to creating cryptocurrencies and publishing AI-generated junk content are extremely low, the coin creators can easily repeat this cycle, profiting by "manufacturing cultural phenomena."
Meanwhile, these memes often begin to "grow on their own." When other creators realize they have viral potential, they imitate and reproduce them for money or online clout. The characters "Kirkinator" and "George Droyd" have already been repeatedly used by several influencers unrelated to the original token creators.
But with each reinterpretation, crypto brokers continue to profit. For example, a tweet about Kirkinator in October garnered 8 million views, causing the price of $KIRKINATOR to surge fivefold, only to drop back down within days. For those who sold at the peak, this profit was built on millions of X users watching a video—its content being "George Droyd being killed by Kirkinator after stealing Epstein's files."
Unfortunately, the more sensational the video, the easier it is to go viral. Violent and offensive imagery generates more comments and longer watch times, both of which are rewarded by algorithms. The coin creators have learned to exploit this mechanism for personal gain. Even Instagram or X users who are unaware of these cryptocurrencies may find themselves repeatedly exposed to these extremely uncomfortable clickbait contents.
We are being drawn into a vortex: loosely regulated cryptocurrency websites, readily available AI tools, and social platforms that allow aggressive memes to proliferate, all intersecting.
As a scholar studying the evolution of internet language, I find this deeply concerning: online trends are being artificially manufactured with the sole purpose of manipulating us. We can no longer be sure that memes are "naturally generated"—they could be part of a profit-driven chain at any moment.
Even if a meme is not directly created by crypto brokers, it will almost immediately be appropriated by them. Every new cultural reference is almost instantly registered as a token on pump.fun and artificially propped up, just to allow certain individuals to profit.
The end result is that all of us become more loosely connected to reality. More and more memes will be invented or amplified, forcing internet users to constantly question what they can still believe; and being continuously exposed to this repugnant discourse environment will make it seem "more acceptable." The only way out is to fight to reclaim the internet and stop those who seek to poison it.
免责声明:本文章仅代表作者个人观点,不代表本平台的立场和观点。本文章仅供信息分享,不构成对任何人的任何投资建议。用户与作者之间的任何争议,与本平台无关。如网页中刊载的文章或图片涉及侵权,请提供相关的权利证明和身份证明发送邮件到support@aicoin.com,本平台相关工作人员将会进行核查。