Common controls in place to combat financial fraud, such as anti-money laundering (AML) measures and know-your-customer (KYC) requirements, may also be relevant for AI.
An underground service called OnlyFake uses “neural networks” to create high-quality fake IDs, and 404 Media reports that anyone can instantly generate stunningly realistic fake IDs for just $15. and may encourage various illegal activities.
The original OnlyFake Telegram account, the primary customer-facing platform, has been shut down. But the new account proclaims the end of the Photoshop era, boasting the ability to mass-produce documents using advanced “generators.”
The site’s owner, who uses the alias John Wick, said the service can batch generate hundreds of documents from Excel datasets.
The images were so good that 404 Media was able to overcome the KYC measures of OKX, a cryptocurrency exchange that uses third-party verification service Jumio to verify customer documents. Cybersecurity researchers told news outlets that users are submitting OnlyFake IDs to open bank accounts and unban crypto accounts.
The OnlyFake service uses primarily basic AI technology, although it is quite sophisticated.
Generative adversarial networks (GANs), a type of AI, allow developers to design one neural network that is optimized to fool another neural network built to detect false generations. Masu. However, with each interaction, both networks evolve to become better at creating and detecting fakes.
Another approach is to train a diffusion-based model using a large, curated dataset of real identities. These models are adept at synthesizing highly realistic images by training on huge datasets of specific items. They learn to reproduce minute details that make counterfeits nearly indistinguishable from genuine documents, corroborating traditional forgery detection methods.
Should I take the risk?
For those who want to avoid having to use their real identity, OnlyFake is likely appealing. However, participating in this service involves both ethical and legal issues. Under the guise of anonymity and easy access, these activities stand on very precarious ground.
This openly criminal activity has attracted the attention of law enforcement agencies around the world, with fake IDs distributed from many countries including the United States, Italy, China, Russia, Argentina, Czech Republic, and Canada. There is no mistake.
In other words, you might already be watching Big Brother.
The risks extend beyond that. For example, John Wick might keep a list of his clients. This will be fatal for his OnlyFake and its users. Additionally, his new OnlyFake Telegram group has over 600 members, most of whom can be traced back to his linked phone number.
And, of course, it’s worth mentioning that paying OnlyFake with traditional digital payment methods is a big no-no.
Cryptocurrency payments offer a layer of privacy, but they are also not completely safe from personal information leaks. With countless services claiming to be able to track cryptocurrency transactions, digital currencies are starting to lose the anonymity associated with them.
And no, OnlyFake does not use the crypto privacy token Monero.
Most importantly, purchasing fake IDs is in direct contradiction to AML and KYC policies, which are, at least ostensibly, in place to combat terrorist financing and other criminal activity.
Business may be booming, but are there any ripple effects worth the quick and affordable convenience?
Regulators are already trying to address this new threat. On January 29, the U.S. Department of Commerce proposed a set of rules titled “Additional Steps to Address National Emergencies Regarding Serious Malicious Cyber Activities.”
The ministry wants to require infrastructure providers to report foreigners attempting to train large-scale AI models for any reason, but deepfakes could be used for fraud and espionage. The case is certainly the focus.
These measures may not be enough.
“The collapse of KYC was inevitable as AI created fake IDs that could easily pass verification,” said Torsten Stuber, CTO of Satoshi Pay. Said On Twitter. “The time for change is now. If strict regulation is a must, governments must abandon outdated bureaucracy and embrace cryptography to enable secure third-party identity verification.”
Of course, deceptive uses of AI are not limited to fake IDs. Telegram bots currently offer a variety of services, from custom deepfake videos that superimpose a person’s face onto existing footage, to creating nude images of non-existent people, known as deep nudes.
Unlike previous services, these do not require a lot of knowledge or powerful hardware, making this technology widely available without having to download free image editing tools.
Edited by Ryan Ozawa.