Instagram Porn Bots Evolve Methods for Peddling Adult Dating Spam
Incentivized by affiliate programs, scammers are evolving how they utilize fake Instagram accounts to target users on the popular social media platform.
As social networking services rose to prominence in the early part of this century, the services themselves and all manner of other businesses saw the revenue potential that came with targeted advertisements tailored to individual interests. At the same time, scammers, who until this point had relied on email as their vehicle to promote adult dating and webcam-based scams, were quick to capitalize on the burgeoning platforms — albeit in shadier ways — in order to earn money from affiliate sign-ups.
In the years since, an entire cottage industry of scammers has cropped up, using bots to redirect social media users to fake accounts in order to game the lead-generation system. Indeed, since 2016, Instagram users have been subjected to a variety of scammers peddling adult dating and webcam spam via porn bots. The activities of the porn bots range from simply following Instagram account holders to liking and commenting on their photos to, more recently, exchanging direct messages with them.
To its credit, Instagram — which attained 1 billion monthly active users (MAU) in 2018 — has worked to try to thwart the efforts of the operators of these porn bot accounts, but, as you can imagine, it is a cat-and-mouse game. As someone who has been researching this space for many years, the cat-and-mouse game fascinates me. This post aims to highlight some of the notable trends I’ve recently observed with Instagram porn bots, such as the use of intermediary accounts and bots using literary quotes in their photo captions, and discusses the driving force behind their presence as part of my continued effort to educate Instagram users.
Instagram Porn Bots
Historically, Instagram porn bots would be self contained, performing activities such as liking photos and following users with a link directly in their bio along with suggestive text, as seen in the example above. These porn bots have made some simple changes, for instance, altering their profile images with Story rings around them to make it seem as though they’ve posted an Instagram story and removing their suggestive text.
However, in an effort to bypass some of the mechanisms in place to detect this type of activity, porn bot operators began to leverage what I’m referring to as intermediary accounts.
How Porn Bots Use Intermediary Accounts
In this example, the intermediary account, “kayla” follows a user. Visiting this profile shows there are no photographs on the account. However, the bio contains emojis and the words “My Nude Pics Here” spaced out with periods in-between. The added punctuation is an attempt to bypass some automated measures Instagram may have in place to detect such activity.
The reason this is considered an intermediary account is because it instructs users to visit a different profile. In this case, the “kayla” intermediary account is linking to a “babe” account.
Similar to the intermediary account, the “babe” account also doesn’t contain any photos. However, this bio contains no obfuscation of the text, directly stating “All nude pics posted on website, look” with a link to a Bitly-shortened URL.
Not having any sort of activity associated with the “babe” accounts allows it to persist on the service without getting flagged by automated means. Based on intelligence from some of the domains used in the babe campaign, it appears the person behind that particular campaign has been actively pursuing Instagram porn bot spam since at least the middle of 2016. They’ve registered close to 1,300 domains since 2016, registering nearly 100 in the last six months.
Prevalence of “Babe” and Similar Instagram Accounts
There are quite a few similarly named “babe” accounts on Instagram. They all have the phrase “ALL NUDE PICS POSTED ON WEBSITE, LOOK” along with emojis in their bios, but only a handful of accounts have Bitly-shortened URLs as well, indicating these are actively being used. It is unclear if the accounts without Bitly-shortened URLs have been abandoned after they served their purpose or if they are spare accounts ready to be used once the active accounts have been removed by Instagram.
In addition to the “babe” accounts, there are other accounts with a different naming convention that are essentially identical. The same Bitly-shortened URL was used by several “babe” accounts, as well as an “n_” account, indicating that each batch of accounts was generated by the same person.
Use of “Novel” Porn Bot Accounts
Even as we see an uptick in the use of intermediary accounts, some porn bot accounts on Instagram still follow users directly to capture their attention. I recently observed a new batch of accounts that were slightly different from normal porn bot accounts. These accounts aren’t blank; they typically contain a maximum of three photographs. Their names contain two random emojis, one at the beginning and one at the end. For instance, one account named “Carolyn Jones” has the vulcan salute emoji followed by a smiling face with horns emoji.
What’s peculiar about the photos on this account is the seemingly random nature of them, which is an intentional effort to thwart suspicion in three ways:
- Most porn bot accounts would promote sexually suggestive imagery on their profiles.
- The woman in the images doesn’t look like the same person.
- The absence of any sort of tagline in the bio and no presence of a short URL.
The random images themselves don’t contain links or any suggestive commentary either. Instead, they include some text that appears to be truncated. In the example above, the image contains a quote from The Count of Monte Cristo by Alexander Dumas.
Similarly, another porn bot account named “Pamela Turner” included another truncated Dumas quote from The Count of Monte Cristo, albeit from a different source.
Another porn bot account named “Denise Sanders” had very little text on each image, save for one image that included a shorter, truncated quote.
This account wasn’t quoting any of Dumas’ novels, opting instead to use a truncated quote from George R.R Martin’s famous Game of Thrones novel.
In some respects, these accounts are novel in their approach, and at the same time they also use quotes from novels, which is why I’m referring to these as “Novel Accounts.”
“Conversing” With A Porn Bot in Direct Messages
Since Novel Accounts and other porn bot accounts with nothing in their bios aren’t promoting their adult dating spam in public, they do so privately in direct messages. Following one of these accounts and initiating a conversation leads to "conversations" in broken English, such as this one with “Carolyn Jones” from earlier.
A similar “conversation” occurred with "Pamela Turner" as well.
What is interesting about these “conversations” is the delay between responses. The “Carolyn Jones” porn bot account took an hour to respond to the initial message, while the “Pamela Turner” porn bot account took five hours to respond. A subsequent message did not receive a response for nearly 22 hours. The reason for the delay is unclear. It could be a feature in the bot configuration to attempt to evade automated mechanisms looking for bot-related behavior within Instagram Direct Messages.
In both “conversations,” the same domain was used in the initial message with a different name in the path (Alison, Amy) despite their account names being entirely different (Carolyn, Pamela). Interestingly enough, in the latter exchange the second link used a different URL but with the same path (Amy).
One thing to note is that, while these Novel Accounts appear to be unique and may be operated by a single spam operator, engaging with Instagram users via direct messages to peddle spam links isn’t unique.
Fake “Safe” Instagram URL Message
Another Instagram porn bot tactic I’ve observed involves faking an Instagram page that claims a URL has been deemed as safe by Instagram.
The porn bot in this case links a user to a website via the short URL service TinyURL. The “Leaving Instagram” page is hosted on a .xyz domain and merely acts as an obfuscation layer to convince the user that the link they’re browsing to is indeed safe.
Non-Mobile Users Redirected to Benign Pages
In some instances, if a link is visited from a computer, users will be redirected to a non-adult themed page. For instance, one of the campaigns I’ve observed while browsing on a desktop will serve up a saved copy of an old article from the Planetary Society that contains broken images and stylesheets.
Visiting this same link from a mobile device will result in a 302 redirect to the scammer’s intended website. While this might be viewed as an effort to thwart examination by a researcher on a computer, there are ways around it for research purposes. However the real intention behind the redirects is likely to ensure that the “lead” is coming from a mobile device and not a computer, to ensure compliance with the adult dating affiliate program guidelines.
Group Instagram Direct Messaging
Outside of intermediary or Novel Accounts, some scammers opt to take a more direct approach when pushing adult dating spam: send out a group direct message to a large number of users.
In the case above, a porn bot account named “Dorothy” added 25 users to an Instagram Direct message chat. According to Instagram, users can add up to 32 users to an Instagram Direct message thread.
While anyone can send an Instagram Direct message to users, they get filtered out into a separate “Message Requests” section. They normally don’t change the group name, but sometimes they name groups like “my very hot photos” for example.
A mass Instagram Direct message from one of these porn bots asks the user if they want to let “Dorothy” message them; the link and image thumbnail aren’t displayed to the recipient.
Once the message request is accepted, it reveals a link and thumbnail claiming to direct the user to the pin-up model community site, SuicideGirls.
In another example, the porn bots include links claiming to direct users to OnlyFans, a social networking service with a less-restrictive content policy that’s used by models and porn actors to offer content via subscriptions.
These links do not lead users to the SuicideGirls or OnlyFans websites after all. Just like the other porn bot accounts above, the links leads to a hookup site intermediary page.
Intermediary Pages for Adult Dating and Webcam Sites
While I’ve noted the presence of Intermediary accounts, Instagram porn bot operators also leverage intermediary sites (referred to as a “prelander” page) designed to serve up varying campaigns to direct users to different adult-themed dating and webcam sites.
The user is asked to fill out a “survey” about their sexual preferences, which leads to the intended adult dating or webcam website. In these instances, they lead to websites called Snapcheat and Sinder, a play on the popular social networking and dating apps Snapchat and Tinder. Included in these URLs are query strings containing parameters about campaign identifiers and, most importantly, affiliate identifiers.
Affiliates and Bots: Like Peanut Butter and Jelly
As discussed in a VICE piece about the money trail behind Instagram porn bots, the goal of the intermediary pages is to get male Instagram users to sign up for adult dating and webcam services like Snapcheat and Sinder. The services themselves rely on affiliate programs to bring in new users. Affiliate programs are quite common and used by many e-commerce sites. In the world of adult dating and webcam sites, these affiliate programs are not so stringent when it comes to cracking down on fraudulent activity. After all, the goal is to get more users to sign up to their websites.
In most cases, the affiliate can earn a lead by simply convincing the user to sign up to one of these adult dating or adult webcam websites. This is usually defined in the affiliate offers as flow. In most cases, when a user completes the “free user registration” flow, it qualifies as a converted lead, and this is usually worth anywhere between $2 and $5 per lead.
The Holy Grail of leads is when an affiliate offer includes verbiage like "CC submit," which is when an affiliate can convince the user to submit their credit card to sign up for a service for a free trial. If the user doesn’t cancel the supposedly “free” trial, they are often billed between $40 and $100, which ensures that the affiliate gets a higher payout versus a free user registration lead.
In the case of most Instagram porn bot spam, the affiliates are leveraging free user registration affiliate offers. Therefore, we can surmise that those responsible for Instagram porn bot spam are focused on generating a large quantity of leads via simple sign-ups, versus pursuing the more lucrative offers that require the user to submit a credit card. The latter tactic has a higher barrier to entry which is, therefore, reflected in the affiliate payout amount. Despite the intermediary pages asking users if they are over the age of 18, users are still directed to the adult dating and webcam sites, making it likely that even underage teens are clicking on the links and signing up for the websites.
We reached out to Bitly and Instagram to provide them with information about the scam activity. Bitly confirmed it has suspended the account and removed the URLs generated by the scammer. Instagram did not respond as of the time of this publication.
Link Activity from Instagram Spam
The URLs used in Instagram porn bot spam can vary between direct links to intermediary sites or short URLs that mask the actual destination URL. Based on the short URL statistical data we were able to obtain from a limited number of campaign activities, the average number of clicks per link is roughly 285. This number is also skewed due to the varying degree of clicks on the link, between nine clicks as the lower bound and over 1,000 clicks as the upper bound.
Bitly provides a breakdown on the clicks for each short URL. For instance, below is a breakdown of one of the of the larger volume short URLs used in one of the “babe” campaigns.
When we pulled the statistics for this particular Bitly link on June 21, it showed over 1,000 clicks, 97% of which originated from Instagram, with a smaller subset coming from Facebook and a more generic bucket.
Geographic distribution of interaction with the Bitly link shows it is highly concentrated in the United States, but its reach spreads across 80 locations worldwide.
Conclusion
As long as Instagram has such a high volume of active users, it will continue to be a haven for porn bot scammers. After all, just as advertisers flock to social networking services like Instagram looking to capitalize on all of the eyeballs affixed to their screens, one should expect scammers won’t be far behind.
However, the only thing constant is change, so we anticipate these tactics will deviate over time, as the cat-and-mouse game continues to be played. For these scammers, one particular Dumas quote accurately depicts their efforts: “all human wisdom is summed up in two words; wait and hope.”
Learn more:
- Read the Vice story: The Secret Trail of Money Behind Those Instagram Porn Bots
- Read the Daily Mail story: Hackers are taking over YOUR Instagram profiles with pornograhpic images and adult dating spam
- Visit https://www.tenable.com/research for the latest news and views from Tenable Research
Related Articles
- Threat Intelligence
- Threat Management
- Vulnerability Management
- Vulnerability Scanning