In arbitrage, there are niches that are often restricted by most advertising networks. Gambling, crypto, nutra, adult content, betting, and certain financial offers are much harder to promote than conventional offers.

The issue goes beyond the ad text. Google Ads, Meta, TikTok, and other networks analyze everything: creative elements, landing pages, website topics, technical components, domains, accounts, and advertising metrics. If the system detects prohibited or controversial content, the ad is rejected. In some cases, the ad account may face restrictions or even get banned.

In such circumstances, an arbitrageur must solve two problems. First, they need to present a page to the advertising platform that complies with the rules. Second, they must direct the target audience to the offer. If everything that sells is removed from the landing page, it will stop converting. If left as is, it will be flagged by moderation bots.

This is where cloaking comes in. It helps separate incoming traffic and show different pages to different types of visitors.

What Is Cloaking in Arbitrage?

Cloaking is the filtering of traffic, where advertising bots, moderators, and other suspicious visitors remain on a safe page, while the target audience lands on the page with the offer.

Typically, the setup revolves around two pages:

  • White page – a safe page that complies with the advertising network's rules
  • Black page – a page with the advertising offer for the target audience

The white page must be relevant to the ad, appear as a full-fledged website, and contain no prohibited content.

The black page is designed for the users for whom the advertising campaign is launched. It contains the offer, form, landing page, showcase, or other commercial content.

To avoid losing extra traffic, the cloaking service must accurately determine who has accessed the page: a real user, an advertising network bot, a crawler, a moderator, a spy service bot, or another unwanted visitor.

Why Traffic Filtering Is Important in Gray Niches

In sensitive verticals, direct advertising often fails moderation. Advertising systems automatically scan sites looking for signs of forbidden or controversial content.

The following elements undergo scrutiny:

  • Text on the landing page
  • Images and videos
  • Keywords
  • Topic of the page
  • Forms and buttons
  • Links to other sites
  • Technical parameters (IP address of the white page, domain history, etc.)
  • Relevance of the site to the ad

If a site contains casinos, slots, bets, crypto offers, nutra, adult content, or unverified medical claims, the system may reject the ad.

Even if the campaign receives initial approval, the platform may rescan the page later, send the campaign for manual review, or limit the account after the first complaints. Therefore, cloaking is not only essential for launch but also for the campaign's ongoing delivery.

How a Cloaking Service Works

A cloaking service analyzes the visitor before showing them the appropriate page. Based on a set of parameters, the system decides whether to keep them on the white page or redirect them to the black page.

The filtering takes into account:

  • IP address
  • Country
  • Device type
  • Operating system
  • Browser
  • User-Agent
  • Time zone
  • Browser language
  • Source of the visit
  • Click ID (gclid, fbclid, ttclid)
  • And other browser and technical parameters

For example, if a visitor comes from a data center IP address, uses a suspicious user-agent, or does not match the allowed country in the stream, they will stay on the white page. If the parameters align with those of a target user, they are redirected to the black page.

The main goal of filtering is not to cut off as much traffic as possible. It's crucial not to allow bots and advertising network moderators onto the offer page while not losing real users. Too lax filtering is risky for the account, while too strict reduces conversion rates.

Why PHP Filtering Is Often Insufficient

In the past, a standard PHP filtering on the server side was enough when working with most advertising networks. It checked IP addresses, countries, user-agents, referrers, and a few other parameters.

Now, this is no longer adequate. Advertising platforms use more complex checks. Bots can access from residential IPs, use real browsers, and look much more plausible than they did a few years ago.

The problem with PHP filtering is that the server sees a limited set of data. It can't always understand what's happening in the visitor's browser.

Why JS Fingerprinting Is Important in Cloaking

JS fingerprinting allows for deeper visitor analysis. Unlike regular server-side checks, it gathers parameters directly from the user's browser and device.

This provides more data and helps more accurately determine who is on the page.

JS fingerprinting considers:

  • Screen resolution
  • Operating system parameters
  • Canvas and WebGL
  • Time zone
  • Signs of automation
  • Behavior when loading the page
  • And other browser parameters

This approach is especially important for Google Ads, Bing Ads, and other advertising platforms, where moderation systems are continually evolving.

JS Fingerprinting does not replace IP filtering and other analysis methods; it complements them. The more parameters a cloaking service analyzes, the higher the chance of accurately identifying a bot.

What a Typical Cloaking Scheme Looks Like

Typically, setting up cloaking looks like this:

  • The arbitrageur prepares a white page relevant to the offer
  • A flow is created in the cloaking service
  • Rules for filtering (allowed countries, device types, and other parameters) are set in the flow, as well as a link to the black page
  • An integration file is downloaded from the flow and installed on the white page
  • Advertising traffic is directed to the white page
  • Bots, moderators, and other suspicious traffic stay on the white page
  • The target audience is redirected to the black page

This scheme is used across different verticals and traffic sources. Rules, GEO, device parameters, integration types, and methods of transitioning to the black page vary.

In practice, testing is most crucial. Before launching, it's essential to check where an ordinary user lands, what suspicious traffic sees, how logs are written, and whether an essential part of the target audience is excluded.

What Impacts Cloaking Quality

Cloaking services differ not only in their interface. The main difference lies in how accurately they can identify bots and moderators.

Several factors affect filtering quality.

Up-to-date IP Databases

Lists of data centers, VPNs, proxies, bots, moderators, and other suspicious sources must be regularly updated. Outdated IP databases may allow unwanted traffic to slip through.

JS Fingerprinting

Browser fingerprinting helps identify discrepancies that are not visible during standard server checks.

Flexibility of Settings

Different ad sources require different configurations. What works for one traffic source may be too strict or, conversely, too lenient for another.

Logs and Transparency

The arbitrageur must understand why a specific visit ended up on the white page or black page. Without logs, it's challenging to test the setup and quickly find errors.

Why Many Arbitrageurs Choose hoax.tech

One of the popular cloaking services frequently used in the arbitrage community is hoax.tech.

Screenshot of hoax.tech website

The service has been operational since 2020, focusing on traffic filtering in gray verticals and most advertising networks. Unlike solutions that primarily rely on basic PHP filtering, hoax.tech employs a more in-depth analysis of visitor parameters, JS fingerprinting, and an integrated neural network called Matchex.

What the service offers:

  • The ability to work with gambling, crypto, adult content, nutra, and other complex verticals
  • Filtering of advertising bots, moderators, and other unwanted traffic
  • JS Fingerprinting for analyzing browser parameters
  • Updated databases of unwanted IPs
  • API for automating work with flows
  • PHP and JavaScript integration
  • Configuration understandable even without deep technical knowledge
  • The ability to handle large volumes of traffic

Methods of Transitioning to the Black Page

When setting up the flow, it's crucial to choose how the target audience will land on the black page. The specific option depends on the traffic source and objectives.

In hoax.tech, the following types of actions are available:

  • JavaScript Redirect
  • iframe
  • Redirect 301, 302, and 303
  • HTML Meta Refresh
  • Set Cookie
  • Insert HTML Code

Of particular note is the Insert HTML Code action type. This option allows for content loading without redirects, often referred to as zero redirect cloaking.

Instead of standard redirection, the service directly loads the content of the black page while keeping the white page's domain in the browser's address bar. For some campaigns, this is a more discreet way to display the offer without traditional redirects.

What Else Is Needed for a Stable Launch

Cloaking is essential, but for a successful launch, the rest of the infrastructure is equally critical. If the account is weak, the white page is poorly constructed, the creatives are too aggressive, and the domain has a bad history, even good bot filtering may not save the account from being blocked or the ad campaign from freezing.

For stable operation, the following are necessary:

  • Warm advertising accounts
  • Cautious creatives
  • High-quality white page
  • Domain without a bad history
  • Reliable and stable hosting
  • A converting, relevant offer
  • High-quality proxies for working with accounts
  • Reliable anti-detect browser for multi-accounting

High-quality proxies help manage accounts, check links from the required countries, and increase trust for the anti-fraud systems of advertising platforms. Thus, they should be viewed as a separate yet crucial element of the setup.

Common Mistakes When Setting Up Cloaking

Even a quality and correctly configured cloaking service cannot compensate for serious mistakes in the setup of other components.

Most issues arise from the following mistakes:

  • The white page looks weak, unfinished, or its content does not match the ad
  • A downloaded third-party site with links to the original resource and its metrics is used as the white page
  • Allowed countries or device types are incorrectly specified, leading to unnecessary traffic losses
  • Too strict filtering rules are enabled
  • The redirect to the black page is not tested before launching
  • The white and black pages are too heavy and load slowly
  • The creatives contain obvious triggers
  • And other mistakes

Before launching, it’s important to verify the correctness of integration across different devices, IP addresses, and GEO.

Conclusion

Cloaking in arbitrage is one of the key elements of infrastructure when working in gray niches. Its main goal is to display different content to different types of visitors. Because of this, arbitrageurs can launch ads in verticals where a direct display of the offer to the advertising platform almost always leads to campaign and account bans.

A quality cloaking service goes beyond simple redirects and deeply analyzes traffic at various levels.

Moreover, cloaking works most effectively as part of an integrated infrastructure: with quality accounts, a white page, domains, proxies, an anti-detect browser, creatives, and a tested offer. This approach helps preserve accounts and gives campaigns a better chance of achieving stable performance.