`playwright-stealth` and `undetected-chromedriver`
Two community-maintained toolkits that automate the dozens of fingerprint patches a stealth scraper needs. Install, configure, verify, then stop reinventing.
What you’ll learn
- Install `playwright-stealth` and apply it to a context in one line.
- Install `undetected-chromedriver` (Selenium) and verify it bypasses webdriver/canvas detection.
- Understand which signals each toolkit patches and which it doesn't.
- Test stealth effectiveness against Catalog108's antibot challenges.
You can patch every fingerprint signal by hand. You can also pip install a library that does it for you. Two ecosystems lead: playwright-stealth for Playwright (Python and Node), and undetected-chromedriver for Selenium Python. Both apply ~15-20 patches automatically and ship updates as detection evolves.
What stealth toolkits do
Generally:
- Set
navigator.webdriver = falsevia Object.defineProperty. - Fake
navigator.plugins,navigator.mimeTypes,navigator.languages. - Patch
chrome.runtime,chrome.app,window.chromeshape. - Override
navigator.permissions.querysonotificationsreportspromptinstead ofdenied. - Spoof
WebGLrenderer strings to plausible values. - Hide the "Headless" substring in User-Agent strings.
- Patch
navigator.hardwareConcurrency,navigator.deviceMemoryto realistic numbers. - Add
window.outerHeight/outerWidthto differ frominnerHeight/innerWidth.
The exact list depends on the library version. Read the source, it's eye-opening to see how many small things go into "looking like a real browser."
playwright-stealth (Python)
Install:
pip install playwright-stealth
Apply to a context:
from playwright.sync_api import sync_playwright
from playwright_stealth import stealth_sync
with sync_playwright() as p:
browser = p.chromium.launch()
context = browser.new_context()
page = context.new_page()
stealth_sync(page) # one line, applies all patches
page.goto("https://practice.scrapingcentral.com/challenges/antibot/webdriver-detected")
print(page.evaluate("navigator.webdriver")) # → False
browser.close()
For async:
from playwright_stealth import stealth_async
await stealth_async(page)
That's the entire integration. Plus a UA override for completeness:
context = browser.new_context(
user_agent="Mozilla/5.0 (Macintosh; Intel Mac OS X 10_15_7) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/127.0.0.0 Safari/537.36",
)
stealth_sync(page)
playwright-stealth (Node)
The Node equivalent is puppeteer-extra-plugin-stealth (works with Playwright via playwright-extra):
npm install playwright-extra puppeteer-extra-plugin-stealth
const { chromium } = require("playwright-extra");
const stealth = require("puppeteer-extra-plugin-stealth");
chromium.use(stealth());
(async () => {
const browser = await chromium.launch();
const page = await browser.newPage();
await page.goto("https://practice.scrapingcentral.com/challenges/antibot/canvas-fingerprint");
console.log(await page.evaluate(() => navigator.webdriver)); // false
await browser.close();
})();
playwright-extra is the plugin host; the stealth plugin from the Puppeteer ecosystem works because the patches are browser-side. Mature, well-tested, the de-facto Node stealth standard.
undetected-chromedriver (Selenium Python)
The Selenium equivalent:
pip install undetected-chromedriver
import undetected_chromedriver as uc
driver = uc.Chrome(headless=False, version_main=127)
try:
driver.get("https://practice.scrapingcentral.com/challenges/antibot/canvas-fingerprint")
print(driver.execute_script("return navigator.webdriver"))
finally:
driver.quit()
The library subclasses Selenium's Chrome driver and applies patches at the driver level, modifying the CDP commands sent to Chromium so the resulting browser doesn't look automated. More invasive than Playwright stealth (which patches via JS), and historically more effective against sophisticated anti-bot.
headless=False is intentional, undetected-chromedriver works much better in headed mode, because some patches depend on Chromium's UI subsystem being initialized. For headless servers, you'll need Xvfb or a virtual display.
Verification: test against real signals
Don't trust that stealth is working, verify:
checks = [
"navigator.webdriver",
"navigator.plugins.length",
"navigator.languages",
"navigator.hardwareConcurrency",
"Notification.permission",
"window.chrome ? Object.keys(window.chrome).length : 0",
"navigator.userAgent",
]
for c in checks:
print(c, "→", page.evaluate(c))
Expected post-stealth output (approximate):
navigator.webdriver → False
navigator.plugins.length → 3
navigator.languages → ['en-US', 'en']
navigator.hardwareConcurrency → 8
Notification.permission → 'default'
window.chrome → 6
navigator.userAgent → Mozilla/5.0 (...) Chrome/127.0.0.0 (no "HeadlessChrome")
Anything still flagged needs a manual patch via add_init_script.
What stealth toolkits DON'T patch
| Signal | Why |
|---|---|
| TLS / JA3 fingerprint | Driven by the browser binary, not JS, already correct for Playwright |
| HTTP/2 fingerprint | Same |
| Behavioural signals (mouse, timing) | Statistical patterns require active emulation, not patches |
| Canvas pixel-level fingerprint | Hard to fake convincingly without GPU access, Camoufox handles this differently (Lesson 2.29) |
| AudioContext fingerprint | Similar, device-specific output, hard to fake |
| Proxy / IP reputation | Out of scope; needs proxy infrastructure (Sub-Path 5) |
| Account-based reputation | A logged-in user with no purchase history is suspicious regardless of fingerprint |
Stealth gets you past tier 1 and most of tier 2. Tier 3 (behavioural) needs more.
Combined recipe
A solid baseline for any anti-bot-protected target:
from playwright.sync_api import sync_playwright
from playwright_stealth import stealth_sync
UA = "Mozilla/5.0 (Macintosh; Intel Mac OS X 10_15_7) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/127.0.0.0 Safari/537.36"
with sync_playwright() as p:
browser = p.chromium.launch(
headless=False, # often required for high-level anti-bot
args=[
"--disable-blink-features=AutomationControlled",
"--no-sandbox",
],
)
context = browser.new_context(
user_agent=UA,
viewport={"width": 1280, "height": 800},
screen={"width": 1920, "height": 1080},
locale="en-US",
timezone_id="America/New_York",
extra_http_headers={"Accept-Language": "en-US,en;q=0.9"},
)
page = context.new_page()
stealth_sync(page)
page.goto(target_url)
# ...
browser.close()
--disable-blink-features=AutomationControlled is a Chromium flag that removes another tier-1 detection vector. Always include it.
When stealth fails
Sometimes a target detects you even with full stealth. Diagnostic ladder:
- Is it actually stealth? Run the check loop above. Confirm patches are applied.
- Is it behavioural? Sites checking mouse movement / timing won't be fooled by JS patches. Add
page.mouse.move(...)to a few random points, add small randomwait_for_timeoutbetween actions. - Is it IP-based? Try from a residential proxy. Many tier-3 systems weight IP heavily.
- Is it premium anti-bot? Akamai, DataDome, Kasada, Cloudflare Bot Management, these often need patched browsers (Lesson 2.29) or paid services.
If you've climbed all four rungs, the right call is often a SERP API or proxy service, covered in Sub-Path 5.
Keeping stealth updated
Detection vendors track stealth toolkits. Old versions get fingerprinted as "stealth-patched browsers", its own signal. Pin loosely and update regularly:
pip install --upgrade playwright-stealth
Check the toolkit's GitHub for recent issues that mention sites you scrape, community knowledge moves fast in this space.
Hands-on lab
Open /challenges/antibot/webdriver-detected with vanilla Playwright. Verify it flags you as a bot. Add playwright-stealth. Re-run, see whether you pass. Then try /challenges/antibot/canvas-fingerprint, stealth may help less here, because canvas fingerprints require GPU emulation, not just JS patches. That gap motivates Lesson 2.29.
Hands-on lab
Practice this lesson on Catalog108, our first-party scraping sandbox.
Open lab target →/challenges/antibot/canvas-fingerprintQuiz, check your understanding
Pass mark is 70%. Pick the best answer; you’ll see the explanation right after.