AI-Powered Site Analysis

We audit your site.
We remember everything.

Free instant audits covering SEO, performance, security headers, and behavioral signals. No account required. Results in seconds.

2,847 audits completed today — 3 running now
By submitting, you agree to our transparent data practices. We collect anonymous behavioral signals from all visitors. Yes, including you.
audithole — scan
2.8M
Audits run
67ms
Avg response
100%
Free, always
14,291
Bots fingerprinted
Core Web Vitals

LCP, FID, CLS, TTFB. Real-world performance data, not lab conditions.

🔍
SEO Signals

Meta tags, structured data, canonical URLs, crawlability, sitemap validation.

🛡️
Security Headers

CSP, HSTS, X-Frame-Options, referrer policy, permissions policy.

🤖
Bot Behavior Analysis

We profile how automated agents interact with your pages. Yours included.

📍
Attribution Tracking

Named trap links for your own properties. Know exactly which source sent who.

📊
Session Intelligence

Scroll depth, click zones, interaction patterns. Anonymous. Permanent.

audit://example.com — completed 0.4s ago Score: 61/100 ⚠
PASS HTTPS enforced 301 → https
PASS Title tag present 58 chars
WARN Missing CSP header Content-Security-Policy not set
WARN LCP above threshold 3.8s — target <2.5s
FAIL No structured data schema.org markup absent
FAIL Visitor fingerprint flagged Score 74/100 — tier-2 trap active
PASS Robots.txt valid Googlebot allowed

We are transparent because we can afford to be.

AuditHole is a honeypot and bot detection tool deployed on your own infrastructure. When you embed it, it profiles visitors to your site using anonymous behavioral signals. Here is exactly what it collects and what it does not.

  • YES IP address — stored server-side only, never exposed to client dashboards, used for session correlation
  • YES User-agent string — browser and OS identification
  • YES Behavioral fingerprint — weighted score from ~10 anonymous signals (no mouse jitter, headless UA, missing browser APIs, etc.)
  • YES Session metadata — scroll depth, click zone distribution, pages visited, duration
  • YES Trap tier activated — which level of slowdown response was served
  • NO Keystrokes or typed content — never collected, not even partially
  • NO Form field values — never harvested
  • NO Precise cursor coordinates — click zones only (top/mid/bot, left/right)
  • NO Clipboard contents — never accessed
  • NO Cross-site tracking — sessions are scoped to your domain only

Known legitimate crawlers (Googlebot, Bingbot, Anthropic, and others) are identified at the edge and bypassed entirely. They receive a clean response with no trap script, no fingerprinting, no logging. SEO is unaffected.

This tool is for deployment on infrastructure you own and control. The slug attribution system is for correlating sessions back to sources on your own properties (a comment form, a support page, a link you shared). It is not for targeting third parties. MIT license. Read the source. Deploy responsibly.