Skip Social Crawler Tracking
Overview
The "Skip social crawler tracking" feature allows you to control whether visits from social media crawlers are recorded in your analytics. When enabled, social media platforms can still access your links (unless blocked), but their visits won't appear in your click statistics.
This helps keep your analytics focused on real human visitors while still allowing social platforms to generate link previews and share cards.
What Are Social Media Crawlers?
Social media crawlers are automated bots used by platforms to:
- Generate link previews when you share a URL
- Extract Open Graph metadata and images
- Create rich social share cards
- Verify link safety and content
Detected Social Media Platforms
The following platforms are recognized as social media crawlers:
- Facebook (FacebookExternalHit, Facebot)
- Twitter (Twitterbot)
- LinkedIn (LinkedIn Corporation)
- Google (Googlebot, APIs-Google, AdsBot-Google)
- YouTube (Youtube, LLC)
How It Works
When Skip Social Crawler Tracking is Disabled (Default)
Social media crawler visits link
↓
Link forwards to destination
↓
Click is recorded in database
↓
Appears in your analytics dashboard
↓
Counts toward total clicks
↓
Visible in "Robots" tab
Result:
- ✅ Social media crawlers can access the link
- ✅ Clicks are recorded in analytics
- ✅ You can see all social crawler activity
- ⚠️ Analytics include both human and bot traffic
When Skip Social Crawler Tracking is Enabled
Social media crawler visits link
↓
Link forwards to destination
↓
Click is NOT recorded in database
↓
Does not appear in analytics
↓
Does not count toward total clicks
Result:
- ✅ Social media crawlers can still access the link
- ✅ Link previews work on social platforms
- ✅ Analytics show only human visitors
- ✅ Reduced database usage
- ❌ No visibility into social crawler activity
Configuration Options
Option 1: Track All Clicks (Default)
- Skip social crawler tracking: ❌ Disabled
Use case: You want complete visibility into all traffic, including when social platforms fetch your links.
Benefits:
- See when Facebook/Twitter/LinkedIn fetch your links
- Verify that social previews are working
- Track all traffic sources including bots
- Useful for debugging social sharing issues
Drawbacks:
- Analytics include non-human traffic
- Total clicks include bot visits
- Larger database usage
Option 2: Skip Social Crawler Tracking
- Skip social crawler tracking: ✅ Enabled
Use case: You want analytics focused only on real human visitors.
Benefits:
- Cleaner analytics showing only human clicks
- More accurate conversion metrics
- Reduced database storage
- Faster analytics queries
- No need to filter out bot traffic
Drawbacks:
- No visibility when social platforms fetch links
- Can't verify social preview generation
- Can't track which platforms are sharing your content
Interaction with Bot Blocking
The "Skip social crawler tracking" feature works together with "Block bots & spiders":
Scenario 1: Neither Feature Enabled
- Block bots: ❌ Disabled
- Skip social crawler tracking: ❌ Disabled
Result:
- ✅ Social crawlers can access links
- ✅ Clicks are recorded in analytics
Scenario 2: Skip Tracking Only
- Block bots: ❌ Disabled
- Skip social crawler tracking: ✅ Enabled
Result:
- ✅ Social crawlers can access links
- ❌ Clicks are NOT recorded in analytics
Scenario 3: Block Bots Only
- Block bots: ✅ Enabled
- Skip social crawler tracking: ❌ Disabled
Result:
- ❌ Social crawlers are blocked (403 error)
- ❌ No clicks to record
Scenario 4: Both Features Enabled
- Block bots: ✅ Enabled
- Skip social crawler tracking: ✅ Enabled
Result:
- ✅ Social crawlers can access links (exception to bot blocking)
- ❌ Clicks are NOT recorded in analytics
- ❌ Other bots are still blocked
Use case: Block search engines and spam bots, allow social sharing with previews, but keep analytics clean.
Technical Details
Detection Method
Social media crawlers are identified by:
-
User Agent String: Matching against known crawler signatures
- Example:
facebookexternalhit/1.1 - Example:
Twitterbot/1.0
- Example:
-
ISP Name: Matching against known platform ISPs (via GeoIP lookup)
- Example: ISP = "Facebook"
- Example: ISP = "LinkedIn Corporation"
If either the user agent OR the ISP matches a known social media platform, the visit is treated as a social crawler.
Request Flow
1. Request arrives at your short link
2. Link validation (spam check, expiry, etc.)
3. Bot blocking check (if enabled)
├─ Is it a social crawler?
│ ├─ Yes + skip_social_crawler_tracking enabled → Allow through
│ └─ Yes + skip_social_crawler_tracking disabled → Block (403)
└─ Is it another bot? → Block (403)
4. Click recording decision
├─ Is skip_social_crawler_tracking enabled AND is social crawler?
│ └─ Yes → Skip recording
└─ Otherwise → Record click
5. Redirect to destination URL
Performance Impact
With tracking disabled:
- No database INSERT operation for social crawler visits
- No BigQuery write for analytics
- Reduced server load on high-traffic links
- Faster response time (by ~10-20ms per social crawler visit)
Database savings example:
- Link shared on Facebook with 10,000 followers
- Facebook may fetch the link 5-10 times for preview generation
- Without skip tracking: 5-10 database records
- With skip tracking: 0 database records
Analytics Impact
Total Clicks Count
- Disabled: Includes social crawler visits
- Enabled: Excludes social crawler visits
Click Timeline Graph
- Disabled: Shows spikes when social platforms fetch links
- Enabled: Shows only human visitor patterns
Robots Tab
- Disabled: Lists all social crawler visits with platform names
- Enabled: Social crawlers don't appear (not recorded)
Filter Robots Button
When "Skip social crawler tracking" is enabled, the "Filter Robots" button becomes less necessary since social crawler traffic is already excluded from your analytics.
Common Use Cases
Marketing Campaigns
Scenario: You're running a social media marketing campaign and want accurate ROI metrics.
Recommendation: Enable "Skip social crawler tracking"
Why:
- Focus analytics on actual customer clicks
- Accurate conversion tracking
- Clean funnel metrics
- Better understanding of campaign performance
Content Sharing
Scenario: You regularly share links on Facebook, Twitter, and LinkedIn.
Recommendation: Enable "Skip social crawler tracking"
Why:
- Social platforms can still generate previews
- Your click counts reflect real people, not bots
- Easier to measure engagement
- No need to manually filter bot traffic
Debugging Social Sharing
Scenario: You're troubleshooting why social previews aren't working correctly.
Recommendation: Disable "Skip social crawler tracking" temporarily
Why:
- See when Facebook/Twitter fetch your links
- Verify that crawlers can access your content
- Check timing and frequency of crawler visits
- Diagnose Open Graph metadata issues
High-Traffic Links
Scenario: You have a viral link with millions of clicks.
Recommendation: Enable "Skip social crawler tracking"
Why:
- Reduce database load
- Lower BigQuery costs
- Faster analytics queries
- Focus on actual user engagement
Private/Internal Links
Scenario: Sharing links within your organization or to a small private group.
Recommendation: Enable both "Block bots" and "Skip social crawler tracking"
Why:
- Block search engines from indexing
- Allow social previews for team members
- Keep analytics clean
- Maintain link privacy
Best Practices
-
Enable for Most Links: For typical use cases, enabling skip tracking provides cleaner analytics without downsides
-
Disable for Debugging: Only disable when actively troubleshooting social sharing issues
-
Consider Your Goals:
- Want to track everything? → Disable skip tracking
- Want clean analytics? → Enable skip tracking
-
Use with Bot Blocking: Combine both features for maximum control over bot access and analytics
-
Monitor Your Robots Tab: Periodically check what bots are visiting to understand your traffic
Related Features
- Block Bots & Spiders - Prevent bots from accessing your links
- Blocking Social Media Crawlers - Detailed guide on bot blocking and social crawler exceptions
- Analytics Filtering - Filter bot traffic from analytics views
Support
Need help configuring social crawler tracking?
- Visit our support documentation
- Contact support at [email protected]
- View the list of detected social media crawlers
Frequently Asked Questions
Will this affect link previews on social media?
No. Social media platforms can still access your links and generate previews. This only affects whether those visits are recorded in your analytics.
Can I still see which platforms are sharing my links?
Not directly. With skip tracking enabled, social crawler visits aren't recorded, so you won't see platform-specific data in your analytics. You'll only see actual human clicks from those platforms.
Does this save me money?
Yes, if you're on a usage-based plan. Fewer database writes means lower storage costs and potentially lower BigQuery costs for analytics queries.
Will this make my analytics more accurate?
Yes. By excluding automated crawler traffic, your click counts and engagement metrics will better reflect actual human visitors and potential customers.
Can I enable this for some links but not others?
Yes. The setting is per-link, so you can customize it based on each link's purpose.
What happens if I enable this after a link has already been shared?
The setting only affects future visits. Past social crawler clicks that were already recorded will remain in your analytics.
Does this work with workspace-level defaults?
Yes. You can set a workspace default for all new links, and override it on individual links as needed.
How is this different from the "Filter Robots" button?
Filter Robots button: Hides bot traffic from analytics view, but data is still recorded
Skip social crawler tracking: Prevents recording altogether, so no data is stored
Can sophisticated bots bypass this?
This feature relies on user agent and ISP detection. Bots that perfectly mimic human browsers may be recorded as regular clicks, but legitimate social media platforms are reliably detected.
