Crawler Record logs the last time specific user agents (for search and AI chat/LLMs) accessed your content including:
You can view this information:
Robots-aware: The plugin checks your robots.txt and evaluates Allow/Disallow rules for a given path. If Settings → Reading → “Discourage search engines” is enabled, all agents are shown as blocked with a prominent warning.
Performance-friendly by design: Write-throttling (default 10 minutes) and an auxiliary “last post ID per agent” record avoids heavy admin queries on large sites.
Privacy-friendly: Saves only bot visit timestamps and last URLs crawled — no personal data.
Learn how to use this plugin.
*) and end-of-line marker ($) are not interpreted; matching is prefix-based only. Future versions may add full spec support.This plugin stores:
– Timestamps of crawler visits (float, with microseconds)
– Last URL seen per crawler (per-URL records)
– Last post ID per crawler (for admin performance)
It does not collect or store personal data about site visitors. No data is transmitted to third parties.
GPLv2 or later. See LICENSE file.