+4 votes
by (1.3k points)
Just ran a Screaming Frog crawl on a client site that was developed by someone else. The site is WP and uses the All In One Event Calendar plugin. The plugin is generating 10's of thousands of unique URLs which are being picked up by Screaming Frog. This is making my crawl audit a much tougher task than it should be. Thoughts? Have the developer simply use an embedded calendar instead or some other solution?  
Just ran a Screaming Frog crawl on a client site that was developed by someone else.

2 Answers

0 votes
by (2k points)
In your Configuration before running the crawl, use exclude feature to basically exclude set of parameters or a URL syntax that this calendar is generating. they should theoretically have a common syntax or directory of sort. so that's what you can do to make your crawl cleaner. Hope this helps!  
by (1.3k points)
@consolata35 yeah I ended up doing that for our purposes. Was wondering if it would affect other crawlers though - Google specifically
by (11.4k points)
Use your standard techniques to block
+3 votes
by (6.8k points)
Edasy as that above
The Search Engine Optimization Group is where you can always find questions, answers, advice, reviews & recommendations from other community members about better strategy on ranking highly for search engine results.

Related questions

...