SharePoint 2013 – Export Index a la Crawl Log

I ran into an issue where I was replacing a competitor's search service.  I needed to provide validation that SharePoint indexed the same data and used the crawl log to get a list of all the items in SharePoint.  Here's the code:

$ssa = Get-SPEnterpriseSearchServiceApplication

$cl = New-Object Microsoft.Office.Server.Search.Administration.CrawlLog $ssa

$cl.GetCrawledUrls($false,1000000,"",$false,-1,0,-1,[datetime]::minvalue,[datetime]::maxvalue) | export-csv -notype successes.csv  # This will likely be huge

$cl.GetCrawledUrls($false,1000000,"",$false,-1,1,-1,[datetime]::minvalue,[datetime]::maxvalue) | export-csv -notype warnings.csv

$cl.GetCrawledUrls($false,1000000,"",$false,-1,2,-1,[datetime]::minvalue,[datetime]::maxvalue) | export-csv -notype errors.csv

Then just a quick ETL to SQL for SP's index, a full outer join and done.

Comments (3)

  1. Great post. Thank you.

  2. Anonymous says:

    Thx for this command. Unfortunatly, it appears this method will be removed in upcoming versions:

  3. susaa says:

    Thanks a lot for this article. it worked…

Skip to main content