# See http://www.robotstxt.org/wc/norobots.html for documentation on how to use the robots.txt file # # To ban all spiders from the entire site uncomment the next two lines: User-Agent: * Allow: / Disallow: /checkout/cart/ Disallow: /checkout/onestepcheckout/ #Crawl-delay: 4 #The grub distributed search engine behaves very badly. User-agent: grub-client Disallow: / #They totally overwhelm servers with traffic. #http://www.nameprotect.com/botinfo.html User-agent: NPBot Disallow: / #Proprietary German backlinks service. User-agent: SEOkicks-Robot Disallow: / #wget run recusively just wrecks server capacity. User-agent: wget Disallow: / #This spider's output isn't public. User-agent: 008 Disallow: / #Entireweb User-agent: Speedy Disallow: / #Foreign-language bot User-agent: Sogou web spider Disallow: / #Poorly behaved bot User-agent: Ezooms Disallow: / #Brandwatch User-agent: magpie-crawler Disallow: / #Magestic-12 User-agent: MJ12bot Disallow: / #Yoydao User-agent: YodaoBot Disallow: / #NerdByNature.Net User-agent: NerdByNature.Bot Disallow: / #Discovery Engine User-agent: discobot Disallow: / #No idea. User-agent: knowaboutBot Disallow:/ #Gnip User-agent: UnwindFetchor Disallow: / #These bots are designed to duplicate entire sites. User-agent: sitecheck.internetseer.com Disallow: / User-agent: Zealbot Disallow: / User-agent: MSIECrawler Disallow: / User-agent: WebReaper Disallow: / User-agent: SiteSnagger Disallow: / User-agent: WebStripper Disallow: / User-agent: WebCopier Disallow: / User-agent: Fetch Disallow: / User-agent: Offline Explorer Disallow: / User-agent: Teleport Disallow: / User-agent: TeleportPro Disallow: / User-agent: WebZIP Disallow: / User-agent: linko Disallow: / User-agent: HTTrack Disallow: / User-agent: Microsoft.URL.Control Disallow: / User-agent: Xenu Disallow: / User-agent: larbin Disallow: / User-agent: libwww Disallow: / User-agent: ZyBORG Disallow: / User-agent: Download Ninja Disallow: /