Scanning the Tags used in the search engines + robots.txt File. Display Redirect Path and IP Address
A free extension that allows the user to view data collected by search engines, and receive alerts when pages were blocked. The product is easy to use and does not require previous knowledge in SEO. Scanning of Robots.txt file: The search engines look for Robots.txt files in order to determine which pages not to scan. • An alert when a page was blocked • Last scan data Meta Robots Tags: An alert when there is a tag that is blocking index (No Index) or a tag that guides the search engines not to follow the links on a page (No Follow) Redirect Path: Checking and displaying Redirect Path includes: • The full route • HTTP Status • Directing URL website IP address: The extension shows the IP address of the website server No Follow Links: This tag guides the search engines not to follow the link • The extension scans the page and highlights in yellow all the links that contains “Rel=NoFollow” (supports pictures and non-textual elements as well) • There is an option to prevent highlighting those links Title,Meta Description Tag: These tags are displayed on Google search results. • Displaying the tags details • Alert when the tag shows more then once Canonical tag This tag prevents the same content from appearing twice When two or more links display identical or similar content, we do not want the search engine to scan all of the duplicated pages, therefore we choose one link to present in all those pages. For example, one link classifies products by price, from lowest to highest, and the other link classifies by price, from highest to lowest. The content in that example is almost identical, so if we wish to prevent the scanning of both of them by search engines, we will choose to present on both pages only one link. Supported Languages: English Hebrew Russian Itay Arye
- (2020-12-12) hnu siyak: best tools for begginer
- (2019-02-23) E. O.: This is the single one extension on the market, which displays in real time, whether a certain url is blocked by domain's robots.txt. The developer is very active - bugs are closed just after notification.
- (2017-10-29) Digital Ashish: coolest tools
- (2017-05-07) Jenna Louise: Simplest tool ,Recommended
- (2017-05-05) dan James: Best tool
- (2017-04-23) Michael Offengenden: Very helpfull!!!