Enter a URL
A spider simulator acts similarly to the spiders of a search engine. When the bots crawl through your web pages, it is not possible to collect data regarding which space is being neglected by them. The best way its identify is by running a similar tool that can crawl like the spiders of the actual search engine.
If you search online, a large number of websites will be available providing this tool for free of cost. For your convenience, many online tools show similar results of how the spider crawls the pages.
All of these tools are available to access directly through websites rather than any requirement of the installation. This is a complete mimic of the actual crawler of a search engine. The main purpose of running this simulator is to watch the website from the perspective of a search engine.
You must know one thing there is a basic difference between the way end users look at a website and the way how a crawler or search engine considers it. These bots cannot gain access to the entire fields that are visible to the end user. For identifying these areas, a simulator is required that you can access all the visible data on the webpage.