is a lightweight command line tool application designed to fetch all links referred in a given URL. To put it simply, the utility has the role of crawling the URLs inside the same domain and grabbing as many external links as possible. The scraping is done up to a given level and using the default parameter Max=1 specifies the URLs to look for.
Generally speaking, extracting and inspecting data from websites enables developers to spot broken links and correct them. The sampling operation can also come in handy for security reasons, as it can help identify poorly maintained web apps and pages. The console application can also come in handy in a variety of situations, some of the common ones being news monitoring, lead generation, price tracking on multiple markets, contact information extraction and data collection for market research.
While this can be done manually, keep in mind that this can turn out to be tedious work. Therefore, using a specialized tool can considerably speed up the process.The tool does not provide any method of exporting or saving the results. Consequently, users need to run Command Line or PowerShell as Administrator and type the command to save the output to a text file in a convenient location.