Checkbot is a tool to verify links on a set of HTML pages. Checkbot can check a single document, or a set of documents on one or more servers. Checkbot creates a report which summarizes all links which caused some kind of warning or error.
|Tags||Internet Web Site Management Link Checking|
Release Notes: Documents are correctly parsed to avoid problems with UTF-8 documents. This avoids the "Parsing of undecoded UTF-8 will give garbage when decoding entities" messages. Regular expressions are allowed in the suppression file, and the program complains if the suppression file is not a proper file. Handling of HTTP and FTP servers that have problems responding to HEAD requests is now more robust. The original URL is used to report problems. XHTML compliance was ensured.
Release Notes: This release doesn't throw errors for links that cannot be expected to be valid all the time (e.g. the classid attribute of an object element). it has better fallbacks for some cases where the HEAD request does not work. More classes and IDs have been added to allow more styling of results pages (including an example CSS file). XHTML compliance is ensured. There are better checks for optional dependencies.
Release Notes: A silly build-related problem that prevented checkbot 1.76 from running at all was fixed. The presence of a robot's meta tag is now checked and acted upon.
Release Notes: A --cookies option that allows cookies to be set while checking was added along with a --noproxy option for indicating which domains should not be passed through the proxy. A new error code is generated for unknown schemes. Minor bugfixes and documentation updates were applied.