HTTrack is an easy-to-use offline browser utility. It allows you to download a Web site from the Internet to a local directory, building recursively all directories, getting HTML, images, and other files from the server to your computer. HTTrack arranges the original site's relative link-structure. Simply open a page of the mirrored Web site in your browser, and you can browse the site from link to link, as if you were viewing it online. HTTrack can also update an existing mirrored site, and resume interrupted downloads. WebHTTrack is a Web-based GUI for HTTrack.
ProxyTrack is a simple proxy server that delivers content archived by HTTrack sessions. It can aggregate multiple download caches, for direct use (through any browser) or as an upstream cache slave server. This proxy can handle HTTP/1.1 proxy connections, and is able to reply to ICPv2 requests for an efficient integration within other cache servers, such as Squid. It can also handle transparent HTTP requests to allow cached live connections inside an offline network.
Wow, at last we have an efficient memory leak debugger on Linux/i386! Thanks for this tool, which is the only way to cleanup complex code, detect leaks/buffer overflows and unitialized areas.
For information, similar tools (like purify) are rare, and generally costs thousands of dollars.
Thanks for rz/sz!
I consider rz and sz as vital as ls, vi or fdisk. I was once stuck on a remote server and had to transfer a quite large (2-3MB) binary to fix it, and was connected using a single modem line ("emergency" serial connection). All server interfaces were down, no scp/ftp was possible. I was able to transfer the binaries using rz through my tera term (telnet) terminal, and successfully woke up the server.