im2html walks a directory hierachy filled with categorised images and makes HTML indices and thumbnails for everything. It's easy to customise the titles of each page, add descriptive HTML for images or the page as a whole, add references to related web pages etc. It's incremental, so you can run it nightly in a cron job to keep things up to date, reducing most maintenance to "copy the image into the right subdirectory and wait overnight". It's careful to not rewrite index files unless their contents change, making maintenance of a remote gallery from your master copy with a tool like rsync efficient.
Runmaint is a shell script for invoking regular jobs. It delivers the output (if any) via mail with a meaningful subject line to the people of your choice. It moves all the complexity from the crontab into an easily managed collection of scripts. Runmaint expects to be run out of cron.
squid_redirect uses a list of patterns to zap annoying ad banners from Web pages, inserting a placeholder image. It lives in a Web proxy and so requires no special browser facilities. It's readily customizable, small, fast, and easy to install. (In Debian/Ubuntu, this program is named "adzapper".)
A simple and trivial to use utility for keeping various 3rd party application packages installed on multiple machines. It leaves flexibility for making packages local to a host or remotely served from a central server. It is not the same sort of tool as RPM; rather it serves a related but different purpose. In particular, unlike most package systems, it can run independently of your main system (RPM, pkgadd, etc) and lets you install multiple versions of an appplication at the same time.
Myke is a replacement for make written several years ago. It's an almost complete superset with a very few syntactic and semantic differences. So if you like it, moving your Makefiles to Mykefiles is very easy. Its primary difference is that the macro syntax is richer, permitting extremely succinct Mykefiles.
cats2procmailrc is a filter to generate a .procmailrc file from a much terser and friendlier file consisting mostly of single-line rules. These suffice to do the most common mail filing, making maintenance of a procmailrc very easy for someone with more than a few filing rules. Each rule normally names the folder, an optional subject tag, and the pattern. Raw procmail rules may also be included for those rare complex filings.
info2man converts GNU info files to pod or -man formats. GNU info can be a pain as it demands its own special pager, it's a binary format, it's cruder than HTML and less documented, and most GNU- authored manual entries basically say "we like info so we don't maintain this manual entry, thus it is probably wrong". info2man thus converts info files so that they can be read by ordinary tools.
histbackup makes incremental backups of a directory tree in a set of directories named by dates, or date-time if the date already exists. Its companion script histbackup-prune controls the archive growth. The new backup is prepopulated by hardlinks of each file from the previous backup and then updated with rsync(1). This arranges that the only new content of each new backup is fresh copies of the changed files. In this way each backup directory is a full copy of the source directory but the disk space cost is that of an incremental backup. Because it uses rsync, it is possible to efficiently backup remote directories in this manner.
Re: trusted source
> % SuprNova rocked because it
> % was moderated, they verified the files
> % were good and then posted them.
> Right, so what I propose is to have
> someone like SN do the verified process
> before releasing them to a distributed
Then your moderator pool (consisting of interested people who
will fetch a file for experimental purposes) post detached signatures indicating "I fetched this and it was good".
You need a a way to link signatures to the posted files, but that's
easy because the signature can sign a text files that mentions
to source object.
Then you just decide whom you trust and limit (or better, rank) your searches according to who signed what.
Re: Blocking Specific Websites
Is it possible to block the zapping of
ads on some selected websites without
having squid bypass them?
Certainly - see the Customisation Section of the web page.
But please report these to me at email@example.com as well so that I can fix these misclassifications.
What you do is: fetch the wrapzap script from the web page and set the $ZAP_PREMATCH variable to a list of site exceptions. Install as per directions.
You want PASS lines like this:
Cheers, - Cameron Simpson