28 March 2006

grabbing HTTP headers

Sometimes it's useful to inspect the HTTP response headers from a Webserver (for example, to know if the Webserver is running Apache, IIS, or something else). Most Web browsers have some sort of 'Page Info' feature which will display the response headers. But it's often more convenient to do it from the command line.

The text-only browser lynx has a nice feature for this. Typing the following command will dump the HTTP response headers to the screen without displaying the content of the www.example.com homepage:

lynx -dump -head http://www.example.com/

wget can also do this. wget ordinarily downloads the Web content to a local file without displaying response headers. The following will show the headers and discard the content (-S displays the headers, and -O diverts the output here to /dev/null):

wget -S -O /dev/null http://www.example.com/

The curl utility can do much the same thing (note that this is a lower-case o to specify the output destination, and there a bare hyphen after the -D, indicating that the headers should be written to stdout):

curl -D - -o /dev/null http://www.example.com/

netcat offers a fourth way of getting the headers by allowing you to hurl a custom HTTP request at port 80 on the Webserver:

printf "HEAD / HTTP/1.0\n\n" | nc www.example.com 80

The previous example would only work for HTTP. For HTTPS, you can do a similar trick using the s_client mode for the openssl utility (this example uses an HTTP v1.1 request, which requires a host request header):

printf "HEAD / HTTP/1.1\nhost: www.example.com\n\n" \
| openssl s_client -ign_eof -connect www.example.com:443

The -ign_eof keeps the connection open so that the s_client will see the printf output: this also requires manually closing the connection (Control-C should do it). Additionally, you may get certificate verification errors from openssl. If so, try specifying your system's certificate authority bundle (which contains the public keys of a list of trusted certificate authorities, and which may be in a different location that this example):

printf "HEAD / HTTP/1.1\nhost: www.example.com\n\n" \
| openssl s_client -ign_eof \
-CAfile /etc/pki/tls/certs/ca-bundle.crt \
-connect www.example.com:443

24 March 2006

sorting files by modification date

At times I've found that I need to sort the contents of a directory by modification date. If all the files are in a single directory, 'ls -lt' will do the trick. But it's not so easy if the files are scattered through an arbitrarily complicated directory structure. So I've saved the following shell script in ~/bin/file_epoch.sh (and made it executable w/ 'chmod 700 ~/bin/file_epoch.sh'):

mod_time="`/bin/ls --full-time \"${file_name}\" \
| awk '{ print $6,$7; }' | cut -d'.' -f1`"
epoch="`date -d \"${mod_time}\" +%Y%m%d%H%M%S`"
echo "${epoch} ${file_name}"

The script takes a single filename as its argument, and the output is something like this:

20060324152639 /path/to/some/file

Then when I need to sort the files, I just do this:

find /path/to/directory/structure -type f \
-exec ~/bin/file_epoch.sh {} \; | sort -rn

This runs the script on every file in /path/to/directory/structure, and piping the output to 'sort -rn' sorts the files in chronological order (from newest to oldest).

22 March 2006

current year (e.g., for copyright) in TT2 templates

If you are using the Perl Template Toolkit and want the template always to display a copyright notice giving the current year, just include the following in your template:

© [% USE date %][% date.format(date.now, '%Y') %]

It uses the Template::Plugin::Date plugin.

Using GPG as a password wallet

I have too many passwords. In fact, I make many of my less-used passwords by grabbing the first 10 or 12 characters of output from something like this:

dd if=/dev/urandom bs=1k count=1 | md5sum

So I get passwords like '1758dbed4331'--no way I'll remember that.

For a while I tried using a nice text-based password wallet program called the Password Management System (PMS--yes, it's a very unfortunate acronym). I learned about this from a Linux Journal article by Marcel Gagne. I built it from source and used it on FC3 for months.

Then I upgraded to FC4. I rebuilt PMS from source and tried reading the previous PMS data files (two data files in my home directory, as I recall). I kept getting segmentation faults. PMS on FC4 worked OK on new data, but it wouldn't read the old data. Some weird incompatibility in the libraries, I guess. So I had to build PMS on another FC3 box, run it, and copy-and-paste all my passwords out of PMS and into a text file. Lame. Hella lame.

So now I just stick with that text file, but it's encrypted by gpg:

gpg -c passwords.txt

When prompted, I gave it a good password. Now, when I need to look up one of my passwords, I just do this:

cat passwords.txt.gpg | gpg | less

Because the output is in 'less', I can even do text searches (each username/password pair is accompanied by some text describing where the password is used). When I've looked up whatever password I need, I just hit 'q', and I'm done.

trick for hosting large DVD ISOs on Apache

I recently needed to make a large file (a DVD ISO) available from a Webserver running Apache. The ISO was around 3.1GB, and Apache wouldn't serve it (the file wouldn't even show up in the mod_autoindex listing).

So I split up the original file into smaller chunks:

split -b 1073741824 huge_DVD.iso

This created 4 files called xaa, xab, xac, and xad. The first three were 1073741824 bytes each, and xad was around 31MB. I renamed the files (in order) to huge_DVD.chunk1, huge_DVD.chunk2, huge_DVD.chunk3, and huge_DVD.chunk4.

I posted the 'chunks' to the Webserver along with a text file containing the SHA1 checksum of the original file. The original file can be reassembled and its checksum computed in a single read by running the following command (after downloading the chunks):

cat huge_DVD.chunk* | tee huge_DVD.iso | openssl dgst -sha1

17 March 2006

bad blogger, no cookie

I haven't been very good at updating this silly thing, have I?