Config file crazyness on linux

I have access to a (relatively large) number of Linux systems. As such I have a number of files which I share with all of these systems, the main one being my .bashrc which is deliberately written to work on all of these systems. (Although .vimrc, .screenrc, .ssh/confih and some custom scripts are also shared.) Previously I have just had to remember when I changed one of the files on a system and copy the changed file to all of the others.

This became unmanageable as the number of machines increased (8 machines at last count – molotov, bong, joshua, primrose, defiant, galaxy, sovereign, ds9) so I decided I needed an automated way of syncing the files, and keeping track of the versions. The obvious solution was some sort of versioning software like CVS or SubVersion.

In fact, Subversion would have been ideal, however the problem with this is that the repository will have to reside on a system with SubVersion installed. As a number of the boxes are behind a router (ds9 is the router, galaxy,sovereign and occasionally defiant reside behind it) this rules them out since the other machines (molotov, bong, joshua, primrose and defiant when not behind ds9) will not be able to access the repository without and ssh tunnel through ds9 which would be extreemly inconvinient. The other machines (molotov, bong, joshua, primrose) are out as I do not have root access to any of them, and none have SubVersion (or CVS) installed. That leaves ds9 (I hope you’re keeping up with all these names), our house router which will disappear at the end of the accademic year when we move out. I want this to be a more permanent solution, so I do not want to use a machine which I know is not going to be around for more than a few months.

Having ruled out CVS and SubVersion as viable options, I then looked at DARCS. On the face of it, this seemed like a good (but not ideal) solution. Due to the nature of DARCS I would still have to remember which machine had the latest version on it, but I could ‘push’ updates to the machines and DARCS would keep track of the versions so I could fetch updates from multiple machines and end up with the latest version. Another approach would be to nominate a core machine and push all updates to it. All machines could then pull from this core machine and know they were getting the latest version (this method is suggested as an approach for syncing multiple machines using Unison File Synchronize, which I also considered and dismissed as it not available by default on any of the machines). The huge problem with the DARCS idea was that, when trying it, I discovered I had to enter my password multiple times to check out the files (8-times, everytime I perform a checkout!). This is not only really, really irritating, it is totally unusable (imho).

Right, so by this point I had successfully dismissed CVS, SubVersion, DARCS and Unison (Hope you caught the 1/2 sentence about that one ;) ) as viable options. I thought to myself ‘someone must have written a half-decent versioning software in PHP for use on a webserver with PHP available’. Some googling later, I found PHP Version Control which looked quite promising. Well, promising to the point that there are no files available to download, nor any code available there. There was a link to another site which had some code but it only allowed checking out of a file, and there after it was locked until it was checked in again, or the lock released. You could only edit a file if it were locked, this is not what I was looking for. I wanted a system like SVN or DARCS where I could edit the files on any machine and then commit them to the others, without needing to lock them first.

At this point I gave up on finding anything suitable and set about creating my own solution.

I created a ‘~/scripts’ directory to put all the files I wanted to syncronise in (not really a good name, may well change to something else like .global and create a .local directory which contains files specific to this system, although I try to create a single file which works on all systems). The original files are then replaced with symlinks to the shared files (e.g. ‘$ ln -s ~/.bashrc ~/scripts/bashrc’). The latest version is then stored on my website and can be updated via scp or downloaded by http.

I then wrote two scripts; ‘update.sh’ which fetches the version on the web if it is newer than the version locally, and ‘upload.sh’ which updates the version on the web if the local version is newer. Here is a copy of the scripts:

config.sh:

scriptdir=$HOME/scripts
scriptfile=scripts.tar.bz2
versionfile=scriptver
sourceserver=omitted for security
sourcedir=/scripts
sourcessh=omitted for security
sourcescpdir='~/public_html/scripts'
username=omitted for security
password=omitted for security
tempfile=`tempfile`
decomptaroptions=jxf
comptaroptions=jcf

update.sh:

#!/bin/bash

# Import configuration settings
. ./config.sh

echo "Checking for updated scripts:"
echo -n "fetching $sourceserver/$sourcedir/$versionfile..."
wget -q -T 2 --user=$username --password=$password -O $tempfile $sourceserver/$sourcedir/$versionfile
if [ $? -eq 0 ]
then
echo "OK."
echo -n "Checking versions..."
localversion=`cat $scriptdir/$versionfile`
remoteversion=`cat $tempfile`
if [ $remoteversion -eq $localversion ]
then
echo " $remoteversion = $localversion"
elif [ $remoteversion -lt $localversion ]
then
echo " $remoteversion >>33[1;31mI suggest you upload the new scripts!!!33[0;0m"
else # must be greater than
echo " $remoteversion > $localversion"
echo -n "Remote version newer. Downloading..."
wget -q -T 2 --user=$username --password=$password -O $tempfile $sourceserver/$sourcedir/$scriptfile
echo -en "OK.\nExtracting new scripts..."
pushd $scriptdir > /dev/null
tar $decomptaroptions $tempfile
popd > /dev/null
echo -en "OK.\nVerifying new version file..."
if [ $remoteversion -eq `cat $scriptdir/$versionfile` ]
then
echo "Ok."
else
echo "Failed."
echo -e ">>>33[1;31mSomething seems to have gone wrong. The version in the script archive does not agree with the version on the server.33[0;0m"
fi
fi
else
echo "Failed."
fi
echo "Local scripts at version `cat $scriptdir/$versionfile`"
rm $tempfile

upload.sh:

#!/bin/bash

# Import configuration options
. ./config.sh

echo -n "fetching $sourceserver/$sourcedir/$versionfile..."
wget -q -T 2 --user=$username --password=$password -O $tempfile $sourceserver/$sourcedir/$versionfile
if [ $? -eq 0 ]
then
echo "OK."
echo -n "Checking versions..."
localversion=`cat $scriptdir/$versionfile`
remoteversion=`cat $tempfile`
if [ $remoteversion -eq $localversion ]
then
echo " $remoteversion = $localversion"
echo "No update necessary."
elif [ $remoteversion -lt $localversion ]
then
echo " $remoteversion >>33[1;31mVersion conflict detected!!!33[0;0m"
echo "There seems to be a version problem - the local version is more than one version newer than the remote one!"
rm $tempfile
exit 1
fi
echo "Compressing new version..."
pushd $scriptdir > /dev/null
tar $comptaroptions $tempfile *
popd $scriptdir > /dev/null
echo "OK. Uploading new version..."
scp $tempfile $sourcessh:$sourcescpdir/$scriptfile
echo "OK. Uploading new version file..."
echo $newversion > $tempfile
scp $tempfile $sourcessh:$sourcescpdir/$versionfile
echo "OK."
else # must be greater than
echo " $remoteversion > $localversion"
echo "Remote version newer."
echo -e ">>>33[1;31mVersion conflict detected!!!33[0;0m"
fi
else
echo "Failed."
fi
echo "Local scripts at version $localversion"
rm $tempfile

Some interesting links

Here’s some random URLs I thought might be interesting:

Browse Happy
A website explaining why Internet Explorer is unsafe for use on the web. Unlike most other websites of its kind it is not favoring any particular ‘alternative'(read: broadly safe) browser, but instead provides a list of alternatives and a possitive description of each.

sorttable: Make all your tables sortable
This site has a nifty looking piece of Java script which instantly allows any table on the web-page to be sorted by any column by defining it to be of the class ‘sortable’. Since this is Java-script the sorting is done client-side so no need to resubmit the page for re-ordering nor will it whore over the server its running on with lots of needless(at least as far as serving web-pages is concerned) sorts.

apt-get.org: Unofficial APT repositories
A place to share usefull (Unofficial) repositories for Debian.

Simple PHP Blog
The software my original Blog was using – no SQL needed, it’s all stored as text files. Easy to configure and update – just decompress and go. Fantastic! My only gripe is that most of the themes are fixed-width, and the only non fixed-width theme is not configurable wrt colours. Creating themes doesn’t appear to be very straight forward either, unfortunately. Maybe I’ll have to write my own blog software which only uses CSS for theming, so creating a new theme simply means modifying a CSS file… hmm, yet another project I’ll probably never finish.

Weighted wired & wireless network (and ifplugd)

I was looking for a way of preventing me from having to wait for DHCP to timeout when booting with no network cable attached (actually I was looking for the correct parameter to adjust the timeout and make it much less – but the solution I eventually found was much neater). Most of this comes from an article on the CLUG Wiki about roaming between wireless and wired networks.

First of all I installed ifplugd. Under Debian this was easy, I configured eth0 (my inbuilt wired card) as the only static interface, and ath0 (my inbuilt wireless card, using madwifi) as a dynamic interface so I could turn the wireless on and off using ifup and ifdown without it connecting to the network when it was in range without my permission (I’m not paranoid, I know they’re coming to get me ;) ).

# apt-get install ifplugd

I also changed -d10 to -d1 so that the interface goes down 1 second after the cable comes out instead of 10 as suggested on the CLUG Wiki (link above).

I then edited ‘/etc/network/interfaces’ to ensure that no ‘auto’ lines pointed to eth0 or ath0.

Starting ifplugd was then just a case of:

# ifdown eth0
# /etc/init.d/ifplugd restart

The install was tested by unpluging and re-pluging the network cable and listening for the ‘beep’s that ifplugd emits when it detects these changes.

I continued following the instructions on the CLUG Wiki to setup the priorities of the interfaces so that if both a wired and wireless connection were available it would use the wired one in preference to the wireless.

First I installed iproute:

# apt-get install iproute

Next I modified ‘/etc/network/interfaces’ to look like this:

# The loopback network interface
auto lo
iface lo inet loopback# The primary network interface
noauto eth0
iface eth0 inet dhcp
up /usr/local/sbin/route-prios $IFACE 1

# Wireless
noauto ath0
iface ath0 inet dhcp
wireless-essid omitted for security
wireless-key omitted for security
up /usr/local/sbin/route-prios $IFACE 10

Debain, madwifi-ng and module assistant

I installed Debian GNU/Linux on my laptop (over Arch Linux) last week, and used module-assistant to install the Madwifi driver for my atheros-based wireless card using The Debian Way(TM).

Here is just a quick note of the commands needed to install madwifi using module-assistant under Debain GNU/Linux:

# apt-get install madwifi-source madwifi-tools module-assistant
# m-a update
# m-a prepare
# m-a a-i madwifi
# modprobe ath_pci

…and that’s all there is to it. Not quite as easy as ‘emerge madwifi-driver’ or ‘pacman -S madwifi-ng’ but still fairly straight forward.

Hmm, Blog

I recently (re-)discovered I’d actually installed a blog script, and never written anything down! Oh well, no time like the present to start – I wonder how long I will be able to keep writing entries before I:

  1. get bored with the whole ‘blog’ idea
  2. simply forget about or neglect the blog to the point that it disappears from my mind (again!)
  3. I get distracted by some project or other.

I recently managed to set up a Debian-based mail server. I originally searched google and came up with a number of guides to doing this which looked quite good, albeit long but it’s a project I’ve been planning for a while so I decided to bite the bullet and have a go. After installing Debian and playing around with various different approaches for a bit, I discovered an entry on another blog at The Tech Terminal explaining how the author had setup a Debian Mail Server. This simply said that all I had to do was enter this:

# apt-get install courier-imap
# apt-get install postfix
# postconf -e 'home_mailbox = Maildir/'
# postconf -e 'mailbox_command ='
# /etc/init.d/postfix restart

at the command line. This was certainly a lot easier that the 8-page guide I had be following previously, and it worked :).

Using other guides to install spamassassin and squirrelmail and it was all working very nicely. Fetchmail and gotmail were easy to install and configure using the man pages so I didn’t need to enlist google’s help with them. I now have a single server with 2x40GB HDDs (configured for RAID 1 using a PCI PATA RAID card) which goes and fetches emails from my 2 POP accounts and my hotmail account and delivers them to my local user on the machine (for my purposes I decided LDAP was overkill and that dropping the mail to a local user’s Maildir made more sense). This means I can now access my mail using an IMAP client on either my desktop or laptop, or I can use a web-browser from any other location.

One small snag did run into is that Maildir creates a directory for each directory on the server (as you’d expect) but doesn’t nest them. I was expecting them to nest and it took a while (and some head-banging) for me to discover that Maildir actually uses a ‘.’ to represent sub-directories.
e.g. this structure:

Inbox
:-New
:-Badgers
: :-Mushroom
: :-Snake
:-Llama

becomes this Maildir structure:

/Inbox
/Inbox.New
/Inbox.Badgers
/Inbox.Badgers.Mushroom
/Inbox.Badgers.Snake
/Inbox.Llama