Category Archives: computing

Script to change keyboard on Windows 7

I’m currently working in a client’s office, on an extremely locked-down Windows 7 PC. As usual, I want to change to the dvorak keyboard layout, which is my standard. However, the environment on the computer is reset every couple of days, which wipes out my keyboard setting.

So I started looking for a way to change it through a powershell script. Unfortunately, Windows 7 only has an extremely limited version of powershell. But I was able to find a way to change it using an xml script at these two sites:

Combining them both together, I put together the following xml file:

<gs:GlobalizationServices xmlns:gs="urn:longhornGlobalizationUnattend">

<gs:User UserID="Current"/>


<!-- Add Dvorak -->
<gs:InputLanguageID Action="add" ID="0409:00010409"/>

<!-- Remove US default-->
<gs:InputLanguageID Action="remove" ID="0409:00000409"/>


The trickiest part was finding the code for the dvorak keyboard, which is 0409:00010409

The code to execute the xml file is:

control intl.cpl,, /f:"Desktop\changekeyboard.xml"

Which I then put into a .bat file which I keep on the desktop. So, whenever, the keyboard changes, I just double-click the bat file, and the keyboard is fixed again, and I end up with the keyboards as so:

Permanent redirect to https

This is a bit of a follow-on from my previous post, in which I was setting up https access on my website.

Once you’ve got https set up correctly, you might, like I did, want to make sure that all traffic to your website now goes the the SSL connection, rather than through an unencrypted connection.

On Linux hosting, like I have with Quadra Hosting, this can easily be done by creating a ‘.htaccess’ file. Create one in the root level of your hosted directory (the one where you have your index.html file). In the .htaccess file, put in the following lines:

RewriteEngine On
RewriteCond %{HTTPS} off
RewriteRule ^(.*)$ https://%{HTTP_HOST}%{REQUEST_URI} [L,R=301]

This will redirect all traffic to the SSL connection.

Setting up Let’s Encrypt with Quadra Hosting

For the last few years, I’ve been using Quadra Hosting for my web hosting needs. Their servers are great, with good speed and connection, and they’re reasonably priced. They also have the best customer and technical support that I’ve ever seen, bar none. I highly recommend them if you’re looking for a new web host.

Let’s Encrypt is a project that has set out to ensure that all web traffic is encrypted. To that end, they have changed the traditional process of obtaining SSL keys to a much simpler one, and provide them for free.

Also, Chrome will soon be warning users anytime they visit an unencrypted website. With these two factors, I thought I’d set about trying to install a Let’s Encrypt key onto my Quadra hosting account. As you can see from the green lock in the address bar for this site, you can see that I was successful!

Let’s Encrypt’s preferred method of installation is via CertBot client that runs on the server, and installs and renews the keys. However, a quick check with the technical support team at Quadra, and I realised that this wasn’t going to work on shared hosting, where I don’t have low-enough level access to the server.

Fortunately, Let’s Encrypt and others have provided a variety of different ways to install some keys. They have scripts for a variety of different languages. One that I came across was a bash script that can run on shared hosting without root access. It’s simply called It worked really well.

Here’s the process that I went through to set up Let’s Encrypt with the script.

  1. Go to your Quadra Hosting control panel, and turn on SSL connections, by going to ‘Domain Settings’, then ‘SSL’ and then clicking the ‘Generate self signed SSL certificate’.


Click ‘submit’ on the next page, then just ignore the page of keys that comes up after that. You won’t be using those keys, but it has activated the SSL functionality for the server for your domain.

2. Next you’ll need to install the script. To do this, and most of the rest of the steps, you’ll need shell access. It’s not turned on by default, so you’ll need to fill out the shell access request form and send it through to the Quadra Hosting support team. Then, when you’re logged on, enter

curl | sh

This will download and install the acme script. At this time, it tries to install an alias to, but it failed for me. I had to manually edit the .bashrc file and added in the following line:


This can be done simply by entering the following line

printf "alias\'~/\'" >> .bashrc

then reload your bashrc by either logging out / in, or by typing

source  ~/.bashrc

3. With the script set up, you can now download and install your SSL key. The first step is done with the ‘issue’ command: --issue -d -d -w ~/

This contacts the Let’s Encrypt servers and will generate some SSL keys, and copies them to the acme install directory. It’ll give the location of these new certificates:

[Thu Nov 10 01:46:53 GMT 2016] Your cert is in  /hsphere/local/home/xxxx/
[Thu Nov 10 01:46:53 GMT 2016] Your cert key is in  /hsphere/local/home/xxxx/
[Thu Nov 10 01:46:53 GMT 2016] The intermediate CA cert is in  /hsphere/local/home/xxxx/
[Thu Nov 10 01:46:53 GMT 2016] And the full chain certs is there:  /hsphere/local/home/xxxx/

4. Next up, you need to install the newly created certificates. This is done in two ways. Firstly, through the control panel, then through the command line. This is so that both Quadra and the script know the certificates.

Firstly, log onto the server with an FTP client and download or view the four certificates which were listed in the key creation message (above).

On the control panel, go to ‘web options’ then select ‘edit SSL support’.


This will bring up a page with some areas in which you can paste in the keys. Put in:

  1. Install yoursite.key to ‘Certificate key’
  2. Install yoursite.cer to ‘Certificate file’
  3. Install ca.cer to ‘Certificate Chain file’


The Certificate Key and Certificate file can be uploaded at the same time, but the Certificate Chain File will need to be uploaded separately.

Once that is done, go back to the command line. You’ll need to change the permissions on the ssl key folder:

chmod 0600 ~/ssl.conf/*

Install the keys again using the acme script: --installcert -d --certpath ~/ssl.conf/ --keypath ~/ssl.conf/ --capath ~/ssl.conf/


Note that this will throw a few errors: The script will also try and install some backup keys, which it won’t be able to do. I don’t consider this a problem, as the keys primary location is in the directory anyway.

5. Reboot your apache instance to reload the new keys. There is no direct to do this, but the Quadra support team suggested a simple workaround. Go to Web Options for your domain in the control panel, then click on the ‘Mime Type’ button.
mime-typeOn the window that pops up, enter some dummy data, hit ‘submit’


Then click on Save / Apply in the Web Options window. Wait a few minutes, then click on the red ‘X’ next to the new Mime type. This will delete the new entry.


Click Save / Apply again. This will reboot the apache instance, and load up the new keys.

6. Test the installed certificate. Wait for five minutes for the server to reboot, then go to . If all the steps above have worked then it should be working nicely. Next, check that the cron job to renew the certificates has installed correctly. This should have been done when the script was installed. Check it now.

This can be done through either the control panel (go home -> tools -> cron) or through the command line:

crontab -l

You can also go to SSL Lab’s SSL test to see that everything is working correctly.

7. Celebrate! You’ve now got SSL working on your site, celebrate that you’re doing your small bit to make the whole web a bit more secure.


After I posted this article, I sent a link through to Quadra support with a note along the lines of ‘here’s a guide I made up, please feel free to send it to any other customers who might be trying to do the same thing.’

I shortly got a response from the support team. The team member who’d been helping me with this setup not only responded and corrected some mistakes I’d made, but he also wrote: “You could also mention that we would be happy to do all this for them if they asked us to, e.g. if they are not used to shell commands.”

This surprised me no end. The Quadra Hosting team sell a one-click SSL implementation, which only costs $30, and requires no technical skills at all. Rather than suggest “You could also mention that we have a cheap one-click SSL implementation if they are having difficulty with Let’s Encrypt.”, they instead suggest that they could forgo making that money, and instead spend time help you implement a no-cost alternative.

If you’re not with Quadra yet for your hosting, change. This is just a small example of how good they are.

Quad-boot Lenovo X220

After getting Mac OSX working fine on the Lenovo, I decided to reach higher and go for a real multi-boot setup – five OS’s! I wanted to put on Windows 8, so that I could play Steam games, and I wanted Linux to see how Linux has progressed over the last few years, and BSD just to see if I could.

My first step – upgrading the hard drive. The 320GB Seagate that I was using was fine for two partitions, but wouldn’t really cut it for more. So I bought a Seagate 7200rpm 750 GB Hybrid HD. With that done, I had to partition it up, which was quite a feat:

Disk Utility Screenshot

So that’s

  • 170GB for the original Windows Partition
  • 120GB for Mac OS X
  • 200GB for Windows 8 (pretty much only for Steam)
  • 60 GB for Linux
  • 50 GB for BSD and
  • 180 GB for general storage (formatted FAT, so that all the OS’s can share it.)

I used Mac OS’s Disk Manager for the partitioning – it does a good job, and it’s a lot harder to make some fatal mistakes.

Once that was done, it was a pretty straightforward, though time-consuming task to put all the OS’s on it. The Mac OS and Win 7 partitions I could copy straight across from the 320 GB drive. Windows 8 installed without too much hassle, but only once Windows 7 was on. It wouldn’t install into the non-first partition without another boot partition being in place.

For my Linux partition, I chose Ubuntu, which is definitely ‘flavour of the moment’. I’ve installed Linux in the past – Red Hat 3 on my old Pentium 2 300Mhz desktop computer, and Gentoo on a Sony Vaio Picturebook. Compared to those old installs, Ubuntu installed like a dream, I’d say fractionally easier than the Windows 8 install, and getting close to that of OSX.

I wasn’t completely thrilled with Ubuntu. It’s probably the Gnome-based Unity desktop, but I found a few of the OS choices to be quite annoying, particularly the dock on the left. There was also a surprising lack of configuration options in order to change the user experience. Sleep / wake worked just fine without any setting or configuration changes, which impressed me a lot.

One thing that impressed me though was the wireless connectivity. With my last Linux install – Gentoo on a Picturebook – getting wireless networking running required a lot of hard work, and a lot of editing of configuration files. However, Ubuntu made the experience as easy and fast as Windows or OSX does.

I was trying to go for five OS’s – with BSD being the fifth OS. Unfortunately, I wasn’t able to get it to work. It was very frustrating. The first time I tried to install it, I then discovered that it would only install into a primary partition, not into an extended partition. After much grinding of teeth, I re-partitioned the hard-drive and started from scratch.

I then moved Windows 8 to an extended partition, so that I could install BSD on a primary. Then, for some reason, I just could not create a bootable USB installer with BSD on it, it just wouldn’t work. The X220 doesn’t have a DVD-drive, so I had to go with a plan B – move the hard-drive across to another laptop which has a DVD-drive, then install, and move the HD back across for first boot.

Using this method, I was able to get PC-BSD installed and working, at least if they had full control of the bootloader. Once I tried to install another bootloader, to boot the other OS’s, then BSD would no longer work. I tried both the FreeBSD and the PC-BSD variants, but neither would boot. I eventually gave up. The BSD partition was only meant for ‘fun’ – a bit of icing on the cake, but it was proving to be more hassle than the rest of the OS’s put together.

Only after I had gone through all of that drama with installing on another computer that I realised that I had another install option. I could have used Virtualbox, and booted up the installer in Virtualbox, then installed on one of the other drives.

Getting the boot loader to work is generally a pain with these multi-boot installs, but I was able to get that sorted pretty easily. I used my Windows-to-go bootable USB to boot into Windows, and then I used EasyBCD to configure the boot loader on the computer. So my boot loader now looks like this:


After I had got it all working, I went back and had another look at Ubuntu. I wasn’t completely happy with the Gnome environment. Fortunately, there’s an officially supported branch of Ubuntu called Kubuntu, which uses the KDE environment. I found this to be much more to my liking, and a lot more configurable. The only downside is that the wireless networking isn’t quite as easy or smooth as Ubuntu’s, and takes a bit longer to re-initialise itself after sleep.

All up, I’m now very happy with my work-issued Lenovo X220. It does absolutely everything I want, and can run any program I want. It’s got a good size and is extremely robust. I think that when it’s time to hand this computer back, I’ll go and pick up a 220 for myself.

RAM Disk on Mac OS X

One of the things that really annoys me with modern computers is that they don’t use their RAM as much as they should – particularly with video and music playing.

Case in point. I was watching a movie on my new Hackintosh the other day, and the hard disk light was flashing continuously. The computer wasn’t doing anything else except watch the video, and I had more than 6GB of free RAM on the computer. However, the video player (VLC in this instance, but they all do it) wasn’t using that RAM, it was continuously reading from the HD. That’s okay for a desktop computer or a laptop with an SSD, but for a laptop with a regular HD, that’s a real waste of battery power. The 800MB video file could easily be read into RAM, then accessed by the video program.

One of the best ways to get around this problem is to create a RAM disk, something that used to be a regular feature on Apple computers. The obvious ways of doing it were removed with the introduction of Mac OS X, but the underlying capability is still there.

I was able to find out how to do this pretty quickly, through a Terminal command:

diskutil erasevolume HFS+ "ramdisk" `hdiutil attach -nomount ram://3165430`

This creates a RAM disk that’s about 1.6 GB in size. Modify the last number to make the disk larger or smaller.

One thing that the articles didn’t mention is how you can easily turn a Terminal command into an ‘executable’ that you can click on to run. So I thought that I’d post a quick ‘how to’ to show how to do this.

  1. Open up a text editor (I use TextWrangler) and paste the command into it.
  2. Save it as a text file (I call it ‘’) to the drive.

Ramdisk script

Once this is done, do a ‘get info’ on the file. Go to the selection ‘open this file with’, and select Terminal.

Ramdisk script Info

Once this is done, when the script is double-clicked, it will open up in Terminal and run:

ramdisk initialisation

This will create the ram disk, and put it on the desktop, like any other type of network or local drive.

ramdisk info

You can then copy your video or other files to it, and get extremely fast access to those files. It’s great as well if you want to do some video editing / photo manipulation, but you have to remember it’s volatile memory. If you lose power for any reason, you’ve totally lost the contents of the drive.

Edup USB wifi dongle

As I mentioned in my last post, one of the remaining issues is that the Lenovo X220’s wifi card isn’t recognised by Mac OS X. One of the ways around this is to replace the wifi card with one that’s recognised by OS X, and this is the path most take, as the new card only costs about $15, and works seamlessly with the computer.

I didn’t want to do this though, since the laptop belongs to my employer. I didn’t want to open up the case and break the warranty. Instead, I went to ebay, and bought the smallest USB wifi dongle I could. It was a Edup RTL8192C 802.11n dongle.


It’s remarkable the miniaturisation that they’ve been able to do for this card. The fact that they could pack the electronics and an aerial into this tiny package is amazing. It’s only 150N networking, and the range isn’t great, but that’s to be expected.


Naturally, the drivers that came with the dongle don’t include Mac drivers. It took a bit of hunting to find some, but I eventually found them. I’ve included them below in case anyone finds this page when looking for drivers.

EDUP Wifi Card drivers

When installing on Mountain Lion, the installer says that it fails, but it actually works. To initiate the wireless connection, you have to use the ‘Wireless Network Utility’ app which is installed. This app replaces all the built-in wireless connectivity

Wireless Setup Window

An interesting side-effect is that the wireless dongle also works in Windows, so you can connect to two wireless networks simultaneously.

Lenovo x220 Hackintosh

For the last few weekends, I’ve been doing a little project, trying to get Mac OS X running on my work-issued Lenovo x220 laptop. It’s quite a good laptop with an i7 processor, 8 gigs of RAM.

However, I can’t really use it. It’s got a pretty locked-down copy of Windows 7 running on it, and I don’t have admin rights. That’s fine, my workplace pays for this computer, they can set it up how they want. So I decided to experiment, to see how I could use the computer in other ways without affecting the ‘work’ nature of the computer at all.

The first thing I tried was to swap out the hard drive. The Lenovo makes this pretty easy, with just one screw on the outside case. With one of my spare laptop drives in, I then installed Windows 8. Naturally, that worked just fine, but switching out the HD every time I wanted to do something different is a bit cumbersome.

So then I tried a new feature of Windows 8 called Windows To Go. It’s basically a full install of Windows 8 that can run from a USB stick or external hard drive. I then created a WTG external drive. This worked fine, and allowed me to play Steam games with no hassle, and without affecting the work hard drive at all.

After a few weeks, I thought that I’d see what other possibilities might be achievable. I’ve really missed my MacBook Pro since the motherboard died about a year ago now, but I was never quite able to justify to myself to buy a new one. I’ve always been an Apple and Mac fan, so one day, I did a search, to see if it was possible to create a ‘Hackinosh’ with the Lenovo.

I found a few pages that had some methods on how to do it. The main one that was very useful was from the ThinkPad Forums.

The main issues I had to work through were:

  • Creating a boot USB stick.
  • Installing Mac OS X
  • Copying across the Windows 7 partition and fixing the bootloader.

Creating the USB boot disk gave me a bit of grief at first. I tried creating one using various instructions that I found on various pages, that mostly involved copying files across from the Mountain Lion installer DMG. This method copied the files across, but didn’t leave me with a drive that the Lenovo would see as bootable. The next method I tried was from using the Lion Disk Maker tool. However, while this tool works really well with genuine macs, it doesn’t work so well for Hackintoshes.

The method which ended up working for me was by using TonyMac’s UniBeast tool. This worked well, and got me to the installer window. Then  I got stuck at the next step. The installer only works for GPT-formatted hard drive’s and the Lenovo (since it uses Windows 7) only allows MBR. (Windows 8 now works with GPT). Fortunately, the main Thinkpad Forum page came to the rescue again, pointing to this page on the site.

On that page are some hacked installers, which allow you to install onto MBR partitions. Once these were copied across onto the USB installer, it then allowed me to install, and get to first boot.

At first boot, things got a little tricky. At this point, you need to install various extensions to get the computer to recognise all the hardware, change a few configuration files, set up the boot manager and patch the DSDT, so that control of the processor is correct. At this point, I got a lot of kernel panics. Much fun.

The forum posts lists two methods for first-boot setup, one by ‘Fraidos125’, the other by ‘Superkhung’. Some people record success using one method, some people record success using the other. Unfortunately, neither worked for me. I kept reading through the thread, trying to figure out why I wasn’t having any success. Then, toward the end of the thread, I saw that someone suggested using both methods in conjunction, one after the other. I gave this a try and had success!

Here’s a photo of Mac OS running on the Lenovo.


There are only three residual issues with the ‘Hack OS X’ install – the wifi doesn’t work, the bluetooth keyboard installer pops up every time you boot and 3rd USB port doesn’t work.

Wifi not working is a recognised issue with the 220’s. Mac OS just doesn’t have a driver for this wifi card. Most people just buy a new wifi card and install it into their laptop, since a new card only costs about $15. I don’t want to do this with a work-issued laptop though, so I’ll find a USB alternative.

The second issue – the bluetooth installer is slightly annoying. The keyboard and trackpad is PS2, and work fine with the right extension. However, on bootup, the computer doesn’t quite seem to recognise that a keyboard is connected, and starts up the bluetooth keyboard installer every time. It’s easy to quit out of it, but it’d be better if it didn’t show up at all.

The third problem is strange. The x220 has three USB ports, two USB2, and one USB3 port. The USB3 port doesn’t work at all. I suspect that there’s something wrong with my configuration files. Most x220’s only have 2 USB ports, with the i7 computers having three. I think that it’s one of the files that I downloaded from the Thinkpad forums that’s causing the issue. It hasn’t bugged me enough yet that I want to fix it, but I’ll probably add that to my list for later.

MacOS Info

 Once I had Mac OS installed, I decided to try and make the computer dual-boot, so I wouldn’t have to swap out the hard-disk everytime I went to work. Doing this was almost as tricky as the MacOS installation. The main problem is that I don’t have administrator rights on the work partition (Windows 7), so I had to work completely ‘hands off’ from it.

One of the things which helped this was the WTG boot drive, and this came in very handy for the dual-boot set up. It enabled me to boot up the 220 without having to touch the internal drive, allowing me to manipulate it as I needed.

The first step was to copy an image of the Win 7 partition from one HD to the other. I tried a couple of methods, with no success, but then I found a method that worked. For this I used a tool called DriveImage XML. It works like the image copy-and-move capabilities of Mac OS’s Disk Manager.  It allowed me to copy across the Windows 7 partition from one drive to another, with no changes.

Once the partition was copied across, I then re-installed MacOS into the other partition. Then I just had to get a bootloader working so that I could select which partition to start. The Hackintosh installer uses a boot-loader called Chameleon, but I couldn’t get this to work with the Win 7 partition. I tried creating a Win 7 rescue USB and fixing the Win 7 partition, but that didn’t work either.

The easy way to get both to work is to use a program called EasyBCD on Windows. It works very well, but I don’t have admin rights on the Win 7 partition. Fortunately, the WTG drive came to the rescue again. Running EasyBCD on it, I was able to fix the bootloader on the laptop’s drive. It boots into the Windows Boot Loader (the finest of text-based user interfaces), which then enables me to select which partition I want to boot from. If I select MacOS, it then goes to the Chameleon boot loader, which allows me to select boot-options for the MacOS partition.

Now both partitions are working great. I’ve got my pristine work partition, and a Mac OS partition for fun. It makes me want to install a few other OS’s, just to see if I can. Maybe a Quin-boot Work / MacOS / Win 8 / BSD / Linux setup.

Project for another weekend.

Backing up Godaddy with Rsync

Like many, I’ve got my webhosting with Godaddy. I quite like them, as they’re cheap, you can host multiple websites very easily and they provide you with a really good level of control. My only issue with them is that you don’t have rsync functionality to backup. I much prefer rsync for all of my backups, as the incremental process is so much faster than a full ftp backup.

I was having a look at this problem this morning when I found this page here. It outlines a way to get a copy of rsync onto godaddy for use. Unfortunately, it was a bit thin on details, so I thought I’d expound them a bit more here, so that people from the future may be able to learn from what I’ve found.

Godaddy’s hosting is currently using CentOS 5.5, so you’ll need to find a copy of rsync which is compatible with that OS. Version 2.6.8 currently works. You can find it here.

This is an RPM package, so it’s not easy to open. Grab 7-zip (freeware) program, which can read it just fine. Go to the ‘bin’ directory, and copy the ‘rsync’ executable. Copy it to Godaddy.

I had troubles when I copied it. For some reason, my FTP program copied it in ASCII mode, which corrupted it. I had to manually set my FTP program to transfer in Binary mode. Make sure the filesize comes out at 313688 bytes.

Once it’s on Godaddy, log in with SSH. SSH isn’t automatic with Godaddy. You need to go to your hosting control panel and turn it on.

home$ ssh

Create a new directory called ‘bin’.

example$ mkdir bin

Move the rsync executable to the bin directory

example$ mv ~/html/rsync bin/

Make the program executable.

example$ chmod 755 rsync

That’s all you need to do! This then is the command I use to backup my websites.

home$ rsync -aviPh --progress --delete-after -e ssh --rsync-path=bin/rsync /mnt/Backups/Websites/

Offsite data backups

One of the big problems that small companies, and homes is off-site backup. Previously, I’ve done what I’m sure many people do; I have a big HD that I take to work. Once a month, I bring the big drive home, sync it with my home drive, then take it back to work and put it in the filing cabinet. I’m not concerned about anyone from work stealing the drive, or the data off it.

That’s good enough for people like myself, for whom a month’s lost data is more of an inconvenience than a disaster. But a small business needs to have a much better option. My Dad’s business is that way. It’s only a very small business, but he needs to be able to have an almost complete record of his email history. Only the loss of a few day’s data is acceptable.

The solution I originally had was to use Apple’s Time Machine backup onto an external disk. Then he was to swap the disk with one at home every month. This worked fairly well, but I kept thinking it could be done better. Make the system better, and not require him to shuffle drives.

What led me to change my thinking was Dodo‘s new unlimited broadband plans. They’re reasonably priced, and have completely unlimited data. Though they’re significantly more expensive when rural than urban, but still pretty good (only $20 / month more than my 10GB plan). My main concern with Dodo was that their network speed wouldn’t be up to scratch. Most of the time it’s excellent. It’s usually only on weekends that I notice that it’s hard to get above a couple of hundred kb/s.

So an unlimited data plan covers getting the data offsite, but where to store it? At the moment, there’s several providers who do data backup ‘to the cloud’. These are typically fairly limited in the amount of data that they can store, though prices range from good to ordinary.

The answer I came up with was network drives. A friend suggested the Drobo network drive, and after having a look at them, I thought that they would be ideal.  The DroboFS has five drive slots, so you’ve got a lot of store with good redundancy. With 2TB drives and dual redundancy (any two drives can die), you can still store about 5.5 TB of data.

Being network-enabled, they can also talk across networks. The drobo’s can be installed with a variety of software, including OpenSSH, rsync, FTP software, and a bunch of others. So I bought two of them. One to set up at the office, another at home.

So the system I’ve got now goes like this:

  • Local computer uses Time Machine to back up hourly to the network drive.
  • Local computer also uses rsync to back up daily to the network drive.
  • The network drive backs up the rsync backup to its twin.
The beauty is that it’s completely automated. Using rsync keeps the size of each incremental backup to a minimum.

Security is the largest concern at this point. A randomly-generated 30-character password should take care of a lot of the scanner-bots out there. The only port which is open to the outside world is SSH. Fortunately you can do everything that needs doing (remote login, FTP and rsync) all through SSH. For additional security, I also set SSH to a non-standard port.

Wondering what the pro’s do, I had a chat with an IT expert. He said that the only real way to do offsite backups is with tape. I can see the point; if the computer filesystem gets corrupted, or data deleted, then that missing data can carry through to the local, then offsite backup. The primary aim of my backup system is disaster recovery; coming back from total data loss. Using Time Machine on the backup drive covers against individual file loss, but at the moment there’s no way to recover against both issues.

So for the future I might add a tape system in with the network drive, to have better timed data recovery. But at the moment, I’m pretty happy with how the system is set up. For reference, here’s the (rather unwieldly) command to back up one drive to the other:

rsync -aviPh --progress --delete-after -e 'ssh -p 2345' --rsync-path=/mnt/Drobo-FS/Shares/DroboApps/rsync/rsync /mnt/Drobo-FS/Shares/Backup/

New Website

I’ve been doing a bit of work on a few websites lately.  One thing that I realised during my website work is that my hosting provider (GoDaddy) lets you host multiple sites, with multiple domain names, all under one hosting account. I hadn’t realised that before. I only thought that you could set up individual sites within folders, but not with domain names pointing to those folders. That allowed me to consolidate a bit of hosting all under one account. I kept the GoDaddy one. Even though it’s not the cheapest, it does provide you with a lot of control.

During the process, I investigated the WordPress blog system, and was pretty impressed by it. It’s got all the advantages of Blogger’s system, but it’s a lot more flexible, and being installed on your own host, it gives you total control of every aspect of the site. Plus, I like the aesthetics of the content-management system, and the innumerable themes which are available. I only found the Blogger themes to be ‘okay’ at best.

One thing I did was move my church’s website ( across to wordpress. This makes for much easier posting of the weekly sermons. Previous to that, the church’s website was just created in Apple’s iweb. Adding a new sermon each week involved a fair amount of messing around, and re-uploading the whole ‘sermon’ directory to the server.

Having a full content-management system with user control will also mean that I can create pages that other people can edit, without messing up the whole site. Ie, the youth leader can safely and securely edit the ‘youth’ page. This should allow for a much more dynamic website, as the individual group leaders can control and edit their own pages.

Since the church website was up and running nicely, I thought that I should think about moving this blog across. After thinking about it for a bit, I realised that there was no good reason to stay on blogger. I set up WordPress and imported across all my old posts. The biggest hassle was changing the DNS entries.

The main downside is that I’ve lost the comments that were already in Blogger’s system. Despite the import plug-in saying that it was able to import comments, it turned out not to be the case.

Of course, the other downside is that I need to do my own backups, but that’s not too onerous.