To set the scene - I'm hosting the Dutch PHP Conference this week and there's a LOT of information to keep track of. Between tweets using the hashtag, people tweeting either my username or any of the conference twitter accounts, the schedule itself which will change as we're having a no-prior-scheduling uncon, and the event and talk-related comments coming in via joind.in itself, its quite a bit of stuff to track. Add into the mix the fact that my iphone's data package will be too expensive to use in NL and although I will have a phone with a data package it wont' be mine and there's no guarantee exactly what it will be. Oh and conference wireless, which last year wasn't bad at DPC but you have to assume there's a bottleneck.
So I figured that since I have a fast production server with lots of spare resource, it can do the hard work on processing the feeds, and just serve me a basic page with everything on and some internal hyperlinks to get around. So I wrote this monster, which runs a bunch of twitter searches and pulls in schedule and comments from joind.in, and that was good.
Except, it does take quite a long time to run (well, a few seconds, but that's too long in my book) so I thought about caching each result set in memcache, and writing a cron job to repopulate those regularly, so every time I hit the script, the cache would be warm. Then I realised I was overcomplicating the matter, and simply wrote a cron job to run my php script and pipe the output to a static HTML file every minute! Whenever I hit the page, I'll get the latest version. It doesn't scale but it doesn't need to, its only for me to use for a few days! Here's the cron job:
* * * * * php -f /path/to/generate.php > /path/to/index.html
Job done! (the actual page I made is/ here if anyone is inquisitive)