This is part of my blog, which I have long since stopped maintaining. The page has been preserved in case its content is of any interest. Please go back to the homepage to see the current contents of this site.
One of the reasons why I’ve recently been simplifying the number of things I run on the web is the difficulty of it all—not that I’m incapable of running mail servers and Minecraft servers and a dozen websites, but that I can achieve 99% of the benefit with only 10% of the effort by using other services instead.
One of those changes I made was to ditch the Juvia comments system for my blog and replace it with the 3rd-party Disqus. Juvia is unmaintained—in fact, despite being a Ruby on Rails beginner, several of the most recent commits on the project are mine—and having to fight with Passenger and Rails and RVM every time the server felt like updating a package took its toll.
Though it has simplified my life, Disqus was clearly the wrong choice.
So, I’m moving back to a self-hosted comment system for my blog. I don’t fancy another trip around Ruby Version Hell, so I haven’t moved back to Juvia—instead I’ve settled on a much simpler PHP-based commenting system known as HashOver. Being PHP it runs just about anywhere with minimal hassle, and uses files rather than a database for storage. I’ve tweaked it to look just like the old Juvia comment section, and you can see it in action below.
One up-side of all this is that I’m now getting pretty adept at migrating comments from one system to another, despite there being no official support for most migrations. I fixed Juvia’s “import from Wordpress” function a few years ago, and I wrote a script to dump Juvia’s comments into Disqus’ special WXR format. HashOver doesn’t support automated import from anything, so here we go again: I wrote a program to convert Disqus’ export format into a HashOver file structure. It’s imaginatively called “Disqus to HashOver”, and it’s free to use if you ever need to do the same migration yourself.
My blogging history has not been lacking in posts where I consider deleting my Facebook profile. It’s been a common thread throughout that time that Facebook has its advantages (having become my sole practical means of contacting many old friends) and disadvantages (that it is a privacy-devouring monster). In the main, we have been willing to make a deal with the Devil in order to use the vast network of communication possibilities it opens up for us.
After holding a Facebook account for 10 years, and apparently struggling with whether that was a good thing for at least six of them, today I deactivated my account—a temporary measure to see how it goes, before a potential full deletion in the near future.
The last straw for Facebook was, as with the last straw for LinkedIn, a matter of privacy.
Facebook’s “People You May Know” feature is often handy for finding friends-of-frients that you know. Occasionally it recommends someone you kind of know despite not having any common friends, and makes you wonder how the algorithm put two and two together. A little concerning sometimes, but not the end of the world.
Today, Facebook’s recommendations went from “a little creepy” to “compromising friends’ private medical data”.
I shan’t name any names, for obvious reasons, but I have a friend who is currently suffering from a particular medical problem. As part of their treatment, that friend has regular appointments with a medical professional, and in supporting my friend, I’ve previously been in contact with that person as well. The friend isn’t on Facebook at all, citing privacy concerns. I have not mentioned anything about them nor their problem on Facebook. And yet today, Facebook’s blissfully context-free recommendation algorithm started suggesting that I add that medical professional as a friend.
As far as we can tell, what happened is this:
- I exchanged an email with the person, via my GMail address.
- I have GMail set to remember people I email by automatically adding them to my address book.
- I have an Android phone, which automatically syncs my Google contacts, so their email address became stored on my phone.
- I have had, in the past, the Facebook and Messenger apps on my phone. I had granted access to my contacts, so they could set contacts’ photos to their Facebook profile picture.
- The Facebook and/or Messenger apps hoovered up my contacts and sent them to their server.
- The person’s email address matched the one they used for their Facebook account, and so Facebook knew that we had some kind of connection.
Now, it’s not entirely Facebook’s fault. Some of the fault lies with Google, and a not inconsiderable portion of the fault lies with me for not checking apps’ permissions and privacy policies. However, it’s the Facebook part of the puzzle that made the whole thing creepy, and so, that’s the part that has to go.
So farewell, Facebook. You’ve made staying in touch with a lot of friends much easier over the last decade, and for that I’m grateful. And I’ve always known that the price for that was that you’d play fast and loose with my own privacy. But when you start to infringe on the privacy—and potentially the confidential medical information—of one of my most vulnerable friends, you crossed the line.
It’s been five years now since, full of enthusiasm and convinced that SuccessWhale might make it big, I bought myself a server in London somewhere, and moved my web presence over from its previous shared hosting.
I’ve learnt a lot in those years.
I’ve learnt that despite the “S” in “SMTP”, running a mail server these days is anything but simple. Fighting viruses and spam is challenging, and in particular DKIM is a hassle to get right. I’ve learned that PGP is a nice idea, but I never did find one person to exchange encrypted mail with, or any real reason to do so.
I’ve learned that LetsEncrypt makes setting up SSL certificates much easier and cheaper, but that configuring Apache to serve 23 HTTPS domains from a single machine with SNI is still difficult.
I’ve learned that Linux Version Hell will bite you at some point, and Ruby Version Hell doubly so. Today the server runs three separate versions of Ruby, because not all my Ruby/Rails sites can be run with the same one. The
www-datauser has a home directory just for
I’ve learned that drive-by hackers and DoS attackers will find your server, often within hours of it going online, and that fail2ban and mod-blacklist are the sysadmin’s friend.
I’ve learned that Open Source community drama can break stuff you depend on, sometimes without a decent way to recover, and every time you do a package update you risk something not working.
All in all, it’s been a good experience. There have been some frustrating times and late nights fighting with configuration that should just work, but I’ve learned a lot in the process and gained a much greater understanding of how the internet works.
However, I think it might be time to start winding it down. It’s no longer an interesting new experience, it’s just another thing that I have to keep an eye on and ensure it doesn’t break. Life, I find, is better when it’s simpler.
As such I am working up a plan to migrate important things to other services over the coming weeks or months.
- Email, contacts, calendar, file storage and photos I’ve already moved from my server to Google products. I’m no great lover of Google, but having struggled so long with email servers and clients’ spotty DAV support and OwnCloud updates breaking everything, I have to admit it’s so much easier.
- This blog is based on Jekyll and can relatively easily be moved to GitHub Pages. We’d lose HTTPS support, but I don’t think that’s a major issue. ~500MB of images used by it currently sit on
files.ianrenton.combut could be brought into the repository. The challenge is the comments, which use my own Juvia server. There’s no migration path from that to anything, and I do want to keep the comments— particularly on the Raspberry Tank pages—as there’s a lot of helpful information there. Disqus is the logical non-self-hosted equivalent, but there’s no automated way of bringing the comments in. Maybe it’s time to migrate the whole site back to WordPress, since hosted solutions are easier to come by.
- Software executables can use GitHub’s “releases” feature and be hosted there next to the source code.
- Everything else on
files.ianrenton.comis pretty much junk, bits of it can be shared ad-hoc from Google Drive.
- SuccessWhale, Can I Call It, Daily Promise, A Thousand Words and Westminster Hubble are effectively dead at this point. They can be taken down and replaced with portfolio pages saying “this is what I once made”.
- Rimbaud’s House hasn’t been in use for a while anyway, it was a nice project but the camera has not proved useful and I do not trust the sensors enough to introduce full automation based on their readings.
A few potential sticking points are:
- I host three WordPress sites for other people. Two are potentially old enough that they can be deleted or replaced with static pages; the other I could migrate to WordPress.com although that would involve paying extra.
- I do use the server as a VPN occasionally when on unsecured WiFi, but almost every critical app and website uses HTTPS these days, it’s not necessarily essential any more.
“Sea Battle” was a casual 2D real-time strategy game that I put together in a few days back in 2010, and documented in a series of blog posts at the time. It’s lain dormant ever since, but I picked it up again today while bored and made a couple of tweaks.
Six years on, it’s obvious how much my coding style has changed—not only is the formatting dubious and commenting sparse, there’s also a lot of inefficient loops and abuse of global variables. I may change all that in a big refactor at a later date, but for now all I’ve done is a few minimal changes on top of the existing code.
If you played Sea Battle ages ago and fancy trying it again, here’s what to expect:
1) Islands! You now get some randomly-generated islands to break up the wide expanse of blue sea. They’ll be different each time you run the game. Collision detection is based on the old code for detecting collisions with other ships, which is not great, but your ships shouldn’t get stuck behind islands too much. Islands only affect movement, not the firing arcs of weapons.
2) Death list! I originally wanted to give ships randomly-generated names in this update, so you could see something like “Bismarck sank HMS Hood”. However, I couldn’t find a nice way to display them on the play field without adding loads of clutter—maybe one to save for the full-screen 3D version 2. :) The implemented list instead shows which equipment the ships had, e.g. “18.104.22.168” = 1st hull, 2nd weapon, 3rd engine, 4th radar, to give you an idea what your enemy’s current tech level is and what’s working well against what.
3) Equipment changes! I’ve simplified some of the abbreviations for different equipment types so they’re less confusing. Submarine types (SSK and SSN hulls) have been dropped, as it never really made sense to have submarines with 15-inch cannons anyway.
HMS M1, a submarine with a 12-inch cannon. It could only fire one shot before being reloaded, which required it to stay surfaced. Needless to say, it did not see operational service. (Image: Wikimedia)
4) Removed dodgy monotype fonts! Not sure what I was thinking with these really. All fonts have been removed from the source package, the whole UI now just uses your system sans-serif font.
5) Build time rebalancing! Build times used to be dependent on hull size alone. This made it (spoilers!) preferable to research only weapons and radar, and flood the field with quick-to-build ships that did high damage and outranged the enemy so they could get a couple of shots off before dying. (The AI prefers this approach on higher levels too.) Now, although hull still dominates, the other equipment affects the build time too. For example, Hull 1 Weapon 10 Engine 1 Radar 10 used to take 4 seconds to build, it now takes 13.
6) An extra bug fix—playing a new game after a win or loss now resets the world properly.
I’ve held onto my Galaxy S5 for 2½ years, until at last the battery has stopped holding more than 8 hours’ charge, the compass no longer works, and the “metal” paint is starting to peel. I had two requirements for a replacement: it must be the same size or smaller, and its battery must last significantly longer. Ideally also: cheap. Unfortunately, most popular manufacturers seem to have stabilised on 5.5 inches as the ideal screen size, having long forgotten how the tech media mocked the Dell Streak as a “phablet” for its ridiculously huge 5-inch screen, a lifetime ago. (2010.) Most smaller phones fit into major manufacturers’ “budget” lines, with poor specifications, including the all-important battery size.
After some investigation, it turned out that plenty of companies are churning out 5-inch and smaller phones with big batteries, mostly for less than £150—here’s 309 of them—only problem is, most of the world has never heard of them.
So, I took the plunge.
Xiaomi, a complete unknown in the western hemisphere, is in fact the world’s 4th largest smartphone manufacturer, so presumably their phones must be of reasonable quality compared to the lucky dip of buying a phone from “vernee” or “Nomu” or “Doogee”. A week or so later, I had in front of me a Xiaomi Redmi 4… something. It’s either called “Prime”, “Pro” or “High Ed.” depending on where you look for it.
It’s… surprisingly nice.
Although it’s a Chinese import, I bought it via Xiaomi’s UK shop, and luckily it came with the “Global ROM” with Google Play Store installed, rather than the Chinese ROM that omits Google services.
Except… there is no Global ROM for this phone.
Additionally, it looked like the default Weather app had been replaced by “SC Weather”—a decently functional replacement, with only a couple of downsides, like consistently using >20% of the battery despite never being opened, and having access to write system settings, install apps, make phone calls…
Ah. I had a fake ROM.
After years of trusting Samsung and Google with all my stuff, I must at least afford Xiaomi some trust to look after all my stuff. But I sure as hell don’t trust whoever installed that on my phone.
And so, the fun began.
Xiaomi helpfully offers ROM downloads for all their phones—and many other people’s phones—on a nice English-language website, and it already had a more recent version of the software that I could upgrade to! Unfortunately, my fake ROM denies all knowledge of over-the-air updates, and even refuses manual updating as well. Bollocks. It’s time for the flashing tool.
Xiaomi’s “Mi PC Suite” comes in two versions: a Chinese one, which actually works, and a helpfully translated English one, which fails to see my phone at all. Why would it? My phone doesn’t exist in English-speaking regions yet. So, here we go.
The internet helpfully recommends downloading the ZIP of the ROM you want, booting your phone into Recovery mode, connecting to the PC, and shift-clicking that button labelled above. This will let you select the ZIP to flash.
Except, oh no it doesn’t. In the latest version of Mi PC Suite, that opens up a menu of more buttons. And not a word of English among them, nor help from the forums, which don’t seem to have encountered this new version yet. I was on my own.
At long last, I had a phone that runs a real version of its software. Additionally, I can now recognise the Chinese characters for “flash”, “file”, “OK”, “cancel”, and “could not locate
More fun ensues because, of course, I now have a legitimate Chinese ROM. With no Google services. Fortunately, a number of websites helpfully show you how to download “Google Installer” from the Xiaomi Store, and use that to install Google Play Services, Play Store, etc. Unfortunately, Xiaomi (being the good citizens they are) removed the app. So it was off to dubious forums and third-party download sites in search of a copy of Google Installer.
At last I went to bed, now with an up-to-date official ROM, Google services and all, happily installed. The next day, since the phone was on an official software version, a nightly update arrived that bumped the software version significantly and conveniently fixed the majority of the issues I had with the software all in one go.
I’m still waiting on a real Global ROM to appear eventually, so I can have Google Play Services and friends installed as real system apps, and fewer strange Chinese video streaming apps installed. But for now, I have a nice looking and feeling phone, regular software updates, and most importantly of all a battery that’s currently sitting pretty on 90% after 7 hours’ use.
Bye bye Galaxy S5, with which I’d now be on 30% battery and hunting for a charger. I’m keeping my weird Chinese phone.
Three years ago, Google shut down its popular RSS reader web application. The decision angered many users, and I penned a long rant about how horrible proprietary services are as they can be taken away from the users at any time without their consent.
I found the News app for OwnCloud, installed in on my own server and never looked back.
Updating the version of OwnCloud on my server, to get the latest security patches, has broken the News app permanently.
It turns out that some time ago the OwnCloud development team split acrimoniously and started a rival fork called “NextCloud”. The maintainer of the News app jumped ship, leaving OwnCloud News unmaintained until it eventually broke.
It looks like I now have three options:
- Take over development of an abandoned project, which I am (in terms of both time and experience) ill-equipped to deal with
- Migrate from OwnCloud to NextCloud, a complex process which also involves changing the software I use for file, contacts and calendar synchronisation
- Use a proprietary service like Feedly instead.
As you might imagine, I picked option 3. I was up and running again within five minutes.
It’s enough of a frustrating experience to have me considering the reverse of a post I made years back, considering which proprietary services I should stop using in favour of doing my own thing. Since then I started running my own mail server, as well as OwnCloud, to meet my online needs; I migrated all my websites from Heroku to my own server as well. I learnt a lot—that fighting spam is hard, SPF is hard, maintaining SSL certificates is hard, few clients support CalDav and CardDav properly, and so on.
It’s been an experience, certainly—mostly a good one, or at least an interesting one. But I do wonder, over the years, how much frustration and wasted time I’ve had that could have been saved by dropping my ideological preference for open source software and “DIY”, and accepting that even if they can shut down unexpectedly, some proprietary services are just so much easier.
It’s fifteen years today since I first posted something—specifically, terrible teenage poetry—on what would become my blog. Back then my website was a purple-and-black exhibition of my poor teenage sense of humour, and I started posting snippets of poetry to it under the category of “Thoughts”.
In 2002 I was invited to an up-and-coming site called “LiveJournal”, a perfect platform for sharing my young adult angst and drama for the world to see. At university it became central to our social lives, a foreshadowing of the social network generation that was yet to come.
LiveJournal came and went. By 2009 I was blogging on my own WordPress site and syndicating the posts to LJ, and by 2011 I was reminiscing about what we had lost. In 2013, beset by buggy plugins and security problems with WordPress, I moved to about the nerdiest blog platform imaginable, the static site generator Octopress.
Editing a site this way has its advantages—the end result is fast and secure, and appeals to my geekier tendencies by allowing me to keep it all under version control. But it has its disadvantages too, principally the fact that the site needs a “compile” step before the results can be seen. In recent months the combination of my old PC, 3000+ pages to render, and a few poorly-implemented plugins have resulted in compile times in excess of three minutes. That’s not too bad for a one-off post, but it’s particularly grating when we do Film Review by the Numbers on a Saturday night. Writing the reviews is something of a spontaneous group activity, and when it takes three minutes to see what a change will look like, those minutes feel a lot longer.
A fifteenth anniversary seems like as good a time as any to make some changes, so I’ve been working on some ways to speed up the writing and generating process.
Firstly, I have started doing the simpler editing tasks, like writing a new post, directly on GitHub where the source code lives. Its Markdown editor has a preview function that renders instantly, meaning that for Film Review by the Numbers (and everything else) we can get an approximately-correct rendered page with inline images straight away. I can also commit directly to the repository from there once everything is looking right.
I’ve contemplated using GitHub Pages to host the site directly as well, though its lack of support for SSL certificates and Jekyll plugins rules it out for now. I have, however, started using GitHub’s “webhooks” to trigger an automatic rebuild of the site—on every commit, GitHub pings a script on my server based on marcomarcovic’s simple-php-git-deploy, which updates its local copy of the site, rebuilds it using jekyll, and deploys it to the public directory on the server.
With it all configured, I can now keep my fast and secure static site, while also regaining some of the ease of a web-based editor that I miss from the WordPress days. I can also sensibly blog on the move from my phone or tablet, without having to open up a command-line console every time.
This is my first test, and if you’re reading this, I guess it works!
Bournemouth Gardens is packed on a sunny Sunday afternoon. Shoppers bustle past, teenagers play on the grass, but today more than usual their gaze is directed downwards at their phones. Kids, adults, old and young; cyclists, bus drivers and big hairy bikers all alike. In a parallel universe, the place is dotted with spinning cubes and buried under a thick drift of cherry petals.
“That’s a lot of lures, I wonder if the Gardens will be… yep.” Everyone in this picture is playing Pokemon.
So far today we have covered six miles in a morning and an afternoon walk, and the child is still keen to go out again. We have levelled up twice and caught more than a hundred Pokemon. We’ve heard the strange ripple of people talking about a Squirtle when one showed up, like the chatter about a PvE boss spawn weirdly transposed into real life.
We’ve had all kinds of conversations that result from fully half of people you pass doing exactly the same as you—the two kids whose gym we took and then felt guilty; the dad whose girls were just picking their team while they walked the dog; and the 53-year-old woman whom we taught to battle, while she showed us pictures of the Ninetales she saw but couldn’t catch.
The mighty Duckbot 2000 held the Pier for ten minutes against the horde of rival team players, and by the time it was time for the walk home, our battery pack was coming in very handy.
More than anything, in increasing order of imporance: we have had a good day; all nearby gyms now belong to Team Valor; and we have taken the hopes of the local children and MERCILESSLY CRUSHED THEM.
A successful day all round, I think you’ll agree.
When I was born, thirty-one years ago, the UK was in the middle of some tough economic times. The value of the pound was low, and interest rates were high. But in the intervening years, despite the recession at the end of the 2000s, the overall trend was up. People were getting richer, quality of life was increasing, and the country was cooperating ever more with its neighbours.
Over the last eight hours, much of that was undone.
“Hackerspaces”, or “Makerspaces” are very much an idea whose time has come. The analogy I liked to use most was that of a “community garden shed”—they are places run by the community, where any member can come along and work on their personal projects and collaborate with others.
This is the story of the Dorset Constructorium, a hackerspace that never quite made it.
Note: This is our story as I remember it, published in case others interested in starting a hackerspace of their own find some use in it. I welcome additions and corrections from other members of the group with a better memory of what happened when than mine. I've also left out people's names for now, let me know if you are happy for me to use your name.
Our group began in the Spring of 2013, the way things do—a couple of friends sat around a table and decided they should start a hackerspace. We came up with a goofy but catchy name, “The Dorset Constructorium”. We started a mailing list using Google Groups, which slowly grew to 40 or 50 subscribers, and an IRC channel that hit around a dozen. Throughout that year we used them to organise some meetings in pubs in Bournemouth and Wimborne, where we chatted and drank and discussed how we could move on from our current arrangements to become a real hackerspace.
Evenings down the pub were all well and good, but we couldn’t be a hackerspace without a space.
We started looking, making calls and sending emails to likely groups: companies, community centres, halls for hire. We filled up a spreadsheet with 20 or 30 possibilities, listing the advantages and disadvantages of each, but none were perfect. Many were simply too expensive for a small group to afford. Others were within our price range, but offered no permanent storage for our tools and equipment. Others still were too far away, or too concerned about the safety of the work we’d be doing.
We knew we’d have to get more organised, manage money, and get insurance. Our original (somewhat naïve) plan was to be somewhat of a free-for-all in terms of structure, where we were all equals and did everything as a group; but we were moving into a world of bank accounts and insurers who would want names and signatures on their forms. We formed a committee—President, Treasurer and myself as Secretary.
Around that time, one of our members offered us to team up with AdidoSrc, a group given space, pizza and booze by web development company Adido. We joined forces for three mid-week evenings in late 2013, before Adido suddenly pulled the plug—we were without a home almost as soon as we had found one.
We carried on looking. Soon we had free web hosting provided for us by Bitfolk and developed our new Wiki into a place for us to share ideas and coordinate. We wrote a Constitution and a Code of Conduct—we were getting serious.
2013 rolled into 2014, and the Constructorium strengthened its ties with the local Rep Rap Users’ Group, by then known as MakeBournemouth.
Local café and “creative hub” Makers Inc opened around this time, and MakeBournemouth started running some themed evenings there, where people came along to build a certain kit together. The Dorset Constructorium joined in for four events… before the cafe closed, and we were homeless once more.
In the ensuing downtime we expanded our web presence again, putting up a nicer-looking Wordpress site to show visitors what we were all about, along with an online calendar for scheduling events, a Facebook page and a Twitter account.
Still holding out for a real space to call home, one of our members offered the services of their garden shed. After shifting out a decade’s worth of junk, we moved some of our tools in, and christened it the “Hack Shack”. Although it was and still is the closest we’d come to a hackerspace of our own, and we offered it four days a week for free, it wasn’t what our members had in mind; it never saw much use.
2014 become 2015, and our last hope came in the form of our local library. While MakeBournemouth contemplated going the big-budget route—allowing members to work on commercial projects, charging more, and affording a space at full commercial rates—the Constructorium tried our luck with the opposite, declaring ourselves strictly non-commercial and aiming for a discounted or free space by pushing the community/charity angle.
The library allowed us use of a back room to get started, and we had some excited conversations with the head librarian about the library getting 3D printers and our group running soldering and Raspberry Pi coding courses.
We had big plans, but by then attendance at our events was waning. Our meetings in the back room of the library averaged only four of us, and we never found the time or the confidence to offer courses. Our Facebook page attracted some interest, but we were never able to provide the organised experience that new visitors were expecting. Before long, we stopped meeting there, and for the third time in three years, we were without a place to meet.
And that, as they say, was that.
The IRC channel became abandoned, the mailing list posts dropped off to zero. The President moved to another town, and the Tresurer we haven’t heard from in some time. My job has got busier, and what little time and energy I had has dwindled further. Unless any member of the group wants to take it on from here, I think it’s about time to call the Dorset Constructorium to a close.
To all the members that made our group great over the years: thanks for the memories.