Blog

    Another Shenzhen Special:
    15 Minutes with the MAFAM M11

    This is a post from my blog, which is (mostly) no longer available online. This page has been preserved because it was linked to from somewhere, or got regular search hits, and therefore may be useful to somebody.

    Thanks to an excitingly mis-sold internet purchase, we are now in possession of the latest in exciting mobile technology – the MAFAM M11. (If you’re mentally incapacitated, you could have one of your very own for 24 whole US dollars!)

    Let’s jump in and see what this technological marvel has in store.

    Four (4) SIM cards! Four! Ideal for the drug dealer in your life. Utterly bizarre for everyone else!

    Also: WhatsApp. Really? OK, say I believe you. Let’s try this. Unboxing!

    Pretty standard, except those four SIM slots! Two are mini-SIM, two are micro-SIM. Now I normally wouldn’t even bother with these, especially not trying to align my nano-SIM card in the slot, but it looks like nowhere on this phone’s spec sheet does it mention WiFi, so if we’re going to WhatsApp on this thing we’re going to need to get our GPRS on.

    OK, let’s get started. No startup jingle, so that’s a bonus. Company logo isn’t centered on the screen though, which as we will see, is emblematic of the quality assurance standards that we’ll see throughout this phone’s design.

    And before we carry on, take a moment to enjoy this menu. “Wha…”!

    OK, time to get this thing online. Helpfully, you can set it to dial up to the internet automatically. Advanced! But first, the default APN settings – usefully named using characters it can’t display – need to be replaced with my real ones.

    Typing text on a numeric keypad was just great, wasn’t it? Remember that? Remember how nice it was to not do that anymore?

    Right, let’s see how the internet experience looks on this phone.

    Google from the browser? Kind of… early 2000s feel here, but still usable.

    That Facebook icon on the home screen? It opens a browser window just like you expected. It’s Facebook’s mobile site, and it just about works.

    And finally, WhatsApp. That service that doesn’t work at all unless you have an Android or Apple phone. Let’s see how they managed this!

    It… opens in a web browser. With helpful download links for your Android or Apple phone. No functionality whatsoever. Not… really… deserving the “Whatsapp” logo on the box, and the “Wha…” icon with pride of place in the menu.

    Alright! Let’s try out the camera. Its quality settings menu includes “Low”, “Medium” and “Advanced”, so naturally we’ll see where the latter gets us.

    Embedded below are a daylight and a low-light shot for comparison, both at the camera’s native, er, 320 by 240 resolution. As you can see, better quality pictures can be obtained from the average potato, though astoundingly the low-light shot actually looks better.

    And to finish up, let’s check some of the other exciting features this phone has.

    First up: Filearray! What is a filearray? We’ll never know, because it’s empty and no option on the phone seems to actually save a file here.

    Next: Magic sound! I can select the Magic Sound of a man, woman, young, old, child or cartoon!

    The phone produces no sound when any of these options are selected.

    We get games though! Check out Sokoban, and this file named Bubble_Bobble_320x240.jar which can’t be run because the phone doesn’t have any storage.

    And that’s a wrap! Don’t buy this.

    MAFAM? Nah, fam.

    SuccessWhale.com Discontinued as of Today

    This is a post from my blog, which is (mostly) no longer available online. This page has been preserved because it was linked to from somewhere, or got regular search hits, and therefore may be useful to somebody.

    As far as I know, SuccessWhale is not being actively used by anyone any more, so I have chosen not to renew the domain name successwhale.com when it expires today. Like most of my past web-based projects, it will continue to live on at an onlydreaming.net subdomain, in this case sw.onlydreaming.net, but will not be actively maintained there.  As well as its graphical web interface, SuccessWhale also has a back-end API that used to run on a SuccessWhale subdomain. This has now moved to https://successwhale-api.herokuapp.com/. The OnoSendai Android client already uses this address for the API as of update 479, so you may need to update.

    Thank you to all the SuccessWhale users over the years!

    Automating the Roast Dinner Timing Chart

    This is a post from my blog, which is (mostly) no longer available online. This page has been preserved because it was linked to from somewhere, or got regular search hits, and therefore may be useful to somebody.

    Against all my expectations, the most popular page on this website (at least, the most visited) turns out to be “The Great Roast Dinner Timing Chart”, which was my attempt to help newbies at the revered British art of the Roast Dinner get their timings right. I first posted it over eight years ago—in the intervening time I have cooked a lot of roasts and tweaked my timings a bit, so it was in need of an update.

    I’ve also cooked roast dinners to serve at all sorts of different times, with variations on the ingredients, so I thought the old list to cook certain things for a fixed 7pm dishing-up time could use some improvements too. To that end, I’ve converted it from a static list to an automatically generated one that users can play with. Visitors to the page can now choose their meat and its weight, the accompaniments and the serving time, and the page will do some JavaScript magic to generate a chart just for them.

    If you’re interested, you can check out the source code on GitHub. It’s not much and not very complicated, but free for anyone to use and modify.

    For anyone who prefers the original non-automated version of the Roast Dinner Timing Chart, or anyone without JavaScript capability, you can still get to the old version here.

    Hopefully it will help someone on the way to cooking their best roast dinner yet!

    Migrating from Jekyll to WordPress

    This is a post from my blog, which is (mostly) no longer available online. This page has been preserved because it was linked to from somewhere, or got regular search hits, and therefore may be useful to somebody.

    The final, and most difficult, part of the plan to wind down some of the more complex stuff I do on the internet was the migration of this site from Jekyll and Hashover to WordPress. It’s a decision I took with some trepidation, as I well remember ditching my old WordPress site for Jekyll (via Octopress) four years ago and enjoying the speed and security it brought.

    However, the workflow is what killed it. The typical “By the Numbers” film review is a shared activity with friends around the TV, which doesn’t lend itself to being sat at a desk at the only computer of mine that can reasonably compile the Jekyll site. I switched to hosting the site on GitHub pages and just editing the pages myself in a browser window, but uploading and linking images was still a multiple step, non-WYSIWYG game of making sure the URLs are all right, followed by a 3-minute compile stage where everyone is waiting to read the finished article and I have to explain why.

    Comments were worse still. I staggered between Juvia, a discontinued Rails application that I was out of my depth maintaining by myself, Disqus which “just worked” but put visitors off commenting, and HashOver where fighting spam involved finding offending new comments via ssh.

    Back on WordPress, things may be slower and security more of a concern, but comments are natively supported, I can drag images in, and preview posts on the fly. No compile stage!

    All in all the process took about 10 hours. If you’re contemplating a similar step, here are some useful hints, as unlike the reverse move, migrating in this direction seems to be a rare activity:

    1. This post was really useful, and RSS export/import does seem to be the best way of moving the main post data across. I moved my pages (20 of them) manually, and used the RSS method for my ~1000 posts.
    2. My Jekyll site had three levels of taxonomy – Collection, Category, and Tag. I believe it’s possible to create the same taxonomies in WordPress, but I didn’t bother. I moved one collection at a time, merged Jekyll’s categories and tags into WordPress’ tags, then Jekyll collections became WordPress categories.
    3. The RSS importer can’t import tags, only categories, so I imported everything as categories and used a “Categories to Tags converter” plugin to sort the mess out.
    4. The Disqus comment importer script from the page linked above worked well for importing a Disqus export, with the exception that Disqus comment and thread IDs are now greater than 2^32. The importer uses these as keys into maps, so I had to subtract arbitrary large numbers from the IDs (which PHP is perfectly happy to do) in order to make them usable as integer keys.
    5. I had a lot of files such as pictures in arbitrary locations which Jekyll is happy to deal with, but WordPress is not. I moved everything into “wp-content/uploads/” with some .htaccess redirects so that they can still be found.
    6. Recreating the Jekyll theme wasn’t too hard, although it took around half the total time. When I first moved away from WordPress its themes were a messy mystery to me, but with four years’ more experience, I can see the parallels between my Jekyll templates and WordPress’ ones, and the transition went very smoothly.

    Although it’s been a few days of working late into the night, I’m happy to say it’s now done. Hopefully the blog will be easier to manage from here on.

    No compile stage!

    Farewell to Facebook

    This is a post from my blog, which is (mostly) no longer available online. This page has been preserved because it was linked to from somewhere, or got regular search hits, and therefore may be useful to somebody.

    My blogging history has not been lacking in posts where I consider deleting my Facebook profile. It’s been a common thread throughout that time that Facebook has its advantages (having become my sole practical means of contacting many old friends) and disadvantages (that it is a privacy-devouring monster). In the main, we have been willing to make a deal with the Devil in order to use the vast network of communication possibilities it opens up for us.

    After holding a Facebook account for 10 years, and apparently struggling with whether that was a good thing for at least six of them, today I deactivated my account—a temporary measure to see how it goes, before a potential full deletion in the near future.

    The last straw for Facebook was, as with the last straw for LinkedIn, a matter of privacy.

    Facebook’s “People You May Know” feature is often handy for finding friends-of-frients that you know. Occasionally it recommends someone you kind of know despite not having any common friends, and makes you wonder how the algorithm put two and two together. A little concerning sometimes, but not the end of the world.

    Today, Facebook’s recommendations went from “a little creepy” to “compromising friends’ private medical data”.

    I shan’t name any names, for obvious reasons, but I have a friend who is currently suffering from a particular medical problem. As part of their treatment, that friend has regular appointments with a medical professional, and in supporting my friend, I’ve previously been in contact with that person as well. The friend isn’t on Facebook at all, citing privacy concerns. I have not mentioned anything about them nor their problem on Facebook. And yet today, Facebook’s blissfully context-free recommendation algorithm started suggesting that I add that medical professional as a friend.

    NOPE

    As far as we can tell, what happened is this:

    1. I exchanged an email with the person, via my GMail address.
    2. I have GMail set to remember people I email by automatically adding them to my address book.
    3. I have an Android phone, which automatically syncs my Google contacts, so their email address became stored on my phone.
    4. I have had, in the past, the Facebook and Messenger apps on my phone. I had granted access to my contacts, so they could set contacts’ photos to their Facebook profile picture.
    5. The Facebook and/or Messenger apps hoovered up my contacts and sent them to their server.
    6. The person’s email address matched the one they used for their Facebook account, and so Facebook knew that we had some kind of connection.

    Now, it’s not entirely Facebook’s fault. Some of the fault lies with Google, and a not inconsiderable portion of the fault lies with me for not checking apps’ permissions and privacy policies. However, it’s the Facebook part of the puzzle that made the whole thing creepy, and so, that’s the part that has to go.

    So farewell, Facebook. You’ve made staying in touch with a lot of friends much easier over the last decade, and for that I’m grateful. And I’ve always known that the price for that was that you’d play fast and loose with my own privacy. But when you start to infringe on the privacy—and potentially the confidential medical information—of one of my most vulnerable friends, you crossed the line.

    I’m done.

    Planning the Wind-Down

    This is a post from my blog, which is (mostly) no longer available online. This page has been preserved because it was linked to from somewhere, or got regular search hits, and therefore may be useful to somebody.

    It’s been five years now since, full of enthusiasm and convinced that SuccessWhale might make it big, I bought myself a server in London somewhere, and moved my web presence over from its previous shared hosting.

    I’ve learnt a lot in those years.

    I’ve learnt that despite the “S” in “SMTP”, running a mail server these days is anything but simple. Fighting viruses and spam is challenging, and in particular DKIM is a hassle to get right. I’ve learned that PGP is a nice idea, but I never did find one person to exchange encrypted mail with, or any real reason to do so.

    I’ve learned that LetsEncrypt makes setting up SSL certificates much easier and cheaper, but that configuring Apache to serve 23 HTTPS domains from a single machine with SNI is still difficult.

    I’ve learned that Linux Version Hell will bite you at some point, and Ruby Version Hell doubly so. Today the server runs three separate versions of Ruby, because not all my Ruby/Rails sites can be run with the same one. The www-data user has a home directory just for .rvm.

    I’ve learned that drive-by hackers and DoS attackers will find your server, often within hours of it going online, and that fail2ban and mod-blacklist are the sysadmin’s friend.

    I’ve learned that Open Source community drama can break stuff you depend on, sometimes without a decent way to recover, and every time you do a package update you risk something not working.

    All in all, it’s been a good experience. There have been some frustrating times and late nights fighting with configuration that should just work, but I’ve learned a lot in the process and gained a much greater understanding of how the internet works.

    However, I think it might be time to start winding it down. It’s no longer an interesting new experience, it’s just another thing that I have to keep an eye on and ensure it doesn’t break. Life, I find, is better when it’s simpler.

    As such I am working up a plan to migrate important things to other services over the coming weeks or months.

    • Email, contacts, calendar, file storage and photos I’ve already moved from my server to Google products. I’m no great lover of Google, but having struggled so long with email servers and clients’ spotty DAV support and OwnCloud updates breaking everything, I have to admit it’s so much easier.
    • This blog is based on Jekyll and can relatively easily be moved to GitHub Pages. We’d lose HTTPS support, but I don’t think that’s a major issue. ~500MB of images used by it currently sit on files.ianrenton.com but could be brought into the repository. The challenge is the comments, which use my own Juvia server. There’s no migration path from that to anything, and I do want to keep the comments— particularly on the Raspberry Tank pages—as there’s a lot of helpful information there. Disqus is the logical non-self-hosted equivalent, but there’s no automated way of bringing the comments in. Maybe it’s time to migrate the whole site back to WordPress, since hosted solutions are easier to come by.
    • Software executables can use GitHub’s “releases” feature and be hosted there next to the source code.
    • Everything else on files.ianrenton.com is pretty much junk, bits of it can be shared ad-hoc from Google Drive.
    • SuccessWhale, Can I Call It, Daily Promise, A Thousand Words and Westminster Hubble are effectively dead at this point. They can be taken down and replaced with portfolio pages saying “this is what I once made”.
    • Rimbaud’s House hasn’t been in use for a while anyway, it was a nice project but the camera has not proved useful and I do not trust the sensors enough to introduce full automation based on their readings.
    • Random little things like Marvellator, Terrible Fanfiction Idea Generator and the Business Processes Wiki can be converted to static pages, with JavaScript instead of PHP providing the active content, and hosted somewhere like GitHub Pages.

    A few potential sticking points are:

    • I host three WordPress sites for other people. Two are potentially old enough that they can be deleted or replaced with static pages; the other I could migrate to WordPress.com although that would involve paying extra.
    • My son’s Minecraft server is turned off 99% of the time, but if we do want to bring it back at some point, losing the server would mean losing that capability.
    • I do use the server as a VPN occasionally when on unsecured WiFi, but almost every critical app and website uses HTTPS these days, it’s not necessarily essential any more.

    A Sea Battle Update?!

    This is a post from my blog, which is (mostly) no longer available online. This page has been preserved because it was linked to from somewhere, or got regular search hits, and therefore may be useful to somebody.

    “Sea Battle” was a casual 2D real-time strategy game that I put together in a few days back in 2010, and documented in a series of blog posts at the time. It’s lain dormant ever since, but I picked it up again today while bored and made a couple of tweaks.

    Six years on, it’s obvious how much my coding style has changed—not only is the formatting dubious and commenting sparse, there’s also a lot of inefficient loops and abuse of global variables. I may change all that in a big refactor at a later date, but for now all I’ve done is a few minimal changes on top of the existing code.

    If you played Sea Battle ages ago and fancy trying it again, here’s what to expect:

    Annotated screenshot showing what's new

    1) Islands! You now get some randomly-generated islands to break up the wide expanse of blue sea. They’ll be different each time you run the game. Collision detection is based on the old code for detecting collisions with other ships, which is not great, but your ships shouldn’t get stuck behind islands too much. Islands only affect movement, not the firing arcs of weapons.

    2) Death list! I originally wanted to give ships randomly-generated names in this update, so you could see something like “Bismarck sank HMS Hood”. However, I couldn’t find a nice way to display them on the play field without adding loads of clutter—maybe one to save for the full-screen 3D version 2. 🙂 The implemented list instead shows which equipment the ships had, e.g. “1.2.3.4” = 1st hull, 2nd weapon, 3rd engine, 4th radar, to give you an idea what your enemy’s current tech level is and what’s working well against what.

    3) Equipment changes! I’ve simplified some of the abbreviations for different equipment types so they’re less confusing. Submarine types (SSK and SSN hulls) have been dropped, as it never really made sense to have submarines with 15-inch cannons anyway.

    HMS M1

    HMS M1, a submarine with a 12-inch cannon. It could only fire one shot before being reloaded, which required it to stay surfaced. Needless to say, it did not see operational service. (Image: Wikimedia)

    4) Removed dodgy monotype fonts! Not sure what I was thinking with these really. All fonts have been removed from the source package, the whole UI now just uses your system sans-serif font.

    5) Build time rebalancing! Build times used to be dependent on hull size alone. This made it (spoilers!) preferable to research only weapons and radar, and flood the field with quick-to-build ships that did high damage and outranged the enemy so they could get a couple of shots off before dying. (The AI prefers this approach on higher levels too.) Now, although hull still dominates, the other equipment affects the build time too. For example, Hull 1 Weapon 10 Engine 1 Radar 10 used to take 4 seconds to build, it now takes 13.

    6) An extra bug fix—playing a new game after a win or loss now resets the world properly.

    If you fancy a go, head to the game’s page where you’ll find instructions and download links. As always, the source is on Github.

    Have fun!

    The Open Source Disadvantage

    This is a post from my blog, which is (mostly) no longer available online. This page has been preserved because it was linked to from somewhere, or got regular search hits, and therefore may be useful to somebody.

    Three years ago, Google shut down its popular RSS reader web application. The decision angered many users, and I penned a long rant about how horrible proprietary services are as they can be taken away from the users at any time without their consent.

    I found the News app for OwnCloud, installed in on my own server and never looked back.

    Until today.

    Updating the version of OwnCloud on my server, to get the latest security patches, has broken the News app permanently.

    It turns out that some time ago the OwnCloud development team split acrimoniously and started a rival fork called “NextCloud”. The maintainer of the News app jumped ship, leaving OwnCloud News unmaintained until it eventually broke.

    It looks like I now have three options:

    1. Take over development of an abandoned project, which I am (in terms of both time and experience) ill-equipped to deal with
    2. Migrate from OwnCloud to NextCloud, a complex process which also involves changing the software I use for file, contacts and calendar synchronisation
    3. Use a proprietary service like Feedly instead.

    As you might imagine, I picked option 3. I was up and running again within five minutes.

    It’s enough of a frustrating experience to have me considering the reverse of a post I made years back, considering which proprietary services I should stop using in favour of doing my own thing. Since then I started running my own mail server, as well as OwnCloud, to meet my online needs; I migrated all my websites from Heroku to my own server as well. I learnt a lot—that fighting spam is hard, SPF is hard, maintaining SSL certificates is hard, few clients support CalDav and CardDav properly, and so on.

    It’s been an experience, certainly—mostly a good one, or at least an interesting one. But I do wonder, over the years, how much frustration and wasted time I’ve had that could have been saved by dropping my ideological preference for open source software and “DIY”, and accepting that even if they can shut down unexpectedly, some proprietary services are just so much easier.

    How I Blog Now

    This is a post from my blog, which is (mostly) no longer available online. This page has been preserved because it was linked to from somewhere, or got regular search hits, and therefore may be useful to somebody.

    It’s fifteen years today since I first posted something—specifically, terrible teenage poetry—on what would become my blog. Back then my website was a purple-and-black exhibition of my poor teenage sense of humour, and I started posting snippets of poetry to it under the category of “Thoughts”.

    Mad Marmablue Web Portal, circa 2001

    In 2002 I was invited to an up-and-coming site called “LiveJournal”, a perfect platform for sharing my young adult angst and drama for the world to see. At university it became central to our social lives, a foreshadowing of the social network generation that was yet to come.

    LiveJournal came and went. By 2009 I was blogging on my own WordPress site and syndicating the posts to LJ, and by 2011 I was reminiscing about what we had lost. In 2013, beset by buggy plugins and security problems with WordPress, I moved to about the nerdiest blog platform imaginable, the static site generator Octopress.

    My Octopress blog, circa 2013

    Editing a site this way has its advantages—the end result is fast and secure, and appeals to my geekier tendencies by allowing me to keep it all under version control. But it has its disadvantages too, principally the fact that the site needs a “compile” step before the results can be seen. In recent months the combination of my old PC, 3000+ pages to render, and a few poorly-implemented plugins have resulted in compile times in excess of three minutes. That’s not too bad for a one-off post, but it’s particularly grating when we do Film Review by the Numbers on a Saturday night. Writing the reviews is something of a spontaneous group activity, and when it takes three minutes to see what a change will look like, those minutes feel a lot longer.

    A fifteenth anniversary seems like as good a time as any to make some changes, so I’ve been working on some ways to speed up the writing and generating process.

    Firstly, I have started doing the simpler editing tasks, like writing a new post, directly on GitHub where the source code lives. Its Markdown editor has a preview function that renders instantly, meaning that for Film Review by the Numbers (and everything else) we can get an approximately-correct rendered page with inline images straight away. I can also commit directly to the repository from there once everything is looking right.

    I’ve contemplated using GitHub Pages to host the site directly as well, though its lack of support for SSL certificates and Jekyll plugins rules it out for now. I have, however, started using GitHub’s “webhooks” to trigger an automatic rebuild of the site—on every commit, GitHub pings a script on my server based on marcomarcovic’s simple-php-git-deploy, which updates its local copy of the site, rebuilds it using jekyll, and deploys it to the public directory on the server.

    With it all configured, I can now keep my fast and secure static site, while also regaining some of the ease of a web-based editor that I miss from the WordPress days. I can also sensibly blog on the move from my phone or tablet, without having to open up a command-line console every time.

    This is my first test, and if you’re reading this, I guess it works!

    The Constructorium Story

    This is a post from my blog, which is (mostly) no longer available online. This page has been preserved because it was linked to from somewhere, or got regular search hits, and therefore may be useful to somebody.

    “Hackerspaces”, or “Makerspaces” are very much an idea whose time has come. The analogy I liked to use most was that of a “community garden shed”—they are places run by the community, where any member can come along and work on their personal projects and collaborate with others.

    This is the story of the Dorset Constructorium, a hackerspace that never quite made it.

    Note: This is our story as I remember it, published in case others interested in starting a hackerspace of their own find some use in it. I welcome additions and corrections from other members of the group with a better memory of what happened when than mine. I’ve also left out people’s names for now, let me know if you are happy for me to use your name.

    Our group began in the Spring of 2013, the way things do—a couple of friends sat around a table and decided they should start a hackerspace. We came up with a goofy but catchy name, “The Dorset Constructorium”. We started a mailing list using Google Groups, which slowly grew to 40 or 50 subscribers, and an IRC channel that hit around a dozen. Throughout that year we used them to organise some meetings in pubs in Bournemouth and Wimborne, where we chatted and drank and discussed how we could move on from our current arrangements to become a real hackerspace.

    Evenings down the pub were all well and good, but we couldn’t be a hackerspace without a space.

    We started looking, making calls and sending emails to likely groups: companies, community centres, halls for hire. We filled up a spreadsheet with 20 or 30 possibilities, listing the advantages and disadvantages of each, but none were perfect. Many were simply too expensive for a small group to afford. Others were within our price range, but offered no permanent storage for our tools and equipment. Others still were too far away, or too concerned about the safety of the work we’d be doing.

    We knew we’d have to get more organised, manage money, and get insurance. Our original (somewhat naïve) plan was to be somewhat of a free-for-all in terms of structure, where we were all equals and did everything as a group; but we were moving into a world of bank accounts and insurers who would want names and signatures on their forms. We formed a committee—President, Treasurer and myself as Secretary.

    The Dorset Constructorium at AdidoSrc

    Around that time, one of our members offered us to team up with AdidoSrc, a group given space, pizza and booze by web development company Adido. We joined forces for three mid-week evenings in late 2013, before Adido suddenly pulled the plug—we were without a home almost as soon as we had found one.

    We carried on looking. Soon we had free web hosting provided for us by Bitfolk and developed our new Wiki into a place for us to share ideas and coordinate. We wrote a Constitution and a Code of Conduct—we were getting serious.

    2013 rolled into 2014, and the Constructorium strengthened its ties with the local Rep Rap Users’ Group, by then known as MakeBournemouth.

    Constructorium and MakeBournemouth at Makers Inc.

    Local café and “creative hub” Makers Inc opened around this time, and MakeBournemouth started running some themed evenings there, where people came along to build a certain kit together. The Dorset Constructorium joined in for four events… before the cafe closed, and we were homeless once more.

    In the ensuing downtime we expanded our web presence again, putting up a nicer-looking WordPress site to show visitors what we were all about, along with an online calendar for scheduling events, a Facebook page and a Twitter account.

    Hack Shack Interior

    Still holding out for a real space to call home, one of our members offered the services of their garden shed. After shifting out a decade’s worth of junk, we moved some of our tools in, and christened it the “Hack Shack”. Although it was and still is the closest we’d come to a hackerspace of our own, and we offered it four days a week for free, it wasn’t what our members had in mind; it never saw much use.

    2014 become 2015, and our last hope came in the form of our local library. While MakeBournemouth contemplated going the big-budget route—allowing members to work on commercial projects, charging more, and affording a space at full commercial rates—the Constructorium tried our luck with the opposite, declaring ourselves strictly non-commercial and aiming for a discounted or free space by pushing the community/charity angle.

    The library allowed us use of a back room to get started, and we had some excited conversations with the head librarian about the library getting 3D printers and our group running soldering and Raspberry Pi coding courses.

    Joseph at the Library

    We had big plans, but by then attendance at our events was waning. Our meetings in the back room of the library averaged only four of us, and we never found the time or the confidence to offer courses. Our Facebook page attracted some interest, but we were never able to provide the organised experience that new visitors were expecting. Before long, we stopped meeting there, and for the third time in three years, we were without a place to meet.

    And that, as they say, was that.

    The IRC channel became abandoned, the mailing list posts dropped off to zero. The President moved to another town, and the Tresurer we haven’t heard from in some time. My job has got busier, and what little time and energy I had has dwindled further. Unless any member of the group wants to take it on from here, I think it’s about time to call the Dorset Constructorium to a close.

    To all the members that made our group great over the years: thanks for the memories.