Some Notes on Migrations

This post will be as much about thinking through account migrations for Reclaim Hosting, as trying to capture some of the technical aspects of moving sites to Reclaim Cloud. In fact, it promises to be all over the place, but that is the prerogative of this blog and that’s why I love it so.

Migrations: the act of moving people’s shit from one server to another.

The vernacular for what I am talking about here when saying migrations, not pretty but true.

Domains migrations: This can be a very straightforward process, for example when we have someone on one of our school cPanel accounts that wants to move to our shared hosting. CPanel has a transfer tool baked in and we can move accounts between servers within seconds (assuming they are under 10 GBs or so), and after that just make sure all the details in our client management software WHMCS are aligned and we are good to go. What’s more, migrations like this are easy enough that there is no charge for anyone migrating from a Domain of One’s Own school to our shared hosting.

Third-party free site migrations: There are too many of these to list, but a few popular ones are, Wix, Weebly, and Squarespace. Interestingly enough the only one of these listed with anything resembling a migration option is You can export and import the posts, pages, media, and author data, but you have to re-build the site design with appropriate themes and plugins. All the other services would be a straight-up copy and paste of page content which should tell you everything you need to know. No HTML files to download, no easily accessible media, no database … nothing. Say what you will about WordPress, but at least it’s an ethos. migrations are fairly straightforward, you just need to prepare folks that some plugins and themes on may not be readily available for free outside that space (I still hate the plugin and theme marketplace and always will). These migrations usually cost $25.

Everything else: Pretty much everything after those two categories is a crap shoot. We have done a fair amount of migrations from just about every host imaginable: Bluehost, Host Gator, Godaddy, Dreamhost, Webfaction, 1and1, etc. And while a few of these use cPanel (Bluehost, Host Gator and sometimes Godaddy) they’re by no means similar. It’s next to impossible to get a full backup from Bluehost without an upsell, Godaddy’s interface is as confusing as they come, and good luck making it through the advertisements in Host Gator. What’s more, if you live and die by the command line (which I don’t but should) getting SSH access is often another level of hell. Services like Webfaction (soon to be gone) and Dreamhost are better in that regard, but given they run their own hosting software there is no straightforward migration path, so the migrations are often manual, and if you have an account with 5-10 sites, that is 5-10x the work as one cPanel migration, which wraps everything up into one neat package.

So, long story short, these migrations are by definition more time intensive and as a result expensive. As a rule of thumb we charge $25 per site migrated in these cases, but as we have learned some of these services allow folks to run beefy sites on their shared hosting services, which is not something we can afford to do. For example, we limit our accounts to no more than 100GB of storage for a shared hosting account, and no more than 1 GB or total server resources. For some sites that want to come over to our shared hosting these limitations will be a hard stop given the amount of storage and CPU resources needed, so that raises two crucial questions before a migration like these even starts: 1) how much data?, 2) how many resources? A few others is what PHP versions they are running and whether or not they are running the latest version of the application (issues with folks needing to run older apps on older versions of PHP is always a red flag).

I’m sure there are other variations, but for sake of memory and dragging this post out I’ll leave it to these three categories, and taking the last as an example of how a site previously run on webfaction‘s shared hosting needing to be migrated to Reclaim Cloud. The site in question had 170 GB of data, a 2 GB database, and was running Drupal 7 on PHP 5.6. The storage was an immediate flag for our shared hosting, and while previously we would point folks to managed hosting (which can run as much as $400 per month), Reclaim Cloud offers a much more affordable, albeit unmanaged, option. Storage is quite cheap at .08¢ per GB per month, or less than $1 per 10GB per month. Also, for large sites with regular traffic and a long history Reclaim Cloud provides dedicated resources wherein you can reserve up to 2 GB of CPU but allow your instance to expand to 4 GB or more if need be, while only paying for those resources if and when needed.

On the Cloud we are able to install a container-based full-stack LiteSpeed server, also known as LLSMP, that is optimized for a PHP app running LiteSpeed (a drop-in replacement for Apache) that also gives the user root access to only that container. So, the client gets more storage, more resources, root access, and an overall more secure experience for roughly $50 per month (this is based on using 10 cloudlets, 150GB of storage, a dedicated IP address, and the LiteSpeed license). What’s more, you have the option to scale instantly should that be of concern.*

So that’s the argument for the Cloud in this case, and it really is a good solution when it comes to speed and experience, and the reason I even took on this migration was it would force me to get more familiar with Reclaim Cloud, in particular creating a LLSMP environment and importing a large Drupal instance. As I predicted, these migrations are never simple, and one of the trickiest pieces beyond understanding what environment you are coming from and where it is going to, is making sure the DNS points from one server to the other cleanly, more tears have been shed over DNS in the previous 8 years than I care to acknowledge in this post.

That said, here comes the notes part of this post because I’ve learned a few things here that I will be referencing in the future, cue blog as outboard brain.

LLSMP was dead simple to setup on Reclaim Cloud, I installed 6.0.2 and ran PHP 7.3.27 and once that was done I was able to login via the web-based SSH and start migrating the files to /var/www/webroot/ROOT

I ultimately had to enable root access on the container and thankfully Webfaction provides SSH access to their server, so most of this migration was done using command thanks to the rsync command, which is amazing. Logged into the Reclaim Cloud I ran the following command to sync files from webfaction:

rsync  -avzh /var/www/webroot/ROOT

Thanks worked cleanly, then I needed to grab a dump of the database on Webfaction, which this worked for:

mysqldump -u db_username -p db_name > database.sql

After that I rsynced it to the Reclaim Cloud instance

rsync  -avzh /var/www/webroot/

After that I had to create the database user, datamase, and create privileges via command line, cause I am kind of a big deal. Well, I thought I was until I hit my first snag:

ERROR 1045 (28000): Access denied for user 'user_db'@'' (using password: YES)

This is where I reached out to help from my Reclaim Hosting colleagues, and the always awesome Chris Blankenship bailed me out with some detailed instructions on how to fix this in Reclaim Cloud:

MySQL actually sees db_user@localhost and db_user@ as two separate accounts, which can cause problems. cPanel handles this automatically by creating both for all db users, but you’ll have to manually create both in Jelastic containers; so like this:

CREATE USER 'db_user'@'localhost' IDENTIFIED BY 'securepassword';
GRANT ALL PRIVILEGES ON db_name .* TO 'db_user'@'localhost';
CREATE USER 'db_user'@'' IDENTIFIED BY 'securepassword';
GRANT ALL PRIVILEGES ON db_name .* TO 'db_user'@'';

To make it simple usually add skip-grant-tables under the mysqld section of /etc/my.cnf, restart mysql (systemctl restart mysql), log in as root without the password. From there I run this:

Followed by those commands above. Then I comment out, skip-grant-tables under  the mysqld section of /etc/my.cnf, restart mysql (systemctl restart mysql) again.

Once I figured out the permissions i was able to import the database.sql file using the following command from the /var/www/webroot/ directory:

mysql -u db_user -p db_name < database.sql

Once that imported I did a final rsync of files using the following command, with the -u flag to skip files that are newer on the destination.

rsync -avzhu /var/www/webroot/ROOT

There was also the bit where Chris updated ‘localhost’ to ‘’ in the settings.php file for the Drupal instance given Reclaim Cloud is particular.

So those are very specific notes for this migration of a larger PHP application to a Reclaim Cloud instance, what’s more I had to do it again a week later given this was just to test the instance before moving the production site (this is where rsync is very useful, although the SQL dump had to be re-done though). As you can tell by now, this is not a $25 migration, this requires spinning up a server, syncing files between servers, and providing a testing environment. Luckily Reclaim Cloud environments automatically have a test unique URL that the mapped domain overwrites (namely something like as opposed to that makes testing the environment easy before pointing DNS, which was quite convenient—even better than pointing localhost files.

Anyway, this is a long post about migrations and Reclaim Cloud, as much a series of notes as a way of narrating what I hope will be a deeper dive into the possibilities of Reclaim Cloud over the next 12 months or so given i have been freed up from other responsibilities, but more on that in my next post.

*The hard part of the Cloud to wrap your head around is the variable pricing, I know it does remain fairly consistent from personal experience, but need for predictability is what Digital Ocean understood and has seemed to figure out, which I admire.

Domains21: Jelastic – a Look at the Technology Behind Reclaim Cloud

Keeping up with my OERxDomains21 syndication series, here is a another great session featuring Jelastic founder and CEO Ruslan Synytsky whochats with Tim Owens about all things Cloud.

In the Summer of 2020 Reclaim Hosting rolled out Reclaim Cloud -a next-generation hosting platform that allows faculty, students, and staff at educational institutions to run complex technology stacks with the click of a button. It’s a brave new world of virtualized, containerized infrastructure that in many ways changes what’s possible for ed tech and higher ed IT groups around the world.

Tim Owens chats with Jelastic founder and CEO Ruslan Synytsky about their cloud platform software and how it has enabled hosting companies like Reclaim Hosting to provide its customers a sophisticated and elegant cloud solution that provides them access to a whole suite of next-generation applications.

Reclaim Cloud’s Got GLAM

I’ve been following Australian historian and hacker Tim Sherratt on Twitter for a while now, and his work with the GLAM Workbench is inspiring. GLAM is an acronym for galleries, libraries, archives, and museums, and the workbench provides a series of tools Tim has stitched together to enable research across numerous collections in Australia and New Zealand so that scholars and students can do things with data.

I saw a mention of this work a few weeks back that piqued my interest, and the following tweet spurred me to follow-up on installing GLAM Workbench in Reclaim Cloud, so I gave it a shot.

I have to say it was quite easy to get up and running, and the documentation around this project is so robust that it also helped me finally get my head around how applications like Jupyter Lab, Datasette, and Voyant Tools might work together, which is huge for me.

It really was that simple, I created a new environment and used this script to import the YAML file with all the instructions for getting the custom Jupyter Lab notebook spun up. Literally one-click, which he has since integrated into the documentation so you can do this right from Github into Reclaim Cloud, which is so slick.

And as I noted, the JupyterLab was all set up and ready to go (it was behind a password given that was part of the customizations he built into his container):

The thing about the GLAM Workbench that pushed me beyond the straight install into exploring the app was the amazing documentation they created that seems to just be getting better.

I was able to wrap my head a bit around using Jupyter to run the Trove Harvester, which is essentially the tool that search across collections and brings back results, and then allows you to harvest text, images, and even PDF versions of the articles. All this is spelled out within the Jupyter Lab, and it allowed me to start digging in.

I did a search across the collections for references to home video rentals and got a solid 8000 hits. I’ll try and do a follow-up post about some of the awesome articles about home video in Australia, but let this page (and a few pull-out ads) suffice for now:

The article on silicon is pretty fascinating, but the ads tell a compelling story of the rise of the mom & pop video store. And this is just one of thousands, and tools like Datasette (which you can work with right from the GLAM Workbench) puts the search info into a database format, and something like Voyant Tools would enable you to visualize, so I actually started to wrap my head around this suite of tools.

This is amazing, but even cooler is that Tim has been hard at work and has updated the GLAM Workbench documentation to include a Launch in Reclaim Cloud link so that  script runs and you are up and running with GLAM Workbench in Reclaim Cloud, so cool.

And in all his copious spare time, he posted the details of his work creating an installer for GLAM Workbench on Reclaim Hosting’s Community forum which provides all the details, so thank Tim—this is above and beyond!

PeerTube, Sonic Outlaws, and UbuWeb

Over a week ago Tim got a one-click installer working on Reclaim Cloud for PeerTube. He got the details up on the Reclaim Hosting Community site already, so you can read more there.

PeerTube in Marketplace

PeerTube in Marketplace

Getting one-click installers working for a wide variety of apps is a big bonus of Reclaim Cloud, and between Azuracast and PeerTube we have the vertical and horizontal pretty well locked-in. I wrote a bit about my explorations with PeerTube already on this blog so feel free to follow that linked rabbit hole for more. But the long and short of this application is that you can upload videos to your own instance of a fairly robust Youtube-like interface. It has a growing peer-to-peer network, and one killer feature is that it can upload and archive just about any video on the web with a URL. I use it regularly to archive videos I watch online given the broken web copyright creates as a result of YouTube take-downs which highlights the worst of the service-centralized internet.

In fact, while Tim and I were working through the PeerTube installer I was watching the 1995 documentary Sonic Outlaws by Craig Baldwin. The copyright bugbear has been with us well before YouTube, and Sonic Outlaws focuses on the fallout of Negativland‘s  decision to parody U2.

Within days after the release of Negativland’s clever parody of U2 and Casey Kasem, recording industry giant Island Records descended upon the band with a battery of lawyers intent on erasing the piece from the history of rock music.

Craig “Tribulation 99” Baldwin follows this and other intellectual property controversies across the contemporary arts scene. Playful and ironic, his cut-and-paste collage-essay surveys the prospects for an “electronic folk culture” in the midst of an increasingly commodified corporate media landscape.

So, long story short, I wanted to see if PeerTube could use the YouTube-dl code to grab and upload the copy of Sonic Outlaws on UbuWeb, and turns out it can, the only thing is the metadata was not included, but that was fairly easy to fill in.

After that I got to thinking about the initial Tweet of this post from UbuWeb about downloading videos and not trusting the cloud.

I wonder if an application like PeerTube might help bridge that gap a bit by re-decentralizing the cloud so that folks could download and share collections like UbuWeb across numerous servers and local machines in order to not only build their own collections, but share them, and hopefully circumvent the copyright trolls that come with the territory of a centralized video service such as YouTube. brought to you by PeerTube

I recently said something along the lines of….

The nice part about a, as Tim reminds me, is there are no copyright trolls. Small can be very good for a video community. Plus, I have already spent my time dealing with the Youtube copyright crap, and I have no interest to go back there. Everything I have on Vimeo is backed-up locally (and remotely); I basically have my bug-out bag by the door ready to go at anytime. If they delete my videos, it would just mean finding another home for them, and maybe that is exactly what we’ll do with ds106tv  ?

And that is exactly what Tim did when he got the open-source video platform PeerTube up and running for I took one look at it and immediately knew it was the alternative I’ve been looking for for some time now. What’s more, you gotta love their motto:

Our aim is not to replace them [YouTube, Vimeo, etc.], but rather to simultaneously offer something else, with different values.

One of the coolest things about PeerTube, other than it being free and open, is it’s premised on a decentralized, federated network of a variety of instances (not unlike Mastodon). So, for example, I can federate my own instance,, with and folks who come to either can explore what’s on both. Even better, there’s the ability to provide redundancy so we can back-up each others videos in the event of server issues, take-downs, etc. It’s everything ad-revenue and premium video sharing services are not.

But it gets better, it also has the youtube-dl open source python library built-in so that you can migrate your videos off services like YouTube and Vimeo. But as the above Tweet from the Electronic Frontier Foundation makes clear, the youtube-dl code is currently under attack by the MPAA and RIAA for enabling copyright circumvention. In fact, trying to takedown open source code is an already established tactic, back in May the MPA did the same thing to the open source software Popcorn (a Netflix clone). But in that case the developers of Popcorn appealed to Github to re-instate their repo:

The developers submitted a DMCA counternotice explaining that the MPA’s request is not legitimate. The code is owned by Popcorn Time, not the MPA, and Popcorn Time asked GitHub to restore access.

“The code is 100 % ours and do not contain any copyright [sic] material please check again,” the developer wrote.

The app’s developers made a good point here. The identified code (not the built app) is not directly copyright infringing and it contains no direct links to copyright-infringing material either. This means that a DMCA notice may not be the right tool here.

Faced with both requests, GitHub has now decided to restore full access to the Popcorn Time repository.

Let’s hope youtube-dl gets as lucky as Popcorn did back in May, but at the same time you begin to understand that in many ways Github is just as arbitrary and liable as Youtube to remove and block access to our culture, this in the form of code, based on power plays by monied interests. It’s the same mistake of consolidating resources, and by extension power, in the hands of a few monolithic sites (rather than federated across many) that gets us back in the hole.

In fact, the Youtube-dl makes archiving videos you want to save from around the web unbelievably convenient for copying videos in seconds.

It”s been over 8 years since I lost all my videos on YouTube thanks to copyright claims and the unilateral arbitration at the hands of for-profit platforms, so it is nice to finally have a really tight alternative. I have been playing with it for over a week given I wanted to make sure the Docker installation works on Reclaim Cloud (it does!), along with the CLI tools that make migrating an entire Vimeo or YouTube channel to PeerTube absolutely painless. I did this yesterday and brought over over 275 videos, and all the accompanying metadata—so good.

I think the thing I appreciate the most about PeerTube is the way it lets you explore your own and others videos. Tim has been uploading all of his videos to and we are working on federating my site with his (it is actually simple to federate instances, but I deleted my previous instance so there have been some caching issues) and I have been able to discover so many of the old gold DTLT Today episodes, not to mention ds106 gold, and more.

I think the larger plan is to give people account on to upload videos for the course we are designing, or even better, help them spin up their own PeerTube instance to see what its all about. To that end I need to work on a one-click install for PeerTube on Reclaim Cloud, which should be very doable, as well as a more in-depth how-to for the peerTube CLI given wrapping your head around that really makes this tool amazing for migrating a large amount of content in a short period of time. is already paying dividends and it is still months away from starting. #4life

Reclaim Cloud Case Study: Containing TEI Publisher in the Cloud

It started out as an innocent enough ticket into Reclaim Hosting from Dr. Laura Morreale, whose work involves transcribing and translating texts from medieval manuscripts using online digital facsimiles, asked if we can run eXist-db on her cPanel account in shared hosting. In particular she needed to run TEI Publisher, an open source application that is described as follows in this documentation:

The motivation behind TEI Publisher was to provide a tool which enables scholars and editors to publish their materials without becoming programmers, but also does not force them into a one-size-fits-all framework. Experienced developers will benefit as well by writing less code, avoiding redundancy, improve maintenance and interoperability – to just name a few. TEI Publisher is all about standards, modularity, reusability and sustainability!

A quick look at the basic installation documentation for eXist-db told me it was a Java app which is a hard no for cPanel. But avoiding hard NOs when someone comes asking for help is one of the main reasons we started Reclaim Cloud. A cursory search for a Docker container for this application led me to a container that seemed out-dated. I responded suggesting we could try installing it on the Cloud if they had a current Docker instance, which I was not finding. Turns out I wasn’t looking hard enough, it was linked from the eXistDB homepage right in front of my eyes. I was wrong, and Dr. Morreale responded suggesting she was becoming increasingly frustrated trying to get this application running online saying, and I misquote for comic effect: “Dammit Jim, I am Medievalist, not a server admin!” She was right, and this was why we started the Cloud in the first place; I needed to try harder. What’s more, I appreciated the fact she was so determined to make this work. So much so that soon after after the last email I sent to try and get this working, she sent sent me a link to the right Docker container on the recommendation of the folks at eXist-db:

That was all we needed, I simply searched for this container in the Docker area when creating a new environment in Reclaim Cloud:

Click “Next” and add the subdomain of this test environment, in my example (now deleted), and then clicked “Create.”

And within moments I was able to access the site at at that subdomain:

The eXistdb splash page redirects to a suite of tools, including TEI Publisher!

A click on that icon brings us into that application:

While there are a still few things to work out in regards to user management for the application, it seems like we may have a winner with this Docker container. In fact, Dr. Morreale’s struggle highlights a pain point for many humanities PhDs that need to run an application that demands a bespoke server environment. This is when the value of containers is extremely evident. In this case, running a Java server environment that can provide an  application that provides a stable and citable publication venue for a Medievalist’s transcriptions and translations is a perfect case in point. In fact, Dr. Morreale was kind enough to furnish me with some insight of her work, process, and challenges for this post:

Like a growing number of humanities PhDs, I am an independent scholar who maintains relationships with several programs and institutions. I am currently affiliated in an official capacity with Fordham, Georgetown, and Harvard Universities, and am also engaged in ongoing projects with partners at Stanford and Princeton Universities.  My medievalist practice has always been characterized by a physical distance from both the repositories that hold sources which I study, and the institutions where my scholarly work finds its home. For this reason, digital methods have offered me a solution for my scholarly work when I had few others.

Some of the most rewarding efforts which have in turn informed much of my traditional analytical work, involve transcribing and translating texts found in medieval manuscripts using online digital facsimiles. Using a tool called FromThePage combined with IIIF image technology, I can now easily choose digitized manuscript images from any online repository, upload them, then immediately begin to transcribe the text from the medieval source. I can also translate my own transcription after it is complete, and I have undertaken both individual and collaborative translation projects using this method. Right now my projects include corpus of early 13th century aristocratic legal codes from Crusader Cyprus, a rarely-cited history of Florence that was buried in a late 14th-century letter from a father to his son, and a little known work by Renaissance Florentine Leon Battista Alberti, found in a larger manuscript that has broken up, with parts of it now housed at Harvard’s Houghton Library.

The one difficulty has been to find a stable and citable publication venue for these transcriptions and translations. I have tried several different programs over the years, but could never easily publish all the work I had done to bring more attention to these texts and manuscripts. Using Reclaim Hosting  and a program called TEI Publisher allows me to create the kind of edition I would like, and to allows me to integrate images, notes, and other explanatory materials into my online editions.

In the end, the fact that we could help Dr. Morreale get what she needed fairly seamlessly is a thrill, and it highlights everything we hoped Reclaim Cloud would be. I am planning on turning this Docker container into a one-click application for the Reclaim Cloud marketplace so that other folks can hopefully scratch a similar itch. And special thanks to Dr. Morreale for so generously sharing her process and work to complete this post. Avanti!

IndieWebCamp: Domain of One’s Own Meetup

This past Tuesday I attended the second Indie WebCamp generously hosted by Chris Aldrich focused on Domain of One’s Own. The format is a more focused 10-15 minute talk around a specific technology, in this meeting Tim gave folks a walk-though of Reclaim Cloud, and then opens up to the 21 attendees for anyone to share something they are working on. Tim shared the Cloud, and not only was I thrilled to see Jon Udell in attendance, but it’s always nice when one of your tech heroes tweets some love for your new project. Even better when you know they’re not one to offer empty interest and/or praise. Thanks Jon!

It was also very cool to read Will Monroe write-up of the session, and like him I found it a “very friendly group” and I realized while attending that this kind of low-key chatting and sharing is one of the things I have missed these days. Folks like Will who want to explore what’s possible in their classroom with Domains and beyond is a big part of what I miss about the day-to-day work of an edtech in an institution. And while I’m not necessarily chomping at the bit jump back into that game given the current circumstances, the ability to share and chat with folks who are interested in Domains is always a welcome opportunity.

During the sharing portion of the meetup Jean Macdonald, community manager at, turned me on to the Sunlit project while I was bemoaning the dearth of open source alternatives to photo sharing apps like Instagram. Soon after I finally took the leap and signed up for a to explore that platform. That platform has been a indieweb cornerstone for many folks I respect like John Johnston, Kathleen Fitzpatrick, and Dan Cohen to name just a few. So I wrote my first post:

What was even cooler was the fact that while writing this post I logged back into and discovered a few folks had welcomed me to the community, including Jean Macdonald and Dan Cohen—that makes all the difference.

I’m sold, so the IndieWeb meetup was a total win for me, and I look forward to the one next month. I am going to start getting serious about headless WordPress development for my new website at, inspired by Tom Woodward’s talk for #HeyPresstoConf20

So, I’ll have something to share in my journey to learn WordPress headless, which will mean learning javascript, CSS, and some other insanity I am not entirely ready for. I have to give a special thanks to Chris Aldrich for putting this together and working to create a space to talk Domain of One’s Own within the IndieWeb community, and I know Greg McVerry has been pushing hard on this for a while now as well, so it is very much appreciated!

Reclaim Cloud Art, Bryan Mathers, and Gettin’ Air

It occurred to me yesterday after finally listening to Terry Greene‘s interview with Bryan Mathers for the Gettin Air podcast that I never blogged about our Reclaim Cloud artwork. That needs to be rectified, and I will share the awesome below, but before I do I just wanted to say how much I enjoyed the interview between these two. Possibly the coolest part was when Bryan started interviewing Terry in order to see if he could “draw” out of him some ideas that he could refactor as a visual for the podcast, and voilà Gettin Air has a new logo!

I dig it, especially given I have returned to snowboarding these last few years, but even better was Bryan getting Terry to talk about his idea behind the name, his articulation of what he’s doing and why—it was all so effortless and real. It was a beautiful demonstration of how the interview can become the thing it wants to share. So genius, well worth a listen if you have some time.

Anyway, that whole process reminded me I have not yet shared the work Reclaim Hosting did with Bryan this summer to get started on the Reclaim Cloud aesthetic. Given Reclaim Cloud is premised on a container-based architecture, we initially explored if we wanted to go down the road of shipping containers, and we have some initial sketches from Bryan that I absolutely love.

The containers are actually VHS tapes! A point made clearer in the heavy lifting image that follows:

It really is brilliant, it captures the idea of Reclaim Cloud as both container-based and industrial-strength, which it is! But ultimately after talking with Bryan we realized the hard limits of the nautical/container metaphor. So we moved on to Cloud City, an idea Martha Burtis and I fleshed out for Domain of One’s Own back in the day.

I still love that poster, in fact I have a stamped copy of it framed and hanging on the wall behind me as I write this. So we got to talking a bit about it, although Tim was a bit reluctant given he is not a Star Wars fan, but through conversation the idea of a retro-futurism aesthetic began to emerge a la The Jetsons.

And Bryan’s rough sketches had us very intrigued:

The idea of scaling your domain was fun, and the way Bryan mapped that onto retro-futuristic housing and was brilliant. In the final image the beginnings of a logo/cloudlet begin to take shape already. This was our aesthetic, and we kind of knew it during the discussion, but the seeds of the sketches sealed it.

The final option was to stick with the music/video metaphor we already have and push it further with mixed tapes. But it just felt forced, and I think Tim and I both wanted the freedom to jump out of that metaphor and explore something new, and I am really glad we did.

The next conversation after deciding on Cloud City was to scout the internet for some ideas for our next conversations, and that is when Tim landed on industrial designer Arthur Radebough’s Closer than We Think comic strip from the late 1950s through 1963. The way in which the art incorporate an explanatory panel and then the actual art incorporates various explicit arrows illustrating the future gels nicely with our idea of introducing Reclaim Cloud as a way of highlighting for higher ed what’s possible in this new space. So, we got to talking, and the first round of art was amazing:

I really love the industrial logo for Reclaim Cloud which is itself an encapsulated container, a cloudlet if you will, and this idea of self-contained cities became a bit part of our aesthetic. And the fact that Bryan Ollendyke said it reminded him of Bioshock on Twitter just sealed it for me 🙂

We were sold after this image, a kind of brochure for Cloud City which enabled us to start exploring the idea of what it would mean to try and create a series of vignettes of the different options for anyone interested in moving to the Cloud. It was just too fun, so the follow-up discussion was to explore the Closer than You Think comic strips to highlight some of the one-click applications we have for courses, organizations, and digital scholarship:

Pure magic! The way in which the container has become an organic part of these images is just so awesome. I love the one outside the window of the home classroom. This idea that it is all connected yet separate is one way to understand the cloud, and Bryan really brought it home. And as amazing as all the art is, I think his breakdown of the various elements of a Reclaim Cloud container that could incur costs in a fullblown masterpiece:

This sphere is everything, literally. I just love the way the aesthetic has evolved and the final bit is thinking through how we’re going to highlight what is happening within each cloud. This led us to the idea of “What’s in your Cloud?” wherein we talk to folks to provide us a peak into their Cloud, what are they running, how, etc. The following image is a placeholder, but we are thinking through ways of trying to capture the individual nature of folks’ cloud for each episode, and Bryan mentioned some kind of comic-like avatar, like my Cotton Mather avatar in a spacesuit hold my Cloud sphere, which would be awesome!

Anyway, I think that brings us up to date, and to be clear this has only just begun. We are thinking of Reclaim Cloud as a long-game. We know it will not replace cPanel hosting; we have plenty of time to experiment with the possibilities; and we can slowly start moving our existing infrastructure over as we become increasingly comfortable with the environment. Not to mention it has forced us to dig in and learn a lot more as a company, and as much as I was kicking myself given I was just start to feel a bit liberated from the day-to-day, in the end I love it. We’ve been dreaming of this kind of infrastructure since we started Reclaim Hosting, and in 3 short months we went from nothing to a pretty full blown product that provides some concrete solutions for academics wanting to host something outside of the LAMP stack. And this retro-future aesthetic is our way to start experimenting in this space without pretending there aren’t also real problems baked into every solution—we’re here to explore right along side you.

Timmy Explores the Wondrous World of Windows 3.1

You begin the game as Timmy, a young boy visiting a crumbling amusement park known as Midway. But Timmy doesn’t see a pathetic locale where everything is falling apart, but rather a world of wonder, with his thoughts appearing in written form at the bottom of the screen.

The above quote is taken from a now gone review on Hardcore Gaming 101 describing The Residents 1995 CD-ROM game Bad Day at the Midway. I know this because I copied that description for a post I wrote on this blog in 2014 talking about this game, which made an indelible impression on my memory when I first played it on Windows 3.1 at in the AVS offices at UCLA. In fact, the description of little Timmy above is perfect to describe another Timmy I know who found himself in the wondrously retro world of archival emulation thanks to the EaaSI project, or emulation as a service infrastructure. What is EaaSI? Well, Tim covers that nicely:

The Eaasi platform allows you to start with basic images of operating systems, and then layer on software as well as “objects”. So, for example, you might have an object that is a Word Document a professor wrote in 1998. Instead of rendering it in a PDF, here we can actually take a Windows 98 computer, add Office 97 to it, and then have the document load at boot. A true native environment that is destroyed and rebuilt each time you go to view it in a matter of seconds and renders the object exactly as it was intended to be viewed.

What’s beautiful for us is that EaaSI is a container-based environment for emulation-based archiving that Tim got running on Reclaim Cloud, so now he can playing Solitaire as it was meant to be played on Windows 3.1:

All of which led us to jump on a video call and see if we could get the iso of the Bad Day at the Midway CD-ROM to run in the Cloud, and turns out it is very possible, even if you have to fix a few issues like mount your virtual CD-drive and fixing the monitor colors:

“Wow!” indeed. Running a 1995 CD-ROM game on Windows 3.1 via the web on Reclaim Cloud is a new level of hosting inception I can dig on. It seems similar in spirit to the remarkable work the folks at the Internet Archive have been doing for years to emulate various games in the browser. It’s exciting stuff, and the fact we could host something like this is mind blowing.

DomainMOD: Getting my domains house in order

I have been having fun watching Tim blog through his recent application experiments on the Reclaim Cloud. What I love is his experiments is they are honest, when he tries out an app he is really not sure if it will run. In fact, I am on the edge of my seat to see if it worked when reading posts like this and this. 🙂 So, inspired by Tim as I often am, I looked through the list of awesome self-hosted apps he linked to in his penultimate post to continue my experimentation in the Cloud. The application I landed on was DomainMOD, which is a tool for managing domains you have registered across different registrars, hosting companies etc. It’s a custom tool for folks like me who have a domain hoarding problem, and it comes at a perfect time given I am continuing to try and get my digital house in order, and with 31 domains registered all over the place, this would be an app I can actually use.

So, the first step was installing, and while it is a pretty straight-forward PHP/MySQL app, I noticed there was a Docker container, so I tried that out and it was dead simple. I spun up a Docker Engine instance in Reclaim Cloud.

After that I created a domainmod directory in the /home directory via command line:

mkdir /home/domainmod

And then from the /home/domainmod directory I ran the following two commands

git clone
docker-compose up -d

And that was it, DomainMOD was up and running and after that I spent the morning adding my domains to the interface so that I could track them more accurately. The app has the option to integrate with the APIs from the various registrars I currently use, i.e. eNom, Logicboxes, and OpenSRS, which is nice. I did a manual import to begin, but I was quickly able to get an overview of all my domains, annual cost, what’s private, where DNS lives, associated registrar, as well as a category (right now I have 3: personal, ds106, and Reclaim).

I am clocking just about $600 a year on domains, which is $50 a month. The custom domains really killed me 🙂 I may have to do some pruning, having all the jimgroom TLDs may not be all that necessary, although the, and are absolutely essential 🙂 I’ll have to continue to play a bit with DomainMOD given I have a fairly involved blog post where I want to track the registration of each of the domains over the years as a kind of personal history of my personal web since 2003 or so. But until then I am winning on the Cloud!