Some Conference Thoughts from Digital Ocean’s Deploy

Back in November Tim, Lauren and I presented alongside Kaysi Holman and Inés Vañó García from the CUNY Graduate Center about work we’re doing in higher education using Digital Ocean. The presentation was under 30 minutes long, pre-recorded in Streamyard, and aired two months later as part of the Digital Ocean Deploy conference. it can be easily found online via their Deploy Conference page on YouTube as well. The moderator, Erin Glass, was kind enough to bring us all together to make it happen, and her introduction is in many ways a short preamble of her brilliant article on Ethical EdTech published recently I really enjoyed presenting alongside the folks from the CUNY GC (my alma mater of sorts), but I agree with Tim that when planning what we would talk about I missed the mark a bit. Rather than talking about Reclaim Cloud and the Emulation as a Service idea, we should have talked about our work with the CUNY Commons folks to make a one-click installer for CUNY’s C-Box. That’s on me, and I will try and avoid letting my excitement with the latest cool thing happening at Reclaim Hosting “cloud” my judgement.

I appreciate Erin inviting us, and I also really benefited from seeing how they organized Deploy as a participant given Reclaim Hosting will be joining forces with the OER21 folks to put on a OER21/Domains online conference in late April, and we’re still very much imagining the possibilities for making this as compelling and accessible as possible. One of the elements of Deploy I really appreciated was that the session was pre-recorded almost two months in advance and allowed, which allowed for us to attend the online conference and actually participate in the Discord discussion not only during our session, but for almost a month before that.  This made getting subtitles done seamless, to ensure everything was accessible out the gate.

What’s more, the way the conference was presented there were multiple channels going at once via a Video player hosted on a single page, that also had the schedule. It could not have been easier to access what’s happening across the conference at any given time in one, fell swoop. I also loved the way the kept all sessions to less than 30 minutes, and had awesome preface art, TV-like bumpers, and highlights between sessions, not unlike the transitions between TV shows. And this kept me watching, which I think testifies to something quite powerful. My pre-recording and cleaning up transitions and announcements you have a much better chance of making sure everything is accessible and that folks will stay tuned-in.

And getting back to my lament about not talking about CUNY’s C-Box installer, I believe that having to groups (CUNY GC educators and Reclaim Hosting folks) in conversation makes that 30 minute time-frame that much more compelling. Everyone spent a few minutes sharing their ideas (which made it move well), but I think have a conversation between groups around topics like ethical edtech would be absolutely brilliant, I think this worked really well during the Against Surveillance session with Maha Bali, Chris Gilliard, sava saheli singh, and Benjamin Doxtdator. It was a compelling discussion that balanced both rehearsed points, sharing media, and extemporaneous discussion that was a near perfect combination. They were even braver in that it was live, but I think managing live across several channels for two days is a lot of work, so I would like to see if there is a balance there between pre-recorded and live.

As a start thinking about the Domains sessions in the conference I am wondering how we can connect folks running various projects across different schools with one another to chat, while balancing structured “formal presentation” (i.e. rehearsed talking points) with new ideas that emerge as part of conversations in the moment. That will be a key for me because I think that’s what makes these sessions compelling and memorable.

Digital Ocean Deploy Conference swag

On a slightly different note, their swag game was pretty tight. Not only did they send a nice sweatshirt that arrived the day before the conference, but they also sent a cable bag, microphone port blocker, and webcam cover. Pretty interesting how those final two suggest a kind of ant-surveillance mentality for their participants and presenters, which was cool.

Anyway, these are all post facto notes about Digital Ocean’s Deploy as we begin to dig in for preparing for a fully online OOERxDomains 2021. It’s a fun challenge to work through, and the first step is stealing from other conferences that things that worked and learning from your own mistakes to make the next time around better. brought to you by PeerTube

I recently said something along the lines of….

The nice part about a, as Tim reminds me, is there are no copyright trolls. Small can be very good for a video community. Plus, I have already spent my time dealing with the Youtube copyright crap, and I have no interest to go back there. Everything I have on Vimeo is backed-up locally (and remotely); I basically have my bug-out bag by the door ready to go at anytime. If they delete my videos, it would just mean finding another home for them, and maybe that is exactly what we’ll do with ds106tv  ?

And that is exactly what Tim did when he got the open-source video platform PeerTube up and running for I took one look at it and immediately knew it was the alternative I’ve been looking for for some time now. What’s more, you gotta love their motto:

Our aim is not to replace them [YouTube, Vimeo, etc.], but rather to simultaneously offer something else, with different values.

One of the coolest things about PeerTube, other than it being free and open, is it’s premised on a decentralized, federated network of a variety of instances (not unlike Mastodon). So, for example, I can federate my own instance,, with and folks who come to either can explore what’s on both. Even better, there’s the ability to provide redundancy so we can back-up each others videos in the event of server issues, take-downs, etc. It’s everything ad-revenue and premium video sharing services are not.

But it gets better, it also has the youtube-dl open source python library built-in so that you can migrate your videos off services like YouTube and Vimeo. But as the above Tweet from the Electronic Frontier Foundation makes clear, the youtube-dl code is currently under attack by the MPAA and RIAA for enabling copyright circumvention. In fact, trying to takedown open source code is an already established tactic, back in May the MPA did the same thing to the open source software Popcorn (a Netflix clone). But in that case the developers of Popcorn appealed to Github to re-instate their repo:

The developers submitted a DMCA counternotice explaining that the MPA’s request is not legitimate. The code is owned by Popcorn Time, not the MPA, and Popcorn Time asked GitHub to restore access.

“The code is 100 % ours and do not contain any copyright [sic] material please check again,” the developer wrote.

The app’s developers made a good point here. The identified code (not the built app) is not directly copyright infringing and it contains no direct links to copyright-infringing material either. This means that a DMCA notice may not be the right tool here.

Faced with both requests, GitHub has now decided to restore full access to the Popcorn Time repository.

Let’s hope youtube-dl gets as lucky as Popcorn did back in May, but at the same time you begin to understand that in many ways Github is just as arbitrary and liable as Youtube to remove and block access to our culture, this in the form of code, based on power plays by monied interests. It’s the same mistake of consolidating resources, and by extension power, in the hands of a few monolithic sites (rather than federated across many) that gets us back in the hole.

In fact, the Youtube-dl makes archiving videos you want to save from around the web unbelievably convenient for copying videos in seconds.

It”s been over 8 years since I lost all my videos on YouTube thanks to copyright claims and the unilateral arbitration at the hands of for-profit platforms, so it is nice to finally have a really tight alternative. I have been playing with it for over a week given I wanted to make sure the Docker installation works on Reclaim Cloud (it does!), along with the CLI tools that make migrating an entire Vimeo or YouTube channel to PeerTube absolutely painless. I did this yesterday and brought over over 275 videos, and all the accompanying metadata—so good.

I think the thing I appreciate the most about PeerTube is the way it lets you explore your own and others videos. Tim has been uploading all of his videos to and we are working on federating my site with his (it is actually simple to federate instances, but I deleted my previous instance so there have been some caching issues) and I have been able to discover so many of the old gold DTLT Today episodes, not to mention ds106 gold, and more.

I think the larger plan is to give people account on to upload videos for the course we are designing, or even better, help them spin up their own PeerTube instance to see what its all about. To that end I need to work on a one-click install for PeerTube on Reclaim Cloud, which should be very doable, as well as a more in-depth how-to for the peerTube CLI given wrapping your head around that really makes this tool amazing for migrating a large amount of content in a short period of time. is already paying dividends and it is still months away from starting. #4life


I’m gonna try and make this post short and sweet because the conversation above says it all. In episode 24 of Reclaim Today Tim and I were once again joined by Andy Rush to  brainstorm the design and structure of an open course we plan on running in the new year called (#ds206video). Yeah, we are piggybacking on the venerable ds106 community, and figured it might be a good time to create a special topics course for folks to learn and share working they are doing around producing, creating, and streaming video. If the course is half as fun as the above discussion, it is going to be a blast, and it has been quite a few years since I have had the time and energy to work with folks to build out an ecosystem, and this post comes a few days late because I spent much of the last 3 days playing with Wiki.js, Peertube, and Discord, which will be at least 3 facets of this open course. I have a lot more to write about this, but I am still knee-deep in Peertube, which is a brilliant open source P2P Youtube alternative, that I have been looking for for the last 8 years:) There will be much more to come on the experience, but if you are interested join the Discord chat and get ready to co-create a community TV station around! #4life

I think the time is nigh, and if nothing else, Tim’s 20 second intro to the above video is not to be missed, click play and FEEL THE RUSH!

Talking Digital Identity with GO-GN

There is no better feeling than when some of your plodding experimentation starts to come together after several months of work.

Yesterday I had the distinct pleasure of presenting to the GO-GN network about digital identity. I asked Martin Weller and Beck Pitt if I could experiment a bit for this talk with some various video shots via OBS, and they were more than willing to let me run wild. In fact, they were even accommodating 🙂 They gave me a test account for their video conferencing application, ClickMeeting, which quite frankly was one of the best I’ve used yet.* I did a preliminary run at this setup with my “5 Questions about EDUPUNK” video last weekend that  I posted about earlier this week. The primary difference was that this was in front of a real-time audience and if I messed up there were no do-overs. Dear reader, I nailed it!

My discussion of digital identity might be broken up into two parts: 1) I play the hits and talk about my blog, narrating your work, and the now venerable ds106, but part 2) was a bit of a departure wherein I discussed the possibilities of streaming video for new ways of building and imagining presence. I was lucky that both Meredith Fierro and Katie Hartraft did the heavy lifting by modeling Tik Tok-style narratives and being far more insightful and thoughtful than I could ever be! It was a lot of fun for me, and I was more nervous about this presentation than I had been about one in a long while, which for me is always a good sign I am stepping out of my lane and trying something new. For me the form was the message, I was spending 45 minutes presenting my story via dynamic, streaming video. It wasn’t The Wire or anything, but it was mine in ways the typical video conferencing video box never could be. I didn’t stream it to out of respect for the GO-GN network, given they invited me to their space, so no need to push folks elsewhere, but I could have quite easily. What’s more, I could record a hi-quality version of the video I can then use for my own purposes. Something like this….

One drawback of this version of the talk is that I didn’t capture the ongoing chat (though I could have), and I didn’t capture the audio questions Martin asked towards the end. I edited the above video to account for that, but GO-GN has the more complete version up on their Youtube account, embedded below:

Now riddle me this, which is the official document? Textual historians are gonna be working overtime for the next few centuries 🙂 A couple of interesting notes about the version on Youtube is that with it came copyright claims that made monetizing it impossible. Fine by GO-GN and me, but it does get to an issue that came up in the talk regarding the lack of de-commodified green spaces for video on the web . The nice part about a, as Tim reminds me, is there are no copyright trolls. Small can be very good for a video community. Plus, I have already spent my time dealing with the Youtube copyright crap, and I have no interest to go back there. Everything I have on Vimeo is backed-up locally (and remotely); I basically have my bug-out bag by the door ready to go at anytime. If they delete my videos, it would just mean finding another home for them, and maybe that is exactly what we’ll do with ds106tv  🙂

Anyway, I am getting off topic here, this entire process has gotten me even more excited about working with Tim and Andy Rush on a special topics ds206† course, namely #ds206video. And while we still have to iron out a bunch of details, I did mention this development in the above video, and Andy and Tim are officially on-board. It will be a multi-week open course around working with OBS, streaming video servers, hardware, video editing software, etc. with the idea of helping interested folks bolster their video game. I’ll hopefully have a lot more to say and blog about this anon. But for now let me try and document what I did for the “Like <3 and Subscribe to Your Digital !dentity in the Time of Corona” presentation.

In fact, it was quite similar to the process I blogged about for the EDUPUNK Q&A, so I will try and keep this a bit briefer with the understanding you can refer back to my previous post for details (at some point I’ll try and come up with a more cogent tutorial).

Like with the EDPUNK Q&A I had 3 main scenes:

1) The Console Living Room:

2) Reclaim Arcade

3) Slide presentation mode

Each of these scenes is composed of pretty much the same inputs/shots as the OBS screenshots in the EDUPUNK Q&A post. Only difference is I deigned to change my shirt, or at least the unbuttoned button-up. What’s more, I added five video shots this time versus the four I used in the EDUPUNK video. Katie and Meredith’s Tik Tok videos were Media Source inputs of a video file I had in a folder on my desktop. Nothing else special, other than making sure in Advanced Audio Preferences for the videos I could monitor to hear it as well as the audience (see “Brian on Crack” example in EDUPUNK Q&A post). For the Buggles’s “Video Killed the Radio Star” video I included my webcam in that shot so I could be goofy and sing/dance along. This was as easy as adding another video capture device, my webcam, but I made sure there was no audio for me on this.

The coolest bit was Katie filmed her 2 minutes discussion of her story around her viral Star Wars Tik Tok against a green screen, so I could add the Yoda image cleanly behind her using the Chroma key in Filters for her video. That was a new process, and remarkably easy in OBS.

Finally I mapped my Stream Deck with the 3 primary scenes: LR (console living room), Full Screen (Reclaim Arcade shot), and Vinylcam (which is actually the presentation mode). The other five video scenes are in yellow and I order them to my liking, but can also label them to make sense for me.

The last bit which is different from the EDUPUNK Q&A video was I used the OBS Virtual Camera plugin for the Mac, it has been around for a while for Windows but a new development for Mac this Spring. This plugin let’s me choose everything I output from OBS as a camera input for an application like ClickMeeting or Zoom or any other video conferencing tool that supports it (this is why I needed to test ClickMeeting well in advance). It is super slick. When I chose the OBS Virtual Cam as my camera for ClickMeeting everyone can see my OBS app as the output. That worked brilliantly.

In fact, I tested everything again that morning so that by 2:30 PM when I was ready to test in the room I was sure everything worked—did I tell you I was nervous? Well, while the virtual video output was fine I had a bit of a scare when Paco, who was awesome, let me know the audio from the videos was not coming through. Oh no! I was racking my brain until I remembered I needed to use the virtual audio output from the Loopback application I use to mix sound together from various applications. Once I switched my audio from just my mic to the virtual sound output I named OBS Audio I was cooking with gas with only 5 minutes to spare, whew! After that, it was show time and I am pretty stoked we could pull it all off! Let me know if you need some comedy relief at your next online event 🙂


*The big test ClickMeeting passed with flying colors was working with the OBS Virtual Camera plugin, but more on that in the post above.

†A few years ago Alan Levine and I toyed with the idea of doing a ds206 course ( wherein we started thinking more intensely about various aspects of managing your digital identity and resulting work online. I think the idea of doing it as a course through a university (which was on the table at the time) was a bit more than either of us wanted to commit to given other demands, so we shelved it.

Reclaim Today: Tumamelt and Telepresence

024: Tunamelts and Telepresence

On Thursday Tim and I recorded yet another Reclaim Today episode, and I have to say this may be my favorite to date. Not only because we are beginning to see some of the fun possibilities manifest with the Reclaim TV Studio in this production, but it might mark the beginning of a truly awesome project. Tim and I have no shortage of good ideas when we get going, but Tim has really hit on some gold in his recent quest to bridge time and space to make sure Reclaim Arcade stays weird. He’s a genius, and I love the madness. But I might be getting ahead of myself here a bit, but the short version is he discovered this very cool site called Telemelt by Andrew Reitano, which is a way to play emulated NES games (amongst others) latency free online with friends. With the simple click of the spacebar you can switch who controls the game, and it is remarkably seamless, totally free, and a by-product of our current locked-down reality.

And to this equation Tim added another dimension, me and him playing them together in the proverbial and very real console living room in Fredericksburg with him in person and me on the robot. The combination of playing seamlessly via the browser and then “being” in the same space as a robot was quite remarkable. Which led him to the idea of what if we can replicate this latency-free game play for the Reclaim Arcade cabinets and have folks come in via robot and play with others that are in the physical space? A fleet of robots occupied by folks all over the world playing games in Reclaim Arcade….CAN YOU DIG IT!

I am sure I’ll have more to say about this, but it is also worth noting that this was our first stream using multiple-scenes with green screens and a little OBS Ninja action. I’m not gonna lie, I am loving our new streaming overlords 🙂

Reclaim Today: Dialing in Reclaim’s TV Studio

022: More Reclaim Studio Improvements

This is another episode of Reclaim Today focused on our playing around in the Reclaim TV Studio. In this episode Tim does a pretty impressive show and tell of the work he has been doing with Elgato’s Stream Deck for making a seamless streaming broadcast, as well as demoing how he made a Raspberry Pi4 into a streaming bridge based on Aaron Parecki’s YouTube video that demonstrates this brilliantly. In short, this allows me to stream directly to that Raspberry PI which is yet another input for the streaming setup, super cool! What’s more, Tim also figured out a way to get shortcut keys working in Streamyard (which are not endemic) using the Hotkey listener Vicreo in tandem with the Chrome extension Tampermonkey.

Two Jim Grooms? One was more than enough?

It was a fun episode chock-full of cool stuff, and what’s awesome is that Reclaim Today is starting to find its groove. I’m finding the episodes are tighter and more focused on our experimentation. What’s more, they are proving a whole lotta fun! It helps that we have a dedicated TV studio now—which was an investment—but it is quickly proving quite useful, not to mention really fun to play with. As I was telling Tim after this episode, I get most excited when I wake up these days thinking about broadcasting to the radio or figuring out another angle of the streaming video puzzle than just about anything else. I have a talk coming up in a couple of weeks that I want to try an apply some of what we are playing with in order to see if we can make the virtual presentation experience more fun, engaging, and interactive using a few of these tools, I guess we’ll see if all this fun has a real purpose or not 🙂

Reclaim Cloud Case Study: Containing TEI Publisher in the Cloud

It started out as an innocent enough ticket into Reclaim Hosting from Dr. Laura Morreale, whose work involves transcribing and translating texts from medieval manuscripts using online digital facsimiles, asked if we can run eXist-db on her cPanel account in shared hosting. In particular she needed to run TEI Publisher, an open source application that is described as follows in this documentation:

The motivation behind TEI Publisher was to provide a tool which enables scholars and editors to publish their materials without becoming programmers, but also does not force them into a one-size-fits-all framework. Experienced developers will benefit as well by writing less code, avoiding redundancy, improve maintenance and interoperability – to just name a few. TEI Publisher is all about standards, modularity, reusability and sustainability!

A quick look at the basic installation documentation for eXist-db told me it was a Java app which is a hard no for cPanel. But avoiding hard NOs when someone comes asking for help is one of the main reasons we started Reclaim Cloud. A cursory search for a Docker container for this application led me to a container that seemed out-dated. I responded suggesting we could try installing it on the Cloud if they had a current Docker instance, which I was not finding. Turns out I wasn’t looking hard enough, it was linked from the eXistDB homepage right in front of my eyes. I was wrong, and Dr. Morreale responded suggesting she was becoming increasingly frustrated trying to get this application running online saying, and I misquote for comic effect: “Dammit Jim, I am Medievalist, not a server admin!” She was right, and this was why we started the Cloud in the first place; I needed to try harder. What’s more, I appreciated the fact she was so determined to make this work. So much so that soon after after the last email I sent to try and get this working, she sent sent me a link to the right Docker container on the recommendation of the folks at eXist-db:

That was all we needed, I simply searched for this container in the Docker area when creating a new environment in Reclaim Cloud:

Click “Next” and add the subdomain of this test environment, in my example (now deleted), and then clicked “Create.”

And within moments I was able to access the site at at that subdomain:

The eXistdb splash page redirects to a suite of tools, including TEI Publisher!

A click on that icon brings us into that application:

While there are a still few things to work out in regards to user management for the application, it seems like we may have a winner with this Docker container. In fact, Dr. Morreale’s struggle highlights a pain point for many humanities PhDs that need to run an application that demands a bespoke server environment. This is when the value of containers is extremely evident. In this case, running a Java server environment that can provide an  application that provides a stable and citable publication venue for a Medievalist’s transcriptions and translations is a perfect case in point. In fact, Dr. Morreale was kind enough to furnish me with some insight of her work, process, and challenges for this post:

Like a growing number of humanities PhDs, I am an independent scholar who maintains relationships with several programs and institutions. I am currently affiliated in an official capacity with Fordham, Georgetown, and Harvard Universities, and am also engaged in ongoing projects with partners at Stanford and Princeton Universities.  My medievalist practice has always been characterized by a physical distance from both the repositories that hold sources which I study, and the institutions where my scholarly work finds its home. For this reason, digital methods have offered me a solution for my scholarly work when I had few others.

Some of the most rewarding efforts which have in turn informed much of my traditional analytical work, involve transcribing and translating texts found in medieval manuscripts using online digital facsimiles. Using a tool called FromThePage combined with IIIF image technology, I can now easily choose digitized manuscript images from any online repository, upload them, then immediately begin to transcribe the text from the medieval source. I can also translate my own transcription after it is complete, and I have undertaken both individual and collaborative translation projects using this method. Right now my projects include corpus of early 13th century aristocratic legal codes from Crusader Cyprus, a rarely-cited history of Florence that was buried in a late 14th-century letter from a father to his son, and a little known work by Renaissance Florentine Leon Battista Alberti, found in a larger manuscript that has broken up, with parts of it now housed at Harvard’s Houghton Library.

The one difficulty has been to find a stable and citable publication venue for these transcriptions and translations. I have tried several different programs over the years, but could never easily publish all the work I had done to bring more attention to these texts and manuscripts. Using Reclaim Hosting  and a program called TEI Publisher allows me to create the kind of edition I would like, and to allows me to integrate images, notes, and other explanatory materials into my online editions.

In the end, the fact that we could help Dr. Morreale get what she needed fairly seamlessly is a thrill, and it highlights everything we hoped Reclaim Cloud would be. I am planning on turning this Docker container into a one-click application for the Reclaim Cloud marketplace so that other folks can hopefully scratch a similar itch. And special thanks to Dr. Morreale for so generously sharing her process and work to complete this post. Avanti!

IndieWebCamp: Domain of One’s Own Meetup

This past Tuesday I attended the second Indie WebCamp generously hosted by Chris Aldrich focused on Domain of One’s Own. The format is a more focused 10-15 minute talk around a specific technology, in this meeting Tim gave folks a walk-though of Reclaim Cloud, and then opens up to the 21 attendees for anyone to share something they are working on. Tim shared the Cloud, and not only was I thrilled to see Jon Udell in attendance, but it’s always nice when one of your tech heroes tweets some love for your new project. Even better when you know they’re not one to offer empty interest and/or praise. Thanks Jon!

It was also very cool to read Will Monroe write-up of the session, and like him I found it a “very friendly group” and I realized while attending that this kind of low-key chatting and sharing is one of the things I have missed these days. Folks like Will who want to explore what’s possible in their classroom with Domains and beyond is a big part of what I miss about the day-to-day work of an edtech in an institution. And while I’m not necessarily chomping at the bit jump back into that game given the current circumstances, the ability to share and chat with folks who are interested in Domains is always a welcome opportunity.

During the sharing portion of the meetup Jean Macdonald, community manager at, turned me on to the Sunlit project while I was bemoaning the dearth of open source alternatives to photo sharing apps like Instagram. Soon after I finally took the leap and signed up for a to explore that platform. That platform has been a indieweb cornerstone for many folks I respect like John Johnston, Kathleen Fitzpatrick, and Dan Cohen to name just a few. So I wrote my first post:

What was even cooler was the fact that while writing this post I logged back into and discovered a few folks had welcomed me to the community, including Jean Macdonald and Dan Cohen—that makes all the difference.

I’m sold, so the IndieWeb meetup was a total win for me, and I look forward to the one next month. I am going to start getting serious about headless WordPress development for my new website at, inspired by Tom Woodward’s talk for #HeyPresstoConf20

So, I’ll have something to share in my journey to learn WordPress headless, which will mean learning javascript, CSS, and some other insanity I am not entirely ready for. I have to give a special thanks to Chris Aldrich for putting this together and working to create a space to talk Domain of One’s Own within the IndieWeb community, and I know Greg McVerry has been pushing hard on this for a while now as well, so it is very much appreciated!

Reclaim Cloud Art, Bryan Mathers, and Gettin’ Air

It occurred to me yesterday after finally listening to Terry Greene‘s interview with Bryan Mathers for the Gettin Air podcast that I never blogged about our Reclaim Cloud artwork. That needs to be rectified, and I will share the awesome below, but before I do I just wanted to say how much I enjoyed the interview between these two. Possibly the coolest part was when Bryan started interviewing Terry in order to see if he could “draw” out of him some ideas that he could refactor as a visual for the podcast, and voilà Gettin Air has a new logo!

I dig it, especially given I have returned to snowboarding these last few years, but even better was Bryan getting Terry to talk about his idea behind the name, his articulation of what he’s doing and why—it was all so effortless and real. It was a beautiful demonstration of how the interview can become the thing it wants to share. So genius, well worth a listen if you have some time.

Anyway, that whole process reminded me I have not yet shared the work Reclaim Hosting did with Bryan this summer to get started on the Reclaim Cloud aesthetic. Given Reclaim Cloud is premised on a container-based architecture, we initially explored if we wanted to go down the road of shipping containers, and we have some initial sketches from Bryan that I absolutely love.

The containers are actually VHS tapes! A point made clearer in the heavy lifting image that follows:

It really is brilliant, it captures the idea of Reclaim Cloud as both container-based and industrial-strength, which it is! But ultimately after talking with Bryan we realized the hard limits of the nautical/container metaphor. So we moved on to Cloud City, an idea Martha Burtis and I fleshed out for Domain of One’s Own back in the day.

I still love that poster, in fact I have a stamped copy of it framed and hanging on the wall behind me as I write this. So we got to talking a bit about it, although Tim was a bit reluctant given he is not a Star Wars fan, but through conversation the idea of a retro-futurism aesthetic began to emerge a la The Jetsons.

And Bryan’s rough sketches had us very intrigued:

The idea of scaling your domain was fun, and the way Bryan mapped that onto retro-futuristic housing and was brilliant. In the final image the beginnings of a logo/cloudlet begin to take shape already. This was our aesthetic, and we kind of knew it during the discussion, but the seeds of the sketches sealed it.

The final option was to stick with the music/video metaphor we already have and push it further with mixed tapes. But it just felt forced, and I think Tim and I both wanted the freedom to jump out of that metaphor and explore something new, and I am really glad we did.

The next conversation after deciding on Cloud City was to scout the internet for some ideas for our next conversations, and that is when Tim landed on industrial designer Arthur Radebough’s Closer than We Think comic strip from the late 1950s through 1963. The way in which the art incorporate an explanatory panel and then the actual art incorporates various explicit arrows illustrating the future gels nicely with our idea of introducing Reclaim Cloud as a way of highlighting for higher ed what’s possible in this new space. So, we got to talking, and the first round of art was amazing:

I really love the industrial logo for Reclaim Cloud which is itself an encapsulated container, a cloudlet if you will, and this idea of self-contained cities became a bit part of our aesthetic. And the fact that Bryan Ollendyke said it reminded him of Bioshock on Twitter just sealed it for me 🙂

We were sold after this image, a kind of brochure for Cloud City which enabled us to start exploring the idea of what it would mean to try and create a series of vignettes of the different options for anyone interested in moving to the Cloud. It was just too fun, so the follow-up discussion was to explore the Closer than You Think comic strips to highlight some of the one-click applications we have for courses, organizations, and digital scholarship:

Pure magic! The way in which the container has become an organic part of these images is just so awesome. I love the one outside the window of the home classroom. This idea that it is all connected yet separate is one way to understand the cloud, and Bryan really brought it home. And as amazing as all the art is, I think his breakdown of the various elements of a Reclaim Cloud container that could incur costs in a fullblown masterpiece:

This sphere is everything, literally. I just love the way the aesthetic has evolved and the final bit is thinking through how we’re going to highlight what is happening within each cloud. This led us to the idea of “What’s in your Cloud?” wherein we talk to folks to provide us a peak into their Cloud, what are they running, how, etc. The following image is a placeholder, but we are thinking through ways of trying to capture the individual nature of folks’ cloud for each episode, and Bryan mentioned some kind of comic-like avatar, like my Cotton Mather avatar in a spacesuit hold my Cloud sphere, which would be awesome!

Anyway, I think that brings us up to date, and to be clear this has only just begun. We are thinking of Reclaim Cloud as a long-game. We know it will not replace cPanel hosting; we have plenty of time to experiment with the possibilities; and we can slowly start moving our existing infrastructure over as we become increasingly comfortable with the environment. Not to mention it has forced us to dig in and learn a lot more as a company, and as much as I was kicking myself given I was just start to feel a bit liberated from the day-to-day, in the end I love it. We’ve been dreaming of this kind of infrastructure since we started Reclaim Hosting, and in 3 short months we went from nothing to a pretty full blown product that provides some concrete solutions for academics wanting to host something outside of the LAMP stack. And this retro-future aesthetic is our way to start experimenting in this space without pretending there aren’t also real problems baked into every solution—we’re here to explore right along side you.

Reclaim Today: Taking the Studio on the Road

Click image for video

We’re currently building out Reclaim Hosting HQ’s TV studio, and as a result we’ve been doing more Reclaim Today episodes —which is a welcome change. In episode 21 we discuss what a video kit would look like for remote workers like Lauren and I. The idea being the mothership that is Reclaim Hosting’s office studio would be where all the heavy lifting happens, but Lauren and I would need to have tight video setups that allow us to seamlessly integrate for a distributed stream, not to mention the importance of having a solid rig as more and more events and trainings go fully online.

And we even had a view or two, thanks Simon! So the discussion delineates what a remote kit would look like, and below is the list of the equipment I got for my remote setup (Lauren’s differs a bit based on availability). There was more Elgato equipment available in Italy than the US (the company is headquartered just up the valley in Munich, Germany) as the demand for webcams, portable green screens, microphones, etc., is still peaking given the US is experiencing the never-ending lockdown. So, below is my annotated list of my remote video setup:

Elgato Key Light Air (2x): Lighting, lighting, lighting! One of the big takeaways from our discussion with Andy Rush a couple of weeks back was good lighting is everything. So I got two portable, adjustable desktop lights that I can link and control via my phone. These were $130 each, and I got two that sit on either side of my computer (as pictured above) and they do make all the difference but the app is a bit wonky at controlling both seamlessly, so that is something to consider. But I love how seamless they work on the desk behind my monitor on the left and next to the one on the right.

Elgato Wave Microphone: Next up is sound, and I currently have a Yeti mic that has worked for me pretty well, but one of the drawbacks is I tend to keep it off to the side and I find my levels are consistently low and it picks up everything. That said the Yeti may be more than enough for folks, but I wanted to try the Elgato Wave 1 to see if that was different, it just came this morning so I have to follow-up after playing around more, but a potential benefit of the Wave mic is comes with mixing software.

Logitech C920 Webcam: This is the camera I bought after mistakenly getting the Logitech C615, which sucks. While only $15-20 difference, the C920 is far superior. And I think this will be a good solution for most, I am still planning on mounting a Canon DSLR behind and above my main monitor and bringing it in as an input for OBS using Elgato’s Cam Link 4K video capture card. More on this experiment anon, but at $115 for the Logictech C920 (which is $20 cheaper than the Cam Link video capture card, and $1000+ cheaper than a DSLR) it is a very solid and affordable camera for a remote kit.

Elgato Portable Greenscreen: Finally the portable Greenscreen from Elgato officially makes me Elgato brand boy, doesn’t it? I can live with that, I had to pay a few bucks for this from a third-party vendor in Italy given it was sold out here, but not like the price gauging for it my vendors in the US right now. This has yet to come, so I will need to write more once I get it and can play with it, which will invite more posts around actually exploring the possibilities with using a Greenscreen when streaming, some of which Tim highlighted in the this video, and they are so fun!