I'm very excited to announce that starting today it is now possible to build and run Node.JS applications on Reclaim Hosting. Similar to the Python and Ruby features that make running Django and Jekyll possible, Node.JS is a third party plugin integrated into the Software area of cPanel. We have the latest versions of Node 6.x, 8.x, and 9.x available to build on and creating a node application gives you shell access to npm to integrate packages. Through the use of Passenger, applications can be built and run directly over Apache allowing you to run your application proxied to a top level domain or subdomain without port numbers.
*Node.JS support is available to all shared hosting plans as well as Domain of One's Own servers that currently utilize Cloudlinux. If you are a program manager and would like access to this, please reach out to discuss adding this feature.
Last month I forgot to call my dad on his birthday. It's certainly not the first time I have forgotten something important. I'm not wired in the necessary way to piece together all the important bits of information that would make me "thoughtful". Maybe that's a red herring and I'm just not a thoughtful person, but I'd like to think that's not the case and life just gets in the way like I'm sure many reading this can sympathize.
Today I came across a really interesting application called Monica and while I'm certainly reticent throw software at all my problems, I can't deny the possibility that having a smartphone or email or even shudder Facebook has helped me stay in touch with people. Monica bills itself as a "Personal Relationship Manager" and if you're familiar with the CRM acronym (Customer Relationship Manager) in the context of running a business, you can start to see where this could be useful. In Monica, you can add family, friends, and others you meet in your walks in life along with any information you have on them. Monica will remind you when birthdays are coming up or other important events. If you saved gift ideas for them it would have those. You can see relationship connections to know who is married to whom. If you were really masochistic you could log all your interactions with all of these people there (spoiler alert: I won't be doing anything quite that crazy).
Best of all Monica is open source so I was able to install it on my own domain and start playing with it. All the data is private to me and not shared with any other social media platforms. Like any relationship manager, you likely get out of it what you put in so I will have to regularly add information to find real use there. But even if it's nothing more than birthdays that could be useful to someone like me. I hope to see more integrations to take advantages of the platforms like Facebook that already have a lot of information to avoid all the tedious data entry and perhaps better notification options beyond getting emails. But the core idea that I have space on my domain where I can keep track of the relationships in my life is interesting to me and gets at the heart of what I think it means to have a domain of your own. It's not all WordPress blogs out there.
One of the biggest challenges for folks new to building a website with WordPress is that it feels very much like writing/blogging software out of the box. Yes, you can create pages, but as soon as you want to structure information in columns or do anything more complex than images and text you will quickly find you need to find a theme or a handful of plugins to get the job done (and if you don't know what you don't know, that's a huge hurdle). "Site Builder" plugins are becoming more and more popular and you even see more themes integrating them into their frameworks these days. Some are pretty good, some suck really badly. One I really like and wanted to demonstrate as a way to quickly get up and running with a WordPress site is Elementor which is both free (there's a Pro version that has more features I'll discuss in a bit) and incredibly user-friendly with a lot of great options.
Rather than just talk about the various features let's go through the process of building out a demo site to see what Elementor has to offer. To start we'll fire up a WordPress install on Reclaim Hosting. Nothing too crazy, you're typical TwentyEighteen theme with the big ass succulent.
Since this isn't going to be a blog I'm going to go to Settings > Reading and change the homepage to be the About page that WordPress creates by default.
If we were to look at our site at this point nothing much has changed. We have the About page showing by default on the homepage. Let's click that Edit Page button to start making some changes.
In addition to your standard Post/Page editor functions we see a big Edit with Elementor button at the top. But before we hit that I want to also draw your attention to the Page Templates in the righthand sidebar. Elementor offers templates to go full width instead of being limited to the layout of the theme we're working with (if you look above at the previous screenshot all the content is pushed to the right which we don't want). They also have a Canvas option which is awesome because it essentially nukes the whole theme content and gives us an empty workspace to build from. I'll be using that so I select it and click Update.
Now let's hit that Edit with Elementor button and start building out the site. You'll see we're taken to an interface that looks a lot more like the WordPress Customizer than the traditional page editor. We can add and edit items via the lefthand sidebar and interact directly with them on the site on the righthand preview seeing the site exactly as it will look to visitors.
WordPress added some demo content to the About page that I don't need so I highlight over that and click the X to remove that text block.
Now below that we can add a new section for content. Let's start by looking at the template options Elementor comes with which is a great way to kickstart building a site rather than starting from scratch. Click Add Template and you're taken to a library of different full page layouts complete with demo content to choose from. Fair warning here, you will find templates that are "Pro Only" meaning this is where you'd need a paid version of the plugin to use those (and hey, if you like the plugin maybe it's worth throwing some money at the developers!), but there are a lot of great free options.
I chose a simple "About Page" template and now my editor interface has a variety of content that I can start modifying to put in my own information.
You'll find you can simply click in the various boxes and edit text directly as well as modify things in more detail using the lefthand sidebar which dynamically changes when an element of the site is selected.
Elementor also has options in the lower left to see how the site will display on tablets and mobile devices.
Going back quickly to the same area where we added Templates you'll notice there's a tab for Blocks as well. Templates are full page designs whereas Blocks are just small snippets of content (an FAQ section, a Call to Action, etc).
You can also build from scratch and add specific elements to your page to build out your own layout. You would create a new section which it lets you choose how many columns you'd like.
Then you can drag elements from the left sidebar over and Elementor has a large library of widgets to choose from with everything from Text, Images, Videos, Maps, Buttons as well as dynamic content like Blog Posts or Tags and Categories from your site.
When you drag an item out to a block you can then edit the content of that element in the sidebar.
There is quite a bit more to Elementor but that covers the basics of the plugin and how you can use it with WordPress to quickly build out dynamic responsive websites that feel less like a blog and more like a full-fledged website. If you start on WordPress and immediately feel out of your element at getting what is in your head on the screen I'd encourage you to give this method a shot.
When we started Reclaim Hosting it was with a lot of hopes and dreams and a single server. Hard to believe we only had a single server to manage for the first ~8 months of Reclaim's existence (and Hippie Hosting was always so small that it always only had a single shared server for all customers). Those days however are long gone, today Reclaim Hosting manages a fleet of over 100 servers for customers and institutions across the globe. As you can likely imagine, it's been a hell of a learning curve to get to a point where we are comfortable with such a large infrastructure.
Unlike a high-availability setup where you might have a bunch of servers spinning up and down but you'd never be logging into a particular one, when it comes to web hosting each server is managed individually. SSH Keys and a shared password management system go a long way to alleviate headaches with access. But the biggest hurdle has been configuration management and I finally feel we're starting to get to a place where I'm comfortable with it (I think it will always be a process). When I say configuration management I mean keeping track of and managing everything from the versions of PHP installed, what system binaries are installed and available like git, upgrades to software over time, and things of that nature. With an infrastructure this large split currently across 3 different companies, many different datacenters, and even a few different operating systems and versions it is inevitable to find that what you thought was running on one server is actually out of date or didn't get installed at the time of provisioning.
There are two things thus far I have done to tackle this issue. The first was to take a hard look at our provisioning process. In the past this was completely manual. Fire up a server and start working through a list of things that needed to be installed. If I normally installed something but forgot to mention it to Jim then there's a great chance it wouldn't get installed. And if the coffee hasn't kicked in then the whole system by its very nature is just prone to user error resulting in an incomplete system, not to mention this could take between a few days to up to a week to complete if other stuff got in the way. It may seem simple, but I created a bash script as a very simple way to get at the deploy process. There are a few prerequisites that have to be installed before running it (namely git since it's in a private repo and screen so we can run it with the session closed) but what used to be a process measured in days can now complete the majority of work in about an hour and a half. Here's everything thus far that the script does:
Install git, screen, nano, and ImageMagick
Run through some first-time setup options for cPanel configuration
Compile PHP with all extensions and Apache modules we need
Install Installatron and configure settings
Install Configserver scripts (firewall, mail management, exploit manager) and configure with settings
Update php.ini with approved values
Install Let's Encrypt
Install Bitninja (a distributed firewall product we use)
Setup custom cPanel plugin Reclaim Hosting uses for application icons
Configure automatic SSL certificates
Mostly my process has been working backward from what things we used to install and configured and finding what commands I could use to do the same process (installation is always with cli but often configuration is with the GUI so finding the right way to configure was a bit more time consuming and sometimes involved editing files directly using sed). We have more that could be done, again I treat this all as a process where we can continue to refine, but this has gone a long way to making that initial hurdle of setting up servers a "set it and forget it" approach.
So with our deployment process becoming more streamlined the second piece of the puzzle was to get a handle on long term configuration of servers, the various changes and settings that we have to manage. If Apache is getting an upgrade it should be tested on a dev server and with approval it should be pushed to all servers to avoid any mismatch. In the past that meant me opening up a bunch of tabs and getting to work but that's not a scalable approach. I've taken to learning more about Ansible as a configuration management tool and it has already saved me countless hours.
Ansible doesn't require any agent to be installed on our machines, it uses SSH access to run its commands. Commands are put together with "Playbooks" which are nothing more than .yml text files. There are various "roles" that can be used to handle everything from installing software with yum and moving files back and forth, to more complex tasks, and people can write their own roles so there is lot out there already for popular approaches to configuring and managing servers. At this point you might be thinking "Well if it can do all that why did you write a bash script instead of using Ansible for deploying servers?" and you're not wrong Walter. Long term it probably does make sense to do that, but Ansible playbooks have a very specific way of doing things that is replicable across a lot of servers and frankly it would require a lot of work to rewrite the deployment method that way so it's a goal but not a major issue in my eyes.
Now with Ansible if I decide I want to roll out HTTP2 support to our servers I can write a small playbook that installs it via yum and then run that against our entire fleet. If a server already has the support for it then it doesn't have to make a change so there's no harm in running playbooks on a variety of servers that may or may not have common configurations to get them all up-to-date. If anything the biggest challenge is not writing the playbooks (which I actually enjoy), it's keeping our inventory file that holds data for all of our servers up-to-date. A dream would be to use the Digital Ocean API to dynamically update our inventory in Ansible so if a new server is added there it's automatically added to our inventory.
I'm confident that developing these configuration management processes will help us ensure our customers benefit from the latest software and better support with a standard environment they can count on from one server to the next. And if it saves us time in the process to devote to new development work even better.
It’s almost hard to believe it’s only been 2 years since Let’s Encrypt came out of beta and began providing SSL Certificates to the general public. I wrote a post at the time calling it a turning point for the web, but cPanel support was pretty much non-existent. Since then much has changed. Just 2 months after that post was written we began using a plugin that offered Let’s Encrypt support directly in cPanel for all users on Reclaim Hosting and announced general support for free SSL certificates. In August of 2016 we began employing ways of scripting the ability for domains to get certificates automatically using the plugin and hooks from our billing system and I wrote a post aptly titled SSL Everywhere where I wrote:
After testing over the past 2 weeks I’m pleased to announce that going forward every domain hosted by Reclaim Hosting will automatically be provisioned with a free and renewable SSL certificate by default.
Around that same time cPanel had also made strides to offer their own support for automatic certificate provisioning with a feature announced called AutoSSL. Initially AutoSSL only supported cPanel’s own certificates issued through Comodo but later Let’s Encrypt support was added. Rate Limits employed by both certificate providers made it difficult to truly promise SSL everywhere and one issue we found was that notifications were a real problem.
Normally receiving a notification that your domain was secure would be a good thing, however often we have found this can confuse a customer that thinks they might have been charged for something, or possibly that the email is spam, especially if they didn’t specifically issue a certificate themselves (and remember we were attempting to issue certificates for all users so that would often be the case). Our ideal scenario is one in which all domains have certificates but no one gets needless emails regarding the provisioning of them (success or failure). Our plugin offered such granular notification settings and at the time AutoSSL did not so given the conflict we decided to double down on the Let’s Encrypt plugin and disable the AutoSSL feature across the board to streamline things.
We have more recently found out that there is a key difference between what the AutoSSL feature can accomplish and the plugin we use cannot. AutoSSL can (and has in many cases) replace and renew certificates for expired domains. That is a good thing in that even if you had a self-signed certificate or previously paid for one and it had expired you’d get a new free one. What we didn’t know was that our plugin was not able to do this, so when we disabled cPanel’s AutoSSL feature we suddenly had a large number of domains with cPanel-issued certificates that the Let’s Encrypt plugin could not renew or replace leading to confusion with folks waking up and finding their sites didn’t work over https.
In the past we have pointed folks to our documentation on installing a Let’s Encrypt certificate but remember our goal was that no one was supposed to have to do that. SSL Everywhere was and still is the goal. We needed to fix this. I’ve reached out to the plugin developers who are now aware of the issue and have committed to working on a fix that could be released along with wildcard support in the next 2-3 months. But that’s a long time to continue fielding issues of certificates not renewing which can render a site inaccessible.
We decided this week that a better short term solution was to turn the AutoSSL feature back on and have it issue certificates for any domains that did not have them or were expired. We would continue to have the Let’s Encrypt plugin exist but with the goal being that users would have a certificate from one or the other automatically and either way they would be renewed automatically. Unfortunately an attempt to ensure that users didn’t receive a bunch of notifications of this failed. cPanel provides an API call to change the setting and it returned the correct response so I didn’t think to check and make sure the setting was actually changed and it wasn’t. Long story short there, many users got emails for every certificate provisioned. But we’ve fixed that now so that the emails won’t be sent in the future and meanwhile the good news overall is that I think we’re much closer to the goal of SSL Everywhere, provisioned by default and renewed automatically with no work on the part of users.
We’ll continue to keep an eye on this in case the landscape changes (with technology it always does) and as always reach out if you have any questions or concerns!
Seems like every 1-2 years we get a major security scare in the form of a global exploit that effects server infrastructure in some fashion and requires a response. We’ve had Heartbleed, Poodle, Shellshock (who comes up with these names anyway?). 2018 didn’t wait long to bring us that gift in the form of Meltdown and Spectre. https://meltdownattack.com/ has a lot of great information about these two exploits but the short story is that rather than taking advantage of any particular software configuration, these exploits expose vulnerabilities in pretty much all modern CPUs. That means not only does this require patching for server admins like me at Reclaim Hosting and across the web, but every operating system from all computers including mobile devices and personal computers are vulnerable. The vulnerability takes advantage of exploits at the hardware as well as software layer to leak data into memory that can then be read by the attacker. It’s not a question of whether or not you are affected, you are affected.
Antivirus can’t block it either, only patching the underlying systems will resolve it and thankfully companies have been hard at work at getting these patches developed since long before the news became public. Intel became aware of the exploit last fall and many major companies have been under an NDA as they developed patches to secure their systems. Due to the complexity of this exploit however, we are still awaiting patches for some systems and now it is public (which will hopefully light a fire under certain groups to get these patches out).
Thankfully when we at Reclaim became aware of the issue last week CentOS, the distribution of Linux that powers over 90% of our server infrastructure and the only supported distribution for cPanel, was already releasing patches. We had to do some testing as well as await patches by Cloudlinux which is a third party that we use for our kernel software, but by Monday we felt confident the patches were safe and we set to work to patch our entire fleet. Normally with maintenance that involves downtime we like to give customers a heads up and with this kernel update requiring a reboot sites would indeed be offline for a few minutes, however we made the judgement call to rip the bandaid off and favor getting these patches in place as soon as possible rather than risk data being exposed as a result of the vulnerability. By 6PM Monday our entire infrastructure that runs cPanel and all CentOS servers were patched for these exploits with minimal downtime across the majority of our servers.
We have a small number of Ubuntu servers that we are still awaiting a production patch on and hope to receive that sometime this week. If you want to make sure you are secure, the best thing you can do is run all updates for your operating system and browser to make sure you’re running the absolute latest version. Due to the nature of the exploit there is no way to trace whether the vulnerability has been taken advantage of (it does not log any of its actions) so it’s particular important to be proactive. I’m proud of the capacity of Reclaim Hosting as a small operation to remain aware of these events and to stay on top of them in a timely manner.
Now can we take a nice long vacation from these major exploits? My spidey sense tells me that’s likely not to be the case as we come to rely more and more on computers and specifically internet-connected devices in our lives. It’s the new normal and the best security we can hope to have is proactive patching and awareness.
Well, #domains17 is done! We’ve wrapped up on Tuesday and are all home by now. I definitely needed a few days to gather my thoughts for this post. I’m so grateful for this chance to experience what the Ed Tech world is like before I even start my job with Reclaim Hosting. It was a great way to meet tons of new people I will interact with.
I wanted to talk a bit about what I wanted to get out of the conference, how the conference actually was, and what I’m doing after it.
So I knew that Reclaim was planning a conference in OKC back when I was an intern. Lauren would send messages on Slack of updates to her planning and it was very cool to follow how she was planned out the entire thing. I was looking forward to hearing about all the fun once I started at Reclaim. Then about 2 weeks ago, I received an email from Tim asking me if I could come along with them to OKC. I was on board immediately! I definitely didn’t want to miss the chance to get to hang with the full team before I started and to see what the Ed Tech world was all about.
Now flash forward to Saturday. I was so nervous, anxious, but mostly excited. I was nervous because it was my first exposure to a business conference. I was anxious because I really only knew the Reclaim and UMW crew out of the 80 people that attended. But I was mostly excited for this wonderful opportunity to really jump into my career with both feet before it even begins.
I arrived at the hotel by the early afternoon and met the whole team to get things set up for the conference.Although the conference really started on Monday, we used Saturday and Sunday to get acclimated to the space and have everything ready for when people got in on Sunday. We all got dinner together along with Adam Croom, the University of Oklahoma liaison for the conference. He and Lauren worked closely to plan. It helped to have someone on the ground who knew the surrounding area and was able to provide awesome recommendations.
Sunday rolled around and it was a great day full of awesome conversations. Lauren and I started the morning by walking to a local coffee shop called Coffee Slingers. And when I say walked, I mean like 30-40 minutes through the city. OKC is a weird mix of open space, but also you get into the city quickly. It was such a nice morning, despite the rain that I totally didn’t mind walking. I enjoyed the time to get to know Lauren a little more than just from our Internet class with Jim in 2014. It was a great girl bonding morning. After the morning, we met for lunch with Jim and Tom Woodward for another meal full of awesome conversation. Tom talked about his work with Georgetown University. He gave a presentation on his work during the conference, you can read that here. He also took some awesome photos as well.
This was my first experience with a conference like this where I am actually involved. I’ve been to other conferences before but never on my own and in this capacity with people who are now colleagues. But honestly, I couldn’t think of a better way to be introduced to the new professional world than through this conference. Jim, Tim, and Lauren both helped make me feel very welcome by introducing me to people and asking me to be involved with a lot of the conference.
We kicked off the conference with a Domain Fair, where participants had numerous booths talking about the different projects they were working on. It was a great chance for people to catch up. For me, it was a great experience to see for the first time what people were working on. I was also recognized from Twitter which was insane, I hadn’t thought that my profile would be recognizable!
Then it was time for Martha’s Keynote! Martha Burtis was my boss at UMW, the director of the DKC, and I knew she was going to talk at the conference but I had no idea that I was going to see it. It was totally awesome. At UMW, the Domain of One’s Own project has been around for 4 years. I was a student there when the program started and I’ve seen it grow so much over the years. Martha talked about the DoOO program being at a point of “inflection,” as she called it, to shift the focus from getting the program set up, to a point of deeper thinking about what DoOO really is. Martha said
“I want to spend my time here dwelling on the the inextricable, in this case, why we in higher education must teach our communities to grapple with the Web in these deep and discerning ways — how the Web, and our culture, and our systems of education are bound up with each other and why they demand a particular responsibility of us.”
For me, this quote really stuck. I have noticed a lot of times that people don’t really understand how to navigate the web. And not just students I’ve encountered as a DKC tutor either. Other students and friends take the web for granted very often. It’s important that we teach others how to use the web, what the web represents in our society today, and promoting digital citizenship. Martha continued to talk about DoOO and provided some thought-provoking points. Towards the end of her talk, she mentions my name. I was totally surprised! She talked about my individual study I did last semester and one of the questions I asked, during the interview process, if the web was a concrete space what would it be? Martha put together all of the answers to that question. It was such a cool video, take a look:
She challenged us to think about what the web would look like if it was a concrete space to us. After I interviewed everyone for the project, I had an idea of what everyone else was saying but I never really put it together like the way Martha did. I thought about and thought about it, then, it hit me that the web was a shipping container. You can do a ton of things with shipping containers, build shelters, buildings, and ship things in them. But when I’m talking about the web as a shipping container, I don’t mean just one of them, there are thousands of them around the world. And they can be transported anywhere in the world. There’s not just one item in them either, there can be a bunch of different products in one container. Just like websites, there are tons of different things within a website.
I thought this example is perfect for what the web represents in my life. My dad works on ships, piloting them from the mouth of the Chesapeake Bay up to Baltimore Harbor in Maryland. He travels on all types of ships, car, container, and tanker ships. So I thought to illustrate my example of the web as a concrete space, I would use some of the photos he’s taken from the ships perspective:
After Martha’s keynote, the group broke into sessions for the remainder of the conference. This was another great opportunity to see what others are working on. This was very new to me, which was very exciting. But there was a chance for me to step out of my comfort zone as well. I’m used to being behind the scenes of events not presenting in front of other people. During a few of the sessions, I was introducing the speaker. It may seem like a small thing introducing someone but for me, especially since I’m so new to the field, it was pretty daunting. Luckily I got to introduce some of the UMW DTLT crew, so that took a little bit of the nerves away.
The first talk I introduced was Sean Morris and Jesse Stommel’s “If bell hooks made a Learning Management System (LMS).” Their talk was awesome, diving into the question: If bell hooks Made an LMS: Grades, Radical Openness, and Domain of One’s Own. Here are a few quotes from the talk:
"The LMS is not a cage to put student in" #domains17
I also introduced Jordan Noyes and Lora Taub who examined archiving protests. This was something that I’ve never really thought about. I haven’t participated in a protest before, but after their talk, I was intrigued. So I’ve set this as a new goal, start archiving protests, or participating for that matter.
One of the other talks I went to was from Jess Reingold and Jenna Azar. Jenna is an Instructional Designer at Muhlenberg College, who also runs the Digital Learning Lab, which is just like the DKC. She brought her son along, Jarrett. Jarrett is a Digital Learning Assistant and helps students with their digital projects. It was really interesting to see how the Digital Learning Lab is run as compared to the DKC, and really cool to see the concept that the DKC started to continue to grow.
Things to take away from #domains17 your mom may have great ideas but you don't know it until someone else recommends it
Overall this trip was one of the best things I could have done to kick off my career. It still hasn’t hit me that I start at Reclaim Hosting this week. I feel refreshed, excited, and motivated to get a start and jump into my work at Reclaim. Thank you, Tim, Jim, and Lauren for this opportunity!
Well as this semester is wrapping up, I just wanted to write a reflection of my time at Reclaim Hosting. It’s really weird to talk about as the last weeks of my undergraduate finishing up. It still hasn’t officially hit me yet that I’m graduating in May. But things are finalizing all over the place, I just had my last classes and I’ve got a job lined up! I’ve officially accepted a position to stay on the team at Reclaim! I start in June, so I have a little time to enjoy my summer and it will be exciting to move back up to Fredericksburg.
Enough chatter though! Time to get into it. Not too much has been going on. I was still working on documentation for the new company the team has started called Rockaway Hosting (I’ll talk a little more about it in a little). I created a style guide to use with the new articles. Style guides are vital in technical writing. These articles are what the clients will look at if they have a problem. They keep the article flowing in a precise way so that the readers don’t spend too much time on the article. The goal of each article is to solve the client’s problem efficiently. The tricky thing too is that you want clients to stay on your website because they are more likely to click on more pages and posts. Luckily I didn’t have to write the articles, if I did, I would have no idea what to do. Most of the topics were written on topics I never really thought about until I started with Reclaim. But there was a major learning curve, and I actually used the articles to figure out what I needed to do. When I started to go through each article, I learned a lot about the different topics as well, because as I was combing through each article, I would go through the steps to make sure they were accurate. Unfortunately, I did not finish all of the articles, but that’s okay! I can continue going through them when I start in June.
I also wanted to talk about what I’ve learned throughout my time as an intern. For starters, I got to see what it was really like to be in a workplace. I’ve had summer lifeguarding jobs and most recently my job at the Digital Knowledge Center, but those helped me gain skills that I can put towards my career. They were a professional environment to some extent, but they are nothing like a real office space. Even though Reclaim is as casual as it gets, there is still a professional feeling to it that I hadn’t experienced while at my other jobs. I was given projects and I would work on my own. A lot of the time in my other jobs, I would work on projects with other people. This was such a change to start working by myself. I learned a lot about time management and staying away from online distractions.
I learned a ton about web hosting and a lot of components that go into it. It is really such an interesting field in technology. I never realized how intense web hosting is. There are a ton of moving parts, you have what the clients see and what the administrators see. There is a community forum used by the clients to search for help when they run into trouble. Of course, there’s the support side of things, which I have to say is the best part but I’m biased. It’s so fascinating to explore the ins an outs of webhosting, that field is really something I’ve come to enjoy.
So let’s talk about Rockaway Hosting! Jim wrote about it on his blog here (he explains it way better than I will). But Rockaway Hosting is the non-educational counterpart to Reclaim Hosting. Reclaim Hosting is mainly about Domain of One’s Own, which I’ve been involved with through Mary Washington since I started there in 2013. I’ve been working on a project for the program for my individual study, which you can read about here. So Reclaim has been all about educational web hosting. Now Rockaway is different to Reclaim in that it provides different hosting plans and support features for an additional fee. The company was still being built when I first started at the Reclaim, but it has grown so much to start a ‘soft-launch’ and hopefully, soon it will be fully operational! I’m really excited to see Reclaim growing!
But that’s it for me as an Intern! I’m really excited to join the Reclaim/Rockaway Team in June, I’m so fortunate to begin my career with them!
Well I’ve been at Reclaim for 6 weeks now! I can’t even believe it, it feels like I’ve been there for months. I’m really enjoying my work and I’m definitely getting the hang of things when answering support tickets. I still ask Tim a ton of questions throughout the day because there is just so much to learn. But in this post I wanted to talk about some new projects I’m tackling for right now, one I started before my spring break and another just this week.
Just before break (so two weeks ago) I started learning a network protocol called Secure Shell or SSH. SSH is a cryptographic network protocol used to navigate through servers remotely, that means you don’t have to log into your account in your browser like you would normally. You use a separate program entirely. On the Mac its called Terminal. Its a very quick, secure, and efficient way to view files and error logs. This comes in very handy for numerous tickets. One ticket came in where the user was having a problem accessing their site at all giving a HTTP 500 error. That usually means that some aspect of the site is broken thus completely breaking the entire site. Using SSH we can go into the site and clear out the bad file to fix the site. Another ticket came in where the user was having trouble with their storage quota. SSH can read the error log right in the program and that allows us to figure out what went wrong. I’m actually struggling with SSH quite a bit however. My brain and code do not mix at all. So its difficult for me to wrap my head around this new type of navigation. But I think I’ll get the hang of it slowly but surely.
I started the second this past week. I am now tackling documentation on the community pages. I’m going through all of the articles to update screenshots, and rewrite a few if the process has completely changed. I’m enjoying that as well! Back in the summer of 2015 I tackled that same sort of project for the Digital Knowledge Center. The DKC was in the process of migrating its documentation to another site and I was in charge of creating a style guide and rewriting the documentation accordingly. So I felt like an old pro going through the other documentation at Reclaim. Its very different though. There is a lot more information to understand and a ton of different topics. Its actually helping me learn a lot about other topics that I didn’t know at the DKC, like nameservers, domain management, and other open source platforms.
I also wrote a new article on the community page, for installing themes to an Omeka site. That was a lot of fun, I’ve never used Omeka before and I had to experiment with it before I could write the article. Omeka is relatively intuitive so I was able to write the article very quickly.
On another note, I continually run into a problem when I’m answering tickets. At the DKC we tutor WordPress, which means we can help students edit their website, but at Reclaim the support we provide stops when it comes to actually editing the clients website. I’ve had a few tickets where clients want help actually editing their site and I’ve had to tell them I can’t. I want to help but its out of Reclaim’s wheelhouse. I guess I’m still getting used to the fact that the DKC and Reclaim are two completely separate companies.
But other than that I’m still having a ton of fun and I’m learning every time I step into the office. Stay tuned for more posts!
When I started at Reclaim I realized that I needed to learn some more open-source web platforms than I thought. At Mary Washington, I mainly work with students on WordPress, which makes up the majority of the domains. That’s a different story at Reclaim. There are multiple applications that access the file manager (which is like the file manager on your computer). I’ve had a couple of tickets where clients needed help with two applications that have access to the file manager: Omeka and Drupal. So I decided I would set up my own subdomains for each application and learn as much as I could. I figured this would help me more when clients need support on those applications.
Drupal is a content management system similar to WordPress. The Interface looks very similar to how you would navigate WordPress, and even add content. But it definitely is not WordPress. Drupal looks a little rudimentary to WordPress but it gets the job done. I spent some time adding test content, pages, themes, and plugins as well. Drupal mainly operates through the interface itself so it does not use the file manager but it’s still very useful to learn since people still use the platform to create content.
Problems I ran into: I struggled when trying to install some themes. There is a specific file type you need to use when installing the specific one I found. Drupal’s main website has tons of themes and I found it hard to pick just one. When it was time to install the theme I had to download the file to my computer then upload it to my Drupal install. Pro Tip: Don’t use the .zip form of the theme, use the .tar.gz part of the file. That’s where I hit a road block. For a while, I wasn’t able to install a theme and I couldn’t figure out why. Now it really seems obvious that I needed to use that specific file type, but now I know.
Omeka is another content management platform where you can create posts for specific items to document them. The items can range from specific historical artifacts to pieces of artwork, and really any item you’d like to document. At Mary Washington, the history department utilizes this tool more than any other department. Omeka mainly uses their interface to create their own content through the back end of that specific install. But Omeka uses the file manager to install and manage themes and plugins. This is a little different than expected but it was very easy to get the hang of. Reclaim has a great documentation website where I was able to look at how to add themes and plugins through the file manager. I had one support ticket where she needed help with the file manager. After looking into how to use the file manager it takes just a little bit to get used to but it’s useful to have the themes and plugins held in a separate area on the file manager. Using Omeka is very intuitive, the interface lays out all of the options you will need when posting an item. When you customize the space as well Omeka gives you all the options for customizing the theme around all in one page.
All in all both applications are good options for content management. But if it were up to me, I’d definitely recommend using WordPress over anything else