Multi-Discipline The Problems and Solutions of a Generalist Maker

Bump

Dusting off the old weblog, after a five-year absence 😅

The Jekyll site has been chugging along maintenance-free this whole time. All I’ve done is keep my DigitalOcean invoices paid.

I built this site and set up hosting two dev machines ago, though, so I had completely forgotten how to publish. The Multi-Discipline code had never even been on my current machine (let alone appropriate git remotes and SSH keys set up, et cetera).

In trying to figure out how past-me had set up the publishing flow, my brain (clouded, unreliable 👎) was nudging me in the direction of popular Jekyll-based hosting option GitHub Pages, and I spent a few hours trying to resurrect a publishing flow that never was.

But then I took a few minutes to read my own first post on this blog, The Making of Multi-Discipline (clear, faithful 👍), and I was put back on the right track. Documentation, FTW 🎉 (thanks, past-me)!

I set up a local git remote for the DigitalOcean droplet, did a test push (which failed, so I wrangled some SSH keys and pushed again), and was back in business.

Since my attention was momentarily back on Multi-Discipline, I took the time to upgrade my trusty DigitalOcean droplet to the latest version of Ubuntu (18.04, up from 14.4) along with some other packages, and did some content/template housekeeping, for good measure.

A quick tune-up, and everything seems ready to roll 😎👏

Timber-Framed Philosophy

”That frame — in a timber frame — defines the building. Everything else is a facade. That frame is what will allow it to stand up tall and strong and rugged for the next 200 years. […] The roof will come and go, side walls may come and go, windows will come and go; but that frame — that will be there for a long time.”
-Tedd Benson, timber framer

Be sure to also watch parts II and III.

For anyone unfamiliar with my personal history, I used to work as a carpenter for a local home builder. As one of only two to three (including the owner) on the crew, I was involved with nearly every stage of the homebuilding process — from rough framing (spanning floor joists and laying down subfloor, putting up walls, setting roof trusses and getting the jack rafters just right, sheathing the exterior walls and roof, building stair structures, etc.) to the more polished work of finish carpentry (putting down hardwood floors, fitting base and window moulding, hanging doors, installing stair and balcony rails, etc.).

I’m fortunate (and proud) to have had the opportunity to experience that level of hands-on work with structures which will stand for decades; it makes for a nice professional contrast with the relative impermanance and perpetual iteration of the Web Things™ I build in my current field.

So when I saw Kevin O’Connel (of This Old House) interviewing documentary filmmaker Ken Burns and builder Tedd Benson about their approach to building an authentic-to-traditional-spirt timber-framed barn, I was instantly engaged.

Ken Burns is, himself, a craft-focused personality (evident from his body of work in documentary films), so I’m always interested to hear him talk about process.

What I didn’t expect was the sense of history, philosophy, and professional purpose with which timber framer Tedd Benson spoke about his work; including morsels such as this:

”Barns are agrarian cathedrals, and they were built by people who were a part of an endless chain of knowledge, skill, and an attitude that ‘when we build, we should build forever’. So — though they sacrificed something about the ornate qualities of building — when they built barns, they didn’t sacrifice anything else. They just reduced materials to simplicity, and then brought all of their skills, all of their knowledge, all of that ancient wisdom to create these wonderful buildings that are part of that sacred tradition.”
-Tedd Benson

I won’t pretend that my approach to building homes was as deeply rooted as Benson’s (few could possibly be) — but the ability to speak with ease about the history and philosophy of one’s chosen profession is a quality for which I have profound respect in craft-focused practitioners of all types.

In what I’ve seen from the people I most admire, that depth of awareness of purpose and context seems to require a mindful balancing of qualities so seemingly disparate as hyper-involvement in one’s craft, and a degree of separation (proactive removal?) from the work directly at hand.

If I ever figure it out, I’ll be sure to pass along the magic mix.

Podcast Discovery & Recommendation

If you, like me, are widely curious and always trying to expose yourself to new ideas, you ought to consider the rise of podcasting as a medium to be nothing short of revolutionary. For someone who carries an intense interest in a few subjects — and a passing interest in many, many more — the right kind of podcast can offer the caliber of insight that used to require tuition and a lecture hall (and often does so through hosts and guests with engaging personalities and an infectious passion for the subject matter).

I’ve posted the “Maker” subset of my podcasts subscriptions. That grid shows about 1/3 the total list of feeds through which my podcatcher of choice iterates every time it checks for new episodes (most of these podcasts post on a weekly schedule).

If not for the fact that writing code (or prose) requires focus from the same part(s) of the brain which process verbal language, I’d be actively keeping up with even more. As such, my podcast consumption is limited — quite literally — by the number of waking non-working hours in a day.

“So, ladies and gentleman, if I say I’m an podcast fan, you’ll agree.”
-Me

The Problem

There is a vast (and growing) landscape of podcasts available to any listener with the time and inclination to seek them out. Most listeners, however, are not inclined to spend time perusing podcast directories for new must-listen shows; they have a few which suit their daily commute, and that’s as far into the landscape as they need to explore. (Filthy casuals.)

Or, approached from the “recommender” side: when one listener finds a show or episode they really like, they have limited outlets through which they can recommend it to other would-be listeners.

Speaking for myself: I almost always tweet a link to the feed and/or episode URL of a show which I find interesting, entertaining, or useful. But I wouldn’t be surprised if those recommendation tweets (given, as they are, in a stream-of-posts context which is impermanent by design and affords no immediate action for podcast listening or subscription) are lost on many who would have taken a listen, but weren’t in a position to immediately:

  1. Stop what they were doing
  2. Open their podcatcher
  3. Find the show (or, if the podcatcher has no directory feature, copy/paste the show’s feed URL)
  4. Subscribe to the show (or download the single episode)

The (Proposed) Solution

A service which allows users to recommend single episodes of their favorite podcasts, and to “follow” the episode recommendations of other users, which the system compiles into a single appropriately formatted RSS/XML personal “followed” feed for each user.

As far as I can see, the best-fitting model already in existence is the humble retweet.

But, like the retweet — in order to enable that behavior, you first need a system to handle the basic re-share-able units of content (in our case, podcast episodes).

Desired Behavior:

  • As a user, I “follow” other users whose podcast recommendations I think I’d find interesting (probably people whose interests I already know mirror my own, thanks to their presence on other social sites/services).
  • Individual episodes which have been recommended by users who I follow will be added to my “followed” feed.
  • Like retweets, even if more than one user who I follow recommends an episode, that episode is only added once to my “followed” feed.
  • When I want to expand my podcast horizons through a set of socially curated episode recommendations, I should be able to simply open my “followed” feed (which has all of the recommended individual episodes compiled into a single feed) in my podcatcher, and play an episode.
  • The rest of my subscriptions in that podcatcher should not be affected.
  • If I find an episode interesting enough that I want to subscribe to the podcast directly, there should be a button for doing just that on the individual episode’s “now playing” or “info” view.
  • If someone I follow adds a recommendation for a podcast to which I’m already subscribed, that episode should not appear in my “followed” feed.

The Tools

  • Web Site (acts as hub for creating an account, finding users to follow, recommending podcast episodes, etc.)
  • API for Web Service (which the site and developers of podcatchers can use to authenticate service users, pass along episode [un]recommendations, add user’s “followed” feed as subscription, etc.)
  • Cooperation of Developers (much in the way that third-party apps have integrated with useful “personal collection” services like Pinboard and Instapaper)

The (Proposed) Process

This remains to be seen, but I believe the first step would be to build an MVP of the website and service. Obviously, it wouldn’t be as feature-rich on its own as it could eventually be once third-party podcatcher developers build integrations; but it would be a useful start for the steadfast core of Podcast Superfans™ who want to push and collect episode recommendations right away. (There are dozens of us… DOZENS!)

Until third-party developers adopted it, service users could simply copy/paste the URL of their personal “followed” feed (LIKE AN ANIMAL) to subscribe in any podcatcher, just like any other podcast feed. They could also manually add/remove podcasts to/from an “I’m already subscribed” list on the system, to prevent recommended episodes of podcasts to which they’re already subscribed appearing in their “followed” feed.

These temporary workarounds would be tedious, but not sufficiently frequent to make use of the service into a chore.

The (Expected) Outcome

Since sharing recommendations is something done without any real expectation of return or interaction, it shouldn’t matter, at the outset, whether anyone is actually following you (meaning the inherent chicken-and-egg problem of social-dependent services should be avoidable). As it stands now, this is something I do with zero feedback about whether my recommendations are appreciated, or even noticed (let alone acted upon).

And since podcasts, by their nature, require a time commitment to consume, a single-digit queue of recommendations is enough to keep a user busy for hours (unlike, say, Twitter — where a user must follow at least a few dozen other users to be ensured consistent fresh new content). This means a user could conceivably fill their available podcast listening time by following just two or three recommenders.

Given the right circumstances (and eventual third-party developer integration) a service such as this could be expected to keep podcast listeners continually returning to the recommendation well for another quality dip of socially filtered podcast goodness.

It would be interesting to see how a recommendation machine constantly churning out appropriately recommended content fares when it bumps up against the inherent limit to a human’s available daily (weekly, monthly) listening hours, and how that ultimately affects the decision users make about what to consume in their preciously limited time.

If it could help casual listeners to “level up” their podcast consumption (in terms of quality, if not hours), I feel like it would be a solid win for the medium.

The Making of Multi-Discipline

Despite having worked on and around the web since 2006, it’s been a while since I’ve had my own personal space which I controlled soup to nuts.

Twitter has satisfied most of my practical publishing needs (low friction, low maintenance, publicly accessible, able to be individually hyperlinked), but the notion that I don’t “own” my tweets is aggravating to my future-proofing sensibilities.

While I do have systems in place to capture, for example, my tweets (first via an iPhone app called Momento; more recently by running a custom install of Brett Terpstra’s Slogger to add my tweets to a personal Day One journal), it’s not the same as having posts under my complete control.

The Problem

The fate of my posts to Twitter, Tumblr, or any other advertising-supported “post here and we’ll probably keep your stuff around for a few years” platform is, ultimately, out of my hands. But I would like most (if not all) of my day-to-day public writing to have the best possible chance of being around for my kids to discover when they’re older.

The Solution

I’m building out my own little corner of the web where I control the presentation, availability, and ultimate fate of my content, and which I can expand to include more forms of media (long-form posts, image galleries, video, audio, etc.) as I find uses for them.

The Tools

The Process

Tinkering With Jekyll Locally

Since I had never before used Jekyll, I limited myself to installing and running it locally on my development machine (a.k.a. “dev machine”, a.k.a. “laptop”) for a few days. This gave me an opportunity to tinker with it (tweak a template, find the limits/hackability of the posting system, determine what else I could build into this) without investing a bunch of time spinning up a VPS. Jekyll, I soon learned, could even be deployed to GitHub Pages with a few setup conformities, so that became my plan.

After tinkering with Jekyll, it was clear that while this was a solid piece of software out of the box, the real value it offered was its customizability and extensibility (given familiarity with Ruby and Liquid). I work with Ruby professionally, so that was a good fit; and despite this being my first experiment with Liquid, it’s a sufficiently intuitive and documented system that I picked it up quickly.

The Jekyll features I wanted to build, though, required that I change my deployment plan from GitHub Pages (which offers a free and quick way to run a stock Jekyll site) to deploying via Git to my own VPS.

Setting Up My VPS

Configuring the DigitalOcean Droplet (their term for a VPS “unit”) was straightforward; DigitalOcean’s own system had me up and running with a virtual Linux (Ubuntu) “box” in no time, and I had the necessary user accounts, SSH Keys, etc. set up shortly thereafter (aided by DigitalOcean’s excellent documentation & tutorials).

Context: Part of setting up this droplet involved directing web traffic from my domain name, multi-discipline.com, to my droplet’s IP address. In addition to being necessary for general web traffic, having that DNS A Record in place allowed me to use that hostname for other services (ssh, git remote, etc.) and have a single point of update should I move the site to a different server.

Next up, I installed Ruby and Jekyll on the VPS, installed Git (git-core, in my case), and created the Git repository at which I’d be receiving my entire, un-compiled Jekyll site (pushed via Git from the repository on my laptop, where I’m building the site and writing posts).

Context: Jekyll is a static site generator, meaning that the “work” of building each and every page in the site (setting the permalink address of each page; including header, footer and other page layout partials; replacing variables with the appropriate text, etc.) is done once whenever I push a change to the Git repository, instead of every time someone on the web requests a page.

The driving force of all that server-side processing is a Git post-receive hook on the site’s repository (a.k.a. “repo”) on my VPS. You can think of this as a shell script which fires when Git receives a change to that repo. In basic terms (neglecting a few technical steps), that script:

  1. Tells Jekyll to turn my site’s un-compiled content files, layout partials, and assets into a functioning and inter-linked set of web pages (in other words, makes the disparate parts into a proper web site).
  2. Clones the generated /site folder to a particular directory on the VPS (where, in a future step, we’ll tell the web server to look for the page files as they’re requested).

Setting Up nginx As My Web Server

While I’ve done some basic system administration work in the past, I’ve never actually installed and configured my own web server (nginx, in this case) on a clean system.

nginx does the work of listening for requests which come in, and serving the appropriate files from my VPS back to the web browser which requested them.

As such, I needed to let nginx know:

  • The port on which it should be listening for web requests (standard is 80)
  • The IP address from which it is serving (the IP address of my DigitalOcean Droplet)
  • Which site(s) I’m serving, and which host name(s) I’m using for them (multi-discipline.com www.multi-discipline.com)
  • Where (on the VPS) it should look for each site’s “root” directory (eg. /var/www)
  • Which file to use as the site’s (or any subdirectory’s) index (typically index.html)
  • Which pages to use for given errors (eg. 404 /404.html)

I also needed to be sure the site configuration I did in /sites-available/multi-discipline.com was again expressed in that site’s file in the /sites-enabled directory. Since my setup is pretty straightforward (i.e. any site I have “available” should also be “enabled”), I symlinked the site file in /sites-available with its counterpart in /sites-enabled, meaning any future change to one of those files is carried through to the other.

Temporary Hangups

At this point, I ran into some system-permissions-related problems.

Firstoff, I’d forgotten to make the post-receive hook file executable (i.e. “able to be run”), so nothing was happening when my Git repository detected changes. I fixed this with a chmod.

Secondly, the post-receive hook script couldn’t write in the destination folder (or, rather, a script run by my user account couldn’t, but needed to, since the git remote named droplet set up on my laptop to push code to the VPS did so using my VPS user account’s SSH Key), so I made my user account a member of the user group which has control over /var/www and its contents. In my case (and, I believe, by default), this group was www-data.

Context: For security and peace-of-mind reasons, I set up a user account on my VPS which I would use for day-to-day system administration, but which was not the system’s default root account, since root is basically god as far as the system is concerned, and can do many harmful things if accidentally or maliciously misused.

The Outcome

The first time I successfully ran git push droplet master on my development machine and found the properly compiled site being served at http://multi-discipline.com, I threw my arms up in victory. Success!

From this point on, I have the ability (if not the time) to post lengthy screeds at will and trust they’ll be around until the day I no longer want them to be (or until my DigitalOcean payment method fails — whichever comes first).