2019-12-13
Dusting off the old weblog, after a five-year absence đ
The Jekyll site has been chugging along maintenance-free this whole time. All Iâve
done is keep my DigitalOcean invoices paid.
I built this site and set up hosting two dev machines ago, though, so I had completely
forgotten how to publish. The Multi-Discipline code had never even been on my current
machine (let alone appropriate git remote
s and SSH keys set up, et cetera).
In trying to figure out how past-me had set up the publishing flow,
my brain (clouded, unreliable đ) was nudging me in the direction of popular Jekyll-based
hosting option GitHub Pages, and I spent a few hours
trying to resurrect a publishing flow that never was.
But then I took a few minutes to read my own first post on this blog, The Making of Multi-Discipline
(clear, faithful đ), and I was put back on the right track. Documentation, FTW đ
(thanks, past-me)!
I set up a local git remote
for the DigitalOcean droplet, did a test push
(which failed, so I wrangled some SSH keys and push
ed again), and was back in business.
Since my attention was momentarily back on Multi-Discipline, I took the time
to upgrade my trusty DigitalOcean droplet to the latest version of Ubuntu (18.04
,
up from 14.4
) along with some other packages, and did some content/template housekeeping,
for good measure.
A quick tune-up, and everything seems ready to roll đđ
2014-11-24
âThat frame â in a timber frame â defines the building. Everything else is a facade. That frame is what will allow it to stand up tall and strong and rugged for the next 200 years. [âŚ] The roof will come and go, side walls may come and go, windows will come and go; but that frame â that will be there for a long time.â
-Tedd Benson, timber framer
Be sure to also watch parts II and III.
For anyone unfamiliar with my personal history, I used to work as a carpenter for a local home builder. As one of only two to three (including the owner) on the crew, I was involved with nearly every stage of the homebuilding process â from rough framing (spanning floor joists and laying down subfloor, putting up walls, setting roof trusses and getting the jack rafters just right, sheathing the exterior walls and roof, building stair structures, etc.) to the more polished work of finish carpentry (putting down hardwood floors, fitting base and window moulding, hanging doors, installing stair and balcony rails, etc.).
Iâm fortunate (and proud) to have had the opportunity to experience that level of hands-on work with structures which will stand for decades; it makes for a nice professional contrast with the relative impermanance and perpetual iteration of the Web Things⢠I build in my current field.
So when I saw Kevin OâConnel (of This Old House) interviewing documentary filmmaker Ken Burns and builder Tedd Benson about their approach to building an authentic-to-traditional-spirt timber-framed barn, I was instantly engaged.
Ken Burns is, himself, a craft-focused personality (evident from his body of work in documentary films), so Iâm always interested to hear him talk about process.
What I didnât expect was the sense of history, philosophy, and professional purpose with which timber framer Tedd Benson spoke about his work; including morsels such as this:
âBarns are agrarian cathedrals, and they were built by people who were a part of an endless chain of knowledge, skill, and an attitude that âwhen we build, we should build foreverâ. So â though they sacrificed something about the ornate qualities of building â when they built barns, they didnât sacrifice anything else. They just reduced materials to simplicity, and then brought all of their skills, all of their knowledge, all of that ancient wisdom to create these wonderful buildings that are part of that sacred tradition.â
-Tedd Benson
I wonât pretend that my approach to building homes was as deeply rooted as Bensonâs (few could possibly be) â but the ability to speak with ease about the history and philosophy of oneâs chosen profession is a quality for which I have profound respect in craft-focused practitioners of all types.
In what Iâve seen from the people I most admire, that depth of awareness of purpose and context seems to require a mindful balancing of qualities so seemingly disparate as hyper-involvement in oneâs craft, and a degree of separation (proactive removal?) from the work directly at hand.
If I ever figure it out, Iâll be sure to pass along the magic mix.
2014-11-10
If you, like me, are widely curious and always trying to expose yourself to new ideas, you ought to consider the rise of podcasting as a medium to be nothing short of revolutionary. For someone who carries an intense interest in a few subjects â and a passing interest in many, many more â the right kind of podcast can offer the caliber of insight that used to require tuition and a lecture hall (and often does so through hosts and guests with engaging personalities and an infectious passion for the subject matter).
Iâve posted the âMakerâ subset of my podcasts subscriptions. That grid shows about 1/3 the total list of feeds through which my podcatcher of choice iterates every time it checks for new episodes (most of these podcasts post on a weekly schedule).
If not for the fact that writing code (or prose) requires focus from the same part(s) of the brain which process verbal language, Iâd be actively keeping up with even more. As such, my podcast consumption is limited â quite literally â by the number of waking non-working hours in a day.
âSo, ladies and gentleman, if I say Iâm an podcast fan, youâll agree.â
-Me
The Problem
There is a vast (and growing) landscape of podcasts available to any listener with the time and inclination to seek them out. Most listeners, however, are not inclined to spend time perusing podcast directories for new must-listen shows; they have a few which suit their daily commute, and thatâs as far into the landscape as they need to explore. (Filthy casuals.)
Or, approached from the ârecommenderâ side: when one listener finds a show or episode they really like, they have limited outlets through which they can recommend it to other would-be listeners.
Speaking for myself: I almost always tweet a link to the feed and/or episode URL of a show which I find interesting, entertaining, or useful. But I wouldnât be surprised if those recommendation tweets (given, as they are, in a stream-of-posts context which is impermanent by design and affords no immediate action for podcast listening or subscription) are lost on many who would have taken a listen, but werenât in a position to immediately:
- Stop what they were doing
- Open their podcatcher
- Find the show (or, if the podcatcher has no directory feature, copy/paste the showâs feed URL)
- Subscribe to the show (or download the single episode)
The (Proposed) Solution
A service which allows users to recommend single episodes of their favorite podcasts, and to âfollowâ the episode recommendations of other users, which the system compiles into a single appropriately formatted RSS/XML personal âfollowedâ feed for each user.
As far as I can see, the best-fitting model already in existence is the humble retweet.
But, like the retweet â in order to enable that behavior, you first need a system to handle the basic re-share-able units of content (in our case, podcast episodes).
Desired Behavior:
- As a user, I âfollowâ other users whose podcast recommendations I think Iâd find interesting (probably people whose interests I already know mirror my own, thanks to their presence on other social sites/services).
- Individual episodes which have been recommended by users who I follow will be added to my âfollowedâ feed.
- Like retweets, even if more than one user who I follow recommends an episode, that episode is only added once to my âfollowedâ feed.
- When I want to expand my podcast horizons through a set of socially curated episode recommendations, I should be able to simply open my âfollowedâ feed (which has all of the recommended individual episodes compiled into a single feed) in my podcatcher, and play an episode.
- The rest of my subscriptions in that podcatcher should not be affected.
- If I find an episode interesting enough that I want to subscribe to the podcast directly, there should be a button for doing just that on the individual episodeâs ânow playingâ or âinfoâ view.
- If someone I follow adds a recommendation for a podcast to which Iâm already subscribed, that episode should not appear in my âfollowedâ feed.
- Web Site (acts as hub for creating an account, finding users to follow, recommending podcast episodes, etc.)
- API for Web Service (which the site and developers of podcatchers can use to authenticate service users, pass along episode [un]recommendations, add userâs âfollowedâ feed as subscription, etc.)
- Cooperation of Developers (much in the way that third-party apps have integrated with useful âpersonal collectionâ services like Pinboard and Instapaper)
The (Proposed) Process
This remains to be seen, but I believe the first step would be to build an MVP of the website and service. Obviously, it wouldnât be as feature-rich on its own as it could eventually be once third-party podcatcher developers build integrations; but it would be a useful start for the steadfast core of Podcast Superfans⢠who want to push and collect episode recommendations right away. (There are dozens of us⌠DOZENS!)
Until third-party developers adopted it, service users could simply copy/paste the URL of their personal âfollowedâ feed (LIKE AN ANIMAL) to subscribe in any podcatcher, just like any other podcast feed. They could also manually add/remove podcasts to/from an âIâm already subscribedâ list on the system, to prevent recommended episodes of podcasts to which theyâre already subscribed appearing in their âfollowedâ feed.
These temporary workarounds would be tedious, but not sufficiently frequent to make use of the service into a chore.
The (Expected) Outcome
Since sharing recommendations is something done without any real expectation of return or interaction, it shouldnât matter, at the outset, whether anyone is actually following you (meaning the inherent chicken-and-egg problem of social-dependent services should be avoidable). As it stands now, this is something I do with zero feedback about whether my recommendations are appreciated, or even noticed (let alone acted upon).
And since podcasts, by their nature, require a time commitment to consume, a single-digit queue of recommendations is enough to keep a user busy for hours (unlike, say, Twitter â where a user must follow at least a few dozen other users to be ensured consistent fresh new content). This means a user could conceivably fill their available podcast listening time by following just two or three recommenders.
Given the right circumstances (and eventual third-party developer integration) a service such as this could be expected to keep podcast listeners continually returning to the recommendation well for another quality dip of socially filtered podcast goodness.
It would be interesting to see how a recommendation machine constantly churning out appropriately recommended content fares when it bumps up against the inherent limit to a humanâs available daily (weekly, monthly) listening hours, and how that ultimately affects the decision users make about what to consume in their preciously limited time.
If it could help casual listeners to âlevel upâ their podcast consumption (in terms of quality, if not hours), I feel like it would be a solid win for the medium.
2014-10-16
Despite having worked on and around the web since 2006, itâs been a while since Iâve had my own personal space which I controlled soup to nuts.
Twitter has satisfied most of my practical publishing needs (low friction, low maintenance, publicly accessible, able to be individually hyperlinked), but the notion that I donât âownâ my tweets is aggravating to my future-proofing sensibilities.
While I do have systems in place to capture, for example, my tweets (first via an iPhone app called Momento; more recently by running a custom install of Brett Terpstraâs Slogger to add my tweets to a personal Day One journal), itâs not the same as having posts under my complete control.
The Problem
The fate of my posts to Twitter, Tumblr, or any other advertising-supported âpost here and weâll probably keep your stuff around for a few yearsâ platform is, ultimately, out of my hands. But I would like most (if not all) of my day-to-day public writing to have the best possible chance of being around for my kids to discover when theyâre older.
The Solution
Iâm building out my own little corner of the web where I control the presentation, availability, and ultimate fate of my content, and which I can expand to include more forms of media (long-form posts, image galleries, video, audio, etc.) as I find uses for them.
The Process
Tinkering With Jekyll Locally
Since I had never before used Jekyll, I limited myself to installing and running it locally on my development machine (a.k.a. âdev machineâ, a.k.a. âlaptopâ) for a few days. This gave me an opportunity to tinker with it (tweak a template, find the limits/hackability of the posting system, determine what else I could build into this) without investing a bunch of time spinning up a VPS. Jekyll, I soon learned, could even be deployed to GitHub Pages with a few setup conformities, so that became my plan.
After tinkering with Jekyll, it was clear that while this was a solid piece of software out of the box, the real value it offered was its customizability and extensibility (given familiarity with Ruby and Liquid). I work with Ruby professionally, so that was a good fit; and despite this being my first experiment with Liquid, itâs a sufficiently intuitive and documented system that I picked it up quickly.
The Jekyll features I wanted to build, though, required that I change my deployment plan from GitHub Pages (which offers a free and quick way to run a stock Jekyll site) to deploying via Git to my own VPS.
Setting Up My VPS
Configuring the DigitalOcean Droplet (their term for a VPS âunitâ) was straightforward; DigitalOceanâs own system had me up and running with a virtual Linux (Ubuntu) âboxâ in no time, and I had the necessary user accounts, SSH Keys, etc. set up shortly thereafter (aided by DigitalOceanâs excellent documentation & tutorials).
Context: Part of setting up this droplet involved directing web traffic from my domain name, multi-discipline.com, to my dropletâs IP address. In addition to being necessary for general web traffic, having that DNS A Record
in place allowed me to use that hostname for other services (ssh
, git remote
, etc.) and have a single point of update should I move the site to a different server.
Next up, I installed Ruby and Jekyll on the VPS, installed Git (git-core
, in my case), and created the Git repository at which Iâd be receiving my entire, un-compiled Jekyll site (pushed via Git from the repository on my laptop, where Iâm building the site and writing posts).
Context: Jekyll is a static site generator, meaning that the âworkâ of building each and every page in the site (setting the permalink address of each page; including header, footer and other page layout partials; replacing variables with the appropriate text, etc.) is done once whenever I push a change to the Git repository, instead of every time someone on the web requests a page.
The driving force of all that server-side processing is a Git post-receive hook
on the siteâs repository (a.k.a. ârepoâ) on my VPS. You can think of this as a shell script which fires when Git receives a change to that repo. In basic terms (neglecting a few technical steps), that script:
- Tells Jekyll to turn my siteâs un-compiled content files, layout partials, and assets into a functioning and inter-linked set of web pages (in other words, makes the disparate parts into a proper web site).
- Clones the generated
/site
folder to a particular directory on the VPS (where, in a future step, weâll tell the web server to look for the page files as theyâre requested).
Setting Up nginx As My Web Server
While Iâve done some basic system administration work in the past, Iâve never actually installed and configured my own web server (nginx, in this case) on a clean system.
nginx does the work of listening for requests which come in, and serving the appropriate files from my VPS back to the web browser which requested them.
As such, I needed to let nginx know:
- The port on which it should be listening for web requests (standard is
80
)
- The IP address from which it is serving (the IP address of my DigitalOcean Droplet)
- Which site(s) Iâm serving, and which host name(s) Iâm using for them (
multi-discipline.com www.multi-discipline.com
)
- Where (on the VPS) it should look for each siteâs ârootâ directory (eg.
/var/www
)
- Which file to use as the siteâs (or any subdirectoryâs) index (typically
index.html
)
- Which pages to use for given errors (eg.
404 /404.html
)
I also needed to be sure the site configuration I did in /sites-available/multi-discipline.com
was again expressed in that siteâs file in the /sites-enabled
directory. Since my setup is pretty straightforward (i.e. any site I have âavailableâ should also be âenabledâ), I symlinked the site file in /sites-available
with its counterpart in /sites-enabled
, meaning any future change to one of those files is carried through to the other.
Temporary Hangups
At this point, I ran into some system-permissions-related problems.
Firstoff, Iâd forgotten to make the post-receive hook
file executable (i.e. âable to be runâ), so nothing was happening when my Git repository detected changes. I fixed this with a chmod
.
Secondly, the post-receive hook
script couldnât write in the destination folder (or, rather, a script run by my user account couldnât, but needed to, since the git remote
named droplet
set up on my laptop to push code to the VPS did so using my VPS user accountâs SSH Key), so I made my user account a member of the user group which has control over /var/www
and its contents. In my case (and, I believe, by default), this group was www-data
.
Context: For security and peace-of-mind reasons, I set up a user account on my VPS which I would use for day-to-day system administration, but which was not the systemâs default root
account, since root
is basically god as far as the system is concerned, and can do many harmful things if accidentally or maliciously misused.
The Outcome
The first time I successfully ran git push droplet master
on my development machine and found the properly compiled site being served at http://multi-discipline.com, I threw my arms up in victory. Success!
From this point on, I have the ability (if not the time) to post lengthy screeds at will and trust theyâll be around until the day I no longer want them to be (or until my DigitalOcean payment method fails â whichever comes first).