Tour 2015 – Days 1-4: Polson, Montana

Home, sweet home in the woods, although slightly uninhabitable after being vacant for more than 18 months. A good sweeping and vacuuming will take care of most of the issues.

To start off our grand auto/bike tour of 2015, we collected Judy’s brother-in-law, Ben, from his niece’s in Bremerton, for the second leg of his annual northward migration to the Mission Valley in Montana. After Delia the cat was safely delivered to Just Cats Hotel for the duration, we packed the car for the 60-day excursion, then arose early on Friday for the Montana trip, a run we have made many times over the last 28 years (in different directions during our 10-year residency in Montana), but a bit more cramped than most, having three people in the car, the tandem on top, family items for “early inheritance” to the New Mexico and Texas clans, and outfitted for the long trip with clothing suitable for biking, driving, and the social events we have scheduled in Iowa and Minnesota, in various climates.

The 12-hour transit was uneventful, with good weather and light traffic, arriving before dark, but plenty stiff and tired from the long journey. Not having been to our cabin in more than a year and a half (last year, we didn’t go in because of deep snow), we decided to impose on nephew Rick’s hospitality for the first few days so we have time to clean up the cabin for our return trip in June. We also found that the storage shelter we had rigged from our greenhouse frame covered with tarps a couple of years ago had suffered from ice buildup during the harsh 2014-2015 winter and collapsed, bending many of the steel tubes, breaking the welded connectors, and damaging some of the items stored within, Rick had salvaged most of the repairable items, but we are still faced with salvaging the shelter materials that are usable. The cabin, meanwhile, had become a fly and wasp trap during last summer, so we set off insecticide bombs inside and plan to sweep up the insect carcasses and tidy up the place before moving on the southward leg of our tour in mid-week.

What was once a 10’x20′ greenhouse frame covered with tarps is now a pile of rope, wire, shredded plastic, crushed sawhorses, and bent steel tubing, swimming in pools of water left behind from the ice blocks that brought it down.

Meanwhile, work goes on, as Judy handles last-minute coordination for weaving guild business and Larye manages updates and status on several web sites. Fortunately, high-speed Internet has reached the rural mountainside, so our trips to town for Internet access are curtailed for now.

Practice, Practice, Practice: An Amateur Filmmaker’s Journey

The old saw, “How do you get to Carnegie Hall?”  Answer:  “Practice, practice, practice,” is so true.  All of us are impatient: we just want to “do artistic stuff” and have it turn out like the examples that inspired us in the first place.  However, no matter how refined our tastes, our talents take time to develop.  How long depends on how much help or critique we get along the way, plus a lot of hard work.  What follows is a narrative example of and informal tutorial on making videos on a budget, with inexpensive equipment and open source software.

I’ve always wanted to learn to make video presentations.  I imagined I might want to record test flights in the homebuilt airplane project that has languished, unfinished, in my cluttered and sometimes soggy workshop.  Another  project is documenting our bicycle travels.  One obstacle was gear: quality video equipment is expensive.  However, all modern digital cameras have a video mode.  I started practicing several years ago, strapping my Fuji pocket camera and small tripod to the handlebars of our tandem bicycle, to document rides on the bike trails.  It was pretty terrible, amplifying the bumps and roots on the trail and the clicking of the gear shift, as well as not being very well attached, with the camera flopping around from time to time.

The next year, I got a GoPro Hero 3 (Silver–the mid-range model) point-of-view sports camera, and a handlebar mount made for it, a modest investment.  The  GoPro web site has daily videos sent in by users, showing off amazing feats of surfing, bicycling, motorcycling, scuba diving, parachuting, wing suit plunges, and all manner of dangerous sport, seen from a camera strapped to the forehead, chest, or wrist of the participant.  Some were exciting, some just plain scary, but all very professional-looking.

The Mean Green Machine on tour in Michigan’s Upper Peninsula, with the GoPro mounted above the handlebars (and the headlight) in its waterproof case. While at an ideal Point of View height, the mount tends to vibrate a lot. I have a helmet mount, but my riding style involves a lot of head movement, which is distracting as well.


At first, I strapped the new camera on the handle bars of our Bike Friday Tandem Traveler “Q,” turned it on, and set off on a 35-km ride.  The result was better than the first attempt with the Fuji, but still shaky, vibrating, and endless.  OK, a bit of editing to show some particularly interesting parts, or at least cut out the really boring and really shaky parts.  But, a lot of time sifting through gigabytes of footage.  I eventually pared the hour and a half of “film” (I only recorded one way of the out-and back ride) down to 11 minutes of not-very-exciting or informative view of lake and woods drifting by at 15km/hr bicycle speed, plus a few moments of 30km/hr downhill bouncing and shaking.  The sound track was a muffled one-side conversation between me and my stoker, Judy, on the tandem, plus a lot of road noise transmitted through the frame, and the frequent clicking of the shifters  and hissing of the brake shoes on muddy metal rims.  A really round-about way of saying “We went for a really satisfying bike ride up the south shore of the lake, and came to nice waterfalls about once an hour.  Wait for it.”  Fast forward two years of trial and error…

After watching a lot of other people’s videos, and the progression in skill over the years of some of my favorites, like Dutch cycle tourists and videographers Blanche and Douwe, I have possibly picked up some hints of what makes a good video presentation.  I mostly publish on, which offers a set of short tutorials on making videos., but I also recently viewed some good tips by Derek Muller, a science educator who makes a living filming short YouTube videos on various topics in science (, and Ira Glass, host of National Public Radio’s “This American Life,” who published a series of four short talks on storytelling on YouTube.  Both agree that getting good takes practice.  Lots of practice.  Probably not as much as Malcom Gladwell’s tale of 10,000 hours of solid practice (in “Outliers“).  but a lot nevertheless.

The main point of Derek and Ira’s stories is: video is storytelling.  As we found, it is not enough to simply record the world as it goes by on your adventure.  The result has to tell a story: why you did it, where you went, what it was like, and what you learned, in a concise way that holds the interest of the viewer.  I know that most of my efforts failed, because of my viewer numbers on Vimeo.  Sometimes both of my loyal viewers watch a particular video, sometimes neither of them do.  Obviously something needs work.  Submissions to video contests garner a couple hundred views (compared with thousands for the winning entries, and millions for the viral baby, cat, and stupid human tricks videos on YouTube and Facebook), with no idea how many viewers actually watched all the way through.  So, we evolved over time, failure after failure.

First, I got rid of the “native sound,” because what the camera mic picks up isn’t what I focus on or even consciously hear while riding.  Instead, I find a piece of music that I think reflects the sense of motion and emotion in the ride, or one that at least fits the length of the film, or that I can cut the film to fit without making the visual too short or too long.  A fast ride needs a beat reflective of the cadence; beautiful scenery or glorious weather deserves a stirring orchestration or piano number, a matter of taste.  The next step is to trim the video clips to match the phrasing of the music, if possible.

I realized that, though I find looking forward to what is around the next bend exciting while on the bike, watching endless scenery flow by on the small screen isn’t particularly engaging.  Most of other people’s videos I enjoy have clips (scenes) of 7-10 seconds each.  Mine ran generally 20 seconds to several minutes.  Boring.  Furthermore, long takes don’t necessarily advance the story line, just as important to film as to the page, unless there is some interesting progression unfolding in the clip, much as a detailed sex scene in a novel is only necessary to define a key point in the development of the relationship between the characters:  mostly, it is sufficient for the characters to retreat to the privacy of the bedroom behind a line of asterisks, as a transition between scenes.  A video fade on a long stretch of empty road to the next bend suffices just as well.  We’re not promoting “bike porn” here: no matter how much we personally enjoyed the ride.  I’m beginning to appreciate the need for story-telling that doesn’t fall into the “shaggy dog” genre, i.e., drawn-out and pointless–suspense to boredom without a satisfying punch line.

Picking music is another issue.  At first, I shuffled through my library of ripped CDs (no piracy here,  just a convenient way to carry your music library with you, on the hard drive of your computer instead of a case full of plastic in a hot car).  However, even if the audience is small (i.e., myself and others in the same room), such usage violates the copyright on commercial recordings, especially on a public post on the ‘Net.  I’ve recently started re-editing some of the early videos I made this way, substituting from my new library of royalty-free music published under a creative commons license, and downloadable from several sites on the Internet, notably, where musicians leave selections of their work as a calling card or audio resume, hoping for commission work or performance gigs, or to sell physical CDs in uncompressed, high-fidelity audio instead of downloading the lower-quality MP3 lossy compression version.

This is the wave of the future in a world where digital copies are easy: whether you buy a copy or get one free, play it, listen to it, use it to enhance your art, just don’t resell it whole.  That’s the idea behind creative commons.  Unfortunately, much of music publishing is still in the “for personal use” only, and if the pressed recording gets scratchy, buy another one, no “backup” copies allowed, and no sharing with friends: if you want them to hear a song, invite them to your house or take your iPod over to theirs: you can’t stream it or email it or share a copy on the cloud.   ASCAP blanket licensing for broadcast or use in video is still on a corporate price scale, intended for production studios and well beyond the reach of a PC user who just wants to add her favorite pop tune to a video of her and her friends having fun.  So it is that Kirby Erickson’s ballad of driving up US93 through the Bitterroot as background to a bike ride up US93 through the Bitterroot is gone, so viewers who aren’t familiar with his work won’t be tempted to buy the album the song came from, because they won’t ever hear of it.   Restrictive licensing actually potentially reduces sales in the Internet age.  By now, you can’t even upload videos if they have copyrighted music audible in them–Facebook, for one, matches audio signatures from video against a sound library and blocks them.

Although I see some improvement in quality in my amateur videos, I still have a long way to go.  For one, the handlebar mount for the GoPro camera introduces too much vibration, so the picture is hard to watch, and doesn’t reflect the experience of riding.  Some damping is needed.  We did get some better results with the camera mounted on our trailer, but we only use that when touring.  Some sort of counterweight to produce a “steady-cam” effect might work here, as the “real thing” is expensive and a bit bulky.

The story is more interesting when it shows the participants, which, for us, means using the trailer mount or some sort of “selfie stick” to put the camera to the side or front, or, as I did in one clip, turn the camera around briefly.  During my convalescence from heart surgery last summer, we did a lot of hiking, where I devised a selfie-stick approach to give the impression of the viewer being with us instead of sharing our point of view.   I’m a bit happier with some of those, particularly the ones where the camera boom isn’t in the view.  More practice, and experimentation.  I’ve been more satisfied with ones where we’re in the shot only when necessary to tell the story (an essential point, when the story was that I was OK, and getting better), and the scenery out in front when it was the story.

Now, the issue is to trim the scenes to the essential elements (who, what, where, when, why, and how).  To that effect, part of the re-editing process to replace the audio tracks involves cutting the video to synchronize with the sound track phrasing, as well as reducing the length to the minimum necessary.  To paraphrase E.B. White’s dictum on writing, “Omit needless frames.”

This still frame says it all: who is reflected in the window, how is the bicycle, where is “Firefly Coffeehouse” in Oregon, Wisconsin. what is “bike tour,” and why is, well, we’re having a good time.


One of the issues with being the director, cameraman, and actor all at once is to keep the bicycle safe while planning the shot and operating the camera, as well as keeping the mission (travel) moving along.  We miss some good shots that way, but it is inevitable.  One popular technique is to set up the camera along the route and show the bicycle or hikers approaching or receding across or into the frame, which involves stopping and staging the shoot.  This is less intrusive where there are two or more cyclists, so it is a matter of setting up the shot ahead of or behind the other rider(s), but that isn’t an option with the tandem, and we’ve used it in limited fashion by propping up the monopod/selfie-stick along the trail.  We do have several sizes of tripods, but they aren’t convenient to carry when the photography is incidental to the main purpose of travel.  I’ve long since taken to filming short takes “on the fly” rather than just leaving the camera on to pick up everything, which involves anticipating some scenery reveals or events, and, of course, missing some.  But,  editing “on the fly” to limit scenes does shorten the editing process and save battery life on the camera.  We work with what we have.

Recently, we entered a video contest for a short travel documentary on the Newberry National Volcanic Monument,  in central Oregon, which seemed to demand some dialogue in addition to the usual soundtrack and titles, so we experimented with voice-over to add a short narration where appropriate.  This also wasn’t the best, since our microphone is the headset-attached variety, suitable for making Skype phone calls and video chats, but little else.  Good quality condenser microphones for the computer and lapel microphones compatible with the GoPro are simply not in the budget, along with professional video cameras with microphone jacks or built-in directional microphones.  Drones are all the rage, now, but one suitable for carrying a GoPro as a payload is stretching the budget, also, and presents safety and control issues for use in our primary video subject, i.e., bicycle touring and trail riding.

Besides finding a story in a video clip sequence, getting the story to flow smoothly, and finding an appropriate sound track to evoke the mood of the piece, the skill set also involves learning to use video editing software.  Microsoft Windows comes with a decent simple video editor, but we don’t use Windows.   We do have iPads, which have apps for making videos, but haven’t spent a lot of time on those, which also limit one’s ability to import material from multiple sources (the apps work best with the on-board iPad camera).   There are a number of contenders in the Linux Open Source tool bag, some good, some complex.  We chose Open Shot, a fairly simple but feature-full non-linear video editor, which gives us the ability to load a bunch of clips, select the parts we want, and set up multiple tracks for fades and transitions and overlays of sound and titles.   We also found that the Audacity audio recording and mixing software can help clean up the sound from less-than-adequate equipment.  ImageMagick and the GIMP are still our go-to tools for preparing still photos to add to the video.  Open Shot uses Inkscape to edit titles and Blender for animated titles.

Video is memory and CPU-intensive, so it helps to have a fair amount of RAM and a fast multi-core CPU (or several).  Our main working machine, a Zareason custom Linux laptop, has 8GB of RAM, an Nvidia GeForce Graphics Processor Unit, and a quad-core dual-thread CPU, which looks like an 8-processor array to Linux.  This is barely adequate, and often slows down glacially unless I exit from a lot of other processes.  The more clips and the longer the clips, the more RAM the process uses; often the total exceeds the physical memory, so swap space comes into play.  I’m usually running the Google Chrome browser, too, with 40-50 tabs open, which tends to overload the machine all by itself.

This isn’t something you could do at all on a typical low-end Walmart Windows machine meant for browsing the ‘Net and watching cat videos on Facebook and YouTube, so investing in a professional-quality workstation is a must.  Since we travel a lot and I like to keep our activity reporting current, that means a powerful laptop machine, running Unix, OS/X, or Linux.  Fortunately, our laptop “strata” is in that class, though only in the mid-range, a concession to the budget as well as portability.  We purchased the machine when we were developing software to run on the National Institutes of Health high-performance computing clusters, and is roughly comparable to a single node in one of the handful of refrigerator-sized supercomputers in the laboratories that have several hundred CPU cores and several dedicated GPU chassis each.

In addition to Open Shot, we also sometimes use avidemux, a package that allows us to crop and resize video clips so we can shoot in HD 16:9 wide-screen format and publish in “standard” 4:3 screen format if necessary, or crop 4:3 stills and video automatically to 16:9 format to use with other HD footage.  In addition to the GoPro, we now have a new FujiXP pocket camera that can shoot stills and video in 16:9 HD, and a Raspberry Pi camera unit, that is programmable (in Python), that we use for low-res timelapse and security monitoring.  The programmable part means we write automated scripts that select the appropriate camera settings and frame timing and  assemble a series of still photos into a timelapse movie, using the Linux ffmpeg command-line utility.

So it goes–gradually, the videos we turn out get slimmer and more to the point, if not technically better quality, something we need to work on constantly with prose as well, as an intended 500-word blog post ended up a 3000-word tutorial instead.

Looking for Conspiracy in All the Wrong Places

The headlines this week are full of new conspiracy theories regarding former First Lady, Senator, Secretary of State, and current philanthropist and possible presidential candidate Hilary Rodham Clinton.  The conspiracy?  Attempting to hide information from public scrutiny by having her very own personal email address, which she used for correspondence while exercising the duties of the office of Secretary of State.  Well, I am waiting for the Federal Marshals to arrive and haul me away, too, because I did the same thing, while exercising my duties as a federal contractor.   Cutting through the hype and hysteria, one can easily see that the conspiracy falls apart in so many ways as to be classic slapstick comedy, though unfunny in its seriousness, stupidity, and mean-spirited attempt to smear yet another political aspirant for no good reason.

According to a release from the Associated Press, that has been widely re-published by various media, Clinton “suspiciously” ran her own “homebrew email system” to handle her email while Secretary of State, rather than using her official State Department .GOV account.  OK, the latter would work for a person who only uses email for official business, with no personal or outside interests.  But, most of us who don’t need help turning on our computers use the Internet for multiple purposes, many of which are at odds with the government’s computer policies about “For Official Use Only,” to the extent we necessarily use our preferred (and private) systems to comply with those policies.

As an independent contractor, I, of course, like the Clintons, had my own Internet domains, for the purpose of maintaining a web site and email identity for my business, which I used for both government and non-government correspondence, even though I also had a government email account provided with the contract.  The government email account was necessary because government IT policies restrict certain mailing lists to internal email systems only.  Most of those correspondences were in the realm of inter-office memos and “news,” of interest only to persons who work in the building that is the “home office” of the organization.  In my case, this was a compound in Bethesda, Maryland, a city that I have never visited–I worked remotely from my office or at the Montana laboratories.

In Secretary Clinton’s case, by all accounts, she rarely spent time in the Washington, DC State Department offices, having spent much time “on the road,” so no doubt had little interest in notices of parking lot restrictions, “brown bag” lecture schedules in the auditorium, or the like, that make up a large part of the content of official government email boxes.   The “official” mailing lists aside, there is no such internal-address limitation on personal address books for the immediate circle of colleagues that most people deal with 99% of the time.  The other 1% are initial contacts from government employees who had to look up your address in the system’s directory.  After the initial exchange, the correspondent usually will have your outside address.

Mechanics aside, it is a fact that all “official” correspondence within the government will be to or from a government mailbox, and therefore archived within the government email system, whether or not one endpoint is a non-government mailbox.  In the case where correspondence is between two non-government addresses, it is also most certainly copied on CC or BCC to a staffer or colleague, or your own official mailbox, if for no other reason than “FYI” or to comply with record-keeping policies.  Under the new rules, put into place after Secretary Clinton’s tenure, a correspondence that doesn’t include a government address but is pertinent to official business merely has to be forwarded to a government account within 20 days to meet the transparency rule.

It is also a fact that government mobile access to their systems is sometimes outmoded and painful to use, and the hardware provided can legally only be used with the official systems, for official business.  However, personal systems can use the latest or at least preferred hardware and software, while still capable of connecting with government services if necessary.  Carrying multiple hardware systems and accessing multiple accounts on each is not compatible with the kind of rapid-response, on-the-go travel that State Department officials (as well as itinerant consultants) do to accomplish their mission.

The AP release attempts to “spin up” the conspiracy angle by muttering about a “mysterious” individual named on the domain registrations for the Clinton’s businesses, and speculating about the location and security features of the actual physical servers.  Oh, please.  It is normal for the “IT guy” to be named on the domain registration, as it is a technical responsibility, and usually someone who has physical access to the server hardware, i.e., a hosting service.  Whether or not the mystery man is indeed the Clinton house IT guy or simply a “nom de tech” to keep a famous name and contact information out of the public record is immaterial.  The implication that a “homemade” email system is neither reliable nor secure is specious: we are dealing with a multi-million-dollar foundation here, not some preteen wannabe hacker running a web site from his bedroom.

Not that you can’t run a reliable and secure system on a budget–I run a webcam service from a $50 Raspberry Pi that is backed up and secured: the logs show it shrugs off break-in attempts; it gets updated with the latest Linux security patches regularly, and it is available as long as we have power and cable services to the window sill in my office where it sits.  Our main services for web and email are physically located in Montana, where we rent server space for our personal domains.  Those are backed up and/or replicated by the service.  We pay a monthly fee for this service, so expect a certain degree of privacy and security that may not be the case with public free services like gmail or MSN.

This is not rocket science, people.  Anyone as smart as the Clintons or who travels a lot  should and usually does have their own Internet presence outside of Gmail and Facebook, i.e., something that they control and has “brand identification,” and people who have business both within and outside the government need to have a non-government account for non-government business: it’s the law.  The speculation in the article about a “homemade email server” is simply specious.  There is no evidence given that the hardware is in the Clinton’s home.  The IP addresses for the mail servers associated with the domains mentioned in the AP article are not related, so are most likely not colocated on the same hardware.  Domain registrations list the address of the owning entity, not the location of the hardware: it can be assumed that the services are hosted, like those of most small businesses, at a hosting service.  Further analysis might at least reveal the geographic region to which the IP addresses are assigned.  The domains themselves have private registration, so that the AP claims cannot be verified from publicly-available information.  Given the wildly speculative tone of the article, there is no reason to believe any allegations or implications in it are true.

Tinkers and Scriveners, 2015

Tinkering is, historically, the repair of tinware, usually by itinerant craftsmen.  Modern usage extends the term to any kind of more or less informal modification or adaptation of something.  If the object is intended to be useful or repurposed, we say it has been “tinkered with.”  If the purpose is to make an art statement, we resort to French, and call it “bricolage.”

A scrivener is traditionally a transcriptionist, someone who writes down what an illiterate person needs to have in print for legal or financial purposes.  Scriveners are still fairly common in Third-World countries, and in the First World, we use professionals to help us with legal and other formal paperwork.   In the modern age, reading and writing is more or less a universal skill, but computer coding is not, so we commonly also employ a scrivener to write down what we want a computer to do.   I think the term “scrivener” is more descriptive than “programmer.” As someone who revises more code than writing original code, we often use the combination, “Tinkers and Scriveners” as our informal motto.

In our last episode, we described a “just for fun” practice, describing (a word derived from “scribe”)  to our computers how to take pictures of our driveway, turn them into a timelapse video, and make them available on the Web.  To extend and refine this concept, we tinkered with the scripts this week, to automate the process and add an audio track, plus fix a few issues.

Try it out:

As part of the allure of having a video surveillance system, we had included a weather report along with the “real time” view.  The process is simple:  “Go get the weather report, take a picture, and display the picture, time, and current weather report in a web page.”   That worked fine when it was just me running it.  But, as soon as I posted the URL (Uniform Resource Locater, or web page name) to Facebook, that World-wide Eater of Time, I immediately got a “nastygram” from the weather service informing me that I had abused my (free) automated query privilege by asking for many weather reports all at once.   I don’t have all that many “fans” eagerly awaiting my next post, so I presume it was the  and Facebook robots burrowing into the link to see where it went–many, many times.  The search engines do this to score your site/post and the spammers also are looking for paths that lead to posts with open comment access.

So, the first order of business was to rethink the weather report.  The service only updates their reports a few times an hour, so there is no need to ask for a fresh report every time the page is accessed.  So, we told the computer, “Remember the weather report, and only get a new one if the one you have is more than 10 minutes old.”  We don’t expect a lot of traffic, but this ensures at least 10 minutes between weather updates, no matter how many visitors we get at once.

Here’s what that revision looks like, in patch form:

> import os.path
< f = urllib2.urlopen('')
< json_string =
> stale = time.time() - 600
> if ( os.path.getmtime('weather.dat') < stale ):
> f = urllib2.urlopen('')
> json_string =
> w = open('weather.dat','w')
> w.write(json_string)
> w.close()
> w = open('weather.dat','r')
> json_string =

A minor change, but one that prevents breaking the application programming interface (API) agreement with the weather site. We can’t control how many hits we get on our web site, but we can control how often we hit theirs.

Since we are now taking a picture every 15 seconds, and the process runs all day, taking exclusive control over the camera hardware, we no longer can take snapshots in “real time” like we did originally. However, 15 seconds isn’t a long time, so we simply copy the picture into the “current” view every time we take one, and skip the “snap the shutter” part of the weather report page.  That part we had in the last post.  What we did this time, though, was add some code to update the time displayed on the screen and refresh the picture every 15 seconds.  This requires some browser code, in Javascript,  We also added some code to show when the weather report was last updated by the weather service.  The patch file looks something like this (we also cleaned up some other code that is no longer necessary):

< <h1>The view from Chaos Central %s</h1>
> <h1>The view from Chaos Central <span id="nowTime">%s</span></h1>

> updated = parsed_json['current_observation']['observation_time']
< print "<img src=\"/images/webcam.jpg\" />"
< print "<p>The current temperature in %s is: %s F / %s C<br />" % (location, temp_f, temp_c)
> print """
> <table><tr>
> <td>
> <img src="/images/current.png" id="currentImage"/>
> <script language="Javascript">
> setInterval(function() {
> var currentImageElement = document.getElementById('currentImage');
> currentImageElement.src = '/images/current.png?rand=' + Math.random();
> var now = new Date();
> var h = now.getHours();
> var m = now.getMinutes();
> var s = now.getSeconds();
> formatnum = function( num ) {
> if ( num < 10 ) { return '0' + num; }
> else { return '' + num; }
> }
> var time = formatnum(h) + ':' + formatnum(m) + ':' + formatnum(s);
> var nowTimeElement = document.getElementById("nowTime");
> nowTimeElement.innerHTML = time;
> }, 15000);
> </script>
> """
> print "<p>The temperature in %s is: %s F / %s C<br />" % (location, temp_f, temp_c)
> print "Weather Data %s<br />" % updated 
> print "Reload page to update weather: image updates automatically every 15 seconds<br />"
> print """
> </td><td>
> <ul>
> <li><a href="/timelapse0.html">Timelapse video of Monday</a></li>
> <li><a href="/timelapse1.html">Timelapse video of Tuesday</a></li>
> <li><a href="/timelapse2.html">Timelapse video of Wednesday</a></li>
> <li><a href="/timelapse3.html">Timelapse video of Thursday</a></li>
> <li><a href="/timelapse4.html">Timelapse video of Friday</a></li>
> <li><a href="/timelapse5.html">Timelapse video of Saturday</a></li>
> <li><a href="/timelapse6.html">Timelapse video of Sunday</a></li>
> </ul>
> </td></tr></table>
> """

Those of you familiar with HTML and web page layout will also notice that we converted the page into a table and added a column with a list of links to the timelapse videos, which weren’t really visible before, since we were testing them.

Now, to make all this useful and automated, we told the computers (plural, remember–the video production takes place in another, bigger machine) to make a new video every hour on the hour and add background music to the video.  For the first part, making a video of the images captured so far, we simply add the script to a cron job.  “cron” is a daemon in the Unix/Linux system that runs programs at specified times, periodically. (A daemon is not to be confused with ‘demon.’  A demon is an evil force, a daemon is a helper or servant –the term comes from the Greek, adopted by the geek culture in the 1970s.)  Because our little Raspberry Pi runs on Universal Time (UTC, what used to be called Greenwich Mean Time), and daylight on the North American West Coast, eight time zones later, spans two days under UTC,  we have to put in two jobs for each time lapse image set, one for the morning and early afternoon, and one for late afternoon and evening:

0 16,17,18,19,20,21,22,23 * * 0 /home/larye/TIMELAPSE/ 6
0 0,1,2,3 * * 1 /home/larye/TIMELAPSE/ 6
0 16,17,18,19,20,21,22,23 * * 1 /home/larye/TIMELAPSE/ 0
0 0,1,2,3 * * 2 /home/larye/TIMELAPSE/ 0
0 16,17,18,19,20,21,22,23 * * 2 /home/larye/TIMELAPSE/ 1
0 0,1,2,3 * * 3 /home/larye/TIMELAPSE/ 1
0 16,17,18,19,20,21,22,23 * * 3 /home/larye/TIMELAPSE/ 2
0 0,1,2,3 * * 4 /home/larye/TIMELAPSE/ 2
0 16,17,18,19,20,21,22,23 * * 4 /home/larye/TIMELAPSE/ 3
0 0,1,2,3 * * 5 /home/larye/TIMELAPSE/ 3
0 16,17,18,19,20,21,22,23 * * 5 /home/larye/TIMELAPSE/ 4
0 0,1,2,3 * * 6 /home/larye/TIMELAPSE/ 4
0 16,17,18,19,20,21,22,23 * * 6 /home/larye/TIMELAPSE/ 5
0 0,1,2,3 * * 0 /home/larye/TIMELAPSE/ 5

Now, the web server owns the scripts that generate the web page and take the photos, but we share the folder on the external disk drive that holds the videos and I own the video building scripts on the CentOS server, so I run this set of  cron jobs under my own user account.  The timelapse show for the current day starts at 8:00am PST (1600 UTC), and runs until well after dark ( i.e., 0300 UTC, or 7:00pm PST).  As the days grow longer, if I am still running this experiment, I will need to extend the times, or use the astronomical data to control whether or not the video needs to be generated.  At the current display rate, the timelapse video covers an hour every 30 seconds.  In mid-winter, daylight lasts less than 8 hours at this latitude, but nearly 18 hours at the summer solstice, so the video will stretch out from the current 4-minute running time to nearly 10 minutes.  The camera program starts at 4:00am and waits for daylight to start recording, shutting off automatically at dark.  The programs are written so that it would be trivially easy to change the frame rate to keep the length of the video shorter by advancing the time scale faster later in the day, as the days get longer.

The last bit of code added was for an audio track.  For this, I told the computer, “Find a song in the music library that is about the same length as the video, and add a sound track using it.”  I have enough tunes of varying length in the library so that there is a different one every hour, and hopefully enough in between so we get different ones each week as the days get longer.

Part of this exercise was to get some practice writing in the Python language.  However, the song-picker, I wrote in my old standby, Perl, just to get it done quickly.

#!/usr/bin/perl -w

%list = (
 108 => "Sintel/Jan_Morgenstern_-_01_-_Snow_Fight.mp3",
.  # lots more songs here, deleted for clarity...
 225 => "Kai_Engel/Kai_Engel_-_09_-_Embracing_The_Sunrise.mp3");

$frames = $ARGV[0];
$rate = $ARGV[1];
$runtime = $frames / $rate;

foreach $key (sort { $a <=> $b } keys %list) {
 print STDERR "check: " . $key . " against " . $runtime . "\n";
 if ( $key >= $runtime ) {
 print $list{$key};
print $list{$key};

This program is called by the shell script that runs the video:


frames=`ls ./images/${1}/img*png | wc -l`
# find a music track that is about the same length as the video
song=`./ $frames $rate`
rm surveil${1}.mp4
~/bin/ffmpeg -framerate $rate -i ./images/${1}/img%04d.png -i ./music/${song}
-c:a aac -strict experimental -strict -2 -r 30 -pix_fmt yuv420p surveil${1}.mp4
echo $song > surveil${1}.txt

This also records the name of the artist and the song (for the credits) in a file, the contents of which are displayed by the Javascript code on the web page that plays the video clip.  This is still crude, using a frame.  The other part of this learning exercise is to hone my Javascript skills and learn to use AJAX (Asynchronous Javascript And XML–actually more often JSON rather than XML)  to pull data from the server and format it so the page is well-formatted and updates seamlessly.  The time updates on the “real-time” page also needs work to format it the same as the original time statement.  One oddity in that is that the initial timestamp is the server time, but the updates are for the timezone of the browser, so the hour will change if you are in a different timezone.  This violates the “least surprise” principle of good web and program design.

Finally, here is one of the video display pages, showing the mechanism to update the current video file and song title.  I should rewrite this as a Python script, so it accepts the index of the video and creates the proper page, instead of repeating the HTML code seven times and having to edit all of them for each code change, which leads to errors and inconsistency.  The random number function added on to the file URL in the Javascript code is a hack to make the browser reload the file–if the URL doesn’t change, the browser will use the cached (old) version of the file, so that the video and song title (and photo in the ‘real-time’ page)  wouldn’t update until the cache expired (usually a couple of weeks–not very effective for an hourly or 15-second update).  That’s always been an issue when fixing customer web sites–they don’t always see the change unless they clear their browser cache.  If they are working through a proxy server, they don’t have control over the proxy caching.  A cache is  a neat way to speed up the Internet for sites you visit a lot, but it can be annoying if the browser or proxy doesn’t check the file modification time against your cached copy.  Inaccurate time-keeping on the user end can also mess up cache coherence and browser cookie management.   Fortunately, most computers now get their time updates from the Internet automatically, so this is less of a problem than it was in the old modem/dial-up days when we couldn’t afford the overhead of a lot of background processing over the link.

<head><title>Webcam Timelapse</title></head>
<table width="500px">
<tr><td align="left">
<a href="timelapse5.html">Saturday</a>
</td><td align="center">
<a href="/cgi-bin/webcam.cgi">Home</a>
</td><td align=right>
<a href="timelapse0.html">Monday</a>
<video width="320" height="240" controls>
<source src="/TIMELAPSE/videos/surveil6.mp4" type="video/mp4" id="vidFile6">
<p>Music track: <iframe id="audFile6" src="/TIMELAPSE/videos/surveil6.txt" height="55px" width="300px" onload="upDate()"></iframe></p>
<script language="Javascript">
function upDate() {
 var currentVidFile6 = document.getElementById('vidFile6');
 currentVidFile6.src = '/TIMELAPSE/videos/surveil6.mp4?rand=' + Math.random();

So, there we have it: a project that is realized in a variety of programming languages and data formats: Python, Bash (Bourne-Again SHell), Perl (Practical Extraction and Report Language or Perniciously Eclectic Rubbish Lister or just short for ‘Pearl,’ depending who you ask), HTML5 (HyperText Markup Language version 5), and JSON (JavaScript Object Notation), MPEG 4 video and MPEG 3 audio, H264 video codec and AAC audio coding, and runs on several different Linux distributions, Raspian (Debian 7 on Atom CPU) and CentOS7 (Red Hat unlicensed version on 64-bit Intel CPU, as a virtual machine running on a CentOS6 host), using the Apache2 web server software configured to use CGI (Computer Gateway Interface, not Computer Generated Images, as used in the movies) and directories aliased on a shared disk drive. We use Secure Shell (SSH) with a password-caching agent to transfer files between the web server and the video processor, and a API (Application Programming Interface) to retrieve weather and astronomical data from a remote web server. We also use a script that queries an external “whats-my-address” server to track the variable IP address of the web server and link to it by a redirection statement on our public web server, routing the request through a firewall rule and port mapping to avoid a lot of hacker traffic on the public HTTP (HyperText Transport Protocol) port (80). All of the software used is freely available, shared for the benefit of all who use it and contribute to the software community: no proprietary, hidden code, no expensive training classes, just relatively inexpensive books or free on-line documentation and how-to-do-it forums. This is literacy in the 21st century–the ability to express yourself through ubiquitous and free technology, with the freedom to tinker–if you don’t like the way the software works, change it, or write something completely different. Expand your mind. Share your code and write about your experience.

Christmas Vacation At Last

Apologies in advance for the technical detail, but I’ve been trying to get the grandchildren interested in learning to code–if not as a career, just to make the world less magical and more driven by thoughtful application of disciplined skill, like driving a car or cooking a meal.  And, above all, it can be fun.  You know who you are.  The rest of of you, sit down, hang on, and share with the next generation.  The other lesson here is that it is never too late to learn something new, nor too early.

Regular readers may know that I became more or less “full time” retired this fall, with the ending of the latest in a 14-year sequence of contracts supporting the NIH Rocky Mountain Laboratories.  Oh, I have a few commercial web clients left, with the occasional rewrite, addition, or help request when a data entry goes awry, and a few pro bono web sites to manage, but, for the first time in 50 years, I actually have only a few minor updates in this last week of the year.

Typically, the systems support folk perform maintenance on systems when the majority of the staff has time off, which, in the United States, tends to occur only between Christmas and New Years Day.  In the pre-Unix days, this often meant traveling from coast to coast to work on U.S. Navy ships in port for the holiday.  In the past 20 years, it meant working late on days when the rest of the staff got early dismissal, or telecommuting on holidays.

So, having a bit of free time, I’ve decided to “play,” which, for the thoroughly addicted Unix Curmudgeon, means writing programs for fun and wiring up systems just to learn something new (or practice rusty skills).  The geek toy of the year, or maybe decade, is the Raspberry Pi computer, the $35 Linux board (more like $50-$100, depending on what accessories are needed to set up and run it–and, once they are set up, you can run them “headless,” without keyboard, monitor, or mouse, and access them from your PC, Mac, iPad, or phone, with the proper app).  Normally fairly frugal, I have somehow managed to acquire five of these little beasties, the last one because I thought one died, but it was the SD card (like the one in your camera) that serves as the primary “disk drive,” so I bought an extra SD card and have an extra box.  I also bought one to get my hands on the new B+ model, the one with a 40-pin I/O jack and four USB ports instead of the old-style with 34-pins and two USB jacks.  I did refrain from buying another protective case, instead choosing to bolt the “replacement” board to a piece of scrap hardwood so it won’t short out.

A “crustless” Pi, mounted on a scrap of lumber. This one is a database server, running PostgreSQL.


The purpose of the Raspberry Pi, as conceived by its British designers, is to promote learning and experimenting.  Thus, the I/O port, with a number of digital and analog inputs and outputs, and a special port for a miniature CCD camera board the size of a postage stamp.  Of course, I had to have one of those, too, along with a “daughterboard” for another that provides convenient digital input/output lines with screw terminals and a couple of tiny relays.

This Raspberry Pi has a PiFace module plugged into the I/O jack, providing input/output terminals that can be connected to lights, other circuits, and switches. two relays are provided for switching incompatible voltage devices. When not wired to sensors, this unit is the head node of a compute cluster and a file server for a 1TB external disk drive.


The Raspberry Pi is, though small, a “real” Linux server, and can run almost any software.  Having been a Unix/Linux system administrator for the past 25 years, I find them  fun to play with and use to recreate small versions of big networks.  I have one that provides network information and internet name service to the local network, another that is a router and cluster head node, connecting two networks together and managing one of them, including disk sharing, one that is a gateway to the Internet (providing secure access into our network) and also a web server, one that is a print server, and another that is a database server.  And, of course, one that is a camera platform.  (In case you are counting, some do several things at once–that’s what Linux and Unix are good at, even in a small machine.)

Lacking interesting ideas, I merely pointed the camera out my office window, to provide a “window on the world,” for which I have started to experiment with different programming.  First, I simply put up a web site (using the Pi as the web server) that allows the user to take a picture of what the camera is looking at “right now.”  The Raspberry Pi promotes the use of the Python programming language (named by its author, Guido van Rossum, after the British comedy team, “Monte Python”), a language that I have heretofore avoided, since it uses “white space” as a block delimiter, and white space has been the bane of Unix shell programmers (of which I am one) since the beginning of the epoch (1/1/1970).  Nevertheless, it is a solid, well-constructed language, which is growing rapidly, in both academia as a beginning programming language and in industry and research as a powerful tool for rapid prototyping and robust permanent systems.  Python is, unlike most scripting languages, strongly typed (meaning variables are distinctly numbers, text, or other structures) and naturally object-oriented, which means data can inherit processing methods and structure, and details of structure can be hidden from other parts of the program, making programs easier to extend and debug, and making objects easily imported into other programs.

The PiCam, pointed out at the driveway, mounted in a case and held by a “third hand” workbench tool. The ribbon cable attaches the camera to the web server, below.


So, with Python as a de facto system programming language, the libraries that operate the external ports and the camera are, of course, written in Python.  Using the devices requires Python programming skills, incentive enough to finally learn the language.

The web camera project quickly evolved into a combined weather station and surveillance system, and, not surprisingly, expanded my Javascript programming skills as well.  Javascript is at the core of most interactive web applications, as it mostly runs in the user’s browser, providing dynamic interaction with the server without the need to reload the current web page, and capable of performing automatic functions.  Since part of the project involves sewing a sequence of still images into a timelapse video, it also involved building a command-line video editor from source code and learning to manipulate video with it.

The web page:  All this code does is display a photo, then update the time and the photo every 15 seconds (the rate at which the camera takes pictures).  The python code runs on the server, the Javascript code runs in the user’s browser.  The image acquisition happens in another program, which follows.

 #!/usr/bin/env python

import cgi
import time
import os
import sys
import urllib2
import json

os.environ['TZ'] = "PST8PDT"

print "Content-type: text/html"

Now = time.localtime(time.time())
DST = time.tzname[Now.tm_isdst]
Date = time.asctime(Now)

print """
<head><title>Raspberry PiCam</title>
<h1>The view from Chaos Central <span id="nowTime">%s</span></h1>
""" % Date

f = urllib2.urlopen('')
json_string =
parsed_json = json.loads(json_string)
location = parsed_json['location']['city']
title = parsed_json['current_observation']['image']['title']
link = parsed_json['current_observation']['image']['link']
weather = parsed_json['current_observation']['weather']
temp_f = parsed_json['current_observation']['temp_f']
temp_c = parsed_json['current_observation']['temp_c']
wind = parsed_json['current_observation']['wind_string']
updated = parsed_json['current_observation']['observation_time']

print """
<img src="/images/current.png" id="currentImage"/>
<script language="Javascript">
setInterval(function() {
 var currentImageElement = document.getElementById('currentImage');
 currentImageElement.src = '/images/current.png?rand=' + Math.random();
 var now = new Date();
 var h = now.getHours();
 var m = now.getMinutes();
 var s = now.getSeconds();
 formatnum = function( num ) {
 if ( num < 10 ) { return '0' + num; }
 else { return '' + num; }
 var time = formatnum(h) + ':' + formatnum(m) + ':' + formatnum(s);
 var nowTimeElement = document.getElementById("nowTime");
 nowTimeElement.innerHTML = time;
}, 15000);
print "<p>The temperature in %s is: %s F / %s C<br />" % (location, temp_f, temp_c)
print "and the weather is %s<br />" % weather
print "Wind: %s<br /><br />" % wind
print "Weather Data %s<br />" % updated
print "Reload page to update weather: image updates automatically every 15 seconds<br />"
print "Weather data provided by <a href=\"%s\">%s</a><br />" % (link, title)
print "Image realized on a <a href=\"\">Raspberry Pi</a></p>"

print """

So, confused yet? Below is the code that actually runs the camera. It takes a picture every 15 seconds, numbering the pictures so that a third program can sew them together into a timelapse video.  The timezone hack in the shebang line (the first line of the script) powers the timing of the script.  This script is started each day by the system before the earliest sunrise, then waits until sunrise (obtained from the weather programming interface), and runs until sunset.  We start 30 minutes early and stop 30 minutes later to start/stop in twilight.

#!/usr/bin/env TZ=PST8PDT python

import time
import picamera
import os
import urllib2
import json

riset = urllib2.urlopen(' key here).../astronomy/q/WA/Shelton.json')
json_string =

parsed_json = json.loads(json_string)
sunriseh = int(parsed_json['sun_phase']['sunrise']['hour'])
sunrisem = int(parsed_json['sun_phase']['sunrise']['minute'])
if ( sunrisem <= 30 ):
 sunrisem += 30
 sunriseh = sunriseh - 1
 sunrisem = sunrisem - 30

while (time.localtime().tm_hour < sunriseh):
 if ( time.localtime().tm_min < 30 ):
 time.sleep((sunrisem * 60) + 1 )
 while (time.localtime().tm_min < sunrisem):
 time.sleep(60 * (sunrisem - time.localtime().tm_min + 1))

sunseth = int(parsed_json['sun_phase']['sunset']['hour'])
sunsetm = int(parsed_json['sun_phase']['sunset']['minute'])
if ( sunsetm >= 30 ):
 sunsetm = sunsetm - 30
 sunseth += 1
 sunsetm += 30

print 'Sunrise: ' + str(sunriseh) + ':' + str(sunrisem) + ', Sunset: ' + str(sunseth) + ':' + str(sunsetm)


logdir = '/mnt/TIMELAPSE/' + str(wday)

# remove last week's files.
import shutil

# grab camera and start recording
with picamera.PiCamera() as camera:
 camera.resolution = (320,240)
# loop: capture method grabs camera device for duration.
 for filename in camera.capture_continuous(logdir + '/img{counter:04d}.png'):
 shutil.copyfile(filename, '/var/www/images/current.png')
 print('Captured %s' % filename)
 if ( time.localtime().tm_hour >= sunseth and time.localtime().tm_min <= sunsetm ):
 shutil.copyfile('/var/www/images/NoImage.png', '/var/www/images/current.png')

So, there it is, a program that finds the hours of daylight, day after day, and records four photos a minute, which should catch at least of glimpse of anyone entering the property. Each picture is copied to the web site as “current.png” so that the Javascript in the web page can update it to anyone currently watching (or at least who has a tab open to the site).

The next evolution is to make a timelapse movie, which, at 8 frames per second, displays an hour of observation every 30 seconds. At faster display would make the movie shorter, but moving objects won’t appear long enough for the eye to recognise them, and slower internet connections/browsers may drop frames, missing data altogether.

The Dell PowerEdge 110, configured as a virtual machine host. This currently looks like about nine different machines to the network. This provides our business with all of the various system configurations we support, including customer systems that may run on older software versions as well as the latest releases. One of the virtual machines assembles the timelapse videos for the PiCam system. Because it is very fast, and videos are hard work.

This code runs in a virtual server on our main virtual host, which contains system images for the various versions and releases of Linux, Windows, and FreeBSD that we support. This happens to run on CentOS7, the latest free Red Hat clone.  It could run on the Raspberry Pi, but requires a lot of time and processing power, so we chose to distribute parts of the process on a faster machine with more memory: the Pi has a 32-bit 700Mhz Atom processor and 512 MB of memory; the virtual host has a quad-core 64-bit 2.4Ghz  (3 times faster) Intel Xeon processor and 8GB  (16 times as much) memory.


~/bin/ffmpeg -framerate 8 -i ./images/${1}/img%04d.png -r 30 -pix_fmt yuv420p surveil${1}.mp4

Short, eh? A simple, essentially one-line shell script running the ffmpeg command with a lot of options set. But, there’s more to this. For one, the files are on an external hard drive attached to the Raspberry Pi. It takes much less processing power and time to copy files, and we can copy them incrementally through the day, so we have a driver script on the Raspberry Pi to send the files to the main server, run the remote video processing script, and retrieve the resulting video file.  In this case,  the included ‘my_agent’ script sets the environment needed to login to the remote machine using a security agent’s pre-authorized key .


. my_agent
rsync -av /mnt/TIMELAPSE/* centos7:TIMELAPSE/images/
ssh centos7 ./TIMELAPSE/ $1
rsync -av centos7:TIMELAPSE/surveil*.mp4 /mnt/TIMELAPSE/videos/

Lastly, a web interface is needed to display the video on the user’s browser: This is still under development as an integrated part of the webcam application, but relies on a snippet of HTML version 5 code, the latest version of the HyperText Markup Language that Tim Berners-Lee spun off as a subset/variant of the 1986 Standard Generalized Markup Language (SGML) to invent the World Wide Web 25 years ago, in 1989 (it didn’t get built until 1990). HTML 5 provides powerful tags to define multi-media constructs like video and audio that previously required specialized streaming server software and browser plugins to implement.  The code snippet below contains the Javascript hack needed to signal the browser to reload a new version of the file, rather than replay the cached version.  The final version will offer the option of displaying a timelapse video for the current day to current time (within the hour) or for any day in the past week (hence the use of an external disk drive on the Raspberry Pi, in order to store a week’s worth of surveillance video and the still pictures from which it is built).

<video width="320" height="240" controls>
<source src="/TIMELAPSE/videos/surveil0.mp4" type="video/mp4" id="vidFile">
<script language="Javascript">
 var currentVidFile = document.getElementById('vidFile');
 currentVidFile.src = '/TIMELAPSE/videos/surveil0.mp4?rand=' + Math.random();

And, so, that’s what old programmers do for fun during Christmas break. Meanwhile, I’ve developed some familiarity and skill with Python programming, honed Javascript skills, and refreshed my skills building software from source packages, and kept up to date on the latest system software from Red Hat. Jobs are out there… On the Internet, no one knows you’re a geezer, or simply doesn’t care, if you have the code skills they need.

If you want to see what’s outside our window, check on during the day ( the camera is off at night).    The picture will update every 15 seconds until you close the browser window or tab (or go to another site from this window). Of course, it is a work in progress, and we have recently made changes to our router, so it might not work at any given time.

Musings on Unix, Bicycling, Quilting, Weaving, Old Houses, and other diversions

%d bloggers like this: