When we last pressed “publish,” on Windows Rant #27433 (or thereabouts), we were stuck with a refurbished Windows Vista machine that had been reloaded with Windows 7, therefore “theoretically” eligible for “free” upgrade (is it really? An upgrade, that is.) to Windows 10. However, the compatibility check failed on the built-in ATI Radeon display adapter, deemed too old to have a Windows 10 driver written for it.
In fact, after some checking, it appears that Windows 7 itself uses the Vista driver in compatibility mode. So much for buying older machines. But, wait. For just a few dollars more (about 40 of them, now raising the cost of the new system to approximately the price of discount Windows 10 OEM installation disk, which I could use to build a virtual machine on our KVM server), I ordered a PNY video card with an Nvidia chipset that I verified: 1) was compatible with the PCIe 2×16 slot in the HP computer 2) came with the short connector brackets needed for a slim-case desktop computer, and, most importantly, 3) had an available driver for Windows 10.
So, the box arrived: I swapped out the connector bracket, moved the existing RS-232 connector on the computer back panel, and plugged in the new card. Then, I turned on the computer, which had been offline for a week or so waiting for the new card. The display wouldn’t come up. Oh, maybe the new card disables the on-board adapter, so I moved the cable–still black, but the disk light is blinking, so the computer is doing something, I just can’t see it. Foolishly, I turned off the machine (never do that when the disk is active), and turned it back on again. OK, I get the bootup screen, but then a message that Windows cannot load. Autorecovery doesn’t work. The messages on the screen invite me to reinstall Windows. I don’t think so. A person without decades of IT background might be tempted at this point to haul the $80 (now $125, with the new video card) machine off to the nearest PC repair shop for a $150 minimum service charge, but, then, the average user would just have bought a new machine with Windows 10 already on it in the first place, rather than trying to install new hardware in the old one..
I remove the new card, plug the display back into the old one, and reboot. Yes! It comes up, but needs to fiddle with updates, going through many long minutes of “Do not touch the Power button” warnings and several reboots. I should have made sure the system was stable before opening it up, but I’m not used to the “hands off” policy where your computer is not available to you for hours on end, at the whim of Microsoft and their “reboot often, and, when in doubt, reinstall” system philosophy (yes, I was actually taught that, in a Microsoft system administration class, many years ago–in contrast to Unix and Linux machines that run for months or years between reboots, except maybe for kernel updates, for which you can now buy “live” update tools).
Finally, the machine stabilizes, and I download the latest Windows 7 Nvidia driver, for the next step, which is: reinstall the new card. This time, the system, having passed through the weekly throes of patch management, boots, in Large Print mode, since it has no idea of what kind of video hardware is installed, and reverts to the default minimum resolution. Installation of the Nvidia driver goes smoothly, and the system reboots to a nice, crisp, high-resolution screen.
The next step is to ask Microsoft to reconsider. But, the upgrade compatibility tool apparently only runs once in a great while, and still says “This machine can’t run Windows 10”–because of the obsolete Radeon display adapter, which is now disabled. Bummer. Well, some quick research on sites where the bloggers make their living fixing other people’s Windows installations shows me how to schedule the compatibility checker to run again “real soon now,” from an administrative command line. It’s been over 15 years since I was a Windows administrator, so I have to research how to “runas” administrator in Windows 7, being somewhat different (but, as it turns out, much simpler) than in Windows NT, and, of course, completely different from using ‘su’ or ‘sudo’ in Unix. So, I ran the request, which says it had “SUCCESS,” but the scan is only “scheduled” and, according to the FAQ, runs only once a month. Removing the Appraisal.JSON report file from the hidden “Telemetry” directory did nothing–the report, which seems to be delivered over the Internet from Microsoft, is still displayed in the appraisal tool, so all we can do is wait. Everything is “wait” in Windows.
Oh, and it seems the browsers are still infested with adware, too, as I can’t seem to click on any link in Firefox without getting a new tab with some advertisement in it. Either that, or all the site links _are_ actually wired to ads for expensive tech support or software of dubious value instead of actual on-line help. I wonder if I can start a campaign to get Garmin and Intuit to port their software to Linux, or at least get it to run under Wine? Then, we wouldn’t ever need to use Windows at all.
Microsoft has finally released Windows 10, to minimal fanfare and with mumbled apologies for the disastrous Windows 8, and an implicit concession that Windows 7 was essentially new paint on the 2002 Windows XP, hastily conceived to back away from the unusable Windows Vista. Why Windows 10? What happened to Windows 9? The answer seems to be, “a name is only a name, and a number is just as good (or better) than a label that is equally non-relevant.” It seems to me that this is a blatant marketing ploy to play “catch-up” with Apple, which has been shipping OS/X (Apple OS version 10) since 2001, the year before Windows XP appeared on the stage.
Note that Apple has not rested on that 15-year-old release, but is now up to 10.10, numbering major revisions in the time-honored tradition of its Unix roots. Indeed, OS/X is architecturally based on the NeXT system pioneered by Steve Jobs during his exile from Apple in the 1980s, which, in turn, was based on the Mach microkernel version of the Berkeley Software Distribution, a.k.a BSD Unix. Unix, of course, is the breakthrough computing concept of a multi-user/multi-tasking virtual machine operating system that could be easily ported to almost any hardware architecture, preferably one with a robust CPU security model and memory management system. Unix has been the mainstay of academic computing and internetworking since 1970, and still, despite the omnipresent Microsoft Windows desktops and corporate servers, runs the majority of the web servers in the world, though primarily in the form of its close relative, GNU Linux.
Windows NT, 95, 98, Millenium, 2000, XP/2003, Vista, 7, 8, and 10 all have their roots in the NT code base, the core principles of which derived from Digital’s mainframe/minicomputer VMS, but hindered by the legacy Microsoft computing model. Prior versions of Windows, 1, 2, and 3, were simply graphical task managers running on top of MS-DOS. MS-DOS was essentially a 16-bit port of the 8-bit CP/M (Control Program for Microprocessors), conceived by Gary Kildahl in the 1970s to load and run programs from tape or disk on the Z80 microprocessor, an early advance over microcontrollers in embedded computing applications. MS-DOS added some features and concepts “borrowed” from Xenix, a 16-bit port of Unix for which Microsoft briefly owned the distribution rights in the 1980s.
Having “grown up” professionally in an environment of networked real-time command and control systems in the 1960s, multi-processor/multi-tasking computers in the 1970s, and being introduced to “personal computing” on CP/M in the early 1980s, MS-DOS and its parasitic Windows interfaces were, to me, simply inadequate, and poorly done, in comparison to other graphical systems like GEM (Graphical Environment Manager), that derived from the Xerox PARC user interface studies that also became the basis for the Apple MacIntosh. Very soon after loading Windows 2.0/x286 on my new 16-bit Sperry IT computer, I dumped it and MS-DOS 4 in favor of a Unix-like OS, Coherent, based on System 7 Unix, which was subsequently replaced by a 32-bit version running on an x486 computer, that became a Linux computer after the demise of the Mark Williams Company, maker of Coherent. At the same time I was learning Coherent, I was learning BSD and SunOS in graduate school, and writing software on Coherent that ran on SunOS.
But, the business world had adopted Microsoft MS-DOS and, subsequently, Windows, as the OS of choice, so it was necessary to continue to use Microsoft products, despite their obvious shortcomings. However, the split between Microsoft and IBM over IBM’s OS/2 created a temporary way out: the 32-bit OS/2 system published by IBM had the ability to run 16-bit Windows programs, and was, architecturally, much sounder than Windows. So, for a number of years, we ran OS/2 in our home business, for the applications required to support my government clients. Of course, most of the software development continued under Coherent or Linux, and later, Solaris, Sun Microsystems’ Unix, which ran on the Sun SPARC system rather than the Intel processors used by Microsoft. From the mid-1990s until 2012, we maintained SPARC machines in our home network, in addition to machines running Linux and FreeBSD.
This era of detente came to an end when Microsoft released Windows 95, their first 32-bit system, and IBM released OS/2 4. OS/2 “Warp,” as version 4 was called, required a bigger computer than we could afford on our budget, and would not run Windows 95 applications. So, we ventured once more into buying and running Windows. One issue that came up right away was the need to buy a third-party TCP/IP stack to integrate our newly-reminted Windows machine into our Linux/Solaris network. Meanwhile, my work shifted to pure Unix and Solaris support–so I thought.
After successfully migrating a large government system from SYSV Unix to Solaris, I was tasked with migrating network services from Netware to Microsoft Windows NT. I had some hopes for a peaceful coexistence with Windows from that, but, as I brought the new systems on-line, mostly using open-source tools “borrowed” from Unix, it was obvious that Windows NT was not up to the task of running a large network: the scheduling algorithms left the “hourglass of death” (the replacement of the pointing device cursor arrow with a “I’m too busy to talk to you now” symbol) flashing permanently on the screen of a server trying to manage over 100 printers, and configuring network services for 500 workstations through a “point-and-click” interface was tedious at best.
In my next two jobs, Windows NT was there, but I wasn’t responsible for it, so life was good–until Windows 2000 arrived. The concept of “Active Directory,” a thinly-disguised (but mostly incompatible) version of the well-established Kerberos security system developed on Unix, plus some incompatible or undesirable extensions to the Internet domain name system (DNS) and a custom implementation of LDAP (Lightweight Directory Access Protocol) made Windows function more or less securely in a network environment, but excluded interoperability with the standard Unix implementations without adding additional software to handle the differences in file system security models and user profiles in LDAP, and to align the Kerberos versions between Windows and the various Unix releases.
Windows 2000 gave way to Windows XP, which mostly replaced the NT-like user interface with one more suited to the Active Directory network model where it mattered. Then, the reality of the power of Apple’s OS/X set in: Apple had a system that not only incorporated their famously user-friendly graphical interface, but built on top of a rock-solid networked multitasking, multiuser operating system based on over 30 years of incremental refinement. Windows Server 2003 was an upgrade to Windows 2000, and there things languished. Bill Gates left Microsoft to focus on his and Melinda’s Foundation, and Ballmer drove the ship steadily on–toward the reef that Apple had built.
Windows Vista appeared on the scene. Not too many companies were making Linux laptops yet in mid-decade, so the choice was to buy a commodity system, which came with Vista, useful only long enough to download and burn an install disk for the relatively new Ubuntu Linux, a Debian-based system (we had previously used Slackware, Red Hat, and SuSE Linux distros). Business and government continued to use XP, refusing to deploy Vista, for a number of valid reasons–performance, security, etc.
By 2009, we had gone “off the reservation,” completely in the independent consulting mode, off-site rather than on-site, so Windows (XP, naturally) gradually disappeared into our virtual machine hosts, to be used only for the one or two “must have” applications, which did not include Microsoft Office or Outlook, which the government clients preferred. We are, as I never ceased to remind them, a Unix/Linux consultancy, and that’s what we use on our desktops. Our frequent road trips called for an additional portable, so we bought a Netbook, which had Windows 7 installed on it, Vista having been quickly replaced as an acknowledged technical failure. Windows 7, like Vista, lasted long enough to download a fresh Linux install image, this time from a flash drive, since the smaller machines no longer come with optical drives. Meanwhile, our OEM copy of Windows XP continued to live on as a virtual system long after the original hardware had been scrapped or converted to BSD Unix or GNU Linux.
Nothing lasts forever, and Microsoft finally “pulled the plug” on Windows XP, which had been on extended life support long after its planned end-of-life phaseout, due to the failure of Vista to excite anyone and the failure of the business and government world to accept either Vista or Windows 7. Windows 8, touted as the “universal Windows user interface,” turned out to transform your desktop into a huge smart phone that doesn’t make phone calls (although Microsoft did buy Skype to make that possible, at least over the ‘Net). The advantage of a “universal” interface was somehow lost when all you had on the huge screen were giant icons that were as meaningless as the tiny ones on your phone. The devil, it seems, is in the details, and we all wanted more details when we had the space. So, enter Windows 10.
When Windows 10 was first announced (amid cries of “whatever happened to 9?”), I downloaded a preview beta copy, loaded it into a virtual instance, and saw that it looked even more like the old familiar XP than Windows 7 did, so we decided it was time to look into upgrading our Windows platform so we could continue to run Turbo Tax, Quicken, and Garmin Connect, applications that we need, but which the vendors have not seen fit to make compatible with WINE (Windows Emulator under Linux) or run natively under Linux. We do have iPads for convenience, but don’t really need an Apple desktop (other than the narcoleptic iMac that a client gave us after converting her back to Windows–a result of the failure of Apple to address certain quirky problems, like narcolepsy in some iMacs and picky WiFi selection in some iPads).
Microsoft has made it difficult to purchase OEM (Original Equipment Manufacturer) versions of Windows–i.e., a version that you can install on bare hardware, like a computer you have built from spare parts (like most of the Linux machines we have had over the years) or, as we do, in a virtual machine host, like the Dell server running CentOS 6 with KVM. A copy of Windows 7 (which vendors are selling to get people to buy hardware that has the hated Windows 8 on it) runs between $90 and $120, generally, if you can find it.
We found a refurbished Windows 7 machine for $80, which now sits under my desk, awaiting its free upgrade to Windows 10, which I have now been told is a myth, since Windows 10 does not support compatibility with older driver software and the video adapter was designed for XP and Vista. So much for truth in advertising. Microsoft suggests purchasing a new machine. However, Radeon does provide a generic Linux driver, both 32-bit and 64-bit, so the lack of Windows upgrade support is simply a ploy to make more money by selling new hardware, which, of course, includes the new operating system, bought and paid for. Upgrades are discouraged, especially with folks like us who have kept a pair of XP licenses reincarnated through several real and virtual machines over the last dozen years. It appears we are finally stopped in our tracks.
We might be able to clone the Win 7 system into a virtual instance, releasing the hardware for another purpose, but, meanwhile, it shares a monitor with the Raspberry Pi printer/scanner server and the secondary monitor port on my main development machine, a Linux laptop I purchased, along with a desktop, from Zareason, a Linux-specific hardware vendor. For now, it is annoying to have to switch monitor inputs and pick up a second keyboard/mouse, but we use Windows so seldom is isn’t much of an imposition. All of the other machines, being Linux or BSD, or virtualized, we can operate from any keyboard and screen in the office (or out of the office, for that matter–though outside access is restricted to only specific machines).
Of course, every few months, we need to turn on the Windows machine to install bug fixes, security updates, and the latest virus scanner databases. Between the time we first fired up the new machine and when we got a virus-scanner installed, the machine acquired at least two types of malware, after downloading only two third-party applications, both of which have since been destroyed, since they didn’t provide the services needed, anyway. The adware virus caused the browser (Internet Explorer, until we could successfully download a non-infected version of Firefox) to bring up malware sites instead of the site requested, and the trojan virus betrayed itself by taking over the desktop and refusing to be closed, insisting that the user needed to “call for support.”
I am convinced at this point that Windows is, in a large sense, its own virus, since you have to connect it to the Internet to install security patches, which leaves it vulnerable to infection while downloading the patches. In twenty-five years of running Unix and Linux on hundreds of machines, I have only seen three malware infections, two on Solaris and one on Linux, but can’t seem to turn on a Windows machine without it being instantly infected. Is it any wonder that we prefer Unix and GNU Linux? Also, Windows comes with no productivity software, other than a buggy Internet browser: a full Linux distro comes with every type of software application and software development tool imaginable, all freely available, and which, if not initially installed, can be downloaded and installed at any time, from a respository free of malware. What does this incredible wealth of secure and productive software cost? Whatever we decide to donate to support those who package it: it’s up to us. Open Source belongs to everyone who creates it, uses it, and improves it, for the benefit of all. And, it can be had for free, if you want, with no “gotcha’s” like the Windows 10 driver fiasco.
The last Friday in July (today!) is the 16th annual system administrator appreciation day, an obscure celebration started in 2000 (by a system administrator, of course) as a response to an H-P ad showing users expressing gratitude to their sysadmin for installing the advertiser’s latest printer. To my knowledge, none of us have ever gotten flowers or even donuts on “our” day, but it does remind us in the profession that our job is to keep the users happy, mostly by keeping the machines happy, but also by attending to their needs in a prompt and professional manner.
I was reminded of the event not only by notices in the discussion forums and IT email lists, but by the fact that today, the replacement memory module for our network server came, and I installed it. A simple procedure, but one that takes a fair portion of the sysadmin’s bag of tricks and tools to accomplish. Bigger shops might have a service contract with the hardware vendor, but in many cases, the sysadmin is also the hardware mechanic.
For a few months, the server, a Dell T110, has been crashing every few weeks, fortunately not while we were on our two-month grand tour, but of concern, naturally. especially because it is a virtual machine host, and often has a half-dozen virtual machines running on it, which means, when the server goes down, half of our network goes with it. Virtualization is a great way to run different versions or distributions of operating systems when developing and testing software, so not too many have production roles in the network, but it is still an inconvenience to have to restart all of them in event of a crash.
A red light appeared on the front panel of the server, indicating an internal hardware condition, so it was time to check it out. First, hardware designed for use as servers (the T110 is aimed at small offices like mine) is a lot more robust than the average tower workstation you might have on your desk. Note above the heavy-duty CPU heat sink (air baffles have been removed for access to the memory modules–the four horizontal strips above the CPU fins). In addition, big computers have little computers inside that keep track of the status of the various components, like memory, CPU, fans, and disk drives, and turn on the light on the panel that indicates the machine needs service. Server memory has error-correction circuitry, as do most server-quality disk arrays, but this is limited to one error–the next one will bring the system down.
System administrators depend on these self-correcting circuits and error indications to schedule orderly shutdowns for maintenance, so that the machine doesn’t crash in the middle of the workday. For most offices, this means late evening or weekend work. For 24-hour operations, like web sites, it means shifting the load to one or more redundant systems while the ailing one is repaired, so no data is lost. Companies like Dell supply monitoring software to notify sysadmins of impending problems, which is vital to operations where there is a room full of noisy servers and the admins are in a nice quiet office in the back room. In our case, with just one server, we don’t use the monitoring software regularly, but it is useful for telling us which component the red light is for; then we can look up the location in the service manual and order the right part, and hope the system doesn’t crash before it arrives.
Normally, businesses that are thriving and need to keep competitive in the market replace their machines at least every three years. Others, like ours, that operate on a shoestring and buy whatever resources we need for a project when we need it, tend to run machines five years or more, sometimes until repair parts are no longer available: since we run Linux, we have machines eight years or older that are still useful for running some network services.
Our server is almost five years old, so when I order replacement parts, they don’t always look like the ones we took out, or have the same specifications. For that reason, I usually take replacement as an opportunity to upgrade, replacing all of a group of components with the a new set, which I did when a disk drive failed a couple years ago. However, this time, since I’m semi-retired and don’t have a steady cash flow, I only ordered one memory module, to replace the failing one. Memory comes in pairs, so having slightly different configurations in a pair causes the machine to complain on startup, but it still runs. The “upgrade” alternative would have replaced all four modules, or at least the two paired ones, with the larger size, at a cost of $150 to $300 instead of just replacing a $50 module and putting up with having to manually restart the machine on reboot.
So, the sysadmin not only needs to keep the machines running, but running within budget, and making sure the operating systems and hardware capabiities can support the software users need to do their jobs. If he or she is doing their job right, there won’t be any red lights in the server room, and the sysadmin will look like they aren’t doing anything…
Our recent Grand Tour 2015 took us by car through 15 states, visiting relatives (some we didn’t know before–3rd and 4th cousins in the Pietz line), old school classmates, national parks and monuments, state capitols, hero monuments, and landmarks.
We also took our bicycle, a Bike Friday Tandem Traveler “Q” model, which spent most of the 14000 km trip perched on top of the car to catch whatever insects were prevalent where we traveled. But, from time to time, we sought out bicycle trails and rode about 2% of the total (288 km). The tour marked the one-year anniversary of my cardiac bypass surgery and subsequent pulmonary emboli, the latter for which I was still taking anti-coagulants (warfarin, which we called by it’s more common usage–rat poison). So, we generally followed doctor’s advice not to stray too far from the car, and limited our rides to 15-38 km, though I’m sure that 20km might be considered “too far.”
For the past 3 years, we’ve been documenting our bike rides (and a few hikes) with a GoPro sports camera mounted on the front (and sometimes the trailer) of our bicycle, and this trip was no exception. We talked about the art of making videos in an earlier post, from a technical standpoint, with some discussion of editing and integrating sound, and the importance of creating a story, rather than just a replay of the ride. From a content standpoint, there are two ways of making a video record of a bike ride: one is to simply turn on the camera and let it run, picking out highlights later in editing, and the other is to film points of interest as they go by. We haven’t yet taken the time and effort to use multiple cameras (a luxury us pensioners can’t justify) or to stage “selfies” by setting up the camera beside the trail and riding past it (which takes extra time, and we’re slow enough as it is), and, since we ride the same bike, we can’t shoot scenes of each other easily.
Our first ride was in Santa Fe, from our condo downtown to my granddaughter’s house 20 km south of the city, intended to be on city trails and frontage roads. However, without a detailed map, we missed turns and ended up on busy highways on the way out and way off course on the way back, depending on the kindness of strangers (with a pickup truck) to ferry us between where we ended up and where we should have been. The distance was a bit ambitious for our level of training and the high altitude (2100 meters, 7000 ft), so it was fortunate that getting lost actually made the return ride about 8 km shorter.
During the first day of my 50-year college reunion, we registered early, then went for a bike ride on the Rolling Prairie Trail, camera running. Regardless of the method, a 25-km ride generally yields between 20 minutes and two hours of video, which needs to be whittled down to a short “story” of impressions of our ride and interesting things we saw along the way (other than endless trees drifting by at 15-20km/hr). Nevertheless, we do get carried away sometimes, so the films tend to have lots of bridge crossings, runners and riders on the trail with us, meeting or passing, and foliage whizzing by, the apparent speed amplified by the narrow (2-3 meter) trail width, for viewers used to auto highways. Still, none of the travelogues have particularly exciting footage or a compelling story, other than the novelty of two old and overweight people rambling along flat trails at less than half the speed of the Tour de France peleton.
This is a one-shot video: we filmed segments along the entire trail, but only kept this long shot, which follows a fast downhill on the Pheasant Branch Creek from U.S. 12 to the end of the paved trail at the nature preserve.
Yes, it says “Part 1,” but we never got around to making Part 2, which essentially covers the route we took two years ago on our excursion through Madison. This ride was with our son and grandson. Hopefully, we taught the young man a few pointers about trail safety (keeping to your lane–about which, more later) and pacing yourself on longer rides: One reason we didn’t make Part 2 was because the younger contingent were far behind us most of the second half.
The Trout Run Trail in Decorah, Iowa, is not a rail trail, but circles the city along the river, up the creek to the trout hatchery, then up through the cliffs south of town. We chose not to tackle the cliff portion this early in our bicycling season, so rode to the first switchback and then back to the city campground.
Most of the videos lie dormant on Vimeo.com’s servers: I consider a video successful if both of my loyal followers watch it (some have zero plays). But, amazingly, one video in this group, “Jackson,” has gotten a lot of airplay, more than 500 viewings in the past month, since I cross-posted the link to a Facebook group of ex-pats and current residents of my home town. The video follows our ride from our B&B in my old neighborhood onto a bicycle trail that follows the river through town and circles the west side. Of course, there is no way to tell who watched it all the way through, or whether they saw the link on Twitter and thought it was a pirated long-lost Michael Jackson music video and clicked on it by mistake. But, 500 (out of the total group membership of 1380) either means it was interesting or that small town folks will watch anything that features their town. The compelling beat of Massimo Ruberti’s frenetic techno “Sabotage” on the sound track probably didn’t hurt, either.
I’ve collected a range of likely soundtracks, from one of the internet repositories offering royalty-free music under a Creative Commons licensing policy: most public video streaming services strictly enforce copyright and license rules in submitted work. The trick is finding a suitable backdrop that is appropriate to the course that fits the edited length, then reedit to match the scenes to the phrasing and actual length. Some results are better than others, and some require truncating the selection to match the film length. In some, two or more shorter works are appropriate.
The Root River winds through the cliffs in the Driftless region in southeastern Minnesota, 50 km north of Decorah, Iowa, where we rode the week before. A large section of the trail was closed in the middle for bridge replacement, but the part between Whalan and Lanesboro is the most scenic, so we rode it two days. We stayed at a large campground on a bend in the river across from the trail, upriver from Whalan.
We drove to northern Minnesota to ride the Paul Bunyan Trail, but the mosquitoes were too dense to camp and Staples, 40 km to the west of Brainerd, had the nearest affordable motels. Staples also had a bike trail from downtown to the regional college and the Legacy Garden north of town. As long as we kept moving, the mosquitoes couldn’t catch us.
When we originally planned this trip, we intended to ride the length of the 200-km Paul Bunyan Trail and return, camping along the way, but a more practical plan called for riding out-and-back short segments from trailheads. The portion we actually rode was from the Northland Arboretum to the village of Merrifield, on North Long Lake, 15 km north.
Some of the videos get a bit long, despite best editing efforts, so this one got split into two segments, one for each direction. Part 2 has a surprise in the middle, the first of several large snapping turtles we came across in our travels. They apparently like to nest under the warm asphalt trails and dig out during the day.
We moved on to Park Rapids, at the western end of the Heartland Trail, which intersects with the Paul Bunyan trail at Walker, 60 km east. This was our longest ride of the trip, a pleasant 19 km run to the town of Nevis for coffee. This video is mercifully short, as we ran out of memory on the camera midway through the outbound leg, and didn’t notice.
Our final midwestern ride was on the hilly Lake Itasca State Park trail, from the Visitor Center 9 km to the Mississippi Headwaters. Shortly after we decided we had enough footage and turned off the camera, we had a scary near-miss encounter with a group of cyclists coming uphill who didn’t expect a fast tandem coming downhill and were riding around a curve on both lanes of the trail. We cut between them, down the middle, losing a water bottle in the evasive maneuver. One point for the “film it all and edit later” method, though maybe we don’t want to see the harrowing aspects of our travel mode, where you can be killed or seriously injured even at what would be minor fender-bender speeds in a car.
After our tour of the Minnesota trails, we headed back west, stopping for a week in Montana for a family gathering, taking a day to check out the new Skyline Trail in Polson, riding a 14 km loop from the base of Polson Hill to the top of the Skyline, then down through town and onto the rail trail back to our starting point. This video is in several shots, leading up to the summit, then three long segments, on the trail and road. We kept the drag brake on during the downhill part, to maintain control on the steep grade and curving narrow trail, with full speed only on the road, to which we switched after the trail turned into a pedestrian sidewalk.
Coming home after a long trip pulls one quickly back into the routine that the trip was designed to break. However, a two-month absence makes reestablishing the routine much more difficult. The inside of the house looks exactly as we left it (in somewhat of a hurry, but prepared–empty refrigerator, empty garbage cans, etc)–almost: a shelf fell off the wall, probably due to being overloaded just before we left, and a bicycle tipped over, probably due to digging out last-minute supplies from behind it. However, the outside is a profusion of blooming things that were just starting to wake when we left, and we missed most of the rhody season–those blooms are long gone. Fortunately we did have a service maintain the grounds while we were gone, so the place didn’t look quite as abandoned as it would have.
By now, the cat is used to extended stays of a week or two or three at the Just Cats Hotel, but she always clings to us for a few days after we all get home. This time is no exception. We’ve moved downstairs to the guest room to beat the unseasonable heat wave, and the cat has taken that in stride, curling up next to us, though she still thinks we should be upstairs. We’ve been busy finding window screens and hunting down our seldom-used fans to help keep the house cooler: our big oscillating floor fan perished last year and wasn’t replaced: a brief search for a new one, even a table unit, was in vain, as the heat wave caught us in Montana several days before we got home, and local stores quickly sold out of what isn’t usually a big selling item in the usually mild Pacific Northwest.
Entropy continues to eat away at houses whether they are occupied or not: the upstairs bathroom tank-to-bowl gasket dried out from age, heat, and lack of use, so toilet repair was first on the list after unloading the car. Several days have passed: the tank bolts continue to seep, despite new bolts and rubber washers–a careful juggling act between tight enough and too tight, to make a seal without breaking the porcelain. The new bolts were larger in diameter than the old ones, which called for carefully drilling out the holes in the ceramic tank with a masonry bit, not something one expects to have to do… Suitcases were unpacked, laundry done, and finally, camping gear put away, though we intend to do some local overnight trips the rest of the summer. A trip to Costco to replenish supplies was in order, but the bulk items remain stacked in the garage, awaiting time to distribute them into the usual storage places.
We also brought back items from our cabin after staging it for sale as a furnished dwelling, including a small kitchen table and stools we originally had used in our Bremerton town house, four houses back, in the 1990s, intending to replace my parent’s old kitchen table, which has been a bit large for the breakfast nook in our Shelton bungalow. The cabin has a set of folding tray tables that is adequate for meals: the table and stools have always been a bit crowded there. So, the 1930s kitchen table, disassembled, has joined the other items in the sewing/craft space in the basement, awaiting further disposition, perhaps as a craft table instead of the precarious tilting drafting table we use now. The plan for the rest of the summer is to unclutter and simplify our current home, whether or not we choose to downsize to a smaller house in the near future. Unfortunately, part of the clutter is the accumulation of two months worth of mail. Some progress has been made on reducing that, as I have chosen not to renew my professional society memberships as well as let several other paper subscriptions lapse anticipating being truly retired and traveling more.
Of course, retirement is a gradual process for the software entrepreneur and systems manager: maintenance and upkeep goes on for existing clients, and the home network that supports the profession has been largely left running untouched for the past two months, so software patches and upgrades are in order for all the machines as well. Amazingly, the services on which we depend for access to data and security while we were gone performed well for the entire two months, though a few of the non-essential experimental systems, unstable at best, did go off-line. The essential systems still are susceptible to functional degradation after a restart, and could become inaccessible if they have a restart and the cable company changes the router address before we can reset the security tokens. Something to work on–I programmed the devices to require manually starting a password agent after reboot to reset the inter-computer communications between servers and clients both internal and external to the network. There is a way to “permanently” allow encrypted communication between selected computers, but I’ve been reluctant to use that method.
The main issue is that, to save money, we have a regular residential Internet account, where the provider assigns the address more or less randomly, so that our network gateway has to continually monitor its address and then be able to provide changes to the external web server. A regular commercial account can request a permanent internet address and link it to the Intenet name service, but that is expensive. Even though our “stealth” web server and secure gateway is not registered, we still get bombarded with dozens of break-in attempts on a daily basis, as the “bad guys” simply scan the network address space for servers and attack them. In fact, “unlisted” addresses are more likely to be personal computers that are notoriously insecure, rather than servers that have professional management and keep security protocols up to date.