Languages
C++
Smalltalk
bash scripting (unix)

Operating Systems
Linux
Windows 9x
Windows 2000
Windows XP

Databases
mySQL
postgreSQL
MS Access

Text editors
emacs
OpenOffice
Word

Some people will want to check out my programming resumé. Others will prefer to read a bit of history...

The Dim, Dark Past
I began programming in 1973 at the tender age of 13 when I joined a computer club called the R.E.S.I.S.T.O.R.S. in Princeton, New Jersey. We were junior high-schoolers running batch FORTRAN jobs using punched cards on Princeton University's IBM 360 mainframe. I remember us scampering through the computer center, weaving around the more sober (and much taller) computer scientists who had serious work to do.

You ran your cards through the card reader, waited up to 20 minutes, and an operator behind a wall placed the print out of your job into a bin. It was kind of a long time cycle when you were debugging. Through a window we could see the 360 in there with its lights twinkling away. The disk drives were the size of washing machines. We also programed in SNOBOL, SPITBOL, PL/1 and BASIC.

When I was an undergraduate at Harvard (1977-1981), I was exposed to things in a more systematic fashion, and I learned LISP, PDP-11 assembler, EL-1, and, of course, more FORTRAN.   Here we were using a VAX/VMS system, and in the Harvard Lab for Computer Graphics, where I took a couple courses, I had access to such powerful machines as a Tektronix 4014 (it was capable of drawing green lines on the screen!) and an AED 512 (it drew in other colours!).

Ranch To Market
In 1983 I formed a software company with John Fehr in Austin, Texas, called Ranch to Market. We specialized in—not surprisingly—FORTRAN programming on VAX systems using graphics on Tektronix scopes. Our flagship project was PASSAGE, an application that performed drive-time calculations on city street networks. It was commissioned by, incredibly, Federated Department Stores to decide where to site new malls. (I take some comfort in believing that it was never actually used).

PASSAGE would read US Census Bureau map files (called "dime" files in 1980, changed to the now familiar "Tiger" files in 1990), assembled the street segments into cohesive networks of arcs and nodes, and computed minimum path trees from this "root" to all the other points on the network. However, people don't drive the shortest path to get places: they drive the quickest path.  Dime files did not include any data on what sort of speeds cars could travel over those roads, so we included a provision for dedicated department store researchers to go out and drive around the city collecting average speed data for major arteries. Armed with a tape recorder, they would note the streets they drove, where they turned, and the times at which they reached intersections. PASSAGE would digest this raw data and assign the average drive speeds to the right streets.

As you can see, if you explore current GIS applications such as ESRI's ArcView, the golden fleece for marketers in the US is to be able to associate locations with Census tracts, because they can get average income and spending data for these tracts. Consequently PASSAGE had a provision for drawing the underlying census tracts, and assigning each tract to a drive time zone from the root of the tree (five to ten minutes away, ten to fifteen minutes away, and so on). Given a certain point in the city, you could group the census tracts into drive-time zones, and therefore estimate the disposable income within, say, 15 minutes of that location.  Diabolical.  We went into negotiations with ESRI at some point to sell them PASSAGE, but no deal was struck.

Using the same technology (FORTRAN/VAX/Tektronix displays) Ranch to Market also produced GARNER, a package to speed data collection by graduate students in the Botany department at Texas A&M University. These students were taking area, length and angle measurements on the leaves and fruits of unknown species (of cucerbitae, mostly from Mexico), using rulers, graph paper and protractors. GARNER allowed them to place the leaf or fruit on a digitizing table, trace the outline and some major features, and get all of the same measurements quite quickly.

Years In The Wilderness
In 1985 I ceased work at Ranch to Market in order to pursue a career in outdoor adventure education.  John went on to become a Smalltalk guru. But over the next ten years, when I was not teaching backpacking and related skills in the southwest deserts, the Rockies, or Alaska, I kept picking up small programming jobs.

In Boulder, Colorado, I spent some time in 1987 working with Jon Miller on a database for veterinary offices. This ran under SCO Xenix. At the National Outdoor Leadership School in Lander, Wyoming, I spent the winter of 1995 under the tutelage of Patrick Clark, designing something I'd never heard of but which he told me was going to be a big deal down the line—a web site. I learned HTML and used Mosaic on an Apple LC to build the school's first website. Later at the school's office in Smithers, British Columbia, I automated the ordering and inventorying of gear (canoes, backpacks, sleeping bags, fleece jackets, etc.) using FileMakerPro on another Mac LC.

John also introduced me to Dolphin Smalltalk, which I played around with in my spare time. I found the pure-object model very appealing and really enjoyed writing small applications with it.

Linux and C++
After my outdoor adventure career was over, I settled in Smithers and began working for Silvertip Software as a technical writer.

Soon Silvertip decided to migrate RoomTime (a database application that facilitates room and equipment reservations) from Visual FoxPro and running under Windows, to PHP pages running on an Linux server running Apache and mySQL database. The Windows version was limited to about 50 simultaneous connected users, but the PHP version would be open to thousands. Such an application would ideally run on a turnkey server installed on the intranet of a large corporation managing many rooms in multiple cities.

My job morphed into creating these servers. A key concern was that the servers would be secure—so that clients would feel okay about installing them within their own networks—and require no maintenance, so that clients would not need personal trained in linux.  I remembered various elements of unix from years before, but a significant part of my job was acquiring the knowledge I would need of linux (Red Hat 8) and networking to do the job—just the kind of job I like!  For security, I shut down all incoming ports except http, https and ssh (in case we wanted to send in updates). There was break-in detection via Logwatch and Tripwire, and the server could send out mail if in distress. A set of cron scripts was used to clean out logs. So far, these servers have proven to go remarkable amounts of time (up to a year) without needing any attention, or even a reboot.

In the process of learning linux and installing the various open source packages, I began to poke around in the C code. It seemed to be a fairly easy to understand language, and I was able to make some small hacks without any formal training. In 2002 I decided it was time to repeat history and teach a young person how to program—in much the same low-key way as I learned back in the 70's. I found a student, and since technology had changed, there was no point in teaching him FORTRAN. I settled on C++, and we would use gcc on a 468 that I had running linux. The fact that I did not know C++ just added to the attractiveness of the plan. It was the perfect structure for learning: his interests guided where we went, but I was able to stay one step ahead of him for the six months that we pursued this.

GIS Consultant

In 2003 I decided to try to turn my hand at map making, and spent about 6 months playing with MapInfo 5, GRASS, Global Mapper,and other free software and data. In mid 2004 I was hired as a consultant by GGL Diamond Corp., a small exploration company specializing in searching for diamonds in the Northwest Territories. At present I am managing their database of exploration data, as well as producing hundres of maps a year. I primarily use MapInfo Professional 8, with the Encom's Discover 7 plug-in; but I also use Geosoft's Oasis/montaj and Encom's Profile Analyst. I've learned some geophysics and geology along the way. There's scripting to be done in all these applications, so programming continues. I also offer support to the company's employees around these pieces of software.

 
Hand-painted Maps GIS Mapping Web Design Software Writing & Editing
Home About Hesperus Links