I have written software in many languages and for many platforms. I have forgotten more languages than I remember, no doubt. My very first experience as in BASIC connected by a 11o baud modem on a TTY to a computer in the basement of a professor of the local college. Paper tape was the permanent store, with a 1:10 error rate, yeah those were the days! In this part of my portfolio I give examples of the some of the more interesting software I have written or had major design input in over the years.
C# and Visual Studio :: Snapplus (2011-14)
Port nutrient management models from Delphi to C# working with the lead scientist from University of Wisconsin Soil Science. Work in a team of 5 as a senior developer/project manager. Build up more knowledge of the science with courses in hydrology at UW-Madison, which is on top of course work at Clemson in Water Law, Land Use Planning and Urban Design.
Create .NET managed code interface to NRCS RUSLE2 unmanaged WIN32 API, design and implement distributed processing of RUSLE2 with a 3x speedup. Design and implement C# genercic classes for SnapPlus models, which includes RUSLE2, Phosphorous Index, Compliance and others. Use .NET generic classes and reflection to build code generator for DAL (data access layer); in essence this is an Object Relation Manager (ORM). Reports using MS ReportViewer, working with a newly hired software engineer. Create application logging which inherits from MS EventLog. Visual Studio build scripts for a complex multi-solution application. WiX installer design and implementation.
ArcGIS Javascript API :: Snapplus, University of Wisconsin Soil Science (2011-14)
Proof-of-concept for next generation of SnapPlus on the web using the ArcGIS Javascript API. Pull in web service layers from our partner at DATCP (Division of Agriculture, Trade and Consumer Protection).
ArcMap ModelBuilder :: Snapplus (2011) and Land Use Planning, Clemson University (2010)
Get to an more advanced understanding of Arc's ModelBuilder. For me, ModelBuilder is like doing unix shell scripts only more fun because it's in a GUI.
Run multiple data analysis models for a course in Land Use Planning (see map in GIS Maps section of this portfolio), which includes sophisticated raster and vector transformations.
Create a per county and per section shapefiles from any WI layers. This is a generalizable approach which for most any GIS layers.
Joomla Component :: UU're Home, UU're Home, Asheville NC (2011-2016)
In the process of moving the UU're Home website into Joomla, I needed to create a component that would show the B&B listings from the legacy system that was in MySQL and PHP. Using the Joomla object oriented approach, I was able to create a fairly simple component in a few hours in PHP. The one subtly was that a preview of the city/state for the listing proper can be seen by anyone, but the actual listing can only be seen by members who have logged into Joomla.
Flash Slideshow :: Western Carolina University, Cullowhee NC (2010)
Wanting to liven up the WCU home page and extend my team's technology into Flash, I gave myself the assignment to build a Flash slideshow. I found an example to work off of in AS2 and built one that used an XML control file for things like the slide files and their title. Only a few graphic flourishes, like the big forward and backward buttons that I saw at Purdue and an overlay to show the mountain swooshes. This worked so well we reused the code on 2 other high profile projects for custom web sites, which was what I was hoping for when I started.
RSS Feed Outage Detection :: Western Carolina University, Cullowhee NC (2009)
After my team had rolled out a new home page for the campus web site just before the holidays, I got a call on Christmas day about an error showing up on the home page. An RSS feed used to put campus news had failed which was causing the entire page to throw a 500 error. Knowing this was over the head of everyone but me on my team, I rolled up my sleaves and wrote a quick HTTP status detection in VBscript. It was ready and in production before we had our Christmas meal.
Madison SSO in Desire2Learn :: UW-Madison DoIT, Madison WI (2008)
After years of trying to get D2L to support a standards based SSO, like PubCookie which had been adopted by Madison, D2L decided to provide this support. Getting SSO to work was the #1 request from the Madison team, based on user surveys they had done, so this was good news. How this was going to come at us was thru Shibboleth. I had to assemble a team to put this together, the key part being the Madison Middleware folks who supported Pubcookie. We determined early on that there was a good handoff between Shibboleth and Pubcookie, so this had passed the feasibility test. Shibboleth would be new for campus, but since the IT Architect was one of Shib's creators there was lots of support for doing this. Mostly what was hard about this project was doing an end-to-end test that was NOT in a production environment. We needed D2L, Pubcookie, Shibbboleth, uPortal, and SIS/Peoplesoft to have a part of this test dance. There were also several delays in the schedule, as for most everyone on the team this was a #3 or lower priority project, however the team never gave up. I left Madison before the Go Live, but it went quite smoothly without me, which is mostly a testament to their skills.
Moodle customizations for MEPP :: UW-Madison DoIT, Madison WI (2008)
The Masters of Engineering for Professional Practice is one of the oldest online degree programs on the Madison campus. They were one of the early adopters of WebCT, but never moved over to D2L owing to being rather particular about what they needed out of an eLearning platform. Seeing WebCT being end of life, they decided to move to Moodle. I was engaged in helping out with a number of small things to make sure were right. There was one medium sized customization where I used the language feature to change a number of terms Moodle showed in it's UI. This is the same feature used for the 100+ languages that Moodle has a translation for.
Federated Id Management using Accessibility Profiles :: UW-Madison DoIT, Madison WI (2006)
Our senior IT Architect asked me to come up with a proof-of-conceptr that would demonstrate how per user accessibility profiles could work in a Federated Id Management world. This was a project for the feds doing e-Authentication groundwork thru an Internet2 middleware initiative for Shibboleth which our IT Architect was involved with. Since I had been tooting the horn of standards for accessibility profiles out of the IMS Access-for-all and the work coming out of the University of Toronto ATRC, I now needed to put some code where my mouth was! What I came up with was a demo that did have all the parts in it: LDAP using OSX OpenLDAP, 2 Shib domains, 2 Moodle servers and ConceptTutor, which my team had developed in Engage. The key log for me was using Moodle which had a Shib plug-in that dealt with the wiring up of the attributes coming out of Shib and it also showed with a widely used piece of software that this was real. I did have to lean on some folks from the I2 world to get the Shib instance behaving, other than that I was able to put all the pieces together.
Presentation @ Internet2 Spring 2006 [PPT]
Moodle Customizations :: Educational Software Company, Austin TX (2006)
After a presentation about the Tools Interoperability work at the Winter Sakai 2005 conference, a local company from Austin approached me about doing consulting work on several Moodle customizations. What they needed was to weave in their Flash based tools into Moodle, which would then be offered thru a publisher as an addon to textbooks. I did 2 rounds of work for them, the first to get their tool to send quiz results to Moodle and the second to do an ecommerce transaction (payment by credit card) before sending a PIN to textbook reading which turns on the Moodle tools. These used a custom theme which messed with the login and status, plug in for a content type and an authentication plugin with an ecommerce hook. Some of the code was object oriented PHP and some was of the procedural variety, depending on the plugin API and code base.
eTEACH 3.0 :: UW-Madison DoIT, Madison WI (2005)
This version was a rewrite of eTEACH 2.0, which was based in MS technology, so that it would be cross platform. With several other successes in Flash behind me team, we decided to do this in Flash. The team included Glenn from Clotho and Mike the original developer of eTEACH. The architecture had a pure separation of client and server tiers which talked using a simple XML command protocol. The heavy lifting was done on the Flash side, with a timeline based tool for authoring. The timeline data was stored in an XML format Apple had created for Quicktime. Rather than build a web framework and user accounts, we adapted Moodle. The only piece that remained from eTEACH 2.0 was the PPT based processing, for which we needed to run a Windows server. My small development piece was to create a new Moodle file type for the eTEACH content, mostly my job was to manage the project.
Tools Interoperability Framework @ Alt-I-Lab 2005 :: UW-Madison DoIT, Madison WI (2005)
IMS Global is an eLearning standards making organization for which I had been the Wisconsin representative since 2000. A working group formed in 2004 to create a "tools interoperability" specification that would allow 3rd party tools to easily be integrated into an eLearning platform like Blackboard or WebCT. This group came up with a Web Services specification and did a demo at the annual IMS meeting in 2005 (Alt-I-Lab). My team supplied the content, lessons with a quiz in BioChemistry using ConceptTutor, and we also did a proof-of-concept in Moodle.
Presentation @ Alt-I-Lab [PPT]
Lecture/Lab Section Data Feed for Learn@UW :: UW-Madison DoIT, Madison WI (2004)
After years of doing hand crafted setup to deal with the idiosyncrasies of Lecture/Lab sections at Madison, my management strongly encouraged my to find an automated solution. This was going to save an FTE, which with state funding diminishing meant an FTE that could be used elsewhere it was really needed. Desire2Learn was releasing a new datafeed API which brought in some techniques our friends at Ohio State had done with their datafeed which were similar to our needs, so I decided to hop on that horse. I found out from the curricular data person in DoIT that it WAS possible to generate queries for all the lecture/lab/discussion scenarios, this has been rolled out for the data warehouse already. So, I reassembled the team that had done version 1 of this datafeed. Everyone was up for this challenge, including D2L, and they were hitting on all cylinders. There was an added twist of doing transforming snapshot (full) data from the SIS into incremental data for the new D2L API. I brought in someone from another project (eTEACH) to do just that part in .NET, it was an exercise in hash tables and XML, which worked just fine. We JUST had enough time to develop, test and roll this out in time for the Fall semester.
My UW-Madison using EpicCentric :: UW-Madison DoIT, Madison WI (1999-2001)
My UW-Madison was a first generation horizontal portal for higher education, for which I was the project manager and architect. Rather than building this from scratch, I opted to use more of a framework approach with pre-build parts that we would put together in a custom way. Fortunately, a good framework product emerged just when I started looking for it (I did a product search with requirements vetted by a team within DoIT) -- Epicentric. A Silicon Valley startup, Epicentric was doing exactly what I needed in a new product space being called portals. In a win-win negotiation, I got a good price for Epicentric as an early customer in a new market segment for them. There was a learning curve for my new team, this was on Java and JSP, but not so steep that it couldn't be mastered quickly. It wasn't so much Java as it was the Epicentric API that was the learning curve, with Epicentric proving to be a good choice as we dug in. I pushed for having a solid development methodology with dev and test environments, along with using version control. I got someone reassigned to this team to handle the builds and version control, who quickly took a liking to CVS. We figured out how to get the entire Epicentric JSP into CVS and push along in CVS using labels. We also deployed early on a full featured demo environment, which was crucial in selling this new portal concept by my Director to campus.
WebCT 1.3 to 3.0 Migration :: UW-Madison DoIT, Madison WI (1999)
Having run the new WebCT for over a year, the team was facing a migration from 1.3 to 3.0. We had passed on the 2.x version as not being stable enough. But the hill for going to 3.0 looked a bit higher. After hearing a number of options and concerns, I decided to lead this particular project. I ran it as a
Internet version of Epic Medical Records :: Epic Systems, Madison WI (1998)
The first entre into the Internet for Epic Systems, a leading provider of medical records systems for the HMO market, was done in a combination of ASP and M. Epic has a high speed DB done in the OODB Cache, for which the API is in M. I was the senior member of the team putting this product together. I fussed over HTML in pre-CSS times, generation of VB objects from the mainline Epic code base, the performance of the ASP, setup dev and test environments and populated with test data, and packaging/installation of the software.
Internet-Based Automatic Publishing System, US Patent #5727156 :: Dirk Herr-Hoyman and Louis Hubert (1998)
"A simple method and apparatus for posting hypertext documents ... while securing unauthorized modification of the posted hypertext document. ..." This patent is based on SWUP (Simple Web Update Protocol) which rides on top of HTTP. US Patent #5727156 [PDF]
Hotoffice :: Linkstar, Boca Raton FL, Madison WI, Kalamazoo MI (1995-96)
The Hotoffice product was an early SOHO (small office, home office) product in the first wave of the DOT COM era. The concept was to create an easy to use means of having an Internet web site for small businesses. This was an extranet and intranet before those terms were even coined. With a colleague I went to grad school with at WMU, we created a client/server style of system which used a protocol we called SWUP (simple web update protocol). SWUP would do secure connections using a 2 phase public/private key cryptographic algorithm, which we licensed from RSA, this was before SSL mind you. On both client and server sides there were complex systems, my colleage did the client side for Windows mostly in VB and I did the server side in PERL first and later in ASP. Among the many details, I created a custom full-text search using MS Indexing Service in C++.
Electronic Journal of Extension :: University of Wisconsin Extension, Cooperative Extension (1992-94)
eJOE was one of the first online peer-reviewed academic journals, which started in a text/PDF format in email + Gopher and later HTML on the web. This journal was read by both academics at Land Grant universities and county extension agents all over the US. The county offices presented a particular challenge as the bandwidth was typically 9600 baud modems when we started. This was put together with PERL at the core, and a C based system for email commands called Almanac. Lots of PERL regexp tricks were used. There was YACC pre-processing over the email commands and regression style unit testing. The journal itself was done using a Pagemaker layout which was processed to generate text and PS (postscript) versions. The metadata for each article was created using a unix bibliographic format roffbib and could be generated for use on the article or a table of contents. The first version was in text for email and gopher, with those that had high bandwidth able to get the PS format with graphics using MIME enable email. All cutting edge at the time. When the web started to emerge in 1994, I did an HTML version using the same underlying framework.
Public Health Research using SAS :: University of Wisconsin Department of Family Medicine (1992)
I would regularly create SAS jobs to run stats for large public health studies. For example, there was one that looked at breast cancer in the midwest over a 10 year period, which used a multi-variate factor analysis in a 500+ line SAS program.
Family Medicine Statewide Email :: University of Wisconsin Department of Family Medicine (1991)
DFM wanted to run a statewide network to provide email to clinics all over the state. My part was to setup the Internet email gateway from the UW's email system. This involved writing custom sendmail rules that would route messages and rewrite email addresses. I used Eric Norman's sendmail rules as a start and ended up being a minor sendmail expert when I was done.
Internet Email :: UW-Eau Claire, Computer Science (1989)
With a connection to BITnet coming to campus, I found software that would bring this into our CS VAX/VMS system. Soon thereafter, BITnet rolled out gateways to this new thing called the Internet. Voila, we had Internet email first in CS! I then worked with the campus central computing staff to setup this same system for the new campus VAX/VMS system.
Gradebook :: UW-Eau Claire, Computer Science (1988)
I wrote my own online gradebook using VMS DCL and indexed files. Without a database, I created my own fast lookup substitute. Without a spreadsheet, I put in my own calcs for weighting the final grades. Since I was already setting up per class space on the VAX/VMScluster, this was an easy addon as another service for the students to use. Students got to see what they were going to get for a grade and had the responsibility to correct scores that weren't right. In a real sense, this as a precursor to the eLearning Systems like WebCT and Blackboard in the late 90's.
Network Flooding Simulation :: Western Michigan University (1986)
In my final project for a course on Computer Simulation, I created a model to simulate various network algorithms. The simulated network load would be increased until it reached a flooding condition. The point was to see how the various network routing algorithms performed. This included lamba distributed random input to simulate real network traffic according to a queuing theory. Mostly written in Minitab, I used some C to generate the pseudo-random input. As I recall, the Ethernet style algorithm of contention detection came out on top (but don't have the evidence to proof that anymore).
BookReader :: Data Research Associates, St. Louis (1984)
Finished off the software for a bookreader contraption which the president brought back from England. This would scan the new magnetic strip technology for each book for check out. There was custom language for the software which needed to look at low level device status of this custom device. For questions, I had to be aware of the 6 hour time difference with England for phone calls. After getting the raw data from the bookreader, I had to get into the DRA bibliographic database system to look up the book and patron information using the MARC based bibliographic records format and displaying for the librarian doing the check out or check in.
System Performance Reports :: Data Research Associates, St. Louis (1983)
To keep up with the system health of a VAXes used for software development, I created scripts in DCL that would generate a daily report on system stats like CPU and memory load. This was emailed to me so I could quickly see if something was going awry. I later gave this package to customers of the library automation software as part of the package of stuff I left them with.
3D Histological Section :: LSU Neurology, New Orleans (1981)
Using an Apple IIe with 64K memory, a 10 Mb hard drive, an Apple drawing pad and writing in Apple Pascal, I created a 3D model of brains for research in neurology. This was a precursor to CAT scanning (no pun intended) and used histological sections of cat brains that where stained to show brain structures. These sections, which were cut very thin and mounted on slides, were projected onto the pad and traced. This created the data for the 3D model. I used linear algebra transformations to allow for all the 3D operations one might expect. As you might imagine, this was kinda slow by today's standards. 200 points would take 15 sec. to move and refresh. Not exactly Pixar speed. Still for 1981 this was cutting edge research within a modest budget.