I have included various algorithms that show the range of my software skills. My apps and algorithms usually interact with electronics, optics, cameras, and motors.
At Motorola, I developed a system to measure complex current waveforms.
Using an oscilloscope, 2 or 4 channels were set to increasingly more sensitive scales. All of the scope probes would be connected to the same measurement point. A shunt resistor, or a special "zero-ohm" shunt circuit was used, to convert the current to voltage.
With the 4 channels set at various sensitivities, then one or more of the channels was always able to capture an unsaturated current waveform. (Some of the channels would be saturated.)
I developed C, C++ software to control both the LeCroy oscilloscope and the Unit-Under-Test (typically a mobile pager). The 4 channels were analyzed, the waveforms were normalized and combined, eliminating the data lost in the saturated channels. This allowed for a high dynamic range, measuring current waveform artifacts from 1 μA, to 200 mA.
This system could identify faulty circuits by placing the electronic device in various modes and comparing the total current drain, with various circuits turned on or off.
By controlling the various circuits on the device-under-test, this system could detect the change in total device-current, for a single Liquid Crystal Display (LCD) segment, if it was disconnected or shorted (typically ± 1-5 μA).
I believe that this same measurement system could be used today on a modern smartphone, to detect a single dead pixel on a display, or other smartphone components.
Language and Platforms: Windows C, developed on Visual Studio. Pager code: 6809 assembly language.
I inherited, then redesigned a PID algorithm, to control a DC motor that positioned an optical stage. The speed of the stage had to be controlled precisely. A single-column camera was synchronized, triggered to grab image "slices", which would be stitched together, to create a snapshot of 4 fingers. The DC motor operated in dynamic braking mode, as the user slides the optical stage from left to right.
Language and Platforms: Embedded C, developed with the Keil IDE.
I designed a velocity and position sensing algorithm, with a super-fast correlator. The correlator used a "packed binary image" (black and white pixels) to implement a highly efficient, bit-wise 2D correlation, in real-time. With this correlator, I implemented trackpad-trackball functionality for an ultrasound fingerprint swipe sensor (192 × 8 pixels). The correlator algorithm analyzes the velocity vectors to identify the current direction, disabling some sections in the correlator table to maximize efficiency. Inertia is implemented, to simulate the mass of a trackball, and its deceleration.
Language and Platforms: C++ Desktop app, Embedded C, and Native C on Android.
Correlator algorithm for Ultrasound swipe sensor - Implement Trackball functionality
I optimized an existing image-stitching algorithm, with a fast correlator, to implement precise fingerprint capture from an ultrasound swipe sensor (192 × 8 pixels).
The footprint for the entire swipe sensor algorithm, including the image buffers, was less than 64KB. The stitching correlator used a "packed binary image" (black and white pixels) to implement a highly efficient, bit-wise 2D correlation, in real-time.
The resulting image had to be free of artifacts and distortion, and the image had to meet government specs for fingerprint image quality.
Language and Platforms: Windows C, ported to embedded C using the Keil IDE.
I developed an algorithm to extract a pulse rate signal from an ultrasound fingerprint sensor (192 × 8 pixels). The algorithm detected the tiny variations in pixel grayscale levels, as the user's fingerprint ridges "swelled", with each pulse in the finger's blood vessels.
Language and Platforms: C running on a Windows Desktop app, developed with Visual Studio.
This algorithm was patented: https://patents.google.com/patent/US8433110B2/en
I used Python, with OpenCV, to implement a prototype Daugman algorithm for locating the iris and pupil. This is a useful first step for an "Iris Recognition" algorithm.
The user is manually selecting coordinates near the center of the pupil, to discover the center by observing the brightness of the rings.