Andy Bottoni

Digital Transformation Specialist
Full-Stack Developer | Software Engineer

History

As a software engineer, my growth in the industry has been interesting to say the least.
Here's an attempt to summarize some of the technologies I've experienced.

In The beginning...

My journey in programming embarked in the early 1980s, navigating a diverse landscape of languages like Assembler, Basic, Pascal, C, and C++. Those formative years were dedicated to crafting, refining, and enhancing portfolio management systems at Teleware, Inc., affectionately known as BestWare. Our flagship product, Market Manager, was meticulously crafted in Pascal. Following its inception on the Apple II platform, I undertook the significant endeavor of migrating it to the PC—an experience that sparked my enduring mastery of multi-platform development.


As my tenure progressed at Teleware, I spearheaded the creation of financial management solutions like M.Y.O.B—an intricate accounting system initially designed for the Macintosh environment. Subsequently, the evolution of technology beckoned the migration of these robust systems to the nascent Microsoft Windows platform. This period was characterized by the seamless transition of Teleware's suite of products across multiple platforms, a testament to my adeptness in navigating diverse programming languages and architectures.


Transitioning from Teleware, my trajectory led me to Avantos Performance Systems, where I contributed to the development of ManagePro. This endeavor demanded the meticulous maintenance of cross-platform compatibility—a single C++ source base seamlessly interfacing with Mac, Windows, and Unix (X-Windows) environments. My code was meticulously crafted to handle platform-specific intricacies, particularly critical in resolving database challenges. Leveraging innovative techniques like overloading data classes, I ensured a unified backend accessible across all platforms. Addressing intricate nuances such as byte, word, and double-word alignment in real time was instrumental in achieving this synchronized functionality across diverse systems.


This trajectory underscored my commitment to multifaceted programming environments, honing my expertise in navigating cross-platform complexities and fostering innovative solutions to ensure seamless interoperability among distinct operating systems.


Growing...

A significant portion of my career has been dedicated to leveraging VBA within Microsoft Access and harnessing T-SQL for Microsoft SQL Server.

My deep dive into Microsoft SQL Server commenced in the late 1990s during my tenure with Information Strategies Group, a smaller enterprise where I orchestrated the transformation of their data processes. I led the charge in automating the importation of diverse database formats extracted from telephone bills into SQL Server. This automation, a fusion of Microsoft Access and SQL Server queries, not only streamlined but revolutionized their manual data entry procedures. By infusing innovative solutions, I enabled the processing of exponentially larger datasets, significantly amplifying their business reach and operational efficiency. Initially, crafting custom import routines for telecom data from CDs, I further innovated by establishing direct connections with their back-offices through ISDN lines, facilitating direct data acquisition from distribution centers. Subsequently, initiating the conversion to Electronic Data Interchange (EDI), I engineered a colossal data dictionary tree and architected the code necessary for direct SQL Server imports from the provider—an intricate feat that elevated our data operations.

For the past 17 years at Team-Systems, Inc. since approximately 2006, my role has revolved around catering to client needs, drawing upon my prior API expertise to construct libraries facilitating seamless data transfer with external systems such as Salesforce, Sage Accounting, and e-check/credit card processors. Leveraging my robust SQL database design background, I seamlessly translated VBA code into T-SQL stored procedures, empowering efficient data handling within the Microsoft SQL Server environment.

At the helm of supporting Colonial Surety Company's IT and software engineering requisites, a considerable chunk of my responsibilities involved maintaining a sizable MS Access program, exceeding 100MB in size, tailored for back-office management. This entailed a diverse array of tasks—ranging from updating forms, reports, and external processes like credit card/e-check processing and web server synchronization through API calls from VB, to meticulous database administration of the MS SQL Server, and crafting T-SQL stored procedures. A significant optimization initiative involved transforming classic linked table forms into detached CRUD forms, dynamically generating T-SQL pass-thru queries based on user-initiated changes. This overhaul significantly expedited processes, particularly with large legacy tables that formerly incurred substantial load times.

In a recent venture, I constructed T-SQL stored procedure libraries adept at processing credit card and e-check transactions by utilizing JSON and XML for direct communication between the SQL server and processors. Employing SOAP requests via the MSXML2.ServerXMLHTTP library, these routines were initially developed and deployed in VBA before seamlessly transitioning to T-SQL for automated execution in handling client recurring payments.



Modernizing...

In recent technological strides, I spearheaded the development of multiple innovative product lines that expanded upon my foundational cross-platform approach to encompass mobile and web realms.


SmartID™ emerged as a groundbreaking anti-counterfeiting system, boasting dynamic authentication and embedded intelligence via SmartCodes™. This revolutionary technology empowered each product to validate its authenticity at the unit level in real-time, rendering traditional static methods like holograms and serialization obsolete in the contemporary battle against counterfeiting and piracy. SmartID™ revolutionized authentication practices by providing a dynamic, foolproof means of product validation anytime, anywhere.


FirstWitness™, an integral part of our WitnessNetworks product line, debuted as a guardian for victims of domestic violence. Our close-knit association with the domestic violence community fueled our belief in its swift adoption by special interest groups. This product garnered substantial media attention, amplifying our brand equity and fostering support from international victim rights groups and human rights organizations. Scheduled to follow were LiveWitness™, a family protection network, and SilentWitness™, a human rights protection network, slated for release in 2014. These products were poised to redefine the landscape of safety and protection.


Each of these pioneering products was anchored by web applications written in PHP with MySQL back-ends, facilitating registration, administration, and event monitoring. Operating within a virtual server environment, our infrastructure boasted the flexibility to spin up additional server images dynamically as demand surged.


My architectural design comprised multiple load balancer servers steering requests toward numerous PHP application servers. Initially employing two load balancers and three PHP application servers, our setup ensured optimal performance and load distribution. The robustness of our backend infrastructure relied on a high availability MySQL server suite synchronizing data across server farms. Shared asset storage relied on common NAT drives for regional user uploads, while cloud-based mechanisms facilitated global asset sharing.


Users could seamlessly access these innovations on either Android or iOS platforms. I meticulously crafted native iOS applications in Objective-C using Apple's xCode IDE, while Java in Android Studio powered our Android applications, ensuring a seamless and immersive experience for users across diverse mobile ecosystems.



Today...

Presently, I'm actively engaged with Smartflow Data, contributing to their clients' needs centered around Microsoft Access and SQL Server technologies.

At SmartflowData , our core objective revolves around transforming raw data into actionable knowledge, catalyzing businesses towards enhanced efficiency. For over 25 years, our forte lies in Enterprise Resource Planning (ERP) customizations and system integrations. Our specialization extends to the maintenance and upgrading of ERP systems tailored specifically for manufacturing entities. We operate with a mission to streamline business processes, augment workflows, and harness technology intelligently, ensuring that each recommendation aligns with boosting overall business performance and enhancing the intrinsic value of IT systems across organizations.

Simultaneously, I collaborate with Team-Systems in implementing a seamless data exchange mechanism bridging Microsoft Access with an advanced online trucking management system called Samsara. The libraries I've meticulously crafted serve as the communication conduit interfacing with the Samara API, facilitating a symbiotic relationship between the in-house SQL Server and the Samsara backend server. Within this process, the complexity of JSON data is effectively managed through a sophisticated conversion mechanism within T-SQL on the Microsoft SQL Server. I utilize OLE Server Objects in T-SQL to synchronize intricate JSON structures into distinct SQL server tables, considering even nested JSON sub-arrays, employing advanced SQL Common Table Expressions via CROSS APPLY and similar constructs.

These meticulously crafted T-SQL stored procedures seamlessly integrate into the Microsoft Access environment through pass-through SQL queries, ensuring a cohesive and unified data exchange architecture.


And Beyond...

During my free time, I delved into a self-driven learning exercise by constructing a network of Raspberry Pi-based servers running Linux. These servers operated independently, while Rancher, a container management platform, was meticulously set up on a dedicated VMware host. This unique setup allowed me to explore and learn the intricacies of orchestrating separate clusters and managing containers effectively.


My exploration into emerging technologies has been a key focus during these pursuits. I've dived into the implementation of Blockchain Nodes, focusing extensively on their security measures and encryption protocols, aiming to comprehend their architecture and functionalities in-depth. Concurrently, I've been deeply engaged in experimental exercises centered around training solutions for generative AI models, pushing the boundaries of AI's creative capabilities and learning mechanisms.


For a visual representation of these endeavors, here's a snapshot of the cluster on LinkedIn, showcasing the culmination of these independent server clusters.


At present, my focus is on migrating the technologies I've developed—spanning PHP, MySQL, iOS Objective-C, and Android Java—towards more current tech paradigms and cloud-based environments. This migration involves deploying Kubernetes clusters engineered for seamless scalability, while also fine-tuning mobile technologies to ensure uninterrupted data synchronization, even in disconnected environments.


In the past, my mobile solutions relied on basic HTTP GET and POST requests for data and binary asset synchronization with PHP web servers. However, the updated versions are empowered by cutting-edge data synchronization libraries, simplifying coding processes and fortifying the resilience of data sharing mechanisms.