Detailed Examples of My Work
Posted on August 03, 2022I was requested recently to provide details related to when I —
- Collaborated with development teams to design, develop, maintain, and deploy SQL database systems, Azure SQL/Azure Data Factory, and Power BI.
- Exceptional experience developing SQL code, testing for quality assurance, administering RDBMS, and monitoring of database.
- Data Mapping
- SSIS
- SSRS
The largest portion of my work has been done independently, however this is an example of one project where I collaborated with another developer. It illustrates the design, development, maintenance, and deployment of a database within SQL Server. Our goal was to automate the intake of non-standard healthcare provider data from many and varied sources delivered to us via email in the form of encrypted Excel workbooks as well as PDF scans of provider contracts on Sharepoint.
We initially planned to take in the data by developing ETL pipelines using SSIS, however complications made it clear to us that SSIS was not well suited to that role because of its inability to access encrypted Excel workbooks, and because successive data releases from the same data sources were not defined well enough. So, we designed a data lake in SQL Server to store the semi-structured data as JSON within temporal (aka, system-versioned) tables. Since the vast majority of the provider data was in the form of Excel workbooks, we designed an additional workbook to facilitate the manual transcription of scanned provider contracts and the supplementation thereof with data from the online NPPES provider database.
Most data releases from each data source were complete replacements of previous iterations, but some were supplemental. We included meta data about each data release to identify its data source, dates of production and delivery, and whether it was complete or supplemental. We developed VBA code and SQL stored procedures to validate workbook data, transform it to JSON, and post it to the data lake. We also included the ability to inspect posted data releases and to roll them back in case of problems.
We designed tables to enable mapping (using SQL regular expressions) of the non-standard data to a standard format, and stored procedures for materializing views of the data for auditing, analysis, reporting, and production of an interactive ADA-compliant online provider directory which I developed in PHP, Javascript, and CSS.
The dynamic mapping of data required the use of dynamic SQL which I designed carefully using the best practices defined by Erland Sommarskog. I developed a reliable SQL framework that supported the nesting of dynamic SQL by parameterizing code definitions and linking the nested code in the form of dynamically named CTEs for final execution.
Later, I developed additional features for automating the auditing of the adequacy of our provider network. I developed a process to geocode out-of-state provider location data using the Google Places API. I also developed an ETL pipeline to deliver the data to the Quest Analytics Suite for geocoding of in-state provider locations, and another ETL pipeline to merge geocoded location data back with the main body of data in SQL Server. Quest was then able to produce reports suitable for delivery to auditors.
The final solution was so highly-automated that all data downstream of the data lake could be deleted entirely and regenerated within hours. Indeed, the full-time person who had been manually processing provider network data left the company and was never replaced. I simply operated the solution in addition to my other duties.
I also developed a T-SQL framework that supported nesting of stored procedures, including avoidance of redundant executions of procedures within the same process tree and asynchronous execution of procedures in order to massively improve performance. The result was a highly modular code stack for materializing views that was entirely automated and reliable. The entire code stack was executed nightly using SQL Agent and monitored via email.
I worked with my supervisor, the CFO (who later became head of special projects), to develop analytics that had significant impact to the company’s bottom-line profitability by holding down payroll during a period of high growth, identifying previously missed opportunities for billings, and eventually resulted in our acquisition. In that role, I established an Azure SQL data warehouse to support data analysis by off-premises care navigators. The data warehouse is used to refresh Excel workbooks within seconds of when they are opened, and as the data source for dashboards I developed in Power BI that analyzed return to hospital rates, improvements in polypharmacy rates, and many other KPIs at the renderer, provider, facility, and hub levels.
I remain highly productive by automating everything I do. I regularly juggle priorities of several projects simultaneously. I do that by monitoring automated processes via email, and also by automating data profiling. I have developed many tools that I use to streamline the process of auditing data in order to provide support to end users when questions arise. Nearly always, the answers to questions amount to mistakes or omissions by operators when entering or transcribing data. It is extremely rare for me to discover bugs in developed procedures. Nevertheless, bugs and oversights have been isolated occasionally, and when that occurs I am the first to accept responsibility and I work very fast to implement corrections.
Although database administration has not been a significant role that I have played, I have managed SQL Server schemas, logins, roles, and rights and Azure networking firewall settings in order to comply with regulations and requirements with regards to data access by personnel and the public.
I have never used Data Factory, but I have used SSIS to develop multiple ETL pipelines.
I have never used SSRS, but I have used Crystal Reports, QuarkXPress, Pagemaker, Microsoft Access, various accounting software report writers, and hand-written code to develop paginated reports with multiple levels of totals. My reports have been used by Diageo Four Roses Distillery to make significant productivity improvements in the handling of barreled whisky, by Semonin Realtors (largest realtor in Louisville) to publish their Homes Magazine for twelve years, by a medical billing company to produce KPIs used to fend off existential threats from competitors, by an oil refinery to produce dynamic multi-scenario budgets, by a dairy processing plant to manage production schedules, by a manufacturer to manage sales territories, by expense aggregation brokers to automate payments to vendors and billing to customers, by service companies for managing inventory control and automating production of purchase orders, and many other businesses for many other purposes.
Operations Department Work Flow
Posted on January 17, 2006Design work flow for maintenance department with internal controls.
- Yellow regions represent staff positions.
- Orange regions represent tasks.
- Red arrows represent transfers of documents between personnel.