Senior Software Engineer | Resume v5
Earned July 2, 2025
In progress.
Software engineer with expertise in architecting and implementing REST APIs with python, success in building SaaS, and DevOps/Cloud engineering to support SaaS (with teraform) for hundreds of users in a startup environment. Proven track record of enhancing developer productivity and ensuring bulletproof deployments through automation (in bitbucket pipelines and github actions) and containerization/reliability (with kubernetes). Background in applied mathematics and engineering with a long term obsession in software and technology.
April 2025 - Now
To expedite operations by the United States Air Force, lead the modernization and containerize various legacy applications at the Distributed Mission Operation Control (DMOC) at Kirkland Air Force Base in Albuquerque, New Mexico as subcontractor for Northrop Grumman and Serco.
django-ninja, typescript, and docker.
kubernetes deployments of custom software services and containerizing containerizing these services.
January 2024 - Now
Established a technical blog covering topics in data science, data structures and algorithms, and development operations to share insights, expertise, and progress. Built acederberg.io using quarto and python to showcase 30+ blog posts, 4 projects, a resume, and a professional portfolio.
pulumi on linode and GitHub Actions.
pandoc filters in python with pydantic, supplemented by additional javascript and extended SCSS built on bootstrap.
typer CLI to monitor pandoc filters and http server logs, utilizing FastAPI, websockets, UNIX domain sockets, and mongodb.
GitHub open-source community by raising and resolving issues, particularly in quarto and its associated projects.
January 2022 - December 2023
Lead the design and implementation of building management and analytics SaaS to empower building management teams and owners to optimize and have clear insights into their utility usage and expenditure. Designed, implemented, and tested the Cufflink data API (using python), SaaS continuous integration and delivery across multiple developers and projects, and infrastructure as code projects to reliably power the user dashboard and deliver new releases.
OAuth to safeguard API endpoints, MySQL to make secure queries, traefik to implement SSL termination, and writing robust tests with PyTest to ensure the effectiveness of these measures.
docker builds in bitbucket pipelines.
99%, platform stability and reproducibility using terraform infrastructure as code on Azure, and guaranteed rapid software delivery by designing self testing deployments using helm.
NextJS.
August 2019 - December 2020
Researched air plasma generation for astronautics with Craig Davidson of Dark Sea Industries.
I2C array controlled by a raspberry pi with code developed in python and c.
numpy and matplotlib to document findings.
August 2015 - December 2019
python and Mathwork’s matlab.
python by building and programming I2C sensor arrays using Raspberry Pi and Arduino.
Click on any the content bellow to learn more.
Click on any the content bellow to learn more.
Whatever you might prefer to call the editing and running of text on screens (‘programming’, ‘coding’, ‘scripting’, etc), it is an essential skill for any serious engineer or academic in the current day to understand and utilize the computer at a deep level. While I have certainly been playing with computers my entire life more than most would be inclined to, I have spent the last 7+ years particularly obsessed with computer science and mathematics.
Here are some other tools I use frequently.
Most of these will be covered in greater detail in subsequent sections.
In the modern world, data is the star of the show and thus the storage, distribution, and programmatic acquisition of data is paramount.
I have plenty of experience developing the tools for this - for instance REST APIs for the distribution of data and ETLs for the aggregation of data in languages like python nodejs mysql mongodb redis
Additional, I am exceptional familiar with means of consuming data like httpx, fetch, and curl and protocols like http and websockets.
While APIs and rest of the back-end aggregate, transform, and serve data, said data is of little use in ascertaining the ‘big picture’ without visualization and interactivity.
For instance, you are probably viewing this resume on my blog - would you read my resume if it were in YAML format? For most of us, the answer is absolutely not.
Development operations, automation, and cloud engineering make it possible to turn code into fully functioning software systems. Without this piece of the puzzle, deployments are inconsistent and likely quickly cobbled together, wasting enumerable hours of developer time and inevitably introducing bugs and reducing uptime.