Systems programming
Systems Programming Projects
A systems-focused C++ portfolio item built around two lower-level projects: a custom memory manager and a WAD-backed file system mounted through FUSE with POSIX-style behavior.
Implementation highlights
Custom memory manager
The memory manager project centered on C++ interface design, allocation behavior, static library workflow, and validating correctness with tests and memory tooling like valgrind.
WAD file system with FUSE
The file-system project treated a WAD archive like a mounted directory structure. FUSE translated normal file operations into custom logic backed by descriptor entries and lump data.
Linux tooling and debugging
Both projects depended on careful command-line workflow, compilation and linking, low-level debugging, and a more precise mental model of how the data was laid out in memory or on disk.
Overview
This combined project entry represents some of the strongest technical-depth work in my portfolio because it pushed me closer to the machine than most of my web projects do.
The two main pieces were a custom memory manager library and a WAD-backed file system implemented in C++ with FUSE and POSIX I/O.
Both required careful reasoning about internal structure, correctness, and debugging. They are good examples of the kind of lower-level problem solving I want to keep doing alongside product work.
Problem / Goal
The memory manager required reasoning about allocation behavior, interface design, compilation and linking, and memory hygiene under test.
The WAD file-system project required mapping familiar file operations onto a very different underlying representation made of headers, descriptor entries, marker conventions, and lump data.
In both cases, the main challenge was not just writing code that compiles. It was building the right mental model for how the system behaves underneath.
Approach / Architecture
For the custom memory manager, I built a C++ library-oriented implementation and validated its behavior through compilation, linking, dedicated tests, and valgrind-style correctness checks.
For the WAD file system, I used POSIX I/O and FUSE so a WAD archive could be mounted and interacted with like a normal file hierarchy.
The WAD structure itself mattered: header information, descriptor count and offsets, lump data, directory markers, and map markers all shaped how the mounted hierarchy had to be derived.
Across both projects, Linux and WSL tooling, repeatable testing, and patient debugging were a major part of the real work.
Engineering details
The memory manager involved library compilation and linking rather than just a single executable, which made interfaces and build behavior part of the implementation challenge.
Valgrind and similar debugging habits mattered because memory correctness is part of whether a low-level project is actually working.
The WAD file system used POSIX I/O, not just high-level streams, because the archive format required precise control over reads, writes, offsets, and descriptor access.
FUSE acted as the bridge between normal file-system requests and the custom archive-backed operations implemented in the project library.
File hierarchy in the WAD project is derived from descriptor conventions and markers, which made translation between stored structure and user-facing paths especially interesting.
Challenges
Systems work is unforgiving. Small mistakes in allocation logic, offset calculations, or persistence handling show up quickly and often far from the original bug.
The WAD project required staying disciplined about the archive format so that directory traversal, file creation, reads, writes, and persistence behaved consistently.
Debugging at this level took patience because the failure mode was often indirect: one bad assumption in the data model could break later operations unexpectedly.
The projects also reinforced how much build systems, toolchains, and validation tooling matter when the code is close to the machine.
What I learned
These projects made me much more comfortable reasoning about lower-level software behavior instead of depending entirely on high-level abstractions.
They also improved my debugging habits, especially around memory correctness, persistence, and format-aware reasoning.
Most importantly, they showed me that I genuinely enjoy the kind of technical depth where structure, correctness, and tooling all matter at once.
Previous
An applied ML project that covers the whole workflow from audio preprocessing through feature extraction, model training, and evaluation.
Next
A team project centered on protocol handling, distributed peer behavior, and the coordination problems that show up in piece-based file exchange.