The core question: how did HP's scientific calculators actually work at the gate level? That rabbit hole led to building one from scratch.<p>The architectural decision everything else follows from: a decimal calculator should store numbers as BCD — one decimal digit per 4-bit nibble. A standard byte-oriented CPU (Z80, 6502) fights that layout constantly. So I designed a small custom CPU in Verilog where 4 bits is the natural data width and memory is nibble addressable.<p>What the project covers:<p>- Custom CPU: Harvard architecture, 12-bit ISA, 8-state execution
FSM, hardware stack guard with a FAULT state for microcode debugging<p>- CORDIC for trig functions, verified to 14 significant digits<p>- Two-pass assembler in Python (~700 lines)<p>- Verilator + Qt framework: same Verilog source runs in simulation,
as a desktop GUI debugger, as WebAssembly, and on real hardware<p>- Scripting language on top of the microcode for adding functions
without touching hardware<p>- Custom PCB (EasyEDA/JLCPCB), battery, charging circuit<p>Write-up: <a href="https://baltazarstudios.com" rel="nofollow">https://baltazarstudios.com</a><p>Hackaday: <a href="https://hackaday.com/2026/05/13/build-the-cpu-then-build-the-calculator/" rel="nofollow">https://hackaday.com/2026/05/13/build-the-cpu-then-build-the...</a>