Welcome to Tech Updates: #4. You can read the updates below, or watch the video version on YouTube. Embedded here for your enjoyment!
Anyways, on to the updates. Links to relevant YouTube videos if you want to dive deeper. Enjoy!
This update, you get double points (updates)! I was too busy to record last weekend, and so you get the updates for the week of 10/12 and the week of 10/19!!
Week of 10/12/25
Research and Learning:
working on list of solutions to install in docker containers - immich, kuma uptime, openspeedtest
Future Research to do:
move n8n and cloudflared docker containers to filevault
move ollama from local service on drake-arch to docker container on file-vault
research maretta docker container
research postiz docker container https://github.com/gitroomhq/postiz-app
research home assistant docker - not going to do
research paperless ngx
commands to remember:
Improvements:
Installed Open Speed Test in Docker container
Installed kuma uptime and configured monitoring for all 12 Docker containers across 3 machines
installed Nginx Proxy Manager and set up reverse proxy, internal DNS names, and SSL certs for all Docker container apps
moved truenas off of 80 and 443 to 8080 and 8443 so NPM can have 443 for reverse proxy and ssl
Installed Immich photo server in Docker container. spent the next 4 days uploading pictures.
Mistakes:
trying to upload over 50k files at once
Week of 10/19/25
Research and Learning:
researching adding LiteLLM proxy
still researching PAI
still researching Claude code
Future Research to do:
running Docker MCP gateway headless or separate from Docker Desktop
commands to remember:
Improvements:
Added OpenAI API to my local OpenWeb UI. Can now switch between local AI’s and “Chat GPT” in MY interface.
set up macOS style pbcopy and pbpaste with wl-clipboard and aliases for wl-copy and wl-paste.
Got Claude desktop AUR package installed and working. That was a pain. Never had library dependencies get so stuck before.
Set up LiteLLM in Docker container and LiteLLM postgress container.
Configured APIs for Grok, Claude, ChatGPT, and Gemini in LiteLLM.
Configured Virtual key in LiteLLM and selected 22 models from those 4 providers to show up in my Open Web UI.
Mistakes:
went to install LiteLLM Docker container on my small server. Has an Atom 64-bit processor from 2012. Couldn’t handle newer CPU commands for the complex Docker container install. Took me and AI about 30 minutes to figure that out.
Here’s what’s happening:CPU Instruction SetsModern software gets compiled to use specific CPU instructions. Newer instructions are faster and more efficient, but older CPUs don’t have them.Your D2550 from 2012 supports:Basic x86_64 instructionsSSE, SSE2, SSE3, SSSE3
Modern containers often require:AVX (Advanced Vector Extensions)AVX2SSE4.1, SSE4.2
Check what your CPU actually has:bashcat /proc/cpuinfo | grep flags | head -n 1Look for avx in that list. It’s probably not there.Why Exit Code 132When the container starts, it tries to execute an instruction your CPU doesn’t understand. The CPU literally says “I don’t know what you’re asking me to do” and crashes the program immediately with an “illegal instruction” error (code 132).
That’s it for this week’s Tech Update. -Derek
Check out my Social Media Links, my Book, my Podcast, and my Substack below:


