From the beginnings of the first electronic circuits to Kubernetes orchestration, my constant has always been only one: never accepting that a system works without having deeply understood its internal logic. Whether it’s hardware or the command line, my goal has always been to master the system, understand its deep mechanisms, and, where possible, push its limits a little further.
Today many talk about Artificial Intelligence and Cloud as absolute novelties, but those who have experienced the evolution of bits “under the hood” are witnessing an extraordinary convergence. In this post, I want to tell you how I went from the screech of the 56k modem to the orchestration of a Kubernetes cluster in my living room.
AI Before Power: The Era of SVMs (1999-2001)#
At the end of the 90s, Artificial Intelligence was a challenge against the limits of physics. The computing power of that time did not allow classical neural networks to converge in reasonable times.
In my degree thesis in Physics, I addressed this limit by using Support Vector Machines (SVM) for the early diagnosis of radiographic tumors. The algorithms were written in C and C++ for maximum performance, with a Java interface. It was an era when AI was built with pure mathematics.
Slackware, RedHat, and the Modem’s Screech#
In those same years, I discovered Linux. There were no YouTube tutorials; there were manuals and newsgroups. I installed the first versions of RedHat and Slackware, configuring kernels while the outside world was connected via the twisted pair telephone line and the screech of the 56k modem. There I learned how an operating system “thinks.”
From the Semantic Web to Enterprise Consulting#
After an IT Master’s focused on the internet stack and the Semantic Web (the idea of making data understandable to machines), I arrived in Milan. Those were the years of “heavyweight” consulting in Java, where the modernity of JavaBeans had to communicate with COBOL systems and critical SQL databases. An incredible training ground for understanding the management of complexity on a large scale.
The Duality: Teaching and Web Development#
My path then veered towards teaching Mathematics and Physics. But I never stopped being a developer. During my teaching qualification process, I worked as a JavaScript and PHP programmer, staying connected with the world of the dynamic web.
In these years, I understood that learning a new language (Python, Haskell, Rust, XML) is a very fast process if you understand its paradigms. Once you understand the logic (functional, imperative, object-oriented), the difference is almost only syntax.
The Spark: AI as a Study Companion#
The circle began to close with the advent of the new LLMs. Initially, I used AI for educational purposes: creating personalized materials for my students and, especially, teaching them how to use AI to improve learning and not to “cheat.”
But AI reawakened my desire to experiment. I needed a functioning Linux system and, after 15 years, I installed Linux Mint. I was struck: the Linux world had become simple, elegant, beautiful. Thanks to AI, I didn’t have to reread tons of documentation for drivers and services; in two days, I had a perfect workstation.
The Escalation: From the VM to the Mini PC#
At that point, curiosity became unstoppable. I wanted services active 24/7. I rented a VM, using it as a training ground with AI as a “Senior Partner” always by my side.
But the services grew: Docker, SQL and NoSQL databases, vector databases for LLMs, n8n for automation, Tailscale for the network… The rented server was no longer enough. Instead of increasing the monthly fee, I made the leap: I bought a Mini PC with enough RAM, a domain, and started building my homelab.
Beyond Fear: Proxmox and Kubernetes#
With AI supporting me, the “fear of complexity” vanished. Why read 300 pages of manual when AI can extract the exact 10 lines I need at that moment?
So I installed Proxmox. Then, I told myself: “Why stop?”. I faced the ultimate challenge: Kubernetes. Perhaps it is excessive for a home laboratory, but the subject is too fascinating to be ignored. Today my Home Lab runs on a Kubernetes cluster with Talos Linux on Proxmox.
Conclusion: Un New World to Explore#
Who am I today? A physicist who saw the birth of AI and who today uses it to break down time barriers. I don’t know where this road made of High Availability, security, and scalability will lead me, but I know that in front of me is a new world to explore.
I can’t wait to discover what’s behind the next node.
Did you enjoy this journey? Follow me on blog.tazlab.net to discover how I am configuring my cluster and what the next experiments on the border between physics and cloud are.

