If you have any questions or feedback, pleasefill out this form
This post is translated by ChatGPT and originally written in Mandarin, so there may be some inaccuracies or mistakes.
Background
2020 was a chaotic year, but it also prompted me to rethink the essence of computers. Throughout this year, I ventured into many areas outside of my expertise, all revolving around a single theme: re-examining the fundamentals.
In high school (more accurately, vocational school), I studied electronics. During an era where further education was the mainstream path, we had practical classes, but they often felt more like rote exercises, with the majority of our time spent on studying and problem-solving. The only thing I was thankful for was that the coursework wasn't as heavy as in regular high schools; otherwise, I might not have made it into the National Taiwan University of Science and Technology.
One regret from vocational school was my lack of interest in hands-on projects. Much of the time was spent teaching soldering, breadboarding, ensuring wires didn’t overlap, and avoiding shorts—tasks that were challenging for someone like me who wasn't particularly skilled with their hands.
In fact, during my studies in electronics, I wasn't particularly interested in the field and never anticipated that I would later earn a living by programming. However, this lack of initial interest has led me to recently become intrigued by these deeper, foundational concepts.
Returning to the question at hand, the answer depends on how deeply you grasp the workings of a computer. From diodes and transistors to registers, CPU operations, and operating systems, the knowledge in each domain could take a lifetime to fully explore. It's truly profound.
For me, this year has been about thoroughly understanding what a computer is. Although I'm not a professional, I find immense joy in grasping these concepts, and I've also discovered colleagues in my company who share an interest in foundational knowledge.
Overlap with Electronics
When I was studying electronics in vocational school, I never anticipated how useful this knowledge would be later on. Concepts like bridge rectifiers, diodes, filtering, and voltage regulation are commonly used in circuits for chargers. I must have calculated these concepts hundreds of times for exams, and we even had practical sessions where we built bridge rectifier circuits.
When a CPU performs addition, it needs to implement an adder, which is divided into half-adders and full-adders. We learned these step by step in class through truth tables and implemented them on a breadboard using logic gates (though using ICs is an option too).
CPUs require a clock signal, typically generated by a quartz oscillator or an external clock generator. Wow, the NE555 circuit I struggled with back then can generate a clock signal for the CPU to operate! (Of course, modern CPUs have this built-in now.)
What once seemed baffling—the flip-flop—is actually the foundation for implementing registers. As I began to understand more and more of these fundamentals, I realized I had already encountered this knowledge during vocational school; I just hadn’t connected the dots at the time.
This foundational knowledge carries a sense of mystery—on one hand, the entry barrier is relatively high; after all, not many people buy chips for breadboards every day, and a basic understanding of each concept is necessary. On the other hand, there’s a severe lack of information online. You might spend a long time searching for the answer buried in some forum's x-th comment, and often, you have to rely on yourself to solve problems.
Returning to Basics
Given this, one of my goals in 2020 was to get as close to the fundamentals as possible, whether that meant hardware, understanding operating systems, or familiarizing myself with programming languages. In short, anything that could help me understand foundational concepts was on the table.
At the beginning of this year, I bought an Arduino. Some might think, isn't Arduino just something that others have put together? True, but I figured I could start from there. For a certain sensor, I decided not to use someone else's library right away; instead, I would look through the datasheet to see how to use it. If I really couldn’t figure it out, then I’d check out the library's implementation.
In July of this year, I implemented an air quality monitoring application using Arduino and ESP32, which utilized MQTT, DHT11, and MH-Z14A, with UART for data communication. Instead of relying solely on libraries, I diligently read through the MH-Z14A datasheet and implemented the functionality myself. While I did use libraries for the Wi-Fi and MQTT components, I still learned a great deal.
This wasn’t enough for me; I didn’t want an Arduino shell to hold me back. So, I searched for the legendary MOS 6502 on Amazon.
The MOS 6502 is considered classic because it is affordable yet performs relatively well, which is why it was used in devices like the Famicom and Apple II. Additionally, compared to modern CPUs, its instruction set and design are much simpler, making it easier to understand CPU operations.
I was initially excited to place my order and awaited its arrival, only to later realize that the MOS 6502 does not have built-in EEPROM. You have to write programs to EEPROM and feed it to the MOS 6502 for it to run. However, I currently don’t have a good way to buy the EEPROM I need here in Japan, so I had to put that plan on hold.
Instead, I turned to AVR, as there’s an electronics store near where I live that sells various ATmega series microcontrollers. I randomly picked a few (ATMEGA328, which happens to be the MCU used in Arduino UNO).
The advantage of using ATMEGA328 is its size, which fits perfectly on a breadboard. Additionally, its instruction set is relatively simple, plus it has some unique advantages like:
- 32 registers, which is significantly more than other architectures (x86: 8, ARM: 16)
- Most instructions only require one clock cycle
- Most AVR chips come with built-in flash and EEPROM, eliminating the need for separate EEPROM and making reading more efficient
Understanding Assembly Language and Hardware Logic
In the world of hardware, everything becomes straightforward yet inconvenient. For example, if you want a specific pin to output a high signal using pure assembly language (AVR), you might write:
ldi r16, 0x01
out DDRB, r16
out PORTB, r16
This first loads 0x01
into register r16, sets the DDRB
register (data direction) to 0x01
, and then sets PORTB
to 0x01
, which is quite similar to what digitalWrite
does.
Surprisingly, assembly language isn’t as difficult as I imagined; however, writing industry-level assembly code might still be quite a challenge.
A Deeper Understanding of Interrupts
I haven't fully grasped how the interrupt mechanism works (including hardware circuits, etc.), but I have developed a deeper understanding of interrupts.
In a typical CPU, an Interrupt Vector is defined to describe what the CPU should do when an interrupt occurs, and whether interrupts are enabled or not is generally determined by the Global Interrupt Enable bit in the SREG
. This requires a solid understanding of register operations and bit shifting, and in AVR, there are usually a few functions that correspond to this (from avr/interrupt.h).
This part is fascinating but can also lead to issues; if it's a toy project, it's manageable, but I wonder how people manage and debug interrupt mechanisms in production environments.
Getting to Know Operating Systems
From around March to April this year, I sporadically watched a Coursera course on operating systems, covering topics like PC (program counter), IR (Instruction Register), kernel mode, user mode, atomic operations, and threads, all the way to semaphores. I still feel completely lost regarding implementation, but at least I have a "better" understanding now. I recently came across a book on building an OS in 30 days. While it looks quite basic and probably omits many details, I find it intriguing; perhaps one day, when I have the time, I’ll give it a try. I still prefer testing directly on actual hardware over running a simulator!
Programming Languages
Earlier this year, I stumbled upon a book by Yukihiro Matsumoto titled "まつもとゆきひろ 言語のしくみ," in which he implemented a programming language from scratch (using yacc/lex) and explained the principles and implementation methods in detail. I was quite impressed with its completeness; it wasn’t just about basic arithmetic but included strings, time handling, numbers, randomness, arrays, atomic operations, and more—essentially covering the foundational elements of a programming language.
Though I’d like to undertake something similar, I truly find the yacc/lex documentation hard to understand. I’ll tackle that challenge once I have a firmer grasp of C.
Another experience was during the IT Ironman competition, where I attempted to implement a simple version of Svelte
(without reactive features) and wrote a basic parser. It was a light brush with programming languages! XD
There’s still so much to learn ahead! Let's all strive to navigate through this chaotic year together!
If you found this article helpful, please consider buying me a coffee ☕ It'll make my ordinary day shine ✨
☕Buy me a coffee