This will be a series of posts to tell about the things I wish I knew sooner, that somehow I would have some differences in my path, but it’s never clear if it would be for better or worse. Anyone, mostly focusing discussing about professional life and academy, nothing much about personal life.
If you’re going to work with computers, learn between the lines.
Computers are a revolutionary invention, they evolved along the centuries but the Universal Turing Machine was the breakthrough that could make computation the way we know today. Even though it has an incredible effect in our daily life, most of people simply can’t see what’s beneath this technology: all the mathematics, physics, chemistry and other sciences that make this invention work.
I’m not asking everyone to be a mathematician with graduation in sciences, but you need to know at least the minimal details of this technology: it’s based in electricity, so you need to know at least the relation between Voltage, Current and Power, how they relate and how it’s applied to computer. Most people simply doesn’t know the minimum necessary to buy a power supply for his/her laptop: it’s simply a matter of having the same voltage and the same or greater current (also the same connector), it will work the same way. Why are those important? A computer is an electrical power consuming machine, so you need to be sure it’ll pass the machinery with the same intensity (voltage) and will be able to supply enough to power for it to make all its parts work (voltage times current equals power).
Also, you don’t need to know boolean arithmetics by heart, but you need to at least understand how it works: in our decimal base system, we count from 0 to 9, adding by 1 every step, like if we were walking in a line and every step was increasing 1 in our digit. Once we get to our base’s value, we exchange those 9 steps by 1 higher representing digit: a ten, and go back to 0 in the first algarism. Boolean (or binary) arithmetic works the same way, but we only have 0 and 1 in each digit, so every time we take two steps in our line, we already exchange for the next digit. That’s why 1 plus 1 is 10 in binary, when we get to 2, we change our current digit to 0 and add 1 to the next digit. The same way, 11 plus 1 will be 100, because we’ll exchange the unit digit for a ten digit, then exchange the ten digit to a hundred digit, always putting a 0 in place of the exchanged digit. This kind of arithmetics is what made smaller computers possible (at least not of the size of a whole building), because we use voltage to represent the 0 and 1 digits (3V for a 1 and 0V for a 0, or simply “on” and “off”). This binary logic is what make computers possible.
You don’t even need to know what is an eletron and a semiconductor, but you should at least know that the electrical circuits from the computer is where electrons runs in a current, and some components of these circuits are useful using physics concepts to “trap”, “interrupt”, “delay” or “pull” electrons through a circuit. When we apply those components, we can now when a component is “on” or “off” (like, has a 1 or 0) and we can correlate those to make logical assumptions (if the first component is 1 and the second component is 1, then the next component will also be 1). This kind of trick let’s us build computers as a switch machine with billions of switches being turned “on” and “off” every nanosecond and making the logical processing we do at processors.
It doesn’t matter if you’re a Dev guy or an Ops guy, you’ll need both.
Information Technology can’t be simply divided in 2 big areas and then you choose a way. Most of the formal education we have in this area will often focus in one of those two: Or you’ll be the one who makes computer programs, algorithms and deal with databases, or you’ll be the one who fix and build computers, interconnects them and deals with servers and protocols. Those are, respectively, the Developer and the Operations guys.
Nowaday, we even have more divisions inside each of these: You may be a Database Administrator (DBA), a Front-end Developer, a Back-end Developer, a Fullstack Developer (that’s both the front and back-end developers, but getting paid the same as both of them alone), a Data Scientist, a AI Engineer, and many others, or you may be a Network Administrator, a Infrastructure Manager, a System Administrator, a Cloud Architect, a Security Engineer, a Pentester and many others. There are other jobs in the area, like a Support Analyst, IT Manager, etc, don’t get me wrong, i’m only giving examples.
It’s good to have a specialization in one of those areas, but you can’t be only specific to one area and simply forget about the other, specially when you’re a job from Dev without any knowledge about Ops, or you’re an Ops professional without any knowledge in Dev: you need to know at least the minimum in both. If you’re, like me, a guy from Ops, you’ll need to know at least one or two good programming languages (Python, Java, Rust, C/C++, and others) to deal with automatizing task, developing management tools and other things to help you do your job. Also, if you’re from Dev, you’ll need to at least know something from servers, connections and protocols, since you’ll probably be working with network connected services and systems, that will need to be deployed in a complex environment with correct scaling, so you can make your job as efficient and rentable as possible.
Of course, no one will ever be a specialist in everything, but there’s a minimun threshold that everyone should know about everything, at least enough to be able to deal with the challenges that appear, and not just say “I don’t know how it works, so I’m not going to deal with that”.
That’s all for now, folks!
I’ve focused a lot in the so called “hard-skills” in this topic. I’ll try to talk a little more about the soft-skills in a later post.