From my enfeebled understanding, computer systems are inherently complex but not entirely impossible to understand, at least a basic level. For example, several distinct operations of a computer can have real-world, simple analogous ways of interacting with them. For example, a file, in essence, is a document which is very similar to a piece of paper that has writing on it. If one were to scribe upon said paper, then put the piece of paper down, there should be minimal confusion as to where it is being set down in the physical world.
The same can be said of the oven in your kitchen – a pivotal piece of equipment in almost everyone’s life is stored in one place and there is absolutely no confusion as to its location.
You may hesitate a guess as to where I’m going with this line of thought – I have worked in the tech field for a not-insignificant amount of time in relative terms (Moore’s Law, etc. ad-nauseum) but I feel I’ve gathered enough experience to form an opinion.
A file is stored on a system and stays in the place you have specified. Across my professional career I have seen people who have worked with computers for over 20 years, full-time, 5 days a week and they still have absolutely no curiosity as to ‘how did I lose that file I just saved?’ nor do they have any interest in finding where that file went after having saved it; instead opting to simply save again in a different spot which is potentially still incorrect.
Within a corporate environment, I can understand the hassle of understanding various network drives, VPN pathing and also virtualised or cloud applications not being run locally; however, I have noticed that the general predicate of how computers do stuff is largely consigned as a useless feature that bears no relevance to one’s work.
The same can be said of engineering professionals whose workload primarily revolves around programming. More oft than not, they work in multiple languages as well, meaning organisation of one’s codebase is relatively important.
Einstein once made a comment on how people organise their desks, ‘A messy desk should reflect how one’s mind works; what does a desk with nothing on it suggest?’. I am inclined to agree that you are welcome to organise information in whatever manner works for you. However, the basic concepts of virtual environments, database drivers and terminal layers should be easy to understand and more importantly, easy to leverage for professionals in a tech-related role. All of the aforementioned attributes of a system do indeed have a location, it’s just that many people seem to ignore it or at the very least think it’s too complicated to learn.
The basic underlying concepts of how computers work are really useful for those of us who interact with them every day. To completely ignore the underlying mechanisms is not only frustrating for others you work with, but it’s a missed opportunity to learn something cool! (well, at least I think it’s cool – and my Mum said I’m the coolest guy in the world so that’s gotta count for something).
In reference to the title of this article, I’d like to drive home (also, evidently a physical location heh) the point that knowing where you are in a computer system can be analogous to orientation in the real-world; you drive down the street in the same way you change directories in a file system.
I realise as I write this that the kind of person who would actually care to learn these things will never see these words, so instead I’ll refocus on some my favourite tooling for navigation and execution of software in a Linux environment.
- oh-my-zsh for terminal highlighting and convenience functionality such as auto-completion and visual cues for Git repositories.
- jump to be more specific, allows you to mark locations and easily ‘jump’ between them.
- virtual-env-wrapper a fantastic way to simply manage your Python environments.
- neovim This technically isn’t a navigation tool, but good lord it is good at it. Subject to it having the following plugins:
- harpoon Similar to ‘jump’ listed above, just in Vim.
- telescope Telescope is fantastic at fuzzy finding files – really fast too!
- treesitter for syntax highlighting which just makes it easier to see where you are.
- vim-airline ditto for airline.
I don’t wish to write this in the guise of a man who is cardinally aligned with the stars themselves when in a system. On the contrary, just the other day I was forced to nest my ssh connections after connecting to the corporate VPN by piggy-backing from my standard-issue laptop to another physical machine to a server (three layers deep!); all of this just so I could configure the iptables on the server to let my normal connection back in. Although I guess in actuality, the person who had no interest in how those connections worked at a basic level would have opted for “I’m blocked, okay! It’s the design of the backend”.
Interestingly, we can extend the principles of basic computer operations to something as large scale as an AWS instance. It is very interesting to me that the field of Data-ops is considered quite prestigious in regard to earning various certifications on how ‘the cloud’ works. I’ve been a massive fan of converting complex topics into what I refer to as ‘cave-man’ English in which I explain to my colleagues in bare-bone terms of how a thing works; more or less, this approach works quite nicely.
Regarding AWS however, the ability to instantly scale as needed is predicated on the fact that one guy managed to scoop up over 60% of the world’s silicone and dumped it into various highly performant data centres around the world – all while leveraging otherwise basic concepts on how computers distribute tasks. A fantastic example of this is a s3 bucket that can contain huge amounts of files for what is essentially just that – a file system, with the exception it can provision more storage at a moment’s notice; just like the one where the Senior Data Scientist forgot where he saved the Excel file he just wrangled for 3 hours.
I suppose the AWS systems are based on the fact no-one really knows how they work under the covers, which plays nicely with what I’ve been babbling about above.
On a side-note, I have worked with talented engineers in the past who worshipped the machine god’s spirit more devoutly than myself. We were once provisioned an ec2 instance which had a vanilla Ubuntu OS installed which didn’t have access to the internet.
I understand that modifying those properties may have been ill-advised but none the less, we persevered. Our team lead at the time simply suggested ‘why don’t you just reverse ssh to one of our local machines on the network and use the internet that way?’ – lo and behold, it worked! Totally not standard protocol, but it was fun to go rogue for the day against the almighty judgement of IT Services.
Additionally, I once accidentally deleted a production CRONTAB by hitting the -r instead of -e argument in the terminal. I initially thought I had goofed pretty badly (which I did!), but knowing how the OS worked and where it logged information I was able to reconstruct the CRONTAB in a few minutes by checking the syslog files. Plus, why the heck IS the removal option only a single character next to the edit button with no interactive prompt by default?!.
Anyway, as a closing note I do hope in the future people would consider looking at the path upon which they walk instead of the fancy clouds above them every once in a while; if only for a moment.