cienpies - stock.adobe.com
Many people today, especially those graduating with a computer science or other computing degree think, “I know enough now to get a job doing ‘X’ and that’s me set up”. That is not quite the case, however, and for at least two reasons:
- Jobs in IT are seldom permanently the same ad infinitum because, like a virus, they mutate or morph into a different format. Many people put the half-life of a job at 24 months or so in an IT career, one would expect to drift across the computing spectrum of jobs to keep pace with the evolution of computing.
- Imagine moving into a specialist subject like artificial intelligence (AI). Great – but can you imagine writing AI algorithms from graduation age, say 21, to somewhere in your 60s? A frightening thought, being a specialist frozen in time.
The approach to avoiding this state of affairs is lifelong learning, whereby a person keeps up to date as far as possible to ride the job mutation wave and still have a satisfying job which they are capable of doing.
Acquiring IT (or digital) skills does not mean there is a standard level of competence in IT to aspire to and which you either have or haven’t. IT skill is not a binary entity in that sense and I envisage at least four skill levels for IT oriented people, from beginner to old timer:
- Awareness of the basic use of IT at the level of computer, data and connections between computers.
- Acquaintance with these elements and the ability to enter a discussion about the use of computing. This, I feel, is the level that non-IT managers and even executives in enterprises undergoing digital transformation should possess.
- Overall IT knowledge, analogous to the know-how acquired by students at medical school, but not at specialist level. It is a mandatory precursor to any IT specialisation.
- Specialist knowledge in a particular area but only when the person has traversed level three. above. Ideally, level four should be provided by the employer since his requirements of any specialisation will vary from some perceived standard of their organisation’s particular one.
How can you acquire skills at the level appropriate to your needs? Not by reading weighty tomes at various levels – I have tried that, and often understand every paragraph I read but still fail to grasp the subject.
Experience taught me that it was better to read a few small articles on the subject, maybe more than once, and eventually you should hit that Eureka moment when the topic slips into place.
What follows is what I have learned about learning, over many decades in IT, both at the coalface and later as an author and researcher.
Learn by ‘little, often and varied’
The “little, often and varied” method allows you to get a consensus on the importance, future and usefulness of a topic or product, thereby eliminating bias and self-praise by a topic fanatic. Not only that, this approach allows people to pick up a topic, be it hardware, software or techniques, at various levels of difficulty since the nature of the topic is rarely fully explained in a single article.
Scanning several brief sources very often puts the theme together like the pieces of a jigsaw and the subject becomes clear since you will subconsciously fill in the understanding blanks as you read. If it doesn’t, perhaps you are in the wrong field of endeavour.
Some years ago, I planned to write a book and wrote a glossary for it. The book never happened, but the glossary lived on, was kept current and the result was an Amazon Kindle eBook, though with topics in the original alphabetical order. It was structured with an overview of each topic followed by reference links to articles, books and videos at various levels of difficulty and marked accordingly.
This allows the novice to pick their way through at their own pace and the expert to flex their IT muscles on the heavy stuff. In truth, they will take a peek at the easy stuff and still learn something; I’m sure they will. I learned a lot in compiling it. The eBook is one way of keeping up to date and even learning a topic from scratch.
Remember, learning, like breathing, is a lifelong task. If you think you know it all already, remember this poignant quotation from baseball manager Earl Weaver: “It’s what you learn after you know it all that counts.”
Here is an excerpt on blockchains from my book
Blockchains are immutable digital ledger systems implemented in a distributed fashion (i.e. without a central repository) and usually without a central authority. At their most basic level, they enable a community of users to record transactions in a ledger that is public to that community, such that no transaction can be changed once published.
This technology became widely known starting in 2008 when it was applied to enable the emergence of electronic currencies where digital transfers of money take place in distributed systems. It has enabled the success of e-commerce systems such as bitcoin, ethereum, ripple and litecoin. Because of this, blockchains are often viewed as bound to bitcoin or possibly e-currency solutions in general. However, the technology is more broadly useful and is available for a variety of applications. [Blockchain Technology Overview, National Institute of Standards and Technology (NIST), p12]
Another informative definition:
Blockchain is a distributed ledger used to maintain a continuously growing list of records, called blocks. Each block contains a timestamp and a link to a previous block. By definition, blockchains are inherently resistant to modification of the data. Once recorded, the data in any given block cannot be altered retroactively without the alteration of all subsequent blocks and a collusion of the network majority.
Terry Critchley’s book, “Modern IT concepts and technology: An IT study guide for beginners and practitioners”, Kindle Edition, is available here.