It's 2021 and the digital native is a myth. ** rewind ** During the 80s and 90s computers were for seriously entering the mainstream. Along with the idea that the current generations better catch up, and fast, lest they're left behind by the next generations, who'd become "digital natives", people who were born and raised along with computers, who'd be natural experts by osmosis alone. That didn't happen, as anyone who's met those born in the 00s can attest to. Sure, lots of people born earlier don't know a thing either, but those prophecied "digital natives", they never materialized. ** sidetrack ** Obvisouly I'm generalizing, there are awesome hackers out there born after 00, but I'm speaking broadly, even if those could pull up the average, the typical level of proficiency is astonishingly low. ** preload ** When talking about tech proficiency, I mean general understanding, not specific skill at some product. Sure, people can get by _USING_ some products, like smartphones and their apps, such as facebook , twitter, tiktok and snapchat. But the underlying concepts of data storage, privacy and networking are entirely and utterly lost to most. To the point of detriment, I will argue. Having no clue at all as to how stuff works on any abstract level leaves you helpless and vunerable to scams, exploitation and just generally way too dependant on others. ** back on "track" ** "When I was young" was during a transition from "computers are simple enough to be understood" to "appliance you just use", but it happened gradually and never entirely. Turning on most computers would literally present you with a programming language. Even if you wanted to use an already existing program, you had to at least learn two commands, which was GOOD, because it forced you to think about the computer as a machine you could program, even if the only program statements you ever wrote was "load this other program and run it". It instilled the idea that computers were to be programmed as well as used. ** sidenote ** I'm not making the argument that everyone should be programmers, and eventhough I think that everyone should at least have some idea of how to do it, that's for writing another time. ** Using any earlier computer required you to something about it, you had to learn stuff to be able to do much, and at first, some of that stuff seemed dense and incomprehensible, but the more you learned, the more you saw connections between things, and the more you started understanding. One may say that computers are more user-friendly these days, but I do not agree on that entirely. Web-sites and end-user programs are maybe some agreeable and userfriendly, some times (but not always, looking at you: Microsoft Office). But try comparing the documentation between a MOS 6502 and Intel i10 CPU. Have a look at the startup sequence on a Commodore 64 and a modern x86 PC. Try debugging a website with plain HTML and some tables on an Apache server, compared to a HTML+JS+CSS+Angular on a node/mongo/docker stack on some cloud provider. Sure, things get done faster, and more, and they generally look better and are esier to use, when they work. But a computer user is just that, a user of a computer, not an app user or appliance user, and a digital native cannot be strictly a consumer, they must be at some competency level, able to create, maintain and troubleshoot their computers and software. ** futile attempt at getting to the point ** So, when you're thrust into the computing experience of 2005 and beyond, you have, first of all, little chance actually understanding all the stuff that there is.. furthermore, lots of the ideas and technologies and why things work the way they do, is due to history that they weren't part of, names and words came to be under technologies long dead and deprecated.. There's a reason some languages have a "print" function and that function usually outputs text onto the screen, not the printer.. It used to, and knowing that gives meaning and understanding, instead of simply having to accept things as they are. When the Internet (capitals, please, show some respect!) started becoming mainstream, slowly problems became apparant, and their solutions formulated. Anyone remember these RULES for using Internet? - Never write your real name ANYWHERE and NEVER tell ANYONE. - Never tell anyone where you live, or your address. - Never tell anyone your phone number. - Never put your picture on the Internet. - The person you talk to is probably a pedophile, even if their name is Liz and they love ponies too. This is why you're reading this on DusteD.dk, that's my nickname, that I came up with many years ago when my English grammar was even worse(tm). Everyone had a nick. Everyone kept their identity secret. Everyone was safe. Imagine snapstagram with those rules applied, it'd actually be a better world, wouldn't it? Maybe that's not my point.. ** maybe this is ** Back in the bad old days, things were simple for multiple reasons, but their simplicity made them maintainable, it made them understandable, and it made it possible to understand how they worked to a high degree, because you HAD to. But this simplicity made it possible to become proficient. New things are made to "just work" and when they do, it feels like (eerie, unpleasant) magic. But when they don't, then the causes and solution s are totally opaque. ** admission of failure ** I failed to get my point across, maybe I don't have a point, maybe I'm just a grumpy old man who think people in general should be better with computers.