ReReversible

Posts Links About

Simplification's Effect on Geekdom

  This is an informal piece I've been working on for a while when I've had some down-time. It's basically a disjointed collection of thoughts I have on why technology seems less exciting to me these days. Yes, all of the evidence is anecdotal but I'm not looking to get published in Nature and I'm not necessarily looking to convert anyone to my viewpoint. It's just how I think things do be doing.

  There's been a phenomenon I've been thinking about for a while now. Back when I was a wee lad in school, teachers ad Windows XP machines which would (shocker!) constantly bomb out one way or another. Back in those days, if you were "good with computers" you'd offer to help cycle the machine and get kudos that you could then leverage when your dog ate your homework. From talking with teachers I know, three things seem to have happened since I've left school:

0. The machines were phased out in favor of Windows 7 machines, and then Windows 10 machines, proving that school IT guys are woke on the suck-good-suck pattern Windows follows (apparently Windows 10 is considered good now?) 1. A lot of teachers have cultivated a bare-level competency with computers; it was described to me as "giving up" and realizing they're here to stay, for better or worse. 2. The impetus for (1) was that students stopped helping out teachers, and the hold-outs who haven't gotten on board now have to wait for the IT guy to come and unfuck their machine.

  (2) is most interesting to me, as it tracks with a lot of what I've personally seen with zoomers: they have no knowledge of computers, and, frankly, appear to be a bit afraid of them. They've grown up in the walled garden where everything is an app and interfaces are built around a touchscreen. This wouldn't be a bad thing (I also use a smartphone) except that there appears to be an increasing reliance on third-party services to keep the facade going. This isn't because the kids these days are stupid or uncaring, it's just that they came after UX engineers dropped on the scene and companies know that if they're kept ignorant, they'll pay for said third-party services (e.g. paying for google drive every month for eternity.) This isn't meant to be a "kids these days" rant and it isn't to say that there are no geeks in the younger generation, just that there seems to be less impetus for them to grow.

Note that making the choice to engage in the facade is very different from not knowing anything different. I use a flagship android phone because It Just Works™ and I don't know where app data is stored; but if I was pressed I could find out. As a case study, university professors have noticed that even older zoomers no longer understand the concept of a file hierarchy *. I want to distinguish this from, say, late millenials (zillenials) and younger not knowing what a floppy disk is. Not knowing about floppies is not really a big deal, as they aren't used anymore (unless you're the Japanese government) and it was nothing more than a medium for data. Programs changing their design language to make the "save button" an icon of a cloud or whatever instead of a floppy is just good design choice. The difference is that files still exist and unless you're using a giant Google Drive cauldron of porn and homework and cat pictures – which I guess is what people these days are doing -- you still need to have at least a concept of a file. Without that, you are necessarily tied to a Fisher-Price (remember when people called Windows XP's Luna theme "Fisher-Price?" If only they knew!) level of interaction with a computer, and you cannot escape.

  I remember when I got my first electronic device. It was a Nokia 3301 (that was its official name, everyone just knew it as the indestructible phone) that I could play snake on. A few years later I got an iPod touch, which was my first "modern" device. It ran iOS 4, to give you an idea of when this was. Back then, the Internet was still a place you went to instead of something that you coexist with constantly. A side effect of that is that a lot of the niceties we have become accustomed to required some real work to get – since this wasn't a device you were using 24/7/365.25. If you wanted a blue light filter or a battery percentage next to the battery icon, you had to jailbreak. The main tools were LimeRa1n and Redsn0w, and if you messed up you could hard-brick your device. That meant that if you wanted those features, you had to actually learn how to use those tools, and as a side-effect you learned a bit about how the hardware authenticated its firmware. This was a great learning experience and probably minted a lot of nerds in the day. Unfortunately, those days appear to be over. Now, machines of all varieties are so damn easy to use that people see no reason to jailbreak. This is good for accessibility and getting more people online, but a disaster for geekdom.

  This is the latest in a string of disasters for geeks. While I was too young to experience it, I do believe people when they say that massive jump from 2D to 2D sprites in a 3D world to full 3D was a massive deal in the gaming community. Similarly, even smaller jumps were still exciting to see. Witness the beginning of the 7th generation of game consoles (Halo 3) and compare the graphics to the end of the 7th generation (Halo 4/The Last of Us). Part of the reason we were able to get such amazing improvement inta-gen was because the two main couch gaming consoles were weird. The xbox360 had a triple core 3.2 GHz PowerPC – think old apple machines – CPU, and the PS3 had a Cell Broadband Engine based CPU which was positively bizarre (I don't have a source for this, but I have heard that Sony would fly Cell gurus out to studios to educate devs.) Devs for both consoles -- especially the PS3 -- had to learn their way around the architectures and so there was an upward curve in both graphical fidelity and performance which kept people "wowed" for the entire generation. Nowadays, both the PS5 and Xbox Series X/S are based on bog-standard x86 AMD Ryzen CPUs, which simplifies development considerably. Of course, it's now assumed that games will be released in a completely fucked state so I'm not so sure that simplification was worth it in the end.



* I've seen some discussion that says this article is bullshit. Far be it from me to defend journalists but it does track with my experiences. Call it confirmation bias but I think it's at least very interesting.