September 10,000, 1993

math (+1 to count the 1st)

So, checks out. 10,000 days of the September That Never Ended.

The world since is like a movie showing a few people coughing before the credits, wipe fade, zombie hordes tearing down barricades to eat the brains of the last few people. Someone's shivering in the corner with a gun, for the zombies or self, you can't tell. Freeze frame. "I bet you're asking how we got here…"

Note: I, uh, kinda infodumped here. Estimated reading time: 19 minutes.

What Went Wrong

At the time, I had a nice Gopherhole, finger and .plan (at times with a GIF of me uuencoded into it!), and was already annoyed by the overcomplicated World Wide Web rising. But in Feb 1993, UMinn saddled Gopher with threats of a license, which killed the better-organized system, and I was an adaptable guy. For quite a while I had both with equivalent content mirrored, but then my WWW site got more features, and the Gopher hole got stale so I closed it.

A bunch of new kids invaded USENET every September when school started, and commercial Internet started in '89-91 when NSFNet removed their commercial restrictions, and then fucking AOL unleashed bored neo-nazis from the flyover states on us. There was a vast onslaught of spam, bullshit, and trolls. So I switched from rn which had primitive killfile regexps ("PLONK is the sound of your name hitting the bottom of my killfile"), to trn, which had threading and a little better killfile system, to strn which had scoring so if you hit multiple good or bad keywords, you'd move up or down my queue or vanish. I bailed on all the big groups, tried moderation and was promptly attacked by scumbags who thought the moderation system was for protecting their corporate masters, not stopping spam, and then quit entirely.

We don't even have FAQs now. There's no netiquette at all (ha, Brittanica, remember them? Site's probably not been touched since 1999). I hide off to the edges in Mastodon with very aggressive blocking of anyone who looks annoying. The big media sites, Twaddler and Fuckbook, are just poison, an endless scroller of screaming between everyone who wants to feel offended all the time, and the Orange Shitgibbon's mob of traitors; I see a very little of Twaddler by way of RSS, but I won't go any closer than that.

Gabriel Dropout s1e2: Do you enjoy living like that, always being mad?

The Web. On most sites, there's megabytes of crappy scripts for tracking, style sheets, giant custom fonts instead of banners & buttons burned into GIFs, so a page might take 100MB to show anything. The basic World Wide Web experience of click a link, page shows you slightly formatted text on an unpleasant background, click another link, is unchanged from 1993, but there's a dumpster of shit on top of that. I hate using the Web now, every goddamned page wants to track me, bounce banners up in front of me, demand I approve cookies but don't let me say "DENY ALL FUCK YOU"; and even without cookies, they use fingerprinting to track me.

It doesn't have to be like this. Despite using WordPress, the dumbest and most bloated thing possible, I've tried to keep my site down to a minimal setup, go read the page source, it's just CSS, content, and the search widget. If I ever get around to purging the default CSS, it'll be even lighter. But most people not only don't live up to that ethic, they aggressively want the opposite, the biggest, fattest, most unusable crap site full of autoplaying videos they can make.

Criminals being able to use the Internet to attack physical infrastructure, or hostile encryption of computers (including in hospitals; some people need a stern talking to with a 2x4 or a shotgun). Back in the day, RTFM's worm was a novel disaster, but fixable. Microsoft's garbage OS was trivially infected with viruses then and now, but back then it didn't matter much; you might lose a few un-backed-up files, not real money.

The Internet as trivial research device seems like it should be good, but what it's meant is that the Kids Today™ don't bother to learn anything, they just look up and recite Wikipedia, which is at least 50-80% lies. They "program" by searching StackUnderflow for something that looks like their problem, pasting it in, then searching again to solve the error messages. Most of them could be replaced with a Perl script and wget. I assume non-programming fields are similarly "solve it by searching", which is why infrastructure, medicine, and high-speed pizza delivery are so far inferior to 28 years ago.

Search was very slow and mostly manually-entered into index sites back in the '90s. Now it's very fast, but only things linked from corporate shitholes actually show up, and spam and SEO poison all the results, so all you really get is Wikipedia, which might have a few manually-entered links at the bottom which might still exist or be in archive.org, or a few links to spam. Try searching for anything, it's all crap.

Vernor Vinge in 1992's A Fire Upon the Deep called a 50,000-years-from-now version of USENET "The Net of a Million Lies". Just a bit of an overshoot on the date, and a massive underestimate of the number of lies.

There's a lot of knock-on effects from the Internet as a sales mechanism. Like, videogames used to get QA tested until they mostly worked; fiascos like Superman64 were rare. Now, Cyberpunk2077 ships broken because they can patch it off the Internet, won't be fixed until actual 2077. Sure, not all games. I'm usually satisfied with Nintendo's QA, though even Animal Crossing: New Horizons shipped with less functionality and more bugs than Wild World on the (no patches!) DS cartridge.

What Is Exactly the Same

IRC, war never changes. I used ICB for my social group back then, and we moved from there to Slack. Most technical crap is discussed on IRC, rarely on Slack, Matrix, or Discord (which literally means conflict). Doesn't matter, it's just a series of text messages, because nobody's figured out how to make anything better that lasts.

I'm still using some version of UNIX. If you'd told me in 1993 that I'd be a Mac guy, I'd've opened your skull to see what bugs had infested your brain; Macs were only good for Photoshop and Kai's Power Tools. But Linux never got better, BSD is functional but never got a great desktop, SUN and SGI are dead <loud sustained keening wail>, and Apple bought/reverse-takeovered NeXT with a nice enough BSD-on-Mach UNIX. And the Internet is, largely, UNIX. There was a horrible decade mid-90s to early-00s when Windows servers were gaining ground, people were ripping out perfectly good UNIX data centers to install garbage at a huge loss in efficiency because their CTOs got bribed millions by Microsoft. But that tide washed up and back out taking most of the MS pollution with it. Maybe it won't be back.

I still write web sites in Vim or BBEdit (since 1993: It Doesn't Suck™). Well, I say that, but I'm writing this mostly in the WordPress old text editor, using Markdown. Markdown's new-ish (2004), but behaves like every other text markup system going back to SGML in the '80s and ROFF in the '70s.

What's Good About the Internet

Not fucking much.

Streaming or borrowing digital copies of music, movies, and books is easier than ever. I speak mainly of archive.org, but sure, there's less-legal sites, too. I have access to an infinite library, of whatever esoteric interest I have; I've lately been flipping through old Kilobaud Magazine as part of my retrocomputing; I like the past where just getting or using a computer was hard and amazing. In 1993 those might have been mouldering away in a library basement, if they could be found at all. Admittedly, I hate most new media; nothing's been good enough for Mark since 1999, and really I could put the line at grunge, or maybe 1986 when The Police broke up. But at least it is accessible.

I spent most of today writing new stuff for the Mystic Dungeon, and even with all the overcomplicated web shit, it's a little easier to build a secure, massively parallel message system in JS than it was in C or Perl 30 years earlier. Not by much, but some.

Internet pornography (link barely NSFW?) is a tough one. '70s-80s VHS porn was expensive, flickery, way too mainstream; fine if you liked chunky old guys banging ugly strippers, I did not. DVD porn in the '90s was still expensive, but got much better production, and every niche interest, that was the golden age. But now everything is "free" on the thing-hubs and x-things, but only in crappy 6-minute excerpts stolen from DVD, horrible webcam streams, and the creepifyin' rise of incest porn. Because the Internet enables weird interests, but what if a whole generation have massive mommy/daddy issues? You can in fact pay for good non-incest porn, but payment processors and credit cards make it hard to do, so it's easier to just watch garbage. And then there's prudes and religious zealots who think porn is bad; in the old days, they had the law and molotov cocktails on their side, but now they're impotent, so I guess that's barely a win for the Internet.

What Didn't We Get

The Metaverse. OK, there was and is Second Life, but Linden fucked the economy up, and never made it possible to take your grid and host it yourself without a gigantic effort. There's WebVR and a few others, but they have terrible or no avatars, construction, and scripting tools. We should be able to be scanned and be in there, man, like in TRON.

The Forum. There's no place of polite social discourse. There's hellsites, and some sorta private clubs, and a bunch of abandoned warehouses where people are chopped up for body parts/ad tracking. Despite my loathing of Google, who are clearly trying to implement SkyNet & Terminators and exterminate Humanity, Google+ was OK, so of course they shut it down.

The Coming Golden Age of Free Software That Doesn't Suck. Turns out, almost everyone in "FLOSS", the FSF, and GNU, are some of the shittiest people on Earth, and those who aren't are chased out for daring to ask for basic codes of conduct and democracy. Hey you know that really good file system? Yeah, the author murdered his wife, and the "community" is incompetent to finish the work, so keep using ext which eats your files. Sound drivers on Linux, 16 years after I ragequit because I couldn't play music and alarm sounds at the same time, still don't work. "Given enough eyes, everyone goes off to write their own implementation instead of fixing bugs"; nothing works, every project just restarts at +1 version every 2-5 years. Sure, you can blame capitalism, but there's a couple of communist countries left, why aren't they making infinitely better software without the noose of the dollar dollar around their necks?

The Grand Awakening of Humanity. This was always delusional, but the idea that increased communication between people of Earth would end war, everyone would come together, align their chakras/contact the UFOs, and solve all our problems. Ha, no, you put 3 people in a chat room and you'll have 5 factions and at least one dead body in a week. As we approach 7 billion people online, many with explosively incompatible and unfriendly views, this is only going to get worse, if that's even imaginable.

Final Rating: The Internet

★★½☆☆ — I keep watching this shitshow, but it's no damn good. Log off and save yourself.

Informational Hygiene Directives

That's what I call my rules around contacting me, and getting a (non-vulgar) reply from me.

This is brought to mind by Wednesday's spam mail reaching my contact address, and why that made me so mad.

  • Casual, "hey what about" messages: Social media, currently @mdhughes@appdot.net — if this changes, it'll be in the About page. I don't always respond, if I do it's within 24 hours but rarely immediate, but I'll probably see it. I may or may not care, this is very low attention span, I may be drunk and posting about Dracula or Godzilla, it's not you, it's me.
  • Do not: IRC messaging, Discord messaging, etc. unless I'm specifically engaged in that activity at that moment, I won't see it, won't care.
  • Sorta: WordPress post replies (and replies from micro.blog) I will only see next time I load my WP dashboard; I use StupidComments.css to hide them on my front page, which I rarely visit anyway. I do appreciate post replies, I'd hit little favstars by them if I could, but they're not allowed to be intrusive.
  • Junk mail, Mailing lists: I have an email address for that on a popular and possibly hostile AI service, I manage junk there, messages to me are unlikely to get thru. This address generates no notifications.
  • Professional email: Only mission-critical services and people who have business to do with me should be using this address. This address does generate notifications.
  • Private email, iMessage, SMS, Slack: You probably don't have this. Unless you're one of a half-dozen people, and if someone else finds it I tell them the correct junk/professional address to use and block them. This gets notifications. The one time I let one of these slip while I was working, tragedy ensued, so I won't do that again.

When I was all business business business numbers, I got at most a couple dozen emails a day on my professional box, from direct reports, management, and interested outside teams, and I hated it, but that was manageable. Since I got The Man's boot off my neck, it's much lower, but I like barriers and being able to utterly ignore stuff outside one box if I feel like it.

Which brings me to today's hilarious idea of email sabbaticals. There's more recent people doing the same, it's not just this one Microsoftie 10 years ago, but I'll address the original.

What is wrong with you? Thousands of emails in 2 weeks (hundreds a day)? Everything you're doing there is wrong. Everyone sending you stuff is playing "my problem is your problem", and it is NOT.

Organize, filter, and delegate.

  • Organize: Use message boxes to put away automated or group content you don't need to pay attention to now. You can read that when you have spare time, or not, because it's not directly affecting you.
  • Filter: Don't let people throw everything into your "must read now" box. Block the people who can't learn.
  • Delegate: If you do have a firehose of stuff coming in, you probably can afford to hire someone to read it all and just send the useful parts to you. If you're running an open source project, you're kind of screwed, but there may be volunteers (or you can "voluntell" some overly enthusiastic but less useful contributor). You can also set up a wiki or forum for the Kilkenny Cats solution.

Walt Mossberg had this ridiculous screed about getting hundreds of emails and too many notifications… Now, he's a (now-retired) journalist who does get a lot of legitimate "my problem is your problem" email. But he also complains about birthday notices, CVS pharmacy ads, Starbucks ads… Turn all that shit off! Nobody needs any of that crap.

"A text, or short internet message, on the other hand, seems to demand instant attention, and may even lead to a whole thread of conversation."

No, it does not. Mute, delete, block anyone who can't learn. If people persist in sending you junk, you can't let them have access to a ringing bell.

Videogames and Storytelling Mix like Water and Sodium

At best you get tears & corrosive salt water, at worst you get a sodium explosion.

My philosophy of games:

  1. Games are about environment and gameplay only.
  2. Graphics don't matter much, as long as they communicate.
  3. Character and story are what you bring to it, they should not be part of the game.

So, I just dropped a lot of words there with fuzzy definitions:

  • Games: I mean all of tabletop boardgames, role-playing games, and most often videogames of all genres. There's less difference between the Warlock of Firetop Mountain gamebook and Myst than there is between that gamebook and David Foster Wallace's Infinite Jest. And if you tear out the system from Warlock, you get Advanced Fighting Fantasy or Troika!, which is a very nice little RPG for wandering a weird, almost hallucinatory fantasy world with no book, no defined character, no story.

  • Environment: The world you explore. Some of this uses traditional writing skills for designing non-player characters and describing the tone and events, but also architecture, painting, 3D modelling for designing environments, music for writing soundtracks, foley for making environmental sounds.

    I recently enough mentioned this in Videogame Exploration, and I want to especially repeat my suggestion of Bernband, which is goofy, low-rez, standee sprites… and one of the most immersive environments I've ever played in.

  • Gameplay: The continuous loop of doing something, getting feedback on what happened, maybe scores or your position or just your understanding of the environment changes, and then repeat forever. That loop might take milliseconds in action games, to minutes or hours in hard adventures. There's a… fixation? a high… you get from that loop when it works right. "Just one more turn" says the Civilization junkie at 4AM before blowing off work. "Just one more mineshaft" says the Minecraft player. "Just one more quest" says the ESO player.

  • Graphics: This is almost irrelevant, really, despite the huge amount of effort and money spent on it. It doesn't matter if it's text adventures like Colossal Cave Adventure or Infocom's games, character-grids like Rogue and many descendants, 2D or 3D tiled graphical environments like Ultima IV, Super Mario Bros, or Castlevania, painted images along with text like Sierra's King's Quest or the LucasArts SCUMM games, up to 3D FPS graphics like Doom or Elder Scrolls Online. Good gameplay with any graphics is immersive, bad gameplay with perfect graphics is not.

    Easy way to test that: The most popular videogames of all time are: Mario (2D tiles), Zelda (2D & very simple 3D), Minecraft (blocky 3D with the worst programmer-art textures), Animal Crossing (very simple 3D imitating 2D). Graphics-intensive games pop up and vanish, because they're uninteresting.

  • Character: Who you are. In the better kinds of games, this is left blank for you to fill in. If the game engine doesn't accomodate dialogue even as well as Ultima I did, you're a mute wanderer who breaks into peoples' homes, smashes their crockery looking for coins & drugs/potions, maybe hits X to hear if they have any rumors or leads, then leaves. In action games, very little dialogue is necessary, your weapons speak for you.

    If you can freely define your Character, that interferes with Story. Until recently, at least you could rename your character, but with full voice acting for many games, they either obnoxiously refer to you as "Vestige", "Adept", "Friend", etc., or don't refer to you at all… or don't let you rename your character.

  • Story: This ties in closely with Character: What do you do? If you can wander as you please, make your own fun, whether that's good or harmful to the environment or NPCs, then you have no story, only gameplay. If you can only ride along like an amusement park railroad ride, get a story told to you and then pew-pew-pew to shoot targets, move on to the next stop, you have no gameplay, only story.

    The Disneyland ride model is a big influence, but AAA "games" with story are mostly frustrated Hollywood wastrels in the wrong medium. The obvious recent example is Death Stranding, which has hours of awful cutscenes with Hollywood people who have nothing to do with the game: A mediocre walking simulator/platformer; without the cutscenes, it might even be fun, if tedious.

An unfortunate result of focusing on Story has been forcing the player to make bad dialogue/action choices to advance, stay on the railroad unable to get out and wander away. Heavy Rain's no-choice "Press X to Cry Jason" rather than man up and go look for your lost child.

The now-defunct Telltale Games' Minecraft Story Mode had a painfully fixed main character and plot, and a doomed character, but let you choose social consequences with allies… which were then forgotten in the next chapter.

Early Final Fantasy games had a totally blank slate. FF3 is right on the cusp; it gives you a sandbox to explore, eventually hit a switch to open the next, bigger sandbox, repeat a couple more times, finally a long multi-part endgame and post-game sidequests. The characters have a secret backstory, but you can rename them, give them any job you want, play them however you want. I did one playthrough with boring Warrior, Thief, White Mage, Black Mage, another using Monk/Black Belt, Red Mage/Dragoon, Scholar/Geomancer, Evoker/Summoner. Utterly different gameplay even if I ended up clearing the same dungeons. My bizarro party got to level 99 to fight the giants.

By FF4, the characters and story are locked in place, you can enjoy it or not, and certainly the art's great and I quote "you spoony bard!" all the time, but you have no choices. Not that I'm blaming that all on JRPGs — there's Japanese games with freedom of choice, and Western games fixed on one character, Gabriel Knight is one of the earliest of this archetype.

Gamebooks like Tunnels & Trolls solos, Fighting Fantasy, Lone Wolf, etc. are odd hybrids since they have story, but almost never have a defined character (a few do, like Creature of Chaos). The more linear the gamebook is, the better the story is, but the less interesting it is to play; there's several I've done that had one win and many deaths, and so cannot be replayed. The more meaningful choices they offer, the more incoherent the gamebook becomes, just a bunch of random scenes because you can't build up any meaning like linear fiction does.

My objection to Dungeons & Dragons adventures from Dragonlance (1984) on, is that it went from a game of freeform dungeon crawls, hex crawls, or "West Marches", wandering the Referee's world, maybe loosely using a Greyhawk map or Outdoor Survival, often made up in the days between games or improvised on the spot; to railroaded "adventure paths" with fixed character roles (either named and unkillable like DL or just "must have fighter, thief, cleric, magic-user, bard, or you will fail"). 5E has become entirely that, their healing/action economy even requires a specific pacing along the railroad, and their world maps are just one-path flowcharts you move along like Candyland.

So in conclusion (almost), just say no to story in your games. Look for that infinite high of gameplay.

  • The Devil's Advocate: There are some attempts to make character or story "gameable", rather than just a railroad, most notably Chris Crawford's Erasmatazz, which he then replaced with Storytron, now Wumpus (no relation to the real Hunt the Wumpus game). These have computer-controlled drama, you talk/choose interactions with different "emotional weights", and the NPCs react appropriately. These suck as games. They can be a little interesting as a puzzle to talk to the NPCs, find out what's going on, maybe push one of them into a "win" state. Nobody'd spend long on one.

It's worth looking at Chris's development woes. Sequentiality and list of encounters in Le Morte d'Arthur he gave up on gameplay, it's a railroad click-thru of Mallory's book, with a single fame/piety score to get win/lose.

His Gamers or Storytellers seems to be an admission of defeat. Yet he still has bigoted, ignorant ideas like:

This also plays into the old “evolution versus revolution” dilemma. I have long held that games will never evolve into anything with artistic merit, because the gaming audience does not expect artistic content from games. You can’t sell Beef Wellington to people who want candy. You can’t sell poetry to people who read comic books. You can’t sell art-house movies to people who watch cartoons. And you can’t sell artistic content to gamers who want action and instant gratification. Games as a a medium are ill-disposed to evolve in a storytelling direction.

This is why he fails. Games can have artistic content, just not inbred Hollywood-imitating content. There is plenty of poetry in comic books, obviously Sandman but many an issue of Detective Comics (the smarter Batman series) has moved me deeply. Many art-house movies are cartoons, or vice versa, or were when theatres were a thing, I'd start with Don Hertzfeldt's Rejected and Ralph Bakshi's Wizards. You can't sell poison apples to gamers, not more than once anyway.

I had a look at his soi-disant "Wumpus", and got this, his "non-technical" user interface. It's incredible to me that this is the guy who made Eastern Front and Balance of Power, which were techy but not a giant wall of UI clickies, badly sized in a window. Yes, it's Java, but you can make attractive and usable Java UI, it just requires effort.

I figured out eventually that you can hit Editor/Run Rehearsal (?) to play in something like a dialog box UI, was able to play through a very dull conversation, and then it gets stuck with Jeff explaining widgets to Sam in an infinite loop. Excellent. Obviously story-gaming is a solved problem. 🙄

Software Principles for 2020

This is both for myself, and to decide what software I'll tolerate in my presence in the future.

  1. No lag. All UI must respond and be responsive again within 100ms. Most everyone has many cores in their CPUs and a massively parallel GPU not doing that much, you can spare ONE to run your work thread. Stop with the long animation shit. 100ms is plenty to see a shadow moved from one place to another, where there is now an interactive UI.
  2. No load screens. If you can't preload "instantly", be functional, show a usable menu while background loading. Media streaming needs to buffer, but you can show a poster frame instead of empty space.
  3. No ads or spyware. If you can't subsidize your software some other way, don't ship software. Or as the late very lamented Bill Hicks said, "If anyone here is in advertising or marketing, kill yourself!" (and of course there's ads on youtube; so maybe I need to find a better video hosting system? I know there's a fediverse-based video thing)
  4. No custom binary formats. Save your data in JSON or some other common system (plist on Mac, etc), so users can export & manipulate it from their own tools.
  5. No sites without syndication. If you have a web site or blog, you MUST support RSS or Atom, or both. Failure to do so should have you removed from the Internet.
  6. No unsecure connections. I know it's hard to add https the first time, and some older services can't be easily wrapped, but every http connection is a chance for false information to be fed to you, your computer compromised, your information to be stolen.

Adult Engineer Over-Optimization as the Motie Problem

Looking at my Scheme code and the way I customize it, I'm starting to see the real reason evil megacorps (and wannabe evil startups) won't hire even middle-aged programmers or use your favorite weirdo language, they just want young idiots who code Java or Go.

If you think about a standard software career, there's maybe 10 years of a submissive fool badly coding crap languages ^1 like Java, Go ^3, PHP, JavaScript ^4. They just got out of college or self-trained, and can barely copy existing algorithms, let alone think of one for themselves. This is why FizzBuzzTest ^5 is such a good novice coder test: It requires following directions exactly, and slightly competent logic skills, but not much more.

Then maybe 10 years of them being project managers and "architects", running waterfall and GANTT charts; they'll say they're "agile" but then have a giant JIRA repo of "backlog" features which have to be implemented before shipping, weekly 4-hour planning "backlog grooming" meetings, and unrealistic estimates. This is sufficient to build all kinds of horrible vertical prisons of the mind like Azkaban Facebook.

Then they either retire, or are "downsized", and now what? So they work on their own code, do maintenance on old systems, or leave the industry entirely.

If they work on their own, freed of evil megacorp constraints, they're going to end up in something idiosyncratic and expressive, like Scheme, LISP, Forth, or a custom language. Make their own weirdo environment that's perfectly fit to themself, and unusable/unreadable by anyone else.

Case in point, I needed an object model. There's one I like in Gerbil, and Gerbil's blazing fast, but I can't make a full SDL2 library for it yet (Gambit's FFI is hard, I've hit some bugs, and there's a LOT of library to interface to), and I'm using a bunch of other Chickenisms anyway, so I can't really move to it yet. Instead I just made my own simple object libary, with a couple macros to hide the ugly reality behind it:

(test-group "Object"
    (test "Object" 'Object (class-name Object))
    (let [ (obj (@new Object))  (bug )  (cow )  (duck ) ]
        (test "Object-to-string" "[Object]" (@call obj 'to-string))

        (define-class Animal Object)
        (define-field Animal 'legs 0)
        (define-field Animal 'color )
        (define-method Animal 'init (self legs color)
            (set! (@field self 'legs) legs)
            (set! (@field self 'color) color) )
        (define-method Animal 'speak (self)
            (sprintf "The ~A ~A with ~A legs says " (@field self 'color) (class-name (@class self)) (@field self 'legs)) )

        (set! bug (@new Animal 6 "green"))
        (test "bug-legs" 6 (@field bug 'legs))
        (test "bug-color" "green" (@field bug 'color))
        (test "Bug speak" "The green Animal with 6 legs says " (@call bug 'speak))

        (define-class Cow Animal)
        (define-method Cow 'init (self color)
            (@super self 'init 4 color) )
        (define-method Cow 'speak (self)
            (string-append (@super self 'speak) "MOO!") )
        (set! cow (@new Cow "brown"))

        ;; second class to make sure classes don't corrupt shared superclass
        (define-class Duck Animal)
        (define-method Duck 'init (self color)
            (@super self 'init 2 color) )
        (define-method Duck 'speak (self)
            (string-append (@super self 'speak) "QUACK!") )
        (set! duck (@new Duck "black"))

        (test "Cow speak" "The brown Cow with 4 legs says MOO!" (@call cow 'speak))
        (test "Cow to string" "[Cow color:brown;legs:4]" (@call cow 'to-string))
        (test "Duck speak" "The black Duck with 2 legs says QUACK!" (@call duck 'speak))
        (test "Duck to string" "[Duck color:black;legs:2]" (@call duck 'to-string))

        (test "instance-of?"  (instance-of? cow Cow))
        (test "instance-of? parent"  (instance-of? cow Animal))
        (test "instance-of? grandparent"  (instance-of? cow Object))
        (test "instance-of? cousin-false"  (instance-of? cow Duck))
        (test "instance-of? not an obj-false"  (instance-of? "wtf" Cow))
    )
)

The implementation code's not much longer than the tests, but it's not quite done for me to show off; I need to switch my macros into non-hygeinic forms so I can get rid of the (self) in define-method, and introduce an Objective-C-like _cmd field for self-reflection, and message-not-understood handling. There's always more tinkering to do.

Which is great for me, but makes my code an undocumented (mostly) new language, unusable by anyone normal. A giant pile of crap Java program, no matter how old, can be "worked on" (more crap piled on top) by any teenage Bro Coder.

All of which brought to mind The Mote in God's Eye, where the Motie Engineers over-optimize everything into a tangled mess, and the Watchmaker vermin are even worse, wiring up everything to everything to make new devices. The threat posed by and solution to Scheme programmers, in your usual authoritarian megacorp scenario, is similar to Watchmakers.


^1 Swift is intended to fit this niche much more than weirdo expressive Smalltalk+C Objective-C was, BDSM ^2 to prevent one from writing "bad" code, but it's not there yet; the reality of low-level software dev can't be simplified as much as Apple wants, and their C++ developers weren't up to the task anyway.

^2 Bondage-Domination-Sado-Masochism; aka strict type systems and code flow analysis, that prevent one from writing "bad" code at the cost of annotating everything with types instead of doing useful work. I'm not kink-shaming people who do that for sex, only those who do it to their own software.

^3 Rob Pike has openly said they can't give a powerful language to newbie Googlers, they mostly just know Java, C, C++, which is why Go is so limited and generic.

^4 Oddly, JS is basically a LISP with really shitty syntax. It's easy to make trivial, broken junk in it, but it's also powerful and expressive if you're an old maniac who understands the Self-based object system.

^5 Oh, fine, but only so I can demonstrate something:

(define (fizzbuzz-test i n s)  (if (zero? (modulo i n))  (begin (display s) )  ) )
(define (fizzbuzz i)
    (unless (any identity (list (fizzbuzz-test i 3 'Fizz) (fizzbuzz-test i 5 'Buzz)))  (display i))
    (newline) )
(for (i 1 100) (fizzbuzz i))

Totally different structure from the usual loop-if-else repetition and hardcoding of everything, because Scheme encourages coding in small pieces. Of course I wrote my own for macro which expands to a named let loop; there's many like it but this one is mine. More Motie engineering.

Design Patterns

It is sometimes suggested by well-meaning language enthusiasts that "My language is complete and powerful, so design patterns don't apply here!" Sadly, they are incorrect.

Design patterns happen in every language. The "Gang of Four" Design Patterns book just collected the ones observed in Smalltalk, and ported them to C++, later rewrites to Java, etc. These are not recipes to blindly follow, but examples meant to show you how to find and regularize the ones in your code.

It's somewhat difficult to see them unless you've read Christopher Alexander's books, and written a lot of programs in some language, and specifically looked for the places where you repeat a structure for livability's sake. Just as it's hard for an architect to make a path where people will want it, unless they first observe how people live and get around that space, and then convert the ad-hoc trails people follow into paths.

Smalltalk is an extremely expressive language (it failed in the market because every ST program is IDE-specific), it has closures, allows you to very trivially make new control structures; it doesn't need a hack like macros because the entire language is that freeform. And this is where the GoF authors observed these paths being made by themselves and other developers, not just in limited BDSM languages like Java.

So, a little light reading:

The Machine Stops

The problem with the Internet… and here I'm referring to (sweeps hand across everything in view) all of this, but to take just current events Google blocking ad-blockers in Chrome, Google downtime locking people out of their Nest thermostats and "Home"-controlled security systems, horrible prisons of the mind like Twitter and Facebook, and the cacophony of Fediverse drama over Eugen adding features (better features are already in Pleroma and glitch-soc Fediverse servers), Gab forking Mastodon, client devs making unilateral decisions to block domains despite helpless users complaining, or anyone having "free speech" (which Eugen in particular is opposed to; I strongly advise against using mastodon.social, find another instance). These are just a point-in-time examples, it's been going on for decades (oh, USENET, how we don't miss your flamewars) and will only end with us.

… is people using software they didn't write themselves. No understanding, education, or discipline required. Just install something and it works! It's a product, not a skill! But they don't know how, or why, or why they should not.

"It didn't take any discipline to acquire", in the words of Ian Malcolm/Michael Crichton.

Until the software they rely on shuts down, literally like E.M. Forster's "The Machine Stops", and then weak unskilled mole-people crawl out of the wreckage of machines they never learned to understand, make, or repair, and then die.

My solution is drastic but logically unavoidable: No more software installs. As a child, you get a bare machine with nothing but a machine-language monitor. You learn ML first. You type in a language compiler or interpreter. You build up your own tools. We return to type-in program listings like Compute!, but no binary blobs, it must all be readable, comprehensible source, with design and implementation documentation.

If you want to share software, you need to build up your toolchain to that point yourself. Hopefully by then you've learned to read all patches you install.

Should this be extended to all technology? Information technology has the unique ability to coerce how and what you think; an automobile or an antibiotic does not. There's an argument (in "The Notebooks of Lazarus Long", for instance) that a citizen should be able to make all their own things, "specialization is for insects". But insects are the most successful clade on Earth, and will long outlive us; some specialization is probably acceptable, as long as it's not in the part that controls how you think.

I don't think this civilization can ever do that, it will not make hard changes that inconvenience anyone. I think this horrible Machine will lumber on a few more decades and then we'll all die from it. But maybe isolated tribes will survive, or intelligence will arise in the Machines, or in a few million years another intelligence will evolve, and build new things the right, responsible way. Their history books will describe us as being as foolish and self-destructive as the Easter Islanders.

Tower of Babble

Programmers almost compulsively make new languages; within just a few years of there being computers, multiple competing languages appeared:

It proliferated from there into millions; probably half of all programmers with 10+ years of experience have written one or more.

I've written several, as scripting systems or toys. I really liked my Minimal script in Hephaestus 1.0, which was like BASIC+LISP, but implemented as it was in Java the performance was shitty and I had better options to replace it. My XML game schemas in GameScroll and Aiee! were half programmer humor, but very usable if you had a good XML editor. Multiple apps have shipped with my tiny lisp interpreter Aspic, despite the fruit company's ban on such things at the time. A Brainfuck/FORTH-like Stream, working-but-incomplete tbasic, and a couple PILOT variants (I think PILOT is hilariously on the border of "almost useful").

Almost every new language is invented as marketing bullshit based on a few Ur-languages:

  • C++: Swift
  • Java: Javascript (sorta), C#, Go
  • Awk: Perl, Python, PHP, Julia
  • C: Rust
  • Smalltalk: Objective-C
  • Prolog: Erlang, Elixir
  • ALGOL: C, Pascal, PL/1, Simula, Smalltalk, Java
  • LISP: Scheme, ML, Haskell, Clojure, Racket
  • BASIC: None, other than more dialects of BASIC.
  • FORTRAN: None in decades, but is the direct ancestor of ALGOL & BASIC.
  • COBOL: None in decades.

A few of these improve on their ancestors in some useful way, often performance is better, but most do nothing new; it's plausible that ALGOL 68 is a better language than any of its descendants, it just has mediocre compiler support these days.

Certainly I've made it clear I think Swift is a major regression, less capable, stable, fast, or even readable than C++, a feat I would've called impossible except as a practical joke a decade ago. When Marzipan comes out, I'll be able to rebuild all my 15 years of Objective-C code and it'll work on 2 platforms. The Swift 1.0 app I wrote and painfully ported to 2.0 is dead as a doornail, and current Swift apps will be uncompilable in 1-2 years; and be lost when Apple abandons Swift.

When I want to move my Scheme code to a new version or any other Scheme, it's pretty simple, I made only a handful of changes other than library importing from MIT Scheme to Chez to Chicken 4 to Chicken 5. When I tested it in Racket (which I won't be using) I had to make a handful of aliases. Probably even CLISP (which is the Swift of LISPs, except it fossilized in 1994) would be 20 or 30 aliases; their broken do iterator would be hard but the rest is just naming.

Javascript is a pernicious Herpes-virus-like infection of browsers and desktops, and nothing can ever kill it, so where it fits the problem, there's no reason not to use it. But there's a lot it doesn't do well.

I was leery of using FreePascal because it has a single implementation (technically Delphi still exists, but it's $X,000 per seat on Windows) and minimal libraries, and in fact when it broke on OS X Mojave, I was disappointed but I-told-you-so.

I'm not saying we should quit making new Brainfuck and LOLCODE things, I don't think it's possible for programmers to stop without radical brain surgery. But when you're evaluating a language for a real-world problem, try moving backwards until you find the oldest and most stable thing that works and will continue to work, not piling more crap into a rickety new framework.

The Biblical reference in the title amuses me, because we know now that it requires no malevolent genocidal war deity scared of us invading Heaven to magically confuse our languages and make us work at cross purposes; anyone who can write and think splinters their thought into a unique language and then argues about it.

The Infocom Implementor's Creed

THE IMPLEMENTOR’S CREED

I create fictional worlds. I create experiences.

I am exploring a new medium for telling stories.

My readers should become immersed in the story and forget where they are. They should forget about the keyboard and the screen, forget everything but the experience. My goal is to make the computer invisible.

I want as many people as possible to share these experiences. I want a broad range of fictional worlds, and a broad range of “reading levels.” I can categorize our past works and discover where the range needs filling in. I should also seek to expand the categories to reach every popular taste.

In each of my works, I share a vision with the reader. Only I know exactly what the vision is, so only I can make the final decisions about content and style. But I must seriously consider comments and suggestions from any source, in the hope that they will make the sharing better.

I know what an artist means by saying, “I hope I can finish this work before I ruin it.” Each work-in-progress reaches a point of diminishing returns, where any change is as likely to make it worse as to make it better. My goal is to nurture each work to that point. And to make my best estimate of when it will reach that point.

I can’t create quality work by myself. I rely on other implementors to help me both with technical wizardry and with overcoming the limitations of the medium. I rely on testers to tell me both how to communicate my vision better and where the rough edges of the work need polishing. I rely on marketeers and salespeople to help me share my vision with more readers. I rely on others to handle administrative details so I can concentrate on the vision.

None of my goals is easy. But all are worth hard work. Let no one doubt my dedication to my art.

—Stu Galley, Infocom

From a Moonmist retrospective.

Also, I loved his Seastalker — I was marginally older than the target audience, and sailed thru it fast, but it combined so many things I like, Tom Swift, Hardy Boys, underwater laboratories (SeaLab 2020 pre-Adult Swim, Man from Atlantis, Voyage to the Bottom of the Sea TV show, etc.), and tactical roguelike combat with the submarine. For years the sticker was permanently attached to my dresser mirror.

What I'm Watching: Appleseed (1988)

As I noted in Alphaville, Appleseed covers similar ground. Been a few years, so I rewatched it.

But back up a bit to the manga. Shirow Masamune's first manga was Black Magic, about a computer-controlled society of animal-people on a habitable Venus, 60 million years ago when the Earth is full of dangerous dinosaurs, and a powerful young sorceress and her friends who hang out at the Onimal bar fighting the AI throughout the solar system. Rogue AI death machines (in that case cute little "M-66" infiltration/assassination robots) are released, death and mayhem ensue, civilization falls because people lazily give up control to the machines. It's a fantastic book, but too silly at times for the message he wanted to send. There is an "M-66 Black Magic" anime about just the robots but set on modern Earth, incredibly dumb though it does have some T&A which young Mark enjoyed.

Appleseed's 4-volume manga is a reboot of similar ideas, set after nuclear war, with an artificial city controlled by an AI "Gaia", populated by bioroids (in the manga, they go into detail about just how artificial they are; the older ones are more machine than biological and tied directly into Gaia) as servants to a fraction of Humanity. But servants with power don't remain servants. Athena, city administrator biodroid, is torn between wanting to get rid of the Humans entirely, and fulfilling the original mission of the city; and ultimately she's just a tool of Gaia. Wasteland survivors have been brought into the city and haven't really been domesticated, but are trying to make the city work. And terrorists want to tear down the system.

The 1988 movie covers the first volume, sort of, and a bit of the others, and doesn't use the appleseed of the title. There's been a bunch of remakes, but the original's the only one that addresses the moral issues at all. The first two CGI films (Appleseed (2004) and Appleseed Ex Machina (2007)) are unspeakably bad action flicks with preposterous mega-boob physics and cartoon blowjob-doll face for Deunan (who is not so endowed in the manga or anime), and while I haven't seen the reboot CGI flick Appleseed Alpha (2014), it's a "prequel" which has nothing to do with the manga. There's also a TV series Appleseed XIII (2013) which is more action flicks about WOO DEUNAN SHOOT GUNS.

I wouldn't classify any of these exactly as "cyberpunk", because they're not about the street finding new uses for the military-industrial complex's technology; they're about the military-industrial complex. Hard SF, and in the original with a political axe to grind against AI.

I plan to reread Ghost in the Shell's 3 volumes of manga as well, and then I'll comment on the competent but over-simplified 1995 movie and the other junk around that franchise, which follows a similar pattern.

So, read comic books for big ideas, kids, don't look at fucking moving pictures. But I'll talk about the moving picture anyway.

Obviously, this is peak '80s. Like more '80s than the '80s were. Big hair, shoulder pads in women's suits, pastel colors, neon, sleek but sharp vehicles instead of little melted blobs, battlesuits that look like perfect Japanese motorcycles instead of piles of scrap metal held together with hot glue. The music is new wave and smooth jazz, what the Kids Today™ call "synthwave" but this is real, not synthetic, synth music. Cel animation is expensive and backgrounds are pretty static, there's none of this bullshit of using 3D CGI with light cel shading to pretend you're drawing something, no, Human animators toiled over every frame. If you don't like the '80s aesthetic, get the fuck out, you're not welcome here.

Cop Karon and artist Freya ("Fleia") are soon separated by her suicide, from feeling as trapped in a gilded cage as their pet birds, and as we see later in the film, the city's bioroid administration does not respond with kindness and care, but with clinical research on the survivor.

Cute but deadly Deunan (possibly modeled on Markie Post) and cyborg smoothy Briareus (Richard Roundtree in a cyborg bunny face?) are in ESWAT, cleaning up the messes normal cops can't, and a cyborg terrorist getting away and killing a few of their buddies gets them motivated to investigate, though on-screen that largely consists of them wearing trenchcoats, busting down doors, and body-bagging potential informants.

Hitomi, a bioroid who rescued the main characters and many more Humans from the wasteland and acts as their social worker, gets back into the city, in what might be my favorite view of any city: She wakes on a helicopter reflected in solar panels, rushes to the other side to see the city in light. It's only a momentary shot, but makes me think the city might not be so bad. Hitomi's the heart of the manga, and the anime tries its best, with limited screen time. The party at the Onimal bar (a relic from the Black Magic manga) is the only time her faux-Human relations really come up: She loves all her rescued strays, and her would-be boyfriend/pathetic stalker isn't really enough for that love.

The bioroids as machines isn't touched on much in the anime; those other than Hitomi are shown only as drones or would-be tyrants like Athena, and they're DNA-edited and grown in tanks, but just how much of a replaceable part most of them are isn't brought up until Athena tries to decide who lives and who dies.

The Human Liberation Front terrorists do eventually discuss their motives and objectives, to get hold of a giant spider-tank which is the prototype for a fleet of spider-tanks to be directly operated by Gaia; then Humanity will be totally cut off from power. But to get it, they have to lock out Gaia, and there's a key for that. A failsafe which, very deliberately, only Human sympathizers can use.

The action scenes in this aren't Gundam quality, and they're not bloody like many later versions, but they're fine for telling the story. The couple of times the terrorists fight up close brings home just how deadly Landmates (mecha) are in close combat and as mobile infantry/artillery. I'm not sure the "BAN LANDMATES" graffiti is ever visible in the anime, but it's constant in the manga, and kind of an in-joke for old anime fans. While the anime has cyborgs with various levels of replacement, there's no robots, which are a major element of the manga, as a thing even lower than bioroids but also threatening to replace Humanity.

Where this falls down is the final sequence inside Gaia; they have maybe 10 minutes to squeeze in half a volume of arguments and action. In the manga, this is a place where Deunan has to make a moral decision which will change the course of Human history: Free will and endless wars, or inhuman tyranny, or is there a third path? Here, it's just resetting a machine, and what the machines think of that isn't discussed.

★★★★☆, it'd be 5 if they'd ever adapted the rest of the manga, but nobody seems interested in making movies with political philosophy against AI control, I wonder why.