Videogames and Storytelling Mix like Water and Sodium

At best you get tears & corrosive salt water, at worst you get a sodium explosion.

My philosophy of games:

  1. Games are about environment and gameplay only.
  2. Graphics don't matter much, as long as they communicate.
  3. Character and story are what you bring to it, they should not be part of the game.

So, I just dropped a lot of words there with fuzzy definitions:

  • Games: I mean all of tabletop boardgames, role-playing games, and most often videogames of all genres. There's less difference between the Warlock of Firetop Mountain gamebook and Myst than there is between that gamebook and David Foster Wallace's Infinite Jest. And if you tear out the system from Warlock, you get Advanced Fighting Fantasy or Troika!, which is a very nice little RPG for wandering a weird, almost hallucinatory fantasy world with no book, no defined character, no story.

  • Environment: The world you explore. Some of this uses traditional writing skills for designing non-player characters and describing the tone and events, but also architecture, painting, 3D modelling for designing environments, music for writing soundtracks, foley for making environmental sounds.

    I recently enough mentioned this in Videogame Exploration, and I want to especially repeat my suggestion of Bernband, which is goofy, low-rez, standee sprites… and one of the most immersive environments I've ever played in.

  • Gameplay: The continuous loop of doing something, getting feedback on what happened, maybe scores or your position or just your understanding of the environment changes, and then repeat forever. That loop might take milliseconds in action games, to minutes or hours in hard adventures. There's a… fixation? a high… you get from that loop when it works right. "Just one more turn" says the Civilization junkie at 4AM before blowing off work. "Just one more mineshaft" says the Minecraft player. "Just one more quest" says the ESO player.

  • Graphics: This is almost irrelevant, really, despite the huge amount of effort and money spent on it. It doesn't matter if it's text adventures like Colossal Cave Adventure or Infocom's games, character-grids like Rogue and many descendants, 2D or 3D tiled graphical environments like Ultima IV, Super Mario Bros, or Castlevania, painted images along with text like Sierra's King's Quest or the LucasArts SCUMM games, up to 3D FPS graphics like Doom or Elder Scrolls Online. Good gameplay with any graphics is immersive, bad gameplay with perfect graphics is not.

    Easy way to test that: The most popular videogames of all time are: Mario (2D tiles), Zelda (2D & very simple 3D), Minecraft (blocky 3D with the worst programmer-art textures), Animal Crossing (very simple 3D imitating 2D). Graphics-intensive games pop up and vanish, because they're uninteresting.

  • Character: Who you are. In the better kinds of games, this is left blank for you to fill in. If the game engine doesn't accomodate dialogue even as well as Ultima I did, you're a mute wanderer who breaks into peoples' homes, smashes their crockery looking for coins & drugs/potions, maybe hits X to hear if they have any rumors or leads, then leaves. In action games, very little dialogue is necessary, your weapons speak for you.

    If you can freely define your Character, that interferes with Story. Until recently, at least you could rename your character, but with full voice acting for many games, they either obnoxiously refer to you as "Vestige", "Adept", "Friend", etc., or don't refer to you at all… or don't let you rename your character.

  • Story: This ties in closely with Character: What do you do? If you can wander as you please, make your own fun, whether that's good or harmful to the environment or NPCs, then you have no story, only gameplay. If you can only ride along like an amusement park railroad ride, get a story told to you and then pew-pew-pew to shoot targets, move on to the next stop, you have no gameplay, only story.

    The Disneyland ride model is a big influence, but AAA "games" with story are mostly frustrated Hollywood wastrels in the wrong medium. The obvious recent example is Death Stranding, which has hours of awful cutscenes with Hollywood people who have nothing to do with the game: A mediocre walking simulator/platformer; without the cutscenes, it might even be fun, if tedious.

An unfortunate result of focusing on Story has been forcing the player to make bad dialogue/action choices to advance, stay on the railroad unable to get out and wander away. Heavy Rain's no-choice "Press X to Cry Jason" rather than man up and go look for your lost child.

The now-defunct Telltale Games' Minecraft Story Mode had a painfully fixed main character and plot, and a doomed character, but let you choose social consequences with allies… which were then forgotten in the next chapter.

Early Final Fantasy games had a totally blank slate. FF3 is right on the cusp; it gives you a sandbox to explore, eventually hit a switch to open the next, bigger sandbox, repeat a couple more times, finally a long multi-part endgame and post-game sidequests. The characters have a secret backstory, but you can rename them, give them any job you want, play them however you want. I did one playthrough with boring Warrior, Thief, White Mage, Black Mage, another using Monk/Black Belt, Red Mage/Dragoon, Scholar/Geomancer, Evoker/Summoner. Utterly different gameplay even if I ended up clearing the same dungeons. My bizarro party got to level 99 to fight the giants.

By FF4, the characters and story are locked in place, you can enjoy it or not, and certainly the art's great and I quote "you spoony bard!" all the time, but you have no choices. Not that I'm blaming that all on JRPGs — there's Japanese games with freedom of choice, and Western games fixed on one character, Gabriel Knight is one of the earliest of this archetype.

Gamebooks like Tunnels & Trolls solos, Fighting Fantasy, Lone Wolf, etc. are odd hybrids since they have story, but almost never have a defined character (a few do, like Creature of Chaos). The more linear the gamebook is, the better the story is, but the less interesting it is to play; there's several I've done that had one win and many deaths, and so cannot be replayed. The more meaningful choices they offer, the more incoherent the gamebook becomes, just a bunch of random scenes because you can't build up any meaning like linear fiction does.

My objection to Dungeons & Dragons adventures from Dragonlance (1984) on, is that it went from a game of freeform dungeon crawls, hex crawls, or "West Marches", wandering the Referee's world, maybe loosely using a Greyhawk map or Outdoor Survival, often made up in the days between games or improvised on the spot; to railroaded "adventure paths" with fixed character roles (either named and unkillable like DL or just "must have fighter, thief, cleric, magic-user, bard, or you will fail"). 5E has become entirely that, their healing/action economy even requires a specific pacing along the railroad, and their world maps are just one-path flowcharts you move along like Candyland.

So in conclusion (almost), just say no to story in your games. Look for that infinite high of gameplay.

  • The Devil's Advocate: There are some attempts to make character or story "gameable", rather than just a railroad, most notably Chris Crawford's Erasmatazz, which he then replaced with Storytron, now Wumpus (no relation to the real Hunt the Wumpus game). These have computer-controlled drama, you talk/choose interactions with different "emotional weights", and the NPCs react appropriately. These suck as games. They can be a little interesting as a puzzle to talk to the NPCs, find out what's going on, maybe push one of them into a "win" state. Nobody'd spend long on one.

It's worth looking at Chris's development woes. Sequentiality and list of encounters in Le Morte d'Arthur he gave up on gameplay, it's a railroad click-thru of Mallory's book, with a single fame/piety score to get win/lose.

His Gamers or Storytellers seems to be an admission of defeat. Yet he still has bigoted, ignorant ideas like:

This also plays into the old “evolution versus revolution” dilemma. I have long held that games will never evolve into anything with artistic merit, because the gaming audience does not expect artistic content from games. You can’t sell Beef Wellington to people who want candy. You can’t sell poetry to people who read comic books. You can’t sell art-house movies to people who watch cartoons. And you can’t sell artistic content to gamers who want action and instant gratification. Games as a a medium are ill-disposed to evolve in a storytelling direction.

This is why he fails. Games can have artistic content, just not inbred Hollywood-imitating content. There is plenty of poetry in comic books, obviously Sandman but many an issue of Detective Comics (the smarter Batman series) has moved me deeply. Many art-house movies are cartoons, or vice versa, or were when theatres were a thing, I'd start with Don Hertzfeldt's Rejected and Ralph Bakshi's Wizards. You can't sell poison apples to gamers, not more than once anyway.

I had a look at his soi-disant "Wumpus", and got this, his "non-technical" user interface. It's incredible to me that this is the guy who made Eastern Front and Balance of Power, which were techy but not a giant wall of UI clickies, badly sized in a window. Yes, it's Java, but you can make attractive and usable Java UI, it just requires effort.

I figured out eventually that you can hit Editor/Run Rehearsal (?) to play in something like a dialog box UI, was able to play through a very dull conversation, and then it gets stuck with Jeff explaining widgets to Sam in an infinite loop. Excellent. Obviously story-gaming is a solved problem. 🙄

Software Principles for 2020

This is both for myself, and to decide what software I'll tolerate in my presence in the future.

  1. No lag. All UI must respond and be responsive again within 100ms. Most everyone has many cores in their CPUs and a massively parallel GPU not doing that much, you can spare ONE to run your work thread. Stop with the long animation shit. 100ms is plenty to see a shadow moved from one place to another, where there is now an interactive UI.
  2. No load screens. If you can't preload "instantly", be functional, show a usable menu while background loading. Media streaming needs to buffer, but you can show a poster frame instead of empty space.
  3. No ads or spyware. If you can't subsidize your software some other way, don't ship software. Or as the late very lamented Bill Hicks said, "If anyone here is in advertising or marketing, kill yourself!" (and of course there's ads on youtube; so maybe I need to find a better video hosting system? I know there's a fediverse-based video thing)
  4. No custom binary formats. Save your data in JSON or some other common system (plist on Mac, etc), so users can export & manipulate it from their own tools.
  5. No sites without syndication. If you have a web site or blog, you MUST support RSS or Atom, or both. Failure to do so should have you removed from the Internet.
  6. No unsecure connections. I know it's hard to add https the first time, and some older services can't be easily wrapped, but every http connection is a chance for false information to be fed to you, your computer compromised, your information to be stolen.

Adult Engineer Over-Optimization as the Motie Problem

Looking at my Scheme code and the way I customize it, I'm starting to see the real reason evil megacorps (and wannabe evil startups) won't hire even middle-aged programmers or use your favorite weirdo language, they just want young idiots who code Java or Go.

If you think about a standard software career, there's maybe 10 years of a submissive fool badly coding crap languages ^1 like Java, Go ^3, PHP, JavaScript ^4. They just got out of college or self-trained, and can barely copy existing algorithms, let alone think of one for themselves. This is why FizzBuzzTest ^5 is such a good novice coder test: It requires following directions exactly, and slightly competent logic skills, but not much more.

Then maybe 10 years of them being project managers and "architects", running waterfall and GANTT charts; they'll say they're "agile" but then have a giant JIRA repo of "backlog" features which have to be implemented before shipping, weekly 4-hour planning "backlog grooming" meetings, and unrealistic estimates. This is sufficient to build all kinds of horrible vertical prisons of the mind like Azkaban Facebook.

Then they either retire, or are "downsized", and now what? So they work on their own code, do maintenance on old systems, or leave the industry entirely.

If they work on their own, freed of evil megacorp constraints, they're going to end up in something idiosyncratic and expressive, like Scheme, LISP, Forth, or a custom language. Make their own weirdo environment that's perfectly fit to themself, and unusable/unreadable by anyone else.

Case in point, I needed an object model. There's one I like in Gerbil, and Gerbil's blazing fast, but I can't make a full SDL2 library for it yet (Gambit's FFI is hard, I've hit some bugs, and there's a LOT of library to interface to), and I'm using a bunch of other Chickenisms anyway, so I can't really move to it yet. Instead I just made my own simple object libary, with a couple macros to hide the ugly reality behind it:

(test-group "Object"
    (test "Object" 'Object (class-name Object))
    (let [ (obj (@new Object))  (bug )  (cow )  (duck ) ]
        (test "Object-to-string" "[Object]" (@call obj 'to-string))

        (define-class Animal Object)
        (define-field Animal 'legs 0)
        (define-field Animal 'color )
        (define-method Animal 'init (self legs color)
            (set! (@field self 'legs) legs)
            (set! (@field self 'color) color) )
        (define-method Animal 'speak (self)
            (sprintf "The ~A ~A with ~A legs says " (@field self 'color) (class-name (@class self)) (@field self 'legs)) )

        (set! bug (@new Animal 6 "green"))
        (test "bug-legs" 6 (@field bug 'legs))
        (test "bug-color" "green" (@field bug 'color))
        (test "Bug speak" "The green Animal with 6 legs says " (@call bug 'speak))

        (define-class Cow Animal)
        (define-method Cow 'init (self color)
            (@super self 'init 4 color) )
        (define-method Cow 'speak (self)
            (string-append (@super self 'speak) "MOO!") )
        (set! cow (@new Cow "brown"))

        ;; second class to make sure classes don't corrupt shared superclass
        (define-class Duck Animal)
        (define-method Duck 'init (self color)
            (@super self 'init 2 color) )
        (define-method Duck 'speak (self)
            (string-append (@super self 'speak) "QUACK!") )
        (set! duck (@new Duck "black"))

        (test "Cow speak" "The brown Cow with 4 legs says MOO!" (@call cow 'speak))
        (test "Cow to string" "[Cow color:brown;legs:4]" (@call cow 'to-string))
        (test "Duck speak" "The black Duck with 2 legs says QUACK!" (@call duck 'speak))
        (test "Duck to string" "[Duck color:black;legs:2]" (@call duck 'to-string))

        (test "instance-of?"  (instance-of? cow Cow))
        (test "instance-of? parent"  (instance-of? cow Animal))
        (test "instance-of? grandparent"  (instance-of? cow Object))
        (test "instance-of? cousin-false"  (instance-of? cow Duck))
        (test "instance-of? not an obj-false"  (instance-of? "wtf" Cow))
    )
)

The implementation code's not much longer than the tests, but it's not quite done for me to show off; I need to switch my macros into non-hygeinic forms so I can get rid of the (self) in define-method, and introduce an Objective-C-like _cmd field for self-reflection, and message-not-understood handling. There's always more tinkering to do.

Which is great for me, but makes my code an undocumented (mostly) new language, unusable by anyone normal. A giant pile of crap Java program, no matter how old, can be "worked on" (more crap piled on top) by any teenage Bro Coder.

All of which brought to mind The Mote in God's Eye, where the Motie Engineers over-optimize everything into a tangled mess, and the Watchmaker vermin are even worse, wiring up everything to everything to make new devices. The threat posed by and solution to Scheme programmers, in your usual authoritarian megacorp scenario, is similar to Watchmakers.


^1 Swift is intended to fit this niche much more than weirdo expressive Smalltalk+C Objective-C was, BDSM ^2 to prevent one from writing "bad" code, but it's not there yet; the reality of low-level software dev can't be simplified as much as Apple wants, and their C++ developers weren't up to the task anyway.

^2 Bondage-Domination-Sado-Masochism; aka strict type systems and code flow analysis, that prevent one from writing "bad" code at the cost of annotating everything with types instead of doing useful work. I'm not kink-shaming people who do that for sex, only those who do it to their own software.

^3 Rob Pike has openly said they can't give a powerful language to newbie Googlers, they mostly just know Java, C, C++, which is why Go is so limited and generic.

^4 Oddly, JS is basically a LISP with really shitty syntax. It's easy to make trivial, broken junk in it, but it's also powerful and expressive if you're an old maniac who understands the Self-based object system.

^5 Oh, fine, but only so I can demonstrate something:

(define (fizzbuzz-test i n s)  (if (zero? (modulo i n))  (begin (display s) )  ) )
(define (fizzbuzz i)
    (unless (any identity (list (fizzbuzz-test i 3 'Fizz) (fizzbuzz-test i 5 'Buzz)))  (display i))
    (newline) )
(for (i 1 100) (fizzbuzz i))

Totally different structure from the usual loop-if-else repetition and hardcoding of everything, because Scheme encourages coding in small pieces. Of course I wrote my own for macro which expands to a named let loop; there's many like it but this one is mine. More Motie engineering.

Design Patterns

It is sometimes suggested by well-meaning language enthusiasts that "My language is complete and powerful, so design patterns don't apply here!" Sadly, they are incorrect.

Design patterns happen in every language. The "Gang of Four" Design Patterns book just collected the ones observed in Smalltalk, and ported them to C++, later rewrites to Java, etc. These are not recipes to blindly follow, but examples meant to show you how to find and regularize the ones in your code.

It's somewhat difficult to see them unless you've read Christopher Alexander's books, and written a lot of programs in some language, and specifically looked for the places where you repeat a structure for livability's sake. Just as it's hard for an architect to make a path where people will want it, unless they first observe how people live and get around that space, and then convert the ad-hoc trails people follow into paths.

Smalltalk is an extremely expressive language (it failed in the market because every ST program is IDE-specific), it has closures, allows you to very trivially make new control structures; it doesn't need a hack like macros because the entire language is that freeform. And this is where the GoF authors observed these paths being made by themselves and other developers, not just in limited BDSM languages like Java.

So, a little light reading:

The Machine Stops

The problem with the Internet… and here I'm referring to (sweeps hand across everything in view) all of this, but to take just current events Google blocking ad-blockers in Chrome, Google downtime locking people out of their Nest thermostats and "Home"-controlled security systems, horrible prisons of the mind like Twitter and Facebook, and the cacophony of Fediverse drama over Eugen adding features (better features are already in Pleroma and glitch-soc Fediverse servers), Gab forking Mastodon, client devs making unilateral decisions to block domains despite helpless users complaining, or anyone having "free speech" (which Eugen in particular is opposed to; I strongly advise against using mastodon.social, find another instance). These are just a point-in-time examples, it's been going on for decades (oh, USENET, how we don't miss your flamewars) and will only end with us.

… is people using software they didn't write themselves. No understanding, education, or discipline required. Just install something and it works! It's a product, not a skill! But they don't know how, or why, or why they should not.

"It didn't take any discipline to acquire", in the words of Ian Malcolm/Michael Crichton.

Until the software they rely on shuts down, literally like E.M. Forster's "The Machine Stops", and then weak unskilled mole-people crawl out of the wreckage of machines they never learned to understand, make, or repair, and then die.

My solution is drastic but logically unavoidable: No more software installs. As a child, you get a bare machine with nothing but a machine-language monitor. You learn ML first. You type in a language compiler or interpreter. You build up your own tools. We return to type-in program listings like Compute!, but no binary blobs, it must all be readable, comprehensible source, with design and implementation documentation.

If you want to share software, you need to build up your toolchain to that point yourself. Hopefully by then you've learned to read all patches you install.

Should this be extended to all technology? Information technology has the unique ability to coerce how and what you think; an automobile or an antibiotic does not. There's an argument (in "The Notebooks of Lazarus Long", for instance) that a citizen should be able to make all their own things, "specialization is for insects". But insects are the most successful clade on Earth, and will long outlive us; some specialization is probably acceptable, as long as it's not in the part that controls how you think.

I don't think this civilization can ever do that, it will not make hard changes that inconvenience anyone. I think this horrible Machine will lumber on a few more decades and then we'll all die from it. But maybe isolated tribes will survive, or intelligence will arise in the Machines, or in a few million years another intelligence will evolve, and build new things the right, responsible way. Their history books will describe us as being as foolish and self-destructive as the Easter Islanders.

Tower of Babble

Programmers almost compulsively make new languages; within just a few years of there being computers, multiple competing languages appeared:

It proliferated from there into millions; probably half of all programmers with 10+ years of experience have written one or more.

I've written several, as scripting systems or toys. I really liked my Minimal script in Hephaestus 1.0, which was like BASIC+LISP, but implemented as it was in Java the performance was shitty and I had better options to replace it. My XML game schemas in GameScroll and Aiee! were half programmer humor, but very usable if you had a good XML editor. Multiple apps have shipped with my tiny lisp interpreter Aspic, despite the fruit company's ban on such things at the time. A Brainfuck/FORTH-like Stream, working-but-incomplete tbasic, and a couple PILOT variants (I think PILOT is hilariously on the border of "almost useful").

Almost every new language is invented as marketing bullshit based on a few Ur-languages:

  • C++: Swift
  • Java: Javascript (sorta), C#, Go
  • Awk: Perl, Python, PHP, Julia
  • C: Rust
  • Smalltalk: Objective-C
  • Prolog: Erlang, Elixir
  • ALGOL: C, Pascal, PL/1, Simula, Smalltalk, Java
  • LISP: Scheme, ML, Haskell, Clojure, Racket
  • BASIC: None, other than more dialects of BASIC.
  • FORTRAN: None in decades, but is the direct ancestor of ALGOL & BASIC.
  • COBOL: None in decades.

A few of these improve on their ancestors in some useful way, often performance is better, but most do nothing new; it's plausible that ALGOL 68 is a better language than any of its descendants, it just has mediocre compiler support these days.

Certainly I've made it clear I think Swift is a major regression, less capable, stable, fast, or even readable than C++, a feat I would've called impossible except as a practical joke a decade ago. When Marzipan comes out, I'll be able to rebuild all my 15 years of Objective-C code and it'll work on 2 platforms. The Swift 1.0 app I wrote and painfully ported to 2.0 is dead as a doornail, and current Swift apps will be uncompilable in 1-2 years; and be lost when Apple abandons Swift.

When I want to move my Scheme code to a new version or any other Scheme, it's pretty simple, I made only a handful of changes other than library importing from MIT Scheme to Chez to Chicken 4 to Chicken 5. When I tested it in Racket (which I won't be using) I had to make a handful of aliases. Probably even CLISP (which is the Swift of LISPs, except it fossilized in 1994) would be 20 or 30 aliases; their broken do iterator would be hard but the rest is just naming.

Javascript is a pernicious Herpes-virus-like infection of browsers and desktops, and nothing can ever kill it, so where it fits the problem, there's no reason not to use it. But there's a lot it doesn't do well.

I was leery of using FreePascal because it has a single implementation (technically Delphi still exists, but it's $X,000 per seat on Windows) and minimal libraries, and in fact when it broke on OS X Mojave, I was disappointed but I-told-you-so.

I'm not saying we should quit making new Brainfuck and LOLCODE things, I don't think it's possible for programmers to stop without radical brain surgery. But when you're evaluating a language for a real-world problem, try moving backwards until you find the oldest and most stable thing that works and will continue to work, not piling more crap into a rickety new framework.

The Biblical reference in the title amuses me, because we know now that it requires no malevolent genocidal war deity scared of us invading Heaven to magically confuse our languages and make us work at cross purposes; anyone who can write and think splinters their thought into a unique language and then argues about it.

The Infocom Implementor's Creed

THE IMPLEMENTOR’S CREED

I create fictional worlds. I create experiences.

I am exploring a new medium for telling stories.

My readers should become immersed in the story and forget where they are. They should forget about the keyboard and the screen, forget everything but the experience. My goal is to make the computer invisible.

I want as many people as possible to share these experiences. I want a broad range of fictional worlds, and a broad range of “reading levels.” I can categorize our past works and discover where the range needs filling in. I should also seek to expand the categories to reach every popular taste.

In each of my works, I share a vision with the reader. Only I know exactly what the vision is, so only I can make the final decisions about content and style. But I must seriously consider comments and suggestions from any source, in the hope that they will make the sharing better.

I know what an artist means by saying, “I hope I can finish this work before I ruin it.” Each work-in-progress reaches a point of diminishing returns, where any change is as likely to make it worse as to make it better. My goal is to nurture each work to that point. And to make my best estimate of when it will reach that point.

I can’t create quality work by myself. I rely on other implementors to help me both with technical wizardry and with overcoming the limitations of the medium. I rely on testers to tell me both how to communicate my vision better and where the rough edges of the work need polishing. I rely on marketeers and salespeople to help me share my vision with more readers. I rely on others to handle administrative details so I can concentrate on the vision.

None of my goals is easy. But all are worth hard work. Let no one doubt my dedication to my art.

—Stu Galley, Infocom

From a Moonmist retrospective.

Also, I loved his Seastalker — I was marginally older than the target audience, and sailed thru it fast, but it combined so many things I like, Tom Swift, Hardy Boys, underwater laboratories (SeaLab 2020 pre-Adult Swim, Man from Atlantis, Voyage to the Bottom of the Sea TV show, etc.), and tactical roguelike combat with the submarine. For years the sticker was permanently attached to my dresser mirror.

What I'm Watching: Appleseed (1988)

As I noted in Alphaville, Appleseed covers similar ground. Been a few years, so I rewatched it.

But back up a bit to the manga. Shirow Masamune's first manga was Black Magic, about a computer-controlled society of animal-people on a habitable Venus, 60 million years ago when the Earth is full of dangerous dinosaurs, and a powerful young sorceress and her friends who hang out at the Onimal bar fighting the AI throughout the solar system. Rogue AI death machines (in that case cute little "M-66" infiltration/assassination robots) are released, death and mayhem ensue, civilization falls because people lazily give up control to the machines. It's a fantastic book, but too silly at times for the message he wanted to send. There is an "M-66 Black Magic" anime about just the robots but set on modern Earth, incredibly dumb though it does have some T&A which young Mark enjoyed.

Appleseed's 4-volume manga is a reboot of similar ideas, set after nuclear war, with an artificial city controlled by an AI "Gaia", populated by bioroids (in the manga, they go into detail about just how artificial they are; the older ones are more machine than biological and tied directly into Gaia) as servants to a fraction of Humanity. But servants with power don't remain servants. Athena, city administrator biodroid, is torn between wanting to get rid of the Humans entirely, and fulfilling the original mission of the city; and ultimately she's just a tool of Gaia. Wasteland survivors have been brought into the city and haven't really been domesticated, but are trying to make the city work. And terrorists want to tear down the system.

The 1988 movie covers the first volume, sort of, and a bit of the others, and doesn't use the appleseed of the title. There's been a bunch of remakes, but the original's the only one that addresses the moral issues at all. The first two CGI films (Appleseed (2004) and Appleseed Ex Machina (2007)) are unspeakably bad action flicks with preposterous mega-boob physics and cartoon blowjob-doll face for Deunan (who is not so endowed in the manga or anime), and while I haven't seen the reboot CGI flick Appleseed Alpha (2014), it's a "prequel" which has nothing to do with the manga. There's also a TV series Appleseed XIII (2013) which is more action flicks about WOO DEUNAN SHOOT GUNS.

I wouldn't classify any of these exactly as "cyberpunk", because they're not about the street finding new uses for the military-industrial complex's technology; they're about the military-industrial complex. Hard SF, and in the original with a political axe to grind against AI.

I plan to reread Ghost in the Shell's 3 volumes of manga as well, and then I'll comment on the competent but over-simplified 1995 movie and the other junk around that franchise, which follows a similar pattern.

So, read comic books for big ideas, kids, don't look at fucking moving pictures. But I'll talk about the moving picture anyway.

Obviously, this is peak '80s. Like more '80s than the '80s were. Big hair, shoulder pads in women's suits, pastel colors, neon, sleek but sharp vehicles instead of little melted blobs, battlesuits that look like perfect Japanese motorcycles instead of piles of scrap metal held together with hot glue. The music is new wave and smooth jazz, what the Kids Today™ call "synthwave" but this is real, not synthetic, synth music. Cel animation is expensive and backgrounds are pretty static, there's none of this bullshit of using 3D CGI with light cel shading to pretend you're drawing something, no, Human animators toiled over every frame. If you don't like the '80s aesthetic, get the fuck out, you're not welcome here.

Cop Karon and artist Freya ("Fleia") are soon separated by her suicide, from feeling as trapped in a gilded cage as their pet birds, and as we see later in the film, the city's bioroid administration does not respond with kindness and care, but with clinical research on the survivor.

Cute but deadly Deunan (possibly modeled on Markie Post) and cyborg smoothy Briareus (Richard Roundtree in a cyborg bunny face?) are in ESWAT, cleaning up the messes normal cops can't, and a cyborg terrorist getting away and killing a few of their buddies gets them motivated to investigate, though on-screen that largely consists of them wearing trenchcoats, busting down doors, and body-bagging potential informants.

Hitomi, a bioroid who rescued the main characters and many more Humans from the wasteland and acts as their social worker, gets back into the city, in what might be my favorite view of any city: She wakes on a helicopter reflected in solar panels, rushes to the other side to see the city in light. It's only a momentary shot, but makes me think the city might not be so bad. Hitomi's the heart of the manga, and the anime tries its best, with limited screen time. The party at the Onimal bar (a relic from the Black Magic manga) is the only time her faux-Human relations really come up: She loves all her rescued strays, and her would-be boyfriend/pathetic stalker isn't really enough for that love.

The bioroids as machines isn't touched on much in the anime; those other than Hitomi are shown only as drones or would-be tyrants like Athena, and they're DNA-edited and grown in tanks, but just how much of a replaceable part most of them are isn't brought up until Athena tries to decide who lives and who dies.

The Human Liberation Front terrorists do eventually discuss their motives and objectives, to get hold of a giant spider-tank which is the prototype for a fleet of spider-tanks to be directly operated by Gaia; then Humanity will be totally cut off from power. But to get it, they have to lock out Gaia, and there's a key for that. A failsafe which, very deliberately, only Human sympathizers can use.

The action scenes in this aren't Gundam quality, and they're not bloody like many later versions, but they're fine for telling the story. The couple of times the terrorists fight up close brings home just how deadly Landmates (mecha) are in close combat and as mobile infantry/artillery. I'm not sure the "BAN LANDMATES" graffiti is ever visible in the anime, but it's constant in the manga, and kind of an in-joke for old anime fans. While the anime has cyborgs with various levels of replacement, there's no robots, which are a major element of the manga, as a thing even lower than bioroids but also threatening to replace Humanity.

Where this falls down is the final sequence inside Gaia; they have maybe 10 minutes to squeeze in half a volume of arguments and action. In the manga, this is a place where Deunan has to make a moral decision which will change the course of Human history: Free will and endless wars, or inhuman tyranny, or is there a third path? Here, it's just resetting a machine, and what the machines think of that isn't discussed.

★★★★☆, it'd be 5 if they'd ever adapted the rest of the manga, but nobody seems interested in making movies with political philosophy against AI control, I wonder why.

What I'm Watching: Trigger Warning with Killer Mike

Rapper Killer Mike does stunts with a social purpose. But unlike, say, Jackass or Dear White People, he's not stupid or preachy, and he's funnier than the supposed professional comedians in those.

E01: Mike tries to live black for 3 days, only buying or using black products from black stores. Cue cruel and sadistic laughter, because that is really damned hard, even in Georgia. The "Figgers" phone is kind of a cheat, because it's obviously an Android made in China, but it's a real small network run by a black kid, Freddie Figgers. The look on Mike's face in the BBQ shop is heartbreaking.

I look a little sideways at his refusal to smoke Mexican weed; I've only ever smoked Washington or Canadian, but surely Mexican can't be that bad, they built a criminal empire on that stuff before legalization.

Still, he makes a good point about how the black community's been economically destroyed. His idea of a good "Black Friday" where everyone tries to buy black is interesting… but impossible where I am.

E02: Mike proposes replacing STEM/liberal arts schools with trade schools, starting with 1st grade. This one Annoying Red-Headed Kid is, like, the worst example of honkie ambition driving everyone else down you can get. Did Mike ship this kid in by asking every school district in the area for their most awful nerd? I predict 100% that ARHK will make a startup that defrauds people, and he'll never go to prison.

"I don't think school teaches you to think. I think school, like prison, teaches you to obey!"

So then he moves on to unemployed adults, and they're unmotivated, so he comes up with a great idea, which I won't spoil. Unfortunately, I find most of the people in his idea too unattractive to be effective.

★★★★☆

The State of Software

On the horrible state of software:

Me Wearing a Scruffy, Profane T-Shirt: "Yeah, man, we should just code in bare metal like back in the '70s! Programmers should control machines, not the other way around! Liberation now!"

On shiny new things:

Me Wearing a Button-Up Dress Shirt: "Superb. Slightly more secure sandboxes in my giant JavaScript application service running on a giant pile of API stacks. I'll upgrade ASAP, I'm sure it won't destroy everything it touches."