The Best SF Author of All Time

So, I can't actually pick one, or even rank ten, but by decade (when they made the works most important to me) it's down to a short list:

  • 1830s: Edgar Allen Poe
  • … long empty stretch …
  • 1890s: H.G. Wells
  • … shorter empty stretch …
  • 1930s: H.P. Lovecraft
  • 1940s: A.E. Van Vogt
  • 1950s: H. Beam Piper, Clifford Simak
  • 1960s: Brian Aldiss, Robert A. Heinlein, Zenna Henderson, Frank Herbert, Andre Norton
  • 1970s: Marion Zimmer Bradley, Katherine Kurtz, Fritz Leiber, Anne McCaffrey, Michael Moorcock, "James Tiptree, Jr", Roger Zelazny
  • 1980s: Mary Gentle, William Gibson, Elizabeth Moon, Rudy Rucker, John Shirley, Bruce Sterling, Walter Jon Williams
  • 1990s: Pat Cadigan, Greg Egan, Neal Stephenson
  • 2000s: Neal Asher, Peter F. Hamilton, Alastair Reynolds
  • 2010s: Mira Grant (aka Seanan McGuire), Martha Wells

I can make an argument for almost any of them to be "my favorite" depending on mood, but Piper might be the winner in a bracket contest. I suspect I'd get down to something like Piper vs. Egan and my head would explode trying to compare Space Viking with Diaspora.

My first pass at this, there were only 3 female authors (Pat Cadigan, Mira Grant, and Martha Wells). Several were only fantasy: Which I think less of, but I do read—and Leiber and Moorcock's science fiction are not their best works.

Many only had a few great books or short stories surrounded by a giant midlist of dullness, but that's also why Niven, Pournelle, Steven Barnes, Iain M. Banks, Dan Simmons, and Poul Anderson never made it. Several I do list produced dullness after their peak, like Gibson objectively only wrote one short story collection & 2 thin novels worth reading, one should not read McCaffrey's post-trilogy extended sequels, and anything Stephenson wrote after The Diamond Age needed an aggressive editor to cut out about 2/3 of his text. And yet many continue to write exactly what I like, decades later.

The '60s-'80s really produced a LOT of SF I liked. Was it objectively better? Or was it just "the golden age of SF is 12" which was 1982, so I read what was still in print?

Terminal Condition

I spend half my time, easily, in a command-line terminal running zsh. So a new one, even one on an OS I don't run, is interesting:

There are some modern, nice conveniences in this. It's a little ways behind Mac Terminal.app (based on the NeXTstep Terminal from 1990), and vastly far behind iTerm2, but it's more advanced than the usual Linux terminals like rxvt, urxvt, or cross-platform Alacritty or Hyper.

Between this and WSL2 being a full Linux, it's plausible that the best Linux dev environment now (well, this summer when it's released) is Windows. The Year of the Linux Desktop is 2019, and it is owned by Microsoft®. Can you hear the tiny, distant screams of the FSF cultists?

Comparison based on code, reviews, and reddit thread with MS devs involved:

  • Scrollback: The single most important thing a terminal can do. MS does this, but doesn't have logging.

    Surprisingly, a lot of them only support a few pages. I keep mine at 10,000 lines or so, which is probably wasteful but so handy; I don't bother logging since my .zhistory keeps everything I typed, and I have Terminal.app and iTerm2 set to not close tabs automatically.

    Alacritty only just added scrollback last year.

  • Prompt Marking: Nope.

    This is a feature it's hard to live without once you've had it, no more paging up trying to see prompt lines (I have a red ANSI-colored prompt and it's still hard to see). In Terminal.app, Edit, Marks, Automatically Mark Prompt Lines, and then ⌘↑ and ⌘↓ move between them. iTerm2 has it enabled by default, and ⇑⌘↑ ⇑⌘↓ are the keys, which took me some re-learning.

    Nothing else has this, as far as I've seen.

  • Fonts: MS has programming ligatures and displays emoji, finally. Does not support RTL languages.

    I use Fira Code in all my editors and shells, and it's enormously helpful, more readable, and catches bugs: I look for === as a fat-equals symbol in JS, etc.

    Hyper, urxvt, Alacritty support Unicode fonts. rxvt stopped development almost 20 years ago so it barely shows 8-bit fonts correctly.

  • Tabs: MS has tabs! They're currently invisible until you add a second tab, same shit Terminal.app does, which annoys the hell out of me; I don't like UI that reshapes itself, reminds me of T-1000 Terminators (also makes it hard to tile my windows up correctly when they get resized).

    It's not clear if you can drag Windows Terminal tabs around to different windows.

    In iTerm2, I normally keep: First window with tabs for home shell, ssh into my server (running screen, so that has many open shells). Second window with 2 tabs for REPL, editor/running/compiling tasks, and sometimes a third tab for reading local docs. If I need more shells, I usually open them on the first window. I rarely open a third window for monitoring some long-running task; I just drag a tab out to its own window. All terminal windows are stacked on the left side of my screen, because there's no icons under that side of the Desktop.

    urxvt has tabs, but they're kind of a hack, not fully usable.

    Hyper has tabs, but they replace the title bar. Which is cool but also awful like a lot of things it does.

    rxvt and Alacritty don't do tabs, because they insist you use screen or tmux. Which sucks if you want to move a process from one window to another.

  • Profiles: MS supports multiple profiles, so you can use different ones for each task.

    So does Terminal.app, iTerm2, urxvt (but it's buried in a text file config).

    Alacritty, rxvt, and Hyper have a single profile and no UI for changing anything, hope you like editing text files and reloading.

    As far as I can tell, nothing else does automatic profile switching like iTerm2; when I cd to my ~/Code/CodeScheme folder, iTerm2 switches to my dark red transparent profile, including Scheme-specific triggers and copy/paste filtering.

    You can probably do that in urxvt's Perl(!) scripting, but it's not normal or easy.

  • Copy/Paste Filtering: Nope.

    iTerm2 and urxvt both let you set a bunch of regexp to run over lines to get selections correctly matching boundaries, not just space-delimited.

  • URL Highlighting: Nope.

    iTerm2, Hyper, and urxvt notice URLs and filenames, and let you click on them. In iTerm2, hold down ⌘ and click on any URL or path (like in an ls or find result!) and it does some useful action: Opens the URL in your browser or file path in your editor, by default, but you can configure that in the profile.

  • Custom Keybindings: Sorta? Doesn't seem complete, no idea if there's UI for it, but it does exist in their config.

    Most terminals can do this, but most can only remap a few actions. I like iTerm2's, as usual, which lets you bind any action, menu, or run a program on any keybinding. I mostly just use it to launch different profiles with starting paths & scripts.

    Terminal.app only lets you send specific text for a key.

  • Images: Sorta? Only if they're embedded in fonts.

    This is a neat trick in iTerm2: images. I use imgls all the time to see a thumbnail of every file with details (protip: I changed ls -ld in the script to ls -1Fskd for a more concise listing), and then ⌘-click to open what I want in Acorn; it's better than opening Finder and trying to read a long filename under a thumbnail.

    I'm unaware of anyone else being able to do this.

Tower of Babble

Programmers almost compulsively make new languages; within just a few years of there being computers, multiple competing languages appeared:

It proliferated from there into millions; probably half of all programmers with 10+ years of experience have written one or more.

I've written several, as scripting systems or toys. I really liked my Minimal script in Hephaestus 1.0, which was like BASIC+LISP, but implemented as it was in Java the performance was shitty and I had better options to replace it. My XML game schemas in GameScroll and Aiee! were half programmer humor, but very usable if you had a good XML editor. Multiple apps have shipped with my tiny lisp interpreter Aspic, despite the fruit company's ban on such things at the time. A Brainfuck/FORTH-like Stream, working-but-incomplete tbasic, and a couple PILOT variants (I think PILOT is hilariously on the border of "almost useful").

Almost every new language is invented as marketing bullshit based on a few Ur-languages:

  • C++: Swift
  • Java: Javascript (sorta), C#, Go
  • Awk: Perl, Python, PHP, Julia
  • C: Rust
  • Smalltalk: Objective-C
  • Prolog: Erlang, Elixir
  • ALGOL: C, Pascal, PL/1, Simula, Smalltalk, Java
  • LISP: Scheme, ML, Haskell, Clojure, Racket
  • BASIC: None, other than more dialects of BASIC.
  • FORTRAN: None in decades, but is the direct ancestor of ALGOL & BASIC.
  • COBOL: None in decades.

A few of these improve on their ancestors in some useful way, often performance is better, but most do nothing new; it's plausible that ALGOL 68 is a better language than any of its descendants, it just has mediocre compiler support these days.

Certainly I've made it clear I think Swift is a major regression, less capable, stable, fast, or even readable than C++, a feat I would've called impossible except as a practical joke a decade ago. When Marzipan comes out, I'll be able to rebuild all my 15 years of Objective-C code and it'll work on 2 platforms. The Swift 1.0 app I wrote and painfully ported to 2.0 is dead as a doornail, and current Swift apps will be uncompilable in 1-2 years; and be lost when Apple abandons Swift.

When I want to move my Scheme code to a new version or any other Scheme, it's pretty simple, I made only a handful of changes other than library importing from MIT Scheme to Chez to Chicken 4 to Chicken 5. When I tested it in Racket (which I won't be using) I had to make a handful of aliases. Probably even CLISP (which is the Swift of LISPs, except it fossilized in 1994) would be 20 or 30 aliases; their broken do iterator would be hard but the rest is just naming.

Javascript is a pernicious Herpes-virus-like infection of browsers and desktops, and nothing can ever kill it, so where it fits the problem, there's no reason not to use it. But there's a lot it doesn't do well.

I was leery of using FreePascal because it has a single implementation (technically Delphi still exists, but it's $X,000 per seat on Windows) and minimal libraries, and in fact when it broke on OS X Mojave, I was disappointed but I-told-you-so.

I'm not saying we should quit making new Brainfuck and LOLCODE things, I don't think it's possible for programmers to stop without radical brain surgery. But when you're evaluating a language for a real-world problem, try moving backwards until you find the oldest and most stable thing that works and will continue to work, not piling more crap into a rickety new framework.

The Biblical reference in the title amuses me, because we know now that it requires no malevolent genocidal war deity scared of us invading Heaven to magically confuse our languages and make us work at cross purposes; anyone who can write and think splinters their thought into a unique language and then argues about it.

Lost Treasure

In 1979, I learned to program in BASIC on a TRS-80 Model I. Sometime in the next year, I read one of my first programming books:

I played Monster Chase and Lost Treasure, modified them extensively, and combined them, so the cave on the island had a monster chase to reach the exit. I recall having problems getting Starship Alpha and Devil's Dungeon to work, but they joined my software library eventually.

One of my earliest and happiest programming memories was sitting at the dining room table, reading Monster Chase, and writing out a smarter movement system and obstacles in a notebook; at the time the only computers were at school, so I wrote code on paper and typed them in later.

So when I found the book again on archive.org last night, I was very excited, and had to reimplement it. I actually typed this into Pythonista on my phone with the PDF open on an iPad, only moved it to the computer to do some final cleanup and upload it.

The book suggests some modifications, and I did some minor ones: Lowered the movement error to 10%, and risk of shark attack to 10%, rising by 1.5x rather than a flat +50% each time; being anywhere near the island edge killed you too often in the original. I also don't move you out of the water automatically, that should cost a turn.

I realized in converting it that I hate, hate, hate Row,Column coordinates instead of Cartesian X,Y; tons of mainframe-era computing resources used Row,Column, and you can still see it in some APIs like Curses. Note that the original program is 74 lines, mine's 214; BASIC is a terrible language, but it's terse.

I could adapt this into another doorgame for my Mystic Dungeon BBS, but I'm not sure what the multiplayer aspect would be, and it has limited replayability without doing some randomization.

Twitterversary

A day that will live in infamy: Twitter emailed me to make sure I knew I joined Twitter 11 years ago today (really?). And then put a banner in front of my notifications (which I still see even if I don't read my timeline), so I said fuck it and hit post, made the swamp a little shittier.

But. 11 years ago, Twitter was really fun. WWDC lunch & event planning, and other nerds finding our weird Objective-C hobby useful and profitable, and all the weird social events which even antisocial nerds would enjoy because it was software-mediated. The normals hadn't really found their way there yet.

At first there was just a little post form on a page, and you had to reload to get updates. Then nice clients came out, like Twitterrific (great for just reading the stream, invented "tweet" and the bird icon, his name is Ollie), Twittelator (great for lists and filtering), and Tweetie (neat UI design, invented pull-to-refresh). And favrd, which was like a leaderboard for funny Twitter.

Then everything started to go wrong. Normals and their predators got on, and humor took a nosedive as thieves stole jokes and reposted memes. Twitter started making their web app usable, and limiting their API, and telling the client devs to go away. Eventually they bought Tweetie and mangled it and then killed it, because everyone at Twitter is too stupid and tasteless to maintain good software.

I've told of the time around App.net and on in Mastodon. I'm still there, generally quite happy with it; there's a bunch of App.net refugees around. I'm kinda sad a bunch of people I like are still on Twitter, there's a hell of a good world out here away from all of that.

Return of the Objective-C Jedi

[[[ These ]]] are your father's square brackets, the weapons of a Jedi Knight.
Not as clumsy or random as C++.
Elegant weapons for a more civilized age.

What's Different in Mulle-ObjC

This is like Objective-C circa 2010(?), good but not fully baked. Far better than circa 1986-2009, when it was a very thin translation layer over C.

  • No ARC (Automatic Reference Counting). This is just invisible sugar to hide retain/release/autorelease, and while ARC's convenient, it's trivial if you actually know how reference counting works. Don't really miss it.
  • No dot property syntax. [[myObj name] length] instead of myObj.name.length, and [myObj setName:newName] instead of myObj.name = newName. I can live with it, but I really did like dot syntax, even if it does "overload" the . operator and hide the distinction between methods and variables.
    • When dot syntax came out, Objective-C nerds came close to fistfights over this. You would not believe the venom some people had for it. Most of those nerds died or quit or got old & tired before fucking Swift came around, I guess.
  • No array syntax. [myList objectAtIndex:i] instead of myList[i]. This is a pain in the ass, I'll have to write some shorthand macros (or rather, go dig them out of my very oldest code).
  • No blocks. This one hurts, but it's a reasonable pick-your-battles decision. Classic: Write a method, dispatch to it, and call back success somehow. Blocks: create a weakSelf reference, enclose it, search-replace self in your block, pick one of a half-dozen complex GCD methods, get a memory leak because you retained something across the block boundary. This is annoying but logically simpler:
    [self performSelectorInBackground:@selector(computeData) withObject:inputData];
    
    - (void)computeData:(id)inputData {
        // create outputData
        [self setOutputData:outputData];
        [[NSNotificationCenter defaultCenter] postNotification:NOTI_DataComputed];
    }
    
  • Has object literals: @42 and @(var) create an NSNumber, @[] creates an NSArray, @{} creates an NSDictionary; dicts use key:value order, not the reverse order used in -[NSDictionary dictionaryWithObjectsAndKeys:], and array and dicts don't need a trailing nil, which was a constant source of mystifying bugs back in the day. Big win!
    • Hmn, crashes if you do something janky like [@[] mutableCopy]: mulle_objc_universe 0x1006adef0 fatal: unknown method 5e1b0403 "-getObjects:range:" in class 7aa0d636 "_MulleObjCEmptyArray"
  • Has for (id x in container) loops, using NSFastEnumeration. The 1.0 process of looping enumerations was awful, so this is very nice.
  • Huh, does have @autoreleasepool, so maybe I should use that instead of NSAutoreleasePool like a caveman? It compiles and seems to work.
  • Properties have properties assign/retain nonatomic/atomic nonnullable readonly, default is assign nonatomic, no "nullable" or "readwrite" flags needed. As it should be.
  • Weird isa define instead of pointer: blog post

TODO

  • I haven't set up an NSRunLoop or the equivalent of NSApplication (which is in AppKit, not Foundation), need to do that and then I'll have a working app template.

Writing Objective-C with Mulle-Objc

mkdir CLICalc
cd CLICalc
mulle-sde init -m foundation/objc-developer executable

This takes more or less forever.

… Still going. OK, finally done. I hate to think it's gonna do that every new project? Or whenever it updates?

Anyway, bbedit . (fuck Xcode), and add at the bottom of import.h and import-private.h:

#import <Foundation/Foundation.h>

Make src/main.m useful:

// main.m
#import "import-private.h"
#import "CLICalc.h"

int main(int argc, char *argv[]) {
    NSAutoreleasePool *pool = [[NSAutoreleasePool alloc] init];

    CLICalc *calc = [[CLICalc alloc] init];
    [calc push:42.0];
    [calc push:69.0];
    double a = [calc pop];
    double b = [calc pop];
    NSLog(@"a=%f, b=%f", a, b);

    [pool release];
    return 0;
}

Create an Objective-C class, src/CLICalc.h:

// CLICalc.h
#import "import-private.h"

@interface CLICalc : NSObject

@property (retain) NSMutableArray *stack;

- (void)push:(double)n;
- (double)pop;

@end

and src/CLICalc.m:

// CLICalc.m
#import "CLICalc.h"

@implementation CLICalc

@synthesize stack = _stack;

- (id)init {
    self = [super init];
    _stack = [[NSMutableArray alloc] init];
    return self;
}

- (void)dealloc {
    NSLog(@"CLICalc dealloc");
    [_stack release];
    [super dealloc];
}

- (void)push:(double)n {
    [_stack addObject:@(n)];
}

- (double)pop {
    if ( ! [_stack count]) {
        // ERROR: stack underflow
        return 0.0;
    }
    double n = [[_stack lastObject] doubleValue];
    [_stack removeLastObject];
    return n;
}

@end

Doing that without a template was a little hard on the old memory, and I had to use DDG to look up some method names without autocompletion. But I'm pretty sure that's fine.

In mulle-ide, type update to add the new class to cmake: If you look in cmake/_Sources.cmake you should now see CLICalc.m listed.

Now craft to compile. You'll get a spew of crap, but hopefully no errors.

I am getting this, which I can't resolve:

/Users/mdh/Code/CodeMac/CLICalc/src/main.m:21:55: warning: 'NSAutoreleasePool'
      may not respond to 'init'
        NSAutoreleasePool *pool = [[NSAutoreleasePool alloc] init];
                                   ~~~~~~~~~~~~~~~~~~~~~~~~~ ^
1 warning generated.

But NSAutoreleasePool certainly has init, and it seems to not die?

% ./build/Debug/CLICalc
a=69.000000, b=42.000000

Hooray!

Yeah, this isn't amazing. Except: It's supposedly portable now. I can maybe rebuild this on Linux, or Windows? I dunno.

This is almost classic Objective-C, slightly enhanced from 1.0: We didn't have property/synthesize, or nice object wrappers like @() when I were a lad. I typed so many [NSNumber numberWithInteger:n]. So get used to the retain/release/autorelease dance. There's no dot-syntax for property access, type them [] like old-school. But hey, it's a proper compiled language with a nice object system and no GC pausing.

I tried importing Cocoa and got a ludicrous spew of errors, so Mac GUI is gonna be a challenge. But I could import SDL and use that for portable UI, since Objective-C is just C.

Sweet. I'll finish up the calculator's parser in a bit, but then see about doing something useful in it.

Spread of Terrible Programming Languages

Abstract—The English-like business programming language COBOL saw widespread use from its introduction in 1960 well into the 1980s, despite being disdained by computer science academics. This article traces out decisions made during COBOL’s development, and argues that its English-like appearance was a rhetorical move designed to make the concept of code itself more legible to non-programming management at computer-using companies.

I found some of the references much more interesting than the paper, which is a pretty high-level history avoiding the actual boots on the ground details.

COBOL was designed (and fought over very hard on this point) so that unskilled managers could "read" it, but in my view that had little to do with its spread. Middle management where that would matter has no buying power, and executives won't read more than a sentence on a slideshow.

Ubiquity made much more of a difference; no two computer installations were compatible until the late '60s, so the alternatives were COBOL, FORTRAN, LISP, and a hundred weird languages invented at each facility. Given those choices, I'd pick FORTRAN or LISP, but even COBOL would beat rewriting on every machine. A bunch of companies and government agencies ended up clustered on that choice, so it became widespread, not on any merits but because programmers could move code semi-automatically.

I know this because it happened at least five more times that I can think of, and only once with unskilled readability as a goal:

  1. BASIC is a tutorial language for children, very poor for large programs, very slow compared to C or ASM, grossly inferior to Pascal or Logo for any role. BASIC became ubiquitous because it can be implemented in a few K of RAM and worked nearly the same on hundreds of incompatible timesharing and microcomputer systems.
  2. Java is a mediocre Objective-C/Smalltalk replacement, applets turned out to be too heavyweight for the web and insecure, but cross-platform on servers turned out to be very valuable; cross-compiling C++ is a total crapshoot. Developers can have nice Macs and still compile Java code that runs on non-Mac servers.
  3. Linux (not a language, I know, but same pattern) is hot garbage, the product of a drunk, belligerent Finn student putting a kernel that'd get him a failing grade in an OS class on his 386. But because it's so quarter-assed and has no device driver support, it runs on anything like a virus. So now UNIX is all but dead, killed by a nematode parasite that fills the niche.
  4. PHP is a cruel joke, a gross hack to put server-side script in HTML instead of generating HTML in code or templating. But it was easily installed in Apache, runs everywhere with no setup. So half the web runs on this shit, from WordPress to Facebook.
  5. JavaScript started life as a six week hack to get LISP & Self-like programming, with C-like syntax for marketing reasons, in a web browser. And until early 2000s, it wasn't portable enough for anything useful. But when IE died and the other browsers implemented ECMAScript consistently, it became the universal language. It's still weird and fragile; I don't dare write it without eslint. But it may be the language of the century.

There's the similar case of IBM PC/DOS/Windows vs microcomputers and Macintosh, which were better tools but fragmented, but that's more about central authorities imposing Nazi-supporting IBM, and convicted criminal organization Microsoft bribing and extorting to kill competition. Common languages would likely have been enough to keep competition and diversity going if IBM & MS had been burned to the ground and their scatterlings shot as they ran back in the '70s.

The author of the paper sort of slouches in this direction but doesn't quite get it, when pointing out how science and technical culture has standardized on English. We are all incompatible machines, but a common language lets us argue.

I hate when papers list references without URLs:

  1. 10 PRINT CHR$(205.5+RND(1)):GOTO 10: Fun little book, not at all relevant to the paper.
  2. N. Wardrip-Fruin, Expressive Processing
  3. M.C. Marino, Critical Code Studies
  4. B. Schneiderman, The Relationship Between Cobol And Computer Science
  5. J. McCarthy, "Memo To P. M. Morse: A Proposal For A Compiler" Memo CC-56
  6. D. Nofre , M. Priestley , and G. Alberts, "When Technology Became Language: The Origins Of The Linguistic Conception Of Computer Programming, 1950–1960"
  7. M.D. Gordin , Scientific Babel: How Science Was Done Before And After Global English

End of 2018

Let's watch Poseidon — Only available on Netflix until tomorrow! Normally I watch Strange Days, but I feel an upside-down sinking ship is a more accurate metaphor for the year than failed love and revolution and pretty Angela Bassett. Maybe for Chinese New Year (Feb 5), Gabriel Dropout's New Year/armageddon episodes.

I don't go super intimate online, but it's been a rough year. I've lost a friend and two of my last few relatives to cancer, my dad's had some close calls, and his dog died. Doing any kind of work under the stress load is… not great. And I'm not a good friend or coworker in this state. My new puppy is a terror, both looks and behavior like a jackal puppy, but the one really good thing.

State of software I touched on yesterday. This is the year a new Perilar rises from the ashes, and Learn2JS is moving along nicely, I think that's going to be a big deal, it's a sweet environment.

I goofed off yesterday and started writing tbasic, a Tiny BASIC interpreter in C, because that's a useful thing to do! I've done this before, but made a messy parser. The new one is a tiny single file and much cleaner. Might be published tomorrow morning sometime. While nobody needs BASIC, it's good C programming exercise, and I can link in SDL2 and give it cross-platform graphics and sound, which is actually kinda neat.

"It is practically impossible to teach good programming to students that have had a prior exposure to BASIC: as potential programmers they are mentally mutilated beyond hope of regeneration."
—Edsger W. Dijkstra, EWD 498: How do we tell truths that might hurt?
[mdh: In case you can't read the paper and get the joke, he's joking. Sort of.]

I got a little writing in on Delvers in Darkness, I'm thinking about more adventures for it, solo gamebooks and Refereed.

Poseidon is really terrible already. Everyone's a ridiculous caricature. Oh, this is gonna be a good shipwreck.

What I'm Watching and Criticizing: The Good Place

So, up front: This is a trashy show in a lot of ways, that's trying to be much, much higher and mostly failing. It exists so someone who wasted their college tuition reading philosophers can pick up a paycheck name-dropping Kant in each script; Kudos to that guy for finding a way to make philosophy pay. (disclosure: I also read philosophers in college and since, mostly on my own time, and never tried to extract money for it.) But it is not written as a philosophy treatise, though it occasionally tries; mostly it's just a dumb sitcom.

The main cast are a trashy girl-next-door mean chick, a hot chick with an English accent I hate, a philosophy nerd (irony/shitty writing: black guy, entirely teaching from texts written by old white guys, all but the latest of whom kept slaves; not a single non-honky philosophy is ever discussed), and a moron, trying to survive an afterlife where they don't quite seem to belong, run by well-past-sell-date Ted Danson and a slightly frumpy robot girl (who says she's neither), in standard sitcom cycles (literally: There's mental reboots that happen so episodes can restart at the beginning), though it does change up the formula eventually. I do like the hot chick and the mean chick; they have character. Maybe the robot girl, even limited by her role. Sadly, the nerd is one-note, the moron is barely able to breathe in and out without electric shocks, the ancient stick-figure of Ted Danson is stiff and overacts when he does break being stiff.

The key premise of the show is that you earn "points" by your actions in life, which sorts you into "The Good Place" or "The Bad Place". There's, uh, roughly everything wrong with this.

Obviously first, there's no magical afterlife. It makes no sense: There's no evolutionary advantage to an afterlife, and Humans being the only animals who can rationalize and make up stories to deal with our fear of death is infinitely more likely than that a magic sky fairy suddenly gifted Homo sapiens with an invisible remote backup system. When you die, your brain patterns rot and the program that was you ceases to be recoverable in about 5-10 minutes. There's probably nothing like an Omega Point or Roko's Basilisk for the same reason; that information won't survive from the current hot period of the Universe to the long cold efficient computational period, so no AI can reconstruct you. I'm as sad and angry about this as anyone, but I don't delude myself.

Second, even if we say "YER A WIZARD HARRY" and you have a magical afterlife, it's populated by immortal beings (IB), somehow. Where do they come from? How does that evolve? How do they get magical powers? If Humans can get a half-measure of sanity and wisdom by 40, 60, 80 years, every IB should be perfectly enlightened and know every trick and skill possible by 1000, 100000, 13.5 billion years old. The IBs shown are as stupid and easily-tricked as Humans, when you get to The Actual Plot of this show. To pick the exact opposite of this show, Hellraiser had an internally consistent magical afterlife: "Hell" is an alien universe inhabited by Cenobites with a wide range of power, whose experiences are so powerful that they would seem like torture to a Human; they collect Humans who seek that experience with magical devices, not to reward or punish meaningless behaviors on Earth; good or evil means nothing in Hellraiser.

Every IB in this show is insultingly stupid, repetitive physical tortures by frat boy demons, inferior to Torquemada's work here on Earth; farting evil robot girls; a neutral Judge too silly to be on a daytime TV show who only wants to eat her burrito. Low, low, lowest-fucking-brow comedy quite often.

Third, and most damning (heh), any system of morality with a scoring system then becomes solely about that scoring system. If "God and/or Santa are Watching" as Christians claim, you must act good according to the dictates of the Bible to score high enough to enter Heaven; it doesn't matter what's logically right and wrong, only the specific rules of an eternal sex-obsessed Middle-Eastern tyrant. Everyone who ate shellfish or wore mixed fibers or got a tattoo, forbidden by Leviticus, or failed to commit genocide & slavery when ordered by a prophet of God, as throughout the entire Old Testament, or masturbated to anyone but their lawfully wedded spouse, as forbidden by Jesus in Matthew 5:28, is gonna have a real bad eternity in Hell.

The scoring system for The Good/Bad Place makes it impossible to commit a "selfless" act unless you're a total moron (so, possibly the moron character, but he's unthinkingly rotten as often as nice). They treat this as a feature, as if you can only do good deeds when you can't see the score.

In philosophy without gods, you can choose to do good (try to define "good" in less than 10,000 pages…) instead of evil (same) because your personal or societal reward system is rigged that way (laws, in general), or because you selfishly want to look altruistic (maybe virtue-signalling to attract a mate), or because universalizing your behavior means you should selfishly do right to raise the level for everyone including yourself ("think global, act local"), or purely at random, and you have still done good deeds. While the ancient Stoics (especially my favorite, Marcus Aurelius ) respected piety to the immortalized Emperors and gods of the Pantheon, they didn't ask the gods for rules, they found a way to live based on reason, a modicum of compassion, and facing the harsh world that exists.

But once the authorities put in an objective score system in with infinite reward/punishment, you must act to maximize your score; there's no moral debate possible, you would just find the highest reward you can achieve each day and grind on it. Those born with the most wealth and privilege will be much more capable of raising their score instead of attending to life's necessities, so the rich get rewarded, the poor get punished.

This show seems to think Jiminy Cricket sits in your head as a quiet voice without any training, and you just have to listen to it to know good and evil. There's a discussion about Les Miserables re stealing bread (worth exactly -17 points), that's only used for mockery, but in real life that ambiguity is impossibly hard to make rules for.

I liked Eleanor and Tahani, and sometimes Michael, playing off each other enough to keep watching this through S2, but every time Chidi speaks I roll my eyes and wish that just once he'd reference someone not on the Dead Honkys shelf; especially not Prussian Immanuel Kant who wrote some of the earliest texts on "scientific racism", including such gems as "The Negroes of Africa have by nature no feeling that rises above the trifling" (1764, Observations on the Feeling of the Beautiful and the Sublime). Fuck that guy.

★½☆☆☆