Xanadu Hypertext from the Future

In DDJ Jan 1983, there's an article:

"The Xanadu Hypertext System is a real and currently available product which can manage multiple differing versions of a single document."
Roger Gregory, "XANADU Hypertext from the Future"

Here in the actual future, this may come as some surprise, since Ted Nelson and Roger Gregory were never able to ship a usable version of Xanadu, as noted in the Wired postmortem; there's a viewer on Xanadu.com with no real editor or import/export system.

Computer dreams without substance, without actual working code.

In DDJ Apr 1983, Gregory files a "Mea no culpa" where he throws Chip Morningstar (of Habitat fame; Habitat did ship) under the bus.

This is great, 35-year-old tech industry drama beats current shit any day.

Dr Dobb's Journal of Computer Calisthenics & Orthodontia

DDJ, especially the run 1984 to 2000-ish, is how I learned C and assembly, and much of my attitude towards software.

These days I read PragPub ed. by Michael Swaine of DDJ infamy, though it's more architect/software manager-oriented than in-the-trenches bit-eating, but it still has some real working code.

Also, I really miss Jolt Cola, like a detoxing junkie misses a needle.

Python 3.7

  • Python 3.7 released: Standard checklist:
    • Run installer
    • Delete the old 3.6 folder from /Applications
    • Run the certificate command in the new 3.7 folder (the other shits a PATH into my shell profile, don't need it)
    • Run IDLE and verify it's 3.7.0. Happily, no longer have to fight with updating Tcl/Tk.
    • Run "python3" from Terminal and verify it's 3.7.0
    • Run a random Python script to make sure nothing's broken.

Nanosecond-accurate time functions and switching more ASCII/C Locale into UTF-8 are nice improvements, but those are more patching up legacy annoyances than "must have".

I'm mostly interested in dataclasses, which makes it much easier to build little struct-type objects instead of random dicts or lists which have all sorts of problems (no equality, hashing, typo-safety).

I greatly dislike the addition of BDSM typing, but it's mostly optional, EXCEPT you have to use them in dataclasses:

from dataclasses import dataclass
@dataclass
class Point:
    x : float = 0.0
    y : float = 0.0

>>> p = Point()
>>> p
Point(x=0.0, y=0.0)
>>> q = Point(1.1, 2.2)
>>> q
Point(x=1.1, y=2.2)

If I define Point without the type annoytations[my new favorite typo!], only the default constructor works, and it doesn't print the fields as a string.

@dataclass
class Pointless:
    x = 0.0
    y = 0.0

>>> f = Pointless()
>>> f
Pointless()
>>> f.x
0.0
>>> f.y
0.0

Real examples might be a lot more complex than a point, and by then the cost of building a proper class with __init__ and everything yourself isn't such a big deal, so I can see dataclasses mostly being used for very simple struct-like containers.

Cyberpunks or Just Punks?

It's not that I don't like Neuromancer, it might be in my top 10 favorite books (but more towards the bottom of that list), but every time I see it mentioned as the "seminal cyberpunk epic", I roll my eyes, because I know these people have never read another cyberpunk book, there were others before Neuromancer and long after.

So educate yourself, make yourself less eye-rolling to me. Here's a little tiny reading list. When you're done with that, hit the KUOI archive on the right, find my Cyberpunk page, work through that. Or maybe I'll pull it out of archive and update it by then? There's a lot in the last 10-15 years since I touched the page.

First:

Then:

What I'm Watching: Low Winter Sun, Marcella, The Staircase, Goliath

  • Low Winter Sun: Terrible train-wreck ending to an otherwise great show. Was it intentional that the show fuck up and choke to death on its own vomit just like Detroit, or ironic coincidence?
  • Marcella: Season 2! I somehow didn't write about S1: Mentally unstable woman detective comes back to work chasing a serial killer or copycat of her old case. She has another breakdown, commits as many crimes as she solves, and then tries to cover up her shit. That was good, if at times heavy on the melodrama. ★★★½☆
    S2 follows her chasing a child murderer. Unfortunately this is Law & Order: SVU bullshit; reality is that child rape or murder by strangers is incredibly uncommon, so their approach of looking at randoms instead of family, teachers, or priests is unproductive.
    Marcella's also being pretty high and mighty for as shitty a person as she is. In S1 she lasted whole eps before melting down. This is all badly written by idiots, everyone spends half their time screaming incoherently at everyone.
    I bailed on this after S2E1. ☆☆☆☆☆

  • The Staircase: Documentary of Michael Peterson's alleged murder of his wife Kathleen. Between the writer suspect, and some of the very weird lawyers and experts, there's much more interesting speech and events than most of these true crime shows. Still pretty dry but not stupid.

  • Goliath (Amazon Prime): A man blows up on a boat. Two years later, the sister recruits a wannabe strip-mall lawyer who recruits a washed-up drunk lawyer (Billy Bob Thornton!!!) to sue. Slightly overwrought legal drama against the evil supervillain lawyer (William Hurt!), and then the crazy conspiracy levels start ramping up, babykillers keep their lawyers in the dark. Good conspiracy show with scruffy protagonists. They ought to play the Imperial March every time the Evil Lawyer Firm is onscreen. The one down side is Amazon Prime's unspeakably shitty video player, which makes me hate watching any serious show where I may need to rewind. Fuck Amazon. But ★★★★½ for the show.

Marzipan and Electron

Chris is missing the point of both technologies. And I'm sure not a brat Millennial.

Marzipan (candy frosting) is a legacy porting technology: Existing iOS apps can cost more to port to AppKit than they're worth, but may be worth something as a cheap Marzipan port. Nobody ports their iOS apps to tvOS or watchOS because it's not profitable, and everyone (in the first world with money) has an iPhone already.

I loved my UNIX® workstation Macs after suffering with Linux for a decade+, but Timmy Cook's Apple abandoned the Mac after Steve's death and Scott Forstall's firing. Anyone making new native Mac apps is in an abusive relationship: Apple does not love you, and does not care about the Mac.

I'd rather eat broken glass than run Linux again, and I have never and will never be a Windows weenie, but I'm not relying on Apple to support desktop developers ever again.

Apple's Mac apps have generally been shit for years now, because they won't spend the resources to develop & support their own stuff. iTunes is a bloated pile of crap, half-broken because it has to run on Windows, too; you don't like hybrid web apps? Everything except your library is web or XML rendering. Pages and Numbers were fast, minimally useful apps that got rewritten based on the iOS versions, and are just about useful for a memo or a chart, but not real work. Mail's a clusterfuck and not half as useful as when it supported more scripting and addons.

The new Mac commercials, first in years, show the broken-keyboard laptops and models they no longer make, nobody coding Mac apps, no desktop Macs. Where's that shiny "new" iMac Pro from last winter? Isn't that what a real musician would use? A near-blind photographer squints at a tiny Mac laptop instead of a giant 27" retina display?

This is how technical, developer-oriented Apple ads were in 2002:
apple_unix_ad-s

Electron and Node are the future (along with Elixir, Go, Rust, maybe others?). It's 100x faster and more fun (attach the FunMeter™ somewhere fun) to code in than Swift, you can use any good editor instead of fucking Xcode, do layout with HTML/CSS instead of the rotting corpse of Interface Builder trapped inside Xcode. And it's cross-platform; 95% of the users (even in the first world with money) don't run Mac, because Apple never updates the Macs, they failed utterly to follow-through with Macs behind iPhones. There's 20x bigger market potential.

The future is certainly not banging your head against the Swift and Xcode walls, just to make a pure Mac app nobody will see. You can't make fun of Electron's runtime, which needs Node and Chromium, if you use Swift, which has a giant runtime turd because their amateur hour C++ compiler nerds can't make a stable ABI. The Mac's only future source of native apps is Marzipan ports.

There can be performance problems in Electron, but Slack's an outlier at 196MB binary and devouring 1.2GB RAM(!!!); it's the bloated WalMart-shopping fat-ass of Electron apps. Discord is also Electron, it has a 136MB binary and uses 360MB RAM, and does more, faster and better than Slack. Atom is the original Electron, has a 541MB binary, and uses 600MB RAM, for an entire editor/IDE.

My game currently has a 139MB binary, and uses 200-300MB RAM when running. Comparing to a random casual game from my Steam library, Chainsaw Warrior (well, "casual"; I've only beaten it once on Easy). It's based on Unity (another VM!), has a 249MB binary, and uses 200MB RAM when running, plus Steam itself uses 130MB RAM (I may yet integrate Steam into mine, so that may even out). It doesn't seem excessive.

I can't compare my Swift game prototype from 3 years ago, because it was written in a version of Swift that doesn't compile in any Xcode that runs on current hardware & OS, and Xcode "helpfully" deleted the built binary; who needs working binaries, right? I might have an old Xcode on my old laptop? Maybe I could waste a couple days fixing the code by hand in current Xcode, if I hated myself or loved Timmy Cook's Apple that much?

New languages evolve fast, but I can run 20-year-old Javascript and it'll run thousands of times faster than it did in the '90s, because the language was improved with forwards-compatibility in mind, hardware caught up, and the newer VMs compile & run it faster. I can compile 30-year-old Objective-C, and it'll run.

We had something similar to web tech 20 years ago with desktop Java, but the convicted criminal organization Microsoft sabotaged it and made a shitty single-platform ripoff called C#. Viruses became a problem for applets, which had nothing to do with desktop Java, but killed Java deployment even before Oracle bought & ruined SUN. Android runs on another Java ripoff, but their dev tools and APIs are even shittier than Xcode or C#, and the users are poor, so why make anything for them? Server-side development in Java, Clojure, or Scala, running on the Java VM, is hidden away in a back room, and made as boring as possible.

So now we have to reinvent the runtime, this time with Node & Chromium. OK with me.