A Flotilla of Shit

Modern software is junk. Almost every program uses vastly more resources than it needs, and does its main task worse than older, more focused programs.

I don't think I have a single "new" program that's as good as the thing it replaced, not a single program as good and light as the stuff we had 30 years ago. So where possible I use 30-40 year old software, and I resent the complex stuff I have to deal with. It's polluting the planet, literally boiling the oceans.

Case 1

This blog is in WordPress, which is in PHP on a giant tower of shitty software, like 20 "plugins" to fix things that are inadequate and wrong in it. I've done what I can to lighten it some, streamline layout, but that's lipstick & yoga pants on a pig. 25 years ago I had a simple blog (uh, actually also in PHP, tho I had another one in Perl, so that's not any better). But that was <1000 LOC, it just needed a tiny local database, and really could've just used flat files. And before the blog, I had just my hierarchical web site, and before that I had Gopher.

Gopher was basically perfect. Just a structured tree of documents, accessed by raw socket connections or manually by telnet. If you wanted to make a journal ("web log" -> "blog" was a decade away), you put links to plain text entries on a Gopher menu.

iMark's Gopher Hole _   _   0
gMugshot    /images/mark.gif    example.com 70
1Games  /games  example.com 70
iJournal    _   _   0
01990-09-01 /journal/1990-09-01.txt example.com 70
01990-08-25 /journal/1990-08-25.txt example.com 70
.

etc. Actually at the time I probably would've done chronological order, not reverse.

We have Gemini now trying to be like Gopher, but it has TLS, and a complex connection protocol, and error messages (Gopher just responded "3" if something went wrong, possibly followed by a message), and then the page you get is presentation, not a menu; it doesn't tell you the content type of any link, it tries to style content in-line, like a lower-resource WWW. But to run Gemini, you need a web server to update TLS, it won't stay up without constant maintenance, and it uses more resources than just serving a web page.

Case 2

Mastodon is a giant database that constantly messages other databases to tell them about posts… and it still sometimes takes a while to propagate messages, or fails utterly. There's no markup except URLs, and either polls or images (can't have both, and aren't inline). The only control you have over your experience is blocking people, and crude text-match filters.

30 years ago, we had USENET, email, and IRC/ICB chat. USENET was often slow, some servers would only connect once a day, others every hour, some every 15 minutes or so. You might need a couple hops to get to someone. But your message length was unlimited, most clients handled some markup with *bold*, /italic/, _underline_, and <URLs and FTP hostnames>. Images had to be UUEncoded, but most clients could insert them easily, graphical ones could display them inline, and download them; I used text-only strn so I'd download and run xv to see images. But the power we had in those clients was so much better. strn did scoring, I had thousands of lines of regular expressions and header lines to match with scores up or down. I'd go into a newsgroup, and the best stuff would be at the top, mediocre stuff below it if I cared, junk and spam and assholes deleted.

If you wanted to immediately contact someone, email or chat existed. There's an experimental chat system on Pleroma, but not on Mastodon yet/ever. Or you can use the modern equivalent of that, burn 1GB of RAM and a CPU core running Slack or Discord. Madness.

Case 3

Emacs. Eight-hundred Megs And Constantly Swapping. Is emacs the original sin, or were there flotilla-of-shit programs before it? Back in the day, you could start micro-emacs ("me" on Atari ST, later uemacs) in milliseconds, or emacs in many tens of seconds or even minutes. The emacs people would just leave this giant blob of an interpreter, editor, half an operating system but not really, running all day, eat most available RAM and CPU, and load files into it. The me and vi people would instantly open a file, edit, and close, barely a blip on the system resources. 30 years later, uemacs starts in nanoseconds, and emacs starts in seconds, but it's just as obnoxious.

Today I use BBEdit, which is svelte for an IDE, but it's a giant pig compared to what "a text editor" needs to be; I keep trying other IDE-types like Sublime Text or Atom, and they're too heavy for me to tolerate. And in console, I run Vim, which isn't as bloated as emacs, but it's fat. None of these make me happy. STeVIe was much lighter, and I've repeatedly considered going back to it if I can recompile it. I did manage to compile Linus' build of uemacs and it's nice, but I can't get used to it again after 25-ish years off it; my console habits are vi, it seems.

Resolved

The end goal of software is not to put everything in it, a flight simulator in your spreadsheet (fucking Excel!); a computer in your fridge for playing ads; a web server, email client, and text editor in your math program "notebook"; a fucking NTFS miner in your MS Paint clone.

The end goal of good software is to do ONE THING. To do it fast, efficiently, and correctly, in the least resources you can.

Re-evaluate your use of flotilla of shit software, and dump it.

BASIC at 57

We had a lingua franca, including the first 15 years of personal computing, that could be taught in a few hours and immediately used practically, and then it vanished almost utterly in the late '90s. Two generations are completely illiterate in the language of their ancestors.

Now, there's a couple of heirs to BASIC.

Python mostly took that role. For a while it looked like every computer would have a good version, but then between the Python 2/3 fiasco and hardening systems, you mostly get an old Python shipped on platform, and have to manually install from Python.org. IDLE (with my IdleStart shortcut ) is still a pretty great REPL, editor, and runner. import turtle as T; T.reset() and you can start doing turtle graphics in 10 seconds. The Raspberry Pi Beginner's Guide has a lot of Python as starting language (and Scratch, which is more a cruel joke than a language), which isn't great at graphics on a low-power computer but it is possible. The problem is it's considerably slower than a good BASIC, can't make use of multi-core, and relies on Tkinter for most graphics, which is even slower. Python's an evolutionary dead-end.

JavaScript is everywhere, and trivial to get started with… except you immediately run into a security wall, so you have to run a web server (easiest is, ironically, python3 -m http.server -b 0.0.0.0 8000 which serves all files in current directory). Edit file, reload browser page. And then doing anything in JS is a big hike up the mountain of JS, HTML, CSS. But it does run. Teaching everything you need to do interactive programs in it is hard bordering on a semester's coursework.

Scheme and LISP can be used this way, as seen in the Land of Lisp book, and Racket, but SBCL is a complex and terrible runtime for an ugly dialect of LISP (IMO, don't throw bombs at me), Racket has a broken REPL and their focus is on language experiments, not so much being useful. The other Schemes vary from hard engineering to even launch, to easy enough but poorly supported like my favorite Chez Scheme. I've been working on a "Beginner's Scheme" book for a while, and it's hard to compress what you need to know to get anything done, into small, fun parts.

It's actually easier to do Programming on Your Phone.

But there's still nice enough BASICs, in particular Chipmunk BASIC, and the old BASIC books & magazines work fine in it:

Just make a BASIC folder, download all those PDFs into it, and read them to see how quickly they go from "here's the BASIC commands" to "write a real program". And how you can make computing fun, not a chore.

I've been spending a little "non-productive" time hacking on my BASIC CRPG that'll run on the SpecNext when it arrives later this year… I do my work for now in Chipmunk; because NextBASIC has some strict limitations, I keep those limitations in my program, like single-letter array names. Graphics will have to be totally rewritten (but most of the UI is text-mode and portable), some data storage, too. Happily NextBASIC does have named functions (PROC, or SUB in Chipmunk), local variables, and loops (REPEAT:WHILE x:...:REPEAT UNTIL x, verbose but usable), so there's less GOTO and no GOSUB in my code.

I thought the line numbers and other limitations would kill me when I started doing it again, but in practice it hasn't bothered me much. I start each logical section at multiples of 10 or 100, add 1 per line, indent function and loop bodies, rarely have to renumber them. Since I don't GOTO/GOSUB much, finding them is less of a problem, only time I need specific line numbers is RESTORE xxx for DATA.

I'm starting to see BASIC as an interesting language again: Low-level enough to do something on simple machines, and reveal exactly how your algorithms work, high-level enough that you can get something done (much easier than ASM). You wouldn't want to build a large system in it, but focused problems, "tunnels through rock" as Chris Crawford wrote in De Re Atari, are well suited to it.

The original home computers having "turn on, maybe hit one key, you're in BASIC" interaction was amazing, unparalleled in any other system since, and we need to get back to as close to that as possible.

VR Micro Online

I've got a new project in the works, VR Micro

A shared infosystem based on spatial reasoning,
a world of many interconnected maps,
with documents and services physically represented as objects.
Users have avatars and can see and interact with each other.

The short version is, this is a file share, as a memory palace by way of graphical adventure games. It is to "real" (but costs too much and makes you puke) VR, as HTML was to "real" (but never shipped) hypertext like Xanadu: It's small and simple enough to actually work. To be something a semi-normal person could run on a rented server and make some maps and share their thought world.

All I have so far is a spec, but I got the server stood up this morning, I expect to have a standalone client working soon™, and then can work on the server.

I plan to keep it all licensed BSD for code, CC-BY for content. You are of course encouraged to contribute feedback, source, content, and/or money.

Hey, you like this? Servers aren't free. You know I have a Patreon tip jar up there? <rattle> <rattle>

Nanorogue2 in BASIC

I've completed my BASIC 10-Liner contest entry, download on itch.io or find the latest version here:

Just shove the disk (.atr) in Atari800MacX or any other compatible Atari 800XL emulator, disable BASIC and hit reset, it should boot up into the launcher:

Where you can read docs or source:

And play the game!

So, the source for my first pass was manually-packed down, and I couldn't really fit everything I wanted in there, or switch to text-graphics mode. With some rethinking, and a better source editing tool, I could… So I wrote a filter program "Basic2List.py" that removes comments & blank lines, joins up everything after a numbered line with colons, lets me insert binary codes with \xFF escapes. It still looks a little dense, because I have to manually use abbreviated statement names or remove spaces, I'd like to make it smart enough about BASIC source to do that itself.

But it lets me turn source like:

5   POKE731,1       // noclick
    GR.1            // 20x20 wide chars, 40x4 regular
    SE.1,13,15      // palette 1 to gold
    W=20            // world size
    DIME$(27),M(W,W)    // E$() encounter table, M() map
    //RRRZZZD$$..........#######>
    E$="\xF2\xF2\xF2\xFA\xFA\xFA\xE4\x04\x04\xAE\xAE\xAE\xAE\xAE\xAE\xAE\xAE\xAE\xAE#######\x3E"
    H=10        // Hit Points
    L=1     // Level
    // G=0      // Gold, default value
    ?"NANOROGUE BY MDHUGHES"

into:

5 POKE731,1:GR.1:SE.1,13,15:W=20:DIME$(27),M(W,W):E$="RRRZZZD$$..........#######>":H=10:L=1:?"NANOROGUE BY MDHUGHES"

(except the RRR... are inverse & graphics chars)

See the Atari BASIC Quick Reference Guide to learn the abbreviations and some of Atari's peculiarities. And it's running in Turbo Basic XL which really helped the program size, so I was able to squeeze in stairs!

Last time I was using the compiler, and that worked but it distorted my sounds, and I couldn't make LAUNCHER.CTB run NANOROG2.CTB! So if I just left them all uncompiled (but tokenized) .BAS files it works fine.

The only down side is it's stuck in easy mode. I'd love to have a difficulty which increases the GP to Level Up, and makes monsters hit harder (but not reward more), but that didn't quite make the cut.

Generally I'm pretty pleased by this!

The ZX Spectrum (non-Next) port is turning out to be hard, it lacks a few things and doesn't have ELSE, either, so I don't know if it can be done.

10-Line BASIC Contest

Let's go back to the 1980s!

So I knocked down a tiny subset of my already tiny BASIC demo program, NANOROGUE, and plan to make 10-line versions for Atari 800 and ZX Spectrum. Getting it running on desktop in Chipmunk BASIC was trivial. Just a little ANSI for screen positioning.

Initially I just used standard Atari BASIC, and that worked fine, if very very tightly packed, and not fast… but the lack of an ELSE statement left me with an 11-line program, I wanted to end with 50 ... :GOTO 20:ELSE:GOTO 20:ENDIF but had to move it to a new line. Very frustrating. So I'm using an enhanced BASIC, which is allowed for the PUR-120 tier, for one command.

Being "compiled" (to bytecode, don't expect miracles) and making AUTORUN.SYS easier than my own utility is nice. TBXL is at least a 1985 tool; though at the time I was using the "official" BASIC XE cartridge instead, which had similar features, and mainly moving over to Action!, 6502 ASM, and C. But for retrocomputing it's fair game.

Making a TBXL executable & bootable disk is a little fussy.

  1. Make a blank floppy ATR (emulator disk format), format it, put Atari DOS 2.5 on it or whatever you like (from DOS 2.5, H to write DOS files). This is your program disk. Like TRON, everything you do will be encoded on it, and losing it will subject you to immediate deresolution.
  2. Turbo BASIC XL disk in D1, program disk in D2. Control menu, Disable BASIC, Cold Reset (Sh-F5). You should see a red load screen, then READY. Check you're in TBXL by typing DIR.
  3. Write your program. I recommend writing BASIC as LST files in a desktop editor, then Cmd-E "Edit an atr disk image", click "Atari/Mac Linefeed Translation", "Import Files From Mac". In BASIC, ENTER "D2:FOO.LST". RUN to test it. But if you like living the '80s lifestyle all the way (or using non-ASCII chars, which are annoying to work with), you can work entirely in TBXL.
  4. Save your program tokenized: SAVE "D2:FOO.BAS"
  5. BRUN "D1:COMPILER.COM", now swap D1 and D2 (Cmd-D, click the swap buttons; you want your program disk in D1!), hit 1, pick FOO.BAS, save as AUTORUN.CTB. Swap disks back (you want your program disk in D2!).
  6. Ctrl-D, J for Ja (yes) to go to DOS-XE. COPY D1:RUNTIME.COM D2:AUTORUN.SYS
  7. Now put your program disk in D1, reset, and it should come right up into your program, then prompt for Dos, Run, or Load when it ends. Nice!


So resuming work on NANOROG, I get:

The only down side is this tiny version has a very slow redraw, it renders the entire screen each move, instead of just fixing last/new positions. I'm pondering changing it to graphics 1 (wide text, 4 colors) and poking screen memory which is probably faster than printing. I made some acceptable bleeps and buzzes with SOUND commands; I'm a poor sound designer, but I get there with some trial and error.

Anyway, next week's task is the ZX Spectrum version. Speccy BASIC is pretty good, so I expect I can knock that out quick and without all these shenanigans.

Portable Computing Devices

"Dr Ed Morbius"[sic] posts The Case Against Tablets and one of the most unusable tables I've ever seen (too much data with no affordances, and Diaspora's design crops anything complex). But the premise is interesting, especially as I'm considering my next wave of new hardware. He's just going about it the wrong way (no, really, Samsung of the exploding batteries is bad? Tell me more news.), and then frustrated he can't succeed by going the wrong way.

So, luggables have been around since (counting only devices usable by the general public) Osborne I (1983, 10.7kg), tablets since Kyocera Kyotronic 85 aka TRS-80 Model 100 (1983, 1.4kg), laptops since Toshiba T1100 (1985, 4.1kg). It's been possible to have handheld computing since at least the Apple Newton MessagePad (1993, 640g), and Palm Pilot 1000 (1996, 160g). I've used but didn't own most of those, mind, just a Sharp PC-3, Psion, slightly later Toshiba and IBM laptops I hated, a bunch of Palm devices. I read many ebooks (Baen's early CDs of ebooks were great! Pity they mostly ship right-wing milsf these days), and 160x160 2-bit grayscale is not ideal. I have, as they say, seen some shit.

These days, the choices of hardware are a little better, thousands of suppliers, almost all of which fit into a few categories:

_ iPhone iPad Apple M1
Laptop
Android
phone
Android
tablet
Microsoft
Surface
Cheap
Windows
Laptop
Pricy
Windows
Laptop
Price mid-y mid-y hi-r low-g low-g hi-r mid-y hi-r
Weight/Bulk low-g mid-y hi-r low-g low-g to mid-y mid-y hi-r hi-r
Battery hi-g hi-g hi-g mid-y low-r mid-y mid-y low-r
Performance hi-g hi-g hi-g mid-y low-r to mid-y mid-y mid-y hi-g
Security hi-g hi-g hi-g low-r low-r mid-y mid-y mid-y
Books low-r hi-g mid-y low-r hi-g hi-g low-r low-r
Video mid-y hi-g hi-g mid-y hi-g hi-g hi-g hi-g
RSS/news sites mid-y hi-g hi-g mid-y hi-g hi-g hi-g hi-g
Social media hi-g mid-y low-r hi-g mid-y mid-y low-r low-r
Online shopping low-r mid-y hi-g low-r mid-y mid-y mid-y hi-g
Reference hi-g mid-y low-r hi-g mid-y mid-y low-r low-r
Videogames mid-y mid-y hi-g mid-y mid-y mid-y low-r hi-g
Writing low-r mid-y to hi-g hi-g low-r mid-y mid-y to hi-g hi-g hi-g
Programming mid-y mid-y hi-g low-r low-r mid-y mid-y hi-g

Figure out the things you care about, and pick whatever has the most green, maybe yellow, avoiding red cells. Now you know what to buy.

What do you actually care about?

  • Price: You do largely get what you pay for in this category. Apple's devices aren't really much more expensive than equal hardware, but they never ship anything in the bottom price category. They do gouge on the top memory prices, which is unpleasant. But any Apple device will last for years longer than Android or a cheap PC, and have good resale value. The incredibly low prices on Android stuff is tempting, but it's a trap ("get an axe").
  • Weight/Bulk: These days almost everything's under 2kg like a good sword, but holding up an iPad over your head in bed is very liable to fall and break your nose; this is one place where a phone or phablet is superior. Obviously holding a heavy laptop up is incredibly dumb, and they're completely useless while moving, standing, etc., can't be propped up anywhere, are just always in the way.
  • Battery life: It's easy for manufacturers to lie about this, but if you run a real workload, you quickly see how wasteful any x86 PC is. Everyone else, it comes down to power management.
  • Performance: Only matters for Videogames and maybe programming. But Apple's been putting absurdly powerful CPUs and GPUs in their mobile devices, A12 is essentially the same as the M1 in the new Macs. Only the fastest AMDs or Intel are even competitive, and those burn too much power to be good mobile devices. Benchmarks are hard to compare exactly, but: anandtech on A12Z puts it pretty high against laptops. 2 years later, that's the SOC that's now in the lower-end iPhone and iPad.
  • Security: I'm not trying to be biased here, but if you are concerned with security at all, you really only need to look at those first three columns. There's just no alternative at present.

    Only a fool would trust anything running Android, they often ship with malware, everything in the stores is contaminated and has ridiculous lists of permissions, and they stop updating at "EOL" which may come as soon as it ships, rarely more than 6-12 months later. Do not put anything of value or interest to others in your Android device.

    Microsoft wants to be good at security, but is functionally terrible at it. They live in an open sewer of constant attacks, and have cardboard walls of bad software. Your mobile device may be pwned and all your files crypto-ransomed the second you connect it to the Internet. MS monthly updates sometimes wipe drives or lock you out, those are just from the last year, I'm sure they'll fuck up new ways this year.

    One can, one supposes, install BSD or Linux on a laptop, but that just makes it unusable for most of the tasks below.

    You know who actually seems able to keep secure borders? The walled Apple garden. Other than nation-states getting physical access to an older device, and if you're not stupid enough to turn on iCloud backups for things you need to stay private (iMessage!), you are almost entirely safe on iOS or Mac OS.

The tasks you might reasonably do with a portable computing device are:

  • Books: Cannot be read comfortably on a small screen, or landscape laptop. Needs a good document management program. On iOS, there's Readdle Documents, which is a great storage/reading hub for almost everything. On the Mac, I use Murasaki to read epub, except those in Apple Books. On Android, I've found ReadEra and Simple File Manager do that well, are pleasantly minimalist, and are not apparently run by criminals out to rob you, unlike 99% of Android software. I guess on Windows you can just keep things in folders and click on them? As noted every time I have to use Windows, I don't know how people use that.
  • Video: Not ideal on small screens, but I've found almost everyone has caught up now. Everyone has players for all the major streaming services, can play web video fine. Android file management of videos is awful. Windows seems OK at this. The real losers, tho, are BSD and Linux laptops; they can't do any DRM video without jumping thru excessive hurdles. I've been fighting this off and on for a decade with my side terminals, and mostly end up playing video on the Mac desktop instead.
  • RSS/news sites: Relies on having a big screen for 2-pane or 3-pane view, and good RSS reader. I use Reeder on mobile, and Feedbin on desktop; there's inferior but functional apps on other platforms.
  • Social media: Doomscrolling is best done at arm's reach, where you can instantly push home or just throw it away to get away from it. You need a camera attached, so I don't consider laptops suitable at all. Can you imagine someone holding their laptop over their lunch or up for a selfie?
  • Online shopping: Requires multiple tabs, note-taking, preferably a spreadsheet. On the latest iPads, you can split-screen a notepad or Numbers and a browser, which definitely helps, but a laptop or desktop works best here. I don't know how you would even do this on Android, where programs rarely keep their contents when hitting Back a bunch of times.
  • Reference: Here I mean on-the-spot "what's the answer to X?". Mostly checking wikipedia or Memory Alpha. So just a tiny bit of typing in search, maybe poke at a couple followup links, not extensive reading. 12 years ago I started doing this with my Treo, and it was addictive. This is an ideal use of a smartphone, every second that passes until you can Kirk someone with your online knowledge, it becomes less interesting.
  • Videogames: All mobile devices suffer from shitty controls. Cheap computers suffer from shitty GPUs; these days mobile GPUs are better. Macs don't have as many games as Windows, the official hybrid Excel/Call of Duty OS, but it's fine. The whole category shouldn't exist, we should just play games on the Switch or consoles, but it persists. Go play catch with your dog, it's more fun than poking at a tiny screen.
  • Writing: Long-form writing depends on screen, keyboard, and editor. There's plenty of BT keyboards for every mobile device. The original iPad had a keyboard dock stand which I bought with mine, and used until I got a better one; I now mostly use a Zagg keyboard with it, or just type on-screen. The current low-travel keyboard cases for Surface, iPad, etc. are kind of awful to type on, but they're very portable. Laptops will always win here, you can sit upright at any table and type ergonomically, and still have functioning hands in a decade. Even with an external keyboard, I find phones too small to compose much text on.

    The editor situation is more complex. I love Editorial (by the author of Pythonista), and it's great for writing text in Markdown, and is scriptable. Pages is fine for short, pretty documents, but it's incredibly slow as your document gets long, and very fiddly when you adjust layouts. There's dozens more on iOS, of varying quality. MS Word runs on iOS, Android, and something called "weeendows"; it's mildly awful but standard. I've found no native Android writing programs that weren't hate crimes, but I'm not super motivated to try every one.
  • Programming: As noted in Programming on your Phone, there's only a few good environments for iOS, but Pythonista is so good it makes up for a category. I've now seen a few Android programming environments, and they're comically, hatefully bad. Surface would be fine, except it's Windows; the only way to dev on that crap is a giant IDE that really needs a high-end desktop computer. Again you might put BSD or Linux on a laptop, but now it's useless for anything else.

I don't rank Drawing, even though that's a very important task for some people, because I'm not qualified to evaluate it; I can draw stick figures and collage art/"memes", but is the Apple Pencil super great? Maybe. What do the others have? No idea. Apparently MS reinstated MS Paint to their program store?

In hardware, I ignored e-ink readers because I find them unusable; a 2-4 second lag when flipping pages or trying to type anything is just unacceptable. We have cheap, low-power, high-refresh-rate LCD screens now, there's absolutely no benefit to e-ink. If you can stand it, fine, but I have no idea how to evaluate a thing I can't even look at.

(my table's not ideal because WordPress fights me; writing this in BBEdit/multimarkdown, I had the column labels rotated 90° with CSS, but for some reason WP positioned them wrong! I could render the HTML and paste that in, I guess, but then it's not easily editable later. And the margin of my site theme is a pain; I keep threatening to rewrite the style sheet entirely. Also, I'm aware there's colorblindness, but safe colors for them look awful to everyone else; so read the -r -y -g labels.)

September 10,000, 1993

math (+1 to count the 1st)

So, checks out. 10,000 days of the September That Never Ended.

The world since is like a movie showing a few people coughing before the credits, wipe fade, zombie hordes tearing down barricades to eat the brains of the last few people. Someone's shivering in the corner with a gun, for the zombies or self, you can't tell. Freeze frame. "I bet you're asking how we got here…"

Note: I, uh, kinda infodumped here. Estimated reading time: 19 minutes.

What Went Wrong

At the time, I had a nice Gopherhole, finger and .plan (at times with a GIF of me uuencoded into it!), and was already annoyed by the overcomplicated World Wide Web rising. But in Feb 1993, UMinn saddled Gopher with threats of a license, which killed the better-organized system, and I was an adaptable guy. For quite a while I had both with equivalent content mirrored, but then my WWW site got more features, and the Gopher hole got stale so I closed it.

A bunch of new kids invaded USENET every September when school started, and commercial Internet started in '89-91 when NSFNet removed their commercial restrictions, and then fucking AOL unleashed bored neo-nazis from the flyover states on us. There was a vast onslaught of spam, bullshit, and trolls. So I switched from rn which had primitive killfile regexps ("PLONK is the sound of your name hitting the bottom of my killfile"), to trn, which had threading and a little better killfile system, to strn which had scoring so if you hit multiple good or bad keywords, you'd move up or down my queue or vanish. I bailed on all the big groups, tried moderation and was promptly attacked by scumbags who thought the moderation system was for protecting their corporate masters, not stopping spam, and then quit entirely.

We don't even have FAQs now. There's no netiquette at all (ha, Brittanica, remember them? Site's probably not been touched since 1999). I hide off to the edges in Mastodon with very aggressive blocking of anyone who looks annoying. The big media sites, Twaddler and Fuckbook, are just poison, an endless scroller of screaming between everyone who wants to feel offended all the time, and the Orange Shitgibbon's mob of traitors; I see a very little of Twaddler by way of RSS, but I won't go any closer than that.

Gabriel Dropout s1e2: Do you enjoy living like that, always being mad?

The Web. On most sites, there's megabytes of crappy scripts for tracking, style sheets, giant custom fonts instead of banners & buttons burned into GIFs, so a page might take 100MB to show anything. The basic World Wide Web experience of click a link, page shows you slightly formatted text on an unpleasant background, click another link, is unchanged from 1993, but there's a dumpster of shit on top of that. I hate using the Web now, every goddamned page wants to track me, bounce banners up in front of me, demand I approve cookies but don't let me say "DENY ALL FUCK YOU"; and even without cookies, they use fingerprinting to track me.

It doesn't have to be like this. Despite using WordPress, the dumbest and most bloated thing possible, I've tried to keep my site down to a minimal setup, go read the page source, it's just CSS, content, and the search widget. If I ever get around to purging the default CSS, it'll be even lighter. But most people not only don't live up to that ethic, they aggressively want the opposite, the biggest, fattest, most unusable crap site full of autoplaying videos they can make.

Criminals being able to use the Internet to attack physical infrastructure, or hostile encryption of computers (including in hospitals; some people need a stern talking to with a 2x4 or a shotgun). Back in the day, RTFM's worm was a novel disaster, but fixable. Microsoft's garbage OS was trivially infected with viruses then and now, but back then it didn't matter much; you might lose a few un-backed-up files, not real money.

The Internet as trivial research device seems like it should be good, but what it's meant is that the Kids Today™ don't bother to learn anything, they just look up and recite Wikipedia, which is at least 50-80% lies. They "program" by searching StackUnderflow for something that looks like their problem, pasting it in, then searching again to solve the error messages. Most of them could be replaced with a Perl script and wget. I assume non-programming fields are similarly "solve it by searching", which is why infrastructure, medicine, and high-speed pizza delivery are so far inferior to 28 years ago.

Search was very slow and mostly manually-entered into index sites back in the '90s. Now it's very fast, but only things linked from corporate shitholes actually show up, and spam and SEO poison all the results, so all you really get is Wikipedia, which might have a few manually-entered links at the bottom which might still exist or be in archive.org, or a few links to spam. Try searching for anything, it's all crap.

Vernor Vinge in 1992's A Fire Upon the Deep called a 50,000-years-from-now version of USENET "The Net of a Million Lies". Just a bit of an overshoot on the date, and a massive underestimate of the number of lies.

There's a lot of knock-on effects from the Internet as a sales mechanism. Like, videogames used to get QA tested until they mostly worked; fiascos like Superman64 were rare. Now, Cyberpunk2077 ships broken because they can patch it off the Internet, won't be fixed until actual 2077. Sure, not all games. I'm usually satisfied with Nintendo's QA, though even Animal Crossing: New Horizons shipped with less functionality and more bugs than Wild World on the (no patches!) DS cartridge.

What Is Exactly the Same

IRC, war never changes. I used ICB for my social group back then, and we moved from there to Slack. Most technical crap is discussed on IRC, rarely on Slack, Matrix, or Discord (which literally means conflict). Doesn't matter, it's just a series of text messages, because nobody's figured out how to make anything better that lasts.

I'm still using some version of UNIX. If you'd told me in 1993 that I'd be a Mac guy, I'd've opened your skull to see what bugs had infested your brain; Macs were only good for Photoshop and Kai's Power Tools. But Linux never got better, BSD is functional but never got a great desktop, SUN and SGI are dead <loud sustained keening wail>, and Apple bought/reverse-takeovered NeXT with a nice enough BSD-on-Mach UNIX. And the Internet is, largely, UNIX. There was a horrible decade mid-90s to early-00s when Windows servers were gaining ground, people were ripping out perfectly good UNIX data centers to install garbage at a huge loss in efficiency because their CTOs got bribed millions by Microsoft. But that tide washed up and back out taking most of the MS pollution with it. Maybe it won't be back.

I still write web sites in Vim or BBEdit (since 1993: It Doesn't Suck™). Well, I say that, but I'm writing this mostly in the WordPress old text editor, using Markdown. Markdown's new-ish (2004), but behaves like every other text markup system going back to SGML in the '80s and ROFF in the '70s.

What's Good About the Internet

Not fucking much.

Streaming or borrowing digital copies of music, movies, and books is easier than ever. I speak mainly of archive.org, but sure, there's less-legal sites, too. I have access to an infinite library, of whatever esoteric interest I have; I've lately been flipping through old Kilobaud Magazine as part of my retrocomputing; I like the past where just getting or using a computer was hard and amazing. In 1993 those might have been mouldering away in a library basement, if they could be found at all. Admittedly, I hate most new media; nothing's been good enough for Mark since 1999, and really I could put the line at grunge, or maybe 1986 when The Police broke up. But at least it is accessible.

I spent most of today writing new stuff for the Mystic Dungeon, and even with all the overcomplicated web shit, it's a little easier to build a secure, massively parallel message system in JS than it was in C or Perl 30 years earlier. Not by much, but some.

Internet pornography (link barely NSFW?) is a tough one. '70s-80s VHS porn was expensive, flickery, way too mainstream; fine if you liked chunky old guys banging ugly strippers, I did not. DVD porn in the '90s was still expensive, but got much better production, and every niche interest, that was the golden age. But now everything is "free" on the thing-hubs and x-things, but only in crappy 6-minute excerpts stolen from DVD, horrible webcam streams, and the creepifyin' rise of incest porn. Because the Internet enables weird interests, but what if a whole generation have massive mommy/daddy issues? You can in fact pay for good non-incest porn, but payment processors and credit cards make it hard to do, so it's easier to just watch garbage. And then there's prudes and religious zealots who think porn is bad; in the old days, they had the law and molotov cocktails on their side, but now they're impotent, so I guess that's barely a win for the Internet.

What Didn't We Get

The Metaverse. OK, there was and is Second Life, but Linden fucked the economy up, and never made it possible to take your grid and host it yourself without a gigantic effort. There's WebVR and a few others, but they have terrible or no avatars, construction, and scripting tools. We should be able to be scanned and be in there, man, like in TRON.

The Forum. There's no place of polite social discourse. There's hellsites, and some sorta private clubs, and a bunch of abandoned warehouses where people are chopped up for body parts/ad tracking. Despite my loathing of Google, who are clearly trying to implement SkyNet & Terminators and exterminate Humanity, Google+ was OK, so of course they shut it down.

The Coming Golden Age of Free Software That Doesn't Suck. Turns out, almost everyone in "FLOSS", the FSF, and GNU, are some of the shittiest people on Earth, and those who aren't are chased out for daring to ask for basic codes of conduct and democracy. Hey you know that really good file system? Yeah, the author murdered his wife, and the "community" is incompetent to finish the work, so keep using ext which eats your files. Sound drivers on Linux, 16 years after I ragequit because I couldn't play music and alarm sounds at the same time, still don't work. "Given enough eyes, everyone goes off to write their own implementation instead of fixing bugs"; nothing works, every project just restarts at +1 version every 2-5 years. Sure, you can blame capitalism, but there's a couple of communist countries left, why aren't they making infinitely better software without the noose of the dollar dollar around their necks?

The Grand Awakening of Humanity. This was always delusional, but the idea that increased communication between people of Earth would end war, everyone would come together, align their chakras/contact the UFOs, and solve all our problems. Ha, no, you put 3 people in a chat room and you'll have 5 factions and at least one dead body in a week. As we approach 7 billion people online, many with explosively incompatible and unfriendly views, this is only going to get worse, if that's even imaginable.

Final Rating: The Internet

★★½☆☆ — I keep watching this shitshow, but it's no damn good. Log off and save yourself.

Informational Hygiene Directives

That's what I call my rules around contacting me, and getting a (non-vulgar) reply from me.

This is brought to mind by Wednesday's spam mail reaching my contact address, and why that made me so mad.

  • Casual, "hey what about" messages: Social media, currently @mdhughes@appdot.net — if this changes, it'll be in the About page. I don't always respond, if I do it's within 24 hours but rarely immediate, but I'll probably see it. I may or may not care, this is very low attention span, I may be drunk and posting about Dracula or Godzilla, it's not you, it's me.
  • Do not: IRC messaging, Discord messaging, etc. unless I'm specifically engaged in that activity at that moment, I won't see it, won't care.
  • Sorta: WordPress post replies (and replies from micro.blog) I will only see next time I load my WP dashboard; I use StupidComments.css to hide them on my front page, which I rarely visit anyway. I do appreciate post replies, I'd hit little favstars by them if I could, but they're not allowed to be intrusive.
  • Junk mail, Mailing lists: I have an email address for that on a popular and possibly hostile AI service, I manage junk there, messages to me are unlikely to get thru. This address generates no notifications.
  • Professional email: Only mission-critical services and people who have business to do with me should be using this address. This address does generate notifications.
  • Private email, iMessage, SMS, Slack: You probably don't have this. Unless you're one of a half-dozen people, and if someone else finds it I tell them the correct junk/professional address to use and block them. This gets notifications. The one time I let one of these slip while I was working, tragedy ensued, so I won't do that again.

When I was all business business business numbers, I got at most a couple dozen emails a day on my professional box, from direct reports, management, and interested outside teams, and I hated it, but that was manageable. Since I got The Man's boot off my neck, it's much lower, but I like barriers and being able to utterly ignore stuff outside one box if I feel like it.

Which brings me to today's hilarious idea of email sabbaticals. There's more recent people doing the same, it's not just this one Microsoftie 10 years ago, but I'll address the original.

What is wrong with you? Thousands of emails in 2 weeks (hundreds a day)? Everything you're doing there is wrong. Everyone sending you stuff is playing "my problem is your problem", and it is NOT.

Organize, filter, and delegate.

  • Organize: Use message boxes to put away automated or group content you don't need to pay attention to now. You can read that when you have spare time, or not, because it's not directly affecting you.
  • Filter: Don't let people throw everything into your "must read now" box. Block the people who can't learn.
  • Delegate: If you do have a firehose of stuff coming in, you probably can afford to hire someone to read it all and just send the useful parts to you. If you're running an open source project, you're kind of screwed, but there may be volunteers (or you can "voluntell" some overly enthusiastic but less useful contributor). You can also set up a wiki or forum for the Kilkenny Cats solution.

Walt Mossberg had this ridiculous screed about getting hundreds of emails and too many notifications… Now, he's a (now-retired) journalist who does get a lot of legitimate "my problem is your problem" email. But he also complains about birthday notices, CVS pharmacy ads, Starbucks ads… Turn all that shit off! Nobody needs any of that crap.

"A text, or short internet message, on the other hand, seems to demand instant attention, and may even lead to a whole thread of conversation."

No, it does not. Mute, delete, block anyone who can't learn. If people persist in sending you junk, you can't let them have access to a ringing bell.

Videogames and Storytelling Mix like Water and Sodium

At best you get tears & corrosive salt water, at worst you get a sodium explosion.

My philosophy of games:

  1. Games are about environment and gameplay only.
  2. Graphics don't matter much, as long as they communicate.
  3. Character and story are what you bring to it, they should not be part of the game.

So, I just dropped a lot of words there with fuzzy definitions:

  • Games: I mean all of tabletop boardgames, role-playing games, and most often videogames of all genres. There's less difference between the Warlock of Firetop Mountain gamebook and Myst than there is between that gamebook and David Foster Wallace's Infinite Jest. And if you tear out the system from Warlock, you get Advanced Fighting Fantasy or Troika!, which is a very nice little RPG for wandering a weird, almost hallucinatory fantasy world with no book, no defined character, no story.

  • Environment: The world you explore. Some of this uses traditional writing skills for designing non-player characters and describing the tone and events, but also architecture, painting, 3D modelling for designing environments, music for writing soundtracks, foley for making environmental sounds.

    I recently enough mentioned this in Videogame Exploration, and I want to especially repeat my suggestion of Bernband, which is goofy, low-rez, standee sprites… and one of the most immersive environments I've ever played in.

  • Gameplay: The continuous loop of doing something, getting feedback on what happened, maybe scores or your position or just your understanding of the environment changes, and then repeat forever. That loop might take milliseconds in action games, to minutes or hours in hard adventures. There's a… fixation? a high… you get from that loop when it works right. "Just one more turn" says the Civilization junkie at 4AM before blowing off work. "Just one more mineshaft" says the Minecraft player. "Just one more quest" says the ESO player.

  • Graphics: This is almost irrelevant, really, despite the huge amount of effort and money spent on it. It doesn't matter if it's text adventures like Colossal Cave Adventure or Infocom's games, character-grids like Rogue and many descendants, 2D or 3D tiled graphical environments like Ultima IV, Super Mario Bros, or Castlevania, painted images along with text like Sierra's King's Quest or the LucasArts SCUMM games, up to 3D FPS graphics like Doom or Elder Scrolls Online. Good gameplay with any graphics is immersive, bad gameplay with perfect graphics is not.

    Easy way to test that: The most popular videogames of all time are: Mario (2D tiles), Zelda (2D & very simple 3D), Minecraft (blocky 3D with the worst programmer-art textures), Animal Crossing (very simple 3D imitating 2D). Graphics-intensive games pop up and vanish, because they're uninteresting.

  • Character: Who you are. In the better kinds of games, this is left blank for you to fill in. If the game engine doesn't accomodate dialogue even as well as Ultima I did, you're a mute wanderer who breaks into peoples' homes, smashes their crockery looking for coins & drugs/potions, maybe hits X to hear if they have any rumors or leads, then leaves. In action games, very little dialogue is necessary, your weapons speak for you.

    If you can freely define your Character, that interferes with Story. Until recently, at least you could rename your character, but with full voice acting for many games, they either obnoxiously refer to you as "Vestige", "Adept", "Friend", etc., or don't refer to you at all… or don't let you rename your character.

  • Story: This ties in closely with Character: What do you do? If you can wander as you please, make your own fun, whether that's good or harmful to the environment or NPCs, then you have no story, only gameplay. If you can only ride along like an amusement park railroad ride, get a story told to you and then pew-pew-pew to shoot targets, move on to the next stop, you have no gameplay, only story.

    The Disneyland ride model is a big influence, but AAA "games" with story are mostly frustrated Hollywood wastrels in the wrong medium. The obvious recent example is Death Stranding, which has hours of awful cutscenes with Hollywood people who have nothing to do with the game: A mediocre walking simulator/platformer; without the cutscenes, it might even be fun, if tedious.

An unfortunate result of focusing on Story has been forcing the player to make bad dialogue/action choices to advance, stay on the railroad unable to get out and wander away. Heavy Rain's no-choice "Press X to Cry Jason" rather than man up and go look for your lost child.

The now-defunct Telltale Games' Minecraft Story Mode had a painfully fixed main character and plot, and a doomed character, but let you choose social consequences with allies… which were then forgotten in the next chapter.

Early Final Fantasy games had a totally blank slate. FF3 is right on the cusp; it gives you a sandbox to explore, eventually hit a switch to open the next, bigger sandbox, repeat a couple more times, finally a long multi-part endgame and post-game sidequests. The characters have a secret backstory, but you can rename them, give them any job you want, play them however you want. I did one playthrough with boring Warrior, Thief, White Mage, Black Mage, another using Monk/Black Belt, Red Mage/Dragoon, Scholar/Geomancer, Evoker/Summoner. Utterly different gameplay even if I ended up clearing the same dungeons. My bizarro party got to level 99 to fight the giants.

By FF4, the characters and story are locked in place, you can enjoy it or not, and certainly the art's great and I quote "you spoony bard!" all the time, but you have no choices. Not that I'm blaming that all on JRPGs — there's Japanese games with freedom of choice, and Western games fixed on one character, Gabriel Knight is one of the earliest of this archetype.

Gamebooks like Tunnels & Trolls solos, Fighting Fantasy, Lone Wolf, etc. are odd hybrids since they have story, but almost never have a defined character (a few do, like Creature of Chaos). The more linear the gamebook is, the better the story is, but the less interesting it is to play; there's several I've done that had one win and many deaths, and so cannot be replayed. The more meaningful choices they offer, the more incoherent the gamebook becomes, just a bunch of random scenes because you can't build up any meaning like linear fiction does.

My objection to Dungeons & Dragons adventures from Dragonlance (1984) on, is that it went from a game of freeform dungeon crawls, hex crawls, or "West Marches", wandering the Referee's world, maybe loosely using a Greyhawk map or Outdoor Survival, often made up in the days between games or improvised on the spot; to railroaded "adventure paths" with fixed character roles (either named and unkillable like DL or just "must have fighter, thief, cleric, magic-user, bard, or you will fail"). 5E has become entirely that, their healing/action economy even requires a specific pacing along the railroad, and their world maps are just one-path flowcharts you move along like Candyland.

So in conclusion (almost), just say no to story in your games. Look for that infinite high of gameplay.

  • The Devil's Advocate: There are some attempts to make character or story "gameable", rather than just a railroad, most notably Chris Crawford's Erasmatazz, which he then replaced with Storytron, now Wumpus (no relation to the real Hunt the Wumpus game). These have computer-controlled drama, you talk/choose interactions with different "emotional weights", and the NPCs react appropriately. These suck as games. They can be a little interesting as a puzzle to talk to the NPCs, find out what's going on, maybe push one of them into a "win" state. Nobody'd spend long on one.

It's worth looking at Chris's development woes. Sequentiality and list of encounters in Le Morte d'Arthur he gave up on gameplay, it's a railroad click-thru of Malory's book, with a single fame/piety score to get win/lose.

His Gamers or Storytellers seems to be an admission of defeat. Yet he still has bigoted, ignorant ideas like:

This also plays into the old “evolution versus revolution” dilemma. I have long held that games will never evolve into anything with artistic merit, because the gaming audience does not expect artistic content from games. You can’t sell Beef Wellington to people who want candy. You can’t sell poetry to people who read comic books. You can’t sell art-house movies to people who watch cartoons. And you can’t sell artistic content to gamers who want action and instant gratification. Games as a a medium are ill-disposed to evolve in a storytelling direction.

This is why he fails. Games can have artistic content, just not inbred Hollywood-imitating content. There is plenty of poetry in comic books, obviously Sandman but many an issue of Detective Comics (the smarter Batman series) has moved me deeply. Many art-house movies are cartoons, or vice versa, or were when theatres were a thing, I'd start with Don Hertzfeldt's Rejected and Ralph Bakshi's Wizards. You can't sell poison apples to gamers, not more than once anyway.

I had a look at his soi-disant "Wumpus", and got this, his "non-technical" user interface. It's incredible to me that this is the guy who made Eastern Front and Balance of Power, which were techy but not a giant wall of UI clickies, badly sized in a window. Yes, it's Java, but you can make attractive and usable Java UI, it just requires effort.

I figured out eventually that you can hit Editor/Run Rehearsal (?) to play in something like a dialog box UI, was able to play through a very dull conversation, and then it gets stuck with Jeff explaining widgets to Sam in an infinite loop. Excellent. Obviously story-gaming is a solved problem. ?

Reinventing the Wheel

Sure, there's existing code. Somebody Else's Code. It works fine, maybe not as fast as you'd like, or the interface isn't quite right. That's how it often is with me and SRFI-13. Olin Shivers is a skilled Schemer, back when that wasn't cool (OK, it's still not cool), but some of his APIs enshrined in early SRFIs drive me a little nuts, and the implementation is slow because it's so generalized.

So after a few false starts and failed tests, I now have these pretty things: (updated 2020-11-10, enforced hstart, hend boundaries)

;; Returns index of `needle` in `haystack`, or #f if not found.
;; `cmp`: Comparator. Default `char=?`, `char-ci=?` is the most useful alternate comparator.
;; `hstart`: Starting index, default 0.
;; `hend`: Ending index, default (- haystack-length needle-length)
(define string-find (case-lambda
    [(haystack needle)  (string-find haystack needle char=? 0 #f)]
    [(haystack needle cmp)  (string-find haystack needle cmp 0 #f)]
    [(haystack needle cmp hstart)  (string-find haystack needle cmp hstart #f) ]
    [(haystack needle cmp hstart hend)
        (let* [ (hlen (string-length haystack))  (nlen (string-length needle)) ]
            (set! hstart (max 0 (min hstart (sub1 hlen))))
            (unless hend (set! hend (fx- hlen nlen)))
            (set! hend (max 0 (min hend hlen)) )
            (if (or (fxzero? hlen) (fxzero? nlen))
                #f
                (let loop [ (hi hstart)  (ni 0) ]
                    ;; assume (< ni nlen)
                    ;(errprintln "hi=" hi ", ni=" ni ", hsub=" (substr haystack hi hlen) ", bsub=" (substr needle ni nlen))
                    (cond
                        [(cmp (string-ref haystack (fx+ hi ni)) (string-ref needle ni))  (set! ni (fx+ ni 1))
                            ;; end of needle?
                            (if (fx>=? ni nlen)  hi  (loop hi ni) )
                        ]
                        [else  (set! hi (fx+ hi 1))
                            ;; end of haystack?
                            (if (fx>? hi hend)  #f  (loop hi 0) )
                        ]
        ))))
    ]
))

;; Test whether 'haystack' starts with 'needle'.
(define (string-has-prefix? haystack needle)
    (let [ (i (string-find haystack needle char=? 0 0)) ]
        (and i (fxzero? i))
))

;; Test whether 'haystack' ends with 'needle'.
(define (string-has-suffix? haystack needle)
    (let* [ (hlen (string-length haystack))  (nlen (string-length needle))
            (i (string-find haystack needle char=? (fx- hlen nlen)))
        ]
        (and i (fx=? i (fx- hlen nlen)))
))

Written for Chez Scheme, caveat implementor. BSD license, do what thou wilt. If you find a bug, send me a failing test case.

I don't normally bother with fx (fixnum) operations, but in tight loops it makes a difference over generic numeric tower +, etc.