Writing Objective-C with Mulle-Objc

mkdir CLICalc
cd CLICalc
mulle-sde init -m foundation/objc-developer executable

This takes more or less forever.

… Still going. OK, finally done. I hate to think it's gonna do that every new project? Or whenever it updates?

Anyway, bbedit . (fuck Xcode), and add at the bottom of import.h and import-private.h:

#import <Foundation/Foundation.h>

Make src/main.m useful:

// main.m
 "import-private.h"
 "CLICalc.h"

int main(int argc, char *argv[]) {
    NSAutoreleasePool *pool = [[NSAutoreleasePool alloc] init];

    CLICalc *calc = [[CLICalc alloc] init];
    [calc push:42.0];
    [calc push:69.0];
    double a = [calc pop];
    double b = [calc pop];
    NSLog(@"a=%f, b=%f", a, b);

    [pool release];
    return 0;
}

Create an Objective-C class, src/CLICalc.h:

// CLICalc.h
 "import-private.h"

@interface CLICalc : NSObject

@property (retain) NSMutableArray *stack;

- (void)push:(double)n;
- (double)pop;

@end

and src/CLICalc.m:

// CLICalc.m
 "CLICalc.h"

@implementation CLICalc

@synthesize stack = _stack;

- (id)init {
    self = [super init];
    _stack = [[NSMutableArray alloc] init];
    return self;
}

- (void)dealloc {
    NSLog(@"CLICalc dealloc");
    [_stack release];
    [super dealloc];
}

- (void)push:(double)n {
    [_stack addObject:@(n)];
}

- (double)pop {
    if ( ! [_stack count]) {
        // ERROR: stack underflow
        return 0.0;
    }
    double n = [[_stack lastObject] doubleValue];
    [_stack removeLastObject];
    return n;
}

@end

Doing that without a template was a little hard on the old memory, and I had to use DDG to look up some method names without autocompletion. But I'm pretty sure that's fine.

In mulle-ide, type update to add the new class to cmake: If you look in cmake/_Sources.cmake you should now see CLICalc.m listed.

Now craft to compile. You'll get a spew of crap, but hopefully no errors.

I am getting this, which I can't resolve:

/Users/mdh/Code/CodeMac/CLICalc/src/main.m:21:55: warning: 'NSAutoreleasePool'
      may not respond to 'init'
        NSAutoreleasePool *pool = [[NSAutoreleasePool alloc] init];
                                   ~~~~~~~~~~~~~~~~~~~~~~~~~ ^
1 warning generated.

But NSAutoreleasePool certainly has init, and it seems to not die?

% ./build/Debug/CLICalc
a=69.000000, b=42.000000

Hooray!

Yeah, this isn't amazing. Except: It's supposedly portable now. I can maybe rebuild this on Linux, or Windows? I dunno.

This is almost classic Objective-C, slightly enhanced from 1.0: We didn't have property/synthesize, or nice object wrappers like @() when I were a lad. I typed so many [NSNumber numberWithInteger:n]. So get used to the retain/release/autorelease dance. There's no dot-syntax for property access, type them [] like old-school. But hey, it's a proper compiled language with a nice object system and no GC pausing.

I tried importing Cocoa and got a ludicrous spew of errors, so Mac GUI is gonna be a challenge. But I could import SDL and use that for portable UI, since Objective-C is just C.

Sweet. I'll finish up the calculator's parser in a bit, but then see about doing something useful in it.

Spread of Terrible Programming Languages

Abstract—The English-like business programming language COBOL saw widespread use from its introduction in 1960 well into the 1980s, despite being disdained by computer science academics. This article traces out decisions made during COBOL’s development, and argues that its English-like appearance was a rhetorical move designed to make the concept of code itself more legible to non-programming management at computer-using companies.

I found some of the references much more interesting than the paper, which is a pretty high-level history avoiding the actual boots on the ground details.

COBOL was designed (and fought over very hard on this point) so that unskilled managers could "read" it, but in my view that had little to do with its spread. Middle management where that would matter has no buying power, and executives won't read more than a sentence on a slideshow.

Ubiquity made much more of a difference; no two computer installations were compatible until the late '60s, so the alternatives were COBOL, FORTRAN, LISP, and a hundred weird languages invented at each facility. Given those choices, I'd pick FORTRAN or LISP, but even COBOL would beat rewriting on every machine. A bunch of companies and government agencies ended up clustered on that choice, so it became widespread, not on any merits but because programmers could move code semi-automatically.

I know this because it happened at least five more times that I can think of, and only once with unskilled readability as a goal:

  1. BASIC is a tutorial language for children, very poor for large programs, very slow compared to C or ASM, grossly inferior to Pascal or Logo for any role. BASIC became ubiquitous because it can be implemented in a few K of RAM and worked nearly the same on hundreds of incompatible timesharing and microcomputer systems.
  2. Java is a mediocre Objective-C/Smalltalk replacement, applets turned out to be too heavyweight for the web and insecure, but cross-platform on servers turned out to be very valuable; cross-compiling C++ is a total crapshoot. Developers can have nice Macs and still compile Java code that runs on non-Mac servers.
  3. Linux (not a language, I know, but same pattern) is hot garbage, the product of a drunk, belligerent Finn student putting a kernel that'd get him a failing grade in an OS class on his 386. But because it's so quarter-assed and has no device driver support, it runs on anything like a virus. So now UNIX is all but dead, killed by a nematode parasite that fills the niche.
  4. PHP is a cruel joke, a gross hack to put server-side script in HTML instead of generating HTML in code or templating. But it was easily installed in Apache, runs everywhere with no setup. So half the web runs on this shit, from WordPress to Facebook.
  5. JavaScript started life as a six week hack to get LISP & Self-like programming, with C-like syntax for marketing reasons, in a web browser. And until early 2000s, it wasn't portable enough for anything useful. But when IE died and the other browsers implemented ECMAScript consistently, it became the universal language. It's still weird and fragile; I don't dare write it without eslint. But it may be the language of the century.

There's the similar case of IBM PC/DOS/Windows vs microcomputers and Macintosh, which were better tools but fragmented, but that's more about central authorities imposing Nazi-supporting IBM, and convicted criminal organization Microsoft bribing and extorting to kill competition. Common languages would likely have been enough to keep competition and diversity going if IBM & MS had been burned to the ground and their scatterlings shot as they ran back in the '70s.

The author of the paper sort of slouches in this direction but doesn't quite get it, when pointing out how science and technical culture has standardized on English. We are all incompatible machines, but a common language lets us argue.

I hate when papers list references without URLs:

  1. 10 PRINT CHR$(205.5+RND(1)):GOTO 10: Fun little book, not at all relevant to the paper.
  2. N. Wardrip-Fruin, Expressive Processing
  3. M.C. Marino, Critical Code Studies
  4. B. Schneiderman, The Relationship Between Cobol And Computer Science
  5. J. McCarthy, "Memo To P. M. Morse: A Proposal For A Compiler" Memo CC-56
  6. D. Nofre , M. Priestley , and G. Alberts, "When Technology Became Language: The Origins Of The Linguistic Conception Of Computer Programming, 1950–1960"
  7. M.D. Gordin , Scientific Babel: How Science Was Done Before And After Global English

End of 2018

Let's watch Poseidon — Only available on Netflix until tomorrow! Normally I watch Strange Days, but I feel an upside-down sinking ship is a more accurate metaphor for the year than failed love and revolution and pretty Angela Bassett. Maybe for Chinese New Year (Feb 5), Gabriel Dropout's New Year/armageddon episodes.

I don't go super intimate online, but it's been a rough year. I've lost a friend and two of my last few relatives to cancer, my dad's had some close calls, and his dog died. Doing any kind of work under the stress load is… not great. And I'm not a good friend or coworker in this state. My new puppy is a terror, both looks and behavior like a jackal puppy, but the one really good thing.

State of software I touched on yesterday. This is the year a new Perilar rises from the ashes, and Learn2JS is moving along nicely, I think that's going to be a big deal, it's a sweet environment.

I goofed off yesterday and started writing tbasic, a Tiny BASIC interpreter in C, because that's a useful thing to do! I've done this before, but made a messy parser. The new one is a tiny single file and much cleaner. Might be published tomorrow morning sometime. While nobody needs BASIC, it's good C programming exercise, and I can link in SDL2 and give it cross-platform graphics and sound, which is actually kinda neat.

"It is practically impossible to teach good programming to students that have had a prior exposure to BASIC: as potential programmers they are mentally mutilated beyond hope of regeneration."
—Edsger W. Dijkstra, EWD 498: How do we tell truths that might hurt?
[mdh: In case you can't read the paper and get the joke, he's joking. Sort of.]

I got a little writing in on Delvers in Darkness, I'm thinking about more adventures for it, solo gamebooks and Refereed.

Poseidon is really terrible already. Everyone's a ridiculous caricature. Oh, this is gonna be a good shipwreck.

What I'm Watching and Criticizing: The Good Place

So, up front: This is a trashy show in a lot of ways, that's trying to be much, much higher and mostly failing. It exists so someone who wasted their college tuition reading philosophers can pick up a paycheck name-dropping Kant in each script; Kudos to that guy for finding a way to make philosophy pay. (disclosure: I also read philosophers in college and since, mostly on my own time, and never tried to extract money for it.) But it is not written as a philosophy treatise, though it occasionally tries; mostly it's just a dumb sitcom.

The main cast are a trashy girl-next-door mean chick, a hot chick with an English accent I hate, a philosophy nerd (irony/shitty writing: black guy, entirely teaching from texts written by old white guys, all but the latest of whom kept slaves; not a single non-honky philosophy is ever discussed), and a moron, trying to survive an afterlife where they don't quite seem to belong, run by well-past-sell-date Ted Danson and a slightly frumpy robot girl (who says she's neither), in standard sitcom cycles (literally: There's mental reboots that happen so episodes can restart at the beginning), though it does change up the formula eventually. I do like the hot chick and the mean chick; they have character. Maybe the robot girl, even limited by her role. Sadly, the nerd is one-note, the moron is barely able to breathe in and out without electric shocks, the ancient stick-figure of Ted Danson is stiff and overacts when he does break being stiff.

The key premise of the show is that you earn "points" by your actions in life, which sorts you into "The Good Place" or "The Bad Place". There's, uh, roughly everything wrong with this.

Obviously first, there's no magical afterlife. It makes no sense: There's no evolutionary advantage to an afterlife, and Humans being the only animals who can rationalize and make up stories to deal with our fear of death is infinitely more likely than that a magic sky fairy suddenly gifted Homo sapiens with an invisible remote backup system. When you die, your brain patterns rot and the program that was you ceases to be recoverable in about 5-10 minutes. There's probably nothing like an Omega Point or Roko's Basilisk for the same reason; that information won't survive from the current hot period of the Universe to the long cold efficient computational period, so no AI can reconstruct you. I'm as sad and angry about this as anyone, but I don't delude myself.

Second, even if we say "YER A WIZARD HARRY" and you have a magical afterlife, it's populated by immortal beings (IB), somehow. Where do they come from? How does that evolve? How do they get magical powers? If Humans can get a half-measure of sanity and wisdom by 40, 60, 80 years, every IB should be perfectly enlightened and know every trick and skill possible by 1000, 100000, 13.5 billion years old. The IBs shown are as stupid and easily-tricked as Humans, when you get to The Actual Plot of this show. To pick the exact opposite of this show, Hellraiser had an internally consistent magical afterlife: "Hell" is an alien universe inhabited by Cenobites with a wide range of power, whose experiences are so powerful that they would seem like torture to a Human; they collect Humans who seek that experience with magical devices, not to reward or punish meaningless behaviors on Earth; good or evil means nothing in Hellraiser.

Every IB in this show is insultingly stupid, repetitive physical tortures by frat boy demons, inferior to Torquemada's work here on Earth; farting evil robot girls; a neutral Judge too silly to be on a daytime TV show who only wants to eat her burrito. Low, low, lowest-fucking-brow comedy quite often.

Third, and most damning (heh), any system of morality with a scoring system then becomes solely about that scoring system. If "God and/or Santa are Watching" as Christians claim, you must act good according to the dictates of the Bible to score high enough to enter Heaven; it doesn't matter what's logically right and wrong, only the specific rules of an eternal sex-obsessed Middle-Eastern tyrant. Everyone who ate shellfish or wore mixed fibers or got a tattoo, forbidden by Leviticus, or failed to commit genocide & slavery when ordered by a prophet of God, as throughout the entire Old Testament, or masturbated to anyone but their lawfully wedded spouse, as forbidden by Jesus in Matthew 5:28, is gonna have a real bad eternity in Hell.

The scoring system for The Good/Bad Place makes it impossible to commit a "selfless" act unless you're a total moron (so, possibly the moron character, but he's unthinkingly rotten as often as nice). They treat this as a feature, as if you can only do good deeds when you can't see the score.

In philosophy without gods, you can choose to do good (try to define "good" in less than 10,000 pages…) instead of evil (same) because your personal or societal reward system is rigged that way (laws, in general), or because you selfishly want to look altruistic (maybe virtue-signalling to attract a mate), or because universalizing your behavior means you should selfishly do right to raise the level for everyone including yourself ("think global, act local"), or purely at random, and you have still done good deeds. While the ancient Stoics (especially my favorite, Marcus Aurelius ) respected piety to the immortalized Emperors and gods of the Pantheon, they didn't ask the gods for rules, they found a way to live based on reason, a modicum of compassion, and facing the harsh world that exists.

But once the authorities put in an objective score system in with infinite reward/punishment, you must act to maximize your score; there's no moral debate possible, you would just find the highest reward you can achieve each day and grind on it. Those born with the most wealth and privilege will be much more capable of raising their score instead of attending to life's necessities, so the rich get rewarded, the poor get punished.

This show seems to think Jiminy Cricket sits in your head as a quiet voice without any training, and you just have to listen to it to know good and evil. There's a discussion about Les Miserables re stealing bread (worth exactly -17 points), that's only used for mockery, but in real life that ambiguity is impossibly hard to make rules for.

I liked Eleanor and Tahani, and sometimes Michael, playing off each other enough to keep watching this through S2, but every time Chidi speaks I roll my eyes and wish that just once he'd reference someone not on the Dead Honkys shelf; especially not Prussian Immanuel Kant who wrote some of the earliest texts on "scientific racism", including such gems as "The Negroes of Africa have by nature no feeling that rises above the trifling" (1764, Observations on the Feeling of the Beautiful and the Sublime). Fuck that guy.

★½☆☆☆

The Mother of All Demos

December 9, 1968, Douglas Engelbart's presentation of NLS and teleconferencing:

  • Youtube: This is at 360p, most other uploads are at 240p fuzzy mud, I'd love to have a good HD one where I can read the text. Alas.
  • TheDemo@50

"If in your office, you as an intellectual worker were supplied with a computer display, backed up by a computer that was alive for you all day, and was instantly responsible—responsive—to every action you had, how much value would you derive from that?"

Of course, in reality what we mostly do with that is look at social media, hardly any better than watching TV. But we could do more.

It's been years since I've watched this, and some things jump out at me as I rewatch:

The keyboard beep is infuriating, it's what I consider an error sound. And Doug's fumbling a few times, which suggests the keybindings aren't visible, well-organized, or practiced yet. We see later that they're just code mapped to keys in a resource list.

The NLS word demo is somewhat like a modern programmer's editor with code folding; but notably I don't ever use folding, it's slow (even on 1 million times faster machines!) and error-prone, sucking up far more text than expected. It's also a lot like outliners like OmniOutliner; but while I do sometimes use OO to organize thoughts, I would never keep permanent data in it, because I can't get it into anything else I use. Dumb text is still easier and more reliable; I put my lists in Markdown lists:

- Produce
    + Banana
        * Skinless

Maybe the answer is we should have better tools and APIs for managing outlines? Right now I can manage dumb text from the shell, or any scripting language, or with a variety of GUI tools. OmniOutliner's "file format" is a bundle folder with some preview images and a hideous XML file with lines like:

    <item id="kILRUkulXwk" expanded="yes">
      <values>
        <text>
          <p>
            <run>
              <lit>Stuff</lit>
            </run>
          </p>
        </text>
      </values>
      <children>

Nothing sane can read that; even if I use an xml-tree library, it's still item.values[0].text.p.run.lit to get a single value out!

If I export it to OPML, it loses all formatting and everything else nice, but I get a more acceptable:

    <outline text="Stuff">

Back to the demo.

The drawing/map editor's interesting. This is pretty much what Hypercard was about, and why it's so frustrating that nobody can make a good modern Hypercard.

Basically every document seems to be a single page, fixed on screen. If a list gets too long, what happens? It doesn't scroll, just fully page forward/back.

Changing the view parameters is basically CSS; CSS for the editor! Which is what makes Atom so powerful, but it's not easy to switch between them, probably have to make your own theme plugins, or just a script to alter the config file and then reload the editor view.

Inline links to other documents in your editor, is interesting. Obviously we can write HTML links, but they have to be rendered out and no editor can figure out where a reference goes and let you click on it. Actually, this does work in Vim's help system, but nowhere else.

The mouse changed three ways since then: The tail moved to the top, the wheels became a ball which drives two roller-potentiometers inside, and then was replaced with a laser watching movement under a window. Don't look into the butthole of your mouse with remaining eye. But the basic principle of relative movement of a device moving the pointer, rather than a direct touch like Don Sutherland's Sketch light pen, or modern touch screens, or a Bluetooth stylus, remains unchanged and still the fastest way to point at a thing on a screen. Oh, and the pointer was called a "bug" and pointed straight up, Xerox copied this directly in their Star project, while everyone since Apple has used an angled arrow pointer.

The chording keyboard never took off, and I've used a few, and see why: It's incredibly hand-cramping. While a two-handed keyboard is awkward with a mouse, you have room to spread your fingers out, and only half the load of typing is borne by each hand. On a chord, each finger is doing heavy work every character.

The remote screen/teleconferencing setup is hilarious: a CRT being watched by a TV camera, which runs to a microwave transmitter; they couldn't send it over phone lines, acoustic coupler modems were only 300 baud (bits per second, roughly) at the time.

As with Skype today, every chat starts with "I can't hear you, can you hear me? Fucking (voice chat system)." Later, audio drops out, and all Doug can do is wave his mouse at the other presenter. I've joked before that the most implausible thing in Star Trek isn't FTL, even though that's physically impossible; it's not aliens indistinguishable from humans with pointy ears, half black/white makeup, or bumpy foreheads; it's that you meet an alien starship and can instantly set up two-way video conferencing.

They seem to have a mess of languages; MOL (Machine Oriented Language) is a macro assembler in modern terms. All the languages have to be adapted to NLS, they couldn't just use LISP or FORTRAN. Since changes are recorded by userid, they had git blame!

Split screen! That's a thing I love, and few editors do. You can drag a bar down from the top in BBEdit, and Atom has "Split up/down/left/right" for panes, but then you have to re-open the document in each and it's a pain.

Messaging is a public board (or rather, an outline with each statement as a message), with #INITIALS for addressing, like @USERNAME in the Twitters and such. Like those, there's too much data to process for live updating, everything runs as a batch job that can crash the database. War Computing never changes.

Cold & hot retrieval are just file search; on the Mac we have Spotlight, and can search by keywords or filename. Though I have some problems with the cmd-space search these days, and mostly open Finder and search from there to get a list of files matching various requirements, or sometimes use mdfind whatever|less from shell, then winnow down "whatever" until I have only a few results. On Windows or Linux, you're fucked; get used to very long slow full-text searches.


What NLS Did, and How We Can Do That

  1. Mouse, Keyboard, bitmapped displays: We have that.
  2. Teleconferencing: Still sucks.
  3. System-Wide Search: Mac users have that, everyone else is boned.
    • It's faster on Linux or Windows to search Google for another copy of existing data than to search the local machine.
  4. Outlining to enter hierarchical data: Nope.
    • All data goes into outlines contained in files.
    • Code as data: Some data is program instructions, in a variety of languages, which can operate on outlines.
    • To enter this outline, I had to keep adjusting my numbers, because I'm writing it in markdown text.

As mentioned above, OmniOutliner is logically very similar, but it's a silo, a trap for your data. The pro version (WHY not every version?!) lets you use Omni Automation, which is basically AppleScript using JavaScript syntax; the problem is waiting for an app to launch, then figuring out where your data is hidden inside some giant structure like app.documents[0].canvases[0].graphics[2] (example from omni docs ), just so you can extract it for your script.

Brent Simmons is working on Rainier/Ballard, which is a reimagining of Dave Winer's Frontier. I think building a new siloed language and system doesn't solve the real problem, but maybe it'll get taken up by others.

I have for some time been toying with enhancing my Learn2JS shell into an Electron application that would let you write, load, save, and run scripts in a common framework, without any of the boilerplate it needs now. A pure JS shell is just too limited around file and network access, and node by itself is too low-level to get any useful work done. I'm not sure how that works with everything else in your system. While browser localStorage of 2MB or so is sufficient for many purposes, you really want to save local files. While this doesn't force data into outlines, it makes code-as-data easy, and JavaScript Object Notation (JSON) encourages storing everything as big trees of simple objects, which your functions operate on.

(I'm having fun with Scheme as a logic puzzle; but it's not anything I'd inflict on a "normal" person trying to work on data that matters).

If you want to talk about doing more with this, reach me @mdhughes@cybre.space on Fediverse.

China, Shenzen, and Quality Control

And much more. There's a good lesson there, and it's not just in China, they're only the worst-case end scenario. I look at this shit and see us in 20 years.

Beer (since they bring that up in the first video) is in a precarious place. Since the '80s, we've had some great microbreweries come up like Ninkasi, Big Sky, Deschutes. We used to have good regional beers like Rainier and Lucky, but then they were bought by giant breweries and turned into fake labels. The "mainstream" beers are garbage, mildly contaminated water you wouldn't wet a pigsty down with, sold entirely on marketing budgets and association with sportsball. If I see someone drinking Coors or Bud, I know they're tasteless and scream at hooligans on TV giving each other concussions. Microbreweries are often on shoestring budgets and one downturn can kill them; happily it's easy to homebrew and start a new microbrewery, but it's a thin line keeping us from going dry or drinking fucking Pabst.

We've very stupidly sent all our electronics manufacturing over to China. If you want good hardware from China, you have to stand over them and QA everything like Apple does, and then get accused of using slave labor, which it is (best-paying slave labor in China, but still so unethical it makes me nauseous). Huawei's CFO has been arrested, and many places aren't using their equipment now because they almost certainly have CCP spyware, but it's still the only place that makes most electronics. This is civilization-ending-level stupidity.

We should have our own Shenzen SEZ (I'm not suggesting loosening our environmental or worker protections, weak as they are under the hideous cheeto person's administration; but some tax incentives would be great), and be making our own hardware for secure systems, but even if we could get workers to do it, we don't have anyone to train them. We used to have Radio Shack for learning to make electronics, but RS fucked it up by selling garbage consumer toys (mostly sourced from China), so they were driven out of business by cheaper online ordering of garbage consumer toys. So we're at the mercy of another country being driven into self-destruction.

US physical infrastructure is crumbling in many places, and entire cities like Detroit are unsafe to inhabit, because politicians have no reason to do anything about it; all money goes back to the companies that bribed them into office, while poor people in cities by definition don't have enough money to bribe them (PR con artists call these bribes "campaign finances"; I call them malfeasance and we should hang them all). The culture of "not my problem" is just as endemic here, in non-"Communist" countries; if we haven't collapsed as far yet it's because most of our buildings are under 100 years old and we build over things faster than they rot.

In software, we do it to ourselves. I dislike/distrust most software, because we have absolutely zero quality control; people ship any damn thing, and even these "walled gardens" don't do anything to stop it. There's so much garbage on the Apple App Store, basically just screenshots and RSS scrapers, or recompiled demo or tutorial book projects, named to take advantage of Apple's shitty search and advertising interfaces, and nobody cares because you can't set a decent price. So only scammers and ad companies and loot-box sellers can make money there. Why does anyone buy a $1000 iPhone when almost all the software on it is shit? Google Play is 100x worse, it's essentially nothing but viruses and scams, because Google's not just uninterested in QA, but profits better from spyware.

Commercial desktop software isn't much better, I think harder about wasting $20 on software than I would on a $100+ non-electronic physical object. In the last 2 years, I have literally upgraded two programs (somewhat reluctantly in one case; old version didn't work on Mojave or I'd've kept using it forever, and the new version can take 5-10 seconds to start, with a splash screen) and bought nothing new, because everything new that I try is shit. I'm using a free thing called "LimeChat" for IRC, because Adium's half-broken by neglect, and it's awful, but slightly better than command-line irc; I wouldn't pay for this.

There's some quasi-commercial stuff where "open source" means you can use the tool but there's ways for the corporation who supports it to make money, and some of these aren't the worst software ever made. WordPress, obviously. Atom's in a dangerous position where it's supported by GitHub, which was making a little money on services, and is now owned by Microsoft, who makes Quality Job #NaN. Will Atom get the performance rewrite finished before Microsoft shutters GitHub? Will it keep working? Nobody knows! Fucking Slack is appallingly bad, not because it's Electron but because non-corporate customers don't matter to them.

Free software is mostly garbage, and we get things like the npm event-stream takeover because nobody maintains their own shit, just make junk and throw it away, and then we're SHOCKED when criminals see this as an opportunity.

As usual, I don't have solutions, only problems. I write my own software so I don't have to rely on other people's software. I ought to grow my own food, dig a well, and stockpile guns and ammo, but I'll probably just turn Reaver if I survive the coming collapse of everything.

Resize Windows with Applescript

So I downloaded it with youtube-dl (after more annoyances with MacPorts updates ) and a helper script ytplaylist: [updated 2019-06-22]

youtube-dl -i --yes-playlist --restrict-filenames --recode-video mp4 -o '%(playlist)s/%(playlist_index)s-%(title)s.%(ext)s' "$1"
osascript -e 'display notification "Youtube playlist downloaded"'

where $1 is the actual playlist URL; "show video list" under the video player or pick from DNA Lounge playlists

Now I have a folder full of properly-named videos. VLC can be opened from the shell with:

~/Applications/VLC.app/Contents/MacOS/VLC jwz_mixtape_200 &

Frustrated by VLC constantly resizing, I then ignored the problem for most of the morning, finally wrote resizeWindow.applescript:

#!/usr/bin/osascript

global appName
global windowX, windowY, windowW, windowH

on run argv
    parseArgs(argv)
    wrapCoords()
    resizeWindow()
end run

on parseArgs(argv)
    set argc to (count of argv)
    if argc ≠ 5 then
        display dialog "Usage: resizeWindow.applescript APPNAME X Y W H"
        error number -128 -- User canceled
    end if
    set appName to item 1 of argv
    set windowX to item 2 of argv as number
    set windowY to item 3 of argv as number
    set windowW to item 4 of argv as number
    set windowH to item 5 of argv as number
end parseArgs

-- Wrap negative coords around to other side
on wrapCoords()
    tell application "Finder"
        set desktopBounds to bounds of window of desktop
    end tell
    if windowX ≥ 0 then
        -- no changes
    else
        set windowX to windowX + (item 3 of desktopBounds) - windowW
    end if
    if windowY ≥ 0 then
        set windowY to windowY + 24 -- menu bar
    else
        set windowY to windowY + (item 4 of desktopBounds) - windowH
    end if
end wrapCoords

on resizeWindow()
    tell application "System Events"
        tell process appName
            set frontWindow to the first window
            set appPos to position of frontWindow
            set appSize to size of frontWindow
            -- display dialog ("front window of " & appName & ": " & (item 1 of appPos) & ", " & (item 2 of appPos) & ", " & (item 1 of appSize) & ", " & (item 2 of appSize))
            -- display dialog (appName & " at " & windowX & ", " & windowY & ", " & windowW & ", " & windowH)
            set size of frontWindow to {windowW, windowH}
            set position of frontWindow to {windowX, windowY}
        end tell
    end tell
end resizeWindow

Now I can just leave it running to update every 5 seconds:

while true; do resizeWindow.applescript VLC 0 -64 720 640; sleep 5; done

Slight annoyance, sometimes it's still expanding the size further down than it should until I size it smaller, and then it works. Fucking software.

I don't know that what I've done is productive in any way, but I have my MTV.

The kids are disco-dancing
They're tired of rock and roll
I try to tell them, "Hey, that drum machine ain't got no soul"
But they don't want to listen, no
They think they've heard it all
They trade those guitars in for drum machines and disco balls
We can't rewind now; we've gone too far
Internet killed the video star
—The Limousines, "Internet Killed the Video Star"

The Stubbornness of Windows Users

What we've got here is, a total failure to understand the purpose of the device or the OS.

A somewhat long sidebar here, state of the world in desktop operating systems:

  • Windows: Redmond still ships a garbage toy OS which is the bastard child of VMS and MS-DOS, that costs a lot of money, but runs on cheap (but not sub-$200) computers, many of which come in every shape and size. In order to run Windows, you need to have a total lack of aesthetic sense, a willingness to put up with "updates" that brick your computer, a tolerance for Microsoft-Quality™ software ("let's add more buttons to a ribbon bar and ship it!"), and a willingness to use junk hardware that consumes twice as much power as needed and makes noise all the time.
    Slightly positive, the graphics and sound systems work, and you get all the games; if that's all you're after, though, a PS4 or Xbox OnePlus+Ultra/190 (whatever the name is) is a better deal. You can generally browse the web on Windows, and you'll get some viruses and ransomware but it works. Dev tools on Windows are expensive and shitty, so in order to get real dev work done, Redmond now also ships Linux inside Windows. My bias shows, sure: I've never owned a Microsoft product in my life, and I'd eat broken glass before doing so, but I've had to use them in some workplaces. Dire, but minimally functional.

  • Linux: Distros ship a garbage OS for free that runs on garbage computers, including sub-$200 microcontrollers. In order to run Linux, you need to be masochistic, technically educated, not have any need for desktop apps, sound support, graphics support, games (some Steam stuff now works, sometimes, on higher-end machines!). As a server or microcontroller OS, or a very nerdy dev machine (emacs and C), it's adequate and somewhat supported. Only insane people use Linux as a working desktop. I say that as someone who ran it as a working desktop for a decade, and I loathed it.

  • FreeBSD, OpenBSD, NetBSD: Great server OS's, that ship for free and run on slightly more demanding computers. Only the most technical nerds will even know that these exist. Software, you basically write your own or port from other POSIX systems, which half the time is written for broken Linux APIs and so doesn't work right. On the bright side, they have such limited sound and graphics driver support that if you do have compatible hardware, you'll have working sound and graphics. If Mac OS X didn't exist, I'd be using FreeBSD.

  • Haiku (aka BeOS): Seriously, they shipped a working beta, and it seems nice. Great desktop, graphics and sound support if you're on compatible hardware. Down side, minimal software for it, and if you want to write your own the APIs are in C++. Fuck that, no. But… I do like BeOS, probably tolerable as a nerdy dev computer.

  • Mac OS X (or "tacOS", er, "macOS" as they now style it): The last of the UNIX® workstation OS's, that only runs on expensive devices Apple makes (it's possible to "Hackintosh" a garbage computer to run Mac OS X, but half the services won't work; don't do it unless you're nerdier than a FreeBSD user). Everything actually fucking works. Sound has no latency, and always works. Graphics, aside from low-end devices having a shitty Intel GPU, always works; I'm unhappy with them deprecating OpenGL and going with Metal instead of Vulkan, but Vulkan libraries have been ported. It's fine.
    There are games, Elder Scrolls Online and World of Warcraft in particular, and Steam's full of Mac games. Desktop applications on the Mac are adequate to amazing; there's no "you must use this one shitty program because it's all we've got" like GIMP on Linux. As a dev machine, it's unmatched. I don't touch Xcode unless I have to anymore, but that's what you use to make iOS and Mac apps, and it has some good dev tools like Xcode Server.

Of course, I say that, and:

libswiftcore-crash Good job, Apple, ship it.

So, end sidebar, the reason you buy Mac hardware is to run Mac OS X, the least bad hardware/software combination available in this horrible century.

What you'd use a Mac Mini for is what you'd use an iMac for, but cheaper and often hidden away:

  • Switch to the Mac from another OS. Steal the keyboard and screen from your garbage computer. You may need some dongles to convert the cables. Learn how to use a Mac. Buy something better when you need it, and re-sell the Mini, which will still be worth 2/3 or more of the original price.

  • Run a multimedia display. Put a playlist of music, photos, or videos on one or a bunch of LCD panels; you can't do this with a Linux microcontroller like Peter suggests, because their graphics and sound don't work worth a fuck. Just try playing a random folder of media on Linux, you'll throw it through the window. It's worth $800 to not fight with Linux.

  • Run a build farm for Xcode Server. Probably need a mid-priced Mini for this, but speed won't matter much because it's an invisible server. It can't be rack-mounted, because not every workplace has a machine room with racks, they just need a little device in some (well-ventilated) cupboard to support the developers.

  • Streaming audio, video, or other server. Put this in a colo farm with a static IP, and deliver whatever media you want. Your podcast and web site has to live somewhere. Now, you can do that with AWS/EC2 and other shared servers, much cheaper, but you don't control the computer, they mostly run Linux (ugh), and often you've written software for the Mac.
    I have an old Mini at colo that runs Minecraft, some file shares, holds backups, used to run Xcode builds but I don't need that now, sometimes runs one-off networking services I want to try out. I may upgrade to a new one, but my needs aren't quite as heavy on it as they were. Invalidstream is currently run from an old Mac pro, but I think he'd be fine on a higher-end Mini now.

  • Literally any other use that doesn't require using it on the move, or extremely heavy CPU or GPU loads. Not a top-of-the-line gaming, Photoshop, or movie editing device by itself, but an external GPU could put it on par with an iMac, maybe even an iMac Pro for some jobs. Probably not a stage DJ device, there they'd use a MacBook Air or even an iPad, but ideal for sticking in an audio booth and doing podcast recording and mixing. Unlike a garbage computer, you can be in the room with a Mini and not be blasted off the air by the overheating fans and clicking drives, and unlike a MacBook it has enough ports.

Most of those tasks require it to be small, quiet, and still attractive if it is visible.

The $799 base model is for only the most minimal uses. For $1,599, you can get:

3.2GHz 6‑core 8th‑generation Intel Core i7 (Turbo Boost up to 4.6GHz)
8GB 2666MHz DDR4
Intel UHD Graphics 630
512GB SSD storage
10 Gigabit Ethernet (Nbase-T Ethernet with support for 1Gb, 2.5Gb, 5Gb, and 10Gb Ethernet using RJ‑45 connector)

That seems like a reasonable Mac workstation, if it had more RAM. +$200 to get 16GB RAM is OK, +$600 to get 32GB RAM is overpriced, +$1400 for 64GB is "bend over and squeal like a pig". You can get 64GB of the same RAM for under $500, and there's a Snazzy Labs RAM Upgrade Tutorial and Teardown; the disassembly doesn't look fun, but worth doing if you're going to use it as a server. A casual user can live on somewhat less RAM with the Apple RAM tax.

lain-s1e3-open navi

Windows and Linux users, people who've only used garbage computers, are confused by Apple's attitude on pricing, upgrades, and repairs because they've never thought about non-garbage computing.

Apple doesn't price based on hardware costs (except for RAM, which they tax 50-300% over cost), but on where it fits in a Portability/Power chart, starting at $1000, because you're buying "machine that runs Mac OS X", not "random collection of parts that does not run Mac OS X". You'll never see Apple micro-adjust prices day to day as part prices or exchange rates change, because it has nothing to do with that.

If you want to upgrade an Apple device, other than RAM in some models, you sell it at a high resale value and buy a better one. Garbage computers are useless in a couple years, cost more to replace parts than they're worth, and have no resale value at all.

If you want to repair an Apple device, it's either free for as long as your AppleCare lasts, or $100 in most cases. Don't keep open containers of liquid on your desk (*), don't abuse your expensive hardware, and the repair isn't a problem (notably, the same guy at Snazzy Labs fucked up his iMac Pro and Apple unsurprisingly told him to go piss up a rope).

*: I wanted to link in the atp.fm episodes where John warns about this, and then gets to say "I told you so", but I can't find them with obvious keywords.

Site Redesign

As part of my site redesign, I'm moving everything off my old "markdamonhughes.com" and "markrollsdice.wordpress.com" domains into this site: Software Gallery, Tools, and RPG. Take a look at the front page, browse around, see if you like it. I'm open to advice at this point. I know I haven't done anything too weird with art and design yet, that's coming.

Content management in WordPress isn't trivial, but it's better than the ad-hoc pile of folders and PHP scripting I was doing. I'm still getting by with the standard media folder, but I'm usually disciplined about naming images so search works; there's advanced media manager plugins but I won't let it get to that point.

Many of the software pages are just "museums" right now. My iPhone software is not currently available (and likely never will be on the iPhone again; Apple's "everything is free" sabotage of developers means it's not possible to charge what software costs to make), but I will rerelease some of it as Mac/Marzipan ports when I get around to it. There's a couple of very cool apps like DungeonJournal (replacement for DungeonDice, but with a mapping & journaling tool!) that were never released properly, and I'd like to get those out. Brigand got adapted back into PerilarFK, so I'm not bothering with it.

I may import the old markrollsdice and dev blog/not-a-blog posts, still pondering on that.

World of Warcraft Classic

This is pretty exciting.

  • I started in original World of Warcraft as a Dwarf Hunter (IIRC, I made a Nelf Rogue first, but that did NOT work for me), determined quickly that I hated Alliance as a bunch of juvenile Nazi-wannabes and switched to the real heroes, The Horde!, played an Undead Mage, didn't progress as much as I'd like due to real life but had fun.
  • Burning Crusade was great, and rather stereotypically I started over with a Blood Elf Warlock (selama ashal'anore!) and maxed out, did some raiding, loved that version of the game. Outland is amazing, that's some of the best content WoW or any game ever made.
  • Lich King was OK but kind of same-y and grindy—why make Dalaran when Shattrath exists and is better? Why are there two indistinguishable forest zones and two indistinguishable undead zones?
  • Cataclysm wrecked everything in the world and in character; it did add some detail to areas that needed it, but it's not worth it.
  • Pandaria had a nice setting and the Monk was a fun class, with beer as a powerup drink!, but letting the Pandas choose their faction made no sense (Goblins in particular, and all the neutral-ish races in general, should've been able to choose, too!), and the higher-level content is trash.
  • Everything after is even worse. After 15 years, what we've learned is that Blizzard can't improve content, only make it worse; everyone good left after Classic and Burning Crusade, and institutionally they aren't able to hire best-of-breed writers or game designers. Restoring Classic is really the only option they have.

So, I've been following this WoW Classic thing a while, since in 2016 when the Nostalrius, RIP and Elysium private servers let me play classic "vanilla" World of Warcraft again; I was frustrated by bugs, and basically couldn't use airships or ships because they'd drop me halfway in an out-of-level zone or the ocean, but it was interesting to have a version of WoW that didn't suck. As previously, I played an Undead Mage; that or Warlock seems most likely when Classic goes live. I am what I am, which is an evil sorcerous corpse.

Crendor and ClassiCast have done excessively exhaustive dialogues of what they want from Classic, which is a super-conservative "No Changes!". I'm much more moderate about this. If they add achievements (added in Lich King), pet battles (added in Pandaria), barbershop (added in Lich King), and use new models (added in Warlords of Draenor), I'd be fine with that; those improve the game without changing any balance. Dungeon/Raid Finder wouldn't bother me, I likely wouldn't use them, since I loathe PUGs (pick-up groups, not tiny inbred dogs), but they're a boon to people in no guild or a small guild that can't organize those things. The actual gameplay should be like the last patch of Classic, or the last patch of Burning Crusade when/if they add that content.

I'd totally be up for getting a "Virtual Ticket" for Blizzcon just to try the demo, except it's $50! That's like, a case of beer and a bottle of whiskey, which is a far more entertaining weekend (please drink more responsibly than me). Minecon on Saturday is free; it's not a real convention anymore (I went to Minecon Vegas back when the audience was much more adult). How the fuck does Blizzard justify $50 for some videos?! So anyway guess I'll just watch Youtubes of the demo when they come out.