Tower of Babble

Programmers almost compulsively make new languages; within just a few years of there being computers, multiple competing languages appeared:

It proliferated from there into millions; probably half of all programmers with 10+ years of experience have written one or more.

I've written several, as scripting systems or toys. I really liked my Minimal script in Hephaestus 1.0, which was like BASIC+LISP, but implemented as it was in Java the performance was shitty and I had better options to replace it. My XML game schemas in GameScroll and Aiee! were half programmer humor, but very usable if you had a good XML editor. Multiple apps have shipped with my tiny lisp interpreter Aspic, despite the fruit company's ban on such things at the time. A Brainfuck/FORTH-like Stream, working-but-incomplete tbasic, and a couple PILOT variants (I think PILOT is hilariously on the border of "almost useful").

Almost every new language is invented as marketing bullshit based on a few Ur-languages:

  • C++: Swift
  • Java: Javascript (sorta), C#, Go
  • Awk: Perl, Python, PHP, Julia
  • C: Rust
  • Smalltalk: Objective-C
  • Prolog: Erlang, Elixir
  • ALGOL: C, Pascal, PL/1, Simula, Smalltalk, Java
  • LISP: Scheme, ML, Haskell, Clojure, Racket
  • BASIC: None, other than more dialects of BASIC.
  • FORTRAN: None in decades, but is the direct ancestor of ALGOL & BASIC.
  • COBOL: None in decades.

A few of these improve on their ancestors in some useful way, often performance is better, but most do nothing new; it's plausible that ALGOL 68 is a better language than any of its descendants, it just has mediocre compiler support these days.

Certainly I've made it clear I think Swift is a major regression, less capable, stable, fast, or even readable than C++, a feat I would've called impossible except as a practical joke a decade ago. When Marzipan comes out, I'll be able to rebuild all my 15 years of Objective-C code and it'll work on 2 platforms. The Swift 1.0 app I wrote and painfully ported to 2.0 is dead as a doornail, and current Swift apps will be uncompilable in 1-2 years; and be lost when Apple abandons Swift.

When I want to move my Scheme code to a new version or any other Scheme, it's pretty simple, I made only a handful of changes other than library importing from MIT Scheme to Chez to Chicken 4 to Chicken 5. When I tested it in Racket (which I won't be using) I had to make a handful of aliases. Probably even CLISP (which is the Swift of LISPs, except it fossilized in 1994) would be 20 or 30 aliases; their broken do iterator would be hard but the rest is just naming.

Javascript is a pernicious Herpes-virus-like infection of browsers and desktops, and nothing can ever kill it, so where it fits the problem, there's no reason not to use it. But there's a lot it doesn't do well.

I was leery of using FreePascal because it has a single implementation (technically Delphi still exists, but it's $X,000 per seat on Windows) and minimal libraries, and in fact when it broke on OS X Mojave, I was disappointed but I-told-you-so.

I'm not saying we should quit making new Brainfuck and LOLCODE things, I don't think it's possible for programmers to stop without radical brain surgery. But when you're evaluating a language for a real-world problem, try moving backwards until you find the oldest and most stable thing that works and will continue to work, not piling more crap into a rickety new framework.

The Biblical reference in the title amuses me, because we know now that it requires no malevolent genocidal war deity scared of us invading Heaven to magically confuse our languages and make us work at cross purposes; anyone who can write and think splinters their thought into a unique language and then argues about it.

Return of the Objective-C Jedi

[[[ These ]]] are your father's square brackets, the weapons of a Jedi Knight.
Not as clumsy or random as C++.
Elegant weapons for a more civilized age.

What's Different in Mulle-ObjC

This is like Objective-C circa 2010(?), good but not fully baked. Far better than circa 1986-2009, when it was a very thin translation layer over C.

  • No ARC (Automatic Reference Counting). This is just invisible sugar to hide retain/release/autorelease, and while ARC's convenient, it's trivial if you actually know how reference counting works. Don't really miss it.
  • No dot property syntax. [[myObj name] length] instead of myObj.name.length, and [myObj setName:newName] instead of myObj.name = newName. I can live with it, but I really did like dot syntax, even if it does "overload" the . operator and hide the distinction between methods and variables.
    • When dot syntax came out, Objective-C nerds came close to fistfights over this. You would not believe the venom some people had for it. Most of those nerds died or quit or got old & tired before fucking Swift came around, I guess.
  • No array syntax. [myList objectAtIndex:i] instead of myList[i]. This is a pain in the ass, I'll have to write some shorthand macros (or rather, go dig them out of my very oldest code).
  • No blocks. This one hurts, but it's a reasonable pick-your-battles decision. Classic: Write a method, dispatch to it, and call back success somehow. Blocks: create a weakSelf reference, enclose it, search-replace self in your block, pick one of a half-dozen complex GCD methods, get a memory leak because you retained something across the block boundary. This is annoying but logically simpler:
    [self performSelectorInBackground:@selector(computeData) withObject:inputData];
    
    - (void)computeData:(id)inputData {
        // create outputData
        [self setOutputData:outputData];
        [[NSNotificationCenter defaultCenter] postNotification:NOTI_DataComputed];
    }
    
  • Has object literals: @42 and @(var) create an NSNumber, @[] creates an NSArray, @{} creates an NSDictionary; dicts use key:value order, not the reverse order used in -[NSDictionary dictionaryWithObjectsAndKeys:], and array and dicts don't need a trailing nil, which was a constant source of mystifying bugs back in the day. Big win!
    • Hmn, crashes if you do something janky like [@[] mutableCopy]: mulle_objc_universe 0x1006adef0 fatal: unknown method 5e1b0403 "-getObjects:range:" in class 7aa0d636 "_MulleObjCEmptyArray"
  • Has for (id x in container) loops, using NSFastEnumeration. The 1.0 process of looping enumerations was awful, so this is very nice.
  • Huh, does have @autoreleasepool, so maybe I should use that instead of NSAutoreleasePool like a caveman? It compiles and seems to work.
  • Properties have properties assign/retain nonatomic/atomic nonnullable readonly, default is assign nonatomic, no "nullable" or "readwrite" flags needed. As it should be.
  • Weird isa define instead of pointer: blog post

TODO

  • I haven't set up an NSRunLoop or the equivalent of NSApplication (which is in AppKit, not Foundation), need to do that and then I'll have a working app template.

Writing Objective-C with Mulle-Objc

mkdir CLICalc
cd CLICalc
mulle-sde init -m foundation/objc-developer executable

This takes more or less forever.

… Still going. OK, finally done. I hate to think it's gonna do that every new project? Or whenever it updates?

Anyway, bbedit . (fuck Xcode), and add at the bottom of import.h and import-private.h:

#import <Foundation/Foundation.h>

Make src/main.m useful:

// main.m
#import "import-private.h"
#import "CLICalc.h"

int main(int argc, char *argv[]) {
    NSAutoreleasePool *pool = [[NSAutoreleasePool alloc] init];

    CLICalc *calc = [[CLICalc alloc] init];
    [calc push:42.0];
    [calc push:69.0];
    double a = [calc pop];
    double b = [calc pop];
    NSLog(@"a=%f, b=%f", a, b);

    [pool release];
    return 0;
}

Create an Objective-C class, src/CLICalc.h:

// CLICalc.h
#import "import-private.h"

@interface CLICalc : NSObject

@property (retain) NSMutableArray *stack;

- (void)push:(double)n;
- (double)pop;

@end

and src/CLICalc.m:

// CLICalc.m
#import "CLICalc.h"

@implementation CLICalc

@synthesize stack = _stack;

- (id)init {
    self = [super init];
    _stack = [[NSMutableArray alloc] init];
    return self;
}

- (void)dealloc {
    NSLog(@"CLICalc dealloc");
    [_stack release];
    [super dealloc];
}

- (void)push:(double)n {
    [_stack addObject:@(n)];
}

- (double)pop {
    if ( ! [_stack count]) {
        // ERROR: stack underflow
        return 0.0;
    }
    double n = [[_stack lastObject] doubleValue];
    [_stack removeLastObject];
    return n;
}

@end

Doing that without a template was a little hard on the old memory, and I had to use DDG to look up some method names without autocompletion. But I'm pretty sure that's fine.

In mulle-ide, type update to add the new class to cmake: If you look in cmake/_Sources.cmake you should now see CLICalc.m listed.

Now craft to compile. You'll get a spew of crap, but hopefully no errors.

I am getting this, which I can't resolve:

/Users/mdh/Code/CodeMac/CLICalc/src/main.m:21:55: warning: 'NSAutoreleasePool'
      may not respond to 'init'
        NSAutoreleasePool *pool = [[NSAutoreleasePool alloc] init];
                                   ~~~~~~~~~~~~~~~~~~~~~~~~~ ^
1 warning generated.

But NSAutoreleasePool certainly has init, and it seems to not die?

% ./build/Debug/CLICalc
a=69.000000, b=42.000000

Hooray!

Yeah, this isn't amazing. Except: It's supposedly portable now. I can maybe rebuild this on Linux, or Windows? I dunno.

This is almost classic Objective-C, slightly enhanced from 1.0: We didn't have property/synthesize, or nice object wrappers like @() when I were a lad. I typed so many [NSNumber numberWithInteger:n]. So get used to the retain/release/autorelease dance. There's no dot-syntax for property access, type them [] like old-school. But hey, it's a proper compiled language with a nice object system and no GC pausing.

I tried importing Cocoa and got a ludicrous spew of errors, so Mac GUI is gonna be a challenge. But I could import SDL and use that for portable UI, since Objective-C is just C.

Sweet. I'll finish up the calculator's parser in a bit, but then see about doing something useful in it.

New Electron Dance

No, wait, that's the Neutron Dance, I get those confused.

Electron

Since abandoning any hope of the iOS App Store paying my bills, I've had to look back at the web or desktop. My current available time-at-computer and energy these days isn't sufficient for a day job or even contracts, much to everyone's dismay. So time for another hard look at the situation.

I like working on my Mac, but Mac isn't that big a market. I also want to ship on Windows (and Linux, I suppose). Objective-C is one of my favorite languages ever, Cocoa & UIKit (on iOS) were great APIs, AppKit on the Mac much less so, but since Apple's killed Obj-C and it's not portable, my happy years of typing [ ] are over.

WHAT HAVE YOU DONE?!

Swift might be the worst mental disorder to strike programmers in decades. Swift is orders of magnitude slower than Objective-C, crashes constantly, the moving-target "spec" creates incompatible changes every year, and because they're too stupid to standardize a binary interface, every program has a 20MB+ blob of Swift runtime. For a single-platform joke language perpetrated by a C++ bozo who fucked off after a year to play with cars. So I'm all too happy to say good riddance to that bullshit. I mean exactly this: If you're using Swift, you either don't know better (it's OK to say you don't know!), or are defrauding your employer for hours, or have something wrong inside.

13 years ago, Project Builder/Interface Builder was a pretty good dev toolkit since I could use a real editor (BBEdit) with it, but Xcode locked that out, and then as Apple sucked in more tools over time, it sucked harder and harder; I can't stand the rickety deathtrap these days. I was getting by in JetBrains' AppCode, but still had to use Xcode for Interface Builder (RIP) and to get builds onto a device half the time. Xcode is a crashy, substandard pile of shit with maybe the worst editor in any IDE in history. Syntax highlighting stops working at random, for most of a decade it has code-completed "nss" as "NSStreamDelegate" rather than the slightly more useful "NSString" (before that it couldn't code-complete at all!), I could go on for hours or days about how Xcode kicks you in the input/output ports every time.

And the worst part is you can't fix the fucking thing, no user-serviceable parts inside, Radar is a black hole, no scripting or plugins. Just bend over and take what Apple Developer gives you good and hard. It's kind of a relief that current Xcode doesn't run on the last stable MacOS version (Sierra).

I'll stick with BBEdit for text and Atom for code, thanks. If I'm angry at Atom I can fix it myself or file a publicly-trackable ticket; I'm rarely angry at BBEdit but I can ask Rich to fix it.

So I'm writing web-type software in Javascript, with Node or Electron behind it. Javascript aka ECMAScript has become a good language in the last 5-10 years, and the V8 runtime in Node/Electron runs close enough to native now for most needs. I love that I can just write UI in HTML again. No fucking around with Apple's bullshit of deprecating APIs out from under me (I "get" to rewrite alert/menu code again?!), or promising to support SpriteKit/SceneKit across iOS & Mac and then doing fuck-all on either. WebGL (or Three.js, anyway) isn't fast enough for complex scene-graphs, but 2D work in Canvas is mostly fine (and it gets better every year, instead of bit-rotting like unused S*Kit APIs). localstorage in a web page isn't enough for any real program, thus Node is needed to reach the filesystem.

I slander Swift for leaving a giant runtime turd in every program, but Electron's the same way: It has to contain a browser, Node, and system APIs. But I'm not at the mercy of Apple's marketing-driven dev tools.

Certain Mac nerds obsess about Purity of Essence, insisting that everyone should love Xcode, Swift, and AppKit, and that use of any other technology is an abomination to the end-users, whom they clearly love more than me. Can you hear that slurping sound? That's someone fellating Apple marketing. Roughly 4 billion more people are familiar with web pages and will find a web-like UI more comfortable.

I intend to keep up my experiments in Scheme and Pascal when I have time, I'd far rather have small, fast, native binaries on every platform, but shipping beats purity.

Progress is being made:

tile-20180426-map

tile-20180426-view

(the + road texture there will get replaced soonish)

Swiftian Satire, or Tragedy?

I honestly cannot tell if Swift developers are seriously eating Irish babies, or taking the Mickey.

From bad implementations of Equatable and Hashable, the wag leaps to:

extension GridPoint : HashVisitable {
    func hash<H: Hasher>(_ hasher: inout H) {
        self.x.hash(&hasher)
        self.y.hash(&hasher)
    }
}

Bravo! That's easily the funniest punchline to a programming joke since "where do you think the chaos came from?"

Starts with the most bizarre strawman Objective-C Sieve of Erathosthenes I've ever seen. A real implementation would be in C, because Obj-C is C with objects, and it'd be massively faster:

#include <stdlib.h>
#include <stdio.h>

typedef char BOOL; // or link in Foundation
#define YES 1
#define NO 0

int main(int argc, char **argv) {
    if (argc != 2) {
        printf("Usage: primes COUNT\n");
        exit(1);
    }
    long n = atoi(argv[1]);
    BOOL p[n];
    p[0] = NO;
    p[1] = NO;
    for (long i = 2; i < n; ++i) {
        p[i] = YES;
    }
    for (long i = 2; i < n; ++i) {
        for (long j = i*i; j < n; j += i) {
            p[j] = NO;
        }
    }
    for (long i = 1; i < n; ++i) {
        if (p[i]) {
            printf("%ld ", i);
        }
    }
    puts("");
    return 0;
}

This does use more lines of code, but they're short, low-density, and it's instantly obvious what it's doing (my predilection for 1-char var names aside). Can you actually decode his filter-based version?

func sieve(_ sorted: [Int]) -> [Int] {
    guard !sorted.isEmpty else { return [] }
    let (head, tail) = (sorted[0], sorted[1..<sorted.count])
    return [head] + sieve(tail.filter { $0 % head > 0 })
}

let numbers = Array(2...1000000)
let primes = sieve(numbers)
print(primes)

And the runtime experiment:

mdh@Aegura:~/Code/CodeC% time clang -O3 -o primes primes.c                
clang -O3 -o primes primes.c  0.04s user 0.41s system 83% cpu 0.534 total
mdh@Aegura:~/Code/CodeC% time ./primes 1000000 >~/Desktop/primes.txt      
./primes 1000000 > ~/Desktop/primes.txt  0.02s user 0.00s system 89% cpu 0.025 total

mdh@Aegura:~/Code/CodeSwift% time swiftc -O -o swiftPrimes swiftPrimes.swift
swiftc -O -o swiftPrimes swiftPrimes.swift  0.57s user 0.64s system 69% cpu 1.754 total
mdh@Aegura:~/Code/CodeSwift% time ./swiftPrimes >~/Desktop/swiftPrimes.txt  
./swiftPrimes > ~/Desktop/swiftPrimes.txt  51.79s user 26.49s system 99% cpu 1:18.78 total

So the naïve C implementation is about 3,151x faster. I can't measure it precisely because a limit measurable in C, would take Swift until the heat death of the Universe.

So here's my question: Is Vincent aware of this, and his theme of "diabetes", "sugar", "saccharine", etc. pointing at how fat, bloated, slow, and deadly Swift is? He never lets on if this is a joke, he keeps tossing more syntax layers on top of Swift.

Installers

Indie game dev leads you to some dark and terrible places.

I so miss the App Store being an endless payout slot machine without spending $10M on advertising, and miss the 6-figure jobs for fixing peoples' apps because nobody knew Objective-C (even less know it now, but they're stupidly trying to rewrite code they don't understand into Swift, which will break again in 6 months).

Now I'm a poor but honest pixel farmer, forced to shovel shit to get to market.

Making a Mac binary for Reaper's Crypt was trivial (on a Mac, probably impossible elsewhere), and produced 1 file: "Reaper's Crypt.app" (a Mac application bundle, hiding all the mess so you don't see it).

Making a Linux binary was not much harder, and produced 17 files and directories, with libraries and data scattered all over, with the binary sitting in the middle where nobody could see it. So I'll have to make a little script to go launch that untidy mess. When I did Linux, there were at least 3 standards for icons, and by now I'm sure there are 13 more, so they get a raw image file.

Making a Windows binary required me to install WINE with MacPorts, which took hours, and the binary is in the middle of a similar mess of 20 files and directories. So for this I need an installer to make a .msi file, which nobody I know has done this decade; I think I have a handle on this. But now I don't know if I need 32-bit "win32" or 64-bit "win32" (what.); there's no fat binaries in Windows, so it's one or the other.

I am not Hercules, and these Augean stables are filthy.

Swift

Swift amazes me. A beta language that breaks your code every 6 months, a type system so totalitarian and inescapable it makes BDSM Haskell look like a vacation (and apparently nobody's read Gödel's paper), the founder abandoned it to go work on cars, Apple won't ship production code in it, compiling burns your fucking CPU to the ground for 10s of minutes for code C can do in seconds, and after 3 years Xcode still can't refactor it.

And stupid motherfuckers write their production apps in it. ?

I know Objective-C is hard. It's C plus Smalltalk, both of which are subtle and take a year or two to learn. [brackets scare:theNoobs] && dot.syntax.isOverloaded;
But the tools fucking work. Dynamic code makes programmers efficient. A more elegant weapon for a more civilized age.