Swiftian Satire, or Tragedy?

I honestly cannot tell if Swift developers are seriously eating Irish babies, or taking the Mickey.

From bad implementations of Equatable and Hashable, the wag leaps to:

extension GridPoint : HashVisitable {
    func hash<H: Hasher>(_ hasher: inout H) {
        self.x.hash(&hasher)
        self.y.hash(&hasher)
    }
}

Bravo! That's easily the funniest punchline to a programming joke since "where do you think the chaos came from?"

Starts with the most bizarre strawman Objective-C Sieve of Erathosthenes I've ever seen. A real implementation would be in C, because Obj-C is C with objects, and it'd be massively faster:

#include <stdlib.h>
#include <stdio.h>

typedef char BOOL; // or link in Foundation
#define YES 1
#define NO 0

int main(int argc, char **argv) {
    if (argc != 2) {
        printf("Usage: primes COUNT\n");
        exit(1);
    }
    long n = atoi(argv[1]);
    BOOL p[n];
    p[0] = NO;
    p[1] = NO;
    for (long i = 2; i < n; ++i) {
        p[i] = YES;
    }
    for (long i = 2; i < n; ++i) {
        for (long j = i*i; j < n; j += i) {
            p[j] = NO;
        }
    }
    for (long i = 1; i < n; ++i) {
        if (p[i]) {
            printf("%ld ", i);
        }
    }
    puts("");
    return 0;
}

This does use more lines of code, but they're short, low-density, and it's instantly obvious what it's doing (my predilection for 1-char var names aside). Can you actually decode his filter-based version?

func sieve(_ sorted: [Int]) -> [Int] {
    guard !sorted.isEmpty else { return [] }
    let (head, tail) = (sorted[0], sorted[1..<sorted.count])
    return [head] + sieve(tail.filter { $0 % head > 0 })
}

let numbers = Array(2...1000000)
let primes = sieve(numbers)
print(primes)

And the runtime experiment:

mdh@Aegura:~/Code/CodeC% time clang -O3 -o primes primes.c                
clang -O3 -o primes primes.c  0.04s user 0.41s system 83% cpu 0.534 total
mdh@Aegura:~/Code/CodeC% time ./primes 1000000 >~/Desktop/primes.txt      
./primes 1000000 > ~/Desktop/primes.txt  0.02s user 0.00s system 89% cpu 0.025 total

mdh@Aegura:~/Code/CodeSwift% time swiftc -O -o swiftPrimes swiftPrimes.swift
swiftc -O -o swiftPrimes swiftPrimes.swift  0.57s user 0.64s system 69% cpu 1.754 total
mdh@Aegura:~/Code/CodeSwift% time ./swiftPrimes >~/Desktop/swiftPrimes.txt  
./swiftPrimes > ~/Desktop/swiftPrimes.txt  51.79s user 26.49s system 99% cpu 1:18.78 total

So the naïve C implementation is about 3,151x faster. I can't measure it precisely because a limit measurable in C, would take Swift until the heat death of the Universe.

So here's my question: Is Vincent aware of this, and his theme of "diabetes", "sugar", "saccharine", etc. pointing at how fat, bloated, slow, and deadly Swift is? He never lets on if this is a joke, he keeps tossing more syntax layers on top of Swift.

Apple Special Event

  • Live video doesn't work in Safari beta. Had to watch on iPad.
  • Steve Jobs tribute. Makes me a little uncomfortable. I don't think Steve would've liked being deified like this. Not sentimental, he wanted the work to speak for itself.
  • Apple Park. Very pretty as a modern cathedral, but still that open plan is going to be Hell for developers.
  • Apple "Town Squares". This is a very 20th Century kind of thing, a real-world gathering place, where you're supposed to learn from others. But now everyone just lives by their computer and talks online, watches online video. There's a lot of "we're restoring historic buildings" in this; the Medicis funding arts while politicking to get their Popes elected.
  • The stream isn't doing the usual dual-camera picture-in-picture of presentation and zoomed-in view of presenter, so often I only see a tiny bit of a presentation screen, then it flips out for context. Very jarring.
  • Apple Watch, exercise ad, lot of heart health study. Almost every watch app is getting redesigned again, because they can't figure out what it's for, beyond being a watch. "Now you can take a phone call, while you're surfing!" Streaming audio is their solution to killing the iPod… But how much is that going to impact your data cap?
  • AppleTV. A presentation screen on streaming video can't show HD vs. 4k HDR video, so they desaturated the "HD" image to make the images look different.
    • Aside: I've never much liked Spider-Man, but the new movie looks stupid, the costume looks like a plastic CGI figure (which it is, I guess), and the fight scene was so over-choreographed it looked like ballet, not a kid in a brawl with thugs.
    • Aside 2: I love thatgamecompany's games, Flower and Journey are amazing. Sky looks just as good. But I'm dubious to the extreme about social gaming in it; that's the weakest part in Journey, which has almost no interaction.
  • "For the first time, you were actually touching the button!" And then iOS 7 destroyed that UI by removing buttons and making everything a bland white void. Thanks IVE-1138.
  • Ha ha everyone who had "iPhone X" on their bingo card, it's "iPhone 8". There's a regular fat model, and a super-fat + model.
  • The camera is much better. I dislike the term "portrait mode", which doesn't mean portrait-vs-landscape, but bokeh.
  • Using AR (Augmented Reality) Kit to put virtual objects on the real world is still silly, you're still staring at your phone. It replaces a convenient on-screen camera control with having to spin around like a doofus, you can't just sit in your chair and play comfortably.
  • There are sane uses of AR to overlay physical things, like landmarks, or provide auto-translation. If Glassholes and naked Robert Scoble hadn't ruined Google Glass, it might be have a useful interface. But holding up your phone to do it is still silly.
  • Wireless charging is a nice thing. In a car is a strange use for it, since it'd just bounce around without something holding it in place, like a cable.
  • One More Thing: A separate model of iPhone X, with all the crazy rumor stuff. No home button, edge-to-edge screen.
  • FaceID: From now on, you need to wear a mask at all times or anyone can use your "true face" to unlock your phone. The pigs can just hold your phone up in front of you to dig thru it. Good thing there's animoji, so you can send a completely virtual face (panda, poop, robot, or alien) to replace that pesky human interaction.
    • Aside: I am wearing a Star Trek Mirror, Mirror tshirt. I'm the one with the goatee.
  • iPhone X (pronounced "Ten"), and Qi chargers (pronounced "Chi") provide all new ways for Apple devotees to "well, actually" everyone else.
  • Skate to where the oh not this quote again.

Probably makes sense for me to wait a couple months for the iPhone X and get the bleeding-edge device rather than a better what-I-already-have.

OK, get back to work.

Pascal Learning Curve

What I've learned so far:

  • I spent a while trying graphics libraries (or failing to even compile them) before deciding I don't understand the UI model enough yet, so I'll prototype with some high-level drawing and circle back around to OpenGL or SDL.
  • Build a do-nothing app in Lazarus, make a single form with a default FormCreate method, then quit out and write code starting from there in BBEdit.
    • Part of that is that I'm not going to use a ton of GUI components, and code building is the evil opposite of Interface Builder. In IB, you edit UI and connect it to method names scanned out of the source code, it doesn't touch your code.
    • The Lazarus editor is nightmarishly wrong and keeps inserting stuff in my code which makes me crazy. Maybe there's non-crazy-making settings, and probably it seems fine to masochistic Windows and Linux users, but I make enough problems in my life.
  • Naming conflicts are a giant problem, so my current naming scheme is: For class "foo", it goes in file FooUnit.pas, containing unit FooUnit and type TFoo = class…. I'm naming instance fields _bar and accessors bar() and setBar() as I do in most languages. I've mostly got the compiler to stop screaming at me every build. Not letting you name a unit, class, and field the same thing is infuriating.
  • Build with lazbuild -B --bm=Release whatever.lpi; I wrote a script to choose Debug or Release builds and launch the app if nothing went wrong, which is close enough to hitting Cmd-R.
  • Bookmark the docs for:
    • RunTime Library
    • Free Component Library - in particular unit 'contnrs' wants to buy some vowels but has dictionaries, lists, etc.
    • Lazarus Class Library
    • There's very little explanation, so often I have to go digging in source like /Developer/lazarus/lcl. I lost about 30 minutes today because they didn't document that TCanvas.FillRect uses Brush settings, TCanvas.Rectangle uses Pen settings, and I figured it out by reading the Carbon implementation. ? ☕️ ?
  • Almost always if there's a non-domain-specific type I need it already exists. Batteries are included but mostly they're named badly, or upside down, or hidden in sofa cushions, or the dog buried them and I need a metal detector, or my psycho ex stole them and is holding them hostage for a pity fuck.
  • The actual implementation code isn't much different from any other procedural language. For a guy who codes in Pascal for one year every 10 years, it's rolling along pretty fast. The near-equivalency of records and classes, and of functions, properties, and methods is convenient. Defining vars before using them, like in old-timey K&R C, is not convenient. Inline variable definition would be a gigantic quality of life improvement, which I doubt they'll do.

Pascal

Trying out alternative languages to work around my performance and native binary problems, I've circled back around to the '70s and '80s: Pascal. I used classic Pascal on Atari 800 and TRS-80 back in the day, and did quite a bit with Kylix (Linux version of Delphi) before Borland killed that.

Pro

  • Fast Compiles. FreePascal compiles faster than anything I've used in ages. That was always a major Pascal selling point, and it still is. Optimized for programmer time.
  • Fast Runtime. As close to perfectly optimized machine code as you're going to get. Computer Language Shootout has competitive times with C++ for most benchmarks, and I think the worst-cases are variations in style.
  • Object-Oriented. FreePascal reimplements Delphi-style objects, which are pretty standard Simula-type OOP. I dislike having to tag methods as virtual, like some C++ or Swift peon, but it has everything I'd expect in a modern OOP system.
  • Reference Counting. No GC pausing, no manual memory management. Like Objective-C 1.0, you have to nil-out field references in your destructor, but otherwise you never need to worry about it.
  • Exceptions. Unlike Objective-C and Swift (which relies on Obj-C frameworks), you can throw exceptions and catch them and the program keeps working. Hooray! Flow control that isn't insane! There's no checked exceptions, which is sad, but it works.
  • Cross-Platform. Mac, Linux, Windows, Android, and iOS. Has SDL and OpenGL bindings, and some other options. I'll see how building out UI for each of those works, but it's not trapped on Mac like most other choices.
  • Native Binary. No source code included in the downloaded app. Dynamic language obfuscators are a minor obstacle at best, while machine language is hard enough to decompile. Sure, the other option is to put the program online and just have a thin client in the user's hands, but I'm old-fashioned, I believe in networkless programming and not paying Amazon for server time.
  • Easy Native Library Integration. Pretty much seems to be defining functions as external and calling.
  • Case-Insensitive. I'm usually neurotic about proper capitalization, but here it's a mercy: The classic Pascals were all-uppercase, Delphi CapitalizedEveryWord, but I prefer lazyCaps. FreePascal doesn't care.
  • Real Programs. There's working programmers using FreePascal to keep their (often very expensive) Delphi software running, and writing new code in it. That makes me confident it's not an unsupported toy, and there's current documentation and help.
  • BBEdit. Object Pascal syntax mode works fine.

Con

  • Bondage & Discipline. Not quite as BDSM as Java, Swift, Haskell, or classic ISO Pascal, which in practice have no safewords. You can use dynamic arrays, Variant and OOP types, and even dangerously cast anything to anything or screw around with pointers, but it's not beautiful anarchy like Python or JavaScript.
  • Pascal Syntax. Verbose begin/end pairs everywhere, long words of function, procedure, and such. Semicolon rules are insane (yes, they're terminators not separators; this is not how we use them in any other language, including English), I've taken to just always using begin/end blocks because I don't trust a misplaced semicolon not to terminate the wrong block.
  • Documentation. FPC's docs assume you already know Delphi. I found some decent docs at Borland's site and old Pascal textbooks, but I dunno how a normal person would learn this. Some of the libraries have moved in 3.0, and you're never going to figure this out unless you like digging thru the guts of a language.
  • Configuration. Put this in fpc.cfg somewhere, and export PPC_CONFIG_PATH to the path containing it:
    #WRITE Compiling with fpc.cfg
    -O3
    -Xs
    -MOBJFPC
    -Sh
    -Fu/usr/local/lib/fpc/$fpcversion/units/$fpctarget/rtl-console
    -Fu/usr/local/lib/fpc/$fpcversion/units/$fpctarget/regexpr
    

    The write is just a sanity check that I have it configured. Instead of the next two lines, I could put -dDEBUG or -dRELEASE on the fpc command-line, but I'm not currently using gdb (unfrozen caveman Mark debugs by writeln), so this is easier. -MOBJFPC forces modern FreePascal mode, not a compatibility mode. -Sh forces a default string type of ansistring instead of shortstring; but to be precise, I always specify utf8string. The -Fu lines add some paths where libraries have been moved.

    I want to have the local directive {$M+} (reflection support) always turned on, but I can't figure out any command-line option to do that.

  • Look Like a Crazy Person. But sometimes the crazy people are right.

Example

To (re)learn the language, I wrote a 4-function RPN calculator for Mac console: RealCalc

Presumably it compiles just fine on Windows or whatever, but you'll have to customize the fpc.cfg file. I'm a ways from dealing with that.

Apple Music

Sort of agreed. Certainly I use it every day and link all my music to it, since that's the easiest way to get a high-quality stream of most every song ever recorded (80% coverage? Based on my somewhat eclectic tastes).

The 3 mixes are doing better, but my daily playlists are now almost exclusively Rock Hits: 1970-1990, Tears Go By, rarely one more curated list, Metal Meets Industrial is today's.

The album choices are far better, a good mix of blues, metal, old rock, and industrial. Since I listen to Howlin' Wolf, today it suggested Bad News is Coming, by Luther Allison, which is pretty goddamned good, I didn't know the man before.

And since they buried Connect instead of putting it on the front page, far fewer bands bother to post to it. Some Shonen Knife merch and 1-2 posts a week from Apple Music {genre}. This is a travesty, they had a real shot at connecting bands to the audience (like Ping, without the distraction of other people with dubious musical taste) and have so far squandered it.

Luther Allison-Bad News is Coming
Luther Allison-Bad News is Coming-back

Premium Subscription ★☆☆☆☆

Day One Goes Premium Subscription
and of course the mob outrage in App Store ratings is what you'd expect: MacDrifter.

And this is why I only do bare minimum maintenance of my App Store software now. I released Brigand as free with a $10 unlock, and got savaged for it, so I pulled it. If Nintendo can't make that work with Mario, Apple giving them the front page, and millions in advertising, I sure can't. I love Brigand, but unless I put in more work changing the business model, I can't sell it; sunk cost fallacy tells me not to do that.

Productivity software should cost more than a game, but very few on iOS are willing to pay up front every single new version.

Apple doesn't let you give old customers an upgrade price, and presumably never will; maybe an upgrade killed Phil Schiller's pet/child/Camaro in front of him, or something, given the 9 years he's heard developers request this feature and told us to pound sand. And Apple does nobody any favors by Sherlocking and undercutting developers with "free" or cheap productivity apps.

The older solution of releasing a new numbered version and abandoning the old one every year or so was completely user-hostile. I just refused to do it, and would always switch apps whenever someone tried, and often found a better app by doing this.

Maybe the subscription model is terrible, but it's less terrible than anything else going on.

Michael Tsai wonders if the hostile reviews are from prices going up, but they're just catching up to desktop/web service prices, usually because a subscription gets you cross-platform access now.

Long-term, I think the App Store will be seen as the worst-managed disaster in the history of software. It went from a nice slot machine for indie devs and gallery for a few professional companies, to a predatory flea market full of thieves and frauds. Trying to tell anyone you make real software and here's a reasonable price, in that environment, is a waste of time.

Data and Reality (William Kent)

A book that's eternally useful to me in modelling data is William Kent's Data and Reality. Written in what we might call the dark ages of computing, it's not about specific technologies, but about unchanging but ever-changing reality, and strategies to represent it. Any time I get confused about how to model something or how to untangle someone else's representation, I reread a relevant section.

The third ambiguity has to do with thing and symbol, and my new terms
didn’t help in this respect either. When I explore some definitions of
the target part of an attribute, I get the impression (which I can’t
verify from the definitions given!) that the authors are referring to
the representations, e.g., the actual four letter sequence “b-l-u-e”,
or to the specific character sequence “6 feet”. (Terms like “value”,
or “data item”, occur in these definitions, without adequate further
definition.) If I were to take that literally, then expressing my
height as “72 inches” would be to express a different attribute from
“six feet”, since the “value” (?) or “data item” (?) is different. And
a German describing my car as “blau”, or a Frenchman calling it
“bleu”, would be expressing a different attribute from “my car is
blue”. Maybe the authors don’t really mean that; maybe they really are
willing to think of my height as the space between two points, to
which many symbols might correspond as representations. But I can’t be
sure what they intend.
—Bill Kent

I originally read the 1978 edition in a library, eventually got the 1998 ebook, and as of 2012 there's a posthumous 3rd edition which I haven't seen; I would worry that "updated examples" would change the prose for the worse, and without Bill having the chance to stop an editor.

See also Bill Kent's website for some of his photography and other papers.

This book projects a philosophy that life and reality are at bottom
amorphous, disordered, contradictory, inconsistent, non- rational, and
non-objective. Science and much of western philosophy have in the past
presented us with the illusion that things are otherwise. Rational views
of the universe are idealized models that only approximate reality. The
approximations are useful. The models are successful often enough in
predicting the behavior of things that they provide a useful foundation
for science and technology. But they are ultimately only approximations
of reality, and non-unique at that.

This bothers many of us. We don’t want to confront the unreality of
reality. It frightens, like the shifting ground in an earthquake. We are
abruptly left without reference points, without foundations, with
nothing to stand on but our imaginations, our ethereal self-awareness.

So we shrug it off, shake it away as nonsense, philosophy, fantasy. What
good is it? Maybe if we shut our eyes the notion will go away.
—Bill Kent

★★★★★

Swift

Swift amazes me. A beta language that breaks your code every 6 months, a type system so totalitarian and inescapable it makes BDSM Haskell look like a vacation (and apparently nobody's read Gödel's paper), the founder abandoned it to go work on cars, Apple won't ship production code in it, compiling burns your fucking CPU to the ground for 10s of minutes for code C can do in seconds, and after 3 years Xcode still can't refactor it.

And stupid motherfuckers write their production apps in it. ?

I know Objective-C is hard. It's C plus Smalltalk, both of which are subtle and take a year or two to learn. [brackets scare:theNoobs] && dot.syntax.isOverloaded;
But the tools fucking work. Dynamic code makes programmers efficient. A more elegant weapon for a more civilized age.

BBEdit

My editing weapon of choice is BBEdit: It Doesn't Suck™. Why? I want you to read the typical release notes:
http://www.barebones.com/support/bbedit/notes-11.6.html
Attention to fucking detail. Best $100 I ever spent (it's cheaper now).

On iPad I use Textastic (has regexp) or Editorial (has scripting filters), but they're amateur hour in comparison.

If I have to shell in and can't edit remotely & upload with BBEdit, I'll use vi, which I used for ~20 years (actual vi, STeVIe, Elvis, & Vim).

Oh, and for iOS/Mac apps with a ton of UI/framework shit, I use AppCode. BBEdit for clean C/Obj-C and I can just xcodebuild from Terminal. As the avatar says, STOP Xcode!

It's a hell of a thing. Apple had a reasonable but limited Project Builder/Interface Builder pair of apps from NeXT. PB could set an external editor (BBEdit) so I was happy. Then Xcode combined them and lost the external editors, and every version since has been worse, slower, crashier, renders ASCII text as Russian, less capable of even simple refactors. Now it just shoves fucking Swift (C++ for masochists) down your throat and shits 10k messages into Console so you can't debug.