So here's an idea for tech culture inclusion: Teaching non-nerds something nerdy.
Today's lesson is making a web page. Start by going to Neocities, signing up, and following the tutorial. Ask questions if you need help.
I just made kamimark
Get Out: '60s-'70s horror movies could start with a girl walking alone in risk of being murdered, because it was the Golden Age of Serial Killers; now post-Trayvon you feel just as tense for a black guy walking in fucking suburbia, or at a roadside police stop, or at a white country house, just as justifiably.
The "Guess Who's Coming to Dinner" parents and sibling are spine-crawlingly uncomfortable. The black servants are weird as hell. Even if there was nothing scary going on, this is one of the most socially terrifying things ever.
And when someone screams at you "GET OUT", you get the fuck out.
Atomic Blonde: Based on the graphic novel "The Coldest City". MTV-era music, good period props & sets. But the 21st C cancer of teal-and-orange film tinting part 2 is hideous and grossly offensive to my '80s-era eyes. Occasionally a shot is in period hot blue and pink neon, or a weird gel like green & yellow, but not enough. '80s colors POPPED.
Charlize is a competent ass-kicker in the dancer-doing-fight-choreography style, not talkative or an especially convincing actress, her clipped South African accent can't pass for English or anything but maybe Dutch or German. But this doesn't require much more than action. Chick version of Arnie.
The lesbian sex is perfunctory, unattractive, and uncommented on; that's not plausible for anyone in that era, who would've used it as lethal leverage against a spy.
The McGuffin of a list, the spy plot barely matters. It's a bunch of cool scenes. The two fake ending scenes after the interrogation room are so brain-damaged stupid and unnecessary I have to assume they were written by Hollywood producers (who never had a bowl of soup they wouldn't "improve" by pissing in it), not the original writer.
★★★½☆ mostly because the tinting so offended me.
SpaceX Falcon Heavy launch about to go! 🚀
Altered Carbon is now on Netflix, based on the cyberpunk books by Richard Morgan (which I read about 15 years ago and am somewhat fuzzy on). I'm up to ep 5 of 10 now; time for binging is hard to come by but I'm trying.
"Avoid blunt force trauma to the base of the brain, and energy weapons fired at the head!"
Good story adaptation. Doesn't flinch from any of the gross biology, the casual homicides and "organic damage", the sex and nudity. It's some good old-fashioned porn and torture porn at times.
So first, the weird premise: Everyone has an alien-tech chip in their spine which backs up the brain, lets them transfer to another "sleeve" (body). I have problems with this: Alien tech shouldn't interact with Human biology, and how did they get interstellar travel in the very near future? The show doesn't do much to establish the year or future history, but best I can figure:
I don't remember how much was explained in the book, but it's way too fast up front and then nothing happens for 250 years.
There's too many physical hardware devices, when almost everything should be software projected on any flat surface or into your optic nerve.
The Methuselahs, rich assholes who can't die, don't really show off how debauched they are until a few eps in, but it's pretty tame compared to Caligula.
The Neo-Catholic and Muslim fruitloops who don't want to be resurrected never made any sense to me in the book, and of course they're committing demographic suicide, there shouldn't be any "believers" this long after the chip.
I don't like the goomba actor they "sleeved" Kovacs in, but Ortega, Elliott, Poe, and most of the others are fine. Kovacs' Hello Kitty backpack full of guns makes me laugh every scene. The fight scenes are great, very bloody and physical, up-close combat. The hotel fight was excellent, once the mooks realize the hotel's killing them.
Visuals are sometimes very derivative of Blade Runner, which wasn't at all the impression I got from the book. Later it gets more of its own look, more gutter SF. The trash areas look like Richard Stanley's Hardware, but not as dirty. The upper city has pneumatic tubes for cars like Futurama, and flying cars with manual controls which seems so implausible it may as well be a sleigh with flying reindeer.
But it's well-shot, the CG mixed into the world constantly as you'd expect from neural-interfaced brains.
Should be ★★★★★ because they made a show of guns, fucking, and brain-fucking for me, but the stupid timeline knocks it down to ★★★★☆
Micro Monday: Anyone in a conversation you've been in, but only if they never post real politics.
So did the Superb Owl hurl that spheroid, and see its shadow, and have some really exciting ads for products, services, and terrible all-consuming megacorporations? I was busy, mostly on videogames.
I learned to program on a TRS-80 Model I. And for almost any normal need, you could get by just fine on it. You could program in BASIC, Pascal, or Z80 assembly, do word-processing, play amazing videogames and text adventures, or write your own.
There's still people using their TRS-80 as a hobby, TRS-80 Trash Talk podcast, TRS8BIT newsletter, making hardware like the MISE Model I System Expander. With the latter, it's possible to use it for some modern computing problems. I listen to the podcast out of nostalgia, but every time the urge to buy a Model I and MISE comes over me, I play with a TRS-80 emulator and remember why I shouldn't be doing that.
I'm planning to pick one up, case-mod it inside a keyboard, and make myself a retro '80s cyberdeck, more as an art project than a practical system, but I'll make things work on it, and I want to ship something on Raspbian.
"It was hot, the night we burned Chrome. Out in the malls and plazas, moths were batting themselves to death against the neon, but in Bobby's loft the only light came from a monitor screen and the green and red LEDs on the face of the matrix simulator. I knew every chip in Bobby's simulator by heart; it looked like your workaday Ono-Sendai VII, the "Cyberspace Seven", but I'd rebuilt it so many times that you'd have had a hard time finding a square millimeter of factory circuitry in all that silicon."
—William Gibson, "Burning Chrome" (1985)
Back in the day, I would work on Pascal, C, or Scheme code in a plain text editor (ed, vi (Bill Joy's version), or steVIe) all morning, start a compile, go to lunch, come back and read the error log, go through and fix everything, recompile and go do something else, repeat until I got a good build for the day. Certainly this encouraged better code hygiene and thinking through problems instead of just hitting build, but it wasn't fun or rapid development. So that's a problem with these retro systems; the tools I use take all RAM and CPU and want more.
These days, I mostly code in Atom, which is the most wasteful editor ever made but great when it's working. I expect my compiles to take seconds or less (and I don't even use the ironically-named Swift). When I do any audio editing (in theory, I might do some 3D in Unity, or video editing, but in practice I barely touch those), I can't sit there trying an effect and waiting minutes for it to burn the CPU. And I'm still and forever hooked on Elder Scrolls Online, which runs OK but not highest-FPS on my now-3-year-old iMac 5k.
For mobile text editing and a little browsing or video watching, I can use a cheap iPad, which happily gets me out of burning a pile of money on laptops. But I'm still stuck on the desktop work machine, I budget $2000 or more every 4 years for a dev and gaming Mac. Given the baseline of $8000 for an iMac Pro I'd consider useful, and whatever more the Mac Pro is going to cost, I'd better get some money put together for that.
I can already hear the cheapest-possible-computer whine of "PC Master Race" whom I consider to be literal trailer trash Nazis in need of a beating, and I'd sooner gnaw off a leg than run Windows; and Lindorks with dumpster-dived garbage computers may be fine for a little hobby coding, but useless for games, the productivity software's terrible (Gimp and OpenOffice, ugh), and the audio and graphics support are shit. The RasPi is no worse than any "real" computer running Linux.
'80s low-end computers barely more than game consoles were $200, and "high-end" with almost the same specs other than floppy disks and maybe an 80-column display were $2500 ($7921 in 2017 money!!!), but you simply couldn't do professional work on the low-end machines. Now there's a vast gulf of capability between the low-end and high-end, the price difference is the same, and I still need an expensive machine for professional work. Is that progress?