TR1 is a little hard to get into as a modern game. The old TR games use tank controls, where you rotate and move Lara relative to her current heading, as opposed to modern 3D games where you move relative to the camera. It felt clunky at first but after a while I really started to grasp its elegance.
You see, Tomb Raider is built on a strict grid system, and Lara’s movement can be measured in squares. Every step is half a square, every running jump exactly two squares, and once you grasp the rules you know exactly what you can and cannot do. There is no ambiguity, no hidden walls, no sneaky missed jumps, it is a system that is completely consistent.
Gameplay is not fast paced, it is considered and challenging. You are expected to observe the surroundings and make your own way, using your knowledge of the movement rules, and few clues beyond that.
Speaking of which, a big part of Tomb Raider is sound. The theme is classic and well loved, but outside of fights TR’s music is generally very understated and ambient. Sometimes the game is silent entirely. This proves important because TR’s combat is punishingly hard, and enemy sounds are an important warning you’ll always be listening out for. Other times, sound provides clues about your environment that are critical to the game’s puzzles.
Listening for clues, observing the environment, watching out for traps and secrets, moving with care and precision - your discipline doing this will make or break your adventure. Tomb Raider is incredibly tense and demands you give it sustained focus.
I think that’s also why Tomb Raider has such a minimal UI. The user interface is very stripped back: unless you’re armed or in poor health, there’s no heads up display, and the health bar when it does appear comes in very understated gold. A nice nod to the treasure theme but which also prevents the HUD distracting from the environment.
There’s no radio chatter either (unlike certain other TR games…) You really do feel like a solo adventurer, completely self-reliant in a hostile environment. When you finally scale the game’s awesome structures, it feels like conquering a physical and metaphorical mountain.
Loneliness helps the supernatural elements land a lot better too, like when Lara suddenly runs into a T-Rex in the lost valley - or the Gigeresque body horror of Atlantis. These moments just wouldn’t work if we had to explain to a radio sidekick how dinosaurs had survived, or why Atlanteans exist. It would be a distraction, and it would lose its mystery.
TR1 then is very much about focus - on the world, on the player, on the movement - and with that focus comes the player’s strong sense of self. I very much feel like I am the one controlling Lara, I am solving the puzzles, I am beating the levels.
Yes, the tank controls are granular, but that’s partly the point: you have to input every movement Lara needs to move, yourself. If she needs to turn a corner, you must rotate her. If she needs to grab a ledge, you must put her guns away. Nothing happens automatically, every success is your own.
Compare that to Tomb Raider Anniversary or Legend, where the controls are much more automated. Lara will grab ledges automatically; you can mash O to do acrobatics, and the game leans very heavily on Quick Time Events.
It’s not an invalid design philosophy, but when I play Legend, I think, “wow, look at the cool thing Lara can do” - whereas in TR1 I think “look at the cool thing I did“. The new controls are more fluid and fast paced, but they lose some sense of accomplishment.
That’s not to say TR1 is perfect. The combat is weak, and camera flails about when enemies are nearby, and killing enemies is a chore. But I think you have to forgive that for a game that was innovating so much.
Which brings me onto the remaster. It’s not clear, right now, how the re-releases will handle the original game’s design. Will they alter the tank controls, or will they stick proudly to the original system? Will they improve the combat, or leave it as is? Do you think the game’s systems still hold up, or is Tomb Raider 1996 strictly an ancient relic?
]]>My argument is not that MFEs never pay off their complexity. I think that for a sufficiently large team, with well factored domains, having separate pipelines within a monorepo arrangements is a reasonable design for keeping teams moving independently.
But my scepticism is that few teams are in this position, and the ones that are, should work on factoring their domains first, before adopting a more complex architecture. The many moving parts of MFEs make it much harder to release, move and test code in a coherent effort.
Instead, start with a modular monolith and do the hard work of refactoring your domains before the easy work of creating new pipelines.
Micro-frontends (MFEs) are not a stupid idea. We all intuitively know that software systems are easier to evolve if they can be split into isolated parts. There’s no intrinsic reason that front-end JavaScript apps don’t benefit from the same modularisation once they reach a certain size.
Microservices are similar. The idea of microservices on the backend was originally sold as a way to make systems more scalable under load, but the real motive was that it’s easier to scale teams if each one gets its own walled context to work in, free of interference.
MFEs have history. Back in 2016 I worked on an MFE-like architecture at The Guardian, where our goal was to completely isolate the ad-tech, news and feature code, which meant that you could read the news without waiting for, say, the crosswords code to download, or the adware bloat. These architectures weren’t called anything because they were just solutions to specific technical problems.
Eventually enough companies found themselves doing this that they coined a name for it, micro-frontends, meant to evoke the intuitive benefits of microservices. The usual pattern is that instead of one JavaScript app in one repo with one pipeline, you split out N apps into N repos with N pipelines. Occasionally the apps are held in a monorepo arrangement, but that’s not universal.
Microfrontends are very popular on the conference circuit, so I am probably about to make future job interviews awkward by criticisng them. But I’m seeing enough teams adopt MFEs at the wrong time to feel concerned about the hype. My objections boil down to three core complaints
Those of us who’ve worked on backend microservice transitions have probably witnessed an anti-pattern when instead of getting cleanly separated, independent microservices, we find ourselves with a complex hairball of dependencies now split over servers. This is dubbed a “distributed monolith” and it’s painful because the same interdependence still exists, but now it’s mediated over things like HTTP calls or events.
Most large frontend apps that would be candidates for MFEs have this level of interdependence. This leads to MFE architectures proposing some kind of context sharing or mechanism for passing around state. That could be JavaScript bundles exposing global functions or using postMessage
as an event bus. The more state that gets shared, the more dependencies that exist, the further away you are from actual isolation.
Why is this worse than having interdependencies within a monolith? Because within a monolith, you can refactor very easily. You can remove the dependency from A -> B
by changing A and B in a single commit. You can understand the dependencies by looking around a single repository. You can have TypeScript verify that nothing is using A’s method any more. You probably have a single test pipeline to verify the change.
Microfrontends have overheads if you aren’t smart about this. You can end up having to deploy code in multiple places in careful orchestration. You can end up with shared code that has to be released with an awkward versioning mechanism. You have to invent a way to integration test everything. Your engineers have to sniff around multiple repositories to understand how the whole thing fits together. That’s not a scalable developer experience.
My view on MFEs, when they exist, is that any dependencies should be carefully vetted and factored down to the absolute minimum. At The Guardian our “MFEs” didn’t share any state at all. But factoring down dependencies hard when your code is a tightly-coupled mess, which leads to the next point.
So let’s say our frontend app is a hairball that’s full of interdependencies. How do we break that link? I think engineers - and managers! - are so excited about the prospect of escaping the monolith, they forget about the intermediary stage of actually chopping up the hairball.
Refactoring any code is hard, uncomfortable, and thankless, but just like brushing your teeth or going to gym you can only avoid it so long before the consequences get to you. If you try splitting an app without decoupling its parts, one of two things happen
What does good decoupling look like?
You might be tempted to skip decoupling and say you’ll do it as you migrate the code. This is a bad idea. You end up in a half-migrated state where part of the code is moved but the rest can’t be decoupled. You want to minimise the scope of each incremental change you make, not try to combine refactoring and replatforming into a single mission.
Sometimes engineers (and manager) get so sick of working in a bad codebase that they fantasise about throwing the lot away. In microfrontend projects, teams sometimes feel they can use MFEs to effectively rewrite the codebase from scratch, avoiding all the complexity of the original project.
Rewrites are really dangerous. They’re tempting because writing code is much easier than reading it. Developers imagine that with modern tools and consolidated knowledge, they can do a much better job than the original engineers. There is a bit of main character syndome here (the last project was a disaster because _I_ didn’t write it). But the rewrite turns out to be more complicated than expected, because the old code was full of secrets and edge cases that weren’t obviously apparent.
My experience of rewrites is that they start great. Early on velocity seems high as you’re focused on scaffolding the new code and adding tools to the codebase. It’s when you start implementing the features that things get sticky. Rewrites are sold as everything to everyone and end up becoming a way bigger project than you first imagined.
Usually in MFEs the motiviation is a monolith codebase that’s bad enough, to make rewrites a strong temptation. But if you’ve reached that point, your pages and UI is too complex to just be rewritten in a few weeks. The Product
page that accumulates five years of code is going to take a lot of work to understand and unpick alone. Moving it into a Next.js app isn’t going to shortcut that. An incremental rewrite via MFEs is better than a total rewrite in a new monolith, but not by a lot.
What is to be done, then, about frontend repositories that reach such a critical mass they can’t be worked on effectively? My advice is to start with modularisation.
Keep a single repo and build, for the moment, but start organising your code into packages with very strict architectural boundaries. Have these packages only interact via well-defined, minimal interfaces, ideally just a small user context object with basics like ID and tokens. Packages can share idempotent state like API caches but they don’t share context like Redux stores or React context.
As you split the code up, then you can implement things like Yarn workspaces that are designed for managing a monorepo. This lets you do things like split up the unit tests and only execute suites for the parts that have changed. This starts bringing in MFE-like benefits without all the overhead.
Over time you can move incrementally to a microfrontend-like architecture that sits inside a monorepo, with separate pipelines and fully differential builds.
Why a monorepo? In my experience it’s a good tradeoff between oversight and independence. You can still typecheck a monorepo app as a single unit, whilst building and deploying modules independently. Code can be shared without complex versioning and packaging; you can implement separate deployment pipelines with a little effort, but it’s much easier to move code across packages as you need to.
Oversight matters. By the time you reach this level of complexity, you need a team who are acting to oversee the overall architecture and provide common tooling. If you let every team diverge you’ll get chaos.
Refactoring _is_ hard. In one frontend I’ve worked on, with over 400,000 lines of source code (and I mean not including dependencies), I ended up writing a program just to understand the codebase itself. It used the AST (abstract syntax tree) of each file to discover when components were reading or writing Redux data, parsed that into a graph, and then let you execute queries against the graph to understand that ties between different pages. The graph data alone was hundreds of kilobytes.
But by doing this, now we could visualise the problem. Before, the argument was that we could just rewrite everything via MFEs and engineers would just kind of figure things out as they went. When we could visualise the graph of data dependencies, we could see how difficult this would be without a dedicated effort to decouple the domains in the code.
Despite all the criticism above, I think in time this app could become something like a microfrontend project, although with more control and oversight than the polyrepo MFEs I see other organisations adopting. But it will only do so once the domain isolation and design-level refactoring is largely complete. Then the parts of the codebase will be sufficiently isolated for us to work on them independently.
You could call this microfrontends by stealth; I call it agile architecture.
Don’t take on big hero projects to implement The Grand New Architecture you’ve read about. Identify your problems, move towards them iteratively, and don’t try to skip the hard work of figuring out how to split the code into genuinely separated units, or all you’ll get is your horrible frontend monolith turned into a distributed monolith MFE.
]]>Some words: I know, objectively, the last three months have been some of the softest restrictions we’ve been under, anyone’s been under.
As seasons go, it was nowhere as tough as two years ago, when the world came to a standstill; it was nowhere as tough as one year ago, when the virus tore through our hospitals and care homes; it was nowhere as wretched and painful and pitiless as last Christmas, which I spent alone. But - for me - I found myself far more drained, more dispirited and even disturbed. This time I couldn’t see path back to normal.
At the time the first lockdowns broke out, April 2020, I treated it as a bargain, a trade, a compromise made with the world, the kind of thing you narrate to give yourself some agency when you’re really under compulsion. “I will give up my job and my friends and my freedom”, I said to the world, tightening my chest, “and you will give us time to invent a vaccine”. The world obliged. “I will give up this year, I will give up on touching any other person”, chest raising again, “and you will give us time to vaccinate”. And so it went.
This Christmas was different. We weren’t staying home to make new medicine, distribute jabs, or get “on top” of Coronavirus (what was that ever supposed to mean?). It wasn’t a deal we were making with the world, something we could rationalise as a choice. Omicron came, it killed, we stayed at home, to banish the nth variant of x variants, until the nth + 1 comes along, and it suffocates us again.
But what was I really after? I’ve spent two years pining for a “return to normal”, meaning that I hoped the world of 2022 would be the same as the world of 2019. But… that couldn’t ever be so, coronavirus or no. Things change.
There’s a saying that being critical is, fundamentally, being disappointed people weren’t who you wanted them to be - which underlines quite nicely how absurd it really is. Obviously, living under the pandemic has been awful. Screamingly, pitilessly, nail-rendingly awful, the kind of awful that every so often just leaves me reeling and agape at all we’ve taken in these 24 months. But I’ve also carried a share of disappointment and even anger simply that the world has changed, that it couldn’t be like the version of the pre-pandemic world I carried in my head. A world of infinite freedoms, rich social life and personal fulfilment.
Looking back… I might have overrated that. Where was I in 2019? For one thing, I was a workaholic. I don’t mean ‘spends too long on presentations when he could be watching The Witcher‘. I mean, I had a problem: using work and career to regulate my emotional state, which is really a kind of (socially acceptable) dependency. I want to write about this at length elsewhere, and how destructive it is, because it feels like doing the right thing.
But in 2019 I was overworked, obese, I rarely dated, I didn’t take care of myself, my sleep was awful, I relied on work drinks to prop up my social life, I bought stuff - objects, clothes, holidays - to try and make peace with how unhappy I was. And then I look where I am now, what I’ve gained over the pandemic:
Am I really sure I want to roll the world back to 2019? That was two and a half years ago. Things change. I changed.
There’s a larger point about the pandemic. You know, before COVID hit, movies taught us the biggest threat in an apocalypse would be the masses going feral; rioting and anarchy; that the moment the norms of the world dissolve, humanity’s bestial instinct would tear up the tenuous veneer of civilisation. just you watch. Actually, in this apocalypse, it was the opposite.
But the problem this time, were the people wishing the world “back to normal”. Plugging their ears and “doing their own research” in favour of living their lives exactly as they always had done. The people who yanked folk back into the office and smouldered with anger to think someone might be working from home (obviously a skiver). The biggest danger in COVID-19, after the virus itself, were the people who desparately wanted to believe it was still 2019.
Nostalgia is an intoxicant. It’s fun to visit once in a while but if you live that way 24/7 it distorts your view of what’s good about today, what we can gain from change. I’ve spent all this time howling in pain and disappointment that life has been stopped and I’ve forgotten to take stock.
I am cautious about 2022. I think it will take time for us to emerge again. I think it will be hard to risk disappointment. I think we will try impersonating the people we were three years ago only to realise we’re not quite them any more, and we don’t quite want what they did. The healthiest thing is to embrace that.
]]>What began there was eight months of gradual, sometimes spotty, sometimes rapid weight loss from 96kg to 81kg (or 2.4 stones). That’s about 16% of my total body weight, and is still ongoing (I’d like to hit 77kg). It was gradual and spotty because I had a lot of things to learn and made several mistakes along the way.
I want to write about these mistakes, in part to share them with other people but largely because I suspect I will drift towards making the same mistakes in three years’ time, and don’t want to spend another year re-learning them.
They come in no particular order, however
Sorry. This one is a blow. This is the one I was hoping wasn’t true, because I didn’t believe I was up to the demand. A lot of the diet advice out there is really tailored around a (contradictory) demand: “how can I change the shape of my body, without experiencing any stress doing so?”
But the only times I saw my body seriously losing weight was when I was regularly hungry. Hunger is something that can be managed and mitigated but I don’t think it can be avoided. Trying to change your metabolic pathways and deprive your organs of glucose won’t feel nice, for a while. It will get easier, but it won’t go away.
I spent a long time overweight because I was trapped by my fear of hunger. I’m not certain where this came from but I _do_ remember long stretches of hunger as a teenager, which was a time of quite disordered eating for me. As an adult I carried with me a dread anxiety of being left even mildly underfed, particularly the idea of going to bed hungry.
But this year I discovered intermittent fasting and one of its psychological benefits has been adapting me to hunger by repeated, controlled, exposure. I know I have food ready and waiting for me at the end of my fasting period, but I have to make it through the waves of hunger and get to feel the ninety-minute cycles of peaks and troughs. Knowing hunger dissipates, wanders, sinks into the background, and that I can distract myself out of it has been a powerful lesson.
Not least because you will inevitably make mistakes as you learn to balance your calories and macros. The diet that sounds solid on paper may disintegrate the first few times you experience strong sugar cravings. Or you may miss a macro like protein and end up with unbearable need for meat and pulses. Or you might decide that you can ‘get away’ with a few too many sweet treats and accidentally negate your calorie deficit. It does happen.
On that point, treats and snacks will wipe out your calorie deficit before you know it. Every day you diet, you plan out meals to leave yourself a specific number of calories short of your TDEE, to force fat metabolism. But a 500 cal deficit can be cut directly in half by a single serving of cookies or chips - and that halves your weight loss for that day.
Overindulge a couple of times a week and you could reduce your 1lb a week weight loss to maintenance calories. And you won’t notice for two weeks because it takes that long to see the difference.
Snacks are really dangerous in the form of a bag or pack that doesn’t have a clear serving size. They’re basically unbounded calories. That’s one reason I’ve chosen to simply stop buying them. They aren’t in my home any more.
As someone who experiences very little hunger in the morning, intermittent fasting has been fantastic for restricting my calories by limiting my opportunities to eat. It’s also introduced me to the richness and depth of black coffee. Yes, I’m becoming one of those people.
Longer fasts are not easy. They are a serious endurance, make demands of your body and concentration, require great focus and discipline. However, whilst fasting is not an easy way to lose weight, it is a simple one. It is unambiguous and resistant to any mental tricks you might play on yourself, like miscounting calories or overestimating serving sizes. You can’t half-fast. Whilst you are fasting, you just don’t eat.
You don’t have to eat X square meals a day, and you don’t have to eat at all one day if you’ve overindulged (some electrolytes and vitamins are recommended if you do a day fast though). You can try and get away with a sandwich or promise yourself a better meal at the end of a longer fast. It’s hugely valuable to build a habit of only eating when you’re actually hungry.
They are something else, they should be listened to but not obeyed mindlessly. Cravings for sugar aren’t a response to something your body needs, but actually (and I will use some strong language here) a kind of addiction. Do you feel comfortable that you have a dependency on sweets and junk? For me it felt disempowering and I tapped into that offended sense of free will to push back.
However, some cravings are important. Protein is a big one. If you find yourself craving meat and pulses, that could indicate that recently you haven’t been giving yourself enough. You can break your fast with lean proteins: eggs, tuna, chicken, beans.
Worse than being a Coffee Guy, I may become a Huel Guy. Whilst your eyes are rolling, let me remonstrate: Huel and similar products are an excellent way to break your fasts with lots of protein and very strictly controlled calories (preventing bingeing). You are absolutely certain that e.g. a shake of two Huel scoops contains 400 calories and a decent chunk of your daily nutrients.
And if you use very cold water and a proper blender, it’s actually quite pleasant. Nothing to write home about, but it’s filling for what it is, somewhat sweet, and at the right consistency - like a thick protein shake. And when you’re hungry enough, anything will taste good. Give it a try.
Most of all, things like Huel are convenient. Eating well can be so difficult, you need every convenience you can take.
I think culturally, at least in Britain, we are big about making identity and social connections over a shared love of food, whether it’s gut-sticking weekend feasts or sneaky treats or seasonal pleasures like an ice cream at the beach. However. It is also a commercial enterprise that exploits that cultural charge to, well, sell us things.
Walk into your local coffee chain and you’ll see decals everywhere of hot chocolate smothered in mountains of cream, groaning under their own weight; cinderblock bricks of carrot cake; cheese toasties oozing lustfully. The effect isn’t just to tempt us when we’re feeling vulnerable to their charms, but to imply certain norms about luxury foods. It should not be the norm to drink a coffee with the same calorie load as a chocolate eclair and shrug it off without altering the rest of our daily calories. It should not be the norm to treat an 800kcal ham and cheese panini as a ‘snack’ rather than a full lunch.
Nor should it be the norm to celebrate good times (or bad) with food. I got into that in a bad way. Every high or low meant a takeaway. So was every night of overtime at work or just a night feeling lazy and in need of a pick-me-up. Each of these indulgences was a calorie surplus I was never getting rid of. Over the years it makes you bigger and you forget that food doesn’t have to be a sensory barrage. Food can be boring and six days out of seven it probably should be.
So will buying new clothes. Track your progress obsessively and celebrate when you make it through a plateau or an important number. Obviously, don’t celebrate with food.
It really does. Initially you have everything against you:
But as you diet (if you diet successfully) you gradually chip away at each of these. There is a compound effect and you build momentum. You think the initial kilos will be easy and the later ones will be tough, because your body fights back - actually, I’ve found my most recent pounds to be no more effort at all, perhaps even less in some ways, because although dieting is hard good habits make it simple.
]]>Right click any image + open in new tab for a full sized view. Or view the gallery on imgur.
Upscaling classic games matters. It’s not just that graphics designed for 90s CRTs look like crap on modern LCDs, or that modern sensibilities have spoiled us from enjoying simpler graphics. It’s that upscaling allows players to soak in far more detail that was visible originally; preserving the original vision and helping out older gamers whose eyesight might not be as it was 20 years prior.
There’s a quote about Crash Bandicoot that always stuck out for me:
The secret to Crash’s success was its Art. And the secret to its Art was its Programming.
[…] To that effect, we took the very unusual step of hiring real “Hollywood” cartoon designers to help with the visual part of the production. This was Mark’s idea at first, although Jason and I saw the brilliance of it immediately.
You can read all about the genesis of Crash on Andy Gavin’s website, but the upshot was that Naughty Dog wanted to create a playable cartoon. The bold shapes and sharp colours made Crash play like a Saturday morning series - complete with Wile E Coyote style death animations.
4K Crash Bandicoot is about as close as it gets.
Crash 1 is the simplest of the games graphically, but it still holds up pretty well.
This is the simplest of the Crash models. He runs around with a face locked in perpetual anxiety.
According to the game’s development bible, there’s some lore about an ancient civilisation named the Lemurians. These were the folk who built the Lost City and its pseudo-Oceania, pseudo-South American inspired architecture.
Apparently they built the teleporters you keep running across, and although the games never go into the idea hard, Lemurian artefacts keep cropping in Cortex’s collection.
Something I noticed in the world map! The Cortex Power stage actually has a little model of the sign you see in-game, with the R hanging off. Dorky little detail but I’d always missed it until now:
The second game is where the upscaling really shines. Naughty Dog really invested in dynamic lighting and much more confident camera direction, dialog, all round polish.
A little detail I noticed - the master crystal sits on a plinth surrounded by numbers. Below each number is a slot for the crystals Crash is collecting. There’s also a lab assistant in the background (a kind of recurring meme for the series)
Crash 2 makes a few attempts to tie things back to the original islands and their Lemurian ruins.
Playing in high-res I noticed a lot more Lemurian architecture. Now I think about it, maybe one of the ideas in the background was that Cortex and co. had largely pilfered ancient tech as their own, including the warp pads.
As you may remember, the story of Crash 3 is that Dr Neo Cortex and fellow evil-doers use time-travel to try and collect the crystals before the good guys were born. On the way, they install themselves as iron-fisted rulers of each time period, becoming Kings of the Medieval age or Emperors of Rome. I guess you’ve got to have a hobby.
Dr Neo Cortex = Dr Big Mood
Crash himself isn’t much more mentally stable this time around - although he does have a more detailed model
This game also introduces Dr Nefarious Tropy, inventor of the time-whatsit. His boss theme is absolutely wicked! Check out the N Sane version too. Too bad he goes down so quick.
In the intro background we see a photo of Crash’s girlfriend Tawna and a promotional render from the original game’s marketing
Dingodile terrorises the local wildlife
Tiny Tiger (no relation to Tony) sets himself up as King in what appears to be Arthurian England. The locals do him the favour of decking out his castle with his own heraldry, complete with naff 12th-century style tiger drawing, which I’d never noticed until high-rezzed:
And this is the final battle. In widescreen you can see a lot more of that Lemurian architecture - a nice touch that ties the end of the trilogy to its origins, and perhaps implies something about where the time travel tech really came from
Defeated again! This is not fair! Maybe I should retire… to a nice, big beach, with a nice big drink. And a woman, with nice, big… bags of ice for my head!
You can play Crash B with upscaled graphics for yourself by installing RetroArch and using the BeetleHW core. Set up internal GPU scaling and texture mapping correction and enjoy!
]]>Final Fantasy VIII is a curious one.
You play a group of teens at a high-school-cum-military academy who spend their days planning prom dates, riding hoverboards and munching hot dogs at the school cafeteria. They’re training to become elite special forces, but have no idea their school is really a front for a millennia-long war against a time-travelling sorceress who wants to destroy all existence.
It gets… strange. You are mortally wounded and impaled in the chest at the end of Disc 1, then awake apparently unharmed. Everyone grew up together in an orphanage but forgot due to Plot Convenient Amnesia (discussed once and then forgotten). You fight a T-Rex in the school gym.
VIII can be maddeningly vague. Just who is Ultimecia, the sorceress from the future, and what does she really want? It’s never satisfyingly explained, nor how the sorceresses came about. There seems to be a link between her and your love interest Rinoa - with tantalising clues and strange allusions - but it’s a lacuna, an absence, like so many elements of FFVIII’s lore.
But a recent replay changed my mind. In fact: I now think VIII is the smartest and most self aware of the whole series. It has its faults and some bad writing in parts, but I think there’s a way of looking at FFVIII that makes sense out of the game’s weirdness.
I think that if you look at FF8 as a story about stories - a metastory - a structure falls into place. Like the game’s own time travel loop, this structure is a paradox, collapsing and uncollapsing on itself indefinintely. VIII tells a story about stories so dangerous, it has to abandon its own story in disgust.
I want to show you the real strangeness of Final Fantasy VIII.
Here’s the idea: Final Fantasy VIII is about many things, but at the center of it are people so obsessed with history and mythology, and making sense of time, that they lose all perspective of the present. The infinite perspective of seeing across history warps people: it makes them justify atrocities; it steals from them their humanity; it makes them sacrifice everything that makes them themselves.
Ultimecia is so traumatised by the grief of her ancestors that through time-travel she retroactively commits genocide. SeeD in response is so fearful of Ultimecia’s wrath that it throws thousands of teenagers’ lives at any Sorceress who emerges, brainwashing cadets with memory-destroying magic powers. The result, of course, is that they cause the very oppression Ultimecia comes back to avenge. Seifer meanwhile is so obsessed with myth and movies that he plays his own fictional character, and loses himself in the process.
But this puts the game in a quandary. FF games love to tell grand stories about the origins of their worlds. VII has its Lifestream and the Ancients; IX had Terra and Garland. FF Tactics has its elaborately architected story of religious charlatans, corrupt priests and supernatural horror. What do you do, though, when the gist of your story, is that Stories on that scale are themselves evil and dangerous?
I think what happens, is that the game gets so revolted at the misuse of history and myth, it decides to stop trying to explain itself. Stop trying to fill in Ultimecia and her origins. Stop trying to explain Hyne or the origins of the Sorceresses. Instead it just focuses on two kids, Squall and Rinoa, making their own future.
And I think that makes sense, within the game’s internal logic. Because I want to go over just how much damage history does in FF8.
Something I need to clarify is that - in FF8 at least - history and mythology are just two facets of the same thing. You could say both attempt to reduce time to a single, simpler, meaning.
For those fighting against the Sorceress, History is nothing but an endless war against Her, which in objective terms means a succession of women who inherit supernatural powers and are willing to use them politically.
For those fighting with the Sorceress, history is nothing but a Witch and her faithful Knight battling against all odds, protecting the descendants of Hyne, the great creator-god. Even if the Witch must lose, there is tragic beauty in facing the world alone.
However, FF8 treats mythology very sceptically. Consider the first (playable) scene of the game:
Squall is recovering from his fight with Seifer and wakes up in a medical bay. Dr Kadowaki asks him to focus his eyes.
What’s striking is its realism. This is Final Fantasy after all. Squall isn’t healed with a Curaga spell or a Megalixir; it’s all very conventional medicine.
This scene tells us four things: FF8’s world is like our own; technology is modern but restrained; magic if it exists is severely limited; and healing is done by doctors - not shamen. We’re in a recognisably modern world, which puts us outside of mythology, and by definition, on the outside of history.
Beyond that, though, VIII is coy to provide details. You’re a mercenary and your job is to complete missions to the letter, not get involved in politics. You’re encouraged to keep the lore of the world, and the interpretation of its events, very distant.
When mythology does come up it’s… hazy. Only occasionally do you find characters willing to bring up how they think the world was created.
1 | Old Man to Grandchild (overheard): |
The Hyne story has the ring of a myth that’s been translated too many times to be intelligible, cribbed from the real world mythology of Atra-Hasis written 4000 years ago.
There are “Guardian Forces”, that resemble beings from religion and myth, but they’re a motley pantheon with no obvious origin. Depending on what fan theories you subscribe to, GFs like Griever might even be memes come to life, rather than beings that ever really existed.
By setting the game in the modern day and telling us to withhold our judgement, Final Fantasy VIII wants us to keep history and myth at arms’ length. Even when magic and monsters are introduced, it’s always behind a semi-scientific or sci-fi mechanic, like the Lunar Cry or electromagnetic radiation.
Perhaps it’s also why the focus of play is so much about stats and “junctioning” stats outside of battle, rather than performing superhuman feats in-battle: a game that’s sceptical of myth is just as sceptical of heroes, and underplays their power. Speaking of which -
In Squall’s SeeD exam, your role is simple: follow a mission brief, don’t leave the assignment zone, don’t engage with civilians. You even get points docked from your exam result if you speak to NPCs. What the Dukedom of Dollet is, and what Galbadia want with their broadcast station, is none of our concern. We’re military subcontractors, not heroes.
Except… we _do_ have a hero on the team: Seifer Almasy. He tells us himself - and he seems to be the only person who gives a shit about what Galbadia are up to in Dollet.
Seifer carries the same weapon as Squall, with the same R1-for-bonus-damage gimmick. He has unique “limit break” conditions and his “posse” of Raijin and Fujin give him a party of three - just like your traditional Final Fantasy protagonist.
But Seifer is punished severely for his heroism.
On return from Dollet, he’s the object of utter contempt from Quistis and Xu - because Seifer stepping above and beyond represents a loss of earnings for SeeD. They’d rather the Galbadians run riot and burn Dollet to the ground: it’s good for business.
But Seifer is irrepressible: when he abducts President Deling in Timber, it’s because he’s outraged your team of three junior SeeDs were sent on a suicide mission:
1 | Quistis: |
Seifer’s problem, though is that he’s too self aware. He’s theatrical, egotistical, completely obsessed with books and movies. He quotes the Sorceress’ Knight at length to Squall and he holds his gunblade exactly like Laguna does in the Sorceress’ Knight movie:
Seifer’s fiction-obsession makes him live like a protagonist, fighting increasingly implausible odds. And boy, it costs him everything. Not just his humiliating defeats, but his values, his dignity, his selfhood.
On Disc 2 he taunts your party for fighting like monsters, a swarm of three against one. By Lunatic Pandora, though? He leaves any code of honour behind as he seizes Rinoa off-screen and sacrifices her to the monstrous Adel.
His grand, theatrical coat is by then worn and ragged; his slick-backed hair greasy and unkempt. Raijin and Fujin no longer recognise him. He’s become nothing, literally a Cipher in someone else’s story.
One of the things that always bothered me about FFVIII was the void where its villain is. I was used to Final Fantasy games with grand, bombastic, larger than life villains like Kefka, Sephiroth or Kuja. Ultimecia by comparison has very little screen presence.
I counted, and excluding what she says through Edea, she only directly speaks about eight lines across the whole game. Yet she’s the ultimate cause of the entire Time Compression plot. Why she wants to do it is unclear, and what it even means to compress time, is a very abstract threat.
Well, if we want to understand Ultimecia, we should start with what she says whilst controlling Sorceress Edea - these are her first lines in-game, if we include possessions:
1 | Edea: |
You probably know already that Ultimecia is going back in time to enact revenge for the oppression she’s suffered across history. But take another look. She’s very focused on fantasy and reality: the fantasy of the evil sorceress; the reality that’s unfolding; the threat to live that fantasy after all.
Ultimecia is specifically punishing people for mythology by making their myths real. She turns statues of mythic monsters into living ones and lets them loose on the crowd.
It’s easy to read Ultimecia as a garden-variety megalomaniac, but I see her as a force of History. She lives to grieve the past, and even the parts of history that weren’t true, she’ll go back in time to make real retrospectively.
Ultimecia is obsessed with history. Why does a woman 2000 years in the future live in a Renaissance-style castle, replete with sixteenth-century style portraiture? Even the castle soundtrack is a Baroque ensemble of organs and harpsicords, a pastiche of 17th century music:
Compressing time - making all moments be experienced at once - is just another facet of that. Because that’s essentially what History, when it runs out control, really does: by “over-matching” patterns across time, it “compresses” them into a simpler narrative. Everything becomes About The Time War, and nothing else has real meaning.
Once you become obsessed enough with a historical narrative, you’re living Time Compression already: experiencing the same meanings and emotions over and over. All joy becomes empty when you know only grief for the inevitable future; people and places just become interchangable soldiers on the forever battlefield. There’s no meaning to anything, just war.
Eventually Ultimecia’s grief becomes literal as she merges her body with a monstrous, man-eating being called The Griever. Once that’s destroyed, she reveals her real form, whose head is just an empty void cradling a brilliant golden light. To me it always resembled a solitary candle in the blackness, a symbol of memorial to everything Ultimecia has lost, the ultimate sum of what’s really inside her.
Enough on Ulti. Let’s talk about SeeD for a moment. I think they’re far more morally grey than first appears.
Let’s review: to fight Ultimecia, Cid Kramer converted an orphanage into an elite paramilitary academy. He locates orphans, as well as displaced, vulnerable and “troubled” children, and he trains them up as soldiers, released into battle at age 15. To increase their power, he introduces these children to Guardian Forces. What he doesn’t disclose is that GFs demand a sacrifice for their power, by replacing memories in their hosts’ heads.
Children who grew up together greet one another as strangers. Garden guidelines discourage forming relationships. SeeDs don’t have friends, just colleagues, and they’re willing to fight former compatriots to the death if a contract demands it.
If you can’t remember anything, and you can never form friendships or relationships, only fight, I have to question - just how different is that existence really from Time Compression? If your life is endless battle against the Sorceress with no friends, no connections, and no personal history, are you really doing that much better than what Ultimecia has in store for you?
At the end of history, all that was left of Ultimecia’s was a glowing orb, a dying light of memory and grief. So what was left in Squall’s head after ten years of SeeD brainwashing and repression?
And let’s not forget how many SeeDs - mostly teens - die in this millenia-long war against the Sorceress. FF8 has a high-school setting to remind you that these people are kids. They talk about love, exploration and romance, because they are young adults with a lifetime of possibility ahead of them. These are the people Cid Kramer sends on suicide missions against Galbadia, because legend has it that a trio of SeeDs eventually win. It’s horrific.
There’s a bigger theme I want to strike at, that stretches across other Final Fantasies. I think a lot of FF games in the late-90s era use child (or teen) sacrifice as a shorthand for civilisations that are fundamentally sick.
IX was probably the most overt about this: Vivi Orienteur and the Black Mages are beings manufactured to die young. IX’s lore is about an ancient planet of infertile immortals, the Terrans, consuming the life energy of younger planets in a bid to keep their ancient civilisation alive.
X continues the idea with Summoners: teenagers who volunteer for a dangerous pilgrimage to fight a being called Sin - that they can only subdue by sacrificing their lives. This is a fraud, however, perpetuated by the elderly and undead priests of Yevon.
These games have societies that have stopped being able to replicate themselves, and are eating their own young to keep going. It’s grotesque and intuitively unsustainable.
In Final Fantasy VII the crisis is all-pervasive. The world is controlled by the dystopian Shinra Electric Company., whose reactors gobble up the very “lifestream” that’s required to conceive new life.
FFVII’s society is sacrificing the young as a literal power source. Everywhere we go in VII’s world, youths are leaving to go work in the city, whilst their villages slowly die. Shinra’s armed force SOLDIER recruits teenagers aged 14, and subjects many of these kids to human experimentation, with brutal results:
In FFVIII, meanwhile, kids are chewed up by the war machine on both sides. SeeD seeks out vulnerable children to turn into young soldiers. Meanwhile Sorceresses like Adel scour the land kidnapping little girls who might make successors.
So I don’t think we can excuse how SeeD operates as just a bit of a plot contradiction, because the theme crops up so many times in FF to be less than deliberate. SeeD might be better than Ultimecia’s tyranny, but the sacrifices they make are on a similar level of atrocity - and end up triggering Ultimecia’s revenge anyway.
And it largely happens because Cid Kramer’s obsession with history that hasn’t even happened leads him to decide that the ends justify the means.
Squall’s arc cuts through this endless cycle of revenge and war.
For Squall, becoming part of the past doesn’t mean becoming a legend, like it does for Seifer. It means losing control of your story and becoming part of someone else’s narrative - that’s what really freaks him out about death:
1 | Zell: |
As Squall grows through the story, his journey is to let go of his childhood trauma and embrace the possiblity of new friends and relationships. He stops caring about any narrative larger than that. Fighting Ultimecia is a faraway second to finding Rinoa:
1 | Zell: |
Whilst the gang go on about All Elusive Time Kompression, Squall Lionheart couldn’t give a fig. He only knows one thing:
1 | Squall: |
In the end, Squall doesn’t care about the binary of Knights versus SeeDs. He’s indifferent to history and contemptuous of plot. Falling in love with Rinoa offers him a future that sits outside the endless Sorceress War.
So the future Squall cares about - one with Rinoa - is a timeline that isn’t bound within the Ultimecia time loop: it’s not part of the present day nor the final battle of SeeD. It’s also completely outside the scope of the game’s plot - this is an important idea, and we’ll come back to it.
Squall’s not the only one indifferent to the Sorceress War or even FF8 lore itself. Doctor Odine, VIII’s Einstein-figure and lore-explainer, positively can’t be bothered:
1 | Doc Odine: |
This is our one opportunity to learn the truth about what Ultimecia truly wants. And it doesn’t matter.
I used to rage at this scene! How could Square leave this completely ambiguous? How could they refuse me the neat explanation that tied everything together, like I had in all the other FF games? With no explanation beyond power itself it just seemed like pointless nihilism.
Now I feel differently. Too many characters in Final Fantasy VIII want their equivalent of an Odine-brand explanation: A single sentence that summarises the history of the FF8 world, nice and pat. A single sentence that draws a battle line in an endless war between side A and side B. I did too.
But getting that sentence, getting that explanation, is Time Compression. Reducing everything to a single experience, a single meaning. Nothing else meaningfully matters in that worldview. No space for love, hope, or anything outside endless war.
Doing that is as nihilistic as anything Ultimecia does. I am Ultimecia. Time shall compress… All existence denied. Grief and grief alone motivates her to reduce all existence to her own unquenchable anger and loss, perpetuating itself over the centuries. If Squall had taken the SeeD Prophecy too seriously, he could have accidentally done the same.
History is the real villain of Final Fantasy VIII and the game wants no part of it. Odine could have been written to fob us off with a simple explanation, about Ultimecia wanting absolute power or whatever. No. Instead he tells us that it truly isn’t important.
We should take him at his word. None of the lore about Hyne or Griever or the Guardian Forces goes anywhere - because the game wants us to move onto a different story.
I want you to have another look at the final scenes of Final Fantasy VIII:
It’s a camcorder movie, of all of Squall’s friends, celebrating in the Garden. Zell is munching his beloved hot dogs. Quistis is finally relaxed and happy. Selphie and Irvine are reconnecting after years split apart. Even Edea and Cid are remarried. Everyone is in good spirits, but where is Squall?
Rinoa turns to the camera, smiles, and it’s obvious: Squall’s behind the camera, filming a new story of his own.
]]>Maybe not rituals like those of ancient humans - worshipping the arc of the sun, or the retreat of the winter frost - but even modern lives have an ebb and flow, are patterned in cycles. The beginning of September school term; the pre-Christmas drinks. The dry Januaries and the big spring cleans in March.
If premodern rituals let early humans gather in groups to celebrate the passage of time, fatten up on available foods and renew their relationships with each other, modern rituals serve a similar purpose. They bond us and help us make sense of the continuum of time. Their novelty and intensity makes them landmarks of our memory.
Rituals help turn time from an unbounded axis to an ordered cycle, punctuated with wine, family and song. Knowing a new year is beginning is a spur to action, to improve ourselves and remember the who we intend to be.
Observing time’s passing has its cutting edge, too: the season for suicide isn’t Christmas, but Spring. The idea of starting a whole new year can be too much for some of us. Chaucer sang of “April’s showers sweet”; TS Eliot called it the “cruellest month”. Both chose April as their day zero. Humans think and live in cycles of time.
But lockdown has put time on pause. It has hushed our parties and quenched our hearths. It has quietly devastated little family habits and quirks. Each one has its own. The big Easter Sunday lunch. The summer barbecue with all the friends you mean to see more often. Your son home from uni to help put up the Christmas tree, as he does every year. COVID has flattened out the fabric of time into an unceasing undifferentiated “present”.
COVID-time has no landmarks, and no celebrations of life or renewal. No weddings or kids’ birthdays, no house moves or new jobs. It is like a long run on sentence, without punctuation it has no resolutions except endings and no breaks except full stops.
We have lost time, but more fundamentally we have lost “time”. It’s not as painful a loss as the thousands of deaths, or the millions of lost jobs, the mounting debts and widespread fear for our loved ones. No-one could, no-one would, argue that. Nor is this any kind of anti-lockdown post. All I am writing is that time is one of several smaller losses, and insidious exactly because it so subtle.
I fear I can feel it already. With my friends, it gets harder every week to think of things to say, think of things to write. Self-isolation becomes the default, and it - disturbs me how comfortable I’m getting with my own company. What scares me is that we could come out of lockdown strangers to each other. We could come out of lockdown expecting a return to normality, when we still have all the hard work of learning to live together again, and the disappointment could be crushing.
In April I made a mistake. I pictured COVID-19 as a temporary problem with a definite end, sometime near Christmas. I put too many things off: staying in touch, learning new things, celebrating good news, getting back in shape - because I felt I had to “get through” COVID first.
Perhaps you did the same? It was folly though, even with a vaccine COVID isn’t going anywhere soon. The effort to get back on track, get the jabs distributed, get the shops back open, get people back to where they were - will be tremendous.
Let’s stop allowing Coronavirus to hold time hostage. Make time to make memories, and don’t be afraid to reach out. Send a friend a message asking if they want a phone call. Get a group video call going, go back to all the ideas we had in March and April. It will feel awkward, and if you’ve been alone as much as me this year it will probably feel uncomfortable.
Nevertheless. Humans need a sense of time: it helps us regulate ourselves emotionally, psychically and physically. It puts positive pressure on us to pursue the things we’d otherwise forget. It inspires us to renew ourselves and interrogate whether we are really living the life that meets our needs. Fight hard for a way to get some rituals back. I wish everyone the very best of luck.
]]>First, you’re going to need an emulator. Dolphin is a reliable, fast, well-supported GameCube and Wii emulator with support for PC, Mac and Linux. You will need to use the Windows build to get mouse and keyboard support (as far as I know), but if you’re on another platform you can still play with a joypad.
I used version 5.0 to play the game and was fine.
Dolphin isn’t able to play GC games straight off of disk, so you’ll need an ISO of a disk backup. Obviously, I can’t provide you one, but I can tell you that you can use a Wii to extract a backup image. Alternatively… I suppose you could search the internet for an ISO. Not that I would condone such a thing, of course.
The mouse injector is a Windows program that intercepts your mouse position and your running Dolphin instance and translates inputs from one to the other.
You can download the mouse injector here.
Unzip the folder and copy the contents into your Dolphin installation folder. You’ll notice it overrides the Dolphin.exe
file - this is needed so the injector can discover your game.
Open Dolphin and go to the controller settings - be sure to set the controller profile to TimeSplitters
.
To make the mouse injector work you have to follow a few steps in order. Don’t worry, it’s not too complicated:
Dolphin.exe
you’ve just copied into the Dolphin folderPlayer
MouseInjector.exe
. It should detect the running Dolphin.exe
The MouseInjector works pretty well, but there are a couple of moments you might get stuck, at least in Future Perfect
Khallos Express
(the train level with Harry Tipper), you can’t defuse the bomb with the default mouse and keyboard controls. Instead you need to:Machine Wars
(set in the future, with R-110), during the section where you have to control the tank’s cannon, you can’t move the aim up and down. You can still beat the section, you just have to be exact with your timings and hit the spacecraft as they fly straight overhead. It is annoying though.[username].github.io
. Github will serve whatever content is on master under that URL.src
branch - this will contain the source content of your static site.travis.yml
file to the root of src
and configure your build (details below)repo
scope and note the value[username].github.io
and add an environment variable for GH_TOKEN
equal to the access tokensrc
and output the result to master
The .travis.yml
file is needed to configure CI:
1 | sudo: false |
(The fqdn
property is needed so that master will contain a CNAME
file, needed to resolve your Github pages site for a custom domain)
The build.sh
script will need to be written for whatever static site generator you use. In my case I’m using Hexo with a theme I’ve shared on Github:
1 | !/bin/bash |
master
with an index.html
and a CNAME
file at root[your-username].github.io
and see your lovely contentGoogle Domains is fairly cheap (a dotcom address is about £10 a year) and highly convenient, but most importantly, it lets you forward emails for a custom domain to Gmail without paying for GSuite. If you want to use another domain registrar, you’ll probably have to sort out an alternative webmail provider.
Signing up to Google Domains is very quick though - the only holdup will be if you’re transferring an existing domain, in which case you’ll need to jump through a couple of hoops handing over an EPP code and (typically) replying to some emails from your old registar.
Once done, Google Domains provides a UI for managing email forwarding via your custom domain to any email your choose. You don’t even have to write any MX records.
At this point, you should be able to send an email to e.g. `me@yourcooldomain.com` and have it land in your Gmail inbox.
So you can receive emails to your custom address, but how do you reply with the same address? You need to set up Gmail aliases.
Rather helpfully, Google have written a guide to do just that. You will need to have 2FA enabled on your account.
Head to Google Domains / your domain registrar and get ready to write some DNS entries. We’ll need two:
A
record for name @
(meaning: the current domain) pointing to these IP addresses:1 | 185.199.108.153 |
CNAME
record for name www
(your subdomain) pointing to [your-username].github.io.
(note the dot at the end)Save the changes, wait a short while, and check the DNS entries are resolving correctly by clearing your DNS cache (Google how to do this for your OS) and using dig
(*nix) or nslookup
(Windows) to check the IP resolution for your domain.
Sometimes DNS propagation can take a while - if it’s been more than an hour and things still don’t seem right, try a tool like https://www.whatsmydns.net/ that will make DNS lookups on your behalf using servers across the globe. It’s highly possible things are still being cached on your end.
Once this clears everything should be sorted - you’ll be able to see your static website at www.mycooldomain.com
and send/receive emails from your custom domain too. Publishing to your blog is just a matter of pushing content to your src
branch - TravisCI will pick up and deploy changes automatically. And the only thing to pay for is the domain itself.
async/await
and Proxies
are one thing, but every year there’s a steady stream of small, incremental improvements that go under the radar for me, as there’s always something bigger to learn.So in this post, I’ve collected some modern JS features that didn’t get much airtime when they first came out. Some of these are just quality of life improvements, but others are genuinely handy and can save whole swathes of code. Here are a few you might have missed:
Binary manipulation isn’t something one has to do very often in JavaScript, but every so often a problem rolls around that just can’t be feasibly solved otherwise. You might be writing high perf code for lower power devices, squeezing bits into local storage, doing pixel RGB manipulation in the browser, or having to work on tightly packed binary data formats.
This can mean lots of work masking / merging binary numbers, which I’ve always felt is needlessly obscured in decimal. Well, ES6 added a binary literal number format just for this purpose: 0b
1 | const binaryZero = 0b0; |
This makes binary flags really easy:
1 | // Pizza toppings |
Likewise for octal numbers. These are a bit niche in the JS world, but they’re quite common in networking and certain file formats. You can now write an octal with the syntax 0o
.
Not to be confused with window.isNaN()
, this is a new method with much more intuitive behaviour.
You see, the classic isNaN
has some interesting quirks:
1 | isNaN(NaN) === true |
What gives? Bar the first, none of those parameters are actually NaN
. The problem, as usual, is everyone’s “favourite” JavaScript feature: type coercion. Arguments to window.isNaN
are coerced to numbers via the Number
function.
Well, the new Number.isNaN()
static method solves all that. It returns, definitively, once and for all, the equality of the argument you give it and NaN
. It is utterly unambiguous:
1 | Number.isNaN(NaN) === true |
Signature: Number.isNaN : (value: any) => boolean
This crops up now and again, so it’s nice to have a literal syntax for powers:
1 | 2**2 === 4 |
(It’s weird because I was convinced JavaScript already had this - I may have been thinking of Python)
This one was a little hard to miss, but if you’ve been spending the last three years writing array.indexOf(x) !== -1
, rejoice in the new includes
method:
1 | [1, 2, 3].includes(2) === true |
includes
uses the Same Value Zero Algorithm, which is almost identical to the strict equality (===
) check, except that it can handle NaN
values. Like an equality check it will compare objects by reference rather than contents:
1 | const object1 = {}; |
includes
can take a second parameter, fromIndex
, which lets you provide an offset:
1 | // positions 0 1 2 3 4 |
Handy.
Signature: Array.prototype.includes : (match: any, offset?: Int) => boolean
This is a great pair of features that may prove invaluable if you’re doing a lot of work with web workers. They allow you to directly share memory between processes, and set up locks to avoid race conditions.
They’re both quite major features with fairly complex APIs, so there isn’t space to give them an overview here, but take a look at this Sitepen article to learn more. Browser support is still spotty but should hopefully improve over the next couple of years.
ES2018 introduced a whole flurry of regular expression features:
In runtimes that support it, you can now write a regex that looks for characters before your match. For example, to find all numbers prepended by a dollar:
1 | const regex = /(?<=\$)\d+/; |
The key is the new lookbehind group, evil twin to lookahead groups:
1 | Look ahead: (?=abc) |
Unfortunately there isn’t presently any way to transpile the new lookbehind syntax for older browsers, so you may be stuck to just using this on Node for the time being.
A really powerful feature of regex is the ability to pick out sub-matches and use them to do some simple parsing. But until recently we could only refer to sub-matches by number, e.g.
1 | const getNameParts = /(\w+)\s+(\w+)/g; |
But there’s now a syntax to assign these sub-matches (or capture groups) names, by putting ?<title>
at the beginning of the parens for each group you want to name:
1 | const getNameParts = /(?<first>\w+)\s(?<last>\w+)/g; |
Unfortunately this is Chrome- and Node-only for the moment.
You just have to provide the /s
flag, e.g. /someRegex./s
, /anotherRegex./sg
.
I was so pleased to see these on MDN.
flat()
, very simply, flattens a multi-dimensional array by a specified maximum depth
:
1 | const multiDimensional = [ |
flatMap
is essentially a map
followed by a flat
of depth 1. It’s handy when a mapping function returns an array but you don’t want the result to be a nested data structure:
1 | const texts = ["Hello,", "today I", "will", "use FlatMap"]; |
You can now write try/catch statements without binding the thrown error:
1 | try { |
Incidentally, catches where you don’t care about the value of e
are sometimes termed Pokémon exception handling. ‘Cos you gotta catch ‘em all!
Pretty minor but a nicety:
1 | const padded = ' Hello world '; |
If you liked this post, let me know and I might find the time to write up something similar for TypeScript!
]]>On the first: DHCLG must implement recommendations of the Hackitt Review, including a more effective ‘outcomes-based’ building safety regulations framework, with clearer terms and responsibilities. It must move swiftly to identify risks in existing high-rises, and cooperate fully with the Grenfell Tower Public Inquiry.
On the second: the department must explore ways to stimulate further housebuilding. It could open up opportunities for SMEs and housing associations to build more homes. It could consider helping organisations raise more capital to invest in this. It could focus on ensuring planning-consented sites are built out faster. Above all it must do so without losing sight of housing quality.
But new builds alone can help no-one sitting empty. DHCLG should investigate ways to improve the efficiency and cost of the home-buying process, and give private renters more clarity over fees.
At a communities level, DHCLG has ongoing but important work to do developing an Integrated Communities Strategy, encouraging social mixing. Given patchy evidence for the benefit of previous interventions, it should establish meaningful measurements to help track the impact of individual pilot policies and build an evidence-based ‘toolbox’ of approaches that work.
Finally: Brexit. DHCLG must work quickly to establish a UK Shared Prosperity Fund that will cover the shortfall in local authority EU funding.
]]><script type='application/javascript'>
. With this little incantation a website author - or ‘webmaster’ - had the power to launch his or her visitors on a fantastic journey to infoscapes hewn from pure imagination. Exhilarating games, virtual shopping malls, columns of animated flames and those little visitor counters you never see any more. All powered by the humble <script>
tag.OK, so the web of the 1990s and early 2000s wasn’t terribly elegant. But it was very easy to develop websites. All you had to do was plop some files on an Apache server and point a bit of XML at the appropriate resource. There was no notion of modules, or bundling, or minification, or code splitting. No Gulp or Grunt or Webpack or Broccoli. Just plain old HTML.
What if I told you there was a way to make webdev simple again?
Parcel.js is a web application bundler, a replacement to the likes of Webpack and Rollup. You give it a HTML file (or JavaScript file, if you prefer), it reasons about all the assets you need, and it outputs them all in a single, specified folder. Where Parcel differs from Webpack is that it requires no configuration whatsoever: no finangling to consume different assets or choose your sourcemap ‘strategy’. Parcel just infers it from your code.
At a time of rapid churn and constant change in the JavaScript world, replacing something as fundamental as Webpack might seem like technological arson. But we cannot solve our present problems by doubling down on yesterday’s tools. They are, at least partly, responsible for the issues we face. Webpack’s plugin-based architecture means managing an enormous list of devDependencies
, each one prone to change and obsolescence every time the wind turns.
Is my career as a Webpack whisperer at an end?
(cries tears of joy)
In comparison, Parcel simply infers as much as possible from the semantics of your HTML, CSS and JavaScript itself. Need a second HTML file? Use a hyperlink! Need content for an iframe? Set its src
. Want code splitting? Use the official ES6 dynamic import syntax and everything just works.
Of course, modern web applications rely on much more exotic technologies than 2002 Geocities pages, and Parcel has you covered. It can consume TypeScript files, React templates and Vuefiles out of the box. It can handle your usual range of CSS preprocessors and the production build (passing a CLI flag) will minify, optimise and MD5-version-stamp your assets utterly without fuss.
Should you ever need extra functionality - and in my experience so far, it’s been rare - just add a plugin to your package.json
. Parcel will sniff it out and call it automatically. (Why can’t other things be this simple?)
It’s also very quick - almost an order of magnitude faster than Webpack 3.x builds, by my count - which is doubly remarkable: one, that builds taking thirty seconds now take three; and two, that this is somehow the least interesting of all Parcel’s features.
It is not perfect. The bundler is very new and has a few beta-ish bugs; the documentation is sparse and so far tree-shaking is not supported (though is on the project radar). Nevertheless. Working with Parcel in three commercial projects has been an absolute joy, and has now become my go-to default for new projects. Try it sometime!
]]>What we don’t often see is an examination why.
The typical explanation (a la r/programming
) seems to be something or other about webdevs being naturally impatient, faddish and incompetent, which may constitute a more general fallacy: assuming behaviour you cannot understand is caused by an entire group being foolish, wicked or greedy (whereas your own unwise behaviour is due exclusively to factors beyond your control).
Still, fallacy or no, we do have a problem - don’t we?
Before we get carried away, it’s worth validating whether the meme really has basis in reality. Do front end technologies actually change that quickly?
In the sense of major view technologies, probably not. Consider this list of the highest ‘starred’ JavaScript front-end technologies on Github:
1 | +------------------------------------------------------------+ |
2.5 years for the youngest isn’t that old in the scheme of things - it’s less than half the support lifespan of your typical desktop OS, for example - but it’s still a ways off our caricature. So what is causing this perception of rapid, even unsustainable change?
It might be React. As powerful a tool as it is, it requires an army of helper modules and support libraries to be used seriously, and this is where the problem sets in. The React community is very big on what I would call the ‘microlibrary architecture’, where applications are composed of a myriad discrete, single-purpose JavaScript libraries, in homage to the Unix philosophy.
The advantage of this architecture is that you can easily adapt as new practices emerge, which makes sense at a time of rapid innovation (like the past few years). The disadvantage is that it increases your surface area for breaking changes and demands a great deal (often too much) vetting and selection of said microlibs.
And this is the thrust of my argument: what’s wrong with JavaScript isn’t the language [1], the web, or any technology in particular, but a poor ‘choice architecture’ that makes developers slaves to fads and trends.
Modern JavaScript’s greatest asset - and liability - is NPM. It provides an enormous wealth of modules, catering to just about any specific purpose one can conceive, but very difficult to filter and curate. Which ones are really being supported? Which ones are actually functionally correct? Which ones aren’t really just vectors for evil malware? The only heuristic a JavaScript developer can really use is popularity - number of downloads and Github stars - which exacerbates faddishness.
There are other ways to validate a library, of course: you can read through Github issues and search for StackOverflow questions. You can do some testing or even examine the sourcecode for yourself (in most cases). But this takes time, which isn’t really warranted when choosing e.g. a date parsing doodad.
I will concede that this is something of a cultural weakness of JavaScript developers. As an interviewer I often like to ask candidates how they choose technologies, and it depresses me somewhat that popularity is the almost always the only marker they know. Software engineering is at least partly a research job and we need to train junior programmers research skills. But even if we did, the odds would still be stacked against them.
Put yourself in the shoes of a junior-to-mid-level JavaScript developer, writing a new application for the first time.
It starts innocently enough. You have a completely clean slate and want to keep things simple. You are a devout Agilist and YAGNI is your watchword. So you begin with a ‘simple, barbones framework’. That sounds good, doesn’t it? (Even if it did not, that’s often the only choice you’ve got).
Being barebones it does little, so the task falls on your shoulders to choose some helper libraries. If you are doing frontend work, it might be helpers for Redux for forms and API requests. If backend, it might be middlewares for Express [2].
So you do a Google search, which reveals a Medium post that heartily recommends X.js. It later transpires the post was written by X’s author, though she never announces that particular conflict of interest (she does, however, provide a GitTip jar). Not that you could tell - all Medium articles look the same, so you can never rely on a ‘brand’ to identify reputable material.
You miss the replies pointing out some critical inadequacies in X.js, because Medium deliberately suppresses them, and move on to finding a _Y_.
This time you find a link on Twitter - with over a hundred hearts! You guess that’s a pretty good signal it’s been “curated” by a community more knowledgeable than yourself. You add a heart of your own in gratitude (like the hundred before) and follow the link to Github.
But not so fast. That link was old - the library is now deprecated. You can tell because the word DEPRECATED
is slapped everywhere like CONDEMNED
signs on a Scooby Doo themepark.
You see, Y.js was “object oriented”. You thought this was a good thing, vaguely recalling something from first year ComSci about Smalltalk and message passing. But apparently it is Very Bad.
Another Medium article tries to explain why, though its reasoning is hazy and packed in dense terminology you don’t recognise. It later turns out the terminology was invented by the post’s author, as were the neutral-looking external blog posts he cited as authorities to his argument.
It gets worse. The post claims that even mentioning OOP in a JavaScript interview will render you utterly unemployable! You are seriously disoriented now. Thankfully help is at hand - in the form of his $50 dollar JavaScript webdev course. You take a note of the link, thinking how lucky you are to have found it, and give another clap in gratitude. (Nineteen thousand and one).
So you move onto Z.js, which seems to have a lot more Github stars, though the documentation seems less useful. Lots of methods are listed, but how do I practically use it? You are heartened at least to see it uses something called ‘Standard JS’, which you assume has something to do with the ECMA Standards Committee. It doesn’t.
But how could you do better, Junior Developer? Who was there to guide you? The Senior Developers, too, are learning as they go. We’re caught in this avalanche too, just trying to keep up to date and remain employable.
So. You take the path of least resistance: you choose the Github project with the most votes, the most stars. And that is why JavaScript dev is driven by fads and hype.
Like most natural complainers I am generally better at moaning about problems than, y’know, SOLVING them. But I have a few ideas:
Medium incentivises clickbait somewhat and makes it harder to distinguish authoritative content. Classical blogging allows good authors to establish a distinct visual theme, which helps visitors recognise a source that’s helped them before.
Over the last few years I’ve seen much more aggressive self-marketing in the JavaScript world, possibly due to the rise of paid online training materials and the employment/consulting advantage of being a Github ‘celebrity’. I’ve no problem with people being incentivised for good content, but increasingly I feel I see dishonest tactics: self-citation; invented, proprietary terminology (so searching takes you back to the author’s materials), and name-squatting (e.g. ‘Standard.js’)
Try to start your projects in frameworks that provide a large surface area of features and don’t require many plugins to get productive - this will immediately reduce the number of moving parts and exposure to unexpected, breaking change. It’s one reason I’m very interested in Vue.js. You could also use React as part of a starter kit or larger framework, like Next.
The only people who need to know a company’s whole stack inside and out on day zero are freelance contractors, who are paid a handsome wage to parachute in and get a project out the door. Otherwise, most employers are absolutely fine with you not knowing the ins and outs of the latest React helper library. So avoid the call to learn absolutely everything: most of it noise.
[1] Though it has many, many faults.
[2] Can you believe Express requires a middleware just to parse JSON POST bodies? Sorry, but that is utterly bananas.
]]>You might be forgiven for thinking of the Malaysian Airlines’ MH370, which disappeared in 2014 under these exact circumstances. But this was the Hawaii Clipper - some eighty years earlier. It turns out these kinds of aviation mysteries aren’t nearly as rare as you’d think. Christine Negroni’s The Crash Detectives is a lively and readable account of “the world’s most mysterious air disasters”.
Aerophobes fear not: this book is no almanac of fire and destruction. It is really about the extraordinary systems, technology and people who not only keep us safe in the air but continuously improve airline safety on the ground, meticulously poring over the wreckage of each accident to ensure it can never happen again. You will finish this book more confident about flying than ever before.
But Negroni is no gearhead and her book is less about the tech than the human factor: how pilots react to stressful situations and pull together as a team. Sometimes they triumph, and these make the book’s most uplifting moments. But sometimes they fail - like one crew so distracted by a broken dashboard light they failed to notice their plane tank into the Florida Everglades. Rather than laugh, though, Negroni would rather ask why: what is it that makes intelligent humans do such silly-seeming things, and how can we design systems to stop it?
She also takes a look at some genuine conspiracies in aviation history, like the mysterious death of a UN Secretary-General and some strange cases during the Iran-Contra affair. It’s genuinely interesting to see what cover-ups actually look like: not hundreds of willing participants and ‘crisis actors’ hired by murky government agencies in a perfectly orchestrated whitewash, but something much more mundane: hasty conclusions, awkwardly revised pathology reports, reams of legal threats and politically sympathetic incompetents installed as investigators.
The Crash Detectives is not just a book for aviation nerds, like me. It should interest anybody who is curious about psychology, human factors, heroism or just inquisitive about the magnificent flying machines we take for granted every day.
]]>What exactly is procrastination? I like to think of it as a conflict between the superego and the id. I am supposed to do something and have turned my spirit towards it. But my body and emotions are quietly rebelling. What if they had good reason to?
Why was I procrastinating?
The obvious explanation is that I was just being lazy. I let my mind drift when it should have been fixed on its purpose, coursing relentlessly towards my goal like a shark stalking its prey.
But could there be a more humane excuse? Maybe I was stalling because I just hadn’t given myself time to get up. Maybe I was daydreaming because I can’t just switch from ‘sleep mode’ to ‘work mode’ like a robot vacuum cleaner. I need to get up earlier and enjoy my morning first. I have to to ruminate over what I heard on Radio 4’s Today Programme because that’s just something important to me.
So we might say my procrastination tells me I’m neglecting myself. I’m not getting up because I’m not ready, and rather than suppressing that perhaps I need to acknowledge it.
A person procrastinating is just ‘being’. They are not moving towards any goal or change, and their reason for not doing so comes entirely from within. There is a kind of honesty to that.
What would we make of a person who could never ‘just be’, who was always on the go? I think I’d wonder if there was something wrong with them. Why can’t they be at peace with themselves? What are they distracting themselves from?
A person dithering is giving audience to a full and healthy range of emotions. Fear, doubt, tiredness, contentment - all of these can tell us something that matters, even if it’s not something we particularly want to hear.
A person on the go, on the other hand, is putting their trust entirely in their superego. They think that when the brain’s executive function says - you, run that mile! you, practice that guitar! - obedience is the path to happiness. But the superego has blind spots.
As a young man at Cambridge I lived and breathed my field and became almost possessed my studies. At age eighteen I had already decided that literary criticism was the only thing the world could offer my interest and every day was pledged to that end. I worshiped my cold, sharp will almost as much as my subject, and I lived much like a monk tending to his holy devotions. I ignored all my other needs and spurned the world quite proudly.
Naturally, it was a failure. No-one can live like that. My relentless studying and utter intolerance for personal failure sent me into a spiral of obsession, compulsion and exhaustion that almost cost me my degree. I had placed too much trust in my superego and not enough in the body and emotions that were trying to tell me to take care of myself, have a rest, be more than just a student.
How often have you thrown yourself into the wrong goal? Perhaps an impressive-sounding career that left you miserable? It’s likely your superego that got you there, but I bet it wasn’t your superego that got you out. The tension at the back of your neck, the cold drop in your stomach that tells you ‘this job is all wrong! this place is all wrong! all this, it’s all wrong!’ - that’s your id, loosely defined, and that’s what tells us to procrastinate. We should give it more credit.
I don’t trust my superego, because I know my limitations and how often my ambitions run ahead of them. And I’m not sure I trust people who can’t procrastinate, either, because people with neither fear, uncertainty nor self-doubt can be terrible individuals. One of the most productive and relentless men I ever knew was also a mendacious sociopath. He was as merciless with you as he was with himself.
But a person who does nothing but procrastinate might be living life dishonestly. If they keep deciding one thing and then quietly resisting they may be making decisions that tally poorly with their real needs. Maturity involves listening to your doubt, indecision and fatigue, taking them seriously, along with all the dreams and the commitments and the ambition, and reconciling them realistically.
]]>The mod also includes a package to use all the plant weapons (except the HF blade, sadly). It was produced by the now-defunct RedCode Interactive.
Demonstration of the mod:
Mod files: tinyurl
Mod file mirror: mirror
To use the mod, unzip and point the installer at whatever directory holds your MGS2 installation. This mod tool replaces files, so you might want to backup your MGS2 folder first.
You can play using several models, including Fortune, MGS1 Ocelot, and a Gurlukovich Commando - but Solidus remains my personal favourite. Sadly there seems to be no equivalent mod for the Plant chapter.
]]>You start the first game literally ‘on a rail’, trapped on a commuter train shuttling you towards a humdrum day as a ‘research assistant’ at the Black Mesa Research Facility. This fast-paced go-getting career entails climbing ladders, pressing brightly lit buttons and pushing carts on the command of uppity scientists who watch your every move.
You are late, and everyone is most keen to ‘helpfully’ remind you of the fact. Gordon is on the bottom rung. He barely even earned the distinction of his PhD.
But as he goes on he gains more agency and notoriety. He stops fleeing for the surface as starts seeking out the Lambda team. He destroys bigger and badder targets and the military begin to single him out. Eventually he becomes the personal saviour of Lambda and is entrusted as the only man with enough combat experience to take on the Xenian threat at source. By the end of the game the Nihilanth, the alien leader himself, is actually addressing you by name!
And this is the point of the ending – to remind us just how much Gordon has changed. He started the game on a rail, but his last action happens to be the very first ‘choice’ the player can make interactively – to step off the train into their new career as intergalactic assassin-for-hire. That’s quite a promotion.
Half Life 2 does gets a bit carried away with its Gordon-as-messiah trope and may approach parody a couple of times. But the game’s central arc always felt to me a pretty well-executed power fantasy: in the first half Valve builds up the combine threat, in the second half Gordon overthrows it.
If there’s a single moment where it all pivots it’s when Gordon deactivates the first thumper on the approach to Nova Prospekt beach. Combine technology has pushed you around all game, blocking your way, covered in indecipherable heiroglyphs and uttely indestructable. Now you take control of it directly, break into Overwatch territory and tear their fortress down. It’s giddy stuff.
By the later Episodes Gordon is a free agent entirely. The G-Man loses control and can only beg the player to save Alyx Vance. Episode 2 opens with a rail crash, and I’ve always felt this is a callback to the tighly controlled train sequences of the earlier games. Everywhere we look are wrecks of broken carriages.
By the end of the game Gordon is entirely independent: Eli’s death removes the last authority figure and Gordon stands with Alyx by choice.
And now we’re told that HL3 was to subvert this arc entirely? That Gordon and the Borealis were nothing but an insignificant mote, a half-second speck in the Combine multiverse, barely earning a synapse’s pulse in the intergalactic sentience? It’s inconsistent and it’s shoddy. You can get away with this if you do the work laying down the clues early on. You can get away if there’s always some tension, always some uncertainty just how powerful the hero is. But you can’t get away with telling the player how powerful they are for fifteen years, growing and growing their silhouette and their accomplishments, then popping the bubble in the final act. That’s having your cake and eating it.
It’s also a cheap way to cover up plots you can’t or won’t tie up. You can’t generate all these lines, set up all these places and characters, then tell us we’re not getting a proper ending because the universe is, like, so incomprehensible to our monkey minds, maaan. But stick around for the spin-off series because Alyx is still alive and we ain’t done making money yet! You can accuse Valve of many things, but the one thing they’ve never done is eke out the HL franchise for extra cash.
So, I don’t buy it. I am convinced Laidlaw’s “fanfic” was nothing more than that – a half-formed offshoot from an old revision of the plot, a flight of fancy or just a bit of speculation how the whole thing could play out. I believe the real HL3 plot was never nailed down and never now shall be.
]]>With shrinking manufacturing industry and the threat of automation as hot-button topics in the press, then, Utopia for Realists will be read eagerly by technosceptics, anti-capitalists and futurists alike. But how well will it satisfy them?
Here’s what I think is a fair-minded assessment: UFR is a fairly short book with a very ambitious scope, but that suffers from being spread so thinly. It raises lots of interesting ideas, but that doesn’t cover any of them with enough depth to make them stick.
Things started poorly. The book is prefaced by four pages of positive review quotes, presumably inserted as encouragement for the indecisive bookshop customer. At least I hope that’s who it’s all for, because it’s a pretty weird species of self-congratulation if not. These reviews all gush over Bregman’s insight and perceptive power, but none of them come from economists or political scientists: several are written by Bregman’s fellow UBI-campaigners, another handful from tech entrepreneurs, and one praising the writing, extracted from an otherwise negative review in the Independent. One may have been from a popstar, too, I can’t remember.
The other thing was the typesetting. I know this must sound like a trivial complaint, but the book is set with extremely wide margins and very loose leading between the lines, so much so that the 260-page work could have been easily printed in half those pages. The cynic in me wondered if the book was padded to make it look more profound. So you may say I started UFR with a little prejudice.
My problem is that Bregman constructs his argument by leaping between a handful of studies and historical test cases, quoting friendly think-tanks second hand, and making vague assertions about the costs of existing benefit systems and the unhappiness of the west. A charitable take might be that his style of argument is journalistic rather than academic. He certainly knows how to write punchy prose, and not to get lost in details, in a desert of percentages and statistics with lots of zeroes in them. But the side-effect of this breeziness is a lack of serious detail or reliable numbers.
That is ultimately the book’s gravest sin: it never actually calculates the price of UBI. You’d think this would be the first order of the day, but all it does is assert society is more than productive enough already (Keynes said it would happen, so it must be so, apparently), and that we can probably save some cash just by dint of streamlining benefits anyway. How much money? It’s never quantified.
Utopia for Realists does have its moments. If you feel dissatisfied with how capitalism is working, it will give you many memorable factoids for pointed conversation. It torches several populist myths about welfare and immigration, and it makes a solid case that what the poor need isn’t education, interference, or community centres, but money. There’s a great chapter on the “Broadway Experiment”, when thirteen homeless men in London were given £3000 strings-free and left to get on with it.
But read it sceptically – Bregman recycles material from anti-capitalist groups and pro-UBI campaigners quite uncritically, and he rarely unpacks their evidence. (To his credit, he does provide a helpful bibliography, with internet links wherever possible).
Let’s take an example. At one point, the author argues that the need to work pushes citizens into jobs that may actively harm the common good. With UBI, this would no longer be necessary. One industry he cites is advertising, where workers ostensibly ‘destroy’ £7 of value for every one £1 created. Disregarding the basic counterargument that most people in marketing and PR are probably there by choice, where does this £7 figure come from? The New Economic Foundation, a using a methodology I would classify as contentious at best: It counted all consumption above the Joseph Roundtree Foundation’s “minimum citizenry level” as “unnecessary, and therefore harmful”. It included unquantified (and frankly unquantifiable) harms to mental health caused by “advertising media”, and it also counted in some very broad-brush assumptions about ecological impact. Even as someone who is very sympathetic to the NEF’s political ideas: I think that figure is bogus.
It may seem a minor point – who really doubts that humans have better things to do than advertising? – but it underlines a general sort of sloppiness and poor judgement on the part of the author. Bregman kept throwing these quotes and the odd supportive statistic at me, but I read each of them with increasing scepticism and unease. It left me distrustful of anything the author wrote.
I came into UFR eager to be convinced, ready to be ignited with positivity and conviction, marching with the banner of the good idea whose time as come, the kind that ends up looking so inevitable future folk can only look back and wonder incredulously how it might have ever been resisted. But I was left deflated and annoyed: Bregman’s arguments are superficial and unserious. His book is oddly structured and lacks focus. It ambled around irrelevant topics, like a chapter on the history of GDP, and one on free movement, quite out of the blue.
Then there is a chapter on automation. This was the meat and gristle I was really waiting for, hearing of self-driving cars about to replace entire droves of logistics workers, and warehouse drones threatening those thousands who work tirelessly through December dark to get us our Christmas gifts on time. This should be the good stuff. Yet all the author does is assert that computers may retire some unspecified range of jobs, probably, over some kind of timescale. Also that the Luddites were a thing. It’s never fleshed out how big the threat is nor how hard the jobs are to replace. This should be Bregman’s coup de grace – he bodges the execution.
If Utopia for Realists is light on the details, it is light on the theory too. There is no poltical philosophy here. Bregman asserts that more equality makes us happier, but he never takes a moment to ask whether it’s morally desirable. I’m a socialist and even I think that one can’t be taken for granted. And for a historian, Bregman seems unreflectively Whiggist: he suggests all history has been nothing a ceaseless, beautiful, great march forwards; that we will keep going on and on and that there will always be a bright new idea waiting to advance us. UFR all but says UBI is a historical necessity just by dint of being new and progressive.
Utopia for Realists is a TED talk expanded into a book. It is slickly argued and it successfully argues that don’t just work out of hunger, but out of their basic social natures. But like TED it is shallow, it is self-promotional, it is too excited about the solutions and not thoughtful enough about the problems. My sceptics’ antennae twirled madly throughout and I left half the text marked with my trademark ‘wavy underline of doubt’. I was not convinced.
]]>Billy Pilgrim, Vonnegut’s semi-auto-biographical stand-in, is a war veteran who experiences time in a non-linear order after being visited by four-dimensional aliens who can see all times at once. Now, like them, his mind slips between present and past, sudden flashbacks returning him to the war. They should be traumatic: Billy is often in mortal danger – but he experiences it all with a kind of beatific calm, finding curious beauty and humour in the most unlikely places. And yet, why does he return to the present day and find himself sobbing?
By moving between times and places, calling back to earlier events and compulsively repeating phrases and images, Slaughterhouse Five isn’t just a marvellous illustration of post-traumatic stress disorder – it’s also an introduction to postmodernism. This movement shies away from imposing greater meaning on human experience, because it knows to do so is always reductive, and sometimes even a step towards totalitarian thinking.
Despite its name, Slaughterhouse Five has little violence, and much wit. It is a short, self-contained book that ends with a satisfying kind of circularity. It does not linger with questions, but it does its job: Vonnegut writes it, gets it out of his system, and moves on with his life.
These micro-reviews surmise a book, essay or work in 200-300 words. I’m writing them to practice slimming down my prose.
]]>Another Day in the Death of America examines the maelstrom of forces that converge to kill so many young Americans: modern segregation; the psychology of infanticide; parenting on the poverty line; media disinterest in ‘complicated’ victims; the toxicity of gun control debate in the US; the trauma and fatalism of teens scarred by gun death; the legacy of revenge; the ubiquity of gangs; the frightening availability of weapons; the psychology of adolescent risk-taking – it is a searching work that ultimately covers an anatomy of American life.
But the analysis never loses sight of the human, and Younge never speaks on behalf of his people. They have their own voices, their own explanations, sometimes contrary to the author’s. There is one moment a victim’s family agree the eleven-year-old shooter should be executed: the author is taken aback, agape, but gives them space to explain themselves, without presuming to interject or interpret. He knows the power journalists have to shape human pain to their own stories, and he is cautious about exercising that power.
This scepticism is what makes the book such fine journalism. Younge questions explanations relentlessly, and has a healthy fear of making the wrong call, of writing the wrong polemic. His even-handedness can occasionally be a fault, paralysing his momentum. But at a moment when so much journalism is tendentious, shallow, and scared of nuance, we need more Gary Younges.
These micro-reviews surmise a book, essay or work in 200-300 words. I’m writing them to practice slimming down my prose.
]]>