The Extinction Trials(47)



Alister crossed his arms. “I go to work early. Every day. You want to get something done? Go to work early. My dad taught me that. About the only thing he taught me before he took off. Anyway, I go to work early because I can think early in the morning—and most importantly, there’s no one there talking. I tell you what, you cut out talking, this world would have to work half as much—if that.”

He paused, waiting for someone to challenge him. Of course, Maya didn’t logically see how the vast, vast majority of people could do their jobs without talking—and she didn’t really think Alister believed it either. He was stirring the pot, but none of them were taking the bait.

“As I said back at that station, I’m a city bus mechanic. I’ll tell you one thing: those buses take a beating.” He nodded. “Sure, they drive themselves and they have programming to optimize everything, but that’s half the problem. They never miss a pothole. Never. The AI that drives loves hitting potholes. It’s like the axle manufacturers have got a mole in the programming group that makes the blasted thing veer for them. I’ve complained repeatedly, but the answer is always the same: the AI prioritizes human life above mechanical wear.”

He threw his head back. “Ha! What a joke that turned out to be. I knew the AI and robots would be the death of us—one way or another. Every year they take—or took—a little more of our freedom. Now, they would tell you that they were freeing us up to do other work, but it just isn’t true. You want to know what’s wrong with the bus? Back when I first picked up a wrench, you opened the thing up and looked and used your brain. It’s pretty simple, really. Now? You hook it up to a computer for a full diagnostic. The screen tells you what’s wrong. It even automatically orders the parts if we don’t have them. Here’s the best part: if the mechanic doesn’t know how to fix it, no problem—it will activate the camera, play a video walk-through of how to fix it, and then watch you and tell you when you’re screwing it up. Could that mechanic repeat the exercise without the computer? Of course not! We just sit there and follow instructions like we’re the robots’ assistants.”

Alister motioned to Owen. “He was right—it was like a dress rehearsal, with us humans only playing a mechanic toward the end. And that’s not even the worst of it. The worst is that if you screw up the repair, the robot mechanic comes by and fixes it. I’m not kidding. It’s the supervisor. And it never messes up. It always gets it right. I know the city was just saving up money to buy more of those blasted repair bots and get rid of all the greasers like me. Just a matter of time.”

He locked eyes with Owen. “Believe me, I sympathize with your story—those firebots slowly taking away that last bit of joy you got from your work.”

Owen nodded. “Yeah.”

Alister held his hand out at Will. “But frankly, the bots weren’t the worst part of the job. It was the programmers. They come in there, skinny as a rail, carrying coffee that costs more than my life. Using words and acronyms nobody understands like API and runtime exception and asynchronous request and token authentication and who knows what else. It’s like doctors—they’ve got their own language and they like it like that, so you don’t know what they’re saying and they feel superior. They like to keep you in the dark.”

Will frowned. “The use of syntax makes any complex job easier. Without it, you’d spend half your day using known language to describe new and specific concepts. And I believe excessive use of spoken language is one of the faults you find in workplace society.”

Alister threw his hands up. “That’s what I’m talking about. Even when programmers are telling you what you just said, you can’t understand it. Your own words. I remember once telling a programmer that a bus stopped running. So, he fires up the diagnostic and studies it and I ask what happened. His answer? That the bus AI threw a runtime exception when it made an API call to the maps service because the maps service encountered a database timeout expiration.”

Alister turned to Will. “Do you know what that means? Do you?”

“Yes,” Will said. “I do.”

“Well, me too—cause I finally badgered them until they told me. It means, quite simply, that the idiot bus AI was driving along and one of the roads was closed for construction, so it asked for directions, and the maps AI tried to dig it up in its database, but the database was busy, so it couldn’t respond in time, and the bus didn’t know what to do. The fix? Well, the fix is simple: the bus has to know that it may have to wait sometimes. So, they updated the code to have it park by the side of the road and wait—what a novel concept. It logs the wait times. They get too high; they add more servers. Why couldn’t they just say that?”

“I think,” Owen said, “we’re getting a little off subject here.”

“On the contrary,” Alister snapped, “this is exactly my point. The bots and the programmers—they’re how the world ended. I could’ve been a programmer. I didn’t want to. I chose not to—because I didn’t want to spend my life staring at a screen, typing stuff in and being tired all the time. I need to work on things I can see. I want to feel like I got something done every day. The buses come in broken, they go out running. That’s something. And that’s what I was doing the day it all ended.”

A.G. Riddle's Books