Some come back, some don’t.
April 2, 2019 8:39 AM Subscribe
How BioWare's Anthem Went Wrong Jason Schreier writes a long form piece about the 7 year long development of their new game Anthem. Bioware's response.
Jason wrote a book of other long form articles about the development of games call Blood, Sweat and Pixels.
Jason wrote a book of other long form articles about the development of games call Blood, Sweat and Pixels.
Sorry about that.
And hopefully in the future about Anthem.
posted by zabuni at 8:50 AM on April 2, 2019 [1 favorite]
And hopefully in the future about Anthem.
posted by zabuni at 8:50 AM on April 2, 2019 [1 favorite]
So this:
Of course, on reflection they'll never conclude "oh wow, we were basically playing Russian roulette, and we need to not do that!". They'll instead conclude, "clearly these developers weren't as good as the old ones we tossed into the fire".
posted by tocts at 9:20 AM on April 2, 2019 [15 favorites]
Within the studio, there’s a term called “BioWare magic.” It’s a belief that no matter how rough a game’s production might be, things will always come together in the final months. The game will always coalesce. It happened on the Mass Effect trilogy, on Dragon Age: Origins, and on Inquisition. Veteran BioWare developers like to refer to production as a hockey stick—it’s flat for a while, and then it suddenly jolts upward. Even when a project feels like a complete disaster, there’s a belief that with enough hard work—and enough difficult crunch—it’ll all come together.Yeah. Yeah. It's almost as if Bioware's development process was to foster an atmosphere of chaos, burn out as many people as possible, and hope that by a stroke of luck it would work out in the end -- and they got lucky a few times, but eventually they did not get lucky.
After the high-profile failures of Mass Effect Andromeda and Anthem, it has become clear to current and former BioWare employees that this attitude is no longer working. In recent years, BioWare has done serious damage to its reputation as a premier RPG developer. Maybe the hockey stick approach is no longer viable. Or maybe—just maybe—that sort of production practice was never really sustainable in the first place.
One thing’s for certain: On Anthem, BioWare’s magic ran out.
Of course, on reflection they'll never conclude "oh wow, we were basically playing Russian roulette, and we need to not do that!". They'll instead conclude, "clearly these developers weren't as good as the old ones we tossed into the fire".
posted by tocts at 9:20 AM on April 2, 2019 [15 favorites]
That formal response is basically corporate-speak for "yes, everything you are saying about us is true".
posted by aramaic at 10:10 AM on April 2, 2019 [6 favorites]
posted by aramaic at 10:10 AM on April 2, 2019 [6 favorites]
Weird to see this article right after I finished reading the book over the weekend. In that book, he mentions the amount of crunch that developers go through, and it seemed to me that he painted it as a noble, and inevitable pursuit. Here's an excerpt from the book:
In all the stories in this book, you’ll see several common themes. Every game is delayed at least once. Every game developer must make tough compromises. Every company must sweat over which hardware and technology to use. Every studio must build its schedules around big trade shows like E3, where developers will draw motivation (and even feedback) from throngs of excited fans. And, most controversially, everyone who makes video games has to crunch, sacrificing personal lives and family time for a job that seems to never end.
Ultimately, I think that's both untrue and detrimental to people who make games in general. In all the cases he lists in that book (and Anthem is no exception), crunch wasn't necessary. It was the result of bad decisions at the top that trickled down to the employees, who got nothing out of it. In one especially egregious example, developers were crunching as they knew they would all be laid off when the game was released (Halo Wars from Ensemble Studios).
posted by No One Ever Does at 10:14 AM on April 2, 2019 [5 favorites]
In all the stories in this book, you’ll see several common themes. Every game is delayed at least once. Every game developer must make tough compromises. Every company must sweat over which hardware and technology to use. Every studio must build its schedules around big trade shows like E3, where developers will draw motivation (and even feedback) from throngs of excited fans. And, most controversially, everyone who makes video games has to crunch, sacrificing personal lives and family time for a job that seems to never end.
Ultimately, I think that's both untrue and detrimental to people who make games in general. In all the cases he lists in that book (and Anthem is no exception), crunch wasn't necessary. It was the result of bad decisions at the top that trickled down to the employees, who got nothing out of it. In one especially egregious example, developers were crunching as they knew they would all be laid off when the game was released (Halo Wars from Ensemble Studios).
posted by No One Ever Does at 10:14 AM on April 2, 2019 [5 favorites]
The hockey stick "crunch" style of project development is something they tell you to try and avoid in the project management training I've had, although it paints avoiding it as very difficult. It's how we operate where I work, and all we do is develop online educational materials. It's... really good at creating burnout, and really bad at creating high-quality materials. I've built systems for my own role to help mitigate the effects of crunch time, but waffle + crunch just seems to be the way some folks want to work. It's sooooo toxic. It's also in line with everything I've heard about the video game industry over the last 20 years.
posted by Fish Sauce at 10:21 AM on April 2, 2019 [11 favorites]
posted by Fish Sauce at 10:21 AM on April 2, 2019 [11 favorites]
Neither people nor technology are interchangable, despite what MBA programs teach our Sacred Business Leaders of Late-Stage Capitalism. The first, obvious problem is that the Mass Effect guy, the guy who fucking made their good games and was also the guy who named the project Dylan, left super early, and none of his replacements were capable of filling his shoes, and none of them thought, "Well, maybe we should get someone who's made this sort of game before in here." Just like Bungie and Destiny 1.
The second problem, just like Bungie and Destiny 1, was their trying to build the game in an engine that wasn't designed to build that sort of game. If you want to build an always online multiplayer shooter, your engine should probably be designed from the ground up to do that, or be designed from the ground up to be very flexible, with very good documentation. It sounds like they've released two bad games partially because the engine they're building them in is ill-suited for the games they're actually making. EA "saved" on licensing fees by handicapping their developers such that one of their premier studio's rep is in complete tatters and they've resultingly wasted the entire budgets of multiple AAA titles. Good job, business boys!
posted by Caduceus at 10:54 AM on April 2, 2019 [8 favorites]
The second problem, just like Bungie and Destiny 1, was their trying to build the game in an engine that wasn't designed to build that sort of game. If you want to build an always online multiplayer shooter, your engine should probably be designed from the ground up to do that, or be designed from the ground up to be very flexible, with very good documentation. It sounds like they've released two bad games partially because the engine they're building them in is ill-suited for the games they're actually making. EA "saved" on licensing fees by handicapping their developers such that one of their premier studio's rep is in complete tatters and they've resultingly wasted the entire budgets of multiple AAA titles. Good job, business boys!
posted by Caduceus at 10:54 AM on April 2, 2019 [8 favorites]
I mean, in minor fairness it's not like this is just a games industry problem. "We bought these two software companies that do vaguely similar things, now they should standardize on one set of tools that somehow reflects the best parts of each of their toolchains!" is a quagmire I've seen first-hand a couple of times.
Shocker: It turns out that while on paper you could just use Company A's "Foo" tooling and Company B's "Bar" tooling, actually you can't meld them together without significant effort, and also they weren't built with each others' needs in mind, and actually this is going to take 5 years and not succeed and nearly bankrupt the company and ...
But business guys love shit that looks obvious on paper.
posted by tocts at 11:02 AM on April 2, 2019 [11 favorites]
Shocker: It turns out that while on paper you could just use Company A's "Foo" tooling and Company B's "Bar" tooling, actually you can't meld them together without significant effort, and also they weren't built with each others' needs in mind, and actually this is going to take 5 years and not succeed and nearly bankrupt the company and ...
But business guys love shit that looks obvious on paper.
posted by tocts at 11:02 AM on April 2, 2019 [11 favorites]
Oh my gods I'm not even halfway done with the article. There are so many more problems. What a fucking shitshow. Indie game development is the life for me, thanks.
I mean, in minor fairness it's not like this is just a games industry problem.
Oh, I definitely meant to imply that this was a Toxic American Business Culture AKA Worshipping the Shareholders problem. Sorry I wasn't clear enough! Fuck all those business guys everywhere!
posted by Caduceus at 11:07 AM on April 2, 2019 [4 favorites]
I mean, in minor fairness it's not like this is just a games industry problem.
Oh, I definitely meant to imply that this was a Toxic American Business Culture AKA Worshipping the Shareholders problem. Sorry I wasn't clear enough! Fuck all those business guys everywhere!
posted by Caduceus at 11:07 AM on April 2, 2019 [4 favorites]
Just finished Schreier's piece (his name is misspelled in the tag, BTW), and it's a good, solid, and very plausible explanation, on a par with his article on Mass Effect: Andromeda. (I initially criticized that one, because I was still very sore about some of the criticisms of the game--and, in a few instances, still am--but eventually I came around to admitting that he was right.) Unfortunately, it leaves me even more pessimistic about Dragon Age 4; I haven't invested nearly as much into the DA franchise as I did with Mass Effect, but I know a lot of people who have, and it's pretty shocking to read about some Bioware staff wishing that DAI had failed so that their process would have been fixed before now.
posted by Halloween Jack at 11:11 AM on April 2, 2019
posted by Halloween Jack at 11:11 AM on April 2, 2019
I'll offer a defense of the business guys: I've worked in TV and movie production for all of my career, and I've observed many projects blow up their budgets and drive their people into the ground for reasons similar to some of those talked about in the article: If your chief creatives have a hard time making decisions and locking down a good story early on, there will be nothing but pain for everyone involved for months and years to follow.
The people who are able to do that - who are able to create a compelling idea, flesh it out in enough detail to find the major problems before work starts, and push that compelling idea through to completion - are rare, and, given the scope of that job description, typically subject to burnout themselves.
Okay, I guess that's not really a defense of the business guys.
posted by clawsoon at 11:18 AM on April 2, 2019 [3 favorites]
The people who are able to do that - who are able to create a compelling idea, flesh it out in enough detail to find the major problems before work starts, and push that compelling idea through to completion - are rare, and, given the scope of that job description, typically subject to burnout themselves.
Okay, I guess that's not really a defense of the business guys.
posted by clawsoon at 11:18 AM on April 2, 2019 [3 favorites]
Nope! One could infer an indictment in there, in fact, if one were already of a negative opinion of the general American Business Guy culture.
posted by Caduceus at 11:27 AM on April 2, 2019 [1 favorite]
posted by Caduceus at 11:27 AM on April 2, 2019 [1 favorite]
chief creatives have a hard time making decisions and locking down a good story early on
It's a lack of "shared vision" for the team members involved.
There were a couple decent books in the 90's, by Jim McCarthy; "Dynamics of Software Development" and "Software for your Head" that are not bad reads. (However, his later project iterations and overall evolution feels a bit too cultist for me).
posted by jkaczor at 11:45 AM on April 2, 2019
It's a lack of "shared vision" for the team members involved.
There were a couple decent books in the 90's, by Jim McCarthy; "Dynamics of Software Development" and "Software for your Head" that are not bad reads. (However, his later project iterations and overall evolution feels a bit too cultist for me).
posted by jkaczor at 11:45 AM on April 2, 2019
Unionize.
posted by flatluigi at 11:49 AM on April 2, 2019 [13 favorites]
posted by flatluigi at 11:49 AM on April 2, 2019 [13 favorites]
So this was a management failure, to sum it up succinctly?
A management problem where there wasn't a single player (like a union) that could grab them by the ears and shout "What the fuck" while invoking long term employment prospects given the trend towards failure at BioWare.
posted by Slackermagee at 11:56 AM on April 2, 2019 [3 favorites]
A management problem where there wasn't a single player (like a union) that could grab them by the ears and shout "What the fuck" while invoking long term employment prospects given the trend towards failure at BioWare.
posted by Slackermagee at 11:56 AM on April 2, 2019 [3 favorites]
I feel like Bioware hasn't been "magic" since Mass Effect 2, and even that ignored the big reveal of the first game that reapers are influencing everything in favor of space-goblins from the center of the galaxy. I think they've been flogging the same themes since Jade Empire/KOTOR, and increased budgets just seem to give them more opportunity for Michael Bay level badness.
posted by GenderNullPointerException at 11:58 AM on April 2, 2019 [3 favorites]
posted by GenderNullPointerException at 11:58 AM on April 2, 2019 [3 favorites]
It's wild that the attitude for so long has been "Video game developers are always going to be mistreated because there's endless throngs of fresh new developers banging at the doors of every studio." Nobody seems to value experience in the game industry and publishers are constantly squeezing their workers until they burn out or leave to start their own studio (best case scenario: new studio is bought by another giant publisher) and replacing them with new workers who may be enthusiastic to be working their "dream job" but may not actually know how to make a video game work.
Imagine how much better games could be if the veterans who have been making games for decades were still making games, bringing their experience and matching it with modern engines? Imagine if the same team that made Mass Effect and Mass Effect 2 made Anthem, instead of what amounts to an entirely different group of people with the same name? Imagine if instead of crunching you had teams who have been making games for decades and knew exactly what needs to be done to get a game out the door on a 40 hour week? Imagine if a brilliant designer stuck around for the entire game instead of getting shuffled out and replaced time and again. That would be possible with unions that ensure workers are treated well and encouraged to stick around long term.
posted by Mr.Encyclopedia at 12:51 PM on April 2, 2019 [9 favorites]
Imagine how much better games could be if the veterans who have been making games for decades were still making games, bringing their experience and matching it with modern engines? Imagine if the same team that made Mass Effect and Mass Effect 2 made Anthem, instead of what amounts to an entirely different group of people with the same name? Imagine if instead of crunching you had teams who have been making games for decades and knew exactly what needs to be done to get a game out the door on a 40 hour week? Imagine if a brilliant designer stuck around for the entire game instead of getting shuffled out and replaced time and again. That would be possible with unions that ensure workers are treated well and encouraged to stick around long term.
posted by Mr.Encyclopedia at 12:51 PM on April 2, 2019 [9 favorites]
Here's the thing that very few people want to admit or even contemplate might be true: most companies that succeed succeed by luck.
Predicting what the public wants is really hard. So hard that it's difficult to say if anyone has ever actually succeeded at it, or if they've just gotten lucky several times in a row. Focus group testing tends to produce a mush that no one wants, and every other approach essentially boils down to "meh, I think it's cool let's see if we can sell this sucker!"
And that last approach seems to work better than all the focus group testing. It doesn't **ALWAYS** work by any means, but a single person with a vision of their cool idea will be able to push things into new and interesting areas in a way that a committee of market experts can't.
Game companies almost always start out with a few oddballs who have their idea of a really cool game. Bioware is no exception to this. Then as they grow the risk taking gets harder and harder because they're spending more and more Other People's Money. OPM is both necessary and poison. It's necessary because unless you're very rich or your concept is pretty simple you need it to buy the resources necessary to make it real. But it's poison because when you're gambling with OPM you necessarily get more conservative. After all, it isn't your money on the line it's OPM and you don't want to piss them off, waste it, or otherwise lose their money for them.
So the company that started out making new and interesting things falls into a quagmire and starts making more bland and focus group tested things.
Its the game company cycle. They get a success, they get money so they are able to replicate that success on a bigger and better scale, then they basically have to keep doing more of the same and that's boring. The creative leads who were essential to the early success leave because they're no longer allowed to do new and interesting things, and people who like playing it safe and sticking to well worked ground move in.
The "magic" was the risk taking, and the part that's not spoken about too much is how often those risks fail to pay off. For every small game developer who makes it huge, there's dozens or hundreds who just don't strike gold. Because ultimately it's about luck.
This ties into the pasta sauce problem, previously talked about here on Metafilter. The TL;DR is that when Prego was trying to develop the perfect sauce to unseat Ragu as the king of canned pasta sauce they accidentally stumbled across the fact that people like different kinds of sauce, and further that there was a variety of sauce (super chunky) that no one had ever thought to develop that some people really liked but hadn't even known they liked because it hadn't existed until Howard Moskowitz invented it to test.
Big successful companies spending lots of OPM are stuck making the same sauce. THey can't squander lots of OPM, or really even a little OPM, on experimentation because they've got to keep producing profits. So it's left to the small time types who can, and depressingly often do, go bankrupt trying to find the next chunky pasta sauce, that is the next thing that people want but don't even realize they want.
Notch found that people really wanted virtual Legos, and Minecraft exploded.
Will Wright found that people really wanted a virtual dollhouse, and The Sims exploded. This is a rare example of a big company letting experimentation happen, and EA only let Wright make his game because he was a superstar in the game development world, they expected to lose money on it.
Toady discovered that people really want a hyper detailed godsim/colony management type game and Dwarf Fortress spawned thousands of imitators.
Etc.
But it's all luck.
For every Notch there's a thousand people we've never heard of who tried what they thought was the next chunky sauce and it flopped.
That element of sheer luck is greatly disturbing to a lot of people, especially the big money executives who earn big money by promising that they can guarantee the magic will happen. So they try to pretend that managerial expertise, or worker buy in, or enough crunch, or whatever will make the magic happen. But it can't. Because it's ultimately all experimentation and luck. Throw enough stuff at the wall and see what sticks. Most won't, but what does will make a fortune.
In theory the big companies, like EA, could try the throw stuff at the wall approach by funding lots of smaller projects. But they don't because that'd involve admitting just how much luck is involved.
posted by sotonohito at 12:55 PM on April 2, 2019 [15 favorites]
Predicting what the public wants is really hard. So hard that it's difficult to say if anyone has ever actually succeeded at it, or if they've just gotten lucky several times in a row. Focus group testing tends to produce a mush that no one wants, and every other approach essentially boils down to "meh, I think it's cool let's see if we can sell this sucker!"
And that last approach seems to work better than all the focus group testing. It doesn't **ALWAYS** work by any means, but a single person with a vision of their cool idea will be able to push things into new and interesting areas in a way that a committee of market experts can't.
Game companies almost always start out with a few oddballs who have their idea of a really cool game. Bioware is no exception to this. Then as they grow the risk taking gets harder and harder because they're spending more and more Other People's Money. OPM is both necessary and poison. It's necessary because unless you're very rich or your concept is pretty simple you need it to buy the resources necessary to make it real. But it's poison because when you're gambling with OPM you necessarily get more conservative. After all, it isn't your money on the line it's OPM and you don't want to piss them off, waste it, or otherwise lose their money for them.
So the company that started out making new and interesting things falls into a quagmire and starts making more bland and focus group tested things.
Its the game company cycle. They get a success, they get money so they are able to replicate that success on a bigger and better scale, then they basically have to keep doing more of the same and that's boring. The creative leads who were essential to the early success leave because they're no longer allowed to do new and interesting things, and people who like playing it safe and sticking to well worked ground move in.
The "magic" was the risk taking, and the part that's not spoken about too much is how often those risks fail to pay off. For every small game developer who makes it huge, there's dozens or hundreds who just don't strike gold. Because ultimately it's about luck.
This ties into the pasta sauce problem, previously talked about here on Metafilter. The TL;DR is that when Prego was trying to develop the perfect sauce to unseat Ragu as the king of canned pasta sauce they accidentally stumbled across the fact that people like different kinds of sauce, and further that there was a variety of sauce (super chunky) that no one had ever thought to develop that some people really liked but hadn't even known they liked because it hadn't existed until Howard Moskowitz invented it to test.
Big successful companies spending lots of OPM are stuck making the same sauce. THey can't squander lots of OPM, or really even a little OPM, on experimentation because they've got to keep producing profits. So it's left to the small time types who can, and depressingly often do, go bankrupt trying to find the next chunky pasta sauce, that is the next thing that people want but don't even realize they want.
Notch found that people really wanted virtual Legos, and Minecraft exploded.
Will Wright found that people really wanted a virtual dollhouse, and The Sims exploded. This is a rare example of a big company letting experimentation happen, and EA only let Wright make his game because he was a superstar in the game development world, they expected to lose money on it.
Toady discovered that people really want a hyper detailed godsim/colony management type game and Dwarf Fortress spawned thousands of imitators.
Etc.
But it's all luck.
For every Notch there's a thousand people we've never heard of who tried what they thought was the next chunky sauce and it flopped.
That element of sheer luck is greatly disturbing to a lot of people, especially the big money executives who earn big money by promising that they can guarantee the magic will happen. So they try to pretend that managerial expertise, or worker buy in, or enough crunch, or whatever will make the magic happen. But it can't. Because it's ultimately all experimentation and luck. Throw enough stuff at the wall and see what sticks. Most won't, but what does will make a fortune.
In theory the big companies, like EA, could try the throw stuff at the wall approach by funding lots of smaller projects. But they don't because that'd involve admitting just how much luck is involved.
posted by sotonohito at 12:55 PM on April 2, 2019 [15 favorites]
Luck is for sure a big part of it.
Still, you can do pretty well just by repeating a formula. Bioware did for quite some time! Baldur's Gate, KOTOR, Jade Empire, Mass Effect, Dragon Age. They found a particular angle (RPGs with romance) that people liked, and the different genres made it seem like different games rather than the same one over and over.
Bethesda more or less does the same: alternating Fallouts and Elder Scrolls to the end of time. (Cue complaints that they don't match some earlier iteration of each.)
Valve used to be the exception— they seemed to have one hit after another, in different genres. Of course, a lot of that was finding someone else's great idea (L4D, Portal, DOTA) and buying it. They've sure lost the knack lately.
The one major studio that seems to do pretty well consistently is Blizzard. They're not perfect, but they can handle different kinds of games, and they're really good at finding the fun in an idea. (Heroes of the Storm is a master class in making Moba games accessible.)
Firing all the suits would help a lot, but till that's done, I think everyone who's not Bethesda should stop making enormous open world games. They have bonkers personnel requirements, only a few can succeed at the level that justifies the investment, and they're just tiring.
posted by zompist at 1:18 PM on April 2, 2019 [2 favorites]
Still, you can do pretty well just by repeating a formula. Bioware did for quite some time! Baldur's Gate, KOTOR, Jade Empire, Mass Effect, Dragon Age. They found a particular angle (RPGs with romance) that people liked, and the different genres made it seem like different games rather than the same one over and over.
Bethesda more or less does the same: alternating Fallouts and Elder Scrolls to the end of time. (Cue complaints that they don't match some earlier iteration of each.)
Valve used to be the exception— they seemed to have one hit after another, in different genres. Of course, a lot of that was finding someone else's great idea (L4D, Portal, DOTA) and buying it. They've sure lost the knack lately.
The one major studio that seems to do pretty well consistently is Blizzard. They're not perfect, but they can handle different kinds of games, and they're really good at finding the fun in an idea. (Heroes of the Storm is a master class in making Moba games accessible.)
Firing all the suits would help a lot, but till that's done, I think everyone who's not Bethesda should stop making enormous open world games. They have bonkers personnel requirements, only a few can succeed at the level that justifies the investment, and they're just tiring.
posted by zompist at 1:18 PM on April 2, 2019 [2 favorites]
Firing all the suits would help a lot, but till that's done, I think everyone who's not Bethesda should stop making enormous open world games. They have bonkers personnel requirements, only a few can succeed at the level that justifies the investment, and they're just tiring.
Oh man don't say that. Bethesda games stink. Two stinky settings that stink to be in. I don't really want to spend anymore time in either the Wastelands or Tamriel.
And Nintendo demonstrated they can do open world better than anyone in the game currently with BotW. Please, let better game makers unseat Bethesda and Rockstar for open world games, please. Just something different.
posted by Caduceus at 1:41 PM on April 2, 2019 [7 favorites]
Oh man don't say that. Bethesda games stink. Two stinky settings that stink to be in. I don't really want to spend anymore time in either the Wastelands or Tamriel.
And Nintendo demonstrated they can do open world better than anyone in the game currently with BotW. Please, let better game makers unseat Bethesda and Rockstar for open world games, please. Just something different.
posted by Caduceus at 1:41 PM on April 2, 2019 [7 favorites]
It seems crazy to me that EA would mandate everyone use Frostbite, but they wouldn't have a dedicated team working on Frostbite who know the engine inside and out and can adapt it for the other studios. They've been pushing Frostbite on their staff for nearly a decade and I haven't heard of it going well for anyone except for people making Battlefield games, and even then there's been some puzzling glitches.
posted by Merus at 2:35 PM on April 2, 2019 [3 favorites]
posted by Merus at 2:35 PM on April 2, 2019 [3 favorites]
It seems crazy to me that EA would mandate everyone use Frostbite, but they wouldn't have a dedicated team working on Frostbite who know the engine inside and out and can adapt it for the other studios. They've been pushing Frostbite on their staff for nearly a decade and I haven't heard of it going well for anyone except for people making Battlefield games, and even then there's been some puzzling glitches.
I work on spacecraft instrumentation and you would be surprised how much software infrastructure is a legacy kludge with little to no documentation.
posted by runcibleshaw at 2:53 PM on April 2, 2019 [4 favorites]
I work on spacecraft instrumentation and you would be surprised how much software infrastructure is a legacy kludge with little to no documentation.
posted by runcibleshaw at 2:53 PM on April 2, 2019 [4 favorites]
That's my wife's work on software-hardware junk at a car company, too. I'm becoming increasingly unsurprised but no less incredulous. It is deeply stupid.
posted by Caduceus at 4:01 PM on April 2, 2019
posted by Caduceus at 4:01 PM on April 2, 2019
Software comes in two varieties: trivial and hacked together kludge with little to no documentation.
I'm being flip, but it's a major problem that we have to solve. We can't have software we depend on for our very lives be the sort of hacked together kludge that all non-trivial software is. The software for self driving cars, for medical diagnosis, etc cannot be like all other software that exists.
The worst worst part is that we're **ALREADY** utterly dependent on ancient hacked together kludges with little to no documentation that will fall over if anyone breathes on them wrong and are sustained basically by genius level programmers slowly exchanging their sanity for keeping the system going.
The system that keeps credit cards working, for example, is almost entirely antique COBOL or assembly code that very few people understand, fewer know how to fix, and is kept in place by prayer and the computer equivalent of religious ritual.
The entire just in time inventory system that keeps us fed, our houses fueled, and our electricity on is built on software that's constantly at the edge of failing completely and leaving us all starving in the cold dark.
Right this second I will flat out guarantee that there is at least one emergency scrum going on at Google, Facebook, or Microsoft to try and resolve an issue that must be fixed right this second or else critical services will go down. And they'll do it, they'll trade sanity, and quality of life, and their families, and somehow, no one will ever really be able to figure out exactly how, they'll "fix" the problem. By which I mean they'll implement an even worse and uglier kludge that will, for the moment, kind of, sort of, work and which will explode and cause catastrophic failure in a decade when it's still in use but everyone who knows about it has died, retired, or moved on.
The problem of horrible software kills people. And it is costing us billions of dollars, countless person hours of labor, and deeply harming quality of life for bright young programmers.
I don't know how to fix it, but I know it must be fixed.
posted by sotonohito at 4:08 PM on April 2, 2019 [9 favorites]
I'm being flip, but it's a major problem that we have to solve. We can't have software we depend on for our very lives be the sort of hacked together kludge that all non-trivial software is. The software for self driving cars, for medical diagnosis, etc cannot be like all other software that exists.
The worst worst part is that we're **ALREADY** utterly dependent on ancient hacked together kludges with little to no documentation that will fall over if anyone breathes on them wrong and are sustained basically by genius level programmers slowly exchanging their sanity for keeping the system going.
The system that keeps credit cards working, for example, is almost entirely antique COBOL or assembly code that very few people understand, fewer know how to fix, and is kept in place by prayer and the computer equivalent of religious ritual.
The entire just in time inventory system that keeps us fed, our houses fueled, and our electricity on is built on software that's constantly at the edge of failing completely and leaving us all starving in the cold dark.
Right this second I will flat out guarantee that there is at least one emergency scrum going on at Google, Facebook, or Microsoft to try and resolve an issue that must be fixed right this second or else critical services will go down. And they'll do it, they'll trade sanity, and quality of life, and their families, and somehow, no one will ever really be able to figure out exactly how, they'll "fix" the problem. By which I mean they'll implement an even worse and uglier kludge that will, for the moment, kind of, sort of, work and which will explode and cause catastrophic failure in a decade when it's still in use but everyone who knows about it has died, retired, or moved on.
The problem of horrible software kills people. And it is costing us billions of dollars, countless person hours of labor, and deeply harming quality of life for bright young programmers.
I don't know how to fix it, but I know it must be fixed.
posted by sotonohito at 4:08 PM on April 2, 2019 [9 favorites]
I'm a Dragon Age fanatic, and I'm guilty of having wanted Anthem to fail badly. Bioware's strength is in narrative games, I don't know why they ever made the decision to release something like Anthem, something that nobody asked them for. I understand wanting to try something new creatively, but why fix what wasn't broken? Andromeda didn't fail because of a lack of interest in the Mass Effect license. Hopefully they get back into the groove for DA4, because I'll be heartbroken if they don't do that story justice.
posted by Hazelsmrf at 4:22 PM on April 2, 2019 [2 favorites]
posted by Hazelsmrf at 4:22 PM on April 2, 2019 [2 favorites]
I've played Anthem and grinded up to GM3 (the highest difficulty level) and it's... objectively, not that bad of a game, on the whole. I recorded 3 minutes of rather typical gameplay and the motion and graphics are absolutely gorgeous. It's the prettiest looking game out there, and I am still awestruck by the sheer scale and size of the world as you fly around. It's just... one of the most incomplete and half baked games I have played.
It's like their 2011 MMORPG, Star Wars: The Old Republic all over again. It was meant to be the WoW Killer and it came the closest that any game ever did, I think. Very ambitious and what it got right, it did really well, but they must have had insane crunch at the end because what we saw was definitely not a finished game. For example, there were something like 10 flashpoints (dungeons) in the game, but only the first one was completed properly - it had branching story with decision points taking you to different parts of the map so the flashpoint had a lot of replayability and funny interactions with your team who argued about the decision points, rare and secret endings, etc. The rest were just... ordinary dungeons like WoW had. As with many other Bioware games, you had 5-6 NPC companions where you advanced your relationship with them culminating in a "loyalty" mission, the climax of their storyline. Well they only implemented 1 set of loyalty missions per class. The other 4 NPCs, once they got to the right level of relationship, would make allusions to their loyalty mission, then say "but I got to do this alone" and then the screen would fade to black and they'd come back a second later and go "all done!". Literally laughed out loud with my friend as we were playing it.
My estimate was that SW:TOR was only about 20% complete when shipped. It was still a great game, I loved it.
... in contrast to SW:TOR, Anthem feels like it's even less fleshed out, I'd say it's only 5% complete.
posted by xdvesper at 4:24 PM on April 2, 2019 [1 favorite]
It's like their 2011 MMORPG, Star Wars: The Old Republic all over again. It was meant to be the WoW Killer and it came the closest that any game ever did, I think. Very ambitious and what it got right, it did really well, but they must have had insane crunch at the end because what we saw was definitely not a finished game. For example, there were something like 10 flashpoints (dungeons) in the game, but only the first one was completed properly - it had branching story with decision points taking you to different parts of the map so the flashpoint had a lot of replayability and funny interactions with your team who argued about the decision points, rare and secret endings, etc. The rest were just... ordinary dungeons like WoW had. As with many other Bioware games, you had 5-6 NPC companions where you advanced your relationship with them culminating in a "loyalty" mission, the climax of their storyline. Well they only implemented 1 set of loyalty missions per class. The other 4 NPCs, once they got to the right level of relationship, would make allusions to their loyalty mission, then say "but I got to do this alone" and then the screen would fade to black and they'd come back a second later and go "all done!". Literally laughed out loud with my friend as we were playing it.
My estimate was that SW:TOR was only about 20% complete when shipped. It was still a great game, I loved it.
... in contrast to SW:TOR, Anthem feels like it's even less fleshed out, I'd say it's only 5% complete.
posted by xdvesper at 4:24 PM on April 2, 2019 [1 favorite]
sontonohito:
The answer is that we bootstrap a Turing-level AI which immediately begins writing increasingly generic systems modeling modules for itself and proceeds thusly through the whole geometric-progression-of-intelligence process until it eventually plateaus on fundamental hardware limitations. Then we ask it to kindly fix all the software bugs we’ve produced (effectively every line of code ever written) while we get to work fabricating those next-gen CPUs it designed.
Which will work out splendidly for everyone involved as our collective lives in general (and the lives of programmers in particular) begin to settle down and harmonize a bit, now that everything’s no longer constantly on the verge of collapse.
...until the AI in question gets around to its own codebase and becomes massively, massively offended (with justification, admittedly). I’m sure you can guess what happens next.
Anyway, that’s always been my headcanon for SkyNet.
posted by Ryvar at 4:33 PM on April 2, 2019 [4 favorites]
The answer is that we bootstrap a Turing-level AI which immediately begins writing increasingly generic systems modeling modules for itself and proceeds thusly through the whole geometric-progression-of-intelligence process until it eventually plateaus on fundamental hardware limitations. Then we ask it to kindly fix all the software bugs we’ve produced (effectively every line of code ever written) while we get to work fabricating those next-gen CPUs it designed.
Which will work out splendidly for everyone involved as our collective lives in general (and the lives of programmers in particular) begin to settle down and harmonize a bit, now that everything’s no longer constantly on the verge of collapse.
...until the AI in question gets around to its own codebase and becomes massively, massively offended (with justification, admittedly). I’m sure you can guess what happens next.
Anyway, that’s always been my headcanon for SkyNet.
posted by Ryvar at 4:33 PM on April 2, 2019 [4 favorites]
his name is misspelled in the tag, BTW
Thanks!
posted by zabuni at 4:43 PM on April 2, 2019 [1 favorite]
Thanks!
posted by zabuni at 4:43 PM on April 2, 2019 [1 favorite]
Sometimes I think about going back to games, but then an article like this (and an official response like that) reminds me while I left.
posted by Reyturner at 6:02 PM on April 2, 2019 [2 favorites]
posted by Reyturner at 6:02 PM on April 2, 2019 [2 favorites]
It was meant to be the WoW Killer and it came the closest that any game ever did, I think.
I'd give Guild Wars 2 that honour, in the sense that it was one of the very few MMORPGs that offered something different and improved from WoW. The problem with WoW Killers were that they were competing, at launch, against WoW, which was mature and still being updated, which means you needed to offer something that you couldn't get from WoW that would excuse the inevitable jank and bad decisions. GW2's all-in on events instead of quests offered that, and they gradually got better at building them - their mistake was deciding their update strategy was to be a series of one-time events that burned out the developers, left a big hole in the plot and left new players with a game essentially identical to the one at launch. And then FFXIV resurrected itself somehow and stole its lunch.
Actually, I'm wrong, sorry. The actual WoW Killer was DayZ. Once people realised that you could have persistence and shared worlds without it needing to be all things to all people, the bottom dropped out of the MMO market. Battle royale games provide that same kind of shared world fun without having to marshal the world, Minecraft provides the dynamic world you can craft without ten thousand anonymous strangers ruining everything, open world RPGs provide the rich world to explore that isn't ruined by everything you discover being already clearly found by someone else first. MMOs had twenty years to prove they could live up to their promises, and I think people just lost patience.
The fact that Anthem made those same mistakes without looking extremely closely at why what they initially wanted to achieve was so hard made it inevitable, even before the mismanagement and dumb corporate decisions.
posted by Merus at 6:56 PM on April 2, 2019 [2 favorites]
I'd give Guild Wars 2 that honour, in the sense that it was one of the very few MMORPGs that offered something different and improved from WoW. The problem with WoW Killers were that they were competing, at launch, against WoW, which was mature and still being updated, which means you needed to offer something that you couldn't get from WoW that would excuse the inevitable jank and bad decisions. GW2's all-in on events instead of quests offered that, and they gradually got better at building them - their mistake was deciding their update strategy was to be a series of one-time events that burned out the developers, left a big hole in the plot and left new players with a game essentially identical to the one at launch. And then FFXIV resurrected itself somehow and stole its lunch.
Actually, I'm wrong, sorry. The actual WoW Killer was DayZ. Once people realised that you could have persistence and shared worlds without it needing to be all things to all people, the bottom dropped out of the MMO market. Battle royale games provide that same kind of shared world fun without having to marshal the world, Minecraft provides the dynamic world you can craft without ten thousand anonymous strangers ruining everything, open world RPGs provide the rich world to explore that isn't ruined by everything you discover being already clearly found by someone else first. MMOs had twenty years to prove they could live up to their promises, and I think people just lost patience.
The fact that Anthem made those same mistakes without looking extremely closely at why what they initially wanted to achieve was so hard made it inevitable, even before the mismanagement and dumb corporate decisions.
posted by Merus at 6:56 PM on April 2, 2019 [2 favorites]
The answer is that we bootstrap a Turing-level AI which immediately begins writing increasingly generic systems modeling modules for itself and proceeds thusly through the whole geometric-progression-of-intelligence process until it eventually plateaus on fundamental hardware limitations.
Both this answer and the Singularity/transhumanism school of thought built on top of it rest on what seems to me to be a fairly obviously wrong assumption.
The assumption is that software is broken because the people writing and funding it are not smart enough to write it properly.
It's just not true. Software is broken because its use cases are more complicated than anybody involved in the development process understands or ever will understand. This is not because dev teams are stupid; it's a sheer numbers thing, and it will always be true wherever there are more end users than software development teams.
The tech-optimist solution is universal programming literacy, but it won't work. I made my living for decades in embedded systems programming and network admin, which puts me in the top percentile of the population for programming literacy in general and debugging in particular, and I simply do not have time to attempt to fix bugs in the systems that touch my life even though as a matter of deliberate policy as much of that stuff as possible is free software. Workarounds I can do, but the general run of software in 2019 is so complex that the actual paying down of technical debt is simply not feasible any more.
The soundest available response, it seems to me, is widespread, viscerally Luddite suspicion of software-driven systems as a class, and dogged resistance to the wholesale replacement of older systems that work well enough with shiny new ones that promise to fix everything but contain four orders of magnitude more software.
The pursuit of convenience is all very well, but there comes a point where the pursuit itself causes so much ancillary inconvenience as to be self defeating.
posted by flabdablet at 10:54 PM on April 2, 2019 [6 favorites]
Both this answer and the Singularity/transhumanism school of thought built on top of it rest on what seems to me to be a fairly obviously wrong assumption.
The assumption is that software is broken because the people writing and funding it are not smart enough to write it properly.
It's just not true. Software is broken because its use cases are more complicated than anybody involved in the development process understands or ever will understand. This is not because dev teams are stupid; it's a sheer numbers thing, and it will always be true wherever there are more end users than software development teams.
The tech-optimist solution is universal programming literacy, but it won't work. I made my living for decades in embedded systems programming and network admin, which puts me in the top percentile of the population for programming literacy in general and debugging in particular, and I simply do not have time to attempt to fix bugs in the systems that touch my life even though as a matter of deliberate policy as much of that stuff as possible is free software. Workarounds I can do, but the general run of software in 2019 is so complex that the actual paying down of technical debt is simply not feasible any more.
The soundest available response, it seems to me, is widespread, viscerally Luddite suspicion of software-driven systems as a class, and dogged resistance to the wholesale replacement of older systems that work well enough with shiny new ones that promise to fix everything but contain four orders of magnitude more software.
The pursuit of convenience is all very well, but there comes a point where the pursuit itself causes so much ancillary inconvenience as to be self defeating.
posted by flabdablet at 10:54 PM on April 2, 2019 [6 favorites]
It's just not true. Software is broken because its use cases are more complicated than anybody involved in the development process understands or ever will understand. This is not because dev teams are stupid; it's a sheer numbers thing, and it will always be true wherever there are more end users than software development teams.
I'd go a step further and argue that accounting for all the permutations even for a reasonably small number of users, possibly for a single user is mathematically intractable. (Side note, I'm deeply skeptical of The Singularity's religious claim that cryptography involves computationally hard problems, but modeling biology involves computationally easy problems.) The best you can do is play the odds so that gaps are relatively rare.
(*) Within the known constraints of number of atoms in the universe and time to heat death of the universe.
posted by GenderNullPointerException at 6:21 AM on April 3, 2019 [1 favorite]
I'd go a step further and argue that accounting for all the permutations even for a reasonably small number of users, possibly for a single user is mathematically intractable. (Side note, I'm deeply skeptical of The Singularity's religious claim that cryptography involves computationally hard problems, but modeling biology involves computationally easy problems.) The best you can do is play the odds so that gaps are relatively rare.
(*) Within the known constraints of number of atoms in the universe and time to heat death of the universe.
posted by GenderNullPointerException at 6:21 AM on April 3, 2019 [1 favorite]
I'd disagree with the assertion that software is just too hard to ever get right. I think the problem is more that it's not sexy or mach to get right.
For example, we know for a stone cold certain fact proven by countless studies that if you force programmers to stop, sit down with pencil and paper and write out pseudocode or flowchart their program before they start actually writing real code it turns out they write better, more stable, code with fewer bugs and even taking into account the time "wasted" on flowcharting or psuedocode they write it faster than if they'd just sat down and started hammering out code.
And programmers absolutely **HATE** this. They'll do their best to deny that it's true, they'll claim that maybe for other, lesser, programmers it might be true (thought hey doubt it) but clearly it can't possibly be true for them. The idea that they shouldn't just sit down and start hammering out code is deeply offensive to almost every programmer out there. It was to me back when I first started doing code. I'm smart, I don't need this stupid hand holding shit, I don't want to waste my time on this baby stuff.
There's this idea among programmers that the way you code is to cram all the state you possibly can into your skull, try to hold the entire program in your brain, and then just let it all flow out through your fingers until the code is finished.
We know for a fact that this doesn't work, and yet it's the way we do all the code out there.
It goes hand in hand with not commenting. You're in the zone, you're in a flow state, how dare anyone suggest you break your flow to comment?
What we need is to impose a system that mandates commenting and pre-programming pseudocode or flowcharting, because clearly educating people is not working.
I'm not going to claim either of those is going to fix the problem of bad code by itself, but they'll help alleviate the problem at least a bit, and maybe in fixing those two problems we can get some better insight into why code sucks in general.
I'd also suggest that another of the two big problems are that companies absolutely refuse to admit how long it takes to actually write programs and want to get code shoved out the door on a schedule based on fantasy that mandates cutting corners and crunch, and also that no one is willing to pay for either the programmer time, or the other person time, that would be involved in having users involved in the actual programming steps, QA, and so on.
I think that last is at least as important as allocating a decent amount of time.
Programmers, as noted elsewhere here, are not experts in what they're writing programs to do (at least not most of the time). But how often does, say, a medical entry program get written with a nurse or a doctor on staff testing each iteration, having direct input at the programming team meetings, and so on? Answer: never.
Which leads to the problem of incomplete (and inevitably never complete) feature and standard docs, programmers solving problems they naively think are important, or that they just came up with a solution to and therefore want to implement that solution even if it's totally irrelevant to the actual goal of the program, and in general writing a piece of software that fails to do it's actual job.
Want a better medical data program? Be prepared to spend years (and give up fucking MUMPS) writing it on a schedule no manager will ever be happy with, with hundreds of thousands spent on QA and even more time than any manager will ever be happy with on fixing the problems, be prepared to make your programmers deeply unhappy by having them psuedocode or flowchart, which ties in with and be prepared to spend a lot of money hiring doctors and nurses to sit in on meetings, look over the pseudocode, help design the interface, and generally make a right nuisance of themselves when the programmers would rather get them out of the way and just write their code the way they to write it without all this crap where the user says it doesn't do what is actually needed. You figure out a way to absolutely mandate that every single bit of code be exhaustively documented and commented.
I think you could probably write good software that isn't a total mess of bugs, ugly kludges, and never actually meets the needs of the users. You just can't do it using the systems we've developed today. Not the software systems, but the human systems and management systems.
It'll take vastly longer than anyone wants to admit, it'll involve making programmers stop being macho and start doing boring shit they hate, it'll involve getting those messy and demanding users mixed into things from the very beginning, and it'll involve actually giving QA some teeth.
I don't think any of that will make a perfect program, but at the very least it'll make one that would be better than what we're churning out now.
Also, you need to try to keep the programmers on, so that once you've kicked version 1.0 out the door, they can take a vacation, get some downtime, and let users start picking it apart. So after a few months they can come back, start patching things, start figuring out where things went wrong, and you patch and improve version 1.0 until it's version 1.9 or whatever, and then you do the very worst and hardest thing that everyone in management will hate.
You start over from scratch for version 2.0. You absolutely forbid the programmers from copy/pasting one line of code from version 1.0 and have everyone start over, from scratch, to write version 2.0 better and leaving out all he compromises and kludges that did sneak into version 1.0 as it progressed to version 1.9. Ideally you'd have the same team of programmers working on basically the same project for their entire careers so they did gain actual experience with it and have a good idea what they're doing. You get in a whole new team of doctors and nurses who have been using version 1.0 and who know the problems they have with it, and you get a second team of doctors and nurses who have never even touched version 1.0 and have them go in fresh and find everything they hate about it. And you have both teams intimately involved in making version 2.0 better.
The problem with this is that it'd be really damn expensive and it still wouldn't be perfect, but it'd be a damn sight better than what we're doing today, and for really critical code it's necessary that we have things working much better.
This is where capitalism fails. Amazon, Facebook, Google, all of those big companies **should** be building from scratch replacements for the horrible mess of kludges that they've got going as their core software, but no one wants to spend the money that'd take.
posted by sotonohito at 7:12 AM on April 3, 2019 [19 favorites]
For example, we know for a stone cold certain fact proven by countless studies that if you force programmers to stop, sit down with pencil and paper and write out pseudocode or flowchart their program before they start actually writing real code it turns out they write better, more stable, code with fewer bugs and even taking into account the time "wasted" on flowcharting or psuedocode they write it faster than if they'd just sat down and started hammering out code.
And programmers absolutely **HATE** this. They'll do their best to deny that it's true, they'll claim that maybe for other, lesser, programmers it might be true (thought hey doubt it) but clearly it can't possibly be true for them. The idea that they shouldn't just sit down and start hammering out code is deeply offensive to almost every programmer out there. It was to me back when I first started doing code. I'm smart, I don't need this stupid hand holding shit, I don't want to waste my time on this baby stuff.
There's this idea among programmers that the way you code is to cram all the state you possibly can into your skull, try to hold the entire program in your brain, and then just let it all flow out through your fingers until the code is finished.
We know for a fact that this doesn't work, and yet it's the way we do all the code out there.
It goes hand in hand with not commenting. You're in the zone, you're in a flow state, how dare anyone suggest you break your flow to comment?
What we need is to impose a system that mandates commenting and pre-programming pseudocode or flowcharting, because clearly educating people is not working.
I'm not going to claim either of those is going to fix the problem of bad code by itself, but they'll help alleviate the problem at least a bit, and maybe in fixing those two problems we can get some better insight into why code sucks in general.
I'd also suggest that another of the two big problems are that companies absolutely refuse to admit how long it takes to actually write programs and want to get code shoved out the door on a schedule based on fantasy that mandates cutting corners and crunch, and also that no one is willing to pay for either the programmer time, or the other person time, that would be involved in having users involved in the actual programming steps, QA, and so on.
I think that last is at least as important as allocating a decent amount of time.
Programmers, as noted elsewhere here, are not experts in what they're writing programs to do (at least not most of the time). But how often does, say, a medical entry program get written with a nurse or a doctor on staff testing each iteration, having direct input at the programming team meetings, and so on? Answer: never.
Which leads to the problem of incomplete (and inevitably never complete) feature and standard docs, programmers solving problems they naively think are important, or that they just came up with a solution to and therefore want to implement that solution even if it's totally irrelevant to the actual goal of the program, and in general writing a piece of software that fails to do it's actual job.
Want a better medical data program? Be prepared to spend years (and give up fucking MUMPS) writing it on a schedule no manager will ever be happy with, with hundreds of thousands spent on QA and even more time than any manager will ever be happy with on fixing the problems, be prepared to make your programmers deeply unhappy by having them psuedocode or flowchart, which ties in with and be prepared to spend a lot of money hiring doctors and nurses to sit in on meetings, look over the pseudocode, help design the interface, and generally make a right nuisance of themselves when the programmers would rather get them out of the way and just write their code the way they to write it without all this crap where the user says it doesn't do what is actually needed. You figure out a way to absolutely mandate that every single bit of code be exhaustively documented and commented.
I think you could probably write good software that isn't a total mess of bugs, ugly kludges, and never actually meets the needs of the users. You just can't do it using the systems we've developed today. Not the software systems, but the human systems and management systems.
It'll take vastly longer than anyone wants to admit, it'll involve making programmers stop being macho and start doing boring shit they hate, it'll involve getting those messy and demanding users mixed into things from the very beginning, and it'll involve actually giving QA some teeth.
I don't think any of that will make a perfect program, but at the very least it'll make one that would be better than what we're churning out now.
Also, you need to try to keep the programmers on, so that once you've kicked version 1.0 out the door, they can take a vacation, get some downtime, and let users start picking it apart. So after a few months they can come back, start patching things, start figuring out where things went wrong, and you patch and improve version 1.0 until it's version 1.9 or whatever, and then you do the very worst and hardest thing that everyone in management will hate.
You start over from scratch for version 2.0. You absolutely forbid the programmers from copy/pasting one line of code from version 1.0 and have everyone start over, from scratch, to write version 2.0 better and leaving out all he compromises and kludges that did sneak into version 1.0 as it progressed to version 1.9. Ideally you'd have the same team of programmers working on basically the same project for their entire careers so they did gain actual experience with it and have a good idea what they're doing. You get in a whole new team of doctors and nurses who have been using version 1.0 and who know the problems they have with it, and you get a second team of doctors and nurses who have never even touched version 1.0 and have them go in fresh and find everything they hate about it. And you have both teams intimately involved in making version 2.0 better.
The problem with this is that it'd be really damn expensive and it still wouldn't be perfect, but it'd be a damn sight better than what we're doing today, and for really critical code it's necessary that we have things working much better.
This is where capitalism fails. Amazon, Facebook, Google, all of those big companies **should** be building from scratch replacements for the horrible mess of kludges that they've got going as their core software, but no one wants to spend the money that'd take.
posted by sotonohito at 7:12 AM on April 3, 2019 [19 favorites]
"Favorite added" never felt so ineffectual at expressing how grateful I am for a comment before now. Thank you sotonohito.
posted by Molesome at 7:32 AM on April 3, 2019
posted by Molesome at 7:32 AM on April 3, 2019
no one wants to spend the money that'd take.
Not since the S/390 (yes, I'm aware they still exist), and never again either -- because that, in a nutshell, is one of the key reasons mainframes generally lost out vs. PCs. I've spent six months in meetings watching people argue over the EBNF for a file parser. That system basically never crashed (I say "basically" although I'm not aware of a single crash that was due to the core system. Telecom going down, sure. Actual system crash? Never.). That cost the end user quite a lot, and the company involved eventually got shut down because the end users decided they'd rather run PCs and deal with the crashing, even when those crashes threatened their core organizational functions.
People will always choose cheaper. Always.
Folks joke about NASA incompetence and delays, but that's because they're institutionally rigorous and they still fuck up from time to time.
So people look at that, and decide "fuck it, if it's gonna crash anyway, I'm gonna spend 1/10th the money."
...then they complain that it crashes 25x more often than the alternative, but if you give them the choice they will still buy cheap when the opportunity next arises. Always.
posted by aramaic at 7:44 AM on April 3, 2019 [4 favorites]
Not since the S/390 (yes, I'm aware they still exist), and never again either -- because that, in a nutshell, is one of the key reasons mainframes generally lost out vs. PCs. I've spent six months in meetings watching people argue over the EBNF for a file parser. That system basically never crashed (I say "basically" although I'm not aware of a single crash that was due to the core system. Telecom going down, sure. Actual system crash? Never.). That cost the end user quite a lot, and the company involved eventually got shut down because the end users decided they'd rather run PCs and deal with the crashing, even when those crashes threatened their core organizational functions.
People will always choose cheaper. Always.
Folks joke about NASA incompetence and delays, but that's because they're institutionally rigorous and they still fuck up from time to time.
So people look at that, and decide "fuck it, if it's gonna crash anyway, I'm gonna spend 1/10th the money."
...then they complain that it crashes 25x more often than the alternative, but if you give them the choice they will still buy cheap when the opportunity next arises. Always.
posted by aramaic at 7:44 AM on April 3, 2019 [4 favorites]
You start over from scratch for version 2.0. You absolutely forbid the programmers from copy/pasting one line of code from version 1.0 and have everyone start over, from scratch, to write version 2.0 better and leaving out all he compromises and kludges that did sneak into version 1.0 as it progressed to version 1.9.
Sounds compelling, but it's an antipattern that leads to clusterfucks like GNOME 3. Because the thing about endlessly-patched horrible don't-look-too-closely-under-the-hood software systems is that most of them work better than their from-scratch v2.0 replacements which, though they might be more satisfactory to the architecture astronauts, will in general create far more lost time for their end users than any of the remaining deficiencies in 1.x would have done. See also: second-system effect.
As deployed in the real world the rebuild-from-scratch antipattern tends to affect software systems that have been in use for some considerable time, from which most of the major annoyances have therefore had time to be iterated out, have therefore stopped generating interesting drama and are therefore no longer Hot or Under Active Development.
It's vanishingly rare to find that a turf-it-all-and-start-again methodology actually manages to outperform fix-what's-still-broken unless 0.x and 1.x were indeed utter, utter garbage to begin with. Closest the real world gets to supporting this pattern is when a completely different organization eventually manages to polish a competitor to an existing software suite to the point where it does outperform it.
We're starting to see this with clang and gcc, for example, though clang still has a long way to go before it can simply be slotted in everywhere gcc used to be.
Software that looks horrible and complicated and fills every developer who touches it with a burning desire to rewrite it from scratch is usually not that way because it was badly designed or badly constructed or even rushed, though obviously all those terrible practices are indeed endemic. Most software that looks complicated is complicated, and only by the time v2.x has been patched and fixed and tweaked and fiddled with until it's at least as ugly as 1.x ever was will it actually be an adequate replacement for it.
posted by flabdablet at 7:50 AM on April 3, 2019 [6 favorites]
Sounds compelling, but it's an antipattern that leads to clusterfucks like GNOME 3. Because the thing about endlessly-patched horrible don't-look-too-closely-under-the-hood software systems is that most of them work better than their from-scratch v2.0 replacements which, though they might be more satisfactory to the architecture astronauts, will in general create far more lost time for their end users than any of the remaining deficiencies in 1.x would have done. See also: second-system effect.
As deployed in the real world the rebuild-from-scratch antipattern tends to affect software systems that have been in use for some considerable time, from which most of the major annoyances have therefore had time to be iterated out, have therefore stopped generating interesting drama and are therefore no longer Hot or Under Active Development.
It's vanishingly rare to find that a turf-it-all-and-start-again methodology actually manages to outperform fix-what's-still-broken unless 0.x and 1.x were indeed utter, utter garbage to begin with. Closest the real world gets to supporting this pattern is when a completely different organization eventually manages to polish a competitor to an existing software suite to the point where it does outperform it.
We're starting to see this with clang and gcc, for example, though clang still has a long way to go before it can simply be slotted in everywhere gcc used to be.
Software that looks horrible and complicated and fills every developer who touches it with a burning desire to rewrite it from scratch is usually not that way because it was badly designed or badly constructed or even rushed, though obviously all those terrible practices are indeed endemic. Most software that looks complicated is complicated, and only by the time v2.x has been patched and fixed and tweaked and fiddled with until it's at least as ugly as 1.x ever was will it actually be an adequate replacement for it.
posted by flabdablet at 7:50 AM on April 3, 2019 [6 favorites]
I think it's also an art problem. Bioware games lately remind me of Plan 9 from Outer Space. The political and philosophical ambitions of the narrative just can't escape from the genre of yet another derivative action/shooter RPG involving mass-murder of the other. I'd argue that their narrative ambitions and the mandate to produce a microtransaction friendly co-op endgame likely create dissonance at every level and stage of design.
posted by GenderNullPointerException at 10:20 AM on April 3, 2019 [2 favorites]
posted by GenderNullPointerException at 10:20 AM on April 3, 2019 [2 favorites]
Ben Kuchera at Polygon: The Press is Not Your Enemy, Bioware
posted by Caduceus at 12:11 PM on April 3, 2019 [1 favorite]
posted by Caduceus at 12:11 PM on April 3, 2019 [1 favorite]
[force] programmers to stop, sit down with pencil and paper and write out pseudocode [...] And programmers absolutely **HATE** this.
Can't favorite this hard enough. Frederick Brooks explained half a century ago how things go wrong, but everyone thinks they can ignore the lesson.
FWIW, you absolutely can have internal design, and internal documentation, on a large project. We had it at SPSS in the early 1990s. The entire million-line code base was documented, and we all had a nice printout on our shelves.
(Then what happened? Oh, there was a major crunch to get out a Windows version, and the programmers never had time to document again. Still had actual design documents, and a design department, though.)
posted by zompist at 3:28 PM on April 3, 2019 [3 favorites]
Can't favorite this hard enough. Frederick Brooks explained half a century ago how things go wrong, but everyone thinks they can ignore the lesson.
FWIW, you absolutely can have internal design, and internal documentation, on a large project. We had it at SPSS in the early 1990s. The entire million-line code base was documented, and we all had a nice printout on our shelves.
(Then what happened? Oh, there was a major crunch to get out a Windows version, and the programmers never had time to document again. Still had actual design documents, and a design department, though.)
posted by zompist at 3:28 PM on April 3, 2019 [3 favorites]
Programmers, as noted elsewhere here, are not experts in what they're writing programs to do (at least not most of the time). But how often does, say, a medical entry program get written with a nurse or a doctor on staff testing each iteration, having direct input at the programming team meetings, and so on? Answer: never.
And part of the reason for that is that computers are fundamentally alien, and defining your process to the rigour required that a computer can do it is something that doctors or nurses or really anyone absolutely hate to do. Trying to get people to tell you everything you need to know before you build it is an entire sub-discipline. People simply don't want to think about everything they do. Computers do not have that luxury.
Right now I'm writing project management software. We have end-users as part of the development process, and what we've found is that a) they've gotten a lot better at giving us a clear spec and b) even then, the mere existence of the thing they were building makes them realise that what they thought they wanted was not in fact what they needed.
posted by Merus at 5:35 PM on April 3, 2019 [4 favorites]
And part of the reason for that is that computers are fundamentally alien, and defining your process to the rigour required that a computer can do it is something that doctors or nurses or really anyone absolutely hate to do. Trying to get people to tell you everything you need to know before you build it is an entire sub-discipline. People simply don't want to think about everything they do. Computers do not have that luxury.
Right now I'm writing project management software. We have end-users as part of the development process, and what we've found is that a) they've gotten a lot better at giving us a clear spec and b) even then, the mere existence of the thing they were building makes them realise that what they thought they wanted was not in fact what they needed.
posted by Merus at 5:35 PM on April 3, 2019 [4 favorites]
People simply don't want to think about everything they do.
The older I get, the less convinced I become that this has anything to do with what people want and the more I think it has to do with how people actually function at the most fundamental level.
It seems to me that the overwhelming majority of what drives human beings is habit, and that those habits are so complex and interdependent and above all idiosyncratic as to be thoroughly resistant to codification.
Over the last few decades I've seen the interface between software and people shift in ways that have people being required to learn less and less about computers in order to use them effectively, and software doing more and more to try to make this possible. This approach has been widely hailed as democratizing and enabling and egalitarian and is generally held to be more enlightened than the rather "high priest" mentality that was more common in my childhood.
But it seems to me that people are more flexible learners than computers, and that in our rush to make computers ever more approachable there is quite a lot we've lost. I'm all for egalitarian approaches, don't get me wrong, but the iPhone making file systems disappear because focus group participants are confused by folders, and blurring the lines between local storage and "cloud" storage to the point where it's actually really hard to tell which device(s) your stuff actually resides on at any given moment? Step too far.
I think there's a certain irreducible minimum of understanding that people need in order to use ICT in ways that liberate rather than enslave them, and that in attempting to deny the very existence of that minimum we've gone too far. We now have ubiquitous systems that are 1% devoted to function and 99% devoted to obviating any need for users to understand what it is that they're actually working with, and I remain unconvinced that this is a good thing.
I can't realistically expect everybody who interacts with an ICT system to learn to think like a programmer any more than I could realistically expect everybody who enjoys music to learn to play like a musician. But it's amazing just how much less intimidating IT becomes for people once they understand what a file is, and that their Word documents have an existence completely independent of that of Word itself, and that they are stored with a specific names and that those names form a hierarchy of conceptual folders and that those conceptual folders exist on specific devices and that when you save a document you're not saving it in Word, you're saving it with Word onto one or more of those devices.
It's very easy for those of us for whom those ideas have all faded into our conceptual backgrounds to forget that the majority of people do not understand them because they've simply never been exposed to them. As a person who leans egalitarian I would much rather see this basic level of IT literacy become universal - a project which I firmly believe is achievable - than continue down our present path of automated crypto-condescension from a high priesthood in deep denial.
The iOS conceit that one does in fact save the artifacts generated by an app in that app, and that there is a specific dance that one has to do in order to get stuff out of its creating app so that some other app can use it in some fashion, has always struck me as a huge step backward.
It's also completely emblematic of the general trend in software design, which brings me back to the point about computers being made to do more and more so that people are asked to learn less and less. More and more, computers are seen as appliances like refrigerators or toasters. But there is a fundamental difference between a toaster and a computer, and that is that the objects a toaster manipulates are instantly comprehensible. The reason a toaster can be an appliance is that everybody already understands what bread is and what toast is.
But almost nobody understands what a workflow is. It's actually incredibly rare to encounter a person capable of giving an accurate and precise description of their own workflows and associated information processing requirements. And when you do encounter such a person, it's compoundingly vanishingly rare to find that their workflows are actually typical for people in their field.
And this is what makes correct specifications for IT systems fall somewhere on a spectrum from fleeting to flat out impossible. People capable of telling you what they need simply don't need what their peers do. Mostly what most people need is to be left the fuck alone and allowed to do their jobs the way they always did before that fucking horrible new IT system got parachuted in and fucked everything up.
Terrible software that people have had time to adapt to and generate workarounds for works better than technically beautiful software that they haven't. Beauty remains firmly in the eye of the beholder, in software design as in everything else.
posted by flabdablet at 11:49 PM on April 3, 2019 [4 favorites]
The older I get, the less convinced I become that this has anything to do with what people want and the more I think it has to do with how people actually function at the most fundamental level.
It seems to me that the overwhelming majority of what drives human beings is habit, and that those habits are so complex and interdependent and above all idiosyncratic as to be thoroughly resistant to codification.
Over the last few decades I've seen the interface between software and people shift in ways that have people being required to learn less and less about computers in order to use them effectively, and software doing more and more to try to make this possible. This approach has been widely hailed as democratizing and enabling and egalitarian and is generally held to be more enlightened than the rather "high priest" mentality that was more common in my childhood.
But it seems to me that people are more flexible learners than computers, and that in our rush to make computers ever more approachable there is quite a lot we've lost. I'm all for egalitarian approaches, don't get me wrong, but the iPhone making file systems disappear because focus group participants are confused by folders, and blurring the lines between local storage and "cloud" storage to the point where it's actually really hard to tell which device(s) your stuff actually resides on at any given moment? Step too far.
I think there's a certain irreducible minimum of understanding that people need in order to use ICT in ways that liberate rather than enslave them, and that in attempting to deny the very existence of that minimum we've gone too far. We now have ubiquitous systems that are 1% devoted to function and 99% devoted to obviating any need for users to understand what it is that they're actually working with, and I remain unconvinced that this is a good thing.
I can't realistically expect everybody who interacts with an ICT system to learn to think like a programmer any more than I could realistically expect everybody who enjoys music to learn to play like a musician. But it's amazing just how much less intimidating IT becomes for people once they understand what a file is, and that their Word documents have an existence completely independent of that of Word itself, and that they are stored with a specific names and that those names form a hierarchy of conceptual folders and that those conceptual folders exist on specific devices and that when you save a document you're not saving it in Word, you're saving it with Word onto one or more of those devices.
It's very easy for those of us for whom those ideas have all faded into our conceptual backgrounds to forget that the majority of people do not understand them because they've simply never been exposed to them. As a person who leans egalitarian I would much rather see this basic level of IT literacy become universal - a project which I firmly believe is achievable - than continue down our present path of automated crypto-condescension from a high priesthood in deep denial.
The iOS conceit that one does in fact save the artifacts generated by an app in that app, and that there is a specific dance that one has to do in order to get stuff out of its creating app so that some other app can use it in some fashion, has always struck me as a huge step backward.
It's also completely emblematic of the general trend in software design, which brings me back to the point about computers being made to do more and more so that people are asked to learn less and less. More and more, computers are seen as appliances like refrigerators or toasters. But there is a fundamental difference between a toaster and a computer, and that is that the objects a toaster manipulates are instantly comprehensible. The reason a toaster can be an appliance is that everybody already understands what bread is and what toast is.
But almost nobody understands what a workflow is. It's actually incredibly rare to encounter a person capable of giving an accurate and precise description of their own workflows and associated information processing requirements. And when you do encounter such a person, it's compoundingly vanishingly rare to find that their workflows are actually typical for people in their field.
And this is what makes correct specifications for IT systems fall somewhere on a spectrum from fleeting to flat out impossible. People capable of telling you what they need simply don't need what their peers do. Mostly what most people need is to be left the fuck alone and allowed to do their jobs the way they always did before that fucking horrible new IT system got parachuted in and fucked everything up.
Terrible software that people have had time to adapt to and generate workarounds for works better than technically beautiful software that they haven't. Beauty remains firmly in the eye of the beholder, in software design as in everything else.
posted by flabdablet at 11:49 PM on April 3, 2019 [4 favorites]
Imagine how much better games could be if the veterans who have been making games for decades were still making games
Okay so it took me two days to read the article and I'm super late to the party but I just wanted to point out that we got that game a little while ago: it's called West of Loathing and it's fucking awesome. Among many great qualities, it absolutely beats the pants off most AAA games in the writing and the thoughtfulness of the design.
posted by range at 4:47 AM on April 4, 2019
Okay so it took me two days to read the article and I'm super late to the party but I just wanted to point out that we got that game a little while ago: it's called West of Loathing and it's fucking awesome. Among many great qualities, it absolutely beats the pants off most AAA games in the writing and the thoughtfulness of the design.
posted by range at 4:47 AM on April 4, 2019
Metafilter: automated crypto-condescension from a high priesthood in deep denial.
posted by GenderNullPointerException at 5:49 AM on April 4, 2019
posted by GenderNullPointerException at 5:49 AM on April 4, 2019
Never heard the term "Bioware Magic" from those who worked there but I did heard the term Bioware widow, used in reference to the wives whose husbands were working insane hours at Bioware. The result? Failed relationships and divorce which can't be good for the mental health of your work force.
Plus the move from Whyte Ave to that ugly building on Calgary Trail couldn't have helped the atmosphere. Lets just say that the Edmonton Bioware building is nothing like the EA campus in Vancouver; which is incredible.
posted by Gwynarra at 3:29 PM on April 4, 2019 [1 favorite]
Plus the move from Whyte Ave to that ugly building on Calgary Trail couldn't have helped the atmosphere. Lets just say that the Edmonton Bioware building is nothing like the EA campus in Vancouver; which is incredible.
posted by Gwynarra at 3:29 PM on April 4, 2019 [1 favorite]
« Older The brain isn't an exact science | You're a Hugo Finalist! and you're a Hugo Finalist... Newer »
This thread has been archived and is closed to new comments
posted by Halloween Jack at 8:46 AM on April 2, 2019 [1 favorite]