"[T]reating the world as software promotes fantasies of control"
June 29, 2016 6:09 AM Subscribe
Maciej Ceglowski on the moral economy of tech.
Previously and previously.
But as anyone who's worked with tech people knows, this intellectual background can also lead to arrogance. People who excel at software design become convinced that they have a unique ability to understand any kind of system at all, from first principles, without prior training, thanks to their superior powers of analysis. Success in the artificially constructed world of software design promotes a dangerous confidence.From his talk at the SASE Conference on 26 June 2016.
Previously and previously.
See also: Brazil (1985), The map is not the territory, The Analytical Language of John Wilkins, Falsehoods Programmers Believe About Names, etc etc.
Models aren't perfect, they're just useful.
posted by Leon at 6:30 AM on June 29, 2016 [7 favorites]
Models aren't perfect, they're just useful.
posted by Leon at 6:30 AM on June 29, 2016 [7 favorites]
I don't know if it's "liberals" who do that specifically. There are lots of human systems which, without regulation, will tend to an undesirable end-point -- waterbodies polluted, all the money in the hands of one guy, gerrymandering, etc., etc. I see this as a practical view based on observations of natural systems that doesn't have to fit a left-right mindset. In fact, the growing extreme divide between "liberal" and "conservative" is itself the result of an out-of-control system.
posted by sneebler at 6:41 AM on June 29, 2016 [7 favorites]
posted by sneebler at 6:41 AM on June 29, 2016 [7 favorites]
The reality is, opting out of surveillance capitalism means opting out of much of modern life.
I learned this firsthand when I finally bit the bullet, a year or so ago, and deleted my Facebook account. It's astonishing how many of my social connections only sustained themselves while I was reachable over Facebook Chat.
(Some irony there in that, by the time I did this, about half my friends were online-only MeFites. Please answer my texts, Cortex, I miss our Harry Potter rap-offs.)
posted by rorgy at 6:44 AM on June 29, 2016 [12 favorites]
I learned this firsthand when I finally bit the bullet, a year or so ago, and deleted my Facebook account. It's astonishing how many of my social connections only sustained themselves while I was reachable over Facebook Chat.
(Some irony there in that, by the time I did this, about half my friends were online-only MeFites. Please answer my texts, Cortex, I miss our Harry Potter rap-offs.)
posted by rorgy at 6:44 AM on June 29, 2016 [12 favorites]
Machine learning is like money laundering for bias. It's a clean, mathematical apparatus that gives the status quo the aura of logical inevitability. The numbers don't lie.
Yep - it also dovetails nicely with the capitalist tendency to dismiss anything that isn't profitable or that resists simple quantification as being sentimental or foolish. Meet the new boss, etc.
posted by ryanshepard at 6:47 AM on June 29, 2016 [16 favorites]
Yep - it also dovetails nicely with the capitalist tendency to dismiss anything that isn't profitable or that resists simple quantification as being sentimental or foolish. Meet the new boss, etc.
posted by ryanshepard at 6:47 AM on June 29, 2016 [16 favorites]
There's so much good stuff in this piece. I wish I could single out one pull quote. We're so much better for having Maciej around to poke the techies where it hurts.
posted by SansPoint at 6:47 AM on June 29, 2016 [27 favorites]
posted by SansPoint at 6:47 AM on June 29, 2016 [27 favorites]
He sees it as an engineering problem, and has gone for grants to "hack" at it - never mind I think he's never actually met a homeless person or invited the organizations involved in day-to-day service to be part.
Stewart Brand, who is in many ways at the root of the Silicon Vallery strain of the thinking Ceglowski is describing, is a good example of how seductive and blinding systems theory can be. As the piece points out, there are many, many privileged, highly educated, otherwise extremely intelligent people who just will not be convinced that they are wrong about their narrow, Olympian belief that social problems are merely engineering problems.
posted by ryanshepard at 7:01 AM on June 29, 2016 [12 favorites]
Stewart Brand, who is in many ways at the root of the Silicon Vallery strain of the thinking Ceglowski is describing, is a good example of how seductive and blinding systems theory can be. As the piece points out, there are many, many privileged, highly educated, otherwise extremely intelligent people who just will not be convinced that they are wrong about their narrow, Olympian belief that social problems are merely engineering problems.
posted by ryanshepard at 7:01 AM on June 29, 2016 [12 favorites]
I've reached the point where, when a new acquaintance shows the first signs of thinking that most other people are "idiots" etc, I head for the escape pod.
posted by thelonius at 7:05 AM on June 29, 2016 [12 favorites]
posted by thelonius at 7:05 AM on June 29, 2016 [12 favorites]
There is a powerful magical belief in the generalizability of intelligence and technical skill in the IT world, even outside Silicon Valley. But I don't think it's really anymore uniquely a problem of software engineers than any other class of engineers except that all the money is being thrown at the software engineers in Silicon Valley and all the cameras are pointing in their direction.
This is America we're talking about here. Our successful business people have always had messiah complexes. Hell, even such seemingly innocuous Americana as Kellog's Corn Flakes were originally the spearhead of a totalizing life-style and nutrition cult. This phenomena of Americans succeeding at something and thinking that means they've got all the answers is nothing new or unique to the contemporary scene.
posted by saulgoodman at 7:07 AM on June 29, 2016 [25 favorites]
This is America we're talking about here. Our successful business people have always had messiah complexes. Hell, even such seemingly innocuous Americana as Kellog's Corn Flakes were originally the spearhead of a totalizing life-style and nutrition cult. This phenomena of Americans succeeding at something and thinking that means they've got all the answers is nothing new or unique to the contemporary scene.
posted by saulgoodman at 7:07 AM on June 29, 2016 [25 favorites]
Compounding this is the problem that people---engineers and civilians alike---seem to think computers are rational, logical, and infallible, when instead they're just really good at following instructions. The algorithms and AIs we create inherit the biases, flaws, and irrationality of the people creating them, and make no mistake, EVERYONE who programs is biased, flawed, and irrational, whether they want to admit it or not.
So when the computer does something flawed, biased, and irrational, the assumption is that well, the computer did it, and the computer is rational, logical, and infallible. Therefore, it must be right.
The sooner we wake up to this problem, the better.
posted by SansPoint at 7:14 AM on June 29, 2016 [10 favorites]
So when the computer does something flawed, biased, and irrational, the assumption is that well, the computer did it, and the computer is rational, logical, and infallible. Therefore, it must be right.
The sooner we wake up to this problem, the better.
posted by SansPoint at 7:14 AM on June 29, 2016 [10 favorites]
Racial supremacists and enlightened rationalists both passionately advocate for belief systems as shining counter examples to the efficacy and integrity thereof
And they're related in more ways than that: it's a very oppressive thing to reduce people to first order models and blame them for failing to reproduce your expectations
posted by an animate objects at 7:15 AM on June 29, 2016 [1 favorite]
And they're related in more ways than that: it's a very oppressive thing to reduce people to first order models and blame them for failing to reproduce your expectations
posted by an animate objects at 7:15 AM on June 29, 2016 [1 favorite]
I work for a digital services agency that would be just another web development firm, except that we have a design and UX components and we pursue and get service design jobs, where the basic premise is that the end-to-end human experience of something is the important part, so we do journeymapping and ethnography of, say, public transit, or signing up for services. It's made the dev team much less engineering-headed than before, but it also costs us when we partner with other tech firms because we're trying to explain that React/Redux isn't going to improve the experience of someone trying to self-file for divorce online because the problem isn't that input fields aren't immediately responsive, it's that the language around the question sounds accusatory.
posted by fatbird at 7:16 AM on June 29, 2016 [29 favorites]
posted by fatbird at 7:16 AM on June 29, 2016 [29 favorites]
That's all to say that the problems this brilliant essay points out, in terms of being stuck with engineering head and the effects of that, resonated deeply with me and I passed this around the office yesterday.
posted by fatbird at 7:19 AM on June 29, 2016 [2 favorites]
posted by fatbird at 7:19 AM on June 29, 2016 [2 favorites]
That was an interesting and entertaining read, and I say that as someone who is guilty of thinking my ability to create a relatively elegant SQL statement is proof of my viability in areas I've no experience with whatever.
posted by Mooski at 7:38 AM on June 29, 2016 [3 favorites]
posted by Mooski at 7:38 AM on June 29, 2016 [3 favorites]
Sym Roe published a good article, Unbiasing, yesterday offering some related assumption-challenges for technologists to bear in mind/apply when solving such "engineering problems".
posted by amcewen at 7:41 AM on June 29, 2016 [2 favorites]
posted by amcewen at 7:41 AM on June 29, 2016 [2 favorites]
Our intentions are simple and clear. First we will instrument, then we will analyze, then we will optimize. And you will thank us.
This can actually work, IF you instrument the right metrics. Measure the wrong thing, and....
posted by mikelieman at 7:45 AM on June 29, 2016
This can actually work, IF you instrument the right metrics. Measure the wrong thing, and....
posted by mikelieman at 7:45 AM on June 29, 2016
My first thought on reading this was that it sheds a lot of light on Uber and Lyft's our-way-or-the-highway behavior in Austin recently.
posted by tippiedog at 7:46 AM on June 29, 2016 [3 favorites]
posted by tippiedog at 7:46 AM on June 29, 2016 [3 favorites]
Behavioral science - in particular behavioral economics - gives us a way to make concise, quantitative arguments about things like the role of meaningfulness and other parts of being human that must factor into all kinds of decisions.
I think it's possible that tech people are persuadable. Many are confident enough that they're not usually too defensive to listen to new information. Arguments from behavioral science and other quantitative-ish disciplines could be a key tool to connecting with these guys.
In addition, pretty much every course on this stuff starts with a series of demonstrations designed to convince people that they aren't as rational as they think they are. Maybe people need really good regular reminders of things like this.
These kinds of things, if they were presented well and consistently, and if there were a way to say "are you taking X into account" every single time tech proposals like this were made, *might* make a difference.
The problem of "tech people gonna tech" is really difficult. It's not impossible, though.
posted by amtho at 8:07 AM on June 29, 2016 [1 favorite]
I think it's possible that tech people are persuadable. Many are confident enough that they're not usually too defensive to listen to new information. Arguments from behavioral science and other quantitative-ish disciplines could be a key tool to connecting with these guys.
In addition, pretty much every course on this stuff starts with a series of demonstrations designed to convince people that they aren't as rational as they think they are. Maybe people need really good regular reminders of things like this.
These kinds of things, if they were presented well and consistently, and if there were a way to say "are you taking X into account" every single time tech proposals like this were made, *might* make a difference.
The problem of "tech people gonna tech" is really difficult. It's not impossible, though.
posted by amtho at 8:07 AM on June 29, 2016 [1 favorite]
IF you instrument the right metrics...
And there's the trap, because there is no such thing as the right metric: by definition, all metrics are proxies for the thing measured. You are always subject to the problem of "be careful what you measure because that's what you'll get."
posted by fatbird at 8:34 AM on June 29, 2016 [7 favorites]
And there's the trap, because there is no such thing as the right metric: by definition, all metrics are proxies for the thing measured. You are always subject to the problem of "be careful what you measure because that's what you'll get."
posted by fatbird at 8:34 AM on June 29, 2016 [7 favorites]
One, this is the reason that the humanities has a valuable place in engineering education.
Two, I've found it noticeable and telling that the EFF's annual "Who's Got Your Back" report solely focuses on government surveillance, and nothing on commercial, even though standards like finite data retention, minimum necessary policies, and the such would be easy to track. They would have to give zeroes across the board, which would be biting the hand that feeds them.
Three, I'm wondering if the solution might be the same one used for the health industry - make the data a liability. Tech firms might be less enamored of siphoning data if its acquisition and retention had real cost to it.
posted by NoxAeternum at 8:35 AM on June 29, 2016 [7 favorites]
Two, I've found it noticeable and telling that the EFF's annual "Who's Got Your Back" report solely focuses on government surveillance, and nothing on commercial, even though standards like finite data retention, minimum necessary policies, and the such would be easy to track. They would have to give zeroes across the board, which would be biting the hand that feeds them.
Three, I'm wondering if the solution might be the same one used for the health industry - make the data a liability. Tech firms might be less enamored of siphoning data if its acquisition and retention had real cost to it.
posted by NoxAeternum at 8:35 AM on June 29, 2016 [7 favorites]
the best kind of control is control without responsibility
Sounds like fun as long as you never lose the control and also are amoral.
Also this is the most accurate description of Engineer's Disease:
People who excel at software design become convinced that they have a unique ability to understand any kind of system at all, from first principles, without prior training, thanks to their superior powers of analysis
posted by jeather at 8:38 AM on June 29, 2016 [8 favorites]
Sounds like fun as long as you never lose the control and also are amoral.
Also this is the most accurate description of Engineer's Disease:
People who excel at software design become convinced that they have a unique ability to understand any kind of system at all, from first principles, without prior training, thanks to their superior powers of analysis
posted by jeather at 8:38 AM on June 29, 2016 [8 favorites]
This [NYRB] is related - applying metaphors from the information age to biology is very powerful, but still not accurate. Blinded by the power of the model we forget we're dealing with metaphors.
posted by grubby at 8:47 AM on June 29, 2016
posted by grubby at 8:47 AM on June 29, 2016
Not a single mention of Mencius Moldbug? I'm surprised.
posted by Pope Guilty at 8:47 AM on June 29, 2016 [1 favorite]
posted by Pope Guilty at 8:47 AM on June 29, 2016 [1 favorite]
What's even worse is there are in addition to squads of all-knowing developers, also squads of all-knowing analysts, managers, and consultants, many of whom essentially begin their careers getting thrown into other people's systems and business with a mandate to analyze them quickly and make them more efficient. I honestly think management science and its emphasis on thinking of all business as systems and all workers as fungible, interchangeable widgets that can be studied and understood as abstractions contribute to this, too. As part of the job description, developers and non-developer IT analysts are literally thrown into unfamiliar systems and told to fix them routinely. We criticize it here as a negative personality affect, but the industry demands developers and non-techies in the industry come into their roles confident they can understand and solve any problem.
posted by saulgoodman at 8:47 AM on June 29, 2016 [11 favorites]
posted by saulgoodman at 8:47 AM on June 29, 2016 [11 favorites]
My first thought on reading this was that it sheds a lot of light on Uber and Lyft's our-way-or-the-highway behavior in Austin recently.
The most illuminating thing about Lyft and Uber is how easily they buy off most local politicos because it is shocking when they fail.
posted by srboisvert at 8:52 AM on June 29, 2016 [4 favorites]
The most illuminating thing about Lyft and Uber is how easily they buy off most local politicos because it is shocking when they fail.
posted by srboisvert at 8:52 AM on June 29, 2016 [4 favorites]
Anyone who wants to rage at people totally missing the point can check out the Hacker News thread. My favorite is the guy who tries to use Uber as a counterexample. I don't even.
posted by neckro23 at 8:57 AM on June 29, 2016 [4 favorites]
posted by neckro23 at 8:57 AM on June 29, 2016 [4 favorites]
I liked this essay and I think it said a lot of true things.
It is, however, possible that other tech-biz experiences, like in-depth exposure to other people's systems, can produce an opposite default assumption: That if things are broken, there's probably a reason. Often a dark secret Eldritch reason. A reason that requires great time and effort to unearth and understand. A reason that may turn out to be intractable, meaning you've wasted your effort, in a sense. That if something were really so easy to fix, it probably would have been fixed already.
You can dismiss that as cynicism or pessimism if you like, but my point is: the cynics are quieter. You won't get an accurate picture of what "tech people" are like in general by listening mostly to noisy optimists.
posted by Western Infidels at 9:02 AM on June 29, 2016 [9 favorites]
It is, however, possible that other tech-biz experiences, like in-depth exposure to other people's systems, can produce an opposite default assumption: That if things are broken, there's probably a reason. Often a dark secret Eldritch reason. A reason that requires great time and effort to unearth and understand. A reason that may turn out to be intractable, meaning you've wasted your effort, in a sense. That if something were really so easy to fix, it probably would have been fixed already.
You can dismiss that as cynicism or pessimism if you like, but my point is: the cynics are quieter. You won't get an accurate picture of what "tech people" are like in general by listening mostly to noisy optimists.
posted by Western Infidels at 9:02 AM on June 29, 2016 [9 favorites]
The recent episodes of Silicon Valley dealt with the issue of being an engineer. Though they could see how amazing their technology was, their users were lost with it. It was too "engineer" for us common folk. And the engineers couldn't see the problem until after a long confrontation and dialog with the common folks. Everybody has a point of view but the difficulty comes when you can't see beyond your own point of view. When coupled with the notion that one's point of view is really objective fact it becomes even more difficult to see things with other people's eyes. Maybe dialog is the key. Dialog with other people, dialog with reality, even dialog with yourself...
posted by njohnson23 at 9:18 AM on June 29, 2016
posted by njohnson23 at 9:18 AM on June 29, 2016
"We should not listen to people who promise to make Mars safe for human habitation, until we have seen them make Oakland safe for human habitation. We should be skeptical of promises to revolutionize transportation from people who can't fix BART, or have never taken BART. And if Google offers to make us immortal, we should check first to make sure we'll have someplace to live."
rekt
posted by AAALASTAIR at 9:26 AM on June 29, 2016 [20 favorites]
rekt
posted by AAALASTAIR at 9:26 AM on June 29, 2016 [20 favorites]
I've reached the point where, when a new acquaintance shows the first signs of thinking that most other people are "idiots" etc, I head for the escape pod.
Yeah, and use of things like "sheeple" or "libtard" is a useful way for me to know to immediately check out of any "discussion" online.
posted by phearlez at 9:30 AM on June 29, 2016 [3 favorites]
Yeah, and use of things like "sheeple" or "libtard" is a useful way for me to know to immediately check out of any "discussion" online.
posted by phearlez at 9:30 AM on June 29, 2016 [3 favorites]
I blame Hari Seldon.
posted by charred husk at 9:32 AM on June 29, 2016 [3 favorites]
posted by charred husk at 9:32 AM on June 29, 2016 [3 favorites]
Of all the hours of lectures I have gone to in the course of my physics education, one quote in particular has stuck with me, and I paraphrase:
"This is one of the more accurate first-principles predictions in thermodynamics. It predicts the real world value to within a factor of 3 or so."
It is this I think of every time every time I want to rely too much on theory alone.
posted by Zalzidrax at 9:40 AM on June 29, 2016 [17 favorites]
"This is one of the more accurate first-principles predictions in thermodynamics. It predicts the real world value to within a factor of 3 or so."
It is this I think of every time every time I want to rely too much on theory alone.
posted by Zalzidrax at 9:40 AM on June 29, 2016 [17 favorites]
a) fatbird is onto something here. language counts, and there is more to problem solving than meets the (full-stack-in-twelve-weeks-bootcamp!-web-dev) eye. or even the i-invented-a-commercial-web-service-and-am-entrepreneurial-lord-of-all-i-survey eye.
b) won't take much editing to get to a more realistic pov:
People who excel atsoftware design analysis become convinced that they have a unique pretty decent ability to understand any kind of system at all a lot of systems, from first principles, without prior training, thanks to their superior powers of analysis practicing analysis professionally for twenty years.
maybe I'm myopic here, I guess.
c) seems like time for a metafilter 'engineer's disease' super post.
posted by j_curiouser at 11:08 AM on June 29, 2016 [1 favorite]
b) won't take much editing to get to a more realistic pov:
People who excel at
maybe I'm myopic here, I guess.
c) seems like time for a metafilter 'engineer's disease' super post.
posted by j_curiouser at 11:08 AM on June 29, 2016 [1 favorite]
The perfect isn't just the enemy of the good, it's the enemy of the possible. Seeing people using software development mindsets to view the world is painful, especially after a day of writing software.
Much like the programmer who is sitting in a meeting picking holes in every approach due to fringe cases where things may not work, efforts to affect the physical world through this mindset tend to aim for the ideal state and leave the implentation to others -- or just deliver the first step and then stomp off when no one seems to be getting closer to the ideal.
I'm guessing Elon Musk was somewhat surprised by the negative feedback after he proposed Tesla buy out SolarCity, but I'm sure that the perfect vision is some sort of off-grid solar-to-home-to-transport sealed system that is beautiful in execution and not at all what either of these companies is anywhere near doing. If you live in the world where the final implementation is the only thing worth judging, you don't understand why people are so angry at you in the interim.
I'm sure someone at Uber thinks they're an amazingly automated personal transport company , using driverless cars and providing a great service, but that ignores the view out the window here in 2016 where they're cutting driver rates in most mid-sized cities, the same places where driverless cars won't be viable for a very long time.
But I'm sure these people will tell you their product is good -- the one that doesn't exist, not the one you use today.
posted by mikeh at 11:10 AM on June 29, 2016 [3 favorites]
Much like the programmer who is sitting in a meeting picking holes in every approach due to fringe cases where things may not work, efforts to affect the physical world through this mindset tend to aim for the ideal state and leave the implentation to others -- or just deliver the first step and then stomp off when no one seems to be getting closer to the ideal.
I'm guessing Elon Musk was somewhat surprised by the negative feedback after he proposed Tesla buy out SolarCity, but I'm sure that the perfect vision is some sort of off-grid solar-to-home-to-transport sealed system that is beautiful in execution and not at all what either of these companies is anywhere near doing. If you live in the world where the final implementation is the only thing worth judging, you don't understand why people are so angry at you in the interim.
I'm sure someone at Uber thinks they're an amazingly automated personal transport company , using driverless cars and providing a great service, but that ignores the view out the window here in 2016 where they're cutting driver rates in most mid-sized cities, the same places where driverless cars won't be viable for a very long time.
But I'm sure these people will tell you their product is good -- the one that doesn't exist, not the one you use today.
posted by mikeh at 11:10 AM on June 29, 2016 [3 favorites]
I kind of feel like Uber and the like started as "like taxis, but better due to lack of cruft in the existing system" and then quit 2/3rds of the way into the experiment when they decided they'd rather not play the game at all when confronted with the infrastructural and legal realities the existing system works within.
posted by mikeh at 11:17 AM on June 29, 2016 [7 favorites]
posted by mikeh at 11:17 AM on June 29, 2016 [7 favorites]
Again with this guy being awesomely correct. So sad I missed this panel when I could totally have attended. I don't know two of the speakers, but one of the others up with Ceglowski was Kieran Healy, a sociologist (and blogger lamentably largely absent from Crooked Timber in recent years), whose contribution is also very worth reading.
posted by col_pogo at 11:21 AM on June 29, 2016 [2 favorites]
posted by col_pogo at 11:21 AM on June 29, 2016 [2 favorites]
You can dismiss that as cynicism or pessimism if you like, but my point is: the cynics are quieter. You won't get an accurate picture of what "tech people" are like in general by listening mostly to noisy optimists.
I think the point of view you're describing is actually quite common among rank-and-file software people - which makes sense because one of the lessons of software development is that software itself is almost inevitably compromised (just from a purely technical perspective) in practice. The thing is it often ends up manifesting (Maciej gets at this actually) as "social problems are intractable therefore we shouldn't even bother to try to address them. Which is to say plain 'ol conservatism.
posted by atoxyl at 12:33 PM on June 29, 2016 [1 favorite]
I think the point of view you're describing is actually quite common among rank-and-file software people - which makes sense because one of the lessons of software development is that software itself is almost inevitably compromised (just from a purely technical perspective) in practice. The thing is it often ends up manifesting (Maciej gets at this actually) as "social problems are intractable therefore we shouldn't even bother to try to address them. Which is to say plain 'ol conservatism.
posted by atoxyl at 12:33 PM on June 29, 2016 [1 favorite]
People really need to get comfortable with the idea the goal sometimes should be managing and ameliorating problems. It's true some kinds of problems may be unsolvable, but that doesn't mean there aren't better ways to help people manage and deal with them. We didn't have to solve the problem of rain to figure out you can stay dry anyway with a roof overhead.
posted by saulgoodman at 1:02 PM on June 29, 2016 [3 favorites]
posted by saulgoodman at 1:02 PM on June 29, 2016 [3 favorites]
I learned this firsthand when I finally bit the bullet, a year or so ago, and deleted my Facebook account. It's astonishing how many of my social connections only sustained themselves while I was reachable over Facebook Chat.
I deactivated mine in July 2014*, and imposed a rule upon myself that I couldn't delete it for a year. I didn't miss it, so a year to the day later I logged in again and deleted it. At this point friends seem to recognize me (not incorrectly, I might add) as a crank, and maybe every two or three months somebody will ask my wife to pass a message along to me.
* (when they started requiring the separate Messenger app, as a matter of fact)
For that matter I never even signed up for LinkedIn, except when I needed developer access to the API for work (not under my real name either, but somebody I worked with easily guessed the new account at the company was me).
I still don't miss Facebook, but I do wonder how sensitive the average passport control officer would be to my "no really, I just found it an irritating waste of time" argument. Maybe I could refer to it in terms of the KonMari Method. "Facebook did not bring me joy, so I said goodbye to it. I had too much clutter already and chose not to bring LinkedIn into my digital home."
posted by fedward at 1:09 PM on June 29, 2016 [1 favorite]
I deactivated mine in July 2014*, and imposed a rule upon myself that I couldn't delete it for a year. I didn't miss it, so a year to the day later I logged in again and deleted it. At this point friends seem to recognize me (not incorrectly, I might add) as a crank, and maybe every two or three months somebody will ask my wife to pass a message along to me.
* (when they started requiring the separate Messenger app, as a matter of fact)
For that matter I never even signed up for LinkedIn, except when I needed developer access to the API for work (not under my real name either, but somebody I worked with easily guessed the new account at the company was me).
I still don't miss Facebook, but I do wonder how sensitive the average passport control officer would be to my "no really, I just found it an irritating waste of time" argument. Maybe I could refer to it in terms of the KonMari Method. "Facebook did not bring me joy, so I said goodbye to it. I had too much clutter already and chose not to bring LinkedIn into my digital home."
posted by fedward at 1:09 PM on June 29, 2016 [1 favorite]
I'm surprised nobody's mentioned Maciej's corporate Twitter account, which is consistently, hilariously anti-technocratic.
posted by rorgy at 1:10 PM on June 29, 2016 [3 favorites]
posted by rorgy at 1:10 PM on June 29, 2016 [3 favorites]
I've not-had a Facebook so long that a.) it seems unlikely that I ever will and b.) it seems that people have come back around from "that's so weird" to "oh yeah I get it." At the same time I've come to realize that I have not built nearly as much of social media profile for myself as a professional (yes, in software, I was busy with Problems and stuff) as would be ideal for my career prospects. You can opt of out things but you can't opt out of everything.
posted by atoxyl at 1:49 PM on June 29, 2016 [2 favorites]
posted by atoxyl at 1:49 PM on June 29, 2016 [2 favorites]
I also run the Bedbug Registry, a searchable database of bedbug reports
So the author maintains an unverifiable blacklist of hotels with no democratic oversight? Physician, heal thyself.
posted by RobotVoodooPower at 1:59 PM on June 29, 2016
So the author maintains an unverifiable blacklist of hotels with no democratic oversight? Physician, heal thyself.
posted by RobotVoodooPower at 1:59 PM on June 29, 2016
The Pinboard Twitter account is one of the better side benefits of being a Pinboard user.
posted by epersonae at 2:45 PM on June 29, 2016 [3 favorites]
posted by epersonae at 2:45 PM on June 29, 2016 [3 favorites]
Also, I find this from the conclusion to be weirdly inspiring, maybe because I've been listening to Hamilton so much? "Techies will complain that trivial problems of life in the Bay Area are hard because they involve politics. But they should involve politics. Politics is the thing we do to keep ourselves from murdering each other."
posted by epersonae at 2:46 PM on June 29, 2016 [5 favorites]
posted by epersonae at 2:46 PM on June 29, 2016 [5 favorites]
The Pinboard Twitter account is one of the better side benefits of being a Pinboard user.
Oh, man, yes. I love the Pinboard Twitter account. And I read the FA yesterday, but forgot to bookmark it to share later. So smart.
posted by suelac at 3:38 PM on June 29, 2016 [1 favorite]
Oh, man, yes. I love the Pinboard Twitter account. And I read the FA yesterday, but forgot to bookmark it to share later. So smart.
posted by suelac at 3:38 PM on June 29, 2016 [1 favorite]
Love @Pinboard's snarking about tech industry bullshit; wish it'd give the Brexit jokes a miss, though. (Same goes for you, @rustyk5.)
posted by We had a deal, Kyle at 4:13 PM on June 29, 2016
posted by We had a deal, Kyle at 4:13 PM on June 29, 2016
RobotVoodooPower, please get it right. It's an unverifiable blacklist of hotels that runs highly intrusive ads!
posted by idlewords at 7:27 PM on June 29, 2016 [10 favorites]
posted by idlewords at 7:27 PM on June 29, 2016 [10 favorites]
After decades of programming, I have learned that software has these properties: never works the first time, always contains errors, can sometimes be made to work well enough to be helpful in the real world.
posted by Triplanetary at 3:06 AM on June 30, 2016 [2 favorites]
posted by Triplanetary at 3:06 AM on June 30, 2016 [2 favorites]
I think this is really just the most recent large-scale demonstration of the truth of that old adage: "when all you have is a hammer..."
Not everything can be solved with software, let alone "solved". [1]
But you know, admitting that writing code isn't gonna do jack-shit to fix some problem would mean -- if your only marketable skill is coding -- that you have no power to fix something that you care about.
Some people can't deal with that.
So we wind up with all these shared delusions of grandeur. And sometimes the delusions pick up a couple rounds of funding.
---
[1] I'm pretty sure that poverty as a whole falls into that category, although poverty *rates* probably don't. But poverty has always been part of the human condition in some form or another. And ain't no app gonna change it.
posted by -1 at 9:00 AM on June 30, 2016
Not everything can be solved with software, let alone "solved". [1]
But you know, admitting that writing code isn't gonna do jack-shit to fix some problem would mean -- if your only marketable skill is coding -- that you have no power to fix something that you care about.
Some people can't deal with that.
So we wind up with all these shared delusions of grandeur. And sometimes the delusions pick up a couple rounds of funding.
---
[1] I'm pretty sure that poverty as a whole falls into that category, although poverty *rates* probably don't. But poverty has always been part of the human condition in some form or another. And ain't no app gonna change it.
posted by -1 at 9:00 AM on June 30, 2016
Artificial Intelligence's White Guy Problem - "Our world is increasingly shaped by biased algorithms that have been built with little oversight."
compare silicon valley to wall street :P
"Neoliberal politics have been a long-term, iterated form of the ultimatum game..."
posted by kliuless at 5:15 PM on July 5, 2016 [1 favorite]
Sexism, racism and other forms of discrimination are being built into the machine-learning algorithms that underlie the technology behind many “intelligent” systems that shape how we are categorized and advertised to.weapons of math destruction!
“Cathy O’Neil has seen Big Data from the inside, and the picture isn’t pretty. Weapons of Math Destruction opens the curtain on algorithms that exploit people and distort the truth while posing as neutral mathematical tools. This book is wise, fierce, and desperately necessary.”Surveillance capitalism has some of the features of a zero-sum game.
—[mefi's own] Jordan Ellenberg, University of Wisconsin-Madison, author of How Not To Be Wrong
compare silicon valley to wall street :P
"Neoliberal politics have been a long-term, iterated form of the ultimatum game..."
posted by kliuless at 5:15 PM on July 5, 2016 [1 favorite]
« Older Seeking paradise: The image and reality of truck... | The campaign lurches into the summer Newer »
This thread has been archived and is closed to new comments
posted by ZenMasterThis at 6:24 AM on June 29, 2016 [3 favorites]