“Cryptocurrency can’t crash soon enough,” one gamer fumes.
January 21, 2018 1:29 PM Subscribe
It's a terrible time to buy a graphics card [PC Gamer] “Not too long ago we warned of a potential impending surge in graphics card prices. Unfortunately for those looking to upgrade graphics cards or buy a new GPU, that prediction has come true, and then some. Right now is the worst time in the history of graphics cards to buy or upgrade this all-important gaming component. I've run through the list of current generation GPUs, and I've even looked at previous generation GPUs. The price changes in the past two weeks are staggering.”
• Why Building a Gaming PC Right Now Is a Bad Idea, Part 2: Insane Graphics Card Prices [Techspot]
• Why Building a Gaming PC Right Now Is a Bad Idea, Part 2: Insane Graphics Card Prices [Techspot]
“Just as we thought graphics card pricing was starting to settle down late last year, it has skyrocketed higher than ever as a result of increasing demand from cryptocurrency miners and Chinese gamers, in addition to equally inflated memory prices. Altogether, this scenario has provided the perfect storm of reasons for holding off on your next GPU upgrade. The surging profitability of cryptocurrencies is likely the biggest contributing factor to the sudden hike in GPU pricing. The price of Ethereum has skyrocketed from ~$450 to well over $1,000 in the last month. At the time of writing, the value had hit just over $1,200, or a 170% increase in about a month and a half, so it's no wonder miners are snapping up every last GPU they can get. Ethereum is just one example, but there are other "up and coming" cryptocurrencies that have miners excited. So, while we hoped for cryptocurrency mining to start cooling off in 2018, so far the opposite has occurred.”• Graphics card prices are ludicrous. [PC World]
“Graphics card prices are downright ludicrous, as the cryptocurrency craze skyrockets in lock-step with Bitcoin’s rising price. That’s nothing new. Miners have been inflating graphics card prices since the middle of the year. But the damage was limited to the middle of the market for most of 2017. Currently, virtually every gaming-class graphics card is affected. The theoretically $200 3GB GeForce GTX 1060 currently goes for $380 to $550—when you can find it in stock, that is. The Radeon RX 570, RX 580, and 6GB GTX 1060—graphics cards with suggested retail prices of $200 to $250—are selling for $500 to $800 on Newegg. And if you thought the GTX 1080 Ti’s $700 price tag was steep, prepare to clutch your chest: Nvidia’s gaming flagship can’t be found for less than $1,300 on Newegg today. It’s bleak. Real bleak. And the prices for used graphics cards are just as depressing.”• Here’s why you can’t buy a high-end graphics card at Best Buy [Ars Technica]
“On Wednesday, I visited my local Best Buy to see the graphics card shortage for myself. A locked cabinet in the gaming section was supposed to hold a wide range of graphics card. But the more expensive ones—cards like the AMD Radeon RX 580 and Nvidia GeForce GTX 1080 that were best suited for cryptocurrency mining—were sold out. All that was available were a handful of low-end graphics cards not suitable for Ethereum mining. I asked a sales representative to check to see if I could get an RX 580 at another Best Buy store. No luck. "It's not available in any store," she told me. "It's not online." I had an experience that's being played out all over the country and around the world. Sullivan McIntyre, a Redditor from San Francisco, sent us this photograph of the graphics card section at Central Computers. Not only are there hardly any graphics cards available for purchase, but the company has notified customers that it is suspending its return policy for graphics cards. The reason: graphics card prices have been fluctuating wildly.”• To Combat Shortage, Nvidia Asks Retailers to Limit Graphics Card Orders [PC Mag]
While there isn't much major manufacturers AMD and Nvidia can do about the overwhelming demand for GPUs, Nvidia is at least trying to let retailers know that they should be holding their stock for the company's core audience: gamers, not miners.“For NVIDIA, gamers come first. All activities related to our GeForce product line are targeted at our main audience. To ensure that GeForce gamers continue to have good GeForce graphics card availability in the current situation, we recommend that our trading partners make the appropriate arrangements to meet gamers' needs as usual,”Nvidia is suggesting that retailers limit graphics card orders to just two per person, but that's just an idea—one Nvidia can't actually enforce beyond restricting sales on its website, which it's currently doing. That said, it wouldn't be difficult to place multiple orders on behalf of you and your friends to skirt Nvidia's two-per-person limit. While Nvidia's move will at least force an amateur cryptocurrency miners to get creative instead of just buying 16 graphics cards at once, the larger supply and demand issues cryptocurrency miners and gamers face might only be settled by time.
Unsourced graphic showing Bitcoin energy consumption in the last few years (if anyone can find sourced or better data, please post.)
tl;dr: "virtual" currency costs a lot of real fossil fuel.
posted by splitpeasoup at 1:46 PM on January 21, 2018 [35 favorites]
tl;dr: "virtual" currency costs a lot of real fossil fuel.
posted by splitpeasoup at 1:46 PM on January 21, 2018 [35 favorites]
Yeah I got my Asus 1070 gaming laptop October 2016. I could never afford that now. Well, for various reasons I couldn't afford a graphing calculator right now. But still, yeah, the bubble can't pop loudly and spectacularly soon enough.
posted by Splunge at 1:47 PM on January 21, 2018 [6 favorites]
posted by Splunge at 1:47 PM on January 21, 2018 [6 favorites]
This Bitcoin bubble can't collapse quick enough.
posted by GallonOfAlan at 1:51 PM on January 21, 2018 [46 favorites]
posted by GallonOfAlan at 1:51 PM on January 21, 2018 [46 favorites]
I found my GPU in the electronic recycling at work about a year ago. (Actually, I found a stack of six, but only one of them was any good.) I then went out and got a refurbished gaming rig from newegg, which happened to already have a pretty decent gpu in it. So now my two gpu's run in a headless tower, trying to figure out how to turn speech into text with deep neural networks all hours of the day.
In short, nevermind gaming and cryptocurrencies, the real victims here are the amateur machine learning enthusiasts.
posted by kaibutsu at 1:52 PM on January 21, 2018 [52 favorites]
In short, nevermind gaming and cryptocurrencies, the real victims here are the amateur machine learning enthusiasts.
posted by kaibutsu at 1:52 PM on January 21, 2018 [52 favorites]
Unsourced graphic showing Bitcoin energy consumption in the last few years (if anyone can find sourced or better data, please post.)
I hope it's not too much of a derail, but regarding environmental impact and bitcoining, this episode from The Current on CBC [Full Transcript] is relevant: Is bitcoin an environmental hazard?
I hope it's not too much of a derail, but regarding environmental impact and bitcoining, this episode from The Current on CBC [Full Transcript] is relevant: Is bitcoin an environmental hazard?
“"I would say it's roughly comparative to running a vacuum cleaner in my kitchen 24 hours a day," says Drew Taylor, a Montreal bitcoin miner. Taylor has two specialized super-computers needed to keep up in bitcoin mining — which is essentially a race to solve the complex math riddles that underpin the bitcoin network, in return for new coins. "Right about now, [bitcoin is using] almost 37 TWh per year, which is the electricity consumption of a country like Bulgaria," says Alex de Vries, a data consultant and creator of The Bitcoin Energy Consumption Index. "The real shocker here is that the bitcoin network only processes 350,000 transactions per day. So if you average out the number per transaction ... [that's equivalent] to the power needed for one U.S. household for almost 10 days." At the current rate, de Vries says a bitcoin uses 70,000 times more energy than Visa per transaction.”posted by Fizz at 1:53 PM on January 21, 2018 [34 favorites]
It's okay; as linked/described in an FPP from earlier today, Bitcoin is scientifically proven to crash later on this year. Hang on in there.
posted by Wordshore at 1:54 PM on January 21, 2018 [3 favorites]
posted by Wordshore at 1:54 PM on January 21, 2018 [3 favorites]
Three bitcoin posts just today. It'll definitely crash.
Says the people who don't have anything in crypto. Including me.
posted by Made of Star Stuff at 1:55 PM on January 21, 2018 [15 favorites]
Says the people who don't have anything in crypto. Including me.
posted by Made of Star Stuff at 1:55 PM on January 21, 2018 [15 favorites]
I work with a kid who is trying to build a gaming PC right now out of a mixture of gifted and purchased components. The struggle is real. His difficulties have completely quashed any interest I've had in building a computer right now, though my interests these days are more in low-end Raspberry Pi-style projects.
posted by Mr.Encyclopedia at 1:56 PM on January 21, 2018 [5 favorites]
posted by Mr.Encyclopedia at 1:56 PM on January 21, 2018 [5 favorites]
It was going all so well. Bitcoin ASICs were making cryptocoin mining on anything but ASICs wholly unprofitable.
But then Ethereum decided it was going to lean on the GPU so that everyone could mine.
Fucking idiots.
posted by Talez at 2:00 PM on January 21, 2018 [12 favorites]
But then Ethereum decided it was going to lean on the GPU so that everyone could mine.
Fucking idiots.
posted by Talez at 2:00 PM on January 21, 2018 [12 favorites]
I feel like my GPU is probably much happier with its new life since it gave up mining and took up fiddling with neural networks. It pays only slightly less, and if Leelaz one day grows up to be the AI that takes over the world I'll be able to say I beat up on it when it was a baby.
posted by sfenders at 2:00 PM on January 21, 2018 [3 favorites]
posted by sfenders at 2:00 PM on January 21, 2018 [3 favorites]
Does anyone want the GTX 580 I found while cleaning out the basement? Only $499.
posted by miyabo at 2:02 PM on January 21, 2018
posted by miyabo at 2:02 PM on January 21, 2018
The economics at work here are interesting, but speaking as a person who plays games: I could do with less fetishizing of graphics anyway. None of my favorite games in recent memory have been great because they were visually stunning. (Well, some of them have been - but because they were thoughtfully art-directed with a unique aesthetic that suits the game, not because they ran at 1080p with bleeding-edge shaders or whatever.)
They've been good because they present the player with interesting choices and challenges, or tell a real story (not just "ur an uber-macho military d00d; shoot all the things"), or explore new territory in the gaming medium, or are just really well tuned examples of established genres.
Heck, half of them use sprites / pixel graphics, or even ASCII.
tl;dr Not everything game has to be a CGI blockbuster. Games are about choices - tactics, strategy, exploration - and while pretty graphics can be nice, they don't change the choices that the game offers to the player.
posted by escape from the potato planet at 2:02 PM on January 21, 2018 [49 favorites]
They've been good because they present the player with interesting choices and challenges, or tell a real story (not just "ur an uber-macho military d00d; shoot all the things"), or explore new territory in the gaming medium, or are just really well tuned examples of established genres.
Heck, half of them use sprites / pixel graphics, or even ASCII.
tl;dr Not everything game has to be a CGI blockbuster. Games are about choices - tactics, strategy, exploration - and while pretty graphics can be nice, they don't change the choices that the game offers to the player.
posted by escape from the potato planet at 2:02 PM on January 21, 2018 [49 favorites]
I'm running a GTX 970 and it's starting to show a little bit of its age. I can still play most AAA games but I'm starting to throttle back on some of the graphical settings because the framerate is dipping and certain games just demand a bit more power than I'm able to run. I still have another year left before I'll have to upgrade/invest. Hoping prices will settle by then.
posted by Fizz at 2:04 PM on January 21, 2018 [2 favorites]
posted by Fizz at 2:04 PM on January 21, 2018 [2 favorites]
Heh, about about six months ago I was brainstorming with a friend, and we were actually discussing the options of simply buying and flipping GPUs instead of wasting time actually mining or HF trading.
It's a no brainer. If you want to get really rich the easy way during a gold rush - sell expensive shovels.
Part of the hitch in that idea was just finding appropriate GPUs at all. At any price. It's not like I can set up a shade tree forge and hammer out some ASICs and FPGAs to sell.
Also, I would like to point out what the market looks like right now. This is a live link, so what I'm talking about is going to be good for about 24-48 hours. That recent huge dip is still on the graphs, and you can see it mirrored over 99% of the coins and how tightly locked they are to Bitcoin's value.
Those graphs don't always look like that, though. Sometimes certain coin pairs will have inversely related dips and spikes where you can see a major arbitration trade happening directly between coins. The major sell off and downward run and the linked movements indicates a true panic selloff, not a whale rolling over and manipulating the market.
Also, I hate to say this but I don't see a full crash of cryptocoins happening. I see a correction back to saner prices happening, but I don't see the hype train derailing entirely in 2018.
Whether or not Bitcoin itself continues to retain value, I don't know. It's pretty obviously eating itself alive with speculation, high transaction fees and high power consumption costs because of the resistance to changing those high transaction fees and high prices.
There are already a dozen other established coins with market caps in the near or multi-billions that have much less power consumption per transaction, nearly unlimited scaleable transcation limits and low or zero fees.
posted by loquacious at 2:07 PM on January 21, 2018 [9 favorites]
It's a no brainer. If you want to get really rich the easy way during a gold rush - sell expensive shovels.
Part of the hitch in that idea was just finding appropriate GPUs at all. At any price. It's not like I can set up a shade tree forge and hammer out some ASICs and FPGAs to sell.
Also, I would like to point out what the market looks like right now. This is a live link, so what I'm talking about is going to be good for about 24-48 hours. That recent huge dip is still on the graphs, and you can see it mirrored over 99% of the coins and how tightly locked they are to Bitcoin's value.
Those graphs don't always look like that, though. Sometimes certain coin pairs will have inversely related dips and spikes where you can see a major arbitration trade happening directly between coins. The major sell off and downward run and the linked movements indicates a true panic selloff, not a whale rolling over and manipulating the market.
Also, I hate to say this but I don't see a full crash of cryptocoins happening. I see a correction back to saner prices happening, but I don't see the hype train derailing entirely in 2018.
Whether or not Bitcoin itself continues to retain value, I don't know. It's pretty obviously eating itself alive with speculation, high transaction fees and high power consumption costs because of the resistance to changing those high transaction fees and high prices.
There are already a dozen other established coins with market caps in the near or multi-billions that have much less power consumption per transaction, nearly unlimited scaleable transcation limits and low or zero fees.
posted by loquacious at 2:07 PM on January 21, 2018 [9 favorites]
The biggest problem is that even after the crash there's going to be a run on boards again with RMAs which will further constrain supply. Hopefully vendors are going to be really strict about returning "faulty" boards.
posted by Talez at 2:07 PM on January 21, 2018 [2 favorites]
posted by Talez at 2:07 PM on January 21, 2018 [2 favorites]
I picked up an RX 580 right when they were released for about $330 CAD. They cheapest version of that card is on Canadian Newegg for $699.
If the reason I bought that card in the first place wasn't to replace a dead video card, I would have flipped it for profit a while ago. But that's the things - if I sell it, I've got no chance of replacing it with anything comparable at that price point.
posted by thecjm at 2:09 PM on January 21, 2018 [2 favorites]
If the reason I bought that card in the first place wasn't to replace a dead video card, I would have flipped it for profit a while ago. But that's the things - if I sell it, I've got no chance of replacing it with anything comparable at that price point.
posted by thecjm at 2:09 PM on January 21, 2018 [2 favorites]
what's weird is NVIDIA recently got in hot water for banning use of the 1080 Ti in datacenters. (It's an extremely awesome GPU for deep learning at a fraction of the price of their "enterprise" offerings.) It's the same sort of thing, restricting the use of a supposedly consumer-oriented chip for non-gaming uses.
but then, in the EULA that banned its use in datacenters, they made an explicit exception allowing use for cryptocurrencies.
so I think any efforts to make sure Joe Q. Gamer gets priority in buying these must just be PR moves, whatever their words, their actions say they definitely want people to buy their chips and use them for mining.
posted by vogon_poet at 2:11 PM on January 21, 2018 [15 favorites]
but then, in the EULA that banned its use in datacenters, they made an explicit exception allowing use for cryptocurrencies.
so I think any efforts to make sure Joe Q. Gamer gets priority in buying these must just be PR moves, whatever their words, their actions say they definitely want people to buy their chips and use them for mining.
posted by vogon_poet at 2:11 PM on January 21, 2018 [15 favorites]
This Bitcoin bubble can't collapse quick enough.
I dunno, I'd rather not see what miner's attention turns to after they can't trade their Dunning-Krugerrands anymore. SWATting futures? Fedora trading options?
posted by Dr. Twist at 2:11 PM on January 21, 2018 [14 favorites]
I dunno, I'd rather not see what miner's attention turns to after they can't trade their Dunning-Krugerrands anymore. SWATting futures? Fedora trading options?
posted by Dr. Twist at 2:11 PM on January 21, 2018 [14 favorites]
The hat is called a trilby, thankyouverymuch
posted by DoctorFedora at 2:19 PM on January 21, 2018 [49 favorites]
posted by DoctorFedora at 2:19 PM on January 21, 2018 [49 favorites]
> Fizz:
"I'm running a GTX 970 and it's starting to show a little bit of its age. I can still play most AAA games but I'm starting to throttle back on some of the graphical settings because the framerate is dipping and certain games just demand a bit more power than I'm able to run. I still have another year left before I'll have to upgrade/invest. Hoping prices will settle by then."
Shush. I am still on a (gifted) GTX-760. And that was an upgrade from the old GeForce 9000 I used to use.
posted by Samizdata at 2:21 PM on January 21, 2018 [1 favorite]
"I'm running a GTX 970 and it's starting to show a little bit of its age. I can still play most AAA games but I'm starting to throttle back on some of the graphical settings because the framerate is dipping and certain games just demand a bit more power than I'm able to run. I still have another year left before I'll have to upgrade/invest. Hoping prices will settle by then."
Shush. I am still on a (gifted) GTX-760. And that was an upgrade from the old GeForce 9000 I used to use.
posted by Samizdata at 2:21 PM on January 21, 2018 [1 favorite]
There's a PC builder born every minute.
posted by Beholder at 2:21 PM on January 21, 2018 [2 favorites]
posted by Beholder at 2:21 PM on January 21, 2018 [2 favorites]
Also, I don't mine ANYTHING. I play games (on a 2008 Core 2 Quad system - Also gifted as one of those "If you can get this working, you can have it" things.)
posted by Samizdata at 2:22 PM on January 21, 2018 [2 favorites]
posted by Samizdata at 2:22 PM on January 21, 2018 [2 favorites]
So grateful for all the games in my Steam backlog that my aging card can still handle.
posted by straight at 2:26 PM on January 21, 2018 [25 favorites]
posted by straight at 2:26 PM on January 21, 2018 [25 favorites]
GPUs are fantastic for parallel processing. So: image alaysis, cryptocurrency, etc. Some things have more potential than Skyrim mods.
posted by grumpybear69 at 2:27 PM on January 21, 2018
posted by grumpybear69 at 2:27 PM on January 21, 2018
tl;dr Not everything game has to be a CGI blockbuster. Games are about choices - tactics, strategy, exploration - and while pretty graphics can be nice, they don't change the choices that the game offers to the player.
No, it doesn't have to be anything. Except for what each individual wants it to be. You are happy with your set up. Good for you. Some of us like VR. Good for us.
posted by Splunge at 2:28 PM on January 21, 2018 [17 favorites]
No, it doesn't have to be anything. Except for what each individual wants it to be. You are happy with your set up. Good for you. Some of us like VR. Good for us.
posted by Splunge at 2:28 PM on January 21, 2018 [17 favorites]
Some things have more potential than Skyrim mods
Blasphemy!
posted by Homo neanderthalensis at 2:33 PM on January 21, 2018 [4 favorites]
Blasphemy!
posted by Homo neanderthalensis at 2:33 PM on January 21, 2018 [4 favorites]
Pretty sure I've got an AGP 4X Riva TNT2 in a box somewhere, is that worth anything?
posted by parm at 2:35 PM on January 21, 2018 [4 favorites]
posted by parm at 2:35 PM on January 21, 2018 [4 favorites]
Yeah, the VR angle is a notable one; that setup by definition requires twice the GPU output (and maybe more so, because framerate and consistency of framerate matter a whole lot more in VR to avoid the upchucks) as an otherwise identical flatscreen gaming experience, which means people trying to set up a machine to work with what's currently a burgeoning technology and gaming paradigm are gonna feel the burn especially hard with this weird market spike.
Beyond that there's the reality that as a game consumer you don't control the design decisions being made for the games you're interested in. I love love love a lot of indie stuff and retro/pixel-styled stuff that runs fine on underpowered machines so it's not the end of the world to me if I fall behind on the graphics race, but I can't just e.g. cause Plunkbat to suddenly be optimized better for graphical performance. It's not by far the prettiest game in the world, it just doesn't run so great on *anything* still, and my having a healthy appreciation for less graphically demanding game design doesn't make Plunkbat run any better.
Anyway, the whole larger phenomenon is equal parts fascinating and maddeningly stupid. To the extent that the GPU makers are making bank off this, I sure hope it turns into some dividends-paying R&D for the next few years of cards.
posted by cortex at 2:36 PM on January 21, 2018 [7 favorites]
Beyond that there's the reality that as a game consumer you don't control the design decisions being made for the games you're interested in. I love love love a lot of indie stuff and retro/pixel-styled stuff that runs fine on underpowered machines so it's not the end of the world to me if I fall behind on the graphics race, but I can't just e.g. cause Plunkbat to suddenly be optimized better for graphical performance. It's not by far the prettiest game in the world, it just doesn't run so great on *anything* still, and my having a healthy appreciation for less graphically demanding game design doesn't make Plunkbat run any better.
Anyway, the whole larger phenomenon is equal parts fascinating and maddeningly stupid. To the extent that the GPU makers are making bank off this, I sure hope it turns into some dividends-paying R&D for the next few years of cards.
posted by cortex at 2:36 PM on January 21, 2018 [7 favorites]
Some things have more potential than Skyrim mods.
[Strong hated that]
posted by GCU Sweet and Full of Grace at 2:38 PM on January 21, 2018 [20 favorites]
[Strong hated that]
posted by GCU Sweet and Full of Grace at 2:38 PM on January 21, 2018 [20 favorites]
Looking forward to the renaissance of 2D side scrollers.
posted by gwint at 2:45 PM on January 21, 2018 [9 favorites]
posted by gwint at 2:45 PM on January 21, 2018 [9 favorites]
To cut down on the energy (over-)consumption, what about charging egregious amounts for fossil fuel usage well above what the median household uses, after adjusting for HH size? E.g. I live alone, I used 130 kWh last month running normal appliances and stuff. I don't know if that's average or not, but let's say yes. What if anything up to 300% of that is charged at the normal rate, and then anything over that is 50x the rate. Exceptions for, idk, medical equipment? and seasonal adjustments for electric heat/AC.
posted by AFABulous at 2:47 PM on January 21, 2018 [3 favorites]
posted by AFABulous at 2:47 PM on January 21, 2018 [3 favorites]
There's a parallel issue here where RAM prices have been skyrocketing as well. RAM has virtually no important to cryptomining, so the explanations I've seen have ranged from simple supply chain problems to price-fixing conspiracies between the two or three manufacturers that produce the memory chips that everybody else repackages into RAM sticks.
posted by tobascodagama at 2:51 PM on January 21, 2018
posted by tobascodagama at 2:51 PM on January 21, 2018
RAM has virtually no important to crypto-mining,
Bitcoin yes, Ethereum no. Ethereum requires gigabytes of scratch space and as much memory bandwidth as you can throw at it.
posted by Talez at 2:53 PM on January 21, 2018 [3 favorites]
Bitcoin yes, Ethereum no. Ethereum requires gigabytes of scratch space and as much memory bandwidth as you can throw at it.
posted by Talez at 2:53 PM on January 21, 2018 [3 favorites]
speaking as a person who plays games: I could do with less fetishizing of graphics anyway.
An interesting bit of anecdote: It may be more due to the way I've raised them, but my kids don't differentiate between high-end blockbusters like Horizon and games like Sonic Mania or even classic games like Pac-Man or Super Mario Brothers. I've never known them to turn their noses up at a game because it has primitive graphics.
Millennials grew up in a video game landscape where the newer more exciting games and consoles were always coming out with better graphics, which encouraged them to think better graphics = better game. Today there's lots of indie games hitting it big and much smaller differences between a good-looking game today and a good-looking game 5 years ago, so it's easier to focus on the other aspects that make or break a good game.
posted by Mr.Encyclopedia at 2:59 PM on January 21, 2018 [20 favorites]
An interesting bit of anecdote: It may be more due to the way I've raised them, but my kids don't differentiate between high-end blockbusters like Horizon and games like Sonic Mania or even classic games like Pac-Man or Super Mario Brothers. I've never known them to turn their noses up at a game because it has primitive graphics.
Millennials grew up in a video game landscape where the newer more exciting games and consoles were always coming out with better graphics, which encouraged them to think better graphics = better game. Today there's lots of indie games hitting it big and much smaller differences between a good-looking game today and a good-looking game 5 years ago, so it's easier to focus on the other aspects that make or break a good game.
posted by Mr.Encyclopedia at 2:59 PM on January 21, 2018 [20 favorites]
Like 95% of the time, I'm an indie gamer who doesn't care at all about my graphics card. But I'm still in a position where that other 5% is stuff that every year, new releases get harder to play on my computer, and maybe there's really only one thing a year that I'm that into playing, but I'd still like to be able to play that one thing. I've switched more to my PS4 for anything that should be pretty, because I should have upgraded my graphics card a few years ago and didn't, and now I'm not sure it seems worth it.
posted by Sequence at 3:03 PM on January 21, 2018 [5 favorites]
posted by Sequence at 3:03 PM on January 21, 2018 [5 favorites]
We've discontinued our locomotive simulator product so no need for the CGI computer for graphics, so no need for the 6 GTX 1080 Ti's on the shelf. Guess we'll just throw them out.
posted by Grumpy old geek at 3:08 PM on January 21, 2018
posted by Grumpy old geek at 3:08 PM on January 21, 2018
Yeah, you can bet a billion satoshis that GPU OEMs are capitalizing on this and doing a lot of back channel stuff.
Also even without weird backchannel and collusion stuff - keeping production, demand and scarcity in balance is standard practice for any chip fab and downstream product line. This is true for CPUs, DRAM, NAND/Flash, ASICs, FPGAs - just about any high value device on a die - it's a pretty delicate balance between production, predicting demand and profits because of the R/D and lead time.
There's always been at least some planned, functional and intentional scarcity in the bleeding and leading edges of tech. Cripes, I've heard stories about RAM factories slowing production to keep prices inflated going back to the early 1990s.
And this has been true of GPU tech pretty much since the time when GPUs first acquired their own cooling fans. Beyond the fact that these complex devices have high failure/reject rates in the factory and they can only make so many of them, the hype to have the latest and greatest helps keep the prices inflated above the sea of sunk costs of developing and making chips.
And they've been making mining-focused or friendly GPUs for at least a year now. I mean, specifically mining targeted cards, not just general purpose machine learning or parallel processing cards, like "here's your bare bones, reduced cost, reduced power, ultra-maximum hashes-per-second 'GPU' with rack-friendly power risers" cards.
So yeah, count on it. GPU OEMs love crypto. There's a reason why you can't find cards reliably through your regular channels, and it's not just because Joe Homeminer bought four of them at retail prices and drilled them into in an Ikea end table in his kitchen.
You can bet on the idea that product is being diverted or slipstreamed directly to miners with big wallets. Why sell 500 GPUs to 10 different suppliers at wholesale rates and stand by and watch them go for double or triple your MSRP when someone knocks on your back door and wants to buy, well, all the GPUs and they're offering a cash and carry price above the wholesale price? Or full MSRP? Under the table? Gee, is that abriefcase full of cash Trezor with 100 BTC? Load up the truck, Newegg can wait!
Side note and prediction:
Watch for China's GPU, FPGA and ASIC technology level blow up and drastically increase production to drive down the costs of mining hardware and processing power in general.
I know they were already working towards this in major ways way before cryptocurrencies, but now there's a direct financial incentive to create extremely fast, advanced and cheap silicon.
We're seeing results of this now with ASICs and other advanced chips, but what I'm predicting is that they may actually blow up the general GPU/MPU market and scene in ways that reach the consumer market for gaming GPUs.
posted by loquacious at 3:10 PM on January 21, 2018 [8 favorites]
Also even without weird backchannel and collusion stuff - keeping production, demand and scarcity in balance is standard practice for any chip fab and downstream product line. This is true for CPUs, DRAM, NAND/Flash, ASICs, FPGAs - just about any high value device on a die - it's a pretty delicate balance between production, predicting demand and profits because of the R/D and lead time.
There's always been at least some planned, functional and intentional scarcity in the bleeding and leading edges of tech. Cripes, I've heard stories about RAM factories slowing production to keep prices inflated going back to the early 1990s.
And this has been true of GPU tech pretty much since the time when GPUs first acquired their own cooling fans. Beyond the fact that these complex devices have high failure/reject rates in the factory and they can only make so many of them, the hype to have the latest and greatest helps keep the prices inflated above the sea of sunk costs of developing and making chips.
And they've been making mining-focused or friendly GPUs for at least a year now. I mean, specifically mining targeted cards, not just general purpose machine learning or parallel processing cards, like "here's your bare bones, reduced cost, reduced power, ultra-maximum hashes-per-second 'GPU' with rack-friendly power risers" cards.
So yeah, count on it. GPU OEMs love crypto. There's a reason why you can't find cards reliably through your regular channels, and it's not just because Joe Homeminer bought four of them at retail prices and drilled them into in an Ikea end table in his kitchen.
You can bet on the idea that product is being diverted or slipstreamed directly to miners with big wallets. Why sell 500 GPUs to 10 different suppliers at wholesale rates and stand by and watch them go for double or triple your MSRP when someone knocks on your back door and wants to buy, well, all the GPUs and they're offering a cash and carry price above the wholesale price? Or full MSRP? Under the table? Gee, is that a
Side note and prediction:
Watch for China's GPU, FPGA and ASIC technology level blow up and drastically increase production to drive down the costs of mining hardware and processing power in general.
I know they were already working towards this in major ways way before cryptocurrencies, but now there's a direct financial incentive to create extremely fast, advanced and cheap silicon.
We're seeing results of this now with ASICs and other advanced chips, but what I'm predicting is that they may actually blow up the general GPU/MPU market and scene in ways that reach the consumer market for gaming GPUs.
posted by loquacious at 3:10 PM on January 21, 2018 [8 favorites]
Watch for China's GPU, FPGA and ASIC technology level blow up and drastically increase production to drive down the costs of mining hardware and processing power in general.
The biggest problem is that Ethereum is pretty resistant to being run on an ASIC. Any ASIC and board required for massive hash output is going to look suspiciously like a GPU anyway.
posted by Talez at 3:14 PM on January 21, 2018 [2 favorites]
The biggest problem is that Ethereum is pretty resistant to being run on an ASIC. Any ASIC and board required for massive hash output is going to look suspiciously like a GPU anyway.
posted by Talez at 3:14 PM on January 21, 2018 [2 favorites]
How much has the low price of GPUs in the past depended on low labour costs and externalized environmental costs? Have we gotten used to cheap GPUs because we've gotten used to indirectly exploiting workers and wrecking the environment?
posted by clawsoon at 3:14 PM on January 21, 2018 [2 favorites]
posted by clawsoon at 3:14 PM on January 21, 2018 [2 favorites]
By their nature, chips have very little human involvement in their production in comparison to other areas and what labor they do require human involvement for is pretty skilled labor. Your biggest inputs on the marginal production are sand and energy. The biggest externality is probably the shitty chemicals used as doping agents, photoresist layers, and cleaning agents (Hydroflouric acid).
posted by Talez at 3:21 PM on January 21, 2018 [7 favorites]
posted by Talez at 3:21 PM on January 21, 2018 [7 favorites]
See, the way I avoid all this fooferaw is by buying Macs, whose graphics cards you can’t upgrade at all. Oh, and consoles.
posted by ejs at 3:21 PM on January 21, 2018 [4 favorites]
posted by ejs at 3:21 PM on January 21, 2018 [4 favorites]
The biggest problem is that Ethereum is pretty resistant to being run on an ASIC. Any ASIC and board required for massive hash output is going to look suspiciously like a GPU anyway.
Right, I'm talking in general - and I'm also specifically predicting cheaper GPU/MPU systems, since they can be repurposed and cross-marketed in ways ASICs can't.
Starting a new process or line can mean having to build an entirely new fab plant because retooling an existing one and taking it offline can be more expensive than just building a whole new one. Both options are incredibly expensive.
posted by loquacious at 3:30 PM on January 21, 2018
Right, I'm talking in general - and I'm also specifically predicting cheaper GPU/MPU systems, since they can be repurposed and cross-marketed in ways ASICs can't.
Starting a new process or line can mean having to build an entirely new fab plant because retooling an existing one and taking it offline can be more expensive than just building a whole new one. Both options are incredibly expensive.
posted by loquacious at 3:30 PM on January 21, 2018
What if a Skyrim mod that when you mine the gpu starts mining crypto currency
posted by um at 3:38 PM on January 21, 2018 [12 favorites]
posted by um at 3:38 PM on January 21, 2018 [12 favorites]
Looks like AMD can easily clear out that warehouse of RV350s.
posted by Jessica Savitch's Coke Spoon at 3:42 PM on January 21, 2018
posted by Jessica Savitch's Coke Spoon at 3:42 PM on January 21, 2018
Pretty sure I've got an AGP 4X Riva TNT2 in a box somewhere, is that worth anything?
posted by parm at 5:35 PM on January 21
Perhaps you jest. But old RAM is worth money. As soon as a type of RAM is no longer manufactured it get more expensive.
posted by Splunge at 3:53 PM on January 21, 2018 [4 favorites]
posted by parm at 5:35 PM on January 21
Perhaps you jest. But old RAM is worth money. As soon as a type of RAM is no longer manufactured it get more expensive.
posted by Splunge at 3:53 PM on January 21, 2018 [4 favorites]
Unsourced graphic showing Bitcoin energy consumption in the last few years (if anyone can find sourced or better data, please post.) tl;dr: "virtual" currency costs a lot of real fossil fuel.
And so does "gaming" - which it could be argued that 1st person shooters show you how to be a better soldier as a positive externality.
From my POV the cost for gaming or cryptocurrency are both a negative social good and instead the social considerations of this kind of computational power and resulting electric power consumption on things like the ability to have a more effective surveillance state along with development and refinement of what is called AI using the kind of hardware being discussed is more important to figure out 1st before thinking about the plate of beans over who should be more morally outraged - gamers or cryptominers that their hobby costs more.
(And no one bother to point out the 12 TB hard drive for $450 is due to the surveillance state and desire for corps to better datamine citizens. Already made THAT observation last week, just not here on The Blue.)
posted by rough ashlar at 3:58 PM on January 21, 2018 [1 favorite]
And so does "gaming" - which it could be argued that 1st person shooters show you how to be a better soldier as a positive externality.
From my POV the cost for gaming or cryptocurrency are both a negative social good and instead the social considerations of this kind of computational power and resulting electric power consumption on things like the ability to have a more effective surveillance state along with development and refinement of what is called AI using the kind of hardware being discussed is more important to figure out 1st before thinking about the plate of beans over who should be more morally outraged - gamers or cryptominers that their hobby costs more.
(And no one bother to point out the 12 TB hard drive for $450 is due to the surveillance state and desire for corps to better datamine citizens. Already made THAT observation last week, just not here on The Blue.)
posted by rough ashlar at 3:58 PM on January 21, 2018 [1 favorite]
From my POV the cost for gaming or cryptocurrency are both a negative social good and instead the social considerations of this kind of computational power and resulting electric power consumption on things like the ability to have a more effective surveillance state along with development and refinement of what is called AI using the kind of hardware being discussed is more important to figure out 1st before thinking about the plate of beans over who should be more morally outraged - gamers or cryptominers that their hobby costs more.
Damn, the 90's really are back in style.
posted by codacorolla at 4:06 PM on January 21, 2018 [11 favorites]
Damn, the 90's really are back in style.
posted by codacorolla at 4:06 PM on January 21, 2018 [11 favorites]
though my interests these days are more in low-end Raspberry Pi-style projects.
PSA to you (and others) - Consider the Rock64. Sub $50 for 4 64 bit 1.2 GHz cores 1 gig Ethernet and 4 gigs of DRAM. All those numbers are better than the Pi and if the "small" Pi size is the concern you can go bigger at the expense of a bit more cash, no built in wifi or bluetooth or ribbon connection for a camera or LCD screen. They just made some code to make 'em be able to boot from USB or Ethernet so as that goes to the various UNIXes you can end up with the PC-esque standard of "attach thing and boot" people are spoiled with on Intel-PC kinds of things.
posted by rough ashlar at 4:08 PM on January 21, 2018 [14 favorites]
PSA to you (and others) - Consider the Rock64. Sub $50 for 4 64 bit 1.2 GHz cores 1 gig Ethernet and 4 gigs of DRAM. All those numbers are better than the Pi and if the "small" Pi size is the concern you can go bigger at the expense of a bit more cash, no built in wifi or bluetooth or ribbon connection for a camera or LCD screen. They just made some code to make 'em be able to boot from USB or Ethernet so as that goes to the various UNIXes you can end up with the PC-esque standard of "attach thing and boot" people are spoiled with on Intel-PC kinds of things.
posted by rough ashlar at 4:08 PM on January 21, 2018 [14 favorites]
From my POV the cost for gaming or cryptocurrency are both a negative social good
How about TV?
posted by Splunge at 4:10 PM on January 21, 2018 [2 favorites]
How about TV?
posted by Splunge at 4:10 PM on January 21, 2018 [2 favorites]
things like the ability to have a more effective surveillance state along with development and refinement of what is called AI using the kind of hardware being discussed is more important to figure out 1st before thinking about the plate of beans over who should be more morally outraged - gamers or cryptominers that their hobby costs more.
Other concerns I have about cryptocurrencies:
-People going back in time to set up crypto mining rigs in the late '90s/early oughts.
-People using quantum technologies to mine coins in parallel universes.
posted by TheWhiteSkull at 4:10 PM on January 21, 2018 [41 favorites]
Other concerns I have about cryptocurrencies:
-People going back in time to set up crypto mining rigs in the late '90s/early oughts.
-People using quantum technologies to mine coins in parallel universes.
posted by TheWhiteSkull at 4:10 PM on January 21, 2018 [41 favorites]
Damn it TheWhiteSkull. Ssssssshhhhhh!!!
posted by Splunge at 4:12 PM on January 21, 2018 [3 favorites]
posted by Splunge at 4:12 PM on January 21, 2018 [3 favorites]
Regardless of whether you think "gaming" is good--I think it's worth noting that there's a lot of high-end-graphics games that aren't FPS games, but it's not super relevant to this particular part of the discussion--the fact remains that I can only play video games on one computer at a time. The bulk of cryptocurrency mining is not, as I understand it, people running this on the background of their regular desktop while they aren't using it, or even people setting up one extra machine to do it. The fact that Ethereum is more GPU-friendly doesn't mean that the prices on these are because a bunch of individual people are buying one each. A relatively small number of people are buying a lot of hardware and using a lot of power. One gamer's environmental impact is not the same as one Ethereum miner's environmental impact just because they're both using the same graphics card.
posted by Sequence at 4:14 PM on January 21, 2018 [10 favorites]
posted by Sequence at 4:14 PM on January 21, 2018 [10 favorites]
Some things have more potential than Skyrim mods.
I am sworn to compute your burdens.
posted by Celsius1414 at 4:17 PM on January 21, 2018 [7 favorites]
I am sworn to compute your burdens.
posted by Celsius1414 at 4:17 PM on January 21, 2018 [7 favorites]
So that's why the one and only time someone hacked my ebay account, they tried selling a bunch of graphics cards.
posted by drezdn at 4:20 PM on January 21, 2018 [1 favorite]
posted by drezdn at 4:20 PM on January 21, 2018 [1 favorite]
Watch for China's GPU, FPGA and ASIC technology level blow up and drastically increase production to drive down
The semiconductor industry has always been boom&bust cyclical. One of the best jobs I ever had was short lived due to the cycle and well funny story I was violently ill and the next day came in and the office was empty, the hr guy found me and filled me in the the entire group was whoosh.
The next huge draw on GPU tech will be hitting soon, every self driving car will need redundant high performance boards for both the image recognition and the AI(ish) smart algorithms. It'll push another direction as low power chips will be increasingly important, so more volume (build out billion dollar fab plants) and more demand. Poor gamers, can't win. ;-)
posted by sammyo at 4:24 PM on January 21, 2018 [1 favorite]
The semiconductor industry has always been boom&bust cyclical. One of the best jobs I ever had was short lived due to the cycle and well funny story I was violently ill and the next day came in and the office was empty, the hr guy found me and filled me in the the entire group was whoosh.
The next huge draw on GPU tech will be hitting soon, every self driving car will need redundant high performance boards for both the image recognition and the AI(ish) smart algorithms. It'll push another direction as low power chips will be increasingly important, so more volume (build out billion dollar fab plants) and more demand. Poor gamers, can't win. ;-)
posted by sammyo at 4:24 PM on January 21, 2018 [1 favorite]
Also, I wouldn't buy a second-hand graphics card any time in the next few years; chances are it will have been overclocked, used to mine coins and, quite probably, burned out in the process.
Future archaeologists will identify the Bitcoin era by the layer of scorched GPUs and useless single-purpose ASICs in the landfill.
posted by acb at 4:24 PM on January 21, 2018 [14 favorites]
Future archaeologists will identify the Bitcoin era by the layer of scorched GPUs and useless single-purpose ASICs in the landfill.
posted by acb at 4:24 PM on January 21, 2018 [14 favorites]
“According to these artifacts primitive 21st century humanity had a craze at the start of the century that involved hashing random numbers. Nobody knows why they engaged in this pointless activity. Some have speculated it is some sort of currency but the notion has been laughed out of the historical community as they were not as primitive as Yap society in the Middle Ages.”
posted by Talez at 4:31 PM on January 21, 2018 [17 favorites]
posted by Talez at 4:31 PM on January 21, 2018 [17 favorites]
If archaeologists are at a loss for what caused society to reorient itself to this pointless acitvity, they'll probably settle on it having been some kind of religious ritual sacrifice.
posted by acb at 4:41 PM on January 21, 2018 [11 favorites]
posted by acb at 4:41 PM on January 21, 2018 [11 favorites]
> Grumpy old geek:
"We've discontinued our locomotive simulator product so no need for the CGI computer for graphics, so no need for the 6 GTX 1080 Ti's on the shelf. Guess we'll just throw them out."
I will take one, and swear NO MINING AT ALL! (I mean, I don't even own a copy of Notch's game...)
posted by Samizdata at 4:43 PM on January 21, 2018
"We've discontinued our locomotive simulator product so no need for the CGI computer for graphics, so no need for the 6 GTX 1080 Ti's on the shelf. Guess we'll just throw them out."
I will take one, and swear NO MINING AT ALL! (I mean, I don't even own a copy of Notch's game...)
posted by Samizdata at 4:43 PM on January 21, 2018
I mean, if someone is offering a GTX 1080 for a reasonable price, I'm not going to say no. I have a nearly ten year old graphics card and I've been waiting for the crash for long enough that everything else is upgraded to modern standards.
posted by Merus at 4:50 PM on January 21, 2018
posted by Merus at 4:50 PM on January 21, 2018
PSA to you (and others) - Consider the Rock64. Sub $50 for 4 64 bit 1.2 GHz cores 1 gig Ethernet and 4 gigs of DRAM.
My most recent purchase was a MinnowBoard Turbot, which I specifically sought out because of its intel x64 chip. It's attached to an external RAID box and manages a sizable Plex Media Server library. I got sick of Plex faltering at transcoding video on the various ARM chips I previously had it running on. The Turbot's processor does the job quite nicely.
posted by Mr.Encyclopedia at 4:52 PM on January 21, 2018 [3 favorites]
My most recent purchase was a MinnowBoard Turbot, which I specifically sought out because of its intel x64 chip. It's attached to an external RAID box and manages a sizable Plex Media Server library. I got sick of Plex faltering at transcoding video on the various ARM chips I previously had it running on. The Turbot's processor does the job quite nicely.
posted by Mr.Encyclopedia at 4:52 PM on January 21, 2018 [3 favorites]
huh, I have like ten old cards laying around. i think I hear ebay calling my name!
posted by mwhybark at 4:52 PM on January 21, 2018 [1 favorite]
posted by mwhybark at 4:52 PM on January 21, 2018 [1 favorite]
I mean, if someone is offering a GTX 1080 for a reasonable price, I'm not going to say no. I have a nearly ten year old graphics card and I've been waiting for the crash for long enough that everything else is upgraded to modern standards.
Nvidia sells them directly on their website (though not quickly, rather than being able to add to cart, there's "notify me" buttons) at MSRP. I know a couple years ago we used to get things lower than MSRP, but for the near future MSRP sounds like a good deal.
posted by tclark at 4:55 PM on January 21, 2018
Nvidia sells them directly on their website (though not quickly, rather than being able to add to cart, there's "notify me" buttons) at MSRP. I know a couple years ago we used to get things lower than MSRP, but for the near future MSRP sounds like a good deal.
posted by tclark at 4:55 PM on January 21, 2018
a datapoint re the coming bitcoin crash: my longterm pal Nnn, who is a community manager for a US-based international quick-gig outsourcing firm (I think), shared that one of his firm's contractor base, a Venezuelan man, had named his son after Nnn. The new father told Nnn that without Nnn providing him and local Venezuelan colleagues the opportunity to participate in contract-based Bitcoin mining tasks, he would not have had a child. According to Nnn, Venezuela's electricity is free to the end user.
I did not dig into this fascinating anecdote in detail, but I definitely thought, "fuckin a, the future, man".
posted by mwhybark at 5:03 PM on January 21, 2018 [2 favorites]
I did not dig into this fascinating anecdote in detail, but I definitely thought, "fuckin a, the future, man".
posted by mwhybark at 5:03 PM on January 21, 2018 [2 favorites]
Gee people, just kidding. I'm not throwing out any 1080's! We use them for spares, but it is weird to see them double in price while they sit on the shelf.
posted by Grumpy old geek at 5:07 PM on January 21, 2018
posted by Grumpy old geek at 5:07 PM on January 21, 2018
Maybe graphics cards are the real crypto currency.
posted by Going To Maine at 5:18 PM on January 21, 2018 [15 favorites]
posted by Going To Maine at 5:18 PM on January 21, 2018 [15 favorites]
The GPU I bought over three years ago (and it was about two years old at the time) for $120 is currently selling online for anywhere between $200 and $550.
And that is why I just bought a laptop with a slightly less good GPU (I only play Borderlands, my graphics needs aren't massive) instead of building another gaming rig. Or fixing the last one.
For all I know that old GPU still works, but I don't have the patience or spoons to figure out what the hell is wrong with my old gaming machine so I shoved it in the closet and got a laptop.
posted by elsietheeel at 5:27 PM on January 21, 2018 [2 favorites]
And that is why I just bought a laptop with a slightly less good GPU (I only play Borderlands, my graphics needs aren't massive) instead of building another gaming rig. Or fixing the last one.
For all I know that old GPU still works, but I don't have the patience or spoons to figure out what the hell is wrong with my old gaming machine so I shoved it in the closet and got a laptop.
posted by elsietheeel at 5:27 PM on January 21, 2018 [2 favorites]
sfenders: "I feel like my GPU is probably much happier with its new life since it gave up mining and took up fiddling with neural networks. It pays only slightly less, and if Leelaz one day grows up to be the AI that takes over the world I'll be able to say I beat up on it when it was a baby."
Wow, that was a fascinating rabbit hole! So let me see if I understand this right. DeepMind made an AI program called AlphaGo Zero that plays Go better than anyone (or anything else). Leela Zero is a free, open source AI program based on AlphaGo Zero. But it isn't as good as the orginal because it doesn't have the network weights. Calculating those weights on your average home computer would take about 1700 years, so now there's a whole community splitting up the work by running Leela Zero on their home computers, and combining their results so that someday Leela will be as good as DeepMind's program. Is that about it?
If so, wow once again! We really are living in the future. And that's definitely a better use for your GPU than mining.
posted by Kevin Street at 6:00 PM on January 21, 2018 [5 favorites]
Wow, that was a fascinating rabbit hole! So let me see if I understand this right. DeepMind made an AI program called AlphaGo Zero that plays Go better than anyone (or anything else). Leela Zero is a free, open source AI program based on AlphaGo Zero. But it isn't as good as the orginal because it doesn't have the network weights. Calculating those weights on your average home computer would take about 1700 years, so now there's a whole community splitting up the work by running Leela Zero on their home computers, and combining their results so that someday Leela will be as good as DeepMind's program. Is that about it?
If so, wow once again! We really are living in the future. And that's definitely a better use for your GPU than mining.
posted by Kevin Street at 6:00 PM on January 21, 2018 [5 favorites]
> Grumpy old geek:
"Gee people, just kidding. I'm not throwing out any 1080's! We use them for spares, but it is weird to see them double in price while they sit on the shelf."
Although I would indeed smile if I got a free update to a 1080, did my response come off somehow as completely serious?
posted by Samizdata at 6:02 PM on January 21, 2018
"Gee people, just kidding. I'm not throwing out any 1080's! We use them for spares, but it is weird to see them double in price while they sit on the shelf."
Although I would indeed smile if I got a free update to a 1080, did my response come off somehow as completely serious?
posted by Samizdata at 6:02 PM on January 21, 2018
> Kevin Street:
"sfenders: "I feel like my GPU is probably much happier with its new life since it gave up mining and took up fiddling with neural networks. It pays only slightly less, and if Leelaz one day grows up to be the AI that takes over the world I'll be able to say I beat up on it when it was a baby."
Wow, that was a fascinating rabbit hole! So let me see if I understand this right. DeepMind made an AI program called AlphaGo Zero that plays Go better than anyone (or anything else). Leela Zero is a free, open source AI program based on AlphaGo Zero. But it isn't as good as the orginal because it doesn't have the network weights. Calculating those weights on your average home computer would take about 1700 years, so now there's a whole community splitting up the work by running Leela Zero on their home computers, and combining their results so that someday Leela will be as good as DeepMind's program. Is that about it?
If so, wow! We really are living in the future. And that's definitely a better use for your GPU than mining."
My "file server" machine does World Community Grid with some of it's GPU horsepower (crappy built-in Radeon, but I run it headless, so who cares?).
posted by Samizdata at 6:04 PM on January 21, 2018
"sfenders: "I feel like my GPU is probably much happier with its new life since it gave up mining and took up fiddling with neural networks. It pays only slightly less, and if Leelaz one day grows up to be the AI that takes over the world I'll be able to say I beat up on it when it was a baby."
Wow, that was a fascinating rabbit hole! So let me see if I understand this right. DeepMind made an AI program called AlphaGo Zero that plays Go better than anyone (or anything else). Leela Zero is a free, open source AI program based on AlphaGo Zero. But it isn't as good as the orginal because it doesn't have the network weights. Calculating those weights on your average home computer would take about 1700 years, so now there's a whole community splitting up the work by running Leela Zero on their home computers, and combining their results so that someday Leela will be as good as DeepMind's program. Is that about it?
If so, wow! We really are living in the future. And that's definitely a better use for your GPU than mining."
My "file server" machine does World Community Grid with some of it's GPU horsepower (crappy built-in Radeon, but I run it headless, so who cares?).
posted by Samizdata at 6:04 PM on January 21, 2018
Maybe graphics cards are the real crypto currency.
I think the real cryptocurrency is the friends we made along the way.
posted by TheWhiteSkull at 6:06 PM on January 21, 2018 [37 favorites]
I think the real cryptocurrency is the friends we made along the way.
posted by TheWhiteSkull at 6:06 PM on January 21, 2018 [37 favorites]
"My "file server" machine does World Community Grid with some of it's GPU horsepower (crappy built-in Radeon, but I run it headless, so who cares?)."
Also very awesome! Never heard of that one before, but after a quick search it sounds like a noble use for your spare silicon and electricity,
posted by Kevin Street at 6:09 PM on January 21, 2018
Also very awesome! Never heard of that one before, but after a quick search it sounds like a noble use for your spare silicon and electricity,
posted by Kevin Street at 6:09 PM on January 21, 2018
Seconding kaibatsu; this has a real effect on machine learning work too. Desktop GPUs are amazingly powerful, I've got my 1080 helping to reproduce AlphaGo Zero now. And I'm slowly working my way up to knowing enough about TensorFlow to run my own code on it.
Supply is clearly limited. I wonder what Nvidia's production pipeline looks like and whether they're not expanding because they expect this to just be a bubble. OTOH they have to be selling a lot of Tesla machine learning cards to big datacenter companies, too. And they just implemented a new shitty policy about that, forbidding datacenters from buying high end Nvidia cards and then leasing time on them.
posted by Nelson at 6:10 PM on January 21, 2018 [1 favorite]
Supply is clearly limited. I wonder what Nvidia's production pipeline looks like and whether they're not expanding because they expect this to just be a bubble. OTOH they have to be selling a lot of Tesla machine learning cards to big datacenter companies, too. And they just implemented a new shitty policy about that, forbidding datacenters from buying high end Nvidia cards and then leasing time on them.
posted by Nelson at 6:10 PM on January 21, 2018 [1 favorite]
Supply is clearly limited. I wonder what Nvidia's production pipeline looks like and whether they're not expanding because they expect this to just be a bubble.
There's no spare capacity. There's nowhere near enough wafer starts on leading nodes and wait times are measured in months. Any increases in production are coming from improved yields at this point.
posted by Talez at 6:27 PM on January 21, 2018 [2 favorites]
There's no spare capacity. There's nowhere near enough wafer starts on leading nodes and wait times are measured in months. Any increases in production are coming from improved yields at this point.
posted by Talez at 6:27 PM on January 21, 2018 [2 favorites]
None of my favorite games in recent memory have been great because they were visually stunning.
i mean. if dragon age inquisition had the same graphics quality that origins had, despite the great gameplay and storyline i would have physically fought everyone at bioware
twice
posted by poffin boffin at 6:30 PM on January 21, 2018 [5 favorites]
i mean. if dragon age inquisition had the same graphics quality that origins had, despite the great gameplay and storyline i would have physically fought everyone at bioware
twice
posted by poffin boffin at 6:30 PM on January 21, 2018 [5 favorites]
so that someday Leela will be as good as DeepMind's program
Yes, I am hoping it does get there, although there's a long way to go. If it gets that far, it should in principle be able to become even stronger than AlphaGo Zero, since DeepMind has reportedly stopped developing and training their program without reaching its ultimate limits. They say it's "retired" now, with its very short undefeated record. They may feel that the game of Go has given them all it can for their goals in machine learning, but they stopped well short of what they could have done for researching and understanding the game of Go.
posted by sfenders at 6:34 PM on January 21, 2018 [1 favorite]
Yes, I am hoping it does get there, although there's a long way to go. If it gets that far, it should in principle be able to become even stronger than AlphaGo Zero, since DeepMind has reportedly stopped developing and training their program without reaching its ultimate limits. They say it's "retired" now, with its very short undefeated record. They may feel that the game of Go has given them all it can for their goals in machine learning, but they stopped well short of what they could have done for researching and understanding the game of Go.
posted by sfenders at 6:34 PM on January 21, 2018 [1 favorite]
Perhaps Leela Zero deserves its own FPP. I've really enjoyed participating in the past few weeks, it's been a good excuse for me to learn more about how neural networks work in practice. Self-training AI is a really appealing thing to work on right now.
posted by Nelson at 6:37 PM on January 21, 2018 [1 favorite]
posted by Nelson at 6:37 PM on January 21, 2018 [1 favorite]
> Kevin Street:
""My "file server" machine does World Community Grid with some of it's GPU horsepower (crappy built-in Radeon, but I run it headless, so who cares?)."
Also very awesome! Never heard of that one before, but after a quick search it sounds like a noble use for your spare silicon and electricity,"
It also runs on my Kindle Fire/smart alarm clock. I had two other machines that would run it, but I need to resolve some issues. There is a Metafilter team on there too! The more the merrier, you know!
posted by Samizdata at 6:46 PM on January 21, 2018
""My "file server" machine does World Community Grid with some of it's GPU horsepower (crappy built-in Radeon, but I run it headless, so who cares?)."
Also very awesome! Never heard of that one before, but after a quick search it sounds like a noble use for your spare silicon and electricity,"
It also runs on my Kindle Fire/smart alarm clock. I had two other machines that would run it, but I need to resolve some issues. There is a Metafilter team on there too! The more the merrier, you know!
posted by Samizdata at 6:46 PM on January 21, 2018
→ Consider the Rock64. Sub $50 for 4 64 bit 1.2 GHz cores 1 gig Ethernet and 4 gigs of DRAM. All those numbers are better than the Pi
Old kernel, limited vendor support and no community tho. Remember the Via APC!
posted by scruss at 7:11 PM on January 21, 2018 [1 favorite]
Old kernel, limited vendor support and no community tho. Remember the Via APC!
posted by scruss at 7:11 PM on January 21, 2018 [1 favorite]
Yeah self driving cars seem like they are going to be very computationally intensive. The Velodyne-128 LIDAR will reportedly fire 6-7 million laser pulses per second, using the latency information to accurately map objects in a 360 degree field of view up to 300m away. Then you need to combine that data with cameras and radars. The NVIDIA Drive PX2, meant to be the self driving brain, is reportedly a 320 teraflop monster.
In contrast, a $1000 GTX 1080 today hits about 9 teraflops. But I guess cost is not a big issue, as they're looking to replace 5-10 years worth of wages (truck driver / tax driver) with the cost of the sensor package and computer.
posted by xdvesper at 8:54 PM on January 21, 2018 [1 favorite]
In contrast, a $1000 GTX 1080 today hits about 9 teraflops. But I guess cost is not a big issue, as they're looking to replace 5-10 years worth of wages (truck driver / tax driver) with the cost of the sensor package and computer.
posted by xdvesper at 8:54 PM on January 21, 2018 [1 favorite]
the 12 TB hard drive for $450 is due to the surveillance state and desire for corps to better datamine citizens.
I heard John McAffee was shilling BURST, a cryptocurrency which secures its blockchain with proofs of storage capacity rather than proofs of computation. So maybe he bought them all. :)
posted by Coventry at 9:24 PM on January 21, 2018
I heard John McAffee was shilling BURST, a cryptocurrency which secures its blockchain with proofs of storage capacity rather than proofs of computation. So maybe he bought them all. :)
posted by Coventry at 9:24 PM on January 21, 2018
The Velodyne-128 LIDAR will reportedly fire 6-7 million laser pulses per second
This sounds like a lot, but keep in mind VGA video at 640x480 at 30fps is over 9 million pixels per second.
posted by Pyry at 10:53 PM on January 21, 2018 [4 favorites]
This sounds like a lot, but keep in mind VGA video at 640x480 at 30fps is over 9 million pixels per second.
posted by Pyry at 10:53 PM on January 21, 2018 [4 favorites]
I don't actually know how much computation you really need for a self-driving car, but:
1) inference (i.e. using a trained neural net) is somewhat cheaper than training. you don't have to compute gradients at least and there's simply fewer examples to deal with. although for something with a lot of parameters that still has to run fast you still probably want a GPU.
2) given a trained net there's a fair amount of research into shrinking them or otherwise making them cheaper to deal with, for purposes of inference. use the big net to train a smaller net ("distillation"), make it work with integer instead of floating-point arithmetic, etc.
and indeed
The NVIDIA Drive PX2, meant to be the self driving brain, is reportedly a 320 teraflop monster.
wikipedia reports 320 "INT8 TOPS", so 320 trillion operations on 8-bit integers per second, which are much cheaper than floating point operations. possibly the plan is people are going to be using integer arithmetic for inference (active area of research), which would be faster and better for power consumption.
wikipedia mentions two future discrete GPUs, so the whole thing is probably gonna have 30-ish teraflops. which is enormous but thankfully not like a small datacenter in your trunk.
posted by vogon_poet at 11:19 PM on January 21, 2018 [1 favorite]
1) inference (i.e. using a trained neural net) is somewhat cheaper than training. you don't have to compute gradients at least and there's simply fewer examples to deal with. although for something with a lot of parameters that still has to run fast you still probably want a GPU.
2) given a trained net there's a fair amount of research into shrinking them or otherwise making them cheaper to deal with, for purposes of inference. use the big net to train a smaller net ("distillation"), make it work with integer instead of floating-point arithmetic, etc.
and indeed
The NVIDIA Drive PX2, meant to be the self driving brain, is reportedly a 320 teraflop monster.
wikipedia reports 320 "INT8 TOPS", so 320 trillion operations on 8-bit integers per second, which are much cheaper than floating point operations. possibly the plan is people are going to be using integer arithmetic for inference (active area of research), which would be faster and better for power consumption.
wikipedia mentions two future discrete GPUs, so the whole thing is probably gonna have 30-ish teraflops. which is enormous but thankfully not like a small datacenter in your trunk.
posted by vogon_poet at 11:19 PM on January 21, 2018 [1 favorite]
/r/Iota:
Lately we’ve been getting a lot of reports regarding people losing/missing their IOTA and people having outgoing transactions not done by them. These seeds were generated online using unofficial websites. IOTA has no official website to generate a seed.(dicussion: /r/buttcoin)
‘This is crypto--YOU are the bank. It is 100% your fault.’posted by sebastienbailard at 3:16 AM on January 22, 2018
‘Don’t forget making it with dice. The fully trust-less solution. Make a 9x9 matrix on paper and roll dice for the values in each cell.’Based on this twitter post it looks like /r/iota actually had some of these online seed generators in the subreddit sidebar. Expand the initial post and look at the follow up comments.> Many users, at the suggestion of IOTA devs, used online seed generators (some were linked in the sub sidebar). However, some users had the misfortune of using the wrong online seed generators, and were burned.
What if a Skyrim mod that when you mine the gpu starts mining crypto currency.
Greeeat. Now I want to see an Antminer ASIC rendered in Minecraft redstone.
More seriously, background browser mining and hijacking is rapidly becoming a huge thing. The Pirate Bay apparently now runs mining code in your browser as standard ToS, and browser exploits have been reported in the wild.
Watch for it to evolve to replace advertisements and even micropayments on porn sites and other content sites.
Someone smart and greasy has an opportunity right now (and, oh, as of about a year or two ago) to build a services/micropayment system that trades CPU time from customers/clients in exchange for, say, credits for porn, gaming, gambling or even gift cards from providers.
Instead of BonziBuddy, consider CryptoBuddy and leveraging the buzz and hype around crypto. Make it cross platform and make apps for desktops and mobiles, make browser plugins and widgets and third party API extensions. App devs could include code in their app to monetize the app instead of ads, and users would recognize the ad free experience and user points and rewards for using the app with the embedded crypto mining widget.
Succeed at this and pretty soon not only do you have one of the world's largest single crypto mining pools - but you may also have one of the world's largest (if slowest) distributed supercomputers available for hire as well as a truly massive marketing, tracking and consumer data mining operation all in one system.
It would/could be incredibly evil yet totally legal, and people would probably sign up for it in droves if leaving their computer on every night meant they got "free" pron and Netflix and whatever in exchange for the hidden cost on their electricity bill.
It'd probably be way more appealing to users who take those endless surveys and do microtasks for rewards and points, and you'd still get oodles of consumer data to sell off to marketers in addition to all that filthy crypto. They passively get their free shit, you get the crypto and marketing data, and even if scammers/bots try to game your platform, they're still mining for you anyway.
See, aren't you all glad I'm a slacker who didn't get into marketing?
posted by loquacious at 3:20 AM on January 22, 2018 [6 favorites]
Greeeat. Now I want to see an Antminer ASIC rendered in Minecraft redstone.
More seriously, background browser mining and hijacking is rapidly becoming a huge thing. The Pirate Bay apparently now runs mining code in your browser as standard ToS, and browser exploits have been reported in the wild.
Watch for it to evolve to replace advertisements and even micropayments on porn sites and other content sites.
Someone smart and greasy has an opportunity right now (and, oh, as of about a year or two ago) to build a services/micropayment system that trades CPU time from customers/clients in exchange for, say, credits for porn, gaming, gambling or even gift cards from providers.
Instead of BonziBuddy, consider CryptoBuddy and leveraging the buzz and hype around crypto. Make it cross platform and make apps for desktops and mobiles, make browser plugins and widgets and third party API extensions. App devs could include code in their app to monetize the app instead of ads, and users would recognize the ad free experience and user points and rewards for using the app with the embedded crypto mining widget.
Succeed at this and pretty soon not only do you have one of the world's largest single crypto mining pools - but you may also have one of the world's largest (if slowest) distributed supercomputers available for hire as well as a truly massive marketing, tracking and consumer data mining operation all in one system.
It would/could be incredibly evil yet totally legal, and people would probably sign up for it in droves if leaving their computer on every night meant they got "free" pron and Netflix and whatever in exchange for the hidden cost on their electricity bill.
It'd probably be way more appealing to users who take those endless surveys and do microtasks for rewards and points, and you'd still get oodles of consumer data to sell off to marketers in addition to all that filthy crypto. They passively get their free shit, you get the crypto and marketing data, and even if scammers/bots try to game your platform, they're still mining for you anyway.
See, aren't you all glad I'm a slacker who didn't get into marketing?
posted by loquacious at 3:20 AM on January 22, 2018 [6 favorites]
To cut down on the energy (over-)consumption, what about charging egregious amounts for fossil fuel usage well above what the median household uses, after adjusting for HH size?
I think we need a hefty carbon tax on cryptocoin transactions.
posted by sebastienbailard at 4:05 AM on January 22, 2018 [1 favorite]
You could call it CryptoBook, loquacious
posted by Annika Cicada at 4:30 AM on January 22, 2018
posted by Annika Cicada at 4:30 AM on January 22, 2018
One gamer's environmental impact is not the same as one Ethereum miner's environmental impact just because they're both using the same graphics card.
Also: there are only so many hours in a day in which a gamer pushes their video card hard. Miners are going to run several 24/7.
For a while the trend with graphics cards was to make them more energy efficient and cooler-running. That trend started to reverse at about the same time Nvidia CUDA came along... and at about the same time, they started producing GPUs for laptops, tablets, etc. which are cooler and more efficient. Those would be pretty much just fine running modern games on a desktop, but not computing blockchain crap.
posted by Foosnark at 4:32 AM on January 22, 2018
Also: there are only so many hours in a day in which a gamer pushes their video card hard. Miners are going to run several 24/7.
For a while the trend with graphics cards was to make them more energy efficient and cooler-running. That trend started to reverse at about the same time Nvidia CUDA came along... and at about the same time, they started producing GPUs for laptops, tablets, etc. which are cooler and more efficient. Those would be pretty much just fine running modern games on a desktop, but not computing blockchain crap.
posted by Foosnark at 4:32 AM on January 22, 2018
Miners are going to be as easy to find on the power grid as growops, it seems.
posted by bonehead at 5:33 AM on January 22, 2018 [2 favorites]
posted by bonehead at 5:33 AM on January 22, 2018 [2 favorites]
I'm surprised no one has built an app to run ethereum on a PS4 or Xbone.
posted by musofire at 5:36 AM on January 22, 2018
posted by musofire at 5:36 AM on January 22, 2018
I'm surprised no one has built an app to run ethereum on a PS4 or Xbone.
Can't be bothered, too busy playing Horizon Zero Dawn.
posted by Fizz at 5:39 AM on January 22, 2018 [5 favorites]
Can't be bothered, too busy playing Horizon Zero Dawn.
posted by Fizz at 5:39 AM on January 22, 2018 [5 favorites]
Sigh. I was tossing around the idea of a new build just before Christmas, carefully planned out my parts, then watched GPU prices go through the roof. My boyfriend bought me some beautiful RAM for Christmas, but with prices the way they are, I won't be able to build.
posted by rachaelfaith at 5:45 AM on January 22, 2018 [1 favorite]
posted by rachaelfaith at 5:45 AM on January 22, 2018 [1 favorite]
It'll be interesting to see if this GPU inflation leads to increased console purchases. I don't NEED to upgrade at the moment but with these prices, for the foreseeable future that will not be happening. I'm definitely going to hold off until prices settle and become more reasonable.
In the meantime, I have my Switch to keep me occupied and happy.
posted by Fizz at 5:49 AM on January 22, 2018
In the meantime, I have my Switch to keep me occupied and happy.
posted by Fizz at 5:49 AM on January 22, 2018
I wonder if Dutch gardeners were this inconvenienced when they had to switch to chrysanthemums for a few years.
posted by Mayor West at 5:57 AM on January 22, 2018 [9 favorites]
posted by Mayor West at 5:57 AM on January 22, 2018 [9 favorites]
Do the manufacturers get any of this demand pricing in return?
Other than keeping up with demand at MSRP, if they don't get any portion of this increased price due to demand, what incentive do they have to produce more cards?
posted by filtergik at 6:22 AM on January 22, 2018
Other than keeping up with demand at MSRP, if they don't get any portion of this increased price due to demand, what incentive do they have to produce more cards?
posted by filtergik at 6:22 AM on January 22, 2018
Do the manufacturers get any of this demand pricing in return?
Nvidia? No. GPU board makers? If you're unsavory.
Other than keeping up with demand at MSRP, if they don't get any portion of this increased price due to demand, what incentive do they have to produce more cards?
It's not like these cards are unprofitable at MSRP. You're literally asking if graphics card OEMs don't like to make more money by increasing production. I mean I guess they could refuse to increase production out of spite but I'm pretty sure they will take every GPU they can lay their hands on.
posted by Talez at 6:46 AM on January 22, 2018
Nvidia? No. GPU board makers? If you're unsavory.
Other than keeping up with demand at MSRP, if they don't get any portion of this increased price due to demand, what incentive do they have to produce more cards?
It's not like these cards are unprofitable at MSRP. You're literally asking if graphics card OEMs don't like to make more money by increasing production. I mean I guess they could refuse to increase production out of spite but I'm pretty sure they will take every GPU they can lay their hands on.
posted by Talez at 6:46 AM on January 22, 2018
DDR4 prices have also taken a sharp swing upwards. This is the only time I can recall where building a comparable PC is more expensive than in the previous year.
posted by thecjm at 7:28 AM on January 22, 2018
posted by thecjm at 7:28 AM on January 22, 2018
Yeah, I eyeballed putting another 8 gig in my (old, 2012) DDR3 Linux box and it would've been a $160-ish upgrade to go from 8g to 16g which felt foolish to put into a 6 year old build, so I looked at current builds and was so shocked I ended up building a _2009_ vintage Xeon Server because datacenter castoffs are cheap by comparison.
(performance is great, just shy of realtime h265 transcoding, memory accesses are shockingly fast, that 8mb cache is deeeeeeep. It's a great little KVM platform and it was only $160 out the door for cpu/board/16g ram.)
posted by Kyol at 7:54 AM on January 22, 2018
(performance is great, just shy of realtime h265 transcoding, memory accesses are shockingly fast, that 8mb cache is deeeeeeep. It's a great little KVM platform and it was only $160 out the door for cpu/board/16g ram.)
posted by Kyol at 7:54 AM on January 22, 2018
I'm in the mostly indie game category, but I'd been thinking of upgrading my GPU when my tax return got in. Guess that isn't happening...
What really bothers me is that Bitcoin doesn't seem to be actually used, and cashing out seems really difficult. So why is everyone still after it so much?
No one is spending it to speak of, and for good reason. With its deflationary nature you'd have to be stupid to spend Bitcoin on pizza or rent or whatever when sitting on it means it keeps going up and up and up in value.
But actually cashing out and getting that theoretical value in dollars seems to be quite difficult, the value is much more theoretical than actual. A bit like rare comic books. Sure, in theory your mint condition copy of X-Men issue #1 is worth millions, good luck finding someone who will actually pay that amount for it though. Same goes with bitcoin, those guys with lots of BTC who want to cash out millions of dollars aren't really finding buyers.
posted by sotonohito at 8:19 AM on January 22, 2018
What really bothers me is that Bitcoin doesn't seem to be actually used, and cashing out seems really difficult. So why is everyone still after it so much?
No one is spending it to speak of, and for good reason. With its deflationary nature you'd have to be stupid to spend Bitcoin on pizza or rent or whatever when sitting on it means it keeps going up and up and up in value.
But actually cashing out and getting that theoretical value in dollars seems to be quite difficult, the value is much more theoretical than actual. A bit like rare comic books. Sure, in theory your mint condition copy of X-Men issue #1 is worth millions, good luck finding someone who will actually pay that amount for it though. Same goes with bitcoin, those guys with lots of BTC who want to cash out millions of dollars aren't really finding buyers.
posted by sotonohito at 8:19 AM on January 22, 2018
If you had told me in 2016 when I bought a 1060 that I'd be saving a huge amount of money and hassle by purchasing one of the first cards to come out at launch... I'm not sure what I would have thought.
posted by selfnoise at 9:02 AM on January 22, 2018 [1 favorite]
posted by selfnoise at 9:02 AM on January 22, 2018 [1 favorite]
Between cryptocurrencies and self-driving cars... I'm not all that enthused about the future.
posted by Artful Codger at 9:06 AM on January 22, 2018 [2 favorites]
posted by Artful Codger at 9:06 AM on January 22, 2018 [2 favorites]
Between cryptocurrencies and self-driving cars... I'm not all that enthused about our future:
I refer you to this comment about the future.
I refer you to this comment about the future.
“Unless you're over 60, you weren't promised flying cars. You were promised an oppressive cyberpunk dystopia. Here you go.”posted by Fizz at 9:46 AM on January 22, 2018 [10 favorites]
Well, at least it's a good thing all these cypto miners are producing something of VALUE.
Between cryptocurrencies and self-driving cars... I'm not all that enthused about the future.
You think you're in trouble? I read a couple of the articles, and I still don't UNDERSTAND the connection between cryptocurrency and graphics cards.
(I didn't research much, but another PC Gamer article and this Street article provided more than enough info. I get it now. What a fucking colossal waste of time, energy, and resources.)
posted by mrgrimm at 10:05 AM on January 22, 2018 [1 favorite]
Between cryptocurrencies and self-driving cars... I'm not all that enthused about the future.
You think you're in trouble? I read a couple of the articles, and I still don't UNDERSTAND the connection between cryptocurrency and graphics cards.
(I didn't research much, but another PC Gamer article and this Street article provided more than enough info. I get it now. What a fucking colossal waste of time, energy, and resources.)
posted by mrgrimm at 10:05 AM on January 22, 2018 [1 favorite]
Could a GPU be designed to be a rockstar at shaders and vertexes but absolutely miserable at mining?
posted by crysflame at 10:08 AM on January 22, 2018
posted by crysflame at 10:08 AM on January 22, 2018
Yeah, it's called an AMD GPU. (OK, cheap shot, it's no longer true. But NVidia had a head start on general purpose GPU computing.)
posted by Nelson at 10:11 AM on January 22, 2018
posted by Nelson at 10:11 AM on January 22, 2018
Miners are going to be as easy to find on the power grid as growops, it seems.
I have friends who work for electric utilities, and one of the real challenges is how to deal with local electric distribution impacts of cryptocurrencies. On the one hand, these mining operations are in some places creating real problems for the local circuit, and maintaining reliable grid operations requires infrastructure upgrades. On the other hand, no one really expects this kind of activity to continue permanently, so making investments in infrastructure to support them might not be prudent. It's a tough position to be in.
posted by nickmark at 10:23 AM on January 22, 2018 [2 favorites]
I have friends who work for electric utilities, and one of the real challenges is how to deal with local electric distribution impacts of cryptocurrencies. On the one hand, these mining operations are in some places creating real problems for the local circuit, and maintaining reliable grid operations requires infrastructure upgrades. On the other hand, no one really expects this kind of activity to continue permanently, so making investments in infrastructure to support them might not be prudent. It's a tough position to be in.
posted by nickmark at 10:23 AM on January 22, 2018 [2 favorites]
> cheap shot
Heh, I'll restate. Could an otherwise-useful GPGPU be designed to be cycle-inefficient or power-inefficient at Bitcoin mining *while* remaining GPGPU-valuable, so that it's more profitable to use custom ASICs again?
posted by crysflame at 10:23 AM on January 22, 2018
Heh, I'll restate. Could an otherwise-useful GPGPU be designed to be cycle-inefficient or power-inefficient at Bitcoin mining *while* remaining GPGPU-valuable, so that it's more profitable to use custom ASICs again?
posted by crysflame at 10:23 AM on January 22, 2018
You think you're in trouble? I read a couple of the articles, and I still don't UNDERSTAND the connection between cryptocurrency and graphics cards.
OK. So in BTC you're basically hashing random shit together to try and get a hash that starts with a certain number of zeros (this is the difficulty factor). This is ridiculously easy to make an ASIC out of.
Ethereum, the second largest cryptocoin by market cap decided to make it less resilient to ASIC mining (power to the people etc etc) and made you find a hash using data in a one gigabyte dataset. So you need both a gigabyte of scratch space to hold the dataset and you need enough memory bandwidth to transport random pieces of the dataset to whatever processor is doing the hashing. Turns out GPUs are great at this given that they have large enough caches (VRAM), massive parallelism (dozens, hundreds, thousands of ALUs), and memory bandwidth out the wazoo (hundreds of GB/sec).
The second most popular cryptocoin is in a gold rush period and is most efficiently mined by a GPU. Hmmmm... I wonder what happens to GPU demand.
posted by Talez at 10:23 AM on January 22, 2018 [2 favorites]
OK. So in BTC you're basically hashing random shit together to try and get a hash that starts with a certain number of zeros (this is the difficulty factor). This is ridiculously easy to make an ASIC out of.
Ethereum, the second largest cryptocoin by market cap decided to make it less resilient to ASIC mining (power to the people etc etc) and made you find a hash using data in a one gigabyte dataset. So you need both a gigabyte of scratch space to hold the dataset and you need enough memory bandwidth to transport random pieces of the dataset to whatever processor is doing the hashing. Turns out GPUs are great at this given that they have large enough caches (VRAM), massive parallelism (dozens, hundreds, thousands of ALUs), and memory bandwidth out the wazoo (hundreds of GB/sec).
The second most popular cryptocoin is in a gold rush period and is most efficiently mined by a GPU. Hmmmm... I wonder what happens to GPU demand.
posted by Talez at 10:23 AM on January 22, 2018 [2 favorites]
Heh, I'll restate. Could an otherwise-useful GPGPU be designed to be cycle-inefficient or power-inefficient at Bitcoin mining *while* remaining GPGPU-valuable, so that it's more profitable to use custom ASICs again?
Not with the way Ethash is designed. The need for a gigabyte of fast as hell memory is the killer to stopping cheap custom ASIC solutions.
posted by Talez at 10:24 AM on January 22, 2018
Not with the way Ethash is designed. The need for a gigabyte of fast as hell memory is the killer to stopping cheap custom ASIC solutions.
posted by Talez at 10:24 AM on January 22, 2018
Also I have this weird feeling that quantum computing will crush all cryptocoins to dust instantly, because once you can say "evaluate all solutions and show me any that work", you can bulk-mine the entire hashspace for the next-block in a single search operation.
posted by crysflame at 10:25 AM on January 22, 2018 [3 favorites]
posted by crysflame at 10:25 AM on January 22, 2018 [3 favorites]
Well, at least it's a good thing all these cypto miners are producing something of VALUE.
Are they?
My understanding of capitalism and the free market is that we let people make money in almost any fashion because we believe it incentivizes innovation and efficiency through competition, and because it is efficient at moving capital around for use in investment.
Ok, fine... but with crypto-currencies, we now have something that's completely artificial, no tie to a real product or service, and at present not even useful as a currency, and it has the ability to make someone money just by buying and selling it at the right time. And cryptocurrencies have produced some astounding gains for a lucky few.
So, if the crypto bubble doesn't burst soon, and it continues to produce real or imagined big gains... then serious money, institutional money will be moved into them. (or into launching crypto thingies of their own). Which means that there will be less money invested into ... actually making stuff, or doing stuff.
I naively believe that the economy must serve society, not vice versa. If the bulk of our economic activity is the buying and selling of a fiction... it's a casino.
posted by Artful Codger at 10:42 AM on January 22, 2018 [4 favorites]
Are they?
My understanding of capitalism and the free market is that we let people make money in almost any fashion because we believe it incentivizes innovation and efficiency through competition, and because it is efficient at moving capital around for use in investment.
Ok, fine... but with crypto-currencies, we now have something that's completely artificial, no tie to a real product or service, and at present not even useful as a currency, and it has the ability to make someone money just by buying and selling it at the right time. And cryptocurrencies have produced some astounding gains for a lucky few.
So, if the crypto bubble doesn't burst soon, and it continues to produce real or imagined big gains... then serious money, institutional money will be moved into them. (or into launching crypto thingies of their own). Which means that there will be less money invested into ... actually making stuff, or doing stuff.
I naively believe that the economy must serve society, not vice versa. If the bulk of our economic activity is the buying and selling of a fiction... it's a casino.
posted by Artful Codger at 10:42 AM on January 22, 2018 [4 favorites]
I understand what it is the Bitcoin and the like are attempting to do but I just think the whole thing is stupid.
U.S. dollars are a fiat currency. In short, dollars are only worth anything because we all agree that they're worth something. Even currency backed by gold still counts on everyone at least agreeing that gold has value as it's not some universal rule that shiny metal has value.
Fiat currency in the U.S. mostly works because the government can levy taxes and has a big military to back it up.
Bitcoin is also a fiat currency but without the ability to tax or enforce much of anything. Not only that it takes inputs that cost real dollars to produce. In effect, you're taking one fiat currency, U.S. Dollars and turning them into Bitcoin and heat.
It just seems so inefficient and dumb. I just don't see how the BTC market, or any cryptocurrency market for that matter, is anything but zero sum. There is some arbitrage with electricity prices but generally, increased demand for electricity to mine BTC drives up those prices along with the GPU's needed to create them and that should, over time, compete out the profitability of mining at current prices and then for every person that wants to convert their BTC to something useful has to find someone to sell their BTC to. If and when there is no one left to buy BTC, you'll have just bought a bunch of electricity and computer hardware and will have generated a bunch of extra heat.
There isn't a stabilizing force for BTC like there is for dollars. No Federal government to step in and keep the currency from totally collapsing. Eventually people are going to stop mining BTC as a way to make money and a bunch of people are going to be left holding the bag. The current glut of stories about people finding keys to their BTC wallet so they can cash out indicate that lot's of people are selling and eventually they're going to run out of buyers and things are going to crash hard and fast but no one is going to be able to predict when it will happen.
If enough people are convinced that your fiat currency has no value, they make it true.
posted by VTX at 10:51 AM on January 22, 2018 [2 favorites]
U.S. dollars are a fiat currency. In short, dollars are only worth anything because we all agree that they're worth something. Even currency backed by gold still counts on everyone at least agreeing that gold has value as it's not some universal rule that shiny metal has value.
Fiat currency in the U.S. mostly works because the government can levy taxes and has a big military to back it up.
Bitcoin is also a fiat currency but without the ability to tax or enforce much of anything. Not only that it takes inputs that cost real dollars to produce. In effect, you're taking one fiat currency, U.S. Dollars and turning them into Bitcoin and heat.
It just seems so inefficient and dumb. I just don't see how the BTC market, or any cryptocurrency market for that matter, is anything but zero sum. There is some arbitrage with electricity prices but generally, increased demand for electricity to mine BTC drives up those prices along with the GPU's needed to create them and that should, over time, compete out the profitability of mining at current prices and then for every person that wants to convert their BTC to something useful has to find someone to sell their BTC to. If and when there is no one left to buy BTC, you'll have just bought a bunch of electricity and computer hardware and will have generated a bunch of extra heat.
There isn't a stabilizing force for BTC like there is for dollars. No Federal government to step in and keep the currency from totally collapsing. Eventually people are going to stop mining BTC as a way to make money and a bunch of people are going to be left holding the bag. The current glut of stories about people finding keys to their BTC wallet so they can cash out indicate that lot's of people are selling and eventually they're going to run out of buyers and things are going to crash hard and fast but no one is going to be able to predict when it will happen.
If enough people are convinced that your fiat currency has no value, they make it true.
posted by VTX at 10:51 AM on January 22, 2018 [2 favorites]
There are three other open Bitcoin threads where we already have people pontificating on The Meaning of Money. Could you talk about in one of those?
posted by Nelson at 10:53 AM on January 22, 2018 [1 favorite]
posted by Nelson at 10:53 AM on January 22, 2018 [1 favorite]
I just had to check the site where I bought my card, and I'm really surprised, I took these screen caps. After nine months, it's $230 more expensive. But was most surprising was today's advertised price was $749.0 $649.0. Usually they show the older price to show how much money you save if you buy now.
There's a certain bittersweet melancholy when you buy high performance hardware, because you know that in a couple of years your premium quality gear will be in the bargain bin while you're buying the latest expensive thing. I've never expected my tech to appreciate! And that's not a good thing at all. If I spend $230 more than I did last year I'd expect some improvement!
posted by adept256 at 11:32 AM on January 22, 2018 [2 favorites]
There's a certain bittersweet melancholy when you buy high performance hardware, because you know that in a couple of years your premium quality gear will be in the bargain bin while you're buying the latest expensive thing. I've never expected my tech to appreciate! And that's not a good thing at all. If I spend $230 more than I did last year I'd expect some improvement!
posted by adept256 at 11:32 AM on January 22, 2018 [2 favorites]
Man, yeah, it's really weird looking at it in personal detail. I bought a GTX 970 3+ years ago, and it was beefy but not bleeding-edge at the time. Paid a little over $300, had been the big big upgrade that was many years coming for me in a machine I'd bought in the late 2000s.
Just checked the same product listing and it's going for $400+ now.
posted by cortex at 11:56 AM on January 22, 2018 [1 favorite]
Just checked the same product listing and it's going for $400+ now.
posted by cortex at 11:56 AM on January 22, 2018 [1 favorite]
Since we're talking about ASICs vs GPUs, this article on a non-ASIC-resistant coin being de facto taken over by a giant coin-mining operation in China seems relevant.
ASICs -- custom chipsets designed to be highly optimised for a specific purpose (e.g., the particular mining algorithm used by a particular cryptocurrency) -- have a somewhat fixed start-up cost which makes them largely inaccessible to small-scale miners but their speed makes them very desirable for large-scale miners that can afford them. So it's easy for a big mining operation to buy a ton of ASICs and gain completely control over a small coin's blockchain. It's presumably possible for a big enough operation to even get control over BTC itself this way, though BTC's status as the most popular cryptocurrency makes this unlikely to actually happen (while also creating a huge incentive to try, of course).
ETH is specifically trying to avoid this problem to keep control of their network more distributed, so it's very likely that they'd adjust their algorithm again if somebody ever managed to create an ASIC that computes it more efficiently than a GPU.
To be quite honest, this seems like a huge problem with the whole concept of blockchain consensus, albeit one that was inevitable given cryptocurrency's origin among techy libertarians. In blockchains as in libertopia, the rich get richer and crowd out everyone else.
posted by tobascodagama at 12:03 PM on January 22, 2018 [2 favorites]
ASICs -- custom chipsets designed to be highly optimised for a specific purpose (e.g., the particular mining algorithm used by a particular cryptocurrency) -- have a somewhat fixed start-up cost which makes them largely inaccessible to small-scale miners but their speed makes them very desirable for large-scale miners that can afford them. So it's easy for a big mining operation to buy a ton of ASICs and gain completely control over a small coin's blockchain. It's presumably possible for a big enough operation to even get control over BTC itself this way, though BTC's status as the most popular cryptocurrency makes this unlikely to actually happen (while also creating a huge incentive to try, of course).
ETH is specifically trying to avoid this problem to keep control of their network more distributed, so it's very likely that they'd adjust their algorithm again if somebody ever managed to create an ASIC that computes it more efficiently than a GPU.
To be quite honest, this seems like a huge problem with the whole concept of blockchain consensus, albeit one that was inevitable given cryptocurrency's origin among techy libertarians. In blockchains as in libertopia, the rich get richer and crowd out everyone else.
posted by tobascodagama at 12:03 PM on January 22, 2018 [2 favorites]
For a lot of those folks, if you asked them to decide which was more important, Video Games or Cryptocurrency, their heads would explode.
posted by Mr.Encyclopedia at 12:04 PM on January 22, 2018 [3 favorites]
posted by Mr.Encyclopedia at 12:04 PM on January 22, 2018 [3 favorites]
Looks like there is a big arbitrage opportunity in older graphics cards. Prices I've seen vary from $250 to $850 for the GTX970.
posted by grumpybear69 at 12:04 PM on January 22, 2018 [2 favorites]
posted by grumpybear69 at 12:04 PM on January 22, 2018 [2 favorites]
To be quite honest, this seems like a huge problem with the whole concept of blockchain consensus, albeit one that was inevitable given cryptocurrency's origin among techy libertarians. In blockchains as in libertopia, the rich get richer and crowd out everyone else.
If you think that's bad, some coins use proof of stake rather than proof of work, meaning that if there's a dispute between two parties about what the contents of the blockchain should be, the party with more coins wins!
posted by Pope Guilty at 12:10 PM on January 22, 2018 [2 favorites]
If you think that's bad, some coins use proof of stake rather than proof of work, meaning that if there's a dispute between two parties about what the contents of the blockchain should be, the party with more coins wins!
posted by Pope Guilty at 12:10 PM on January 22, 2018 [2 favorites]
CS Engineers: "C'mon Nintendo, you're never going to be able to devise a security system that will defeat hundreds of nerds determined to pirate a few video games."
Also CS Engineers: "We devised a system for making secure, annonymous currency transactions that will defeat millions of people looking for ways to scam billions of dollars."
posted by straight at 1:32 PM on January 22, 2018 [15 favorites]
Also CS Engineers: "We devised a system for making secure, annonymous currency transactions that will defeat millions of people looking for ways to scam billions of dollars."
posted by straight at 1:32 PM on January 22, 2018 [15 favorites]
Could an otherwise-useful GPGPU be designed to be cycle-inefficient or power-inefficient at Bitcoin mining *while* remaining GPGPU-valuable, so that it's more profitable to use custom ASICs again
No. Because the answer to the question "Can I get $xxxxx dollars to fund R&D who's sole purpose is to force half of our customer-base to other suppliers?" is always no.
posted by rtimmel at 1:38 PM on January 22, 2018 [4 favorites]
No. Because the answer to the question "Can I get $xxxxx dollars to fund R&D who's sole purpose is to force half of our customer-base to other suppliers?" is always no.
posted by rtimmel at 1:38 PM on January 22, 2018 [4 favorites]
"ETH is specifically trying to avoid this problem to keep control of their network more distributed, so it's very likely that they'd adjust their algorithm again if somebody ever managed to create an ASIC that computes it more efficiently than a GPU."
Etherium wants a network that can't be dominated by a couple of industrial scale mining operations, so they use an algorithm that can be mined by ordinary people with high-end graphics cards, and that puts pressure on the global market for GPUs and makes prices (in real dollars) rise. Quite an interconnected world we live in.
posted by Kevin Street at 1:41 PM on January 22, 2018 [1 favorite]
Etherium wants a network that can't be dominated by a couple of industrial scale mining operations, so they use an algorithm that can be mined by ordinary people with high-end graphics cards, and that puts pressure on the global market for GPUs and makes prices (in real dollars) rise. Quite an interconnected world we live in.
posted by Kevin Street at 1:41 PM on January 22, 2018 [1 favorite]
Globalisation is when someone asks you "What's that got to do with the price of tea in China?" and if you are sufficiently well-informed you can take a deep breath and reply, "Well, you see, it's like this..."
posted by tobascodagama at 1:50 PM on January 22, 2018 [3 favorites]
posted by tobascodagama at 1:50 PM on January 22, 2018 [3 favorites]
If you think that's bad, some coins use proof of stake rather than proof of work, meaning that if there's a dispute between two parties about what the contents of the blockchain should be, the party with more coins wins!
PoS usually means that the next "miner" with the right to validate the transactions in the most recent interval is selected by a random draw, weighted by the number of coins they've committed to the validation process. So in a dispute, the only way the scenario you outline could happen is if one party to it has staked a huge fraction of their coins and wants to keep the transaction from ever being recorded in the blockchain. If any one entity has that much control over the validation process, the whole notion of distributed consensus has broken down.
posted by Coventry at 2:19 PM on January 22, 2018
PoS usually means that the next "miner" with the right to validate the transactions in the most recent interval is selected by a random draw, weighted by the number of coins they've committed to the validation process. So in a dispute, the only way the scenario you outline could happen is if one party to it has staked a huge fraction of their coins and wants to keep the transaction from ever being recorded in the blockchain. If any one entity has that much control over the validation process, the whole notion of distributed consensus has broken down.
posted by Coventry at 2:19 PM on January 22, 2018
I bought a premade system with a 1070 over the summer for about a grand, and I could claw more than half of it back if I sold just the card. Huh. I like having a nice system though, so I'll hold onto it.
posted by codacorolla at 2:58 PM on January 22, 2018
posted by codacorolla at 2:58 PM on January 22, 2018
Built a computer last year with a gtx 1070 and spent about $330 for it. This afternoon the 1070s on Newegg were going for about $1000. Jeez.
posted by snwod at 4:26 PM on January 22, 2018
posted by snwod at 4:26 PM on January 22, 2018
Got me a GTX 1080ti a month or two back. I would not even consider polluting this beautiful piece of technology by mining cryptocurrency with it.
I should clearly have just bought a ton of GPUs, never opened the boxes, and sold them on ebay. It's easier to get rich selling the panning equipment than by panning for gold.
posted by Justinian at 5:51 PM on January 22, 2018 [4 favorites]
I should clearly have just bought a ton of GPUs, never opened the boxes, and sold them on ebay. It's easier to get rich selling the panning equipment than by panning for gold.
posted by Justinian at 5:51 PM on January 22, 2018 [4 favorites]
Anyone know why Ethereum difficulty fell off a cliff in mid October?
posted by Coventry at 6:29 PM on January 22, 2018
posted by Coventry at 6:29 PM on January 22, 2018
Anyone know why Ethereum difficulty fell off a cliff in mid October?
The Byzantium fork.
Back in 2015 Ethereum put a difficulty bomb into the protocol in prep for a hard fork in the future once more of the protocol was fleshed out. This would ensure that anyone following the old protocol moves to the new fork since the difficulty of the old protocol would soon become far too difficult to mine effectively. Once they released the new fork the difficulty on that fork plummeted to ensure that the miners followed the fork.
posted by Talez at 6:55 PM on January 22, 2018 [4 favorites]
The Byzantium fork.
Back in 2015 Ethereum put a difficulty bomb into the protocol in prep for a hard fork in the future once more of the protocol was fleshed out. This would ensure that anyone following the old protocol moves to the new fork since the difficulty of the old protocol would soon become far too difficult to mine effectively. Once they released the new fork the difficulty on that fork plummeted to ensure that the miners followed the fork.
posted by Talez at 6:55 PM on January 22, 2018 [4 favorites]
I feel like it's been remarked, but cryptocurrency is the paperclip maximizer.
And you don't even get all the nice paperclips.
posted by aspersioncast at 5:31 AM on January 23, 2018 [7 favorites]
And you don't even get all the nice paperclips.
posted by aspersioncast at 5:31 AM on January 23, 2018 [7 favorites]
I should clearly have just bought a ton of GPUs, never opened the boxes, and sold them on ebay. It's easier to get rich selling the panning equipment than by panning for gold.
Loquacious beat you to the punch, but yes.
posted by aspersioncast at 7:16 AM on January 23, 2018
Loquacious beat you to the punch, but yes.
posted by aspersioncast at 7:16 AM on January 23, 2018
I just bought a basic gaming PC from HP for my son with a GTX 1060 in it for a little under $700. I got a little jealous and decided that I should upgrade the card in my aging rig, and quickly discovered that the same card by itself is roughly $500. Crazy.
posted by monju_bosatsu at 11:15 AM on January 23, 2018
posted by monju_bosatsu at 11:15 AM on January 23, 2018
Set up a nowinstock.net alert and keep trying. Restocks happen all the time just gotta be quick.
posted by Talez at 11:32 AM on January 23, 2018
posted by Talez at 11:32 AM on January 23, 2018
There are a lot of 1080Ti products listed there. Are there any characteristics of a 1080Ti which make it better or worse for machine learning vs mining vs gaming?
I've been meaning to set a multi-card box up since the summer, but I'm afraid I made the best the enemy of the good. Probably a bad time to be mining, anyway.
posted by Coventry at 12:12 PM on January 23, 2018
I've been meaning to set a multi-card box up since the summer, but I'm afraid I made the best the enemy of the good. Probably a bad time to be mining, anyway.
posted by Coventry at 12:12 PM on January 23, 2018
I don't know if there's any special sauce in the 1080Ti, but AFAIK it is in terms of raw horsepower the most powerful card on the market right now.
posted by tobascodagama at 12:21 PM on January 23, 2018 [1 favorite]
posted by tobascodagama at 12:21 PM on January 23, 2018 [1 favorite]
It's the best bang for buck in terms of machine learning. I'm asking more about the packaging around the card. E.g., would I care if I got an ASUS AC-ORIGINS-ROG-STRIX-GTX1080TI as opposed to an EVGA 11G-P4-6390-KR?
posted by Coventry at 12:31 PM on January 23, 2018 [1 favorite]
posted by Coventry at 12:31 PM on January 23, 2018 [1 favorite]
Read somewhere back that the rise (no pun intended) of online porn accounts for a significant amount of processor needs nowadays, when you factor in people who consume it, how it's served and so forth. And was wondering if the adult entertainment industry would ever get into the whole cryptocurrency thing, for this other reasons.
Some searches - from which I now have to clear out my browser history - turned up this, which is apparently real.
I have no words, and do not understand the modern world.
posted by Wordshore at 12:33 PM on January 23, 2018
Some searches - from which I now have to clear out my browser history - turned up this, which is apparently real.
I have no words, and do not understand the modern world.
posted by Wordshore at 12:33 PM on January 23, 2018
There's also Pinkcoin. They basically want to be a dark-net market for the prostitution industry, which is technically, ethically and legally problematic, but is actually a thing
posted by Coventry at 12:40 PM on January 23, 2018
posted by Coventry at 12:40 PM on January 23, 2018
E.g., would I care if I got an ASUS AC-ORIGINS-ROG-STRIX-GTX1080TI as opposed to an EVGA 11G-P4-6390-KR?
Ah, I see. For gaming applications, the specific features of the various SKUs are... I won't say important, but perhaps relevant is a good fit. You're talking about things like the number and type of video outputs on the back, mainly. Some will have blinkenlights for folks who like those. But for headless applications like machine-learning and cryptomining, there's not a whole lot of difference between them.
posted by tobascodagama at 1:19 PM on January 23, 2018 [1 favorite]
Ah, I see. For gaming applications, the specific features of the various SKUs are... I won't say important, but perhaps relevant is a good fit. You're talking about things like the number and type of video outputs on the back, mainly. Some will have blinkenlights for folks who like those. But for headless applications like machine-learning and cryptomining, there's not a whole lot of difference between them.
posted by tobascodagama at 1:19 PM on January 23, 2018 [1 favorite]
How about TV?
TV was covered by marshall mcluhan back in the day and the Theta brain wave pattern the old 60 Hz TV turned on was something research was done on.
No idea what today's faster refresh and higher visual fidelity will do yet the people who actually may understand what their own products can do make different choices for their own families.
If one thinks advertising works enough to have college degrees in what is now called "Public Relations" or viral things posted online are able to influence elections then what you actually act out in a video game should also be able to provide influence over humans longer term.
Reinforcing brain wiring via repetition is part of the education system because it works.
posted by rough ashlar at 1:37 PM on January 23, 2018
TV was covered by marshall mcluhan back in the day and the Theta brain wave pattern the old 60 Hz TV turned on was something research was done on.
No idea what today's faster refresh and higher visual fidelity will do yet the people who actually may understand what their own products can do make different choices for their own families.
If one thinks advertising works enough to have college degrees in what is now called "Public Relations" or viral things posted online are able to influence elections then what you actually act out in a video game should also be able to provide influence over humans longer term.
Reinforcing brain wiring via repetition is part of the education system because it works.
posted by rough ashlar at 1:37 PM on January 23, 2018
What really bothers me is that Bitcoin doesn't seem to be actually used, and cashing out seems really difficult. So why is everyone still after it so much?
1) The effort to "become rich" is: Do you have the money for electric power, an internet connection, and computer hardware.
2) Stories exist of people who were able to get in at the last $5 or $10 price point and an almost $20K price point was visible. The 2014 story below notes $30 a bitcoin.
3) The methods shown in stock trading books/classes seem to work on bitcoins thus a straight technical trade looks to work.
4) Returns in other parts of markets are not the same as in this flashy 'coin thing.
5) It is an inexpensive and unobtrusive possible way to store value. Land, stock, gold/silver, bank accounts are things of interest at a border. And elsewhere. A string of digits - not as much. Unless you have some hokey artwork then bitcoin in your bag is a thing. Of course honey sends TSA agents to the hospital overcome with chemical fumes VS the realization that is bee vomit.
6) Your wealth is able to be hidden.
and
7) The greater fool theory of investing says you arn't the fool and won't get suckered. 7a) says you are smarter than others and will be able to get out at the top.
The electrical power and scarcity of hashing cards isn't going to slow up interest in trying to take $30 and make it $19,000 or $11,000 in 4 years. As soon as one of the old, big wallets starts to cash out the rumors will start that someone has a working long qbit machine or some kind of qbit method and that will have lots of people rush to the doors and crush the "buy on the dips" people due to the very thin trading.
And if anyone figures out a thing that converts TensorFlow as a distributed AI(esque) system into cash - the price on video cards won't come down for a long, long time.
posted by rough ashlar at 2:08 PM on January 23, 2018
1) The effort to "become rich" is: Do you have the money for electric power, an internet connection, and computer hardware.
2) Stories exist of people who were able to get in at the last $5 or $10 price point and an almost $20K price point was visible. The 2014 story below notes $30 a bitcoin.
3) The methods shown in stock trading books/classes seem to work on bitcoins thus a straight technical trade looks to work.
4) Returns in other parts of markets are not the same as in this flashy 'coin thing.
5) It is an inexpensive and unobtrusive possible way to store value. Land, stock, gold/silver, bank accounts are things of interest at a border. And elsewhere. A string of digits - not as much. Unless you have some hokey artwork then bitcoin in your bag is a thing. Of course honey sends TSA agents to the hospital overcome with chemical fumes VS the realization that is bee vomit.
6) Your wealth is able to be hidden.
and
7) The greater fool theory of investing says you arn't the fool and won't get suckered. 7a) says you are smarter than others and will be able to get out at the top.
The electrical power and scarcity of hashing cards isn't going to slow up interest in trying to take $30 and make it $19,000 or $11,000 in 4 years. As soon as one of the old, big wallets starts to cash out the rumors will start that someone has a working long qbit machine or some kind of qbit method and that will have lots of people rush to the doors and crush the "buy on the dips" people due to the very thin trading.
And if anyone figures out a thing that converts TensorFlow as a distributed AI(esque) system into cash - the price on video cards won't come down for a long, long time.
posted by rough ashlar at 2:08 PM on January 23, 2018
Also I have this weird feeling that quantum computing will crush all cryptocoins to dust instantly, because once you can say "evaluate all solutions and show me any that work", you can bulk-mine the entire hashspace for the next-block in a single search operation.
I hate to be a pedant but in case anyone is reading this, quantum computing doesn't work this way! You simply can't do this with a quantum computer any more than with a regular computer. (I wrote this a while ago explaining the difference. But basically quantum computers can only work their magic on very specific and limited problems.)
(Well, being doubly pedantic, you could sort of improve things using Grover's algorithm but it wouldn't exactly be the magic people tend to expect from quantum algorithms, and might not be worth it.)
There definitely would be an extremely interesting effect of quantum computing on cryptocurrencies, but it wouldn't have to do with the hashing/mining.
Bitcoin et al currently use elliptic curve cryptography to secure transactions. Unfortunately, this happens to be one of the cryptographic schemes that a quantum computer can break, so if someone had a quantum computer they could essentially steal anyone's wallet.
Now, if quantum computers start to look like they're becoming real, it wouldn't be super super difficult to shift to a quantum-proof algorithm for signing transactions. People would just need to move their stuff over to new wallets.
However, lots of bitcoin -- say, the tons of bitcoin where people have lost their keys, or the stuff from the early days that's just hanging around -- probably wouldn't be shifted to the new scheme, and would be ripe for the taking.
posted by vogon_poet at 9:53 PM on January 23, 2018 [1 favorite]
I hate to be a pedant but in case anyone is reading this, quantum computing doesn't work this way! You simply can't do this with a quantum computer any more than with a regular computer. (I wrote this a while ago explaining the difference. But basically quantum computers can only work their magic on very specific and limited problems.)
(Well, being doubly pedantic, you could sort of improve things using Grover's algorithm but it wouldn't exactly be the magic people tend to expect from quantum algorithms, and might not be worth it.)
There definitely would be an extremely interesting effect of quantum computing on cryptocurrencies, but it wouldn't have to do with the hashing/mining.
Bitcoin et al currently use elliptic curve cryptography to secure transactions. Unfortunately, this happens to be one of the cryptographic schemes that a quantum computer can break, so if someone had a quantum computer they could essentially steal anyone's wallet.
Now, if quantum computers start to look like they're becoming real, it wouldn't be super super difficult to shift to a quantum-proof algorithm for signing transactions. People would just need to move their stuff over to new wallets.
However, lots of bitcoin -- say, the tons of bitcoin where people have lost their keys, or the stuff from the early days that's just hanging around -- probably wouldn't be shifted to the new scheme, and would be ripe for the taking.
posted by vogon_poet at 9:53 PM on January 23, 2018 [1 favorite]
Anyway, there's a good chance that quantum computation won't work, and we'll learn some new physics instead.
posted by Coventry at 2:25 AM on January 24, 2018
posted by Coventry at 2:25 AM on January 24, 2018
Did y'all know that you can run algorithms on IBM's quantum computers right now? (Looks cold in there.)
posted by clawsoon at 5:27 AM on January 24, 2018
posted by clawsoon at 5:27 AM on January 24, 2018
Hmmmm, what I'm hearing is to keep an ear out and I'll be able to pick up a REALLY fancy graphics card (Compared to my 270X) in a few months DIRT CHEAP? HECK YEAH!
posted by Canageek at 10:21 AM on January 24, 2018
posted by Canageek at 10:21 AM on January 24, 2018
I was about to say, how do I find out when prices are crashing, but then I realized it is all I'll hear about when they do crash.
posted by Canageek at 10:21 AM on January 24, 2018 [1 favorite]
posted by Canageek at 10:21 AM on January 24, 2018 [1 favorite]
you can run algorithms on IBM's quantum computers right now?
They might get it to work, but their error-correcting scheme looks infeasible for calculations complex enough to break contemporary cryptographic public keys. I think they're kind of cheating.
posted by Coventry at 11:26 AM on January 24, 2018 [1 favorite]
They might get it to work, but their error-correcting scheme looks infeasible for calculations complex enough to break contemporary cryptographic public keys. I think they're kind of cheating.
posted by Coventry at 11:26 AM on January 24, 2018 [1 favorite]
Hmmmm, what I'm hearing is to keep an ear out and I'll be able to pick up a REALLY fancy graphics card (Compared to my 270X) in a few months DIRT CHEAP? HECK YEAH!
I would not buy a card that has been mining for months or years. These cards have been running flat strap modded to max voltage and speed to get the most performance possible out of them. The bearings of the fans are probably going, transistors probably have advanced electro migration, the heat sink grease is probably dust. A few months after a few months when you get a DIRT CHEAP card you’ll probably be disappointed.
posted by Talez at 5:00 PM on January 24, 2018 [2 favorites]
I would not buy a card that has been mining for months or years. These cards have been running flat strap modded to max voltage and speed to get the most performance possible out of them. The bearings of the fans are probably going, transistors probably have advanced electro migration, the heat sink grease is probably dust. A few months after a few months when you get a DIRT CHEAP card you’ll probably be disappointed.
posted by Talez at 5:00 PM on January 24, 2018 [2 favorites]
PonziCoin: The World's First Legitimate Ponzi Scheme
posted by tonycpsu at 11:50 AM on January 25, 2018 [3 favorites]
Q: Is this a real thing?That "Whitepaper Summary" should accompany anything cryptocurrency-related.
A: Yes, PonziCoins are an ERC20 standard token built on the ethereum blockchain that you can sell on this site.
How does this work? Can I actually cash out? How does the price double?
A: When you buy PonziCoins, you are paying ether into the PonziCoin's Smart Contract balance, think of it like an ether bank account. Whenever you wish to cash out, the Smart Contract will pay you the ether you are owed out of its balance. Every time 100 tokens are sold, the Smart Contract will double the current price it allows people to buy coins from it and sell coins to it. [...]
Q: This seems to be too good to be true, how am I getting screwed?
A: It's a literal pyramid scheme. You are fairly likely to be one of the last people to buy PonziCoins, so another 100 tokens probably will not be sold and the price may not ever double again, in which case you could lose up to 75% of your investment. There's also a chance the contract runs out of money or gets hacked in which case you could lose all of your investment
posted by tonycpsu at 11:50 AM on January 25, 2018 [3 favorites]
I dropped a link to that in the other cryptocurrency thread. It might be a coincidence, but the author did his best to shut it down (it can't really be shut down) shortly after I pointed out it's an unregistered security in the r/cryptocurrency thread.
posted by Coventry at 1:27 PM on January 25, 2018
posted by Coventry at 1:27 PM on January 25, 2018
Apparently PonziCoin is a repeat from 2014. Same domain, at least.
posted by clawsoon at 4:22 PM on January 25, 2018
posted by clawsoon at 4:22 PM on January 25, 2018
Wow, while I was checking on that contract, I came across this poor person, who seems to have burned $34,000-worth of ether by accident yesterday.
posted by Coventry at 6:09 PM on January 25, 2018
posted by Coventry at 6:09 PM on January 25, 2018
[a bit more on the quantum derail] This simulator supposedly kicks IBM's simulator in the butt. The developer of this one has apparently been hired by Google, according to Hackaday.
posted by Artful Codger at 7:21 AM on January 27, 2018 [1 favorite]
posted by Artful Codger at 7:21 AM on January 27, 2018 [1 favorite]
RPS has an interesting article about cloud gaming yesterday. That's where the fancy computer + GPU is in a datacenter somewhere and you're playing via streamed video, audio, and controls. There's a new wave of those systems coming out now. One key benefit of them is you don't have to buy the GPU. At Razer's current pricing you could lease one of the cloud systems for 4-5 years before you'd pay as much as if you'd bought the thing yourself.
posted by Nelson at 7:39 AM on January 27, 2018
posted by Nelson at 7:39 AM on January 27, 2018
Wow, Artful Codger, that looks awesome.
posted by Coventry at 8:26 AM on January 27, 2018 [1 favorite]
posted by Coventry at 8:26 AM on January 27, 2018 [1 favorite]
Just bought 6 GTX 1070Ti's myself over the past two weeks for $725-800 each, now they're hashing away. I started hobby mining back in 2014 with a GTX 560 I had at the time, later upgraded to a 970 that I bought for the Rift. Proceeds from those two cards alone since 2014 paid for the new ones several times over.
My carbon footprint is offset by renewable energy credits.
Gonna go over and read the 'crash' thread now and pop some popcorn.
posted by daHIFI at 8:03 PM on January 27, 2018
My carbon footprint is offset by renewable energy credits.
Gonna go over and read the 'crash' thread now and pop some popcorn.
posted by daHIFI at 8:03 PM on January 27, 2018
A LessWrong Crypto Autopsy. SlateStarCodex worries that only 3% of rationalists made over $100,000 from cryptocurrency.
posted by Coventry at 1:06 PM on January 29, 2018
posted by Coventry at 1:06 PM on January 29, 2018
Anyway, there's a good chance that quantum computation won't work, and we'll learn some new physics instead.
To clarify this comment a little: quantum computing is 100% compatible with quantum physics as we know it. It's hard to build a quantum computer, but it's a hard engineering challenge -- akin to how we knew it was physically feasible for an object to go from Earth to the moon even in the 1800s according to Newton's laws.
If we worked out the engineering and the things still didn't work, this would be disappointing for anyone who wants a quantum computer, but also a major breakthrough in physics, because it would imply that quantum mechanics is wrong or incomplete in a way we know how to test.
posted by vogon_poet at 1:17 PM on January 29, 2018 [4 favorites]
To clarify this comment a little: quantum computing is 100% compatible with quantum physics as we know it. It's hard to build a quantum computer, but it's a hard engineering challenge -- akin to how we knew it was physically feasible for an object to go from Earth to the moon even in the 1800s according to Newton's laws.
If we worked out the engineering and the things still didn't work, this would be disappointing for anyone who wants a quantum computer, but also a major breakthrough in physics, because it would imply that quantum mechanics is wrong or incomplete in a way we know how to test.
posted by vogon_poet at 1:17 PM on January 29, 2018 [4 favorites]
Yeah, I think failure of quantum computing is by far the most optimistic outcome, and not because QC would complicate our cryptographic lives, but because of the physics implications.
posted by Coventry at 1:21 PM on January 29, 2018 [1 favorite]
posted by Coventry at 1:21 PM on January 29, 2018 [1 favorite]
« Older Effective graphic design can help communicate your... | what is egg coffee? Newer »
This thread has been archived and is closed to new comments
My computer is going to be in storage for three months, I feel like a dork for not setting it up to mine while I'm out of the country.
posted by under_petticoat_rule at 1:38 PM on January 21, 2018 [2 favorites]