The future of file sharing
November 19, 2012 11:02 AM   Subscribe

In an attempt to make itself less desirable to copyright infringers, starting November 27, RapidShare will begin capping non-paying users at 1 gigabyte of outbound downloads per day. (Paying users will have 30 gigabytes.) Meanwhile, controversial Megaupload founder Kim Dotcom is planning a January debut for his new Mega service - which plans to insure itself against litigation by having all hosted material encrypted by the uploader's browser before transmission.
posted by Egg Shen (65 comments total) 8 users marked this as a favorite
 
The horses are gone! Quick, shut the barn doors!
posted by kirkaracha at 11:06 AM on November 19, 2012 [9 favorites]


I think the first filehosting company that gets North Korea (or any other "fuck you, America" government) to agree to host its data will really have earned their rocket cars and solid-gold houses.
posted by griphus at 11:14 AM on November 19, 2012 [6 favorites]


On a message board devoted to soundtracks, someone put up an excellent fan version of the soundtrack to The Empire Strikes Back. His MediaFire links aren't working reliably - due to that company's own chops-busting - so as a thank-you I plan to upload them to a new service.

Before I came across this news just now, I planned to use RapidShare - as its links have generally been durable and there's been no waiting period between downloads lately. But in the wake of this nonsense, I'll try peejeshare or sendspace instead - who seem to be the up-and-comers.
posted by Egg Shen at 11:21 AM on November 19, 2012


My actual thought process:

starting November 27, RapidShare will begin capping non-paying users


"Jesus, that is hardcore."

... at 1 gigabyte

"Oh. Right then. Carry on."
posted by ricochet biscuit at 11:27 AM on November 19, 2012 [10 favorites]


In an attempt to make itself less desirable to copyright infringers...

Non-paying ones, anyway.
posted by Blazecock Pileon at 11:29 AM on November 19, 2012 [2 favorites]


Is it just me being old,but isn't 1GB still a shitload of data to be downloading in a day?
posted by salmacis at 11:29 AM on November 19, 2012 [1 favorite]


Most file lockers already have limits and quotas for non-paying users.
posted by ceribus peribus at 11:30 AM on November 19, 2012 [1 favorite]


Yeah, and Rapidshare has always been more of a pain in the ass with the timers than most other file lockers.

(The main thing that I use them for is to do what YouSendIt used to before they got all weird — send huge files around for work. Our c3's Gmail can't handle anything over 25mb, but if I shoot an event I'll have 200 RAW files at 2mb each, and it's just a pain in the ass to send them separately. Megaupload used to be the easiest to use, but they apparently had ridiculously terrible management.)
posted by klangklangston at 11:31 AM on November 19, 2012 [1 favorite]


Is it just me being old,but isn't 1GB still a shitload of data to be downloading in a day?


Depends on what that data is, honestly. I have the most basic sort of cable internet available in my area, and I get 1mb/s speeds regularly, making 1GB a pretty trivial download. A demo of a relatively graphics-heavy game on Steam, for instance, will run you about 750GB. A full game is in the multiple-gigs category.

Meanwhile, streaming video isn't exactly in the scope here but Netflix accounts a quarter of global bandwidth.
posted by griphus at 11:32 AM on November 19, 2012 [1 favorite]


Rapidshare also has, at least for me the most recent times that I've tried, the slowest upload speeds. And that tends to annoy me more than a download that I can fire and forget.
posted by klangklangston at 11:32 AM on November 19, 2012


salmacis: "Is it just me being old,but isn't 1GB still a shitload of data to be downloading in a day?"

Isn't it "outbound downloads"? So if you've uploaded a 100meg file, it can only be downloaded 10 times per day?
posted by radwolf76 at 11:35 AM on November 19, 2012


I think Kim Dotcom's goal (already partially achieved) is to make New Zealand into a "fuck you, America" government.
posted by oneswellfoop at 11:35 AM on November 19, 2012


Isn't it "outbound downloads"? So if you've uploaded a 100meg file, it can only be downloaded 10 times per day?

That was my understanding.
posted by Egg Shen at 11:36 AM on November 19, 2012 [1 favorite]


Personally, I've been using Dropbox and Google Drive for large file transfers (media assets).

How do these file exchange companies make any money? I would think the ad and subscription revenue wouldn't cover the cost of bandwidth. I guess I must be wrong.
posted by pez_LPhiE at 11:37 AM on November 19, 2012


Is it just me being old,but isn't 1GB still a shitload of data to be downloading in a day?

Not if you live in Kansas City these days.
posted by azpenguin at 11:38 AM on November 19, 2012 [2 favorites]


That was my understanding.

I suppose next they'll disable non-member downloading altogether. Which is a shame because the nice thing about rapidshare (and mediafire) is I could reach a 1GB limit in 30 seconds.
posted by Lorin at 11:39 AM on November 19, 2012


Or five minutes, if I knew how to count!
posted by Lorin at 11:40 AM on November 19, 2012


How do they plan to enforce the 1GB cap? If they're just dropping a cookie, then this is as easy as closing one incognito browser session and opening another...
posted by mullingitover at 11:42 AM on November 19, 2012


How do they plan to enforce the 1GB cap?

Again, the cap is on outbound downloads. They'll certainly know when they've served up 1 GB of of data on a file they're hosting.
posted by Egg Shen at 11:46 AM on November 19, 2012 [2 favorites]


A demo of a relatively graphics-heavy game on Steam, for instance, will run you about 750GB.

Those days are coming sooner than our cable-providers would like.
posted by blue_beetle at 11:48 AM on November 19, 2012


"Sorry, the web sitefile you are trying to accessdownload has exceeded its allocated data transfer."
posted by ceribus peribus at 11:50 AM on November 19, 2012 [1 favorite]


GTA 12: The Moon, now with lander schematics compatible with most 3D printers!
posted by griphus at 11:50 AM on November 19, 2012 [2 favorites]


I think Kim Dotcom's goal (already partially achieved) is to make New Zealand into a "fuck you, America" government.

New Zealand governments swing between enthusiastic and genuine support for absolutely everything America does and mild, occasionally enthusiastic, occasionally reluctant support that acknowledges the realpolitik that we need to keep the US happy. Depending on who is in power. Dotcom isn't going to do much to change that, as far as I can see.
posted by Infinite Jest at 12:01 PM on November 19, 2012


Is it just me being old,but isn't 1GB still a shitload of data to be downloading in a day?

Those W1f3yzw0r1d siterips can be huge.
posted by Thorzdad at 12:01 PM on November 19, 2012 [4 favorites]


Those days are coming sooner than our cable-providers would like.

The metaphor I've come up with is that you have a restaurant which is an all-you-can-eat buffet. And it's incredibly popular, and your business is growing. Every day there are more people coming to eat at your restaurant. In fact, there's so many people that you don't have enough seating, kitchen capacity, or buffet table space to serve them all.

The sane thing to do is to expand! Build more tables, more kitchen space, more buffet table space, and become able to serve all the people who want to come eat at your business.

The American telcos' decision is to put up a sign that says "One plate per customer per day".
posted by Pope Guilty at 12:07 PM on November 19, 2012 [7 favorites]


Pope Guilty: ""One plate per customer per day"."

Challenge Accepted.

Ok,this doesn't actually work for this metaphor.
posted by radwolf76 at 12:16 PM on November 19, 2012


Not if you live in Kansas City these days.

Or Chattanooga -- their fiber is 100Mbit bidirectional for $70/mo... gigabit there costs $300. (250Mbit for $140). So not quite as cool, but you can still download a gig in about a minute and a half if your connection is maxed out.
posted by Malor at 12:36 PM on November 19, 2012


(Oh, and of course, on gigabit, you can download a gigabyte in 8 or 9 seconds, though session setup and speed ramping will probably make it about 20 seconds minimum)
posted by Malor at 12:39 PM on November 19, 2012


Is it just me being old,but isn't 1GB still a shitload of data to be downloading in a day?

Specifically addressed in regards to Netflix in the link griphus found:
The "average subscriber" reportedly streams a gigabyte worth of content every day.

That amount is even more for gamers. Sandvine reports that people that access Netflix via a gaming console stream about 2.5GB of content daily.
Sandvine evidently is a vendor of network management products and among other things makes whatever Comcast uses to interfere with BitTorrent traffic.
posted by XMLicious at 12:49 PM on November 19, 2012


How do these file exchange companies make any money? I would think the ad and subscription revenue wouldn't cover the cost of bandwidth. I guess I must be wrong.

DropBox's revenue is from premium subscriptions. AFAIK they don't even have ads (possibly because the taboo against sneaking ads into client-side software is still in effect, although apparently Windows 8 hopes to change that)
posted by qxntpqbbbqxl at 12:54 PM on November 19, 2012 [2 favorites]


salmacis: "Is it just me being old,but isn't 1GB still a shitload of data to be downloading in a day?"

No. It's only like 3 or 4 pinball tables or two thirds of a 43 minute HD multimedia presentation.
posted by wierdo at 12:55 PM on November 19, 2012


Holy shit are pinball tables up to 300MB apiece? Are we talking about Visual Pinball or something else?
posted by griphus at 12:59 PM on November 19, 2012


salmacis: "Is it just me being old,but isn't 1GB still a shitload of data to be downloading in a day?"

I don't really use rapidshare, but owning an iphone, in between updates to apps (as far as I know an update to an app just re-downloads the whole app and some games exceed 1GB on their own. Needless to say, it doesn't take many 100MB apps updating when, for instance, a new device comes out or a new iOS version comes down the pipe to hit 1GB) and new episodes of podcasts (some of which can exceed 100MB), it's not that uncommon at all to see that usage from itunes alone.
posted by juv3nal at 1:10 PM on November 19, 2012


pope guilty--your solution is one, of others, that can lead to a restaurant closing--over expansion, inadequate capitalization, new debt, erosion of quality, less flexibility to adapt to new competition etc. Other possibilities include increasing price, expanding hours, preferential pricing for off times, increasing quality/increasing price, adding specialty items for additional cost, etc. If you are not running a business it is very easy to underestimate the direct and indirect cost of expansion. While expansion can bring significant rewards it also carries equally significant risk.
posted by rmhsinc at 2:05 PM on November 19, 2012 [1 favorite]


griphus: "Holy shit are pinball tables up to 300MB apiece? Are we talking about Visual Pinball or something else?"

With all the HyperPin assets and everything, yes. The tables themselves aren't that large, thankfully. Not that you'd know that from their video memory requirements...
posted by wierdo at 2:20 PM on November 19, 2012


what this'll mean, please

When the aspiring downloader clicks the RapidShare link, they'll see a message saying the file's download limit for the day has been reached.

If MediaFire's example is any guide, said message will add that the original uploader has been contacted to pony up some dough remedy the issue.
posted by Egg Shen at 2:30 PM on November 19, 2012 [1 favorite]


item: "what this'll mean, please"

I would imagine that if the the people who upload the files who supply guy-onna-train's habit want to continue to use rapidshare for free, they will end up spending time uploading multiple copies of the same popular file and passing around the whole collection of rapidshare download links instead of just a single link, to support the fact that only 1 gig's worth can be downloaded per day.

Meanwhile, trainguy and his buddies are going to have to spend more time trying to find links that haven't hit the 1 gig cap for the day yet (and if all the redundant links provided by the uploader have hit their caps, come back the next day and try again).

All this change really does is add some friction to slow down what's normally a low-effort instant-gratification process to the free Rapidshare uploaders to try to drive this class of uploader to Rapidshare premium accounts or to other services. From Rapidshare's standpoint, if they upgrade to premium, that's additional revenue, and if they move to another service, then Rapidshare can tell regulators that steps were effectively taken to curb illegal activities on their servers.
posted by radwolf76 at 3:06 PM on November 19, 2012 [1 favorite]


griphus: "Is it just me being old,but isn't 1GB still a shitload of data to be downloading in a day?


Depends on what that data is, honestly. I have the most basic sort of cable internet available in my area, and I get 1mb/s speeds regularly, making 1GB a pretty trivial download. A demo of a relatively graphics-heavy game on Steam, for instance, will run you about 750GB. A full game is in the multiple-gigs category.

Meanwhile, streaming video isn't exactly in the scope here but Netflix accounts a quarter of global bandwidth.
"


I am going to have to assume you mean megabytes and not gigabytes. If I am tossing out three quarters of a terabyte for a game, it better damn well not be a demo...
posted by Samizdata at 3:19 PM on November 19, 2012


if they move to another service, then Rapidshare can tell regulators that steps were effectively taken to curb illegal activities on their servers.

It will be interesting to see how their "We only facilitate copyright infringement for paying customers" defense plays on Capitol Hill.
posted by Egg Shen at 3:21 PM on November 19, 2012 [2 favorites]


From Rapidshare's standpoint, if they upgrade to premium, that's additional revenue, and if they move to another service, then Rapidshare can tell regulators that steps were effectively taken to curb illegal activities on their servers.

But, from Hollywood's perspective, suddenly there's 10x as many infringing files, and you can be certain they'll be screaming bloody murder about how much worse piracy is getting, and OMG Congress needs to do something NOW.
posted by Malor at 3:29 PM on November 19, 2012 [2 favorites]


Samizdata: I am going to have to assume you mean megabytes and not gigabytes.

Yeah, demos are usually 750 megs or so, not gigs. Yet. That must have been a typo.

Full games are getting into the 20 gig range now, for the really big titles.
posted by Malor at 3:29 PM on November 19, 2012


Malor: "Samizdata: I am going to have to assume you mean megabytes and not gigabytes.

Yeah, demos are usually 750 megs or so, not gigs. Yet. That must have been a typo.

Full games are getting into the 20 gig range now, for the really big titles.
"

I know that is sadly true. And then you factor in DLC...
posted by Samizdata at 3:32 PM on November 19, 2012


When I download an old [out of print] album it is far less than 1GB and is often in a zip drive, anyway. So 1GB is plenty for my needs. If I have to pay a bit to get some rare music that I couldn't normally get without paying / bidding lots of cash on ebay then so be it.
posted by Rashomon at 3:41 PM on November 19, 2012


Full games are getting into the 20 gig range now, for the really big titles."

Max Payne 3 was 35 gig.
posted by Sebmojo at 3:50 PM on November 19, 2012


"Sorry, the web sitefile you are trying to accessdownload has exceeded its allocated data transfer."

1997 called...
posted by Sys Rq at 3:53 PM on November 19, 2012


Egg Shen: "It will be interesting to see how their "We only facilitate copyright infringement for paying customers" defense plays on Capitol Hill."

Oh, I'm sure being a paying customer doesn't insulate you from whatever DMCA takedown process Rapidshare has in place, it's just now, you've subsidized the paperwork behind your file's deletion out of your own pocket, instead of it coming out of the revenue from people who got their premium subscriptions to download the files you've been uploading.

Also, I hope you were on their monthly subscription instead of an annual one, because they've closed your account, no refunds, due to violation of terms of service.

And I wonder, if a copyright holder finds an infringing file uploaded by a premium account holder, could they subpoena Rapidshare for the billing records?
posted by radwolf76 at 3:53 PM on November 19, 2012


Those of you who like transferring files too big for email: ge.tt. (though you should really go with Dropbox or Drive at this point.)

The rest of you: ShareBee and Slimrat.

On the one hand, online file lockers and sites like RapidShare provide a legitimate and valuable service. On the other, their business model works because they commercially exploit copyright infringement, and even with reform to the laws or copyright period it's not clear that would change.

But one thing is clear: if they go under, the progress of animated meaningless gauges will never recover.
posted by 23 at 5:48 PM on November 19, 2012 [7 favorites]


salmacis writes "Is it just me being old,but isn't 1GB still a shitload of data to be downloading in a day?"

Massive game downloads aside ya that would be a shit ton of data to be downloading from rapid share on a daily basis. Most movie torrents are ~700MB. So a gig could keep you occupied for two hours a day easy.
posted by Mitheral at 7:02 PM on November 19, 2012


Not if you're watching in HD, Mitheral. :) 720p usually comes out around 1.4 gigs for the typical 40 minutes of a TV episode, using much stronger compression than what's on, say, a BluRay. And a non-transcoded BluRay rip straight off the disc can run 20 gigs, easy.
posted by Malor at 7:06 PM on November 19, 2012


Very easy. There are quite a few Blu-Rays that clock in at over 40GB. Much like PS3 exclusives, which often seem to have their disks packed to the gills, actually. Regardless, there are plenty of legitimate large files out there. Drowned out by the likely-copyright-infringement stuff, to be sure, but still.
posted by wierdo at 7:26 PM on November 19, 2012


I'm a little confused about this 'outbound' thing. Someone help explain what's about to happen to my friend (who of course isn't me but just some guy I met on the train), a rapidshare user who stopped using bitorrent when the hammer began to fall and who routinely snags on average between 1 and 2 gigs worth of totally pirated stuff a day that he has no right whatsoever to be downloading, what this'll mean, please. Are my trainmate and I right in assuming that with the new data rationing, when a file is uploaded to rapidshare it can only be downloaded a certain number of times? Or something? So confusing, and so glad I chose a career in social work instead of data wrangling.
Short answer - yes. In an attempt to go legit, or at least appear to go legit they're implementing the following changes:

1) if registered but non-paying rapidshare user Alice uploads a file, say totally legit BigBangTheeeryS12E05 xvid, 300MB in size, and shares the link publically. As a non paying user, her 'outbound' quota is 1GB/day.
2) user Bob who doesn't know Alice downloads BBTS12E05, using up 300MB of the 1GB 'quota'.
3) users Charlie, Dylan do the same, using up another 600MB between them.
4) user Ethan tries to use the link, and bzzt, Alice's quota exceeded. Try again tomorrow.
5) Alice probably gets a message saying 'hey, you have a popular file! pay money to share it more.'
6) For some altruistic reason, Alice ponies up money to become a premium account - her upload limit goes to 30GB a day.
7) Now for Alice's copy of BBTS12E05, 100 users can download that file a day instead of 3, which Alice is paying for.

Even with the upper limit, large popular files - i.e. videos, or for example, 500MB Samsung Galaxy Note legally customised ROMs - will be heavily limited in how many people a day can download them, regardless of the downloader's premium account status.

You can directly share files privately, dropbox-style, without limit though I believe.

The idea is to reduce the appeal of using rapidshare for bulk public distribution of well, anything big.
Personally, I'm not sure what legit business that leaves them with. Premium users are left paying for high paying download of not much, uploaders have to be premium to share anything much at all so will leave in droves, and if you're just sharing privateish legit stuff (docs, photos of your kids to granny etc) between a handful of users, dropbox, sugarsync, google drive etc etc are nicer to use and not riddled with ads, and are largely free.
posted by ArkhanJG at 12:07 AM on November 20, 2012 [2 favorites]


Also - if you're a premium uploader, you're tying a real name and credit card to what you upload, not just an IP address. If you're sharing popular dodgy material, that would strike me as a stunningly bad idea - whether you use your own, or steal credit card numbers. Especially given all copyright infringement lawsuits have been going after uploaders and service providers, not downloaders* (since the law is much clearer about distribution being illegal, less so about being a customer of an illegal distributor)

* bittorrent users are both public uploader and downloader, by design. Great for Ubuntu 12.10 ISOs, not so much for iffy film rips.
posted by ArkhanJG at 12:24 AM on November 20, 2012


Personally, I'm not sure what legit business that leaves them with.

I suppose they figure putting themselves out of business is preferable to finding themselves in Kim Dotcom's shoes.
posted by Egg Shen at 5:41 AM on November 20, 2012


"Those of you who like transferring files too big for email: ge.tt. (though you should really go with Dropbox or Drive at this point.)"

Dropbox had a weird TOS change a bit back where they wrote themselves some pretty over-broad perpetual licenses for things shared there, and Drive just flat broke when I was uploading raw video for our SF guy to edit (around 750mb each). We ended up dicking around with some FTP space he had on his private server, but it was a pain in the ass (still not sure why the password wouldn't take with, like, three of the clients I usually use — I had to use Terminal to connect).

Anyway, I'll take a look at ge.tt. If our stupid servers weren't such a mess, I'd agitate for a file locker just through FTP on our own site, but that has to wait for us to fuck around with a whole lot of other stuff.
posted by klangklangston at 2:43 PM on November 20, 2012


Once we have an actual IT guy instead of me and the digital marketing guy cobbling together kludges, I think that's worth pursuing. (Maybe I'll agitate to put an IT wishlist on our site, in case coders wanna scratch their pro bono c3 itch.)
posted by klangklangston at 7:44 PM on November 20, 2012


If you want cloud hosted, google apps (free with your own domain) includes drive with 5GB free space per user and 10GB upload limit (obviously assuming you buy more space!); I've used it successfully recently to share a 4GB ISO, so may be working better than when you last tried it, it's gone through a lot of updates as part of the changeover from google docs.

If you can live with microsoft, skydrive doesn't entirely suck these days - 7GB free, 2GB filesize upload limit from the desktop sync app (300MB from the webapp IIRC), with just released clients for osx, android & iOS.

If you want self-hosted entirely, have a look at tonido - you can share files from windows, osx or linux if you're prepared to forward the ports on the firewall to the box - use a box in the DMZ for security on a bigger network. You can specify how much, or how little, you share. Software is a little clunky, but it does work.

If you want to roll your own cloud storage, amazon S3 plus s3dropbox may be a decent starting point - hosting s3dropbox in amazon elastic cloud should also be doable...
posted by ArkhanJG at 11:03 PM on November 20, 2012 [1 favorite]


While I think about it, FTP use these days is considered harmful for general use. It's totally insecure so password leakage is a real issue, and unless you know what you're doing it's pretty easy to cock up and leave public access enabled on the server - and there are bots that prowl about looking for open FTP servers to dump dodgy material for sharing. Speed is woeful, and getting past firewalls properly with an FTP client can often be a right bugger.

If you want an old-school setup, i.e. CLI based, SFTP is a much better option and comes included with SSH on any linux distro. If all you have is windows servers, it's trivial to enable hyper-v and run a virtual linux server; ubuntu server includes hyper-v driver support in-the-box, so you just point hyper-v at the ISO and install, no cocking about required - you can either give the VM its own network card, or share one with the host windows.

Then you just open port 22 and point it at the linux IP address (or run it on a different port for a little obscurity) - root access is disabled by default, but you can ssh or sftp in as a normal user on the box and be locked to that user's space. WinSCP is a good SFTP client for windows, and OSX and linux have a CLI one built in. Bonus points for using a public/private ssh key for password-less login.

I actually run this setup at home - a windows 2008R2 server with hyper-v and an ubuntu 12.04 virtual server with ssh and sftp - and it works very well and probably took an hour to setup, including the time to download and install the ubuntu server. I use mine primarily for a self-hosted git server and development work with node.js, but the ssh/sftp side works just as well.
posted by ArkhanJG at 11:39 PM on November 20, 2012 [2 favorites]


In many flavors of linux SFTP support is built in via GVFS or KIO and is natively handled by many of the file manager applications, so that it's indistinguishable from browsing a hard drive and any application—even one that wasn't built to support it—can open a file over SFTP.

(And in case it wasn't clear, WinSCP for Windows which ArkhanJG mentioned is a GUI tool rather than something you use from the command line; it's sort of like an old-fashioned two-pane FTP client from before the Windows file manager supported FTP.)
posted by XMLicious at 5:52 AM on November 21, 2012


klangklangston: "Once we have an actual IT guy instead of me and the digital marketing guy cobbling together kludges, I think that's worth pursuing. (Maybe I'll agitate to put an IT wishlist on our site, in case coders wanna scratch their pro bono c3 itch.)"

Even you can install owncloud on a server somewhere. These days it can use S3 and Drive and some other stuff as a backend, so the server itself need not have a crap ton of space or anything.
posted by wierdo at 10:06 AM on November 21, 2012


Another option for mounting an sftp site is using sshfs; works on both linux and osx via fuse. (OSX used to have a nice GUI sshfs/FUSE tool in macfusion, but it's a bit broken in mountain lion).

In my case at work (using a similar setup with a virtual headless linux development server), I mount the source code directory in osx via sshfs, so it gets treated as a local file system mount - I then edit the code in a nice GUI text editor (sublime text 2), and commit to git as I go using either the OSX github app for my open source projects or sourcetree for my private ones - much nicer having a proper commit history graphically, easy feature branches etc.

I also ssh into the dev server with iterm 2; the dev server has all the libraries and the same versions as production, so I can test the code in situ using mocha etc, and a live dev database that's server side.

I don't need to setup anything locally on my mac for dev work, bar the git client and editor, and a browser for 'end result' testing. I don't even technically need the git client, as I can commit directly to git via ssh on the dev server if needed, and my dev environment ends up mirroring production closely for the sake of a bit of RAM and disk space on our vmware server farm.

Once my feature branch is complete and I'm happy it all tests properly, I merge it back into the master git branch, and all I need to do then is ssh into the production server, do a git fetch, restart if necessary, and the tested code is fully up to date in one step. If I did more commits, I'd probably set up a hook in git so that merging into master would trigger a production update automagically, but it's not been worth the effort.

It also means that all I need to demo a change on the fly is a computer with an ssh client and browser, which is a quick standalone download on windows, i.e. kitty.exe, or running on my nexus 7 - I can dial in, spin off a new git branch if needed (so I don't lose my place if I've left a half-finished feature branch open), manually text edit the changes directly on the dev server, and the changes go live immediately on my dev environment. So when my manager says 'can we move this logo a bit?', I ssh in from whatever's to hand, make the CSS change, and boom, live demo. If we decide to keep the changes, commit to git via ssh for posterity; if not, blow away the branch and forget it ever happened without ever touching the real production server.

It makes web app development work so damn easy and reversible in the event of cockup it feels it ought to be illegal, though being the senior sysadmin for my day job probably helps.
posted by ArkhanJG at 12:21 PM on November 21, 2012


I don't know about KIO but GVFS is basically a wrapper around FUSE.

sshfs is the bomb, fer sure. See also s3ql and s3fs which mount a cloud storage service as a filesystem.
posted by XMLicious at 1:06 PM on November 21, 2012








Here's what Rapidshare shows now:

Download not available: File owner's public traffic exhausted.
posted by Egg Shen at 6:12 PM on November 27, 2012


« Older Romney's Technical Foul   |   Radi-Aid: Africa for Norway Newer »


This thread has been archived and is closed to new comments