Tile, Baby, Tile
January 28, 2025 12:59 PM   Subscribe

Why do we need a “desktop” at all? Our digital spaces are hyper-connected, non-linear, infinitely malleable—yet we continue to constrain them within the rigid, physical-world boundaries established four decades ago. Our computers are capable of generating entire worlds, processing billions of calculations per second, constantly connected to global networks—and yet we interact with them as if they were elaborate digital paper-pushing machines. The skeuomorphic interface made sense in the 70s when computers were alien technologies that needed familiar metaphors to feel approachable. But in 2025, it’s become a cognitive straightjacket, limiting how we conceptualize and interact with information. from The Lost Futures of Computing: How We Got Boxed Into the Desktop Metaphor
posted by chavenet (93 comments total) 25 users marked this as a favorite
 
One alternative, previously.
posted by lalochezia at 1:00 PM on January 28 [3 favorites]


Because I need a metaphorical place that I can fill with crap to match my literal desktop.
posted by jb at 1:14 PM on January 28 [22 favorites]


The future is unevenly distributed in this. Lots of folks are still using extremely nonmetaphorical file folders and inboxes in their offices—even spindles. And to be honest, growing up alongside a desktop-style computing system helped my physical organization.
posted by Countess Elena at 1:15 PM on January 28 [7 favorites]


(I don’t mean to threadshit, by the way! It’s an interesting article! And I’m sure as hell tired of my Windows desktop trying to sell me shit or give me cutesy reminders.)
posted by Countess Elena at 1:16 PM on January 28 [2 favorites]


I keep seeing comments such as "users are forgetting/never-learned" what a *file* is, much less file folders.
posted by aleph at 1:18 PM on January 28 [10 favorites]


DWM gang represent. All programs should have their options configurable by changing C and recompiling.
posted by andorphin at 1:19 PM on January 28 [7 favorites]


I don't ever use my desktop. Full screen windows launched from the start menu cover up anything that was left on or installed on the desktop. So i've got no idea why people care/don't care about it
posted by thecjm at 1:22 PM on January 28 [7 favorites]


that was interesting, if kind of light - you can tile windows with a hotkey, and i feel like sharepoint is doing its best to replace folder structures with ubiquitous search, and ios/android has its own non-desktop kind of metaphor, i guess?

fundamentally a file structure is a useful way to reflect clarity of thought, and has enough flexibility to cover a vast array of use cases, so it's not that surprising it's stuck around.
posted by Sebmojo at 1:25 PM on January 28 [6 favorites]


So, with a few unused laptops lying around, I decided to dive back into Linux.

Of course.

Linux.

Most people just want to open up their applications and have them work, but if it makes him happy to fiddle and tweak, that's fine. Life is short and people should do what makes them happy.

My system, a MacBook with many, many apps open fullscreen that I swipe through, has caused people stress when they see it. But, it makes me happy.
posted by betweenthebars at 1:26 PM on January 28 [6 favorites]


I run xmonad on arch linux. I'm also a vegetarian cyclist. Yes I have a mechanical keyboard, why do you ask?
posted by Alex404 at 1:32 PM on January 28 [33 favorites]


As a semi-antique unix user who deals with a lot of online storage office spaces (like the whole G Suite) for non-work projects, the way that Drive and others do their damnedest to hide any underlying structure in favor of search.

It drives me mad. So does my partner's habit of "everything on the desktop with every app opened to full screen". :) (It works for them, but it gives me metaphorical hives)
posted by drewbage1847 at 1:39 PM on January 28 [11 favorites]


Just give me full-screen windows and the ability to put them on different monitors and access them from something like a taskbar, and that's 99% of my usage covered. 30 years of using Windows and I'm quite happy using a mouse, thanks. I know about half a dozen keyboard shortcuts (copy/paste, close an app, bring up the start menu, undo/redo) and have no desire to learn any more. As far as I'm concerned, the main job of an OS is to stay out of my way. I have VR for gaming, and really don't want to spend my work life waving my hands about with my head in a sweaty bucket.
posted by pipeski at 1:40 PM on January 28 [8 favorites]


Article is full of crap, sorry.

Floating windows that you managed with a mouse are crappy, we've know this for a long time. On windows you can do basic tiling by selecting a windows and holding the windows key and using the arrows to move/snap it, but it's not quite enough. Should be better from the get go, but it has nothing to do with the concept of the desktop which is basically a fancy folder displayed underneath your windows.

Files and folders are there so that you can "manually" organize your data, it's a bit skeuomorphic but really they're just common building blocks to ease back n forth between you and the multiple programs you can run. Your desktop is a just a convenience folder for items you need to access quickly.

The real crime in computer interfaces is all the modern apps pushed upon us, it's either web apps or web apps running in electron. Those follow none of the standard UI elements or keyboard shortcuts, we've made massive step backs in usability from that alone. And chrome and the other browsers just suck at tiling even more than the base OS does.
posted by WaterAndPixels at 1:40 PM on January 28 [19 favorites]


unfortunately the post-desktop post-file utopia we got was iOS and Android, and something like 75% of all computing (figure pulled out of my ass) is done with them, not desktops
posted by BungaDunga at 1:45 PM on January 28 [2 favorites]


we tossed files as a common glue away in favor of bespoke app interfaces that lock every piece of information into a non-interoperable per-app walled garden
posted by BungaDunga at 1:47 PM on January 28 [25 favorites]


Oh, while I make jokes about my desktops (literal and digital), I'm all about hierarchical folder and file structures. Everything is filed in exactly the right place or else it is never seen again.

People need metaphors to organize their thoughts - and it's amazing how much I do remember about where my files are even when I come back months later. I think that computer folder systems are the modern equivalent of the "memory palace" technique - by giving a physicality to information, we tap into our brains' ability to remember places.

The desktop is for the stuff I'm not worried about seeing again.
posted by jb at 1:48 PM on January 28 [16 favorites]


One thing i couldn't do without is everything by voidtools - seamless, instant search of everything on your multiple terabytes of storage, and it's like 250kb.
posted by Sebmojo at 2:04 PM on January 28 [7 favorites]


The one thing that kills me about windows is how one can steal focus from another. From the late 90s until 2010 I was using Linux all the time and wherever my pointer was had focus regardless of what else the computer was doing. I don't have a computer at home, I'll just use my spouse's or kid's if I actually need to use one but normally my phone is enough, but at work it's a Windows PC and it won't let me ignore when some background program wants to throw up a dialog box.
posted by any portmanteau in a storm at 2:08 PM on January 28 [6 favorites]


betweenthebars: Of course. Linux. Most people just want to open up their applications and have them work

True. I'm one of them. I just prefer to do that on Linux.
posted by Too-Ticky at 2:24 PM on January 28 [12 favorites]


I will admit I kinda skimmed the article but nothing I saw dispelled my initial reaction of “what are you talking about the desktop metaphor has been in terminal decline since the iPhone and arguably the first blow was fucking Windows 95?”
posted by atoxyl at 2:25 PM on January 28 [4 favorites]


I guess the ultimate thrust here is really about windows, rather than files, but as people have pointed out most modern operating systems offer a variety of ways to manage windows
posted by atoxyl at 2:32 PM on January 28 [2 favorites]


I love the desktop and the concept of a desktop!
posted by tiny frying pan at 2:37 PM on January 28 [9 favorites]




I don't ever use my desktop. Full screen windows launched from the start menu cover up anything that was left on or installed on the desktop. So i've got no idea why people care/don't care about it


Like I couldn't understand this less don't you ever look...underneath?

Yes I am of The Oregon Trail Generation, what of it
posted by tiny frying pan at 2:38 PM on January 28 [3 favorites]


Using tiling managers is the main reason I could never go back to a Mac.
posted by iamck at 2:46 PM on January 28 [2 favorites]


How odd that they state that Xerox invented the layered desktop metaphor without once mentioning that Xerox's Cedar was the first tiled window manager...

And for fans of virtual screens, that was also invented at PARC with Interlisp-D's Rooms, which you can still play with today at Interlisp.org
posted by Runes at 2:47 PM on January 28 [2 favorites]


Cloudy file systems may not actually be based on a folder structure at all, but instead use a database to store the files and metadata, and then hack on a hierarchical file organization post-hoc.
posted by kaibutsu at 3:04 PM on January 28 [2 favorites]


I rarely see the desktop wallpaper of my laptop because there is a web browser open over it the majority of the time. I don't see how something so little-used is a cognitive straightjacket. But, agreed that the metaphor of the desktop is quite dated.
posted by skiles at 3:07 PM on January 28 [2 favorites]


Cloudy file systems may not actually be based on a folder structure at all, but instead use a database to store the files and metadata, and then hack on a hierarchical file organization post-hoc.

To be fair a file hierarchy is an abstraction anyway. But that one may find oneself using files built on a database that is built on files does perhaps say something about modern computing.
posted by atoxyl at 3:15 PM on January 28 [5 favorites]


i feel like sharepoint is doing its best to replace folder structures with ubiquitous search

That's like a squirrel doing its best to calculate orbital mechanics...
posted by mrgoat at 4:02 PM on January 28 [10 favorites]


hey gang it's 2025 and people still wonder why we can't jack directly into cyberspace the way the console cowboys do

cool cool cool
posted by Halloween Jack at 4:34 PM on January 28 [9 favorites]


I can understand Mac and Windows users, they don't know any better. But I am always confused when perfectly smart Linux/Unix developer types go for these desktop environments. What do they get out of them? I watch them fiddling with graphical file managers, so awkward and slow having to search for everything visually and then aiming with the mouse, dragging stuff around etc. A lot more error prone too.

*smugly looks down from his high horse*
posted by donio at 4:35 PM on January 28 [2 favorites]


I hate Teams and I hate SharePoint because I feel that I’m always doing things with one hand tied behind my back. I realize that the license that I have is part of the problem, and I also realize that there are some great innovations in SharePoint but until I feel like I can do everything that I want to, my desktop is my home. My background is a night time winter scene with the viewpoint of looking up a snowy hill, past a snow covered tree and into the night sky filled with northern lights. It’s my happy place and I don’t need a browser to get there.
posted by ashbury at 4:36 PM on January 28 [7 favorites]


Humanity has a half-life to outgrow things. It took a decade or so before manufacturers were willing to let go of the virtual leather and brushed metal nonsense the people born in the 1950s insisted on. They still haven't been able to wean customers from the 80s off the physical remnants of that. Even though composites are, in fact better in every way.

Desktops will have to outgrow the limits of their users too. It's generational.

This is one reason immortality is foolish . You would be in a constant state of futureshock. Most would simply live like anchorites, withdrawn from the world, using the interfaces they knew when they were 20.
posted by bonehead at 4:41 PM on January 28 [4 favorites]


I've invested decades into evolving a file structure for important documents, contracts, etc, projects, manuals and technical info, pictures and other images, and media files for editing. Within each silo are subfolders depending on the best way to organize that subject (eg pictures by date and/or event, tech data by category, make, model, docs by person, company or subject, etc). Most stuff I can retrieve just by drilling down; worst case I only have to search one silo.

Using Windows, Android and Linux.
posted by Artful Codger at 4:41 PM on January 28 [3 favorites]


Tiling's a start, but I'd like the ability to crop off several small bits of windows at once, like little live screengrabs without all of the full window chrome. Five lines of a text document that you can still mouse over and scroll as you reference it, just the volume monitors from four different apps in a chain, the notification bell icon on a reply you're waiting for, and the progress bar on the latest idle game you're playing.
posted by lucidium at 5:08 PM on January 28 [2 favorites]


Simultaneous with the birth of desktop-based computing were experiments in adding graphical structure to programs, such as DRAKON and Max. These experiments were/are successful ways to integrate multiple programming languages and programs into a more flexible and legible representations of computing. They save time, improve quality control, and empower users.

Unfortunately, the culture of computing is conservative and regressive. Most visual interfaces are built as the “stupid person” interface to the “real computer” of a text interface. Never mind that text is just as abstract as any other human-readable representation of computing— computer culture treats text as primary and essential, and visualization as a limited abstraction.
posted by Headfullofair at 5:17 PM on January 28 [3 favorites]


Hierarchical databases (of which file systems are an avatar) can be quite useful, but they don't necessarily need to be the canonical way to access information.
posted by Monday, stony Monday at 5:21 PM on January 28 [1 favorite]


I loved when Mac started offering full screen apps and tabbed full screen finder windows. I do still have nice wallpaper, but rarely see the desktop. Sadly, a lot of the stuff I loved has stopped working consistently on Macs in the last year. Opening full-screen apps from the launchpad launches them without going to the app screen, and finder doesn’t know where to put a copy progress window when you are using full-screen windows, etc.
posted by snofoam at 5:28 PM on January 28 [1 favorite]


I've invested decades into evolving a file structure for important documents, contracts, etc, projects, manuals and technical info, pictures and other images, and media files for editing. Within each silo are subfolders depending on the best way to organize that subject (eg pictures by date and/or event, tech data by category, make, model, docs by person, company or subject, etc). Most stuff I can retrieve just by drilling down; worst case I only have to search one silo.
Same here. I'm fighting what is likely a futile battle at work to get people to properly organise files in a way that allows anyone (not just the person that created them) to find things. Search works up to a point, but often generates so many results you end up having to sift through them endlessly to find what you're looking for. It's not just having a logical structure that matters, but I can't seem to get through to people the importance of using naming conventions for files that make sense. Any files I create are identifiable by their filename even outside their structure, but most people just don't care. I guess it's because I started off using hard copy files that had no way of being found unless they were properly organised and most of the people I work with have never used paper documents at all.

I don't think the desktop is the problem, although I do miss sharing desktop screen grabs with people. I do get frustrated with using phone and tablet devices that only allow full-screen apps, but that seems to be what most people use for most tasks these days, so maybe it's just me. Most of the time, I'm using a device with two 32" monitors, plus the laptop monitor and my happy place is being able to see all the things I'm currently working on at once, as well as monitoring things like email and Teams (which I hate with the heat of a white-hot sun). Maybe the future will be virtual screens and involve lots of hand-waving and gesturing, but I hope I'm dead and buried by then.
posted by dg at 5:30 PM on January 28 [9 favorites]


*drags a tattered arm up from a long trail of broken systems* taxonomy is unsolvable
posted by lucidium at 5:37 PM on January 28 [10 favorites]


I do most of my non-work, non-mobile computing on a 13-inch MacBook Air. I got my first Air after using an iPad for a while, and I've used my Mac a lot like I use my iPad...apps mostly almost full-screen, cmd-tabbing back and forth. I literally never see my desktop; if I can't open a program from Alfred (or Spotlight), it had better be on my dock.

But I don't understand how the young'uns don't understand files and folders. One of my biggest issues with iOS was the lack of a way of getting to the file structure, and the Files app, as pathetic as it is, is a godsend on iOS/iPadOS. I work with files all the time; how am I supposed to live without them? What is the alternative, if I have to work on a spreadsheet or a Pages document, and need to convert and send it to a Word user? Is the share sheet adequate for that? I know for a long time I ran into dead ends trying to make use of it.

I really do like the idea of tiled windows managers, but on a 13-inch screen, I'd be looking at postage stamp-sized tiles a lot of the time, and my eyes are getting too old for that.
posted by lhauser at 6:09 PM on January 28 [3 favorites]


If our computing environments (iOS/Android) still use metaphors, the predominant one is…. the stream?
posted by brendano at 6:19 PM on January 28 [1 favorite]


Linux.

Most people just want to open up their applications and have them work, but if it makes him happy to fiddle and tweak, that's fine.


This isn't true anymore. I've installed stock Fedora and Pop workstations and fiddled not a whit. It makes me happy to not have ads and telemetry woven into my workday though, and I've fiddled more with my Windows machines to disable the things I don't want.

I don't think my Linux desktops will even let me put things on them. I suppose I still keep a few clutter spots going on elsewhere because it helps me remember things.
posted by telepsism at 6:37 PM on January 28 [5 favorites]


Tile, Baby, Tile

🎵 Linux Inferno 🎶
posted by Greg_Ace at 6:41 PM on January 28 [5 favorites]


There are a lot of valid critiques of Season 6 of Community, but then there's this episode where Jim Rash made me laugh so hard I fell off my damn chair.
"The problem is in order to copy a file, you have to throw a fireball at it, then absorb the fire, then drop the flaming file into a crystal lake, then take out both copies and throw them into the side of a mountain."
posted by The Pluto Gangsta at 7:04 PM on January 28 [5 favorites]


You can always tell a Windows user on a Mac, they almost always maximize windows so they're only interacting with a single app at a time.   And there's nothing wrong with that!  It's just a different way of working with a GUI and it's what makes them comfortable.

For all its name, Windows is not the least bit window-centric, it's app-centric, the window managing side of it always nudging you to go full-screen.  Again, nothing wrong with it.  Not the way I like to work, but that's because I've used macOS flavors my entire computing life, from System software to the modern OS.   And macOS is most definitely window-centric.   Apple's GUI paradigm is structured such that it encourages you to use several apps simultaneously in a way I love because I've been using it for decades, but most Windows users won't because their OS of choice approaches application window management differently.

What I took from the article is that he found a window manager that melds the two approaches together.  Doesn't seem particularly revolutionary, but maybe I'm missing something.
posted by los pantalones del muerte at 7:11 PM on January 28 [3 favorites]


The one thing that kills me about windows is how one can steal focus from another.

There are x style mouse programs! I use this one and have been happy with it.

grumble grumble xmouse used to be available thru powertoys but microsoft stupid grumble
posted by GCU Sweet and Full of Grace at 7:16 PM on January 28 [2 favorites]


I grew up a Mac user, and now I’m on Windows computers (except for my phone). But did growing up Mac influence my current style of using windows that are 70-90% of my screen size come from?

Part of my resistance to full screen applications likely has to do with having 3 monitors and one of them is pretty big. A full-screen application (other than a video game) would feel smothering and overwhelming. Also, I’m prone to having multiple cascading windows on a single screen at a time where I can easily click on the layer I want and bring it to the front. Full screen applications (browser, text editor, even photo or video editing) are unsettling to me the same way a closed door to an empty bathroom is.

Anyway. Windows’ tile-assist features for windows is sometime very useful and sometimes gets in my way.

And my desktop picture (which spans all 3 of my monitors) is a photo of a desert sunset that I took.
posted by itesser at 7:28 PM on January 28 [2 favorites]


I suspect “desktop” in this article means the whole GUI representation, not the image you see when no programs except the GUI are running. “Desktop” as opposed to “terminal” historically, plus also this is someone who runs servers (that might not have GUIs at all).

Which makes a little more sense of what’s new. Still I didn’t think the description of the tiling manager had much to do with hierarchical files-and-folders as computers use them. Only a little to do with skeuomorphy, even! I’ve never seen the progressive-filter-as-you-type presented as a knitting-needle sort or anything else physical.

OTOH, not the first person I’ve heard joyful about a fancy tiling manager; and whenever I can work with keyboard only I am much faster and my shoulders hurt less. So I should see which would be easiest to install on my friendly Pop! Ubuntu.
posted by clew at 7:56 PM on January 28 [1 favorite]


You can always tell a Windows user on a Mac, they almost always maximize windows so they're only interacting with a single app at a time. And there's nothing wrong with that! It's just a different way of working with a GUI and it's what makes them comfortable.

I’m a Mac guy and these days I use a lot of full screen apps, because using trackpad gestures to switch workspaces works really nicely.
posted by atoxyl at 8:23 PM on January 28 [1 favorite]


Consider the absurdity:

Okay.

Why do we still “drag” files?

Because it turns out that quite a lot of the operations that it is useful to be able to perform on chunks of data - replicating them, backing them up, associating them with other chunks, versioning them - remain useful regardless of what their internal format is or what they represent, and it's often quicker and easier to get that work done using a point-and-grunt graphical interface than manipulating them by name with something more literate.

Why do we need a “desktop” at all?

Many of us don't. Others, though, find both the metaphor and its associated cached view of current work to be very useful.

Our digital spaces are hyper-connected, non-linear, infinitely malleable

Yes, we're all fully buzzword compliant now.

yet we continue to constrain them within the rigid, physical-world boundaries established four decades ago.

Some of us would rather know where our stuff is kept than entrust its safety to some your-call-is-important-to-us corporation. I guess you could call that preference a "constraint" if it makes you happy. I call it not being a rube.

Infinite malleability should mean that I get to work within a digital environment that makes me comfortable, and you get to work within one that makes you comfortable, and yet mutual intelligibility remains readily achievable when required. My preferred digital environment is not objectively better than yours but yours isn't objectively better than mine either, and I will resent the hell out of anybody who insists that I must break my workflow just to fall in line with whatever their focus group tells them will appeal to more rubes.

If there's any actual "constraining" going on in IT, it has far less to do with the ongoing availability of older user interface patterns than with the relentless drive toward replacing all of them with fucking touch screens. God, how I loathe fucking touch screens. And if there's one emerging trend I loathe and resent even more than fucking touch screens on fucking everything, it's fucking voice controls in fucking everything. Fuck the fuck off with that fucking shit. I like my nice precise non-view-occluding mouse cursor for point-and-grunt, and I like my nice unambiguous physical buttons for control-by-feel, and I like going to my computer when I want to compute instead of being followed about by fucking surveillance capitalism every fucking minute of every fucking day. Fuck digital ubiquity!

Ahem. Sorry. Mind went all infinitely malleable and lost composure for a moment.

Our computers are capable of generating entire worlds, processing billions of calculations per second, constantly connected to global networks

Indeed they are, even if the worlds they generate are weak sauce compared to the one they're all embedded within.

and yet we interact with them as if they were elaborate digital paper-pushing machines.

Turns out that quite a lot of what it is useful to accomplish with these elaborate machines does boil down to pushing metaphorical digital paper, yes. It's mundane, sure, and most of us will doubtless find that mundanity somewhat tedious. But that isn't helped by a relentless treadmill of mandatory "upgrades" that's even more tedious, especially given how often it happens that the New Thing becomes the Only Thing for no reason more compelling than digital designers' desire to follow fashion.

Sometimes it is also useful to float off on a cloud of vague abstraction, forgetting for a little while that in fact I live in this world with all its annoying little details that somehow keep on refusing not to matter.

There are various techniques for achieving that state. Some of them involve ongoing access to a constructed digital world that itself bristles with annoying little details that somehow keep on refusing not to matter. My favourite, though, involves visiting my local river swimming hole at midnight, floating naked on my back, allowing all my limbs and my head to arrange themselves as they please, and restricting my intentionality to the minimal breath control required to ensure that I don't accidentally sink while gazing outward into the blazing magnificence that is a Southern Hemisphere starscape.

Oddly enough, having a tiling window manager available doesn't improve that experience.
posted by flabdablet at 8:26 PM on January 28 [16 favorites]


Okay I read TFA and I don't get what they're on about. Tree shaped file structures underpin everything they're talking about. They can be flattened or stepped through one branch at a time. You can arrange only your favorite leaves in front of you if you like. Dragging things around with a mouse is a fine way to organize them, as is putting on goggles and flapping your arms around spatial-style. I guess the things the author is discovering as fresh and new seem to me like just another bug's eye view of the trees. I mean - they are big trees.

My first eureka moment like that was ages ago. I'd been learning to compute on Mac OS 9 and saw someone using the Win95 File Explorer just opening up entire branches of files and moving things around. All of the disjointed stacks of folders I'd been tunneling through on the Mac suddenly made sense viewed as parts of a tree. I liked the zoomed out view of the structure.

Also the javascript in that screenshot is triggering me. There's snake case, camel case, pascal case and whatever it is when you use all lowercase with no word delimiters. And mixing 'var' and 'let' like that...
posted by telepsism at 8:35 PM on January 28 [7 favorites]


That headline made me think they were going to recommend Lotus Notes and it felt like a disturbance in the Force.
posted by Sphinx at 9:33 PM on January 28 [6 favorites]


God, how I loathe fucking touch screens. And if there's one emerging trend I loathe and resent even more than fucking touch screens on fucking everything, it's fucking voice controls in fucking everything.
Preach on brother, I'm right there with you. Not just because I hate finger-smears all over everything, but it's just so fucking inefficient in most cases. On phones, sure, but there's no way I want touch controls on anything bigger.

Oddly enough, having a tiling window manager available doesn't improve that experience.
Definitely an experience best enjoyed in full-screen and even more so not viewed through Windows ;-)
posted by dg at 9:40 PM on January 28 [5 favorites]


search vs structure seem to be the metaphor- and all search is half assed bullshit. I don't need guesses, I need answers. or, give me a sql-like query language for search - this and only this, not that, plus only after last wednesday, plus includes the strict case sensitive term "bollocks".

otherwise, let my brain do the work, and get outta the fucken way.
posted by j_curiouser at 9:45 PM on January 28 [6 favorites]


Computers are already full of distractions; I can't think of anything more hostile to concentrated work than having several programmes in little tiles, all competing for my attention. That would clearly be a huge leap backwards, at least for me.
posted by Termite at 9:49 PM on January 28 [3 favorites]


I was real keen to see what the cool alternate interface was! I was so ready, friends!

And then he just....that's what my workspace looks like a lot of the time. I just manually tile the windows myself, and frankly that's a good way to do it because then I'm the boss of it. I can put each tile where I want in the frame and size it as needed. I can use keyboard shortcuts to get around and often do. It's something I've been doing as a windows user since 3.11.

I can't express how disappointing that was. I would love a paradigm busting re-interpretation but friend, the tactic I've been using since 1994 is not it.

God, how I loathe fucking touch screens. And if there's one emerging trend I loathe and resent even more than fucking touch screens on fucking everything, it's fucking voice controls in fucking everything.

Flabdablet for Prime Minister. I've been trying to get a small lightweight laptop in the vein of my much beloved Acer Atom since it died on me in 2023 but they're all so fucking heavy thanks to touchscreens. I don't want to touch my screen. That's the opposite of useful. I just want to go typie typie on a real keyboard attached to the heaviest part of the machine so it's all balanced like. I don't want a tablet that needs a fucking kickstand and a seperate keyboard of any type, two bits to lose and to keep charged and to find in my bag. Just a little machine that does what it is supposed to do. RIP my little Atom, I loved you so much.

Also voice controls are a fucking joke if you don't speak in a specific American accent, my housemate has his phone voice controls switched on all the time and it just randomly starts babbling definitions at you more or less at random, because something in the Australian vernacular inspires it to search for the most unpredictable of phrases. I can't imagine trying to drive a computer with those and it's only touch device tomfoolery that makes it even viable in the first place.
posted by Jilder at 10:19 PM on January 28 [3 favorites]


Enter tiling window managers. Think of it as the anti-desktop. Instead of dragging windows around and stacking them like a digital version of your messy desk, a tiling manager automatically arranges everything into neat, non-overlapping tiles. Open a new app? Boom—it snaps into place, perfectly aligned. No clutter, no wasted space, no hunting for that one lost window buried under 15 others.
If that works for you, I'm genuinely happy for you.

Tiling window managers do not work for me. I don't like having the machine "helpfully" decide for me how much of my screen it ought to dedicate to each task; it gets that wrong far more often than not, and I don't enjoy needing to translate my preferences into explicit policy just to improve that balance. Managing my windows by hand is one of those things that costs me so little effort as to be not worth automating.

Clutter doesn't bother me, and I run a panel that has a menu button bottom left, some quick-launch buttons for applications I frequently want to open, and separate window buttons grouped per open application for application windows I already have open. I can bury a window under 15 others if I want to with no risk of losing it. I also use a window manager that very easily lets me designate selected windows as always on top.

This is a UI pattern that's only very slightly improved over what Microsoft designed for Windows 95. I liked it better than the Mac UI in 1996, and I still like it better now.
this is the first time in twenty years that using a computer has felt exciting and fresh again.
I would not be at all surprised to find that in a year or so this author comes to see this feeling as having a lot more to do with the discovery that there's a whole world of freedom outside the proprietary walled gardens than the specifics of any given window manager.
Also, some things needed serious tinkering before they worked as I liked it. But that’s the beautiful thing about it. With enough time and dedication, you can basically get anything done, where with Windows and MacOS you will eventually run into some hard limits of the platform.
I've said it many times before, but I'll say it again: the problems people encounter with free software usually boil down to trying to work out how to make it do some specific thing that you want it to. Problems with proprietary software in general and Microsoft stuff in particular usually boil down to trying to work out how to stop it from doing endless amounts of things you'd rather it didn't.

Not knowing how to make your software do what you want is frustrating, and this is the disincentive to switching to non-proprietary alternatives for most people. What those people miss, though, is that frustration is just inherent in working with any software. It all sucks! So really, any fair comparison between the free and proprietary software experiences ought to account for the quality of the required problem-solving involved in both.

Figuring out how to make a system do some particular thing usually involves getting more familiar with one or more of its underlying design principles. This is a learning process, and undertaking it leaves the learner more skilled, more flexible and more creative. It also happens according to the desires and priorities of the learner. So although the process itself can be annoying and almost always involves more effort than anticipated, the outcome almost always feels unambiguously good.

Figuring out how to make a system not do some particular thing doesn't usually offer much in the way of new insight; it's almost always about finding where the designer has buried the controls for the new source of aggravation in order to turn it off. Getting it done also doesn't leave the user with much in the way of new knowledge or new capabilities, and is usually made necessary by some automatic "upgrade" whose arrival has nothing to do with the user's own desires and priorities. The aggregate result is a constant feeling of existing under low-level siege.

The flip side of that is that it's the same low-level siege that most people exist under, which makes it harder to see as something potentially avoidable and easier just to accept as a cost of doing business. Which is why 2025, like every year before it, will not be the Year of Linux on the Desktop.

This makes me sad when I think about it, because it's reflective of so many of the larger patterns that allow surveillance capitalism to keep on digesting the world and turning everything to shit.

Night swimming helps keep me workably sane in the face of all that. Making a regular practice of literally not drowning in a literal river of relatively clean water is a good counter to feeling myself metaphorically drowning in the Empire's metaphorical flood of shit.

let my brain do the work, and get outta the fucken way

This. I would much rather use what little brain I have left than choose to let it rot inside a Kessler Syndrome of spurious conveniences.
posted by flabdablet at 10:22 PM on January 28 [8 favorites]


give me a sql-like query language for search - this and only this, not that, plus only after last wednesday, plus includes the strict case sensitive term "bollocks".

Sounds like find

Infinite malleability should mean that I get to work within a digital environment that makes me comfortable, and you get to work within one that makes you comfortable, and yet mutual intelligibility remains readily achievable when required.

Yeah, I think this - choice and customizability - is the important thing, more than the article's kind of confused conflation of lots of stuff under the title "desktop".

To me the big frustrating thing about the user interfaces on Windows, MacOS, and iOS is that you only get one UI option. Windows looks like this, Macs look like that, iPhones look like the other. You can customize a little, but only a little. It's all centrally controlled too - if Windows or Apple change their UI decisions, your UI has to change when you update, whether or not you want it to change (and often, whether or not you even want to update).

But on linux and other unix-y OSes, (including, to a certain extent, Android) you can choose any kind of UI you like. UIs and operating systems aren't tightly bound together. If you feel like it you can literally install a Windows-like UI alongside a Mac-like UI alongside the kind of tiling UI the article describes alongside some 3D UI where every window is actually a flying penguin living on a rotating 3D ice floe and you maximize them by feeding them fish.*

* the penguin one probably doesn't quite exist yet, although the 3D stuff does. But there's exactly nothing stopping anyone from coding that up and letting others install it.

You can switch between different UIs whenever you feel like it, like at the click of a button or the press of a key if that's how you want to set things up. You can even decide not to use a graphical interface at all and just live your life in a commandline terminal containing other commandline terminals, if that's what you really want.

If you have things set up in some way you like, you can often keep it that way for as long as you want. I have a setup I've been using for over 2 decades, and any changes in it are ones I decided I wanted, not ones that were forced by some update. I don't have to get used to other people's random design decisions (well, except for all the UI changes inside specific apps - hi, Firefox - but that's a different story).

And many (though not all) of the UIs are way, way more customizable than any UI I've seen outside of the unix world - in terms of appearance, in terms of keyboard or mouse or other shortcuts, in terms of (the equivalent of) macros, even in terms of logical setups. For example, I use a UI where I can set up any number of simultanous desktops that I can switch between in whatever way I define. (So if I were a more organized person I could set up all my work stuff to open in one desktop, all my entertainment media stuff to open in another, all my random browsing to open in a third, and so on, and then switch neatly between them like an organized person. In actuality I'm very messy so I just spread things out as I go, like I do IRL if I have multiple tables.) If I wanted to I could make those desktops bigger than my actual screen, or fit multiple desktops onto the screen at once, or whatever. I don't personally see the point of that, but apparently some people find it useful and it's great that they can if they want! It's not a full-on tiling UI but I was able to snap and tile literal decades before Windows and Mac users got that ability. If I want icons on the "desktop", I can have them, but I don't have to; there are a bunch of alternatives to icons. If I do use icons I can make them live in specific places on the screen so I don't have to look for them, just use muscle memory. I can make apps open up in specific default locations or window sizes. I can set up any shortcuts I want, for pretty much anything I can imagine. I can do every non-drawing thing with just the keyboard, or every non-typing thing with just the mouse. I can set up context menus to contain anything I find useful and not contain anything I don't. I think I can even make them dynamic, and maybe one day I'll play around with that.

I don't have to do any tweaking of anything, but I can if I want to, so every once in a while when I get the urge I take the time to make something a little bit nicer, and little by little over the years I wound up with something I really like, that's set up for my needs and my preferences and makes me happy when I use it. With zero regard to how the heirs of Steve Jobs or Bill Gates or anybody else likes things. That feels nice!


Computers are already full of distractions; I can't think of anything more hostile to concentrated work than having several programmes in little tiles, all competing for my attention. That would clearly be a huge leap backwards, at least for me.

I wish the article had gone into at least some of the alternatives available besides tiling window managers, but as far as tiling ones go you can usually decide how you want things to behave, so if what you want is one single giant window at a time, or two tiles side by side and no more, or whatever, you can make that happen. And you can maximize or minimize anything any time, generally using whatever shortcuts you want.

But tiling UIs are just one kind of UI and there are plenty of others and you can choose whatever makes you happy, is the point.
posted by trig at 10:36 PM on January 28 [3 favorites]


I have a setup I've been using for over 2 decades, and any changes in it are ones I decided I wanted, not ones that were forced by some update.

It's exactly that kind of stability that I personally value more than the customizability that let me build the setup in the first place. I would not actually mind having had to adapt to some proprietary designer's vision of the "right" way to do a UI if they would then just leave it the fuck alone.

Muscle memory is expensive to acquire and I get super pissed off when required to abandon it, especially when that requirement is driven by nothing but fashion and fad.

I don't have to do any tweaking of anything, but I can if I want to, so every once in a while when I get the urge I take the time to make something a little bit nicer, and little by little over the years I wound up with something I really like, that's set up for my needs and my preferences and makes me happy when I use it. With zero regard to how the heirs of Steve Jobs or Bill Gates or anybody else likes things. That feels nice!

Sure does. And I look at all the people around me who appear to be wilfully depriving themselves of the opportunity to experience that feeling for themselves, and some days it takes a lot of effort to remind myself that sometimes it's not about the nail.
posted by flabdablet at 10:52 PM on January 28 [5 favorites]


"Also, some things needed serious tinkering before they worked as I liked it."

Nothing like the luxury of free time to get yourself used to environments whose setups are often poorly documented and not terribly intuitive. Linux is among the worst offenders in this regard, although it is my OS of choice mostly because I don't have to pay for it.

I'd like to get away from mousing, too, but I'm nearly 60 and I simply don't have much time or energy to devote to grooming my computer's user interface.
posted by rabia.elizabeth at 11:37 PM on January 28 [2 favorites]


I sometimes - very rarely - need to compare two versions of a Word document, or something similar. In that case, having two equally sized windows next to each other is very useful. But having all my open programmes in little tiles in front of me ... that would just as helpful as if the majority of those tiles were filled with ads. What the author of the article describes is actually worse than what I currently have.

It's not about the nail
- thank you for this, flabdablet - that was great!
posted by Termite at 11:42 PM on January 28 [2 favorites]


Most visual interfaces are built as the “stupid person” interface to the “real computer” of a text interface. Never mind that text is just as abstract as any other human-readable representation of computing— computer culture treats text as primary and essential, and visualization as a limited abstraction.

That's not true at all. The vast majority of interactive computing today takes place in visual interfaces. This includes applications for working professionals: Adobe products, Blender, CAD software, etc.

And to the extent that textual interfaces are prioritized over GUIs, this isn't some kind of hacker elitism.

Text is composable. You can give a program textual input, get textual output, and then feed that output into another program.

For many reasons, GUIs do not generally compose this way. Admittedly, we have a little more GUI composability than we used to---things like the clipboard, drag-and-drop, "standard applications" for filetypes, etc. But this is still quite limited. And there's no obvious way to get around this. Every GUI has its own bespoke data model that it displays in its own bespoke way, and graphical applications are mostly created by different, non-cooperating teams. How are you supposed to move data between these applications in any meaningful way?

In contrast, if a program takes text as input and spits out text as input, the program itself doesn't have to know about anyone else's data model. You, the user, can simply arrange your data into whatever format the program expects and feed it in, no problem.

Also, GUI applications are difficult to use in an automated way unless they deliberately provide facilities to do so. This is less of a hard problem than the above one, but it is still a pain point.
posted by heraplem at 12:25 AM on January 29 [3 favorites]


I should also say that text is universal. Text is trivial to manipulate (shut up I don't want to talk about Unicode), it requires essentially no hardware or OS support, and it works basically the same everywhere. GUIs? You've got X11, GTK, and Qt, but those all kind of suck to write for. The closest thing to a universal GUI framework that's decent to program for is . . . the Web browser.
posted by heraplem at 12:32 AM on January 29 [2 favorites]


Text is trivial to manipulate (shut up I don't want to talk about Unicode), it requires essentially no hardware or OS support, and it works basically the same everywhere.

For sufficiently restricted values of "everywhere", sure.
posted by flabdablet at 12:57 AM on January 29 [3 favorites]


I just came here to laugh at the phrase "desktop wallpaper". Ah yes, just like in those 1950s office dramas, where everyone is gluing wallpaper onto the surface of their big desks!
posted by rum-soaked space hobo at 2:26 AM on January 29 [2 favorites]


Best part is these types always want to forbid things that actually work instead of improving them. I want a gui for every single thing that works on a computer - that doesn't mean it needs to be stupid.

It's a failure of computer touchers to imagine anything other than the command line, which is pretty insane. Like somebody said, the article is full of shit.

(also, look at how this kind of user interface took over the world - all things are designed in openscad instead of nx, solidworks and the others. it's because of the superior interface, no mouse, no gui. jesus)
posted by mayoarchitect at 2:44 AM on January 29 [3 favorites]


Yes, the article seems like a lot of overblown hyperbole just to say "I like my windows arranged in a neat grid". I didn't see anything about alternative metaphors or infinitely connected malleable spaces, which in any case sounds like a nightmarish LSD trip.

I'm quite happy to view one window at a time, thanks, as my monitor is not very high-res, and I'm not a software developer, nor a movie hacker who needs to have twenty windows of scrolling text visible at once. I agree that using a mouse is a time-waster though, and switching windows with the arrow keys does seem like it would be fractionally faster and easier than Alt-Tabbing, but I'm not sure that justifies this breathless article about paradigmal inversions and extraboxular thinking.
posted by mokey at 3:55 AM on January 29 [4 favorites]


I run xmonad on arch linux. I'm also a vegetarian cyclist. Yes I have a mechanical keyboard, why do you ask?

Xmonad - check.
Mechanical Kbd - check (Since my XT-286, currently using a couple of Unicomp Classics.

Not a vegetarian nor a cyclist.
posted by mikelieman at 5:47 AM on January 29 [1 favorite]


Best part is these types always want to forbid things that actually work

Which types? This seems like a lot of projection about some guy who's definitely not just using a terminal with the screen command.

It's a failure of computer touchers to imagine anything other than the command line, which is pretty insane.

To me that actually seems like the opposite of the trend for the last million years. Desktop Linux users by and large tend to use heavy "desktop environments" like KDE and things derived from GNOME. Desktop environments basically encourage users to do everything through graphical interfaces instead of text files and commands. Which is fine! (Although it makes it harder if you actually prefer using the commandline, because when you Google for instructions on something half the time you see answers like "just check the whatever box in the settings dialogue you get to from the whatever menu", or "just run the whatever plugin".)

Sometimes GUIs feel like the most natural and easy way to do something. Sometimes just telling your computer straight up what you want to do, with words, only writing them instead of having to speak them out loud, feels easier, more natural, and more direct. Sometimes one is more powerful than the other. They're both important. It's good to have options.


Anyway, that article didn't even say much about the UI he's using. I took a look at its documentation out of curiosity, and just at first glance it looks like it has a bunch of features I don't care about (animations, eye candy) but also some things that seem nice (like, if I understood correctly, the ability to set universal shortcuts that can work in some specific app even if you're not currently in it, so that you don't have to switch away from app A to do something in app B). Again, not stuff that anybody actually has to learn or customize in order to have a perfectly nice working UI, but things that you can change if you want to or think it'll make your life better.


"Also, some things needed serious tinkering before they worked as I liked it."

I think the key words there are "as I liked it", not "before they worked".

Nothing like the luxury of free time to get yourself used to environments whose setups are often poorly documented and not terribly intuitive.

I haven't found any UIs that are especially intuitive (try teaching any unfamiliar adult using Windows or Macs or iPhones or Android for the first time - even if they're used to one system or the other, they keep getting stymied by how to do specific things on the new system). And none of the UIs for those systems is well documented. But if you're using them you don't have any choice but to get used to them (except, to some extent, on Android where you can run different "launchers"). If you're using something like Linux then (a) you can choose what you actually want to get used to, and (b) if you actually want to you can read some documentation (or get someone to help you, or just copy someone else's setup) to have your system adapt to you more than you adapt to it.

And again, if you feel like doing that it's on your own schedule - you don't have to keep adapting to things that somebody else decided to change. I keep thinking maybe I should try out a tiling window manager sometime, maybe I would like it, but if I ever do it it'll be when I have the time and headspace to explore and get myself used to a new way of doing things. Until I have that leisure and am in that headspace, I don't have to learn or get used to anything.
posted by trig at 5:58 AM on January 29 [2 favorites]


I had a 25 year career in advertising as an art director, so though i was weaned on a Commodore 64, I have MacOS hard wired into my brain. When I semi-switched careers, I was forced into using a PC with Windows in an agency that has extremely tight IT security (we very, very, very indirectly might have access to medical files, though I would never need to see one nor have I ever come close to seeing one). So I'm in a very rigidly locked-down Windows world at work. Being in Marketing & Communication, I am one of the very few people who has access to say, YouTube, or Facebook, etc. (I run social media among other things)

My work laptop has a touch screen that I loathe. I always forget that it is a touch screen until I bump it or something and then end up doing something I never intended to do. And I almost never see my desktop on my home MacBook Pro or my Lenovo work laptop.

The desktop on each machine is just a Limbo where I very occasionally save something that I can't think of what to do with. Oddball or funny things that I don't really intend on keeping. I rarely see my desktops except briefly at startup. I still float my windows around for different applications because I'm 54 years old and I don't want to switch using the dock (or whatever they call it in Windows) and my Windows machine constantly "helpfully" tries to get me to lock my windows into some rigid configuration.

Systems like this are generational I guess. I really don't want to be assed to learn a new way of doing my regular work (learning Windows quick-keys was painful enough thank you). I see the young kids of my nephews and nieces who are given tablets in the cradle, and I'm not sure that many of them have even touched a mouse or keyboard. Not sure how that's going to work out, but change is the only thing that is constant.
posted by SoberHighland at 6:29 AM on January 29 [1 favorite]


I'm a big command-line nerd who personally finds tiling WM stuff more comfortable, and I agree with the sentiment that mobile phones captured the Alternative To The Desktop Metaphor for everyday people. The desktop files/folders thing was always limited to people who folks who'd had office jobs and understood why filing existed in the first place. Showing what a shell-brained nerd like me chooses as the alternative isn't a helpful conclusion.

I'm in a graduate degree programme right now where I'm a little over twice the age of everyone around me. Even stuff like google docs has made a pretty savvy set of people see no benefit for making hierarchy an intrinsic property of their documents. Instead, they tend to slurp their corpus into some kind of mind-map software and navigate it like a Wikipedia binge, or just search for stuff. We're all doing our readings of journal articles in PDF and epub before seminars, slurping sources out of JSTOR, managing bibliographies in zotero (well, I'm a total nerd, so I'm using bibTeX, naturally), and turning in digital documents that get annotated and marked digitally. At no point is anyone putting ink on paper anywhere in this process, and most people are getting it all done without having to know what a "folder" is.
posted by rum-soaked space hobo at 7:08 AM on January 29 [1 favorite]


My work laptop has a touch screen that I loathe. I always forget that it is a touch screen until I bump it or something and then end up doing something I never intended to do.

My first experience of that specific kind of aggravation happened at a customer's house while working as a PC fixit guy. The machine with the fault was an Acer all-in-one, essentially a laptop motherboard built into the back of a 22" flat screen monitor. I had just changed some Windows setting or other in the course of trying to track down the fault, with the immediate result that the mouse cursor began to wander about on the screen all by itself.

So I reverted the change, but the cursor continued to wander about on its own. I put a sheet of paper under the mouse; it kept happening. Unplugged the mouse altogether and it still kept happening.

Turns out, it was a bug. Literally an insect, some kind of small beetle. Little bastard was hiding in plain sight right on top of the pointer arrow, and as it walked its little feet across what I had until then not realized was a touch screen, it dragged the pointer along with it.

Why the fuck anybody designs a touch screen into a machine you'd have to stretch your arm all the way across your desk to poke and prod at, I will never understand. The entire point of inventing the mouse was to obviate the need for that kind of strain after it had proved problematic for light pen users.

my Windows machine constantly "helpfully" tries to get me to lock my windows into some rigid configuration

Holy fuck do I hate that behaviour, which was among those at top of mind when I made my remarks above about problem-solving on proprietary OS offerings being mostly about disabling unwanted "helpful" crap. And of course there is a hidden control that turns this aggravation off which your IT department has hopefully not "helpfully" locked you out of using.
posted by flabdablet at 7:20 AM on January 29 [7 favorites]


Metaphors, shmetafors, who cares whether there are virtual desktops, bulletin boards, file cabinets and rolodexes in there somewhere. The key is to develop your own mental map of things, and to keep the organization of your stuff as simple as possible. As a geezer on a Mac, here's my method, for what it is worth:

1. I don't use icons, I use list view, almost always. The exception sometimes is when searching through some pictures. Also I don't drag and drop, unless forced to, because it's much less intuitive than just clicking on what you want to open or insert.

2. I don't create complex folder structures, either in email or documents/files. People who do that are the ones who write you to say ask you to resend stuff they know you sent, but they can't find it. I have had TWO principal folders used for everything I've created and saved since 2008. One is personal, one is business. No subfolders. (Plus a downloads folder with everything downloaded since about 2018.) In the personal and business folders, the key is in consistently naming files, and using the list view to find them. So for example, for an organization called, say, LMN, I might have a file called "LMN budget 25." If I need it, I go to finder, open the big Business folder, scroll to LMN, and there it is right below LMN budget 24 and LMN budget 23, etc. OR, in "last opened" view, it might be near the top. Easy peasy, no need for an LMN subfolder, and a budget subfolder (or 2025 subfolder) under that. Worst case, a simple search of Business (or the whole Mac) locates the file.

3. For screen/desktop management, I use hot corners: top right displays all open files in the current application; top left displays the desktop so I can admire that picture of my grandchildren and retrieve one of the odds and ends that land there, such as screenshots because I've been too lazy to make them go somewhere else by default; bottom left opens all applications and open items if I need to go back to something I know is there (OR I can use the application icons bar to get there). Bottom right, the screen goes dark, if I've had enough of it.

4. That applications icons bar, which on Mac lives at the bottom of the screen by default — I moved it to the right side of the screen (you can put it on the left also), AND I have it hide by default unless I move my cursor over there. This opens up quite a bit of real estate: if it is always in view, left, right or bottom, my 13-inch screen would effectively be a 12.5-inch screen. (I recommend exploring your System Settings to find out how to do this and to find other tricks you might like. I'm always amazed how few people actually go there to customize their machine.)

5. In my browser (Chrome) I always have THREE tabs open to Gmail, so I can be working on up to three things at once without losing my place. And as noted above, I never use folders or labels, just keep everything in the main Inbox, and never delete anything. A simple search, or if necessary, advanced search, quickly brings up what I need 99.99% of the time. Along with these three I have pinned tabs with Calendar, Contacts, Google Drive, and a few other things like Metafilter.

That's it, it works for me, YMMV.
posted by beagle at 7:21 AM on January 29 [2 favorites]


Heraplem, read about Max. It has existed since 1985 and shows how a windowed, visual system can represent composability and data flow. And it is largely used for waveform manipulation, which is not representable by text.
posted by Headfullofair at 7:54 AM on January 29


> The desktop on each machine is just a Limbo where I very occasionally save something that I can't think of what to do with.

> top left displays the desktop so I can admire that picture of my grandchildren and retrieve one of the odds and ends that land there

This is why I hate the desktop folder as a place for files—it's often the equivalent of a real-life "Miscellaneous" folder and about as useful. I use Macs and Windows daily, and every computer I have control over loses the ability to display items on the desktop. Let me see the pretty landscapes in my wallpaper folder instead.
posted by Wilbefort at 8:06 AM on January 29 [2 favorites]


2. I don't create complex folder structures, either in email or documents/files. People who do that are the ones who write you to say ask you to resend stuff they know you sent, but they can't find it.

It's the opposite, in my experience. People without any organizing framework might be all over their current work, but let a project get done and slide into the past, and they suddenly need something specific out of that project...could get ugly.

Now it seems to me that by using list view, and your file-naming conventions, you do have some baked-in hierarchy, but still, you are in ONE directory with thousands of files, docs, records etc? [shudder]. Screw up a file name, it could be lost forever.

Any email client worth having will search all its folders as easily as searching just one. It's also bad form, in my opinion, to use your email in-box as long-term storage. If I've received something important, I save it right away with the project it belongs to.

No, I'm almost never the person who misplaces an email attachment. :-)
posted by Artful Codger at 8:15 AM on January 29 [2 favorites]


Let me see the pretty landscapes in my wallpaper folder

No! No! You're doing it wrong!

Our digital spaces are hyper-connected, non-linear, infinitely malleable, and yet you continue to constrain them to picturing rigid physical-world boundaries! What are you, some kind of luddite?
posted by flabdablet at 8:16 AM on January 29 [3 favorites]


I have had TWO principal folders used for everything I've created and saved since 2008. One is personal, one is business. No subfolders.

I kinda wish I could do this but I have, idk, half a million photos I've taken over the past 35 years.

So for example, for an organization called, say, LMN, I might have a file called "LMN budget 25." If I need it, I go to finder, open the big Business folder, scroll to LMN, and there it is right below LMN budget 24 and LMN budget 23, etc. OR, in "last opened" view, it might be near the top.

This is just folders with fewer steps.
posted by Mitheral at 8:42 AM on January 29 [4 favorites]


It's also the way folders got simulated on Macs before HFS was a thing.
posted by flabdablet at 8:50 AM on January 29 [2 favorites]


i'm on i3wm on linux and very happy. No complaints :)
posted by mahadevan at 9:15 AM on January 29 [2 favorites]


slurp their corpus into some kind of mind-map software and navigate it like a Wikipedia binge

So, as alternatives to a user-editable tree structure of names, what do we have so far for finding computer information?

Full-text search
Directed graphs (mind-maps, Wikipedia, etc)
Recency*
Folksonomy (tagging)
Database (SQL implicit??)
Zettelkasten, obsidian, etc which seem like glue between some of the above??
Let the apps do it
Make your coworkers do it


* ages ago there was a totally skeuomorphic GUI of “volcano sort” or “messy desk”, in which stuff was visible in proportion to having recently been opened. And I just saw someone’s cunning use of touch to organize their file-based task list.
posted by clew at 9:28 AM on January 29 [1 favorite]


Looking at those tiled screens gave me a headache. Even if I wasn't on a 16 inch laptop. In part, perhaps it's because I'm in my 40s and have to have everything at 125% to read it easily (my eye doctors keep telling me that reading glasses/bifocals/whatever won't help yet). Reading anything in any of those tiles would be impossible. Give me a full screen, let my eyes do their thing.

I will agree that keyboards are better than using the mouse, but there are easy ways to handle that. There is Quicksilver on the Mac, and the younger, slightly inferior cousin Launchy on the PC. Alt+Space, type in program name, or if I set it up right, file name, webpage, etc and I get it. The biggest issue is that it doesn't work easily with Steam, given the way that Steam stores games. (Quicksilver was faster than find, when I last used it, that may not be true anymore, Launchy is so much faster than windows search, even with AI cruft eliminated at the registry level.)

I like the start menu and pinned programs interface. Windows 7 and 10 have been the easiest to use operating systems I've had and I really don't look forward to having to tame 11 when I have to switch. That said, even if my laptop was not also my machine for gaming (Yes, some games are available on Linux. I will get insulting if anyone tries to use that as a reason to switch.) I don't think I'd be willing to brave a Linux machine. In part it's from memories of borrowing my girlfriend's cheap laptop in college, the one that she (who is now a senior something or other programmer at a FAANG) could never get the wifi working with Linux on, so that I lost the internet and distractions while writing my undergrad thesis. In part it's also being comfortable in an environment and not needing anything Linux does. Maybe if I was a programmer instead of an analyst for whom python is an R alternative, but there's a why bother? The tiny annoyances are small enough that the major annoyances of getting used to a completely new system and learning all its annoyances make it not worth the switch.
posted by Hactar at 9:56 AM on January 29 [2 favorites]


The tiny annoyances are small enough that the major annoyances of getting used to a completely new system and learning all its annoyances make it not worth the switch.

That's legit! (I do think there are bigger reasons to move to linux). Still, it's interesting how those annoyances and learning curves also exist when you move from Windows to Mac or vice versa, or from iOS to Android or vice versa - not to mention the hardware/software issues you can face with all of them - and yet most people don't really fear those moves in the same way they fear "braving" a move to linux. Which, again, can give you the kind of UI you feel at home with, whatever that may be.
posted by trig at 10:54 AM on January 29 [3 favorites]


beagle: I never use folders or labels, just keep everything in the main Inbox, and never delete anything

... which works fine, until Google says that you've run out of your 15 GB of free storage.
posted by Termite at 10:56 AM on January 29 [3 favorites]


These days the Mac OS is pretty much what I'm using all day for the office job part of my job. I'm generally OS-agnostic really, but with homebrew and bash I can do any of the terminal stuff I actually need, and I just live in fullscreen or tiled modes for each app on one monitor, switching between Slack and a terminal in the other. The apple "desktop" switcher has good keyboard shortcuts and I dislike using the mouse for any basic UX functions. I quite like xmonad but I wouldn't say it always got out of my way.

But to The Desktop. Don't use it, don't need it, think it encourages bad habits.The major non-phone OSes all let you fire up any known app from the keyboard so I don't need stupid bubbly touchscreen icons. Every download either gets filed in its proper place where I can find it later, or sits in the Downloads folder until I dump it.
posted by aspersioncast at 10:58 AM on January 29 [1 favorite]


Termite, I ran out of that 15GB long ago. I don't mind paying a few bucks a month for 138GB, which covers Drive as well.
posted by beagle at 11:18 AM on January 29 [2 favorites]


Imagine a brand new world. It can be anything you want. Your imagination is limitless. Ok, so look around, what do you see? Probably a street, a house, a forest, a lake?

All of these are things that exist in our real world. But even with a 100% unlimited imagination, you still added them to your imaginary world. Why? Because it is SO much easier to riff off of existing ideas rather than create everything 100% from scratch.

Computers are much the same way. We use skeuomorphism because its easy and natural to do so. Buttons have been around for 100s or 1000s of years. They've been refined and are part of our cultural consciousness. We all know how a button works. You press it and something happens.
Think of that cultural consciousness as a low-level API for interacting with humans.

That is an established method that we use to interact with our world. No training required. That is a powerful thing to just casually toss aside.

I'm 41 years old, and I've been using computers since 386 days. It appears that UX designers have completely forgotten that this problem has been solved already. Take for instance swiping on a smart phone. There is nothing on the screen that indicates that you can swipe. We don't have a corollary for this in "real life". What other objects can you just swipe on to affect them? It is 100% NON INTUITIVE. Meaning that without prior experience or reading a manual, there is no way for us to use it without trial and error. Very poor user experience.

I know that some may argue that a button is the same. That without prior experience, you don't know that a button should be "pressed". But it has already been established even prior to electronics. Button have been in use for 1000 years, and we already know them. I'd also argue that with buttons, you have an explicit understanding of the trigger; the button is pressed or not. Swiping gives no indication to the user that anything has ever been done. Definitely inferior in that regard.

TLDR; Skeuomorphism has been used for a reason. That reason still exists and is valid. And it won't change unless the change provides an extremely compelling reason to do so. One that overrides the existing benefits.
posted by ianhorse at 11:52 AM on January 29 [11 favorites]


I agree with that a lot (I remember the first time I got a smartphone - I was fairly late to the game, and it came with zero instructions or introduction, and it was like "really, they expect everyone to just get this?")

But as far as "What other objects can you just swipe on to affect them?" - I think the metaphor there is flipping a page in a book (and, in some cases, unwinding an actual scroll/spool, pulling out a drawer, pulling down a rolled-up blind/screen, moving a slider, shooing someone away, etc.)
posted by trig at 12:19 PM on January 29 [3 favorites]


Death to the desktop metaphor! And while we're reducing metaphors, let's ditch "copy," "paste," "scroll," "bookmark," "enter," and "exit"!

Any interface is a metaphor—even and especially the "memory palace." If only we had something like a divine language that would allow us to express the true names of things, or a map that was truly equivalent to the territory.
posted by vitia at 12:28 PM on January 29 [3 favorites]


sometimes it's not about the nail.
That was awesome!
posted by dg at 2:28 PM on January 29 [2 favorites]


Button have been in use for 1000 years, and we already know them.

1000 years? The.... Abbasid engineers, possibly? Why not the classical automata of Alexandria or Zhou?
posted by clew at 4:55 PM on January 29 [2 favorites]


> fundamentally a file structure is a useful way to reflect clarity of thought, and has enough flexibility to cover a vast array of use cases, so it's not that surprising it's stuck around.

Yeah, I don't see the "problem". When I'm doing some "real" work I'm almost always fullscreen (and my display is only 24" and I don't want a bigger one, it would look ridiculous on my desk. I however can set up multiple displays on my desk if need be.) and otherwise I find myself not using the full width 1920px on my desktop anyways, I like to read manageable lines of text. However, I'm old school in lot of ways, including not ever owning a smart phone – or actually owning what was then called something like a smart phone but doesn't work anymore – like a communicator – and I dislike touch screens and find using apps on a phone just a painful experience COMPARED TO using a desktop app. Not a fan of the web app repackaged or whatever as a desktop app. I use the desktop as a temp folder for stuff I don't need to archive elsewhere. There is a folder called ALL OF ME where these random screenshots and memes get dragged to for a clean fresh desktop.

Also: at least on macos, the Spotlight Search still very regularly shits the bed completely and goes to searching only couple levels deep or not working at all. Finder search sucks too, I really can't count on it, don't know why it is so bad still. I'm on 2020 M1 Mac Mini running Sonoma 14.6.1. My Windows machines (Thinkpads) have all been on lighter load, so pretty much everything I use is on Dropbox, which yes is a folder on my desktop on both machines. I recently installed Win11 on my unsupported X220 and ran a couple of scripts from the internet to uninstall all the crap. Surprisingly (why was I surprised?) there apparently is a lot of AI-assisting crap in the regular install not to mention the regular annoying shitty little widgets and whatever. No widgets, thanks, on my desktop. Luckily these are quite easy to remove.

Douglas Adams wrote the following which resonates with me greatly:

Anything that is in the world when you’re born is normal and ordinary and is just a natural part of the way the world works.

Anything that’s invented between when you’re fifteen and thirty-five is new and exciting and revolutionary and you can probably get a career in it.

Anything invented after you’re thirty-five is against the natural order of things.

posted by fridgebuzz at 4:31 AM on February 4 [3 favorites]


« Older Three Men In A Boat MOVIE   |   Rise up against your electronic pickleball... Newer »


You are not currently logged in. Log in or create a new account to post comments.