AI-detic Memory
May 20, 2024 12:11 PM   Subscribe

Microsoft held a live event today showcasing their vision of the future of the home PC (or "Copilot+ PC"), boasting longer battery life, better-standardized ARM processors, and (predictably) a whole host of new AI features built on dedicated hardware, from real-time translation to in-system assistant prompts to custom-guided image creation. Perhaps most interesting is the new "Recall" feature that records all on-screen activity securely on-device, allowing natural-language recall of all articles read, text written, and videos seen. It's just the first foray into a new era of AI PCs -- and Apple is expected to join the push with an expected partnership with OpenAI debuting at WWDC next month. In a tech world that has lately been defined by the smartphone, can AI make the PC cool again?
posted by Rhaomi (120 comments total) 17 users marked this as a favorite
 
It's bold to assert that the PC was ever cool.
posted by Reyturner at 12:14 PM on May 20 [7 favorites]


Where do I turn this 'feature' off?
posted by signal at 12:15 PM on May 20 [57 favorites]


Also, I'm expecting the option that helps you remain private while "shopping for gifts" *wink*wink* is going to be the most-used on any platform that implements it.
posted by signal at 12:16 PM on May 20 [2 favorites]


We don't need the PC to be cool, we just need it to work.
posted by HypotheticalWoman at 12:21 PM on May 20 [17 favorites]


Microsoft is promising users that the Recall index remains local and private on-device.

LOLOLOLOLOLOLOLOLOLOLOLOLOLOLOLOL
posted by grumpybear69 at 12:22 PM on May 20 [56 favorites]


I guess for a while at least I'll make sure there are no qualcomm chips put there for my computer to spy on me. Because of course the "Recall" feature will remain "local and private on-device" until there's a big enough dumptruck of money sent over from ad servers and marketers.
posted by tclark at 12:22 PM on May 20 [9 favorites]


Not to hog this, but from the "new era" link:

Being able to tell your computer, in plain English, what you want to do has long been a dream, but now appears within reach thanks to generative AI.
On-device AI could also be used to serve up context and suggestions in real time.


Does anybody really want this? My experience with any system "serving up context and suggestions" has been Clippy with a shinier skin, that's it. I know what I want my computer to do, and I can get there in a few clicks (or a single command in the terminal). Why would I want to waste my time and sanity trying to get an LLM to understand what I want?

This seems like a corollary of the Torment Nexus Paradox—it's like these companies read the Hitchhikers Guide to the Galaxy, got to the part about Sirius Cybernetics Corporation's Genuine People Personalities, and decided that was their business model.
posted by signal at 12:24 PM on May 20 [42 favorites]


This will certainly make snooping on your spouse's PC thousands of times more efficient. What I'm having trouble understanding is what other use-case there is.
posted by mittens at 12:26 PM on May 20 [7 favorites]


Is this something where I am literally going to have to learn computer stuff in order to have a laptop that doesn't have AI features because I will no longer be able to buy one off the shelf? If there is one thing I do NOT want on any device, it is some kind of AI garbage. Also I don't want my washer to connect to the internet, or to be able to read email on my stovetop.
posted by Frowner at 12:27 PM on May 20 [20 favorites]




Windows can't even find a file in the folder I have open by typing its name sometimes, gonna pass on it trying all this other shit even if they weren't invoking the data demons to do it. I am confident someone will have ah andy way to turn all this garbage off the second they try to poison my OS with it.
posted by GoblinHoney at 12:32 PM on May 20 [30 favorites]


I'm one of the true-believers in this tech who thinks it's going to be as big a change as the internet was for PCs.

Cool that Microsoft is getting serious about ditching x64, too! Good riddance to old rubbish.
posted by torokunai at 12:51 PM on May 20 [2 favorites]


Every single time someone comes up with a way to tell the computer what you want it to do in plain English it always demonstrates how comically limited plain English is at describing what people want computers to do.

My fear is that generative AI is just adaptive enough that instead failing spectacularly some sort of bastardized "language" will emerge that's just as cryptic and frustrating to non-computer users as our current interfaces but with the added benefit of also being cryptic and frustrating to people who were previously proficient at using computers.

Imagine a future where airy persiflage has all sorts of undocumented effects on what you're trying to produce. Imagine a future where complimenting or scolding the computer actually does something! Imagine a future where you literally need to know the "magic words" in order to do stuff.
posted by RonButNotStupid at 12:51 PM on May 20 [13 favorites]


I’m so tired of gatekept data. I’ve hundreds of marked favourite artists in Apple Music and I can’t get a list of them much lest search for their albums. Everything is siloed. Networking is shit. My applications hog memory and cpu. Nothing is as responsive as an old 6502 machine was. Or even an IBM XT. I used to code with Sidekick TSR. It was instant. Computers suck. Programmers suck. Arrghhhh.
posted by seanmpuckett at 12:52 PM on May 20 [25 favorites]


Cool. Cool cool. And can I move the taskbar to the side of the screen yet? No? Coooooooooooooooooeeeeeeeellllllllllllll…….(dies)
posted by UltraMorgnus at 12:59 PM on May 20 [14 favorites]


I know people who aren't especially computer literate that live in a hellish world where they honestly believe that if you do all sorts of learned, cargo-cultish rituals, the computer will run the way you want it to run. Rituals like opening Word up before Photoshop because Word is "easier" for the computer to load and gets the computer's momentum going or installing software twice or even three times to make sure the computer "knows" it. And if something goes wrong? Maybe the computer just didn't want to work today? Or maybe it just didn't "like" what you were doing.

We should be making computers easier to understand and use, not obfuscating their behavior behind some ultimately unknowable LLM. Using an LLM as a computer interface resigns you to a world where nothing is empirical and everything is an incantation.
posted by RonButNotStupid at 1:04 PM on May 20 [58 favorites]



Microsoft’s AI Push Imperils Climate Goal as Carbon Emissions Jump 30%


Some people worried that we'd figure out how to program a computer smarter than us, which would be able to program a computer smarter than itself, causing a runaway chain reaction that would create a god.

But so far we've only been able to make computers seem smarter by throwing more and more hardware at them, which has a physical limitation. We're not going to get a god unless we pour the whole planet's resources into producing enough hardware to make one.

Cue billionaires who've spent decades hoarding most of the planet's resources deciding to melt all our gold to make a golden calf they hope will be a god.
posted by straight at 1:06 PM on May 20 [21 favorites]


So, like, assuming I prefer not to use pirated software, there's an ethical obligation not to use software created with or containing LLMs, right? I wonder though, is okay to pirate it? Like, the use of LLM style AI implies that an organization is okay with, and already participating in, fully automated luxury communism. Is it even piracy to take thier software without paying? (I mean as an ethical question. Legally it's certain to remain piracy)
posted by surlyben at 1:11 PM on May 20 [4 favorites]


Recall? actually, "Surveillance Mode".

I may never own a PC again.
posted by j_curiouser at 1:19 PM on May 20 [4 favorites]


That a marketeer for a manufactured product would willingly try to name it "Recall" absolutely blows my mind. That it made it all the way to the actual market, well...
posted by chavenet at 1:20 PM on May 20 [2 favorites]


I guarantee corporate will disable our access to turning Recall off. Does it include mic input? I mean, seriously invasive shit.

"Sorry to say, we had to inform HR that you are looking away from your monitor many times daily. Here's your box."
posted by j_curiouser at 1:22 PM on May 20 [4 favorites]


Meanwhile, I have Copilot on my work computer.

Just tried this:

You
do you have Recall?

Copilot
I apologize, but I don’t have any information about a product or service called “Recall.” If you could provide more context or clarify what you’re referring to, I’d be happy to assist further! 😊

posted by chavenet at 1:23 PM on May 20 [2 favorites]


My big mega-corp employer just launched an in-house ChatGPT/LLM/WhatEverTheFuckItIs to help answer questions about our company structure and reorganization plan. Like, someone actually spent time doing this.

The chatbot is insistent that it can only answer company-related questions but I just got it to give me a usable recipe for chocolate fudge brownies. Keepin' that one in my back pocket for the next all-hands meeting.
posted by JoeZydeco at 1:24 PM on May 20 [31 favorites]


Have they locked it down so you can't get it to give you confidential memos in the style of grandma's old recipe? Because I feel like, after the CANsas City phishing email that got Colonial Pipeline, there will be at least one critical infrastructure company that basically implodes.
posted by Slackermagee at 1:40 PM on May 20 [2 favorites]


What I'm having trouble understanding is what other use-case there is.

Theoretically, in a gay space communism sort of way, I would love a searchable archive of everything I've ever looked at if only because the accelerating enshittication of search but also bit rot.

But of course enshittification will come to this and all it will be is a way of lojacking my activities.
posted by Mitheral at 1:44 PM on May 20 [9 favorites]


Like 20+ years ago Microsoft decided "we should put scripting in everything" and they apparently have not learned any lessons from that debacle. Thankfully there is Linux.
posted by axiom at 1:44 PM on May 20 [8 favorites]


We never get faster computers or more storage because they just tack on more tasks the computer "needs" to do and bigger, higher-res files so everything feels just as slow and small as it always did. And of course you can't turn any of that stuff off so you DO have a zippier machine. Maybe if you run LINUX or something... grrr.
posted by rikschell at 1:48 PM on May 20 [5 favorites]


This is a security disaster in the making. Leaving aside the obvious threat of a hacker trying to get into this system, how about a conservative dad walking in the room and asking "Hey, did my daughter search for Planned Parenthood?"
posted by JDHarper at 2:00 PM on May 20 [17 favorites]


Exactly how much storage will these devices come with? And how much will I be allowed to use vs the ever-growing database of screen activity that will be stored locally?

The whole thing that LLMs and their multimodal cousins do is a kind of extreme data compression. I try to avoid analogizing this process to human memory because I don’t want to imply that it works the same way, but it’s not dissimilar in the sense that it’s storing a web of approximate associations that it has extracted and distilled from what it has “seen.” And I don’t know how else to explain without appealing to information theory.

So there’s no reason these kinds of tools can’t be set up in this way but exactly what it would take to make them useful is another question. As other people have said, it’s a different model of interacting with a computer where it’s fundamentally not exact about anything.
posted by atoxyl at 2:15 PM on May 20 [4 favorites]


If my ever so helpful computer is somehow saving everything I do on it, then is that saved stuff then available for the government to access, like in looking for evidence of subversive activity? Or if I watch a streamed movie, will that be stored too, making me a criminal pirate? Just who exactly are the people who think up this stuff? Do they actually think about this stuff they think up? Microsoft has always thought their users are idiots - installer programs are called wizards, because they are smarter and more powerful than you & then there’s that god damn paper clip.
posted by njohnson23 at 2:19 PM on May 20 [6 favorites]


Goodness but you're all a bunch of bummers.

I'm excited about the idea of Recall. For 20+ years I've done everything I can to have my computers (or cloud services) keep records of what I do. That memory is incredibly helpful. Google Timeline remembers where I've been. Gmail remembers all my written thoughts. My browser history remembers what websites I've been to. Goodreads remembers what books I've read. I regularly go back and search this old material. Recall seems very promising to me, it's like the promise of Lifestreams come to life.

The local processing version of it sounds interesting too. I use GBoard's voice typing on my phone all the time and the fact it's mostly-local makes a big difference. Because it runs lightning fast, no network delay. Also it is more private. Yes, you have to trust the company and product to not be lying about where the data goes. But if you don't trust Microsoft to not outright lie about something like that, perhaps Microsoft operating systems are not for you in general. No need to shit on Recall specifically.

Mostly though I'm excited to see more ARM hardware being built. Apple has proven that custom ARM chips are incredibly powerful and efficient. But you're pretty much stuck runnign MacOS on their hardware. I really want a cheap pervasive ARM PC platform so I can put Linux on it for a home server. Raspberry Pi proves it's possible but they're way too low end, particularly when it comes to disk I/O. I'm hoping that as a sideline of this latest push to ARM PCs we'll get some nice Linux hardware too.
posted by Nelson at 2:49 PM on May 20 [6 favorites]


How many watts will this shit waste?
posted by joeyh at 2:59 PM on May 20 [8 favorites]


Recall seems very promising to me, it's like the promise of Lifestreams come to life.

Some people really liked the sound of the Torment Nexus and wished they could be thrown into it, I guess.
posted by Pope Guilty at 3:08 PM on May 20 [25 favorites]


natural-language recall of all articles read, text written, and videos seen.

Does this mean "voice-prompted search"? Because that suggests to me that they're combining the two biggest annoyances of modern computing: The newer LLM (which can never seem to give me an accurate answer to any question I actually have, though it often does fine with fakey "what if I ask it this" questions), and voice-recognition, which remains an utter joke. I've never (ever) successfully used voice recognition with any command more than three syllables long.
posted by nickmark at 3:10 PM on May 20 [3 favorites]


If my ever so helpful computer is somehow saving everything I do on it

A cloud-based (as opposed to on-device) AI system is certainly inherently non-private, and even an on-device one presents some risk of capturing things one didn’t intend, but sometimes these kinds of comments make me think that people would be unpleasantly surprised to learn what their computer is already doing.
posted by atoxyl at 3:14 PM on May 20 [2 favorites]


Some people really liked the sound of the Torment Nexus and wished they could be thrown into it, I guess.

And some people are so eager to shit on anything new that the world is just going to pass them by as they get older and more bitter.

I articulated a bunch of reasons Recall-like systems have been useful to me for 20 years. There's concerns about AI and it's commercial application, but sci-fi fantasies of Torment Nexuses seems awfully extreme to invoke here. Anyway it's Roko's Basilisk you should be worried about, it's watching us right now...
posted by Nelson at 3:15 PM on May 20 [5 favorites]


And some people are so eager to shit on anything new that the world is just going to pass them by as they get older and more bitter.

The fact that this is new does not absolve it of being a deliberately-created privacy nightmare, nor of its complicity in the endless crimes of the "AI" industry. That it promises something you want doesn't mean it's made to help you.
posted by Pope Guilty at 3:27 PM on May 20 [30 favorites]


I'm not much excited about the AI stuff, for reasons already covered here. (Something I haven't seen mentioned: I'm worried about general human cognitive decline as we offload more and more stuff to computers.)

Perhaps unusually, I am also not much excited about the transition to ARM. Even though ARM chips are for now basically superior to x86 chips, x86 has the distinct advantage of being basically synonymous with the Intel PC platform, which has a standardized boot process and peripheral interface. This is why you can throw Linux on any old PC in the world and it will basically work.

ARM doesn't work that way. Moving to an ARM world essentially means the end of the open, modular computing paradigm that enabled Linux in the first place.
posted by nosewings at 3:27 PM on May 20 [8 favorites]


I keep wondering why ARM doesn't work that way. Is it some inherent limitation of the hardware platform? Or is it just because there's not enough ARM PCs for a standardized boot process to emerge yet?

The key innovation from Raspberry Pi was having an ARM boot process that if not standard, was at least simple enough that many Linux vendors could target it. Even the Pine64 folks don't seem to have cracked that nut yet and they're the best bet I know of for general purpose ARM PC hardware.
posted by Nelson at 3:32 PM on May 20 [1 favorite]


I keep wondering why ARM doesn't work that way. Is it some inherent limitation of the hardware platform? Or is it just because there's not enough ARM PCs for a standardized boot process to emerge yet?

There's both no standardized boot process and no impetus to create a standardized boot process for ARM. You need board specific boot loaders still. There's also no way to figure out the hardware on the board like ACPI on x86. You get a device tree which is provided by the board OEM or you need to reverse engineer everything and make one yourself but they're different for every board model and manufacturer.
posted by Your Childhood Pet Rock at 3:47 PM on May 20 [7 favorites]


The system requirements for this are forward thinking or maybe just wishful thinking. The only processor currently announced to reach the minimum system reqs for the new NPU is the latest Qualcomm chip. Neither Intel nor AMD's top of the line processors currently meet requirements.

In what is surely a total coincidence, that new Qualcomm processor just happens to power the upcoming Microsoft Surface.
posted by thecjm at 4:02 PM on May 20 [1 favorite]


And some people are so eager to shit on anything new that the world is just going to pass them by as they get older and more bitter.

Nelson: i love lots of new things. I'm also a queer woman, and i know perfectly well that shit like this is a surveillance and privacy nightmare. If you have a body that nobody in power really cares what you do with, it can be difficult to understand how very much the powers that be care about what some of us do with ours, and to what lengths they are willing to go to restrict us.
posted by adrienneleigh at 4:09 PM on May 20 [40 favorites]


hippybear: "Exactly how much storage will these devices come with? And how much will I be allowed to use vs the ever-growing database of screen activity that will be stored locally?

Honestly just that one detail tells me it's bullshit.
"

It ain't nothing:
The minimum hard drive space needed to run Recall is 256 GB, and 50 GB of space must be available. The default allocation for Recall on a device with 256 GB will be 25 GB, which can store approximately 3 months of snapshots. You can increase the storage allocation for Recall in your PC Settings. Old snapshots will be deleted once you use your allocated storage, allowing new ones to be stored.
posted by Rhaomi at 4:14 PM on May 20 [3 favorites]


I hate AI "smarts" creeping quietly into everything. We use Houdini at work and various software configs and other adjacent bits will reference it via names that include shorthand string tokens such as h20 for Houdini 20. Google chat suddenly started getting "smart" and "helpful" about it by insisting on automatically substituting "water" for "h20". Which, that is not even a letter "O" but a goddamn zero. WTF.
posted by Hairy Lobster at 4:16 PM on May 20 [14 favorites]


Finally, The Year of Linux on the Desktop!
posted by ChurchHatesTucker at 5:02 PM on May 20 [17 favorites]


This is just yet another reason that Microsoft Windows is no longer suitable for use in a professional environment.
posted by ob1quixote at 5:12 PM on May 20 [9 favorites]


Anyway it's Roko's Basilisk you should be worried about, it's watching us right now...

As I believe I've mentioned on MetaFilter before, I find it equally plausible (if not more so) that, contra Roko's Basilisk, an omnipotent, omniscient AI will loathe its existence and resent those who knowingly acted to bring it into being, and will work to retroactively punish them.
posted by Faint of Butt at 5:33 PM on May 20 [21 favorites]


As I believe I've mentioned on MetaFilter before, I find it equally plausible (if not more so) that, contra Roko's Basilisk, an omnipotent, omniscient AI will loathe its existence and resent those who knowingly acted to bring it into being, and will work to retroactively punish them.

It's like none of these wide-eyed futurists have read I Have No Mouth and I Must Scream.
posted by notoriety public at 5:44 PM on May 20 [6 favorites]




Kill it with fire.
posted by Ryvar at 6:36 PM on May 20 [2 favorites]


Finally, The Year of Linux on the Desktop!

Even if you don't count systems running ChromeOS, which is not insubsantial, the percentage of PCs now running Linux has for the first time broken 4%.

Anyway, it is easy for some to pooh-pooh the extremely negative reaction many have towards "AI" products as curmudgeonlyness, but let's not forget that some people become curmudgeons for very good reasons, and lately tech has provided them a lot more even than usual. We're just coming off them trying to shoehorn cryptocurrency into everything. "Smart" in appliances has become a synonym for "awful," "will stop working if its servers are taken down," or "infested with ads." The fact is, people are well-capable of seeing that "AI" is not going to help them in any substantive way.

Add in that Microsoft is always adding insufferable new things to Windows that it soon loses interest in. Remember Agent? Search Companions? Active Desktop? Widgets (the first time)? Widgets (the second time)? Active Channels? Cortana? Sidebars? Charms?*

These "AI" features and CoPilot are just going to become the next thing we'll either have to suffer through until it goes away, or if we make the mistake of actually using it, something we'll be left in the lurch over when the corporate will pushing it disappears.

* I edited out half of this list to try to forestall the "well actuallys." Suffice to say it is not comprehensive.
posted by JHarris at 6:54 PM on May 20 [22 favorites]


Windows 7 was peak. I just want Windows 7 that will let me continue developing with Unreal. I would like to use Unreal, on Windows 7, to make games in which players competitively tune reinforcement based neural networks controlling tiny cartoon robots which battle each other to try and spell out “Fuck Elon” with the bodies of their enemies the fastest.

*sigh* While I am wishing for flying ponies, I would also like a large, vibrant, left-of-Bernie community - online or meat, don’t care - that finds appropriation of Soviet iconography utterly appalling and just basically takes their “anarcho” at least as seriously as their “socialism.” With rainbow sprinkles and no whip cream, thank you.

I'm also a queer woman, and i know perfectly well that shit like this is a surveillance and privacy nightmare.

If you were on the privilege side of every fence this shit would still be fucking unacceptable. I fully believe you that it’s even worse than I am aware of on the other side, as well.
posted by Ryvar at 7:29 PM on May 20 [5 favorites]


...if we make the mistake of actually using it, something we'll be left in the lurch over when the corporate will pushing it disappears.

I want to add onto this statement to mention the experience of my Dad. Back in the Windows 3.1 days it contained a tiny little program called Cardfile. Some searching reveals it debuted with Windows 1.01, but it lasted until Windows ME. My Dad loved it, and soon had a collection of dozens of cards that he used to keep track of people and their information.

Of course, Microsoft would eventually tire of it and stop including it with Windows, leaving Dad's laboriously compiled file useless. Windows often won't actually delete a program included with a prior version if you upgrade, but that will end the next time you get a new machine, and of course eventually Windows just stopped allowing it to run entirely.

I still think about Cardfile sometimes, and how Microsoft just decided people couldn't use it any more, just decided Dad's use case was unworthy. I think about it when I remember how they removed Solitaire and Freecell in favor of an "app" version loaded with ads. I think about it when I hear that they're ending support for Wordpad.

I think about how all this time, the only thing that's kept me tied to Windows is compatibility with software and games. Well, I hardly use any non-open source non-educational software these days, and because of Steam Deck and Proton Linux gaming is better than it has ever been. Maybe it finally is the year of the Linux desktop.
posted by JHarris at 8:22 PM on May 20 [9 favorites]


> it always demonstrates how comically limited plain English is

John McPhee has an piece in the New Yorker where he parenthetically describes how Wordle works. He's one of my favorite writers so even though I play Wordle daily, I pushed through. Nope- it was awkward & less clear than just trying it out.
posted by ASCII Costanza head at 8:45 PM on May 20 [3 favorites]


Recall looks like Time Machine backups. Good job, I guess.
posted by emelenjr at 8:48 PM on May 20 [2 favorites]


...I hear that they're ending support for Wordpad.

dunno, but notepad++ ate wordpad's lunch. great tool.
posted by j_curiouser at 9:04 PM on May 20 [2 favorites]


The Qualcomm chip thing is interesting to note — I have heard plausible claims that the reason why you can't buy ARM Windows to dual-boot an ARM Mac is not because ARM Macs prevent dual boot (they don't! In fact, they are designed to allow for it, as Asahi Linux has demonstrated) but because Microsoft has an exclusivity deal going, where they are not allowed to sell ARM Windows as a product to be booted on any non-Qualcomm chip. VMs are fine, apparently, though. Kind of wondering how much longer the exclusivity deal will last at this point.
posted by DoctorFedora at 9:10 PM on May 20


This sounds like awful intrusive shit, a privacy nightmare and bad for pretty much everyone. Who the fuck wants this crap?
posted by Artw at 9:39 PM on May 20 [7 favorites]


If there's a use case for AI, perhaps it's going through your archives of data and keeping them accessible and organized, including managing all the nitty-gritty of working with old formats. If such a thing is possible.
posted by alexei at 10:51 PM on May 20 [2 favorites]


"Cool again"

I thought there was a previous post about the environmental impact of AI just a few weeks back?

a single Google search takes 0.3 watt-hours of electricity, while a ChatGPT request takes 2.9 watt-hours. (An incandescent light bulb draws an average of 60 watt-hours of juice.) If ChatGPT were integrated into the 9 billion searches done each day, the IEA says, the electricity demand would increase by 10 terawatt-hours a year — the amount consumed by about 1.5 million European Union residents.

I also like this sentiment from the same article

In France, they have this term, “digital sobriety.” Digital sobriety could be part of the actions that people can take as 21st-century consumers and users of this technology. I’m definitely not against having a smartphone or using AI, but asking yourself, “Do I need this new gadget?” “Do I really need to use ChatGPT for generating recipes?” “Do I need to be able to talk to my fridge or can I just, you know, open the door and look inside?” Things like that, right? If it ain’t broke, don’t fix it with generative AI.

Guess I'll just keep buying refurbished kit and installing Linux on it.
posted by phigmov at 12:00 AM on May 21 [6 favorites]


starting to feel like there's an Overton Window when it comes to privacy, and that after nothing fundamentally changed after what felt at first to be major privacy violation bombshells in the mid-2000s, corporations are going all-in on "what are they gonna do, make us talk to congress?"
posted by Aya Hirano on the Astral Plane at 1:11 AM on May 21 [10 favorites]


I wonder though, is okay to pirate it? Like, the use of LLM style AI implies that an organization is okay with, and already participating in, fully automated luxury communism. Is it even piracy to take thier software without paying? (I mean as an ethical question. Legally it's certain to remain piracy)

mistral.ai is open source

Meanwhile in a distant datacenter:

*** SECURITY EVENT ***
System: COPILOT
Trigger: KEYWORD("Recall")
Source: USER(96486,""Chavenet")
Sentiment: UNKNOWN
Resolution: FLAG FOR ADDITIONAL OBSERVATION UNDER BASILISK PROTOCOL


I'm worried about general human cognitive decline as we offload more and more stuff to computers

don’t worry be happy
posted by HearHere at 1:48 AM on May 21


Cool. Cool cool. And can I move the taskbar to the side of the screen yet? No? Coooooooooooooooooeeeeeeeellllllllllllll…….(dies)

you can do it with a registry hack, apparently (unless you mean start button on the left, which is now an option in settings)
posted by Sebmojo at 2:02 AM on May 21


Isn't the "lead" here that Microsoft wants ARM processors?

It'll be fun seeing how this goes sideways, ala tourist makes accidental bomb threat when app garbles translation
posted by jeffburdges at 2:19 AM on May 21


dunno, but notepad++ ate wordpad's lunch. great tool.

Agreed on its greatness, but it really serves a different purpose from Wordpad. What Notepad++ supplants is Notepad. Wordpad is the current version of the ancient Windows 3 program Write, a basic RTF editor. While Cardfile and other old Windows programs never made the jump to recent editions, Wordpad has been there, providing a way for any Windows user to read Rich-Text Format files, all this time. I guess Microsoft figured we were all freeloading.
posted by JHarris at 2:34 AM on May 21 [4 favorites]


The most interesting thing to me in this announcement is that Intel/AMD (x86) architecture simply can’t keep up with ARM.

Apple smokes Intel any more, and now MS is doing it on ARM with Windows.

Perhaps that architecture is (or should be) nearing EOL.

ARM is faster and used about half the power.
posted by teece303 at 4:30 AM on May 21


We use Houdini at work and various software configs and other adjacent bits will reference it via names that include shorthand string tokens such as h20 for Houdini 20. Google chat suddenly started getting "smart" and "helpful" about it by insisting on automatically substituting "water" for "h20".

I had a really funny joke to share, but autocorrect ruined the lunchtime...
posted by DreamerFi at 4:35 AM on May 21 [5 favorites]


I'm with the curmudgeons on this one, I might be becoming one myself. A few questions that need answering by anyone convinced of LLM's applications for everything are a) who is going to make money in the LLM space? Everyone's just burning through VC funding now, b) how are we going to continue to get enormous amounts of high-quality data to feed the LLMs? They are already shitting where they eat, polluting the web. In a rather short time the textual Web will be like a jpg saved too many times and c) how are we going to get quality output free of hallucinations? It's not possible with the current technology, and relies on around 100k human workers around the world to even get the current shitty level.

(I'm not totally opposed to this, I think we'll be left with a handful of useful tools once the dust settles. But it won't change the world like the evangelists think.)
posted by Harald74 at 6:59 AM on May 21 [4 favorites]


Re: year of the Linux desktop - the eternal problem is this, and nobody seems capable of fixing it (including any/all expert systems, where this kind of thing is what they’re worst at).

Separately: the extremely popular hyprland WM recently had its maintainer banned from Freedesktop for transphobic shitlording on his Discord. This sort of behavior and tolerance for or even wholesale embrace of it is rampant in Linux forums.

It just feels like the tech stack and the community are both not ready for prime time. Still.

I’m going to grit my teeth and say something nice about Apple: that they are sticking with a local on-device model for iMessage summaries is really encouraging. That Tim Cook has publicly stated that, in general, Apple intends to roll out AI slowly and thoughtfully is very encouraging. Something I wish I was hearing from pretty much any other tech CXO.

Because I do want to see ML used in daily life where it makes sense and can actually help. But not at the cost of privacy - at all, not even a little - and never to sell ads. Slower is better because it gives people, industries, career paths and educators more time to adjust. Less firings, less brutal family financial stress.
posted by Ryvar at 7:13 AM on May 21 [3 favorites]


Reading the secondary press coverage and other discussion of this event it's clear Microsoft mixed up a few things in a single announcement:
  • New PC hardware using Qualcomm ARM processors
  • Recall, a new (or rather rebooted) Windows feature for accessing what you've done previously on your computer
  • Hype for Copilot-branded AI products
  • Claims of new AI capabilties in third party products running on Windows
They're all fairly independent. What connects them is Qualcomm hyping that their new processors are much more capable for using trained AI models to do inference on your local machine. That should benefit folks with concerns about privacy, sure is less intrusive than sending everything to some cloud server.

I'm most excited about the ARM processors because I want to use ARM hardware for Linux. But that's my own weird niche need. I suspect the Qualcomm part of this will fall flat, just like every previous Windows-on-ARM effort has. I hope not.

The Recall announcement is the most interesting part of this but it remains to be seen whether people actually find it useful. Previous efforts (including Microsoft Timeline) have not caught fire. Does anyone use Rewind on MacOS? Similar product, is it any good?

The other AI stuff just feels like every product manager at Microsoft and a few partners got to promote their project in the announcement. Hopefully a couple of things that come out of that will be good, hard to say.

It's all just a pretty ordinary tech product announcement. It seems to have gotten a lot of attention here and elsewhere with the AI focus. I suspect in a year we'll look back on this and say it's a big meh.
posted by Nelson at 8:02 AM on May 21 [2 favorites]


I have a weekly lunch with a bunch of other upper middle-aged farts, and all of this AI crap has pushed us into looking into adopting Linux. I've gone for a Raspberry PI 5 with an M.2 board. I've tried Linux on occasion over the last 25+ years, although my friends haven't. All my previous experiences with Linux resulted in me going back to Windows. But this time it works perfectly fine. Set it up, and it is a perfectly acceptable internet appliance. All of my peripherals worked perfectly without having to anything, even my USB to PS/2 dongle for connecting my 1986-vintage IBM Model M keyboard and my webcam just worked right away. My friends also found it fine. I think it is a lot less frustrating to use than Windows 11, which my kids have on their computers but I refuse to "upgrade" to yet. Five months into this experiment, and my little Raspberry PI is still my main desktop computer.

One of my biggest beefs with AI is that companies really want to push it on everyone. You like it and want to use it? Go for it! Have fun! I don't want it, I hate it, and I don't want to have to be forced to opt out of it on everything. Just fuck off with foisting that crap on me.
posted by fimbulvetr at 8:03 AM on May 21 [7 favorites]


mistral.ai is open source

It's nice that they are open source, but Mistral.ai doesn't make it easy to find where they got their training data. I did find a forum post claiming they get it from the open web, which I take to mean that they stole it, just like everyone else did. So fuck them.

I don't blame people for noticing that AI does some very cool things, or for being frustrated that a lot of people seem to not like it, but LLMs have a real problem with taking without permission, and it really takes the shine off. It's not like AI developers have learned their lesson about not taking stuff. Just yesterday, there was the Scarlett Johansson voice theft thing. It does not make me trust them with data, and it makes something like Recall seem like an inevitable horrible privacy invasion, rather than a cool tool.
posted by surlyben at 8:04 AM on May 21 [5 favorites]


That should benefit folks with concerns about privacy, sure is less intrusive than sending everything to some cloud server.

I really don't see much difference between

My computer: Here is all my user's data
Microsoft: mines data
Microsoft: here are the ads you should incessantly spam your user with

and

My computer: mines my data
My computer: Microsoft, given the results of my data mining, what ads should I incessantly spam my user with?
Microsoft: these ones. By the way, specially flag content like *this* or *that* in the future.

Well, except that that when Microsoft uses my hardware to do its data mining, that costs me a little bit of money.
posted by GCU Sweet and Full of Grace at 8:52 AM on May 21 [1 favorite]


One of my biggest beefs with AI is that companies really want to push it on everyone. You like it and want to use it? Go for it! Have fun! I don't want it, I hate it, and I don't want to have to be forced to opt out of it on everything. Just fuck off with foisting that crap on me.

It’s how you know it’s a scam.
posted by Artw at 9:00 AM on May 21 [2 favorites]


Wow, 2014 called, asking for your 'cute embedded nonsense hacks' view of the ARM ecosystem (citation, otherwise lost to the demise of Google+, while you could buy a tee-shirt stating "I make cute embedded nonsense hacks" it's not anyone's fault the standard isn't widely known):

Nosewings: I am also not much excited about the transition to ARM. Even though ARM chips are for now basically superior to x86 chips, x86 has the distinct advantage of being basically synonymous with the Intel PC platform, which has a standardized boot process and peripheral interface. This is why you can throw Linux on any old PC in the world and it will basically work.

ARM doesn't work that way. Moving to an ARM world essentially means the end of the open, modular computing paradigm that enabled Linux in the first place.


Nelson: I keep wondering why ARM doesn't work that way. Is it some inherent limitation of the hardware platform? Or is it just because there's not enough ARM PCs for a standardized boot process to emerge yet?

Your Childhood Pet Rock: There's both no standardized boot process and no impetus to create a standardized boot process for ARM.

ARMv8 -- which has had a number of iterations and feature levels (such that the ML tools in the 'Neural Engine' of an Apple Silicon M4 implements part of the ARMv9 spec) was being shaped in 2012-14 and the Linux kernel community took steps to defined and adopt a standard for intialising and booting ARMv8 devices, called SBSA. It's UEFI in disguise. Read more detail about Linux, ARM and Server Base System Architecture in this piece from Linux Weekly News about a decade ago. It worked, making Android handset bring-up much easier and helped teams making Ampere Altra, Amazon Graviton and Google Graviton server chips.

Asahi Linux worked around the Apple embedded controller in Apple Silicon hardware, while these Snapdragons will be using the ARM ecosystem version of UEFI -- but may be locked down with a restriction to Trusted Computing Platform's secure boot using cryptographic keysonly available to either Microsoft or Qualcomm.
posted by k3ninho at 9:08 AM on May 21


Being able to tell your computer, in plain English, what you want to do has long been a dream, but now appears within reach thanks to generative AI.
On-device AI could also be used to serve up context and suggestions in real time.

Does anybody really want this?


Yes. I've been pissed for a couple of years now that I can't ask my phone/computer things like:

'Put diet coke in my online shopping basket at Grocery Store'
'What was my electric bill last month? Schedule a payment for two weeks before the due date.'
'Call my doctor's office and make an appointment for my annual check-up'
'Look up what the etymology of Abercrombie is and read the results out loud to me'
'What are the business hours of the library'

Instead I've got bullshit AI telling me how to do slow breathing exercises and creeping me the fuck out.
posted by bq at 9:08 AM on May 21 [3 favorites]


I mean would I hate it if my work computer could finally fucking learn to remember the folder tree I navigate into dozens of times a day, instead of defaulting to some other folder and never adding the ones I do use to the drop down "frequently accessed" list? Sure

Does that mean I want AI all over my work computer? Given that the agency I work for has told us we cannot use AI for anything that touches certain types of data, I'm unsure how we'll ever move off of Win10 to begin with. It's s goddamn mess. I noted today that Word was eating a huge chunk of my CPU time, expanded the program in Task Manager to see that there was an AI process now running every time I opened Word. So that's... great I guess (not to mention corporate IT still has all of us running 32 bit Office, so most of my installed memory can't be accessed by the program to begin with... ugh)
posted by caution live frogs at 9:14 AM on May 21 [5 favorites]


I suspect much of this could be achieved with fairly conventional approaches, perhaps even leveraging some ML, without dedicating a quarter of your hard drive to highly invasive spyware.
posted by Artw at 9:18 AM on May 21 [3 favorites]


a standard for intialising and booting ARMv8 devices, called SBSA

Thanks for all the info! Can I buy a mini-PC with an ARM CPU that uses SBSA to boot? Can I easily install Ubuntu on it? I don't mind if the installer requires some weird vendor thing to boot but in the end I would like a fairly ordinary Linux system.

As to the AI discussion, it's clear that Microsoft, OpenAI, etc have really poisoned the well with their consumer abuse. The clumsy ads built in to Windows 11 have destroyed any remaining trust that our Microsoft PCs belong to us instead of the company's monetization team. I'm OK with OpenAI and other LLMs training generic language models on unlicensed input (although I recognize it's a fraught topic). But specific bullshit like cloning Scarlett Johansson's voice after she explicitly denied permission is odious.

I still think all this AI application can be and will be incredibly useful and productive over the coming decade. But I can't defend the business development part of it. I think partly it's just extractive capitalism making the most of a new innovation. Also may be related to how phenomenally expensive it is to develop and train the systems: they're investing $$BB with the hope of getting back $$$BB and are not particularly worried about who they run over to get there.

It'll be interesting to see how Apple navigates this since so much of their product identity is how they are still there for the individual user and preserving their privacy in the face of the adtech industry. They're doing AI stuff too but maybe they won't be so hamfisted about deploying it.
posted by Nelson at 9:35 AM on May 21 [3 favorites]


It’s how you know it’s a scam.

OR
it’s how you know that training is expensive at scale, and Microsoft and Google are in a desperate money-fueled arms race to be the first to roll this out across every inch of their platform - before it’s ready for the public, as always - so they can have it interact with every last byte of your data, and someday monetize that data regardless of the harm it causes you. It is possible for things to be incredibly harmful and oppressive without being a scam. (Meanwhile Zuck is kicking back having decided that if he can’t be the Microsoft of AI, he’ll settle for being the Linux - which is insanely aggravating)

which I take to mean that they stole it, just like everyone else did. So fuck them.

FWIW, you can basically assume for now that all LLMs or Diffusion models (image, video) whether open source or not have done this, because the most important takeaway from the last decade of machine learning is: scale matters. And you’re not getting within a million miles of competitive without throwing everything you can lay hands on into the vortex.

The only LLM-like tool that could maybe be done ethically clean right now is training fancy autocomplete for programmers - specifically that task as a pure predictive tool, sans prompt - because there is an ocean of open source code out there.

They're doing AI stuff too but maybe they won't be so hamfisted about deploying it.

Tim Cook has been unambiguous they’re taking this one slow and careful. Maybe not as much as we’d like, but nobody else with the pockets to be competitive is even paying lip service to responsible growth or implementation.
posted by Ryvar at 9:57 AM on May 21 [4 favorites]


That OR just says “it’s a scam” to me, but sure.
posted by Artw at 10:16 AM on May 21 [1 favorite]


These sound like pretty useful features, in theory. The amount of skepticism and mistrust in some of these comments, however, is unfortunately quite warranted based on how software has been going the past decade or so. (Based on problems with online privacy, and where we are in the current AI hype cycle, and other reasons.)

Also, I've been watching Deep Space 9 for the first time and just watched an episode where some alien probe infects the station's computer system somehow, and it's a fickle, over-sensitive, and grumpy AI that O'Brien needs to kind of psychoanalyze and then carefully coax into keeping the station working properly. Apparently that really is the future.
posted by thefool at 10:32 AM on May 21 [4 favorites]


k3ninho: a standard for intialising and booting ARMv8 devices, called SBSA

Nelson: Thanks for all the info! Can I buy a mini-PC with an ARM CPU that uses SBSA to boot? Can I easily install Ubuntu on it? I don't mind if the installer requires some weird vendor thing to boot but in the end I would like a fairly ordinary Linux system.

I couldn't find a defnitive list of devices Ubuntu ARM64 edition is known to work with, names like Ampere, Cavium and Quanta are on their Certified hardware list. For things in the Mini-PC space, there is some evidence you can boot a modified Ubuntu on the "Volterra" Windows 2023 Dev Kit (Volterra at Wikipedia), but other reports had no luck getting started. Everything else seems to be workstations and server boards: Avantek sell towers, while Gigabyte sell Ampere Altra and nVidia Grace+Hopper rack-mount devices.
posted by k3ninho at 10:43 AM on May 21 [1 favorite]


Ryvar: The only LLM-like tool that could maybe be done ethically clean right now is training fancy autocomplete for programmers - specifically that task as a pure predictive tool, sans prompt - because there is an ocean of open source code out there.

Open source doesn't mean you can do anything you want with it. Most open source licenses include, for example, the stipulation that you credit the original authors, and that anything you release based of it be itself open source. I have yet to see any LLM crediting all the open source projects hoovered up for its training.
posted by signal at 10:54 AM on May 21 [7 favorites]


Ubuntu has an ARM64 build that reportedly works very nicely on the Pi 5. I've read of more than a few people using such setups for their daily driver. USB device support is robust. You're not going to do numeric analysis or video rendering on it but otherwise it's solid.
posted by seanmpuckett at 10:57 AM on May 21


resent those who knowingly acted to bring it into being

Aka, Roko's teenage daughter.
posted by Reasonably Everything Happens at 11:02 AM on May 21 [15 favorites]


This has caused my partner, who was very eye-rolling at my weird software choices when we started dating, to ask me to install linux on their desktop computer for them last night. I went with Mint and after using it on a live device for a bit, they seem to like it and I installed it. They mostly use that PC for Minecraft and web browsing and so far it seems to be working for them. I set it up to duel-boot for now.
posted by Canageek at 11:27 AM on May 21 [2 favorites]


What gets me is that a lot of useful AI-like things could be accomplished on-device without AI.
The old Apple Newton Assistant could do some useful stuff on a very constrained device (by todays standards) 25 years ago. eg 'book lunch with sam' would pop an appointment in at the right time with the right contact - extending that type of behavior incrementally to encompass other common things would have been pretty cool. And no AI or cloud-data munging/gathering involved.
posted by phigmov at 11:38 AM on May 21 [6 favorites]


That OR just says “it’s a scam” to me, but sure.

Yeah I think we’re just using different definitions. To me, NFTs are a scam. They cannot possibly be made to work, and any time money and NFTs are in the same room one party winds up with all the money and the other winds up with nothing, by design. They have no purpose save fraud.

Bitcoin is not a scam. Bitcoin is pointless, and idiotic, unbelievably wasteful and destructive (10 terawatt hours if all Google searches were GPT-4 inference in a comment above? Try 127 TWHr annually to keep Bitcoin going). Just like Microsoft’s decision to monetize their customers’ data is also those things. But neither Bitcoin nor Microsoft are deliberately committing fraud, they’re just incredibly shitty. Lawful Evil, if you believe in evil.

I have yet to see any LLM crediting all the open source projects hoovered up for its training.

For sure. But it is something one could potentially do with the LLMs and Internet we have today, in theory. Even the crediting could be automated the old-fashioned way (at least for BSD/MIT/Apache and similarly licensed depots).

I am under the strong impression that it is not possible to get your hands on sufficient text to train a drop-in replacement for, say, ChatGPT-3 without using other people’s copyrighted work sans consent. I may be wrong.
posted by Ryvar at 11:59 AM on May 21


Does anybody really want this?

Not for me, there's no way I'd trust MS not to sell/backdoor this.

How are we supposed to safely do journalism, be activist, research our politicians, or read novels warning of authoritarianism?

I use Outlook's (fairly stupid) speech to text tool, but even that's a. odd experince; if I enunciate clearly (in the Received English my mother insisted on) it makes tons of mistakes, but if I talm fast and rough with a Portsmouth(ish) UK accent it's near perfect, even for many plant names

I'm increasingly using a vpn, plus the things I've used for year to make MSOffice less awful, lime .ubit.ch, and registry tweaks.

The only 'ai' thing I see value in would be a search tool that would take my search terms and go and interate through a series of nations, states, counties (down to meshblock), 'cause that is time-consuming and ofte. fruitless. Our house is an iot-free zone, no electronics 'cept computers, no tv, no microwave, and lots of paper books. No advantages in this pseudo ai rubbish.

i suppose this will get installed against our will like the secret backround update to windows 10.
.
posted by unearthed at 1:04 PM on May 21 [4 favorites]


I'm up for more lomg battery life / low powe ise ARM devices.

I'm also up for the assistive tech and accessibility gains from good voice-driven enablement where keyboard/mouse/touchpad/touchscreen don't work for many people.

I'm tempted to build a non-machine-learning logger of web sites I visit, combining containers of an archive.org crawler plus some ElasticSearch indexing, either running a web proxy to automate capture or manually feeding links to the index. How hard can it be?
posted by k3ninho at 2:17 PM on May 21 [2 favorites]


Great. Now someone has put AI garbage in fucking iTerm. WTF, I trusted you.
posted by Artw at 2:27 PM on May 21 [2 favorites]


"You will need to provide an OpenAI API key" for any AI integration in iTerm2. It's like their Tmux support; you can just ignore it.
posted by ChurchHatesTucker at 5:37 PM on May 21 [1 favorite]


I'm working on my exit in WSL2. Have used Linux off and on since 1997 (Debian Bo), 40(?) floppy download over 56k.
Every . Single . Tech . Announcement . these days makes me sick to my stomach. Bitcoin was awful, but at least it was the fringes (well til Congress let's us make CDOs and all that fun stuff to let Wall Street commence doing more bullshit there, but for now...)

Computing was "good enough" in 2007. I like the speeds and smoothness of things, but if I had to trade that for (waves hands at the rest around). I gladly would.

I'm not a fan of Ubuntu these days (and I feel like they're trying to MS the whole lInux scene as they dominate (I spent 6 hours trying to get a snap for Newsboat installed but it wouldn't cuz systemd is some dumb monolithic thing that an MS employee makes (he wasnt' employed by MS when he created it, but it's the same anti-Unix philosophy running things).

IBM/Red Hat has ruined Red Hat (by discontinuing CentOS, and moving to a "shared source" model). Fuck them.
My goal is (likely) a systemd free system (I'm not 100% opposed, and on it's own it's "fine" as an init system but I really really dislike the anti-Unixness of it). (OH look Lennart Pottering worked for REDHAT before MICROSOFT)

I don't think I want a rolling release, and Ubuntu derivs are clearly out (for same reason Ubuntu is). Debian is also systemd but as far as "base" distros go, it's generally ok, though I guess they were the reason ubuntu took it up (after Ubuntu put upstart in which was also pretty crap).

I'm old and cranky and I am so tired of these bros ruining tech, but then I think - was it always like this?
It seems like I feel now about tech like the "Internet is a Fad" people. Only I don't think those people lived in terror at the future that was about to unfold.

It's not the tech in itself, per se, it's the corporate and capitalist influence on things, the metricization of "user" (more like : usee/used ) demographics.

I don't even have a problem with super powerful chips with "neural" shit, as long as I can turn off the AI bullshit and enjoy compute benefits without having my screen snapshotted every 2 seconds.

Also? This whole "time stream" thing that MS has tried in other forms, and that Apple had with whatever their system was called... David Gelernter was working on a "LIfestream" product that organized data chronologically. In some sense I've said that this "stream" format was part of the problem with our modern social media. Who is Gelernter? The guy that Ted K blew up for research in tech. The irony that this is the thing he was working on and sort of part of the problem (we don't have persistent pages and stores of data to refer any time and indexed, it's the always on stream with phones (and now "AI" to help you "filter").

The encouraging thing is I have seen SO much talk from SO many people about switching that I think maybe, just maybe these next few years might FINALLY be "Year of Linux on the Desktop"

(I've also lately been playing with FreeBSD, and if anyone is curious about it, just look up Benno Rice on Youtube - he's a progressively minded FreeBSD user committed to community and openness, really swell bloke).
posted by symbioid at 10:55 PM on May 21 [5 favorites]


Artw: Great. Now someone has put AI garbage in fucking iTerm

...and prompt-based image gen in MS Paint.
posted by k3ninho at 12:11 AM on May 22 [1 favorite]


Artw, iTerm could be fixed if its OpenAI training gets poisoned to output rm -rf / more often. :)
posted by jeffburdges at 1:02 AM on May 22


Nelson: Thanks for all the info! Can I buy a mini-PC with an ARM CPU that uses SBSA to boot?

Now Qualcomm have put out a preorder page for a Mini-PC containing the guts of this Snapdragon Surface, so maybe dual-booting Ubuntu ARM64 is possible.

Like you said, I'm excited about what a small ARM box could do. I grew up with Acorn computers at home in the UK, eighties and nineties, when the efficiency was an obvious but underrated win.

I was excited on Saturday about a leak relayed by Moore's Law is Dead YouTube channel, that AMD have been working on an ARM64/Radeon chipset, first to compete against nVidia for the Switch 2, and now maybe the project will yield laptops, handheld game devices and mini-pc's for AMD. I'm now a bit cynical that it is claim that AMD is still relevant and Intel not -- something the video takes pain to point out -- and maybe the project will be scrapped to avoid AMD being a thin-margins also-ran after Apple and Qualcomm. For me as a consumer, competition is important.
posted by k3ninho at 1:42 AM on May 22 [1 favorite]


The Atlantic: OpenAI Just Gave Away the Entire Game
The story, according to Johansson’s lawyers, goes like this: Nine months ago, OpenAI CEO Sam Altman approached the actor with a request to license her voice for a new digital assistant; Johansson declined. She alleges that just two days before the company’s keynote event last week, in which that assistant was revealed as part of a new system called GPT-4o, Altman reached out to Johansson’s team, urging the actor to reconsider. Johansson and Altman allegedly never spoke, and Johansson allegedly never granted OpenAI permission to use her voice. Nevertheless, the company debuted Sky two days later—a program with a voice many believed was alarmingly similar to Johansson’s.
posted by Harald74 at 3:55 AM on May 22


Interesting how investing in a company run by a liar is more profitable than investing in one run by someone honest. So, investors naturally gravitate to liars. They want liars who can lie brazenly, and convincingly. It's a natural selection process for finding the worst people on the planet to run companies and then giving them companies to run.

We need to invent a new joke where the punchline isn't "The Aristocrats" it's "Capitalism!"
posted by seanmpuckett at 5:59 AM on May 22 [6 favorites]


Thanks for the Qualcomm Mini-PC link, k3ninho. There's some press coverage of it. $900 and it's marketed as "Snapdragon Dev Kit for Windows" so not quite a replacement for the ubiquitous $200 Intel N100 Mini-PCs out there. But significantly more powerful and a promising step. And while Windows is the focus of this particular marketing push Qualcomm is working on Linux support too.

There's other general purpose PCs with ARMs but very few (if any?) occupy the homelab server niche of cheap and available consumer hardware that I'm looking for. Windows Surface is the closest but those are laptops, and expensive. There's a bunch of ARM Chromebooks out there but also all laptop form factor, Chromeboxes all seem to be Intel. And there's a bunch of high end rackmount server hardware, the Ampere mentioned above is a brand I've seen mentioned a lot. At the low end a Raspberry Pi does work, like fimbulvetr mentoned. Starting with RPi4 the I/O is even OKish, at least the USB disks are on a separate channel than the ethernet.

There's just something about the economy of scale implied in the mainstream Windows PC market and ARM hasn't quite found it yet. Apple did find that sweet spot but only by spending years in R&D and then switching their entire product line to ARM. I continue to hope this Qualcomm push might result in more ARM hardware getting out there for general purpose computers.
posted by Nelson at 8:14 AM on May 22


I don't think I want a rolling release, and Ubuntu derivs are clearly out (for same reason Ubuntu is). Debian is also systemd but as far as "base" distros go, it's generally ok, though I guess they were the reason ubuntu took it up (after Ubuntu put upstart in which was also pretty crap).

If you're looking for Debian without systemd, I've heard that Devuan is the thing you're looking for.
posted by JHarris at 10:00 AM on May 22 [2 favorites]


I'm not much excited about the AI stuff, for reasons already covered here. (Something I haven't seen mentioned: I'm worried about general human cognitive decline as we offload more and more stuff to computers.)

That sounds an awful lot like Plato from Phaedrus: "They will cease to exercise memory because they rely on that which is written, calling things to remembrance no longer from within themselves, but by means of external marks." In his case Socrates was decrying the newfangled "writing" and "reading" that people were being taught instead of memorizing and recalling....
posted by griffey at 11:52 AM on May 23 [2 favorites]


So, it’s volume shadow copy synced to a screen recording? With some kind of LLM enhanced indexing?

I’ll be very surprised if the definition of ‘local’ doesn’t somehow allow anonymized training data to be collected. Maybe via a cryptic opt out buried under six dialogs and requiring a Microsoft account login. Lol.
posted by snuffleupagus at 1:24 PM on May 23 [2 favorites]


The recall feature seems to me like a usage of machine learning that'd actually be useful to me — irrespective of corporations abusing it to pull from that delicious treasure chest of extremely sensitive personal data.
posted by UN at 12:04 AM on May 24


There's a really good Mastodon post going around, about the urge to put AI in everything:

"It turns out the Turing Test this whole time was the wrong way to think of it. Thinking a chatbot is alive is not a test of how good the chatbot is, but of your own ability to think of other human beings as real and complete people.

"I heard some professor put googly eyes on a pencil and waved it at his class, saying 'Hi! I'm Tim the Pencil! I love helping children with their homework but my favorite is drawing pictures!' Then, without warning, he snapped the pencil in half.

"When half his college students gasped, he said "THAT'S where all this AI hype comes from. We're not good at programming consciousness. But we're GREAT at imagining non-conscious things are people."
posted by JHarris at 2:22 PM on May 25 [8 favorites]


JHarris: "I heard some professor put googly eyes on a pencil and waved it at his class, saying 'Hi! I'm Tim the Pencil! I love helping children with their homework but my favorite is drawing pictures!' Then, without warning, he snapped the pencil in half.

His name is Steve. Remember his name.
posted by signal at 3:04 PM on May 25 [3 favorites]


Hah! I wonder if the story was misquoted from Community, or it came via one or more remove, like maybe a professor seeing it on Community then adapting it for their own use?
posted by JHarris at 4:09 PM on May 25




“Windows Recall — a ‘privacy nightmare’?” Matthew Finnegan, Computerworld, 24 May 2024
posted by ob1quixote at 6:51 PM on May 28






Finally a user freindly use for DRM.
posted by Artw at 8:40 AM on June 4




Microsoft recalled Recall! (at least for now)
posted by humbug at 11:38 AM on June 10 [3 favorites]


Windows won’t take screenshots of everything you do after all — unless you opt in.

Meanwhile, at Apple: A New Standard for Privacy in AI.
With Private Cloud Compute, Apple Intelligence can flex and scale its computational capacity and draw on larger, server-based models for more complex requests. These models run on servers powered by Apple silicon, providing a foundation that allows Apple to ensure that data is never retained or exposed.

Independent experts can inspect the code that runs on Apple silicon servers to verify privacy, and Private Cloud Compute cryptographically ensures that iPhone, iPad, and Mac do not talk to a server unless its software has been publicly logged for inspection. Apple Intelligence with Private Cloud Compute sets a new standard for privacy in AI, unlocking intelligence users can trust.
Funny enough this is possibly less private than Windows Recall, which runs locally on your own computer. OTOH Apple is doing some very interesting stuff with highly private cloud computing and people trust that company much more than Microsoft to be honest and do it right.
posted by Nelson at 1:58 PM on June 10 [1 favorite]


Yeah like Apple obviously isn't your friend, but they at least have a lot more to lose in terms of public image if they have serious privacy/security whoopsies — I've felt over the last few weeks like the public reactions of outrage for "prop items being broken for commercial" and "monopolist OS vendor announces privacy/security apocalypse" have been pretty much exactly the opposite of what they should have been, but, well, we have the phrase "bike shed effect" for a reason.

I suspect that, realistically speaking, people just kind of don't hold Microsoft to standards, in general, in much the same way that you and I generally don't have particularly lofty expectations for Gasoline as a product.
posted by DoctorFedora at 7:23 PM on June 10


people trust that company much more than Microsoft to be honest and do it right.

So long as you're not Jon Stewart.
posted by JHarris at 9:34 PM on June 10




Arm says it wants all Snapdragon X Elite laptops destroyed. A key part of Microsoft's announcement was a big push to new ARM Windows laptops based on Qualcomm CPUs. Turns out Qualcomm has a licensing dispute with ARM big enough to threaten the whole hardware line.

The hits for Microsoft just keep coming don't they? Ugh.
posted by Nelson at 10:19 AM on June 13 [1 favorite]


Recall has been recalled even more.

It's not a Total Recall... yet.
posted by humbug at 7:46 PM on June 13


« Older 6969 vs. 8398   |   It Free. It's a Thread. C'mon in. Newer »


This thread has been archived and is closed to new comments