Does people doing things that upset others also upset you?


This question came about over a discussion my brother and I had about whether dogs should be on leashes when outside. We both agreed that yes, they should, for several reasons, but that's not the point.

Let's use a hypothetical to better illustrate the question. Imagine that there's a perfume - vanilla, for example - that doesn't bother you at all (you don't like nor dislike it), but that is very upsetting to some people, and can even cause some adverse reactions (allergies or something). In this hypothetical, based on the negative effects, you agree that vanilla perfumes should be banned. Currently, however, they are allowed.

You're walking down the street, and randomly smell someone passing you by and they're wearing a vanilla perfume.

Would that upset you? Why, or why not?


My answer is yes, without a doubt. Even though the smell itself doesn't bother me, the fact someone would wear that perfume and not only potentially upset others, but put them in danger, is upsetting.

My brother, however, would say no! He couldn't explain his reasoning to me.

I know this is a little convoluted, but I hope I got my question across.

in reply to BryceBassitt

There was a certain type of perfume that seemed popular back in the 90s, that would make me instantly gag and almost puke within seconds. I have no clue how anyone found that as any sort of pleasant smell.

To me I thought it smelled like a woman with a nasty yeast infection, trying to cover it up with potpourri. But it wasn't even the women's health causing it, literal potpourri smell alone causes me the same gag reflex, the stuff just smells nasty to me and I can't be in the same room as that smell for long.

So yes, there are reasons to be offended by particular scents, even if others somehow find them pleasant.

[Workaround] (Arch, KDE Plasma 6.4, Wayland) Resuming from sleep taking up to 30 seconds, display settings not loading, screen auto-rotate broken after suspend - issue with iio-sensor-proxy 3.7


Once again posting something for reference as I couldn't find it online

Symptoms


No issues after logging in.
After suspending (sleep) and resuming, screen takes 25 - 30 seconds to turn on.
Display settings in Plasma take a long time to load, sometimes don't show automatic rotation option.
Turning on screen after turning off (even without sleep) takes a long time.
No suspicious logs in Kernel and Journald (even after comparing post-fix).
Switching kernel makes no difference.
Logging out and back in temporarily fixes screen rotation and screen waking until next suspend.
Everything works in X11 session apart from screen rotation (appears unsupported).
Running monitor-sensor hangs when running after suspend
systemctl stop iio-sensor-proxy fixes slowdown issues

Workaround


Downgrading to iio-sensor-proxy 3.6-1 following Arch Linux package downgrade instructions.
In my case with a cached package
```<>
sudo pacman -U file:///var/cache/pacman/pkg/iio-sensor-proxy-3.6-1-x86_64.pkg.tar.zst

and optionally adding it to IgnorePkg  
```<>
IgnorePkg   = iio-sensor-proxy # Issues in Wayland after suspend

System info


OS: Arch Linux x64
Host: Lenovo ThinkPad L390 Yoga
Kernel: 6.12.35-1-lts
DE: Plasma 6.4.2
iio-sensor-proxy (broken version): 3.7-1
Last full system upgrade: 2025-07-06

This entry was edited (1 week ago)

Installing Guix as a Complete GNU/Linux System - System Crafters


Trying out Guix for the first time! Waiting for packages to download.

I'm a long time Arch user. Any tips?!

I've heard there aren't as many packages for Guix as other distros, but I was thinking Flatpak and distrobox will help bridge the gap for me.

in reply to paequ2

Btw, here's how you configure HiDPI for GNOME. Unfortunately, my laptop has a hydeepeeay display, so it's not fully compatible with Linux. (It's 3840x2160, so at least 2x scaling is possible, hypothetically.)

Commands from the Arch Wiki, but also adds cursor scaling:

$ gsettings set org.gnome.settings-daemon.plugins.xsettings overrides "[{'Gdk/WindowScalingFactor', <2>}, {'Gtk/CursorThemeSize', <48>}]"
$ gsettings set org.gnome.desktop.interface scaling-factor 2

The default GNOME configuration is some how missing that. I didn't have to do that in Arch, but I do in Guix. IDK. Anyway, if you don't run those commands certain apps will be tiny, including a tiny mouse cursor.

How do your toughts work?


As per title, I am curious. How does your mind / your thoughts work? I only ever experienced my own thoughts, so I'm curious how it works for other people.

I for one feel like my thoughts sometimes are like me talking to myself silently. Sometimes I can even let out a random short sound, which I've come to start disguising by laughing kinda quietly or coughing or whatever. Like it was part of something, and not like an inner monologue almost leaking out.

So, how do your thoughts work?

billionaires are a cancer on society [literally]


Ok so how does a cancer kill its host?

It grows until it consumes so many nutrients that the other living cells don't get enough. The host literally starves even if he eats plentifully.

The same applies for the US: The billionaires are not only hoarding wealth, but by doing so they're crippling the economy for workers and everybody besides themselves.

in reply to Karna

Beyond raw horsepower, 7-Zip quietly tightens its handling of several legacy formats. Support for ZIP, CcPIO, and FAT archives has been refined, smoothing edge-case extractions that previously required third-party tools.


Over the years there was a few .zip archives that 7z could not handle for whatever reason. For these cases I had to use another application, but don't know the reason. And my bad to not keeping copies of these files for future testing.

Weird line tearing on KDE


Posting here too as I've not had any responses in the more relevant communities.

Hi there, I've got these really odd issue where certain windows will cause random lines like the one in the screenshot appear on my screen. They will often flicker a bit and will dissapear if I hover my mouse over them. The lines will display what is beneath the window itself. These occour quite frequently and are frankly getting quite annoying to deal with.

Is this a known issue with KDE right now? It does not happen while using Gnome on the same machine + screen. If it matters I am running CachyOS+KDE 6.4.1+AMD.

If there is anythign I can do to fix this then I'd greatly appriciate some pointers!

This entry was edited (1 week ago)

Linux gave me a brand new laptop


I bought a Lenovo laptop, one from the the bargain bin, 11th gen Intel and 8gb soldered RAM

Even if I reinstalled Windows to make sure all the bloatware was removed, it was almost unusable. At boot I was left with only 800mb free memory, and "Lenovo vantage" kept reappearing automatically like malware. (It's a useless electron app that wastes half a gig of ram to show you on screen when you press caps lock, check driver updates and try to upsell you on extended warranty)

At idle the machine was as loud as a jet, with crystal disk mark always complaining "the nvme drive is over 65°C!!" (I'm guessing from the constant swapping)

Battery life was a disaster, 2 hours at idle with no foreground apps open

I thought that it was the CPU too slow for my use and the RAM not enough, so I was planning to spend some hundreds of euro to buy a new laptop with at least 16gb of RAM.

Then I installed cachyos and because I'm masochist I chose hyprland at the "easy" install screen that asks you which of the 19 available DE you prefer.

After a week of suffering trying to understand all the text configuration files for everything (it was a shock, everything needs the terminal) I'm now getting used to it and... It's like I got a brand new laptop??!?

Memory: clean boot now obviously is reversed situation. I don't have only 800mb of free RAM, the whole system uses only 800mb

Temperatures: by default cachyos is set to show the CPU temperature on waybar, and it's always around 40-45° C. The fan is way quieter. At idle they can even stop, before they were like a hair dryer even after a clean boot

Battery life: astounding. I can't believe that I can use it for a whole afternoon. Accidentally fell asleep and when I came back after two hours it lost only 10% (on idle, screen turn off automatically)

Gaming performance: tried only with casual games but with something like tinytopia I get 60fps on ultra when on windows it was choppy on high

“It’s over”: David Suzuki says it’s too late to stop climate change now and the damage is already done


And guess what ... most of the people who 'did what they could' just kept driving their cars. 'What choice did we have?' None.

Should Humanity Continue? Glenn Reacts to Thiel Interview [20:56 | JUL 05 2025 | Glenn Greenwald]


cross-posted from: lemmy.world/post/32525534

SponsorBlock and Generated Summary below:

SponsorBlock:
1. 0:00.000 - 0:06.150 Intermission
2. 18:43.000 - 20:56.301 Unpaid/Self Promotion


Video Description:

This is a clip from our show SYSTEM UPDATE, now airing every weeknight at 7pm ET on Rumble. You can watch the full episode for FREE here: rumble.com/v6vontt-system-upda…

Now available as a podcast! Find full episodes here: linktr.ee/systemupdate_

Join us LIVE on Rumble, weeknights at 7pm ET: rumble.com/c/GGreenwald

Become part of our Locals community: greenwald.locals.com/


Generated Summary:

Main Topic: The video discusses Peter Thiel's interview where he hesitates when asked if the human race should continue, and Glenn's reaction to Thiel's views on transhumanism, AI, and the potential dangers of unchecked billionaire influence.

Key Points:

  • Thiel's Hesitation: The video starts by referencing Peter Thiel's interview where he seemed uncertain about whether humanity should continue.
  • Transhumanism and AI: The discussion explores the transhumanist philosophy prevalent in Silicon Valley, focusing on merging humans with technology and AI, as exemplified by Mark Zuckerberg's vision of brain implants.
  • Autism and Conformity: Glenn discusses Thiel's perspective on autism, suggesting that it can provide a detachment from societal norms, fostering innovation.
  • Billionaire Culture: A significant portion of the video critiques the culture of Silicon Valley billionaires, arguing that their wealth and power, combined with constant flattery, can lead to detachment from reality, dangerous levels of self-confidence, and utopian/dystopian visions for society.
  • Essentialism vs. Nihilism: The video touches on the philosophical implications of transhumanism, contrasting it with essentialist views of humanity and raising concerns about the potential destruction of what it means to be human.
  • Lack of Debate: Glenn expresses concern about the lack of societal debate and safeguards surrounding the rapid advancement of AI, driven by billionaires with unchecked power.

Highlights:

  • Glenn's concern about billionaires' ability to reshape society without proper debate due to their wealth and perceived brilliance.
  • The discussion of Thiel's autism and how it might influence his unconventional thinking.
  • The comparison of mind-altering drugs to autism as a means of achieving transcendent thought.
  • The critique of Mark Zuckerberg's vision of brain implants and the potential implications for humanity.

About Channel:

Independent, Unencumbered Analysis and Investigative Reporting, Captive to No Dogma or Faction.

Musk gründet nach Konflikt mit Trump eigene Partei


[CH] Franzosen dürfen in Porrentruy JU nicht mehr in die Badi – schon wieder


#Europa macht dicht.

In einer Gemeinde im Jura dürfen Ausländer (korrekt: Personen ohne Wohnsitz oder Arbeitsplatz in der #Schweiz) nicht mehr ins Schwimmbad.

Die Franzosen würden sich angeblich zu schlecht benehmen.

berlin.social/@mina/1147969856…

This entry was edited (1 week ago)

Preventing Asian Citrus Psysllid


I have a small lemon tree that was bought from a local grower and came with the extra bonus of an Asian Citrus Psysllid infestation. The tree is dead now and I'd love to get a new tree, but want to make sure I've done everything I could to prevent a new tree from getting infested by any Psysllid still in the area.

Is there anything I can do to treat my soil or surrounding plants to make sure those little buggers aren't going to keep coming back? I'm in California where the sale of IMIDACLOPRID products is banned, which was previously the primary treatment for this.

in reply to anticonnor

In all seriousness, you need to contact your local university agriculture extension. Like right now.

This is a huge problem this generated from unlicensed growers, and they are shipping this all over the country, causing massive outbreaks.

Unless you live in an area with an abundance of Mantids or Wasps, I don't think there are any other means of control aside from harsh pesticides.

Call your local extension immediately, tell them where you got it, and have them come visit to treat if necessary.

Weird line tearing on KDE


Hi there, I've got these really odd issue where certain windows will cause random lines like the one in the screenshot appear on my screen. They will often flicker a bit and will dissapear if I hover my mouse over them. The lines will display what is beneath the window itself. These occour quite frequently and are frankly getting quite annoying to deal with.

Is this a known issue with KDE right now? It does not happen while using Gnome on the same machine + screen. If it matters I am running CachyOS.

If there is anythign I can do to fix this then I'd greatly appriciate some pointers!

GrapheneOS version 2025070300 released


Tags:
  • 2025070300 (Pixel 6, Pixel 6 Pro, Pixel 6a, Pixel 7, Pixel 7 Pro, Pixel 7a, Pixel Tablet, Pixel Fold, Pixel 8, Pixel 8 Pro, Pixel 8a, Pixel 9, Pixel 9 Pro, Pixel 9 Pro XL, Pixel 9 Pro Fold, Pixel 9a, emulator, generic, other targets)

Changes since the 2025070100 release:

  • increase virtual memory reserved for Binder buffers from 1MiB to 8MiB due to Android 16 having a very large Binder transaction scaling up based on the number of apps and profiles which can go beyond the total size limit and break fully booting the OS, which occurred for a tiny number of our Alpha testers (if you were one of the tiny number of Alpha channel testers running into this, you can sideload this release to resolve the issue)
  • fix issues with display of the end session button to avoid it being wrongly displayed for Owner or not displayed for secondary users (we may remove this part of the upstream end session UI or make it optional since the functionality is also in the power menu)
  • update Pixel USB HAL to Android 16 (this was omitted in the initial port due to needing special handling for our USB-C port and pogo pins control feature)
  • always use UTC as the time zone for build date properties
  • kernel (6.6): update to latest GKI LTS branch revision

GrapheneOS Foundation Commentary On ICEBlock's False Claims About Push Notifications


ICEBlock is making incredibly false privacy claims for marketing. They falsely claim it provides complete anonymity when it doesn't. They're ignoring both data kept by Apple and data available to the server but not stored. They're also spreading misinformation about Android:

iceblock.app/android

Their claims about push notifications on Android compared to iOS are completely false. Both Firebase Cloud Messaging (FCM) and the Apple Push Notification service (APNs) function in a similar way with similar privacy. However, Android does not force using FCM and apps can use other push systems.

iOS forces uses Apple services including getting apps through Apple where they have a record of which apps each person and account has installed and using their push notification service. Both FCM and APNs have tokens. Android doesn't allow apps to access device IDs. Push tokens aren't device IDs.

Apple and Google can identify devices/users based on push tokens obtained by law enforcement from services. Unlike Google, Apple only recently began requiring warrants:

reuters.com/technology/apple-n…

ICEBlock's claims about this are highly inaccurate and they haven't acknowledged corrections.

in reply to spujb

           MY ANTI DEPRESSANTS JUST KICKED
                 IN ! FANTASTIC !
                   __      _______ ______
                ╱    /\__/\       //     ╲╲
        ______⊂╱    ( ´∇`  )     // ⊃     ||╲ フ 🡖
      ,´__▔▔▔▔╱  ▔╱▔  ⌒▔▔▔▔╱▔▔▔▔ 🡖▔ ▔▔▔▔▔🡖 ▔▔▔▔ |
    ,╱_ _╱   /-o—/ ___ ╱▔▔╱ ___/\  |     ▔ | /\__|
   ,========————´=============/⌒ ╲=/=======||🡖 ||
   | __  |  GAY!  |   __ "    |⌒| |/    ___/|  )╯
   )|🞕|_∈≡≡≡≡≡≡≡≡≡∋__|🞕|"  __|| ╯ ╯__ -‒‒‒‒‒┘  ╯
   ▔╲ ▔╲__╯▔▔▔▔▔▔▔▔三三三▔╲  ╲__╯ ▔▔     三三三三╯
     三三三三三三三三三三三三三三三三三三三三三三三三三三三三
       三三三三三三三三三三三三三三三三三三三三三三三三三三三三

OOP tweet (they explain in this thread that it’s from circa 2012): web.archive.org/web/2022090606…

pastebin archive (mildly unsatisfactory recreation, for example uses the wrong character for the headlights instead of perhaps 日): web.archive.org/web/2023061201…

recreation credit: web.archive.org/web/2022031512…

work of a fellow archivist: web.archive.org/web/2025021621…

have given up looking for older versions to archive but
~~if anyone knows how to search usenet communities that might be where the forum OOP got this from lives~~ actually i’m pretty sure 4-ch.net/dqn/index.html is the forum as it perfectly matches the description—all posts are timestamped 1993-09

cute redraw:

This entry was edited (1 week ago)

US consultancy firm [Boston Consulting Group] involved in GHF aid scheme modelled plans to 'relocate' Palestinians


The spy, private equity baron and ghost of a Trump donor: The revolving door behind a Gaza mercenary firm


in reply to cheese_greater

It depends on the transit service, and how much their IT people suck. I'm pretty sure there have been multiple attempts to make standardized APIs for this sort of thing, but you shouldn't necessarily expect them to be widely used except maybe in Europe.

Do a web search for "[transit service name] API" and start from there.

Edit: My local transit service apparently publishes a GTFS feed, which may be more widespread than I assumed, but I'm honestly kinda surprised they didn't try to roll their own or something stupid like that.

This entry was edited (1 week ago)

BlackRock Halted Ukraine Fund Talks After Trump’s Election Win


archive.ph/xZJXi

Wafrn: a tumblr clone that federates with fedi and now also has opt in native bluesky


Hello, its me gabbo the creator of this hellsite. I am totaly not making this post to make sure that lemmy federation works properly
in reply to Imhotep

Bit confused, what would they even do with a report of downvote? Doesn’t make sense.

Plus don’t even understand why someone cares so much about downvoting that they would message you and report it. The upvote/ downvote means seriously nothing. It’s “thin air”.

Put down your device and it has no impact on your live. Continue using Lemmy and it will have no impact on how you use Lemmy.

in reply to LadyButterfly

All of these things can be helped by using a tracking app that projects all your balances and recalculates every time you put in more information or you simulate various choices you're about to pull the trigger on. You will get instant feedback and see how it screws up your bills money and hopefully learn to heat that feeling so its not such an active effort to stick to the plan ans go with the flow

What is the supposed workflow for vanilla Gnome for keyboard users?


Question is in the title: What is the supposed workflow for vanilla Gnome for keyboard users?

Is there any video/design documents which explain, how the workflow is supposed to be?

Assume, I have a full screen web browser on workspace 1. Now I want to have a terminal... I hit the super-key, type terminal, hit enter ... and then I have a terminal which does not start maximized on workspace 1, so I can either maximize the terminal and switch between the applications, arrange them side by side... or I can navigate to workspace 2, start the terminal there (the terminal will not start maximized again on an empty workspace 2) ... and switch between the two workspaces (AFAIK there are no hotkeys specified by default to navigate directly to a workspace)...

What I simply do not understand: Does the vanilla Gnome workflow expect you to use mouse and keyboard? Like hit super, use mouse to go to next workspace, type terminal, click to maximize terminal (or use super-up)?

It just seems like a lot of work/clicks/keys to achieve something simple. And to my understanding Gnome expects you to use basically every application with a full screen window anyway, so why does it not open a new application on the next free workspace full screen by default?

in reply to wolf

Keyboard -> Keyboard shortcuts from Settings will show all the available keyboard shortcuts. You can also create your own custom keybindings

These seem like a lot of personal design complaints rather than actual issues with GNOME itself.

And to my understanding Gnome expects you to use basically every application with a full screen window anyway


You misunderstood, that's not what GNOME expects at all. Your app not maximizing on startup is because the app doesn't maximize on startup. GNOME doesn't have a setting to maximize all apps by default since that should be the app's responsibility.

If you want the auto-tiling window manager experience, you'll need to install an extension (Paperwm, tiling shell, Forge, Pop shell). Extensions are like applications, there's no shame in using them.

This entry was edited (5 days ago)

Moving to the US in the worst of times…


I’m a teen from Turkey. I’m moving to the US at the end of this month with my mom to live with her fiancé, so we’re going on a K-1 visa. He lives in Los Angeles County. I’ve been following the news regularly and I’d be lying if I said I wasn’t nervous by what’s happening in the country these days… such a bad time to move there.

‘The vehicle suddenly accelerated with our baby in it’: the terrifying truth about why Tesla’s cars keep crashing


It was a Monday afternoon in June 2023 when Rita Meier, 45, joined us for a video call. Meier told us about the last time she said goodbye to her husband, Stefan, five years earlier. He had been leaving their home near Lake Constance, Germany, heading for a trade fair in Milan.

Meier recalled how he hesitated between taking his Tesla Model S or her BMW. He had never driven the Tesla that far before. He checked the route for charging stations along the way and ultimately decided to try it. Rita had a bad feeling. She stayed home with their three children, the youngest less than a year old.

At 3.18pm on 10 May 2018, Stefan Meier lost control of his Model S on the A2 highway near the Monte Ceneri tunnel. Travelling at about 100kmh (62mph), he ploughed through several warning markers and traffic signs before crashing into a slanted guardrail. “The collision with the guardrail launches the vehicle into the air, where it flips several times before landing,” investigators would write later.

The car came to rest more than 70 metres away, on the opposite side of the road, leaving a trail of wreckage. According to witnesses, the Model S burst into flames while still airborne. Several passersby tried to open the doors and rescue the driver, but they couldn’t unlock the car. When they heard explosions and saw flames through the windows, they retreated. Even the firefighters, who arrived 20 minutes later, could do nothing but watch the Tesla burn.

At that moment, Rita Meier was unaware of the crash. She tried calling her husband, but he didn’t pick up. When he still hadn’t returned her call hours later – highly unusual for this devoted father – she attempted to track his car using Tesla’s app. It no longer worked. By the time police officers rang her doorbell late that night, Meier was already bracing for the worst.

Customers described their cars suddenly accelerating or braking hard. Some escaped with a scare; others ended up in ditches

The crash made headlines the next morning as one of the first fatal Tesla accidents in Europe. Tesla released a statement to the press saying the company was “deeply saddened” by the incident, adding, “We are working to gather all the facts in this case and are fully cooperating with local authorities.”

To this day, Meier still doesn’t know why her husband died. She has kept everything the police gave her after their inconclusive investigation. The charred wreck of the Model S sits in a garage Meier rents specifically for that purpose. The scorched phone – which she had forensically analysed at her own expense, to no avail – sits in a drawer at home. Maybe someday all this will be needed again, she says. She hasn’t given up hope of uncovering the truth.

Rita Meier was one of many people who reached out to us after we began reporting on the Tesla Files – a cache of 23,000 leaked documents and 100 gigabytes of confidential data shared by an anonymous whistleblower. The first report we published looked at problems with Tesla’s autopilot system, which allows the cars to temporarily drive on their own, taking over steering, braking and acceleration. Though touted by the company as “Full Self-Driving” (FSD), it is designed to assist, not replace, the driver, who should keep their eyes on the road and be ready to intervene at any time.

Autonomous driving is the core promise around which Elon Musk has built his company. Tesla has never delivered a truly self-driving vehicle, yet the richest person in the world keeps repeating the claim that his cars will soon drive entirely without human help. Is Tesla’s autopilot really as advanced as he says?

The Tesla Files suggest otherwise. They contain more than 2,400 customer complaints about unintended acceleration and more than 1,500 braking issues – 139 involving emergency braking without cause, and 383 phantom braking events triggered by false collision warnings. More than 1,000 crashes are documented. A separate spreadsheet on driver-assistance incidents where customers raised safety concerns lists more than 3,000 entries. The oldest date from 2015, the most recent from March 2022. In that time, Tesla delivered roughly 2.6m vehicles with autopilot software. Most incidents occurred in the US, but there have also been complaints from Europe and Asia. Customers described their cars suddenly accelerating or braking hard. Some escaped with a scare; others ended up in ditches, crashing into walls or colliding with oncoming vehicles. “After dropping my son off in his school parking lot, as I go to make a right-hand exit it lurches forward suddenly,” one complaint read. Another said, “My autopilot failed/malfunctioned this morning (car didn’t brake) and I almost rear-ended somebody at 65mph.” A third reported, “Today, while my wife was driving with our baby in the car, it suddenly accelerated out of nowhere.”

Braking for no reason caused just as much distress. “Our car just stopped on the highway. That was terrifying,” a Tesla driver wrote. Another complained, “Frequent phantom braking on two-lane highways. Makes the autopilot almost unusable.” Some report their car “jumped lanes unexpectedly”, causing them to hit a concrete barrier, or veered into oncoming traffic.

Musk has given the world many reasons to criticise him since he teamed up with Donald Trump. Many people do – mostly by boycotting his products. But while it is one thing to disagree with the political views of a business leader, it is another to be mortally afraid of his products. In the Tesla Files, we found thousands of examples of why such fear may be justified.
Illustration of bashed up and burned cars in a car park
‘My husband died in an unexplained accident. And no one cared.’ Illustration: Carl Godfrey/The Guardian

We set out to match some of these incidents of autopilot errors with customers’ names. Like hundreds of other Tesla customers, Rita Meier entered the vehicle identification number of her husband’s Model S into the response form we published on the website of the German business newspaper Handelsblatt, for which we carried out our investigation. She quickly discovered that the Tesla Files contained data related to the car. In her first email to us, she wrote, “You can probably imagine what it felt like to read that.”

There isn’t much information – just an Excel spreadsheet titled “Incident Review”. A Tesla employee noted that the mileage counter on Stefan Meier’s car stood at 4,765 miles at the time of the crash. The entry was catalogued just one day after the fatal accident. In the comment field was written, “Vehicle involved in an accident.” The cause of the crash remains unknown to this day. In Tesla’s internal system, a company employee had marked the case as “resolved”, but for five years, Rita Meier had been searching for answers. After Stefan’s death, she took over the family business – a timber company with 200 employees based in Tettnang, Baden-Württemberg. As journalists, we are used to tough interviews, but this one was different. We had to strike a careful balance – between empathy and the persistent questioning good reporting demands. “Why are you convinced the Tesla was responsible for your husband’s death?” we asked her. “Isn’t it possible he was distracted – maybe looking at his phone?”

No one knows for sure. But Meier was well aware that Musk has previously claimed Tesla “releases critical crash data affecting public safety immediately and always will”; that he has bragged many times about how its superior handling of data sets the company apart from its competitors. In the case of her husband, why was she expected to believe there was no data?

Meier’s account was structured and precise. Only once did the toll become visible – when she described how her husband’s body burned in full view of the firefighters. Her eyes filled with tears and her voice cracked. She apologised, turning away. After she collected herself, she told us she has nothing left to gain – but also nothing to lose. That was why she had reached out to us. We promised to look into the case.

Rita Meier wasn’t the only widow to approach us. Disappointed customers, current and former employees, analysts and lawyers were sharing links to our reporting. Many of them contacted us. More than once, someone wrote that it was about time someone stood up to Tesla – and to Elon Musk.

Meier, too, shared our articles and the callout form with others in her network – including people who, like her, lost loved ones in Tesla crashes. One of them was Anke Schuster. Like Meier, she had lost her husband in a Tesla crash that defies explanation and had spent years chasing answers. And, like Meier, she had found her husband’s Model X listed in the Tesla Files. Once again, the incident was marked as resolved – with no indication of what that actually meant.

“My husband died in an unexplained and inexplicable accident,” Schuster wrote in her first email. Her dealings with police, prosecutors and insurance companies, she said, had been “hell”. No one seemed to understand how a Tesla works. “I lost my husband. His four daughters lost their father. And no one ever cared.”

Her husband, Oliver, was a tech enthusiast, fascinated by Musk. A hotelier by trade, he owned no fewer than four Teslas. He loved the cars. She hated them – especially the autopilot. The way the software seemed to make decisions on its own never sat right with her. Now, she felt as if her instincts had been confirmed in the worst way.

We uncovered an ominous black box in which every byte of customer data was collected – and sealed off from public scrutiny

Oliver Schuster was returning from a business meeting on 13 April 2021 when his black Model X veered off highway B194 between Loitz and Schönbeck in north-east Germany. It was 12.50pm when the car left the road and crashed into a tree. Schuster started to worry when her husband missed a scheduled bank appointment. She tried to track the vehicle but found no way to locate it. Even calling Tesla led nowhere. That evening, the police broke the news: after the crash her husband’s car had burst into flames. He had burned to death – with the fire brigade watching helplessly.

The crashes that killed Meier’s and Schuster’s husbands were almost three years apart but the parallels were chilling. We examined accident reports, eyewitness accounts, crash-site photos and correspondence with Tesla. In both cases, investigators had requested vehicle data from Tesla, and the company hadn’t provided it. In Meier’s case, Tesla staff claimed no data was available. In Schuster’s, they said there was no relevant data.

Over the next two years, we spoke with crash victims, grieving families and experts around the world. What we uncovered was an ominous black box – a system designed not only to collect and control every byte of customer data, but to safeguard Musk’s vision of autonomous driving. Critical information was sealed off from public scrutiny.

Elon Musk is a perfectionist with a tendency towards micromanagement. At Tesla, his whims seem to override every argument – even in matters of life and death. During our reporting, we came across the issue of door handles. On Teslas, they retract into the doors while the cars are being driven. The system depends on battery power. If an airbag deploys, the doors are supposed to unlock automatically and the handles extend – at least, that’s what the Model S manual says.

The idea for the sleek, futuristic design stems from Musk himself. He insisted on retractable handles, despite repeated warnings from engineers. Since 2018, they have been linked to at least four fatal accidents in Europe and the US, in which five people died.

In February 2024, we reported on a particularly tragic case: a fatal crash on a country road near Dobbrikow, in Brandenburg, Germany. Two 18-year-olds were killed when the Tesla they were in slammed into a tree and caught fire. First responders couldn’t open the doors because the handles were retracted. The teenagers burned to death in the back seat.

A court-appointed expert from Dekra, one of Germany’s leading testing authorities, later concluded that, given the retracted handles, the incident “qualifies as a malfunction”. According to the report, “the failure of the rear door handles to extend automatically must be considered a decisive factor” in the deaths. Had the system worked as intended, “it is assumed that rescuers might have been able to extract the two backseat passengers before the fire developed further”. Without what the report calls a “failure of this safety function”, the teens might have survived.
'I feel like I'm in the movies': malfunctioning robotaxi traps passenger in car – video

Our investigation made waves. The Kraftfahrt-Bundesamt, Germany’s federal motor transport authority, got involved and announced plans to coordinate with other regulatory bodies to revise international safety standards. Germany’s largest automobile club, ADAC, issued a public recommendation that Tesla drivers should carry emergency window hammers. In a statement, ADAC warned that retractable door handles could seriously hinder rescue efforts. Even trained emergency responders, it said, may struggle to reach trapped passengers. Tesla shows no intention of changing the design.

That’s Musk. He prefers the sleek look of Teslas without handles, so he accepts the risk to his customers. His thinking, it seems, goes something like this: at some point, the engineers will figure out a technical fix. The same logic applies to his grander vision of autonomous driving: because Musk wants to be first, he lets customers test his unfinished Autopilot system on public roads. It’s a principle borrowed from the software world, where releasing apps in beta has long been standard practice. The more users, the more feedback and, over time – often years – something stable emerges. Revenue and market share arrive much earlier. The motto: if you wait, you lose.

Musk has taken that mindset to the road. The world is his lab. Everyone else is part of the experiment.

By the end of 2023, we knew a lot about how Musk’s cars worked – but the way they handle data still felt like a black box. How is that data stored? At what moment does the onboard computer send it to Tesla’s servers? We talked to independent experts at the Technical University Berlin. Three PhD candidates – Christian Werling, Niclas Kühnapfel and Hans Niklas Jacob – made headlines for hacking Tesla’s autopilot hardware. A brief voltage drop on a circuit board turned out to be just enough to trick the system into opening up.

The security researchers uncovered what they called “Elon Mode” – a hidden setting in which the car drives fully autonomously, without requiring the driver to keep his hands on the wheel. They also managed to recover deleted data, including video footage recorded by a Tesla driver. And they traced exactly what data Tesla sends to its servers – and what it doesn’t.

The hackers explained that Tesla stores data in three places. First, on a memory card inside the onboard computer – essentially a running log of the vehicle’s digital brain. Second, on the event data recorder – a black box that captures a few seconds before and after a crash. And third, on Tesla’s servers, assuming the vehicle uploads them.

The researchers told us they had found an internal database embedded in the system – one built around so-called trigger events. If, for example, the airbag deploys or the car hits an obstacle, the system is designed to save a defined set of data to the black box – and transmit it to Tesla’s servers. Unless the vehicles were in a complete network dead zone, in both the Meier and Schuster cases, the cars should have recorded and transmitted that data.
Illustration of bashed up and burned cars in a car park
‘Is the car driving erratically by itself normal? Yeah, that happens every now and then.’ Illustration: Carl Godfrey/The Guardian

Who in the company actually works with that data? We examined testimony from Tesla employees in court cases related to fatal crashes. They described how their departments operate. We cross-referenced their statements with entries in the Tesla Files. A pattern took shape: one team screens all crashes at a high level, forwarding them to specialists – some focused on autopilot, others on vehicle dynamics or road grip. There’s also a group that steps in whenever authorities request crash data.

We compiled a list of employees relevant to our reporting. Some we tried to reach by email or phone. For others, we showed up at their homes. If they weren’t there, we left handwritten notes. No one wanted to talk.

We searched for other crashes. One involved Hans von Ohain, a 33-year-old Tesla employee from Evergreen, Colorado. On 16 May 2022, he crashed into a tree on his way home from a golf outing and the car burst into flames. Von Ohain died at the scene. His passenger survived and told police that von Ohain, who had been drinking, had activated Full Self-Driving. Tesla, however, said it couldn’t confirm whether the system was engaged – because no vehicle data was transmitted for the incident.

Then, in February 2024, Musk himself stepped in. The Tesla CEO claimed von Ohain had never downloaded the latest version of the software – so it couldn’t have caused the crash. Friends of von Ohain, however, told US media he had shown them the system. His passenger that day, who barely escaped with his life, told reporters that hours earlier the car had already driven erratically by itself. “The first time it happened, I was like, ‘Is that normal?’” he recalled asking von Ohain. “And he was like, ‘Yeah, that happens every now and then.’”

His account was bolstered by von Ohain’s widow, who explained to the media how overjoyed her husband had been at working for Tesla. Reportedly, von Ohain received the Full Self-Driving system as a perk. His widow explained how he would use the system almost every time he got behind the wheel: “It was jerky, but we were like, that comes with the territory of new technology. We knew the technology had to learn, and we were willing to be part of that.”

The Colorado State Patrol investigated but closed the case without blaming Tesla. It reported that no usable data was recovered.

For a company that markets its cars as computers on wheels, Tesla’s claim that it had no data available in all these cases is surprising. Musk has long described Tesla vehicles as part of a collective neural network – machines that continuously learn from one another. Think of the Borg aliens from the Star Trek franchise. Musk envisions his cars, like the Borg, as a collective – operating as a hive mind, each vehicle linked to a unified consciousness.

When a journalist asked him in October 2015 what made Tesla’s driver-assistance system different, he replied, “The whole Tesla fleet operates as a network. When one car learns something, they all learn it. That is beyond what other car companies are doing.” Every Tesla driver, he explained, becomes a kind of “expert trainer for how the autopilot should work”.

According to Musk, the eight cameras in every Tesla transmit more than 160bn video frames a day to the company’s servers. In its owner’s manual, Tesla states that its cars may collect even more: “analytics, road segment, diagnostic and vehicle usage data”, all sent to headquarters to improve product quality and features such as autopilot. The company claims it learns “from the experience of billions of miles that Tesla vehicles have driven”.
‘Lidar is lame’: why Elon Musk’s vision for a self-driving Tesla taxi faltered
Read more

It is a powerful promise: a fleet of millions of cars, constantly feeding raw information into a gargantuan processing centre. Billions – trillions – of data points, all in service of one goal: making cars drive better and keeping drivers safe. At the start of this year, Musk got a chance to show the world what he meant.

On 1 January 2025, at 8.39am, a Tesla Cybertruck exploded outside the Trump International Hotel Las Vegas. The man behind the incident – US special forces veteran Matthew Livelsberger – had rented the vehicle, packed it with fireworks, gas canisters and grenades, and parked it in front of the building. Just before the explosion, he shot himself in the head with a .50 calibre Desert Eagle pistol. “This was not a terrorist attack, it was a wakeup call. Americans only pay attention to spectacles and violence,” Livelsberger wrote in a letter later found by authorities. “What better way to get my point across than a stunt with fireworks and explosives.”

The soldier miscalculated. Seven bystanders suffered minor injuries. The Cybertruck was destroyed, but not even the windows of the hotel shattered. Instead, with his final act, Livelsberger revealed something else entirely: just how far the arm of Tesla’s data machinery can reach. “The whole Tesla senior team is investigating this matter right now,” Musk wrote on X just hours after the blast. “Will post more information as soon as we learn anything. We’ve never seen anything like this.”

Later that day, Musk posted again. Tesla had already analysed all relevant data – and was ready to offer conclusions. “We have now confirmed that the explosion was caused by very large fireworks and/or a bomb carried in the bed of the rented Cybertruck and is unrelated to the vehicle itself,” he wrote. “All vehicle telemetry was positive at the time of the explosion.”

Suddenly, Musk wasn’t just a CEO; he was an investigator. He instructed Tesla technicians to remotely unlock the scorched vehicle. He handed over internal footage captured up to the moment of detonation.The Tesla CEO had turned a suicide attack into a showcase of his superior technology.

Yet there were critics even in the moment of glory. “It reveals the kind of sweeping surveillance going on,” warned David Choffnes, executive director of the Cybersecurity and Privacy Institute at Northeastern University in Boston, when contacted by a reporter. “When something bad happens, it’s helpful, but it’s a double-edged sword. Companies that collect this data can abuse it.”
Illustration of bashed up and burned cars in a car park
‘In many crashes, investigators weren’t even aware that requesting data from Tesla was an option.’ Illustration: Carl Godfrey/The Guardian

There are other examples of what Tesla’s data collection makes possible. We found the case of David and Sheila Brown, who died in August 2020 when their Model 3 ran a red light at 114mph in Saratoga, California. Investigators managed to reconstruct every detail, thanks to Tesla’s vehicle data. It shows exactly when the Browns opened a door, unfastened a seatbelt, and how hard the driver pressed the accelerator – down to the millisecond, right up to the moment of impact. Over time, we found more cases, more detailed accident reports. The data definitely is there – until it isn’t.

In many crashes when Teslas inexplicably veered off the road or hit stationary objects, investigators didn’t actually request data from the company. When we asked authorities why, there was often silence. Our impression was that many prosecutors and police officers weren’t even aware that asking was an option. In other cases, they acted only when pushed by victims’ families.

In the Meier case, Tesla told authorities, in a letter dated 25 June 2018, that the last complete set of vehicle data was transmitted nearly two weeks before the crash. The only data from the day of the accident was a “limited snapshot of vehicle parameters” – taken “approximately 50 minutes before the incident”. However, this snapshot “doesn’t show anything in relation to the incident”. As for the black box, Tesla warned that the storage modules were likely destroyed, given the condition of the burned-out vehicle. Data transmission after a crash is possible, the company said – but in this case, it didn’t happen. In the end, investigators couldn’t even determine whether driver-assist systems were active at the time of the crash.

The Schuster case played out similarly. Prosecutors in Stralsund, Germany, were baffled. The road where the crash happened is straight, the asphalt was dry and the weather at the time of the accident was clear. Anke Schuster kept urging the authorities to examine Tesla’s telemetry data.

Every road user trusts the cars around them not to be a threat. Does that trust still stand when a car is driving itself?

When prosecutors did formally request the data recorded by Schuster’s car on the day of the crash, it took Tesla more than two weeks to respond – and when it did, the answer was both brief and bold. The company didn’t say there was no data. It said that there was “no relevant data”. The authorities’ reaction left us stunned. We expected prosecutors to push back – to tell Tesla that deciding what’s relevant is their job, not the company’s. But they didn’t. Instead, they closed the case.

The hackers from TU Berlin pointed us to a study by the Netherlands Forensic Institute, an independent division of the ministry of justice and security. In October 2021, the NFI published findings showing it had successfully accessed the onboard memories of all major Tesla models. The researchers compared their results with accident cases in which police had requested data from Tesla. Their conclusion was that while Tesla formally complied with those requests, it omitted large volumes of data that might have proved useful.

Tesla’s credibility took a further hit in a report released by the US National Highway Traffic Safety Administration in April 2024. The agency concluded that Tesla failed to adequately monitor whether drivers remain alert and ready to intervene while using its driver-assist systems. It reviewed 956 crashes, field data and customer communications, and pointed to “gaps in Tesla’s telematic data” that made it impossible to determine how often autopilot was active during crashes. If a vehicle’s antenna was damaged or it crashed in an area without network coverage, even serious accidents sometimes went unreported. Tesla’s internal statistics include only those crashes in which an airbag or other pyrotechnic system deployed – something that occurs in just 18% of police-reported cases. This means that the actual accident rate is significantly higher than Tesla discloses to customers and investors.

There’s more. Two years prior, the NHTSA had flagged something strange – something suspicious. In a separate report, it documented 16 cases in which Tesla vehicles crashed into stationary emergency vehicles. In each, autopilot disengaged “less than one second before impact” – far too little time for the driver to react. Critics warn that this behaviour could allow Tesla to argue in court that autopilot was not active at the moment of impact, potentially dodging responsibility.

The YouTuber Mark Rober, a former engineer at Nasa, replicated this behaviour in an experiment on 15 March 2025. He simulated a range of hazardous situations, in which the Model Y performed significantly worse than a competing vehicle. The Tesla repeatedly ran over a crash-test dummy without braking. The video went viral, amassing more than 14m views within a few days.
Mark Rober’s Tesa test drive

The real surprise came after the experiment. Fred Lambert, who writes for the blog Electrek, pointed out the same autopilot disengagement that the NHTSA had documented. “Autopilot appears to automatically disengage a fraction of a second before the impact as the crash becomes inevitable,” Lambert noted.

And so the doubts about Tesla’s integrity pile up. In the Tesla Files, we found emails and reports from a UK-based engineer who led Tesla’s Safety Incident Investigation programme, overseeing the company’s most sensitive crash cases. His internal memos reveal that Tesla deliberately limited documentation of particular issues to avoid the risk of this information being requested under subpoena. Although he pushed for clearer protocols and better internal processes, US leadership resisted – explicitly driven by fears of legal exposure.

We contacted Tesla multiple times with questions about the company’s data practices. We asked about the Meier and Schuster cases – and what it means when fatal crashes are marked “resolved” in Tesla’s internal system. We asked the company to respond to criticism from the US traffic authority and to the findings of Dutch forensic investigators. We also asked why Tesla doesn’t simply publish crash data, as Musk once promised to do, and whether the company considers it appropriate to withhold information from potential US court orders. Tesla has not responded to any of our questions.

Elon Musk boasts about the vast amount of data his cars generate – data that, he claims, will not only improve Tesla’s entire fleet but also revolutionise road traffic. But, as we have witnessed again and again in the most critical of cases, Tesla refuses to share it.

Tesla’s handling of crash data affects even those who never wanted anything to do with the company. Every road user trusts the car in front, behind or beside them not to be a threat. Does that trust still stand when the car is driving itself?

Internally, we called our investigation into Tesla’s crash data Black Box. At first, because it dealt with the physical data units built into the vehicles – so-called black boxes. But the devices Tesla installs hardly deserve the name. Unlike the flight recorders used in aviation, they’re not fireproof – and in many of the cases we examined, they proved useless.

Over time, we came to see that the name held a second meaning. A black box, in common parlance, is something closed to the outside. Something opaque. Unknowable. And while we’ve gained some insight into Tesla as a company, its handling of crash data remains just that: a black box. Only Tesla knows how Elon Musk’s vehicles truly work. Yet today, more than 5m of them share our roads.

Some names have been changed.

This is an edited extract from The Tesla Files by Sönke Iwersen and Michael Verfürden, published on 24 July by Penguin Michael Joseph at £22. To support the Guardian, order your copy at guardianbookshop.com. Delivery charges may apply.

Christopher Street Day in Köln: Das mulmige Gefühl bleibt


Vermehrt greifen Rechtsextreme queere Veranstaltungen an. Der CSD in Köln hat auch deshalb ein striktes Sicherheitskonzept erarbeitet. Die Teilnehmenden wollen sich nicht einschüchtern lassen - das fällt ihnen aber zunehmend schwerer.