Pavel Durov is warning that what many people think of as “deleted” messages may still live on in places they rarely consider: push notification logs and other bits of device‑level data that sit outside encrypted chats.
His comments came after a report describing how US investigators were able to recover supposedly deleted Signal messages from an iPhone by pulling information from the phone’s notification history. The case has reignited an old but still misunderstood debate: end‑to‑end encryption can protect the contents of a message in transit, yet it does not automatically wipe every trace that message leaves on the devices that send and receive it.
According to Durov, the weak point is not necessarily the encryption protocol, but the way modern phones and apps handle push notifications. When a message arrives, an app or operating system often stores enough information to display a banner alert or lock‑screen preview. That data can remain on the device even after a user deletes the conversation inside the app.
Crucially, Durov emphasized that turning off text previews on your own phone is not a complete solution. You can disable notification previews for your apps and still have your messages exposed via the other person’s device, if they keep default settings that show message snippets on the lock screen. In a two‑way conversation, your privacy depends not only on your own settings, but also on the choices of everyone you communicate with.
The episode that sparked the discussion was first outlined in a report from 404 Media. In a criminal investigation, the FBI reportedly obtained Signal message content by examining notification logs stored on an iPhone. The messages themselves were protected within Signal’s end‑to‑end encrypted environment, but the iOS notification system had saved enough data for investigators to read what was sent.
This distinction is essential: encryption can keep outsiders from reading messages as they move between devices or from accessing data stored inside an app’s encrypted database. But systems around that core-notification services, backup mechanisms, keyboard caches, screenshots, and system logs-may still hold fragments of the same conversations in unencrypted form.
As a result, the case has put fresh attention on metadata and auxiliary data that surround secure messaging. Even when message content is locked down, the devices and services involved can reveal who talked to whom, when, from where, and in some cases, what was said via notification previews. For law enforcement and adversaries, that “data about the data” can be almost as valuable as the messages themselves.
The renewed scrutiny is also spilling over into a wider examination of how messaging apps and operating systems manage, sync, and store notifications. Many platforms rely on centralized servers to route push notifications through Apple’s and Google’s systems. Each step can create logs. Even if those logs don’t contain full text, they may record identifiers, timestamps, IP addresses or other technical details that can later be pieced together to reconstruct communication patterns.
Developers who focus on privacy‑preserving tools are increasingly experimenting with designs that minimize what is stored outside the user’s direct control. Local‑first architectures, different routing schemes, and alternative network layers aim to limit how much recoverable information is left behind after a message is sent, viewed, or deleted. But as the Signal notification case shows, changing the app alone is not enough when the operating system itself can capture data independently.
This is where decentralized platforms come into the conversation. Their proponents argue that when messaging and social tools are built without central custodians, there are fewer large databases and fewer single points of failure that can be queried or compromised. Instead, information stays on user devices or is distributed across a network in ways that make bulk collection and centralized logging more difficult.
Interest in such decentralized messaging and social networks has grown sharply in recent years, particularly during periods of blackout, unrest, and heavy internet restrictions. Since 2025, more people have been searching for alternatives that can survive shutdowns or state‑ordered bans. Data cited in the report showed that online search interest in decentralized social platforms jumped 145% over a five‑year period, reflecting a steady shift in how users think about resilience and control.
One example highlighted was Bitchat, a Bluetooth mesh messaging app that connects nearby phones directly without needing an internet connection. During a social media ban in Nepal in September 2025, the app was reportedly downloaded more than 48,000 times as people searched for ways to stay in touch despite blocks on mainstream platforms. Because such tools rely on local, device‑to‑device communication, they can operate outside traditional networks and leave fewer centralized records behind.
Durov also pointed to his own experience with censorship efforts. When Telegram faced restrictions in Iran, authorities hoped users would migrate to state‑backed services. Instead, many turned to VPNs, routing around blocks rather than embracing government‑controlled alternatives. That behavior underscores a broader trend: when people perceive that their communication is being monitored or controlled, they increasingly look for ways to decentralize or obfuscate their online presence.
However, decentralization is not a magic bullet for privacy. Even decentralized and mesh‑based apps can still generate local logs, store notifications, or leak details through the operating system or hardware they run on. The Signal notification incident is a reminder that privacy is a full‑stack problem: encryption, app design, OS behavior, user habits, and legal environments all interact, and weaknesses in any layer can undermine the rest.
The concept of “deleted” messages is particularly misleading in this context. Deleting a message in a chat application usually means removing it from that app’s visible history. It does not guarantee that no copies or traces exist elsewhere on the device or in its associated services. Backups, notifications, screenshots, cloud syncs, and even predictive text keyboards may all hold pieces of what once appeared only in a secure conversation.
In practice, this means users who rely on secure messengers to protect sensitive information should think beyond the app icon. Are message previews disabled on both your device and the recipient’s? Are your lock‑screen notifications visible when the phone is locked? Are you syncing notifications or messages to cloud backups? Each of these factors can make the difference between a message disappearing from casual view and being recoverable in a forensic examination.
For those genuinely concerned about leaving digital traces, there are additional behavioral strategies to consider. Keeping devices fully encrypted and locked with strong passcodes, minimizing automatic backups, regularly clearing notification histories, and restricting lock‑screen content can all reduce what is available to anyone who gains access to the physical device. In some cases, the safest course is to avoid putting the most sensitive information in writing at all.
From a broader policy and design angle, the Signal notification case is pushing developers and regulators to confront an uncomfortable reality: the usability features people love-instant alerts, quick replies from the lock screen, seamless syncing between devices-are often in tension with strict privacy goals. Each convenience creates extra copies of data, increases the number of components that touch it, and broadens the attack surface.
Messaging platforms are starting to experiment with ways to reconcile these competing demands. Some are exploring encrypted notification payloads that reveal as little as possible to the operating system, or options to send “silent” notifications without text content. Others allow users to choose aggressive deletion policies for logs and backups, or to limit how much metadata is exposed when messages are routed across the network.
Still, device manufacturers and OS vendors play a decisive role. If the underlying system logs too much information or provides broad access to notification histories, even the best‑designed secure messaging app can be undermined. This has led to calls for clearer, more granular controls over what notifications store, how long they persist, and which apps or investigators can access them.
Ultimately, Durov’s warning is less about any single application and more about expectations. Many users equate “end‑to‑end encrypted” with “invisible” or “unrecoverable,” and assume that hitting delete erases a message from existence. The reality is more nuanced: encryption can strongly protect message contents against interception and unauthorized server‑side access, but it does not automatically sanitize every trace left on the devices involved.
As interest in decentralized and privacy‑focused communication continues to grow-accelerated by censorship attempts, political instability, and high‑profile investigations-the technical and legal battles over metadata, notifications, and device logs will only intensify. Users, meanwhile, will need a clearer understanding of where their messages really live, and what “delete” truly means in an ecosystem where convenience and privacy are constantly at odds.
For now, the lesson from the Signal notification case is straightforward but unsettling: even if you trust the encryption of your chosen messenger, your conversations may still be exposed through the very tools that make smartphones so easy to use. Push notifications, system logs, and device‑level data remain weak points-and until they are redesigned with privacy at their core, “deleted” will not always mean gone.

