Public key encryption, like Signal uses, offers good security for most purposes. e.g. It's fantastic for credit card transactions. The problem with using it for transmitting state secrets is that you can't rely on it for long-term secrecy. Even if you avoid MITM or other attacks, a message sent via Signal today could be archived in ciphertext and attacked ten years from now with the hardware/algorithms of ten years in the future. Maybe Signal's encryption will remain strong in ten years. Maybe it will be trivial to crack. If the secrets contained in that message are still sensitive ten years from now, you have a problem.
Anything sent with Signal needs to be treated as published with an unknown delay. If you're sharing intelligence with the U.S., you probably shouldn't find that acceptable.
The reason why the policies restrict access to government systems isn’t because anyone thinks that those systems are magically immune to security bugs, but that there are entire teams of actually-qualified professionals monitoring them and proactively securing them. His phone is at risk to, say, a dodgy SMS/MMS message sent by anyone in the world who can get his number, potentially not needing more than a commercial spyware license, but his classified computer on a secure network can’t even receive traffic from them, has locked down configuration, and is monitored so a compromise would be detected a lot faster.
That larger context is what really matters. What they’re doing is like the owner of a bank giving his drunken golf buddy the job of running the business, and the first thing he does is start storing the ledger in his car because it’s more convenient. Even if he’s totally innocent and trying to do a good job, it’s just so much extra risk he’s not prepared to handle for no benefit to anyone else.
I assume he copy pasted the message on his unsecured device.
How many apps had access to that text in his clipboard?
To me this isn't a technical problem with Signal, it's an opsec problem, and that's quite a lot harder to explain to people.
At least in the case of the leak the culprit was the UX, no?
Suppose a user wants the following reasonable features (as was the case here):
1. Messages to one's contacts and groups of contacts should be secure and private from outside eavesdroppers, always.
2. Particular groups should only ever contain a specific subset contacts.
With Signal, the user can easily make them common mistake of attempting to add a contact who already is in the group. But in this case Signal UI autosuggested a new contact, displaying initials for that new contact which are the same initials as a current group member.
Now the user has unwittingly added another member to the group.
Note in the case of the leak that the contact was a bona fide contact-- it's just that the user didn't want that particular contact in that particular group. IIRC Signal has no way to know which contacts are allowed to join certain groups.
I don't know much about DoD security. But I'm going to guess no journalist has ever been invited to access a SCIF because they have the same initials as a Defense Dept. employee.
Too deep. The problem is the physical environment, the room in which the machine displays the information. Computer and technological security means nothing if the information is displayed on a screen is in a room where anyone with a camera can snap a pic at any time.
Also, to be clear, Signal doesn't use public-key cryptography in the naive way (i.e. to encrypt/decrypt messages) as was/is possible with RSA. It uses asymmetric key pairs to first do a Diffie-Hellman key exchange, i.e. generate ephemeral symmetric keys, which are then used for encryption/decryption. This then also guarantees forward secrecy, see https://signal.org/blog/asynchronous-security/ . (Add to that they incorporate an additional post-quantum cryptographic scheme these days, and I'm probably omitting a lot of other details.)
For their use case, which requires communication between two (or more) arbitrary users who never communicated before among millions of users, running on cheap commodity hardware over wireless connectivity to the internet.
Leaving encryption aside, looking only at the network level, the DoD is capable of using a dedicated fiber line. Or rather a parallel fiber infrastructure.
About a month ago there was a discussion here saying Signal is preinstalled and widely used at the CIA.
https://news.ycombinator.com/item?id=43478091
It's also recommended by the government's cybersecurity agency CISA.
https://www.cisa.gov/sites/default/files/2024-12/guidance-mo...
Edit: I didn't state something perhaps I should have. Symmetric key is considered more secure because public key is more complicated so more room for side channel mistakes, and the computation needes to break public keys doesn't scale as fast with key size. I am not an expert but that is what I've read.
Maybe it’s the servers that is the problem.
Oddly they have thought of that already, to the point all encryption systems in use in the gov are thought of in these terms.
All that matters are the different assumed times to publication (weeks to years), and then treating the strength of measures involved differently based on what is reasonable for the given use.
If you absolutely need something to never be published then encryption isn't the solution, and nor are computers generally.
You shouldn't share state secrets with the US. They will be on or transferred between misconfigured cloud accounts. Some agency will eventually get authorization for analysis of them with an intention of financial espionage. The probable or confirmed loss of them will serve as a plausible deniability for the US when it misuses them.
Obviously using signal here is a terrible opsec failure, I'm just not sure how what you are saying changes anything
"A one-time pad (OTP) is considered theoretically the most secure method of communication — when it’s implemented correctly. That means: 1. The key (pad) is truly random. 2. The key is at least as long as the message. 3. The key is used only once. 4. The key is securely shared in advance and kept completely secret.
When all these conditions are met, a one-time pad provides perfect secrecy — an eavesdropper cannot learn anything about the message, even with infinite computing power."
Distribute a bunch of physical artifacts (smartcards) across the globe; guard a central facility (a symmetric key exchange center) extremely well etc.
The military can also afford to run its (encrypted or plaintext) communications over infrastructure it fully controls. The same isn't true for a service provided out of public clouds, on the public Internet.
That's not the threat model. The threat model is that Signal is a tiny LLC making an app on behalf of a foundation and open source software project. It's a small group of human beings.
Small groups of human beings can be coerced or exploited by state-level actors in lots of ways that can't feasibly be prevented. I mean, if someone walks up to you and offers $2M (or blackmails you with whatever they found in your OneDrive, etc...[1]) to look the other way while you hand them your laptop, are you really going to say no to that keylogger? Would everyone?
At scale, there are auditing techniques to address this. The admins at e.g. github are (hopefully) not as vulnerable. But Signal is too small.
[1] Edit: Or, because let's be honest that's the scale we're playing at: straight up threatens to Novichok you or your family.
You and I know that. So do the adversaries. The biggest issue for them is going to be not tripping over the intelligence collecting agencies (or corps) already on their devices.
Right, but this is nothing new: Hegseth is only a recent example of Trump's camp mishandling sensitive docs; I'll bet there's been an inner secret Four Eyes group since the the Mar-a-Lago bathroom official-document-archive story dropped years ago.
What surprises me is that I expected Tulsi Gabbard to be the centre of mishandling allegations, not SecDef.
Tulsi is by all appearances more experienced in operating under the radar. That said, I’m sure she won’t disappoint.
It is clear there is a gap between how people imagine this works, or should work in theory, and how it actually works.
For lunch orders and office softball schedules. Not top secret information.
https://www.theguardian.com/us-news/2016/sep/02/hillary-clin...
https://www.theguardian.com/us-news/2016/jul/05/fbi-no-charg...
Also:
https://www.fbi.gov/news/press-releases/statement-by-fbi-dir...
"To be clear, this is not to suggest that in similar circumstances, a person who engaged in this activity would face no consequences. To the contrary, those individuals are often subject to security or administrative sanctions. But that is not what we are deciding now."
But it’s not hypocritical of our country to want to improve our government officials and not for them to stagnate or slip backwards.
The Legal Eagle channel did an analysis of the two situations, "Signal War Plans v.s. Hillary's Emails":
* https://www.youtube.com/watch?v=cw1tNTIEs-o
The two situations are not actually (legally) equivalent. One huge difference being that Hesgeth et al are setting communications to auto-delete, which is against records keep statues (there is no evidence Clinton purged e-mails).
Every single sender and recipient (excluding bcc) was aware or could have been aware that she was not using a .gov email address and is somewhat complicit or tacitly ok with her using that server.
Occasionally previously unclassified materials can later be deemed classified, or there can be a data spill where a sender transmits classified information and recipients need to participate in deletion, investigations, etc.
I agree that her using an external server was bad but it was also in plain sight the whole time.
Hypocrisy indeed.
Whataboutism is when you bring up something about person A, then the only argument against it is something relating to person B.
For example, when you point out the call the president made to the secretary of state in Georgia begging him to "find" 11,780 votes. Then, without a great excuse, the other person brings up Biden's mental decline.
Both true, both concerning, but the reply just being blatant and desperate misdirection.
OP's comment was pointing out the similarities between issue #1 and issue #2. There's no dismissal.
Maybe the DoD should work on developing some internal Android and Signal forks that focus on adding additional critical security controls without impacting usability. There's an obvious desire path here.
I know personally that given the choice I'd probably rather use Signal than whatever messaging system the DoD contractors managed to come up with. And private conversations between senior military officials over encrypted DoD communication channels probably aren't FOIAable anyway.
It's not just this. Security involves compromises and trade-offs. Humans will be stupid humans and re-use passwords, install better but insecure software, not ever update, etc. It's an old story.
In the year 2025, if communication with any other human on the globe isn't as simple as opening and app and typing, then people will find another way because there are about a thousand better ways.
So I doubt they are trying to get away with anything. They're just preferring the trivial option over the option that probably involves a physical token or slow biometrics or 15-second logout or whatever arduous security features the government comms probably have. Just like any human would.
Perhaps this will force the government COMSEC people to re-evaluate their practices.
Updated to add: I'm not defending their practices, just giving a likely explanation. Blaming the users is not always the best way to evaluate a security failure.
I think big companies' influence on purchasing decisions (aka corruption) drives a lot of this.
(The recent cringe inducing Deniro series comes to mind)
I suspect this is somewhat common in history (this is not meant to excuse it), but we can’t tell because those people still wrote the narrative.
Should be a disqualifier for US security clearance.
Is easily manipulable - either give them booze to pry secrets or encourage "booze mind" in 1-1 conversations to pry secrets. Plus the huge slip-up of using Signal #signalgate.
One skirts the official tools like this to prevent accountability from a written record. Completely sensible if you're planning to be judged for your actions.
For a High-Tech President, a Hard-Fought E-Victory
For more than two months, Mr. Obama has been waging a vigorous battle with his handlers to keep his BlackBerry, which like millions of other Americans he has relied upon for years to stay connected with friends and advisers. (And, of course, to get Chicago White Sox scores.)
He won the fight, aides disclosed Thursday, but the privilege of becoming the nation’s first e-mailing president comes with a specific set of rules.
“The president has a BlackBerry through a compromise that allows him to stay in touch with senior staff and a small group of personal friends,” said Robert Gibbs, his spokesman, “in a way that use will be limited and that the security is enhanced to ensure his ability to communicate.”
[...]
The presidency, for all the power afforded by the office, has been deprived of the tools of modern communication. George W. Bush famously sent a farewell e-mail address to his friends when he took office eight years ago.
While lawyers and the Secret Service balked at Mr. Obama’s initial requests to allow him to keep his BlackBerry, they acquiesced as long as the president - and those corresponding with him - agreed to strict rules. And he had to agree to use a specially made device, which must be approved by national security officials.
"Some of the classified emails found on former secretary of state Hillary Clinton’s home server were even more sensitive than top secret, according to an inspector general for the intelligence community."
even bush fooled everyone he was literate (save from the two times he held books upsidedown) while in office.
https://news.sky.com/story/trumps-fixer-was-made-to-wait-eig...
His personal PC? Send Big Ballz his way to do some upgrades
https://www.npr.org/2025/04/15/nx-s1-5355896/doge-nlrb-elon-...
maybe a free Starlink dish
https://www.nytimes.com/2025/03/17/us/politics/elon-musk-sta...
I'm guessing there are a few scenarios where they could be tortured / blackmailed into compliance, even if it meant that the DoD would know about it in a day or two, and it would still be worth it.
E.g., shortly before a real fight over Taiwan began.
I really, really hope Hegseth gets his OPSEC act together, yesterday.
Signal’s protocol secures the message in transit. But their desktop app may or may not have client-side vulnerabilities. And if he clicks a link, you’re out of Signal and into the browser. If the link downloads a file, you’re into the OS.
Title:”0-click deanonymization attack targeting Signal, Discord, other platforms”
Maybe not 0-click anymore, but still applies if the user browsing the internet.
Yes, I should have thought of that old and obvious one. It opens up a universe of possibilities.
Get me inside the minds of these freaks.
a) beaurocrats' real comms setups (3 telephones, four monitors all sitting on the desk – versus mounted on arms/wall) full of clutter and sitting on an anachronism of a wood desk
and b) what you'd see in any "spy" movie with dark-mode graphics displaying fancy l33t charts displayed on quad-monitor setups mounted on arms, probably in a low-light setting and the beaurocrat doesn't look at the "small" monitors himself, his cronies do that, the only monitor he looks at is the single 136" on the wall used for teleconferencing with villains
is hilarious
1) He is avoiding some sort of corrupt signals intelligence folks from knowing what he's working on.
2) He is avoiding the government catching him in some corruption by avoiding the official records act.
Anything else?
Or the same reason I have Whatsapp - communication in my social groups happens there, and if I don't have it I get left out.
Your explanations assume there is some deeper meaning, looking at the tradeoffs for each communication platform, and then coming to some rational conclusion. I don't think there's much evidence for that.
The people around trump just happen to be used to using signal to communicate, and if Pete doesn't get on board he gets left out.
We have to assume malicious intent. These people could start a nuclear war. They get zero flexibility or grace.
Say what you want about the usability of DoD home grown solutions, but it was a military system backed up by military budgets and guns - civilians are less likely to be collateral damage in an attack against these systems.
Now, all the civilians using Signal are potential splash damage casualties in a military conflict.
I also suspect Signal does not have the budget, staffing, or desire to serve as a front line soldier in a cyber war; but this exposes them to military-grade risks, whether they like it or not.
Unless you can predict the future, I'm not sure how you would generate a key that would be unknowable now but generally available in the future.
I was thinking of encrypting a secret in the structure of a Rust program so it can only be decrypted by compiling and running it.
1) DoD and other departments have either tacitly or explicitly approved the use of Signal for internal matters for several years now, with proper opsec.
2) You cannot govern exclusively from a SCIF, hence 1.
If you have the resources available to the SecDef, you frankly should be able to. Mobile SCIFs are something private companies can provide off the shelf for a few hundred thousand dollars. That's a drop in the bucket.
Obviously, nobody can or should spend all their time in one unless you're some kind of watch officer, but when handling TS/SCI material, there really is no reason for a principal to not have access to a SCIF within a moment's notice if they make it a priority. And there's no reason to be sharing TS/SCI with anyone that is not themselves in a SCIF. We have a declassification/reclassification process if information needs to be more widely disseminated.
(1) doesn't have to be Signal. It should be some "enterprise" solution that DoD can own and operate, and it should federate with the same thing used in other executive agencies, and the WH itself. And it should have military grade authorization (meaning labeled, multi-level security).
That said, (2) is quite right: you cannot govern from a SCIF. SCIFs are mainly tools of control to access to long-ago classified information. New classified information cannot be born in a SCIF for the simple reason that SCIFs cannot scale to the needs of those who govern.
I guess the Treasury Department could stop transferring funds into DoD accounts, but that seems unlikely.
Perhaps he could be prosecuted for violating various laws, but that would require action by the DoJ, which also seems unlikely.
Congress could also hold Trump responsible for Hegseth's actions, but that also strikes me as unlikely.
The past 9 years have been a really good education in why the Separation of Powers is important, and what's at risk when it doesn't function properly.
I’m guessing that’s the product in question: https://www.vertiv.com/490454/globalassets/products/monitori...
During the UK Covid-19 enquiry into gov decision making at that time it came to light that most of the UK cabinet were co-ordinating via Whatsapp groups. Again, I'm not a fan of Boris and Dom Cummings but this makes some sort of sense to me. I recognise the need for government teams to have quick convenient chat available to them. Things move too fast these days to wait for the next cabinet meeting or to arrange things via a series of phone calls.
Similarly we can look back to Obama having to fight to keep his Blackberry in 2009 https://www.nbcnews.com/id/wbna28780205
That 20 year old tech is simply more secure... specifically because it is less convenient. By doing things the way they do them they can enforce access to desired levels of security by controlling physical access to the equipment. With something like Signal, that access is entirely the responsibility of the user. The user will inevitably mess that up, particularly when things get exciting. ... and Signal is not even really all that good at preventing the user from messing the identity thing up.
* https://articles.59.ca/doku.php?id=em:sg (my article)
And unarchived. It's very convenient to not have to do things in meetings with minutes where people might later question your decisions. Or report them to the police.
Also, I complain a lot about Teams, but my understanding is modern DoD basically runs on Microsoft, AWS, (also Google?) just the same as private companies. Probably not Zoom, which is unfortunate from a usability perspective but also wise I think.
Can you name a popular civilian tech that blocks adding random journalists to small chat groups? That includes strong identity guarantees? That meets compliance requirements around logging calls?
Bloomberg might come the closest on this. Why don't you go out and price a Bloomberg terminal for yourself, at the grade that lets you trade options with other Bloomberg terminal owners over the chat interface?
Do you know at all or are you just relying completely on your imagination to justify the Trump admin's actions?
... but unlike Signal, SDC respects laws requiring accurate record-keeping. And that's why this bunch of lawbreakers want to use Signal. They want to evade any and all accountability once this administration is over.
1. The Defense Department bans the use of Signal for everybody else. Why is that? Why is the Secretary exempt?
2. As we've seen it's pretty easy to add unauthorized people to what should be secure communication channels where classified information is shared; and
3. There are laws around the preservation of governmental records. Expiring Signal messages seems like it's intentionally meant to circumvent these legal requirements ie it's illegal.
We're only 100 days in. We've got 1200 more days of this.
NB: I’m not arguing that this change in policy was done after a careful Chesterton’s Fence analysis and weighing of all relevant factors, but it would seem stranger if a new leader couldn’t change any policies than if they can.
Same place everyone else is now. Nobody cares about the flagrant violations by the executive. This is the foxes walking around freely now.
edit: To the lazy down voters. Address the 'my side never does anything wrong' issue and I might concede.
This has never been the case; JFK appointed his little brother AG. The problem is that the Congress should be investigating and prosecuting the president but will not.
Everyone in this administration has to know they’re spending the decade after Trump in front of the Congress and various investigators.
The extreme bipartisan view is that government business done by public officials should be hidden from the public record at their whim, even with the explicit goal of avoiding FOIA. Democrats believe that this is not only justified but virtuous, because Hillary Clinton lost an election.
If someone gave me a whole set of locked down _windows_ computers and a bunch of achaic phone lines and told me to use them in 2025, I’d also try to circumvent such inconvenience.
I personally started using signal some time around 2018 and I'm sure there were millions of users by the time Biden began his term.
Not to pick on this in particular – nearly all the reporting on this starts and ends with "Signal is insecure" as if that was all it took to be wrong. And in other eras, that was enough.
The man likes Signal. For better or worse, he is the Secretary of Defense...The man we've entrusted to help coordinate our national defense.
There's so many questions I genuinely don't have an answer for...
Has Congress made it illegal to use an off-brand messaging app for secure communications? _Why_ is it insecure? What is the probability that China is reading these messages in real-time? 100%? 25%? 0.2%?
We need to start from the presumption that the people-in-power don't care that it's always been done this way...in fact, they have a ton of pressure to be different. But, in some cases, these people may be willing to listen to reasonable arguments which clearly establish _why_ using Signal is unreasonably worse than using US Government Issue messaging.
Real-time might be nice but there's value in reading material at this level with almost any delay.
In 1949 a US counter-intelligence program(me), the Venona project[1] decrypted Soviet cables from 1945 which made it almost certain the First Secretary to the British Embassy in Washington DC [2] was a Soviet asset. That wouldn't have happened if the Soviets hadn't misused their channels of communication.
[1] https://www.osti.gov/opennet/manhattan-project-history/Event... [2] https://en.wikipedia.org/wiki/Kim_Philby
> Has Congress made it illegal to use an off-brand messaging app for secure communications? _Why_ is it insecure? What is the probability that China is reading these messages in real-time? 100%? 25%? 0.2%?
Is your point that, in the space of your own lack of knowledge, that reasonable rational may exist? Could you share what gives you trust in this administration to be so generous?
It isn’t enough to say “don’t use Signal”, at some point they need to address the reality that there are no functional alternatives.
The thing I am more bothered by is why would he take a picture of his desk, thereby narrowing the attack profile.
Yes. The law requires that classified information be handled under certain standards.
> _Why_ is it insecure?
Classified data is being transmitted on an unsecured device. If Hegseth's personal phone has Uber, Tinder, ... whatever apps installed, that software is running on a device that's contains national secrets.
Systems which handle classified data are meant to be airgapped from the normal internet/normal software.
The issue is not that signal is insecure, but rather that sensitive government information demands additional precaution (e.g. airgapping).
There's a separate issue that there are legal requirements for maintaining records of government communication. Using a personal device (especially with disappearing messages) is illegal since it doesn't maintain this documentation.
Additionally, classified information is tracked to see who read it and when. In the event of a security leak, this can help isolate where the leak happened. If the information gets posted on Signal, then there's nothing more that can be tracked.
> For better or worse, he is the Secretary of Defense...The man we've entrusted to help coordinate our national defense.
That's not the way rule of law works. The Secretary of Defense doesn't get to _decide_ we're doing things differently now. His actions, as well as the actions of his staff, are bound by the laws that congress has passed.
> We need to start from the presumption that the people-in-power don't care that it's always been done this way...in fact, they have a ton of pressure to be different. But, in some cases, these people may be willing to listen to reasonable arguments which clearly establish _why_ using Signal is unreasonably worse than using US Government Issue messaging.
The onus should not be on the general public to convince the Secretary of Defense to adhere to bog standard requirements for handling sensitive information. If he has an idea, "I think using Signal on my personal phone to discuss imminent military actions is better than using a secure line," he could push that idea forward. Have the Pentagon's security staff evaluate the idea. Instead, he simply did it.
Check out what happened to the Signal FOSS fork.
Then check out what Molly is doing, and why.
Personally I'd favor Briar over Signal any day.
I mean, thinking the DoD is actually defending the U.S. is where you went wrong. The stakes are so incredibly low that none of this actually matters.