> "For example, by including a specially formatted but otherwise innocuous file in an app on a device that is then scanned by Cellebrite, it’s possible to execute code that modifies not just the Cellebrite report being created in that scan, but also all previous and future generated Cellebrite reports from all previously scanned devices and all future scanned devices in any arbitrary way (inserting or removing text, email, photos, contacts, files, or any other data), with no detectable timestamp changes or checksum failures. This could even be done at random, and would seriously call the data integrity of Cellebrite’s reports into question."
They've may have just got a lot evidence collected using Cellebrite from phones with (or without) Signal installed on them thrown out of court.
I don't recall the details, but there was an absolute unsubstantiated speculative and surely fictional rumor of at least one entirely theoretical zero-day non-gif formatted image file that exploited a similar class of vulnerability in what was probably not a market leading tool used tangentially for the same purposes, floating around well over a decade ago as well.
I for one am very glad that these hypothetical issues have almost surely been fixed.
It attacks Cellebrite's ability to operate by casting doubt on the reports generated by the product that their customers may wish to use in court.
It places them in legal peril from Apple, and removes any cover Apple would have to not take legal action. (I assume someone at Apple knew they were shipping their DLLs?)
It makes a thinly-veiled threat that any random Signal user's data may actively attempt to exploit their software in the future and demonstrates that it's trivial to do so.
edited to add a bonus one:
Publish some data about what they are doing to help create a roadmap for any other app that doesn't want their data to be scanned.
Fortunately, parallel construction means you never really have to throw out bad evidence as long as you can find some good evidence too!
It reminds me of the story of https://en.wikipedia.org/wiki/Annie_Dookhan
The defense lawyers have to read it, and the people in law enforcement need to read the cases where judges throw out Cellebrite evidence based on that.
CoreFoundation is open source: https://github.com/opensource-apple/CF
libdispatch is open source: https://apple.github.io/swift-corelibs-libdispatch/post/libd...
ASL is open source: https://opensource.apple.com/source/syslog/syslog-349.1.1/li...
The objective C runtime is open source: https://github.com/opensource-apple/objc4/blob/master/APPLE_...
icu, pthread, zlib, and libxml did not even originate at Apple.
That does not make me feel good about Signal.
In contrast, it would strike me as strange if a Signal user switched to another messenger that allowed the data extraction because they were uncomfortable with Signal blocking it.
Last year I've managed to gain partial access to one of their systems and it took me weeks emailing their internal email addresses to finally fix the bug. They were total ass about it.
Now I've got complete access to their entire database and I don't know what do. Can HN advise?
So if the database is fingerprintable to the GP specifically in any way, they're very very dead. And the random username doesn't even count here; they probably didn't post from Tor, so their real IP is connected to this post.
I wish I could see those files in action...
Pretty sure it's the former, since the above is a way to ensure that Cellebrite can't just gather all implied exploit files and make sure they've got those specific problems all patched. This is, quite literally, an informational attempt at guerilla/asymmetric warfare, where Signal is trying to make engaging with them too costly, while also making a few blows quite a bit above their weight level. Cellebrite now has to decide whether to keep after this adversary that both is hard to pin down, ambushes them, and has shown it can hit them really hard where it matters (credibility, and thus their pocket book).
They are basically putting the threat out that if you use Cellebrite on Signal in the future, you might not get the data you expect, and at worst, it may corrupt the report/evidence.
This also brings into question the chain of custody, as an untrusted device being imaged can alter reports of unrelated devices.
It's as though Theo decided that OpenSSH should respond to portscanners by trying to pwn the source systems.
"We have strict licensing policies that govern how customers are permitted to use our technology and do not sell to countries under sanction by the US, Israel or the broader international community."
And these policies are obviously quite effective at preventing such uses.
[1] https://www.theregister.com/2021/04/21/signal_cellebrite/
The official Cellebrite policy has always been "don't worry, if you get stuck, we can send you an expert to testify to the reliability of the scientific evidence due to previous cases" but what happens when the pyramid of previous cases fall apart? Do you suddenly own a paperweight?
I've also published papers (with NIST's help) on using consumer grade hardware for forensics and why testing your tools across a wide variety of scenarios is critical.
This purported vulnerability does not rely on FFmpeg, hence the disclaimer.
My fear, and prediction, is that the authorities will frame this as an even more egregious attack on law enforcement and that interfering with investigations is a crime (I'm not a lawyer, but I play one in hacker news comments, and that sounds like a crime). They'll lean on the app stores and the app stores will lean on or remove Signal.
2. Signal stirred FUD in a blog post. That's a very different thing from actually doing it.
I am SO IMPRESSED with this middle finger from the Signal team.
On the one hand:
> One way to think about Cellebrite’s products is that if someone is physically holding your unlocked device in their hands, they could open whatever apps they would like and take screenshots of everything in them to save and go over later. Cellebrite essentially automates that process for someone holding your device in their hands.
But on the other hand:
> We are of course willing to responsibly disclose the specific vulnerabilities we know about to Cellebrite if they do the same for all the vulnerabilities they use in their physical extraction and other services to their respective vendors, now and in the future.
If UFED just copies data from unlocked phones, why would they be using vulnerabilities to do so?
I guess my question is, is Cellebrite capable of copying locked devices, or more to the point - has vulnerabilities to unlock devices without knowing the access PIN?
Yes, they even brag about it in their marketing materials: https://www.cellebrite.com/en/a-practical-guide-to-checkm8/
That's a public vunerability, it's anyone's guess how many nonpublic ones they're using.
"Lawfully access locked devices with ease Bypass pattern, password or PIN locks and overcome encryption challenges quickly on popular Android and iOS devices"
I think what he meant to say was that if Cellebrite is used on your locked phone it could be the equivalent of a person having your unlocked phone in their hands where they can do whatever they want. Only cellbrite doesn't do look at random things, it grabs everything.
Cellebrite devices are also frequently used by phone carriers to image your old phone and transfer the data to your newly purchased phone, to give you a clearer idea of what they can do.
"Required" to comment? By whom and for what reason?
Obviously there may be some backchannel, but that is probably how it would go if you assume Apple and Cellebrite have no relationship.
By stockowners who might not like their valuable IP used in this way or by this company without permission?
This, if Apple does pursue it, is a copyright matter, not trademark.
The "Physical Analyzer" is just a forensics tool. There are dozens of competitors out there that will take a phone and surface the things that might be interesting in a court case or law enforcement investigation.
The product Signal didn't talk about - which I think is the one they are upset about - is Cellebrite Premium. That is their service where law enforcement can send locked or damaged devices to their lab and get back a an image to load into PE. However in 99% of cases devices are either accessed because they are running old software with public vulnerabilities, or using the magic phrase "would you mind unlocking your phone so we can clear this matter up?"
Nailed it!
This will just prompt Cellebrite to improve its security process and sandbox the entire tool.
If they wanted to destroy the credibility of the tool, using the vulnerabilities to silently tamper with the collected data or even leaking it online would be a much better option and hit them without any warning, not only jeopardizing those cases but forever casting doubt on not just Cellebrite but their competitor tools.
Sandboxing doesn't really help. The problem isn't that the tool is used to infect the rest of the system, but that the tool itself is compromised, the reports it generates are compromised, and and past reports may be compromised. Unless you're pushing that data outside the sandbox (which is a hole in the sandbox, and while much more limited might also be an exploit vector or a way to cause problems in the data) it's still fair game if the sandboxed tool is compromised.
There's multiple reasons to disclose it. First, because as another comment noted it attacks the credibility of the company, and credibility is very important for tools used in court.
Second, because their main goal is to protect Signal, not attack Cellebrite. Making Signal a problem to attempt to gather data about will possibly make them just blacklist Signal as an app they gather for. This could be temporary, but since Signal alluded to many exploits and that they have a bunch queued up for the future, it will always be a risk for Cellebrite to attempt to gather info from Signal, so they might just continue to skip it.
The process of e-discovery is rife with risks of this sort. When you forensically collect data from a random set of devices from a party that may or may not have porn, HIPAA, GDPR, sample viruses, malware, who know what all.
The short version of it even if the inhaling of this data crashes the device, there are mitigations and protections that will allow the evidence to be ultimately produced.
A crash of the windows host in collection will not invalidate the case.
disclosure: ex-CSO of Relativity, leading provider of e-discovery software.
edit:
One thing they could try to do is to sandbox the parser itself to lower attack surface area... but the damage is done here and I really doubt they will win a security tit-for-tat with Signal.
IANAL but I could imagine Cellebrite has existing or pending litigation where this disclosure upsets their position.
I assume Apple will choose to file for copyright infringement than risk being accused of collusion and lose the copyright on that iTunes or parts of it.
> When Cellebrite announced that they added Signal support to their software, all it really meant was that they had added support to Physical Analyzer for the file formats used by Signal.
Your case is valid about potential judiciary impact, but it would require for Signal to monitor cases involving Cellebrite and step forward to help the defense while unprompted to do so. Furthermore, Cellebrite clients seems to include entities that do not care so much about a fair trial.
Silently tamper with the data might cross a legal line. doing this might put at risk current or past cases where there is a legitimate reason to use this sort of tool.
Privacy can be hard. While i 100% defend everybody has the right to privacy, i can also see the need for the capability to break it. Maybe the answer for this is a very tight regulation around the uses of this kind of hardware/software, but that regulation would have to keep up with the pace of technology
i've always wondered if there could be a cryptographic solution to this. issuing decryption keys to governments seems rife for abuse, but some sort of multi-party situation where governmental and non-governmental entities have to cooperate (with actual multiparty key material) to perform a decryption authorized by warrants- with said non-governmental agencies acting as a check on usage of those warrants, their frequency and the eventual publication of their usage could be an interesting approach.
personally i think strong encryption should be a requirement for digital evidence, but even that can be forged.
strange times we live in.
And it doesn't destroy the credibility of the tool to silently mess with its data. People have to know it's happening.
Couldn't Apple now sue Cellebrite?
I'm trying to understand the risk profile here.
I guess I see the value for, e.g., a border crossing, where they can inconvenience you and ask you to unlock your phone, but instead of flicking through your messages briefly, they authorize a pairing and quickly backup your entire disk content. You expected a quick perusal by a human, but unknowingly gave them a lot more. If you've blocked pairing, they can't get nearly as much data as quickly.
But if you're being investigated for committing a crime, everything we think we know about device unlocking is still true, right? They'd need me to unlock it before it'd trust a new device to pair to, and they'd need a court order to get me to unlock it for them. Five quick taps of the power button and biometric unlocks are off--now they need my passcode.
Perhaps there's still value, even in that case, in that if I were compelled via court order to give my passcode, they still can't quickly / easily dump the disk contents from a device pairing. Although I imagine if you have the passcode there's probably many other ways of accomplishing the same result.
Well, mostly yes, that's considering Cellebrite doesn't have 0-days or other exploits which can send a SMS to the device or similar things. Using Cellebrite's software you can also send silent SMS, so it's not far off either.
A german Cellebrite ambassador showed me and colleagues the mentioned tools of the blog post and told us he participates at Law Enforcement raids. At 6 in the morning they raid the houses of the suspects, detain them and immediately ask for PINs and passwords. He said that surprisingly often it works and no further decryption tries have to be performed.
In the US that won't work if unlocking the device requires a password or pin. In practice, you can't be compelled to provide that unless you openly admit that you know it. (Even then, the 5th amendment might afford you some protection.) YMMV, IANAL, etc.
1) "..saw a small package fall off a truck ahead of me..."
2) The very last paragraph is just great!
Aren't Cellebrite products/services more advanced than that? I mean don't they use publicly unknown zerodays to extract data from locked phones?
"In completely unrelated news, upcoming versions of Signal will be periodically fetching files to place in app storage.
These files are never used for anything inside Signal and never interact with Signal software or data, but they look nice, and aesthetics are important in software.
Files will only be returned for accounts that have been active installs for some time already, and only probabilistically in low percentages based on phone number sharding.
We have a few different versions of files that we think are aesthetically pleasing, and will iterate through those slowly over time. There is no other significance to these files."
That does not mean adding stuff like untraceable cryptocurrency payments or very publicly tweaking the noses of law enforcement, and bragging about how you're putting exploits in your app to hack them.
This isn't 1993 and the last thing we need is more pretexts to ban E2E encrypted apps in the countries where they're needed the most. I think this trades a moment's satisfaction for a very bad long-term outcome.
That being said, I agree with you 100% on the cryptocurrency payments issue and think that was a misstep on their part.
The reason I want e2e encryption is because I want control of my devices, control of my information, control of what's going on. It's not Moxie's phone to drop random files on to, regardless of purpose. It's my phone, and I consider programs that are doing things that I'm not aware of malware.
(Admittedly, Android is rife with stuff I don't want going on, so it's not really my phone, it's Google's and Motorola's and a bunch of other entities who have their tentacles in it, but still...)
Maybe the last paragraph is a joke, and they have no intention of randomly placing files on unwitting client machines. It's open source, so I could compile the client myself and make sure it's not doing anything funny. What a pain though. At that point, so much trust is lost in the organization and codebase that really I need to find some other messaging protocol / app / network.
Mwahahahaha. Tell us how you hack our app and everyone else's, and we'll tell you how we hacked yours.
The middle finger is strong with this one.
Curious if there'll be a response of sorts.
That's just hilarious! Nice way of saying we got our hands onto one of these boxes, but we don't want to reveal how. It fell of a truck.
https://www.uspis.gov/wp-content/uploads/2020/02/FY-2019-ann... (p. 35)
See: https://news.ycombinator.com/item?id=26892180 https://news.yahoo.com/the-postal-service-is-running-a-runni...
Don't get me wrong, the implication is enough to discredit Cellebrite, but my initial thoughts are that either this bluff gets called, or there's a non-zero risk of someone landing in even hotter water down the line for using Signal. Of course, this assumes that you're not already neck-deep for having encrypted data and upholding your right to privacy.
Correctly me if I am wrong, but did they really say they were going to be doing active attacks against Cellebrite units? Also funny... but they probably are not actually going to be doing that.
If Cellebrite decides to punch a spiky rock they could have just not done that in the first place.
Yeah, but they probably figured they're not being attacked. But now? Now they'll have to figure they are.
- Cellebrite helps oppressive regimes read your messages
- Signal keeps your messages private
- Cellebrite announces "Signal support"
- Signal finds 9 years of vulnerabilities in Cellebrite
- Signal permanently pwns Cellebrite
You come at the king, you'd best not miss.
Additionally, I see from the video that the purported vulnerability is present in UFED version 7.40.0.229. There is nothing stopping Cellebrite from patching this purported vulnerability, and shipping trustworthy versions of UFED going forward.
If there is a concern that the purported vulnerability still exists, the burden of proof will be with the person claiming the vulnerability exists, for each new version of UFED. Cellebrite doesn’t even need to implement actual code, but merely increment the UFED version number. It will be an endless cat and mouse game driven by baseless claims from both sides.
Since this vulnerability has not been reproduced by third parties, it could be equally likely that Signal is using a psyop rather than exploiting a genuine vulnerability. In either scenario, it casts doubt on Cellebrite; the damage is done by convincing you, the reader.
Also, not disclosing specifics is reasonable here, given that the vendor is themselves known for using, hoarding, and selling access to 0days.
There is no obligation for a researcher to share their research with such a corrupt vendor.
As it stands, the vulnerability is not reproducible by anyone other than Signal. Reproducibility is key in the scientific method and in the court of law.
Couldn't Apple simply revoke the signature?
>Looking at both UFED and Physical Analyzer, though, we were surprised to find that very little care seems to have been given to Cellebrite’s own software security.
People keep saying this. It has never changed since the 90s. There is no bar to become a "software engineer".
If Moxie can get his hands on these devices and hack them, why can't Apple or Google, with all their resources, seem to be capable of REing them to fix the mobile device bugs they currently exploit?
Tinfoil hat perspective suggests they don't want to.
[1] https://www.phrases.org.uk/meanings/fell-off-the-back-of-a-t...
"In completely unrelated news, upcoming versions of Signal will be periodically fetching files to place in app storage. These files are never used for anything inside Signal and never interact with Signal software or data, but they look nice, and aesthetics are important in software.[...]"
Does anyone find a package dropping off a truck and first take a picture of it, pick it up and go home to open it?
Even if someone picks up the package, usually taking a picture of it doesn't come to their mind. It's an unsual bit in the story. Unless, they went back and put the bag on the road to show that it was found just for the sake of "recreating the story" purposes.
How does something like a small briefcase just "fall from a truck"? By what mechanism? Briefcase would be stored inside the cabin.
If you're the author, can you explain my suspicion?