283
submitted 2 months ago by [email protected] to c/[email protected]
top 50 comments
sorted by: hot top controversial new old
[-] [email protected] 95 points 2 months ago

E2EE is not supposed to protect if device get compromised.

[-] [email protected] 40 points 2 months ago

One could argue that Windows is compromised right out of the box.

load more comments (13 replies)
[-] [email protected] 15 points 2 months ago

Intrinsically/semantically no but the expectation is that the texts are encrypted at rest and the keys are password and/or tpm+biometric protected. That's just how this works at this point. Also that's the government standard for literally everything from handheld devices to satellites (yes, actually).

At this point one of the most likely threat vectors is someone just taking your shit. Things like border crossings, rubber stamped search warrants, cops raid your house because your roommate pissed them off, protests, needing to go home from work near a protest, on and on.

[-] [email protected] 4 points 2 months ago* (last edited 2 months ago)

If your device is turned on and you are logged in, your data is no longer at rest.

Signal data will be encrypted if your disk is also encrypted.

If your device's storage is not encrypted, and you don't have any type of verified boot process, then thats on you, not Signal.

[-] [email protected] 4 points 2 months ago* (last edited 2 months ago)

That's not how this works.

If the stored data from signal is encrypted and the keys are not protected than that is the security risk that can be mitigated using common tools that every operating system provides.

You're defending signal from a point of ignorance. This is a textbook risk just waiting for a series of latent failures to allow leaks or access to your "private" messages.

There are many ways attackers can dump files without actually having privileged access to write to or read from memory. However, that's a moot point as neither you nor I are capable of enumerating all potential attack vectors and risks. So instead of waiting for a known failure to happen because you are personally "confident" in your level of technological omnipotence, we should instead not be so blatantly arrogant and fill the hole waiting to be used.


Also this is a common problem with framework provided solutions:

https://www.electronjs.org/docs/latest/api/safe-storage

This is such a common problem that it has been abstracted into apis for most major desktop frameworks. And every major operating system provides a key ring like service for this purpose.

Because this is a common hole in your security model.

load more comments (1 replies)
load more comments (12 replies)
load more comments (1 replies)
[-] [email protected] 4 points 2 months ago

Plaintext should never be used in any application that deals with security, ever.

load more comments (3 replies)
load more comments (2 replies)
[-] [email protected] 42 points 2 months ago* (last edited 2 months ago)

While it would certainly be nice to see this addressed, I don't recall Signal ever claiming their desktop app provided encryption at rest. I would also think that anyone worried about that level of privacy would be using disappearing messages and/or regularly wiping their history.

That said, this is just one of the many reasons why whole disk encryption should be the default for all mainstream operating systems today, and why per-app permissions and storage are increasingly important too.

[-] [email protected] 22 points 2 months ago

Full disk encryption doesn't help with this threat model at all. A rogue program running on the same machine can still access all the files.

[-] [email protected] 13 points 2 months ago

It does help greatly in general though, because all of your data will be encrypted when the device is at rest. Theft and B&Es will no longer present a risk to your privacy.

Per-app permissions address this specific threat model directly. Containerized apps, such as those provided by Flatpak can ensure that apps remain sandboxed and unable to access data without explicit authorization.

[-] [email protected] 4 points 2 months ago

Does encrypting your disks change something for the end user in day to day usage? I'm honest, I've never used encrypted disks in my life.

[-] [email protected] 7 points 2 months ago

Whole disk encryption wouldn't change your daily usage, no. It just means that when you boot your PC you have to enter your passphrase. And if your device becomes unbootable for whatever reason, and you want to access your drive, you'll just have to decrypt it first to be able to read it/write to it, e.g. if you want to rescue files from a bricked computer. But there's no reason not to encrypt your drive. I can't think of any downsides.

[-] [email protected] 5 points 2 months ago* (last edited 2 months ago)

No, the average user will never know the difference. I couldn't tell you exactly what the current performance impact is for hardware encryption, but it's likely around 1-4% depending on the platform (I use LUKS under Linux).

For gamers, it's likely a 1-5 FPS loss, depending on your hardware, which is negligible in my experience. I play mostly first and third person shooter-style games at 1440p/120hz, targeting 60-90 FPS, and there's no noticeable impact (Ryzen 5600 / RX 6800XT).

load more comments (4 replies)
load more comments (5 replies)
[-] [email protected] 4 points 2 months ago

Exactly.

I'll admit to being lazy and not enabling encryption on my Windows laptops. But if I deployed something for someone, it would be encrypted.

load more comments (1 replies)
[-] [email protected] 41 points 2 months ago* (last edited 2 months ago)

How in the fuck are people actually defending signal for this, and with stupid arguments such as windows is compromised out of the box?

You. Don't. Store. Secrets. In. Plaintext.

There is no circumstance where an app should store its secrets in plaintext, and there is no secret which should be stored in plaintext. Especially since this is not some random dudes random project, but a messenger claiming to be secure.

Edit: "If you got malware then this is a problem anyway and not only for signal" - no, because if secure means to store secrets are used, than they are encrypted or not easily accessible to the malware, and require way more resources to obtain. In this case, someone would only need to start a process on your machine. No further exploits, no malicious signatures, no privilege escalations.

"you need device access to exploit this" - There is no exploiting, just reading a file.

[-] [email protected] 9 points 2 months ago

You. Don't. Store. Secrets. In. Plaintext.

SSH stores the secret keys in plaintext too. In a home dir accessible only by the owning user.

I won't speak about Windows but on Linux and other Unix systems the presumption is that if your home dir is compromised you're fucked anyway. Effort should be spent on actually protecting access to the home personal files not on security theater.

load more comments (7 replies)
[-] [email protected] 8 points 2 months ago* (last edited 2 months ago)

How in the fuck are people actually defending signal for this

Probably because Android (at least) already uses file-based encryption, and the files stored by apps are not readable by other apps anyways.

And if people had to type in a password every time they started the app, they just wouldn't use it.

[-] [email protected] 6 points 2 months ago

Popular encrypted messaging app Signal is facing criticism over a security issue in its desktop application.

Emphasis mine.

[-] [email protected] 5 points 2 months ago

I think the point is the developers might have just migrated the code without adjustments since that is how it was implemented before. Similar to how PC game ports sometimes run like shit since they are a close 1-1 of the original which is not always the most optimized or ideal, but the quickest to output.

load more comments (1 replies)
load more comments (1 replies)
load more comments (5 replies)
[-] [email protected] 39 points 2 months ago

That applies to pretty much all desktop apps, your browser profile can be copied to get access to all your already logged in cookie sessions for example.

[-] [email protected] 9 points 2 months ago

IIRC this is how those Elon musk crypto livestream hacks worked on YouTube back in the day, I think the bad actors got a hold of cached session tokens and gave themselves access to whatever account they were targeting. Linus Tech Tips had a good bit in a WAN show episode

load more comments (1 replies)
[-] [email protected] 21 points 2 months ago

The real problem is that the security model for apps on mobile is much better than that for apps on desktop. Desktop apps should all have private storage that no other non-root app can access. And while we're at it, they should have to ask permission before activating the mic or camera.

[-] [email protected] 11 points 2 months ago* (last edited 2 months ago)

macOS has nailed it*, even though it’s still not as good as iOS or Android, but leagues and bounds better than Windows and especially Linux.

ETC: *sandboxing/permission system

[-] [email protected] 4 points 2 months ago

What's wrong with the Flatpak permissions system on Linux?

load more comments (1 replies)
load more comments (2 replies)
load more comments (2 replies)
[-] [email protected] 19 points 2 months ago

Why is Signal almost universally defended whenever another security flaw is discovered? They're not secure, they don't address security issues, and their business model is unsustainable in the long term.

But, but, if you have malware "you have bigger problems". But, but, an attacker would have to have "physical access" to exploit this. Wow, such bullshit. Do some of you people really understand what you're posting?

But, but, "windows is compromised right out of the box". Yes...and?

But, but, "Signal doesn't claim to be secure". Fuck off, yes they do.

But, but, "just use disk encryption". Just...no...WTF?

Anybody using Signal for secure messaging is misguided. Any on of your recipients could be using the desktop app and there's no way to know unless they tell you. On top of that, all messages filter through Signal's servers, adding a single-point-of-failure to everything. Take away the servers, no more Signal.

[-] [email protected] 26 points 2 months ago

If someone can read my Signal keys on my desktop, they can also:

  • Replace my Signal app with a maliciously modified version
  • Install a program that sends the contents of my desktop notifications (likely including Signal messages) somewhere
  • Install a keylogger
  • Run a program that captures screenshots when certain conditions are met
  • [a long list of other malware things]

Signal should change this because it would add a little friction to a certain type of attack, but a messaging app designed for ease of use and mainstream acceptance cannot provide a lot of protection against an attacker who has already gained the ability to run arbitrary code on your user account.

[-] [email protected] 7 points 2 months ago* (last edited 2 months ago)

Those are outside Signal's scope and depend entirely on your OS and your (or your sysadmin's) security practices (eg. I'm almost sure in linux you need extra privileges for those things on top of just read access to the user's home directory).

The point is, why didn't the Signal devs code it the proper way and obtain the credentials every time (interactively from the user or automatically via the OS password manager) instead of just storing them in plain text?

load more comments (6 replies)
[-] [email protected] 6 points 2 months ago* (last edited 2 months ago)

Not necessarily.

https://en.m.wikipedia.org/wiki/Swiss_cheese_model

If you read anything, at least read this link to self correct.


This is a common area where non-security professionals out themselves as not actually being such: The broken/fallacy reasoning about security risk management. Generally the same "Dismissive security by way of ignorance" premises.

It's fundamentally the same as "safety" (Think OSHA and CSB) The same thought processes, the same risk models, the same risk factors....etc

And similarly the same negligence towards filling in holes in your "swiss cheese model".

"Oh that can't happen because that would mean x,y,z would have to happen and those are even worse"

"Oh that's not possible because A happening means C would have to happen first, so we don't need to consider this is a risk"

....etc

The same logic you're using is the same logic that the industry has decades of evidence showing how wrong it is.

Decades of evidence indicating that you are wrong, you know infinitely less than you think you do, and you most definitely are not capable of exhaustively enumerating all influencing factors. No one is. It's beyond arrogant for anyone to think that they could 🤦🤦 🤦

Thus, most risks are considered valid risks (this doesn't necessarily mean they are all mitigatable though). Each risk is a hole in your model. And each hole is in itself at a unique risk of lining up with other holes, and developing into an actual safety or security incident.

In this case

  • signal was alerted to this over 6 years ago
  • the framework they use for the desktop app already has built-in features for this problem.
    • this is a common problem with common solutions that are industry-wide.
  • someone has already made a pull request to enable the electron safe storage API. And signal has ignored it.

Thus this is just straight up negligence on their part.

There's not really much in the way of good excuses here. We're talking about a run of the mill problem that has baked in solutions in most major frameworks including the one signal uses.

https://www.electronjs.org/docs/latest/api/safe-storage

load more comments (2 replies)
load more comments (1 replies)
[-] [email protected] 5 points 2 months ago* (last edited 2 months ago)

98% of desktop apps (at least on Windows and Linux) are already broken by design anyways. Any one app can spy on and keylog all other apps, all your home folder data, everything. And anyone can write a desktop app, so only using solutions that (currently) don't have a desktop app version, seems silly to me.

load more comments (6 replies)
load more comments (32 replies)
[-] [email protected] 18 points 2 months ago* (last edited 2 months ago)

The backlash is extremely idiotic. The only two options are to store it in plaintext or to have the user enter the decryption key every time they open it. They opted for the more user-friendly option, and that is perfectly okay.

If you are worried about an outsider extracting it from your computer, then just use full disk encryption. If you are worried about malware, they can just keylog you when you enter the decryption key anyways.

[-] [email protected] 8 points 2 months ago

The third option is to use the native secret vault. MacOS has its Keychain, Windows has DPAPI, Linux has has non-standardized options available depending on your distro and setup.

Full disk encryption does not help you against data exfil, it only helps if an attacker gains physical access to your drive without your decryption key (e.g. stolen device or attempt to access it without your presence).

Even assuming that your device is compromised by an attacker, using safer storage mechanisms at least gives you time to react to the attack.

[-] [email protected] 7 points 2 months ago

Linux has the secret service API that has been a freedesktop.org standard for 15 years.

load more comments (1 replies)
load more comments (2 replies)
[-] [email protected] 17 points 2 months ago

Ah yes, another prime example that demonstrates that Lemmy is no different than Reddit. Everyone thinks they are a professional online.

Nothing sensitive should ever lack encryption especially in the hands of a third party company managing your data claiming you are safe and your privacy is protected.

No one is invincible and it's okay to criticize the apps we hold to high regards. If your are pissed people are shitting on Signal you should be pissed Signal gave people a reason to shit on them.

load more comments (3 replies)
[-] [email protected] 7 points 2 months ago

Whatever its stores and however it stores it doesn't matter to me: I moved its storage space to my ~/.Private encrypted directory. Same thing for my browser: I don't use a master password or rely on its encryption because I set it up so it too saves my profile in the ~/.Private directory.

See here for more information. You can essentially secure any data saved by any app with eCryptfs - at least when you're logged out.

Linux-only of course. In Windows... well, Windows.

load more comments (1 replies)
[-] [email protected] 7 points 2 months ago

Bruh windows and linux have a secrets vault (cred manager and keyring respectively, iirc) for this exact purpose.

Even Discord uses it on both OSs no problem

[-] [email protected] 6 points 2 months ago

Sure, I was aware. You have the same problem with ssh keys, gpg keys and many other things

load more comments (4 replies)
[-] [email protected] 5 points 2 months ago

This just in: threat actors compromising your devices is bad. More at 11.

load more comments (2 replies)
[-] [email protected] 4 points 2 months ago

This shows an incredibly cavalier approach to security on the part of the team working on signal.

load more comments (2 replies)
load more comments
view more: next ›
this post was submitted on 06 Jul 2024
283 points (94.6% liked)

Privacy

31253 readers
579 users here now

A place to discuss privacy and freedom in the digital world.

Privacy has become a very important issue in modern society, with companies and governments constantly abusing their power, more and more people are waking up to the importance of digital privacy.

In this community everyone is welcome to post links and discuss topics related to privacy.

Some Rules

Related communities

Chat rooms

much thanks to @gary_host_laptop for the logo design :)

founded 4 years ago
MODERATORS