There are no good guys. Only rotating members of the aligned interest gang.
Nailed it.
The recent advent of governments worldwide trying to force corporations to build backdoors into their services for the ‘safety of children’ or to ‘counter terrorism’ arguably does more harm than good for the common people.
We’re going to ‘counter terrorism’ by mass spying on our own citizens and hope to god real terrorists don’t gain access to our backdoor.
The point of the exercise is ALWAYS to spy on innocent citizens. It’s about surveillance and control, not countering terrorism or protecting children.
if we spy on their spies, we can counterspy their spyworks before they spy on us!
… and it’s called “defensive measures”, sir.
/s
It’s never about the public interest but the preservation of plutocratic power.
It’s the fact that the intelligence agencies have proven themselves to be unable to responsibility use their powers, and instead find every sneaky way possible to infiltrate and spy on their own citizens while preventing nothing. That’s what has pushed the world to say enough is enough and we are going to encrypt everything we can. Now the global powers are crying poor about how they need access to stop terrorism, while being completely unable to point to a single instance where they stopped a terror attack and contrarily there’s plenty of terror attacks that were never stopped.
Government’s : We’re going to counter terrorism by backdooring into every device our citizens use.
Real Terrorists : We have our own devices that are fully encrypted and free from backdoors.
Thanks for giving us the key to spy on all of your citizens tho, very helpful.deleted by creator
deleted by creator
Love the concrete examples and then “etc.”
Slightly off topic - someone mentioned they don’t use Tutanota for social interactions because the domain is weird and I agree whole heartedly. Everytime I’m on the phone with a support dept. or tell my friends and I spell it out I feel so silly. Not to mention my wife has gotten it wrong several times.
Love the solution, their support is responsive as well, but yeah…
That’s the main reason I didn’t even consider them. “Proton(mail)” just sounds more professional when used in actually important contexts and is easier for people to get right.
In general, I’ve noticed that a lot of privacy focused software, particularly FOSS, are really bad at choosing names which make people want to use them. They tend to have names which might appeal to some crypto-nerds, but which make them sound just weird or questionable or niche to the average user. Like (the precursor to) Signal the messaging app used to be called TextSecure. There’s no way I would’ve gotten my parents and siblings to use something called TextSecure. The name just sounds so geeky and niche.
Tbf Google is also a weird name, Yahoo was also a bit weird even if not entirely, there probably are more examples but it’s not just that the name is not great but also that these things aren’t advertised as well
Two details:
FBI expands rapidly to DHS and then the entire US Police State. If you cross borders, expect ICE AND CBP to be up in your body cavities. If the local county sheriff doesn’t like you, or you’re being stalked by an officer (say, an ex) expect them also to have access.
When you think Hackers think of not only data mining interests like Palantir but also industrial spies. If you have any business interests on your phone subject to an NDA (or you’re motivated not to share because reasons) these guys will sell that information to your competitors, if they weren’t hired by them in the first place.
If you run more than a mom-and-pop then the default security of your smartphone is not enough. But a lot of sizeable companies supply their officers with unprotected phones.
deleted by creator
not if governments are compromising the encryption algorithms themselves.
edit: source https://slashdot.org/story/420213 https://www.newscientist.com/article/2396510-mathematician-warns-us-spies-may-be-weakening-next-gen-encryption/
Good luck with that. FOSS is transparency on a source code level, there’s no obscurity they can hide their back door behind.
You realize nobody would know about this in the first place if it was Proprietary, right?
FOSS allows for whistleblowers, scrutiny, and audits. Proprietary ‘security via obscurity’ does not.I’m perfectly aware of all that. but cryptography is an extremely complicated discipline that even the most experienced mathematicians have a hard time to design and scrutinize an algorithm, they heavily rely on peer review. If one major institution like the NIST is biased by the NSA, they will have a bigger chance of compromising algorithms if that are their intentions.
You’d be surprised what the world wide collective of Cryptographers are capable of when they’re able to scrutinize a project in the first place. Which would you prefer? A closed unscrutinizable encryption algorithm or one that’s entirely open from the ground up?
NIST could do damage if they’re biased, but it’s not like people aren’t keeping a close eye on them and scrutinizing as many mistakes as possible. Especially for an algorithm as globally important as PQC.I’m totally against anything proprietary. That’s the first requisite for anything I use. And I’m not advocating for proprietary algorithms at all, that would be very much the demise of encryption.
I’m just worried that a sufficiently influent actor (let’s say a government) could theoretically bribe these institutions to promote weaker encryption standards. I’m not even saying they are trying to introduce backdoors, just that like the article suggest they might bias organizations to support weaker algorithms.
AES 128 bits is still considered secure in public institutions, when modern computers can do much stronger encryption without being noticeable slower.
They did this before with the eliptic curve cryptography, and we knew it had this problem before it was implemented as a standard.
So if the NSA offers a standard, don’t trust it and include in your encryption software the option to use something different.
Leaving a back door in is the same logic as leaving a key under a fake rock by your house.
That you as the home owner don’t know about and anyone with a home constructed by the same people whom did your home have the exact same key under the exact same rock.
Those famous good guys we all know and love.
This article makes some good points generally, but it is ultimately marketing for a commercial snakeoil service which has a gigantic backdoor in its very threat model: when a tutanota users send an “end to end encrypted email” to a non-tutanota user what actually happens is that they receive a link to a web page which they type the encryption key in to.
Even if the javascript on that page is open source and audited, it is not possible (even for sophisticated users) to verify that the server is actually sending the correct javascript each time that a user accesses it. So, the server can easily target specific users and circumvent their encryption. The same applies to tutanota users emailing eachother when one of them is using the webmail interface.
This effectively reduces the security of their e2ee to “it works as long as the server remains honest”. But, if you fully trust the server to always do what it says it will, why bother with e2ee at all? They may as well just promise not to read your email.
I am removing this from !privacy@lemmy.ml with the reason “advertising for snakeoil”. (If you’re reading this on another instance and the post isn’t deleted, ask your instance admins to upgrade… outdated versions of lemmy had a bug which prevents some moderation actions from federating.)
But we have to think of the children.
It’s the same person.
Uhm question, how is Tutanota E2EE? Other than making PGP setup easier. Afaik they just use a different protocol for client-server
it is a shitty E2EE implementation in JS incompatible with the email standard OpenPGP.
but I like that they wrote this post, even if it is for marketing purposes, because Tutanota is based on the EU and hopefully the EU Parliament will listen if enough people tells them.
i’m genuinely curious about some alternatives to this sort of surveillance to solve issues like CSAM etc., which aren’t “it’s the parents responsibility”. section 230 reform? links to further reading appreciated.