Menu
The Issues with Free and Open-Source Software for Privacy Tools

The Issues with Free and Open-Source Software for Privacy Tools

Once you get down to this level, you almost need to reevaluate everything about your threat model and what you are doing to protect yourself. Even the littlest of things can bring a whirlwind of issues if you are up against the wrong people. Just in the previous section, we are discussing how open source software is a really, really good thing. And now, we need to discuss some issues with it and what you can do to combat these issues and stay safe.

FOSS is great because it allows us to look at the code in its entirety and verify that what we are seeing is doing what we are being made to believe it is doing. But in order for this to be a true statement, we need to understand everything about the published code. I for one do not understand how to code anything apart from a simple website in HTML so I have to rely on the word of others. This word is only as good as the people checking it though. So say we are planning on using ServiceX (just as an example) to communicate securely with someone else but ServiceX is pushing out updates on a pretty timely (monthly) basis. Unless we know how to read, understand, and validate the code ourselves, we need to have another trusted person who is able to do this. Furthermore, that person needs to be doing this when every update is pushed. Then we raise the question on whether one skilled person looking at the code is enough? If this person misses something that has the potential to compromise us, we would be using ServiceX up until the point and time where someone else does notice this fault. Even though that timeframe might only be a matter of days, those are days where everything we do in association with this service is compromised, which by association, compromises us and our entire model of security, privacy, AND anonymity we have worked so hard to build up.

Another issue with Free, Open Source Software is mobile platforms. On most operating systems for desktop computers, we can take the source code from the GitHub (or other code publishing website) and build/compile them ourselves if they have been written to work with our OS. But on mobile operating systems, we can’t do that easily. And even in the cases where we can do it, we still face a huge challenge that doesn’t yet have a magical solve. To download an application onto my iPhone, it needs to be published to the App Store by the company who developed said application. I can’t go to the Open Whisper Systems website and download Signal straight to my phone. So even if we are checking the source code of the service/application (or having someone else do it for us), we still can’t validate that the same application is being sent to the App Store for us to download. If the company was compromised by a body of law enforcement and forced to comply, they could publish a clean update to the GitHub, making slight UI changes to avoid suspicion, but then send a backdoored version of the same application to the App Store for thousands of users to download. This holds true in a sense for Android devices and the Google Play Store as well. The only way around this with the Google Play Store is to submit reproducible builds for the public to see and make use of. Open Whisper Systems has just pushed this out for Signal and it would be really nice to see other services do the same (Hint Hint: ProtonMail, Tutanota, ChatSecure) https://github.com/signalapp/Signal-Android/tree/master/reproducible-builds. So since we can’t easily verify that the application we are using on our phones isn’t doing malicious things, it should be a fair assumption that ditching mobile devices and using strictly desktop versions of programs, ones we can compile from source and monitor ourselves, is the best route to travel down.

Code Audits for Privacy Tools

Even after reading all of the above about Open Source Software, there still lays a huge issue that needs to be hurdled before we can be certain that the software we are using is secure. It isn’t fair to assume that 100% of the people reading this section are going to be able to check through the code of an application themselves. Hell, it isn’t even fair to assume that 5% of the people reading this could perform such a daunting task. Take TrueCrypt for example. The code audits performed to make sure it was secure took months, from people light years ahead of me in the field of encryption; some of these people holding master’s degrees in the area with years of experience under their belts (cough, cough @matthew_d_green). So assuming that one individual can do this sort of thing to keep his or herself secure is silly. Code audits on the applications and services we are trusting with our security at this level is crucial. And once this code audit is complete, you then have to consider that the audit won’t be valid for further versions of the application. The second they send out an update and you install it, you have gone back to square one unless someone is viewing the changes and verifying them with every update.

See: The Importance of Free and Open-Source Software for Privacy Tools