Life & much, much more

Deleting Facebook, and a reflection on digital privacy

In the wake of the recent Cambridge Analytica privacy issue in the news, I have decided to #DeleteFacebook. The thinkMoult blog is still represented via the public Facebook thinkMoult page, but my private profile has been cleared out. Given that Facebook is increasingly sharing our profile data (as shown in the graph below produced from Facebook’s very own reports), clearing out the account makes a difference, albeit a small one. I also thought it would be good to share a few things I’ve learned about Facebook in the past couple of weeks, related to my new years resolution to improve digital security.

Facebook government requests over time

(Note: you can compare with Google’s data disclosure over time)

First, I’d like to commend Facebook’s behaviour so far. Being the world’s largest social network probably isn’t easy, and Facebook has made initiatives to increase its transparency. For instance, they issue a transparency report, and they use the Signal secure messaging protocol for a secure chat mode in FB Messenger. It is also possible to download your Facebook data, and place restrictions on data sharing with apps and advertisers. Their data retention policy also seems to suggest that if you delete data from your account, it’s also gone from their servers.

However, of course, this isn’t the complete picture. Take for instance the world map of Facebook government requests in the first half of 2017 from their very own transparency report.

Facebook government requests in 2017

The map (split into Jenks natural breaks) shows that US government requests are miles ahead of the rest of the world in asking Facebook for information. Most governments from other countries don’t play any part in this.

However, the map is incomplete. It is also not possible to see data shared through indirect means. Developers can easily create apps that integrate with Facebook. Whether you answer a survey through Facebook or use Facebook to log into another service, they can have varying degrees of access to your profile and friend information. This may also occur without your explicit consent. For instance, my meager Facebook usage has resulted in my details being shared with 138 companies. This is not to mention that Facebook trackers are on 25% of websites online. Oh, and let’s just forget Facebook altogether: Google trackers are on 75% of websites online (and yes, also on my blog). Basically, you are always tracked online, from the way you move your mouse to how you feel, which can be combined through machine learning to indirectly define character profiles, interests, and demographics.

Like most technologies, this data can be used for very positive things and very negative things alike. The negative side comes when services we assume are private social platforms are actually not. This data may be used to influence political elections, or help China rank all citizens, or rebrand political news as fake news in Malaysia, or even be accessed by any law enforcement agency around the world without notification or warrant – it doesn’t matter – people misunderstand that posting on Facebook is not a private matter: it is public.

Deleting Facebook is one step of many to promote the idea that just as there are public outlets for expression online (blogs, Twitter, Facebook) there equally are private outlets (Signal, Tor, ProtonMail). Of course, there is nothing inherently wrong with either outlet, but we should recognise these differences in privacy and know when to choose between them.

For more reading, see why digital rights matters, even though you don’t think it impacts you, and how you can improve human rights by changing your messaging app.

Life & much, much more

Improving human rights through secure messaging

Earlier this year, I talked about how important digital privacy is (even if you don’t think it is). I talked about political oppression, and how raising the awareness of basic digital privacy largely benefits those who are politically oppressed. Using secure services increases the amount of infrastructure dedicated towards them and raises the standards of digital security worldwide. But before we talk about how we make the first steps, let’s remind ourselves why this is so important.

2018 Freedom in the World index map

The map above shows the results of the 2018 Freedom in the World index, derived largely from the Universal Declaration of Human Rights. At the time, it was signed without competition by all UN member states. Green is free, yellow is partly free, and red is not free. As of 2018, more than half the countries in the world have issues.

Percentage of countries in the freedom in the world index over time

There are quite a few ways to slice and dice freedom index data, but the general trend since the 1970s can be seen in the graph above – showing the distribution of free countries over time. Generally, since the 1970s, we’ve improved a bit, but largely stalled in the past 20 years. More than half the world still seems to have some problems in regard to political and civil liberties, and a few are getting worse. Of course, the above data is a gross simplification, so if you’re interested in seeing more detailed and granular metrics I highly urge you to check other dimensions such as Our World In Data’s Human Rights graphs.

The good news is that we all use the internet, and by using it we shape how it grows, and that allows us to make an impact on human rights. The World Economic Forum illustrates the link between digital privacy and human rights in the quote below:

Digital rights are basically human rights in the internet era. The rights to online privacy and freedom of expression, for example, are really extensions of the equal and inalienable rights laid out in the United Nation’s Universal Declaration of Human Rights.

As a case study, Facebook bi-anually releases a report called the Global Government Requests report (see the 2017 Global Government Requests Report blog post). In the first half of 2017, it shows that there were roughly 79,000 government requests for data for 115,000 user accounts. That’s more than double what it was three years ago (35,000 user accounts). Every report sees an increase in the number of requests, easily growing more than 30% each year. Yikes! That’s some serious compounding privacy interest!

However, there are steps we can take to raise the basic levels of digital privacy online. By adopting these technologies, we increase the global average cost per capita of digital mass surveillance — and reduce its efficacy as a tool to control and oppress those in need.

Our online activity can largely be grouped into three categories, messaging, email, and web browsing. By changing a few habits in our day-to-day online activities, we can make a difference. In this article, we’ll concentrate on messaging.

We send messages all the time – SMSes, through Facebook Messenger, WhatsApp, Skype, Google Hangouts and so on. If you’re the statistically average user, you have 2 messaging apps, and they’re both on the chart below. The data comes from Statista, and I’ve rehashed it slightly.

Global monthly active users for different messaging apps

(note: due to a formatting error, you will need to multiply the horizontal axis by 1,000. So Facebook’s numbers are over 2.5 billion!)

What you may not know is that big data on the internet is owned by a handful of companies, governed by a handful of countries. USA’s Facebook and China’s TenCent gathers more of your messages than probably everything else combined. These companies have little to no incentive to protect your data, actively create digital profiles of you, and are based in countries that have governments that are more than happy to ask for it to be disclosed. .

But don’t listen to me, listen to Amnesty International’s Encryption and Human Rights Report instead. Unless you’re using Facebook’s WhatsApp (which is the least bad), Amnesty International thinks you deserve a slap on the wrist. Worst of all messaging apps is China-based TenCent’s QQ and WeChat, which scores a 0 out of 100 in protecting human rights. It has no encryption specification, does not recognise threats to human rights, made no commitment to freedom of expression, actively detects and censors content, and does not refuse backdoor implementations. So, if you send money through WeChat (yes, WeChat has higher transaction volumes than PayPal), guess what? It’s public! We could go through the many examples of public data but I’ll let you read the publication yourself and judge.

So what makes Facebook’s WhatsApp the least bad? Well, for a start it has publicly stated there is no encryption backdoor – no built-in mechanism for sharing your data. It’s more transparent and tries to notify you if your data is being requested, and produces bi-annual reports that we saw above. But perhaps the most effective secret sauce — the gold-standard of digital humans rights protection — is that it supports end-to-end encryption. This means that the moment your message leaves your device, nothing can read it.

WhatsApp’s end-to-end encryption isn’t it’s own invention. Like any robust cryptography standard, it is based off free and open-source software. Many years ago, defectors from Twitter started a collaborative effort called Open Whisper Systems and developed the Signal secure messaging system. Signal is not owned by any company or country, is open-source, and primarily funded by the Freedom of the Press foundation. For instance, if you want to tip off The Guardian, Signal is one of your options.

Signal logo

However despite WhatsApp’s best intentions in using the Signal system under the hood, its nature as a Facebook acquisition, organizational structure and some of its other technical decisions means that WhatsApp falls short of Signal’s encryption standards. In short, WhatsApp retains metadata about your contacts and messages, which may be used to infer information about you (much more than you might think!). Luckily, the small core team that built the Signal system also have their own app, which is completely privacy focused. It looks just like any other messaging app out there, and anyone can use it if they truly want to get top-notch security and privacy. Here’s a screenshot of it from the official Signal website. If you have an iPhone or Android, you can download it from the app store for free. It works on your computer with a computer app, and also works as a Signal command line app if you’re a terminal junkie.

Signal messenger app screenshot

In fact, the core Signal app is such an ideal state of privacy in the messaging world that apart from earning a special mention in the Amnesty International report, it also earned a 50 million USD investment from the co-founder of WhatsApp. Brian Acton, the co-founder of WhatsApp, was around when WhatsApp made the initial jump to use Signal as its system under the hood, and after he left Facebook and WhatsApp, donated to create the Signal Foundation – a non profit organisation to protect data privacy, transparency, and open-source development, which aligns with Acton’s personal beliefs.

If two people want a private conversation, electronic or not, they should be allowed to have it. – Brian Acton, WhatsApp co-founder

There’s still so much to talk about, but let’s stop here. I highly recommend that even if you do not fully understand the technical background behind encryption or the full extent of the humans rights impact, to take the first step and install Signal.

See you on the other side!

P.S. For the more technically inclined, you may instead be interested in setting up your own XMPP server that supports the OMEMO XEP. OMEMO is an implementation of the same cryptographic technique pioneered by the Signal protocol, and XMPP offers decentralised messaging, in contrast to Signal Messenger, which for all practical purposes is a centralised system (theoretically, it is possible for somebody to use the protocol and build in federation support).

Life & much, much more

Digital privacy is important, even though you think it doesn’t impact you

The average person (or business entity) publicly shares their personal information on the internet. If you search with Google, send email with Gmail, talk with Facebook Messenger, and browse the Web with Chrome, you are being tracked. These free services, and many more, store and analyse your personal messages, search history, cloud photos, and the websites you visit. This information is readily available to governments, hackers, or really any business or person who is interested and willing to pay (law firms, journalists, advertisers, etc).

This is not news to most people. You have perhaps experienced an advertisement pop up suddenly related to a website you visited that you thought was private. You have probably had Facebook recommend new friends who you just met a week ago. However, these are all rather benign examples that don’t warrant paranoia over your digital security.

As part of my 2018 new years resolution I have been taking a closer look at my online privacy. Many people have questioned me on it and so I thought I would address it in a blog post. To begin with, I’d like to refer you to a great TED Talk on Why Privacy Matters. Take 20 minutes to watch it and come back.

Glenn Greenwald - TED - Why Privacy Matters

For those too lazy to click, Glenn Greenwald makes the point that we don’t behave the same way in the physical world and the virtual world. In the physical world, we lock our houses, cover our PIN at the ATM, close the curtains, don’t talk about business secrets in public, and use an empty room when having a private conversation. This is largely because we understand that in the physical world, we can open unlocked doors, glance at PIN keypads, peek through curtains, listen to company gossip, and overhear conversations.

In the virtual world, we are unfortunately uneducated about how to snoop on other’s private information. We assume that sending an email on Gmail is private, or opening an incognito mode browser hides everything. This is far from the truth: mass surveillance is relatively cheap and easy, and there are many organisations that are well invested in knowing how to snoop. However, for the most of us, we only experience this through tailored advertising. As a result, there is little motivation to care about privacy.

In this post, I will not talk about how you are tracked, or how to secure yourself. These are deep topics that deserve more discussion by themselves. However, I do want to talk about why privacy matters.

The right to privacy is a basic human right. Outside the obvious desire to hide company secrets, financial and medical information, we behave differently when we are being watched. You can watch adult videos if you close the door, buy different things if you don’t have a judgmental cashier, and talk about different things on the phone if you aren’t sitting on a train in public.

Again, these are benign and socially accepted norms. However, there are people living in countries where the norm is largely biased against their favour. Global issues like corruption and political oppression exist, even though many of us are lucky to turn a blind eye. Victims of these countries are censored, incarcerated, and killed. See for yourself where your country ranks in the list of freedom indices.

In these societies, a greater percentage of the population start to be impacted by the poor digital security that we practice. We can see this in the following graph, which shows the usage of The Tor Project, a tool that anonymises Internet traffic, correlating with political oppression (read the original study).

Correlation of Tor usage and political repression

Further investigation shows that Tor usage (see how Tor statistics are derived) similarly correlates to politically sensitive events. As of writing this post, I rewinded the clock to the three most recent political events that occurred in countries which experience censorship and political oppression.

First, we have the 19th National Congress of the Communist Party of China. You can see the tripling in activity as this event occurred. The red dots show potential censorship.

Chinese Tor usage spikes during the 19th National Congress of the Communist Party of China

Similarly, we can see a turbulent doubling in value during the blocks of social media and TV channels in Pakistan.

Pakistan Tor usage during the social media block

Finally, a spike of usage and statistically relevant censorship / release of censorship events during the anti-government protests in Iran.

Iran Tor usage spikes during Protests in Iran, blocking of various services including Tor

These three events were simply picked as the most three recent political events. Whether they are good or bad is largely irrelevant and I hold no opinion on them whatsoever. However, it is clear that others do have an opinion, and are using services like Tor as a reaction. Of course, it’s not just Tor. For example, a couple weeks ago, 30,000 Turks were incorrectly accused of treason from a 1×1 tracking pixel. This results in jobs, houses, and innocent lives being lost. In the US, Governors are still signing in support of Net Neutrality.

Despite these issues, there are those that believe that as long as we do not do anything bad, there is nothing to hide. Privacy tools are used by criminals, not the common population. This is also untrue. The definition of “bad” changes depending on who is in power, and criminals are motivated individuals who have much better privacy tools than most will ever have. Statistically, increasing the basic awareness of privacy does not increase criminal activity, but does increase protection of the unfairly oppressed.

Those who are fortunate enough to live a complacent digital life tend to decrease the average awareness of digital privacy. Just as we donate relief aid to countries that experience wars or natural disasters, we should promote awareness about digital freedom on the behalf of those who do not have it. Nurturing a more privacy aware generation -a generation who is born with a tablet in their hands- is a responsibility to ensure that social justice and the expression of the marginalised population remains possible.

Next up, I’ll talk a bit about what tracking does occur, and what privacy tools are out there.