23 Jun, 2014

Dating Securely In The Mobile Age

by Abdullah Munawar & Jack Mannino

One in 5 adults between 25-34 use at least one dating app. Almost all of the big players in the mobile dating space have mobile apps for iOS and Android. Some have both a traditional web application and a mobile application, while some dating services are mobile only. We looked at the security and privacy of 30 different apps for both iOS and Android. We found some pretty interesting results based on our digging around.

Today’s mobile dating apps cater to many different interests and guilty pleasures. Here are a few examples of applications we examined and their unique demographics:

  • Ashley Madison - a dating application for people looking to cheat on their spouses, as well as a “Black Book” application
  • Positive Singles - an STD dating site where you try to match up with people who have the same STDs as you
  • HowAboutWe - speed dating and meeting new people by experiencing things together like eating food, drinking, or wandering around a museum 
  • Tinder - location based app where you meet people nearby by swiping left or right Depending on who you are and what you are hiding, you may have a very different purpose for being on these networks than other users. If you are cheating on your spouse and you just so happen to be a senator, this may be valuable information, useful for extortion purposes. If you are a sexual predator, you may want to fly under the radar to avoid detection so you might be looking for any advantage you can find. Although several apps made claims to screen for and attempt to block any known sex offenders, we found that these ‘protections’ are for the most part, completely useless.

In a joint statement from eHarmony, Match, and Spark Networks (Christian Mingle, JDate), these groups vowed to band together to ensure users of these networks have a secure and safe experience. At the same time, their language indicates that they understand that it’s impossible to make any guarantees. An example of this is in describing their tools to filter out sex offenders, which unfortunately, is neither easy or accurate:

“The providers will remind members that the members are responsible for their own safety and offline activities. As noted, because there are limitations to the effectiveness of sex offender screening tools and use of such tools does not guarantee member safety, providers will not promote or publicize sex offender screening tools in a manner intended to lead members to assume that due to the providers’ use of sex offender screening tools, meeting people online is any safer than meeting people any other way. The providers will disclose in the Terms of Use or User Agreements for their websites that members should not rely on sex offender screenings or other protective tools as a guarantee for safety or a replacement for following Safety Tips.”In our research, we found that over 80% of the apps had at least one issue that allowed for de-anonymizing users. Using one or more pieces of information, we were able to tie a user’s online persona back to their real-world identity. Anonymity is important for both privacy and safety reasons. Over half of the apps sent sensitive information over HTTP, either to the app’s backend services or to analytics services. In addition to not even attempting encrypted communications, a considerable number of apps had certificate validation issues and didn’t apply SSL best practices. So besides just “Big Brother” intercepting the traffic, anyone at your local coffee shop with minimal expertise can also read this information.

Many of these apps also had authentication, session management, and access control issues that ultimately allowed for other users to impersonate you. A considerable amount of sensitive information was also insecurely stored on the mobile device including credentials, user location history, and messages.

As internet good samaritans, we disclosed these issues to the app developers. Some were fixed pretty quickly, some are in the process of being fixed, and a few app developers were completely non-responsive. Some even insisted that these were non-issues due to . To protect the innocent, we’ll be holding some things back. For the purpose of education and driving awareness that these issues exist, we will only be disclosing things that have been fixed, have been deemed “acceptable” by the developers, or seem to have been completely ignored.

####

Location Data

Traditionally, most dating apps ask you a series of questions about yourself, your interests, and your ideal mate. Using that information, they attempt to match you up with people their matchmaking algorithms deem you compatible with or who you may have found via search. Nowadays, many of them also pull this information directly from your social graph by authenticating via your Facebook, Twitter, or Google+ account.

So what really makes mobile dating different? For starters, your location. Where you are right now, where you’ve been, and where you’re going are all important. Most of the dating apps we looked at (over 75%) requested access to your location. While mobile browsers support accessing user geolocation data, mobile apps and mobile SDKs provide stronger capabilities for interacting with this data. This includes providing background updates as well as timely notifications such as when you’re within a few hundred feet of your dream girl (or guy).

Most of the geolocation exposure issues we discovered were available in a user’s public profile. Any authenticated (or sometimes unauthenticated) user had the ability to access this information and track it in realtime. In addition to geolocation data, email addresses, social network IDs (your real identity), birthdays, phone numbers, or home addresses were also found to be leaking in several applications within public profiles. For sites that guarantee anonymity, this is a bad thing.

Mobile dating is all about instant gratification. In order to unlock the full capabilities of these apps, your location is important. Typically, an app will pull your location and send a request to a backend service to update your location and potentially return results of people within a certain distance from you.

By providing your current location, the service returns a list of users closest to you that may have similar interests. One thing to note is the distance, which can almost always be used to triangulate a user’s current location. In fact, we typically encountered a few issues that could be used to figure out where you are or where you’ve been:

  • Returning the user’s actual GPS coordinates (latitude and longitude)
  • Returning the precise distance between you and another user
  • Geolocation EXIF information not removed from user uploaded images (e.g. your bathroom selfies) In the first scenario, many of the apps we looked at returned a user’s last updated GPS coordinates. These alternated between intentional and unintentional exposure. Where it was intentional, there was usually a feature that allowed you to opt-out, but more often than not, users’ locations were exposed by default.

However, not all apps that exposed a user’s location did it intentionally. In the unintentional bucket, there were typically two root causes that we identified:

  • Automatically binding the entire user model to JSON and failing to take into account that the location was included
  • Intentionally returning the data but not displaying it, erroneously thinking that this information still couldn’t be recovered Most of the popular web MVC frameworks provide built-in capabilities or leverage popular libraries that allow simple binding of objects to JSON or XML responses. This behavior cuts down on a significant amount of code bloat but also leads to many programmatic flaws along the way. An example of this is using the Rails as_json method . Care should be taken to ensure that you aren’t exposing parts of your model that you wouldn’t want an attacker to know about. This includes password hashes (yes, we saw those too), geolocation information, birthdays, or email addresses.

The example below was discovered and reported to  HowAboutWe, and their development team closed this issue almost immediately. They were returning both your current location as well as your birthday within public profiles.

Here is a request for a public profile:

Below is the response, containing GPS coordinates as well as the user’s birthday:

The other scenario we encountered is where a developer returned the same verbose user model data for multiple request types, picking and choosing the parts to consume within the application with each response. In this scenario, they return more than they need to simply out of convenience. These developers may not have fully understood that an attacker could still intercept and view this data if it was being leaked back to the client even if their application did not display it directly to the user. Intercepting proxies such as Burp and ZAP allow an attacker to intercept all web traffic between a mobile app and a backend web service, even if SSL is in place. There are techniques such as certificate pinning that make this type of self-interception more difficult, but generally these techniques can be bypassed.

The distance between yourself and another user can be used to triangulate their actual location even if the application doesn’t return their GPS coordinates. As an example, the Skout application returns a user’s exact distance with very high precision from you within their public profile:

How is it possible to turn the distance into an exact location? The distance is calculated from your current position. If your position changes, so will the distance between you and your target user. By manipulating your position and triangulating each point, you can narrow the person’s location down to a range of only a few feet.  This issue was pretty prevalent within the set of applications we looked at, and in most cases, very similar, if not identical to an issue found on Tinder a few months ago. As it appears, no one must have been paying attention.

Another method for figuring out a user’s location and potentially their identity is to use EXIF information extracted from their uploaded images. EXIF information typically includes data such as the time a picture was taken, the type of camera, geolocation, and many other fields. By default, the iPhone and some Android phones geotag pictures taken with your phone. When these images are uploaded, this information isn’t always stripped, allowing for someone to extract the location the picture was taken. There are plenty of open source and free tools that can extract this information in an easily readable format.

####

Transport Layer Protection

Insufficient Transport Layer Protection was another issue found to be pretty prevalent within the pool of mobile dating apps we examined. This is described well by OWASP. Most of the issues we found fell into a few different areas:

  • No SSL at all or even attempting to do it
  • Encrypting just the login function
  • Encrypting the app but using ad or analytics libraries that don’t
  • Certificate validation flaws We were surprised to see some of the larger apps with significant user bases vulnerable to these problems. A few sites that we found to be sending your data over the internet insecurely include:

  • Match.com
  • Christian Mingle
  • Plenty of Fish
  • Skout We received a response from Christian Mingle, where we were given the standard issue “go away security” response from their support team:

However, clearly that’s not the case:

Many apps also implemented ad networks and analytics tools. A considerable amount of these libraries leaked identifying information about the users to third parties, often over an insecure communication channel. The majority of apps leaked at least one piece of identifying information to a third party, including:

  • Your actual credentials
  • Geolocation data
  • Your MAC address
  • Your device ID
  • Your sexual preference Here is an example of the Skout application sending your GPS coordinates to an ad network in plain text:

In the case where an application did use SSL, the implementations weren’t always perfect. Certificate exceptions were frequently handled by silently ignoring the errors, both within iOS and Android applications. This issue usually pops up as a result of a developer trying to use a self-signed certificate in development and not understanding the impact of doing this in a production environment. An example of this was found within the Jaumo app on Android. The code below shows their TrustManager implementation:

In addition to SSL implementation vulnerabilities, we also discovered several best practices to be completely absent across the board. Certificate Pinning wasn’t in use anywhere. Perfect Forward Secrecy wasn’t deployed anywhere.

####

Local Data

In addition to being able de-anonymize you as well as intercept your data, a significant amount of information about you is stored on your device. This includes your credentials, location history, message databases, and more. Some of the apps we found to be the worst culprits include:

  • Match.com
  • Skout
  • Ashley Madison
  • Singles Around Me We found very few applications that implemented a local app passcode lock. The purpose of having this in place is to try to prevent your significant other or anyone curious from reading your data if you left your phone unlocked or if they knew your passcode. This could be implemented as a 4 digit pin or a longer string. An implementation of this concept for iOS can be found here.

The Ashley Madison Black Book app is used for being “discreet” by using a disposable phone number, “private” text messages, and a “confidential” contacts list. This is the type of app that says “I’m up to no good”, basically.

Black Book was one of the few applications that used a local app passcode lock, and we found a bunch of implementation flaws. It has a local app passcode lock that’s 4 digits in length.

While this is certainly a decent deterrent against the lazy or non-technical, someone with enough motivation and skill plus physical access to your device will likely win. I know for a fact that if my wife ever found this app on my phone, she would try to guess the password until her fingers hurt. Or, maybe she would use a robot.

First, the application doesn’t wipe your data after an excessive amount of failed unlock attempts, nor does it give you the option to. This means your spouse who thinks (and is probably right) that you are up to no good can try again and again and again to unlock this data. I’d recommend not using your anniversary as your passcode.

Second, both your local passcode as well as your account credentials are stored locally. Your account credentials are stored in plain text, while the app attempts to encrypt your stored local passcode. If you are one of those cool kids who uses a rooted or jailbroken device, you make it even easier to recover this information. The internet is filled with tutorials and 1-click tools to make this easy for anyone semi-technical and motivated.

Your local passcode is stored encrypted on the device. While this is only a 4-digit pin, changing this to a longer, more complex value would still be a degraded solution because it is symmetrically encrypted with AES with a hardcoded key. Here is decompiled code from the com.hushed.base.providers.SecurityProvider.java class:

Within apps like Skout, your password is stored in plain text within both iOS .plist files as well as Android’s SharedPreferences:

We also found a bunch of Plain Text Offenders who email your credentials to you in plain text. What this means is if they are sending your password to you in plain text, they are also storing it in plain text within their database or with a symmetric encryption algorithm. These culprits certainly have some room to tighten up their approach to secure password storage. Here is an example of Match.com emailing a plain text password:

These are just a few examples of the many problems we found. In our opinion, the current crop of mobile dating apps puts user privacy and safety at great risk. While we can argue that users should read an app’s terms and conditions as well as the end user licensing agreement, in reality, most users don’t read them nor would they understand most of the intentionally vague legalese. Hence, it is up to the developers of these applications to be responsible with the data they collect and how they protect it.

As a user of these applications, we recommend a few ways to stay safe:

Review the app’s settings and opt out of things that seem questionable

Download the full-sized infographic here.