Today, vibrators and other sex toys are connected in with the Internet of Things. Apps can be used to control vibrators, and conversely, apps can receive usage and preference information from vibrators. We’ve all heard about data security issues and data breaches. Remember the AshleyMadision.com fiasco? Recently, there have been reports that AdultFriendFinder.com may have been hacked, as well. Now, what would happen if intimate and private information like the date and time you used your vibrator, the setting you had it on, or the interactions between you and your partner using remote sex devices, became shared with other people or collected by the vendor? These are not sci-fi pontifications about the future, but instead, real-world problems of today with serious implications for both the consumer and the producer of sex toys and sex tech products.
On Friday, September 2, 2016, a class-action suit was filed against Standard Innovation, the maker of “We-Vibe”, a high-end remote controlled vibrator and the associated “WeConnect” app (see the photo below from the We-Vibe website). Anonymous plaintiff, N.P., brought the suit in U.S. District Court for the Northern District of Illinois, Eastern Division, seeking damages for the defendant’s alleged collection, unbeknownst to users, of usage information from the vibrators via the app.
The complaint alleges that the defendant programmed the app. to:
…secretly collect intimate details about its customers’ use of the We-Vibe, including the date and time of each use, the vibration intensity level selected by the user, the vibration mode or pattern selected by the user (collectively, the “Usage Information”), and incredibly, the email address of We-Vibe customers who had registered with the App, allowing Defendant to link the usage information to specific customer accounts.
In the lawsuit, the plaintiff is seeking to collect damages for: (i) a violation of the Federal Wiretap Act, (ii) a violation of the Illinois Eavesdropping Act; (iii) a violation of the Illinois Consumer Fraud and Deceptive Business Practice Act, (iv) an intrusion upon seclusion, and (v) unjust enrichment. According to a recent court filing though, the parties have mediated the issue, and are working on drawing up settlement papers. [IMPORTANT UPDATE ON STATUS SINCE ORIGINAL PUBLICATION IS AT FAR END OF THE ARTICLE BELOW]
The circumstances of this case bring a multitude of questions to my mind, and I’m sure, yours as well. For insight into the technical aspects of data security and sex toys, I turned to RenderMan, the founder of the “Internet of Dongs” (or “IoD”) project. The project focuses on the detection (and eventually, rectification) of data security issues in Internet-connected sex toys. RenderMan, who lives in western Canada, works as a penetration tester for a financial institution, and has previous experience in vulnerability detection in air traffic control systems. He works on the IoD as a non-profit endeavor.
ML: Tell me about the Internet of Dongs project.
RM: Internet of Dongs project (IoD) is focusing on one particular branch of the “Internet of Things”. IoT devices in a nutshell is taking a previously unconnected, simple device and adding connectivity for whatever reasons. Toasters, refrigerators, thermostats, etc. These industries have never had to deal with the threats that come with a connected device, and as a result, they are notorious for having terrible security practices. The IoD is the same thing; an industry of previously manually operated devices suddenly now connected to the internet by an industry that had no experience in the threats that come along with that.
ML: What is the danger that information could be intercepted from use of these web-connected sex toys (i.e. teledildonics)?
RM: Once you add connectivity, there’s a whole host of information that is generated that you may not want to share with the vendor, other users or an attacker.
Many devices are Bluetooth® low energy devices and connect to your smart phone or tablet and are controlled by an app. That app then communicates to the rest of the world. Since these vendors do not always realize the implications of what they are implementing, they may generate, transmit and collect more information than they need. This information, while with most other device wouldn’t be an issue, given the nature of these devices, has the potential to be embarrassing and violating to have revealed.
Some examples we’ve found are:
– Unrestricted ability to enumerate accounts against a list of emails. We could check a list of email addresses to see if any of those have an associated account on the app with no restrictions. I personally found a few people from my personal email address book had accounts that I would not have suspected. Imagine doing this with celebrities, politicians, etc.?
– Email address disclosure. In a few cases users could share vibration patterns under an arbitrary (and potentially anonymous) username, however their email address associated with the username was disclosed (but not displayed) in the server response. Obviously this negates the anonymity and privacy of the users.
– Ability to determine “paired” users. With at least one vendor, we could enumerate an account and see what user they were “paired” with. This would be the account that they had authorized to remotely control their device. This is a privacy issue to be sure, but combined with any disclosure of personally identifying information about the accounts you may be able to tell if a user’s account is paired with another that may be embarrassing. For example, someone who is not their spouse or significant other?
– Interception of text, audio and video streams. May apps include the ability to send text chat, audio and even two way video communications. It’s pretty obvious the implications to privacy if someone was able to intercept any of those very private conversations and disclose them, particularly video and audio.
– Remote session hijacking. We’ve not been able to do this *yet* but are pretty sure it will happen. Basically taking control of a device that we were not authorized to control without the user knowing. This is the issue that worries us the most. Imagine that Alice gives Bob permission to remote control her toy. While they are using it, Eve hijacks the connection and seizes control and Alice did not give consent or permission for this. By many definitions, this counts as a sexual assault and possibly rape. We are entering a time where rape-via-internet is possible. This is something the courts will have to deal with eventually and I prefer it not be the result of someone being harmed.
ML: Interesting examples from an external third party abusing the systems these devices run on. In the We-Vibe case, we see a claim relating to vendor-collected data. Tell us more about what kind of data could be collected and how it could be used.
RM: – Time of day, usage length, etc. Many products collect basic usage data to help them make their products better. While it may be useful for the vendor to know metrics of how their devices are used, what’s the most popular features, etc., the end users may not appreciate it. Time of day may indicate infidelity of an account holder. Usage and favorite patterns may not be something that a user wants anyone to know. This is all compounded if the data is not properly anonymized and it is not disclosed to the users that this is occurring.
– Retained records as evidence. Retaining records, even anonymized, can lead to some very awkward and unintended uses. It’s not difficult to imagine the day where a divorce lawyer puts into evidence the usage logs for a spouse that show the device being used while their spouse was out of the house. I’m sure that someday these records will be used to establish an alibi for a criminal case (See Judge, the time, data, geolocation information from these logs proves I was at home masturbating at the time of the crime).
ML: How can people protect themselves when using such devices?
RM: Consumers of these devices right now don’t have a great deal of recourse to protect themselves. These devices are not regulated like medical devices or vehicles. This is where we are stepping in. By pointing out to the industry that security and privacy matters to consumers, we are hoping to raise the bar and simply let market forces decide. We are here to provide help and advice to everyone.
Concerned consumers can decide to go with a device from a vendor who has committed to being open, preemptive and committed to the security and privacy of their devices or a vendor who does not. If a vendor does not do so, it’s pretty obvious what will happen to their market share. We are building up frameworks, checklists, and advice to be put out publicly for any vendor to adopt. It’s up to the consumers to ask the questions about security and privacy but at the same time, we are looking to provide similar checklists of questions and considerations for the consumer.
For more information on the Internet of Dongs project, visit: www.internetofdon.gs. We’ll circle back with RenderMan soon to delve further into these issues as time and technology ultimately progress.
Using the We-Connect app
As with many applications, certain limited data is required for the We-Connect app to function on your device. This data is collected in a way that does not personally identify individual We-Connect app users. This data includes the type of device hardware and operating system, unique device identifier, IP address, language settings, and the date and time the We-Connect app accesses our servers. We also collect certain information to facilitate the exchange of messages between you and your partner, and to enable you to adjust vibration controls. This data is also collected in a way that does not personally identify individual We-Connect app users.
Anonymous App Data
We use third party service providers to collect certain analytical information to help us improve our products and the quality of the We-Connect app. We receive this data in an aggregate, anonymous form that does not personally identify any individual We-Connect app user. This anonymous analytical data includes the app features used and time spent on the app.
As part of our commitment to privacy, we enable users of the We-Connect app to opt-out of sharing this aggregate, anonymous data through the We-Connect app Settings under Privacy.
Standard Innovation learned from their mistake, and got some good legal advice – Those policies are very important to protect themselves by basically achieving “informed consent” from the consumer. They’ve also cleaned up some vulnerabilities, according to RenderMan. In light of that case, hopefully others in the industry are also taking a look at their own products to work out any “kinks” (pun intended, as usual) yet to be discovered…
UPDATE: The Court has issued a preliminary approval for the case to be settled with a payout of 5 million Canadian dollars by Standard Innovation to the class of plaintiffs. The Court will hold a hearing relating to final approval on August 7, 2017. To view the settlement proposal document filed March 9, 2017, click here.