Sex + Extortion: A Call for Federal Criminalization of a Rising Cybercrime

“I can never get that photo back. It’s out there forever.”1 These were the words of 15-year old Amanda Todd as she documented her story of bullying, harassment, and extortion on YouTube.2 Todd used flashcards to narrate how she became a victim — detailing how her aggressor tormented her by posting an uncompromising photo of her on the Internet after she refused to give into his sexual demands.3

While Todd was struggling to find an escape route from her tormentor, meanwhile in the United States, Luis Mijangos was being prosecuted for hacking into the computers of approximately 230 victims and blackmailing them for sexual material.4 Federal investigation revealed that the 32-year old paraplegic had more than 15,000 web-cam captures, 900 audio recordings, and 13,000 screen captures of his victims saved on his computer.5

Todd committed suicide just over a month later after the posting of the 2012 YouTube video.6 To date, the video imploring for help and support by the Canadian teenager7 has garnered more than eleven million views.8 Todd’s suspected tormentor is a 38-year old Dutch male, who will be facing separate charges for blackmail and distribution of child pornography in the Netherlands before being extradited to Canada to stand trial.9 On the other hand, Mijangos — whose blackmailing and harassment scheme reached as far as New Zealand — was sentenced to six years, and is scheduled to be released next year.8

Both cases highlight the devastating and egregious effects of the cybercrime phenomenon informally known as “sextortion.” Sextortion is the use of coercion to intimidate victims into producing and satisfying demands for sexual favors.9 With the advent of new technologies and cyberspace, the concept of having clear and defined borders has diminished as human interactions have increased exponentially over the World Wide Web. Therefore, sextortion is a sex crime that has transgressed national borders: “For the first time in the history of the world, the global connectivity of the Internet means that you don’t have to be in the same country as someone to sexually menace that person.”10

Given the serious privacy issues that arise in connection with this cybercrime, one would think that the subject has been thoroughly addressed. However, sextortion is a heavily understudied issue.11 Despite it being a crime that has always existed, sextortion has recently garnered media attention as a growing threat12, and a proposal for federal criminalization of sextortion was just introduced in July.13

Although in the U.S. sextortion is recognized as a crime, there is no state or federal statute specifically classifying it as so.14 Instead, sextortion is prosecuted under a myriad of federal and state laws concerning extortion, cyber hacking, and child pornography, thereby producing inconsistent sentencing across jurisdictions.15 Furthermore, due to the Government’s strong interest in protecting minors, child pornography laws produce invariable sentencing results in federal courts between sextortion cases involving minors and those involving adults.16

Although there is much to be done with tightening cyber security and online privacy, the passage of the Interstate Sextortion Prevention Act — calling for the federal criminalization of sextortion — is a first step towards that direction.17

Security Overhaul in iPhone7

On September 7, 2016, Apple, Inc. (“Apple”) revealed it’s much anticipated new product, the iPhone 7.1 The iPhone 7’s reveal was met with mixed reviews, with much of the focus on the company’s decision to remove the 3.5mm headphone jack, in an attempt to push users to wireless headphones.2 However, lost in the debate, over whether Apple made a mistake in removing the headphone jack from its new product, was the overhaul Apple had made in the security of their products.

On February 2016, a United States Magistrate Judge issued an order pursuant to the All Writs Act, that directed Apple to assist the Federal Bureau of Investigation (“FBI”) in bypassing the passcode security feature of the iPhone 5c.3 The order requested that Apple develop a new version of the iPhone’s operating system that would allow the FBI to circumvent the phone’s encryption and security systems.4 In response, Apple declined and the CEO, Tim Cook, released a letter to Apple’s customers addressing the order and insisted that the company would not acquiesce the request or honor any similar requests in the future.5 Further, Cook stated, “In today’s digital world, the ‘key’ to an encrypted system is a piece of information that unlocks the data, and it is only as secure as the protections around it. Once the information is known, or a way to bypass the code is revealed, the encryption can be defeated by anyone with that knowledge.”6

As a result of Apple’s refusal to create the software, Apple and the FBI were set to appear in Court on March 22nd, 2016. However, the FBI claimed to have found a third-party who was capable of bypassing the phone’s security and subsequently withdrew its request.7 While the case may have ended when the request was withdrawn, the issue still continues to be a hot topic in the world of technology. In response to the standoff between Apple and the FBI, may major competitors such as Facebook and Google pledged their support for Apple and vowed to implement their own version of encryption on their software’s.

In response to the case, Apple had a major commitment to the security and encryption of their software and products.8 Apple has gone on to state that they believe they are the “most effective security organization in the world.”9 With the release of the newest iPhone and operating system, Apple has added many features that are aimed to better protect their customers’ data and information. iPhone’s will now utilize Apple File System (“APFS”) which “improves the way information is organized and protected to make it faster and more secure.”10 The APFS will introduce many new encryption and security features that will make it much more difficult for hackers to access information stored on Apple products.

As Apple continues to introduce new innovations to the way in which they protect their customers’ data and information, the FBI will find it increasingly difficult to bypass the security features and access the information they desire. Undoubtedly, it will be very interesting to see where the battle over information between technology companies and the FBI goes from here.

Privacy Concerns and Fitness Trackers in the Workplace

There has been an enormous increase in recent years in the amount of people utilizing “wearable technology.”1 Wearable technology can be described as devices that have the ability to “collect data [and] track activities.”2 Fitness trackers, including Fitbit3, Jawbone4, have all been part of this market increase.5

Employers have latched on to this increasing trend by encouraging,6 and sometimes mandating7 employees to wear these devices as part of a health and wellness program.8 Approximately ninety percent of companies offer wellness programs9, and about forty to fifty percent use fitness trackers as part of these programs.10 There are increased financial incentives for employers to encourage, or mandate, employees to wear these trackers.11 Similarly, there are incentives for employees to take part in health programs that utilize fitness trackers, because of reductions in the price of employee’s health care plans.12

There are concerns, however, due to the tracking component of these fitness devices. Fitbit, for example, enables constant tracking,13 which can be a benefit to employees, who wish to track their activity, food and exercise.14 However, there are also concerns for employee’s privacy and the data that these devices collect.15 Employers will have access to ample information about employees collected through these tracking devices.16

Moreover, there are additional concerns that employers could use the information gathered from the tracking devices to factor into employment decisions, including raises and promotions.17 There could be a new wave of litigation from less active, or disabled, employees if employers were to use this data in job performance reviews.18

Additionally, many of the companies that develop these fitness trackers, including Fitbit19 and Apple20, sell the data collected from these devices to employers and third-parties. There are additional concerns that these wearables can track employee’s locations, and may have audio and video recording features.21 Employers could potentially track employees, and their exact location, by “spying” on them in and out of the workplace.22 This has created an area in privacy laws where consumer information is unprotected.23

There are many benefits for employees to use wearable technology, including sleep, activity, and health and wellness management.24 Additionally, certain professions may be drastically improved by wearable technologies, such as the medical profession.25 However, there is not enough, or seemingly any, legislation to protect employees.26 Currently, the Health Insurance Portability and Accountability Act, the Americans with Disabilities Act Amendments Act, the Electronic Communications Privacy Act and the Computer Fraud and Abuse Act, do not protect employees from this very specific form of health data collection.27

The only way to protect employees who wish to use these trackers while in their employment, is to create specific rules, laws, and guidelines for employers. This would enable regulation as to what employers can and cannot do with the information they collect. Further, employees should be made aware of what data is being collected from them and who is able to access it, opting either to allow or disallow their information from being collected.

Privacy Victory or Criminal Loophole?: Implications of Second Circuit’s Decision in Microsoft v. USA

Many are calling the recent Second Circuit decision in Microsoft Corp. v. USA “a victory for privacy.”1 The Second Circuit ruled that a Stored Communication Act (SCA) warrant does not compel production of email content stored exclusively on foreign servers.2 The Stored Communications Act was passed in 1986 with the intent “to protect the privacy of digital communications.”3 The SCA allows a customer to sue a service provider that discloses private data under the law unless the disclosure was made in “good faith reliance on a warrant, order, or subpoena.”4

After asserting that a Microsoft email account was being used to facilitate drug trafficking, federal prosecutors in New York executed a search warrant to Microsoft Corporation seeking disclosure of “information associated with a particular individual’s email address, including the email contents.”1 Microsoft complied in part with the search warrant and provided basic “information about the customer that was being stored” on its United States servers.5 However, Microsoft refused to provide the email contents asserting it was stored on its servers in Dublin, Ireland.6 Microsoft justified its refusal to fully comply with the search warrant on the grounds that the court did not have the authority to compel the production of data maintained outside of the United States.7 The Second Circuit agreed.

The Court reasoned that traditional federal rules dictate that warrants issued by the courts only permit law enforcement officials to search property within the boundaries of the United States.8 Even if the property the government seeks is electronic in nature, the same rules apply. The Second Circuit’s ruling demonstrates the United States’ desire to avoid interfering with the laws of foreign countries.9 Foreign countries should be free to utilize services and technology of American tech companies without having to answer to the United States government.10

On its face this may seem like a victory for privacy, but what are the negative implications of this decision? In theory, this decision has now created a loophole in the system that will allow criminals to conduct illegal business activity through email. Upon registering for a Microsoft email account, an individual is prompted to enter their location and Microsoft takes this information at face value.11 Based on the location entered in the system by the customer, Microsoft will then store the customer’s email contents in a “data center assigned to that country.”12 As long as a criminal enters a location outside of the United States, the content of their emails will remain private if a search warrant is issued. The only way U.S. law enforcement will be able to access the emails it seeks is by collaborating with law enforcement officials in the country where the data is being stored.13 This could significantly thwart law enforcement’s ability to investigate illicit activity. In addition, this decision could have a significant impact on U.S. service “providers’ decisions to exclusively store information abroad.”14 More U.S. based service providers may opt to store data exclusively on foreign servers to protect their customers’ privacy rights and to decrease the likelihood of U.S. government interference with those rights.[Id.]

Weighing the Costs of Privacy and Security

Security. For most people, this means putting their money in a bank, having basic home security, and their birthday as passwords on their computer. A simple concept, which for the most part is inexpensive.

However, on a national scale, security becomes prohibitively expensive. Not only because of money, but rather because of the cost to citizens’ privacy and liberty. These costs were emphasized when, in the wake of the San Bernardino terror attack, the Department of Justice (DOJ) demanded that Apple should create a program which would allow the DOJ to access the information in San Bernardino terrorist’s phone under the All Writs Act of 1789.1 When Apple refused, the DOJ won a federal suit to compel Apple to produce such a program.2

This particular request is almost unrivaled in its audacity compared to previous advances by the United States Government.3 While national security is vital to our nation’s interests, this grab at power by our government ups the ante. It does not only limit private citizens, corporations, and entities in a preventative sense, but rather forces them to act.

There are three problems with this. First, providing the government with access to anyone’s private phones, computers, and documents at any time may violate the Fourth Amendment.4 Second, to force a citizen or entity to act affirmatively can violate their basic freedom and liberty under the First and Fifth Amendments.5 Finally, when an executive branch gets to decide what can be demanded from a citizen in the name of homeland security judicial oversite is limited.6 The only remaining question would then be what can your country request/demand from you. If you ask Stalin or Mussolini, a lot.

Until now, and even with the recent policy decided in Sebelius7, the government could only compel private citizens to act in very limited circumstances. However, compelling private entities to create and do things as the government wishes, is a vastly enlarged scope of government power with untold realities. Of course, anything done will be in the name of national security and the United States’ interest. However, the key phrase is “national” security. Essentially, this is for the “greater good”, not the personal citizen. Further, everything has a good reason and a real reason, all the government needs to put forth to compel citizen action under rational basis is the good reason.

Also, as evidenced by the debacle between Edward Snowden and the NSA,8 once the government has power it cannot be blatantly assumed that the government will use such power appropriately and fairly. If the government would be able to compel entities to produce programs, what else can the government compel? Can the government demand every citizen register and produce the keys and passwords to their home, car, personal locker, computer and phone “just in case” the government needs to enter? True, in this instance, this request was part of an investigation proceeding, but this was not a request which specifically applied to that security situation, this request encompassed the privacy of millions of users. Further, it demanded that Apple affirmatively act.

However, in conclusion, one has to wonder based on the amount of access millions of users sign away on phone contracts and signing up for the latest apps, how much we as a society value our privacy, and therefore, how much such privacy concerns should weigh against our national security needs.

The Use of Stingrays – Consitutional?

We are not talking about the stingray found in the ocean. We are talking about a new, emerging piece of technology that can spy, record, and track people down via their cell phones. This new piece of technology is called the StingRay and its prominent use has been under wraps for almost two decades.1 Police have been using this spy tool and most often without a proper warrant or permission from supervising officers.2

As it stands today, sixty government agencies in twenty-three states now employ StingRay technology allowing police to grab cell phone data, text messages and more.3 Those states are Washington, California, Idaho, Arizona, Texas, Tennessee, Georgia, Florida, Louisiana, Delaware, Oklahoma, Michigan, Wisconsin, Illinois, Minnesota, North Carolina, Maryland, Pennsylvania, New York, Virginia and Massachusetts, Hawaii and Missouri.4

The way in which this portable, high tech scanning device works is that it, first, masquerades itself as a cell tower and is usually mounted in a police vehicle.5 Cell phones are constantly seeking the nearest cell tower even when you are not using it.6 Your phone could connect to the police StingRay when it is nearby and route data through the StingRay.7 The data is relayed to a connected laptop which displays and translates the data for the officers.8 The data is passed onto an actual cell tower and the phone’s user would not know the difference.9 Police can get the identification of the phone user, call records, voicemails, text messages, location of the connected phone, and much more.10 When a StingRay is used to track a suspect’s cell phone, they also gather information about the phones of countless bystanders who happen to be nearby even if they are not the target of surveillance.11 In essence, StingRays are invasive cell phone surveillance devices that mimic cell phone towers and send out signals to trick cell phones in the area into transmitting their locations and other identifying information.

The use of these devices by government agencies is warrantless cell phone tracking as they have frequently been used without informing the court system or obtaining a warrant.12 The Fourth Amendment of the Constitution protects the public from warrantless searches.13 As such, the use of StingRays for warrantless cell phone searches should be held as unconstitutional. In Kyllo v. United States, the Supreme Court ruled that thermal imaging of Kyllo’s home constituted a search within the meaning of the Fourth Amendment.14 In this case, since the police did not have a warrant when they used the device, which was not commonly available to the public, the search was presumptively unreasonable and therefore unconstitutional.15 The majority opinion in this case argued that a person has an expectation of privacy in his or her home and therefore, the government cannot conduct unreasonable searches, even with technology that does not enter the home.16 Justice Scalia discussed how future technology can invade one’s right of privacy and therefore authored the opinion so that it protected against more sophisticated surveillance equipment.17 As a result, Justice Scalia asserted that the difference between “off the wall” surveillance and “through the wall” surveillance was non-existent because both methods physically intruded upon the privacy of the home.18

Here, the facts about the use and operation of StingRay technology is analogous to the use of thermal imaging. Justice Scalia warned the American people about more sophisticated, invasive technology of the future19, and here it is! StingRay devices. For the first time, a federal judge, Judge Pauley, puts everyone on notice about warrantless searches by actually kicking DEA StingRay evidence to the curb in court as he deemed them unconstitutional.20

At this point, it seems as though the only problem going forth, is that citizens who are subjected to the use of StingRay technology may have a difficult time defending their right to privacy and constitutional right against a warrantless search. The main reason is because most people are unaware of the mere existence of this new technology and, further, often unknowingly fall victim to the use of this “all-you-can-eat-data-buffet.” This is why the proper call to action would be for all police departments to abandon this technology as it is a direct violation of each American citizen’s right against unreasonable searches and an invasion of their privacy right. Although, it certainly is hard to rule something as unconstitutional if the case never reaches the Supreme Court. Our challenge, then becomes, how do we as American citizens protect our right against the invasion of privacy and unreasonable searches if we do not know our rights are being violated? The use of StingRays and they are so secretive and are deemed to be classified information by the FBI21 that, in moving forward, this will be an arduous task for the American people.

NY, COPAA, and the Fight to Protect Children from Online Tracking

The grand realm of the internet has regularly created new challenges throughout the years. The latest trend in online fiascos concerns the use of tracking devices on websites. Over the years, social media, such as the world-renowned Facebook, have been collecting data from hundreds of individuals.1 Although this has been a major issue for some time, a dire concern evolves from websites that particularly target children. On September 13, 2016, NY Attorney General Eric Schneiderman announced settlements against four industry giants over their failure to regulate the online tracking of children.2 Viacom, Mattel, and various others are now required to pay “a combined $835,000 in penalties and implement ‘significant’ reforms to the way they monitor third-party tracking technologies on their sites.”3 This will spark a major movement in the use and misuse of online tracking.

Websites tracking children are subject to the Federal Trade Commission’s ‘Children’s Online Privacy Protection Act’ (COPPA) which provides, in relevant part, that “it is unlawful for an operator of a website or online service directed to children, or any operator that has actual knowledge that it is collecting personal information from a child, to collect personal information from a child in a manner that violates the regulations prescribed under subsection (b).”4 Subsection b requires various disclosures which outline, in a conspicuous manner, that the website is collecting information.5 In 2013, the FTC amended COPPA which subjected website owners to nothing less than strict liability for “the collection, use and disclosure of personal information by independent third parties that are allowed to plug into their sites.”6 Any website operators found to violate COPPA will be subject to fines by the FTC. The NY Attorney General has construed the regulation in a way that coincides with the views of the FTC, which would mean full cooperation between the Federal Government and the States on this subject.

This regulation has been in effect since 1998, however, States have been actively looking into this issue for the past few years. Texas, New Jersey, and Maryland have been the few states so far who have led the charge in filing suits under COPPA.7 The action in NY is a large-scale movement toward the regulation and protection of children online. With the shift to mobile games becoming an even larger threat, it is of particular important that other States follow suit. Children’s privacy should be a top priority not only for the individual States but for the Federal government as well. The broad expansions to COPPA add various loopholes and difficulties to Website operators, who rely on website views in order to create revenue. Fortunately, there are FTC approved safe-harbor programs which monitor compliance with the statute, making it easier for websites to make changes without any lawsuits against them. The FTC sponsored compliance programs are not in any way cheap. However, compared to the $500,000 that Viacom paid and the $250,000 penalty imposed upon Mattel; Hasbro, who had a safe-harbor program in place paid nothing in the action against them.8 These safe-harbor programs should be a major incentive for companies with websites geared toward children to ensure that those websites are not only complying with the statute, but also providing a safe environment for children to enjoy themselves. Surely, in the next year, massive changes will take place and States will begin to regulate websites aimed at children more effectively.

GOOGLE PERPETUATES PRIVACY CONCERNS WITH ITS NEW PERSONALIZED SEARCH FEATURE

Google recently enhanced its search engine functionality so that a user’s personal information from Gmail, Google Calendar, and Google+ are utilized to generate user-specific answers to personal questions such as “When is my flight leaving?” or “What time is my reservation?”[1] The feature is available on any device on which Google search is available.[2] Although Google claims this feature to be their most innovative yet, it is likely to be in violation of the Federal Wiretap Act.[3]

Google launched its personalized search tool in 2004.[4] At that time, the function used the individual’s Google search history to come up with results that were most relevant to the user’s particular interests.[5] In 2005, the company began ranking search results based on cumulative data of personal behavior collected from all users.[6] In 2009, social media data was also integrated into Google search,[7] and, since August 2013, Google search has been able to pull information from users’ personal accounts to offer the most relevant and individualized results for them.[8] The company explains that it collects the data by embedding cookies and anonymous identifiers to the users’ devices and claims that the right to do so had been added to the privacy policy to which all Google users must agree.[9]

This new privacy policy does not allow consumers to keep their information in separate accounts in a practical manner.[10] Enrollment in the comingling of all services is automatic, while opting-out all at once is not available.[11] Even when the users opt-out from the data collecting services for all accounts one by one, cookies embedded in their devices do not become deactivated.[12]

Since the feature is fairly new, it has not yet been subject many complaints. But, in September 2013, a federal judge in California allowed a consolidated wiretapping action to proceed against Google in conjunction with their previous data collecting policies from Gmail accounts.[13] In this ongoing case, plaintiffs allege that Google “intentionally intercepted, read and acquired content” from their emails, for the purposes of targeted advertising.[14]

Google filed a motion to dismiss, asserting that the scanning of emails was within the exception of the “ordinary course of business”[15] and that the users consented to the company’s activities by accepting its privacy policy.[16] However, the Court dismissed Google’s motion, holding that data collection, for the purposes of targeted advertising, does not constitute “ordinary course of business” for an email account provider, and merely adjusting the privacy policy does not mean that the users have “explicitly” or “implicitly” consented to the access to their accounts.[17]

Since Google’s new search feature furthers the amount of intrusion into the users’ private information, the company may have subjected itself to increased liability, based on the outcome of this ongoing case.[18]



[1] Jolie O’Dell, Google’s big brain now includes your calendar, tracking numbers, and more, Venture Beat (Aug. 14, 2013, 11:00 AM),

http://venturebeat.com/2013/08/14/googles-big-brain-now-includes-your-calendar-tracking-numbers-more/.

[2] Id.

[3] See 18 U.S.C. § 2511 (2013).

[4] Elinor Mills, Google automates personalized search, CNET (Jun. 28, 2005, 1:53 PM), http://news.cnet.com/Google-automates-personalized-search/2100-1032_3-5766899.html.

[5] Id. (explaining that, for example, searching the word “bass” would lead to results related to fish for someone who had previously searched fishing terms on Google, and results related to musical instruments for someone who had previously searched terms related to music).

[6] Eugene Agichtein et al., Improving Web Search Ranking by Incorporating User Behavior Information, SIGIR ’06 19, available at http://web.cs.dal.ca/~anwar/ir/review/grads.pdf (last visited Oct. 23, 2013) User behavior data consists of scrolling time in each search term, dwelling time on each link and reformulation patterns of search terms until the user gets the desired result.

[7] Sarah Kessler, Why Google’s Social Search Is Too Much, Too Soon, Mashable (Jan. 13, 2012), http://mashable.com/2012/01/13/google-social-search-too-much-too-soon/. A search for a name would return a result of someone that is already in the user’s social network, rather than many strangers with the same name.

[8] See O’Dell, supra note 1.

[9]  See Privacy Policy, Google (June 24, 2013), http://www.google.com/policies/privacy/.

[10] Complaint at 62, Hoey v. Google, Inc., No. 12-cv-01448 (E.D. Pa. filed Mar. 22, 2012), available at http://www.courthousenews.com/2012/03/26/Goog.pdf.

[11] Id.

[12] See Demand for Jury Trial, Yngelmo v. Google, Inc., available at http://www.technologyreview.com/sites/default/files/legacy/yngelmo_v_google.pdf (last visited Oct. 26, 2013).

[13] Mathew J. Schwartz, Google Wiretapping Lawsuits Can Proceed, Judges Say, InformationWeek (Oct. 2, 2013, 1:32 PM), http://www.informationweek.com/security/privacy/google-wiretapping-lawsuits-can-proceed/240162124.

[14] See Order Granting in Part and Denying in Part Defendant’s Motion to Dismiss, In Re: Google, Inc. Gmail Litigation, No. 13-MD-02439-LHK (N.D. Ca. filed Sept. 26, 2013), available at https://www.documentcloud.org/documents/799772-google-class-action.html.

[15] Id. at 12.

[16] Id. at 22.

[17] See id. at 22, 28.

[18] See id.; Schwartz supra note 14.

Are the FTC’s “Recommendations” Enough to Effectively Govern the Use of Facial Recognition Technologies?

Thanks to social networking sites like Facebook, there are billions of photographs available to the public across the Internet.  Indeed, the Federal Trade Commission (FTC) recently reported that “in a single month in 2010, 2.5 billion photos were uploaded to Facebook.”[1]  With this information in mind, it may come as little surprise that the field of facial recognition technology has expanded and improved tremendously over the past few years.[2]  Tests conducted by the National Institute of Standards and Technology between 1993 and 2010 “showed that the false reject rate – the rate at which facial recognition systems incorrectly rejected a match between two faces that are, in fact, the same – was reduced by half every two years.  In 2010, in controlled tests, the error rate stood at less than one percent.”[3]

Companies are using facial recognition technologies in a variety of ways.  For example, facial detection technologies are used in virtual eyeglass fitting systems and virtual makeover tools.  Additionally, some technologies are able to determine an individual’s engagement with a video game or excitement during a movie by identifying moods or emotions from facial expressions.[4]  Further, social networking sites like Facebook utilize technology that scans and compares new photos a user uploads with existing “tagged” photos, which enables the site to automatically identify the person in the new photo.  Likewise, it is now possible to use your face, rather than a password, to unlock your mobile device.[5]  As for the future, recent studies suggest that we are not far away from a world where facial recognition technology can be used to “identify anonymous individuals in public places, such as streets or retail stores, or in unidentified photos online.”[6]

Despite its advantages, facial recognition technologies raise a number of privacy concerns. As of now, however, there are no steadfast regulations governing its use. Instead, so far the FTC has merely released a report that contains “recommendations” for companies on how to best protect consumer privacy.[7] The FTC developed the list of recommendations after hosting a workshop in December 2011 to explore developments in the field of facial recognition technology.

Following the workshop, the FTC received eighty public comments discussing [facial recognition technology related] issues from private citizens, industry representatives, trade groups, consumer and privacy advocates, think tanks, and members of Congress. In [its] report, FTC staff . . . synthesized [workshop] discussions and comments in order to develop recommended best practices for protecting consumer privacy in this area, while promoting innovation.[8]

The following list represents a summary of the report’s recommendations to companies for preserving consumer privacy: (1) maintain data security for consumer’s biometric data; (2) establish and maintain reasonable retention periods and disposal practices for collected images; (3) avoid placing facial recognition technologies in sensitive areas such as bathrooms and health care facilities; and (4) make practices transparent to consumers.[9]

Finally, the report states, there are at least two scenarios in which companies should get consumers’ affirmative consent before collecting or using biometric data from facial images. First, they should obtain consent before using consumers’ images or any biometric data in a different way than they represented when they collected the data. Second, companies should not use facial recognition to identify anonymous images of a consumer to someone who could not otherwise identify him or her, without obtaining the consumer’s affirmative consent first.[10]

The FTC seems hopeful that its recommendations are enough to ensure that players in the field of facial recognition technology will respect consumer privacy now and into the future, stating “[i]f companies consider the issues of privacy by design, meaningful choice, and transparency at this early stage, it will help ensure that this industry develops in a way that encourages companies to offer innovative new benefits to consumers and respect their privacy interests[.]”[11]  Although such optimism is laudable, the FTC should remain skeptical that not all companies engaging in the use of facial recognition technologies have consumers’ best interests and privacy in mind.  The FTC should instead consider protecting consumers by adopting a formal set of rules to govern the use of facial recognition technologies, which includes penalties and fines for noncompliance.



[1] Facing Facts: Best Practices for Common Uses of Facial Recognition Technologies, Federal Trade Commission (FTC), (Oct. 2012), available at http://www.ftc.gov/reports/facialrecognition/p115406commissionfacialrecognitiontechnologiesrpt.pdf.

[2] See id. at 4 (“Th[e] multitude of identified images online can eliminate the need [for companies] to purchase proprietary sets of identified images, thereby lowering costs and making facial recognition technologies commercially viable for a broader spectrum of commercial entities.”).

[3] Id. at 3 (internal citation omitted).

[4] Id. at 5.

[5] Id. at 6.

[6] Id.

[7] See id.

[8] Id. at ii.

[9] See id.

[10] FTC Recommends Best Practices for Companies That Use Facial Recognition Technologies, FTC (Oct. 22, 2012), http://www.ftc.gov/opa/2012/10/facialrecognition.shtm.

[11] See id. at iii.

New Technologies Potentially Raise HIPAA Concerns

Through the use of innovative products such as stretchable electronics, wearable technologies and microchips individualized health data can be seamlessly recorded, wirelessly transmitted and stored for later use.[1]  These products can be applied externally or internally to monitor a person’s vital signs.[2]  The stretchable electronics are applied to the skin like a small sticker.[3]  They can record and transmit biological functions such as heart rate, brain activity, respiration, body temperature, hydration levels as well as information about a person’s bloodstream.[4]

These new technologies will undoubtedly have significant implications for the healthcare industry, such as increased efficiency in the delivery of healthcare and reduced costs.  However, anytime health data is being transmitted, it is wise to analyze whether the Health Information Portability and Accountability Act (HIPAA)[5] applies.  If HIPAA does apply, the organizations transmitting and using the data compiled by these devices will need to be careful not to run afoul of the federal statute.

HIPPA protects the privacy of an individual’s health information.  HIPPA’s protections are enforced through the Privacy Rule, which is a group of federal regulations promulgated by the U.S. Department of Health and Human Services.[6]  The Privacy Rule prohibits a “covered entity” from disclosing or unlawfully using a person’s “individually identifiable health information” without the person’s specific written consent.[7]  Violators of HIPPA can be subject to civil penalties ranging from $100 to $50,000 per violation, depending on the circumstances.[8]

Several legal issues may arise from the use of health data collected by stretchable electronics, wearable technologies and microchips.  The first is whether the organizations transmitting and using the health data qualify as a covered entity under HIPAA.  If not, their use or disclosure of the health data is outside the reach of HIPAA.[9]  Covered entities include healthcare providers, health plans, healthcare clearinghouses and the business associates of any of these three types of entities.[10]  Whether or not HIPPA applies will depend on whether the health data is being collected by or shared with one of the covered entities.

Another issue is whether the health information being collected and transmitted is individually identifiable health information protected by HIPAA.  Completely anonymous, or “de-identified” health information is not protected by HIPAA.[11]  However, de-identification is not as simple as removing a person’s name, address, year of birth, etc., from the health data.[12]  De-identification is only achieved when there is no information that can create a reasonable basis to believe it can be used to identify the individual.[13]

A third issue to consider is whether the health data being collected will be used for one of the several uses specifically permitted under HIPAA.[14]  Permitted uses include, among others, public interest and benefit activities’ which includes research and disclosures necessary to prevent serious threats to health or safety.[15]

Finally, the use of health data collected by these devices may be permitted under HIPAA if the person supplying the data consented to its use.[16]  However, the consent needs to be sufficient.  For example, under HIPAA, the individual’s consent must be in writing, in plain terms, be specific about the information to be disclosed and to whom, and provide an expiration date and a right to revoke permission.[17]

Stretchable electronics, wearable technologies and microchips will provide rapid electronic exchange of health information.  These technologies will likely make the health industry more effective and efficient.  However, organizations planning to use the health information collected by these devices should carefully consider HIPAA and the Privacy Rule before doing so.



[1] Quentin Hardy, Big Data in Your Blood, Bits, NY Times.com (Sept. 7, 2012, 10:37 AM), http://bits.blogs.nytimes.com/2012/09/07/big-data-in-your-blood. See also David Talbot and Kyanna Sutton, Making Stretchable Electronics, technology review (Aug. 21, 2012), http://www.technologyreview.com/demo/428944/making-stretchable-electronics; Robert T. Gonzalez, Breakthrough: Electronic circuits that are integrated with your skin, tecca, http://www.tecca.com/news/2011/08/12/breakthrough-electronic-circuits-that-are-integrated-with-your-skin/#uW441jvxJhsYfpO3.03 (last visited Sept. 22, 2012).

[2] Id.

[3] Id.

[4] Id.

[5] Pub. L. No. 104–191, 110 Stat. 1936 (1996).

[6] 45 C.F.R. pts. 160, 162, 164 (2011), available at http://www.gpo.gov/fdsys/pkg/CFR-2011-title45-vol1/pdf/CFR-2011-title45-vol1.pdf.

[7] 45 C.F.R. §§ 160.102, 160.103.

[8] 45 C.F.R. § 160.404.

[9] 45 C.F.R. §§ 160.102, 160.103.

[10] Id.

[11] 45 C.F.R. §§ 164.502(d)(2), 164.514(a).

[12] 45 C.F.R. § 164.514(a) & (b).

[13] Id.

[14] 45 C.F.R. § 164.502(a)(1).

[15] 45 C.F.R. § 164.502(a)(1), 164.512.

[16] 45 C.F.R. § 164.508.

[17] 45 C.F.R. § 164.508(c).