Call us: 617 720 0099

D’oh! Stupid Television. How I’ll be missing you soon.

0

I have a dumb TV. A really dumb TV. How dumb? So dumb that it doesn’t even know who I am. It has no idea what I watch on it. Or when. It doesn’t listen to me no matter how much I talk to it. Without my Comcast voice remote giving it the technological crutch it needs, my TV would ignore me completely when I speak. It can’t see me either—it’s blind as a bat. The internet? Never heard of it. And it doesn’t even try to learn anything about me—it has no motivation at all. So it’s deaf, dumb, blind, and lazy. Oh, and OLD too. Very old; I bought it about 9 years ago. So old in fact, it can’t even remember what show I watched just an hour before. And it can’t tell if I liked it.

I want to break up with my TV and move on. But you know how it goes: You get so used to having it around, you know all its idiosyncrasies, it’s really not that much trouble, sure the grass always looks greener, blah blah blah. I know there’s an internet-connected “Smart TV” out there somewhere who so wants to meet me and has all the great interpersonal qualities that my current one lacks. It’ll pay a lot of attention to me. And it will listen to me. Closely. This new generation of televisions practically exclaim, “Hey, Always Listening!” (or HAL for short). What’s a person like me to do? You know, someone who still believes in some semblance of privacy and the sanctity of the home, wants some occasional solitude, and who doesn’t think it’s anyone’s business when I view four commercials for the new-and-improved Snuggie.

During the holiday shopping season last year, I realized that I’ll have to break down and buy a new TV sooner or later. Screen size and clarity keep going up while prices keep coming down. Don’t I want eye-popping reds and blues? And the blackest of black blacks? On a 65″ screen so thin that not even Kate Moss could hide in its profile? And of course, a TV so smart that in time it’ll know me better than I know myself? Given the ever-increasing demand for Smart TVs, it may soon be the only choice that I have. So….what’s not to love? Plenty actually, if you understand the legal and practical ramifications of what your new TV demands in return. It’s a Faustian bargain, digital-style.

The security and privacy issues surrounding Smart TVs first started making a public splash in 2014 when people learned that Vizio, who has sold millions of Smart TVs since 2010, automatically and surreptitiously tracked what its consumers were watching, and then transmitted that data back to the company. What did Vizio do with it then? The Federal Trade Commission blog doesn’t mince words:

Vizio then turned that mountain of data into cash by selling consumers’ viewing histories to advertisers and others. And let’s be clear: We’re not talking about summary information about national viewing trends. According to the complaint, Vizio got personal. The company provided consumers’ IP addresses to data aggregators, who then matched the address with an individual consumer or household. Vizio’s contracts with third parties prohibited the re-identification of consumers and households by name, but allowed a host of other personal details – for example, sex, age, income, marital status, household size, education, and home ownership And Vizio permitted these companies to track and target its consumers across devices. [emphases added]

Wow. Talk about techno-chutzpah. It wasn’t enough for Vizio to secretly collect this data in the first place and use it internally for its own purposes, but it sold the data to third-party aggregators—and who knows what they did with the data—AND then allowed these aggregators to track Vizio’s consumers across their devices (which generated their own trails of data to be further aggregated and analyzed). So how much data was collected? According to the FTC, “Vizio captured as many as 100 billion data points each day from millions of TVs” (emphasis added). It was a virtual all-you-can-eat smorgasbord of data rapine, and Vizio never informed its customers that it was happening.

As an attorney who drafts and evaluates privacy policies and practices (and sometimes litigates privacy issues too), I was surprised that a major electronics company would engage in such blatant misconduct. For many years prior to this incident, The FTC had clear standards in place addressing these types of data collection practices in various other technological contexts; the agency had been bringing successful enforcement actions for quite some time and fining companies. Smart TVs were nothing more than the device du jour. Vizio presumably had competent legal counsel who knew this. The company just wanted to see how long it could get away with it and generate as much money as it could. It was just a trial balloon.

Not surprisingly, Vizio settled the FTC case in early 2017 by agreeing to: (1) stop collecting consumers’ data; (2) delete most of the data it had collected; (3) conspicuously disclose its collection practices; and (4) obtain consumers’ express consent before collecting and sharing their information (more on this shortly). The agency also fined the company $2.2 million and required monitoring and evaluation of its privacy practices for 20 years. Given the company’s $3.5 billion in revenue in 2016 though, the FTC fine amounted to nothing more than a financial pinprick. Just the cost of doing business.

If Vizio’s sale and collection of your personal viewing data gives you pause, then what Samsung did in 2015 shows the frightening reach of the corporate surveillance state. The following language was found buried in the fine print of Samsung’s privacy policy for its Smart TV: “Please be aware that if your spoken words include personal or other sensitive information, that information will be among the data captured and transmitted to a third party.” Yikes. Double yikes. So basically, anything you say to anyone within earshot of the TV’s microphone gets recorded, transmitted, analyzed, and categorized by some mysterious “third party” (and that third party’s third parties, i.e., data brokers and aggregators). The possibilities are scary.

If you and your spouse were arguing about the latest Lexus repair bill in front of your TV, would you suddenly be getting ads on your TV from Infiniti? If your kids told you that they wanted the new Xbox system, would you find a coupon for $50 off in the mail two weeks later? Or how about the really personal stuff: Suppose your doctor just diagnosed you with depression. Would you start getting e-mails for Prozac or Zoloft? And if Samsung does what Vizio did and allowed third parties to track consumers across devices, would you receive a geo-located advertisement on your smart phone while walking past a CVS? The possibilities are endless, and endlessly invasive. Your Samsung TV is a silent (but not deaf) witness to some of your most personal moments, presumably to commercialize them without conscience at some undefined later time.

It also highlights how our collective mindset has changed about surveillance. There used to be a time way back when that if the bad guys wanted to listen to your private conversations, they had to physically break into your home and surreptitiously plant bugs in strategic locations.  However, thanks to our unrelenting demand for convenience combined with our need for social validation, we not only plant the bugs ourselves right out in the open, but we then brag on Facebook that we got a great deal on Amazon while doing it. Who’s the bad guy again?

Anyway, Samsung created such an uproar that the company rushed out a verbose clarification a few days later. The standard exculpatory language was used: We at Samsung take privacy seriously, it’s merely voice recognition, here’s how it really works, you can interact better with your TV, you can disable it, and on and on. It’s interesting how just one sentence in its privacy policy engendered over 20 sentences and a full page explanation in a not-quite-contrite moment of corporate introspection.

Why didn’t Samsung just explain all of this in its original privacy policy? Were its lawyers charging by the word? Perhaps an extra few lines of exposition in its already legally dense privacy policy would have allayed fears—or at least maybe made less of a public splash than it did. Surely their lawyers could have easily done so. Like Vizio, were Samsung and its attorneys unfamiliar with FTC guidelines and best practices? The Samsung policy was drafted poorly and the company handled it poorly (and one of my biggest pet peeves is that a company’s marketing department and its legal counsel rarely see eye-to-eye on much of anything). The damage was done, a FTC complaint ensued, and most importantly: the public was notified and the cat was out of the bag. It was a PR fiasco for Samsung. Not much has changed since then.

Which leads me to the recent February 2018 Consumer Reports (CR) article evaluating some of the newer models of Smart TVs, including Samsung, Vizio, LG, and Sony. CR didn’t mince words in its article entitled, “Samsung and Roku Smart TVs Vulnerable to Hacking.” CR states right at the outset:

Consumer Reports has found that millions of smart TVs can be controlled by hackers exploiting easy-to-find security flaws.

The problems affect Samsung televisions, along with models made by TCL and other brands that use the Roku TV smart-TV platform, as well as streaming devices such as the Roku Ultra.

We found that a relatively unsophisticated hacker could change channels, play offensive content, or crank up the volume, which might be deeply unsettling to someone who didn’t understand what was happening. This could be done over the web, from thousands of miles away. (These vulnerabilities would not allow a hacker to spy on the user or steal information.) [emphases added]

Let that sink in. It’s been a couple of years since the Vizio and Samsung disclosures, and not only are there still security concerns (which will never change), but they are “easy-to-find security flaws” that even an “unsophisticated hacker” could exploit in “millions of smart TVs.” So hacking into certain Smart TVs doesn’t even take too much effort. While “easy-to-find” flaws may be relative as technology marches forward, the fact still remains that it’s not marching fast enough to outsmart even basic hackers armed with an internet or YouTube tutorial.

But here’s what really problematic, which CR notes parenthetically: “These [security] vulnerabilities would not allow a hacker to spy on the user or steal information.” That’s small comfort. Perhaps CR should have added at the end of the sentence: “not at the moment” or “as far as we know.” Simply because CR didn’t find any major security issues with Sony and Vizio TVs (for example), doesn’t mean that someone else won’t find a way to compromise them. Someone will.

For example, the CIA has reportedly hacked Smart TVs and can turn them into listening devices (much like they can with our always-present smartphones).  And if the CIA can do it, how many other non-state actors can too—especially when those supposedly secure agencies that protect our nation’s secrets get hacked themselves and their methods disseminated for all to see?  So even if you turn these settings off through the TV’s menu, how hard is it for someone to surreptitiously turn them back on? How hard is it for them to turn a “passive” microphone or camera into an “active” one, even though manufacturers assure us that their software won’t allow that? Short of tearing out the microphone and taping over the camera, your new TV will always have those capabilities. The only question is whether they’re on or not, and of course, whether you’re aware of it.

CR also discusses how Smart TVs collect data, as well as its new “Digital Standard,” which CR developed with “cybersecurity and privacy organizations to help set expectations for how manufacturers should handle privacy, security, and other digital rights.” It’s an ambitious effort (and worth a look) to try and simplify for consumers what the relevant factors are in an increasingly complicated area—and one where consumers do not have the upper hand when it comes to data privacy.

How technology companies have framed the privacy issue for us is paramount. The word “privacy” is used so ubiquitously these days in so many different contexts, it means even less now than when Scott McNealy, former CEO of Sun Microsystems, famously declared in 1999: “You have zero privacy anyway. Get over it.” And that was well before even smart phones existed, let alone TVs, allowing all types of data to be collected about us. How much less does privacy mean to people now, almost 20 years later? According to some views, while consumers have much greater awareness of privacy, their inquiry has shifted away from: “Is someone collecting my data?” to “Is someone using my data inappropriately, unethically, or illegally?” Sigh. Perhaps this erosion is due to the steady drip, drip, drip of data breaches in the news almost every week.

If consumers have essentially thrown in the towel, it should come as no surprise that TV manufacturers still refer to the data-gathering options on Smart TVs as “privacy settings” or “privacy preferences” in their ever-present “user agreements” that you must agree to in order to use your new TV. It sounds so innocuous. Indeed, Vizio diluted the concept of privacy even further and referred to its dubious practices as “Smart Interactivity”—and then described it deceptively and generically to consumers as “enables program offers and suggestions.” Huh? We the people need to do much, much better than that and call it what it is.

Perhaps if we called this invasive functionality “surveillance settings” or “eavesdropping preferences“—manufacturers certainly won’t—we would take far better notice of what our new TVs demand from us in return. It’s like the difference between calling Santa Claus a jolly old man who asks a red-nosed Rudolph to guide his sleigh through the snow, or an obese and senile animal abuser who exploits the disadvantaged by making them work in a storm. Same guy, but who do you want giving toys to your kids? It’s all how you frame the issue.

This isn’t to say that all data collection is bad. It isn’t. Personally, I don’t usually mind when Amazon suggests certain products and lower cost alternatives based on my past purchases. Or when iTunes recommends a new band based on my downloads; I’ve discovered a lot of new music that way. Learning about new things and saving money is fine. But what exactly does Apple and Amazon do with my data besides analyze my preferences and spending patterns? I can’t really say—I’m not sure they can either, especially if the data ever leaves their hands (whether deliberately or accidently). And therein lies the problem for society at large.

In the FTC’s view, one of the central issues governing data collection and privacy issues is “consent.” Consent is as American as apple pie. Because, so the argument goes, it’s all about freedom. (Isn’t everything nowadays?) Americans should be free to make their own choices about how their personal information is used, including their Smart TV data. If they want to trade privacy  for convenience or some other incentive, they should be allowed to do so. Technology companies are only too happy to oblige. (This laissez-faire view is far different than our European counterparts.)

But here’s the thing about consent in an increasingly interconnected world where your data is collected, parsed, analyzed, aggregated, sold, parsed again, transmitted, and re-transmitted to unknown third parties: Do you truly, really, genuinely KNOW what it is you’re consenting to? (And that’s assuming the company isn’t lying about what really happens to your data.) Full transparency is key—and is also an established FTC guideline—but opacity is often the norm instead, since what’s transparent tends to be relative. And what’s transparent to a lawyer is often opaque to a layman.

Perhaps more importantly though: Do you care? Do you even have the time to care while running around trying to earn a living and navigating between an array of personal devices? I care since I practice in the privacy and technology law field, but my 20-something nephews and nieces? Not so much. According to a poll cited by CR of Smart TV users, “51 percent were at least somewhat worried about the privacy implications of smart TVs and 62 percent were at least somewhat worried about the sets’ security practices.” But what does “somewhat worried” really mean? Are they worried enough to not use their TVs? Probably not. Or to not consent to their TV’s data collection practices?

And what happens if you don’t give your consent and you try to limit the TV’s data collection? According to CR, if you don’t consent you lose a lot of functionality, depending upon the manufacturer: “You can hook up a cable box or an antenna, but you won’t be able to stream anything from Amazon, Netflix, or other web-based services.” And Sony’s “all or nothing” privacy policy for its Smart TV was even worse: It “required you to agree to a privacy policy and terms of service to complete the setup of the TV.” So if you don’t consent you turn your sleek and advanced Smart TV into, well, my TV. And you didn’t pay good money to do that. Our choices are much more limited then we realize.

So as always, we have to keep a very close eye on what information and data the technology companies demand from us and what they will do with it, especially when the next shiny new gadget comes along (shortly). It’s a precarious balance between privacy and convenience.  Feel free to contact me if you have any questions on privacy issues, privacy practices and policies, or if you believe that you’ve been a victim of a privacy or data breach. Time for me to grab a bite to eat. Hey Alexa, I need to end this blog po—

© 2018 Daniel A. Batterman

Print Friendly, PDF & Email