Couldn’t make it to DC? Watch our event in its entirety here ft. a keynote address from former FTC Chairman Jon Leibowitz.
By Tim Sparapani
“Bad UI leads to bad UX” is one of the most common sayings in Silicon Valley. Translated, this means that bad user interface (UI) – the look, feel and relative usability of an app or website’s design – will inevitably create a bad user experience (UX). Silicon Valley spends considerable resources trying to build more intuitive, instinctive designs, especially when trying to get consumers’ attention and permission for using their data to offer them products and services. This challenge is at the heart of an upcoming Federal Trade Commission conference this week focusing on the effectiveness of online disclosures to consumers.
Companies have long had to balance completeness and usefulness in disclosures — take our constantly evolving nutrition labels, for example. In the digital age, while tech companies have made important strides in communicating with consumers, striking the right balance is still a challenge just as it is with describing the most important details about your favorite cereal.
Despite decades of work in trying to figure out best disclosure practices by all kinds of companies, it’s amazing how much we have to learn about design for disclosing critical information to consumers. The challenge is most stark online: tech companies must figure out how to get their consumers’ attention, tell them what they need to know, and obtain their permission when needed, while not creating the dreaded “notice fatigue” where consumers ignore disclosures or, worse, abandon the website or app out of annoyance.
That’s why the upcoming FTC workshop to explore consumer disclosures is both so interesting and so important. The FTC, state attorneys general, and consumers themselves, rightfully expect that companies should communicate clearly what consumers should or must know about a company’s products or services. Today, the FTC will gather experts from various disciplines to explore consumer messaging cognition and challenges for disclosures, permissions and warnings all with the goal of advancing UI.
How to communicate something you really need someone to know is a vexing problem in life. Perhaps, if you are married, you are really good at sharing important news and wisdom with your spouse. You probably do not try to use the same words to share the same bit of wisdom with your children or someone who is from an older generation. Words and phrases, much less idioms or technical language, often mean different things to people with different experiences. Those consumers for whom English is not their first language may understand words in translation differently than those who are native speakers.
Read the full article here.
By: Tim Sparapani
If you want to buy someone’s private data, it’s disturbingly easy to do. It’s all there for sale on the dark web, a completely anonymous twin of the web most of us use daily.
The dark web (or deep web, if you prefer) is “dark” because the sites on it cannot be indexed by a web crawling browser, such as Google. That makes it hard for ordinary people, and law enforcement, to find specific websites. This anonymity has the advantage of creating a zone of free speech where individuals can communicate, think and explore ideas without government interference.
But it also creates a haven for illicit activity, including the buying and selling of drugs, child pornography and individuals’ private information such as social security numbers, health records and passwords.
People who don’t closely follow privacy issues probably associate the dark web with Silk Road, the infamous illegal drug marketplace that did millions in business before the FBI managed to shut the site down in 2013.
But the death of Silk Road didn’t put an end to the dark web. This shady technological playground is still going strong, and many sites that thrive on the dark web are a daily threat to privacy and the economy.
Over the past three months, the website LeakedSource has uncovered huge caches of account data being sold on the dark web from eight websites including Twitter, MySpace and LinkedIn. In some cases, those accounts came from privacy breaches at the web companies. In other cases, data thieves were able to steal information directly from users.
The way the account information was stolen matters less than the fact that so much of it is for sale. Need a Netflix password? They’re available for pennies on the dark web. You can also get stolen passwords for Hulu, HBO Go and Spotify.
The dark web has also become a haven for child pornography. According to an article on Wired, over 80% of dark web searches are related to pedophilia.
By: Tim Sparapani
The US Supreme Court has just made the law of privacy in the US about as settled as wet cement. Now, neither consumers nor companies handling consumer data know where things stand.
This all came about when a data broker – a company that gathers data about individuals, typically without their knowledge or consent, and then resells that data – created a file of wholly inaccurate information about an individual for resale. Upon learning about the data broker, Spokeo’s, actions the individual sued Spokeo citing a violation of his rights under the Fair Credit Reporting Act. That federal statute creates a right to sue for violations. The trial court, nevertheless, dismissed the case but the US Court of Appeals for the Ninth Circuit allowed it to proceed. The Supreme Court overturned that decision and sent the case back for additional consideration because the Ninth Circuit had not determined whether the plaintiff’s alleged injury was sufficiently real, tangible, or, what it deemed “concrete” enough to meet Constitutional standards for sustaining a lawsuit.
The result of effectively a non-decision by the US Supreme Court coupled with it providing the barest of guidance has created tremendous privacy law controversy. Now a debate is raging in Washington and in the offices of General Counsels of corporations and plaintiffs attorneys nationwide about what it takes to satisfy this vague standard. Pitched battles are being waged to influence the interpretation of that non-decision and influence what happens next because so much is at stake in a time when our economy is driven by identifying and unlocking value from consumers’ data.
How do we know when a company’s actions using a consumer’s data – especially erroneous data – harmed that consumer’s privacy? The whole debate will turn now on the definition of the term “concrete.” It’s a word that’s hard to lock down. Dictionaries provide only slightly more help than the thin guidance provided by the US Supreme Court. “Concrete”, an adjective meaning, “based on sure facts or existing things rather than guesses or theories.” Cambridge English Dictionary. “Specific particular. Real, tangible.” Merriam Webster’s Dictionary.
This non-decision has corporate America celebrating because fewer privacy cases will be successful. Influential privacy and consumer advocates, in contrast, argue that giving the lower court a do over, in effect, changes nothing. The truth lies somewhere in between, of course.
This is, no doubt, a blow for plaintiffs trying to bring lawsuits. By forcing people who want to sue to describe a tangible injury – and perhaps barring ephemeral or hard to explain or quantify privacy harms, even when Congress created a statutory right to sue – the barrier is now higher to successfully sue to vindicate privacy invasions. While that’s not the end of all suits regarding privacy as some have erroneously claimed, it does mean that some privacy cases that would have gone forward in the past will not make the cut. That surely means that some not-well-articulated but nonetheless important privacy harms will go unaddressed in the courts.
By Tim Sparapani:
Los Angeles is considering new regulations around Airbnb, and other home-sharing platforms, that should deeply worry anyone who cares about keeping their personal information private. If approved, the regulations would require people who rent out space via a home-sharing platform to hold on to three years’ worth of information about who rented their property for how long and at what price. The Office of Finance would have the right to inspect these records at any time.
It’s unclear exactly why the government is proposing this level of privacy invasion. The main thrust of the proposed legislation, which will eventually need to be approved by the LA City Council, is to set out guidelines and fines that would ensure a level of safety and accountability for home rentals. This market is growing quickly. According to a recent poll by Time magazine, 26% of the population has used a home-sharing service. As such, it’s not a stretch for the government to set some commonsense rules around the market and collect taxes from commercial activity.
But the invasion of privacy outlined in the Los Angeles proposal will create unnecessary risks for consumers.
Think about when you check into a hotel. If you pay with a credit card, the hotel will likely look at your driver’s license to make sure it matches the name on the card, but they don’t have to. If you pay with cash, they don’t need any kind of proof of identity. You pay your money and you get your room.
So why is the sharing economy potentially going to be held to a different standard?
By: Tim Sparapani
It’s late in the college basketball season and just as the best teams have to improve their defense in order to win a championship, it may be time for our privacy regulators to try increasing their defensive intensity in order to deter and prevent additional consumer privacy violations. Just like in basketball, sometimes it takes switching up your tactics and your defense to take your team’s game up to a new level.
The challenges for policing misuse of consumers’ data are getting harder, not easier for regulators on the privacy beat. Those regulators – chiefly the Federal Trade Commission and the state Attorneys General – have performed admirably during an age when accidental or clumsy data breaches are now daily events, and cyber attacks – many successful – are now the norm, not the exception. The FTC like a 7 foot-tall center is well-practiced at swatting away the easy, slam dunk cases where companies deceive consumers about their privacy practices ranging from neglect, to poor cross-company coordination, to outright lies. Yet, as a longtime privacy and consumer advocate I’m eager to see more done. I want to shout “De-Fense!” – or, maybe “Un-Fair-Ness!” and exhort the FTC to do more for consumers and in a new way.
Let’s be honest, however, the FTC has resource constraints; it simply cannot police every violation of consumer trust or misuse of consumers’ data by a corporation. Nor is privacy defense the only role assigned to the FTC. The FTC is statutorily required to enforce dozens of consumer protection statutes, and its work on consumer data privacy, while it has lead to groundbreaking and important results, is limited by a lack of staff and intricacies of sophisticated new scams. The FTC’s data privacy work is also limited by the fast-moving pace of technologies. New systems are brought to the public and then obviated by subsequent iterations often within months, not years.
By: Tim Sparapani
CALinnovates, a tech advocacy organization where I am senior policy counsel, recently sponsored a survey of 806 Americans on questions of trust concerning the presidential candidates. Assuming the race comes down to Hillary Clintonand Trump, they wanted to know whom voters trust more.
One question particularly caught my eye. It asked: “Which of these candidates do you trust more to manage the delicate balance between privacy and national security?” Almost half of all respondents (48%) said they trusted Clinton more, while 27% chose Trump.
This is going to be a big issue for the next president, but it’s about more than just who is more trustworthy. We must get the right data policies in place. That cannot happen if we focus on the wrong goals.
The notion that there is a zero-sum game going on between privacy and national security is, and always has been, the wrong way to look at the issue. Privacy and national security actually don’t need to be balanced, they need to be optimized — and by optimizing privacy you optimize security. It may sound counterintuitive, but strengthening national security does not depend on limiting personal privacy. We need strong privacy rules in order to enhance national security.
By: Tim Sparapani
It is time to rethink the concept of consent and to change data privacy law to better align corporate data practices with consumers’ expectations. We should increase the flexibility to innovate for companies that directly interact with consumers while restricting the chances that companies with no relationship with consumers have to misuse consumers’ data.
Recently, I’ve read articles speculating about why the Match Group, which runs online dating services including Match.com and Tinder, would purchase the Princeton Review, a leading student test prep service. Princeton Review has been unprofitable according to those articles. Why would an online dating company buy a test prep company? Perhaps the Match Group knows how to turn money losing test prep companies around or maybe they want to use or sell Princeton Review’s customer data to create new niche dating services. If the latter is true, this unexpected use of sensitive data would confuse, frustrate or anger most customers. When I used Princeton Review to prep for my law school entrance exam neither my girlfriend nor I would have expected that I’d be offered dating services along with new test taking skills. I would never have consented to this unexpected, third party use of my data.
This got me thinking that it is high time we rethought data privacy laws to benefit consumers in two ways by focusing on consumers’ consent. Ideally, consumers should both be free to choose exciting and unexpected innovations derived from their personal data provided by companies they know and trust, and reduced downside by lessened privacy risks posed by unknown companies accessing their personal data. Legislators should empower consumers both to consent to sharing their data for more opportunities they choose and restrict or denying access to corporations they do not know would access that data.
By: Tim Sparapani
The Apple-FBI saga playing out in a very public way is a classic case of overreach by a law enforcement agency. The FBI is putting extraordinary (and unprecedented) pressure on Apple following the horrific San Bernadino shootings. The U.S. government has filed a motion in court to compel Apple to re-engineer its operating system so that the FCC can investigate whether the shooter used his iPhone to communicate or plan with other potential co-conspirators.
Forcing Apple to crack open its own code might appeal to some people clamoring for a quick fix for the ever-increasing threat of terrorism in our country. Unfortunately, there are no quick fixes and the government’s move is an extraordinary threat to civil liberty. It also won’t solve the larger problem. A backdoor won’t stop terrorism, but it will weaken smartphone security systems with no likelihood of any real public benefit. The public, and policymakers, should support Apple’s public resistance to the FBI’s pressure tactics. The FBI’s proposal is dangerous for at least these four reasons:
It Won’t Stop Terrorism
The FBI wants Apple to build a post-incident forensic investigation tool to unpack what may have happened. But that will not actually deter or prevent terrorism. Terrorists will simply switch to using encrypted phones from other countries.
It Will Open Security Loopholes
If the government is allowed to force Apple to provide a backdoor to its operating system, it will weaken security for all U.S. consumers on a go-forward basis This will not force committed terrorists to think twice, but instead could make Apple’s operating system vulnerable to the hacking of consumer data on a large scale given the way this story is playing out publicly as the hacking community will be awaiting the court decision with baited breath.