By: Mike Montgomery
Verizon’s most recent statement on net neutrality is really the height of hypocrisy. To be clear, I completely agree with the no blocking and no throttling principles outlined in the post. Yet their sudden enthusiasm to ban zero rating services and grant the FCC power to chill innovation through antiquated regulation will ultimately harm broadband investment and consumer access to modern 21st century Internet-based services and applications. It’s curious that Verizon has recently found religion on this single-most divisive and long-lasting tech public policy issue of this century.
For the past six years my organization has been pushing for a third way on net neutrality: laws that support an open, free Internet but are affirmatively enshrined into law by Congress instead of mandated by the FCC and are subject to change every 4–8 years. Unfortunately, we’re currently in legal no-man’s land, and, as we predicted, the rules are now going through the legal meat grinder as we are essentially legislating through litigation.
As we predicted, the FCC has set in motion a process where tech policy is set through litigation, rather than through collaborative and publicly vetted legislation. This creates many problems. Waiting means incredible uncertainty for anyone building a business that might be affected by changing net neutrality rules (in other words, everyone building a business on the Internet).
No one knows what the courts will decide. The legal process is, by definition, uncertain. And we’re in the middle of a heated presidential race.
By: Tim Sparapani
CALinnovates, a tech advocacy organization where I am senior policy counsel, recently sponsored a survey of 806 Americans on questions of trust concerning the presidential candidates. Assuming the race comes down to Hillary Clintonand Trump, they wanted to know whom voters trust more.
One question particularly caught my eye. It asked: “Which of these candidates do you trust more to manage the delicate balance between privacy and national security?” Almost half of all respondents (48%) said they trusted Clinton more, while 27% chose Trump.
This is going to be a big issue for the next president, but it’s about more than just who is more trustworthy. We must get the right data policies in place. That cannot happen if we focus on the wrong goals.
The notion that there is a zero-sum game going on between privacy and national security is, and always has been, the wrong way to look at the issue. Privacy and national security actually don’t need to be balanced, they need to be optimized — and by optimizing privacy you optimize security. It may sound counterintuitive, but strengthening national security does not depend on limiting personal privacy. We need strong privacy rules in order to enhance national security.
By: Mike Montgomery
When the Copyright Royalty Board issued its rate increases last winter, it seemed like the battle over reasonable royalty rates for music was finally settled. But the music labels were furious because they were banking on extracting far more from yet-to-be-profitable digital music services, so they could enjoy even higher margins at the rest of the industry’s expense.
Undeterred, the publishing arms of these foreign-owned behemoths, already raking in record revenues from streaming services, now want to leverage their monopoly control over musical works to extract higher royalty payments to further enrich themselves instead of the songwriters they represent. But they’ve run into an obstacle: their collection goons, ASCAP and BMI (aka the performance rights organizations, or PROs) are limited by federal antitrust consent decrees.
And that’s a good thing. The Department of Justice originally sued ASCAP and BMI – which together control use of approximately 90 percent of all music – for collusive, anticompetitive behavior more than 70 years ago. Today, the consent decrees are the only obstacles keeping these cartels from throttling the growth of innovative music platforms.
And yet the music mafia is begging the DoJ to bless the very kind of collusive behavior that landed them in antitrust court in the first place; behavior that the consent decrees safeguard against.
Today, when a digital music service (or anything else that plays music) negotiates a license with the PROs, they get the full use of the song even if ASCAP or BMI controls less than the full ownership stake in that song. In other words, any partial owner of a song can license the entire work (they just need to share revenues proportionally with other owners). This is known as “100 percent licensing,” and is a bedrock of the current blanket licenses.
By: Tim Sparapani
It is time to rethink the concept of consent and to change data privacy law to better align corporate data practices with consumers’ expectations. We should increase the flexibility to innovate for companies that directly interact with consumers while restricting the chances that companies with no relationship with consumers have to misuse consumers’ data.
Recently, I’ve read articles speculating about why the Match Group, which runs online dating services including Match.com and Tinder, would purchase the Princeton Review, a leading student test prep service. Princeton Review has been unprofitable according to those articles. Why would an online dating company buy a test prep company? Perhaps the Match Group knows how to turn money losing test prep companies around or maybe they want to use or sell Princeton Review’s customer data to create new niche dating services. If the latter is true, this unexpected use of sensitive data would confuse, frustrate or anger most customers. When I used Princeton Review to prep for my law school entrance exam neither my girlfriend nor I would have expected that I’d be offered dating services along with new test taking skills. I would never have consented to this unexpected, third party use of my data.
This got me thinking that it is high time we rethought data privacy laws to benefit consumers in two ways by focusing on consumers’ consent. Ideally, consumers should both be free to choose exciting and unexpected innovations derived from their personal data provided by companies they know and trust, and reduced downside by lessened privacy risks posed by unknown companies accessing their personal data. Legislators should empower consumers both to consent to sharing their data for more opportunities they choose and restrict or denying access to corporations they do not know would access that data.
By: Tim Sparapani
The Apple-FBI saga playing out in a very public way is a classic case of overreach by a law enforcement agency. The FBI is putting extraordinary (and unprecedented) pressure on Apple following the horrific San Bernadino shootings. The U.S. government has filed a motion in court to compel Apple to re-engineer its operating system so that the FCC can investigate whether the shooter used his iPhone to communicate or plan with other potential co-conspirators.
Forcing Apple to crack open its own code might appeal to some people clamoring for a quick fix for the ever-increasing threat of terrorism in our country. Unfortunately, there are no quick fixes and the government’s move is an extraordinary threat to civil liberty. It also won’t solve the larger problem. A backdoor won’t stop terrorism, but it will weaken smartphone security systems with no likelihood of any real public benefit. The public, and policymakers, should support Apple’s public resistance to the FBI’s pressure tactics. The FBI’s proposal is dangerous for at least these four reasons:
It Won’t Stop Terrorism
The FBI wants Apple to build a post-incident forensic investigation tool to unpack what may have happened. But that will not actually deter or prevent terrorism. Terrorists will simply switch to using encrypted phones from other countries.
It Will Open Security Loopholes
If the government is allowed to force Apple to provide a backdoor to its operating system, it will weaken security for all U.S. consumers on a go-forward basis This will not force committed terrorists to think twice, but instead could make Apple’s operating system vulnerable to the hacking of consumer data on a large scale given the way this story is playing out publicly as the hacking community will be awaiting the court decision with baited breath.
By: Tim Sparapani
What we’re seeing right now with Apple AAPL -1.92% is a classic case of law enforcement overreach. The FBI is putting extraordinary (and unprecedented) pressure on Apple AAPL -1.92% in the wake of the San Bernadino shooting which left 14 dead and 22 wounded. The U.S. government has filed a court motion to press Apple to rewrite its operating software so that it can investigate whether the shooter used his phone to communicate and plan with others.
That might appeal to some people who are looking for a quick fix for the threat of terrorism. But the truth is there are no quick fixes and the government’s move is an extraordinary threat to liberty. It also won’t work. A backdoor won’t stop terrorism, it will only weaken phone security with no likelihood of any kind of public benefit. The public, and policymakers, should help Apple resist the FBI’s pressure. The FBI’s proposal is dangerous for at least these three reasons:
It Won’t Prevent Terrorism
The government wants Apple to build an after-the-incident forensic tool to figure out what may have happened. But that will not actually deter or prevent terrorism. Terrorists will simply switch to using encrypted phones from other countries. At the same time, the government’s move will weaken security for all U.S. consumers. This cannot and will not stop committed terrorists.
It Sets A Terrible International Precedent
If the U.S. government forces this technology on Apple, it’s also giving this technology to the rest of the world. That means rogue regimes, dictatorships and oligarchs will have access to the same security busting technology as the U.S. government. One nation’s terrorist is another’s journalist, reformer, freedom fighter or human rights advocate. Limiting security on iPhones could put these people, who are often on the frontlines of fights against oppression, in grave danger.
By: Tim Sparapani
Some horrifying stories surfaced recently about glaring data security vulnerabilities for the Internet of Things. A company called Shodan, which is a search engine for connected devices, has had no trouble pulling up video camera feeds of sleeping babies, marijuana plants and schoolrooms. The site found insecure connections for everything from traffic lights to ice rinks. Those gaps are a hacker’s playground, and they should worry consumers and companies hoping to capitalize on the market for Internet-connected devices of all kinds.
By collecting data from things like lightbulbs, factories and home appliances, engineers will be able to design endless apps to make things work more efficiently, saving energy and water while preventing equipment failure. That’s the essential promise of the Internet of Things (IoT) era. Thanks to the burgeoning IoT economy, we’re on the verge of having self-driving cars and appliances that tell us that their parts are about to fail.
But right now, that bright future looks a little dim. Security is paramount, and if manufacturers don’t take steps to assure the public that their devices are secure, that revolution will be delayed.
Perhaps because IoT devices are to date opaque — after all, there’s no interface for a lightbulb with sensors embedded in it — consumers haven’t been overly concerned about safety issues. Since this is still a relatively new industry, things like price and convenience have taken priority. We are in a type of technology limbo where we are learning that securing the data collected by these devices is essential, yet too few manufacturers have implemented robust data security protections for these devices.
But it will take just a few high-profile hacks to change that. Say, for example, all of the traffic lights in a big city suddenly went red at the same time and stayed that way. Or all of the lightbulbs linked to a given system went on in the middle of the night. An event like that would be enough to potentially scare people away from the IoT.
By: Tim Sparapani
A federal judge’s order to help the Justice Department unlock a phone used by a suspect in the San Bernardino, Calif., shootings has put unprecedented pressure on Apple. In a letter to customers detailing the company’s opposition, Apple CEO Tim Cook noted that there are “implications far beyond the legal case at hand.” Yes, the owner of the phone–Syed Rizwan Farook‘s former employer–has given permission to search the device. But those who view the case as a potential means to combat the threat of terrorism are missing its threat to liberty, its potentially dangerous precedent, and the fallout to technological security. Consider:
Apple has said it complied with government search warrants and subpoenas. The Justice Department’s motion for Apple to disable particular security features on the phone presses the company to reformulate its operating software so that U.S. investigators can learn whether Mr. Farook used the iPhone to communicate with others about the November shootings. Forcing companies to create technologies to break their operating systems or override security features creates an after-the-incident forensic tool to figure out what may have happened. This does not actually deter or prevent terrorism. People determined to carry out attacks will continue to do so. They will simply use the encrypted products and devices sold by companies based outside the U.S. or other countries whose governments pry open their devices. At the same time, security protections for all consumers of those products will be weakened.
Such a move would set a dangerous international precedent. If the U.S. government forces Apple to undermine its technology there will be no means for companies to take a principled stand when rogue regimes, dictatorships, oligarchs, and other bad actors around the world make a similar request. One nation’s terrorist is another’s journalist. Or reformer, or freedom fighter, or rights advocate. In the wrong hands, the implications could extend to instances regarding human life, free speech, privacy, and other fundamental human rights around the globe.
In the immediate and long term, there is also a malware risk. Forcing Apple to reformulate its operating system is all but asking for the introduction of a bug, flaw, or defect–those forced upon companies by governments and those introduced through the vulnerabilities created by criminal hackers, identity thieves, and the government-sponsored spies of foreign nations.
By: Mike Montgomery
2016 has started out on a sour note for Live365. The online radio service, which specializes in user-curated music, announced that it has had to lay off a significant portion of its staff and will likely shut down later this year.
The reason: A decision by the Copyright Royalty Board to raise the rates non-interactive Internet streaming services like Pandora have to pay for the right to spin music. In December, the board raised the rate from 14 cents per 100 plays to 17 cents.
Three cents is trivial, right? Not exactly. It might not sound like a lot of money, but for small Internet streamers like Live365, it’s the difference between survival and ruin. It’s hard enough to run a business when 50% or more of a non-interactive streaming company’s revenues go toward royalty payments. It’s even more challenging when what’s left over can’t be reinvested into innovation or marketing in order to enhance the customer experience or grow the listener base through marketing and promotions.
Live365 isn’t the only victim of the CRB’s decision. SmoothJazzChicago, a site run by radio vet Rick O’Dell, is also shutting down. O’Dell cited the new royalty rates as one of the main reason he’s turning off the lights.
While the rate hike certainly harms the bigger players, it’s devastating to a whole tier of streaming companies that either serve niche audiences or were just getting off of the ground. There’s no doubt it’s also affecting the army of entrepreneurs in Silicon Valley and elsewhere who are currently hard at work on the next big thing for Internet music, not to mention the venture capital that will instead go toward startups that don’t have to give away the lion’s share of their revenue in order to avoid collapse.