Skip to main content

The Digital Self: Fight the man, buy a CD

fist headerDon’t even think about trying to re-sell any of those under-played MP3s on your computer – that would likely be against the law. This is according to U.S. District Judge Richard Sullivan in Manhattan, who ruled this week against Arizona-based company ReDigi, the “world’s first” digital equivalent of a used record store, which allows people to buy and sell “used” music tracks and albums originally purchased through Apple’s iTunes.

Sullivan’s decision stems from a January 2012 lawsuit filed by Capitol Records, which asserted that ReDigi’s business was based entirely on copyright infringement. ReDigi countered the claim, saying that its service abides by copyright law because it does not make any copies of the MP3 files it re-sells – instead, it simply transfers the original file to the ReDigi servers, and permanently erases it from the seller’s hard drive. Because no copy is being made, ReDigi’s argument goes, its business should be legal. Sullivan, of course, disagreed.

The ruling against ReDigi is seen as a win for Capitol Records and the music industry in general, and a loss for ReDigi (which may have to shut down entirely) and any other company that wants to make a business dealing with pre-owned digital goods – like Apple and Amazon. But the real losers here are the consumers. It’s time we fight back the only (legal) way we can: Stop buying digital music.

Thwarted by copyright

The first issue here is something known as the “first sale” doctrine, which says you are allowed to “sell or otherwise dispose” of a copyright work that you’ve purchased without first obtaining permission from the copyright holder. “First sale” is what allows any secondary marketplace, like eBay, Craigslist, Amazon, or your local used book store, to operate legally.

When you pay for something, you should own it. You should be able to resell it, delete it, or play it on whatever device you wish. At least, that’s the theory.

ReDigi claims that its business should be legal based on “first sale,” a provision of copyright law that the Supreme Court reaffirmed just this month (PDF). Sullivan dismissed this argument based on the interpretation of the Digital Millennium Copyright Act (DMCA), which prohibits the unlicensed reproduction of protected works. What ReDigi was doing, says Sullivan, clearly violated the DMCA because copies were being made (even though ReDigi refutes that claim). As an alternative, Sullivan suggested that users could simply sell the hard drive on which the copyrighted file is stored. Yes, really.

The bigger issue at hand is the DMCA itself, which even the U.S. Register of Copyrights Maria A. Pallante believes is outdated and confusing. It is the DMCA that sits behind most of the copyright issues we have in the U.S. It’s the law that killed Megaupload, and the one that makes it a federal crime to unlock your new smartphone. According to Pallante, Congress would do well to revamp U.S. copyright law to clear up the issue of “first sale” as it applies to digital goods.

Here’s how Pallante explained the issue during a recent lecture at Columbia Law School:

On the one hand, Congress may believe that in a digital marketplace, the copyright owner should control all copies of his work, particularly because digital copies are perfect copies (not dog-eared copies of lesser value) or because in online commerce the migration from the sale of copies to the proffering of licenses has negated the issue. On the other hand, Congress may find that the general principle of first sale has ongoing merit in the digital age and can be adequately policed through technology . . . Or more simply, congress may not want a copyright law where everything is licensed and nothing is owned.

Let me repeat that last bit: “everything is licensed and nothing is owned.” This line perfectly explains the nut of the problem; when it comes to digital goods, we don’t actually “own” anything. That album you just purchased off of iTunes isn’t really yours – you’ve simply paid for a license to play the songs in the privacy of your home or car. The same rules apply to “your” ebooks, downloaded video games, apps, and any other software-based product.

Look at it this way

From copyright owners’ perspective, this setup is completely understandable. While a physical book and an ebook might have the same content, it is virtually impossible for your average Web user to provide an infinite number of copies of the physical book to anyone who wishes to have it. The “licensed not owned” technicality is there to stave off piracy. Were that setup to disappear, say experts, it could “wreak havoc” on content publishers.

For consumers, licensing-only make much less sense. When you pay for something, you should own it. You should be able to resell it, delete it, or play it on whatever device you wish. At least, that’s the theory. Unfortunately for us consumers, it’s no longer the reality.

So, what are we to do? The first step is to contact your representatives in Congress and tell them to firmly establish the re-selling of digital goods as a legal practice. (Not a very satisfying option, I know.) As we wait for Congress to take action on this issue – it will be a very long wait – the only real way to fight back against the current system is to go analog-only. Only buy used books, used CDs or vinyl – not because you want to be a hipster, but because it’s the only legal way to subvert a system that is designed specifically to make your life more difficult.

What other choice do we have?

Editors' Recommendations

Andrew Couts
Former Digital Trends Contributor
Features Editor for Digital Trends, Andrew Couts covers a wide swath of consumer technology topics, with particular focus on…
The Digital Self: Facebook’s teen privacy effort is great … oh, wait – no it’s not
facebook teen security header

Well, this is just precious. Facebook and the National Association of Attorneys General (NAAG) announced Monday a new partnership to help teach teenagers and their parents how to protect their privacy online. The campaign, which brings at least 19 attorneys general together with the world's biggest social network, will include a pro-privacy video series and other tips for how people can keep their personal information within their control, both on Facebook and the Web at large.
“Teenagers and adults should know there are tools to help protect their online privacy when they go on Facebook and other digital platforms,” said Attorney General Douglas F. Gansler in a statement. “We hope this campaign will encourage consumers to closely manage their privacy and these tools and tips will help provide a safer online experience. Of course, attorneys general will continue to actively protect consumers’ online privacy as well.”
While NAAG's intentions are seemingly in the right place, this partnership sounds like one giant farce - a plan that won't only fail to help people protect their privacy, but could cause further damage.
Advising people to avoid Facebook altogether is perhaps the best privacy tip you can give.
Foremost is the fact that advising people to avoid Facebook altogether is perhaps the best privacy tip you can give. Instead, the NAAG partnership allows Facebook to promote itself as pro-privacy, despite the fact that the company's entire existence is built upon people betraying their own personal data. It is a pure marketing move for Facebook - announce the partnership, generate some headlines that make it seem as though the company is making a concerted effort to keep its users' data private. The partnership also allows Facebook to show that it is taking extra steps to comply with the recently updated Children's Online Privacy Protect Act (COPPA), which prohibits most data collection from users under the age of 13 (users who should, incidentally, not be on Facebook at all, according to its terms of service).
Meanwhile, it's partnering with a slew of the country's biggest data brokers to combine its trove of online data with the offline purchasing histories of its users. That is the real Facebook. The NAAG partnership is nothing more than a pretty mask.
The next worrisome detail is what the partnership says about Facebook itself. According to Facebook's press release on the partnership, the campaign will include a "tip sheet" that highlights the "top 10 tools" that allow users to protect their data on the social network. That's right - there are so many different privacy settings on Facebook that the company can create a freakin' top-10 list of the best ones. Does anyone else see that as a problem?
While Facebook is quick to tout its granular privacy controls as a feature, the plethora of various options (different settings for status updates, apps, ads, photos, on and on) are clearly confusing people. And when people are confused, they make mistakes - they share more about themselves and their relationships than they would if they really understood what was going on. This confusion is good for Facebook; the more people share, the more valuable the company's advertising platform becomes. Keeping its privacy settings complicated seems to be part of its business strategy - otherwise it would have fixed them by now, right?
Now, some might say that the Facebook partnership is a win for online privacy protection. After all, Facebook and NAAG are releasing a variety of new pro-privacy tools that didn't exist before the partnership came to be. How is that a bad thing? In a bubble, it's not - the trouble is, this partnership further distorts the reality of what Facebook is all about: Gathering data, and selling the value of that data to advertisers. Privacy is just not an important part of the company's general makeup, no matter how exuberantly its press releases sing that it is.
If NAAG and other organizations want to help people protect their online privacy, I'm all for it. (Though it would help if state and federal law makers crafted some legislation that provide real privacy protection.) But NAAG's partnership with Facebook appears to do far more to boost the social netwok's image as a champion of online privacy while doing little to keep the rest of us any safer than we were before.

Read more
The Digital Self: We need laws that empower consumers in the face of big data
The Digital Self: We need laws that empower consumers in the face of big data

Imagine for a moment that I am looking over your shoulder at your computer screen. I can see you, but you can't see me. Sure, you might know I'm there, but you don't really think about it – perhaps you've forgotten about me, or just grew used to my presence.
On your screen, I can see your complete browser history – every website you've ever visited, even the ones viewed with "incognito mode" turned on. I also know your name, date of birth, sexual orientation, every place you've ever lived, everyone you contact, and everything you buy, online and off. I can also see your smartphone, which tells me where you've been, who you call or text, which apps you use, and more. All told, I know more than a thousand tidbits about your life.
I have all of this information collected into files about you. Sometimes I share those files with other people. Sometimes they pay me for that information.
One day, you realize what I've been up to. So you stop by my house and ask to see your files. "Oh, I just can't do that," I tell you. "That would simply be too much trouble." Besides, I say, there is no law that requires that I tell you what I know about your life. And that information was given to me voluntarily – you agreed to hand it over when we first met, remember? You don't, but tough luck. Now leave me alone.
Stranger than fiction
This is a true story. Rather than me looking over your shoulder, however, it's thousands of companies – advertising networks, Facebook and Facebook apps, mobile apps, Google and Google apps, data brokers, and more. And while some of these companies do allow you to find out what information they've collected about your life, you remain at their mercy – gaining access to your file, if possible, usually isn't easy, and sometimes carries a fee. Other times the information you receive is only a fraction of what the company has on you. Most of the time, access is simply not an option.
If this doesn't make you angry, it should. And it's long past time for this imbalance of power over our information to come to an end.
Think about the issue of user data collection and use in terms of "control" – not just control over the data, but control over our lives.
For residents of California, this imbalance may soon right itself, thanks to a recently proposed bill known as "The Right to Know Act" (or AB 1291). If "Right to Know" passes, Californians will have the power to demand a year's worth of their data from any company that has information on them. Companies will also be required to tell users which "third parties" have access to that data. "Right to Know" wouldn't stop data collection; it would simply make data collection practices more transparent.
Like other consumer rights advocates, including the Electronic Frontier Foundation and the American Civil Liberties Union, I strongly support "Right to Know." Problem is, we need precisely this kind of law at the federal level – and for now, it doesn't look like that's going to happen.
Privacy vs. control
Debates about data collection inevitably center around "privacy." While privacy is important, it is also a problematic concept – privacy likely means different things to each of us, rendering discussions about its importance meaningless. Instead, let's think about the issue of user data collection and use in terms of "control" – not just control over the data, but control over our lives.
Here's the thing: The life details collected about us are not just used for serving targeted ads and search results; they are defining who we are to the world at large. In turn, the world is placing us in an increasing number of boxes – safe and risky, big spenders and low spenders, high performers and underachievers, on and on. These details are used to determine all types of important decisions: whether we should qualify for loans, whether we deserve to get a job, or even how much we should pay for a particular product or service.
The problem here is not that companies are using data and algorithms to figure out which customers to target or with whom to do business; it's that many of us have no way of knowing that our information will be used in this way; and far too often, the information is entirely incorrect.
Barriers to entry
Because the U.S. currently lacks privacy laws like "Right to Know," we are left completely ignorant to the ways in which our data is used to define us, and completely powerless to change incorrect data. This must change.
Our politicians know the status quo is broken. In February 2012, the Obama administration proposed a "Consumer Bill of Rights," which puts us firmly in control over our data. This was soon followed by a list of policy recommendations by the Federal Trade Commission (PDF), which offered further remedies to the problem of data collection and dissemination. Despite this, not a single new federal law has come to our rescue.
This inaction likely stems from opposition in the business sector. Businesses are not happy about "Right to Know," for example. According to the Wall Street Journal, a coalition of powerful trade groups, including the Internet Alliance, TechNet, and TechAmerica, sent a letter to the bill's author, Democratic assemblywoman Bonnie Lowenthal, arguing that the bill would leave technology companies vulnerable to lawsuits. Some say the bill's requirements would add crippling burdens on companies, which would hurt innovation and kill jobs.
It is difficult for me to care about these woes. Thanks to European privacy laws, any company that has customers and users within the European Union already does business in this way. If new businesses need to learn how to disclose our data properly and cheaply, there are professionals in this world who can walk them through the process. Furthermore, these companies are often getting our data for free, so if they have to hire a whole team of people to deal with requests for our data, that seems like a fair trade.
What is not fair is allowing anyone to peek over our collective shoulder, and then refuse to even tell us what they saw. What is not fair is categorizing people based on information they don't know they have shared – or, worse, information that is entirely false – which can have profound effects on their lives. What is not fair is allowing this imbalance of power to exist.
For Californians, "Right to Know" is a step in the right direction. It's time for our leaders in Washington to let the rest of America walk along with them.
Image courtesy of Mishchenko Mikhail/Shutterstock

Read more
The Digital Self: Ignorance about privacy is not an excuse – but it is our excuse

A person whose name I failed to find on Google once said, "If you are not paying for the product, you are the product." What this mystery wordsmith likely said next is, "If you are not making money off the product, you probably suck at using it."
Such is the plight of the average technology user – me, you, and anyone else this side of a Stanford University computer science degree. We fail to fully grasp most of the gadgets, apps, and social networks that permeate 21st century life – especially when it comes to online privacy, the opposite of which fuels a healthy chunk of the Web and apps. And as a result of our ineptitude, we've let a bunch of strangers make money off of our lives.
Most talk about technology and privacy centers around the ways companies collect our data – advertising cookies, Facebook "Like" buttons, location-collecting apps, everything made by Google. That's perfectly understandable. But it's also counterproductive. It's time for us, the uneducated users, to take some responsibility for our ignorance.
So here's a simple challenge: If you don't understand how a certain connected service really operates, don't use it until you do.
New app? Download it. New social network? Join. New gadget? Go buy it. Click. Click. Click.
I realize this request may seem unreasonable. But why should it? If you don't know how to drive a car, don't get behind the wheel for a cross-country road trip. If you've never fired a gun, don't go moose hunting with a bazooka. (Do people hunt moose with bazookas? God, I hope so.) And if you don't understand that everything you do on Facebook can make its way onto the public Web, don't start posting status updates ragging on your boss. It's really that simple.
In an age when new digital products and services pop up literally every day, we've become trained to jump on board the hot thing without a single thought. New app? Download it. New social network? Join. New gadget? Go buy it. Click. Click. Click. Our promiscuous approach to technology adoption – though totally understandable – makes us junkies, not victims.
This does not absolve companies of responsibility for the privacy-crushing practices that permeate consumer tech. Some companies seem to make privacy protection as complicated as possible, just to dupe us into giving out the goods. Sharing options are turned on by default. Activity tracking takes place without our knowledge. Our account settings are explained in gibberish, splintered into 19 different options that could bewilder even the savviest among us. Or they don't give us privacy settings at all, as is the case with the vast majority of websites. Terms of service and privacy policies, where all of this information is supposedly laid out for us, are almost universally unintelligible, and always too damn long. All of these failures belong to those who created these products – whether they decided to trick us on purpose, underestimated how easily we get confused, or just suck at privacy, too. 
For example, a friend recently purchased a new health band – one of those watch-like gizmos that tracks how much you exercise. He asked me, as the local expert on such matters, whether it was a better privacy practice to sign up for an account directly with the company, or to use Facebook Connect to create a profile.  No question: Don't use Facebook – that's guaranteed privacy suicide. But after perusing the health band company's privacy policy, which vaguely "explained" that collected user data could essentially be shared with basically any "third party," I was at a loss. "Just don't use it at all," I said. "That's really the only good option."
That users must sacrifice privacy to use the product is the company's failure. But it is our failure if we do so anyway – even if  we don't realize what's happening. We have access to more information than anytime in history. Ignorance is not an option.
Instead, as my third-grade D.A.R.E. officer liked to say, arm yourself with knowledge that lets you make better choices: Download an anti-tracking tool for your browser. Don't download apps willy nilly. Turn off location services in your phone. Don't share revealing details on social networks. Never check in anywhere. Don't connect via Facebook and Twitter. Read blog posts about specific products and privacy. Use Tor when browsing, and DuckDuckGo for search. And, as painful as it may be, try your best to read through the legally binding terms of services and privacy policies – even if they make you want to tear out your eyeballs – then sign up, if you so choose.
Of course, this modus operandi has two glaring problems: You can't control what your friends post about you. And no matter how informed you are, no matter how careful, your personal information will probably still get collected – it usually starts as soon as you sign up for Internet or wireless access. The goal here is to limit the bleeding as much as possible. 
Even as I write this, I realize the futility in this advice – I don't even follow much of it myself. But as we move deeper into this age of constant full disclosure, we could at least think about trying.

Read more