Amazon’s Brand Registry – Logos Need Not Apply

You may have heard of Amazon’s “Brand Registry.”  If not, it is a way for a brand owner to register and claim a brand name so that the claimant can better monitor for counterfeit goods, rogue sellers, and other issues on Amazon.

In a trademark application, there is an area for filling out “Correspondent” information.  This is typically where a lawyer’s information goes if a lawyer filed the application.  It’s also where Amazon looks to see where Amazon should send Brand Registry emails.  For example, if my client is ACME Anvils, and my client owns a registered trademark for “ACME Anvils,” then I am likely the “Correspondent” for that registration.  Thus, when ACME Anvils decides to list its anvils on Amazon and also wants to claim the ACME Anvils brand as its own, Amazon will send me, as the Correspondent, an email such as the following:


We write to validate the identity of an individual seeking to enroll brand: “ACME Anvils” in Amazon Brand Registry. You are listed as the contact for the registered trademark for brand: “ACME Anvils”.

We are unable to provide you with the name of the applicant and have directed the applicant to contact you. In order to give the applicant approval to enroll brand: “ACME Anvils” in Amazon’s Brand Registry, provide the verification code listed below to the applicant. If you decline, do not provide the code.

Verification code:  [redacted]

Thank you in advance for your assistance, Amazon Brand Registry Support

Best regards,


Amazon Brand Registry Support

When we receive such emails, we simply forward them to the client, and the client ultimately claims the brand on Amazon.

Sounds easy, right?  It is, but there is a MAJOR caveat:


That’s right – if you only have a logo-version of your mark registered with the USPTO, you cannot claim the brand via Amazon’s Brand Registry.  You must have a registration for the plain-text version of your mark.

(In case the graphic doesn’t show up, the highlighted portion says that to enroll in the Brand Registry, the brand name must be “a live registered trademark, which is a word mark and NOT a stylized, illustration, or design mark.”)

This is a very important reason to seek protection for the plain-text version of your mark in conjunction with, or prior to, seeking protection for your logo.  We often see clients who are adamant about only filing a trademark application for their logo.  This is understandable.  The logo was probably the product of much debate, love, and cash.  People seem more wedded to their logos than their simple textual brand names.  But, we typically advise clients to begin with the plain text version of the mark as it can be seen as providing “broader” protection because there are no claims to specific colors or symbols – it’s just the name of the mark regardless of how it’s presented.   However, if you sell goods that you would like to ultimately list on Amazon, at least under the current policy, you should consider registering the plaint-text version of your mark.

New take on an old (infringing?) idea

I read about a recently-published patent application from Apple that will allow an audio device (presumably an i-Device) to “self-censor” explicit content.  The application is titled “Management, Replacement and Removal of Explicit Lyrics during Audio Playback,” and you can read the application HERE.

Figure 1 from the patent application

Here is the abstract from the application:

Unwanted audio, such as explicit language, may be removed during audio playback. An audio player may identify and remove unwanted audio while playing an audio stream. Unwanted audio may be replaced with alternate audio, such as non-explicit lyrics, a “beep”, or silence. Metadata may be used to describe the location of unwanted audio within an audio stream to enable the removal or replacement of the unwanted audio with alternate audio. An audio player may switch between clean and explicit versions of a recording based on the locations described in the metadata. The metadata, as well as both the clean and explicit versions of the audio data, may be part of a single audio file, or the metadata may be separate from the audio data. Additionally, real-time recognition analysis may be used to identify unwanted audio during audio playback.

That description immediately reminded me of an old copyright infringement case that was brought on behalf of Hollywood movie directors who objected to a service called “CleanFlicks.”  CleanFlicks would purchase DVDs of movies, edit out the “objectionable” content, and then sell or rent the “clean” versions.  Ultimately, this practice was considered copyright infringement.  Here’s a brief analysis of the case from the Copyright Office.  From the analysis:

The court held that public distribution of edited versions of plaintiffs’ films for the purpose of eliminating objectionable content did not constitute fair use. It ruled that the edited film versions were not transformative because they added nothing new to the originals. It further held that the “amount and substantiality” factor weighed against a finding of fair use because the movies were copied in their entirety for non-transformative use. Regarding the fourth factor, plaintiffs claimed that there was no adverse effect on the market for the films because they maintained a one-to-one ratio between original and edited films, and that but for their editing, the defendants would not have sold those particular original copies. The court, however, stated that this argument ignored the defendants’ “right to control the content of the copyrighted work,” and further remarked that “[w]hether these films should be edited in a manner that would make them acceptable to more of the public … is a question of what audience the copyright owner wants to reach.” The court also found that editing the versions as a form of comment or criticism was a public policy argument that was not appropriately raised in the copyright context.

But, Apple’s technology might not be making a new copy of a song to then censor, rather, it’s merely bleeping or silencing-out a portion, so there is a decent argument that this will not be a copyright issue.  “Moral rights,” which can protect the integrity of (or cursing in) a work, are generally not recognized in the U.S., so the mere act of censoring a work isn’t necessarily copyright infringement.  But, Apple’s technology is similar to the facts of a 1970’s case wherein ABC was sued by the people behind the Monty Python program based on ABC making edits to Monty Python re-broadcasts.  There, the court found that ABC might actually be violating the Lanham Act, which forms federal trademark and unfair competition law.  By editing the episodes, ABC misrepresented the source of the episodes.  That is, by screwing around with the way that Monty Python’s writers intended the episodes to look, ABC damaged Monty Python because viewers might think that the edited, “less good,” episodes originated with Monty Python.  This is similar to a line of cases where Rolex has successfully sued after-market jewelers who add “bling” to Rolex watches.  Rolex argues that they would never add a dial full of diamonds or other tacky embellishments, and when jewelers do so, they damage the Rolex brand because consumers might think that Rolex was the source of the tackiness.

In sum, Apple may not even bring this technology to market, and even if they do, I would imagine that for artists to have their music on iTunes (the patent anticipates use in streaming environment), they will have to agree to allow the use of the technology.  But at the end of the day, this may not be legally actionable anyway.  Artists typically create a “clean” mix of a song.  I wonder if Tipper Gore would have approved?

Apple’s full application may be downloaded HERE.

(Bonus thoughts:  the idea of piecing a song together from multiple audio streams located at different sources is interesting.  The patent application doesn’t just address adding a “beep” in place of a dirty word.  The “clean” music can be seamlessly spliced into the song on-the-fly.  As we’ve recently seen with Kanye tinkering around with edits and mixes of songs from The Life of Pablo after its release, this technology may have some positive creative merit – i.e. custom MP3/audio files depending on what time of day the audio is streamed or downloaded.)

Lock Up Your Files and Throw Away the Fourth Amendment?

Florida IP Trends is taking another IP-detour to bring you a new case from the 9th Circuit regarding the contours of Fourth Amendment protection with regard to border searches.  This topic dovetails nicely with the other privacy-related topics I addressed in my Mobile Devices and Privacy presentation last year, and this new case will definitely be included in my update to that presentation.

Beginning when are very young, we are conditioned that seeking privacy means that we’re doing something wrong. Whether it’s locking the door to our bedrooms as children, having dark tinted windows on our cars, or telling a police officer that they have no right to search your personal belongings at-will, seeking privacy often carries an unfortunate stigma that you’re doing something wrong.  Another example is putting a password on a computer file.  Fortunately, a new case out of the 9th Circuit explains why having a locked file or an encrypted hard drive does not automatically trigger “reasonable suspicion” such that your entire laptop can be forensically examined.

The Fourth Amendment guarantees the “right of the people to be secure in their persons, houses, papers, and effects, against unreasonable searches and seizures.”  Even so, border searches, such as when law enforcement searches a car for narcotics when re-entering the United States from Mexico, are considered an exception to the general requirement that a warrant be obtained prior to commencing a search or seizure.  However, as the case below recognizes and reaffirms, even a border search must be “reasonable.”  Just because a search is being conducted at the border, that is not cause for an “anything goes” approach.  (In reality, despite the search being triggered at the border, the actual search could take place hundreds of miles away from the border in a forensics lab and still be considered a “border search.”)

The case is United States v. Cotterman, Case No. 09-10139 (9th Cir. March 8, 2013) (en banc).  The quick facts are that Mr. and Mrs. Cotterman were returning home from a vacation in Mexico, and, at an Arizona Port of Entry, a border agent performed a primary inspection of the vehicle, which included running Mr. Cotterman’s name through a database of known or suspected criminals.  The database query revealed that Mr. Cotterman had some prior involvement with child pornography.  The border agents contacted the people associated with Mr. Cotterman’s database record, and that led to additional information about why Mr. Cotterman’s name triggered the alert.  Specifically, the alert was part of a child sex tourism operation which tracked registered sex offenders who travel frequently outside the U.S.

During the initial search, the agents found various media equipment, including a laptop, and they began to examine the devices.  Under the case law interpreting the border search exception to the Fourth Amendment, a suspicionless, cursory scan of one’s personal items is allowed.  There are already cases holding that a “quick look” at a laptop is a Constitutional and unintrusive search.  See United States v. Arnold, 533 F.3d 1003, 1009 (9th Cir. 2008) (holding border search reasonable where border patrol agents had the traveler boot up a laptop so that the agents could perform a cursory review).  So, a basic scan of Mr. Cotterman’s laptop would have been reasonable, and thus legal.

However, after encountering password-protected files, the agents decided to seize the Cottermans’ laptops and digital camera.  Those devices were transported 170 miles away to the ICE office in Tucson, Arizona.  The more thorough forensic examination revealed many images of child pornography, including images showing Mr. Cotterman himself abusing the same child.  He was indicted by a grand jury, however, he moved to suppress the evidence found on his laptop based on the premise that the extended forensic search of his laptop required more suspicion of wrongdoing than the initial border patrol agents could have reasonably found during that initial search.  His motion to suppress was granted.  The government appealed that order, and the 9th Circuit reversed the lower court, finding that reasonable suspicion is not required for the type of extended search that was performed in Tucson.  But the case didn’t end there, because the 9th Circuit agreed to rehear the case “en banc,” meaning that the entire panel of judges would decide the case, not just a 3-judge panel as was the case the first time around.  It is the en banc opinion that we’re addressing with this blog post.

At this point, it is natural to lose any amount of sympathy you may have had for Mr. Cotterman.  If it turned out that he was engaged in despicable behavior – as the very first database alert suggested – then why should we be concerned with how that behavior was uncovered?  There are several two-word answers to that:  Due Process.  Reasonable Suspicion.  Presumed Innocent.  Fourth Amendment.

I’ll start with the good news – the 9th Circuit ultimately concluded that reasonable suspicion was present, so the extended forensic examination of the laptop was legal, and Mr. Cotterman hopefully will not be able to hurt any more children.  So, why is this case important then?  Justice served.  Isn’t it enough that Cotterman is a registered sex offender who had password protected files?  No.  And that is the very important line in the sand that the majority of the en banc panel draws.

In prosecuting the case, the government attorneys argued that the existence of password protected files was an important factor in determining whether or not to perform a heightened search of the laptop.  Fortunately, the 9th Circuit did not permit that factor to hold much independent weight.  Rather, it was only because of the other factors – Cotterman’s status as a sex offender, multiple documented trips to and from a known sex tourism country, multiple cameras and camera equipment, and, only because of the other factors, the existence of password protected files.  That’s the importance of this ruling – mere existence of password protected files is not enough for law enforcement or the government to seize and thoroughly search your “papers.”  The 9th Circuit majority understood that the “papers” written into the Fourth Amendment are today’s laptops, iPads, and other digital devices.  The Founding Fathers wanted the citizenry to be free from unreasonable searches and seizures of their papers, and that logic extends to our virtual papers.  Securing one’s papers does not equate to reasonable suspicion of criminal activity.

The majority opinion does a wonderful job of cautioning the government about the unique nature of digital searches. The Court stated that “legitimate concerns about child pornography do not justify unfettered crime-fighting searches or an unregulated assault on citizens’ private information.  Reasonable suspicion is a modest, workable standard that is already applied in the extended border search, Terry stop, and other contexts. Its application to the forensic examination here will not impede law enforcement’s ability to monitor and secure our borders or to conduct appropriate searches of electronic devices.”  Indeed, it worked in this very case.  Despite the government not thinking that it needed reasonable suspicion, ultimately, reasonable suspicion was found, with the Court still taking the time to explain what reasonable suspicion is not.

The Court explained that “reasonableness” is a dynamic concept that changes depending on the media.  For example, searching a suitcase is much different than searching a laptop, so what is “reasonably suspicious” when searching one, is not necessarily “reasonably suspicious” when searching the other.  Again, it is the Court’s determination to explain and safeguard digital privacy that is the hallmark of this case.  The Court did not have to go into such detailed analysis, but, thankfully, it did.

With the TSA allowing basic pocketknives and other items back on board airplanes, and cases like this which could easily result in further intrusion of the government into our private lives, maybe we will remain home of the free.

I’ll close this post by just providing and emphasizing some of the excellent language from the case (internal citations and quotations removed):

We rest our analysis on the reasonableness of this search, paying particular heed to the nature of the electronic devices and the attendant expectation of privacy.

Notwithstanding a traveler’s diminished expectation of privacy at the border, the search is still measured against the Fourth Amendment’s reasonableness requirement, which considers the nature and scope of the search.

The amount of private information carried by international travelers was traditionally circumscribed by the size of the traveler’s luggage or automobile. That is no longer the case. Electronic devices are capable of storing warehouses full of information. The average 400-gigabyte laptop hard drive can store over 200 million pages—the equivalent of five floors of a typical academic library…Even a car full of packed suitcases with sensitive documents cannot hold a candle to the sheer, and ever-increasing, capacity of digital storage.

The nature of the contents of electronic devices differs from that of luggage as well. Laptop computers, iPads and the like are simultaneously offices and personal diaries. They contain the most intimate details of our lives: financial records, confidential business documents, medical records and private emails. This type of material implicates the Fourth Amendment’s specific guarantee of the people’s right to be secure in their “papers.” U.S. Const. amend. IV. The express listing of papers reflects the Founders’ deep concern with safeguarding the privacy of thoughts and ideas—what we might call freedom of conscience—from invasion by the government.

Electronic devices often retain sensitive and confidential information far beyond the perceived point of erasure, notably in the form of browsing histories and records of deleted files. This quality makes it impractical, if not impossible, for individuals to make meaningful decisions regarding what digital content to expose to the scrutiny that accompanies international travel. A person’s digital life ought not be hijacked simply by crossing a border. When packing traditional luggage, one is accustomed to deciding what papers to take and what to leave behind. When carrying a laptop, tablet or other device, however, removing files unnecessary to an impending trip is an impractical solution given the volume and often intermingled nature of the files. It is also a time-consuming task that may not even effectively erase the files.

The present case illustrates this unique aspect of electronic data. Agents found incriminating files in the unallocated space of Cotterman’s laptop, the space where the computer stores files that the user ostensibly deleted and maintains other “deleted” files retrieved from web sites the user has visited. Notwithstanding the attempted erasure of material or the transient nature of a visit to a web site, computer forensic examination was able to restore the files.  It is as if a search of a person’s suitcase could reveal not only what the bag contained on the current trip, but everything it had ever carried.

With the ubiquity of cloud computing, the government’s reach into private data becomes even more problematic. In the “cloud,” a user’s data, including the same kind of highly sensitive data one would have in “papers” at home, is held on remote servers rather than on the device itself. The digital device is a conduit to retrieving information from the cloud, akin to the key to a safe deposit box. Notably, although the virtual “safe deposit box” does not itself cross the border, it may appear as a seamless part of the digital device when presented at the border. With access to the cloud through forensic examination, a traveler’s cache is just a click away from the government.

The point is technology matters. The Department of Homeland Security has acknowledged as much in the context of international travelers:
“Where someone may not feel that the inspection of a briefcase would raise significant privacy concerns because the volume of information to be searched is not great, that same person may feel that a search of their laptop increases the possibility of privacy risks due to the vast amount of information potentially available on electronic devices.” DHS, Privacy Impact Assessment for the Border Searches of Electronic Devices 2 (Aug. 25, 2009), available at

The relevant inquiry, as always, is one of reasonableness. But that reasonableness determination must account for differences in property.

Freedom of Speech Online: Are We Entering an Internet Governance Cold War?

My co-associate at Walters Law Group, Kimberly Harchuck, recently presented an excellent CLE via the West LegalEdcenter.  She included a large amount of valuable information concerning recent domestic and foreign attempts to regulate not only speech on the Internet, but the very function of the Internet itself.

She has graciously made the slides available for Florida IP Trends readers.  Enjoy!

KAH Webcast Presentation – 01.23.13

The full CLE, including the audio file, will be available via the West LegalEdcenter next week.

Mobile Devices and Privacy – Presentation

I just realized that I didn’t post the slides from a presentation I gave back in July.  It dealt with regulations, laws, and technical issues concerning mobile devices and First Amendment rights.  I cited several recent cases concerning the right to film the police, as well as some cases concerning compelled password disclosure (i.e. forcing citizens to reveal encrypted contents).

Usual disclaimer applies: by themselves, the slides don’t give the full effect of the presentation, but you might find them interesting.

Mobile Device Privacy-Summer 2012-public