Interoperability across the creator economy, Sample generative AI contract clauses, Louisiana and California pass new social media laws, and more!


This is Creator Economy Law, a newsletter dedicated to exploring and analyzing the legal issues surrounding the creator economy, creators, and internet platforms. If you enjoy what you’re reading, share with friends, and invite them to subscribe using the button above and share using #CreatorEconomyLaw.


☕️ Support this Newsletter ☕️

Consider buying a recommended book from my Amazon store! I have some suggestions that are focused on the creator economy, so check them out. If you order one (or more), I’ll receive a commission from Amazon.


🖖 We might not be alone in this universe according to a whistleblower alleging that the U.S. government has been holding onto ‘intact and partially intact’ unexplained anomalous phenomena… the fancy name for UFOs. 🛸

Do you think any other worlds have a well-established creator economy?? Asking for a friend. ✨


Here’s what’s been happening in the world of creator economy law.


What You Should Know

The Future of Interoperability in Social

Are high-priced APIs for social networks bad for the future of the internet? An article on Mashable explores the latest revenue-seeking trend from tech companies, including those that survive off user-generated content. The trend arguably started when Twitter went private and, under Elon Musk‘s leadership, began targeting its API and third-party developers and partners as a new revenue stream. Now, it appears that interoperability across the creator economy is entering a new era.

Interoperability is a characteristic of a product or system to work with other products or systems. –Wikipedia

Android Authority reached out to Reddit, Inc. to get a statement on the upcoming protest by users due to their API changes and the impact on third-party app developers. The company gave Android Authority five small statements, each addressing specific concerns of the general community. It boils down to pressure on the company to control costs while recognizing there are situations in which third-party developer access may need an exception. As The Verge reports, one such exception by Reddit will focus on accessibility-focused apps.

A lack of interoperability across social platforms impacts creators. We’re already experiencing the loss of connectivity with cross-posting to Twitter from some applications due to the API restrictions. If the trend continues and expands, the costs of content distribution within the creator economy, and operations as a whole, are going to skyrocket and potentially shut out some creators.

📖 Read:

Adobe’s Firefly Exits Non-Commercial Beta

Fast Company‘s Chris Stokel-Walker reports on Adobe‘s unveiled enterprise expansion for Firefly generative AI tools, which include “full indemnification for the content created through these features,” according to Claude Alexandre, VP of digital media at Adobe.

Personally, I’m interested to see the full terms and whether there’s any liability cap associated, limitations on the indemnification, or other requirements for how the output is used. I’m assuming it’s a mix of Firefly software limitations and protections that prevent substantially similar outputs either unintentionally or at the direction of a user (arguably misusing the service, or manipulating with intent to infringe).

But I’m also very curious as to how the fact that Adobe has built Firefly from licensed content (debates aside) plays into the indemnification obligation. For example, if the output is substantially similar or identical to an asset from Adobe Stock that was used to train the Firefly models, I’d assume the photographer gave up the right to sue Adobe under submission terms for Adobe Stock.

Ultimately, I think the risk is rather low for Adobe here based on what they’ve said publicly about how Firefly training materials were gathered and used to train the models that power Firefly product/service lines.

📖 Read: Adobe is so confident its Firefly generative AI won’t breach copyright that it’ll cover your legal bills

I’d also note that there’s a paragraph in the article paragraph about indemnification that doesn’t really make sense. It says:

“Alexandre declined to answer whether the indemnity means that anyone who believes their copyright has been infringed by Firefly should sue Adobe rather than the person who used Firefly, noting instead, “It’s a guarantee against litigation, the consequences of litigation.”

At its core, indemnification is triggered by a 3rd party suing, or making a claim against, the end user. The degree here turns on whether the clause says “indemnify,” “defend” and/or “hold harmless” as to what Adobe does. There’s not going to be an ability to control whether a lawsuit is brought by someone against the end user, Adobe, or both.

Meta’s Oversight Board 2022 Annual Report

Did you catch it? Meta’s Oversight Board published its 2022 Annual Report, and there’s a lot!

The Oversight Board was created to help Facebook and Instagram answer some of the most difficult questions around freedom of expression online: what to take down, what to leave up, and why. The report covers 91 recommendations made to Meta and 12 published decisions during 2022.

Here are some of the highlights from the Executive Summary:

👤 The Board received nearly 1.3 million cases from users around the world (around 25% YoY increase)

👤 The Board issued its first policy advisory opinions on: sharing private information and Meta’s cross-check program.

👤 The Board overturned Meta in 9 of 12 case decisions on content moderation, upheld them 3 times.

👤 The Board caused Meta to reverse its original decision in 32 cases considered for selection and where it’s original decison on a post was incorrect.

👤 The Board expanded its scope to include the ability to add warning screens to eligible content.

The Board’s recommendations resulted in Meta:

👤 Telling people which specific policy their content violated when removed.

👤 Systematically measuring the transparency of its enforcement messaging to users.

👤 Enhancing how it identifies breast cancer context in content on Instagram, which contributed to thousands of additional posts being sent for human review that would previously have been automatically removed.

👤 Creating a new section in the Community Standards on misinformation.

👤 Completing global rollout of new messaging telling users whether human or automated review led to their content being removed.

👤 Introducing a new Crisis Policy Protocol.

The report is also forward-looking into what the Board aims to accomplish in 2023:

👤 Publish first summary decisions on cases where Meta reversed its original decision on a piece of content.

👤 Issue first expedited decisions where we publish a decision on a case within days.

👤 Reach updated full Board membership goal for maximum efficiency.

👤 Deepen engagement around the Board’s seven strategic priorities.

👤 Pursue long-term plans for scope expansion.

👤 Monitor how Meta is implementing the Board’s recommendations and push the company to provide evidence of implementation and impact.

Authors Guild Releases Model Contract Clauses for Generative AI

The Authors Guild has released four new model contract clauses that address the use of generative AI in the publishing sector.

The four new clauses cover language that addresses:

🤖 The use of generative AI by authors

🤖 Prior consent for generative AI audiobooks

🤖 Prior consent for generative AI translations

🤖 Prior consent for generative AI book cover design

The translation and book cover clauses do allow for the assistive use of generative AI tools in connection with a human translator or designer.

These four new clauses now join the previous model clause the AG released that prohibited the use of an author’s work for training generative AI models.

Why does this matter? The AG explains:

“The purpose of these demands is to prevent the use of AI to replace human creators. The Authors Guild strongly believes that human writing, narration, and translation are vastly superior to their AI mimics. Moreover, as an ethical matter, the Authors Guild opposes relying on these tools to replace human creators, in part because current AI content generators have largely been trained on pre-existing works without consent.”

Looking for the language? Check out the following, which are helpful sample clauses to review if you are looking for similar to incorporate into your own contracts:

Model Trade Book Contract

  • Section 2(f) – No Generative AI Training Use.
  • Section 2(g) – Audio Book Clause (for use with audiobook grants)
  • Section 2(h) – Translation Clause (for use with grants of translation rights)

Translator Book Contract

  • Section 1(d) – No Generative AI Training Use.
  • Section 1(e) – Audio Book Clause (for use with audiobook grants)
  • Section 2(c) – Clause relating to Translators’ use of AI

I share some practical drafting considerations over on the Creator Economy Law blog, so check it out!


Don't Miss

YouTube rolled back its rules against misinformation, as Axios first reported. It’s a move that comes as many social media platforms appear to be loosening their controls ahead of the 2024 elections. The Verge also reports on the recent changes from Twitter, Meta, and YouTube.

Instagram, YouTube, TikTok, and Twitter are the targets of an EU crypto advertising complaint.

Twitter

This week marked Linda Yaccarino‘s take over as CEO of Twitter. She shared in a tweet, It happened — first day in the books! Stay tuned…”

I’m goin’ down… Twitter’s head of Trust & Safety, Ella Irwin, has resigned. The move comes following an issue with how the platform handled transgender content.

Twitter has left the EU’s voluntary Code of Practice against misinformation, as announced by European Commissioner for Internal Market Thierry Breton on Twitter. France has also joined in on issuing a warning to Twitter for its apparent lack of compliance with EU law come August.

The Wall Street Journal reports that EU regulators are planning a “stress test” on Twitter to determine its ability to comply with the upcoming Digital Services Act. Meanwhile, France’s digital minister Jean-Noël Barrot said, “Twitter, if it repeatedly doesn’t follow our rules, will be banned from the EU.” And now, in the U.S. a group of senators are questioning Twitter’s privacy compliance in light of the company’s oversight by the FTC. Senators Warren, Markey, Wyden, and Hirono have signed the letter to Twitter. Read the letter.

Twitter is facing a lawsuit in New York from 11 former office cleaners claiming the company owes backpay under NY labor laws for the way they were fired.

Twitter is under fire regarding its ability to maintain brand safety, as CNBC explores.

Oh… and Twitter’s U.S. advertising revenue is down 59% YoY and, according to Fidelity, the company is worth 33% of Elon’s purchase price.

TikTok

TikTok opens up eligibility for Series. Beginning June 6, 2023, creators in select regions who are 18 years or older, have an account that is at least 30 days old with at least 10K followers, have posted more than three public videos in the last 30 days, and have at least 1K authentic video views in the last 30 days, are eligible to join Series. Creators with less than 10K followers but meet the above requirements can apply by providing a link to premium content they’ve previously sold on other platforms via the Creator Center in the TikTok app.

Internal messaging and collaboration app Lark is causing headaches for TikTok following reports of employees sharing user information through the system. According to a report by The New York Times, the system is allegedly accessible in China and across all employees of the parent company ByteDance. Information shared ranged from names and addresses to drivers licenses and CSAM.

The Wall Street Journal spoke with a former Bytedance employee that claims the company opened up access for the Chinese Communist Party (CCP) to the data of Hong Kong protestors in 2018.

TikTok is under fire from some of music’s most powerful players. Its global music boss says the future’s bright – and answers some tough questions in this interview with Music Business Worldwide.

The United States Department of Defense (DoD), General Services Administration (GSA), and National Aeronautics and Space Administration (NASA) have issued an interim ban on having or using the social networking service TikTok or any successor application or service developed or provided by ByteDance Limited or an entity owned by ByteDance Limited for U.S. contractors.

Despite the uproar, Chinese apps remain highly popular in the U.S., as CNBC reports.

Meta

The Wall Street Journal reports that Instagram is being openly used to connect a sprawling pedophile network. “The Meta unit’s systems for fostering communities have guided users to child-sex content; company says it is improving its internal controls.” The Washington Post published a story exploring a new task force that Meta is starting to fight back. The EU is now asking questions, too, following a tweet from Commissioner Breton.

Instagram’s Director, Music Partnerships Perry Bashkoff, Director, Music Partnerships at Instagram, was part of the recent job cuts at Meta, as explained in his LinkedIn post.

Judge throws out DC’s privacy lawsuit against Meta — A Superior Court judge ruled that Meta ‘did not materially mislead consumers as to their response to Cambridge Analytica.’

Meta is asking the U.S. District Court for Washington, D.C., to block the Federal Trade Commission’s sanctions for the company’s alleged violations of a 2019 settlement with the agency. The Wall Street Journal offers more details, as well as Bloomberg.

Microsoft

An FTC action against Microsoft for alleged violations of the Children’s Online Privacy Protection Act (COPPA) Rule was filed by the Department of Justice. The $20 million proposed settlement will require Microsoft to bolster privacy protections for kids who use its #Xbox gaming system. The order also makes it clear that COPPA covers information like avatars generated from a child’s image, biometric data, and health data collected with other personal information – and reminds businesses that the Rule imposes strict limitations on the retention of data from kids.

Microsoft signed a deal with Nvidia-backed CoreWeave to meet A.I. computing demand.

Artificial Intelligence

As reported by Natasha Lomas over at TechCrunch, “The European Union is leaning on signatories to its Code of Practice on Online Disinformation to label deepfakes and other AI-generated content.” Read the full article for more details.

Jetpack, owned by Automattic, is integrating generative AI into the WordPress blogging platform through a new tool called Jetpack AI Assist. There is a free tier and a $10 per month tier. The company has also announced the launch of paid newsletters… which, you know, I have a WordPress.com run website, so maybe 🤔 TechCrunch has additional coverage.

Chad Rutkowski published an interesting article that explores what the recent SCOTUS decision in Warhol v Goldsmith might mean for training AI systems.

Van Lindberg also published an article that “analyzes applicable copyright law in relation to the factual foundation developed in part one. Comparing ML to technologies in previous cases, the article argues that the development and use of generative ML models in most cases falls outside the scope of copyright or constitutes fair use.” Check out: Building and Using Generative Models Under US Copyright Law, 18 Rutgers Bus. L.R. No. 1, 2023.

Extinction Level Event. 350 executives across tech and AI have signed onto a one-sentence statement on AI risk. Here is that sentence: “Mitigating the risk of extinction from AI should be a global priority alongside other societal-scale risks such as pandemics and nuclear war.”

A new release from the Congressional Research Service focuses on the intersection of #GenerativeAI and #DataPrivacy, exploring (1) the origins of content, information, and data being used to train and build, (2) how those materials are shared and used, and (3) policy considerations for congress. Check it out.

A district court in Texas announced one of the first of likely many generative AI legal requirements that are coming in the future. 🤖 Here’s why we should be paying attention.

Snapchat launches “My AI Snaps” for paid subscribers to “send it Snaps of what they’re up to and receive a unique generative Snap back that keeps the conversation going,” according to the press release.

How will generative AI affect the creator economy? Christianna Silva explores the question in this article for Mashable.

Other News

Think of the Children. Louisiana joins a handful of states (Utah, Arkansas, and, arguably, California) that have passed laws that require parental consent for minors to sign up for online services. The state legislature passed a law prohibiting kids under the age of 18 from signing up for online services. It now goes to the governor for signature before taking effect.

California Journalism Preservation Act. The California state assembly passed AB 886 in a vote of 55-6, with 19 not voting. The law will require social media companies, like Meta, to pay a royalty to news publishers when their content is shared on platforms. Meta, facing a similar issue in Canada, has already planned to remove content instead of paying the royalties. Axios provides an overview of other countries that have passed similar measures. Meanwhile, Canadian PM Justin Trudeau is calling the move “bullying tactics”.

More Strikes in Hollywood. The Screen Actors Guild-American Federation of Television and Radio Artists (SAG-AFTRA) has commenced formal contract negotiations with the Alliance of Motion Picture and Television Producers (AMPTP) following a vote this week in which almost 98% SAG-AFTRA members supported authorizing a strike if a deal isn’t reached by the June 30 deadline. Dana Kempler shares a great post with more details about what this means.

Podcast Unions Unite. Spotify is facing a backlash from Gimlet and Parcast unions following the absorption of both companies into Spotify Studio along with the layoff of 200 employees. Also, check out this article from The Hollywood Reporter that explores the M&A and layoffs activity within the podcast sector. (h/t Simon Pulman)

Eric Goldman explores the question of social media account ownership in light of the recent issues surrounding Band Energy Drinks.

Adidas has started the sell-off of its remaining inventory of Yeezy brand sneakers, what the company likely hopes is a quick move towards ending its relationship with controversial rapper Kanye “Ye” West.

SocialMediaToday published an article that explores a recent report by EnTribe that finds “81% of the consumers it surveyed didn’t feel that influencer endorsement had any impact on their purchase activity at all, and can even have a negative impact in some respects.”

Apple‘s WWDC 2023 contained a host of new software updates, products, and other features. Of note to the creator economy is the upcoming automatic removal of tracking parameters from links as part of iOS 17.

Struggling to keep up with the changes across social media? HubSpot published its research: “The Future of Social Media: What Marketers Need to Know” which you might find helpful.

Rolling Stone published its Creators Issue, including a list of the top 20 most influential creators. You can also check it out on Apple News+.

The U.S. Supreme Court declined to hear a child sex abuse case against Reddit, attempting to hold the platform liable for allegedly allowing images of minors to be shared by its users. Gizmodo has a full report. As the article notes, it’s another sign that the Court is punting to Congress to take action on any reforms to Section 230 and internet law more broadly.


Learn With Me

Learn about AI and IP

What are the current best practices for counseling clients on artificial intelligence? 🤔

We got you covered! I had such a blast discussing #AI and #IntellectualProperty law for the Practising Law Institute (PLI)’s IP Discussions AI program. The panel included:

  • Joshua L. Simmons of Kirkland & Ellis
  • Mary Rasenberger of The Authors Guild
  • Luis Villa of Tidelift.

Note that a subscription to PLI or purchase of the program is required to watch.

Here’s more about the program 🤖 A day does not go by without an announcement of a new technological development relating to artificial intelligence. At the same time, legislators and regulators are only just starting to wrap their arms around the legal consequences of these technological advancements. Moreover, although a number of artificial intelligence-related intellectual property cases already have been litigated, new litigations and those not yet filed are likely to determine the boundaries of the protection for and the permissible use of artificial intelligence.

After completing this program, participants will be able to answer: 

💡 What are the fault lines in protecting, through intellectual property rights, the outputs of artificial intelligence? 

💡 How can we learn from past experience with open-source licensing to determine how to license artificial intelligence-related technologies? 

💡 What are the current best practices for counseling clients on artificial intelligence?

Thanks to Kenneth Min for helping to organize the panel!

Learn about fandoms and IP

How does #FairUse impact fandoms within the #CreatorEconomy? In this latest episode of the Creative Control podcast from Fast Company, the discussion centers around the legal aspects of creators utilizing intellectual property owned by (often) large media companies.

#Copyright applies because the fandom works are often considered derivative of the original works, which would typically require a license.

#Trademark applies because the fandom works often rely on protected brand elements from their favorite universes.

However, there’s often a give-and-take between creators and IP owners. Creators offer a vibrant, passionate community of fans that not only engage with a media property but also directly contribute to the financial success of any activity coming out of the property.

On the other side, media companies recognize the downside of policing every single unauthorized use of their IP, which could agitate their fandom, while still ensuring they aren’t devaluing the potential market for licensing and exploitation of their IP.

Take a listen and let me know what you think! Note: it’s the third in a three-part series on fandom, so check out the first two episodes, too!

Copyright Office Hosting AI Webinar

Save the date! On June 28th, the U.S. Copyright Office is hosting a public webinar on the current guidance for registering works created using generative AI technology. Check out more details.


Music Video of the Week

This week, I’ve been enjoying some Apple Fitness+ workouts with Madonna as the featured artist. They’re great! They also brought back memories of the CD single I had for Ray of Light back in elementary school that I bought at the local Borders bookstore. #memories Enjoy the Ray of Light music video!

Watch on YouTube or Apple Music.


Editor's Notes

Affiliate Links. As an Amazon Associate, I earn from qualifying purchases. I have noted above where links to products on Amazon may earn me a commission if you make a purchase. Thanks for supporting my work!

Not Legal Advice. This newsletter is published solely for educational and entertainment value. Nothing in this newsletter should be considered legal advice. If you need legal assistance or have specific questions, you should consult a licensed attorney in your jurisdiction. I am not your attorney. Do not share any information in the comments you should keep confidential.

Personal Opinions. The opinions and thoughts shared in this newsletter are my own, and not those of my employer or any of the third parties mentioned or linked to in this newsletter. No affiliation or endorsement is implied or otherwise intended with third parties that are referenced or linked.


Enjoying this? Share with someone you think might be interested! If this was forwarded to you, jump over to LinkedIn and subscribe for free.

Trending

Discover more from Creator Economy Law

Subscribe now to keep reading and get access to the full archive.

Continue reading