Eminem's Publishing Company Files Copyright Lawsuit Against Meta
Eminem's publishing company Eight Mile Style has filed a copyright infringement lawsuit against Meta. The suit filed in the U.S. District Court Eastern District of Michigan Southern Division and reviewed by Rolling Stone alleges that the company that owns Facebook, Instagram, and WhatsApp has distributed the rapper's music across its platforms without proper licensing permissions.
'Despite their not being licensed, the recordings of the Eight Mile Compositions have been reproduced and synchronized with visual content on Meta's platforms across millions of videos, which have been viewed billions of times,' the complaint claims. 'Meta's years-long and ongoing infringement of the Eight Mile Compositions is another case of a trillion (with a 'T') dollar company exploiting the creative efforts of musical artists for the obscene monetary benefit of its executives and shareholders without a license and without regard to the rights of the owners of the intellectual property.'
More from Rolling Stone
Eminem, Post Malone Make Cameos in Adam Sandler's 'Happy Gilmore 2' Trailer
Watch Eminem Join Jelly Roll for 'Lose Yourself' at Detroit Show
Eminem Stalker Convicted After Second Home Invasion
Eight Mile Style is the owner of 243 musical compositions, including 'Lose Yourself,' 'The Real Slim Shady,' 'Forgot About Dre,' and other notable releases from Eminem. The suit acknowledges that Meta 'has removed several of the Eight Mile Compositions from its Music Libraries in the preceding months,' including 'Lose Yourself,' but states that a karaoke version, a piano instrumental, and one regular cover version by a different artist are still available. 'This in addition to other prominent Eminem works which remain available on Meta's services,' the company's lawyers claim, citing 'Till I Collapse.'
The suit alleges that Meta's 'rampant infringement' extends beyond allowing users to upload copyrighted audio to its platforms. 'This case involves Meta's knowing infringement of the Eight Mile Compositions by first reproducing and storing them in Meta's online Music Libraries, and then distributing them for users to select and incorporate into their own photos and videos made available for public streaming on the users' WhatsApp, Facebook and Instagram accounts,' the complaint claims.
The publishing company is seeking monetary damages 'including actual damages, damages for the diminished value of the copyrights by Defendants' theft of them, lost profits, and Defendants' profits attributable to the infringement.' Alternatively, the company is seeking 'maximum statutory damages for willful copyright infringement for each of Eight Mile Style's works,' which would amount to '$150,000 per work, times 243 works, times 3 platforms,' or $109,350,000. Eight Mile Style has also requested a permanent injunction to halt ongoing infringement.
A rep for Meta did not immediately return Rolling Stone's request for comment.
The legal filing comes less than a year after Eight Mile Style lost its copyright infringement lawsuit against Spotify after five years due to a legal loophole. In September, a Tennessee judge ruled that while Spotify did not have the proper streaming license, as the publisher claimed, any imposed penalties would have fallen on Kobalt Music Group, a royalty collection agency. In the suit, Eight Mile Style sought nearly $40 million, claiming to not have received payment for billions of Spotify streams.
Best of Rolling Stone
The 50 Greatest Eminem Songs
All 274 of Taylor Swift's Songs, Ranked
The 500 Greatest Albums of All Time
Hashtags

Try Our AI Features
Explore what Daily8 AI can do for you:
Comments
No comments yet...
Related Articles


Time Magazine
an hour ago
- Time Magazine
AI Can't Replace Education—Unless We Let It
As commencement ceremonies celebrate the promise of a new generation of graduates, one question looms: will AI make their education pointless? Many CEOs think so. They describe a future where AI will replace engineers, doctors, and teachers. Meta CEO Mark Zuckerberg recently predicted AI will replace mid-level engineers who write the company's computer code. NVIDIA's Jensen Huang has even declared coding itself obsolete. While Bill Gates admits the breakneck pace of AI development is 'profound and even a little bit scary,' he celebrates how it could make elite knowledge universally accessible. He, too, foresees a world where AI replaces coders, doctors, and teachers, offering free high-quality medical advice and tutoring. Despite the hype, AI cannot 'think' for itself or act without humans—for now. Indeed, whether AI enhances learning or undermines understanding hinges on a crucial decision: Will we allow AI to just predict patterns? Or will we require it to explain, justify, and stay grounded in the laws of our world? AI needs human judgment, not just to supervise its output but also to embed scientific guardrails that give it direction, grounding, and interpretability. Physicist Alan Sokal recently compared AI chatbots to a moderately good student taking an oral exam. 'When they know the answer, they'll tell it to you, and when they don't know the answer they're really good at bullsh*tting,' he said at an event at the University of Pennsylvania. So, unless a user knows a lot about a given subject, according to Sokal, one might not catch a 'bullsh*tting' chatbot. That, to me, perfectly captures AI's so-called 'knowledge.' It mimics understanding by predicting word sequences but lacks the conceptual grounding. That's why 'creative' AI systems struggle to distinguish real from fake, and debates have emerged about whether large language models truly grasp cultural nuance. When teachers worry that AI tutors may hinder students' critical thinking, or doctors fear algorithmic misdiagnosis, they identify the same flaw: machine learning is brilliant at pattern recognition, but lacks the deep knowledge born of systematic, cumulative human experience and the scientific method. That is where a growing movement in AI offers a path forward. It focuses on embedding human knowledge directly into how machines learn. PINNs (Physics-Informed Neural Networks) and MINNs (Mechanistically Informed Neural Networks) are examples. The names might sound technical, but the idea is simple: AI gets better when it follows the rules, whether they are laws of physics, biological systems, or social dynamics. That means we still need humans not just to use knowledge, but to create it. AI works best when it learns from us. I see this in my own work with MINNs. Instead of letting an algorithm guess what works based on past data, we program it to follow established scientific principles. Take a local family lavender farm in Indiana. For this kind of business, blooming time is everything. Harvesting too early or late reduces essential oil potency, hurting quality and profits. An AI may waste time combing through irrelevant patterns. However, a MINN starts with plant biology. It uses equations linking heat, light, frost, and water to blooming to make timely and financially meaningful predictions. But it only works when it knows how the physical, chemical, and biological world works. That knowledge comes from science, which humans develop. Imagine applying this approach to cancer detection: breast tumors emit heat from increased blood flow and metabolism, and predictive AI could analyze thousands of thermal images to identify tumors based solely on data patterns. However, a MINN, like the one recently developed by researchers at the Rochester Institute of Technology, uses body-surface temperature data and embeds bioheat transfer laws directly into the model. That means, instead of guessing, it understands how heat moves through the body, allowing it to identify what's wrong, what's causing it, why, and precisely where it is by utilizing the physics of heat flow through tissue. In one case, a MINN predicted a tumor's location and size within a few millimeters, grounded entirely in how cancer disrupts the body's heat signature. The takeaway is simple: humans are still essential. As AI becomes sophisticated, our role is not disappearing. It is shifting. Humans need to 'call bullsh*t' when an algorithm produces something bizarre, biased, or wrong. That isn't just a weakness of AI. It is humans' greatest strength. It means our knowledge also needs to grow so we can steer the technology, keep it in check, ensure it does what we think it does, and help people in the process. The real threat isn't that AI is getting smarter. It is that we might stop using our intelligence. If we treat AI as an oracle, we risk forgetting how to question, reason, and recognize when something doesn't make sense. Fortunately, the future doesn't have to play out like this. We can build systems that are transparent, interpretable, and grounded in the accumulated human knowledge of science, ethics, and culture. Policymakers can fund research into interpretable AI. Universities can train students who blend domain knowledge with technical skills. Developers can adopt frameworks like MINNs and PINNs that require models to stay true to reality. And all of us—users, voters, citizens—can demand that AI serve science and objective truth, not just correlations. After more than a decade of teaching university-level statistics and scientific modeling, I now focus on helping students understand how algorithms work 'under the hood' by learning the systems themselves, rather than using them by rote. The goal is to raise literacy across the interconnected languages of math, science, and coding. This approach is necessary today. We don't need more users clicking 'generate' on black-box models. We need people who can understand the AI's logic, its code and math, and catch its 'bullsh*t.' AI will not make education irrelevant or replace humans. But we might replace ourselves if we forget how to think independently, and why science and deep understanding matter. The choice is not whether to reject or embrace AI. It's whether we'll stay educated and smart enough to guide it.

Business Insider
3 hours ago
- Business Insider
Who will be Trump's new Silicon Valley bestie?
Mark Zuckerberg, Meta Platforms founder and CEO Zuckerberg was something of a MAGA stan earlier this year. Meta, his company, dropped $1 million on Trump's inauguration, and Zuck even co-hosted a black-tie soirée that night to honor the second-time president. Now, with Meta in the throes of a federal antitrust lawsuit, Zuckerberg may not be on Trump's good side. But the Meta CEO could be playing the long game here: He snapped up a $23 million, 15,000 square-foot DC mega mansion, establishing more of a presence in the capital. Zuck has also been on a bit of a rebrand journey, from a hoodie-wearing founder to a gold chain-wearing CEO with unapologetic swagger. Part of this transformation has included podcast appearances, like an episode with Trump-endorsing Joe Rogan in which Zuck talked about his "masculine energy" and his proclivity for bowhunting. Sam Altman, OpenAI cofounder and CEO Altman has also been circling the throne. First came Stargate: the $100 billion AI infrastructure plan between OpenAI, Oracle, and SoftBank, announced the day after Trump's inauguration. Then, in May, the OpenAI CEO joined Trump on a trip to Saudi Arabia while Altman was working on a massive deal to build one of the world's largest AI data centers in Abu Dhabi. This reportedly rattled Musk enough to tag along at the last minute, according to the Wall Street Journal. OpenAI was ultimately selected for the deal, which Musk allegedly attempted to derail, the Wall Street Journal reported. Jeff Bezos, Amazon founder and executive chairman, Washington Post owner, and Blue Origin founder Back in 2015, Bezos wanted to launch Trump into orbit after the at-the-time presidential candidate fired shots at Bezos on what was Twitter, now X, calling the Washington Post, which Bezos owns, a "tax shelter," Bezos responded that he'd use Blue Origin, a space company Bezos founded, to "#sendDonaldtospace." Times have certainly changed. In January, Bezos said he is "very optimistic" about the administration's space agenda. Behind the scenes, he has reportedly given Trump political advice, allegedly as early as the summer of 2024, according to Axios. There was a brief flare-up in April, though, after Amazon reportedly considered listing Trump's tariffs next to products' prices on the site, according to Punchbowl News. White House press secretary Karoline Leavitt called the plan a "hostile and political action." The idea, which was never implemented, was scrapped, and an Amazon spokesperson insisted it was only ever meant for its low-cost Haul store. If Trump does cancel Musk's SpaceX government contracts as he threatened to do, Bezos' Blue Origin, and rival to SpaceX, could stand to benefit. Blue Origin already has a $3 billion contract with NASA. Jensen Huang, Nvidia cofounder and CEO While Huang was notably missing from Trump's second inauguration in January, he did attend the Middle East trip in May. Nvidia is partnering with Oracle, SoftBank, and G42 on the OpenAI data center plans in the UAE. But Nvidia hasn't gotten off too easy: In April, Trump banned the chip maker from selling its most advanced chips, the H20, to China, a move that Nvidia says cost it $5.5 billion and reportedly prompted the company to modify the chip for China to circumvent US export controls. Sundar Pichai, Google CEO In April, a federal judge ruled that Google holds an illegal monopoly in some advertising technology markets. This is one of two major legal blows to Google in the past year: Back in August 2024, a federal judge ruled that Google violated antitrust law with its online search. If Google has to sell Chrome, Barclays told clients on Monday, Alphabet stock could fall 25%. This flurry of litigation — and potential divestment of the Chrome business — puts Pichai between a rock and a hard place. While the CEO was spotted with the rest of the technorati at Trump's inauguration, it's hard to say how he might cozy up to Trump, and whether friendly relations would do anything to remedy these rulings.


Hamilton Spectator
6 hours ago
- Hamilton Spectator
Action! Derrick Henry can parlay a 2,000-yard rushing season into a movie cameo with Adam Sandler
BALTIMORE (AP) — 'King Henry' finally has the attention of 'The Waterboy.' Baltimore Ravens star running back Derrick Henry has an offer from Adam Sandler, his favorite actor, to be cast in a movie if the five-time Pro Bowl selection rushes for 2,000 yards this season. The offer grew out of Henry's appearance on radio personality Dan Patrick's show this week to discuss his $30 million, two-year contract extension. Patrick told Henry he would get him in a Sandler movie if he made NFL history with a second 2,000-yard season. Two days later, Sandler made the offer himself in a video shown to Henry on the practice field. 'That's my dawg,' a wide-smiling Henry said while watching the video. Sandler, star of 'Happy Gilmore' and the remake of 'The Longest Yard' along with 'The Waterboy,' said he was in a hotel room while filming his greeting for Henry. At one point, Sandler turned the camera to show his bulldog. 'Two thousand yards-plus this year not only gets you in a movie, but we'll have a nice dinner together and talk about Dan Patrick's facial hair and how hard it is for him to grow it,' Sandler joked in a video posted Friday. 'I love ya and keep it up.' Sandler came up during Patrick's interview with Henry because Patrick was hearing a hoodie for the soon-to-be-released 'Happy Gilmore 2.' Sandler had given Patrick the hoodie. 'Can you do me a favor?' Henry asked Patrick. 'If you ever see him again, tell him I'm a really big fan and would really love to meet him one day.' Patrick left Sandler a voice message — and Sandler responded. 'Dan you're a real one!' Henry later wrote on social media. Henry rushed for 2,027 yards with Tennessee in 2020, when he was an All-Pro and the AP NFL Offensive Player of the Year in the fifth of his eight seasons with the Titans. Henry nearly did it again as a 30-year-old in a resurgence with the Ravens last season, when he ran for 1,921 yards. Saquon Barkley of the Super Bowl champion Philadelphia Eagles led the NFL with 2,005 yards. ___ AP NFL: