
Rewriting The Script: How Black Millennial Streaming Executives Are Centering Ownership In Film
Devin White of Blacktivity and Chanel Nicole Scott of In The Black Network
The entertainment industry has long profited from Black creativity without investing in Black creators. But a new class of millennial media executives is working to disrupt that dynamic—this time, on their own terms.
Devin White, co-founder of the streaming and creative ecosystem Blacktivity, and Chanel Nicole Scott, CMO of In The Black Network, are leading the charge. From content licensing and compensation to strategic partnerships and platform design, both are reshaping the business of storytelling with equity at the center.
'Blacktivity was born from frustration—but also imagination,' said White. 'We're building a space that gives more than it takes. That means curating authentic stories, compensating our creatives fairly, and fostering a community where Black and BIPOC artists can grow—without having to pay to be in the room.'
Founded in 2024, Blacktivity offers an intentional alternative to traditional streaming models. It operates as a nonprofit and gives creators a 50/50 revenue split for non-exclusive content and 60/40 for exclusive deals—figures that sharply contrast with industry norms, where creators often relinquish ownership for limited exposure. Payouts are distributed quarterly, and agreements are capped at three years to protect creator autonomy.
But Blacktivity is more than just a platform. It's also an incubator, offering monthly masterclasses, networking events, and soon, micro-grants. 'We don't just distribute films—we build a community,' White emphasized. 'We know our creators. We talk to them. And we're investing our revenue right back into them.'
Similarly mission-driven, In The Black Network is a hybrid AVOD/TVOD platform co-founded by entertainment veteran James DuBose and co-owned by Scott, who also serves as the brand's Chief Marketing Officer. Since its 2023 launch, the platform has secured distribution on Apple TV, Roku, Samsung, and most recently, Vizio.
'Our model is designed to empower content creators—especially Black creators—to maintain ownership,' Scott explained. 'We're not in the business of telling creators what their story should be. We want them to come with a finished product, and we help them amplify it.'
The platform's approach requires creators to have skin in the game: they fund their productions independently and then enter licensing agreements with In The Black. Depending on viewership and pricing strategy, creators receive a revenue split based on performance. 'It keeps everyone accountable,' Scott said. 'If the work is strong and the audience shows up, you're compensated fairly.'
That emphasis on creator control marks a major shift from legacy practices—especially for Black talent. According to McKinsey, Black professionals hold less than 6% of executive positions in film and TV, despite making up a significant share of audience viewership. Meanwhile, Nielsen reports that Black audiences consume 35% more streaming content than the general population, often shaping pop culture narratives without seeing proportional returns.
For White and Scott, that gap between cultural influence and economic power is precisely what they're working to close.
'We know that Black viewership drives the culture—and the market,' said White. 'So why shouldn't we benefit from the platforms we help build?'
Both platforms also prioritize storytelling that pushes beyond tropes. 'We're not here to replicate the conflict-heavy, drama-driven formulas,' Scott added. 'We're building a space where our culture is reflected with nuance—where normal Black life, joy, and complexity are just as valuable.'
Still, disrupting an industry comes with challenges. Scott noted that In The Black remains investor-backed and is still on the path to profitability. 'It's a long game. But we've been intentional about bringing in people who believe in our mission—not just the margins.'
For White, the nonprofit structure of Blacktivity has required a bit of convincing among potential funders. 'We've had people question why we're offering such high pay splits,' he said. 'But this is about long-term sustainability for our community. We're not here to extract—we're here to empower.'
As both executives prepare for major milestones—including Blacktivity's June 14th launch event at Baltimore's CFG Arena and the first annual Blacktivity International Film Summit in Barbados this fall—they're clear on the legacy they hope to build.
'Equity, to me, is ownership,' Scott said. 'It's having the power to tell your story, your way—and reap the benefits of it.'
White agrees: 'This isn't just about streaming. It's about shifting the power back to the people who create culture. And we're just getting started.'
Hashtags

Try Our AI Features
Explore what Daily8 AI can do for you:
Comments
No comments yet...
Related Articles


Black America Web
23 minutes ago
- Black America Web
Daddy Lessons: 10 R&B Songs About Pops for Father's Day
Father's Day doesn't always come with the same bells and whistles as Mother's Day, but it should. Our dads, grandfathers, uncles, father-figures, and Black men raising families deserve their flowers too. Whether he taught you how to ride a bike, how to stand ten toes in the face of struggle, or just how to be real, there's always a song out there that captures that kind of love. RELATED: Gone Too Soon: 16 R&B Stars Who Died Under 35 This playlist is a mix of gratitude, memories, and lessons passed down. Some tracks celebrate fatherhood's strength, others sit in the complexities of it. Check out these 10 R&B songs about the special old man in your life! Daddy Lessons: 10 R&B Songs About Pops for Father's Day was originally published on 1. Gerald LeVert & Eddie LeVert – Wind Beneath My Wings The Leverts put their own soulful spin on this classic, making it a heartfelt father-son tribute. 2. Beyoncé – Daddy One of Bey's most personal tracks, this song is a soft tribute to her father and manager (at the time), Mathew Knowles. 3. Luther Vandross – Dance With My Father A gut-punch every time. Luther reflects on childhood memories and the painful longing after his father's passing. 4. Sade – Babyfather With her signature smooth style, Sade celebrates a father who's loving, present, and proud. 5. Chrisette Michele – Your Joy This jazzy ballad is about the love between a father and daughter. 6. James Brown – Papa Don't Take No Mess James wasn't just the Godfather of Soul, he was the voice of Black fatherhood for a generation. This track is a no-nonsense ode to tough love and real-life parenting. 7. Beyoncé – Daddy Lessons Another Bey track, she switches gears here. Blending country, jazz, and storytelling. This one's about strength, legacy, and the lessons she got from a daddy who prepared her for the real world. 8. Isn't She Lovely – Stevie Wonder This anthem was written for Stevie's daughter, and it's one of the most iconic celebrations of fatherhood ever recorded. 9. Bryan Andrew Wilson – Still, My Father This raw gospel-rooted song speaks to complicated relationships with dads who may not have always been there, but are still part of the story. 10. Horace Silver – Song for My Father A timeless jazz instrumental that's been sampled, studied, and celebrated across generations. Horace Silver composed this as a tribute to his Cape Verdean father.


CBS News
an hour ago
- CBS News
Baltimore judge proposes slashing city's $266 million opioid verdict or new trial
A Baltimore judge on Thursday proposed reducing the city's $266 million settlement from a verdict against two drug distributors, suggesting that the companies pay just under $52 million or receive a new trial to determine damages. In a 96-page ruling, Baltimore City Circuit Judge Lawrence Fletcher-Hill agreed with the companies, McKesson and AmerisourceBergen, that the jury's award was too high and that jurors attributed too much blame to the companies for the opioid crisis. "This decision is disappointing to say the least. We are evaluating the decision and considering all of our options," Scott said in a statement. "While the court acknowledged that the City proved that McKesson and AmerisourceBergen were liable for the City's opioid crisis, we are disappointed that the court abandoned the remainder of the findings of the jury, which carefully reviewed this case over nearly two months last year." Baltimore City awarded millions after litigation with pharmaceutical companies The city began litigation against pharmaceutical manufacturers and distributors for their role in the opioid epidemic in 2018, opting out of a global settlement in order to pursue more restitution money. Public health experts said that the crisis began because of the availability of prescription opioids, which led residents to turn to illicit drugs. Throughout 2024, the city reached settlements with several large companies including Allergan, CVS, Teva, Cardinal Health, and Walgreens, bringing total recoveries to $402.5 million by September 2024. In Nov. 2024, a Baltimore jury found McKesson and AmerisourceBergen liable in Baltimore's opioid epidemic and awarded the City of Baltimore more than $266 million in damages. A month later, the city began working with the Board of Estimates and City Council to come up with a plan for using the restitution funds. The plan included administrative and oversight costs, funding for health department and recovery programs, community engagement, and planning for the replacement of the Druid Health Clinic. Opioid crisis in Baltimore A June 2024 report published by the Baltimore Banner and New York Times found that the death toll from opioid deaths among residents reached more than 6,000 over the past six years – more than double that of any other large U.S. city. The report also found that the opioid crisis in Baltimore disproportionately impacted Black residents, particularly those aged 55 to 70. Then, community members accused the city of failing to prioritize the needs of Black communities affected by the opioid epidemic.
Yahoo
an hour ago
- Yahoo
Protecting the vulnerable, or automating harm? AI's double-edged role in spotting abuse
Artificial intelligence is rapidly being adopted to help prevent abuse and protect vulnerable people – including children in foster care, adults in nursing homes and students in schools. These tools promise to detect danger in real time and alert authorities before serious harm occurs. Developers are using natural language processing, for example — a form of AI that interprets written or spoken language – to try to detect patterns of threats, manipulation and control in text messages. This information could help detect domestic abuse and potentially assist courts or law enforcement in early intervention. Some child welfare agencies use predictive modeling, another common AI technique, to calculate which families or individuals are most 'at risk' for abuse. When thoughtfully implemented, AI tools have the potential to enhance safety and efficiency. For instance, predictive models have assisted social workers to prioritize high-risk cases and intervene earlier. But as a social worker with 15 years of experience researching family violence – and five years on the front lines as a foster-care case manager, child abuse investigator and early childhood coordinator – I've seen how well-intentioned systems often fail the very people they are meant to protect. Now, I am helping to develop iCare, an AI-powered surveillance camera that analyzes limb movements – not faces or voices – to detect physical violence. I'm grappling with a critical question: Can AI truly help safeguard vulnerable people, or is it just automating the same systems that have long caused them harm? Many AI tools are trained to 'learn' by analyzing historical data. But history is full of inequality, bias and flawed assumptions. So are people, who design, test and fund AI. That means AI algorithms can wind up replicating systemic forms of discrimination, like racism or classism. A 2022 study in Allegheny County, Pennsylvania, found that a predictive risk model to score families' risk levels – scores given to hotline staff to help them screen calls – would have flagged Black children for investigation 20% more often than white children, if used without human oversight. When social workers were included in decision-making, that disparity dropped to 9%. Language-based AI can also reinforce bias. For instance, one study showed that natural language processing systems misclassified African American Vernacular English as 'aggressive' at a significantly higher rate than Standard American English — up to 62% more often, in certain contexts. Meanwhile, a 2023 study found that AI models often struggle with context clues, meaning sarcastic or joking messages can be misclassified as serious threats or signs of distress. These flaws can replicate larger problems in protective systems. People of color have long been over-surveilled in child welfare systems — sometimes due to cultural misunderstandings, sometimes due to prejudice. Studies have shown that Black and Indigenous families face disproportionately higher rates of reporting, investigation and family separation compared with white families, even after accounting for income and other socioeconomic factors. Many of these disparities stem from structural racism embedded in decades of discriminatory policy decisions, as well as implicit biases and discretionary decision-making by overburdened caseworkers. Even when AI systems do reduce harm toward vulnerable groups, they often do so at a disturbing cost. In hospitals and elder-care facilities, for example, AI-enabled cameras have been used to detect physical aggression between staff, visitors and residents. While commercial vendors promote these tools as safety innovations, their use raises serious ethical concerns about the balance between protection and privacy. In a 2022 pilot program in Australia, AI camera systems deployed in two care homes generated more than 12,000 false alerts over 12 months – overwhelming staff and missing at least one real incident. The program's accuracy did 'not achieve a level that would be considered acceptable to staff and management,' according to the independent report. Children are affected, too. In U.S. schools, AI surveillance like Gaggle, GoGuardian and Securly are marketed as tools to keep students safe. Such programs can be installed on students' devices to monitor online activity and flag anything concerning. But they've also been shown to flag harmless behaviors – like writing short stories with mild violence, or researching topics related to mental health. As an Associated Press investigation revealed, these systems have also outed LGBTQ+ students to parents or school administrators by monitoring searches or conversations about gender and sexuality. Other systems use classroom cameras and microphones to detect 'aggression.' But they frequently misidentify normal behavior like laughing, coughing or roughhousing — sometimes prompting intervention or discipline. These are not isolated technical glitches; they reflect deep flaws in how AI is trained and deployed. AI systems learn from past data that has been selected and labeled by humans — data that often reflects social inequalities and biases. As sociologist Virginia Eubanks wrote in 'Automating Inequality,' AI systems risk scaling up these long-standing harms. I believe AI can still be a force for good, but only if its developers prioritize the dignity of the people these tools are meant to protect. I've developed a framework of four key principles for what I call 'trauma-responsive AI.' Survivor control: People should have a say in how, when and if they're monitored. Providing users with greater control over their data can enhance trust in AI systems and increase their engagement with support services, such as creating personalized plans to stay safe or access help. Human oversight: Studies show that combining social workers' expertise with AI support improves fairness and reduces child maltreatment – as in Allegheny County, where caseworkers used algorithmic risk scores as one factor, alongside their professional judgment, to decide which child abuse reports to investigate. Bias auditing: Governments and developers are increasingly encouraged to test AI systems for racial and economic bias. Open-source tools like IBM's AI Fairness 360, Google's What-If Tool, and Fairlearn assist in detecting and reducing such biases in machine learning models. Privacy by design: Technology should be built to protect people's dignity. Open-source tools like Amnesia, Google's differential privacy library and Microsoft's SmartNoise help anonymize sensitive data by removing or obscuring identifiable information. Additionally, AI-powered techniques, such as facial blurring, can anonymize people's identities in video or photo data. Honoring these principles means building systems that respond with care, not punishment. Some promising models are already emerging. The Coalition Against Stalkerware and its partners advocate to include survivors in all stages of tech development – from needs assessments to user testing and ethical oversight. Legislation is important, too. On May 5, 2025, for example, Montana's governor signed a law restricting state and local government from using AI to make automated decisions about individuals without meaningful human oversight. It requires transparency about how AI is used in government systems and prohibits discriminatory profiling. As I tell my students, innovative interventions should disrupt cycles of harm, not perpetuate them. AI will never replace the human capacity for context and compassion. But with the right values at the center, it might help us deliver more of it. This article is republished from The Conversation, a nonprofit, independent news organization bringing you facts and trustworthy analysis to help you make sense of our complex world. It was written by: Aislinn Conrad, University of Iowa Read more: Weaponized storytelling: How AI is helping researchers sniff out disinformation campaigns Is using AI tools innovation or exploitation? 3 ways to think about the ethics Healing from child sexual abuse is often difficult but not impossible Aislinn Conrad is developing iCare, an AI-powered, real-time violence detection system.