logo
The age of incredibly powerful 'manager nerds' is upon us, Anthropic cofounder says

The age of incredibly powerful 'manager nerds' is upon us, Anthropic cofounder says

Managers need to have "soft skills" like communication alongside harder technical skills. But what if the job becomes more about managing AI agents than people?
Anthropic cofounder Jack Clark says AI agents are ushering in an era of the "nerd-turned-manager."
"I think it's actually going to be the era of the manager nerds now, where I think being able to manage fleets of AI agents and orchestrate them is going to make people incredibly powerful," he said on an episode of the "Conversations with Tyler" podcast last week.
"We're going to see this rise of the nerd-turned-manager who has their people, but their people are actually instances of AI agents doing large amounts of work for them," he added.
Clark said he's already seeing this play out with some startups that have "very small numbers of employees relative to what they used to have because they have lots of coding agents working for them."
He's not the only tech exec to predict AI agents will let teams do more with fewer people.
Meta CEO Mark Zuckerberg said at the Stripe Sessions conference last week that tapping into AI can help entrepreneurs "focus on the core idea" of their business and operate with "very small, talent-dense teams."
"If you were starting whatever you're starting 20 years ago, you would have had to have built up all these different competencies inside your company, and now there are just great platforms to do it," Zuckerberg said.
Y Combinator CEO Garry Tan said in March that he thinks "vibe coding" — or using generative AI tools to quickly develop and experiment in software development — will help smaller startup teams do the work of 50 to 100 engineers.
"People are getting to a million dollars to 10 million dollars a year revenue with under 10 people, and that's really never happened before in early stage venture," Tan said. "You can just talk to the large language models and they will code entire apps."
AI researchers and other experts have warned there are risks to over-reliance on the technology, especially as a replacement to human manpower, including LLMs having hallucinations and concerns that vibe coding can make it harder in some instances to scale and debug code.
Mike Krieger, the cofounder of Instagram and chief people officer at Anthropic, said on a podcast earlier this year that he predicts a software developer's job will change in the next three years to focus more on double-checking code generated by AI rather than writing it themselves.
"How do we evolve from being mostly code writers to mostly delegators to the models and code reviewers?" he said on the " 20VC" podcast.
The job will be about "coming up with the right ideas, doing the right user interaction design, figuring out how to delegate work correctly, and then figuring out how to review things at scale," he added.
A spokesperson for Anthropic previously told BI the company sees itself as a "testbed" for workplaces navigating AI-driven changes to critical roles.
"At Anthropic, we're focused on developing powerful and responsible AI that works with people, not in place of them," the spokesperson said. "As Claude rapidly advances in its coding capabilities for real-world tasks, we're observing developers gradually shifting toward higher-level responsibilities."

Orange background

Try Our AI Features

Explore what Daily8 AI can do for you:

Comments

No comments yet...

Related Articles

OpenAI Launches Improved Version of Latest Reasoning Model. Here's What It's Best For
OpenAI Launches Improved Version of Latest Reasoning Model. Here's What It's Best For

CNET

time36 minutes ago

  • CNET

OpenAI Launches Improved Version of Latest Reasoning Model. Here's What It's Best For

OpenAI this week announced an improved version of its o3 generative AI reasoning model, saying it performed better on benchmarks and is designed to spend more time thinking on difficult problems. The new model, called o3-pro, is now available for Pro and Team users in ChatGPT and API users, and will come to Enterprise and Edu users next week. In its release notes, OpenAI said the model is similar to o1-pro, which users favored for its math, science and coding skills. It incorporates improvements from the newer o3 model, which can search the web and use more reasoning skills. "Because o3-pro has access to tools, responses typically take longer than o1-pro to complete," the company said. "We recommend using it for challenging questions where reliability matters more than speed, and waiting a few minutes is worth the tradeoff." Kevin Weil, OpenAI's chief product officer, posted on X that the company is also dropping the price of o3 in the API by 80%, and that rate limits for o3 will be doubled for Plus users. (Disclosure: Ziff Davis, CNET's parent company, in April filed a lawsuit against OpenAI, alleging it infringed Ziff Davis copyrights in training and operating its AI systems.) OpenAI continues to build on its latest rounds of updates to its large language models, which rolled out in April. CEO Sam Altman has teased the larger release of an "open-weights" model, a kind of middle ground between open- and closed-source models that provides more insight into how the models work. He doubled down on that teasing this week, posting on X that it would be exptected later in the summer, but not in June. "[O]ur research team did something unexpected and quite amazing and we think it will be very very worth the wait, but needs a bit longer," he said.

Applied Optoelectronics Announces First Volume Shipment of Data Center Transceivers to Recently Engaged Major Hyperscale Customer
Applied Optoelectronics Announces First Volume Shipment of Data Center Transceivers to Recently Engaged Major Hyperscale Customer

Yahoo

time41 minutes ago

  • Yahoo

Applied Optoelectronics Announces First Volume Shipment of Data Center Transceivers to Recently Engaged Major Hyperscale Customer

SUGAR LAND, Texas, June 11, 2025 (GLOBE NEWSWIRE) -- Applied Optoelectronics, Inc. ('AOI') (Nasdaq: AAOI), a leading provider of fiber-optic access network products for the internet datacenter, cable broadband, telecom and fiber-to-the-home (FTTH) markets, today announced the first volume shipment of high-speed data center transceivers to a recently re-engaged major hyperscale data center customer. This milestone is the first volume shipment of these advanced high-speed data center transceivers to this customer, and the first shipment of significant quantity to this customer in several years. 'Throughout the year, we have been expecting growth in data center transceiver sales, particularly in the second half of the year,' commented Dr. Thompson Lin, AOI's Founder, Chairman, and CEO. 'This first volume shipment to this customer represents a significant milestone on a journey to what we continue to expect to be significant business opportunities with this newly re-engaged customer. As we execute on our previously-announced US-based capacity expansion plan, we continue to expect shipments to this customer and other customers to increase in line with our previous commentary of a second-half ramp.' For more information about AOI's industry-leading line of advanced optical transceivers for AI-focused data centers, please refer to the information on AOI's website at About Applied Optoelectronics, Inc. Applied Optoelectronics, Inc. (AOI) is a leading developer and manufacturer of advanced optical products, including components, modules and equipment. AOI's products are the building blocks for broadband fiber access networks around the world, where they are used in the internet datacenter, CATV broadband, telecom and FTTH markets. AOI supplies optical networking lasers, components and equipment to tier-1 customers in all four of these markets. In addition to its corporate headquarters, wafer fab and advanced engineering and production facilities in Sugar Land, TX, AOI has engineering and manufacturing facilities in Taipei, Taiwan and Ningbo, China. For additional information, visit Forward-Looking InformationThis press release contains forward-looking statements within the meaning of the Private Securities Litigation Reform Act of 1995. In some cases, you can identify forward-looking statements by terminology such as "believe," "may," "estimate," "continue," "anticipate," "intend," "should," "could," "would," "target," "seek," "aim," "predicts," "think," "objectives," "optimistic," "new," "goal," "strategy," "potential," "is likely," "will," "expect," "plan," "project," "permit" or by other similar expressions that convey uncertainty of future events or outcomes. These statements include management's beliefs and expectations related to our outlook for the second quarter of 2025 and the remainder of 2025. Such forward-looking statements reflect the views of management at the time such statements are made. These forward-looking statements involve risks and uncertainties, as well as assumptions and current expectations, which could cause the company's actual results to differ materially from those anticipated in such forward-looking statements. These risks and uncertainties include but are not limited to: reduction in the size or quantity of customer orders; change in demand for the company's products due to industry conditions; changes in manufacturing operations; volatility in manufacturing costs; delays in shipments of products; disruptions in the supply chain; change in the rate of design wins or the rate of customer acceptance of new products; the company's reliance on a small number of customers for a substantial portion of its revenues; potential pricing pressure; a decline in demand for our customers' products or their rate of deployment of their products; general conditions in the internet datacenter, cable television (CATV) broadband, telecom, or fiber-to-the-home (FTTH) markets; changes in the world economy (particularly in the United States and China); changes in the regulation and taxation of international trade, including the imposition of tariffs; changes in currency exchange rates; the negative effects of seasonality; and other risks and uncertainties described more fully in the company's documents filed with or furnished to the Securities and Exchange Commission, including our Annual Report on Form 10-K for the year ended December 31, 2024 and our Quarterly Report on Form 10-Q for the quarter ended March 31, 2025. More information about these and other risks that may impact the company's business are set forth in the "Risk Factors" section of the company's quarterly and annual reports on file with the Securities and Exchange Commission. You should not rely on forward-looking statements as predictions of future events. All forward-looking statements in this press release are based upon information available to us as of the date hereof, and qualified in their entirety by this cautionary statement. Except as required by law, we assume no obligation to update forward-looking statements for any reason after the date of this press release to conform these statements to actual results or to changes in the company's expectations. Investor Relations Contact: The Blueshirt Group, Investor Relations Lindsay Savarese +1-212-331-8417ir@ in to access your portfolio

Meta Launches AI Video Restyling Tools in Edits and the Meta AI App
Meta Launches AI Video Restyling Tools in Edits and the Meta AI App

Yahoo

timean hour ago

  • Yahoo

Meta Launches AI Video Restyling Tools in Edits and the Meta AI App

This story was originally published on Social Media Today. To receive daily news and insights, subscribe to our free daily Social Media Today newsletter. After previewing its AI-powered video editing tools late last year, Meta is now making the first of these available in both the Meta AI and Instagram's Edits app, with a new 'Restyle' option enabling users to completely alter the context of their video clips. As you can see from these examples, Restyle uses AI understanding to completely alter the context of your videos. So now, if you want to make it look like you're in an animated film, or underwater, you can use these AI filters to reform your content. As explained by Meta: 'You'll now be able to easily edit your short-form videos using a variety of preset AI prompts that can transform your outfit, location, style and more. It's available in the U.S. and more than a dozen countries around the world.' So you do have to choose from a preset, you don't have the option to edit your clip based on text prompts as yet. But that's also in the pipeline, with Meta working on a range of AI customization and editing tools to help customize and reimagine your video clips. Meta says that there are currently more than 50 editing prompts that you can use to transform 10 seconds of your video 'for free for a limited time.' Which is worth noting. Instagram chief Adam Mosseri has previously said that Edits is available for free for now, but that Meta may have to charge for access in future to cover the cost of things like AI editing elements. From the wording here, that still appears to be the case, with Meta likely moving to a paid subscription tier for Edits at some stage. But for now, you can try out these prompts for free, which you can also use in the Meta AI app: 'Once you've chosen a preset prompt, Meta AI will edit your video to match the selected scenario. You can turn your video into a graphic novel, and see yourself reimagined as a vintage comic book illustration. Or change the lighting of a video you captured on a rainy Seattle day to create a dreamy mood with shimmery sparkles, pearlescent blur and soft focus. Or you can turn your video into a video game, complete with fluorescent lighting and battle clothing.' There's a range of options here, and I can imagine that a lot of people will be looking to try them out, and see what can be done with Meta's evolving video editing tools. Will that change video on IG as we know it? I imagine the impact here will be similar to Snapchat's Lenses, in that some people will be interested in trending effects and features, but for most, they'll continue to share their regular uploads instead. Because social apps are ultimately about connection, and filtering your content into unreal scenarios, while an impressive novelty, isn't helping on that front. But like Lenses, there will be trending filters, and there will be a lot of people posting their own take on these trends by using the same effects. Either way, it's an impressive-looking option, based on the examples, and it could open up more sharing and engagement opportunities in Meta's apps. Error in retrieving data Sign in to access your portfolio Error in retrieving data Error in retrieving data Error in retrieving data Error in retrieving data

DOWNLOAD THE APP

Get Started Now: Download the App

Ready to dive into the world of global news and events? Download our app today from your preferred app store and start exploring.
app-storeplay-store