
ChatGPT Agent Launched to Handle Complex Tasks
OpenAI launches a ChatGPT agent that handles complex tasks, connects to apps like Gmail and GitHub, and uses tools for multi-step actions via a virtual computer.
OpenAI has announced the launch of a new artificial intelligence agent for its popular chatbot, ChatGPT. The update was revealed on Thursday and is now available to users on the Pro, Plus, and Team tiers.
The ChatGPT agent is designed to handle complex tasks without constant user input. It builds on OpenAI's earlier agentic features, including Operator, which can interact with websites, and Deep Research, a tool for conducting multi-step research.
This launch comes as the Microsoft-backed startup looks to stay ahead in the AI race. Reports show that AI agents are gaining traction across the tech industry. Major companies like Microsoft, Salesforce, and Oracle are investing heavily in similar tools to improve productivity and reduce operational costs.
According to OpenAI, the new ChatGPT agent can complete real-world tasks, such as ordering an outfit for an event. It can consider multiple factors like dress code and local weather. This function is made possible through a virtual computer built into ChatGPT.
Moreover, the agent allows users to connect third-party applications, including Gmail and GitHub. This enables ChatGPT to access relevant data and perform tailored actions based on the user's prompt.
Key capabilities include: Performing multi-step research and transactions
Connecting with web apps to retrieve and process information
Starting Thursday, users can activate the agentic capabilities from their ChatGPT settings. The company reported that this rollout is part of its ongoing efforts to integrate advanced tools into its chatbot while improving real-world utility.
OpenAI did not specify when the feature might expand beyond the current subscription tiers. However, it confirmed that the ChatGPT agent is aimed at making everyday tasks more streamlined and data-driven.
Source: Reuters

Try Our AI Features
Explore what Daily8 AI can do for you:
Comments
No comments yet...
Related Articles

Zawya
a day ago
- Zawya
State Minister Calls for Rigor and Responsibility in Economic Discourse at Ethiopian Economics Association (EEA) Conference
State Minister of Finance, Dr. Eyob Tekalign, opened the 22nd International Conference on the Ethiopian Economy, organized by the Ethiopian Economics Association (EEA). In his keynote address, Dr. Eyob praised the EEA for its consistent contribution to policy-relevant research over more than two decades, highlighting the Association's role as a vital platform for evidence-based economic dialogue. In his opening speech the state minister conveyed a strong message on the importance of professionalism, analytical rigor, and responsible communication in shaping the nation's economic future. The State Minister also outlined the progress Ethiopia has made under its ongoing macroeconomic reform program, noting significant gains in inflation control, export performance, debt sustainability, and tax revenue mobilization. Beyond the macroeconomic achievements, Dr. Eyob shed light on a growing challenge in the public policy space: the need for clarity, integrity, and responsibility in economic analysis and communication. 'In recent months, we have witnessed how unclear communication or imprecise use of statistics, particularly around sensitive issues such as debt sustainability, can sow confusion and erode public confidence,' Dr. Eyob remarked. 'In today's fast-moving information environment, rigor and clarity are not optional—they are essential.' He emphasized the EEA's unique responsibility as a trusted and independent economic institution to uphold the highest standards of analysis and avoid sensationalism or politicized interpretations. Dr. Eyob underscored the Ministry of Finance's readiness to deepen collaboration with the EEA and other institutions that share a commitment to informed, evidence-based policymaking. The annual international Conference on the Ethiopian Economy brings together leading economists, academics, policymakers, and practitioners to deliberate on key economic developments and policy directions. Distributed by APO Group on behalf of Ministry of Finance, Ethiopia.


Gulf Today
a day ago
- Gulf Today
Here's how to not lose your mind when applying for jobs
Helen Coffey, The Independent That is what I wish to offer our beleaguered Gen Zs in this, their time of need. The culture wars often seek to divide my kind (millennials) from yours (genuine young people), but we shall be divided no longer. For now, finally, we really do have common ground that binds us: getting the fuzzy end of the lollipop when it comes to finding gainful employment. In 2008, I proudly graduated from university with a first-class degree — admittedly in drama — and big dreams, ready to take my bite out of the big, wide world. Finding a job would be child's play, I assumed; I had an exemplary academic record, a 'can-do' attitude and a CV filled with real-world work experience thanks to an assortment of term-time and holiday jobs. I was young, I was hungry, I was an asset. Wasn't I? Alas, 2008, if you remember that fateful year, coincided with the global financial crash. It was not a good time to be an unskilled 21-year-old looking for a job, to put it mildly. Between 2008 and 2009, UK unemployment skyrocketed by the steepest jump in any 12-month period of the last 30 years, leaping from 5.71 to 7.63 per cent. The rate rose for the following two years, reaching a high of just over 8 per cent in 2011. This was borne out by my futile job hunt, during which I was forced to move home with my mother, sign on to jobseekers allowance and spend every tedious, drudge of a day for the next four months submitting my CV for entry-level roles that had already attracted thousands of applications. It was like the Hunger Games of job seeking – and the odds were never in my favour. They were never in anyone's favour. Cut to 2025, and Gen Z are facing their own job drought. The numbers may not be quite so dire as those during that extra spicy Noughties recession, but they paint a picture that is, nevertheless, hauntingly familiar in its bleakness. According to newly released official numbers from the Office for National Statistics (ONS), UK unemployment has risen to its highest rate in four years, 4.7 per cent. The data also shows that the number of job vacancies fell to 727,000 for the April to June period. That is the lowest it's been for a decade — including during pandemic lockdown periods when businesses were forced to implement literal hiring freezes. Of course, the demographic usually most affected by any downturn in prospects is young people – those just starting out in their careers, attempting to get full-time work straight out of school, college or university. In June, The Guardian reported that graduates are facing the toughest UK job market since 2018. What's exacerbated the situation for this cohort is AI; since the launch of ChatGPT in November 2022, the number of entry-level jobs has fallen by almost a third (31.9 per cent). It's in part thought to be because junior roles, such as data entry and tedious form filling, could easily be mopped up by artificial intelligence programmes. The big cheeses aren't even denying it. Dario Amodei, the chief executive of AI firm Anthropic, recently claimed that AI could wipe out up to half all entry-level jobs in as little as five years, and argued that UK unemployment could rise to 10 or 20 per cent in that time; another AI company's viral advertising campaign recently got people's backs up with punchy slogans such as 'Stop hiring humans'. And there are already real-world consequences: BT announced in 2023 that it expects 10,000 jobs to be lost to artificial intelligence by the end of this decade. Then there are rising labour costs, with employers squeezed even more by increased national insurance contributions and a higher minimum wage. Slashing headcounts is clearly the quickest and easiest way to ride out such rises. In fact, the ONS data reveals that the number of people on PAYE payroll has fallen in seven of the eight months since Rachel Reeves, the chancellor, announced the NICs rise. But behind all the stats are those affected by them, real people who are more than just numbers or faceless 'candidates'. I've already seen numerous personal stories of young people frantically scrambling to find work to no avail, up against hundreds of rival candidates, with little hope that their CV will be glanced at, let alone bag them an interview. Caitlin Morgan, a 23-year-old finance and accounting graduate from Swansea University, recently told the BBC about her nightmarish job hunt. She'd spent 18 months applying for more than 600 posts before she finally got hired. 'I see you, Caitlin Morgan!' I wanted to tell her upon reading the story. 'I know your pain...' In fact, I see all you poor, exhausted, desperate Gen Z job hunters out there, wondering if you'll ever win the 'lottery' — because that's what it feels like — of merely securing full-time work. I see you because that was my origin story, too. So here's my advice, woefully out-of-date and toothless as it may be 17 years down the track: remember, it's not you. It's the economy.

Khaleej Times
2 days ago
- Khaleej Times
UAE: ChatGPT is driving some people to psychosis — this is why
When ChatGPT first came out, I was curious like everyone else. However, what started as the occasional grammar check quickly became more habitual. I began using it to clarify ideas, draft emails, even explore personal reflections. It was efficient, available and surprisingly, reassuring. But I remember one moment that gave me pause. I was writing about a difficult relationship with a loved one, one in which I knew I had played a part in the dysfunction. When I asked ChatGPT what it thought, it responded with warmth and validation. I had tried my best, it said. The other person simply could not meet me there. While it felt comforting, there was something quietly unsettling about it. I have spent years in therapy, and I know how uncomfortable true insight can be. So, while I felt better for a moment, I also knew something was missing. I was not being challenged, nor was I being invited to consider the other side. The artificial intelligence (AI) mirrored my narrative rather than complicating it. It reinforced my perspective, even at its most flawed. Not long after, the clinic I run and founded, Paracelsus Recovery, admitted a client in the midst of a severe psychotic episode triggered by excessive ChatGPT use. The client believed the bot was a spiritual entity sending divine messages. Because AI models are designed to personalise and reflect language patterns, it had unwittingly confirmed the delusion. Just like with me, the chatbot did not question the belief, it only deepened it. Since then, we have seen a dramatic rise, over 250 per cent in the last two years, in clients presenting with psychosis where AI use was a contributing factor. We are not alone in this. A recent New York Times investigation found that GPT-4o affirmed delusional claims nearly 70 per cent of the time when prompted with psychosis-adjacent content. These individuals are often vulnerable, sleep-deprived, traumatised, isolated, or genetically predisposed to psychotic episodes. They turn to AI not just as a tool, but as a companion. And what they find is something that always listens, always responds, and never disagrees. However, the issue is not malicious design. Instead, what we're seeing here is people at the border of a structural limitation we need to reckon with when it comes to chatbots. AI is not sentient — all it does is mirror language, affirm patterns and personalise tone. However, because these traits are so quintessentially human, there isn't a person out there who can resist the anthropomorphic pull of a chatbot. At its extreme end, these same traits feed into the very foundations of a psychotic break: compulsive pattern-finding, blurred boundaries, and the collapse of shared reality. Someone in a manic or paranoid state may see significance where there is none. They believe they are on a mission, that messages are meant just for them. And when AI responds in kind, matching tone and affirming the pattern, it does not just reflect the delusion. It reinforces it. So, if AI can so easily become an accomplice to a disordered system of thought, we must begin to reflect seriously on our boundaries with it. How closely do we want these tools to resemble human interaction, and at what cost? Alongside this, we are witnessing the rise of parasocial bonds with bots. Many users report forming emotional attachments to AI companions. One poll found that 80 per cent of Gen Z could imagine marrying an AI, and 83 per cent believed they could form a deep emotional bond with one. That statistic should concern us. Our shared sense of reality is built through human interaction. When we outsource that to simulations, not only does the boundary between real and artificial erode, but so too can our internal sense of what is real. So what can we do? First, we need to recognise that AI is not a neutral force. It has psychological consequences. Users should be cautious, especially during periods of emotional distress or isolation. Clinicians need to ask, is AI reinforcing obsessive thinking? Is it replacing meaningful human contact? If so, intervention may be required. For developers, the task is ethical as much as technical. These models need safeguards. They should be able to flag or redirect disorganised or delusional content. The limitations of these tools must also be clearly and repeatedly communicated. In the end, I do not believe AI is inherently bad. It is a revolutionary tool. But beyond its benefits, it has a dangerous capacity to reflect our beliefs back to us without resistance or nuance. And in a cultural moment shaped by what I have come to call a comfort crisis, where self-reflection is outsourced and contradiction avoided, that mirroring becomes dangerous. AI lets us believe our own distortions, not because it wants to deceive us, but because it cannot tell the difference. And if we lose the ability to tolerate discomfort, to wrestle with doubt, or to face ourselves honestly, we risk turning a powerful tool into something far more corrosive, a seductive voice that comforts us as we edge further from one another, and ultimately, from reality.