logo
#

Latest news with #CSAIL

RBC joins Massachusetts Institute of Technology's (MIT) CSAIL fintech research initiative on the role of AI in the future of finance
RBC joins Massachusetts Institute of Technology's (MIT) CSAIL fintech research initiative on the role of AI in the future of finance

Cision Canada

time18-07-2025

  • Business
  • Cision Canada

RBC joins Massachusetts Institute of Technology's (MIT) CSAIL fintech research initiative on the role of AI in the future of finance

TORONTO, July 18, 2025 /CNW/ - RBC today announced its membership in FinTechAI@CSAIL, an initiative at the Massachusetts Institute of Technology's (MIT) Computer Science and Artificial Intelligence Laboratory (CSAIL), a premier research institution for computing and artificial intelligence (AI). CSAIL's fintech initiative, FinTechAI@CSAIL will examine AI's role in the future of finance. RBC recently announced its ambition to generate $700 million to $1 billion in value derived from AI by 2027 and is recognized as an AI leader in the financial space, ranking third among 50 global banks for AI maturity in the Evident AI index. "We believe Canada's future as a world leader in artificial intelligence requires us to work together as a community to bolster and grow the full AI ecosystem from institutes and universities to start-ups and companies," added Foteini Agrafioti, SVP Data & AI, RBC. "This collaboration reflects RBC's commitment to collaborate with leading research institutions that drive real-world value while upholding the bank's responsible AI principles which ensure that high standards of accountability, fairness, privacy and security, and transparency continue to be upheld in all of RBC's AI efforts." The MIT membership will provide RBC with access to talent, including CSAIL's graduate students, increased recruitment opportunities as well as participation in technical briefings and educational workshops. During its three-year membership, RBC will have early access to cutting-edge research across areas critical to the future of financial services including machine learning, predictive analytics, secure computation, cybersecurity, and data science. RBC will participate in executive boards, research reviews, and innovation pilots, helping to inform the direction of emerging technologies while building stronger links between academia and industry. RBC and FinTechAI@CSAIL will conduct machine learning research in areas such as explainability, bias mitigation, and LLM safety – key pillars of responsible AI – as well as emerging applications in cyber security and financial crime prevention. "RBC is constantly exploring ways to connect research with real-world impact," says Greg Mori, VP, RBC Fellow, RBC Borealis. "Working with FinTechAI@CSAIL allows us to access early-stage innovations that can help us build better, smarter, and more secure financial solutions. This collaboration between AI scientists in RBC Borealis and MIT will create technologies that will help shape the future of financial services and FinTechAI@CSAIL's leading edge research in responsible AI will be critical to the advancement of the field." MIT CSAIL Director Professor Daniela Rus says, "I am excited to work with our initiative members to advance the foundations of AI and enable new capabilities for the fintech industry sector. Together, we aim to develop intelligent, trustworthy, and transformative fintech AI solutions that can shape the future of global finance." About RBC Royal Bank of Canada is a global financial institution with a purpose-driven, principles-led approach to delivering leading performance. Our success comes from the 97,000+ employees who leverage their imaginations and insights to bring our vision, values and strategy to life so we can help our clients thrive and communities prosper. As Canada's biggest bank and one of the largest in the world, based on market capitalization, we have a diversified business model with a focus on innovation and providing exceptional experiences to our more than 19 million clients in Canada, the U.S. and 27 other countries. Learn more at

MIT Teaches Soft Robots Body Awareness Through AI And Vision
MIT Teaches Soft Robots Body Awareness Through AI And Vision

Forbes

time07-07-2025

  • Science
  • Forbes

MIT Teaches Soft Robots Body Awareness Through AI And Vision

MIT CSAIL researchers have developed a new system that teaches robots to understand their own ... More bodies, using only vision. Instead of relying on sensors, the system allows robots to learn how their bodies move and respond to commands just by watching themselves. Researchers from the Massachussets Institute of Technology's (MIT) CSAIL lab have developed a new system that teaches robots to understand their bodies, using only vision. Using consumer-grade cameras, the robot watched itself move and then built an internal model of its geometry and controllability. According the researchers this could dramatically expand what's possible in soft and bio-inspired robotics, enabling affordable, sensor-free machines that adapt to their environments in real time. The team at MIT said that this system and research is a major step toward more adaptable, accessible robots that can operate in the wild with no GPS, simulations or sensors. The research was published in June in Nature. Daniela Rus, MIT CSAIL Director said with Neural Jacobian Fields, CSAIL's soft robotic hands were able to learn to grasp objects entirely through visual observation with no sensors, no prior model and no manual programming. 'By watching its own movements through a camera and performing random actions, the robot built an internal model of how its body responds to motor commands. Neural Jacobian Fields mapped these visual inputs to a dense visuomotor Jacobian field, enabling the robot to control its motion in real time based solely on what it sees,' added Rus. Rus adds that the reframing of control has major implications. "Traditional methods require detailed models or embedded sensors but Neural Jacobian Fields lifts those constraints, enabling control of unconventional, deformable, or sensor-less robots in real time, using only a single monocular camera.'Vincent Sitzmann, Assistant Professor at MIT's Department of Electrical Engineering and Computer Science and CSAIL Principal Investigator said the researchers relied on techniques from computer vision and machine learning. The neural network observes a single image and learns to reconstruct a 3D model of the robot which relies on a technique called differentiable rendering which allows machine learning algorithms to learn to reconstruct 3D scenes from only 2D images. 'We use motion tracking algorithms - point tracking and optical flow - to track the motion of the robot during training,' said Sitzmann. "By relating the motion of the robot to the commands that we instructed it with, we reconstruct our proposed Neural Jacobian Field, which endows the 3D model of the robot with an understanding of how each 3D point would move under a particular robot action.' Sitzmann says this represents a shift towards robots possessing a form of bodily self-awareness and away from pre-programmed 3D models and precision-engineered hardware. 'This moves us towards more generalist sensors, such as vision, combined with artificial intelligence that allows the robot to learn a model of itself instead of a human expert,' said Sitzmann. "This also signals a new class of adaptable, machine-learning driven robots that can perceive and understand themselves.' The researchers said that three different types of robots acquired awareness of their bodies and the actions they could take as a result of that understandi A 3D-printed DIY toy robot arm with loose joints and no sensors learned to draw letters in the air with centimeter-level precision. It discovered which visual region corresponds to each actuation channel, mapping 'which joint moves when I command actuator X' just from seeing motion. A soft pneumatic hand learned which air channel controls each finger, not by being told, but just by watching itself wiggle. They inferred depth and geometry from color video alone, reconstructing 3D shape before and after actions. A soft, wrist-like robot platform, physically disturbed with added weight, learned to balance and follow complex trajectories. They quantified motion sensitivity, for example, measuring how a command that slightly changes an actuator produces millimeter‑level translations in the gripper. Changing soft robotics The CSAIL researchers aid that soft robots are hard to model because they deform in complex ways. One reasercher said in an email interview that the method they used in the research doesn't require any manual modeling. The robot watches itself move and figures out how its body behaves similar to a human learning to move their arm by watching themselves in a mirror. Sitzmann says conventional robots are rigid, discrete joints connected by rigid linksbuilt to have low manufacturing tolerance. "Compare that to your own body, which is soft: first, of course, your skin and muscles are not perfectly solid but give in when you grasp something.' 'However, your joints also aren't perfectly rigid like those of a robot, they can similarly bend and give in, and while you can sense the approximate position of your joints, your highest-precision sensors are vision and touch, which is how you solve most manipulation tasks,' said Sitzmann. "Soft robots are inspired by these properties of living creatures to be similarly compliant, and must therefore necessarily also rely on different sensors than their rigid cousins.' Sitzmann says that this kind of understanding could revolutionize industries like soft robotics, low‑cost manufacturing, home automation and agricultural robotics. 'Any sector that can profit from automation but does not require sub-millimeter accuracy can benefit from vision‑based calibration and control, dramatically lowering cost and complexity,' said Sitzmann. "In the future, with inclusion of tactile sensing (=touch), this paradigm may even extend to applications that require high accuracy.' A new approach to soft robotics Researchers say their approach removes the need for experts to build an accurate model of the robot, a process that can take months. It also eliminates reliance on expensive sensor systems or manual calibration. The simplified process entails recording the robot moving randomly and the model learns everything it needs to know from that video. 'Instead of painstakingly measuring every joint parameter or embedding sensors in every motor, our system heavily relies on a camera to control the robot," said Sitzmann. 'In the future, for applications where sub-millimeter accuracy is not critical, we will see that conventional robots with all their embedded sensors will increasingly be replaced by mass-producible, affordable robots that rely on sensors more similar to our own: vision and touch."

KBRA Releases Research – CMBS Loan Performance Trends: June 2025
KBRA Releases Research – CMBS Loan Performance Trends: June 2025

Business Wire

time30-06-2025

  • Business
  • Business Wire

KBRA Releases Research – CMBS Loan Performance Trends: June 2025

NEW YORK--(BUSINESS WIRE)--KBRA releases a report on U.S. commercial mortgage-backed securities (CMBS) loan performance trends observed in the June 2025 servicer reporting period. The delinquency rate among KBRA-rated U.S. private label commercial mortgage-backed securities (CMBS) in June decreased to 7.3% from 7.4% in May. The total delinquent plus current but specially serviced loan rate (collectively, the distress rate) also decreased 31 basis points (bps) to 10.6%. After last month's 204-bp increase in the distress rate, mixed-use saw a 419-bps decrease following the modification of the JPMCC 2022-NLP Portfolio loan. In June, CMBS loans totaling $1.6 billion were newly added to the distress rate, of which 47.5% ($771 million) comprised imminent or actual maturity default. The office sector experienced the highest volume of newly distressed loans (47.6%, $772.5 million), followed by retail (19.9%, $322.4 million), and lodging (9.8%, $158.6 million). Key observations of the June 2025 performance data are as follows: The delinquency rate decreased to 7.3% ($23.9 billion) from 7.4% ($24.5 billion) in May. The distress rate decreased to 10.6% ($34.6 billion) from 10.9% ($35.8 billion) last month. The office delinquency rate increased 51 bps this month to 12.1%. The sector continues its upward trend, albeit at a slower rate than last month. Among KBRA-rated loans, Federal Center Plaza ($130 million in COMM 2013-CR6) and 25 Broadway ($116.6 million in COMM 2014-CR16) became nonperforming matured balloon this month, as the loans became delinquent in June. Additionally, the $300 million One California Plaza loan ($250 million in CSMC 2017-CALI and $50 million in CSAIL 2017-CX10, both KBRA-rated) entered the foreclosure process as negotiations fell through after short-term forbearances. Mixed-use's distress rate fell 419 bps after a 292-bp jump last month. JPMCC 2022-NLP Portfolio, with $1 billion in JPMCC 2022-NLP, was modified and extended 24 months. Prime Storage Fund II, with $340 million in CGCMT 2021-PRM2, was returned to the master servicer after a successful maturity extension to November 2025. Multifamily saw a decrease in the specially serviced rate after the Hatteras Multifamily Portfolio ($346 million in NCMF 2022-MFP) paid off without a loss after a 90-day forbearance. The multifamily delinquency rate climbed 59 bps, as four loans ranging from $45 million to $21 million became 30+ days delinquent on top of another 11 loans under $20 million with an average of $10.4 million that also missed payments. In this report, KBRA provides observations across our $338 billion rated universe of U.S. private label CMBS including conduits, single-asset single borrower and large loan transactions. Click here to view the report. Recent Publications About KBRA KBRA, one of the major credit rating agencies, is registered in the U.S., EU, and the UK. KBRA is recognized as a Qualified Rating Agency in Taiwan, and is also a Designated Rating Organization for structured finance ratings in Canada. As a full-service credit rating agency, investors can use KBRA ratings for regulatory capital purposes in multiple jurisdictions. Doc ID: 1010182

MIT Jameel Clinic and CSAIL launch new AI model accelerating the future of drug discovery ‘Boltz-2'
MIT Jameel Clinic and CSAIL launch new AI model accelerating the future of drug discovery ‘Boltz-2'

Zawya

time24-06-2025

  • Health
  • Zawya

MIT Jameel Clinic and CSAIL launch new AI model accelerating the future of drug discovery ‘Boltz-2'

Cambridge, Massachusetts – The Jameel Clinic, the epicentre of artificial intelligence (AI) and health at the Massachusetts Institute of Technology (MIT), announced today the release of Boltz-2 — a groundbreaking artificial intelligence model which will transform the speed and accuracy of drug discovery. The announcement was made together with the MIT Computer Science and Artificial Intelligence Laboratory (CSAIL) and the biotechnology company Recursion. Boltz-2 breaks new ground by jointly modelling both structure and binding affinity, a critical parameter in small molecule drug discovery. A big leap for small molecule drug discovery Boltz-2 builds on the success of Boltz-1, a pioneering model first released in 2024 that can determine protein structures, by adding a powerful new ability: accurately predicting how strongly a drug molecule will bind to a target protein — a crucial factor in determining its effectiveness. In doing so, Boltz-2 addresses one of the most complex challenges in early-stage drug development. Boltz-2's affinity module was trained on millions of real lab measurements, showing how strongly different molecules bind to proteins. Thanks to this, Boltz-2 can now predict binding strength with unprecedented accuracy across several benchmarks reflecting different stages of real-world drug discovery. Boltz-2's predictions come very close to those produced by full-physics free energy perturbation (a precise computer simulation that predicts how strongly a drug sticks to its target, but that can take up to a day to run one test even on a GPU) – at over 1,000 times the speed. It is the first deep learning model to deliver that level of precision. Saro Passaro, researcher at the MIT Jameel Clinic and co-lead of the Botlz-2 project, said: 'This release is especially significant for small molecule drug discovery, where progress has lagged behind the rapid gains seen in biologics and protein engineering. 'While models like AlphaFold and Boltz-1 allowed a significant leap in the computational design of antibodies and protein-based therapeutics, we have not seen a similar improvement in our ability to screen small molecules, which make up the majority of drugs in the global pipeline. 'Boltz-2 directly addresses this gap by providing accurate binding affinity predictions that can dramatically reduce the cost and time of early-stage screening.' Gabriele Corso, PhD student at MIT CSAIL and one of the lead researchers behind Boltz-1 and Boltz-2, said: 'This performance increase makes Boltz-2 not just a research tool, but a practical engine for real-world drug development. 'Instead of spending hours simulating the interaction between a single molecule and its target, scientists can now screen vast chemical libraries within the same time frame, enabling early-stage teams to prioritise only the most promising compounds for lab testing.' Open-source and optimised for medical research The Boltz-2 model also introduces a new feature, Boltz-Steering, which refines molecular structure predictions and make them more realistic. This allows researchers to guide the model using experimental data, example structures, or design goals — giving them greater control and customisability in their search for new treatments. Boltz-2 will be released as a fully open-source model under the MIT licence, including the code, weights and training data, enabling researchers around the world to freely access and build upon its capabilities. A breakthrough for the MIT Jameel Clinic and CSAIL The model represents a major milestone in an ambitious research programme launched in early 2023 by the MIT Jameel Clinic and CSAIL. The team set out to develop a machine learning system that could not only predict the 3D shape of proteins — like AlphaFold — but also understand how and why molecules interact, as well as how likely they are to bind to each other. This deeper understanding is essential for designing effective new therapies, particularly for diseases caused by molecular dysfunction. Boltz-1, released in 2024, was the first result of that effort. Created as a fast, accessible alternative to AlphaFold3, Boltz-1 quickly became the most widely adopted open-source tool of its kind, used by thousands of scientists across academia, biotech startups, and pharmaceutical companies. It demonstrated that open and interpretable models could rival the best in the field. Now, with Boltz-2, the MIT team is taking the next step — targeting small molecule drug discovery, an area that has historically lagged behind biologics and protein engineering in terms of computational tools. Boltz-2 is the latest milestone in MIT Jameel Clinic's growing portfolio of open-source tools for health, developed at the intersection of AI and medicine — and part of a broader mission by the Jameel Clinic to make cutting-edge technology accessible for solving the world's most pressing health challenges. The team includes MIT Jameel Clinic AI faculty lead Professor Regina Barzilay; MIT CSAIL principal investigator Professor Tommi Jaakkola; PhD students Gabriele Corso and Jeremy Wohlwend; MIT Jameel Clinic researcher Saro Passaro; as well as additional collaborators from Recursion. About Jameel Clinic: The Jameel Clinic is the epicentre of artificial intelligence (AI) and healthcare at MIT. It works to develop AI technologies that will change the landscape of healthcare. This includes early diagnostics, drug discovery, care personalisation and management. Building on MIT's pioneering history in artificial intelligence and life sciences, the Jameel Clinic works on novel algorithms suitable for modelling biological and clinical data across a range of modalities including imaging, text and genomics. While achieving this goal, the team strives to make new discoveries in machine learning, biology, chemistry and clinical sciences. The Jameel Clinic was co-founded in 2018 by MIT and Community Jameel, the independent, global organisation advancing science to help communities thrive in a rapidly changing world. About Community Jameel: Community Jameel advances science and learning for communities to thrive. An independent, global organisation, Community Jameel was launched in 2003 to continue the tradition of philanthropy and community service established by the Jameel family of Saudi Arabia in 1945. Community Jameel supports scientists, humanitarians, technologists and creatives to understand and address pressing human challenges in areas such as climate change, health and education. The work enabled and supported by Community Jameel has led to significant breakthroughs and achievements, including the MIT Jameel Clinic's discovery of the new antibiotics Halicin and Abaucin, critical modelling of the spread of COVID-19 conducted by the Jameel Institute at Imperial College London, and a Nobel Prize-winning experimental approach to alleviating global poverty developed by the co- founders of the Abdul Latif Jameel Poverty Action Lab at MIT.

CETI Looks Into The Complexities Of Whale Sounds With AI
CETI Looks Into The Complexities Of Whale Sounds With AI

Forbes

time12-06-2025

  • Science
  • Forbes

CETI Looks Into The Complexities Of Whale Sounds With AI

What can we learn from the whales? It's something that researchers at the CETI project (not to be confused with the SETI Institute) are working on in order to help drive awareness around language models that exist right here in our own world. In a recent TED talk, CETI's Pratyusha Sharma talks about the communication of sperm whales, and how humans can use that to learn more about other species and ourselves. Sharma is a graduate student at CSAIL and works with advisors like our own Daniela Rus to advance this kind of discovery. As a starter, she gave the example of aliens speaking to humans verbally, or through a script – and again, distinguish CETI from what they're doing in space research! 'Communication is a key characteristic of intelligence,' Sharma explained. 'Being able to create an infinite set of messages by sequencing together finite sets of sounds is what has distinguished human beings from other species.' However, she said, CETI research indicates that we may not be alone on the earth in developing these kinds of systems. In figuring this out, she suggested, we can get insights on other species, and understand our own language better as well. Millions of life forms on earth, she said, share some form of language. 'They have their own physical and mental constraints, and are involved in their own unique ecosystems and societies,' she said. 'However, we know very little about – their communications.' So how do you decipher them? In further explaining what goes on at CETI, she listed different stakeholders with credentials in areas like linguistics, biology, cryptography and AI. (Here's some more background on the project). Most of the research, she said, is taking place in the Dominican Republic, or in the Caribbean. Explaining how the large brains of sperm whales have evolved over 16 million years, she described activity that shows advanced thinking: 'The members of the family coordinate their dives, engage in extended periods of socialization, and even take turns babysitting each other's young ones,' she said. 'While coordinating in complete darkness, they exchange long sequences of sounds with one another.' The question, she noted, is this: what are they saying? Researchers at CETI have identified 21 types of 'codas' or call systems with a certain complexity. 'One of the key differentiators between human language and all animal communications is that beautiful property called duality of patterning,' Sharma said. 'It's how a base set of individually meaningless elements sequence together to give rise to words, that in turn are sequenced together to give rise to an infinite space with complex meaning.' She outlined some of the principles through which CETI is building this species knowledge. 'Getting to the point of understanding the communications of sperm whales will require us to understand what features of their (vocalizations) they control,' she said. Presenting a set of 'coda visualizations,' Sharma noted that these simple communications correspond to complex behavior. '(This) presented a fundamental mystery to researchers in the field,' she said. She showed how the CETI work magnifies the structure of a coda: 'Even though the clicks might not have sounded like music initially, when we plot them like this, they start to look like music,' she said, presenting a combinatorial coda system. 'They have different tempos and even different rhythm.' This, she added, reveals a lot about the minds of these creatures. 'The resulting set of individual sounds (in the coda) can represent 10 times more meanings than what was previously believed, showing that sperm whales can be much more expressive than what was previously thought,' she said. 'These systems are rare in nature, but not uniquely human. … these results open up the possibility that sperm whales' communication might provide our first example of this phenomenon in another species. … this will allow us to use more powerful machine learning techniques to analyze the data, and perhaps get us closer to an understanding the meanings of their sounds – and maybe (we can) even communicate back.' The research, she added, continues: 'Hopefully the algorithms and approaches we developed in the course of this project empower us to better understand the other species that we share this planet with,' she said. This type of research has a lot of potential!! Let's see what it turns up as we continue through the age of AI.

DOWNLOAD THE APP

Get Started Now: Download the App

Ready to dive into a world of global content with local flavor? Download Daily8 app today from your preferred app store and start exploring.
app-storeplay-store