logo
Nepalese Start-Up Deploys Drones To Remove Rubbish From Mount Everest

Nepalese Start-Up Deploys Drones To Remove Rubbish From Mount Everest

Barnama6 hours ago

ISTANBUL, June 26 (Bernama-Anadolu) -- A Nepalese start-up has begun deploying drones to remove rubbish from Mount Everest, Anadolu Ajansi reported.
A drone successfully completed a delivery test at an altitude of over 6,000 metres (19,685 feet) while carrying a 15-kilogramme (33-pound) payload, according to Airlift Technology's website.
"The maximum payload tested at Everest Base Camp was 32 kilogrammes (70.5 pounds)," the company said, adding the delivery of rubbish from Camp 1 to the base camp was also "tested and found to be successful."
bootstrap slideshow
The project is being conducted with the cooperation of the Khumbu Pasang Lamhu Rural Municipality, where Mount Everest is located, and China-based DJI, the world's largest drone manufacturer.
Everest Base Camp and Camp 1 are separated by the Khumbu Icefall, "one of the most perilous stages of the ascent," DJI said in a statement, adding that "while helicopters can theoretically make the same journey, they are rarely used due to the significant dangers and costs."
The Chinese company also said that each climber is estimated to leave 8 kilogrammes (17.6 pounds) of rubbish behind on Everest.
The agreement that the Cimex BYD Charity Foundation (CBCF) and Airlift Technology have signed foresees the clean-up of more than 1,000 kilogrammes (2,204 pounds) of waste this season.
The Nepalese government has been launching initiatives every year since 2019 to clean the mountain.
By 2024, Nepal's efforts to clean Mount Everest and other Himalayan peaks resulted in the collection of 119 tonnes of waste and the recovery of 14 human bodies and several skeletons.

Orange background

Try Our AI Features

Explore what Daily8 AI can do for you:

Comments

No comments yet...

Related Articles

Of calculators and trust — Fatin Nabila Abd Latiff
Of calculators and trust — Fatin Nabila Abd Latiff

Malay Mail

timean hour ago

  • Malay Mail

Of calculators and trust — Fatin Nabila Abd Latiff

JUNE 26 — My experience teaching Mathematics in two countries, Malaysia and China, has revealed an important reality: the way students master this subject is deeply influenced by the educational culture and assessment systems of each country. In today's modern educational era, tools such as scientific calculators and artificial intelligence (AI) have become increasingly prevalent in the classroom. However, students' approaches to using these tools are still firmly rooted in the foundational values shaped by their respective systems. In Malaysia, the use of scientific calculators is standard practice beginning at the upper secondary level. Students rely on calculators for a wide range of mathematical operations, and for some, they become an inseparable part of problem-solving. While calculators help speed up calculations and minimize errors, overreliance can sometimes lead to weaker mastery of basic computational skills and reduced understanding of core mathematical concepts. This culture of calculator dependency is also reflected in Malaysia's national examination, the Sijil Pelajaran Malaysia (SPM). In SPM, calculators are permitted for Mathematics and Additional Mathematics papers. The structure of the exam often assumes that students have access to calculators, especially for questions involving trigonometry, logarithms, or statistical calculations. While this allows for efficiency, it may inadvertently discourage the development of mental calculation and manual problem-solving strategies. In today's modern educational era, tools such as scientific calculators and artificial intelligence (AI) have become increasingly prevalent in the classroom.— Picture via Unsplash By contrast, my experience teaching foundation students under the PASUM offshore program at Xi'an International University in China revealed a very different learning environment. Many students there had never used a calculator. Since they were preparing to pursue their undergraduate degrees at Universiti Malaya, I took the initiative to introduce calculator usage and made it a requirement in both lectures and assessments. Initially, they were unfamiliar and hesitant, but I could see their excitement when they first tried using the device. Even so, most of them continued to prefer solving problems such as multiplication, square roots, and trigonometric expressions manually with remarkable confidence, speed, and precision. This comfort with manual computation stemmed from their early training and a system that actively reinforces such skills. One of the main reasons for this is China's national university entrance exam, Gaokao. Known for its intensity and competitiveness, Gaokao strictly prohibits the use of calculators in the mathematics paper. This policy is intentional. It aims to assess a student's genuine computational skills, ensure fairness across all regions and backgrounds, and encourage deep mastery of mathematical principles without reliance on technology. As a result, Chinese students are trained from a young age to memorize formulas and solve problems manually. The outcome is a generation of students who possess strong fundamental skills and a high level of discipline when tackling complex problems using logical and structured steps. Despite these systemic differences, global developments continue to impact both countries. Students in Malaysia and China are now increasingly turning to AI-powered apps such as ChatGPT, DeepSeek, Symbolab, and Photomath. These tools allow students to input or scan questions and receive complete answers, including solution steps, within seconds. While these technologies offer convenience and accessibility, I have observed a troubling trend: students are becoming increasingly dependent on AI-generated solutions without fully engaging with the problem-solving process. To address this, I apply a simple yet effective approach in my classroom. Students are required to first attempt questions manually, using their own reasoning, before they are allowed to check or verify their answers using AI. This method trains students to think critically, assess their own solutions, and compare them thoughtfully with the output provided by AI tools. It also builds confidence in their conceptual understanding. What I find most encouraging is how students respond when their answers differ from AI-generated ones. On several occasions, I have heard students say confidently, 'I think my answer is correct. The AI is wrong.' To me, this is a clear indicator of authentic learning. These students are not simply replicating solutions — they have internalized the logic, can explain their reasoning, and are unafraid to challenge the authority of a machine when they believe in their understanding. I am not against the use of technology. On the contrary, I fully support the integration of AI as a learning tool, provided it is used wisely and with the right guidance. However, I believe that manual problem-solving and conceptual mastery must remain the foundation of Mathematics education. Technology should enhance students' learning but not replace their ability to think. Calculators, SPM, Gaokao, and AI each represent tools, systems, and educational paradigms that shape students in different ways. What truly matters, however, is ensuring that students are able to understand, reason logically, and trust their own thinking. When a student can confidently say, 'AI is wrong, I know my answer is correct,' because they fully understand the concept, that is where the true success of a teacher lies. *Dr. Fatin Nabila Abd Latiff is a Senior Lecturer of the Mathematics Division, Centre for Foundation Studies in Science, Universiti Malaya (PASUM), and may be reached at [email protected]. ** This is the personal opinion of the writer or publication and does not necessarily represent the views of Malay Mail.

Balancing AI Benefits And Academic Integrity
Balancing AI Benefits And Academic Integrity

Barnama

timean hour ago

  • Barnama

Balancing AI Benefits And Academic Integrity

I n the rapidly advancing era of artificial intelligence (AI), tools like ChatGPT are reshaping the landscape of higher education, bringing profound changes to institutions of higher learning (IPTs) nationwide. ChatGPT offers substantial benefits as a learning tool, such as generating essays, enhancing writing creativity, analysing data, accelerating research processes, and providing instant answers to complex questions. However, this convenience also raises concerns—particularly over misuse by students who rely on the software to complete assignments automatically, without true comprehension or critical engagement. Academic dishonesty is becoming more complex, as conventional plagiarism tools struggle to detect AI-generated content. Even more concerning is the growing reliance on AI, which blurs the line between genuine student effort and machine-assisted work—raising important ethical and pedagogical questions. THE CHANGING LANDSCAPE OF ACADEMIC DISHONESTY According to Associate Professor Dr Mohd Khairie Ahmad, Dean of the School of Multimedia Technology and Communication at Universiti Utara Malaysia, the philosophy of technology is to simplify and enhance capabilities—and when it comes to the issue of AI in learning, it depends on context. 'Generative AI is a technological advancement capable of producing content that previously required human thought and effort. AI can certainly generate student assignments or coursework. 'If students rely entirely on AI, it could potentially hinder their learning process. This irresponsible or unethical use of AI to complete assignments—while claiming them as original work—is referred to as 'AIgiarism' or AI plagiarism,' he told Bernama. Sharing that digital plagiarism or academic dishonesty is not a new phenomenon, Mohd Khairie said AI's development has made academic misconduct more dynamic. He noted that since generative AI gained popularity around 2022, the higher education world has become aware of and anticipated the challenges it brings. 'It is undeniable that the use of AI in learning—especially for assignment completion—has become common over the past year or two. There are students who rely entirely on AI to complete assignments or even answer tests or quizzes, especially when conducted online. 'Many students believe such actions are not wrong since AI is legal and not a prohibited technology. However, this is considered unethical because the work does not stem from the student's own cognitive effort or thinking. In fact, such conduct is regarded as a form of plagiarism. 'Typically, lecturers evaluate student assignments by measuring the similarity index, and now also through AI detection. Among the AI applications that can detect AI plagiarism are Turnitin, GPTZero, Winston AI, Copyleaks AI Detector, and he said, adding that evaluating the style, language structure, and content of assignments also helps detect breaches of academic integrity. While not denying that educators, particularly lecturers, also use AI for teaching and research purposes, he said there can be no compromise when it comes to violating the principles of academic integrity. According to him, the world of higher education upholds the practice of respecting and valuing past scholarly works. 'A scholarly work requires reading and digesting prior writings as part of the process of generating new thoughts or ideas. This is a defining feature of academic writing and a core principle of scholarly work—to acknowledge references used, at the very least by listing them in citations. 'In the context of AI being a productive tool that supports scholarly work, it is therefore ethical to clearly disclose its use and to list the AI sources used to obtain information, ideas, and so on,' he said. ESTABLISHING GUIDELINES Responding to whether IPTs have clear guidelines on AI usage by students and lecturers, Mohd Khairie said to his knowledge, the Malaysian Qualifications Agency (MQA) was among the earliest to issue brief guidance through an Advisory Note in 2023 on the use of generative AI across all Malaysian institutions. He added that in 2024, Universiti Teknologi Malaysia (UTM) published more specific guidelines for educators and students on the application of generative AI. These guidelines focus on lawful, responsible, transparent, trustworthy, and ethical use of AI, grounded in values, regulations, and legislation. 'Since AI has become a foundational and routine part of the teaching and learning process, all IPTs should have clearer and more specific guidelines for generative AI. Furthermore, these guidelines should eventually align with the AI Act currently being drafted by the National Artificial Intelligence Office (NAIO), under the Ministry of Digital,' he said. Describing the best approach as educating students to use AI ethically and responsibly—as a learning aid rather than a shortcut to complete assignments—he said the importance of awareness education, especially since AI is poised to become an essential tool for enhancing learning efficiency and productivity. 'AI should be understood not as the end product but as a process that supports students' cognitive (thinking) activities. If this understanding doesn't take root, it's not impossible that digital 'illnesses' like brainrot (mental fatigue) may affect university students. 'AI is an unavoidable phenomenon and, at the same time, a current necessity. Its exposure and practice as a learning support tool should be promoted as a value and part of the academic culture. 'A study by leading international publisher Wiley found that in 2024, AI contributed to a 72 per cent increase in academic dishonesty compared to 2021 in the United States and Canada. However, responsible and ethical AI guidance by educators has been shown to potentially reduce academic misconduct among students,' he said. AI AS PART OF THE ECOSYSTEM Meanwhile, the Malaysian Cyber Consumers Association (MCCA) views the increasing use of AI—particularly ChatGPT—among students in IPTs as a clear sign that higher education is undergoing a profound technological transformation. Its president, Siraj Jalil, said that AI is no longer a tool of the future but has already become an integral part of the current ecosystem in IPTs. 'MCCA does not see this issue as entirely a threat, nor as an opportunity without risks. It lies in a grey area that can bring either benefits or harm depending on how it is used. 'If a student uses AI to enhance their understanding of a subject, generate ideas, or organise their thoughts, it can lead to progress. However, if it is used entirely without the involvement of reasoning, critical thinking, and a sense of responsibility, then it clearly challenges academic integrity. 'Therefore, MCCA believes this is the time for IPTs to re-evaluate their approaches to assessment and learning—not to reject AI from educational methods, but to develop a framework that allows AI to be used ethically and effectively,' he explained. He noted that the concerns of some lecturers regarding this issue should also be taken seriously. MCCA has received a great deal of direct feedback from lecturers reporting a sharp increase in students submitting assignments almost entirely generated by AI. 'This not only disrupts the academic assessment process but also raises uncertainty in terms of academic aesthetics and values. The solution to this issue isn't merely to impose restrictions or punishments, but to create a more responsible academic ecosystem—one that focuses on ethics and perhaps even redefines academic benchmarks beyond AI usage. 'Every IPT should develop clear AI usage guidelines and integrate AI literacy and academic ethics modules into student orientation and professional development for lecturers. Assignments should also be restructured to emphasise process rather than just outcomes, such as through presentations, reflective portfolios, or fieldwork,' he added, noting that ethical use is shaped not by fear, but through understanding and clear guidance. At the same time, Siraj suggested that lecturers be given training on the use of AI in research and academic writing, including the importance of disclosing AI usage openly in methodology or references to safeguard academic integrity. 'Academic publications—especially journals and conference proceedings—should begin adapting their policies on AI-generated content. What matters most is striking a balance between innovation and integrity. This is to address concerns that some research content could be produced without critical review or clear AI usage disclosure,' he said. Siraj also believes that the Ministry of Higher Education (MOHE), in collaboration with NAIO, could formulate a national policy or official guidelines on AI usage in IPTs. He proposed that such a policy include several key components: permitted levels of AI usage, types of assignments appropriate for AI support, forms of misuse that warrant action, and AI literacy and ethics requirements for all campus communities. 'This policy should be developed inclusively, with engagement from academic experts, students, technology practitioners, and industry stakeholders to ensure it is responsive and practical. 'Responsible use of AI begins with the fundamental principle that AI is a tool—not a replacement for human reasoning. For students, responsibility begins with the awareness that learning is a process of self-development and understanding one's field, not just completing tasks for grades. 'Using AI to understand concepts or review writing structure is acceptable. But copying or generating an entire assignment without comprehension goes against the spirit and discipline of education,' he said, adding that both students and lecturers must understand the risks and threats of AI misuse, including the possibility of false information, biased algorithms, and unverified content dissemination. AWARENESS AND HIGH LITERACY Sharing his views, Muhammad Haziq Sabri, President of the Student Representative Council at Universiti Teknologi MARA Shah Alam for the 2024/2025 session, said ChatGPT has now become a common tool among university students and has helped him significantly in completing assignments and preparing notes for exams. 'It enables note generation from lecture slides and helps in understanding certain topics. Using ChatGPT to correct grammar and sentence structure also speeds up the process of completing assignments,' he said. Rejecting the notion that the use of AI—particularly ChatGPT—is a form of academic cheating, he said it should be seen as a modern learning support tool that must be used responsibly. 'It becomes academic dishonesty when students just 'copy and paste' without understanding or modifying the content generated by ChatGPT. Almost all my friends also use ChatGPT, but not excessively—they focus on things like assignment structure and grammar checking. 'So far, I have not heard of any students facing disciplinary action for AI misuse. Most students use ChatGPT responsibly because they understand that misuse could violate the university's academic ethics policies,' he said. Muhammad Haziq noted that according to Academic Circular No. 5 of 2023, official guidelines on the use of ChatGPT in teaching and learning have been issued, adding that lecturers are encouraged to guide students on using ChatGPT ethically as a learning tool. He said the circular also stresses the importance of ensuring that AI is used to foster critical thinking, understanding, and values—not merely for copying answers—as outlined in Article 6. 'This shows that the university not only allows the use of AI but encourages its responsible use and provides guidelines,' said the Bachelor of Public Relations student from the Faculty of Communication and Media Studies. For Muhammad Asyraf Daniyal Abdul Halid, 24, a Master's research student in Marine Biotechnology at Universiti Malaysia Terengganu, ChatGPT serves as a guide, but over 90 per cent of the work comes from the student's own effort in sourcing credible information with proper citations. 'ChatGPT really helps us search and compile necessary information, develop ideas, and get an overview of the assignments or projects given by lecturers. However, plagiarism and failure to fact-check information are common forms of misuse among students,' he added, noting that not all students in higher learning institutions have a high level of awareness and literacy when using such software.

DOWNLOAD THE APP

Get Started Now: Download the App

Ready to dive into a world of global content with local flavor? Download Daily8 app today from your preferred app store and start exploring.
app-storeplay-store