March 31, 2023
Good morning. In today’s either/view, we discuss whether a moratorium on Artificial Intelligence models is necessary. We also look at the launch of the ‘Green Wall’ project in Haryana, among other news.
📰 FEATURE STORY
Is a moratorium on Artificial Intelligence models necessary?
Much of history is in asterisks and footnotes about rupture, innovation, or protest. When San Francisco-based OpenAI launched ChatGPT, 2022 quickly became dubbed the year of technological breakthrough. It’s quicker and more intelligent than previous chatbots and natural language processing models.
Recently, Future of Life Institute published an open letter signed by over 1,377 Artificial Intelligence (AI) experts and industry leaders calling for a six-month pause on training AI systems more powerful than ChatGPT-4. While they argue that such a break is necessary for the responsible development of AI, other experts in the field find it impractical and something of an overhype.
Context
OpenAI has released Generative AI tools that create information, art, images, and 3D models. A simple prompt helps DALL-E 2 consult millions of visual data points and produce original artwork. If it’s so good at doing what humans do and at a quicker pace, naturally, people begin to worry about their jobs.
Recently, OpenAI introduced ChatGPT-4, which indulges in very few hallucinations and even passes the Scholastic Assessment Tests and the law school bar exam. Coincidentally or not, it was released within two months of Microsoft’s ethics and society team getting laid off.
Quick on their heels, other companies started to integrate Generative AI into their products and services, including Adobe, IBM, and Meta. Google rolled out Bard, a generative AI product integrated with Google Search. Firms have also been advertising ethical considerations while developing their models. Microsoft has its Office of AI Responsibility, and Google has Responsible AI Principles.
But, as demonstrated by Big Tech’s long list of antitrust cases, self-policing has its limits. Back in 2020, it even stopped an employee, Timnit Gebru, from publishing their research on the large language models’ ethical and environmental concerns.
The Association for Computing Machinery’s (ACM) Code of Ethics and Professional Conduct devolves the responsibility of upholding the public good to tech employees. And more often than not, a focus on ethical innovation comes from the employees rather than the top.
The recent letter, signed by Elon Musk and Apple co-founder Steve Wozniak among others, talks about hitting a pause on AI advancement until its risks and long-term impacts are clear. The letter calls for a public and verifiable moratorium on all key players in the industry.
The letter recommends that companies adopt the Asilomar AI Principles to avoid said race and prioritise the common good. The 2017 Asilomar Conference on Beneficial AI resulted in 23 guidelines on ethics and long-term issues to consider during research and development (R&D). It was signed by Stephen Hawking, Elon Musk, and Ilya Sutskever, co-founder and research director of OpenAI.
The letter raises some serious concerns, but some question whether pausing giant AI experiments is the way forward. Does the open letter raise the right questions about responsible innovation?
VIEW: It’s the right call
Generative AI’s ability to make up facts and images can be leveraged to produce disinformation. At a time when polarisation and propaganda have become a global crisis, it might be practical to take stock of the dark side of Generative AI. The signatories argue that a brief pause will help calculate the risks that automation brings to jobs in industries like art and programming.
The letter argues that training powerful LLMs should be accompanied by a robust assessment of their long-term impact. There’s not enough evidence yet, to predict how AI systems will transform in the future. The letter argues that the industry should jointly enforce a set of safety protocols for AI design and development, audited by independent experts. Even OpenAI has noted the need for an independent safety review of future AI systems.
Responsible AI development requires ethical decision-making and algorithmic constraints. The AMC Code is ineffective without assessing the possible risks of advanced AI models. Algorithmic regulations can prevent such advanced systems from becoming unpredictable black boxes.
COUNTERVIEW: It’s overhype
Security risks aren’t a threat made possible by future AI – they exist in current models too. Consider the recent ChatGPT bug that leaked payment data and user titles. A moratorium does not guarantee solutions for privacy, security, and unemployment risks that AI carries. In fact, it’s argued that such exaggerated claims, as in the letter, will likely deprecate and lock down the systems than address their risks.
The creation of propaganda and lies has evolutionary roots. It’s a function of the human brain. Researchers argue that creating misinformation isn’t the trouble, but distributing it is. Maliciously aligned users can easily pay for ChatGPT Plus or Dall-E and may be able to generate inaccurate data. There’s also the fact that further training and development will minimise factual errors.
The letter doesn’t talk about risks associated with existing Generative AI models. Speculation over AI displacing humans and causing civilisational obsoletion paints AI as more potent than it is. It could be more productive to strengthen the internal auditing policies of tech giants and collaborate with policymakers.
Reference Links:
- Pause Giant AI Experiments: An Open Letter – Future of Life
- The AI arms race highlights the urgent need for responsible innovation – The Conversation
- The AI Arms Race Is Changing Everything – Time
- Top AI ethics researcher Timnit Gebru says Google fired her over an email – Fast Company
- A misleading open letter about sci-fi AI dangers ignores the real risks – AI Snake Oil
- ChatGPT bug leaked payment data, conversation titles of users, confirms OpenAI – Mint
- Elon Musk asking for a 6-month halt after ChatGPT 4. He is not alone – DailyO
What is your opinion on this?
(Only subscribers can participate in polls)
a) Pausing the training of future AI systems is necessary.
b) Pausing the training of future AI systems is unnecessary.
🕵️ BEYOND ECHO CHAMBERS
For the Right:
Kejriwal’s Great Gamble As He Drops His Restraint On Modi
For the Left:
Opinion: Can Hindutva Replace Old Social Coalitions In Karnataka?
🇮🇳 STATE OF THE STATES
Government launches “Green Wall” project (Haryana) – The government of Haryana launched an extensive afforestation and plantation project named the “Green Wall” on March 25. The project will cover 75 villages. The initiative aims to revitalise the Aravalli range while also offering numerous ecological advantages to the surrounding areas. The plan for the green belt was inspired by the African Green Wall program, which was introduced in Sahel, Africa’s boundary area with the Sahara Desert, to increase the amount of fertile land.
Why it matters: The Aravalli range serves as a natural barrier against the Thar desert’s unrelenting eastward march, a process known as desertification, for the towns of Gurugram and Delhi. Furthermore, it continues to be an important ecological zone, acting as a lung area, trying to balance the polluted air of the capital region while also serving as a groundwater replenishment zone for the area. Groundwater levels in Gurugram and surrounding areas have also plummeted to historic lows as a result of the region’s fast growth. This project, therefore, will act as a strong buffer against the ill effects of rapid industrialisation in Gurugram and Delhi.
100th year of the Vaikom Satyagraha (Kerala) – On March 30, 1924, three men, Kunjappy, Bahuleyan, and Venniyil Govinda Panicker, walked hand in hand towards a board outside the Vaikom Mahadevar Temple denying entry to untouchables. These three men marked the first step towards the nation’s historic trek to a great goal – equity for all. And thus began a historic people’s fight – the Vaikom Satyagraha – to end social prejudice in the former princely state of Travancore.
Why it matters: The protest was launched in 1923 by T K Madhavan, K P Kesava Menon, and George Joseph, with the blessings of the All India Congress Committee (AICC) and Mahatma Gandhi, after T K Madhavan raised the injustice faced by lower caste people at the Congress party’s Kakinada meeting. The 603-day battle had several ups and downs. While frontline leaders were arrested within two weeks of the protest, Periyar E V Ramasamy was brought in from Tamil Nadu and gave the movement new energy. People from all over the nation rallied in support. Even Akalis from Punjab established a camp at Vaikom to prepare meals for the satyagrahis. The protest truly united people from all walks of life and was successful in obtaining access to three out of four entryways for the lower caste people.
Resort building plan finalised at paragliding spot (West Bengal) – The government of West Bengal has completed plans to build a tourism retreat near Kalimpong’s paragliding take-off spot. Locals and specialists, however, disagree with the choice. The Kalimpong Paragliding Association (KPA) claimed that the building of a resort behind the Kalimpong Science Centre would obstruct the adventure sport, causing losses in the overall business.
Why it matters: On Tuesday, a survey team composed of officials from the tourist department, the PWD, and the district office inspected the take-off spot. According to KPA President Prashant Rana, the glider sometimes strayed from the resting location due to wind currents. He claims that there are numerous technical problems when paragliding. He said that he and his team could anticipate major accidents and respond appropriately every time. The stakeholders are hoping that the government will change its mind about the initiative. They think that the government’s construction on this location will harm Kalimpong’s tourism and adventure sports more than it will help.
SC order favours Goa in Mahadayi case: Savant (Goa) – Goa’s case in the ongoing Mahadayi river water dispute with Karnataka has been strengthened by “favourable orders from the Supreme Court” that have barred Karnataka from carrying out any further work or diversion (of the water) unless environmental permits are obtained, Chief Minister Pramod Sawant said on Tuesday. Sawant went on to say that the Goa government was investigating and developing plans for the appropriate utilisation of Mahadayi basin resources for the advantage of the people.
Why it matters: According to Savant, his administration has essentially halted any further building or diversion of the Mahadayi waters. He stated that the state government is carefully monitoring Karnataka’s activities as well as issues before different federal government agencies. Sawant also stated that, in addition to presenting a firm case for the defence of Goa’s interests in the Mahadayi conflict with Union Cabinet ministers, all agencies of the government are working together to ensure that all steps are taken to ensure that no damage occurs. The Mahadayi River has been a source of huge political conflict and tussle recently.
New thesis studying gaming among kids (Sikkim) – Aaron E Lepcha, of Namprick, Tingvong, Upper Dzongu, Mangan district, successfully defended his thesis on “Uses of Mobile Phone Gaming and Gratification: A Study on the Responses of Sikkim Teenagers” under the supervision of Dr Pooja Basnet from Sikkim University’s Department of Mass Communication. This study on mobile phone gaming is probably the first in Sikkim on gaming practices and their impacts on adolescents in the areas of scholastic and social lives of teenagers studying in Sikkim’s educational institutions.
Why it matters: Lepcha is a gold medallist in Masters from Sikkim University’s Department of Mass Communication in his class of 2013-2015. Throughout his study, he has given numerous papers in colleges across India at both national and foreign conferences on subjects such as mobile gaming and gratification, video gaming and gender, and other gaming-related topics. According to the research, kids play games to satisfy their social, emotional, and achievement needs. He also found that despite being relatively very new in Sikkim, gaming has successfully established a firm grip on Sikkim.
🔢 KEY NUMBER
4,200 – NH 65 is Telangana’s deadliest road, with 4,200 accidents in 9 years.