You are currently viewing India Mandates Approval for Release of Unreliable AI Tools

India Mandates Approval for Release of Unreliable AI Tools

5/5 - (2 votes)

India has told tech companies that they need to get approval before releasing artificial intelligence (AI) tools that are considered “unreliable” or are still in the testing phase. The government has emphasized that these tools should be clearly labeled, warning users that they might give incorrect answers to queries.

India asks tech firms to seek approval before releasing 'unreliable' AI tools

The Indian Ministry of Information Technology issued this advisory last Friday to various platforms, stating that the release of AI tools, including generative AI, to users on the Indian internet must be done with explicit permission from the Government of India. This move aligns with the global trend of countries working on regulations to control the use of AI.

India has been particularly focused on tightening regulations for social media companies, which see the country as a significant growth market. The advisory comes just a week after a senior minister criticized Google’s Gemini AI tool on February 23 for providing a response that some accused of aligning with policies described as “fascist” by Prime Minister Narendra Modi. Google later acknowledged the tool’s limitations, especially concerning current events and political topics.

Deputy IT Minister Rajeev Chandrasekhar emphasized on social media that ensuring safety and trust is a legal obligation for platforms, and being labeled as “Sorry Unreliable” doesn’t exempt them from the law.

The advisory also addressed concerns related to the upcoming general elections in India, scheduled for this summer. It urged platforms to ensure that their AI tools do not compromise the integrity of the electoral process, highlighting the need for caution during a time when the ruling Hindu nationalist party is expected to secure a clear majority.

Source: Reuters