ETV Bharat / technology

Meta, Parent Company of FB, Insta, Gears Up for Lok Sabha Polls 2024; Sets Operations Centre

As the Lok Sabha poll in India approaches, Meta, the parent company of Facebook, Instagram, Threads and WhatsApp announced that it will activate an Elections Operations Centre to identify potential threats and put mitigations in place in real time.

As the Lok Sabha poll in India approaches, Meta, the parent company of Facebook, Instagram, Threads and WhatsApp announced that it will activate an Elections Operations Center to identify potential threats and put mitigations in place in real time.
Credit: Meta Newsroom
author img

By ETV Bharat Tech Team

Published : Mar 19, 2024, 3:52 PM IST

Updated : Mar 19, 2024, 4:20 PM IST

Hyderabad: Meta on Tuesday announced that the company will activate an India-specific Elections Operations Centre, bringing together experts to identify potential threats and put specific mitigations in place across its apps while aiming to curb AI-generated fake or manipulated content spread via Facebook, WhatsApp, Instagram and Threads.

"As the world’s largest democracy prepares for the 18th General Elections, Meta will continue efforts to limit misinformation, remove voter interference, and enhance transparency and accountability on our platforms to support free and fair elections," said Meta in its latest press release.

“We are dedicated to the responsible use of new technologies like GenAI and collaborating with industry stakeholders on technical standards for AI detection, as well as combating the spread of deceptive AI content in elections through the Tech Accord,” said the company.

According to Meta, around 40,000 employees in the company are globally working on safety and security, with more than $20 billion invested in teams and technology in this area since 2016. This includes 15,000 content reviewers who review content across Facebook, Instagram, and Threads in more than 70 languages — including 20 Indian languages.

“We are closely engaged with the Election Commission of India via the Voluntary Code of Ethics that we joined in 2019, which gives the Commission a high-priority channel to flag unlawful content to us,” said Meta.

The company also informed that they have been running an integrated 8-week-long safety campaign, ‘Know What’s Real,’ since the end of February. The campaign is focused on educating users on identifying and addressing misinformation on WhatsApp and Instagram by promoting digital best practices and highlighting available safety tools, along with encouraging people to double-check information that sounds suspicious or inaccurate by sending it to WhatsApp tiplines.

To address virality, Meta said that WhatsApp will continue to limit people’s ability to forward messages and announced last year that any message that has been forwarded once can only be forwarded to one group at a time, rather than the previous limit of five.

Read More

  1. Most Teens Report Feeling Happy or Peaceful Without Smartphones, Pew Survey Finds
  2. Google and Meta have Strong Political Bias: Elon Musk
  3. Meta Working with Sarvam AI to Build Vernacular LLMs

Hyderabad: Meta on Tuesday announced that the company will activate an India-specific Elections Operations Centre, bringing together experts to identify potential threats and put specific mitigations in place across its apps while aiming to curb AI-generated fake or manipulated content spread via Facebook, WhatsApp, Instagram and Threads.

"As the world’s largest democracy prepares for the 18th General Elections, Meta will continue efforts to limit misinformation, remove voter interference, and enhance transparency and accountability on our platforms to support free and fair elections," said Meta in its latest press release.

“We are dedicated to the responsible use of new technologies like GenAI and collaborating with industry stakeholders on technical standards for AI detection, as well as combating the spread of deceptive AI content in elections through the Tech Accord,” said the company.

According to Meta, around 40,000 employees in the company are globally working on safety and security, with more than $20 billion invested in teams and technology in this area since 2016. This includes 15,000 content reviewers who review content across Facebook, Instagram, and Threads in more than 70 languages — including 20 Indian languages.

“We are closely engaged with the Election Commission of India via the Voluntary Code of Ethics that we joined in 2019, which gives the Commission a high-priority channel to flag unlawful content to us,” said Meta.

The company also informed that they have been running an integrated 8-week-long safety campaign, ‘Know What’s Real,’ since the end of February. The campaign is focused on educating users on identifying and addressing misinformation on WhatsApp and Instagram by promoting digital best practices and highlighting available safety tools, along with encouraging people to double-check information that sounds suspicious or inaccurate by sending it to WhatsApp tiplines.

To address virality, Meta said that WhatsApp will continue to limit people’s ability to forward messages and announced last year that any message that has been forwarded once can only be forwarded to one group at a time, rather than the previous limit of five.

Read More

  1. Most Teens Report Feeling Happy or Peaceful Without Smartphones, Pew Survey Finds
  2. Google and Meta have Strong Political Bias: Elon Musk
  3. Meta Working with Sarvam AI to Build Vernacular LLMs
Last Updated : Mar 19, 2024, 4:20 PM IST
ETV Bharat Logo

Copyright © 2024 Ushodaya Enterprises Pvt. Ltd., All Rights Reserved.