Technology

Twitter Implements Limits on Daily Tweet Reading to Combat Data Scraping and Manipulation

© Thomson Reuters 2023

In a bid to deter excessive data scraping and system manipulation, Twitter has introduced limitations on the number of tweets accounts can read per day, according to a recent post by Executive Chair Elon Musk. Initially, verified accounts were restricted to reading 6,000 posts daily, while unverified accounts were limited to 600 posts. Musk later announced an update, increasing the temporary reading limits to 10,000 posts per day for verified users, 1,000 posts per day for unverified users, and 500 posts per day for new unverified users. However, he did not provide additional details regarding the changes.

This move by Twitter comes after the social media platform’s decision to require users to have an account in order to view tweets, a measure that Musk referred to as a “temporary emergency measure” just one day prior. The motivation behind these restrictions is to address the issue of aggressive data scraping by numerous organizations, which has been negatively impacting the user experience on Twitter.

Elon Musk has been openly critical of the use of Twitter’s data by artificial intelligence firms, including OpenAI, the organization behind ChatGPT. He has expressed his dissatisfaction with such companies leveraging Twitter’s data to train their large language models.

In addition to these recent developments, Twitter faced a temporary outage on Saturday morning, affecting thousands of users. According to Downdetector.com, approximately 7,500 users reported difficulties accessing the platform at around 11:17 am ET.

Under Musk’s ownership, Twitter has taken various measures to regain the trust of advertisers who previously left the platform. One such initiative involves incorporating verification check marks into the Twitter Blue program, aimed at boosting subscription revenue.

While the new restrictions on tweet reading may cause some inconvenience for users, Twitter hopes that they will help mitigate data scraping and manipulation, ultimately leading to a more secure and reliable user experience on the platform.