Elon Musk, ever since taking over Twitter, has vowed to revolutionize the platform by making sweeping changes on the platform, which can be seen in the past month, with the latest one being Child protection as his primary focus.
- Twitter CEO has decided to make Child Safety a major priority on Twitter as one of his reforming changes with the company
- The problem is that the Child Sexual Abusive Material (CSAM) is being handled by a single staff member after massive layoffs
- The issue might become big due to the massive layoffs that Musk has ordered in the past month
Elon Musk’s Priority of Child Safety Under Problem
Elon Musk is a name that everyone is familiar with but one that has been frequently used in the past year or so as the billionaire entrepreneur has engaged in a legal tussle with Twitter as the latter has filed a case against him following his backing out of a deal to purchase the social media giant for $44 billion.
It goes without saying that Twitter has become a cesspool of uncertainties ever since the Tesla CEO took over as the company’s CEO last month on October 28, 2022, where there is little job security while users cannot be sure regarding the launch of the next feature.
As of now, Musk has undertaken a top priority of making Twitter a safe haven for children as he wants to bring, in his own words, some sanity in the company but there is a huge problem that he seems to have overlooked.
Apparently, after the massive employee layoffs and staff resignations, the one key team that has been tasked with moderating Twitter’s Child Sexual Abusive Material (CSAM) has only one member left as the permanent employee.
Given the situation, it remains a big question mark as to how Elon Musk can hope to protect children when the department tasked with handling it has only one staff member left.
There were four Singaporean employees, as CSAM is based in Singapore, who specialized in child safety that had resigned in November 2022 itself, and the issue is that it is the Singapore team that handles Twitter’s busiest markets, Japan included.
It is the job of Twitter’s child safety experts to review CSAM content with groups like Internet Watch Foundation (IWF), UK, and National Center for Missing and Exploited Children (NCMEC), US, providing backend support.
The experts have to scan and identify images posted and shared on Twitter while the above groups only work with the internal CSAM team as they focus on the final product and have no access to Twitter’s internal data.
Since the team has only one staff member left, Musk cannot claim child safety as a priority as the department is depleted and he can only focus his attention on it when it has a significant number of members.