Late last week, Apple pulled popular messaging app Telegram — and its recently released offshoot, Telegram X — from the iOS App Store.
At the time, the only explanation for the removal was a tersely worded statement from Telegram CEO Pavel Durov saying that Apple had removed the two apps due to their inclusion of “inappropriate content.”
“We were alerted by Apple that inappropriate content was made available to our users and both apps were taken off the App Store,” wrote Durov at the time. “Once we have protections in place we expect the apps to be back on the App Store.”
It turns out an oversight by Telegram had allowed the company’s userbase to view and exchange child pornography.
In an email obtained and verified by 9to5Mac, Apple’s Phil Schiller explains what happened to a user who emailed him about the removal:
“The Telegram apps were taken down off the App Store because the App Store team was alerted to illegal content, specifically child pornography, in the apps,” wrote Schiller. “After verifying the existence of the illegal content, the team took the apps down from the store, alerted the developer, and notified the proper authorities, including the NCMEC (National Center for Missing and Exploited Children).”
Schiller then goes on to write:
“The App Store team worked with the developer to have them remove this illegal content from the apps and ban the users who posted this horrible content. Only after it was verified that the developer had taken these actions and put in place more controls to keep this illegal activity from happening again were these apps reinstated on the App Store.”
For what it’s worth, it took Telegram all of a day to put in the proper protections. However, this isn’t the first time the company has run into issues involving its less savoury users. Last year, Indonesia threatened to ban the app if the company didn’t take a more active roll in stopping the spread “radical and terrorist propaganda.”
The full letter is viewable on 9to5Mac.