As much as Apple has tried to explain how the intelligent system that will take care of recognize images on iPhones to only personally review those that are out of “normality”, this does not seem to have been the solution. Apple has also imposed security measures that have been accepted at times, even though it appears that all of them work against them.
A security measure that does not convince
After the open letter and signed by thousands of Apple users, it has been the employees who have left their complaints against the company in the Slack group for this measure that they find inappropriate. It is a change that directly affects privacy and goes against Apple’s precise use. Apple has long boasted about this in its advertising spots.
The firm has received more than 800 complaints from employees. A rejection that as explained by Appleinsider does not come from the privacy or security departments. Because the images can be used in an inappropriate way, the problem is the little peace of the mind this provides.
IOS 15 child protection in the spotlight
Although from the beginning the company has tried to speak of this tool as a security measure to locate and stop the expansion of inappropriate images about children, the system does not provide the necessary security. The employees have sought to find solutions. Some even doubted that such data could reach government governments.
For now, the measure remains in place, with a launch planned in all iPhones with iOS 15 for fall, but without a specific date. We’ll have to wait to see what Apple decides, how it implements it, or if things turn against it in order to make any progress. Google and Android, as well as the other manufacturers, are open to any alternative. However, we need to remember that Microsoft or Google have used this method for years.