Apple introduced the new iPhone 13 series and iOS operating system this week. Although we see Apple’s advertisements everywhere, when we look at the company’s own standards, we see that they have had a nightmarish week. While organizations such as the Israeli NSO explain that they have succeeded in using various vulnerabilities in Apple’s new operating system, the company’s screen monitoring technologies also create a separate problem.
Apple wants to deploy some form of CSAM (Child Sexual Abuse Material) screening to catch its competitors. The purpose of this practice is to protect children and to identify those who sexually abuse or otherwise harm children in photos uploaded to the cloud. Still, there are many people who consider scanning all their photos against their privacy.
Apple also holds up to a certain extent
Apple has previously said that where it has difficulty responding to government demands in the US and other countries under local law, it will not bow to such demands and will not act under government pressure. With the release of the iPhone 13, Apple, which said that it did not need the intervention of the government and claimed that it could provide its own security, was in a difficult situation.
In addition, a move by Apple as the elections in Russia were approaching led to the growth of the debate. Apple, which has often faced the Chinese administration before, had problems with Russia this time. The reason for this situation is some voting practices.
Elections will be held in Russia on Friday. Some practices that were planned to be used by citizens who were not satisfied with the current administration in the country to cast protest votes and control the votes, drew the reaction of the country’s administration. Then Google and Apple removed these applications from the application markedollarseries.
Apple blames others
One of Apple’s policies is that what happens on the iPhone stays on the iPhone. Still, the firm’s defense on issues like CSAM seems rather weak. According to Apple, they do not scan themselves in a way that creates a privacy violation like their competitors. According to the firm, Google and Microsoft uploading and reviewing images to the cloud is a security breach, but Apple does the review on the phone.
Of course, it is not very plausible for people to accept the idea of examining the images on the devices without reacting, and many people also express that they do not trust the companies. Although Apple explains the methods and process it uses step by step, this is not enough for many people.
The protest actions carried out by various organizations in front of Apple do not help the Cupertino-based company at all. The reason why Apple didn’t do this is because of its past statements, which would not be so difficult if they said that such reviews are now the industry standard, but they do it in a safer way.
Previously opposed image control
Apple has had heated debates and lengthy explanations in the past about why browsing images in the cloud is bad for privacy. So the company needs to find a clever way out so that any action it takes now is not seen as hypocrisy.
The firm is under pressure to develop CSAM applications and has already lagged behind its competitors in this area. The statements made do not show that the crisis has been managed very well. For example, Apple says its artificial intelligence system is investigating. It is stated that humans correct the erroneous decisions of artificial intelligence. This raises the question of how many people can view the photos in total, as well as creating reservations about end-to-end encryption.
It’s almost impossible for Apple to sell the idea of examining every image on the device to users, at least in the first stage. Of course, Apple can take action and wait for the reactions to fade over time. However, the confidentiality promised by the company so far will be broken. Therefore, they need to find a good communication language.
What will Apple do?
First of all, Apple will do what it will do as soon as possible. A fix for iOS today will have an impact for the next year. This means that not only the iPhone 13, but also 14 will be affected. Alternatively, the firm may release an operating system once in a while, but that will push consumers away.
The current situation has turned into a nightmare for Apple. The reactions to WhatsApp’s community contract and Google’s FLoC regulation now seem to return to Apple. How Apple will solve this problem is eagerly awaited.