August 11, 2021
either/view ⚖️
When pioneer succumbs

To: either/view subscribers

Good morning. Did you check out our ‘Know Your Rights’ edition published last Saturday? We wrote about your right to access free water and use restrooms of any hotel even if you are not their customer. It’s an interesting and useful read. Check it out here.


Is Apple’s new child safety feature a cause for concern?

There’s something fantastic about Apple products. Call it fascination or craze, many people desire to have an Apple device. The reason for this is Apple’s commitment to privacy. Time and again they have reminded us that ‘Privacy is a fundamental right.’ 

Well, it seems like all’s not well for Apple’s image as a privacy advocate. Recently, the company revealed their big plans to introduce a child safety feature in iOS 15. They will soon start scanning Apple devices for photos of child sexual abuse material (CSAM). If someone is found to be in continuous possession of such sexually explicit images, they will be flagged by the company and reported to law enforcement. 

Now, let’s face it. This feature is going to be of major help. This will put a check on the distribution of child porn. So what’s the real deal the critics have about this feature? Privacy. That’s what they are concerned about. Very many people are worried that this is only the beginning of their breach of privacy. If the company can now scan images for CSAM, they might soon expand their search for other purposes. What would then become of one’s privacy that the company promised?


For many years now, Apple’s selling point has been it’s core value – privacy. The spectacular case in 2016 between Apple and the FBI stands proof that they haven’t lied about this. Even upon the request of the top law enforcement agency, the company refused to allow access to iPhone users’ data. They assured their customers that they would never betray them and open a backdoor to the FBI. 

Everyone’s respect for Apple soared through the skies and it still stays that way mostly. Of course, all of us appreciate that the company is taking steps to combat child sexual abuse. Nobody condones sharing CSAM. In fact, several big tech companies like Microsoft, Google and Facebook have already been involved in tracing and reporting CSAM. So this is not a new practice and nobody is against the company’s intention. 

Apart from the scanning feature, Apple has planned to install a software to protect minors from viewing inappropriate content. So, whenever the child receives or attempts to send such an image on their iMessage account, they will be notified that sensitive content is ahead. Then the child can proceed to view it or avoid opening that image. You can say the mechanism is like a parent who wants to protect their child from consuming illegal stuff. If the child decides to see it anyway, their real parent will be notified by the company. 

But this feature is largely being looked at as a double-edged sword. It’s intention is definitely good but it has the potential to compromise the user’s privacy.

New feature violates privacy

Now, Apple has been very devoted to its consumers. The brand is all about the rights of the users and their privacy. Take for instance their advertising slogan that preached “What happens on your iPhone, stays on your iPhone.” But with the company’s announcement about this feature, people are saying “No, it does not.” They are worried that their trust in Apple’s privacy policies is about to be slowly shattered. 

Many Apple consumers came together and penned down an open letter to the company asking them to reconsider their feature. Firstly, it’s not possible to create a system that will only scan and look into sexually abusive content of children. So all photos you upload onto your iCloud will be scanned. This strongly shuts down your privacy as a iCloud photo user. 

Next, let’s take the case of detecting sexually explicit images that are being sent to a minor’s device. When this feature comes up, children will be notified whenever they share or receive a sensitive image on iMessage. Know that the sensitivity of the image is being decided by an algorithm. It’s not new that algorithms sometimes do the wrong math and classify files under the wrong category. This means, not only is the company peeping into our images and messages voluntarily, it will also break it’s commitment to “end-to-end encryption.”

Again, scanning for CSAM is not out of bad intention. But people think that this is just the precedent to scanning other files in the future. Think of it as a slow, passive breach of privacy. Nobody would object to this feature because it is morally correct to stop child porn. However, this could be their stepping stone to gradually open the backdoor to the government. We never know, one day they might compromise our data privacy.

The fear of the company letting the government view our data is not unwarranted. There have been instances where Apple has bent its privacy rules to fit into the demands of the local government. People are citing examples of how the company agreed to allow government censorship tools in Chinese App Store. They have also removed the FaceTime option in their devices in Saudi Arabia because the government is against encrypted phone calls. So how long would it take for the company to not fall under governmental pressure and give away our privacy?

Still committed to privacy : Apple

Apple is very clear on its stance. This feature is their way of taking responsibility to fight against CSAM. Like we talked about, many companies have been doing this already. So it’s high time the company jumped into the trend of catching the bad guys. 

But flash news – Apple’s new feature will not compromise on privacy like the other companies. This feature will work on preserving user privacy while also identifying accounts with CSAM. Let’s look into the details to understand clearly. 

Firstly, not all of your images can be scanned by the company. Only those images you choose to upload in your iCloud will be scanned. Before you upload the images, Apple’s brand new CSAM detection technology – NeuralHash will get activated. This tech already has a database of known CSAM images. So essentially, NeuralHash will match the photos you are trying to upload to iCloud with the database. If there is a match, the company will be notified. Else, nothing will happen.

The best part is that the technology cannot view your images as it is. All your images will be converted to a combination of letters and numbers, which is called the hash. Only these hashes will be used to verify with the database. Plus, all this process happens on your device and the company does not have access to your images. So it is not a case of breaching privacy unnecessarily.

Moreover, not all matches will be legally reported. Only the users who have CSAM images that are more than a set limit will be identified by the software. After this happens, your images will be manually reviewed in their original format by the company. Yes, your images can be viewed, provided your account is being continuously flagged. While reviewing, if it is found that you are possessing CSAM, you will then be disabled and reported. So the process is quite elaborate and effective. And if what Apple says is right, only one in a trillion has the chance to be falsely reported.


For the Right:

Consent during sex should always be legally essential – Marriage or No Marriage

For the Left:

Why Dalit is no longer an empowering word for some marginalised communities in UP


Reel to real (Madhya Pradesh) – Remember the movie ‘Special 26’ starring Akshay Kumar as Ajay Singh who along with his comrades act like CBI officers and loot money? This has happened for real now. A gang of 6 robbed ₹2 lakh at gunpoint posing as CBI officers. But police officials are quicker, robbers are not Ajay Singh and this is not a film. So, even after taking away the CCTV recording hard disk, the robbers were nabbed by the police. Do you know what? One of the robbers told the police that he was actually inspired by the Hindi film and hence planned the robbery. Such inspirations!

Women in Sports (Tamil Nadu) – Women may be winning Olympic medals for the country, but issues of discrimination still persist. For instance, Sameeha, a long jump and 100m track athlete, has been denied a spot in India’s Poland Championships squad. She is a three-time gold winner in national athletic championships for deaf athletes. Besides, she had cleared the selection round. Then why is she dropped, you ask? She is the only woman athlete to have got selected. The Sports Authority of India (SAI) has claimed that there aren’t enough funds to arrange for an escort for one person. On July 26, the MP of Kanyakumari V Vijayakumar wrote to the Union Sports Minister Anurag Thakur requesting that she be included in the squad. He has also mentioned that the funds can be arranged. Alas, there has been no reply yet. With the team expected to be leaving on August 14, the selection of a talented athlete is being barred because she is a woman.

Lathi Charge (Jharkhand) – Visuals where police lathi charged a group of girls went viral. Here is what happened. A group of girl students on August 6th gathered to protest against exam results, before State Minister Banna Gupta . They were unhappy over the results declared by the Jharkhand Academic Council (JAC) and demanded reconsideration. It is said that the girls made their way into the place where Gupta was chairing a meeting. Thereby, prompting the lathi charge by the police. An enquiry committee has been set up to look into the matter. Addressing the students, State Education Minister Jagarnath Mahto said that there are well established procedures for reassessment and students should approach the grievance cell for the same. While further probe into the matter is underway, we hope that justice prevails.

Activist turned Politician (Gujarat) – If you want to change the system, you have to enter the system. This is what the farmer activist and the founding president of Khedut Ekta Manch, Sagar Rabari, said as he joined hands with Aam Aadmi Party (AAP). The party is rejoicing that a person who has fought for Gujarat farmers for many years has joined them. Sagar has joined AAP because their workers are in sync with his ideology. He also believes that AAP is the only party that can offer an alternate governance.  Now, if you ask whether that’s true, only time will tell.

Transportation Peace (Mizoram) – The Assam-Mizoram conflicts are a long standing debate. The issue kept brewing over the past few weeks. There is yet another development, but this time it might be hinting at peace. The economic blockade staged by Mizoram locals after the July 26 border clash was removed on August 9. So far, over 600 vehicles carrying goods and people have been smoothly moving through the Highway. Also, the people have been asked to welcome others and show hospitality. Earlier, both the states agreed on solving the issue amicably. The changes in transportation are a positive sign for peace.


₹3,355 crore – Electoral bonds sold in 2019-20. FYI, electoral bonds are funds donated to political parties anonymously. In 2019-20, BJP got ₹2,555 crore, which is a whopping 76% of the total funds donated. Congress party, on the other hand, received only ₹318 crore.