Skip to main content

Market Overview

'Child Sexual Abuse Is Stored On iCloud. Apple Allows It': Anti-Apple Posters Emerge In Front Of Apple Park

Share:
'Child Sexual Abuse Is Stored On iCloud. Apple Allows It': Anti-Apple Posters Emerge In Front Of Apple Park

Apple Inc.'s (NASDAQ:AAPL) decision to abandon plans to scan iPhones for child sexual abuse material (CSAM) has invited the wrath of protestors, who have now set up banners in front of Apple Park to coincide with the iPhone 15 launch.

What Happened: Protestors have set up anti-Apple banners in front of Apple Park, condemning Apple's decision to abandon its iPhone CSAM detection plans, implying that the company is somehow enabling it.

See Also: Want To Upgrade To The iPhone 15 or 15 Pro From iPhone 14? Here Are The Trade-In Prices Apple Is Offering

"Child sexual abuse is stored on iCloud. Apple allows it," says the poster put up by Heat Initiative, a collective of child safety experts and advocates. Danish Khan (@dankh4n) spotted and shared the poster on Instagram.

Image Credits – @dankh4n on Instagram.

The Heat Initiative's website underlines its efforts to bring Apple to make good on its commitment to protect children. Back in 2021, Apple announced plans to scan iPhones and iCloud for CSAM – while child safety experts praised this, it angered privacy advocates.

The Electronic Frontier Foundation, a digital rights group, called it opening a “backdoor” to the private lives of users.

After facing immense backlash, Apple quietly pulled references to CSAM detection on its website and subsequently called it off entirely in 2022.

Why It Matters: CSAM is a huge problem. A case in point is the iCloud account of a 32-year-old man, who was found to be in possession of nearly 2,400 child abuse images in 2019, according to the Heat Initiative.

"We are calling on Apple to detect, report, and remove child sexual abuse images and videos from iCloud," the collective says.

Here's What Apple Has To Say

On the other hand, Apple says its initial CSAM detection plans would have violated user privacy. This realization came about after immense backlash from cybersecurity and privacy experts.

The company said this would not only snowball into a privacy issue but also create security risks and new vectors for malicious parties to attack.

Instead, Apple has come up with a different solution with on-device CSAM detection within apps themselves – for example, apps like Messages, FaceTime, AirDrop, and more have on-device nudity detection systems.

Apple has also launched an application programming interface (API) for this so third-party apps can implement it without adversely impacting user privacy – this also aligns with one of the demands of Heat Initiative, but its protests suggest that it is not completely satisfied with the steps Apple has taken so far.

Check out more of Benzinga’s Consumer Tech coverage by following this link.

Read Next: How To Preorder Apple's iPhone 15: Price, Where To Order And Everything You Need To Know

 

Related Articles (AAPL)

View Comments and Join the Discussion!

Posted-In: Apple Appleverse Consumer Tech Cybersecurity iPhone privacyNews Tech

Don't Miss Any Updates!
News Directly in Your Inbox
Subscribe to:
Benzinga Premarket Activity
Get pre-market outlook, mid-day update and after-market roundup emails in your inbox.
Market in 5 Minutes
Everything you need to know about the market - quick & easy.
Fintech Focus
A daily collection of all things fintech, interesting developments and market updates.
SPAC
Everything you need to know about the latest SPAC news.
Thank You

Thank you for subscribing! If you have any questions feel free to call us at 1-877-440-ZING or email us at vipaccounts@benzinga.com