Apple will now automatically put a stop to any nudity on FaceTime

In the latest version of iOS, Apple automatically detects any nudity on FaceTime and immediately stops it by pausing the video and audio.

Apple will now automatically put a stop to any nudity on FaceTime
Comment IconFacebook IconX IconReddit Icon
Tech and Science Editor
Published
1 minute & 45 seconds read time
TL;DR: Apple's iOS 26 beta introduces a new FaceTime feature that automatically pauses video and audio if nudity is detected, aiming to enhance Communication Safety. This on-device machine learning tool raises privacy concerns as it intervenes during calls, with unclear application to adult accounts beyond child protections.

If you are one to get somewhat spicy over FaceTime, those days are now coming to an end as Apple has updated the feature to automatically stop video and audio if any nudity is detected.

Apple will now automatically put a stop to any nudity on FaceTime 312123312

The new stipulations for using FaceTime were discovered in the iOS 26 beta. Apple did mention last month when it unveiled iOS 26 that new features were coming to child accounts, but now it appears some of those features are making their way over to adult accounts as well. 9to5Mac reports that as part of the expansion of features, "Communication Safety expands to intervene when nudity is detected in FaceTime video calls, and to blur out nudity in Shared Albums in Photos."

What happens when FaceTime detects nudity? The video and audio of the call both freeze, and then the app will show the following message: "Audio and video are paused because you may be showing something sensitive. If you feel uncomfortable, you should end the call." There are two options to select from in this screen, "Resume Audio and Video" and "End Call". Presumably, users can simply resume the call and carry out their spicy activity.

However, the fact that nudity was even detected and Apple's decision to interrupt the call has raised some serious questions about privacy. Notably, Communication Safety uses on-device machine learning to analyze photo and video attachments and determine if a photo or video appears to contain nudity.

Apple writes, "Because the photos and videos are analyzed on your child's device, Apple doesn't receive an indication that nudity was detected and doesn't get access to the photos or videos as a result."

Currently, it remains unclear whether this is a bug within the iOS 26 beta intended only for child accounts or if it has inadvertently affected adult accounts. Hopefully, that is the case, as enforcing this on adult accounts seems like an overreach for a company that is encroaching on people's privacy through the guise of user protection.

Photo of the PlayStation 5 console (slim)
Best Deals: PlayStation 5 console (slim)
Today7 days ago30 days ago
$509.99 USD$549 USD
$509.99 USD$549 USD
£693.87-
$509.99 USD$549 USD
Check PriceCheck Price
* Prices last scanned 4/23/2026 at 11:29 am CDT - prices may be inaccurate. As an Amazon Associate, we earn from qualifying purchases. We earn affiliate commission from any Newegg or PCCG sales.
News Source:9to5mac.com

Tech and Science Editor

Email IconX IconLinkedIn Icon

Jak joined TweakTown in 2017 and has since reviewed 100s of new tech products and kept us informed daily on the latest science, space, and artificial intelligence news. Jak's love for science, space, and technology, and, more specifically, PC gaming, began at 10 years old. It was the day his dad showed him how to play Age of Empires on an old Compaq PC. Ever since that day, Jak fell in love with games and the progression of the technology industry in all its forms.

Follow TweakTown on Google News
Newsletter Subscription