Apple’s Use of Bug Report Data for AI Training in iOS 18.5 Beta Raises Privacy Concerns

In the latest iOS 18.5 beta release, Apple has implemented a policy that allows the company to utilize data from user-submitted bug reports to train its artificial intelligence (AI) models. This development has sparked discussions among developers and privacy advocates due to the absence of an opt-out option.

Users participating in Apple’s beta testing program, who report issues through the Feedback app, are now required to consent to the use of their submissions—including logs and diagnostic files—for AI training purposes. The updated privacy policy within the Feedback app states:

Apple may use your submission to improve Apple products and services, such as training Apple Intelligence models and other machine learning models.

This means that any attachments, such as sysdiagnose files, included in bug reports can be utilized by Apple to enhance its AI capabilities. The only way to prevent this data from being used is to refrain from submitting bug reports altogether.

The change was first noticed by developer Joachim, who highlighted the issue on social media. He criticized Apple for updating the privacy terms without providing a clear opt-out mechanism. This sentiment has been echoed by others in the developer community who find the change invasive, despite Apple’s assurances of privacy safeguards.

Apple asserts that its AI training process employs Differential Privacy, a technique designed to add artificial noise to user data, making it difficult to trace back to individuals. This method is already in use for features like Genmoji and Image Playground.

However, privacy advocates argue that requiring consent for bug reporting without offering an alternative crosses a line. Developers working with Apple’s beta software are often the first to identify flaws, and they now face a dilemma between assisting Apple in improving iOS and maintaining control over their data.

While users can opt out of broader Apple Intelligence training by navigating to Settings > Privacy & Security > Analytics & Improvements and toggling off Share iPhone & Watch Analytics, this does not prevent the use of bug report content submitted through the Feedback app.

This policy marks a significant expansion of how Apple handles diagnostic data. The company has not publicly responded to criticism or clarified if future updates will include an opt-out option.

For now, participants in Apple’s beta program who plan to report issues should be aware that their data will contribute to training Apple’s AI models, regardless of their preferences.