Like all other corporate titans, Apple has every desire and intent to know more about its customers. At the same time, however, the Cupertino-based giant has quite successfully branded itself as Silicon Valley's sole defender of privacy. For some reason, the company may have figured out a way to collect its users' data without sacrificing privacy. They called it "Differential Privacy."

During its Worldwide Developers' Conference in San Francisco last Monday, Apple delivered a keynoted address through its Senior VP of Software Engineering Craig Federighi. The exec said that to strengthen Apple customers' privacy, the company won't assemble user profiles for marketing purposes, ensure end-to-end encryption of messages sent through Message and FaceTime, and tries as much as possible to limit computation involving private data on users' personal device rather than processing them on an Apple server.

Federighi also made a cautious, careful, and well-phrased transition to acknowledging the importance of collecting private date in creating good software in an era where big data analysis and increasingly autonomous machine learning are becoming staple.

"Differential privacy is a research topic in the area of statistics and data analytics that uses hashing, sub-sampling and noise injection to enable this kind of crowdsourced learning while keeping the information of each individual user completely private," Federighi explained, Tech Crunch reported.

The "Differential Privacy" isn't exactly new. For years, academics have been the studying the concept. But Apple's forthcoming rollout of iOS 10 will make a big difference in terms of applying the concept to collect and analyze a huge amount of user data from its devices as well as iOS tools and features.

How does it work?

In a nutshell, "Differential Privacy" refers to "the statistical science of trying to learn as much as possible about a group while learning as little as possible about any individual in it." What this means is that Apple can get as much data it needs and extract useful output as to the users' general preference and patterns of behavior without committing a privacy violation against a specific user.

"With a large dataset that consists of records of individuals, you might like to run a machine learning algorithm to derive statistical insights from the database as a whole, but you want to prevent some outside observer or attacker from learning anything specific about some [individual] in the data set," says Aaron Roth, a University of Pennsylvania computer science professor whom Apple quoted in its address, Wired reported.

"Differential privacy lets you gain insights from large datasets, but with a mathematical proof that no one can learn about a single individual."

In theory, at least, the concept should prevent hackers or external government intelligence organizations from acquiring specific knowledge of an individual's life or background. But of course, some security and other tech experts are not wholly convinced saying that the concept is still pretty much in its "infancy."

"What remains to be seen is how these features will be implemented. Implementing privacy-protecting algorithms are often tricky to get right-just because it works on paper doesn't mean it will act properly when implemented in software," remarked Electronic Frontier Foundation's technology expert William Budington as quoted by Fast Company.