Apple is letting developers peer into the core of its mobile operating system for the first time - a move that could have major implications for security.
Last week the tech firm released a preview version of iOS 10.
Its kernel - the central component that controls how software is processed by a device's hardware - was unencrypted.
The move should make it easier for researchers to flag flaws that could otherwise be exploited by hackers.
Experts say that will make it harder for organisations to keep secret techniques used to overcome privacy measures on iPhones and iPads.
In a recent high-profile case, the FBI refused to share an exploit it had used to to crack an iPhone used by a gunman who had killed several people in San Bernardino, California.
"In general, transparency is good for security," commented Dr Steven Murdoch from University College London.
"Well-resourced attackers like government intelligence agencies have always been able to find vulnerabilities.
"And while Apple's move will make that job easier, it will also make it easier for less well-resourced security researchers to find the vulnerabilities and get them fixed."
Apple has not commented on the matter, and it was a report by MIT Technology Review that made it public.
It noted that the move could also be used by "jailbreakers" - people who release code that removes an operating system's restrictions to allow a wider range of software to be used.
Unlike many tech firms, Apple does not currently run a bug bounty programme that pays researchers to alert it to flaws.
One researcher suggested that it might now be a good idea to introduce one.
"If Apple has deliberately opened up its code, then it needs to make sure it is very thoroughly reviewed by the community and the firm must then be very responsive in fixing stuff that is found," said Ken Munro from Pen Test Partners.
"A bug bounty would get everyone interested, meaning the security community would be working for Apple for a comparatively low cost."