“Unlock the Future: Apple Challenges You to Win a Million with Its Revolutionary AI!”

Are you ready to dive into the geeky world of hacking for a chance to snag a cool million? Well, clear your schedule because Apple is throwing down the gauntlet! Their latest challenge is unlike any other: beat their new, cutting-edge AI system, and the hefty prize of $1,000,000 (yes, you read that right!) could be yours. With the highly anticipated launch of iOS 18.1 on October 28, in which this powerful AI—referred to as Apple Intelligence—will be unleashed, iPhone users are buzzing with excitement over the new capabilities it promises to introduce. Apple has boasted about its top-notch security and the revolutionary Private Cloud Compute (PCC) technology, but they’re so confident in its defenses that they’re inviting anyone brave enough—or tech-savvy enough—to attempt to breach it. If you think you’re up for the challenge, grab your hacking tools and prepare for a weekend of digital adventure! Want to know more? LEARN MORE.

If you fancy getting your hands on $1,000,000 and are something of a cybergeek, clear your weekend – as Apple is offering the life-changing sum to anyone who can get the better of its sophisticated new AI system.

We’ve all heard a lot about Apple Intelligence over the last few months, and finally, it is set to be released next week when iOS 18.1 is launched on Monday (28 October).

iPhone users have been eagerly awaiting the software update which the tech giant says will open up a world of ‘new possibilities’ for your device, while CEO Tim Cook promised it ‘raises the bar for what an iPhone can do’.

The new AI features will bring improvements to your emails, messages, notification centre and photos, as well as introducing AI writing tools, voice transcribing and its reimagined emojis.

Apple has made some pretty big assertions about the abilities of its AI system, particularly about the safety of it.

According to the firm, Apple Intelligence will ‘draw on your personal context without allowing anyone else to access your personal data – not even Apple’.

It explains that the AI system is ‘designed to protect your privacy at every step’, and although it is ‘aware’ of your personal information, it doesn’t collect any of it.

The company is so confident in its security system and its Private Cloud Compute (PCC) technology that it has offered $1 million (£771,500) to anyone who can crack it.

Apple have called on tech whizzes to try and hack its Private Cloud Compute (PCC) technology (GREG BAKER/AFP via Getty Images)

Apple have called on tech whizzes to try and hack its Private Cloud Compute (PCC) technology (GREG BAKER/AFP via Getty Images)

What is Private Cloud Compute (PCC)?

For those who have no idea what PCC is, let us introduce you.

Apple explains that this ‘groundbreaking cloud intelligence system’ was created specifically for the purpose of private AI processing.

These servers will receive and sort out users’ Apple Intelligence requests when they are too complex for the device to handle on its own – and once it is all done and dusted, this data will be deleted immediately.

The tech firm itself can’t even access the stuff sent to PCC, which is built with custom Apple silicon and a ‘hardened operating system designed for privacy’, thanks to end-to-end encryption.

Apple describes the tech as the ‘most advanced security architecture ever deployed for cloud AI compute at scale’, a comment which is sure to sting for its competitors.

PCC was built with a core set of requirements, including stateless computation of personal user data, enforceable guarantees, no privileged runtime access and most importantly, non-targetability and verifiable transparency.

To put these latter two to the test, Apple is encouraging people to try and hack into the system.

$1 million bounty

On Thursday (24 October), Apple called on ‘all security researchers – or anyone with interest and a technical curiosity’ to conduct their ‘own independent verification’ of the company’s big claims about the capabilities of PCC.

It said in an announcement: “To further encourage your research in Private Cloud Compute, we’re expanding Apple Security Bounty to include rewards for vulnerabilities that demonstrate a compromise of the fundamental security and privacy guarantees of PCC.”

Third-party auditors have already had a go at it, but now, anyone can attempt to break into the system.

You could earn $1 million for cracking into the supposedly iron-clad system (Getty Stock Image)

You could earn $1 million for cracking into the supposedly iron-clad system (Getty Stock Image)

And for your valiant efforts, you can earn a myriad of different rewards – but one lucky bugger could get their hands on a whopping $1 million if they can run code on the system.

All they have to do is avoid detection and access sensitive info. Simple, right?

The idea is that Apple will be able to ‘learn more about PCC and perform their own independent verification of our claims’ by letting the tech-savvy try and break into it.

Hackers don’t have to start from absolute scratch, as Apple is granting access to the source code for key components of PCC, so people can get their head around the software before they take a whack at it.

Those who have a go will also be given a security guide and access to a virtual environment on macOS.

There is $100,000 up for grabs for anyone who can carry out code requests that are unverified by the company.

You could also bag a tidy $250,000 if you find a way to get into PCC remotely and expose a user’s data request file, but if you manage to find any other vulnerabilities in the tech, you could also be rewarded financially.

Apple said that ‘even if it doesn’t match a published category’, people who find any weak points could still earn some cash from them.

“We hope that you’ll dive deeper into PCC’s design with our Security Guide, explore the code yourself with the Virtual Research Environment, and report any issues you find through Apple Security Bounty,” it added.

“We believe Private Cloud Compute is the most advanced security architecture ever deployed for cloud AI compute at scale, and we look forward to working with the research community to build trust in the system and make it even more secure and private over time.”

So, what are you waiting for?

RSS
Follow by Email