Apple won’t weaken encryption on your iPhone without a fight

Apple is fighting to keep your data encrypted. The tech giant has now refused the FBI's request to devise an operating system that would allow the security services access to encrypted data. We look at why Apple is so determined to secure your information

Apple chief executive Tim Cook has said the company will challenge a court order to help FBI investigators build a “master key” to access encrypted data.

The specific phone that the FBI want to access belongs to San Bernardino gunman Syed Rizwan Farook. But in a message to Apple customers, Cook stated that he believes the FBI’s current demands would only represent the beginning of their encroach on privacy and would signal a “dangerous precedent”.

The Apple CEO also criticised the FBI’s unprecedented use of the All Writs Act of 1789 to justify an expansion of its authority, rather than asking for legislative action through Congress.

“The implications of the government’s demands are chilling,” said Cook. “If the government can use the All Writs Act to make it easier to unlock your iPhone, it would have the power to reach into anyone’s device to capture their data.

“The government could extend this breach of privacy and demand that Apple build surveillance software to intercept your messages, access your health records or financial data, track your location, or even access your phone’s microphone or camera without your knowledge.”

Gaining access through the backdoor

The FBI has requested Apple build a version of iOS – iPhone’s operating system – that could be installed and used to circumvent current security features.

The FBI may use different words to describe this tool, but make no mistake: building a version of iOS that bypasses security in this way would undeniably create a backdoor

At present Apple says it has complied with valid subpoenas and search warrants, but has taken exception to what it sees as an “overreach by the US government “.

“The FBI may use different words to describe this tool, but make no mistake: building a version of iOS that bypasses security in this way would undeniably create a backdoor,” says Cook. “While the government may argue that its use would be limited to this case, there is no way to guarantee such control.”

Apple has said that it objects to the FBI’s request “to expose its customers to a greater risk of attack”, and that their court-ordered mandate amounts to asking the engineers who built strong encryption into the iPhone “to weaken those protections and make our users less safe”.

Apple’s motivation

Some commentators have suggested Apple’s motives might not be quite as noble as they seem at first glance. “I’m not in a position to guess whether Apple can break the encryption on its devices – that’s one of those things where you need highly skilled cryptanalysts to bang on them for some years and not find holes,” says Open Rights Group advisory council member, Wendy Grossman.

Apple

Image courtesy of pio3 / Shutterstock.com. Featured image courtesy of Marco Prati / Shutterstock.com

“What we do know is that Apple promised its customers that it could not access their data. So either it’s infeasible, as they say, or they would be breaking their word to customers. Neither is a desirable state for a public company, so I’m not surprised they’ve gone to court.”

Whatever the tech giants rational for refusing the FBI’s request, Grossman agrees with Apple’s argument that once a backdoor has been established innocent people’s data will be exposed.

“There are always hard cases with respect to law enforcement’s desire for more information. However, Apple’s decision to provide encryption it can’t’ crack for its customers is a rational one because opening the gunman’s phone, for example, doesn’t just expose the gunman’s data but also data relating to innocent family members and friends and other contacts,” says Grossman.

Battling on multiple fronts

The FBI’s request to bypass iPhone’s encryption follows the proposals made by policy makers in California and New York to ban the sale of encrypted phones. In their letter to customers Apple point out that such a policy would “hurt only the well-meaning and law-abiding citizens who rely on companies like Apple to protect their data,” while criminals would still be able to encrypt data, using tools already available to them.

You cannot make a hole that only good guys can use

“The difficulty with policies such as those that have been alluded to by both the US and the UK of banning the use of encryption where law enforcement can’t gain access is a really bad idea, for several reasons. One, you cannot make a hole that only good guys can use, so a law like that opens all of us up to much worse and more pervasive criminal attack that we’ve seen before,” says Grossman.

“Democratic societies have long imposed limits on what law enforcement can access in an effort to balance the right to privacy of ordinary people and their right to protection from crime. Criminals plan in houses, but we don’t require that every householder deposit a copy of their house key in the local police station –this is a close analogy.”

China uses facial recognition to monitor ethnic minorities

China has been criticised for adding facial recognition to an already obtrusive surveillance system in Xinjiang, a Muslim-dominated region in the country's far west. The "alert project" matches faces from surveillance camera footage to a watchlist of suspects, and supposedly is designed to thwart terrorist attacks.

Source: Engadget

Microsoft execs say the ultimate form of AI is a digital assistant

In an interview with Business Insider, Microsoft president Brad Smith and EVP of AI and research Harry Shum have said the ultimate manifestation of AI in 20 years will be in a digital assistant that will serve as an "alter ego." The two argue that we need to set ground rules for our AI assitants while we still can.

Facebook’s head of AI isn't impressed by Sophia the robot

Facebook's head of AI, Yann LeCun, isn't happy with Sophia the robot. Following a Business Insider interview with Sophia, LeCun took to Twitter to call the whole thing “complete bullsh*t”. He went on to say Sophia masquerading as a semi-sentient entity was "to AI as prestidigitation is to real magic”.

Source: The Verge

Drone saves the lives of two swimmers

Two teenage boys were rescued by a brand new lifesaving drone in Australia, while lifeguards were still training to use the device. When a member of the public spotted them struggling in heavy surf about 700m (2,300ft) offshore the drone was sent out and dropped an inflatable rescue pod, which allowed the pair to make their way safely to shore.

Source: BBC

Google defends the right to not let people be forgotten online

Google is going to court to defend it's right to not abide by "the right to be forgotten", which it says “represent[s] a serious assault on the public’s right to access lawful information. Two anonymous people want the search engine to take down links to information about their old convictions because search engine results attract “adverse attention”.

Source: Bloomberg

UK Police delivering daily briefings via Amazon Echo

Lancashire police have begun streaming daily briefings straight to peoples' homes through Amazon Echo. Users will get hourly updates as well as pictures of wanted and missing people sent directly to their devices. "Alexa works alongside traditional policing methods to inform the public about the important issues in their neighbourhoods," said PC Rob Flanagan.

Source: BBC

A quarter of ethical hackers don’t report cybersecurity concerns because it’s not clear who they should be reporting them to

Almost a quarter of hackers have not reported a vulnerability that they found because the company didn’t have a channel to disclose it, according to a survey of the ethical hacking community.

With 1,698 respondents, the 2018 Hacker Report, conducted by the cybersecurity platform HackerOne, is the largest documented survey ever conducted of the ethical hacking community.

In the survey, HackerOne reports that nearly 1 in 4 hackers have not reported a vulnerability because the company in question lacks a vulnerability disclosure policy (VDP) or a formal method for receiving vulnerability submissions from the outside world.

Without a VDP, ethical, white-hat hackers are forced to go through other channels like social media or emailing personnel in the company, but, as the survey states, they are “frequently ignored or misunderstood”.

Despite some companies lacking a VDP, the hackers surveyed in the report did say that companies are becoming more open to receiving information about vulnerabilities than they were in the past.

Of the 1,698 respondents, 72% noted that companies have become more open to receiving vulnerability reports in the past year,

That figure includes 34% of hackers who believe companies have become far more open.

Unlike a bug bounty program, a VDP does not offer hackers financial incentives for their findings, but they are still incredibly effective.

Organisations like the US Department of Defence have received and resolved nearly 3,000 security vulnerabilities in the last 18 months from their VDP alone.

India (23%) and the United States (20%) are the top two countries represented by the HackerOne hacker community, followed by Russia (6%), Pakistan (4%) and the United Kingdom (4%).

The report revealed that because bug bounties usually have no geographical boundaries the payments involved can be life changing for some hackers.

The top hackers based in India earn 16 times the median salary of a software engineer. And on average, top earning hackers make 2.7 times the median salary of a software engineer in their home country.

In terms of which demographics are attracted to a life of ethical hacking, the report found that over 90% of hackers are under the age of 35, and unsurprisingly the vast majority of hackers on the HackerOne platform are male.