Apple’s recent FaceTime vulnerability raises an ongoing concern that as a society we suck and blow expecting perfection in an imperfect world while all too willingly jumping with both feet into the technology black hole by adopting of the latest technology, Installing Apps abs programs without hesitation, medical advancements, or latest tech fad.
Where is that line of personal vs corporate due-diligence fall? This raises a question of “Who is or should be accountable / liable, the user, the developer, if anyone at all?”
For clarity, I am not a lawyer – like most of my blog articles, I am just a guy looking in from the outside. However, does the necessarily make we wrong?
Apple FaceTime Vulnerability
The latest poster child on the expecting perfection in an imperfect world is technology giant Apple as news breaks of a vulnerability in the group chat functionality of its FaceTime application that allows users to eavesdrop on the people being called, even if they didn’t pick up the call. The simple exploit works with any pair of iOS devices running iOS 12.1 or later. “The bug lets you call anyone with FaceTime, and immediately hear the audio coming from their phone – before the person on the other end has accepted or rejected the incoming call” according to Benjamin Mayo at 9to5Mac who first broke the story and adds “there’s a second part to this which can expose video too…”. However, it begs to ask the question “Should Apple be liable over this or any other security vulnerability?”
Apple Under Attack over FaceTime Bug
A Huston lawyer argues, “Yes”, after filing a lawsuit against Apple over a security vulnerability that let people eavesdrop on iPhones using FaceTime alleging that Apple “failed to exercise reasonable care” and that Apple “knew, or should have known, that its Product would cause unsolicited privacy breaches and eavesdropping.” It alleged Apple did not adequately test its software and that Apple was “aware there was a high probability at least some consumers would suffer harm.” I would argue differently, as to err is human.
We are human! We commonly use Alexander Pope’s proverb, to err is human; to forgive, divine (Essay on Criticism). Yet, we expect perfection? For corporations such as Apple, this comes at a significant liability. However, should it? IOS is estimated to have over 10 million lines of code. Yet we are expecting perfection in 1:10,000,000 – do we expect to win the lottery? Do we expect to have our choice in a girl over a boy 1:2. I admit, the latter is a bit of a stretch in comparison – however, if you include the exponential difference between 1:2 to 1:10,000,000 maybe not. So how can we expect industry to know every outcome of every possibility? Applications drive the world, smart phones, smart cars, industry – looking at the odds of finding an error is a given, and expectation, and should have a parallel level of tolerance.
Software Development Model
One of my favourite quotes is by Donald Rumsfeld, “ we know, there are known knowns; there are things we know we know. We also know there are known unknowns; that is to say we know there are some things we do not know. But there are also unknown unknowns—the ones we don’t know we don’t know”. This applies to all aspects of life, including software development.
We know what we know:
We test our code, the structure, the syntax. It works as expected. We develop tests, we try to break the software, find the bugs, find the exploits.
We know what we don’t know:
Humans are also creative and we all bring different perspectives to an issue. This is why, software developers like Apple have open beta and developer programs. They open up their software to professions, tech geeks, to play, to explore, to exploit, to test. Find what the development team may have missed, look at the software, the app, the interface with a different perspective.
We don’t know we don’t know:
What more can we expect from a developer? Shouldn’t the expositing beta model be transparent enough, shouldn’t the existing beta model protect industry from the We don’t know we don’t know?
Going back to the Apple’s recent FaceTime vulnerability. Tens of thousands of users have tested and explored every aspect of FaceTime both as Apple employees and as members of the public beta team. Using the words of Rumsfeld, we didn’t know what we didn’t know. None of these users thought about testing for this vulnerability? Who can or should be blamed for being human?
Where does corporate liability begin?
Should Apple be liable for the FaceTime vulnerability? Let’s look at the facts as they are known.
- Apple receives hundreds if not thousands of feature updates and bug notifications on a weekly basis. It takes time to review, test and respond.
- Apple disable group calling feature on Apple Devices once the exploit was identified.
- Apple had a clear and transparent process for testing the group calling feature before it was incorporated into official update.
Considering the filed lawsuit, did Apple “fail to exercise reasonable care” and that Apple “knew, or should have known, that its Product would cause unsolicited privacy breaches and eavesdropping.”: I would argue, NO. The existing testing and Beta process, presumably adhered to was followed. Simply, as a group of Apple users we did not consider testing that feature. Therefore, how could Apple be “aware there was a high probability at least some consumers would suffer harm.” Again – tens of millions of lines of code. Playing devil’s advocate, it would be easy to consider that “Apple should have recognized” that you could involuntarily connect to a group conversation. However, it appears that neither Apple or more importantly the beta testing public failed to recognize this as a potential issue (We don’t know we don’t know).
Apparently society expects everyone to have a crystal ball.