An Enigma machine. Photo Credit: Wikimedia
By: Roxanne Heston, Columnist
Imagine a clunky metal typewriter with a bizarre keyboard. In the 1940s, this machine was highly sophisticated — the linchpin, and ultimate downfall, of its dependents’ plan. They are called Enigma machines, and they were used by Nazi Germany during World War II for secret communications. Enigmas, or electromechanical rotor machines, converted typed words into messages that appear to an unintended recipient as gibberish. A complex algorithm allowed senders and receivers to flip their secret text between readable and unreadable formats. Since government and military plans were extraordinarily valuable during World War II, the German military obscured their communications using Enigmas. Although German cryptographic efforts were effective for some time; however, Allied cryptographers ultimately broke the Enigma’s cryptographic cipher in 1941.[i] This cryptanalysis allowed the Allies to get such an edge that they ended the war two years earlier than it would have otherwise.[ii]
This is one of the most iconic stories of cryptography for security and typifies the adversarial nature of many of its applications. Fortunately, it is possible to use cryptography in ways that provide benefits without escalating combat.
Cryptography includes both the magnificent and the mundane. Some messages sent online are “encrypted” — that is, made “cryptic,” or unintelligible without a decryption key — so that eavesdroppers cannot understand messages they intercept.[iii] Sensitive stationary information can also be encrypted, as passwords and card details are in password management software.[iv] While private users benefit from encryption, governments, and particularly their intelligence communities, especially rely on these methods.[v]
Since cryptography is fundamentally about securing information, it helps with more than two-party communication.[vi] For instance, blockchain, the basis of digital currencies like Bitcoin, runs on these principles. Encryption, one element of cryptography, allows blockchain users to keep a log of everyone’s digital transactions without actually seeing them. The record generated is a “decentralized public ledger,” comprised of transactions (“blocks”) in a conjoined list (“chain”). As with two-party communication, the information is “locked” with a cryptographically generated private key, which only the receiver can use to decrypt the message. Unlike two-party communication, when a person sends digital currency, the encrypted versions of transactions by other users also appear on that person’s computer. The sender now owns a copy of the ledger, which helps to confirm its integrity.[vii]
Much like most defensive technology used in conflicts, the moral perception of the technology employed depends on who is using it and how. For instance, password-protected cellphone footage is usually great for personal privacy and innocuous for law enforcement. But sometimes cellphone footage is both the only evidence of a crime and inaccessible to law enforcement. In such cases, criminals can “go dark” and walk free while authorities’ hands are tied.[viii]
However, one future application of cryptography, coupled with artificial intelligence (AI), could be incontrovertibly good: privacy-preserving surveillance.[ix] In recent years government surveillance has garnered pushback, fueled by events like the Snowden leaks that highlight its reach. The public protest is reasonable but limiting oversight would also be costly. If machines could surveil instead of humans, we could, paradoxically, retain the benefits of both oversight and confidentiality.
While the tradeoff between privacy for citizens and surveillance by government seems immutable, some existing techniques, like bomb-sniffing dogs, preserve the former without foregoing the latter. These dogs are fairly accurate and, unlike humans, do not need to look inside bags to operate.[x] Cryptography and AI may bring bomb-sniffing to the digital realm.
Companies have data relevant to identifying criminals, and governments have the relevant methods. Today, governments either collect this data themselves or obtain it from companies. In the future, companies and governments could each provide their respective pieces of the puzzle to a jointly owned processing facility and let an algorithm flag people that pattern-match with criminal behavior. The AI could autonomously generate a list of suspicious individuals and send it to authorities. As with blockchain technologies, no one party would have all of the relevant components to build the desired product — in blockchain, a validated ledger; here, a criminal shortlist.[xi]
This approach is even more likely to be possible as machine learning (ML), the current preferred AI technique, improves. The more a machine can perform classically human tasks — say, flagging suspicious connections or threatening language — the less humans need to look at the actions, people, or writing directly. Advances in ML have already made rapid advances in text, speech, and facial recognition.[xii]
Should this be implemented, society could face a tradeoff between the efficacy and the transparency of the evaluation process — as we do with any automated practice. This problem would be more pronounced, however, if data providers and algorithm developers keep their components separate. Obscuring the process will be harmful if the algorithm is biased or completely inaccurate, as is already true of AI criminal sentencing.[xiii] Fortunately, the better developed the ML algorithms become, the less of the tradeoff there will be. With enough training, technology may get to a place where the AI is better at these investigative tasks than humans are. In some cases, such as many components of driving and medical diagnostics, they already are.[xiv]
Cryptography has the potential to transform many components of international competition. Up until now, improvements to one’s cryptographic methods often escalated a cat-and-mouse game, spurring the adversary to reciprocally improve their cryptanalysis. Such was the case with the Axis and Allied powers in World War II. Amidst the many adversarial applications of cryptography, it behooves us to find applications, like privacy-preserving surveillance, that reduce the social costs of security without prompting such dynamics.
Bibliography
[i] “History of the Enigma.” n.d. Crypto Museum. https://www.oii.ox.ac.uk/events/recent-developments-in-cryptography-and-why-they-matter/.
[ii] Garfinkel, Ben. 2018. “Recent Developments in Cryptography and Why They Matter.” Oxford Internet Institute, May 1. https://www.oii.ox.ac.uk/events/recent-developments-in-cryptography-and-why-they-matter/.
[iii] “Cryptography: Roles, Market, and Infrastructure.” n.d. In Cryptography’s Role in Securing the Information Society, 1996th ed., 51–78. https://www.nap.edu/read/5131/chapter/6.
[iv] Buchanan, Ben. 2017. “Introduction.” In The Cybersecurity Dilemma. Oxford University Press.
[v] “Encryption and National Security.” n.d. MIT OpenCourseWare. https://ocw.mit.edu/courses/electrical-engineering-and-computer-science/6-805-ethics-and-the-law-on-the-electronic-frontier-fall-2005/readings/read_tlp4/.
[vi] Garfinkel.
[vii] Kore. 2018. “Blockchain for the Internet of Things.” Blog. Hackernoon. January 9, 2018. https://hackernoon.com/blockchain-for-the-internet-of-things-71a06afce81.
[viii] Nakashima, Ellen. 2016. “Apple Vows to Resist FBI Demand to Crack iPhone Linked to San Bernardino Attacks.” The Washington Post, February 17, 2016. https://www.washingtonpost.com/world/national-security/us-wants-apple-to-help-unlock-iphone-used-by-san-bernardino-shooter/2016/02/16/69b903ee-d4d9-11e5-9823-02b905009f99_story.html?utm_term=.0a2f73456a1c.
[ix] Andrew Trask. 2017. “Safe Crime Detection.” GitHub. I Am Trask. June 5, 2017. https://iamtrask.github.io/2017/06/05/homomorphic-surveillance/.
[x] Garfinkel, Ben. 2018. “Recent Developments in Cryptography and Why They Matter.” Oxford Internet Institute, May 1. https://www.oii.ox.ac.uk/events/recent-developments-in-cryptography-and-why-they-matter/.
[xi] Garfinkel.
[xiii] Angwin, Julia, Jeff Larson, Surya Mattu, and Lauren Kirchner. 2016. “Machine Bias.” ProPublica, May 23, 2016. https://www.propublica.org/article/machine-bias-risk-assessments-in-criminal-sentencing.
[xiv] Aggarwal, Alok. 2018. “Domains in Which Artificial Intelligence Is Rivaling Humans.” Scry Analytics. January 20, 2018. https://scryanalytics.ai/domains-in-which-artificial-intelligence-is-rivaling-humans/.