What I told nuclear regulators about quantum computers

Today I spoke at a cybersecurity roundtable at the International Atomic Energy Agency (IAEA) Safeguards Symposium in Vienna.
I’ve been looking forward to share my paper about quantum computing and cybersecurity, hopefully spreading awareness and bringing more developers into the process of developing standards for post-quantum cryptography.

Annotations throughout the paper will look like this.
All cited references appear at the end.
All views are my own and not associated with my employer.

If you’re new to quantum computing and post-quantum cryptography, I recommend reading recent articles in Scientific American or The Economist, or listening to a Microsoft Research podcast episode. Also see this 90-second video about the IAEA’s Safeguards program:


After making computer chips with the smallest possible transistors, researchers and technologists are pursuing new technologies including qubits — the beginnings of practical quantum computers. This paper addresses the advantages that quantum computers have over today’s classical computing, how internet and email security would be impacted, and how we could start securing our systems and servers with post-quantum encryption (PQE). An overview is provided of expected developments in the next few years (mainly in chemistry and other research problems), and why Google and Akamai are already evaluating present-day solutions to the future post-quantum encryption problem. Meanwhile, NIST is currently evaluating multiple proposals for a PQE standard, especially lattice-based encryption.


According to Moore’s Law, the number of transistors which fit on the leading computer chip doubles every 12–18 months, even while the cost of the chip halves. This miniaturization trend holds true even when projecting back to pre-transistor computing, to earlier computing models such as vacuum tube machines [1]. Computer chip fabrication labs have recently reported building transistors as small as a single ion or a molecule, approaching the physical limit for the transistors used in classical computing [2]. Researchers continue to explore computing models which could be alternatives to transistors, including optical computing (using lasers and photons in circuits) and multiple models of computing which rely on quantum physics.

I looked everywhere for where I’d learned this fact on Moore’s Law predating transistors, and finally traced it back to the diagram below, which I read in Ray Kurzweil’s The Singularity is Near while I was volunteering on a One Laptop per Child project in Uganda. I’m delighted to make it the first citation of my paper.

This image taken directly from The Singularity is Near does not appear in the paper

When practical, large-scale quantum computers are developed, they will pose a threat to the most popular encryption algorithms used today. In 1994, in the early days of quantum computing research, Dr. Peter Shor published algorithms which could be run on a computer with a sufficient number of entangled quantum bits (known as qubits). Shor’s algorithm significantly reduces the computation time to reveal the prime factors of a number, breaking encryption keys of length n with 2n error-corrected qubits. This vulnerability applies to common protocols for encrypting e-mail (PGP and GPG, which depend on RSA) [3] and for internet traffic (AES and other standard algorithms used in TLS) [4]. Research is ongoing into how these applications could best be upgraded with post-quantum cryptography which would be resistant to attacks from both classical and quantum computers.

As of 2018, quantum computers which exist and are publicly known today have too few qubits to pose a threat to cybersecurity. Commercial providers of quantum computer programming, such as IBM, Microsoft, and Rigetti Computing, provide access to machines with only four to nineteen qubits, while researchers have successfully built larger machines in labs, such as one 72-qubit machine recently built by Google [5]. An RSA encryption key generated today with a key length of 2,048 bits could be decrypted only by a machine with 4,096 or more qubits. With researchers continuing to develop larger quantum computers, it is necessary for technologists to anticipate the future changes in their cybersecurity strategy.

The NSA and comparable agencies in other countries are known to be developing their own quantum computers. In 2016, the NSA hinted that NIST should get moving on post-quantum cryptography standards. But through gossip (not really citable in a scientific paper) and venture funding, my understanding is that the US govt believes large-scale quantum computers are still many years away.


The potential benefits of quantum computers will only be proven when one surpasses the ability of classical computers, a test known as quantum supremacy. The design and repeatability of quantum supremacy experiments remains debated, with early efforts being questioned by other researchers [6]. One quantum algorithm for recommendation systems, which had been suggested as a potential real-world application for quantum supremacy, was recently discovered to have a classical algorithm with similar efficiency [7]. It is unclear which application might first demonstrate quantum supremacy, and when the first quantum supremacy experiment may succeed.

“Quantum supremacy” has become as controversial as “passing the Turing test.” It’s hard to design Turing’s experiment to fairly test if a computer ‘sounds human’. Even basic chatbots can fool people into sharing personal information and photos, but other more serious tests are still easily won by humans. A recent one-word Turing test was won with the very human message “poop.”

For a quantum computer to show supremacy, questions sound more like: what’s the right algorithm to test on? Is supremacy anything which proves n qubits are better than n regular bits, or when the machine no longer can be simulated even by classical supercomputers?

Rigetti Computing just set up a $1MM prize to the first team which can prove quantum supremacy on their platform.

In October there was a write-up on Urmila Mahadev’s Ph.D thesis using quantum homomorphic encryption to verify quantum supremacy.
Homomorphic encryption running on quantum computers is a hot research topic — while I was looking for this article, I saw Jonas Zeuner had discussed another implementation at a recent conference.

The advantage of a quantum computer also depends on which quantum effects are manipulated for computing. For example, some companies’ chips use the physical properties of quantum annealing [6], and some others’ chips use lasers to restrict ions in a supercooled ‘ion trap’ [8]. The details of these two computing models may seem minor or only of interest to a manufacturer, but different designs potentially are better suited for different computational problems, or scale more easily. One of the major challenges of constructing a quantum computer is keeping their qubits in stable superposition, without outside interference, and it currently remains undetermined which design might provide the most stable and reliable computation.

This difference in counting qubits is why D-Wave reports 2,040 qubits in their quantum annealing machine, and Google reported 72 superconducting qubits. It’s unfortunate again that I don’t have deep knowledge into hardware differences here.
You can find your favorite qubit technology, and the researchers using it, at https://quantumcomputingreport.com/scorecards/qubit-technology/


At least three companies have detailed plans and public programming interfaces for connecting to their universal quantum computing chips: IBM, Microsoft, and Rigetti Computing.

Did I miss anyone? Another company launched their portal on October 5th (after I submitted the paper), and TechCrunch reported it as “the first public access to a quantum computer.” 🙄

Due to the limited number of quantum computers which are available, each company has developed a queueing system to accept, schedule jobs, and return results in a complex cloud computing system. Each provider has also developed a method to simulate qubit interactions on classical computers, either on one’s local computer or on a cloud service. Currently there are more simulated qubits available than true qubits through these services, but ultimately quantum computers may outpace them. When writing programs for these computers, the technology is limited to assembly-like programming languages. Each instruction in a quantum assembly language connects qubits through various quantum logic gates, replacing the classical AND, OR, XOR, NOR, NAND, and NOT gates.

This was hard for me to understand originally, and I want to emphasize understanding it before going forward. Quantum computing today is not copying a JS file or compiling C to a special binary, but painstakingly designing an experiment with quantum gates, making measurements, and running the code many times to measure probabilities. Python, JS, and .NET libraries that are available today, exist to generate this gate code.

The cloud computing providers who currently offer public access to quantum computers are listed below in alphabetical order:

  • IBM’s ‘Quantum Experience’ is available to anyone with an e-mail sign-up, and includes both programmatic and a graphic user interface (GUI). IBM released a Python library known as QISKit and a JavaScript complement QISKit-JS to facilitate coding quantum assembly [9].
  • Microsoft offers simulated qubits on their Azure cloud computing platform, and indicates that they plan to add physical qubits in the future. Currently users can write code in a new programming language called Q# [10].
  • Rigetti’s Forest is entirely an application programming interface (API), which comes with a Python development kit pyQuil to write programs in another dialect of quantum assembly. Third-party developers have supported the same interface in JavaScript [11].


The table doesn’t fit into Medium’s formatting, so check out the more comprehensive and continuously updated page: https://quantumcomputingreport.com/scorecards/qubit-count/

Table 1 above lists the number of qubits in public announcements of quantum computers by these companies and other major startup providers (as of August 2018). With no known physical limits to the size of quantum computers, these companies predict to continue increasing their quantum chip sizes, with over 100- qubit chips announced by companies in their projections for 2019.

There are some physicists who believe that there are physical limits, or unsurpassable noise, which will make it impossible to have today’s machines scale up to 100 or 1,000 qubits, but it is difficult for me to assess or verify their claims. Scott Aaronson’s book Quantum Computing Since Democritus makes the interesting point that if quantum computing is resisted so strongly by physics, those resistance effects would also be complex and able to be harnessed for computing. 🤯


4.1. Chemistry and Physics

Considering that quantum computation is based on the interaction of atoms or molecules in a quantum state, one potential application would be to simulate the interactions of more complex atoms and molecules. Researchers at Oak Ridge National Laboratory used the qubits available from one cloud provider to simulate deuteron binding energy in the nucleus of a stable hydrogen atom. More complex calculations will require not only more qubits to simulate more parts of the atom, but more error-correcting qubits to correct for noise, which increases with the number of additional quantum logic gates added to the experiment. Another method to improve accuracy and reduce noise is to run a quantum program multiple thousands of times, to determine overall probability of different results [12].

This is a very simplified explanation of what will likely be the first profitable application of quantum computing. Simulations and modeling is feasible to do with a small number of qubits and is directly related to the problem. One company in this space is https://proteinqure.com/
They recently posted a paper preprint on a successful protein-folding program: (blog post and scientific paper updated last week).

4.2. Quantum Cryptography

Two quantum computers can exchange highly-secure encrypted messages using a process known as quantum encryption. Each computer possesses a qubit in superposition with the other’s qubit, giving them a shared secret. If an outsider observer eavesdrops on communication between the two quantum computers, it would alter the distribution of the entangled particles, and this tampering would be detected by the recipient. This provides a major benefit for secure messaging. [13]. Recent experiments have shown that quantum encryption can be applied outside of a laboratory setting, for example protecting communications between satellites and ground stations [14]. Unfortunately, many quantum computing methods use advanced hardware such as near-absolute-zero freezers to maintain qubits’ state, which might not be possible to miniaturize for personal computers and handheld devices.

I’ve been told that this depends on the type of qubits in the machine, so keep your hopes up.

It’s important to draw a distinction between quantum cryptography and post-quantum cryptography, whose primary purpose is to replace existing cryptography algorithms with ones which are difficult for both classical and quantum computers to break.

4.3. Prime Factorization and Key Search

Three types of algorithms commonly used today to encrypt internet traffic are RSA (Rivest, Shamir, and Adleman), multiple elliptic-curve cryptography (ECC) algorithms, and Advanced Encryption Standard (AES).
These algorithms are all vulnerable either to Shor’s algorithm or Grover’s key-search algorithm on quantum computers. Both algorithms have been demonstrated with tiny key sizes on the quantum computers which exist today, but the number of qubits necessary means that only small prime factors can be found on current hardware. A 2017 paper described a more efficient quantum ring algorithm, GEECM, which will allow decryption of RSA with fewer qubits than the original Shor’s algorithm, but still does not currently threaten existing keys. AES would only be affected by a computer running Grover’s algorithm, which is less threatening, so it may be possible to continue using AES with a 256-bit key-length for many years [3].

I’ve rethought this section after reading a recent blog post by legendary privacy and security writer Bruce Schneier. He wrote, “If public-key cryptography ends up being a temporary anomaly based on our mathematical knowledge and computational ability, we’ll still survive.” and “Maybe the whole idea of number theory­-based encryption, which is what our modern public-key systems are, is a temporary detour based on our incomplete model of computing…”

Many of the post-quantum crypto standards in consideration for internet infrastructure’s future are asymmetric, so I was surprised to read the pessimistic view, that these new algorithms are also vulnerable and we just haven’t understood quantum algorithms fully yet. I didn’t think at all to distinguish between symmetric and asymmetric cryptography in this section. Schneier lists some symmetric cryptographic ideas which could be brought into use if this indeed is what happens.


There are multiple approaches to replacing RSA and ECC algorithms in post-quantum cryptography, but none can immediately be picked out as a future standard. Even after information security standards have been set, users’ older browsers and operating systems have taken years to completely adopt new security standards, for example, the ongoing upgrade of TLS to version 1.3. Even if it will take years to develop. Brian Sniffen, Chief Security Architect of Akamai warned in 2016, “protocols are going to live for ten [to] fifteen years… the quantum computer people tell us they are 10 to 15 years out, which means the time to start shipping mainstream browsers and mainstream crypto endpoints with post-quantum crypto in it is now.” [15]

5.1. NIST Competition

The US-based National Institute of Standards and Technology (NIST), which has developed and maintains several encryption algorithm standards, responded to encryption concerns by announcing the Request for Nominations for Public-Key Post-Quantum Cryptographic Algorithms in December 2016. The objective of the competition is to allow public discussion, debate, and commenting on multiple alternatives for post-quantum cryptography. The competition received 69 proposed algorithms in the first round. As of August 2018, five algorithms have been withdrawn, and the others are currently being evaluated. Multiple algorithms are likely to be selected to cover a variety of use cases. It will take an additional two years or more for NIST to provide feedback and publish recommendations in this field, but the final selected algorithms of the context are likely to have long-reaching impacts in information security [16].

I’ve mostly followed this process on Twitter via accounts such as https://twitter.com/DemocraticLuntz and threads including https://twitter.com/grittygrease/status/983360199313551360
Also conferences https://twitter.com/pqc_eu and https://twitter.com/PQCryptoConf .

5.2. Evolving Internet Security

Researchers from major browser vendors Microsoft and Google have published independent and joint research into post-quantum encryption for common internet security applications, such as public key exchange and encryption protocols. Every time that a user loads a web page, an IP address is looked up in a trusted DNS server, the browser opens a connection to that web server, and the server decides how to respond their query. Websites typically protect this client-server communication with the HTTPS and TLS protocols, to confirm their identity and encrypt their content. The public key for a website can typically be verified by checking that it was issued by a known certificate authority (CA) [17]. These CAs include private companies (Google), government agencies (the United States federal government), universities (MIT), and non-profits (Let’s Encrypt). Updating each of these protocols and certificate authorities with new, post-quantum algorithms and encryption keys will take significant investment of time and resources, for multiple vendors and service providers to create and satisfactorily test their security.

Some people have chosen to disable the US government CAs, some CAs have been targeted in the past (allowing Iran to spoof Google, before public-key pinning).

To support multiple public key exchange algorithms and standards, which evolve over time, web servers and browsers typically negotiate with a list of algorithms to find one efficient and secure algorithm which they both can use to communicate. If the client web browser is significantly outdated, it may not be able to find an algorithm which has not been deprecated. As of 2017, certificate authorities mutually agreed not to issue any new certificates using the SHA-1 hash algorithm, because it is no longer believed to have long-term security even on classical computers. ATMs and other credit card payment systems from multiple operators worldwide had planned only support for SHA-1, so it was necessary to make temporary security compromises until old hardware could be replaced [18]. The problems faced by certificate authorities, web browser developers, and end users could be much greater in the years following the development of large-scale quantum computers. Prudent software developers could act now on both server and client-side applications, using the existing negotiation process to list and prefer post-quantum algorithms which they trust.

The ATM / payments issue caused shady CAs to secretly compromise the whole certificate infrastructure. WoSign and StartCom secretly merged and issued poorly-scoped or backdated certificates using algorithms that all CAs had agreed to deprecate, eventually getting their certificates disabled in major browsers and renewing interest in certificate transparency or some sort of blockchain solution.

Research at Google initially was built on a ring learning with errors key exchange scheme published as “A New Hope” in 2015, and was continued with joint Google-Microsoft research into “Frodo” in 2017. Both are based on lattice-based encryption, instead of purely factorization, with the newer “Frodo” scheme no longer having a ring structure [17].

Oh, wow, the names on these algorithms…

When testing post-quantum algorithms on users’ private internet browsing, it is crucial to avoid relying solely on these new algorithms which lack the long-time scrutiny of current industry standards. It would significantly risk users’ privacy and confidence in internet security if their browsers tested their security without users’ permission and understanding. In the joint Microsoft and Google experiment, their users’ browsers used a hybrid algorithm which encrypted data with both a post-quantum algorithm and a military-grade standard algorithm, AES-256. Google has also included this algorithm in their BoringSSL fork (derivative version) of OpenSSL, which opens its use for other potential adopters. In the short-term, these hybrid algorithms allow browser vendors and server libraries to prepare and test in advance of quantum computers.


The IAEA is a significant authority on nuclear safety, technology, and planning. Regular inspections of nuclear facilities depend on accurate reporting, and Additional Protocol sometimes involves encrypted streaming sensor data, video feeds, or unannounced visits. These responsibilities of the IAEA underline the importance of developing best practices along with member states.

Additional Protocol is the term for country-specific IAEA agreements which allow more in-depth inspections. The general idea was to avoid blindspots which some countries 🙄 had exploited in the past. Since the late 1990s these came into effect with 133 of 179 IAEA members (including the United States since 2009). These sometimes use newer technologies for video streaming and sensing, electronic tamper-proof locks, and PGP-signed messages / logs, so I wanted to highlight the cybersecurity risk.

6.1. Risks of Cybersecurity Failures

When providing expertise on nuclear power and waste facilities, the IAEA must consider future threats to their physical and IT infrastructure. Even seemingly small gaps in cybersecurity which have occurred in other organizations could pose a risk to the IAEA and its member states, for example:

  • A nuclear facility has a video or sensor feed, which could be monitored by unauthorized viewers, or attacked with unexplained disruptions to build uncertainty about compliance.
  • E-mails and documents containing internal details, inspection and logistics strategies, or personal and political matters are revealed to other countries or ‘leaked’ to the public.
  • A device used by IAEA or member state teams supports only the encryption algorithms and key sizes which were common when the device was manufactured, and its lack of software updates makes it incompatible or insecure on newer systems.

These cybersecurity failure scenarios are not solely a quantum computing issue, but these scenarios all involve systems which have trusted security standards today, which would need to be revisited in a post-quantum world. The IAEA and member states should be vigilant about how their systems store information, communicate over the internet, and receive cybersecurity updates.

There are a lot of things which could go wrong if you’re targeted by a bad actor with a quantum computer, but in the preceding and following sections through to the end, I wanted to focus on technical decisions today which should be influenced by the future quantum threats. They should seem reasonable considering (a) by the use of remote sensing, Additional Protocol efforts are not and cannot fully be air-gapped, and (b) nuclear cybersecurity is already a major issue:

It is possible that the consequences of quantum computing and post-quantum encryption have not been included in previous cybersecurity discussions, as their potential continues to be overlooked or dismissed in the technology industry. When a new vulnerability is uncovered, a nuclear facility or an inspection team cannot download a patch into their secure systems or upgrade their industrial control systems (ICS) hardware as easily as a software-based company. Past experience shows that nuclear facilities are targeted by advanced cybersecurity programs, which could also be motivated to build decryption technology outside of public knowledge. One intelligence agency (which supports the NIST competition for post-quantum encryption) has also been reported by the Washington Post and other news outlets to be developing its own quantum computer [19]. A bad actor with a quantum computer could monitor inspectors and nuclear plant workers, anticipate their investigative actions and movements, disrupt reporting methods, or cast suspicion on inspectors’ work or countries’ compliance.

Obviously there is a lot which could be said here about industrial control systems, the NSA, and Stuxnet, but it didn’t seem appropriate for me, an American new to the nuclear conference, to discuss it here without any new information.

6.2. Adapting Existing Encryption Algorithms and Key Sizes

Acknowledging the risk of current encryption algorithms when faced with an adversary with a quantum computer, one common suggestion is to continue using the same algorithms, but with an increased encryption key size. This type of change has happened multiple times in the past, as classical computers’ speeds have improved under Moore’s Law. Unfortunately, the speedup from a quantum computer is more significant, involving a change to decrypting keys in polynomial time. To keep up with this change would require extremely large RSA keys, which are not efficient or even realistically storable and transferrable between users [3].

For applications with AES encryption, AES-256 is still usable but has security relative to half the key-size (AES-128) on classical computers. AES-128 (commonly used and preferred by some web browsers) would also be weakened to the point it cannot be considered for post-quantum encryption [4].

Regardless of how existing algorithms and key sizes are changed, internet security remains vulnerable in public key exchange and certificate authority systems. Server and browser developers must research alternatives at each step of the protocol. The IAEA could evaluate which of its systems use internet connections (even secure connections such as VPNs) and the long-term security of these connections.

Really, please don’t suggest ‘existing Algorithm A but longer keys’. The internet is using the wrong algorithms. We already know how to have the server and client agree to use other ones. We only need to decide on which are best in the post-quantum world, distribute them to servers and clients, and prefer them before other algorithms.

6.3. Preventing “Record Now, Break Later”

Even though quantum computers exist only in a small scale today, it is possible for a determined adversary to collect encrypted internet traffic today with the intent to decrypt it on future quantum computers.

Cryptography researcher Dr. Brian LaMacchia of Microsoft warns “If the time in which you need the information to be protected is longer than when we think quantum computers are going to show up, you have to assume that information’s going to be recorded and broken when an attacker has access to a quantum computer.” Dr. LaMacchia was referring to private industries such as pharmaceuticals specifically, but the same standard can be applied to encryption of documents related to power generation, waste management, industrial control systems, and nuclear safeguards [21]. Member states with an interest in cybersecurity would benefit from participating in discussions by NIST and local experts to develop better post-quantum encryption standards and implement them in their critical infrastructure in the near future, to prevent future decryption.

Thanks for reading! Here come the references


[1] KURZWEIL, R., The Singularity is Near, Penguin, New York (2005).

[2] LUNDSTROM, M., Moore’s Law Forever?, AAAS. Science 299 5604 (2003) 210–211.

[3] BERNSTEIN, D., HENINGER, N., LOU, P., VALENTA, L., Post-quantum RSA (2017), www.cr.yp.to/papers/pqrsa-20170419.pdf

[4] GRASSL, M., LANGENBERG, B., ROETTELER, M., STEINWANDT, R., “Applying Grover’s Algorithm to AES: Quantum Resource Estimates”, Post-Quantum Cryptography, Springer (2016).

[5] KELLY, J., A Preview of Bristlecone, Google’s New Quantum Processor (2018), ai.googleblog.com/2018/03/a-preview-of-bristlecone-googles-new.html

[6] RONNOW, T., WANG, Z., JOB, J., BOIXO, S., ISAKOV, S. et al., Defining and Detecting Quantum Speedup, Science 345, 420 (2014). www.arxiv.org/abs/1401.2910

[7] TANG, E., A Quantum-Inspired Classical Algorithm for Recommendation Systems, University of Texas — Austin www.arxiv.org/abs/1807.04271

[8] KIELPINSKI, D., MONROE, C., WINELAND, D., Architecture for a Large-Scale Ion-Trap Quantum Computer, Nature 417, 711 (2002).

[9] IBM, QISKit (2018), developer.ibm.com/code/open/projects/qiskit/

[10] MICROSOFT, Microsoft Quantum Development Kit (2018), www.microsoft.com/en-us/quantum/development-kit

[11] RIGETTI, Welcome to the Docs for Forest and pyQuil! (2018), pyquil.readthedocs.io/en/stable/

[12] DUMITRESCU, E., MCCASKEY, A., HAGEN, G. et al., Cloud Quantum Computing of an Atomic Nucleus, Physical Review Letters 120, 210501 (2018).

[13] JENNEWEIN, T., SIMON, C., WEIHS, G., et al., Quantum Cryptography with Entangled Photons, Physical Review Letters 84, 4729 (2000).

[14] LIAO, S., CAI, W., HANDSTEINER, J., et al., Satellite-Relayed Intercontinental Quantum Network, Physical Review Letters 120, 30501 (2018). www.arxiv.org/abs/1801.04418

[15] SNIFFEN, B., Akamai Faster Forward Crypto at Scale (2016), www.youtube.com/watch?v=bAGiXimZ4kQ

[16] NIST, Post-Quantum Cryptography (2018), csrc.nist.gov/Projects/Post-Quantum-Cryptography

[17] BOS, J., COSTELLO, C., DUCAS, L., et al., “Frodo: Take Off The Ring! Practical, Quantum-Secure Key Exchange from LWE”, 23rd ACM Conference on Computer and Communications Security (2017) eprint.iacr.org/2016/659.pdf

[18] COCLIN, D., The Challenges of Transitioning Non-Browser Applications to SHA-2 (2016), community.digicert.com/en/blogs.entry.html/2016/03/01/the-challenges-of-transitioning-non-browser-applications-to-sha-2.html

[19] RICH, S., GELLMAN, B., “NSA Seeks to Build Quantum Computer that Could Crack Most Types of Encryption.” The Washington Post (2014).

[20] MOORE, S., IBM Edges Closer to Quantum Supremacy with 50-Qubit Processor (2017), spectrum.ieee.org/tech-talk/computing/hardware/ibm-edges-closer-to-quantum-supremacy-with-50qubit-processor

[21] LAMACCHIA, B., Cryptography for the Post-Quantum World with Dr. Brian LaMacchia (2018), www.microsoft.com/en-us/research/blog/cryptography-for-the-post-quantum-world-with-dr-brian-lamacchia

looking up at the IAEA building in Vienna — tbh this is for the preview image

Nomadic web developer and mapmaker.