As the quantum threat approaches, the need to prepare our cryptographic systems has never been more critical. Post-Quantum Cryptography (PQC) is positioned as THE solution to protect data and communications against the quantum computers. One common misconception I frequently observe among my clients is the belief that once the National Institute of Standards and Technology (NIST) releases its PQC standards, implementing these new solutions will be simple and straightforward, instantly making their systems compliant and secure. Unfortunately, the reality is far more complex. While PQC offers a viable path to quantum readiness, it also presents significant challenges that must be addressed.

Algorithm Maturity and Standardization

While significant progress has been made, many PQC algorithms are still in the experimental phase and have not yet undergone the extensive testing and validation that current cryptographic standards have. PQC algorithms are designed to withstand the capabilities of quantum computers, which can break classical cryptographic methods such as RSA and ECC. Bodies like the NIST have been at the forefront of developing and standardizing these new algorithms. NIST’s Post-Quantum Cryptography Standardization Project, initiated in 2016, aims to evaluate and endorse quantum-resistant algorithms. Although several promising candidates have emerged, they have not yet reached the level of maturity required for widespread adoption.

Many of these algorithms are still undergoing rigorous testing to evaluate their security, performance, and practicality. Unlike classical algorithms, which have been tested and validated over decades, PQC algorithms are relatively new and must prove their resilience against both classical and quantum attacks. This process involves extensive cryptanalysis, implementation trials, and real-world testing, which takes time and resources.

The standardization process for PQC is ongoing, and it is essential for organizations to stay informed about developments from bodies like NIST. The finalization of PQC standards involves multiple rounds of evaluation, public comments, and revisions. Even when NIST releases its final standards, the selected algorithms may still undergo changes as new vulnerabilities are discovered and as our understanding of quantum computing evolves.

One crucial point to consider is that the cryptographic landscape will likely become much more dynamic with the advent of quantum computing. Historically, certain cryptographic algorithms and implementations, such as RSA and AES, have been considered safe for decades. However, the rapidly evolving nature of quantum computing means that the field of cryptography will need to adapt continuously. Organizations must be prepared for the possibility that the PQC algorithms they implement today might require updates or replacements in the future.

To manage expectations, it is important to acknowledge that PQC implementation will not be a one-time effort. Instead, it will be an ongoing process of adaptation and improvement. This reality highlights the need for organizations to become crypto-agile—able to quickly adopt and deploy new cryptographic algorithms as they become available and as the threat landscape evolves. For more on crypto-agility see “Introduction to Crypto-Agility“.

Performance Challenges with Post-Quantum Cryptography (PQC)

The adoption of PQC will also bring significant performance challenges to your infrastructure. Unlike classical cryptographic methods, PQC algorithms generally demand more computational resources, leading to potential performance bottlenecks, particularly in environments with limited processing power and memory.

The performance impact of PQC is particularly concerning for real-time applications and systems that require high performance. For instance, IoT devices often have limited processing power and memory, and implementing PQC in these devices can lead to slower response times and increased energy consumption. Similarly, embedded systems in industrial control systems, medical devices, and automotive systems might struggle with the additional computational load imposed by PQC.

Moreover, the cumulative impact of upgrading to PQC across an entire infrastructure can be massive. While a single larger key may not seem problematic, the aggregate effect of deploying PQC algorithms throughout all systems and applications can lead to significant performance degradation. This includes increased key sizes and computational requirements, which can consume more bandwidth and introduce latency in network communications.

Several studies and experiments have highlighted these performance challenges. The NIST has conducted extensive evaluations of various PQC algorithms. Their findings indicate that while some PQC algorithms perform well in specific use cases, others may not be suitable for resource-constrained environments due to their high computational demands. Google and Cloudflare experimented with integrating PQC algorithms into the TLS protocol and demonstrated increased handshake times and computational overhead for some of the post-quantum cryptographic algorithms, showcasing the practical challenges of adopting PQC in existing internet infrastructure. See Cloudflare Blog Towards Post-Quantum Cryptography in TLS for a great intro into those experiments.

Addressing these performance challenges requires several strategic approaches. One practical method is to use hybrid cryptographic schemes that combine classical algorithms with PQC algorithms. This allows organizations to benefit from the security of PQC while maintaining acceptable performance levels. For example, a hybrid scheme could use classical algorithms for performance-sensitive operations and PQC for long-term data protection.

Improving the efficiency of PQC algorithm implementations can also help reduce their performance impact. This includes optimizing code, leveraging hardware acceleration, and using efficient mathematical libraries. Collaboration with hardware manufacturers to develop PQC-optimized processors could play a significant role in addressing performance issues.

Conducting extensive performance testing and benchmarking is crucial for understanding the impact of PQC on specific systems and applications. Organizations should test PQC algorithms in their actual operating environments (or production mirror testbed) to identify bottlenecks and optimize configurations accordingly. Rather than a big-bang switch to PQC, organizations can take an incremental approach to deployment, gradually introducing PQC algorithms into the infrastructure, monitoring their performance, and making adjustments as needed. This approach helps manage the transition and allows for iterative improvements.

Effective resource allocation and planning are essential for managing the increased computational demands of PQC. This includes ensuring that systems have adequate processing power, memory, and network bandwidth to handle the additional load. Investing in scalable infrastructure can help accommodate future performance requirements.

Implementation Complexity

Adopting PQC algorithms requires substantial changes to existing cryptographic libraries and protocols. Traditional cryptographic systems are deeply integrated into the infrastructure, including software applications, hardware devices, and communication protocols. Transitioning to PQC involves rewriting cryptographic libraries, modifying protocols, and optimizing performance. Existing cryptographic libraries must be updated or replaced to support PQC algorithms. This process involves significant code modifications and rigorous testing to ensure the new algorithms are correctly implemented and secure. Many communication protocols, such as TLS and IPsec, are designed around classical cryptographic methods. Integrating PQC algorithms into these protocols requires changes to their fundamental operations, which can be complex and time-consuming. PQC algorithms often have different performance characteristics compared to classical algorithms. Optimizing these new algorithms for speed and efficiency within existing systems is a crucial but challenging task.

To manage these integration challenges effectively, organizations should adopt a phased approach. Beginning with pilot projects that implement PQC in non-critical systems can help understand the practical implications and challenges. Gradually extending the implementation to more critical systems once initial hurdles are overcome allows for a smoother transition. Using hybrid cryptographic schemes that combine classical and PQC algorithms can facilitate a gradual transition, maintaining security while integrating new algorithms. Additionally, implementing automated testing frameworks and continuous integration (CI) pipelines can streamline the integration process, ensuring that any changes to cryptographic libraries and protocols are thoroughly tested before deployment.

Ensuring that new PQC solutions are backward compatible with existing systems is another significant hurdle. Many existing applications and protocols are tailored to specific cryptographic algorithms, making it challenging to introduce new ones without disrupting functionality. Legacy systems, in particular, may not easily accommodate new cryptographic methods. Ensuring that these systems can operate seamlessly with PQC algorithms requires careful planning and potentially significant modifications. Moreover, new PQC solutions must be interoperable with systems and protocols still using classical cryptography. This requirement adds complexity, as it requires maintaining dual compatibility during the transition period.

To achieve backward compatibility, organizations can implement PQC algorithms in a layered manner, allowing both classical and quantum-resistant algorithms to operate concurrently. This approach facilitates a smoother transition without immediate disruption to existing systems. Using intermediary solutions, such as gateways or proxies, to translate between classical and PQC algorithms can help bridge the gap between old and new systems, ensuring compatibility and security.

The introduction of PQC algorithms requires developers and security professionals to acquire new skills and knowledge. Implementing and maintaining PQC systems is fundamentally different from working with classical cryptographic methods. Developers must undergo specialized training to understand the intricacies of PQC algorithms and their implementation. This training includes understanding the mathematical foundations of PQC, algorithm-specific characteristics, and best practices for secure implementation. Collaborating with academic institutions and participating in industry consortia can provide valuable insights and access to the latest research in PQC. Engaging external experts and consultants to provide training and guidance on PQC implementation can also accelerate the learning curve and ensure best practices are followed.

Compliance and Regulatory Challenges

Compliance with new requirements and obtaining certification for PQC systems could present another significant set of challenges. The regulatory landscape for PQC is still developing, with standards and requirements continuously emerging. This uncertainty makes it challenging for organizations to ensure compliance with current and future regulations. Regulatory and SDO bodies such as NIST and ISO are actively working on establishing PQC standards, but these efforts are ongoing, and definitive guidelines are not yet fully established. In a year or so, when the NIST is expected to release the PQC standard, we can expect a flurry of regulatory requirements being released.

Organizations must stay on top of these developments to ensure compliance. Regularly monitoring updates from regulatory bodies and participating in industry forums can help organizations stay informed about emerging standards. Engaging with regulatory agencies and providing feedback during public comment periods can also help shape the evolving regulatory framework. Larger organizations should consider establishing a dedicated compliance team responsible for monitoring regulatory developments and ensuring alignment with emerging standards. This team should work closely with legal, security, and IT departments.

To address regulatory uncertainty, organizations should adopt a proactive approach to compliance. This involves implementing flexible and adaptable cryptographic systems that can be updated as new standards emerge. See Introduction to Crypto-Agility. Developing a comprehensive compliance strategy that includes regular audits and reviews can ensure that the organization remains aligned with the latest regulatory requirements.

Obtaining certification for PQC systems from regulatory bodies can be a lengthy and complex process. Certification ensures that cryptographic systems meet specific security standards and are robust against potential threats. The certification process can be resource-intensive, requiring significant time, effort, and expertise. The novelty of PQC algorithms adds an extra layer of complexity to this process.

Organizations should also foster a culture of compliance by promoting awareness and understanding of regulatory issues across all levels of the organization. Regular training sessions and workshops can help employees stay informed about the latest regulatory developments and best practices for PQC implementation.

Organizations should also leverage technology and automation to streamline compliance processes. Implementing automated compliance tools can help monitor and enforce regulatory requirements, reducing the risk of non-compliance and ensuring timely updates to cryptographic systems.


Transitioning to PQC involves substantial financial outlays in various domains, including new technologies, training, and infrastructure upgrades. The cost implications can be daunting, especially for organizations with extensive and complex cryptographic infrastructures

Adopting PQC requires investment in new technologies that are designed to withstand quantum attacks. These technologies include updated cryptographic algorithms, hardware security modules (HSMs), and other cryptographic hardware and software solutions. The transition involves not only purchasing these new technologies but also integrating them into existing systems, which can incur additional costs for customization and implementation.

Purchase of these solutions, if done at the same time as the rest of the market, could increase the overall PostQuantum transition costs significantly. Two upcoming will likely drive a sharp increase in market interest for post-quantum cryptography (PQC). The first milestone will be the release of the PQC standards by NIST. Once these standards are established, they will likely be integrated into various regulatory, insurance, and compliance frameworks, driving widespread adoption and reaction from the market. The second milestone, hopefully much further down the line, will be the achievement of quantum supremacy by a vendor with universal quantum computers. In both scenarios, there is an expected surge in demand for PQC-enabled equipment, which could lead to shortages and price hikes.

To prepare for these eventualities, organizations should consider early investments in necessary hardware and software. However, these investments should be made with the assurance that the equipment’s cryptographic systems can be easily upgraded to meet future standards. Organizations should seek contractual guarantees for refunds if upgrades to NIST PQC algorithms are not feasible. Alternatively, organizations with sufficient leverage over their vendors might secure current prices and quantities in advance, while awaiting the finalization of NIST PQC standards.

To manage the financial investment required for PQC, organizations should adopt a phased approach. Starting with pilot projects allows organizations to spread out the costs and gain valuable insights into the practical implications of PQC implementation. Additionally, seeking funding and grants from government bodies and industry consortia dedicated to quantum readiness can help alleviate some financial burdens.

Consider also that PQC algorithms often require more computational power and memory than classical algorithms, which will require infrastructure upgrades. These upgrades can include purchasing new servers, enhancing network capabilities, and expanding data storage solutions. Again, if you are waiting for one of the two significant events to start upgrading your infrastructure at the same time as the whole market, be ready to pay premium. My recommendation would be to transition to cloud as much as possible, and for on-prem infrastructure build up some capacity ahead of the demand.


The transition to post-quantum cryptography is a complex, multi-faceted process that requires careful planning, significant investment, and a proactive, adaptable approach. By addressing these challenges head-on and preparing for the dynamic cryptographic landscape of the future, organizations can achieve crypto-agility and secure their digital assets against the emerging quantum threat.

Avatar of Marin Ivezic
Marin Ivezic
 | Website

For over 30 years, Marin Ivezic has been protecting critical infrastructure and financial services against cyber, financial crime and regulatory risks posed by complex and emerging technologies.

He held multiple interim CISO and technology leadership roles in Global 2000 companies.