Connect with us

TECHNOLOGY

Quantum Computing Trends

Published

on

Quantum Computing Trends


Quantum computing is the area of study focused on developing computer technology based on the principles of quantum theory.

Tens of billions of public and private capitals are being invested in Quantum technologies. Countries across the world have realized that quantum technologies can be a major disruptor of existing businesses, they have collectively invested $24 billion in quantum research and applications in 2021.

quantum_research_and_applications.jpeg

A Comparison of Classical and Quantum Computing

What Is Quantum Computing

Classical computing relies, at its ultimate level, on principles expressed by Boolean algebra. Data must be processed in an exclusive binary state at any point in time or what we call bits. While the time that each transistor or capacitor needs be either in 0 or 1 before switching states is now measurable in billionths of a second, there is still a limit as to how quickly these devices can be made to switch state.

As we progress to smaller and faster circuits, we begin to reach the physical limits of materials and the threshold for classical laws of physics to apply. Beyond this, the quantum world takes over, in a quantum computer, a number of elemental particles such as electrons or photons can be used with either their charge or polarization acting as a representation of 0 and/or 1. Each of these particles is known as a quantum bit, or qubit, the nature and behavior of these particles form the basis of quantum computing. Classic computers use transistors as the physical building blocks of logic, while quantum computers may use trapped ions, superconducting loops, quantum dots or vacancies in a diamond.

Physical vs Logical Qubits

When discussing quantum computers with error correction, we talk about physical and logical qubits. Physical qubits are the physical qubits in quantum computers, whereas logical qubits are groups of physical qubits we use as a single qubit in our computation to fight noise and improve error correction.

To illustrate this, let’s consider an example of a quantum computer with 100 qubits. Let’s say this computer is prone to noise, to remedy this we can use multiple qubits to form a single more stable qubit. We might decide that we need 10 physical qubits to form one acceptable logical qubit. In this case we would say our quantum computer has 100 physical qubits which we use as 10 logical qubits.

Distinguishing between physical and logical qubits is important. There are many estimates as to how many qubits we will need to perform certain calculations, but some of these estimates talk about logical qubits and others talk about physical qubits. For example: To break RSA cryptography we would need thousands of logical qubits but millions of physical qubits.

Another thing to keep in mind, in a classical computer compute-power increases linearly with the number of transistors and clock speed, while in a Quantum computer compute-power increases exponentially with the addition of each logical qubit.

Quantum Superposition and Entanglement

The two most relevant aspects of quantum physics are the principles of superposition and entanglement.

Superposition: Think of a qubit as an electron in a magnetic field. The electron’s spin may be either in alignment with the field, which is known as a spin-up state, or opposite to the field, which is known as a spin-down state. According to quantum law, the particle enters a superposition of states, in which it behaves as if it were in both states simultaneously. Each qubit utilized could take a superposition of both 0 and 1. Where a 2-bit register in an ordinary computer can store only one of four binary configurations (00, 01, 10, or 11) at any given time, a 2-qubit register in a quantum computer can store all four numbers simultaneously, because each qubit represents two values. If more qubits are added, the increased capacity is expanded exponentially.

Entanglement: Particles that have interacted at some point retain a type of connection and can be entangled with each other in pairs, in a process known as correlation. Knowing the spin state of one entangled particle – up or down – allows one to know that the spin of its mate is in the opposite direction. Quantum entanglement allows qubits that are separated by incredible distances to interact with each other instantaneously (not limited to the speed of light). No matter how great the distance between the correlated particles, they will remain entangled as long as they are isolated. Taken together, quantum superposition and entanglement create an enormously enhanced computing power.

Quantum_Computers_Categories.jpeg

Quantum computers fall into four categories:

  1. Quantum Emulator/Simulator
  2. Quantum Annealer
  3. Noisy Intermediate Scale Quantum (NISQ)
  4. Universal Quantum Computer – which can be a Cryptographically Relevant Quantum Computer (CRQC)

Quantum Emulator/Simulator

These are classical computers that you can buy today that simulate quantum algorithms. They make it easy to test and debug a quantum algorithm that someday may be able to run on a Universal Quantum Computer (UQC). Since they don’t use any quantum hardware, they are no faster than standard computers.

Quantum Annealer

A special purpose quantum computer designed to only run combinatorial optimization problems, not general-purpose computing, or cryptography problems. While they have more physical Qubits than any other current system they are not organized as gate-based logical qubits. Currently this is a commercial technology in search of a future viable market.

Noisy Intermediate-Scale Quantum (NISQ) computers.

Think of these as prototypes of a Universal Quantum Computer – with several orders of magnitude fewer bits. They currently have 50-100 qubits, limited gate depths, and short coherence times. As there are several orders of magnitude of Qubits, NISQ computers cannot perform any useful computation, however they are a necessary phase in the learning, especially to drive total system and software learning in parallel to the hardware development. Think of them as the training wheels for future universal quantum computers.

Universal Quantum Computers / Cryptographically Relevant Quantum Computers (CRQC)

This is the ultimate goal. If you could build a universal quantum computer with fault tolerance (i.e., millions of error- corrected physical qubits resulting in thousands of logical Qubits), you could run quantum algorithms in cryptography, search and optimization, quantum systems simulations, and linear equations solvers.

Post-Quantum / Quantum-Resistant Codes

New cryptographic systems would be secure against both quantum and conventional computers and can interoperate with existing communication protocols and networks. The symmetric key algorithms of the Commercial National Security Algorithm (CNSA) Suite were selected to be secure for national security systems usage even if a CRQC is developed. Cryptographic schemes that commercial industry believes are quantum-safe include lattice-based cryptography, hash trees, multivariate equations, and supersingular isogeny elliptic curves.

Difficulties with Quantum Computers

What Are Quantum Computers Made Of

• Interference – During the computation phase of a quantum calculation, the slightest disturbance in a quantum system (say a stray photon or wave of EM radiation) causes the quantum computation to collapse, a process known as de-coherence. A quantum computer must be totally isolated from all external interference during the computation phase.

• Error correction – Given the nature of quantum computing, error correction is ultra-critical – even a single error in a calculation can cause the validity of the entire computation to collapse.

• Output observance – Closely related to the above two, retrieving output data after a quantum calculation is complete risks corrupting the data. 



Source link

TECHNOLOGY

Snowflake becomes available in UK on Microsoft Azure

Published

on

Cloud Computing News


Snowflake, a data cloud company, has announced Snowflake’s general availability on Microsoft Azure in the UK, driven by high customer demand for local data residency from both the private and public sector in the UK.

Snowflake enables institutions across multiple industries to deploy data and analytical workloads to suit their business-critical needs. With true multi-cloud availability across three major public clouds, organisations can deploy Snowflake’s unique data capabilities with region-to-region and cross-cloud data replication for workloads, across their chosen cloud service providers. Multi-cloud deployment supports firms’ ability to manage operational resilience requirements to meet ever-changing regulations.

Snowflake is supporting local organisations with their data localisation implementations, including organisations across a wide range of industries, such as financial services and the public sector. For such organisations, a Data Cloud located in the UK can facilitate compliance with laws and regulations linked to handling sensitive customer data. This new deployment will enable businesses using Snowflake to keep their data in the UK, while at the same time taking advantage of the flexibility and scalability of Snowflake Data Cloud.

Julien Alteirac, area VP UK&I, Snowflake, said: “This Microsoft Azure deployment further highlights Snowflake’s commitment to helping businesses in the UK take full advantage of cloud technology while also enjoying data residency.

“Snowflake customers in the UK can keep their data in the country, and at the same time benefit from the flexibility of multi-cloud to drive innovation and adaptability for their organisations. This deployment is a further representation of our ongoing commitment to data innovators in the UK.”

Orla McGrath, global partner solutions lead, Microsoft UK, said: “This launch further demonstrates our commitment to meeting our customers’ most stringent requirements, working hand in hand with our key partners.

“By ensuring availability of the Snowflake Data Cloud in our UK data centres, customers and partners across a wide range of industries can be better prepared to meet local data residency requirements, whilst leveraging the tools they need to accelerate their own data and AI strategies.”

Customers using Snowflake’s Data Cloud can discover and securely share data, as well as execute diverse analytic workloads. The platform is a cloud-native powerhouse of business intelligence capabilities, including applications, cybersecurity, collaboration, data warehousing, data lake, data engineering, data science & ML, and unistore. Snowflake uses an innovative, per-second pricing model, enabling customers’ access to almost limitless capacity while only paying for the resources they consume.

Tags: , ,



Source link

Continue Reading

DON'T MISS ANY IMPORTANT NEWS!
Subscribe To our Newsletter
We promise not to spam you. Unsubscribe at any time.
Invalid email address

Trending

en_USEnglish