Posted in | News | Quantum Computing

Researchers Achieve Significant Progress Towards Quantum Information Processing

A research team from the National Physical Laboratory (NPL) has developed a first-of-its-kind monolithic three-dimensional (3D) ion microtrap array that may be scaled up to confine several tens of ion-based quantum bits (qubits).

Semi-conductor chip used by scientists at the National Physcial Laboratory to test the first scalable 3-D ion microtrap. (credit: National Physical Laboratory)

In a paper reported in the Nature Nanotechnology journal, the research team described the process of realizing this tool embedded in a semiconductor chip and demonstrated the chip’s ability to trap single ions at the nanoscale.

The monolithic ion microtrap array integrates a near perfect 3D geometry with a scalable production process, which is a significant progress in this field. With respect to basic operating properties, the microtrap chip outclasses other scalable devices for ions.

The research team fabricated the microtrap chip from a silica-on-silicon wafer utilizing an innovative method based on traditional semiconductor fabrication technology. The team was able to trap single and sequence of about 14 ions in one segment of the array. The fabrication technique must allow for device scaling to deal with more number of ions without sacrificing the capability of controlling each of them individually.

The remarkable development in nanotechnology allows for scaling up the power of traditional processor chips in accordance with Moore's Law. Although quantum processors are in the nascent stage, the NPL microtrap chip is a significant progress for scaling up such devices for ion-based qubits.

NPL’s Principal Scientist, Alastair Sinclair stated that the research team was able to fabricate a key tool or device that is essential for cutting-edge research and advancement in quantum technologies. This device lays the cornerstone for a future atomic clock device with application for timing, location and navigation services or for a future quantum processor chip build upon trapped ions, thus paving the way to a quantum computer as well as a quantum information network.

Source: http://www.npl.co.uk

Will Soutter

Written by

Will Soutter

Will has a B.Sc. in Chemistry from the University of Durham, and a M.Sc. in Green Chemistry from the University of York. Naturally, Will is our resident Chemistry expert but, a love of science and the internet makes Will the all-rounder of the team. In his spare time Will likes to play the drums, cook and brew cider.

Citations

Please use one of the following formats to cite this article in your essay, paper or report:

  • APA

    Soutter, Will. (2019, February 18). Researchers Achieve Significant Progress Towards Quantum Information Processing. AZoQuantum. Retrieved on May 07, 2024 from https://www.azoquantum.com/News.aspx?newsID=72.

  • MLA

    Soutter, Will. "Researchers Achieve Significant Progress Towards Quantum Information Processing". AZoQuantum. 07 May 2024. <https://www.azoquantum.com/News.aspx?newsID=72>.

  • Chicago

    Soutter, Will. "Researchers Achieve Significant Progress Towards Quantum Information Processing". AZoQuantum. https://www.azoquantum.com/News.aspx?newsID=72. (accessed May 07, 2024).

  • Harvard

    Soutter, Will. 2019. Researchers Achieve Significant Progress Towards Quantum Information Processing. AZoQuantum, viewed 07 May 2024, https://www.azoquantum.com/News.aspx?newsID=72.

Tell Us What You Think

Do you have a review, update or anything you would like to add to this news story?

Leave your feedback
Your comment type
Submit

While we only use edited and approved content for Azthena answers, it may on occasions provide incorrect responses. Please confirm any data provided with the related suppliers or authors. We do not provide medical advice, if you search for medical information you must always consult a medical professional before acting on any information provided.

Your questions, but not your email details will be shared with OpenAI and retained for 30 days in accordance with their privacy principles.

Please do not ask questions that use sensitive or confidential information.

Read the full Terms & Conditions.