ORIGONE QUANTUM Offering

The offer addresses first international groups for which the hearths of business development rely on complex scientific problems to solve in a short time such as finance, aerospace, Defense, big data, pharmaceutical industry, etc. ORIGONE Quantum’s secondary market will focus on ground-breaking start-ups which need computational power but cannot offer it.

ORIGONE Quantum is intended to be the company which develops operating systems for quantum computers. It aims–to be the key player which is the reference in adapting industrial applications for quantum systems.

Don’t hesitate to contact our team of experts for more information: info@origone.fr or quantum@origone.fr

Quantum Applied Industries

Quantum Applied Industries is a non for profit which aims to:

  • Contribute to the development of knowledge of the international scientific community for physics and matter sciences, high performance computation, quantum computers, and their industrial and academic applications, offering:
    • A space for reflection and a force of proposal for measures to promote research, scientific and material support for the benefit of academic organizations, industry, start-ups;
    • A fundamental and applied research activity in connection with public and private institutions to develop acculturation to fundamental research for national institutions and cultural communities, industrial development sites or education preferred targets within an objective of global knowledge homogenization for physics, high performance computation, quantum computers, and their industrial and academic applications.
  • Be a force of proposal, support and action in the acculturation of public institutions in applied research measures, and the development and implementation of technology tools, organizational and human resources to promote the development of fundamental physics, high performance computation, quantum computers, and their industrial and academic applications.
  • Propose projects, actions and services in connection with the subject of the Association.

D-Wave Systems Breaks the 1000 Qubit Quantum Computing Barrier

With the cordial authorisation of our partner D-Wave Systems Corporation

New Milestone Will Enable System to Address Larger and More Complex Problems

Palo Alto, CA – June 22, 2015 – D-Wave Systems Inc., the world’s first quantum computing company, today announced that it has broken the 1000 qubit barrier, developing a processor about double the size of D-Wave’s previous generation and far exceeding the number of qubits ever developed by D-Wave or any other quantum effort.  This is a major technological and scientific achievement that will allow significantly more complex computational problems to be solved than was possible on any previous quantum computer.

D-Wave’s quantum computer runs a quantum annealing algorithm to find the lowest points, corresponding to optimal or near optimal solutions, in a virtual “energy landscape.” Every additional qubit doubles the search space of the processor. At 1000 qubits, the new processor considers 21000 possibilities simultaneously, a search space which dwarfs the 2512 possibilities available to the 512-qubit D-Wave Two. ‪In fact, the new search space contains far more possibilities than there are ‪particles in the observable universe.

“For the high-performance computing industry, the promise of quantum computing is very exciting. It offers the potential to solve important problems that either can’t be solved today or would take an unreasonable amount of time to solve,” said Earl Joseph, IDC program vice president for HPC. “D-Wave is at the forefront of this space today with customers like NASA and Google, and this latest advancement will contribute significantly to the evolution of the Quantum Computing industry.”

As the only manufacturer of scalable quantum processors, D-Wave breaks new ground with every succeeding generation it develops. The new processors, comprising over 128,000 Josephson tunnel junctions, are believed to be the most complex superconductor integrated circuits ever successfully yielded. They are fabricated in part at D-Wave’s facilities in Palo Alto, CA and at Cypress Semiconductor’s wafer foundry located in Bloomington, Minnesota.

“Temperature, noise, and precision all play a profound role in how well quantum processors solve problems.  Beyond scaling up the technology by doubling the number of qubits, we also achieved key technology advances prioritized around their impact on performance,” said Jeremy Hilton, D-Wave vice president, processor development. “We expect to release benchmarking data that demonstrate new levels of performance later this year.”

The 1000-qubit milestone is the result of intensive research and development by D-Wave and reflects a triumph over a variety of design challenges aimed at enhancing performance and boosting solution quality. Beyond the much larger number of qubits, other significant innovations include:

  •  Lower Operating Temperature: While the previous generation processor ran at a temperature close to absolute zero, the new processor runs 40% colder. The lower operating temperature enhances the importance of quantum effects, which increases the ability to discriminate the best result from a collection of good candidates.​
  • Reduced Noise: Through a combination of improved design, architectural enhancements and materials changes, noise levels have been reduced by 50% in comparison to the previous generation. The lower noise environment enhances problem-solving performance while boosting reliability and stability.
  • Increased Control Circuitry Precision: In the testing to date, the increased precision coupled with the noise reduction has demonstrated improved precision by up to 40%. To accomplish both while also improving manufacturing yield is a significant achievement.
  • Advanced Fabrication:  The new processors comprise over 128,000 Josephson junctions (tunnel junctions with superconducting electrodes) in a 6-metal layer planar process with 0.25μm features, believed to be the most complex superconductor integrated circuits ever built.
  • New Modes of Use: The new technology expands the boundaries of ways to exploit quantum resources.  In addition to performing discrete optimization like its predecessor, firmware and software upgrades will make it easier to use the system for sampling applications.

“Breaking the 1000 qubit barrier marks the culmination of years of research and development by our scientists, engineers and manufacturing team,” said D-Wave CEO Vern Brownell. “It is a critical step toward bringing the promise of quantum computing to bear on some of the most challenging technical, commercial, scientific, and national defense problems that organizations face.”

Former Department of Defense official and author releases new cyber security book

April 28, 2015                                                                                               Rebecca Abrahams

240-200-4490

Former Department of Defense official and author releases new cyber security book

New book, “Essays in Technology, Security and Strategy,” now available on Kindle

WASHINGTON, D.C. – Technology security visionary Dr. Stephen Bryen has published a new collection of pivotal essays on national security and cyber security to help policy makers and citizens understand the real threats facing the security of the United States.

Essays in Technology, Security and Strategy,” provides unique insight and new information from Dr. Bryen who has more than 40 years of experience in government and defense and technology industries. The book guides readers through a unique landscape of original ideas and practical solutions to the ever-increasing threat to U.S. security and our way of life.

“These interesting, colorful, and engaging essays demonstrate deep understanding of what led to exacerbate the technological, foreign policy, and national security challenges facing America today,” said noted author and terrorism expert Rachel Ehrenfeld.

Essays in Technology, Security and Strategy” targets important questions including:

  • Is the U.S. still a Great Power?
  • Will NATO and Europe fight?
  • Will Japan build its own nuclear weapons?
  • Why Iraq is a national security disaster
  • After an Iran deal will there be a Saudi-Israeli alliance?
  • Why spying is out of control
  • Sharing our defense budget with China

The book also provides insight on domestic affairs such as:

  • Why the Stingray police spy tool will end up in the Supreme Court
  • The day U.S. critical infrastructure goes up in smoke
  • U.S. Policy and Cyber Attacks – time for a Byte for a Byte

Dr. Bryen served as a senior staff director of the U.S. Senate Foreign Relations Committee and served as the Deputy Under Secretary of Defense for Trade Security Policy. He was also the founder and first director of the U.S. Defense Technology Security Administration and served as a Commissioner of the U.S. China Security Review Commission. Dr. Bryen’s extensive experience and high effectiveness has earned him the highest civilian awards of the U.S. Defense Department on two occasions and established him as a proven government, civic, and business leader in Washington D.C. and internationally.

Contributing co-authors of “Essays in Technology, Security and Strategy,” include Peabody and Edward R. Murrow award winning producer, journalist, and author Rebecca Abrahams and Shoshana Bryen, an internationally recognized expert on defense policy and Senior Director of the Jewish Policy Center in Washington D.C. Mrs. Bryen is also the editor of inFocus Quarterly.

More information about co-author Rebecca Abrahams can be found at www.linkedin.com/pub/rebecca-abrahams/0/9b9/648. For more information about co-author Shoshana Bryen, visit www.jewishpolicycenter.org/board/shoshana-bryen

For more information about Essays in Technology, Security and Strategy and author Dr. Stephen Bryen, please visit www.amazon.com/author/stephenbryen

About the Author

Dr. Stephen Bryen served as a senior United States Department of Defense official responsible for technology security and has headed a major international corporation in the United States. He brings more than 45 years of experience in government, international politics, business, and policy expertise into focus in this important new book, “Essays in Technology, Security and Strategy.” Dr. Bryen twice was awarded the Pentagon medal for Distinguished Public Service.

Les OVI face aux obligations de la LPM

Un décret précise les nouvelles obligations en matière de cybersécurité.

Des contraintes, des lourdeurs, des coûts, et l’obligation de communiquer ses attaques informatiques aux autorités. Tel sera le prix à payer pour les opérateurs d’importance vitale (OIV), ces entreprises dont l’activité est jugée stratégique pour la nation. Un an et demi après le vote de la loi de programmation militaire, un décret renforçant les obligations de 218 entreprises de tous secteurs (banques, opérateurs télécoms, grande distribution, etc.) a été publié dimanche.

Les OIV devront mettre en place des systèmes de détection des intrusions dont il font l’objet et procéder à des audits en faisant appel soit à l’Agence Nationale de la Sécurité des Systèmes d’information (ANSSI), soit à des prestataires labellisés comme Thales.

Pour l’article complet ICI

Rise of the In-Memory Data Base Management Systems

As technology advances each day, more and more options are becoming available to the end user, much to the relief of many individuals, businesses and organizations. Take for instance, the in-memory database. In-memory database technology has been at the forefront in enabling businesses to mine operational data for their business intelligence. But what exactly does in-memory database refer to?  This is a database that runs entirely within the working memory, or RAM, of a server or servers.

In the year 2014, the performance-monitoring service New Relic launched an application that would enable businesses to use the data acquired through operations to build on their business intelligence. Business intelligence is crucial and enables businesses to understand how to handle their customer service, security and targeted marketing. This application that was unveiled by New Relic muscles through huge volumes of data to arrive and answers to queries input by users.

In addition to the many applications that can be based on the in-memory database, this database type is growing in popularity on its own accord. There are several reasons for this. Some time back in-memory databases were a preserve for well-funded fast-trading financial firms. Fast forward to day and thanks to the falling costs of server memory, more institutions are able to afford the in-memory databases. Also, the fact that more customers are demanding faster speeds when it comes to accessing Internet services has pushed companies towards ensuring these demands are met.

An increasing number of technological companies are offering databases that have in-memory capabilities, and thus availing the benefits of the in-memory database to companies and consequently, their consumers. There are also a growing number of caching tools that have been developed, with the aim of allowing companies to keep their relational database content into memory, such as Memcache and Redis. For example, Facebook uses MySQL to store user data but prefers Memcache when it comes to getting material quickly to users.

Microsoft, Oracle, Altibase and other companies that are considered to be the best in-memory database providers are continuing with the upgrades and development of their technology solutions.

The in-memory database is quickly becoming a favorite among many enterprise owners, with the questions shifting from “what is it?’ to “How can I do it?”. It is clear that as far as operations are concerned, business owners are constantly seeking means to improve on efficiency and service delivery. For instance, the in-memory capabilities of a database do not just improve speeds of delivery, but can create new business lines as well, including being able to change product prices to counter the dynamism of competitors’ prices.

Dynamic pricing is something that can be done with standard relational databases, but when done with in-memory databases, the process is much faster and delays are eliminated. By integrating in-memory databases with the appropriate data processing platforms, big data can also be handled in these databases.