Ajay
Ajay Malik

Deep learning to avoid real time computation

Avoid real time computation

“The underlying physical laws necessary for the mathematical theory of a large part of physics and the whole of chemistry are thus completely known, and the difficulty is only that the exact application of these laws leads to equations much too complicated to be soluble. It therefore becomes desirable that approximate practical methods of applying quantum mechanics should be developed, which can lead to an explanation of the main features of complex atomic systems without too much computation” — Paul Dirac, 1929. For example:

  • Navier–Stokes equations are the fundamental basis of almost all Computational Fluid Dynamics (CFD) problems. These are extremely useful to model the weather, airflow around an airplane wing, ocean currents, water flow in a pipe, and the analysis of pollution. But these cannot be used in real-time due to the time they need for computation.
  • Fully describing an arbitrary many-body state in quantum mechanics requires an exponential amount of information. While simulating a quantum system with 30 qubits requires just tens of gigabytes, simulating 300 qubits requires more bytes than the number of atoms in the observable universe! Even the state-of-the-art approximation methods for quantum mechanics such as Hartree-Fock Theory (HF) and Density Functional Theory (DFT) can take a long time: hours, days, or even weeks to compute. Learn more about QuBits.
  • Even with the computing power available today, simulations or real-time analysis for acoustic transmission or absorptions in buildings can take a long time.
  • Computer simulation of electrons in the potential of atomic nuclei is the workhorse of modeling material properties such as phase stability, mechanical behavior, and thermal conductivity. However, these simulations are limited by their computational cost.
Deep Learning1

Deep learning has the potential to break through this limitation.

Imagine creating a deep learning model by constructing a dataset that covers the entire physically relevant set of configurations for the problem and then just using the model to completely bypass costly calculations in the future.

You can use the predictive power of deep neural networks to cut the computation time down to a couple of seconds.

Yes, you will do complex simulations once, but only once! Once you have built your dataset, decided the fitting methodology such as a simple FFN (feed forward neural network), RBM (a restricted-Boltzmann-machine) or any other neural network architecture, your model can serve as a template for all future work!

To me, this is one of the best uses of deep learning to build an explanation without too much computation!

People often think of AI for boosting growth by substituting humans, but actually, huge value is going to come from how humans will use AI. This is yet another perfect example how deep learning will help us advance more.

By Ajay Malik

Ariel

3 Challenges of Network Deployment in Hyperconverged Infrastructure

Hyperconverged Infrastructure In this article, we’ll explore three challenges that are associated with network deployment in a hyperconverged private cloud environment,

Johan

Why the digital infrastructure is a matter of national interest!

Digital Infrastructure National Interest When the Internet was born, it promised a form of democracy and guarantee that everybody could

Hamza Seqqat

The Benefits of Virtualizing SD-WAN and Security

Benefits of Virtualizing SD-WAN As more companies adopt SD-WAN technology to enhance the agility of their networking architecture, they must

Google Data

Google.org Fellows bring transparency to local jail data nationwide

Click here to explore Interactive Map with Localized Data In recent years, we’ve seen a bipartisan focus on criminal justice reform

Samsung

Developing AI ScaleNet: Enabling Seamless, High-resolution 8K Streaming

It’s official: We’ve entered the era of 8K TVs. Around the world, sales of 8K displays are steadily increasing, and

Wired Logo

Facebook’s Head of AI Says the Field Will Soon ‘Hit the Wall’

Jerome Pesenti is encouraged by progress in artificial intelligence, but sees the limits of the current approach to deep learning.