Search

Getting ready for the upcoming DNA computation revolution

The growing cost of AI

AI delivers innovation at a rate and pace the world has never experienced but comes at a substantial cost. AI generates volumes of data, and machine learning models are expensive to train and maintain.

These exorbitant costs point to a limitation of the current computation paradigm.

While the advent of the microprocessor and its exponential development over the decades is largely responsible for the world as we know it today, the basic von Neumann architecture surrounding the microprocessor hasn’t changed much since World War II. And it is this architecture, this computing paradigm, that is increasingly becoming incompatible with the ever-increasing demand for data storage and computation.

DNA computation: A paradigm shift

Our cells are DNA-based computers that come together to form our bodies, which collectively process trillions of operations in parallel with very little energy. Scientists have mimicked that and used synthetic DNA to store and compute digital data in laboratory settings.

Compared to existing microprocessor technologies, which process workloads serially, a significant benefit of DNA Computation platforms is the ability to use enzymes or DNA probes to compute in a massively parallel fashion.

Imagine mixing a container of blue liquid with a container of red liquid. The result of this computation –a new color—appears not by serially mixing each color molecule one at a time but by mixing all of them together in parallel. Just as in this thought experiment, computation is performed in a massively parallel manner directly on the data, without having to travel to memory or processor to be processed.

Potential DNA computation application areas

DNA-based computation has the potential to allow the generation of insights from data sets that are not currently possible with existing computers. Early application areas include search, signal processing, and machine learning.

One practical example is satellite imagery of the entire surface of the Earth. We’ll soon have decades’ worth of images taken every second of every day. Given the amount of data, a simple search using conventional technology could become prohibitively expensive, but with DNA, it could be as simple as a COVID test.

Other expected areas of early application are artificial intelligence, machine learning, data analytics, and secure computing. In addition, initial use cases are expected to include fraud detection in financial services, image processing for defect discovery in manufacturing, and digital signal processing in the energy sector.

Borrowing heavily from natural processes and cutting-edge synthetic biology tools, in addition to parallelization, automated and scalable DNA-based computation platforms are divorced from the limitations of traditional electronic systems. They leverage low energy, low physical footprint, and secure computing.

We’ve listed the best business cloud storage.

In “The Structure of Scientific Revolutions,” physicist and philosopher Thomas Kuhn introduced the concept of a paradigm shift, which he used to describe a fundamental change in the basic framework of thinking in natural sciences. Throughout history, however, such paradigm shifts have occurred not just in natural sciences but across the entire spectrum of human endeavor, providing solutions to problems that appeared to be insurmountable under the old paradigm.

The field of data storage and computation is a case in point. As the demand for creation, retention, and data computation only ever increased with time, the current computing paradigm requires enterprises to build data continuously centers the size of football fields and nuclear power plants to power them. Here, the lack of resources and capabilities to build these things quickly enough indefinitely is not as important as the fact that the current computing paradigm is not compatible with a scalable solution.

Share it

🤞 Don’t miss these tips!

Solverwp- WordPress Theme and Plugin