Back to the future
-
- from Shaastra :: vol 04 issue 11 :: Dec 2025
A look back at the technologies that evolved in 2025 and will transform the next decade and more in science.
When one is looking to identify the most important scientific and technological developments of the year gone by, the Nobel Prizes are probably not the place to begin the search. Laureates are typically chosen for work that was done a long time ago, sometimes decades earlier. And in most cases, their names and their work don't reveal anything about the quality of research done during contemporary times. Although the time gap between discovery and the prizes has shrunk over time, it is rare to see scientific advances being acknowledged with a Nobel soon after the publication of the research. Even for veterans in a field, it takes a long while to assess the real importance of a discovery.
The year 2025 marks a fundamental shift in the way science is done as well as the way scientific problems are defined.
However, Nobel Prizes are often awarded with an eye on the future as well. The Nobel Committee is led not just by the quality of the work and its impact on current science. Its members look for impact on other areas of science, how the work opened up new avenues of research, and how they are important to emerging and future scientific fields. The Prizes are given after thorough research and discussions, and provide insights into the minds of the Nobel Committee. They can, therefore, give a sense of the direction of future research. Increasingly, they act as pointers to the direction in which technology is evolving as well.
Till the 1990s, almost all Nobel Prizes went to basic research in science – such as the discovery of a new law or a fundamental aspect of nature that hadn't been seen till then. From the 1990s, new technologies blossomed through the application of science, and these had a significant impact on society. This drove a gradual shift in the outlook of the Nobel Committee, which frequently began to reward science that led to transformative technologies. The committee is, in a sense, mirroring what the rest of the world thinks. There is a recognition around the world, especially among policymakers and funders of science, that scientific research must have a transformative effect on society. Specifically, political and industrial leaders around the world now look towards science to solve existential problems like climate change, pollution, pandemics, chronic diseases, and so on.
THE SILVER LINING
This felt need to solve serious problems now shapes scientific research and drives funding. Although research funding in the United States fell in 2025, and climate change programmes there were cancelled by the Trump administration, the year accelerated a shift towards orienting research to solve major global challenges, with major developed economies contributing large amounts to developing the technologies of the future. The European Union's Horizon Europe, a €93.5-billion programme to tackle global challenges, allocated €7.3 billion to boost European competitiveness in the green and digital transitions. In November, India launched a new programme, the Research, Development and Innovation Scheme, with a budget of ₹1 lakh crore, to accelerate research in the areas of quantum technologies, clean tech, and biotechnology. Japan continued research under its ¥2 trillion Green Innovation Fund to decarbonise the country by 2050. China enhanced its research budget by 10% during the year; it now matches the U.S. in government funding for science.
These projects largely fund a new kind of research, based on methods that are being created rapidly. It uses tools like quantum models, data-based modelling and artificial intelligence (AI) simulations – tools that did not exist at this level a few years ago. Meanwhile, scientists are moving across silos and creating new disciplines. Science has been multidisciplinary for a while, with several areas layering their contributions separately and ultimately creating a new finding or a product like a telescope or a microscope or an MRI machine. It is increasingly interdisciplinary too, as it synthesises and integrates different fields to solve difficult problems such as protein folding or climate change. Research is now becoming transdisciplinary as well, as it creates new disciplines – pandemic research, complexity science – by combining many older areas of research. All of these strands – multidisciplinary, interdisciplinary, and transdisciplinary – will shape the path to the future.
The major breakthroughs/events of 2025 were all driven by these trends. Many were in areas that are key to solving the big problems of the world, from climate change to pollution to chronic diseases. Quantum science and technology made major advances. In chemistry, new materials held the centre stage, as large databases of metal-organic frameworks (MOFs) were created with assistance from AI and tools based on large language models (LLMs). These two disciplines – quantum technologies, and AI – are now at the forefront of scientific research. The United Nations had chosen 2025 as the Year of Quantum Science and Technology. The Nobel Prize in Physics for 2025 went to three researchers who had demonstrated quantum effects in a chip "big enough to be held in the hand" (as the press release from the Royal Swedish Academy of Sciences noted). The Nobel Prize in Chemistry for 2025 went to three researchers who developed MOFs.
Work that led to the Nobel Prize in Physics will be used in robust quantum computers, quantum cryptography, quantum sensors, and other technologies based on quantum mechanics. New MOFs will be central to solving climate change, through the development of technologies for water harvesting and carbon capture from the air, new kinds of catalysts, hydrogen storage, and so on. The Nobel Prize in Physiology or Medicine 2025 went to three scientists for discovering mechanisms by which immune cells are prevented from attacking the body's own tissues. Their research, conducted since the mid-1990s, acknowledges the importance of immune regulation for medicine: at least 200 drugs are under clinical trials for a variety of diseases – from kidney transplants to treating cancer - using these mechanisms. These clinical trials, along with recent evidence of the importance of immune tolerance to tackling several diseases, informed the Prize in 2025.
A SCIENTIFIC SHIFT
So, 2025 marks a fundamental shift in the way science is done as well as the way scientific problems are defined. A lot of basic or blue-sky research is now being carried on under the umbrella of large projects created with specific goals. Solving climate change, for instance, requires the development of new materials, and development of new materials requires an understanding of matter at a deep level. Tackling neurodegenerative diseases and improving mental health requires a keener understanding of the brain, which in turn requires independent advances in many traditional disciplines. Understanding pandemics is now a transdisciplinary science.
Scientific research increasingly uses tools that did not exist at this level a few years ago: quantum models, data-based modelling and artificial intelligence simulations.
The Cover Story package over the next few pages showcases some of the more noteworthy science headlines from 2025. They relate to areas that are vigorously active. Despite being areas of blue-sky research in an age of translation, astronomy and high energy physics have been thriving in recent years. Cosmology has been undergoing a 'stress test', as new data has questioned old assumptions about the standard cosmological model, a framework built using physics that has not been observed so far through experiments. In fact, recent data through different methods have shown different expansion rates for the universe, with the difference too strong to ignore. The James Webb Space Telescope has spotted extraordinary events in the universe.
Similar discrepancies have been jolting physics as well. The muon, a heavy fundamental particle that is otherwise similar to the electron, has been teasing physicists for a few years, with data hinting at new physics and holes in the Standard Model of Particle Physics (which is different from the Standard Model of Cosmology). Data went the other way in 2025, strengthening the Standard Model and seemingly weakening some other theories like supersymmetry. The muon anomaly between prediction and experiment will continue to generate excitement over the next few years. There were plenty of advances in other areas of physics: precise optical clocks, quantum computers, improvements in quantum chips, and a large amount of exotic stuff.
In chemistry, scientists have taken further steps towards using safer and cleaner materials for synthesis, removing the need for rare and expensive metals and thereby opening avenues for clean carbon capture and chemical synthesis. Overall, in several incremental but still significant breakthroughs, scientists found ways of making the environment cleaner through novel ways of synthesis. Some chemists developed plastics that can be programmed to break down in the environment after specific lifetime. Meanwhile, news about microplastics and other pollutants became grimmer. Climate scientists discovered that the Earth may have crossed some points of no return.
Some of the best news came from biology and medical research. Biologists laid the groundwork for future disease treatment by moving towards understanding diseases as network perturbations. Evidence has been strengthening for this idea over the past decade, and in 2025 scientists also made steps towards using this theory to treat diseases using multidrug combinations. Advances were also made in understanding cellular ageing, developing tests for Alzheimer's disease, forming deeper understanding of immune regulation, and so on. Gene editing techniques got better and more precise. These advances are expected to generate a paradigm shift in the way scientists look at disease. It would also change the way doctors treat disease. Therapies may be round the corner for several intractable problems in medical science. It will be rewarding to watch the next year closely.
PHYSICS/ASTRONOMY
How does the universe end? That question keeps cosmologists awake at night. They have known that the universe has been expanding rapidly for a century, a hypothesis confirmed when American astronomer Edwin Hubble published results showing that galaxies were all receding from each other, with their velocities depending on their distance from the Earth. Until the end of the 20th century, astronomers also believed that the universe would one day collapse under its own gravity, a process called the Big Crunch. A universe expanding forever didn't make sense to them. In 1998, cosmologists got a shock.
Three American astronomers then showed that the universe's expansion was accelerating, mainly due to the accelerating expansion of space between galaxies. If this acceleration continued, there would be a time when everything would disappear in the universe. The three astronomers — Brian Schmidt, Adam Riess, and Saul Perlmutter — won the 2011 Nobel Prize in Physics for their discovery. The search since then has been to determine what causes this expansion. They named it dark energy, but what is in a name?
More DESI data became available in October 2025, increasing confidence in a weakening dark energy. Scientists now await the next release.
Now, the cosmology community is bracing itself for one more shock. To study the expansion history of the universe, the U.S. Department of Energy has funded the Dark Energy Spectroscopic Instrument (DESI), based in Arizona. Astronomers got excited when DESI data from 6 million galaxies became available in 2024, suggesting that dark energy might not be a constant and might have changed over time. In March 2025, DESI data were available from 14 million galaxies, providing stronger evidence that dark energy has been weakening over time. More data became available in October 2025, increasing confidence in a weakening dark energy. Now all eyes are trained on the next release in 2026, promising two exciting years ahead in cosmology.
FROZEN LIGHT
Early in their science lessons, schoolchildren learn about the three states of matter: solid, liquid, and gas. These states are determined by factors such as atomic size, temperature and pressure. The human senses, however, cannot perceive the realms of the very small or very large, very hot or very cold. A fourth stage of matter occurs at high temperatures, when atoms break into their constituent electrons and nuclei. This fourth state of matter is called plasma, and is what makes up stars in the universe. When matter becomes cold, very close to the lowest temperature it can possibly reach, atoms in some elements can coalesce and behave as if they were one atom, like soldiers marching in unison.
This cold, fifth state is called a Bose-Einstein condensate. It was theorised to exist by S.N. Bose and Albert Einstein in 1924 and confirmed experimentally in 1995. However, the bizarre world of quantum mechanics has thrown up surprises, with matter states that cannot be classified precisely: superfluids that flow without any resistance, and supersolids with atoms ordered as in solid crystals and yet flowing without resistance, as in liquids. In precise scientific language, they are phases rather than states. They exist at very low temperatures, and the existence of supersolids was fully confirmed only in 2017. Supersolids are engineered quantum systems and not natural matter simply cooled. In 2025, some scientists engineered light into existing as a supersolid.
Researchers in Italy cooled light particles — photons — to the Bose-Einstein condensate. When many such photons were cooled, they behaved like a supersolid, an entity that is frozen and yet flows. The scientific and technological possibilities of this research are enormous: among other prospects, more precise quantum computers and sensors, and tools to explore new physics.
QUANTUM TECHNOLOGIES
The atomic world is considered a bizarre realm, which runs by the rules of quantum mechanics that do not conform to human notions of reality. Human beings have evolved to perceive what happens at the middle scales — not in the nanoworld of atoms, nor in the scales of the universe. The quantum world has phenomena such as superposition and entanglement, in which something can take values of, say, one and two at the same time, and two particles are tuned to behave as one, even though they are physically far apart. Physicists have learned to describe such things mathematically, although they do not fully understand them in terms of common perception. These properties are useful for building computers.
When a quantum computer is created, it is not easy to know whether it can perform tasks better than a classical computer.
There are many ways of representing two quantum states of entities called qubits. These qubits are to a quantum computer what 'bits' corresponding to 0 and 1 are to classical computers. However, there can be superpositions of two or more qubits in quantum computers, and this superposition provides methods of representing problems in complex ways. It is hard to create and maintain these qubits. When a quantum computer is created, it is also not easy to know whether it can perform tasks better than a classical computer, a situation called quantum advantage. In October 2025, Google published a paper in the journal Nature (bit.ly/Google-Advantage) claiming to have achieved a quantum advantage with its computer.
The machine was a 105-qubit superconducting quantum computer, which was used to predict properties of simple molecules by computing the interactions between the spins of electrons in the atoms. The predicted structures could be verified using Nuclear Magnetic Resonance (NMR) imaging. This method, although verifiable through NMR imaging, did not work in larger molecules. Some scientists question whether there was indeed a quantum advantage, since there was no way of knowing whether engineers could not develop classical computers and algorithms that could match this performance.
THE KEY
What is a quantum computer good for? This is not a trivial question to ask when classical computers and algorithms keep advancing, and no one can predict how good they will get one day. However, there is at least one problem that classical computers cannot solve: the rapid factoring of large numbers into their constituent factors. As the numbers grow larger, it will take classical computers longer and longer to break them into factors. At some stage, they take longer than the lifetimes of individuals. Quantum computers, on the other hand, are supposed to be able to factorise large numbers in no time. And that is a problem rather than a solution.
It is easy to multiply numbers and create a large number. But, once they are created, it is hard to factorise them unless you know the original numbers. Engineers use this property of numbers to build secure communications. The sender and the receiver know the key, which is the original set of factors, but anyone intercepting the message cannot factorise the number and crack the code. Since quantum computers may factorise it easily, the world needs them to secure communications as well. In March 2025, the Japanese firm Toshiba made an important advance in quantum communications. The company exchanged quantum keys between two locations securely at temperatures between -20° and -60° C. In fact, they used standard optical fibres between Frankfurt and Kehl, German cities 250 km apart. Previous demonstrations by Toshiba were over longer distances, but with equipment that was colder than the temperature of interstellar space. The current demonstration brings it to a more practical domain. After this work, real quantum communication systems do not seem to be too far in the future.
BIOLOGY/MEDICINE
In 1972, American scientists Stanley Cohen and Herbert Boyer developed a technique to cut DNA using enzymes and transfer the pieces from one organism to another, marking the origin of DNA modification. It was a clumsy technique then and prone to errors, but was being used by the early 1990s for gene therapy. New methods were developed in the new millennium, but even these were slow, expensive and error-prone. In 2012, Emmanuelle Charpentier and Jennifer A. Doudna developed the CRISPR-Cas9 system, a toolkit for precisely editing DNA. It was not transferring DNA, but editing it in an organism. Their technique holds enormous possibilities in medicine and agriculture.
Using the technique in 2025, scientists from the Indian Council of Agricultural Research (ICAR) developed two new rice varieties: Pusa Rice DST1 and DRR Dhan 100. Pusa Rice DST1 is designed for better drought and salt tolerance. The CRISPR editor quietened a gene that was suppressing stress resistance in the parent rice variety, thereby increasing the plant's ability to tolerate stress. Overall, the edited variety showed lower water requirements, higher grain yield, and greater salt tolerance. The DRR Dhan 100 was developed by editing the common Samba Mahsuri, resulting in a 20-25% yield increase, a 20-day shorter maturation cycle, and greater tolerance to drought and low fertiliser conditions than its parent.
Plant scientists across India are using the CRISPR toolkit to improve a plant's nutritional value, climate resilience, yield and other traits.
It will take a few years for these rice varieties to reach the farmer's field, but plant scientists across India are using the CRISPR toolkit to improve nutritional value, climate resilience, yield and other traits. Researchers at the National Agri-Food and Biomanufacturing Institute (NABI), Mohali, have edited the LCYε gene in bananas to develop a Vitamin A-rich banana. At the National Institute of Plant Genome Research, New Delhi, researchers have edited genes in the mustard plant to improve oil quality. These are just the beginnings of a transformative technology.
TREATING ALZHEIMER'S
In 1906, after an autopsy, German neurologist Alois Alzheimer discovered a brain disease with characteristic neuroanatomical features such as the accumulation of plaque. This was later named Alzheimer's disease. For nearly a century, the only reliable diagnosis of the disease was an autopsy. By the end of the 20th century, neuroimaging techniques began to detect brain tissue shrinkage in late-stage disease. Soon after that, spinal fluid markers emerged as a good diagnosis of the disease, but still, much after the symptoms manifest. All of these were expensive and difficult to administer. Till recently, the drugs available for treatment were also not effective. In the last few years, both the diagnosis and treatment of Alzheimer's have taken a new turn.
Earlier in 2025, the U.S. Food and Drug Administration (FDA) approved a blood-based test for diagnosing the disease. This is the first such test to be approved by the FDA to confirm Alzheimer's. Developed by the U.S. company Fujirebio and called the Lumipulse test, it measures the ratio of two proteins in the blood — pTau217 and ß-Amyloid 1-42. The FDA later approved another blood-based test – called Elecsys – for Alzheimer's disease, developed by Roche. Unlike the Lumipulse test, it is designed to rule out the disease but not confirm it.
These tests have been built on years of academic research worldwide, most notably at Johns Hopkins University. The new blood tests are expected to become widely available over the next few years and can diagnose the disease well before symptoms manifest. As new drugs have also been developed in the last five years, Alzheimer's disease has become a treatable condition.
CHEMISTRY/MATERIALS SCIENCE
Photosynthesis, which originated around 3.5 billion years ago, is the only way of storing the Sun's energy on the Earth on a large scale. Scientists have for a long time been trying to emulate the natural process and fix carbon using only sunlight. Successfully mimicking this process offers many benefits: the ever-increasing carbon dioxide in the atmosphere could be converted into carbohydrates and other useful substances using freely available solar energy, and artificial photosynthesis could also be harnessed to produce green hydrogen. The year 2025 saw key advances towards this goal.
An emerging field of skeletal editing aims to provide chemists with useful synthetic shortcuts and the ability to fine-tune molecular properties.
Photosynthesis is a highly complex, multi-step process that involves numerous pigments, proteins, and other interacting molecules within the plant. Chlorophyll captures energy from the Sun, which is used to split water and release oxygen, and in the process produces energy-carrying molecules. Carbon dioxide is then captured from the air and fixed to another molecule, and the third step produces sugars from this carbon dioxide, using hydrogen from the split water. It is not easy to mimic this process developed over billions of years. However, scientists at Julius-Maximilians-Universität (JMU) Würzburg in Germany and Dongho Kim of Yonsei University in South Korea have synthesised a stack of dyes that comes very close to replicating the photosynthetic apparatus of plants. It absorbs light energy, uses it to separate charge carriers, and then quickly and efficiently transfers them through the stack.
In another equally significant development, researchers at the University of Basel in Switzerland have synthesised a novel molecule that can simultaneously store two positive and two negative charges when exposed to sunlight. The ability to temporarily store multiple charges is a key requirement for converting sunlight into chemical energy. In a third development, a multi-institutional collaboration supported by the Lawrence Berkeley National Laboratory of the U.S. Department of Energy also developed an artificial leaf that used solar energy to convert carbon dioxide into valuable chemicals by recreating natural processes.
SKELETAL EDITING
For a long time, chemists have been tinkering with the surface of molecules. They oxidise alcohols, change acids into esters, and hydrogenate double bonds into single bonds. These processes form the backbone of today's chemical and pharmaceutical industries. This is changing in a new form of chemistry called skeletal editing. In it, chemists edit the backbone of the molecule by removing or adding atoms to create new scaffoldings. This ability has enormous potential to make molecules of the future.
In January 2025, a team of researchers at The University of Oklahoma developed a novel method for inserting a single nitrogen atom into molecules, thereby vastly improving the biological and pharmacological properties of the resulting drug compounds. Nitrogen atoms and nitrogen-containing chemical structures, called heterocycles, play a pivotal role in medicinal chemistry. Nearly 85% of all drugs approved by the U.S. Food and Drug Administration (FDA) contain one or more nitrogen atoms. The work belongs to an emerging field of skeletal editing, which aims to provide chemists with useful synthetic shortcuts and the ability to fine-tune molecular properties with precision.
This skeletal editing process enables greater drug diversity because, rather than developing new drugs from scratch, researchers can incorporate a single nitrogen atom to create a new set of drugs. Nitrogen is important in this process because DNA, RNA, proteins, and amino acids all contain nitrogen atoms. This work could have far-reaching implications in the treatment of diseases such as cancer and neurological disorders. Previous research in this field had demonstrated similar concepts but required conventional nitrenes and generated excessive amounts of the oxidising agent, which were not compatible with many drug molecules. Unlike previously demonstrated research, this team used additive-free and metal-free methods compatible with other functional groups within the molecule.
CLIMATE CHANGE
How do you assess the long-term impact of human activities on Earth? This is not as simple as it seems because the Earth is a complex network of physical, chemical and biological processes. This question is important because human beings thrive when the Earth remains as a stable environment over long periods. It is important to know what keeps the Earth's climate stable over centuries and millennia, and yet this was not done in a holistic way till a decade and a half ago. In 2009, scientists from the Stockholm Resilience Centre and the Australian National University created a framework through collaboration with scientists from other institutions.
They identified nine planetary boundaries which, if crossed, can tip the Earth into a dangerous environmental change: climate change, change in biosphere integrity, land system change, freshwater change, modification of biogeochemical flows, ocean acidification, increase in atmospheric aerosol loading, stratospheric ozone depletion, and introduction of novel entities. Of these, seven had defined, safe limits in 2009. Scientists found that the Earth had already crossed three of them. By 2023, the Earth had crossed six of them. In 2025, it crossed the seventh: ocean acidification.
Research by MIT scientists now forms a framework for developing water bases with switches that can absorb and release carbon dioxide.
According to a study published in June 2025 in Global Change Biology, over 40% of the global surface ocean has crossed the ocean acidification boundary. Moreover, up to 60% of the subsurface ocean (down to 200 metres) has crossed the boundary. Under acidic conditions, the calcium carbonate that makes up the shells and skeletons of marine organisms, such as corals, crustaceans, molluscs and plankton, starts to dissolve. New shells cost more energy to build as carbonates become less available. Acidification also impairs the recovery of coral reefs that have bleached from marine heatwaves. All this has consequences for the marine food web as well as the coastal communities that rely on marine ecosystems.
CARBON CAPTURE
The main reason for global warming is the rapid increase of atmospheric carbon dioxide levels through the burning of fossil fuels. Carbon dioxide traps the heat that goes into space from the Earth, making the atmosphere warmer. Reversing this change requires the removal of carbon dioxide from the atmosphere, which can happen over millennia through natural processes. Planting more trees can remove carbon dioxide, but there is a limit to how many trees the Earth can support. For a few decades, scientists have been trying to develop processes to remove carbon dioxide from the atmosphere, but existing processes are too expensive for practical use, mainly because fixing carbon dioxide is an energy-intensive process. Nature does it through photosynthesis, but all natural processes are slow and often inefficient for rapid impact. Carbon capture is one of the most actively researched areas in technology. In fact, most climate change scientists believe that the Earth cannot return to a stable and tolerable climate unless a substantial amount of carbon dioxide is removed from the atmosphere every year.
Many scientists believe that the answer lies in developing new materials that can trap carbon dioxide with low energy consumption and then release it where required. In 2025, a noticeable advance was made by scientists at the Massachusetts Institute of Technology in Boston, U.S. They used substances called ‘photoswitchable' bases that can become stronger bases – substances that can accept hydrogen ions and neutralise acid – when exposed to light. Their research now forms a framework for developing water bases with switches that can absorb and release carbon dioxide. Their system is stable in the presence of oxygen and can work in sunlight to trap carbon dioxide. Since an air capture system itself should not release carbon dioxide into the air, photoswitchable bases provide a good starting point for future, low-cost, and scaleable systems to capture carbon dioxide.
See also:
Have a
story idea?
Tell us.
Do you have a recent research paper or an idea for a science/technology-themed article that you'd like to tell us about?
GET IN TOUCH



