Login with your Social Account

explosive merger of neutron stars helps in measuring hubble constant

Collision of neutron stars gives a new measure to one of the fundamental cosmic constants

It’s been almost two years since the collision of two neutron stars which took place in a galaxy which was 130 million light years away from our universe. The fact which made this collision unique was that it helped in improving pre-existing knowledge about the universe and related features to it.

Basically, neutron stars are condensed, burnt out debris of humongous stars which are out of fuel. Such stars finally blow up and die. So, this collision has contributed in understanding of the concept of Hubble constant but on the other hand, the results have created more confusion also. Hubble constant is the rate at which the universe is expanding and to calculate this crucial constant, astronomers have more than 3 options. The study has been published in the Nature Astronomy journal.

The first method used for calculating Hubble constant uses data from Planck satellite which is used for measuring cosmic microwave background condition. The results obtained from this method has given Hubble constant as 67.4 kilometres per second per megaparsec.

Apart from this, another method used is through studying Nebulae which is left behind by Type La Supernovae which is a kind of supernova. Through this way, Hubble constant was calculated as 72.78 kilometres per second per megaparsec.

Adding to the dilemma, there was a third method which is through standard candles like Cepheid stars known for its luminosity which helps in accurate calculations. Most recently, a measurement based on the motions of 70 Cepheid variables returned a result of 74.03 kilometres per second per megaparsec. So, you see the dilemma. The measurement from this third method is way faster than the previous two methods.

All of this chaos due to various methods has forced the scientific community to think about the validity of the methods and even the models are suspected to underpin the processes.

Furthermore, the reason why we are discussing this collision is because even through this merging of stars the team of scientists have calculated a value for the Hubble Constant. The process was to find the orientation of this collision through recognizing the minor changes in the location and shape, which further aided in knowing the precise distance of collision. Subsequently, resulting in a different value for Hubble Constant which was 70.3 kilometres per second per megaparsec.

Summarizing the aforementioned points, it will surely take some time to understand the results of different methods and then finally consolidating those deductions about the Hubble constant. These kinds of discoveries have contributed to the understanding of our universe to a large extent and thus it is mandatory to look for precise conclusions to remove any kind of errors.

jupiter dark matter

Quasiparticles known as magnons could help in detecting dark matter

Nearly 80% of the matter present in the cosmos has a form which is totally unknown to the physics of the present day. This is commonly known as dark matter. There have been several experiments all around the globe in an attempt to capture particle belonging to dark matter. Unfortunately, there have been no positive results. 

A group of scientists have come up with a new way of searching dark matter with the help of quasiparticles (not a real particle but something you can describe with math in that way) known as magnons. The theorists claim that these tiny so-called particles can bring out even lightweight particles of dark matter. 

Although dark matter cannot be detected directly, its evidence is visible with the help of telescopes. The first one came in the 1930s through the observation of galaxy clusters. They are one of the largest structures present in the universe. The galaxies present in these clusters are held together by the gravitational bonding which depends on the mass of the galaxies. Heavy galaxies mean stronger gravitational glues. However, it was detected that the galaxies were moving much faster than the limit set by the gravitational glue. This indicates that there must be something which is holding the clusters together and preventing them from ripping apart but not interacting or emitting light. 

Dark matter is present in every galaxy. The visible fraction in a galaxy comprising of stars, gas clouds, dust is just a tiny fraction compared to the zillions and zillions of dark matter particles. Particles of dark matter are present all around. However, they are not noticed because they do not interact with light or charged particles. The only way of observing dark matter is through the force of gravity. Each and every form of energy and matter in the universe, dark or not is influenced by gravitational forces. 

It is also possible that some particles of the dark matter interact with weak nuclear force, the one responsible for radioactive decays. In a very large detector, if a dark matter particle is heavy it knocks out the atomic nucleus of the element, via weak nuclear force and alters the mass of the detector. However, the lack of results is worrying, as the heaviest of the candidates have been ruled out. If these mysterious particles are too light, then the present setups cannot detect it.  

Therefore scientists have come up with a setup where the material is kept at absolute zero. Here, the electrons are in the same direction. However, if enough dark matter particles strike the material it would flip some of the spins. Each of those flipped spins also causes a little ripple in the energy of the material, and those wiggles can be viewed as a quasiparticle, not a true particle, but something you can describe with math in that way. These are called magnons. Thus it can detect a lightweight particle of a dark matter if it exists. 

Breaching a “carbon threshold” could lead to mass extinction

Breaching a “carbon threshold” could lead to mass extinction

In the brain, when neurons fire off electrical signals to their neighbours, this happens through an “all-or-none” response. The signal only happens once conditions in the cell breach a certain threshold.

Now an MIT researcher has observed a similar phenomenon in a completely different system: Earth’s carbon cycle.

Daniel Rothman, professor of geophysics and co-director of the Lorenz Center in MIT’s Department of Earth, Atmospheric and Planetary Sciences, has found that when the rate at which carbon dioxide enters the oceans pushes past a certain threshold — whether as the result of a sudden burst or a slow, steady influx — the Earth may respond with a runaway cascade of chemical feedbacks, leading to extreme ocean acidification that dramatically amplifies the effects of the original trigger.

This global reflex causes huge changes in the amount of carbon contained in the Earth’s oceans, and geologists can see evidence of these changes in layers of sediments preserved over hundreds of millions of years.

Rothman looked through these geologic records and observed that over the last 540 million years, the ocean’s store of carbon changed abruptly, then recovered, dozens of times in a fashion similar to the abrupt nature of a neuron spike. This “excitation” of the carbon cycle occurred most dramatically near the time of four of the five great mass extinctions in Earth’s history.

Scientists have attributed various triggers to these events, and they have assumed that the changes in ocean carbon that followed were proportional to the initial trigger — for instance, the smaller the trigger, the smaller the environmental fallout.

But Rothman says that’s not the case. It didn’t matter what initially caused the events; for roughly half the disruptions in his database, once they were set in motion, the rate at which carbon increased was essentially the same.  Their characteristic rate is likely a property of the carbon cycle itself — not the triggers, because different triggers would operate at different rates.

What does this all have to do with our modern-day climate? Today’s oceans are absorbing carbon about an order of magnitude faster than the worst case in the geologic record — the end-Permian extinction. But humans have only been pumping carbon dioxide into the atmosphere for hundreds of years, versus the tens of thousands of years or more that it took for volcanic eruptions or other disturbances to trigger the great environmental disruptions of the past. Might the modern increase of carbon be too brief to excite a major disruption?

According to Rothman, today we are “at the precipice of excitation,” and if it occurs, the resulting spike — as evidenced through ocean acidification, species die-offs, and more — is likely to be similar to past global catastrophes.

“Once we’re over the threshold, how we got there may not matter,” says Rothman, who is publishing his results this week in the Proceedings of the National Academy of Sciences.“Once you get over it, you’re dealing with how the Earth works, and it goes on its own ride.”

A carbon feedback

In 2017, Rothman made a dire prediction: By the end of this century, the planet is likely to reach a critical threshold, based on the rapid rate at which humans are adding carbon dioxide to the atmosphere. When we cross that threshold, we are likely to set in motion a freight train of consequences, potentially culminating in the Earth’s sixth mass extinction.

Rothman has since sought to better understand this prediction, and more generally, the way in which the carbon cycle responds once it’s pushed past a critical threshold. In the new paper, he has developed a simple mathematical model to represent the carbon cycle in the Earth’s upper ocean and how it might behave when this threshold is crossed.

Scientists know that when carbon dioxide from the atmosphere dissolves in seawater, it not only makes the oceans more acidic, but it also decreases the concentration of carbonate ions. When the carbonate ion concentration falls below a threshold, shells made of calcium carbonate dissolve. Organisms that make them fare poorly in such harsh conditions.

Shells, in addition to protecting marine life, provide a “ballast effect,” weighing organisms down and enabling them to sink to the ocean floor along with detrital organic carbon, effectively removing carbon dioxide from the upper ocean. But in a world of increasing carbon dioxide, fewer calcifying organisms should mean less carbon dioxide is removed.

“It’s a positive feedback,” Rothman says. “More carbon dioxide leads to more carbon dioxide. The question from a mathematical point of view is, is such a feedback enough to render the system unstable?”

An inexorable rise

Rothman captured this positive feedback in his new model, which comprises two differential equations that describe interactions between the various chemical constituents in the upper ocean. He then observed how the model responded as he pumped additional carbon dioxide into the system, at different rates and amounts.

He found that no matter the rate at which he added carbon dioxide to an already stable system, the carbon cycle in the upper ocean remained stable. In response to modest perturbations, the carbon cycle would go temporarily out of whack and experience a brief period of mild ocean acidification, but it would always return to its original state rather than oscillating into a new equilibrium.

When he introduced carbon dioxide at greater rates, he found that once the levels crossed a critical threshold, the carbon cycle reacted with a cascade of positive feedbacks that magnified the original trigger, causing the entire system to spike, in the form of severe ocean acidification. The system did, eventually, return to equilibrium, after tens of thousands of years in today’s oceans — an indication that, despite a violent reaction, the carbon cycle will resume its steady state.

This pattern matches the geological record, Rothman found. The characteristic rate exhibited by half his database results from excitations above, but near, the threshold. Environmental disruptions associated with mass extinction are outliers — they represent excitations well beyond the threshold. At least three of those cases may be related to sustained massive volcanism.

“When you go past a threshold, you get a free kick from the system responding by itself,” Rothman explains. “The system is on an inexorable rise. This is what excitability is, and how a neuron works too.”

Although carbon is entering the oceans today at an unprecedented rate, it is doing so over a geologically brief time. Rothman’s model predicts that the two effects cancel: Faster rates bring us closer to the threshold, but shorter durations move us away. Insofar as the threshold is concerned, the modern world is in roughly the same place it was during longer periods of massive volcanism.

In other words, if today’s human-induced emissions cross the threshold and continue beyond it, as Rothman predicts they soon will, the consequences may be just as severe as what the Earth experienced during its previous mass extinctions.

“It’s difficult to know how things will end up given what’s happening today,” Rothman says. “But we’re probably close to a critical threshold. Any spike would reach its maximum after about 10,000 years. Hopefully, that would give us time to find a solution.”

“We already know that our CO2-emitting actions will have consequences for many millennia,” says Timothy Lenton, professor of climate change and earth systems science at the University of Exeter. “This study suggests those consequences could be much more dramatic than previously expected. If we push the Earth system too far, then it takes over and determines its own response — past that point there will be little we can do about it.”

Materials provided by Massachusetts Institute of Technology


Instability in Antarctic Ice Projected to Make Sea Level Rise Rapidly

Instability in Antarctic Ice Projected to Make Sea Level Rise Rapidly

Images of vanishing Arctic ice and mountain glaciers are jarring, but their potential contributions to sea level rise are no match for Antarctica’s, even if receding southern ice is less eye-catching. Now, a study says that instability hidden within Antarctic ice is likely to accelerate its flow into the ocean and push sea level up at a more rapid pace than previously expected.

In the last six years, five closely observed Antarctic glaciers have doubled their rate of ice loss, according to the National Science Foundation. At least one, Thwaites Glacier, modeled for the new study, will likely succumb to this instability, a volatile process that pushes ice into the ocean fast.

How much ice the glacier will shed in coming 50 to 800 years can’t exactly be projected due to unpredictable fluctuations in climate and the need for more data. But researchers at the Georgia Institute of Technology, NASA Jet Propulsion Laboratory, and the University of Washington have factored the instability into 500 ice flow simulations for Thwaites with refined calculations.

The scenarios diverged strongly from each other but together pointed to the eventual triggering of the instability, which will be described in the question and answer section below. Even if global warming were to later stop, the instability would keep pushing ice out to sea at an enormously accelerated rate over the coming centuries.

And this is if ice melt due to warming oceans does not get worse than it is today. The study went with present-day ice melt rates because the researchers were interested in the instability factor in itself.

Glacier tipping point

“If you trigger this instability, you don’t need to continue to force the ice sheet by cranking up temperatures. It will keep going by itself, and that’s the worry,” said Alex Robel, who led the study and is an assistant professor in Georgia Tech’s School of Earth and Atmospheric Sciences. “Climate variations will still be important after that tipping point because they will determine how fast the ice will move.”

“After reaching the tipping point, Thwaites Glacier could lose all of its ice in a period of 150 years. That would make for a sea level rise of about half a meter (1.64 feet),” said NASA JPL scientist Helene Seroussi, who collaborated on the study. For comparison, current sea level is 20 cm (nearly 8 inches) above pre-global warming levels and is blamed for increased coastal flooding.

The researchers published their study in the journal the Proceedings of the National Academy of Sciences on Monday, July 8, 2019. The research was funded by the National Science Foundation and NASA.

The study also showed that the instability makes forecasting more uncertain, leading to the broad spread of scenarios. This is particularly relevant to the challenge of engineering against flood dangers.

“You want to engineer critical infrastructure to be resistant against the upper bound of potential sea level scenarios a hundred years from now,” Robel said. “It can mean building your water treatment plants and nuclear reactors for the absolute worst-case scenario, which could be two or three feet of sea level rise from Thwaites Glacier alone, so it’s a huge difference.”


Why is the Antarctic ice the big driver of sea level rise?

Understanding the instability is easier with this background information.

Arctic sea ice is already floating in water. Readers will likely remember that 90% of an iceberg’s mass is underwater and that when its ice melts, the volume shrinks, resulting in no change in sea level.

But when ice masses long supported by land, like mountain glaciers, melt, the water that ends up in the ocean adds to sea level. Antarctica holds the most land-supported ice, even if the bulk of that land is seabed holding up just part of the ice’s mass, while water holds up part of it. Also, Antarctica is an ice leviathan.

“There’s almost eight times as much ice in the Antarctic ice sheet as there is in the Greenland ice sheet and 50 times as much as in all the mountain glaciers in the world,” Robel said.

What is that ‘instability’ underneath the ice?

The line between where the ice sheet rests on the seafloor and where it extends over water is called the grounding line. In spots where the bedrock underneath the ice behind the grounding line slopes down, deepening as it moves inland, the instability can kick in.

That would look like this: On deeper beds, ice moves faster because water is giving it a little more lift, to begin with, then warmer ocean water can hollow out the bottom of the ice, adding water to the ocean. But, more importantly, the ice above the hollow loses land contact and flows faster out to sea.

“Once ice is past the grounding line and only over water, it’s contributing to sea level because buoyancy is holding it up more than it was before,” Robel said. “Ice flows out into the floating ice shelf and melts or breaks off as icebergs.”

“The process becomes self-perpetuating,” Seroussi said, describing why it is called “instability.”

How did the researchers integrate instability into sea level forecasting?

The researchers borrowed math from statistical physics that calculates what haphazard influences do to predictability in a physical system, like ice flow, acted upon by outside forces, like temperature changes. They applied the math to data-packed simulations of possible future fates of Thwaites Glacier, located in West Antarctica, where ice loss is most.

They made an added surprising discovery. Normally, when climate conditions fluctuate strongly, Antarctic ice evens out the effects. Ice flow may increase but gradually, not wildly, but the instability produced the opposite effect in the simulations.

“The system didn’t damp out the fluctuations, it actually amplified them. It increased the chances of rapid ice loss,” Robel said.

How rapid is ‘rapid’ sea level rise and when will we feel it?

The study’s time scale was centuries, as is common for sea level studies. In the simulations, Thwaites Glacier colossal ice loss kicked in after 600 years, but it could come sooner if oceans warm and as the instability reveals more of its secrets.

“It could happen in the next 200 to 600 years. It depends on the bedrock topography under the ice, and we don’t know it in great detail yet,” Seroussi said.

So far, Antarctica and Greenland have lost a small fraction of their ice, but already, shoreline infrastructures face challenges from increased tidal flooding and storm surges. Sea level is expected to rise by up to two feet by the end of this century.

For about 2,000 years until the late 1800s, sea level held steady, then it began climbing, according to the Smithsonian Institution. The annual rate of sea level rise has roughly doubled since 1990.

Materials provided by Georgia Institute of Technology

methane mars nasa

Researchers have found new clues about methane on Mars

For 15 years, the search for life on Mars increased to a great extent and during this period, methane (An Organic Molecule linked with life on Earth) was observed. Since that time, attempts to study Mars’ atmospheric methane have produced varying results. In some cases, it was found in Normal concentration, in some, it was found in higher concentrations, while absent in others.

A recent study of a special team from Aarhus University stated that they have been looking for possible ways to remove methane gas from the atmosphere of Mars. Methane production on Mars varies from about 0.24 parts per billion (ppb) in winter to about 0.65 ppb during summer in the northern hemisphere.

Extended plumes were detected by Curiosity, one in December of 2014 and in June 2019. This indicates that it is released periodically from discrete regions. Different methods have been proposed for the production and disappearance of Methane over several years. In the case of production, these range from non-biological processes such as serpentinization to biological production by microbes.

As for how it is removed, that has remained a mystery as well, but even more so.

The most prominent mechanism for disappearance of methane known to us is photochemical degradation. It is the process of breaking down of Methane into carbon dioxide, formaldehyde, and methanol by UV radiation from the sun. However, this process cannot evaluate the rate of methane disappearance, which is a crucial part of the process.

In the scientific journal Icarus, the research team from Aarhus University’s laboratory proposed that wind-driven erosion (Saltation) could be responsible for the ionization of methane into compounds such as methyl (CH3), methylene (CH2), and carbyne (CH).
With the help of Mars-analog minerals such as basalt and plagioclase, the team detected that ionized methane during the erosion process reacts and bonds with the mineral surfaces. Also, in plagioclase which is a major component in Mars’ surface material, silicon atoms combine with methyl groups obtained from methane.

Based on these results, the team concluded that this mechanism is much more effective than the photochemical process and could explain how methane is removed from the Martian atmosphere and deposition within its soil in the monitored time period. These findings have the most interesting significance on the existence of life on Mars.

Further, the team proposes to carry out studies to investigate bound Methane (complex organic material) if it is originated on Mars or have been deposited by meteorites. Precisely, they want to see if the same erosion process is responsible for changing or removing of atmospheric methane.

The conclusion drawn from these investigations will guide future Mars mission like the ESA’s ExoMars 2020 rover and NASA’s Mars 2020 rover and will hopefully clear up important questions regarding the existence of life on Mars and preservation of organic materials there.

Chasing SpaceX, Amazon plans to launch 3236 satellites

Chasing SpaceX, Amazon plans to launch 3000 internet satellites

  1. Amazon asked the United States for permission to launch 3236 communication satellites under its ambitious Kuiper satellite project
  2. Amazon mentioned that the satellites would be operating at altitudes of nearly 370-390 miles.
  3. It is a separate Amazon project and doesn’t involve Blue Origin LLC, an aerospace manufacturer company also owned by Bezos
  4. Kuiper System will be helping several mobile network operators in expanding wireless services

Amazon asked the United States for permission to launch 3236 communication satellites (named the Kuiper project). With this, it joins Elon Musk’s SpaceX in a space race for offering internet connectivity from low orbits. In a July 4 filing, Amazon informed the Federal Communications Commission that its Kuiper satellites will be delivering the broadband connection to several million consumers and businesses that lack appropriate internet access. FCC’s task is to coordinate the trajectories and use of radio-frequency.

FCC has previously given the approval of almost 13000 low-Earth orbiting satellites. Among these, there a huge chunk precisely 11,943 belong to Space Exploration Technologies of Elon Musk. SpaceX made an initial launch of 60 spacecrafts in the month of May.

In the case of low-Earth orbits, at altitudes of 112 to 1200 miles which translates to 180 to 2000 kilometers, satellites have to move around the Earth for staying aloft, thus completing the orbits in time as minimum as 90 minutes. When one satellite moves towards the horizon, it will be handing off the signal duties to the next satellite that is coming. For continuous and wide coverage, there is a requirement of many satellites. In the FCC application, Amazon mentioned that the satellites would be operating at altitudes of nearly 370-390 miles.


Jeff Bezos, Amazon founder, and CEO said last month that the Kuiper project will be costing several billions of dollars. It is a separate project and doesn’t involve Blue Origin LLC, an aerospace manufacturer company also owned by Bezos. Amazon mentioned in its statement that it is a long-term project which has plans of serving millions of people who do not have basic access to internet connectivity.

It filed its satellite program with the International Telecommunications Union in April. It has plans of offering service to the communities in the United States through its offering of fixed broadband communication in the areas which are difficult to access such as the rural areas.

Kuiper System will be helping several mobile network operators in expanding wireless services. In addition to this, it also promises to offer high-throughput wireless broadband connectivity services for airplanes, land vehicles and maritime vessels.

Amazon cited studies by the FCC which mention that close to 21 million Americans lack fixed or residential broadband service and almost 33 million Americans do not have access to high-speed internet connectivity. When stretched to the worldwide level, this figure blooms to 3.8 billion people who are without reliable broadband connectivity as per the application.

Qubits in quantum computing

Scientists find gravity as key to optimal quantum computation

Scientists have been trying persistently to achieve significant success in the field of Quantum Computing but still, there is a huge scope of improvement if we go by the consensus.

But recently the arduous efforts of all the involved scientists have given them something to cheer at, as there is a claim that gravity a natural phenomenon which has been studied extensively will provide the pathway through which in-depth knowledge of quantum computing can be obtained. The scientific community is hailing these findings because there is a belief that quantum computing will bring a drastic change in the power and scale of computation. The study has been published in Physical Review Letters

The reason behind this linkage between gravity and quantum computing are the geometric rules which are used for finding the shortest distance between two points on a curved surface with respect to gravity in General Relativity. Those same geometric rules can be used for finding the most effective methods to process information in quantum computing.

These points of shortest travel – whether across a spherical planet or inside a quantum computing system – are known as geodesics. A noteworthy point regarding this discovery is that it involves a branch of quantum computing which is conformal field theory. Through this new study, there is a possibility of faster calculations in the above-mentioned branch of quantum computing.

Physicist Pawel Caputa who was involved in this discovery expounds that the process of finding minimal length on complexity geometry is equivalent to solving equations of gravity. So this explanation has made it transparent about how gravity is linked to discovery of quantum computing.

Looking at the other side, there is still a requirement where the reduction of error rates should take place. Along with this, scientists are looking to bring down the interference which hinders the computation process. The reason why quantum computing is thought as the future of computing because it functions on the concept of qubits which is also another form of information. The striking feature of a qubit is its ability to represent several states in contrast to binary digits which have only two states (0 and 1).

Therefore, along with this discovery, recent progress has made quantum computing to be more space efficient and significant improvement in accuracy has taken

All of this promises a bright future for the quantum computing field but the finding has been limited and thus it requires much deeper research which will help in finding its multidisciplinary applications.

heat wave alaska

Giant heat dome over Alaska about to break all-time temperature records

Alaska is setting all-time heat records in recent days as a massive and abnormally intense area of high pressure is locked in and strengthening the region around it. It is expected to create temperature records as the highest value recorded for several days mainly in Southern Alaska.

Anchorage has recorded its highest temperature of 85 degrees Fahrenheit which was recorded in 1969 for 5 consecutive days. It could even touch 90 degrees Fahrenheit. The National Weather Service has predicted the temperature in southern Alaska to be in the 80 degrees Fahrenheit and even in the low 90s. The lowest temperature could touch only close to 60 degrees Fahrenheit at nights during this hot stretch and close to the average temperature at this point of time in the year. It is the warmest period for Alaska and is expected to last close to 7 days.

The heat wave is the latest in a nonstop barrage of warm weather for Alaska. It comes on the end of June where the temperature was already above average and was filled with calamities like wildfires which continue even as July arrives. The spring and winter before that were also pretty warm. It also follows the heat wave that hit Europe and shattered records for highest temperatures there as well.

The temperatures at Alaska have shifted abruptly in the past few years and there is a similar change across the Arctic Region due to climate change and global warming. The sea ice surrounding the state has recorded the lowest levels. The presence of open water and the absence of ice has elevated the ocean temperature close to 2.5 degrees Celsius above normal temperature. The combination of the high-pressure heat dome and unusually high coastal waters and maximum energy from the sun will play the role of contributing factors to maximize the potential for historically high temperatures.

A climatologist from Alaska has tweeted that Anchorage, Kotzebue, Talkeetna, and Yakutat have posted the warmest temperatures in the month of June as per record while Nome, King Salmon, and McGrath have logged the second warmest June as per records. The high temperatures have led to a higher monthly temperature average and have hit 92 degrees Fahrenheit near the Northway, near the eastern border with Canada. In southeast Alaska, drought has persisted for close to a year now with Juneau recorded the third warmest day and also completed the warmest five day stretch since 1936. This heat blast is expected to eventually ease by next week and forecast predicts normal temperature in July and August.

A new way of making complex structures in thin films

A new way of making complex structures in thin films

Self-assembling materials called block copolymers, which are known to form a variety of predictable, regular patterns can now be made into much more complex patterns that may open up new areas of materials design, a team of MIT researchers says.

The new findings appear in the journal Nature Communications, in a paper by postdoc Yi Ding, professors of materials science and engineering Alfredo Alexander-Katz and Caroline Ross, and three others.

“This is a discovery that was in some sense fortuitous,” says Alexander-Katz. “Everyone thought this was not possible,” he says, describing the team’s discovery of a phenomenon that allows the polymers to self-assemble in patterns that deviate from regular symmetrical arrays.

Self-assembling block copolymers are materials whose chain-like molecules, which are initially disordered, will spontaneously arrange themselves into periodic structures. Researchers had found that if there was a repeating pattern of lines or pillars created on a substrate, and then a thin film of the block copolymer was formed on that surface, the patterns from the substrate would be duplicated in the self-assembled material. But this method could only produce simple patterns such as grids of dots or lines.

In the new method, there are two different, mismatched patterns. One is from a set of posts or lines etched on a substrate material, and the other is an inherent pattern that is created by the self-assembling copolymer. For example, there may be a rectangular pattern on the substrate and a hexagonal grid that the copolymer forms by itself. One would expect the resulting block copolymer arrangement to be poorly ordered, but that’s not what the team found. Instead, “it was forming something much more unexpected and complicated,” Ross says.

There turned out to be a subtle but complex kind of order — interlocking areas that formed slightly different but regular patterns, of a type similar to quasicrystals, which don’t quite repeat the way normal crystals do. In this case, the patterns do repeat, but over longer distances than in ordinary crystals. “We’re taking advantage of molecular processes to create these patterns on the surface” with the block copolymer material, Ross says.

This potentially opens the door to new ways of making devices with tailored characteristics for optical systems or for “plasmonic devices” in which electromagnetic radiation resonates with electrons in precisely tuned ways, the researchers say. Such devices require very exact positioning and symmetry of patterns with nanoscale dimensions, something this new method can achieve.

Katherine Mizrahi Rodriguez, who worked on the project as an undergraduate, explains that the team prepared many of these block copolymer samples and studied them under a scanning electron microscope. Yi Ding, who worked on this for his doctoral thesis, “started looking over and over to see if any interesting patterns came up,” she says. “That’s when all of these new findings sort of evolved.”

The resulting odd patterns are “a result of the frustration between the pattern the polymer would like to form, and the template,” explains Alexander-Katz. That frustration leads to a breaking of the original symmetries and the creation of new subregions with different kinds of symmetries within them, he says. “That’s the solution nature comes up with. Trying to fit in the relationship between these two patterns, it comes up with a third thing that breaks the patterns of both of them.” They describe the new patterns as a “superlattice.”

Having created these novel structures, the team went on to develop models to explain the process. Co-author Karim Gadelrab PhD ’19, says, “The modeling work showed that the emergent patterns are in fact thermodynamically stable, and revealed the conditions under which the new patterns would form.”

Ding says “We understand the system fully in terms of the thermodynamics,” and the self-assembling process “allows us to create fine patterns and to access some new symmetries that are otherwise hard to fabricate.”

He says this removes some existing limitations in the design of optical and plasmonic materials, and thus “creates a new path” for materials design.

So far, the work the team has done has been confined to two-dimensional surfaces, but in ongoing work they are hoping to extend the process into the third dimension, says Ross. “Three dimensional fabrication would be a game changer,” she says. Current fabrication techniques for microdevices build them up one layer at a time, she says, but “if you can build up entire objects in 3-D in one go,” that would potentially make the process much more efficient.

These findings “open new pathways to generate templates for nanofabrication with symmetries not achievable from the copolymer alone,” says Thomas P. Russell, the Silvio O. Conte Distinguished Professor of Polymer Science and Engineering at the University of Massachusetts, Amherst, who was not involved in this work. He adds that it “opens the possibility of exploring a large parameter space for uncovering other symmetries than those discussed in the manuscript.”

Russel says “The work is of the highest quality,” and adds “The pairing of theory and experiment is quite powerful and, as can be seen in the text, the agreement between the two is remarkably good.”

Materials provided by Massachusetts Institute of Technology

Carbon Dioxide Pellets feritliser

Scientists turn damaging carbon dioxide into pellets to restore soils and increase crop yields

Carbon dioxide (CO2) captured from the atmosphere could be used to restore degraded soils, save water and boost crop yields, according to new research.

Scientists at the University of Sheffield’s Institute for Sustainable Food in collaboration with industry partner CCm Technologies Ltd have developed pellets made from a mixture of captured CO2 and waste straw or anaerobic digestate from the slurry, which can be used as a normal fertilizer to improve the health and water retention of soils.

The production of each tonne of these pellets generates up to 6.5 tonnes less CO2 than a typical conventional fossil fuel-based fertilizer – and could therefore dramatically reduce the carbon footprint of foods like bread.

These new pellets could turn damaging CO2 into something positive – helping communities to cope with increasingly extreme droughts by allowing farmers to grow more food while using less water.

Dr Janice Lake

Institute for Sustainable Food at the University of Sheffield

A new study published in the Journal of CO2 Utilization found the pellets improved soil water retention by up to 62 percent with immediate and prolonged effect, potentially helping crops to survive drought conditions for longer. They also resulted in a 38 percent increase in crop yields – demonstrating the pellets’ potential to grow more food using fewer resources.

There was a 20 percent increase in microbial growth in soil treated with the pellets, which is crucial for soil fertility and soil functions like decomposition and nutrient cycling. The pellets also increased the pH of the soil, making it less acidic, which could help restore degraded or even contaminated soils – and potentially increase their ability to act as a carbon sink.

Dr. Janice Lake from the Institute for Sustainable Food, an Independent Research Fellow at the University of Sheffield’s Department of Animal and Plant Sciences, is the lead author of the study. She said: “Faced with a climate emergency and a growing population, we urgently need innovative solutions to feed the world.

“As well as reducing greenhouse gas emissions, we need to capture carbon dioxide from the atmosphere to limit temperature rises. These new pellets could turn damaging CO2into something positive – helping communities to cope with increasingly extreme droughts by allowing farmers to grow more food while using less water.

“These initial results are really exciting, and we hope to be able to prove this new product’s potential with field tests in the near future.”

Dr. Lake collaborated with Pawel Kisielewski, Dr. Fabricio Marques and Professor Peter Hammond from CCm Technologies. Professor Hammond is also a visiting Professor in Chemical and Biological Engineering at the University of Sheffield.

Materials provided by the University of Sheffield