Login with your Social Account

Comets in the solar system might share the same place of origin

Comets in the solar system might share the same place of origin

According to new research, all comets might come from the same place. Astronomer Christian Eistrup from Leiden University used chemical models on fourteen widely known comets only to find a clear pattern. 

Comets move through the solar system and they are made of dust, ice, and small rocks. Their nuclei can span as wide as several kilometers. Some of them have strange orbits around the Sun and have also hit Earth in the past. Eistrup said that the composition of comets is very well known and are usually considered as icy balls. So he wanted to find out if they actually belong to one group or are divided by several subsets. 

Eistrup’s research team also included Ewine van Dishoeck, Kavli Prize winner who created models for predicting the chemical composition of protoplanetary discs, which are flat discs made of dust and gas encompassing younger stars. These discs help in knowing about the formation of stars and planets. These were now applied to comets. Ewine along with Eistrup used statistics to understand if there was a particular place in the solar system where the models meet the comets’ data. It turned out that all the fourteen comets showed the same result. Each comet could be described by only one model indicating that their common origin. The work can be found here.

The origin is near to our Sun at a time when a protoplanetary disc encircled it while the formation of the planets took place. The model suggests a zone that is farther from the Sun’s nucleus and where the temperature varies in the range of 21 to 28 Kelvin, a very low temperature where CO (carbon monoxide) turns into ice. There are several reactions that are taking place in the ice phase in the time frame of a hundred thousand to million years. This explains different comets with different types of compositions. 

The orbits of the comets vary since some of the comets might have been disturbed by planets like Jupiter, which explains the varied nature of the orbits. 

Eistrup wants to test the hypothesis on many more comets as the current sample size is pretty small, only fourteen. He hopes that astronomers studying the solar system and its origins can use his results which can provide them with new insights, hence he is interested to discuss this model with other comet researchers. 

We still do not how life started on our planet. However, the chemistry of the comets could be responsible for some of life’s building blocks. Life could start with the right comet hitting the right planet accompanied by a suitable environment. Understanding comets could lead to understanding the origin of life on our planet. 

Researchers have found the relations between genetic differences and left handedness

Researchers have found the relationship between genetic differences and left-handedness

Scientists for the first time have been able to identify the specific gene regions which have an influence on being left-handed. They have also found connections to differences in the structure of the brain in those having these variations.

It has been established before that being left or right-handed has an approximate dependence of 25 percent on the genetic code at birth. However, until now, researchers have not been able to locate the specific areas of the genome that are responsible.

The latest study involved nearly 400,000 individual records in a national database of the United Kingdom. Four genetic regions were found that were associated with handedness out of which three were linked to the proteins in the development and structure of the brain. These proteins are related to the cytoskeleton which is responsible for the construction and functionality of the cells. The study appears in the Brain Journal.

Brain scans were performed on 10,000 participants through which researchers were able to link the variations in the genetic code with white matter tracts between the language-processing regions. The cytoskeleton of the brain is present in the white matter tracts. Akira Wiberg, physician said that the large datasets from the UK Biobank helped in a deep understanding of the processes responsible for left-handedness. The language areas in the left and right sides of the brain coordinate with each other in a coordinated manner for participants with left-handedness. This might translate to left-handers having better language and verbal skills.

Human beings have an imbalance between the left-handed and right-handed individuals which is a ratio of 1:9. Cytoskeletal differences like a coil of a snail’s shell can be highly influenced by genetics. Hence scientists think the indications of the development of handedness might be initiated in the mother’s womb.

We are still at an early stage of the research to say conclusively that handedness and genes are related however this research has provided significant associations between the two instances. Scientists are now starting to understand how a dominant hand is influenced by genetic coding. This also helps in preventing any misconceptions that left-handedness is a sign of being unlucky or that being a right-handed individual is somehow superior.

Dominic Furniss, a plastic surgeon researching in molecular genetics said that through this study it has been demonstrated that left-handedness is a result of the brain’s developmental biology which is also influenced by the complex genetic interplay.

Journal Reference: Brain

The world's first voice AI crime

The world’s first voice AI crime

We might have witnessed the first crime powered by artificial intelligence in the world where synthetic audio was used for imitating the voice of a chief executive to trick his subordinate in transferring an excess of 240,000 US Dollars into a secret account.

The insurer of the company, Euler Hermes did not name the company which was involved. On a fateful day, the company’s managing director was called and a voice resembling his superiors instructed him to wire the amount to an account-based in Hungary. The money was supposed to cut on the fines of late payment and the financial details of the transaction were sent over email while the managing director was on call. Euler Hermes said that the software was able to mimic the voice along with the exact German accent, tonality, and punctuation. 

The thieves tried for a second attempt to transfer funds in this manner but this time the managing director suspected the intentions and directly called his boss. While he was on the phone with his real boss, the artificial voice demanded to speak to him. 

Deepfakes have been growing in complexity in the last few years. It cannot be detected easily by the online platforms and companies have struggled to handle it. Its constant evolution has made it clear that simple detection would not serve any purpose as it has been gaining an audience through monetization and constant generation of viral content. There are apps that can put someone’s face on any actor’s film clips making it a source of entertainment. This kind of technology would have sounded fancy a few years ago but now it can be misused by anyone who has a creative bent of mind channeling in improper ways. There are many positive uses too. It can be used in humanizing the automated call systems and help the mute people to talk again. But if unregulated it can cause fraud and cybercrime on a massive scale. 

Cybersecurity firm Symantec reported that they managed to find a minimum of three instances where the executives’ voice was mimicked to loot cash from the companies. However, it did not comment on the victim companies. 

In this technique, a person’s voice is processed and broken down into its phonetic fundamentals such as syllables or sounds. Then these can be rearranged to form new phrases with a similar tone and speech patterns. 

Researchers are working hard to develop systems that can detect fraud audio clips and combat them. While Google, on the other hand, has also created one of the world’s most persuasive AI services, Duplex service that can call restaurants for booking tables in a simulated, lifelike voice. 

Scientists have to be cautious when these technologies are released along with framing proper systems to fight scams. 

Researchers might have finally solved the mystery of holes in the head of tyrannosaurus rex

Researchers might have finally solved the mystery of holes in the head of tyrannosaurus rex

We generally perceive Tyrannosaurus rex as a ferocious animal always seething with rage. However, a new study has indicated that the presence of two mysterious holes on its skull might have helped in controlling the temperature inside its head. The work appears in The Anatomical Record

Earlier these holes also termed as dorsotemporal fenestra were considered to be only occupied by muscles which helped in the operation of its jaw. Casey Holliday, an anatomist from the University of Missouri said that it was strange for a muscle to extend from the jaw till the top of the skull. But now enough evidence has been gathered from alligators and other reptiles which suggest the presence of blood vessels in this region. Similar fenestra has been observed in the skulls of animals collectively termed as diapsids. It includes crocodilians, birds, lizards, and tuatara. It is estimated that the holes evolved nearly 300 million years ago. Fenestra can be found in tyrannosaurs and pterosaurs. The team analyzed several diapsid skulls to find out which animals had fenestra resembling T.rex; the closest one was crocodilians. 

Holliday and his team members, William Porter, Lawrence Witmer from Ohio University and Kent Vliet, University of Florida used thermal cameras for studying alligators at St Augustine Alligator Farm Zoological Park. The body temperature of alligators is dependent on the temperature of the surroundings since they are cold-blooded. As a result of which their thermoregulation processes are different from the warm-blooded or endothermic animals. 

“We noticed when it was cooler and the alligators are trying to warm up, our thermal imaging showed big hot spots in these holes in the roof of their skull, indicating a rise in temperature,” Vliet said.

“Yet, later in the day when it’s warmer, the holes appear dark, like they were turned off to keep cool. This is consistent with prior evidence that alligators have a cross-current circulatory system – or an internal thermostat, so to speak.”

It is not sure if dinosaurs were endothermic or ectothermic and this is a topic of heavy debate. Some scientists think they were in between the two categories i.e a feature called mesothermy. Previous research suggested that armoured ankylosaur had tunnels in the skull for keeping the brain at optimum temperatures.

It is suggested that T.rex use few thermoregulation tactics of the ectotherms. It can, however, be confirmed that there are no osteological features on the skull of tyrannosaurus which shows that fenestra were extensions of muscle attachment. They can also infer, based on modern alligators that Fenestra could have been used for controlling the temperature in the skull of T.rex by warming or cooling the blood flowing through blood vessels. 

Witmer said that similar to T.rex, alligators have holes on their skull-tops which are filled with blood vessels. But still, muscles have been grouped with dinosaurs. The anatomy and physiology of the present-day animals can be used to discard the early hypotheses.

Journal Reference: The Anatomical Record

protection apps

Researchers develop new technology for isolation of software components with lesser computing power

In the coming future, protection of sensitive information such as passwords, credit card numbers will require less computational work. Scientists at Max Planck Institute for Software Systems in Kaiserslautern and Saarbrücken have developed a new technology known as ERIM for isolating software components from one another. With the help of this, sensitive data can be protected from hackers when it is processed in online transactions. This technique has nearly three to five times less computational overhead than the last isolation technology. As a result, it is more suitable for use in online transactions. For this, the researchers were awarded the Internet Defense Prize 2019 by USENIX and Facebook. 

Different types of security technologies such as firewalls help in protecting the computer programs from malicious softwares. Even a small security lapse can lead to hackers accessing the components of a software. It can also go as far as hackers accessing the financial details of the users’ accounts and making credit card transactions with them. As an example, the Heartbleed bug in the OpenSSL encryption software made the usernames and passwords of different online services vulnerable to hackers.

For preventing these attacks, developers can isolate different software components similar to the walls of a fortress preventing access to its central area even if attackers manage to overcome the external obstacles. The current isolation methods often require upto 30 percent more CPU power and many servers running simultaneously which increase the infrastructure costs. Deepak Garg, a leading researcher at Max Planck Institute said that many services do not believe in the justification of the greater costs and hence do not use the isolation methods. He added that their isolation technique uses only five percent more time for computation which makes it attractive to the companies. This is the reason they have been awarded the lucrative 100,000 USD prize by USENIX and Facebook. 

A team led by Deepak Garg and Peter Druschel, director at Max Planck Institute for Software Systems combined a hardware feature which was introduced by Intel in their microprocessors with software for building the isolation method. The new hardware feature is known as MPK or Memory Protection Keys. 

MPK on its own cannot isolate the components as it can still be exploited by clever attackers. MPK was used with another method known as instruction rewriting. Peter Druschel said that the code of a software can be written in such a manner that an attacker cannot exploit the “walls” of the components. This is done keeping the purpose of the software intact. Both these methods can be used to divide the memory of software with very less computational work thus isolating the parts from one another. Remaining isolation technologies access the kernel of the operating system for this purpose thereby using more computational effort. With increase in the pace of software development, the practicality of data protection has to be maintained. This often involves unconventional approaches. 

World Magnetic Declination

For the first time in several centuries, compasses in Greenwich will point at true north

Compasses in Greenwich will be pointing towards the true north for the first time. This coincidence of magnetism and time has not occurred for almost 360 years. It is set to occur in the coming fortnight and it is a reminder of the fact that magnetic north pole of Earth is constantly wandering, unlike Earth’s geographic north pole. 

The angular difference between the geographic and magnetic meridian at any place is known as magnetic declination. Although this difference does not affect the normal citizens in a significant way, this disparity can last for a long time. For almost a hundred years in the United Kingdom, the compass needles have been pointing to west of actual north since the magnetic declination has been negative. 

This is not permanent. The agonic, an invisible line which joins the north and south magnetic poles of Earth which also represents the zero declination has been shifting in the western direction at a rate of nearly 20 kilometres or 12 miles every year

If this rate continues, it will be passing through Greenwich in this month which is the site of the Greenwich Royal Observatory thus creating a historic occurrence. Ciaran Beggan, geomagnetism researcher at British Geological Survey said that in the month of September, the agonic will meet zero longitudes at Greenwich. Since the creation of the Observatory, this is the first time that geomagnetic and geographic coordinate systems will be coinciding at this place. 

The Royal Observatory was founded at the decree of King Charles II in 1675. It is a coincidence that compasses in Greenwich due to zero declination also pointed towards true north. From that time, the agonic has been shifting as Earth’s magnetic north pole has been varying in its position due to changes in the molten outer core. 

Beggan said that the agonic will continue to pass across the United Kingdom for the next 15-20 years. So this phenomenon is expected to continue beyond circumstantial synchronicity of this September. By 2040, compasses will most probably point east of true north. Beyond this scientists cannot provide any prediction as it is difficult to estimate the future magnetic movements. Beggan said that currently, it is not possible to predict how the magnetic field will change in the time span of several decades. For 360 more years in the United Kingdom, the compass may point towards the east of the true north direction. 

light sound waves

Researchers demonstrate storage and release of mechanical waves without loss of energy

In several technologies which are used today, light and sound waves are the fundamentals for transporting energy and signals. However, until now there has been no method to store a wave for a long period of time and then redirect it to a specific location when needed. This would provide the opportunity to manipulate waves for several purposes such as quantum computing, storing information, energy harvesting and many more.

A team of scientists led by Andrea Alù, founding director of Photonics Initiative, Advanced Science Research Center, CUNY and Massimo Ruzzene, Aeronautics Engineering professor at Georgia Tech has demonstrated experimentally that it is possible to capture a wave and store it efficiently while redirecting it later to a specific location. The work appears in Science Advances journal.

Alù said that the experiment demonstrates new opportunities can be unlocked in wave scattering and propagation through unconventional scattering methods. Researchers found ways to change the basic interaction between waves and particles. On striking an obstacle, a light or sound wave can go through two processes, partial absorption or reflection and scattering. In absorption, the wave is immediately converted to different forms of energy including heat. For those who cannot absorb waves, they are reflected and scattered.

In this experiment, the aim of the researchers was to find some technique to mimic the process of absorption in which the wave would not be converted to any other form instead stored in the material. This is known as coherent virtual absorption and it was introduced by ASRC two years ago.

For proving the theory, it was necessary to tailor the time evolution of waves so that on contacting non-absorbing materials, they would not be scattered, transmitted or reflected. This would prevent the wave from escaping and store it inside the material efficiently. Then it could be released on demand. In the course of the experiment, two mechanical waves were propagated in opposite directions along a carbon steel waveguide that had a cavity. Time variations of every wave were controlled so that the cavity would retain all the energy. The excitation of one of the waves was stopped which enabled the researchers to control the stored energy and send it towards a specific direction.

The experiment was performed using elastic waves which traveled inside a solid material. It can also be replicated for light and radiowaves thus opening the doors to exciting opportunities such as efficient harvesting of energy, wireless power transfer and greater control on wave propagation.

Research Paper: Coherent virtual absorption of elastodynamic waves

Brain Synapse

Researchers develop device which can forget things like our brain

Scientists are trying to emulate the human brain since it is the ultimate computing machine. In this effort, the latest research has resulted in the development of a device which can also “forget” memories much like our brains. 

It is known as a second-order memristor. It mimics the synapse of a human brain in such a manner where it stores information but then loses it slowly when it is not accessed for a long time period. The device currently does not have a practical use but this could be a stepping stone to a unique kind of neurocomputer which can perform the same functions that a human brain does. The work appears in ACS Applied Materials and Interfaces

In an analogue neurocomputer, neurons and synapses can be replicated by the on-chip electronic components. This could help in amplifying computational speeds as well as decreasing the energy requirements of the computer. 

Presently the analogue neurocomputers are not feasible as researchers need to figure out how synaptic plasticity can be also implemented in electronics. This is the technique in which the active brain synapses become strong while the inactive ones get weak resulting in fading away of memories. 

Previously, memristors were produced by nanosized conductive bridges which decayed with the passing of time similar to how we forget some incidents. 

Anastasia Chouprik, a physicist from the Moscow Institute of Physics and Technology(MIPT), Russia said that in the first order memristor, the problem is that the device changes its behaviour with the passage of time resulting in its breakdown. The synaptic plasticity has been implemented in a robust manner this time which sustained the change in the state of the system for 100 billion times. 

A ferroelectric material, hafnium oxide was used along with electric polarisation which changes in response to an electric field. It is already used by Intel for manufacturing microchips. So it would be easier to introduce the memristors.

Researchers faced challenges in finding the proper thickness for the ferroelectric material. They found four nanometres to be the ideal thickness as a nanometre more or less would make it unsuitable for application. 

The forgetfulness is implemented through an imperfection as a result of which microprocessors based on hafnium are difficult to develop. The imperfection is the defect present at the interface between hafnium oxide and silicon which results in the decrease in the memristor conductivity. 

There is a long way to go as these memory cells have to be made more reliable and suitable enough to be integrated into flexible electronics. Another physicist, Vitalii Mikheev said that they would be studying the relation between several mechanisms through changing the memristor. There might be mechanisms other than ferroelectric effect which have to be studied.

Journal Reference: ACS Applied Materials and Interfaces

Scientists come with solution to the worst estimation in physics ever that is the cosmological constant

Scientists came up with a solution to the worst estimation in physics ever

Albert Einstein introduced the cosmological constant in the theory of relativity almost a century ago. It is definitely not a precise estimation as the difference in the theoretical model and the value based on astronomical measurements is of order 10121. Hence, this is known as the worst estimate in the entire history of physics. 

A researcher from the University of Geneva, Switzerland has proposed a method to resolve this difference. Lucas Lombriser, professor of Theoretical Physics and the article’s sole author states that it is to be accepted that the universal constant of gravitation G, which is also a part of the relativity equation may vary. It has been received positively by the scientific world however deeper work for confirmation of this theory is still pending. The work appears in Physics Letters B

Lombriser says that his approach consists of a different way to mathematically manipulate the general relativity equations which help in merging the gap between theory and actual observation for the cosmological constant. The cosmological constant, lambda was introduced to maintain Einstein’s assumption for his theory that the universe is static. But it was refuted by Edwin Hubble in 1929 when he discovered that galaxies have been separating from each other indicating the expansion of the universe. On knowing this, Einstein termed the introduction of the cosmological constant as his greatest blunder. 

Analysis of distant supernovae in 1998 showed that the universe’s expansion rate is not constant but accelerating. The cosmological constant was then used to define “vacuum energy” for the unknown energy which is actually responsible for the acceleration in the expansion of the universe. Observations of the cosmic microwave background which is the microwave radiation from all over the sky and a leftover of the Big Bang helped in measuring the cosmological constant. It is a small figure which explains the expansion of the universe. 

The theoretical value of the cosmological constant is calculated using quantum field theory which considers that particle pairs at a very small scale are created and destroyed simultaneously at all points in the space at any moment. When the energy of “vacuum fluctuation” is used for calculating the value of cosmological constant then the result turns out to be incompatible with the value obtained from observation. It is the largest gap ever obtained between experiment and theory. 

This problem is being approached by researchers all over the world with their own ideas however no consensus has been reached yet. 

Previously Professor Lombriser had this idea to introduce a variation in G(universal gravitation constant) in the equations of Einstein. This means that our universe is a single case among an endless number of theoretical possibilities. Through this approach, it is possible to calculate ΩΛ, a different way of expressing the cosmological constant but easier to manipulate. It also expresses the fraction of the universe which consists of dark energy. The number comes to 70.4% which is close to the experimental value of 68.5 percent. Further analyses are needed to confirm the framework of Lombriser but it has been already taken in a positive way by the scientific community. 

Journal Reference: Physics Letters B

RFID Chip

Researchers develop stickers similar to skin for monitoring health

Some health devices are on the edge of merging smoothly with our skin as wearable technology grows ever tinier and more sensitive. A flexible digital sticker has been developed by the Researchers at Stanford University that can track a person’s pulse, respiration and muscle activity simultaneously. The work appears in the Nature Electronics journal. 

Bodynet which is the explanation of the working of this novel device asserts that the delicate and lightweight sensors fuse easily with the skin, stretching and bending gradually with each motion, heartbeat or breath. These accurate wireless measurements are then transported from the sticker to a close-by flexible receiver which is cropped somewhere onto a person’s clothing. The device has only been experimented on one individual so far and the receiver of this device is still a bit clumsy and requires further development.

The researchers have planned to improve their model even after 3 years of work in the near future. They expect the device to be used for tracking sleep disorders and heart conditions in real-time by the physicians.

The chemical engineer Zhenan Bao said that they think that in the future they can develop a full-body skin-sensor array device that can assemble physiological information without intervening with a person’s normal behavior.

There is a long way to go, researchers are moving rapidly on wearable technologies. Recently, researchers worldwide have been building new methods to hold medical devices onto the skin or implant medical sensors in the tattoo ink. Recent reports on wearable devices predict that as the industry prospers, the market could rise from US$8.9 billion in 2018 to US$29.9 billion by 2023.

The new design from Stanford uses a magnificent new wireless system including an antenna made from metallic ink, screen-printed onto a rubber sticker that can bend and stretch like human skin, unlike the sensors that stick on the skin. The electric current flowing through this metallic ink is varied as it goes through the motions giving precise measurements of a person’s physicality. The very close contact motion with the skin in the flexible antennae can disturb the radio waves sent to the receiver.

The team of researchers had to develop a novel type of wireless communication based on radiofrequency identification (RFID) that would enable the antennae to transmit strong and reliable signals to the receiver without being stretched and contracted to fix this issue. The key-card generates an access code when placed near a receiver and then sends back to the receiver for access allowing for the battery-free key card to steal a little of the reader’s energy.

The authors concluded that in spite of the system with Bodynet stickers becoming insensitive to strain-induced antenna disruptions, it can still maintain full functionality even when subjected to 50 % strain. Moreover, they added that the device can potentially be used for real-time physiological and clinical findings in a modern personal health monitoring system by continuously analyzing critical human signals (pulse, respiration and body movement).

The researchers are planning to integrate sweat, temperature and other sensors into their sticker and also reducing the receiver’s size so it can one day be woven into clothing.

Journal Reference: Nature Electronics journal.