Tuesday, June 2, 2020


To safely explore the solar system and beyond, spaceships need to go faster – nuclear-powered rockets may be the answer

Over the last 50 years, a lot has changed in rocketry. The fuel that powers spaceflight might finally be changing too. CSA-Printstock/DIgital Vision Vectors via Getty Images
Iain Boyd, University of Colorado Boulder

With dreams of Mars on the minds of both NASA and Elon Musk, long-distance crewed missions through space are coming. But you might be surprised to learn that modern rockets don’t go all that much faster than the rockets of the past.

There are a lot of reasons that a faster spaceship is a better one, and nuclear-powered rockets are a way to do this. They offer many benefits over traditional fuel-burning rockets or modern solar-powered electric rockets, but there have been only eight U.S. space launches carrying nuclear reactors in the last 40 years.

However, last year the laws regulating nuclear space flights changed and work has already begun on this next generation of rockets.

Why the need for speed?

The first step of a space journey involves the use of launch rockets to get a ship into orbit. These are the large fuel-burning engines people imagine when they think of rocket launches and are not likely to go away in the foreseeable future due to the constraints of gravity.

It is once a ship reaches space that things get interesting. To escape Earth’s gravity and reach deep space destinations, ships need additional acceleration. This is where nuclear systems come into play. If astronauts want to explore anything farther than the Moon and perhaps Mars, they are going to need to be going very very fast. Space is massive, and everything is far away.

There are two reasons faster rockets are better for long-distance space travel: safety and time.

Astronauts on a trip to Mars would be exposed to very high levels of radiation which can cause serious long-term health problems such as cancer and sterility. Radiation shielding can help, but it is extremely heavy, and the longer the mission, the more shielding is needed. A better way to reduce radiation exposure is to simply get where you are going quicker.

But human safety isn’t the only benefit. As space agencies probe farther out into space, it is important to get data from unmanned missions as soon as possible. It took Voyager-2 12 years just to reach Neptune, where it snapped some incredible photos as it flew by. If Voyager-2 had a faster propulsion system, astronomers could have had those photos and the information they contained years earlier.

Speed is good. But why are nuclear systems faster?

The Saturn V rocket was 363 feet tall and mostly just a gas tank. Mike Jetzer/heroicrelics.org, CC BY-NC-ND

Systems of today

Once a ship has escaped Earth’s gravity, there are three important aspects to consider when comparing any propulsion system:

  • Thrust – how fast a system can accelerate a ship
  • Mass efficiency – how much thrust a system can produce for a given amount of fuel
  • Energy density – how much energy a given amount of fuel can produce

Today, the most common propulsion systems in use are chemical propulsion – that is, regular fuel-burning rockets – and solar-powered electric propulsion systems.

Chemical propulsion systems provide a lot of thrust, but chemical rockets aren’t particularly efficient, and rocket fuel isn’t that energy-dense. The Saturn V rocket that took astronauts to the Moon produced 35 million Newtons of force at liftoff and carried 950,000 gallons of fuel. While most of the fuel was used in getting the rocket into orbit, the limitations are apparent: It takes a lot of heavy fuel to get anywhere.

Electric propulsion systems generate thrust using electricity produced from solar panels. The most common way to do this is to use an electrical field to accelerate ions, such as in the Hall thruster. These devices are commonly used to power satellites and can have more than five times higher mass efficiency than chemical systems. But they produce much less thrust – about three Newtons, or only enough to accelerate a car from 0-60 mph in about two and a half hours. The energy source – the Sun – is essentially infinite but becomes less useful the farther away from the Sun the ship gets.

One of the reasons nuclear-powered rockets are promising is because they offer incredible energy density. The uranium fuel used in nuclear reactors has an energy density that is 4 million times higher than hydrazine, a typical chemical rocket propellant. It is much easier to get a small amount of uranium to space than hundreds of thousands of gallons of fuel.

So what about thrust and mass efficiency?

The first nuclear thermal rocket was built in 1967 and is seen in the background. In the foreground is the protective casing that would hold the reactor. NASA/Wikipedia

Two options for nuclear

Engineers have designed two main types of nuclear systems for space travel.

The first is called nuclear thermal propulsion. These systems are very powerful and moderately efficient. They use a small nuclear fission reactor – similar to those found in nuclear submarines – to heat a gas, such as hydrogen, and that gas is then accelerated through a rocket nozzle to provide thrust. Engineers from NASA estimate that a mission to Mars powered by nuclear thermal propulsion would be 20%-25% shorter than a trip on a chemical-powered rocket.

Nuclear thermal propulsion systems are more than twice as efficient as chemical propulsion systems – meaning they generate twice as much thrust using the same amount of propellant mass – and can deliver 100,000 Newtons of thrust. That’s enough force to get a car from 0-60 mph in about a quarter of a second.

The second nuclear-based rocket system is called nuclear electric propulsion. No nuclear electric systems have been built yet, but the idea is to use a high-power fission reactor to generate electricity that would then power an electrical propulsion system like a Hall thruster. This would be very efficient, about three times better than a nuclear thermal propulsion system. Since the nuclear reactor could create a lot of power, many individual electric thrusters could be operated simultaneously to generate a good amount of thrust.

Nuclear electric systems would be the best choice for extremely long-range missions because they don’t require solar energy, have very high efficiency and can give relatively high thrust. But while nuclear electric rockets are extremely promising, there are still a lot of technical problems to solve before they are put into use.

An artist’s impression of what a nuclear thermal ship built to take humans to Mars could look like. John Frassanito & Associates/Wikipedia

Why aren’t there nuclear powered rockets yet?

Nuclear thermal propulsion systems have been studied since the 1960s but have not yet flown in space.

Regulations first imposed in the U.S. in the 1970s essentially required case-by-case examination and approval of any nuclear space project from multiple government agencies and explicit approval from the president. Along with a lack of funding for nuclear rocket system research, this environment prevented further improvement of nuclear reactors for use in space.

That all changed when the Trump administration issued a presidential memorandum in August 2019. While upholding the need to keep nuclear launches as safe as possible, the new directive allows for nuclear missions with lower amounts of nuclear material to skip the multi-agency approval process. Only the sponsoring agency, like NASA, for example, needs to certify that the mission meets safety recommendations. Larger nuclear missions would go through the same process as before.

Along with this revision of regulations, NASA received US$100 million in the 2019 budget to develop nuclear thermal propulsion. DARPA is also developing a space nuclear thermal propulsion system to enable national security operations beyond Earth orbit.

After 60 years of stagnation, it’s possible a nuclear-powered rocket will be heading to space within a decade. This exciting achievement will usher in a new era of space exploration. People will go to Mars and science experiments will make new discoveries all across our solar system and beyond.

[You’re too busy to read everything. We get it. That’s why we’ve got a weekly newsletter. Sign up for good Sunday reading. ]The Conversation

Iain Boyd, Professor of Aerospace Engineering Sciences, University of Colorado Boulder

This article is republished from The Conversation under a Creative Commons license. Read the original article.

Thursday, May 28, 2020


SpaceX astronaut launch: here's the rocket science it must get right

Gareth Dorrian, University of Birmingham and Ian Whittaker, Nottingham Trent University

Two NASA astronauts, Robert Behnken and Douglas Hurley, will make history by travelling to the International Space Station in a privately funded spacecraft, SpaceX’s Falcon 9 rocket and Crew Dragon capsule. But the launch, which was due to take place on May 27, has been aborted due to bad weather, and will instead take place on May 30 at 3:22 pm EDT.

The astronauts will take off lying on their backs in the seats, and facing in the direction of travel to reduce the stress of high acceleration on their bodies. Once launched from Kennedy Space Centre, the spacecraft will travel out over the Atlantic, turning to travel in a direction that matches the ISS orbit.

With the first rocket section separating at just over two minutes, the main dragon capsule is then likely to separate from the second stage burn roughly an hour later and continue on its journey. All being well, the Dragon spacecraft will rendezvous about 24 hours after launch.


Read more: SpaceX reaches for milestone in spaceflight – a private company launches astronauts into orbit


Space mission launches and landings are the most critical parts. However, Space X has conducted many tests, including 27 drops of the parachute landing system. It has also managed an emergency separation of the Dragon capsule from the rocket. In the event of a failed rocket launch, eight engines would lift the capsule containing the astronauts up into the air and away from the rocket, with parachutes eventually helping it to land. The Falcon 9 rocket has made 83 successful launches.

Docking and return

The space station has an orbital velocity of 7.7km per second. The Earth’s rotation carries launch sites under a straight flight path of the ISS, with each instance providing a “launch window”.

ISS orbit. Author provided

To intercept the ISS, the capsule must match the station’s speed, altitude and inclination, and it must do it at the correct time such that the two spacecraft find themselves in close proximity to each other. The difference in velocity between the ISS and the Dragon capsule must then be near to zero at the point where the orbits of the two spacecraft intersect.

Once these conditions are met, the Dragon capsule must manoeuvre to the ISS docking port, using a series of small control thrusters arranged around the spacecraft. This is due to be done automatically by a computer, however the astronauts can control this manoeuvre manually if needed.

As you can see in the figure below, manoeuvring involves “translation control” as indicated by green arrows – moving left/right, up/down, forward/back. The yellow arrows show “attitude control” – rolling clockwise/anti-clockwise, pitching up/down, and yawing left/right.

How to manoeuver a spacecraft. Author provided

This is complicated by Newton’s first law of motion – that any object at rest or in motion will continue to be so unless acted upon by an external force. That means any manoeuvre, such as a roll to the right, will continue indefinitely in the absence of air resistance to provide an external force until it is counteracted by firing thrusters in the opposite direction.

So now that you have a grasp of orbital manoeuvring, why not have a go yourself? This simulator, provided by Space X, allows you to try and pilot the Dragon capsule to the ISS docking port.

The astronauts will return to Earth when a new set are ready to take their place, or at NASA’s discretion. NASA are already planning the first fully operational flight of crew Dragon, with four astronauts, although a launch date for that has not yet been announced and will undoubtedly depend on the outcome of this demonstration flight.

New era for spaceflight

The launch puts SpaceX firmly ahead of the other commercial ventures looking at providing crewed space launches. This includes both Boeing’s Starliner, which first launched last year but was uncrewed, and Sierra Nevada’s Dream Chaser which is planned to be tested with cargo during a trip to the ISS next year.

The ability of the commercial sector to send astronauts to the ISS is an important step toward further human exploration, including establishing a human presence at the Moon, and ultimately, Mars.


Read more: To the moon and beyond 4: What's the point of going back to the moon?


With companies competing, however, an open question remains whether safety could at some point be compromised to gain a commercial edge. There is no suggestion this has happened so far, but any crewed mission which failed due to a fault stemming from economic concerns would have serious legal ramifications.

In a similar way to modern aircraft legislation, a set of space safety standards and regulations will need to be put in place sooner rather than later. For commercial lunar and beyond missions we also have to ensure that any spacecraft does not contaminate the location they are visiting with germs from Earth.

With more nations and companies developing plans for lunar missions, there are obvious advantages in international cooperation and finding cost efficient launch methods. This is not least because it’s not as dependent on the whim of elected governments for direction, which can change completely from one administration to the next.

So for us scientists looking to expand our knowledge of space, it is a very exciting moment.The Conversation

Gareth Dorrian, Post Doctoral Research Fellow in Space Science, University of Birmingham and Ian Whittaker, Lecturer in Physics, Nottingham Trent University

This article is republished from The Conversation under a Creative Commons license. Read the original article.

Monday, May 18, 2020


How the Hubble Space Telescope opened our eyes to the first galaxies of the universe

The launch of Hubble Space Telescope on April 24, 1990. This photo captures the first time that there were shuttles on both pad 39a and 39b. NASA
Rodger I. Thompson, University of Arizona

The Hubble Space Telescope launched on the 24th of April, 30 years ago. It’s an impressive milestone especially as its expected lifespan was just 10 years.

One of the primary reasons for the Hubble telescope’s longevity is that it can be serviced and improved with new observational instruments through Space Shuttle visits.

When Hubble, or HST, first launched, its instruments could observe ultraviolet light with wavelengths shorter than the eye can see, as well as optical light with wavelengths visible to humans. A maintenance mission in 1997 added an instrument to observe near infrared light, which are longer wavelengths than people can see. Hubble’s new infrared eyes provided two new major capabilities: the ability to see farther into space than before and see deeper into the dusty regions of star formation.

I am an astrophysicist at the University of Arizona who has used near infrared observations to better understand how the universe works, from star formation to cosmology. Some 35 years ago, I was given the chance to build a near infrared camera and spectrometer for Hubble. It was the chance of a lifetime. The camera my team designed and developed has changed the way humans see and understand the universe. The instrument was built at Ball Aerospace in Boulder, Colorado, under our direction.

The light we can see with our eyes is part of a range of radiation known as the electromagnetic spectrum. Shorter wavelengths of light are higher energy, and longer wavelengths of light are lower energy. The Hubble Space Telescope sees primarily visible light (indicated here by the rainbow), as well as some infrared and ultraviolet radiation. NASA/JHUAPL/SwRI

Seeing further and earlier

Edwin Hubble, HST’s namesake, discovered in the early 1900s that the universe is expanding and that the light from distant galaxies was shifted to longer, redder wavelengths, a phenomenon called the redshift. The greater the distance, the larger the shift. This is because the further away an object is, the longer it takes for the light to reach us here on Earth and the more the universe has expanded in that time.

The Hubble ultraviolet and optical instruments had taken images of the most distant galaxies ever seen, known as the Northern Hubble Deep Field, or NHDF, which were released in 1996. These images, however, had reached their distance limit due to the redshift, which had shifted all of the light of the most distant galaxies out of the visible and into the infrared.

One of the new instruments added to Hubble in the second maintenance mission has the awkward name, the Near Infrared Camera and Multi-Object Spectrometer, NICMOS, pronounced “Nick Moss.” The near infrared cameras on NICMOS observed regions of the NHDF and discovered even more distant galaxies with all of their light in the near infrared.

A typical image taken with NICMOS. It shows a gigantic star cluster in the center of our Milky Way. NICMOS, thanks to its infrared capabilities, is able to look through the heavy clouds of dust and gas in these central regions. NASA/JHUAPL/SwRI

Astronomers have the privilege of watching things happen in the past which they call the “lookback time.” Our best measurement of the age of the universe is 13.7 billion years. The distance that light travels in one year is called a light year. The most distant galaxies observed by NICMOS were at a distance of almost 13 billion light years. This meant that the light that NICMOS detected had been traveling for 13 billion years and showed what the galaxies looked like 13 billion years ago, a time when the universe was only about 5% of its current age. These were some of the first galaxies ever created and were forming new stars at rates that were more than a thousand times the rate at which most galaxies form stars in the current universe.

Hidden by dust

Although astronomers have studied star formation for decades, many questions remain. Part of the problem is that most stars are formed in clouds of molecules and dust. The dust absorbs the ultraviolet and most of the optical light emitted by forming stars, making it difficult for Hubble’s ultraviolet and optical instruments to study the process.

The longer, or redder, the wavelength of the light, the less is absorbed. That is why sunsets, where the light must pass through long lengths of dusty air, appear red.

The near infrared, however, has an even easier time passing through dust than the red optical light. NICMOS can look into star formation regions with the superior image quality of Hubble to determine the details of where the star formation occurs. A good example is the iconic Hubble image of the Eagle Nebula, also known as the pillars of creation.

The optical image shows majestic pillars which appear to show star formation over a large volume of space. The NICMOS image, however, shows a different picture. In the NICMOS image, most of the pillars are transparent with no star formation. Stars are only being formed at the tip of the pillars. The optical pillars are just empty dust reflecting the light of a group of nearby stars.

The Eagle Nebula in visible light. NASA, ESA and the Hubble Heritage Team (STScI/AURA)
In this Hubble Space Telescope image is the Eagle Nebula’s Pillars of Creation. Here, the pillars are seen in infrared light, which pierces through obscuring dust and gas and unveils a more unfamiliar — but just as amazing — view of the pillars. NASA, ESA/Hubble and the Hubble Heritage Team

The dawning of the age of infrared

When NICMOS was added into the HST in 1997 NASA had no plans for a future infrared space mission. That rapidly changed as the results from NICMOS became apparent. Based on the data from NICMOS, scientists learned that fully formed galaxies existed in the universe much earlier than expected. The NICMOS images also confirmed that the expansion of the universe is accelerating rather than slowing down as previously thought. The NHDF infrared images were followed by the Hubble Ultra Deep Field images in 2005, which further showed the power of near infrared imaging of distant young galaxies. So NASA decided to invest in the James Webb Space Telescope, or JWST, a telescope much larger than HST and completely dedicated to infrared observations.

On Hubble, a near infrared imager was added to the third version of the Wide Field camera which was installed in May of 2009. This camera used an improved version of the NICMOS detector arrays that had more sensitivity and a wider field of view. The James Webb Space Telescope has much larger versions of the NICMOS detector arrays that have more wavelength coverage than the previous versions.

The James Webb Space Telescope, scheduled to be launched in March 2021, followed by the Wide Field Infrared Survey Telescope, form the bulk of future space missions for NASA. These programs were all spawned by the near infrared observations by HST. They were enabled by the original investment for a near infrared camera and spectrometer to give Hubble its infrared eyes. With the James Webb Space Telescope, astronomers expect to see the very first galaxies that formed in the universe.

[Deep knowledge, daily. Sign up for The Conversation’s newsletter.]The Conversation

Rodger I. Thompson, Professor of Astronomy, University of Arizona

This article is republished from The Conversation under a Creative Commons license. Read the original article.

Sunday, May 17, 2020


The lack of women in cybersecurity leaves the online world at greater risk

Women bring a much-needed change in perspective to cybersecurity. Maskot/Maskot via Getty Images
Nir Kshetri, University of North Carolina – Greensboro

Women are highly underrepresented in the field of cybersecurity. In 2017, women’s share in the U.S. cybersecurity field was 14%, compared to 48% in the general workforce.

The problem is more acute outside the U.S. In 2018, women accounted for 10% of the cybersecurity workforce in the Asia-Pacific region, 9% in Africa, 8% in Latin America, 7% in Europe and 5% in the Middle East.

Women are even less well represented in the upper echelons of security leadership. Only 1% of female internet security workers are in senior management positions.

I study online crime and security issues facing consumers, organizations and nations. In my research, I have found that internet security requires strategies beyond technical solutions. Women’s representation is important because women tend to offer viewpoints and perspectives that are different from men’s, and these underrepresented perspectives are critical in addressing cyber risks.

Perception, awareness and bias

The low representation of women in internet security is linked to the broader problem of their low representation in the science, technology, engineering and mathematics fields. Only 30% of scientists and engineers in the U.S. are women.

The societal view is that internet security is a job that men do, though there is nothing inherent in gender that predisposes men to be more interested in or more adept at cybersecurity. In addition, the industry mistakenly gives potential employees the impression that only technical skills matter in cybersecurity, which can give women the impression that the field is overly technical or even boring.

Women are also generally not presented with opportunities in information technology fields. In a survey of women pursuing careers outside of IT fields, 69% indicated that the main reason they didn’t pursue opportunities in IT was because they were unaware of them.

Organizations often fail to try to recruit women to work in cybersecurity. According to a survey conducted by IT security company Tessian, only about half of the respondents said that their organizations were doing enough to recruit women into cybersecurity roles.

Gender bias in job ads further discourages women from applying. Online cybersecurity job ads often lack gender-neutral language.

Good security and good business

Boosting women’s involvement in information security makes both security and business sense. Female leaders in this area tend to prioritize important areas that males often overlook. This is partly due to their backgrounds. Forty-four percent of women in information security fields have degrees in business and social sciences, compared to 30% of men.

Female internet security professionals put a higher priority on internal training and education in security and risk management. Women are also stronger advocates for online training, which is a flexible, low-cost way of increasing employees’ awareness of security issues.

Female internet security professionals are also adept at selecting partner organizations to develop secure software. Women tend to pay more attention to partner organizations’ qualifications and personnel, and they assess partners’ ability to meet contractual obligations. They also prefer partners that are willing to perform independent security tests.

Increasing women’s participation in cybersecurity is a business issue as well as a gender issue. According to an Ernst & Young report, by 2028 women will control 75% of discretionary consumer spending worldwide. Security considerations like encryption, fraud detection and biometrics are becoming important in consumers’ buying decisions. Product designs require a trade-off between cybersecurity and usability. Female cybersecurity professionals can make better-informed decisions about such trade-offs for products that are targeted at female customers.

Attracting women to cybersecurity

Attracting more women to cybersecurity requires governments, nonprofit organizations, professional and trade associations and the private sector to work together. Public-private partnership projects could help solve the problem in the long run.

A computer science teacher, center, helps fifth grade students learn programming. AP Photo/Elaine Thompson

One example is Israel’s Shift community, previously known as the CyberGirlz program, which is jointly financed by the country’s Defense Ministry, the Rashi Foundation and Start-Up Nation Central. It identifies high school girls with aptitude, desire and natural curiosity to learn IT and and helps them develop those skills.

The girls participate in hackathons and training programs, and get advice, guidance and support from female mentors. Some of the mentors are from elite technology units of the country’s military. The participants learn hacking skills, network analysis and the Python programming language. They also practice simulating cyber-attacks to find potential vulnerabilities. By 2018, about 2,000 girls participated in the CyberGirlz Club and the CyberGirlz Community.

In 2017, cybersecurity firm Palo Alto Networks teamed up with the Girl Scouts of the USA to develop cybersecurity badges. The goal is to foster cybersecurity knowledge and develop interest in the profession. The curriculum includes the basics of computer networks, cyberattacks and online safety.

Professional associations can also foster interest in cybersecurity and help women develop relevant knowledge. For example, Women in Cybersecurity of Spain has started a mentoring program that supports female cybersecurity professionals early in their careers.

Some industry groups have collaborated with big companies. In 2018, Microsoft India and the Data Security Council of India launched the CyberShikshaa program in order to create a pool of skilled female cybersecurity professionals.

Some technology companies have launched programs to foster women’s interest in and confidence to pursue internet security careers. One example is IBM Security’s Women in Security Excelling program, formed in 2015.

Attracting more women to the cybersecurity field requires a range of efforts. Cybersecurity job ads should be written so that female professionals feel welcome to apply. Recruitment efforts should focus on academic institutions with high female enrollment. Corporations should ensure that female employees see cybersecurity as a good option for internal career changes. And governments should work with the private sector and academic institutions to get young girls interested in cybersecurity.

Increasing women’s participation in cybersecurity is good for women, good for business and good for society.

[Insight, in your inbox each day. You can get it with The Conversation’s email newsletter.]The Conversation

Nir Kshetri, Professor of Management, University of North Carolina – Greensboro

This article is republished from The Conversation under a Creative Commons license. Read the original article.

Saturday, May 16, 2020


Robots are playing many roles in the coronavirus crisis – and offering lessons for future disasters

A nurse (left) operates a robot used to interact remotely with coronavirus patients while a physician looks on. MIGUEL MEDINA/AFP via Getty Images
Robin R. Murphy, Texas A&M University ; Justin Adams, Florida State University, and Vignesh Babu Manjunath Gandudi, Texas A&M University

A cylindrical robot rolls into a treatment room to allow health care workers to remotely take temperatures and measure blood pressure and oxygen saturation from patients hooked up to a ventilator. Another robot that looks like a pair of large fluorescent lights rotated vertically travels throughout a hospital disinfecting with ultraviolet light. Meanwhile a cart-like robot brings food to people quarantined in a 16-story hotel. Outside, quadcopter drones ferry test samples to laboratories and watch for violations of stay-at-home restrictions.

These are just a few of the two dozen ways robots have been used during the COVID-19 pandemic, from health care in and out of hospitals, automation of testing, supporting public safety and public works, to continuing daily work and life.

The lessons they’re teaching for the future are the same lessons learned at previous disasters but quickly forgotten as interest and funding faded. The best robots for a disaster are the robots, like those in these examples, that already exist in the health care and public safety sectors.

Research laboratories and startups are creating new robots, including one designed to allow health care workers to remotely take blood samples and perform mouth swabs. These prototypes are unlikely to make a difference now. However, the robots under development could make a difference in future disasters if momentum for robotics research continues.

Robots around the world

As roboticists at Texas A&M University and the Center for Robot-Assisted Search and Rescue, we examined over 120 press and social media reports from China, the U.S. and 19 other countries about how robots are being used during the COVID-19 pandemic. We found that ground and aerial robots are playing a notable role in almost every aspect of managing the crisis.

R. Murphy, V. Gandudi, Texas A&M; J. Adams, Center for Robot-Assisted Search and Rescue, CC BY-ND

In hospitals, doctors and nurses, family members and even receptionists are using robots to interact in real time with patients from a safe distance. Specialized robots are disinfecting rooms and delivering meals or prescriptions, handling the hidden extra work associated with a surge in patients. Delivery robots are transporting infectious samples to laboratories for testing.

Outside of hospitals, public works and public safety departments are using robots to spray disinfectant throughout public spaces. Drones are providing thermal imagery to help identify infected citizens and enforce quarantines and social distancing restrictions. Robots are even rolling through crowds, broadcasting public service messages about the virus and social distancing.

At work and home, robots are assisting in surprising ways. Realtors are teleoperating robots to show properties from the safety of their own homes. Workers building a new hospital in China were able work through the night because drones carried lighting. In Japan, students used robots to walk the stage for graduation, and in Cyprus, a person used a drone to walk his dog without violating stay-at-home restrictions.

Helping workers, not replacing them

Every disaster is different, but the experience of using robots for the COVID-19 pandemic presents an opportunity to finally learn three lessons documented over the past 20 years. One important lesson is that during a disaster robots do not replace people. They either perform tasks that a person could not do or do safely, or take on tasks that free up responders to handle the increased workload.

The majority of robots being used in hospitals treating COVID-19 patients have not replaced health care professionals. These robots are teleoperated, enabling the health care workers to apply their expertise and compassion to sick and isolated patients remotely.

A robot uses pulses of ultraviolet light to disinfect a hospital room in Johannesburg, South Africa. MICHELE SPATARI/AFP via Getty Images

A small number of robots are autonomous, such as the popular UVD decontamination robots and meal and prescription carts. But the reports indicate that the robots are not displacing workers. Instead, the robots are helping the existing hospital staff cope with the surge in infectious patients. The decontamination robots disinfect better and faster than human cleaners, while the carts reduce the amount of time and personal protective equipment nurses and aides must spend on ancillary tasks.

Off-the-shelf over prototypes

The second lesson is the robots used during an emergency are usually already in common use before the disaster. Technologists often rush out well-intentioned prototypes, but during an emergency, responders – health care workers and search-and-rescue teams – are too busy and stressed to learn to use something new and unfamiliar. They typically can’t absorb the unanticipated tasks and procedures, like having to frequently reboot or change batteries, that usually accompany new technology.

Fortunately, responders adopt technologies that their peers have used extensively and shown to work. For example, decontamination robots were already in daily use at many locations for preventing hospital-acquired infections. Sometimes responders also adapt existing robots. For example, agricultural drones designed for spraying pesticides in open fields are being adapted for spraying disinfectants in crowded urban cityscapes in China and India.

Workers in Kunming City, Yunnan Province, China refill a drone with disinfectant. The city is using drones to spray disinfectant in some public areas. Xinhua News Agency/Yang Zongyou via Getty Images

A third lesson follows from the second. Repurposing existing robots is generally more effective than building specialized prototypes. Building a new, specialized robot for a task takes years. Imagine trying to build a new kind of automobile from scratch. Even if such a car could be quickly designed and manufactured, only a few cars would be produced at first and they would likely lack the reliability, ease of use and safety that comes from months or years of feedback from continuous use.

Alternatively, a faster and more scalable approach is to modify existing cars or trucks. This is how robots are being configured for COVID-19 applications. For example, responders began using the thermal cameras already on bomb squad robots and drones – common in most large cities – to detect infected citizens running a high fever. While the jury is still out on whether thermal imaging is effective, the point is that existing public safety robots were rapidly repurposed for public health.

Don’t stockpile robots

The broad use of robots for COVID-19 is a strong indication that the health care system needed more robots, just like it needed more of everyday items such as personal protective equipment and ventilators. But while storing caches of hospital supplies makes sense, storing a cache of specialized robots for use in a future emergency does not.

This was the strategy of the nuclear power industry, and it failed during the Fukushima Daiichi nuclear accident. The robots stored by the Japanese Atomic Energy Agency for an emergency were outdated, and the operators were rusty or no longer employed. Instead, the Tokyo Electric Power Company lost valuable time acquiring and deploying commercial off-the-shelf bomb squad robots, which were in routine use throughout the world. While the commercial robots were not perfect for dealing with a radiological emergency, they were good enough and cheap enough for dozens of robots to be used throughout the facility.

Robots in future pandemics

Hopefully, COVID-19 will accelerate the adoption of existing robots and their adaptation to new niches, but it might also lead to new robots. Laboratory and supply chain automation is emerging as an overlooked opportunity. Automating the slow COVID-19 test processing that relies on a small set of labs and specially trained workers would eliminate some of the delays currently being experienced in many parts of the U.S.

Automation is not particularly exciting, but just like the unglamorous disinfecting robots in use now, it is a valuable application. If government and industry have finally learned the lessons from previous disasters, more mundane robots will be ready to work side by side with the health care workers on the front lines when the next pandemic arrives.

[You’re too busy to read everything. We get it. That’s why we’ve got a weekly newsletter. Sign up for good Sunday reading. ]The Conversation

Robin R. Murphy, Raytheon Professor of Computer Science and Engineering; Vice-President Center for Robot-Assisted Search and Rescue (nfp), Texas A&M University ; Justin Adams, President of the Center for Robot-Assisted Search and Rescue/Research Fellow - The Center for Disaster Risk Policy, Florida State University, and Vignesh Babu Manjunath Gandudi, Graduate Teaching Assistant, Texas A&M University

This article is republished from The Conversation under a Creative Commons license. Read the original article.