Photo: Petersen and her students test the Martha robot in the Computer Systems Lab space in Rhodes Hall. Martha is an open-source platform designed for human robot interaction studies. Video below.
How robotics is driving new research into foundational aspects of ECE
What is a robot? For Cornell’s School of Electrical and Computer Engineering, it’s the intersection of a broad range of research and a useful vehicle to explore core aspects of engineering education. In the past several years, Cornell ECE has been expanding its robotics faculty and building a program that positions robotics as a central pillar of teaching and research within the department.
Robots are physical systems which can perceive, reason about, and act upon their environment. They use sensors to understand internal signals and external surroundings. Robots can be programmable and designed to make real-time computations and decisions about their tasks. A robot takes action to positively affect its environment. Perception, computational intelligence, and action represent foundational principles in ECE.
A robot is an intersectional device. From the design of its circuits and chips, to the actuators and algorithms that allow it to move and think, a robot represents research from many disparate fields working together in ever more novel ways to create something greater than electrical, mechanical, computer or systems engineering could produce on its own. All of those fields intersect in ECE.
Robotics research in Cornell ECE is taking inspiration from surprising sources and asking interesting questions. The core elements of a robot: sensing, planning and decision making, and control and action are all research areas within electrical and computer engineering. Robotics provides a unique platform for science education because so many disciplines converge to make a robot work.
In short, robots could not exist without ECE, a field that excels at bridging novel hardware and computational architecture to create intelligent physical systems.
Biological Inspiration
Assistant Professor Kirstin Petersen is interested in bio-inspired robot collectives and studies of their natural counterparts, especially in relation to construction, exploration, and agriculture. She founded the Collective Embodied Intelligence Lab in Cornell ECE in 2016 to research the design and coordination of large robot collectives which are to achieve complex behaviors beyond the reach of single robot systems.
“We're looking at alternative types of intelligence,” Petersen said. “What kind of intelligence can we program into the body of the robot? What kind of intelligence comes from many robots working together?”
Petersen points out that social organisms demonstrate an implicit intelligence, incorporated into their morphology, into the physical interactions between organisms and into the way they modify their shared environments. They seem to use the environment itself as a shared database. Can social robots learn from this organic intelligence?
“There are a lot of things that many robots can do in collaboration that single robots can’t, even if the single robots individually are much, much smarter,” Petersen said. When many robots are working together embodied in the same environment, their physical interactions and their morphology mean something. “We’re looking at how we can bring intelligence out in those kinds of systems,” she said.
Once you get to big enough swarms, Petersen explained, it starts making sense to think about distributed intelligence instead of centralized controllers telling each robot what to do. “There are many examples in nature where we see that you can have really interesting and robust outcomes from systems like that,” she said.
The Collective Embodied Intelligence Lab represents the core of robotics research in Cornell ECE. It embraces the multidisciplinary and intersectional nature of ECE to take inspiration from biology, systems engineering and artificial intelligence to research how swarms of devices can work better together, or how to distribute intelligence throughout the body of a robot to make it perform better.
“We work a lot with biologists to develop new instruments that help their research,” Petersen said, “because that's where we get our inspiration from. Beyond that, we're also looking at how we can have robots and insects work together to do more interesting things.”
In some cases, it doesn't make sense to create complex robots. For example, if you’re trying to monitor an agricultural field, creating a robot to navigate that field can be tricky and require many different kinds of perception, which could be prohibitively expensive. But pollinators like bees already navigate fields very well. So, if pollinators could be outfitted with simple sensors, one could leverage their existing abilities and still get data without having to design a flight system.
“Following the form-and-function idea from biology, we develop custom dedicated hardware for particular tasks,” Petersen said. “This allows the individual robots to be much simpler, less expensive, and more robust. However, because we have flexibility in their distributed coordination mechanisms, they are still capable of adapting to different perturbations.”
Biological inspiration has led Petersen’s team not just to swarms of insect-like robots, but also to unconventional designs known as soft robots, which can integrate sensors throughout their forms instead of having discrete hard sensors on a rigid robot. Because of their simplicity and inherent robustness such robots are particularly well suited to operate in large collectives.
“The simple way of thinking of them is like a fancy balloon,” Petersen explained. “You might have a polymer encasing which is pneumatically driven; air is used to change the morphology significantly upon inflation, whereas the elastic energy stored up in the polymer helps reverse that motion upon deflation.”
Alexandra Nilles, a postdoctoral researcher in Petersen’s group, is working with students on soft robot collectives that inflate and deflate. The project involves new materials that are more compliant when they interact with the environment, materials not typically found in robotic systems.
“Usually we've used metal, plastic, hard materials that are not flexible,” Nilles said. “This new wave of soft materials is really exciting, because we can make new kinds of robots that people have never imagined before.”
Envisioning a New Platform
The newest member of the ECE faculty, Assistant Professor Elizabeth Farrell Helbling looks at robots from a systems perspective. “I’m really focused on integrating all of the electronics and power systems on board to give a robot some level of autonomy,” she said.
As a researcher Helbling asks, what can engineers do to improve on the sensing and locomotion methods we see in nature? Which tasks are better performed by flying or crawling? Would wheels work better than legs? “You need to think about the mechanisms,” she said. “You need to think about the actuators, the control systems, the sensors, planning and estimation models, power sources, the list goes on.” Understanding how biological systems integrate these elements is a useful method for exploring the capabilities of a robotic system.
Helbling’s postdoctoral work focused on the systems-level design of the Harvard RoboBee, an insect-scale flapping-wing robot, and HAMR, a bio-inspired crawling robot. “I work on insect scale robots with micro scale features,” Helbling said. “It's a hard problem to solve because of the size, weight and power constraints that exist in micro scale platforms.” The RoboBees Helbling worked on had a wingspan of about three centimeters and weighed about 80 milligrams each. It takes about 30 of them to weigh as much as a penny.
Helbling is now envisioning a robotics platform that can be used as the basis for creating new and various types of very small robots. She said that a lot of the growth that’s happening in robotics has been made possible by the availability of stable, off the shelf robotics platforms that researchers can modify by adding actuators or sensors or other pre-packaged systems to suit their needs. But these off the shelf solutions are not really available for insect-scale robots.
“We have all the conventional scale manufacturing techniques, and we have all the nanoscale manufacturing techniques,” Helbling said, “but for everything in this millimeter to centimeter range, the equipment is not there. If I can create a robust autonomous physical platform at this size scale, then hopefully it would open up robotic exploration to a lot more researchers.”
Part of Nilles’ Ph.D. research involved how to automate the robot design process, including how to describe and formalize the trade-offs among the multitude of design decisions. “The goal is to eventually have some kind of automated assistant to help people design robots and to make the robots they design more robust,” she said, “but it’s a long way out.”
Deep Learning
Professor Daniel Lee is part of the AI research group at Cornell Tech. His research in robotics deals with the way intelligent machines perceive, make decisions and execute their plans in an uncertain, dynamic world, both from a practical point of view and from a theoretical point of view. Artificial intelligence and machine learning have major impacts on how to program robots; is it possible to apply an algorithm and let the robot figure things out on its own?
The problem is the huge variability in robot sensor data. Lee has found that this sensor data has an underlying geometrical structure and wants to design deep learning algorithms that can more efficiently deal with this structure to achieve faster learning or provide some sort of performance guarantees for robotic systems.
“From a geometrical point of view,” Lee said, “you can think about your sensor input as a large high-dimensional vector. Every time something changes in the environment, this vector shifts its position in the vector space.” We can measure the robustness of a machine-learning system by looking at how well it handles these shifts.
Before deep learning, even something seemingly as simple as detecting an object was not straightforward. “Machine learning and computer hardware developments really changed the landscape of what information robots can process,” said Nilles. “Ten years ago, a robot with a camera would not be able to determine ‘this is an apple’ very reliably, but now computer vision systems are much more robust.”
The next innovations will allow products or devices to better understand the world around them and that’s critical for robots. “How does the robot understand the world, what are the actions it needs to take and how does it respond properly?” asks Lee. “This is one of the fundamental problems that my research addresses.”
Lee brings industry experience and perspective to his role in Cornell ECE. He works part-time as an executive vice president overseeing global AI research for Samsung Research, the R&D arm of Samsung Electronics. His work for Samsung Research is helping to develop future products including mobile phones, TV’s and smart appliances, while also informing his teaching, keeping his students aware of the current needs in high tech industries.
In his course at Cornell Tech, Intelligent Autonomous Systems, Lee and his students explore algorithms for robotic perception, planning and control with a focus on real-time adaptation and learning. “My approach in robotics is to think about how humans solve the problem,” Lee said, “and then try to build a machine that can emulate what humans do.”
Robotic Construction
Assistant Professor Nils Napp is also interested in taking inspiration from biological processes for robotics. His research looks at the way insects and animals manage to build structures in unstructured and fluctuating environments and applies these principles to robotic construction.
“What animals are able to do in building things is just mind boggling,” Napp said. “I'm interested in the reasoning capabilities, because animals never took mechanics, but they can achieve construction behavior that would be hard for humans to match.”
Napp works to understand how evolved biological systems work reliably with varied terrain and random disturbances in order to design control strategies with a similar robustness. The reason a robot fails a task is often because of something subtle or unanticipated, like a pebble in its path or a doorway that's slightly smaller than every other door. Humans and animals generally do not let slight variances in the environment derail them from their task.
Nilles describes two important algorithmic goals in robotics. “One is efficiency, being fast,” she said, “and the other is robustness, so that it doesn't break immediately when one small thing about the environment changes. Balancing those two characteristics in the algorithms is really difficult.”
Construction is about changing the environment according to some predictable outcome. That requires the robot to understand the state of the environment before construction and after. Can researchers create representations or models of the external world in such a way that guarantees the robot can complete its task successfully?
“I think what's going to make the biggest impact,” Napp said, “is better reasoning and better guarantees, being able to deal with conflicting information. The main driver of progress in making robots more capable is algorithmic.”
Creating the physical, mechanical and sensory parts of the robot is achievable with off the shelf components, but developing its ability to plan, perceive and adapt may be the greater challenge.
“I would eventually like to work on construction robots on a bigger platform capable of building structures on a human scale,” Napp said. “It’s structured in interesting ways because some parts are very predictable. Building materials can be standardized; a prefabricated house is designed to fit together a certain way. Constructing things with robots is a really interesting application domain.”
Robotics is transforming our lives and work practices and has even greater potential to provide enhanced levels of service, boost efficiency and promote workplace safety. Robotics is becoming the driving technology underpinning a whole new generation of autonomous devices and machines that, through their learning capabilities, interact seamlessly with the world around them and provide the missing link between the digital and physical worlds.
_____
Visit the Collective Embodied Intelligence Lab online at cei.ece.cornell.edu.
Karan Mehta, assistant professor (ECE), won a 2024 Sloan Research Fellowship from the Alfred P. Sloan Foundation. Mehta and his lab aim to further deploy such techniques in systems to realize high...
Read more about Mehta wins Sloan Research Fellowship
Zhiru Zhang, associate professor (ECE), was named one of Intel's 2023 Outstanding Researcher Award winners for collaborative research on “Verifying Domain-Specific Optimization in HeteroCL using...
Read more about Zhang named an Intel outstanding researcher