top of page

Robot Performs Laparoscopic Surgery

Autonomous robots are driving cars and sorting recycling, but a team at Johns Hopkins University has been developing a robot for a much more delicate task, performing surgery. In January 2022, the team reported a milestone achievement. For the first time, the machine, Smart Tissue Autonomous Robot (STAR), performed laparoscopic surgery on a live pig with human supervision.

The team has been developing the STAR system for about 10 years. In 2016, the team reported success in performing an intestinal anastomosis—a surgery that involves stitching two sections of the intestine together—on a stabilized tissue sample. Moving on to a live animal subject was huge step forward.

“We’ve made some big advances since our last landmark study on the design in 2016,” said Axel Krieger, assistant professor of mechanical engineering. “We’re excited to keep pushing the boundaries.”

Intestinal anastomosis using laparoscopic, or keyhole, technology is a common procedure, but it is important to note that they are not simple. Even with the advances made in laparoscopic technology, leakages still occur in at least 10 percent of cases because a stitch is not placed accurately. This means a patient must undergo further procedures to correct the area, prolonging recovery.

Krieger and his team hope that they can improve the accuracy of this surgery with their suturing tool, which was designed especially for STAR. The tool drives a needle through the tissue in a circular motion, so the needle stays inside of the mechanism. Krieger says the tool helps reduce the need for the robot to gain its bearings every time the needle is sent through the tissue.

Since STAR does not require human intervention or manual guidance, the team believes they are close to the robot having full autonomy. In fact, it could be possible that one day, a surgeon would only supervise while STAR takes the wheel on this common surgery.

Mending with a Mind of its Own

The team also has improved its machine learning algorithms. Before STAR begins the procedure, it uses a specially designed endoscope outfitted with a hybrid 3D and infrared camera that not only provides visibility, but also allows STAR to construct a 3D model of the area. The machine learning algorithm then formulates a suturing plan based on the position of the tissue and an estimation for how it will look when the procedure is complete. From there, STAR is able to refer back to the models as it performs the surgery and correct course as needed.. The infrared cameras are also used to detect breathing motion, which allow STAR to synchronize with the body so that it doesn’t make a stitch while the tissue is moving. “Naturally, there's a bit of a break between breathing cycles,” Krieger said. “So, with artificial intelligence, the robot is able to find the exact moment when the tissue stops moving so that it has the best chance to get the tool on target.” Overall, these advances have shown that STAR is now able to place a stitch with 83 percent accuracy, which is a big improvement not only on previous iterations of the machine, but also compared to human surgeons doing the same job without robotic assistance.

Challenges Ahead

There are still many challenges to overcome before STAR can begin performing surgery on humans. The first involves creating a failsafe mechanism so that the robot can detect when an error has occurred and redo the job. The team also wants to increase the speed at which STAR performs, which is still slower than a human, even if it is more accurate.

Krieger is eager to explore other applications for the robot, including outfitting emergency vehicles and trauma suites with STAR. He imagines a world where the robot can assist professionals by temporarily blocking arteries to prevent exsanguination on the way to the hospital. “The goal is not to replace surgeons, but to help them in complex situations,” he said. “We also hope this technology becomes a valuable resource in areas where doctors are scarce or not as well equipped to perform procedures like this.”



bottom of page