Cameras for eyes, the playful gait of a Labrador, plating wrapped around its torso like a onesie—walk up to Ghost Robotics’ Vision 60 robot dog, and you might think it’s a mechanical “man’s best friend.” Then you see the assault rifle strapped to its back.
In 2021, Ghost Robotics made headlines for showcasing their sniper–rifle–armed dog at the Association of the United States Army’s annual conference, fueling endless articles about the dystopian future of war. Now, Vision 60 dogs prowl the Gaza Strip—a 2023 Israeli TV segment showcases the dogs firing assault rifles wildly into the distance. They’ve since doubled as mobile deployment platforms for Israeli Rooster drones, whose manufacturer has recently marketed them as weapons capable of conducting “surgical strikes inside buildings.”
Autonomous weaponry is nothing new—plenty of existing technologies, from point–defense cannons to bomber drones, involve some level of automation. But the sheer spectacle of the Vision 60 seemed to presage a future in which wars are fought by killer robots unburdened by conscience, spraying bullets or shooting missiles all on their own accord.
At the center of that future is Penn. Ghost Robotics’ co–founders, Avik De (E ’17) and Gavin Kenneally (E ’21), met when they were both Ph.D. candidates at Penn’s General Robotics, Automation, Sensing and Perception Laboratory. Ghost Robotics is one of 13 companies listed on GRASP Lab’s startup directory, alongside a defunct company producing “flying security guards” and an industrial propulsion firm building parts for both civilian and military drones. Not all GRASP spinoffs focus on defense, but the Department of Defense is one of the biggest patrons of GRASP’s work, whether it comes to funding the basic research Penn conducts or buying up its results down the line.
As the Pentagon marches into the age of autonomy, labs like GRASP are building the future of warfare—but no one quite knows what that future will hold.
_____________
From 2023 to 2024, Perry World House Director and Penn professor Michael Horowitz served in the Pentagon as the deputy assistant secretary of defense for force development and emerging capabilities. A key part of his work involved accelerating the Pentagon’s adoption of artificial intelligence and automation, as well as developing norms and guidelines to govern their responsible use. “I think the Pentagon has now had, like, four AI strategies in the last five years,” he reflects, addressing the sheer pace at which the technology has evolved.
Horowitz is careful to point out that the Pentagon doesn’t get to set policies all of its own accord. “There are both federal law and treaty obligations … that determine the rules that the Pentagon has for the use of force,” Horowitz says. The DOD’s job is to translate those broader commitments into specific policies that its various arms can implement in action. “If you have big obligations and responsibilities, [they] fundamentally don’t change,” Horowitz explains. “The way that those are implemented can change.”
Many of those implementation shifts have been spurred by new developments in defense technology, which have led stakeholders at every level to question whether the U.S. military is ready to meet the moment. In Horowitz’s words, “Countries or violent nonstate actors can [now] use commercial manufacturing … to generate very low–cost short and long–range strike systems.” Alarmingly, says Horowitz, “the U.S. has really been behind in developing these kinds of systems. [We] already have the best military in the world, and that can sometimes make the U.S. slow when it comes to innovation adoption.”
When people picture the U.S. defense industry, they often imagine a World War II–era scene of massive factories pumping out planes and tanks by the thousands. The modern defense ecosystem, however, is far more decentralized. At the center of it all, of course, remains the DOD and its vast reserves of cash. Some of their funding goes towards the basic scientific research conducted at higher education institutions like Penn. That basic research can come in many forms, with GRASP Lab researchers engaged in projects that range from building navigationally autonomous flying robots to creating algorithms designed to learn continuously.
Equally important to the U.S. defense community, however, are the private corporations that apply this research towards practical ends. In recent years, more and more of that practical work has moved out of traditional powerhouses, like Raytheon and Lockheed Martin, and into smaller startups that seem better poised to innovate at the necessary scale. “The traditional defense industry is not delivering at the speed and scale that they need to, people like me have been arguing,” Horowitz says. “We’re in a bad spot when it comes to [our] defense industrial base, and it is time to push the ‘I believe’ button and give some of these smaller companies the opportunity to try to scale their capabilities.”
With startups like Ghost Robotics handling the work of implementation, the job of Penn researchers is simply to get the ball rolling. “Anybody at Penn that’s building a weapon, they’re not doing it on Penn’s time,” Horowitz says.
__________
Though he may not be building a weapon himself, Penn Electrical and Systems Engineering professor Daniel Koditschek has become increasingly concerned with the ways in which his research is being applied. The basic technology that Ghost Robotics built off of came out of Koditschek’s lab, a GRASP subsidiary dedicated to research robot locomotion. In 2021, Koditschek published an open letter that demanded Ghost Robotics remove all references to his Kod*Lab from their website and any other promotional literature. Ghost Robotics’ work, the letter claimed, “corrupts the very aims and nature of robotics research.”
Though Koditschek got his Ph.D. in 1983, he admits that the social impacts of his research only dawned on him about ten years ago. “Long before the Ghost thing blew up, I was quite concerned that I had miscalculated in my notion that I didn’t have to worry about the implications of the research,” Koditschek says. “I came to see that that was an error on my part, and I’ve been working away at that as best I know how for the last six, seven years.” Part of that work has been in spurring others to consider how new developments in robotics are poised to reshape warfare—Horowitz and Koditschek were both members of a joint PWH–GRASP initiative on the social implications of autonomous weapons systems.
These initiatives haven’t gained as much traction as Koditschek hoped. He says that a complicated nexus of problems has prevented researchers from grappling with their work’s implications—not least of which is “the huge flows of money that disincentivize interest in carefully understanding … the capabilities and limitations of [our] machines.” Despite this, Koditschek still believes defense–adjacent researchers can act morally. “We’ve been distracted from thinking ethically, but no citizens have been prevented from thinking ethically,” Koditschek says. Yet, as he acknowledges, “very few of us choose to do so, right?”
When Koditschek began his career, the thought of robots roaming the streets felt like something ripped from the pages of a science fiction novel. Now, however, the real–world applications of his and his colleagues’ research have become impossible to ignore. Robot drones have become angels of death from Ukraine to the Strait of Hormuz. The United States and China have found themselves locked in an arms race to build the automated weapons of the future. Dogs from Ghost Robotics—direct descendants of Koditschek’s work—have found new life as tools of the Israel Defense Forces’ brutal siege on the Gaza Strip. “I’m not the only one who got surprised at how suddenly things became useful,” Koditschek says.
But Koditschek’s ethical qualms haven’t prevented him from taking funds from the DOD—since 2017 alone, his research has received just over $10 million in grant money from the DOD. That’s hardly out of the ordinary for GRASP’s researchers. Vijay Kumar, dean of Penn’s School of Engineering and Applied Science and leader of GRASP’s Kumar Lab, received a $5.5 million DOD grant in 2015 and $2.03 million since 2017 for his research in autonomously navigating drones. GRASP faculty members like Cynthia Sung and Mark Yim have also received six–figure grants for the DOD to fund their robotics research. “There are very few robotics faculty,” Koditschek says, “who don’t have a close association with [the] DOD.”
A key draw of defense money is its relative stability amidst immense federal turmoil. When the Department of Government Efficiency embarked upon their unsuccessful mission to aggressively cut federal spending, one of the few departments that escaped their watchful eye was the DOD. In fiscal year 2025, Congress approved a $895.2 billion budget for the DOD, allowing the Pentagon to foot the bill for plenty of potentially useful research.
Koditschek explains that some robotics researchers are able to conduct their work without taking DOD money—but he isn’t one of them. His first faculty appointment came in 1983, and his work at the time was entirely underwritten by grants from the National Science Foundation. In 1996, however, he received his first DOD grant— and he continues to take Pentagon funding today. “If you add up the actual dollar amounts,” Koditschek says, “roughly 85% of my support has come from DOD over the course of my career.”
The speed and scale of robotics development over the past decade have certainly surprised Koditschek, in both good and bad ways. “I’m horrified by the evolution of drone warfare. I’m horrified by the use of robots in warfare in general,” he says. With defense technology evolving at a dizzying speed, Koditschek thinks he bears at least some responsibility to help curtail its excesses. “It is the responsibility of the scientists and engineers who develop these technologies,” he says, “to help educate our citizens concerning the opportunities and risks so incurred.” In the final instance, however, he believes that regulation is a task for the American people. “I’m still convinced that the voice of democracy is going to have to be decisive here,” Koditschek says.
The Trump administration, however, seems more interested in cutting red tape around technology than regulating it. The Judge Advocate General’s Corps, responsible for providing legal advice to service members and prosecuting court martials, has been ordered by Defense Secretary Pete Hegseth to “execute a ruthless, no–excuses review” and slim down its bureaucracy. The directive to clean house comes a little more than a year after Hegseth purged top lawyers in the Army, Air Force, and Navy for being “potential roadblocks” to the orders of President Trump (W ’68). The Pentagon’s Civilian Harm Mitigation and Response Plan and its associated Center for Excellence have been similarly gutted, with its staffing reduced by almost 90% and its framework for reducing collateral damage scrapped by Trump administration officials. In the wake of massive cuts and persistent interference from the White House, federal regulators seem less and less capable of ensuring that the future of warfare proceeds along ethical lines.
Despite these setbacks, Koditschek holds on to hope that America will come to its senses. “This is not the nation that we hold ourselves to be,” Koditschek says. “I’m convinced that the American citizenry are going to correct that as quickly as—I don’t know how quickly, but I’m convinced that we will correct that as the democracy that we are.”
But if the government refuses to hold itself accountable, do researchers themselves have to step in to regulate their work? “Great question. That’s a great question. That’s a great question,” Koditschek says. It’s a question he doesn’t have an answer to. Recent changes to federal policy, he says, have “been a shock for a lot of us in academia … I’ve never thought about this scenario before.”
_____________
Though Ghost Robotics is best known for their Vision 60’s quadruped offensive firepower, CEO Gavin Kenneally says that it’s all about defending our troops. “Ghost has always been committed to supporting our US and Allied Government customers,” Kenneally says in a written statement to Street. “[That] may mean weaponized payloads on the robots to keep our armed service members out of harm’s way.” While many GRASP startups have attempted to market their technology to both civilian and defense authorities, Ghost Robotics has pursued military applications from the start. Their first–ever contract, in fact, came from the Naval Special Warfare Command. When asked about Ghost Robotics’ ethical commitments as a company, Kenneally simply says that they are “always in compliance with government rules and regulations.” Those regulations, for the moment, prohibit weapons from firing with true autonomy. “The robot’s chain of command starts—and ends—with people,” Kenneally says.
Ghost’s embrace of weaponization is far from universal in the robotics industry. In 2022, six of the world’s top robotics firms signed a pledge not to weaponize any of their products. That pledge may soon be broken—Unitree Robotics, one of the pact’s signatories, has begun collaborating with the Chinese military to strap assault rifles to their own robot dogs. Put another way, however, five of the six firms that signed the pact have kept their promises of peace, despite enormous financial rewards for defection.
For its part, Ghost Robotics has put up its own code of conduct—one that critiques the aforementioned pact for its nebulous language and lofty ideals. Ghost Robotics’ “Our Philosophy” page reads, “even with the best intentions, questions justifiably arise as to the efficacy of this ambiguous pledge.” Instead, they outsource ethics to the experts—the military bureaucrats and political actors who can be held directly accountable to the American people. The same page states that “in a democracy such as ours, the role of policy is so critical that it should be determined by the elected government and not by individual companies.”
___________________
While defense applications of robotics may garner the most media attention, plenty of GRASP startups also build civilian technology. GRASP spinoff Exyn Technologies develops hardware and software for autonomous drones, drawing off of breakthroughs in drone research made by Kumar and his team. Exyn Chief Operating Officer and Chief Product Officer Ben Williams explains that most of the company’s work is focused on geospatial surveys, creating digital reconstructions of the physical world. Their technology has been adopted by firms operating in a variety of industries, from mining to oil and construction. Their conviction to focus on civilian technology was evident from the words displayed front and center on their website’s “Policies” page. “We are committed,” the page stated plainly, “to ensuring the technology we develop does not put lives at risk. We do not support the weaponization of our autonomous systems, and we oppose fully autonomous lethal weapons systems.”
As of March 29, that page is no longer accessible.
In 2025, Exyn began exploring applications of its technology in a new sector—defense. Exyn’s “Company” page states that Range, their new wholly owned subsidiary, was created to deliver “mission-ready autonomy and modular robotic capabilities for defense and other high-consequence environments.” Williams explains that the subsidiary’s focus is “not defense specifically. It’s intended for working with government customers.” Despite this, a blurb at the top of Range’s homepage says that they are “structured explicitly to serve the defense community.” Their website’s “About Us” page says that it is “focused exclusively on defense” by working with “U.S. and NATO-aligned integrators to advance the future of robotic warfare.” Nearly every page on the website is adorned with images of armed soldiers or desolate environments, emphasizing both the ruggedness and the intended use cases of Range’s technologies. Williams says that despite Exyn’s recent expansion, their policy on weaponization remains functionally unchanged. “It’s not something that we’re pursuing,” he affirms.
Nevertheless, Exyn’s shift from civilian to defense applications is hard to ignore—and it’s hardly unique within the robotics industry. From Unitree to Saildrone, company after company founded with noble intentions has shifted their focus to lucrative government use cases. With the amount of funding the federal government is willing to throw at the words “autonomy” or “drone,” the policy gradient for robotics firms points in only one direction.
Koditschek made headlines for publicly breaking from Ghost Robotics in 2021, with his letter calling their robots nothing less than “weapons of mass destruction.” Today, however, his view of the company is more sympathetic. “I’m quite comfortable with the Ghost Robotics position,” Koditschek says. “They’re no more going to be able to direct the use of those machines in the military laboratories than the car companies can direct the use of those machines when they get turned over to the military labs.”
As he sees it, the final choice on weaponization lies outside the scope of any one researcher—or any one company. “I am not going to render judgment on arms manufacturers,” Koditschek says, “any more than I’m going to render judgment on the people fighting for their liberty in Ukraine.”
______________
Others have not been so sympathetic. At the height of Penn’s Gaza Solidarity Encampment in 2024, protestors hung a massive banner between two trees on College Green that bore a single message for the University’s researchers: “UPENN, STOP ENGINEERING GENOCIDE.”
Though Penn conducts no classified research on campus, it still remains very much invested in the private sector applications of its discoveries. “What you’ll often see,” Williams explains, “is a combination of Penn taking equity in the startup and licensing technology.” Williams says that Penn still holds an equity stake in Exyn, though he declined to disclose the specific amount. Kumar, while not involved in the company’s decision making, also holds stock in the corporation. Though researchers at Penn might not build weapons themselves, they and the University still receive the dividends.
Eric—a Penn student who requested anonymity out of fear of disciplinary action—was involved in the encampment and remains a member of both Penn Students Against the Occupation and Philadelphia Students for Justice in Palestine. When it comes to the applications of robotics research, Eric says that scientists knew exactly what they were getting themselves into. “Basic research was the pact that science made with the U.S. government,” Eric says. “Science shook the government’s hand and said, ‘You give us funding to let us research the things we find interesting, and in exchange, our findings will be useful to you, eventually, to further U.S. security interests, U.S. hegemony, U.S. destruction abroad.’”
Horowitz, however, explains that the question of defense research isn’t so simple. In fact, Penn’s relationship with the DOD comes with key benefits for the undergraduate student body. “For every student that might be concerned about the relationship that the University has with the defense world,” he says, “there’s probably a student who’s getting summer money, working on a Defense–funded project at a lab, or a student who’s writing a senior thesis based on research that’s being done for a federal grant. Students actually benefit enormously—and directly—from the resources that the University gets from the government.”
Eric isn’t quite swayed by that argument. The University, he thinks, maintains an affirmative responsibility to serve the people independent of their material interests. “Penn has a [motto]—it says, ‘Laws without morals are useless,’” he says. “Similarly, money without morals is useless; power without morals is useless.” That same responsibility of care extends to science as a whole. “The promise of science was that science could be a net positive. If it’s not doing that, why are you doing it?” Eric asks. “Yes, knowledge for knowledge’s sake—but at a certain point, people’s lives come first.” The basic research Penn conducts has a clear impact on the world—and for that, he thinks, it must be held accountable.
_____________
Eric is hardly the first to fight this battle. In 1969, a group of students opposing war research and gentrification staged a sit–in at College Hall, calling for Penn to end all classified research at the Penn–affiliated University City Science Center. Unlike the encampment, this protest was largely successful—in the end, the board of the UCSC agreed not to conduct research aimed at the “destruction or incapacitation of human life.”
The decentralized defense technology landscape of today, however, is far more difficult for activists to oppose. That dispersal also makes it easier for those involved to turn a blind eye to the real–world consequences of their research. Engineers feel unequipped to regulate their own work; startups, meanwhile, are content with outsourcing the hard work of ethics to the invisible hand of Democracy. Democracy, however, is not in the room with us—the institutions meant to ensure defense technology is deployed ethically are increasingly under siege, leaving them incapable of meeting the present moment. Everyone agrees that ethical guidelines are important—but no one wants to make them.
When it comes to defense development, the genie can hardly be put back in the bottle. Because both the United States and China have not yet deployed their most bleeding–edge weaponized robots, the real potential of today’s technology has yet to be unleashed. “We’re putting stuff out into the wild that we really don’t understand very well. It’s much more powerful than we understand it,” Koditschek says. “It’s not as powerful as it will become.”
Koditschek holds no illusions about the destructive power of his technology, but he hopes that others will be able to solve the ethical problems that he and his colleagues still haven’t.
“You, and your generation, and generations after, should hold me accountable.”



