Buzzing like a mammoth insect, the four-prop drone lifts off the ground, rising straight up. Once it reaches about four feet, it hovers for a bit, then slowly settles back to the ground.
It doesn’t seem like much of a flight until you understand how it happened.
No handheld controls, no tether, no nothing.
Just brain waves.
Tony Ferguson sits about 12 feet away at a laptop. As he works in what has been dubbed the Unmanned Systems Lab, on top of his head is a lattice of white plastic dotted with blue electrodes about the size of a thumb. The electrodes, which register brainwave activity, can be tightened to firmly hold the entire device in place.
Unfortunately, at the skull-touching end, the electrodes have a circle of tiny points—little needles, basically—and they tend to dig into the flesh. Not enough to draw blood, but enough to make anyone wearing the headgear want to take it off as quickly as possible.
“These things are pretty uncomfortable; they stick and they aren’t fun. About an hour is all I can take,” Ferguson says, several small, reddish circles indented in the skin of his forehead just short of the hairline.
But the pain is worth it, he says, because his goal is to develop technology that will allow control all aspects of the drone simply through thought. Want it to move right? Think about it. Want it to take a photo with its camera? Imagine it.
“Visualization in the brain would control the drone,” says Ferguson, an electrical engineering major.
Future uses
Dr. Zach Ruble, a researcher in the Departments of Engineering and Electrical Engineering and director of the drone project, pictures a future in which one person can control multiple drones by thought. A separate research project that uses infrared light to control a swarm of drones—led by electrical engineering student Artem Malashiy—is taking place alongside the project on brain-wave control.
“Ultimately, we want to detect patterns in the brainwaves such that we can give abstract commands to multiple drones,” Ruble says. “So instead of giving a basic command such as ‘take off’ or ‘land,’ we can send out an abstract command such as ‘search and rescue.’
“After receiving the command, the swarm of drones will determine what needs to be done in order to carry out that command and execute the mission autonomously.”
A smaller, more comfortable headset should be possible as the technology develops, Ferguson says.
“It could have military applications, search-and-rescue applications, etc. Obviously, prosthetics has a huge, huge application here,” says Ferguson, a former Marine who served spent time under fire in Iraq’s war-torn Fallujah.
In the current setup, his brainwaves are fed into his laptop, which sends a Wi-Fi signal to a second computer that, in turn, sends a wireless signal to the drone. The second computer, manned by Ed Steele, a graduate student in electrical engineering, contains the software that actually controls the drone. Steele developed the software and, without his work, the project would not be possible, Ferguson says.
“He did all the groundwork, all the legwork, to fly the drone,” he says.
Both spent several months just getting the drone to move up and down, and both say the work took longer than they wanted, but now they want more. Much more.
Following footsteps
Using brainwaves to control robotics limbs is nothing new, but those rely on implants in the brain that fire nerves and send electric impulses to microprocessors in the limbs. What Ferguson and other members of the project team have plans that are difficult to create but will ultimately make things easier.
“I’m following in the footsteps of stuff that’s gone before, but the thought avenue is what I’m focusing on, to be able to control things just by thinking,” says Ferguson, who adds that, until he joined the drone project in January, “had no idea” that this type of technology was even available much less usable.
He’s in the process of creating a wide “vocabulary” for drone control and says he hopes to have that done by the end of December. Creating a wider command list is another step to increase the drone’s capabilities, Ruble says.
“For example, if we can detect a pattern in the EEG signals that differentiates a person moving his left arm versus moving his right arm, we can generate two different commands based off which arm the person is moving.”
Ferguson says he’s “trying to get gesture control, the brainwave response for this”— he moves his right arm up and down—“to fly the drone left or right, I will move on to purely thought,” he explains.
“I’ve done it a little bit; it’s kind of hard to do.”
David Avery
This is absolutely amazing!!! I’m a nursing student at utc and was always on the fence between nursing and engineering, after reading about i might have to switch majors haha! I build and race drones as a hobby and I was wondering if it would be possible to come and see this drone in action?
Anton Nijholt
Races for BCI controlled drones were already held at the University of Florida (Gainesville) in 2016. See e.g., https://techcrunch.com/2016/04/25/university-of-florida-held-the-worlds-first-brain-controlled-drone-race/
Ryankendall
This is a great research further proving the ability of the brain.