Connecting Unmanned Vehicles to the Cloud

By Michael August, Darren Powell, Chris Raney, and Charles Yetman


A team of scientists and engineers at Space and Naval Warfare Center Systems Center (SSC) Pacific in San Diego is developing unmanned vehicle control systems that could revolutionize warfighters’ battlespace awareness capabilities—an important asset in the Navy’s goal for information dominance.


“UxV to the Cloud via Widgets” is a science and technology research effort established to demonstrate distributed control of unmanned systems. UxV stands for any one of the four categories of unmanned vehicle: ground, air, surface, or undersea. Humans operate these vehicles from remote locations. Widgets are user-configured Web applications that provide a limited view into a larger application, similar to windows within a Web browser that display interactive Web applications. The cloud is the collective computer power of remotely accessed networked servers and computers—very much like the networks that you access on your cell phone or laptop.

Synthesizing these three distinct technologies presents a novel approach to unmanned vehicle control.

Currently, unmanned vehicles are commanded by dedicated control systems with proprietary hardware, and system software components must be custom-built for each platform. The UxV project challenges that practice by allowing an operator to control multiple vehicles simultaneously within a Web browser.

This project began in 2012 under the direction of SSC Pacific Executive Director Carmela Keeney, with funding from the Naval Innovative Science & Engineering (NISE) program. The NISE program was established in 2009 in legislation passed through the National Defense Authorization Act to fund efforts in basic and applied research, technology transition, workforce development, and capital equipment investment.

Cloud, widget, and unmanned vehicles team members joined forces to design and develop a prototype system using open-source components. The system has a realistic unmanned surface vehicle (USV) simulator, a software interface for controlling the vehicles in the simulator, widgets that provide human operators with a graphical user interface for controlling the vehicles, and a data cloud for storing all of the data received from the vehicles.

The team developed the widgets using the Ozone Widget Framework, an open-source Web application originally developed by the National Security Agency. The cloud implementation is based on Apache Accumulo, an open-source data cloud software bundle with security features. Government employees at SSC Pacific developed all software components of the system.


Multiple personnel can use the system to control an unmanned vehicle and record data in the cloud, and individuals not in control of the vehicle can view the unmanned vehicles’ observations. For example, an operator on ship A uses widgets on a control dashboard to send commands to the unmanned vehicle. An operator on ship B can request control of the vehicle from the operator on ship A. If the operator on ship A agrees with the request, then control is passed to the operator on ship B. As the vehicle is in transit, its sensor data and camera feeds are ingested into the cloud in near real time. An analyst on shore can monitor the archived historical data as well as the live data stream.

Connecting a data cloud to the system to archive the incoming data will allow sharing among operators and analysts, enabling them to pass control from one operator to another and access the vehicle’s historical inputs through the cloud.

Software and a set of widgets must reside both ashore and on each ship, and each ship can control a USV directly. The control widgets send commands directly to the USV without going through the cloud, as the time lag of the cloud is too long to perform real-time operations through it. As the operator on board the ship is controlling the USV, its position information, sensor data, and camera imagery are pushed into the cloud stack. These data points in the ship-based cloud stack are shared with other ships and the shore. From these other locations, analysts can open up analysis widgets and inspect the imagery taken from the USV as well as track the USV’s position over time. Operators and analysts on board other ships also can launch control and analysis widgets to access this information.

An operator managing the control dashboard within his or her Web browser can see the live feed from the USV’s camera. In addition to viewing the feed from the front-facing camera, the operator can see the video feeds from the rear- and side-facing cameras, known as the “quad view,” all within a single window. A recent innovation of the control widget features a 3-D “can view,” which studies suggest is more intuitive than the 2-D or “quad view.”

The operator can use a gamepad controller plugged into the computer to control the vehicle, another modification made from usability studies at the Office of Naval Research. A unified map widget also combines the tactical map and the analysis map into one widget. This map has multiple zoom levels and different layers that operators can toggle on or off, including digital nautical charts for navigation. When in vector mode, the operator controls the vehicle using the gamepad. When in waypoint mode, the operator controls the vehicle by setting waypoints on the map. The operator can click and drag these waypoints around the map, enabling the vehicle to be redirected while in transit.


A single operator can control multiple vehicles displayed on the map with a circle that annotates the vehicle currently under control. Many digital nautical chart features and multiple layers can be toggled on or off. Three types of data—historical, near-real-time, and live—can be displayed on the map simultaneously. The historical data is retrieved from the cloud and displayed alongside the live data on the map. When the mouse is hovered over the vehicle track, a thumbnail of the vehicle’s camera feed pops up. The waypoints representing the autonomous route for the USV are displayed on the map as lines.

A controlled vehicle simultaneously places data into the cloud. The data consists of geospatial location and various sensor readings, as well as video feeds from the cameras located on the vehicle. This data is ingested into the cloud and indexed for quick retrieval. As the number of data files ingested into the cloud increases, the size of the cloud grows to accommodate the larger data set. With the appropriate permissions, an analyst in a remote location with network connectivity can access the analysis dashboard. This dashboard consists of various widgets for investigating previous positions of USVs within the analyst’s area of responsibility. These include the unified map, data viewer, image viewer, and video widgets. The analyst can click on a point in the map for a particular USV, and the other widgets automatically display the data associated with that point.

The data viewer widget shows the coordinates and heading of the vehicle, as well as other relevant data. The image viewer widget displays the image taken from the forward-looking camera on the USV at that point in time. The video widget plays the full-motion video captured by the forward-looking camera starting at the point in time at which the analyst clicked on the track. If operators or analysts notice something they would like to analyze in detail later on, they can always come back and view the captured video and imagery and the associated position of the vehicle.


Currently in its third year of funding, the team is leveraging existing architecture and components to integrate control of an unmanned aerial vehicle (UAV) with minimal software changes required. Operating multiple UAVs from a Web browser will provide the air community with a flexible architecture for UAV control and provide a mechanism for sharing both surface and air data among operators and analysts. The team also is extending its full-motion video architecture to generate 3-D models of objects of interest within the camera’s field of view, an experimental technology referred to as “Structure from Motion.” In addition, social and collaboration widgets are being developed to enable operators and analysts aboard different platforms to communicate with one another directly using widgets within the dashboard.

Some of the technology developed within this project already has been applied successfully to a different domain: the management of logistics data from aircraft. As part of the Comprehensive Automated Maintenance Environment-Optimized project at SSC Pacific, an effort that uses widgets and the cloud is being developed to provide maintenance personnel with a true condition-based-maintenance-plus capability. Condition-based maintenance enables fault patterns in aircraft components to be discovered before a problem arises within the aircraft. This Readiness Integration Center stores sensor data in the cloud for display within widgets. A suite of services that provides visualization and analytics capabilities is currently being built into the system.

The “UxV to the Cloud via Widgets” prototype has successfully demonstrated a novel approach for operating the Navy’s growing number of unmanned systems and for managing and sharing the sensor data generated by those systems. As a NISE technology transition project, “UxV to the Cloud via Widgets” has secured agreements to transition its technology into multiple Navy programs of record for current and future technologies. Transitioning technology into Navy programs in support of the warfighter is the ultimate barometer of success for the NISE program. “UxV to the Cloud via Widgets” combines command, control, communications, computers, intelligence, surveillance, and reconnaissance assets to reduce manning of unmanned systems while enhancing battlespace awareness—a solution for reducing costs while providing superior information dominance capabilities to warfighters.

 About the Authors

Michael August is the project manager for the Enterprise Cloud Team in the Enterprise Communications and Networks Division at Space and Naval Warfare Systems Center Pacific. Darren Powell is an engineer in the Unmanned Technologies Branch. Charles Yetman is a software engineer in the Command and Intelligence Systems Divisions. Christopher Raney is the head of the Command and Control Technology and Experimentation Division.






About Future Force Staff