The Dimensions of DJI Drone SDKs and APIs
The Dimensions of DJI Drone SDKs and APIs
Understanding the intersection between drones and APIs can be tricky. This article provides a look at all the nitty gritty details.
Join the DZone community and get the full member experience.Join For Free
I am going through the DJI drone developer area, which has three distinct SDKs that allow us to leverage a variety of APIs that make the drone magic happen. I'm still wrapping my head around the intersection of drones and APIs, and this is my attempt to distil down what I'm finding in their developer area and absorb some of what is going across the industry. This is not meant to be a complete list. It is meant for my learning, and hopefully yours along the way.
There are a variety of devices being connected to the Internet, but other than the automobile, I don't think there is another object that is as complex as the drone. I'm fascinated by what is possible with this device, the variety of APIs it has, and the interaction with the RC controller, mobile device, and other resources the clouds. I personally fly a DJI drone, so I am going through the DJI developer area and learning about their three SDKs, as they seem to be the ecosystem furthest along in their understanding the API potential (think Twitter for IoT).
The DJI Onboard SDK
This SDK allows for communication with the DJI flight controller over a direct serial connection to monitor and control aircraft flight behavior with the Onboard API while utilizing the built-in Intelligent Navigation Modes to create autonomous flight paths and maneuvers. Some of the actions for the onboard SDK are:
- Activation. Before you start exploring DJI Onboard SDK functionality via our ROS examples, you will need to go through the activation process.
- Obtain and release flight control. Manage the process to get flight control.
- Takeoff. Initiate a take-off for the drone.
- Landing. Tell the device to land.
- Go home. Tell the device to go home.
- Gimbal control. Manage gimbal for camera.
- Altitude control. Manage the altitude for the drone.
- Photo taking. Allow for taking photos
- Start and stop video recording. Start and stop the video for the camera.
- Virtual RC control. Control the drone through the serial port by simulated channel values.
- Broadcast frequency control. Manage which frequency the drone is broadcasting on.
- Arm and disarm control. Arm or disarm the controls.
- Timestamp synchronization. Synchronize the timestamp.
- Native waypoint. Manage the waypoints for the mission.
- Hotpoint. Manage the Hotpoint for circling.
- Local navigation. Go to a specified local position.
- Global navigation. Go to a specified global position.
- Waypoint navigation. Manage flying through a series of GPS coordinates.
- WebSocket with Baidu Map for navigation. This involves real-time mapping.
- MAVLink and QGroundStation. Manage the vehicle-to-air communication.
These are the actions that onboard SDK open up, but the SDK has other layers, that go beyond the drone itself, and is more about the space and environment around a drone, and its interaction with this world.
Light Detection and Ranging (LiDAR) sensors are for commercial UAS applications such as 3D aerial mapping, surveying, inspection, collision avoidance, and autonomous navigation in potentially either indoor and outdoor environments. There are three distinct elements of the drones LiDAR system:
- Lidar API. Supporting the computation of point cloud and logging LiDAR data in real-time into a standard LAS (LiDAR Aerial Survey) or PCAP (packet capture) file.
- Lidar simulator. A LiDAR simulator for playing back PCAP files in real-time via the same Ethernet output as a real LiDAR for facilitating development and debugging for the system integration.
- Lidar logging. An API-based example to demonstrate how to use the API of real-time point cloud computation and LAS and PCAP logging. It can also be used for the use case of the integration of Velodyne LiDAR with M100 without the onboard SDK.
LiDAR opens up another universe within the DJI onboard SDK, allowing for access the system through API, managing the logging around activity, and also simulating common navigation elements in the environment.
uAvionix ADS-B Receiver
The pingRX ADS-B receiver allows the drone to receive real-time traffic information broadcasted by other manned or unmanned aircraft, as well as temporary flight restriction (TFR) information broadcasted by the government. With this type of situation awareness, the onboard embedded system (OES) will be able to make some safety-critical decisions like collision avoidance or self-separation.
Precision Trajectory Mission Planning
With the Onboard SDK Precision Trajectory Mission Planning suite, DJI developers can now plan complex missions without having to use GPS waypoints. The new DJI Precision Trajectory Mission Planning library has the flexibility to deal with complicated trajectories, issues with GPS accuracy and cases when GPS is simply unavailable.
- Trajectory following library that can autonomously execute preplanned smooth spiral trajectories.
- SketchUp plugin to visualize trajectories, import 3D CAD models and geolocate the scene.
- Configurable speed, start and end radii, and pitch for the spiral.
- Start your drone from anywhere; real-time path planning to get to the trajectory's GPS location.
- Integration with DJI Assistant 2 to visualize simulations of the drone following the trajectory in the SketchUp scene.
That is a pretty robust SDK. I'm taking the time to learn about each action, as well as the more communication and mission planning components separately. I can tell that they are having trouble to keep the large amounts of functionality and features coherent and organized in the documentation, one of the reasons I'm breaking out separately here on my blog.
DJI Guidance SDK
Guidance SDK enables allows you to develop vision-based applications by granting you full control over drone guidance. You can access all output data for any device using the DJI Guidance SDK; they break things down into five separate groups.
- Reset config. Clear subscribed configure if you want to subscribe data different from last time.
- Initialize transfer. Initialize guidance and create data transfer thread.
- IMU. Subscribe Inertia Measurement Unit (IMU) data.
- Ultrasonic. Subscribe ultrasonic data.
- Velocity. Subscribe velocity data, i.e., velocity, of guidance in body coordinate system.
- Obstacle distance. Subscribe obstacle distance.
- Depth image. Subscribe rectified greyscale image.
- Disparity image. Subscribe disparity image, which can be filtered with functions such as filterSpeckles.
- Greyscale image. Subscribe rectified greyscale image.
- Motion. Subscribe global motion data, i.e., velocity and position of Guidance in global coordinate system.
Set Callback and Exposure
- Event handler. Set callback function handler. When data from guidance comes, it will be called by data transfer thread.
- Exposure parameters. Get stereo calibration parameters.
- Get online status. Get the online status of guidance sensors.
- Get stereo calibration. Get stereo calibration parameters.
- Get device type. Get the type of devices. Currently only support one type of device: Guidance.
- Get image size. Get the size of image data.
- Start transfer. Inform guidance to start data transfer.
- Stop transfer. Inform guidance to stop data transfer.
- Release transfer. Release the data transfer thread.
- Wait for board ready. Set callback function handler. When data from guidance comes, it will be called by data transfer thread.
This SDK seems to be real time senses of the drone, allowing you to develop the experience you need to be in control, and guiding the device.
One of the thing that captivates me about the whole drone thing is its data collection capacity. I'm still learning about what is possible with the drone itself, but I know that much of the value generated by these flights will be based on the data that is gathered, as well as the images and video recorded. Here are the data points I have found so far in the DJI documentation for the DJI Guidance SDK.
- Error code. Enumerates possible error codes. When an error occurs, usually an error code will be given, and the developer can reference this enum to find the error type.
- Velocity data. Velocity in body frame. The unit is millimeters per second and the frequency is 10 Hz.
- Obstacle distance data. Obstacle distance from five Guidance Sensors. The unit is centimeter and the frequency is 20 Hz.
- IMU data. IMU data including accelerometer (in units of acceleration of gravity g) and gyroscope (in quaternion format) data. The frequency is 20 Hz.
- Motion data. Pose and velocity data including quaternion orientation, position in the global frame, velocity in the global frame.
- Ultrasonic data. Outputs ultrasonic data from five Guidance Sensors, including obstacle distance (in units of meter) and reliability of the data. The frequency is 20 Hz.
- Greyscale image. Outputs Greyscale images for five directions. The image size is 320 by 240 bytes for an individual sensor. The default frequency is 20 Hz and can be scaled down using API functions.
- Depth image. Outputs depth images for five directions. The image size is 320 by 240 by 2 bytes for each direction. The default frequency is 20 Hz and can be scaled down using API functions.
- Disparity image. Outputs disparity images for five directions. This data is useful when developers want to further refine the disparity images using functions like speckle filter.
This data is generated constantly by a drone, and you have control over this transfer process through the DJI Guidance SDK. I'm thinking I need to aggregate some JSON schemas for this data, to better help me understand the depth and relationships in this data. There is a lot going on here, and a wealth of data to consider in a wide range of scenarios.
DJI Drone Mobile SDK
I use the DJI drone application to operate my two drones. The DJI Drone Mobile SDK is where you can get to work crafting your own custom application to deliver exactly the drone operation experience you want. This is what the iPhone application was for mobile, but this is for the consumer and commercial drone world. There are a wealth of areas you can develop around in this SDK.
- Flight controller. The flight controller is an onboard computer that combines control information from the pilot with sensor information to adjust the thrust at each propeller and fly the aircraft as desired.
- Camera. - The camera captures photos and videos. Many different modes of operation, resolutions, frame rates, exposure settings, picture settings and file types can be selected. Cameras have local storage to hold the media which will typically be an SD card and in some cases, an SSD (solid state drive).
- Gimbal. Cameras fixed to an aircraft will record images that pitch and roll with the aircraft as it moves. Multi rotor aircraft need to pitch and roll simply to move horizontally, so getting a stable horizontal shot is not possible.
- Airlink. AirLink describes the wireless link between aircraft, remote controllers, handheld cameras and mobile devices.
- Remote controller. The remote controller allows manual flight, gimbal and camera control, and provides a robust wireless control link for aircraft. The mobile device can connect to the remote controller to communicate to the aircraft and receive the live video stream from the camera.
- Smart battery. Smart batteries provide the energy required to run the products. Together with the flight controller, the smart battery can estimate remaining flight time and provide warnings when low battery thresholds are crossed. Batteries are easily swapped between flights, extending product use considerably.
- Missions. Missions can be used to easily automate flight. There are many different mission types that offer different product behavior. Some missions can be uploaded to and managed by the aircraft, while other missions are managed from the mobile device.
- SDK manager. Application registration to use the DJI Mobile SDK, product connection, debugging and logging services are handled through the SDK manager class DJISDKManager.
I think that this stack of features speaks for itself. Providing a wealth of valuable API-driven resources to think about. This blows my mind as I begin to think about the possibilities for developing drone applications but gets even better when you think about how this also applies to the rest of the IoT world. Flight controllers might not apply universally, but cameras, batteries, remote control, and the network are all ubiquitous with other IoT devices, and in my world should be considered beyond just drones.
I just needed to wrap my head around what programmatic resources are available to me as a DJI drone operator and developer. Next, I will be diving in and learning about some of the more interesting layers of this drone ecosystem, but first, I am more interested in spending time looking through the platform API and SDK resources for other drone platforms, as well as some of the data solution providers like Airmap and other physical components providers like FLIR for imaging and LumeCube for lighting. I am always having to pick my battles on how deep on want to go down each rabbit hole or stay at the high level for a wider perspective.
There is a lot going on here. I find drones fascinating from a technical perspective, and terrifying when it comes to surveillance, privacy, policing, and some of the other bad behavior we've seen recently. Like other areas of the tech space, I think APIs are important for not just managing devices and the experience, but also provide transparency, logging, auditing, and other observability considerations when it comes to the Internet of Things space.
Published at DZone with permission of Kin Lane , DZone MVB. See the original article here.
Opinions expressed by DZone contributors are their own.