[go: up one dir, main page]
More Web Proxy on the site http://driver.im/
You seem to have javascript disabled. Please note that many of the page functionalities won't work as expected without javascript enabled.
 
 
Sign in to use this feature.

Years

Between: -

Subjects

remove_circle_outline
remove_circle_outline

Journals

Article Types

Countries / Regions

Search Results (4)

Search Parameters:
Keywords = DJI RC

Order results
Result details
Results per page
Select all
Export citation of selected articles as:
29 pages, 32760 KiB  
Article
Digital Forensic Research for Analyzing Drone Pilot: Focusing on DJI Remote Controller
by Sungwon Lee, Hyeongmin Seo and Dohyun Kim
Sensors 2023, 23(21), 8934; https://doi.org/10.3390/s23218934 - 2 Nov 2023
Cited by 1 | Viewed by 3531
Abstract
Drones, also known as unmanned aerial vehicles (UAVs) and sometimes referred to as ‘Mobile IoT’ or ‘Flying IoT’, are widely adopted worldwide, with their market share continuously increasing. While drones are generally harnessed for a wide range of positive applications, recent instances of [...] Read more.
Drones, also known as unmanned aerial vehicles (UAVs) and sometimes referred to as ‘Mobile IoT’ or ‘Flying IoT’, are widely adopted worldwide, with their market share continuously increasing. While drones are generally harnessed for a wide range of positive applications, recent instances of drones being employed as lethal weapons in conflicts between countries like Russia, Ukraine, Israel, Palestine, and Hamas have demonstrated the potential consequences of their misuse. Such misuse poses a significant threat to cybersecurity and human lives, thereby highlighting the need for research to swiftly and accurately analyze drone-related crimes, identify the responsible pilot, and establish when and what illegal actions were carried out. In contrast to existing research, involving limited data collection and analysis of the drone, our study focused on collecting and rigorously analyzing data without restrictions from the remote controller used to operate the drone. This comprehensive approach allowed us to unveil essential details, including the pilot’s account information, the specific drone used, pairing timestamps, the pilot’s operational location, the drone’s flight path, and the content captured during flights. We developed methodologies and proposed artifacts to reveal these specifics, which were supported by real-world data. Significantly, this study is the pioneering digital forensic investigation of remote controller devices. We meticulously collected and analyzed all internal data, and we even employed reverse engineering to decrypt critical information files. These achievements hold substantial significance. The outcomes of this research are expected to serve as a digital forensic methodology for drone systems, thereby making valuable contributions to numerous investigations. Full article
(This article belongs to the Special Issue Security, Cybercrime, and Digital Forensics for the IoT)
Show Figures

Figure 1

Figure 1
<p>Information of DJI RC. (<b>a</b>) Connection information of DJI RC. (<b>b</b>) Data to digital forensic analysis of DJI RC.</p>
Full article ">Figure 2
<p>DJI RC rooting using DJI RC FCC. (<b>a</b>) Successful rooting using DJI RC FCC. (<b>b</b>) Check root privilege using ADB shell.</p>
Full article ">Figure 3
<p>Imaging procedure of the DJI RC flash memory.</p>
Full article ">Figure 4
<p>Source code of Bluetooth app on DJI RC.</p>
Full article ">Figure 5
<p>Atrace log contents of Sysobserver App on DJI RC.</p>
Full article ">Figure 6
<p>User information in DJI Fly configuration file.</p>
Full article ">Figure 7
<p>Comparison of data before and after decryption of DJI Fly app log.</p>
Full article ">Figure 8
<p>View of dpad_flyshare and dpad_share app. (<b>a</b>) dpad_flyshare app. (<b>b</b>) dpad_systemui app.</p>
Full article ">Figure 9
<p>Configureation file os dpad_setup app on DJI RC.</p>
Full article ">Figure 10
<p><tt>settings.global.xml</tt> file containing user account.</p>
Full article ">Figure 11
<p><tt>DJIFRSyncLog_YYYY-MM-DD_[HH-MM-SS].txt</tt> file containing user account and binding time.</p>
Full article ">Figure 12
<p>Drone pairing information #1. (<b>a</b>) <tt>/data/.../active/∗.txt</tt> File containing drone pairing information. (<b>b</b>) <tt>/data/.../deviceManager/∗.txt</tt> file containing drone pairing information. (<b>c</b>) <tt>/data/.../DJINotificationService/∗.txt</tt> file containing drone pairing information. (<b>d</b>) <tt>/data/.../LteService/∗.txt</tt> file containing drone pairing information.</p>
Full article ">Figure 13
<p>Drone pairing information #2. (<b>a</b>) <tt>/data/.../UP_DATA_ALL/∗.txt</tt> file containing drone pairing information. (<b>b</b>) <tt>/data/.../UP_NEW_ALL/∗.txt</tt> file containing drone pairing information.</p>
Full article ">Figure 14
<p>DJI RC location information #1. (<b>a</b>) <tt>/data/.../CACHE/∗.txt</tt> file containing DJI RC location information. (<b>b</b>) <tt>/data/.../CACHE/AeroScope/∗.txt</tt> file containing DJI RC location information. (<b>c</b>) <tt>/data/.../CACHE/AttitudeBar/∗.txt</tt> file containing DJI RC location information.</p>
Full article ">Figure 15
<p>DJI RC location. (<b>a</b>) <tt>CACHE</tt> artifacts. (<b>b</b>) <tt>AeroScope</tt> artifacts. (<b>c</b>) <tt>AttitudeBar</tt> artifacts.</p>
Full article ">Figure 16
<p>Decrypting the DJIFlightRecord file and viewing the flight records using Phantom Help site.</p>
Full article ">
26 pages, 8913 KiB  
Article
Small Fixed-Wing UAV Radar Cross-Section Signature Investigation and Detection and Classification of Distance Estimation Using Realistic Parameters of a Commercial Anti-Drone System
by Ioannis K. Kapoulas, Antonios Hatziefremidis, A. K. Baldoukas, Evangelos S. Valamontes and J. C. Statharas
Drones 2023, 7(1), 39; https://doi.org/10.3390/drones7010039 - 6 Jan 2023
Cited by 12 | Viewed by 10756
Abstract
Various types of small drones constitute a modern threat for infrastructure and hardware, as well as for humans; thus, special-purpose radar has been developed in the last years in order to identify such drones. When studying the radar signatures, we observed that the [...] Read more.
Various types of small drones constitute a modern threat for infrastructure and hardware, as well as for humans; thus, special-purpose radar has been developed in the last years in order to identify such drones. When studying the radar signatures, we observed that the majority of the scientific studies refer to multirotor aerial vehicles; there is a significant gap regarding small, fixed-wing Unmanned Aerial Vehicles (UAVs). Driven by the security principle, we conducted a series of Radar Cross Section (RCS) simulations on the Euclid fixed-wing UAV, which has a wingspan of 2 m and is being developed by our University. The purpose of this study is to partially fill the gap that exists regarding the RCS signatures and identification distances of fixed-wing UAVs of the same wingspan as the Euclid. The software used for the simulations was POFACETS (v.4.1). Two different scenarios were carried out. In scenario A, the RCS of the Euclid fixed-wing UAV, with a 2 m wingspan, was analytically studied. Robin radar systems’ Elvira Anti Drone System is the simulated radar, operating at 8.7 to 9.65 GHz; θ angle is set at 85° for this scenario. Scenario B studies the Euclid RCS within the broader 3 to 16 Ghz spectrum at the same θ = 85° angle. The results indicated that the Euclid UAV presents a mean RCS value (σ ¯) of −17.62 dBsm for scenario A, and a mean RCS value (σ ¯) of −22.77 dBsm for scenario B. These values are much smaller than the values of a typical commercial quadcopter, such as DJI Inspire 1, which presents −9.75 dBsm and −13.92 dBsm for the same exact scenarios, respectively. As calculated in the study, the Euclid UAV can penetrate up to a distance of 1784 m close to the Elvira Anti Drone System, while the DJI Inspire 1 will be detected at 2768 m. This finding is of great importance, as the obviously larger fixed-wing Euclid UAV will be detected about one kilometer closer to the anti-drone system. Full article
Show Figures

Figure 1

Figure 1
<p>A typical identification process of a radar system for drones.</p>
Full article ">Figure 2
<p>Elvira Anti Drone System’s main lobe for the DJI Inspire 1 detection case. All lengths are in meters.</p>
Full article ">Figure 3
<p>Elvira Anti Drone System’s main lobe for the DJI Inspire 1 classification case. All lengths are in meters.</p>
Full article ">Figure 4
<p>RCS measurements of some common commercial multicopters.</p>
Full article ">Figure 5
<p>The two targets analyzed in this research and the overall validation methodology for the Euclid results.</p>
Full article ">Figure 6
<p>θ angle clarification in POFACETS software.</p>
Full article ">Figure 7
<p>Proper placing of a target in POFACETS software.</p>
Full article ">Figure 8
<p>Check for normal for the Euclid UAV.</p>
Full article ">Figure 9
<p>Graphical representation of the simulation parameters for the two scenarios.</p>
Full article ">Figure 10
<p>DJI Inspire 1 RCS results for θ = 85°, φ = 0°–360° and f = 8.7 Ghz.</p>
Full article ">Figure 11
<p>Euclid UAV RCS results for θ = 85°, φ = 0°–360° and f = 8.7 Ghz.</p>
Full article ">Figure 12
<p>DJI Inspire 1 RCS results for θ = 85°, φ = 0°–360° and f = 9.175 Ghz.</p>
Full article ">Figure 13
<p>Euclid UAV RCS results for θ = 85°, φ = 0°–360° and f = 9.175 Ghz.</p>
Full article ">Figure 14
<p>DJI Inspire 1 RCS results for θ = 85°, φ = 0°–360° and f = 9.65 Ghz.</p>
Full article ">Figure 15
<p>Euclid UAV RCS results for θ = 85°, φ = 0°–360° and f = 9.65 Ghz.</p>
Full article ">Figure 16
<p>DJI Inspire 1 RCS signature for θ = 85° and φ = 45° within the 3 to 16 Ghz spectrum.</p>
Full article ">Figure 17
<p>Euclid UAV RCS signature for θ = 85° and φ = 45° within the 3 to 16 Ghz spectrum.</p>
Full article ">
13 pages, 1720 KiB  
Article
Comparison of Radar Signatures from a Hybrid VTOL Fixed-Wing Drone and Quad-Rotor Drone
by Jiangkun Gong, Deren Li, Jun Yan, Huiping Hu and Deyong Kong
Drones 2022, 6(5), 110; https://doi.org/10.3390/drones6050110 - 27 Apr 2022
Cited by 10 | Viewed by 5608
Abstract
Current studies rarely mention radar detection of hybrid vertical take-off and landing (VTOL) fixed-wing drones. We investigated radar signals of an industry-tier VTOL fixed-wing drone, TX25A, compared with the radar detection results of a quad-rotor drone, DJI Phantom 4. We used an X-band [...] Read more.
Current studies rarely mention radar detection of hybrid vertical take-off and landing (VTOL) fixed-wing drones. We investigated radar signals of an industry-tier VTOL fixed-wing drone, TX25A, compared with the radar detection results of a quad-rotor drone, DJI Phantom 4. We used an X-band pulse-Doppler phased array radar to collect tracking radar data of the two drones in a coastal area near the Yellow Sea in China. The measurements indicate that TX25A had double the values of radar cross-section (RCS) and flying speed and a 2 dB larger signal-to-clutter ratio (SCR) than DJI Phantom 4. The radar signals of both drones had micro-Doppler signals or jet engine modulation (JEM) produced by the lifting rotor blades, but the Doppler modulated by the puller rotor blades of TX25A was undetectable. JEM provides radar signatures such as the rotating rate, modulated by the JEM frequency spacing interval and the number of blades for radar automatic target recognition (ATR), but also interferes with the radar tracking algorithm by suppressing the body Doppler. This work provides an a priori investigation of new VTOL fixed-wing drones and may inspire future research. Full article
Show Figures

Figure 1

Figure 1
<p>Comparison of two drones: (<b>a</b>) TX25A, (<b>b</b>) DJI Phantom 4. (Photo of TX25A is from <a href="http://www.harryskydream.com" target="_blank">http://www.harryskydream.com</a>, accessed on 26 April 2022; photo of DJI Phantom 4 is from <a href="https://www.dji.com/" target="_blank">https://www.dji.com/</a>, accessed on 11 March 2022).</p>
Full article ">Figure 2
<p>Detection of drones using a X-band radar in a coastal area: (<b>a</b>) radar PPI screenshot, (<b>b</b>) radar signals including range-Doppler images and time-frequency data; (<b>c</b>) confirmation with EO/IR data; (<b>d</b>) flowchart of one collection.</p>
Full article ">Figure 3
<p>Detection of hybrid VTOL fixed-wing drone (TX25A) and quad-rotor drone (DJI Phantom 4) using a radar in a coastal area: (<b>a</b>) detection ranges; (<b>b</b>) azimuth degrees.</p>
Full article ">Figure 4
<p>Comparisons of signal amplitudes and SCR values of VTOL and quad-rotor drones.</p>
Full article ">Figure 5
<p>Raw radar data and spectra of different types of drones: (<b>a</b>) detection range of 10 km; (<b>b</b>) detection range of 12 km.</p>
Full article ">Figure 6
<p>JEM of TX25A with different detection ranges.</p>
Full article ">Figure 7
<p>Tracking Doppler values of TX25A and DJI Phantom 4 drones.</p>
Full article ">
20 pages, 26407 KiB  
Article
A Robust and Accurate Landing Methodology for Drones on Moving Targets
by Assaf Keller and Boaz Ben-Moshe
Drones 2022, 6(4), 98; https://doi.org/10.3390/drones6040098 - 15 Apr 2022
Cited by 12 | Viewed by 10352
Abstract
This paper presents a framework for performing autonomous precise landing of unmanned aerial vehicles (UAVs) on dynamic targets. The main goal of this work is to design the methodology and the controlling algorithms that will allow multi-rotor drones to perform a robust and [...] Read more.
This paper presents a framework for performing autonomous precise landing of unmanned aerial vehicles (UAVs) on dynamic targets. The main goal of this work is to design the methodology and the controlling algorithms that will allow multi-rotor drones to perform a robust and efficient landing in dynamic conditions of changing wind, dynamic obstacles, and moving targets. Unlike existing GNSS-based vertical landing solutions, the suggested framework does not rely on global positioning and uses adaptive diagonal approaching angle visual landing. The framework was designed to work on existing camera-drone platforms, without any need for additional sensors, and it was implemented using DJI’s API on Android devices. The presented concept of visual sliding landing (VSL) was tested on a wide range of commercial drones, performing hundreds of precise and robust autonomous landings on dynamic targets, including boats, cars, RC-boats, and RC-rovers. Full article
(This article belongs to the Special Issue Honorary Special Issue for Prof. Max F. Platzer)
Show Figures

Figure 1

Figure 1
<p>The concept of visual sliding landing (VSL) of a drone on a moving target: (<b>upper left</b>) a DJI’s Mavic mini drone performing a 45° angle auto-VSL on a moving rover (using multi targets). (<b>upper right</b>) DJI’s Mavic Pro, performing 60° auto-VSL on a moving sailing boat. (<b>down</b>) the implementation of the suggested VSL method on an Android application as performing an almost horizontal (10°) auto-VSL into a moving car (using DJI’s Mavic air).</p>
Full article ">Figure 2
<p>(<b>left</b>) DJI camera with gimbal; (<b>right</b>) a common UAV in quadcopter form.</p>
Full article ">Figure 3
<p>A standard remote controller, from Dji’s Mavic Pro User Manual.</p>
Full article ">Figure 4
<p>Height, distance, and concentration.</p>
Full article ">Figure 5
<p>Basic scheme of PID controller.</p>
Full article ">Figure 6
<p>Local coordinate system which includes 1. The coordinate system of the target, 2. The coordinate system of the drone, and 3. The coordinate system of the gimbal, 4. The global coordinate system.</p>
Full article ">Figure 7
<p>Scanning, centralization, and landing.</p>
Full article ">Figure 8
<p>Scanning, distance reducing, centralizing, and landing.</p>
Full article ">Figure 9
<p>Scanning for first target, detecting first target, distance reducing and angle changing, detecting second target, distance reducing, centralizing, and landing.</p>
Full article ">Figure 10
<p>Slide landing: The concept of visual adaptive angle landing of a drone on a moving target. The change in the landing vector is due to the shift of focus from the big target to the small target.</p>
Full article ">Figure 11
<p>(<b>a</b>) Our experiment system setting includes moving target helipad and Mavic mini. (<b>b</b>) The image as captured from landing position on the helipad. Our API includes video streaming area, real-time data, PID parameters, and state machine.</p>
Full article ">Figure 12
<p>The generic VSL algorithm.</p>
Full article ">Figure 13
<p>ArUco markers “10” and “20”.</p>
Full article ">Figure 14
<p>Fix the orientation error. (<b>a</b>) The orientation error from top view. (<b>b</b>) The orientation error from the app screen. (<b>c</b>) The roll error after fixing the yaw. (<b>d</b>) The orientation after fixing both yaw and roll error.</p>
Full article ">Figure 15
<p>The experiment is divided into three steps. (<b>a</b>) Recognition of both target and helipad. (<b>b</b>) Closing distance while changing gimbal angle from −45° to 0°, and hovering in relation to the target. (<b>c</b>) Start landing and changing the gimbal angle value from 0° to 20°.</p>
Full article ">Figure 16
<p>Safety envelope parameters.</p>
Full article ">Figure 17
<p>Fail-safe policy.</p>
Full article ">Figure 18
<p>Data results for testing different values of the PID parameters Kp, Kd, and Ki for distance mode where we control the pitch. (<b>a</b>) Testing system response for different Kp values while Kd = 0, Ki = 0. (<b>b</b>) Testing system response for different Kp values while Kd = 0, Ki = 0.0001. (<b>c</b>) Testing system response for different Kp value while Kd = 0.01, Ki = 0.0001. (<b>d</b>) Testing system response for different Kd value while Kp = 0.23, Ki = 0. (<b>e</b>) Testing system response for different Ki values while Kp = 0.23, Kd = 0.02.</p>
Full article ">Figure 19
<p>An almost horizontal landing into a moving car (back door). The top image shows the first detection of the landing marker. The second image was taken after the autonomous mode was engaged (see “toast” of a “virtual sticks enabled”). The third image shows the drone maintaining a fixed (leashed) distance and angle from the target, and once this distance is maintained successfully for few seconds, the drone is landing, as shown in the bottom image.</p>
Full article ">Figure 20
<p>Slide landing: Mavic Pro (DJI) performing an adjustable tracking and approaching to a moving boat. This is an example of a case in which a diagonal sliding landing is needed as, due to obstacles such as the sails and wires, a vertical landing cannot be performed.</p>
Full article ">Figure 21
<p>(<b>a</b>,<b>b</b>) Results of two successful experiments of hovering and landing on moving targets. There are two zones in the graph, “hover zone” and “land zone”. The upper line represents the distance from the target, and the lower line represents the height from the surface. (<b>c</b>) Result of successful landing on static target.</p>
Full article ">
Back to TopTop