[go: up one dir, main page]
More Web Proxy on the site http://driver.im/

CN109871385B - Method and apparatus for processing data - Google Patents

Method and apparatus for processing data Download PDF

Info

Publication number
CN109871385B
CN109871385B CN201910151379.9A CN201910151379A CN109871385B CN 109871385 B CN109871385 B CN 109871385B CN 201910151379 A CN201910151379 A CN 201910151379A CN 109871385 B CN109871385 B CN 109871385B
Authority
CN
China
Prior art keywords
sensor
type
data
data fusion
target
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201910151379.9A
Other languages
Chinese (zh)
Other versions
CN109871385A (en
Inventor
程凯
王军
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beijing Baidu Netcom Science and Technology Co Ltd
Original Assignee
Beijing Baidu Netcom Science and Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beijing Baidu Netcom Science and Technology Co Ltd filed Critical Beijing Baidu Netcom Science and Technology Co Ltd
Priority to CN201910151379.9A priority Critical patent/CN109871385B/en
Publication of CN109871385A publication Critical patent/CN109871385A/en
Application granted granted Critical
Publication of CN109871385B publication Critical patent/CN109871385B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Landscapes

  • Arrangements For Transmission Of Measured Signals (AREA)

Abstract

Embodiments of the present disclosure disclose methods and apparatus for processing data. One embodiment of the method comprises: firstly, receiving first data output by a first sensor; thereafter, determining whether the type of the first sensor matches a target sensor type, wherein the target sensor type is associated with a history record that triggers a data fusion operation; then, in response to determining that the type of the first sensor matches the target sensor type, triggering a data fusion operation; finally, the type of the first sensor is stored in response to triggering the data fusion operation. The implementation method realizes triggering of data fusion without depending on a single sensor, thereby improving the reliability of triggering data fusion.

Description

Method and apparatus for processing data
Technical Field
Embodiments of the present disclosure relate to the field of computer technologies, and in particular, to a method and an apparatus for processing data.
Background
With the rapid development of the sensing technology, the data fusion technology of multiple sensors also becomes more important. Particularly, in the application of automatic driving and auxiliary driving systems of vehicles, the fusion module is used as a perception bottom layer module to fuse barrier information perceived by different sensors. Therefore, the triggering frequency of data fusion is significant to whether the data collected by each sensor can be fully utilized and whether the fusion process occupies too much system resources.
There are generally two ways of triggering the relevant data fusion. One is to set a certain sensor as a main sensor, and when receiving data transmitted by the main sensor, acquire data of other sensors and trigger data fusion. The other is to trigger data fusion once every 100ms, for example, by a preset frequency.
Disclosure of Invention
Embodiments of the present disclosure propose methods and apparatuses for processing data.
In a first aspect, an embodiment of the present disclosure provides a method for processing data, the method including: receiving first data output by a first sensor; determining whether the type of the first sensor matches a target sensor type, wherein the target sensor type is associated with a history record that triggers a data fusion operation; triggering a data fusion operation in response to determining that the type of the first sensor matches the target sensor type; in response to triggering the data fusion operation, the type of the first sensor is stored.
In some embodiments, the method further comprises: in response to determining that the type of the first sensor does not match the target sensor type, determining whether a difference between a timestamp corresponding to a time at which the first data was received and the target timestamp is greater than a target trigger threshold; in response to determining that the difference is greater than the target trigger threshold, a data fusion operation is triggered.
In some embodiments, the target timestamp includes a timestamp corresponding to a trigger of a last data fusion operation. The target trigger threshold includes a preset trigger time threshold corresponding to the first sensor. The method further comprises the following steps: in response to triggering the data fusion operation, a timestamp at the time the data fusion operation was triggered is stored.
In some embodiments, the triggering data fusion operation includes: acquiring second data from a second sensor, wherein the second sensor is associated with the first sensor; and sending information representing the start of data fusion.
In some embodiments, the first sensor and the second sensor correspond to a preset trigger time threshold respectively.
In some embodiments, the target sensor type includes a type of sensor to which the last data fusion operation was triggered.
In a second aspect, an embodiment of the present disclosure provides an apparatus for processing data, the apparatus including: a receiving unit configured to receive first data output by the first sensor; a determination unit configured to determine whether a type of the first sensor matches a target sensor type, wherein the target sensor type is associated with a history record that triggers a data fusion operation; a first triggering unit configured to trigger a data fusion operation in response to determining that the type of the first sensor matches the target sensor type; a first storage unit configured to store a type of the first sensor in response to triggering the data fusion operation.
In some embodiments, the apparatus further comprises: a second triggering unit configured to determine whether a difference between a timestamp corresponding to a time at which the first data is received and the target timestamp is greater than a target trigger threshold in response to determining that the type of the first sensor does not match the target sensor type; in response to determining that the difference is greater than the target trigger threshold, a data fusion operation is triggered.
In some embodiments, the target timestamp includes a timestamp corresponding to a trigger of a last data fusion operation. The target trigger threshold includes a preset trigger time threshold corresponding to the first sensor. The device also includes: and the second storage unit is configured to respond to the triggering of the data fusion operation and store the time stamp when the data fusion operation is triggered.
In some embodiments, the first trigger unit includes: an acquisition module configured to acquire second data from a second sensor, wherein the second sensor is associated with the first sensor; a sending module configured to send information characterizing the start of data fusion.
In some embodiments, the first sensor and the second sensor correspond to a preset trigger time threshold respectively.
In some embodiments, the target sensor type includes a type of sensor to which the last data fusion operation was triggered.
In a third aspect, an embodiment of the present disclosure provides a multi-source sensor fusion triggering system, including: the system comprises a main sensor, a secondary sensor and a processor, wherein a preset trigger time threshold of the main sensor is smaller than a data output time interval of the main sensor, a preset trigger time threshold of the secondary sensor is larger than the data output time interval of the main sensor, and the processor is used for realizing the method described in any one implementation mode of the first aspect.
In a fourth aspect, an embodiment of the present disclosure provides a terminal, including: one or more processors; a storage device having one or more programs stored thereon; when the one or more programs are executed by the one or more processors, the one or more processors are caused to implement the method as described in any implementation of the first aspect.
In a fifth aspect, embodiments of the present disclosure provide a computer-readable medium on which a computer program is stored, which when executed by a processor implements the method as described in any of the implementations of the first aspect.
The method and the device for processing data provided by the embodiment of the disclosure firstly receive first data output by a first sensor; thereafter, determining whether the type of the first sensor matches a target sensor type, wherein the target sensor type is associated with a history record that triggers a data fusion operation; then, in response to determining that the type of the first sensor matches the type of the target sensor, triggering a data fusion operation; finally, the type of the first sensor is stored in response to triggering the data fusion operation. Therefore, data fusion is triggered without depending on a single sensor, and the reliability of triggering data fusion is improved.
Drawings
Other features, objects and advantages of the disclosure will become more apparent upon reading of the following detailed description of non-limiting embodiments thereof, made with reference to the accompanying drawings in which:
FIG. 1 is an exemplary system architecture diagram in which one embodiment of the present disclosure may be applied;
FIG. 2 is a flow diagram for one embodiment of a method for processing data according to the present disclosure;
FIG. 3 is a schematic diagram of one application scenario of a method for processing data according to an embodiment of the present disclosure;
FIG. 4 is a flow diagram of yet another embodiment of a method for processing data according to the present disclosure;
FIG. 5 is a schematic block diagram illustrating one embodiment of an apparatus for processing data according to the present disclosure;
FIG. 6 is a timing diagram of one application scenario of a multi-source sensor fusion triggering system 600, according to an embodiment of the present disclosure;
FIG. 7 is a schematic structural diagram of an electronic device suitable for use in implementing embodiments of the present disclosure.
Detailed Description
The present disclosure is described in further detail below with reference to the accompanying drawings and examples. It is to be understood that the specific embodiments described herein are merely illustrative of the relevant invention and not restrictive of the invention. It should be noted that, for convenience of description, only the portions related to the related invention are shown in the drawings.
It should be noted that, in the present disclosure, the embodiments and features of the embodiments may be combined with each other without conflict. The present disclosure will be described in detail below with reference to the accompanying drawings in conjunction with embodiments.
Fig. 1 illustrates an exemplary architecture 100 to which the method for processing data or the apparatus for processing data of the present disclosure may be applied.
As shown in fig. 1, the system architecture 100 may include sensors 101, 102, 103, a network 104, and a processor 105. Network 104 is used to provide a medium for communication links between sensors 101, 102, 103 and processor 105. Network 104 may include various connection types, such as wired, wireless communication links, or fiber optic cables, to name a few.
The sensors 101, 102, 103 interact with a processor 105 over a network 104 to receive or send messages or the like. The sensors 101, 102, 103 may include various types of sensing devices that convert sensed, measured information into electrical signals or other information formats that may be transmitted, processed, stored, and displayed. The sensors may include, but are not limited to, at least one of: LiDAR (Light Detection And Ranging), image sensors (e.g., cameras), ultrasonic radar, And millimeter wave radar.
The processor 105 may be any of a variety of processors that support data processing and information transfer, such as a processor that provides a trigger signal for a data fusion operation. The processor 105 may receive the data output by the sensors 101, 102, 103 and record the type of sensor outputting the data, and determine whether to trigger a data fusion operation based on the type of sensor.
The processor may be hardware or software. When the processor is hardware, it can be implemented as a processor group consisting of a plurality of processors, or as a single processor. When the processor is software, it may be implemented as a plurality of software or software modules, or as a single software or software module. And is not particularly limited herein.
It should be noted that the method for processing data provided by the embodiments of the present disclosure is generally executed by the processor 105, and accordingly, the apparatus for processing data is generally disposed in the processor 105.
It should be understood that the number of sensors, networks, and processors in fig. 1 is merely illustrative. There may be any number of sensors, networks, and processors, as desired for an implementation.
With continued reference to FIG. 2, a flow 200 of one embodiment of a method for processing data in accordance with the present disclosure is shown. The method for processing data comprises the following steps:
step 201, receiving first data output by a first sensor.
In the present embodiment, an execution subject (such as the processor 105 shown in fig. 1) of the method for processing data may receive the first data output by the first sensor through a wired connection or a wireless connection. Wherein the first sensor may comprise a sensor capable of triggering a data fusion operation. In practice, since the data fusion operation needs to analyze and integrate several acquired data of each sensor under certain criteria, system computing resources are occupied. It is therefore usually predetermined that one or some of the sensors can trigger a data fusion operation, while data from other sensors, even if acquired, cannot trigger a data fusion operation. The plurality of sensors in communication connection with the execution main body can output the collected data at respective preset data transmission frequencies. Thus, the execution body may receive data from different sensor outputs at different times. As an example, in the field of autopilot technology, the first sensor may include, but is not limited to, at least one of: LiDAR, cameras, millimeter wave radar.
It should be noted that in practice there may be a plurality of sensors of the same type. In the data processing process of different types of sensors, data collected by different sensors of the same type are generally integrated, and at this time, the first data corresponds to an integrated result of data collected by a plurality of sensors of the same type. As an example, an autonomous vehicle may have 3 cameras mounted thereon. In this case, the first data may be a result of integrating image data collected by the 3 cameras. Accordingly, the first sensor may be the 3 cameras.
At step 202, it is determined whether the type of the first sensor matches the target sensor type.
In this embodiment, the executing entity may determine the type of the first sensor corresponding to the first data received in step 201. The execution body may then determine whether the type of the first sensor matches a target sensor type. Wherein the target sensor type may be associated with a history record that triggers a data fusion operation. As an example, the target sensor type may be a type of sensor that triggers the data fusion operation the most number of times in the history. Generally, the greater the number of times a data fusion operation is triggered, the more important the data of the sensor is relative, and thus corresponds to a higher triggering priority for the sensor.
It should be noted that the types of the sensors mentioned above may generally include, but are not limited to, at least one of the following: LiDAR, cameras, millimeter wave radar.
In this embodiment, the executing entity may first determine the type of the sensor with the largest number of triggers in the history of triggering data fusion operations. The executive body may then compare the type of the first sensor with the determined type of sensor. In response to determining that the type of the first sensor is consistent with the determined type of the sensor, the execution body may determine that the type of the first sensor matches the target sensor type. In response to determining that the type of the first sensor is inconsistent with the determined type of the sensor, the executive agent may determine that the type of the first sensor does not match the target sensor type.
In some optional implementations of the present embodiment, the target sensor type may include a type of a sensor corresponding to a trigger of a last data fusion operation. The executing entity may first determine the type of sensor that triggered the data fusion operation last recorded in the history of triggered data fusion operations. The executive body may then compare the type of the first sensor with the determined type of sensor. In response to determining that the type of the first sensor is consistent with the determined type of the sensor, the execution body may determine that the type of the first sensor matches the target sensor type. In response to determining that the type of the first sensor is inconsistent with the determined type of the sensor, the executive agent may determine that the type of the first sensor does not match the target sensor type.
Step 203, in response to determining that the type of the first sensor matches the target sensor type, triggers a data fusion operation.
In the present embodiment, according to the determination result of step 202, in response to determining that the type of the first sensor matches the target sensor type, the executing entity may trigger the data fusion operation in various ways. As an example, the execution body may perform analysis processing on the acquired data according to a preset data fusion (data fusion) algorithm. As yet another example, the executing entity may send an instruction characterizing the initiation of the data fusion operation to the electronic device executing the data fusion operation.
In some optional implementations of this embodiment, the executing entity may trigger the data fusion operation according to the following steps:
in a first step, second data is acquired from a second sensor.
In these implementations, the execution body may obtain the second data from a second sensor communicatively coupled thereto. Wherein the second sensor may be associated with the first sensor. As an example, the first sensor and the second sensor may be sensors that are separated by less than a preset distance threshold in operation. As yet another example, the first sensor and the second sensor may be sensors in which the data output frequencies differ by less than a preset threshold.
Optionally, the first sensor and the second sensor may also respectively correspond to preset trigger time thresholds. Wherein the trigger time threshold is generally associated with a frequency at which the sensor outputs data. The frequency of the sensor output data is used for representing that the sensor outputs data once at regular time intervals. As an example, in practice, the preset trigger time threshold of the first sensor may be set to a value less than the time interval for which the first sensor outputs data; the preset trigger time threshold of the second sensor is set to a value greater than the time interval for which the first sensor outputs data. The first sensor described above may then correspond to the primary sensor that triggers the data fusion operation. The second sensor may correspond to a secondary sensor that triggers the data fusion operation. And the larger the preset trigger time threshold is, the lower the priority of triggering the data fusion operation by the corresponding sensor is. Therefore, different preset trigger time thresholds are set for the sensors, and the setting of the priority of triggering data fusion operation of the sensors is realized.
And secondly, sending information representing the start of data fusion.
In these implementations, after the executing entity obtains the second data in the first step, the executing entity may perform analysis processing on the obtained first data and the obtained second data according to a preset data fusion algorithm. The execution body may also send information characterizing the start of data fusion. The information for representing the start of data fusion may be in various forms, such as a control signal for turning on an indicator light or modifying the value of a flag bit. Optionally, the executing body may further send the acquired second data to an electronic device for executing a data fusion operation; and sending an instruction representing the start of the data fusion operation to the electronic equipment.
In response to triggering the data fusion operation, the type of the first sensor is stored, step 204.
In this embodiment, the execution body may store the type of the first sensor in response to triggering the data fusion operation. Optionally, the execution body may further update the history of the triggered data fusion operation according to the type of the stored first sensor.
With continued reference to fig. 3, fig. 3 is a schematic diagram of an application scenario of a method for processing data according to an embodiment of the present disclosure. In the application scenario of fig. 3, image data output by binocular camera 303 is received by an autonomous driving system 302 mounted on a vehicle 301. The autopilot system 302 determines that the target sensor type is lidar based on the history of the triggering data fusion operation. Thereafter, the above-described automatic driving system 302 determines that the type of the binocular camera 303 does not match the target sensor type, and does not perform the data fusion operation. Next, the autopilot system 302 receives the point cloud data output by the lidar 304. Autopilot system 302, as described above, then determines that the type of lidar 304 matches the target sensor type, triggering a data fusion operation. Then, the automatic driving system 302 stores the type of the sensor that triggered the data fusion operation this time. Optionally, the automatic driving system 302 may further store the type of the laser radar and a timestamp for triggering the data fusion operation in the history record for triggering the data fusion operation.
At present, one of the prior arts generally sets a certain sensor as a main sensor, and when data transmitted by the main sensor is received, data of other sensors is acquired and data fusion is triggered. But this approach may be strongly dependent on the primary sensor. When the main sensor has faults or the frequency of output data is unstable, the triggering of data fusion operation is directly influenced, and further the information collected by the multiple sensors cannot be utilized in time. In the method provided by the embodiment of the disclosure, whether the data fusion operation is triggered or not is determined by determining whether the type of the sensor from which the received data comes is matched with the type of the target sensor or not, so that the data fusion operation can be triggered by different sensors, the triggering mode is enriched, the advantage of fusing redundant backup is more effectively embodied, and the reliability of triggering data fusion is improved.
With further reference to FIG. 4, a flow 400 of yet another embodiment of a method for processing data is shown. The flow 400 of the method for processing data includes the steps of:
step 401, receiving first data output by a first sensor.
At step 402, it is determined whether the type of the first sensor matches the target sensor type.
In response to determining that the type of the first sensor matches the target sensor type, a data fusion operation is triggered, step 403.
In response to triggering the data fusion operation, the type of the first sensor is stored, step 404.
Step 401, step 402, step 403, and step 404 are respectively consistent with step 201, step 202, step 203, and step 204 in the foregoing embodiment, and the above description for step 201, step 202, step 203, and step 204 also applies to step 401, step 402, step 403, and step 404, which is not described herein again.
Step 405, in response to determining that the type of the first sensor does not match the type of the target sensor, determining whether a difference between a timestamp corresponding to a time at which the first data is received and the target timestamp is greater than a target trigger threshold; in response to determining that the difference is greater than the target trigger threshold, a data fusion operation is triggered.
In this embodiment, according to the result determined in step 402, in response to determining that the type of the first sensor does not match the target sensor type, the execution subject may determine a difference between a timestamp corresponding to the time when the first data is received and the target timestamp. The timestamp may refer to a total number of seconds from 1970, 01, 00 hours, 00 minutes, 00 seconds (beijing time, 00 minutes, 00 seconds from 1970, 01, 08 hours, 00 seconds) to a specific time (e.g., a time when the first data is received). The target timestamp may be a timestamp dynamically determined according to a preset rule. As an example, the target timestamp may be used to characterize the time of the last output data of the sensor that last triggered the data fusion operation. The execution body may then compare the determined difference to a target trigger threshold. The target trigger threshold may be any preset value, or may be a value determined according to actual application. As an example, the target trigger threshold may be an average length of time between two data fusion operations as indicated by a history of triggering the data fusion operations. In response to determining that the determined difference is greater than the target trigger threshold, the executing entity may trigger a data fusion operation.
In some optional implementations of this embodiment, the target timestamp may include a timestamp corresponding to a trigger time of the last data fusion operation. The target trigger threshold may include a preset trigger time threshold corresponding to the first sensor. In response to triggering the data fusion operation, the execution body may further store a timestamp of when the data fusion operation was triggered. Optionally, the execution body may further update the history of the triggered data fusion operation according to the type of the stored first sensor.
As can be seen from fig. 4, compared with the embodiment corresponding to fig. 2, the flow 400 of the method for processing data in this embodiment embodies a step of determining whether to trigger the data fusion operation by determining whether a difference between a timestamp corresponding to the time when the first data is received and the target timestamp is greater than the target trigger threshold in the case that the type of the first sensor does not match the type of the target sensor. Therefore, the scheme described in the embodiment realizes the switching of the data fusion operation triggered by different sensors by introducing the target trigger threshold. Therefore, when frame loss occurs in the data output process of the main sensor, data fusion operation can be triggered by other sensors, and the advantage of redundancy backup in data fusion is fully embodied.
With further reference to fig. 5, as an implementation of the methods shown in the above figures, the present disclosure provides an embodiment of an apparatus for processing data, which corresponds to the method embodiment shown in fig. 2, and which is particularly applicable in various electronic devices.
As shown in fig. 5, the apparatus 500 for processing data provided by the present embodiment includes a receiving unit 501, a determining unit 502, a first triggering unit 503, and a first storing unit 504. Wherein, the receiving unit 501 is configured to receive the first data output by the first sensor; a determining unit 502 configured to determine whether a type of the first sensor matches a target sensor type, wherein the target sensor type is associated with a history record that triggers a data fusion operation; a first triggering unit 503 configured to trigger a data fusion operation in response to determining that the type of the first sensor matches the target sensor type; a first storage unit 504 configured to store a type of the first sensor in response to triggering the data fusion operation.
In the present embodiment, in the apparatus 500 for processing data: the specific processing of the receiving unit 501, the determining unit 502, the first triggering unit 503 and the first storing unit 504 and the technical effects thereof can refer to the related descriptions of step 201, step 202, step 203 and step 204 in the corresponding embodiment of fig. 2, which are not described herein again.
In some optional implementations of this embodiment, the apparatus 500 for processing data may further include: a second triggering unit (not shown in the figures) configured to determine whether a difference between a timestamp corresponding to a time when the first data is received and the target timestamp is greater than a target triggering threshold in response to determining that the type of the first sensor does not match the target sensor type; in response to determining that the difference is greater than the target trigger threshold, a data fusion operation is triggered.
In some optional implementations of this embodiment, the target timestamp may include a timestamp corresponding to a trigger of the last data fusion operation. The target trigger threshold may include a preset trigger time threshold corresponding to the first sensor. The apparatus 500 for processing data may further include: the second storage unit (not shown in the figure) may be configured to store a timestamp when the data fusion operation is triggered, in response to triggering the data fusion operation.
In some optional implementation manners of this embodiment, the first triggering unit 503 may include: an acquisition module (not shown in the figure), and a sending module (not shown in the figure). Wherein the acquisition module may be configured to acquire second data from a second sensor, wherein the second sensor is associated with the first sensor; the transmitting module may be configured to transmit information characterizing the start of data fusion.
In some optional implementation manners of this embodiment, the first sensor and the second sensor may respectively correspond to a preset trigger time threshold.
In some optional implementations of the present embodiment, the target sensor type may include a type of a sensor corresponding to a trigger of a last data fusion operation.
The apparatus provided in the above embodiment of the present disclosure first receives the first data output by the first sensor through the receiving unit 501; then, the determination unit 502 determines whether the type of the first sensor matches a target sensor type, wherein the target sensor type is associated with a history record that triggers a data fusion operation; thereafter, in response to determining that the type of the first sensor matches the target sensor type, the first triggering unit 503 triggers a data fusion operation; finally, in response to triggering a data fusion operation, the first storage unit 504 stores the type of the first sensor. Therefore, data fusion is triggered without depending on a single sensor, and the reliability of triggering data fusion is improved.
With further reference to fig. 6, a timing diagram of one application scenario of the multi-source sensor fusion trigger system 600 of an embodiment of the present disclosure is shown. The multi-source sensor fusion triggering system 600 may include a primary sensor 601, a secondary sensor 602, and a processor (not shown). The preset trigger time threshold of the main sensor is usually smaller than the data output time interval of the main sensor. The preset trigger time threshold of the secondary sensor is generally greater than the preset trigger time threshold of the primary sensor. The processor may be configured to implement the method described in step 201 to step 204 or step 401 to step 405 of the foregoing embodiments.
As an example, data output by both the binocular camera and the lidar may trigger a data fusion operation. The data output time interval of the binocular camera can be 0.08s, namely the data output frequency of the binocular camera is 12.5 Hz. The data output time interval of the laser radar may be 0.1s, that is, the data output frequency thereof is 10 Hz. In order to set the binocular camera as the main sensor, the preset trigger time threshold of the binocular camera may be set to 0.06s, i.e., the trigger frequency is 16.67 Hz. In order to set the lidar as a secondary sensor, the preset trigger time threshold of the lidar described above may be set to 0.15s, i.e. the trigger frequency is 6.67 Hz.
When the data output frequency of the binocular camera is normal, the data fusion operation is triggered by the frequency of the binocular camera, as shown in the first half of fig. 6. If the output data of the binocular camera at a certain moment in the middle is frame-lost (as shown by a dotted line in fig. 6), the processor can determine the difference between the timestamp of the output data of the laser radar of the current frame and the timestamp of the last trigger data fusion (binocular sensor) in the history record of the trigger data fusion operation. And in response to determining that the difference value 0.06s is smaller than the preset trigger time threshold value 0.15s of the laser radar, not triggering the data fusion operation. When the laser radar outputs data again, and the difference reaches 0.16s, in response to determining that the difference 0.16s is greater than the preset trigger time threshold of the laser radar by 0.15s, the data fusion operation may be triggered by using the data output by the laser radar, as shown in the middle section of fig. 6. Therefore, the information collected by the sensor is still ensured to be fused under the condition that the frame loss occurs in the output data of the main sensor, and a correct sensing output result is generated. After the data output frequency of the binocular camera is recovered to be normal, the data output time interval of the binocular camera and the preset trigger time threshold are smaller, so that the data output by the binocular camera can be switched to be used for triggering the data fusion operation. Therefore, the frequency of the data fusion operation is not too low, and the subsequent processing module can timely sense the surrounding environment according to the data fusion processing result.
Referring now to fig. 7, and referring now to fig. 7, a block diagram of an electronic device (e.g., the terminal device of fig. 1) 700 suitable for use in implementing embodiments of the present disclosure is shown. The terminal device in the embodiments of the present disclosure may include, but is not limited to, a mobile terminal such as a vehicle-mounted terminal (e.g., an automatic driving system, a driving assistance system), and a fixed terminal such as a desktop computer. The terminal device shown in fig. 7 is only an example, and should not bring any limitation to the functions and the use range of the embodiments of the present disclosure.
As shown in fig. 7, electronic device 700 may include a processing means (e.g., central processing unit, graphics processor, etc.) 701 that may perform various appropriate actions and processes in accordance with a program stored in a Read Only Memory (ROM)702 or a program loaded from storage 708 into a Random Access Memory (RAM) 703. In the RAM703, various programs and data necessary for the operation of the electronic apparatus 700 are also stored. The processing device 701, the ROM 702, and the RAM703 are connected to each other by a bus 704. An input/output (I/O) interface 705 is also connected to bus 704.
Generally, the following devices may be connected to the I/O interface 705: input devices 706 including, for example, a touch screen, touch pad, keyboard, mouse, camera, microphone, accelerometer, gyroscope, etc.; an output device 707 including, for example, a Liquid Crystal Display (LCD), a speaker, a vibrator, and the like; storage 708 including, for example, magnetic tape, hard disk, etc.; and a communication device 709. The communication means 709 may allow the electronic device 700 to communicate wirelessly or by wire with other devices to exchange data. While fig. 7 illustrates an electronic device 700 having various means, it is to be understood that not all illustrated means are required to be implemented or provided. More or fewer devices may alternatively be implemented or provided. Each block shown in fig. 7 may represent one device or may represent multiple devices as desired.
In particular, according to an embodiment of the present disclosure, the processes described above with reference to the flowcharts may be implemented as computer software programs. For example, embodiments of the present disclosure include a computer program product comprising a computer program embodied on a computer readable medium, the computer program comprising program code for performing the method illustrated in the flow chart. In such embodiments, the computer program may be downloaded and installed from a network via the communication means 709, or may be installed from the storage means 708, or may be installed from the ROM 702. The computer program, when executed by the processing device 701, performs the above-described functions defined in the methods of embodiments of the present disclosure.
It should be noted that the computer readable medium described in the embodiments of the present disclosure may be a computer readable signal medium or a computer readable storage medium or any combination of the two. A computer readable storage medium may be, for example, but not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, or device, or any combination of the foregoing. More specific examples of the computer readable storage medium may include, but are not limited to: an electrical connection having one or more wires, a portable computer diskette, a hard disk, a Random Access Memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or flash memory), an optical fiber, a portable compact disc read-only memory (CD-ROM), an optical storage device, a magnetic storage device, or any suitable combination of the foregoing. In embodiments of the disclosure, a computer readable storage medium may be any tangible medium that can contain, or store a program for use by or in connection with an instruction execution system, apparatus, or device. In embodiments of the present disclosure, however, a computer readable signal medium may comprise a propagated data signal with computer readable program code embodied therein, for example, in baseband or as part of a carrier wave. Such a propagated data signal may take many forms, including, but not limited to, electro-magnetic, optical, or any suitable combination thereof. A computer readable signal medium may also be any computer readable medium that is not a computer readable storage medium and that can communicate, propagate, or transport a program for use by or in connection with an instruction execution system, apparatus, or device. Program code embodied on a computer readable medium may be transmitted using any appropriate medium, including but not limited to: electrical wires, optical cables, RF (Radio Frequency), etc., or any suitable combination of the foregoing.
The computer readable medium may be included in the terminal device; or may exist separately without being assembled into the terminal device. The computer readable medium carries one or more programs which, when executed by the terminal device, cause the terminal device to: receiving first data output by a first sensor; determining whether the type of the first sensor matches a target sensor type, wherein the target sensor type is associated with a history record that triggers a data fusion operation; triggering a data fusion operation in response to determining that the type of the first sensor matches the target sensor type; in response to triggering the data fusion operation, the type of the first sensor is stored.
Computer program code for carrying out operations for embodiments of the present disclosure may be written in any combination of one or more programming languages, including an object oriented programming language such as Java, Smalltalk, C + +, and conventional procedural programming languages, such as the "C" programming language or similar programming languages. The program code may execute entirely on the user's computer, partly on the user's computer, as a stand-alone software package, partly on the user's computer and partly on a remote computer or entirely on the remote computer or server. In the case of a remote computer, the remote computer may be connected to the user's computer through any type of network, including a Local Area Network (LAN) or a Wide Area Network (WAN), or the connection may be made to an external computer (for example, through the Internet using an Internet service provider).
The flowchart and block diagrams in the figures illustrate the architecture, functionality, and operation of possible implementations of systems, methods and computer program products according to various embodiments of the present disclosure. In this regard, each block in the flowchart or block diagrams may represent a module, segment, or portion of code, which comprises one or more executable instructions for implementing the specified logical function(s). It should also be noted that, in some alternative implementations, the functions noted in the block may occur out of the order noted in the figures. For example, two blocks shown in succession may, in fact, be executed substantially concurrently, or the blocks may sometimes be executed in the reverse order, depending upon the functionality involved. It will also be noted that each block of the block diagrams and/or flowchart illustration, and combinations of blocks in the block diagrams and/or flowchart illustration, can be implemented by special purpose hardware-based systems which perform the specified functions or acts, or combinations of special purpose hardware and computer instructions.
The units described in the embodiments of the present disclosure may be implemented by software or hardware. The described units may also be provided in a processor, and may be described as: a processor includes a receiving unit, a determining unit, a first triggering unit, and a first storing unit. Where the names of the units do not in some cases constitute a limitation of the unit itself, for example, the receiving unit may also be described as a "unit that receives the first data output by the first sensor".
The foregoing description is only exemplary of the preferred embodiments of the disclosure and is illustrative of the principles of the technology employed. It will be appreciated by those skilled in the art that the scope of the invention in the embodiments of the present disclosure is not limited to the specific combination of the above-mentioned features, but also encompasses other embodiments in which any combination of the above-mentioned features or their equivalents is made without departing from the inventive concept as defined above. For example, the above features and (but not limited to) technical features with similar functions disclosed in the embodiments of the present disclosure are mutually replaced to form the technical solution.

Claims (15)

1. A method for processing data, comprising:
receiving first data output by a first sensor;
determining whether the type of the first sensor matches a target sensor type, wherein the target sensor type is associated with a history record that triggers a data fusion operation;
triggering the data fusion operation in response to determining that the type of the first sensor matches the target sensor type;
in response to triggering the data fusion operation, storing the type of the first sensor.
2. The method of claim 1, wherein after the determining whether the type of the first sensor matches a target sensor type, the method further comprises:
in response to determining that the type of the first sensor does not match the target sensor type, determining whether a difference between a timestamp corresponding to a time at which the first data was received and a target timestamp is greater than a target trigger threshold; triggering the data fusion operation in response to determining that the difference is greater than the target trigger threshold.
3. The method of claim 2, wherein the target timestamp comprises a timestamp corresponding to a trigger of a last data fusion operation, and the target trigger threshold comprises a preset trigger time threshold corresponding to the first sensor; and
after the triggering the data fusion operation, the method further comprises:
in response to triggering the data fusion operation, storing a timestamp at which the data fusion operation was triggered.
4. The method of claim 1, wherein the triggering the data fusion operation comprises:
obtaining second data from a second sensor, wherein the second sensor is associated with the first sensor;
and sending information representing the start of data fusion.
5. The method of claim 4, wherein the first sensor and the second sensor each correspond to a preset trigger time threshold.
6. The method of one of claims 1-5, wherein the target sensor type comprises a type of sensor to which a last data fusion operation was triggered.
7. An apparatus for processing data, comprising:
a receiving unit configured to receive first data output by the first sensor;
a determination unit configured to determine whether a type of the first sensor matches a target sensor type, wherein the target sensor type is associated with a history record that triggers a data fusion operation;
a first triggering unit configured to trigger the data fusion operation in response to determining that the type of the first sensor matches the target sensor type;
a first storage unit configured to store a type of the first sensor in response to triggering the data fusion operation.
8. The apparatus of claim 7, wherein the apparatus further comprises:
a second triggering unit configured to determine whether a difference between a timestamp corresponding to a time at which the first data is received and a target timestamp is greater than a target trigger threshold in response to determining that the type of the first sensor does not match the target sensor type; triggering the data fusion operation in response to determining that the difference is greater than the target trigger threshold.
9. The apparatus of claim 8, wherein the target timestamp comprises a timestamp corresponding to a trigger of a last data fusion operation, and the target trigger threshold comprises a preset trigger time threshold corresponding to the first sensor; the device further comprises:
a second storage unit configured to store a timestamp at which the data fusion operation was triggered in response to triggering the data fusion operation.
10. The apparatus of claim 7, wherein the first triggering unit comprises:
an acquisition module configured to acquire second data from a second sensor, wherein the second sensor is associated with the first sensor;
a sending module configured to send information characterizing the start of data fusion.
11. The apparatus of claim 10, wherein the first sensor and the second sensor each correspond to a preset trigger time threshold.
12. The apparatus of one of claims 7-11, wherein the target sensor type comprises a type of sensor to which a last data fusion operation was triggered.
13. A multi-source sensor fusion triggering system, comprising: a primary sensor, a secondary sensor, and a processor, wherein a preset trigger time threshold of the primary sensor is less than a data output time interval of the primary sensor, and a preset trigger time threshold of the secondary sensor is greater than the data output time interval of the primary sensor, and the processor is configured to implement the method of any one of claims 1-6.
14. A terminal, comprising:
one or more processors;
a storage device having one or more programs stored thereon;
when executed by the one or more processors, cause the one or more processors to implement the method of any one of claims 1-6.
15. A computer-readable medium, on which a computer program is stored which, when being executed by a processor, carries out the method according to any one of claims 1-6.
CN201910151379.9A 2019-02-28 2019-02-28 Method and apparatus for processing data Active CN109871385B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201910151379.9A CN109871385B (en) 2019-02-28 2019-02-28 Method and apparatus for processing data

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201910151379.9A CN109871385B (en) 2019-02-28 2019-02-28 Method and apparatus for processing data

Publications (2)

Publication Number Publication Date
CN109871385A CN109871385A (en) 2019-06-11
CN109871385B true CN109871385B (en) 2021-07-27

Family

ID=66919523

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201910151379.9A Active CN109871385B (en) 2019-02-28 2019-02-28 Method and apparatus for processing data

Country Status (1)

Country Link
CN (1) CN109871385B (en)

Families Citing this family (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112307094B (en) * 2019-07-26 2024-04-02 北京百度网讯科技有限公司 Automatic driving data reading method and device, computer equipment and storage medium
CN111310627B (en) * 2020-02-07 2024-01-30 广州视源电子科技股份有限公司 Detection method and device of sensing device and electronic equipment
CN111753901B (en) * 2020-06-23 2023-08-15 国汽(北京)智能网联汽车研究院有限公司 Data fusion method, device, system and computer equipment
CN112461245A (en) * 2020-11-26 2021-03-09 浙江商汤科技开发有限公司 Data processing method and device, electronic equipment and storage medium
CN113327344B (en) * 2021-05-27 2023-03-21 北京百度网讯科技有限公司 Fusion positioning method, device, equipment, storage medium and program product

Family Cites Families (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US8427309B2 (en) * 2009-06-15 2013-04-23 Qualcomm Incorporated Sensor network management
JP5753286B1 (en) * 2014-02-05 2015-07-22 株式会社日立パワーソリューションズ Information processing apparatus, diagnostic method, and program
CN106017475B (en) * 2016-07-04 2019-03-08 四川九洲防控科技有限责任公司 A kind of track update method and device
JP6521096B2 (en) * 2016-10-21 2019-05-29 日本電気株式会社 Display method, display device, and program
CN106840242B (en) * 2017-01-23 2020-02-04 驭势科技(北京)有限公司 Sensor self-checking system and multi-sensor fusion system of intelligent driving automobile
CN108663988B (en) * 2018-05-31 2020-03-10 哈尔滨莫迪科技有限责任公司 Intelligent monitoring system of numerical control machine tool based on Internet of things
CN108983219B (en) * 2018-08-17 2020-04-07 北京航空航天大学 Fusion method and system for image information and radar information of traffic scene

Also Published As

Publication number Publication date
CN109871385A (en) 2019-06-11

Similar Documents

Publication Publication Date Title
CN109871385B (en) Method and apparatus for processing data
CN112630799B (en) Method and apparatus for outputting information
US11999371B2 (en) Driving assistance processing method and apparatus, computer-readable medium, and electronic device
CN115391422B (en) Vehicle perception information generation method, device, equipment, medium and program product
CN114089811B (en) Data processing method, device, equipment and storage medium
CN112017462B (en) Method, apparatus, electronic device, and medium for generating scene information
CN112590798B (en) Method, apparatus, electronic device, and medium for detecting driver state
CN111461980B (en) Performance estimation method and device of point cloud stitching algorithm
CN115938013B (en) Method, apparatus, device and computer readable medium for monitoring data
CN113299058B (en) Traffic accident responsibility identification method, device, medium and electronic equipment
CN114758503B (en) Driving data processing method, device, server and storage medium
CN115061386B (en) Intelligent driving automatic simulation test system and related equipment
CN114357814B (en) Automatic driving simulation test method, device, equipment and computer readable medium
CN112668194B (en) Automatic driving scene library information display method, device and equipment based on page
CN118053447A (en) Sound signal processing method and device, electronic equipment and storage medium
CN111488928B (en) Method and device for acquiring samples
CN115705258A (en) Test method, test device, storage medium and electronic equipment
CN115638812B (en) Automatic driving sensing information detection method, device, equipment and computer medium
CN112904366B (en) Repositioning method and device applied to sweeper, electronic equipment and medium
CN116382597B (en) Vehicle operation data storage method, apparatus and computer medium
CN115900638B (en) Obstacle course angle information generation method and device, electronic equipment and readable medium
CN114359673B (en) Small sample smoke detection method, device and equipment based on metric learning
CN113888892B (en) Road information prompting method and device, electronic equipment and computer readable medium
CN110362086B (en) Method and device for controlling an autonomous vehicle
CN118097788B (en) Behavior recognition method and device applied to infants and electronic equipment

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant