[go: up one dir, main page]
More Web Proxy on the site http://driver.im/

CN110515092B - Plane touch method based on laser radar - Google Patents

Plane touch method based on laser radar Download PDF

Info

Publication number
CN110515092B
CN110515092B CN201911011503.8A CN201911011503A CN110515092B CN 110515092 B CN110515092 B CN 110515092B CN 201911011503 A CN201911011503 A CN 201911011503A CN 110515092 B CN110515092 B CN 110515092B
Authority
CN
China
Prior art keywords
laser radar
points
coordinates
touch
measurement
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201911011503.8A
Other languages
Chinese (zh)
Other versions
CN110515092A (en
Inventor
杨帆
白立群
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Xiaoshi Technology Jiangsu Co ltd
Original Assignee
Nanjing Zhenshi Intelligent Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Nanjing Zhenshi Intelligent Technology Co Ltd filed Critical Nanjing Zhenshi Intelligent Technology Co Ltd
Priority to CN201911011503.8A priority Critical patent/CN110515092B/en
Publication of CN110515092A publication Critical patent/CN110515092A/en
Application granted granted Critical
Publication of CN110515092B publication Critical patent/CN110515092B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S17/00Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
    • G01S17/88Lidar systems specially adapted for specific applications
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S7/00Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
    • G01S7/48Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S17/00
    • G01S7/4802Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S17/00 using analysis of echo signal for target characterisation; Target signature; Target cross-section
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/041Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
    • G06F3/0416Control or interface arrangements specially adapted for digitisers
    • G06F3/0418Control or interface arrangements specially adapted for digitisers for error correction or compensation, e.g. based on parallax, calibration or alignment
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/041Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
    • G06F3/042Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means by opto-electronic means
    • G06F3/0425Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means by opto-electronic means using a single imaging device like a video camera for tracking the absolute position of a single or a plurality of objects with respect to an imaged reference surface, e.g. video camera imaging a display or a projection screen, a table or a wall surface, on which a computer generated image is displayed or projected
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/60Analysis of geometric attributes
    • G06T7/66Analysis of geometric attributes of image moments or centre of gravity
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/70Determining position or orientation of objects or cameras

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Multimedia (AREA)
  • Geometry (AREA)
  • Electromagnetism (AREA)
  • Optical Radar Systems And Details Thereof (AREA)

Abstract

The invention provides a plane touch method based on a laser radar, which comprises the following steps: step 1, arranging at least one laser radar in a measurement area; step 2, for any two measurement points in the measurement area, continuously obtaining measurement of the two measurement points by using a laser radar to obtain multiple groups of data returned by the radar, wherein the laser radar scans at a fixed increment angle, and each group of returned data comprises an accumulated angle of the current angle measurement points of the laser radar and a distance of the current angle measurement points of the laser radar; step 3, converting the data returned by the laser radar into a depth perception map; step 4, acquiring indexes of left and right end points of each section of bright spot according to the depth perception map; and 5, converting the coordinates of the rectangular coordinate system into pixel coordinates in a display plane. The invention combines point cloud data with image processing, can efficiently and intuitively calculate estimated points by using an image mode, and has stronger anti-interference capability. The estimated angle and the estimated distance of each point are obtained by a method of calculating the gravity center in the area of the gray level image, and therefore the Lubang and the stability of data are improved.

Description

Plane touch method based on laser radar
Technical Field
The invention relates to the technical field of image touch interaction, in particular to a plane touch method based on a laser radar.
Background
At present, touch technologies are mainly classified into capacitive and resistive touch technologies, wherein capacitive and resistive touch detection both require the use of a specific touch medium, such as a capacitive screen or a resistive screen, no matter they exist in an independent form, or in a combined and spliced manner, such as a mobile phone capacitive/resistive screen, wherein one layer of touch screen is used for realizing photoelectric conversion, and such a touch manner is relatively high in cost and cannot be suitable for touch support in a large scene.
Disclosure of Invention
The invention aims to provide a plane touch method based on laser radars, aiming at the problems of difficult support, difficult installation and high cost when the existing capacitive/resistive touch interaction mode is applied in a large scene.
In order to achieve the purpose, the technical scheme adopted by the invention is as follows:
the plane touch method based on the laser radar comprises the following steps:
step 1, arranging at least one laser radar in a measurement area;
step 2, for any two measuring points in the measuring area, the positions of the two measuring points are respectively expressed as polar coordinates
Figure 922281DEST_PATH_IMAGE001
And
Figure 361265DEST_PATH_IMAGE002
continuously obtaining measurement of two measurement points by using a laser radar to obtain multiple groups of data returned by the radar, wherein the laser radar scans at a fixed increment angle, and each group of returned data comprises an accumulated angle of the current angle measurement point of the laser radar and a distance of the current angle measurement point of the laser radar;
step 3, converting the data returned by the laser radar into a depth perception map with the abscissa as the accumulated angle of the current angle measurement point of the laser radar and the ordinate as 1, wherein the image brightness of the depth perception map represents the distance of the current angle measurement point of the laser radar;
step 4, acquiring indexes of left and right end points of each section of bright spot according to the depth perception map, wherein the indexes of the left and right end points of each bright spot are coordinates of a rectangular coordinate system generated by the rectangular coordinate system by taking the laser radar as a circle center;
and 5, converting the coordinates of the rectangular coordinate system into pixel coordinates in a display plane.
Further, in the step 4, the obtaining the index of the left and right end points of each segment of the bright spot specifically includes:
extracting the area where the bright spots are located;
determining the predicted abscissa
Figure 759494DEST_PATH_IMAGE003
And predict the depth valueThe method specifically comprises the following steps:
Figure 500672DEST_PATH_IMAGE005
Figure 404780DEST_PATH_IMAGE006
i denotes a depth map, i.e.: one pseudo image in t rows and t columns, I (x) represents the depth value represented by the image brightness when the abscissa is x, and the gray scale gravity center method is used for obtaining the estimated abscissa
Figure 364558DEST_PATH_IMAGE007
Using the Gaussian distribution as the quality of the point to calculate the estimated depth valueFrom the input depth image, calculate an estimate in each spot segmentation area
Figure 257517DEST_PATH_IMAGE008
And
Figure 365765DEST_PATH_IMAGE007
(ii) a And converting the coordinate system into a rectangular coordinate system through coordinate conversion:
the prediction calculated for each segment
Figure 24115DEST_PATH_IMAGE008
And
Figure 180947DEST_PATH_IMAGE007
substituting the above formula to calculate the estimated point of each touch areaAnd (4) indexing.
Further, the converting of the pixel coordinates includes:
projecting a picture on a touch plane, wherein four touch points 1-4 positioned on diagonal lines are arranged in the picture, the coordinates of the touch point 1 are the coordinates of an original point to be mapped, the touch points 1-4 are sequentially touched during calibration, and the coordinates of each point under a laser radar coordinate system are recorded; setting the pixel coordinates of the points 1-4 to be mapped, recording the measurement coordinates corresponding to the four points, and calculating to obtain a rotation and translation matrix as a correction parameter.
Further, x is determined by the scanning accuracy of the laser radar.
Further, the estimated depth value
Figure 566972DEST_PATH_IMAGE008
Is constructed as a weighted average depth modulated by a gaussian distribution.
The invention has the beneficial effects that: the invention overcomes the problem that special touch screen media are needed in the prior art, provides a method for using a single or a plurality of laser radars for touch and calibration, does not need other media, uses a photoelectric induction type to measure distance, and converts the distance into a plane coordinate.
The method combines point cloud data with image processing, can efficiently and intuitively calculate the estimated point by using an image mode, and has stronger anti-interference capability. The estimated angle and the estimated distance of each point are obtained by a method of calculating the gravity center in the area of the gray level image, and therefore the Lubang and the stability of data are improved.
Drawings
Fig. 1 is a schematic diagram of a planar touch method of the present invention.
FIG. 2 is a schematic representation of a longitudinally stretched depth perception map of the present invention.
FIG. 3 is a schematic diagram of the hot spot index of the present invention.
Fig. 4 is a schematic diagram of a lidar plane, a measurement plane, and a display plane.
FIG. 5 is a schematic illustration of the coordinate calibration of the present invention.
Detailed Description
In order to better understand the technical content of the present invention, specific embodiments are described below with reference to the accompanying drawings.
In this disclosure, aspects of the present invention are described with reference to the accompanying drawings, in which a number of illustrative embodiments are shown. Embodiments of the present disclosure are not necessarily intended to include all aspects of the invention. It should be appreciated that the various concepts and embodiments described above, as well as those described in greater detail below, may be implemented in any of numerous ways.
Referring to fig. 1 to 5, a lidar-based planar touch method according to an embodiment of the present invention includes the following steps:
step 1, arranging at least one laser radar in a measurement area;
step 2, for any two measuring points in the measuring area, the positions of the two measuring points are respectively expressed as polar coordinates
Figure 828804DEST_PATH_IMAGE001
And
Figure 632986DEST_PATH_IMAGE002
continuously obtaining measurement of two measurement points by using a laser radar to obtain multiple groups of data returned by the radar, wherein the laser radar scans at a fixed increment angle, and each group of returned data comprises an accumulated angle of the current angle measurement point of the laser radar and a distance of the current angle measurement point of the laser radar;
step 3, converting the data returned by the laser radar into a depth perception map with the abscissa as the accumulated angle of the current angle measurement point of the laser radar and the ordinate as 1, wherein the image brightness of the depth perception map represents the distance of the current angle measurement point of the laser radar;
step 4, acquiring indexes of left and right end points of each section of bright spot according to the depth perception map, wherein the indexes of the left and right end points of each bright spot are coordinates of a rectangular coordinate system generated by the rectangular coordinate system by taking the laser radar as a circle center;
and 5, converting the coordinates of the rectangular coordinate system into pixel coordinates in a display plane.
One or more laser radars are installed in the re-measurement scene to serve as sensors, and the angles of the touch points under the coordinate system of the radar and the distances between the touch points are measured by the laser radars. In FIG. 1, the laser radar origin is at the laser radar center, the laser radar clockwise direction is the positive direction, the laser radar rotates a week and measures t times, and the depth unit of the touch point measured at each time is meters.
In a preferred embodiment, the measurement area may be pre-defined using the PNPOLY algorithm, and may be any polygon. And filtering out points which are not in the measuring area by the laser radar data through a PNPOLY algorithm, and returning to a set S of the points in the measuring area.
Through multiple groups of data returned by the radar, a black and white image with the size of t rows and 1 columns of a pseudo image, which is referred to as a depth perception image for short, can be obtained according to the measurement angle and the measurement distance. Therefore, we convert the two-dimensional point cloud information into a list of image information which is as long as the measurement angle returned by the laser radar, as wide as 1, and takes the depth distance of the laser radar vehicle as the brightness information of the image.
Referring to fig. 3, the regions where the light spots are extracted are, for example, regions [ r0, r1], [ r2, r3], [ r4, r5 ]. Where r0 and r1 are the left and right anchor points in the speckle region, respectively.
More preferably, in step 4, the obtaining the index of the left and right end points of each segment of the bright spot specifically includes:
extracting the area where the bright spots are located;
determining the predicted abscissa
Figure 847804DEST_PATH_IMAGE003
And predict the depth value
Figure 461528DEST_PATH_IMAGE004
The method specifically comprises the following steps:
Figure 556347DEST_PATH_IMAGE005
i denotes a depth map, i.e.: one pseudo image in t rows and t columns, I (x) represents the depth value represented by the image brightness when the abscissa is x, and the gray scale gravity center method is used for obtaining the estimated abscissa
Figure 59541DEST_PATH_IMAGE007
Using the Gaussian distribution as the quality of the point to calculate the estimated depth valueFrom the input depth image, calculate an estimate in each spot segmentation area
Figure 137974DEST_PATH_IMAGE008
And
Figure 759491DEST_PATH_IMAGE007
(ii) a And converting the coordinate system into a rectangular coordinate system through coordinate conversion:
Figure 423168DEST_PATH_IMAGE009
the prediction calculated for each segmentAnd
Figure 319153DEST_PATH_IMAGE007
substituting the formula above, and calculating the index of the estimated point of each touch area.
The estimated point coordinate set is points in a rectangular coordinate system established by taking the laser radar as a center, and usually the points are mapped to a screen coordinate system.
Fig. 4 shows the coordinate relationship among the laser radar plane- > measurement plane- > display plane, and the transformation matrix can be determined by calibration, so as to complete the mapping from the laser radar coordinate to the display plane coordinate.
As shown in fig. 5, the calibration process includes: projecting a picture on a touch plane, wherein four touch points 1-4 positioned on diagonal lines are arranged in the picture, the coordinates of the touch point 1 are the coordinates of an original point to be mapped, the touch points 1-4 are sequentially touched during calibration, and the coordinates of each point under a laser radar coordinate system are recorded; setting the pixel coordinates of the points 1-4 to be mapped, recording the measurement coordinates corresponding to the four points, and calculating to obtain a rotation and translation matrix as a correction parameter.
And finally, multiplying the estimated point coordinate set by the correction parameters to obtain a final output touch point coordinate set.
In other embodiments, the touch action may be completed by binding the system touch event and simulating a system mouse click event.
The invention combines point cloud data with image processing, can efficiently and intuitively calculate estimated points by using an image mode, and has stronger anti-interference capability. The estimated angle and the estimated distance of each point are obtained by a method of calculating the gravity center in the area of the gray level image, and therefore the Lubang and the stability of data are improved.
Although the present invention has been described with reference to the preferred embodiments, it is not intended to be limited thereto. Those skilled in the art can make various changes and modifications without departing from the spirit and scope of the invention. Therefore, the protection scope of the present invention should be determined by the appended claims.

Claims (4)

1. A plane touch method based on laser radar is characterized by comprising the following steps:
step 1, arranging at least one laser radar in a measurement area;
step 2, for any two measuring points in the measuring area, the positions of the two measuring points are respectively expressed as polar coordinatesAnd
Figure 627155DEST_PATH_IMAGE002
by using laserThe method comprises the steps that the optical radar continuously obtains the measurement of two measurement points to obtain multiple groups of data returned by the radar, wherein the laser radar scans at a fixed increment angle, and each group of returned data comprises the accumulated angle of the current angle measurement point of the laser radar and the distance of the current angle measurement point of the laser radar;
step 3, converting the data returned by the laser radar into a depth perception map with the abscissa as the accumulated angle of the current angle measurement point of the laser radar and the ordinate as 1, wherein the image brightness of the depth perception map represents the distance of the current angle measurement point of the laser radar;
step 4, acquiring indexes of left and right end points of each section of bright spot according to the depth perception map, wherein the indexes of the left and right end points of each bright spot are coordinates of a rectangular coordinate system generated by the rectangular coordinate system by taking the laser radar as a circle center;
step 5, converting the coordinates of the rectangular coordinate system into pixel coordinates in a display plane;
in step 4, the obtaining the index of the left and right end points of each segment of the bright spot specifically includes:
extracting the area where the bright spots are located;
determining the predicted abscissa
Figure DEST_PATH_IMAGE003
And predict the depth value
Figure 643052DEST_PATH_IMAGE004
The method specifically comprises the following steps:
Figure DEST_PATH_IMAGE005
i denotes a depth map, i.e.: one pseudo image in t rows and t columns, I (x) represents the depth value represented by the image brightness when the abscissa is x, and the gray scale gravity center method is used for obtaining the estimated abscissa
Figure 858179DEST_PATH_IMAGE003
Using the Gaussian distribution as the quality of the point to calculate the estimated depth value
Figure 542357DEST_PATH_IMAGE004
From the input depth image, calculate an estimate in each spot segmentation area
Figure 698228DEST_PATH_IMAGE003
And
Figure 547509DEST_PATH_IMAGE004
(ii) a And converting the coordinate system into a rectangular coordinate system through coordinate conversion:
the prediction calculated for each segment
Figure DEST_PATH_IMAGE007
Andsubstituting the formula above, and calculating the index of the estimated point of each touch area.
2. The lidar-based planar touch method of claim 1, wherein the converting of the pixel coordinates comprises:
projecting a picture on a touch plane, wherein four touch points 1-4 positioned on diagonal lines are arranged in the picture, the coordinates of the touch point 1 are the coordinates of an original point to be mapped, the touch points 1-4 are sequentially touched during calibration, and the coordinates of each point under a laser radar coordinate system are recorded; setting the pixel coordinates of the points 1-4 to be mapped, recording the measurement coordinates corresponding to the four points, and calculating to obtain a rotation and translation matrix as a correction parameter.
3. The lidar-based planar touch method of claim 1, wherein x is determined by a scanning accuracy of the lidar.
4. The lidar-based planar touch method of claim 1, wherein the predicted depth value
Figure DEST_PATH_IMAGE009
Is constructed as a weighted average depth modulated by a gaussian distribution.
CN201911011503.8A 2019-10-23 2019-10-23 Plane touch method based on laser radar Active CN110515092B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201911011503.8A CN110515092B (en) 2019-10-23 2019-10-23 Plane touch method based on laser radar

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201911011503.8A CN110515092B (en) 2019-10-23 2019-10-23 Plane touch method based on laser radar

Publications (2)

Publication Number Publication Date
CN110515092A CN110515092A (en) 2019-11-29
CN110515092B true CN110515092B (en) 2020-01-10

Family

ID=68633547

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201911011503.8A Active CN110515092B (en) 2019-10-23 2019-10-23 Plane touch method based on laser radar

Country Status (1)

Country Link
CN (1) CN110515092B (en)

Families Citing this family (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111831161B (en) * 2020-07-23 2023-10-03 吕嘉昳 Method for automatically identifying contact position in display screen based on touch method
CN111831162B (en) * 2020-07-23 2023-10-10 吕嘉昳 Writing brush shape correction method based on touch screen
CN112774181B (en) * 2021-01-11 2023-11-10 北京星汉云图文化科技有限公司 Radar data processing method, radar data processing system and computer storage medium
CN116990787B (en) * 2023-09-26 2023-12-15 山东科技大学 Scanning platform coordinate system error correction method based on airborne laser radar system

Family Cites Families (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105335021A (en) * 2015-09-09 2016-02-17 浙江工业大学 Laser radar based man-machine interaction system
CN105306991A (en) * 2015-09-09 2016-02-03 浙江工业大学 Interactive television based on laser radar
CN105320367A (en) * 2015-09-09 2016-02-10 浙江工业大学 Stage interaction system
US10817065B1 (en) * 2015-10-06 2020-10-27 Google Llc Gesture recognition using multiple antenna
CN108663677A (en) * 2018-03-29 2018-10-16 上海智瞳通科技有限公司 A kind of method that multisensor depth integration improves target detection capabilities

Also Published As

Publication number Publication date
CN110515092A (en) 2019-11-29

Similar Documents

Publication Publication Date Title
CN110515092B (en) Plane touch method based on laser radar
CN107610084B (en) Method and equipment for carrying out information fusion on depth image and laser point cloud image
CN109477710B (en) Reflectance map estimation for point-based structured light systems
JP7180646B2 (en) Detection device, information processing device, detection method, detection program, and detection system
CN102968800B (en) A kind of evaluation method of image definition
US20200380653A1 (en) Image processing device and image processing method
CN114814758B (en) Camera-millimeter wave radar-laser radar combined calibration method and device
CN103729846A (en) LiDAR point cloud data edge detection method based on triangular irregular network
CN114719966A (en) Light source determination method and device, electronic equipment and storage medium
CN110378174A (en) Road extracting method and device
CN116704048B (en) Double-light registration method
CN111862208B (en) Vehicle positioning method, device and server based on screen optical communication
CN110765631B (en) Effective imaging pixel-based small target judgment method for infrared radiation characteristic measurement
CN110716209B (en) Map construction method, map construction equipment and storage device
CN116342452A (en) Image generation method and fusion imaging system
KR102025113B1 (en) Method for generating an image using a lidar and device for the same
CN112965052A (en) Monocular camera target ranging method
CN107490363B (en) Production intensity monitoring method for heat production enterprise based on surface temperature data
CN115436936A (en) Radar map determination method, device, equipment and medium for target detection
CN115327540A (en) Radar map-based landslide detection method, device, equipment and medium
CN116089546A (en) Typhoon cloud system identification method, typhoon cloud system identification system, typhoon cloud system identification terminal and storage medium
CN111047635A (en) Depth image-based plane touch method and device and touch system
CN111222504A (en) Bullet hole target scoring method, device, equipment and medium
JP7527532B1 (en) IMAGE POINT CLOUD DATA PROCESSING APPARATUS, IMAGE POINT CLOUD DATA PROCESSING METHOD, AND IMAGE POINT CLOUD DATA PROCESSING PROGRAM
RU2398240C1 (en) Method of measuring speed of extended objects

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant
CP01 Change in the name or title of a patent holder
CP01 Change in the name or title of a patent holder

Address after: No.568 longmian Avenue, gaoxinyuan, Jiangning District, Nanjing City, Jiangsu Province, 211000

Patentee after: Xiaoshi Technology (Jiangsu) Co.,Ltd.

Address before: No.568 longmian Avenue, gaoxinyuan, Jiangning District, Nanjing City, Jiangsu Province, 211000

Patentee before: NANJING ZHENSHI INTELLIGENT TECHNOLOGY Co.,Ltd.