CN111337013A - Four-linear-array CCD-based multi-target point distinguishing and positioning system - Google Patents
Four-linear-array CCD-based multi-target point distinguishing and positioning system Download PDFInfo
- Publication number
- CN111337013A CN111337013A CN201911308933.6A CN201911308933A CN111337013A CN 111337013 A CN111337013 A CN 111337013A CN 201911308933 A CN201911308933 A CN 201911308933A CN 111337013 A CN111337013 A CN 111337013A
- Authority
- CN
- China
- Prior art keywords
- linear array
- point light
- receiving end
- linear
- assembly
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Granted
Links
Images
Classifications
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01C—MEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
- G01C21/00—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01C—MEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
- G01C11/00—Photogrammetry or videogrammetry, e.g. stereogrammetry; Photographic surveying
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01C—MEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
- G01C21/00—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
- G01C21/20—Instruments for performing navigational calculations
Landscapes
- Engineering & Computer Science (AREA)
- Radar, Positioning & Navigation (AREA)
- Remote Sensing (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Automation & Control Theory (AREA)
- Multimedia (AREA)
- Length Measuring Devices By Optical Means (AREA)
Abstract
The invention discloses a multi-target point distinguishing and positioning system based on a four-linear-array CCD (charge coupled device), which consists of four one-dimensional image acquisition units with fixed positions, a target consisting of three LED point light sources and an embedded system. The one-dimensional image acquisition unit consists of cylindrical optical lenses with central axes vertical to each other and a linear array CCD device. A point light source in the space is projected to each linear array CCD plane through a cylindrical optical lens at a receiving end to form a linear light perpendicular to a CCD photosensitive line array, the linear light is intersected on a CCD to obtain projected position information, and the three-dimensional position of the point light source is determined by the projection positions of four groups of CCDs. The three-dimensional LED point light sources can be distinguished and measured without synchronous exposure of the transmitting end and the receiving end, and the three-dimensional coordinate positions of the three point light sources in space relative to the receiving end sensor are calculated through an embedded system. The system has good stability, good four-axis symmetry consistency of a coordinate system and low cost.
Description
Technical Field
The invention relates to an optical measurement system, in particular to a photoelectric multi-target recognition system, which is applied to the technical field of indoor and outdoor 3D space multi-target point real-time measurement instrument equipment, is also applied to the technical field of photoelectric three-dimensional real-time positioning, and can be applied to the technical field of image navigation operation and determination of the moving position of an intelligent body in space.
Background
The method for identifying multi-target point light sources is frequently used in computer vision, two cameras are used for shooting the same object at different positions, and each point on the object can be distinguished by adopting an antipodal geometric principle. Linear array CCDs are common in computer vision metrology and have significant advantages over area CCDs in terms of accurate coordinate measurement and dynamic position tracking due to their high resolution and excellent frame rate. However, for the three-dimensional optical positioning measurement system based on the linear array CCD, if two or more LED point light sources are simultaneously illuminated in the viewing area of the three-dimensional optical positioning measurement system, the optical measurement system based on the linear array CCD cannot distinguish the corresponding projection relationship of each LED point light source.
When the optical measurement system based on the three-linear array CCD is used for measuring a plurality of point light sources, the plurality of LED point light sources are required to be controlled to be sequentially lightened one by one, namely, only one LED point light source is lightened in the visual field of the optical measurement system based on the three-linear array CCD at each time, so that the linear array CCD measurement system knows the LED point light source corresponding to the current projection point. However, this requires a synchronous exposure control of the line CCD measurement system, which necessarily increases the complexity and power consumption of the system. The existing three-dimensional optical positioning measurement system based on the linear array CCD still lacks a feasible distinguishing and measuring method under the condition that a plurality of LED point light sources are simultaneously lightened, so that the system can not work normally. Therefore, it is necessary to develop a related technology research to enable the linear array CCD optical measurement system to work normally even when a plurality of LED point light sources are simultaneously lit. The defects of the existing three-dimensional optical positioning measurement system based on the linear array CCD when measuring a plurality of simultaneously lighted LED point light sources become a technical problem to be solved urgently.
Disclosure of Invention
In order to solve the problems of the prior art, the invention aims to overcome the defects in the prior art and provide a multi-target point distinguishing and positioning system based on a four-linear-array CCD (charge coupled device). under the condition that a point light source of a transmitting end target and a receiving end sensor are not required to be synchronously exposed, three LED point light sources which are simultaneously lightened can be distinguished by only using the projection relation which is met by three point light sources on the four-linear-array CCD, and the three-dimensional coordinates of the three point light sources are simultaneously measured, so that the multi-target point distinguishing and positioning system can be used in the fields of image navigation operation, measurement for determining the moving position, indoor position and posture of.
In order to achieve the purpose of the invention, the invention adopts the following technical scheme:
a multi-target point distinguishing and positioning system based on a four-linear-array CCD comprises a transmitting end target, a receiving end sensor and an embedded system; a receiving end sensor serves as a receiving end of a linear photoelectric sensor to receive and transmit a point light source signal in a target at a transmitting end; the target at the transmitting end consists of three point light sources, each point light source consists of an LED light-emitting point, the LED light-emitting points adopt infrared light-emitting to send light signals, the target at the transmitting end can be arranged on any movable or static object, and connecting lines among the three point light sources of the target at the transmitting end form three polygons with unequal side lengths; the receiving end sensor consists of four groups of photoelectric sensors and an embedded system, wherein the four groups of photoelectric sensors comprise four linear array CCD components, four cylindrical optical lenses with fixed focal lengths and a sensor base, each group of linear array CCD component and the cylindrical optical lenses respectively form a one-dimensional imaging unit, each linear array CCD component corresponds to one cylindrical optical lens, any point light source in space is projected onto the plane of each linear CCD component through each cylindrical optical lens to form four linear light rays perpendicular to the linear arrays of the linear array CCD components, a linear array CCD1 component, a linear array CCD2 component, a linear array CCD3 component and a linear array CCD4 component are arranged to form the linear array CCD component system of the receiving end sensor, and the intersection points of the linear array CCD1 component, the linear array CCD2 component, the linear array CCD3 component, the photosensitive surface of the linear array CCD4 component and the four linear light rays are respectively defined as lambda1,λ2,λ3,λ4By the structure of the receiving end sensor, the four intersection points satisfy the relationship: lambda [ alpha ]1+ λ4=λ2+λ3(ii) a Transmitting terminalThree LED point light sources A of target1,A2,A3Simultaneously lightening, and respectively projecting each LED to four linear array CCD assemblies according to the relation lambda satisfied by the linear array CCD assemblies1+λ4=λ2+λ3Respectively determining projection points lambda respectively corresponding to the three LED point light sources1,λ2,λ3,λ4(ii) a Therefore, three groups of four different projection points can be obtained, and the coordinates of each point light source under the coordinate system of the receiving end sensor are respectively reconstructed and determined according to each group of projection points; calculating the distance between three LED point light sources, and comparing the calculated distance value with the known point light source A1、A2、A3A distance A between1A2、A1A3、A1A2Comparing, and distinguishing matching comparison relations of the three LED point light sources; the embedded system calculates the three-dimensional space coordinate position data of each point light source in the transmitting end target relative to the receiving end sensor in real time, and stores the data to complete the multi-target point distinguishing and positioning tasks.
As a preferred technical scheme of the invention, three point light sources A1、A2、A3When simultaneously lighted, and satisfy A1A2、A1A3、 A2A3Are all in unequal structural relationship.
As the preferred technical scheme of the invention, four projection points lambda of any point light source on four linear array CCDs1,λ2,λ3, λ4Satisfies the relation lambda1+λ4=λ2+λ3According to the relation, four projection points corresponding to the LED point light source can be distinguished, so that the coordinate value of the LED point light source under the coordinate system of the receiving end sensor is reconstructed, and the projection points corresponding to the three LED point light sources are distinguished.
As a preferred technical scheme of the invention, three point light sources A1、A2、A3When simultaneously lighted, and satisfy A1A2、A1A3、 A2A3All of which are in unequal structural relationship; then will giveCoordinate values of three LED point light sources and A1A2,A1A3,A2A3The structural relations are compared to distinguish the matching contrast relation of the three LED point light sources.
Preferably, the intersection point of the straight lines where the photosensitive line arrays of the four linear array CCD assemblies are located is the origin of the coordinate system of the receiving end sensor.
As the preferred technical scheme of the invention, four linear array CCD assemblies are distributed on the same plane, and the plane layout is in a cross shape; when the intersection point of the straight lines where the photosensitive line arrays of the four linear array CCDs are located is made to be the origin of the coordinate system of the receiving end sensor, the linear array CCD1 assembly is located on the positive half shaft of the X axis of the coordinate system of the receiving end sensor, the linear array CCD2 assembly is located on the positive half shaft of the Y axis of the coordinate system of the receiving end sensor, the linear array CCD3 assembly is located on the negative half shaft of the X axis of the coordinate system of the receiving end sensor, and the linear array CCD4 assembly is located on the negative half shaft of the Y axis of the.
Preferably, the distances from the innermost ends of the four linear array CCD assemblies to the origin of the coordinate system of the receiving end sensor are all equal.
Preferably, the focal lengths of the four cylindrical optical lenses matched with the four linear array CCD assemblies are all equal.
Preferably, the lengths of the four linear CCD assembly photosensitive line rows are all equal.
Preferably, there is no need for a simultaneous exposure between the point source of the transmitting end target and the receiving end sensor.
Compared with the prior art, the invention has the following obvious and prominent substantive characteristics and remarkable advantages:
1. the system adopts a photoelectric multi-target recognition system to carry out photoelectric three-dimensional real-time positioning; under the condition that a transmitting end target and a receiving end sensor are not required to be synchronous, the four one-dimensional image acquisition units are used for identifying a plurality of point light source targets which emit light simultaneously, and multi-target point distinguishing and positioning based on the four-linear array CCD device are realized;
2. the system has high measurement precision, high data measurement efficiency and low delay;
3. the system of the invention calculates the three-dimensional coordinate position of the three point light sources relative to the receiving end sensor in space through the embedded system, and the system of the invention has good working stability, good four-axis symmetry consistency of a coordinate system and low manufacturing and measuring cost of the device.
Drawings
FIG. 1 is a schematic structural diagram of a partial discharge sensing detection system of a quartz fluorescent fiber according to a preferred embodiment of the present invention.
FIG. 2 is a first diagram illustrating multi-target point distinguishing and positioning according to a preferred embodiment of the present invention.
FIG. 3 is a diagram illustrating a second embodiment of multi-target point distinguishing and positioning.
FIG. 4 is a flowchart illustrating the operation of the multi-target distinguishing and positioning system according to the preferred embodiment of the present invention.
Detailed Description
The above-described scheme is further illustrated below with reference to specific embodiments, which are detailed below:
in the present embodiment, referring to fig. 1-4, a multi-target point distinguishing and positioning system based on a four-linear-array CCD includes a transmitting-end target 1, a receiving-end sensor 2 and an embedded system 3; the receiving end sensor 2 is used as a receiving end of a linear photoelectric sensor to receive and send point light source signals in the emitting end target 1; the emission end target 1 consists of three point light sources, each point light source consists of an LED light-emitting point 4, the LED light-emitting points adopt infrared light-emitting to send light signals, the emission end target 1 can be arranged on any movable or static object, and connecting lines among the three point light sources of the emission end target 1 form three polygons with unequal side lengths; the receiving end sensor 2 consists of four groups of photoelectric sensors and an embedded system 3, wherein the four groups of photoelectric sensors comprise four linear array CCD components, four cylindrical optical lenses 6 with fixed focal lengths and a sensor base, each group of linear array CCD components and the cylindrical optical lenses 6 respectively form a one-dimensional imaging unit, each linear array CCD component corresponds to one cylindrical optical lens 6, any point light source in space is projected onto the plane of each linear CCD component through each cylindrical optical lens 6 to form four linear light rays perpendicular to the linear arrays of the linear array CCD components, and a linear array CCD1 component 7, a linear array CCD2 component 8 and a linear array CCD3 component are arranged9. The linear array CCD4 assembly 10 forms a linear array CCD assembly system of the receiving end sensor 2, wherein the intersection points of the photosensitive surfaces of the linear array CCD1 assembly 7, the linear array CCD2 assembly 8, the linear array CCD3 assembly 9 and the linear array CCD4 assembly 10 and four linear light rays are respectively defined as lambda1,λ2,λ3,λ4By the structure of the receiver sensor 2, the relationship between the four intersection points is satisfied: lambda [ alpha ]1+λ4=λ2+λ3(ii) a Three LED point light sources A of emission end target 11,A2,A3Simultaneously lightening, and respectively projecting each LED to four linear array CCD assemblies according to the relation lambda satisfied by the linear array CCD assemblies1+λ4=λ2+λ3Respectively determining projection points lambda respectively corresponding to the three LED point light sources1,λ2, λ3,λ4(ii) a Therefore, three groups of four different projection points can be obtained, and the coordinates of each point light source under the coordinate system of the receiving end sensor 2 are respectively reconstructed and determined according to each group of projection points; calculating the distance between three LED point light sources, and comparing the calculated distance value with the known point light source A1、A2、A3A distance A between1A2、A1A3、A1A2Comparing, and distinguishing matching comparison relations of the three LED point light sources; the embedded system 3 calculates the three-dimensional space coordinate position data of each point light source in the transmitting end target 1 relative to the receiving end sensor 2 in real time, and stores the data to complete the multi-target point distinguishing and positioning tasks.
In the present embodiment, the cylindrical optical lens 6 is used to realize projection mapping from a three-dimensional space to one-dimensional imaging. According to the basic principle of optics, light from an object point on One side of the cylindrical optical lens 6 will form a linear image on the other side, and the linear array CCD and the cylindrical optical lens constitute a One-Dimensional Imaging Unit (ODIU), as shown in fig. 1.
In this embodiment, it is assumed that four linear CCDs are distributed on the same plane, the planar layout is "cross", when an intersection point of straight lines where photosensitive line arrays of the four linear CCDs are located is an origin of a coordinate system of the receiving end sensor 2, the linear CCD1 component 7 is on an X-axis positive half axis of the coordinate system of the receiving end sensor 2, the linear CCD2 component 8 is on a Y-axis positive half axis of the coordinate system of the receiving end sensor 2, the linear CCD3 component 9 is on an X-axis negative half axis of the coordinate system of the receiving end sensor 2, the linear CCD4 component 10 is on a Y-axis negative half axis of the coordinate system of the receiving end sensor 2, distances from the innermost sides of the four linear CCDs to the origin are all equal to 60mm, the focal length of the cylindrical optical lens 6 is 50mm, and the length of the photosensitive line arrays of the linear CCDs.
In the present embodiment, when one LED point light source in the transmitting end target 1 emits light, the point light source passes through the four cylindrical optical lenses 6 of the receiving end sensor 2, and forms four intersecting projection points λ on the four photosensitive lines of the linear array CCD1 assembly 7, the linear array CCD2 assembly 8, the linear array CCD3 assembly 9, and the linear array CCD4 assembly 10, respectively1,λ2,λ3,λ4As shown in fig. 2. According to the designed structure of the receiving end sensor 6, four plane equations corresponding to the intersection of four planes at one point are listed as follows:
50*x+0*y-(75-λ1)*z=50*λ1(1)
0*x+50*y-(75-λ2)*z=50*λ2(2)
50*x+0*y+(75+λ3)*z=50*λ3(3)
0*x+50*y+(75+λ4)*z=50*λ4(4)
the coefficient matrix and the augmentation matrix of these four equations are listed.
The matrix is subjected to an initial row transformation to obtain the following results:
from the knowledge of linear algebra, the only requirement for the intersection of four plane equations is rank (a) rank (b) 3, so that the four projection points λ of a point light source can be obtained1,λ2,λ3,λ4Satisfies the following conditions: lambda [ alpha ]1+λ4=λ2+λ3。
When the three LED point light sources of the emission-end target 1 emit light simultaneously, it is not necessary to turn on one by one, and the normally-on state is maintained, as shown in fig. 3. Three LED point light sources A in emission end target 11,A2,A3Respectively pass through four cylindrical optical lenses 6 of the receiving end sensor 2 and form four intersection points defined as lambda on the photosensitive surfaces of the four linear array CCDs1_A1,λ2_A1,λ3_A1,λ4_A1;λ1_A2,λ2_A2,λ3_A2,λ4_A2;λ1_A3,λ2_A3,λ3_A3,λ4_A3. All projection points on the linear array CCD1 component 7 are lambdaccd1={λ1_A1,λ1_A2,λ1_A3All projection points on the linear array CCD2 component 8 are lambdaccd2={λ2_A1,λ2_A2,λ2_A3All projection points on the linear array CCD3 component 9 are lambdaccd3={λ3_A1,λ3_A2,λ3_A3All projection points on the linear array CCD4 component 10 are lambdaccd4={λ4_A1,λ4_A2,λ4_A3}。
Each projection point on the line CCD1 module 7 and each projection point on the line CCD4 module 10 are added to obtain all (lambda)ccd1+λccd4) (ii) a Each projection point on the line CCD2 module 8 and each projection point on the line CCD3 module 9 are added to obtain all (lambda)ccd2+λccd3)。
Determine | (λ)ccd1+λccd4)-(λccd2+λccd3) All possible calculations, the first three minima are chosen from these: namely Min { | (λ)ccd1+λccd4)-(λccd2+λccd3) And (ii) reconstructing coordinates of three markers according to four projection points respectively corresponding to the first three minimum values, and setting the coordinates as C1,C2,C3. Comparison C1C2,C1C3,C2C3Distance ofFrom a known as A1A2,A1A3,A2A3To find an estimated point C1,C2,C3And the actual point A1,A2,A3The contrast relationship between the three luminous mark points A and the three luminous mark points A can be successfully distinguished at the receiving end1,A2,A3. The system flow diagram is shown in fig. 4. The distance between three LED point light sources is calculated, and the calculated distance value is compared with the known point light source A1、A2、A3A distance A between1A2、A1A3、A1A2Comparing, and distinguishing matching comparison relations of the three LED point light sources; the embedded system 3 calculates the three-dimensional space coordinate position data of each point light source in the transmitting end target 1 relative to the receiving end sensor 2 in real time, and stores the data to complete the multi-target point distinguishing and positioning tasks.
The multi-target point distinguishing and positioning system based on the four-linear-array CCD is composed of four one-dimensional image acquisition units with fixed positions, a target composed of three LED point light sources and an embedded system. The one-dimensional image acquisition unit is composed of a cylindrical optical lens 6 and a linear array CCD device, wherein the central axes of the cylindrical optical lens are perpendicular to each other. A point light source in the space is projected to each linear array CCD plane through a cylindrical optical lens at a receiving end to form a linear light perpendicular to a CCD photosensitive line array, the linear light is intersected on a CCD to obtain projected position information, and the three-dimensional position of the point light source is determined by the projection positions of four groups of CCDs. The three-dimensional LED point light sources can be distinguished and measured without synchronous exposure of the transmitting end and the receiving end, and the three-dimensional coordinate positions of the three point light sources in space relative to the receiving end sensor are calculated through an embedded system. The system has good stability, good four-axis symmetry consistency of a coordinate system and low cost.
The embodiments of the present invention have been described with reference to the accompanying drawings, but the present invention is not limited to the embodiments, and various changes and modifications can be made according to the purpose of the invention, and any changes, modifications, substitutions, combinations or simplifications made according to the spirit and principle of the technical solution of the present invention shall be equivalent substitutions, as long as the purpose of the present invention is met, and the present invention shall fall within the protection scope of the present invention as long as the technical principle and inventive concept of the four-line CCD-based multi-target point distinguishing and positioning system of the present invention are not departed.
Claims (10)
1. A multi-target point distinguishing and positioning system based on a four-linear-array CCD is characterized in that: the system comprises a transmitting end target (1), a receiving end sensor (2) and an embedded system (3); the receiving end sensor (2) is used as a linear photoelectric sensor receiving end to receive a point light source signal in the transmitting end target (1); the target (1) at the transmitting end consists of three point light sources, each point light source consists of an LED light-emitting point (4), the LED light-emitting points adopt infrared light-emitting to send light signals, the target (1) at the transmitting end can be arranged on any movable or static object, and connecting lines among the three point light sources of the target (1) at the transmitting end form three trilaterals with unequal side lengths;
the receiving end sensor (2) consists of four groups of photoelectric sensors and an embedded system (3), wherein each group of photoelectric sensors comprises four linear array CCD assemblies, four cylindrical optical lenses (6) with fixed focal lengths and a sensor base, each group of linear array CCD assembly and each cylindrical optical lens (6) respectively form a one-dimensional imaging unit, each linear array CCD assembly corresponds to one cylindrical optical lens (6), any point light source in space is projected onto the plane of each linear CCD assembly through each cylindrical optical lens (6) to form four linear light rays perpendicular to the linear arrays of the linear array CCD assemblies, a linear array CCD1 assembly (7), a linear array 2 assembly (8), a linear array 3 assembly (9) and a linear array CCD4 assembly (10) are arranged to form the linear array CCD assembly system of the receiving end sensor (2), and the linear array CCD1 assembly (7) is arranged, The intersection points of the light sensing surfaces of the linear array CCD2 assembly (8), the linear array CCD3 assembly (9) and the linear array CCD4 assembly (10) and four linear rays are respectively defined as lambda1,λ2,λ3,λ4By the structure of the receiving end sensor (2), the four intersection points satisfy the following relation: lambda [ alpha ]1+λ4=λ2+λ3;
Three of the transmitting end target (1)LED point light source A1,A2,A3Simultaneously lightening, and respectively projecting each LED to four linear array CCD assemblies according to the relation lambda satisfied by the linear array CCD assemblies1+λ4=λ2+λ3Respectively determining projection points lambda respectively corresponding to the three LED point light sources1,λ2,λ3,λ4(ii) a Therefore, three groups of four different projection points can be obtained, and the coordinates of each point light source under the coordinate system of the receiving end sensor (2) are respectively reconstructed and determined according to each group of projection points; calculating the distance between the three LED point light sources, and comparing the calculated distance value with the known point light source A1、A2、A3A distance A between1A2、A1A3、A1A2Comparing, and distinguishing matching comparison relations of the three LED point light sources; and the embedded system (3) calculates the three-dimensional space coordinate position data of each point light source in the transmitting end target (1) relative to the receiving end sensor (2) in real time, stores the data and completes the multi-target point distinguishing and positioning tasks.
2. The system according to claim 1, wherein said system comprises: the three point light sources A1、A2、A3When simultaneously lighted, and satisfy A1A2、A1A3、A2A3Are all in unequal structural relationship.
3. The system according to claim 1, wherein said system comprises: four projection points lambda of any point light source on four linear array CCDs1,λ2,λ3,λ4Satisfies the relation lambda1+λ4=λ2+λ3According to the relation, four projection points corresponding to the LED point light source can be distinguished, so that the coordinate value of the LED point light source under the coordinate system of the receiving end sensor (2) is reconstructed, and the projection points corresponding to the three LED point light sources are distinguished.
4. The system according to claim 3, wherein said system comprises: the three point light sources A1、A2、A3When simultaneously lighted, and satisfy A1A2、A1A3、A2A3All of which are in unequal structural relationship; then the coordinate values of the three LED point light sources and A are obtained1A2,A1A3,A2A3The structural relations are compared to distinguish the matching contrast relation of the three LED point light sources.
5. The system according to claim 1, wherein said system comprises: the intersection point of the straight lines where the photosensitive line arrays of the four linear array CCD assemblies are located is the origin of the coordinate system of the receiving end sensor (2).
6. The system according to claim 5, wherein said system comprises: the four linear array CCD components are distributed on the same plane, and the plane layout is in a cross shape; when the intersection point of the straight lines where the photosensitive line arrays of the four linear array CCDs are located is made to be the origin of the coordinate system of the receiving end sensor (2), the linear array CCD1 assembly (7) is located on the X-axis positive half shaft of the coordinate system of the receiving end sensor (2), the linear array CCD2 assembly (8) is located on the Y-axis positive half shaft of the coordinate system of the receiving end sensor (2), the linear array CCD3 assembly (9) is located on the X-axis negative half shaft of the coordinate system of the receiving end sensor (2), and the linear array CCD4 assembly (10) is located on the Y-axis negative half shaft of the coordinate system of the receiving end sensor (2).
7. The system according to claim 5, wherein said system comprises: the distances from the innermost ends of the four linear array CCD assemblies to the origin of the coordinate system of the receiving end sensor (2) are equal.
8. The system according to claim 1, wherein said system comprises: the focal lengths of the four cylindrical optical lenses (6) matched with the four linear array CCD assemblies are all equal.
9. The system according to claim 1, wherein said system comprises: the lengths of the photosensitive line rows of the four linear array CCD assemblies are equal.
10. The system according to claim 1, wherein said system comprises: synchronous exposure is not needed between the point light source of the transmitting end target (1) and the receiving end sensor (2).
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201911308933.6A CN111337013B (en) | 2019-12-18 | 2019-12-18 | Four-linear array CCD-based multi-target point distinguishing and positioning system |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201911308933.6A CN111337013B (en) | 2019-12-18 | 2019-12-18 | Four-linear array CCD-based multi-target point distinguishing and positioning system |
Publications (2)
Publication Number | Publication Date |
---|---|
CN111337013A true CN111337013A (en) | 2020-06-26 |
CN111337013B CN111337013B (en) | 2023-05-16 |
Family
ID=71181348
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN201911308933.6A Active CN111337013B (en) | 2019-12-18 | 2019-12-18 | Four-linear array CCD-based multi-target point distinguishing and positioning system |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN111337013B (en) |
Cited By (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN112710234A (en) * | 2020-12-17 | 2021-04-27 | 中国航空工业集团公司北京长城航空测控技术研究所 | Three-dimensional dynamic measuring device and measuring method based on linear array and area array |
CN114111601A (en) * | 2021-12-07 | 2022-03-01 | 合肥工业大学智能制造技术研究院 | Method for detecting position offset of assembly hole by utilizing linear array CCD technology |
Citations (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20020024598A1 (en) * | 2000-07-25 | 2002-02-28 | Satoshi Kunimitsu | Detecting system for container's location |
CN1431628A (en) * | 2003-02-14 | 2003-07-23 | 清华大学 | 3D real time positioning method based on linear CCD and its system |
CN203719535U (en) * | 2013-09-18 | 2014-07-16 | 赵伟东 | Positioning system for multiple CCD (charge-coupled device) large-scene indicating target |
CN103983189A (en) * | 2014-05-16 | 2014-08-13 | 哈尔滨工业大学 | Horizontal position measuring method based on secondary platform linear array CCDs |
CN104819718A (en) * | 2015-04-09 | 2015-08-05 | 上海大学 | 3D photoelectric sensing localization system |
CN110109056A (en) * | 2019-04-24 | 2019-08-09 | 广州市慧建科技有限公司 | A kind of multiple target laser orientation system |
-
2019
- 2019-12-18 CN CN201911308933.6A patent/CN111337013B/en active Active
Patent Citations (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20020024598A1 (en) * | 2000-07-25 | 2002-02-28 | Satoshi Kunimitsu | Detecting system for container's location |
CN1431628A (en) * | 2003-02-14 | 2003-07-23 | 清华大学 | 3D real time positioning method based on linear CCD and its system |
CN203719535U (en) * | 2013-09-18 | 2014-07-16 | 赵伟东 | Positioning system for multiple CCD (charge-coupled device) large-scene indicating target |
CN103983189A (en) * | 2014-05-16 | 2014-08-13 | 哈尔滨工业大学 | Horizontal position measuring method based on secondary platform linear array CCDs |
CN104819718A (en) * | 2015-04-09 | 2015-08-05 | 上海大学 | 3D photoelectric sensing localization system |
CN110109056A (en) * | 2019-04-24 | 2019-08-09 | 广州市慧建科技有限公司 | A kind of multiple target laser orientation system |
Non-Patent Citations (1)
Title |
---|
CHUANG WANG ET AL.: "Multiple Targets Identification in the Linear CCD Measurement System", 《2020 IEEE 4TH INFORMATION TECHNOLOGY,NETWORKING,ELECTRONIC AND AUTOMATION CONTROL CONFERENCE(ITNEC 2020)》 * |
Cited By (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN112710234A (en) * | 2020-12-17 | 2021-04-27 | 中国航空工业集团公司北京长城航空测控技术研究所 | Three-dimensional dynamic measuring device and measuring method based on linear array and area array |
CN114111601A (en) * | 2021-12-07 | 2022-03-01 | 合肥工业大学智能制造技术研究院 | Method for detecting position offset of assembly hole by utilizing linear array CCD technology |
CN114111601B (en) * | 2021-12-07 | 2024-01-30 | 合肥工业大学智能制造技术研究院 | Method for detecting position offset of assembly hole by utilizing linear array CCD technology |
Also Published As
Publication number | Publication date |
---|---|
CN111337013B (en) | 2023-05-16 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
EP3348958B1 (en) | Multi-line array laser three-dimensional scanning system, and multi-line array laser three-dimensional scanning method | |
CN110044300B (en) | Amphibious three-dimensional vision detection device and detection method based on laser | |
WO2017041418A1 (en) | Multi-line array laser three-dimensional scanning system, and multi-line array laser three-dimensional scanning method | |
CA2370156C (en) | Method for optically detecting the shape of objects | |
CN111964694B (en) | Laser range finder calibration method for three-dimensional measurement | |
CN101901501B (en) | Method for generating laser color cloud picture | |
CN111492265A (en) | Multi-resolution, simultaneous localization and mapping based on 3D lidar measurements | |
CN105004324B (en) | A kind of monocular vision sensor with range of triangle function | |
CN102155923A (en) | Splicing measuring method and system based on three-dimensional target | |
CN102410811A (en) | Method and system for measuring parameters of bent pipe | |
CN102438111A (en) | Three-dimensional measurement chip and system based on double-array image sensor | |
KR20120006306A (en) | Indoor positioning apparatus and method | |
CN111854622B (en) | Large-field-of-view optical dynamic deformation measurement method | |
US20210203911A1 (en) | Light Field Imaging System by Projecting Near-Infrared Spot in Remote Sensing Based on Multifocal Microlens Array | |
CN111337013B (en) | Four-linear array CCD-based multi-target point distinguishing and positioning system | |
US20190339071A1 (en) | Marker, and Posture Estimation Method and Position and Posture Estimation Method Using Marker | |
CN115218813B (en) | Large-size reflective surface measurement method | |
CN104748720B (en) | Spatial angle measuring device and angle measuring method | |
CN112595236A (en) | Measuring device for underwater laser three-dimensional scanning and real-time distance measurement | |
CN108051005A (en) | The single PSD detection methods of Target space position and posture | |
CN202406199U (en) | Three-dimensional measure chip and system based on double-array image sensor | |
Yamauchi et al. | Calibration of a structured light system by observing planar object from unknown viewpoints | |
CN110146032A (en) | Synthetic aperture camera calibration method based on optical field distribution | |
CN113624158A (en) | Visual dimension detection system and method | |
CN117629567A (en) | High instantaneous speed model attitude angle online measurement method |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant | ||
GR01 | Patent grant |