WO2023157443A1 - Object orientation calculation device and object orientation calculation method - Google Patents
Object orientation calculation device and object orientation calculation method Download PDFInfo
- Publication number
- WO2023157443A1 WO2023157443A1 PCT/JP2022/045955 JP2022045955W WO2023157443A1 WO 2023157443 A1 WO2023157443 A1 WO 2023157443A1 JP 2022045955 W JP2022045955 W JP 2022045955W WO 2023157443 A1 WO2023157443 A1 WO 2023157443A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- dimensional
- coordinates
- vertices
- image
- pallet
- Prior art date
Links
- 238000004364 calculation method Methods 0.000 title claims abstract description 72
- 238000012937 correction Methods 0.000 description 27
- 238000012545 processing Methods 0.000 description 27
- 238000000034 method Methods 0.000 description 12
- 238000005259 measurement Methods 0.000 description 11
- 230000005540 biological transmission Effects 0.000 description 6
- 238000010586 diagram Methods 0.000 description 6
- 230000006870 function Effects 0.000 description 6
- 238000005516 engineering process Methods 0.000 description 3
- 230000014509 gene expression Effects 0.000 description 3
- 238000001514 detection method Methods 0.000 description 2
- 238000004458 analytical method Methods 0.000 description 1
- 238000004891 communication Methods 0.000 description 1
- 239000000470 constituent Substances 0.000 description 1
- 239000000284 extract Substances 0.000 description 1
- 230000010365 information processing Effects 0.000 description 1
- 238000003780 insertion Methods 0.000 description 1
- 230000037431 insertion Effects 0.000 description 1
- 239000004973 liquid crystal related substance Substances 0.000 description 1
- 230000000717 retained effect Effects 0.000 description 1
- 239000007787 solid Substances 0.000 description 1
Images
Classifications
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B66—HOISTING; LIFTING; HAULING
- B66F—HOISTING, LIFTING, HAULING OR PUSHING, NOT OTHERWISE PROVIDED FOR, e.g. DEVICES WHICH APPLY A LIFTING OR PUSHING FORCE DIRECTLY TO THE SURFACE OF A LOAD
- B66F9/00—Devices for lifting or lowering bulky or heavy goods for loading or unloading purposes
- B66F9/06—Devices for lifting or lowering bulky or heavy goods for loading or unloading purposes movable, with their loads, on wheels or the like, e.g. fork-lift trucks
- B66F9/075—Constructional features or details
- B66F9/20—Means for actuating or controlling masts, platforms, or forks
- B66F9/24—Electrical devices or systems
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01B—MEASURING LENGTH, THICKNESS OR SIMILAR LINEAR DIMENSIONS; MEASURING ANGLES; MEASURING AREAS; MEASURING IRREGULARITIES OF SURFACES OR CONTOURS
- G01B11/00—Measuring arrangements characterised by the use of optical techniques
- G01B11/26—Measuring arrangements characterised by the use of optical techniques for measuring angles or tapers; for testing the alignment of axes
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/70—Determining position or orientation of objects or cameras
Definitions
- the present invention relates to an object orientation calculation device and an object orientation calculation method, and in particular, in distribution work in a warehouse or the like, accurate calculation of the orientation of a pallet on which cargo is loaded can be made to facilitate transportation work using cargo handling equipment such as a forklift.
- the present invention relates to an object orientation calculation apparatus and an object orientation calculation method suitable for performing
- a loading platform called a pallet and a forklift to carry the cargo.
- the pallet has insertion openings (fork pockets) on both sides of the surface on which the cargo is to be placed. work becomes possible.
- Non-Patent Document 1 An algorithm for obtaining the position and orientation of an object using 3D reference points is described, for example, in Non-Patent Document 1.
- This non-patent document 1 describes mathematical formulas representing the positional relationship between 3D reference points and their images in the 2D plane ( ⁇ 2 THE CAMERA POSE FROM THREE POINTS REVISITED, Fig 1).
- Patent Document 1 discloses a three-dimensional measurement and display apparatus for calculating the position and orientation of an object based on a depth image acquired by a depth camera and a plane area extracted from the depth image. is disclosed.
- FIG. 13 is a perspective view of a pallet and a diagram for explaining specifications. As mentioned above, due to the measurement problem of the current depth camera, the measurement quality of the 3D point cloud is low. think of.
- detecting the position and orientation of the pallet specifically means obtaining the tilt of the camera and the pallet involved in the measurement (the specific spatial image will be explained later).
- the pallet 10 as shown in FIG. 13, has a pallet side 12 with a height h and a width s, and is provided with fork pockets 11 for inserting and carrying the forks of two forklifts, An inter-fork-pocket rectangular region 13 is formed between the left and right fork pockets 11l and 11r.
- the bounding box is clipped by image recognition, and the recognized four vertices in 2D of its sides are a 11 , a 12 , a 21 , a 22 as shown in FIG.
- the coordinate system is assumed to be a three-dimensional orthogonal coordinate system such as (X, Y, Z) as shown in the figure.
- the configuration of the object orientation calculation apparatus of the present invention is preferably such that a two-dimensional image taken by a two-dimensional camera and a three-dimensional image taken by a three-dimensional camera are input, and the target object is positioned with respect to a predetermined coordinate system.
- An object orientation calculation device for calculating an angle comprising: an image vertex recognition unit for recognizing a plurality of vertices of a target object from a two-dimensional image; and an object orientation calculation unit for calculating an angle of one side surface of the target object with respect to a predetermined axis with respect to a predetermined coordinate system.
- an object attitude calculation apparatus and an object attitude calculation method that can accurately calculate the attitude of a pallet on which cargo is to be loaded and can smoothly carry out a transportation operation using cargo handling equipment such as a forklift. can.
- FIG. 1 is a diagram showing a configuration of a forklift transportation system at a physical distribution site according to Embodiment 1;
- FIG. 2 is a hardware/software configuration diagram of an object orientation calculation device;
- FIG. 10 is a diagram showing the configuration of a forklift transportation system at a physical distribution site according to Embodiment 2; 10 is a flowchart showing object orientation calculation processing according to the second embodiment.
- FIG. 10 is a diagram showing specifications related to 3D reference point coordinate correction processing of the second embodiment on the front surface of the pallet; It is a figure which shows an example of a center point correction data table.
- FIG. 10 is a diagram showing specifications related to 3D reference point coordinate correction processing of Embodiment 3 on the front surface of the pallet; It is a perspective view of a pallet and a figure explaining specifications.
- expressions such as “table”, “list”, and “queue” may be used, but various types of information may be expressed in data structures other than these.
- various information such as “XX table”, “XX list”, and “XX queue” may be referred to as “XX information”.
- identification information expressions such as “identification information”, “identifier”, “name”, “ID”, and “number” are used, but these can be replaced with each other.
- the processing performed by executing the program may be explained.
- the computer executes a program by means of a processor (eg, CPU, GPU) and performs processing determined by the program while using storage resources (eg, memory) and interface devices (eg, communication port). Therefore, the main body of the processing performed by executing the program may be the processor.
- a main body of processing executed by executing a program may be a controller having a processor, a device, a system, a computer, or a node.
- the subject of the processing performed by executing the program may be an arithmetic unit, and may include a dedicated circuit for performing specific processing.
- the dedicated circuit is, for example, FPGA (Field Programmable Gate Array), ASIC (Application Specific Integrated Circuit), CPLD (Complex Programmable Logic Device), or the like.
- the program may be installed on the computer from the program source.
- the program source may be, for example, a program distribution server or a computer-readable storage medium.
- the program distribution server may include a processor and storage resources for storing the distribution target program, and the processor of the program distribution server may distribute the distribution target program to other computers.
- two or more programs may be implemented as one program, and one program may be implemented as two or more programs.
- FIG. 1 Each embodiment according to the present invention will be described below with reference to FIGS. 1 to 12.
- FIG. 1 A first embodiment according to the present invention will be described below with reference to FIGS. 1 to 12.
- FIG. 1 A first embodiment according to the present invention will be described below with reference to FIGS. 1 to 7.
- FIG. 1 A first embodiment according to the present invention will be described below with reference to FIGS. 1 to 7.
- FIGS. 1 and 2 First, the configuration of a forklift transportation system at a distribution site will be described using FIGS. 1 and 2.
- the forklift 20 is a device that carries by inserting the forks 25 into the fork pockets 11 of the pallet 10 .
- the forklift 20 has an operation command unit 30 that gives instructions to the driving wheels 28 and the fork operation unit 26 to operate them according to an external command or an operation program.
- a 2D (Dimension) camera (RGB camera) 21 captures a two-dimensional image (2D point image), and a 3D (Dimension) camera (Depth camera) 22 captures a three-dimensional image (3D point image). image) is captured and wirelessly transmitted to the access point 5 via the external interface unit 27 .
- the 3D camera 22 is a camera that can refer to information about the depth of the captured image.
- the 3D camera 22 may be realized by a stereo system that measures the distance with two cameras, may be a ToF (Time of Flight) system that measures the reflection time of light, or may be a system that emits special light. It may be a structured lighting scheme that measures depth from the state.
- the image captured from the access point 5 is transmitted to the object orientation calculation device 100.
- the radar sensor 40 measures surrounding objects and transmits the measured data to the operation command unit 30 .
- the object orientation calculation device 100 is a device that obtains the orientation of the pallet 10 (the angle formed with the coordinate system) based on the captured image.
- the object orientation calculation apparatus 100 includes functional units such as an image vertex recognition unit 101, a 3D (dimensional) reference point determination unit 102, an object angle calculation unit 103, an image reception unit 104, an object angle transmission unit 105, and a storage unit 110. have.
- the image vertex recognition unit 101 is a functional unit that recognizes the vertices of the sides of the pallet 10 by performing image recognition from a 2D point image.
- the 3D reference point determination unit 102 is a functional unit that determines a 3D reference point (details will be described later) used for object angle calculation from points in the 3D point image.
- the object angle calculator 103 is a functional unit that calculates the angle of the pallet 10 from 2D points and 3D points.
- the image receiving unit 104 is a functional unit that receives a two-dimensional image (2D point image) captured by the 2D camera 21 and a three-dimensional image captured by the 3D camera 22 .
- the object angle transmission unit 105 is a functional unit that transmits the calculated angle of the pallet 10 .
- the operation command unit 30 of the forklift 20 receives the angle of the pallet 10 via the access point 5 and the external interface unit 27, and generates appropriate pallet operation data.
- the storage unit 110 is a functional unit that holds data necessary for the object orientation calculation apparatus 100, and includes an image data DB 111 that stores 2D image data and 3D image data, and a measurement data DB 112 that stores measurement data necessary for operation. , holds an object orientation calculation data DB 113 that stores data for calculating the orientation of an object.
- the hardware/software configuration of the object orientation calculation apparatus 100 will be described with reference to FIG.
- a hardware configuration of the object orientation calculation apparatus 100 for example, it is realized by a general information processing apparatus such as a personal computer shown in FIG.
- the object orientation calculation device 100 includes a CPU (Central Processing Unit) 202, a main memory device 204, a network I/F (InterFace) 206, a display I/F 208, an input/output I/F 210, and an auxiliary memory I/F 212, which are coupled via a bus. It is in the form of
- the CPU 202 controls each part of the object orientation calculation device 100, loads necessary programs into the main storage device 204, and executes them.
- the main storage device 204 is normally composed of a volatile memory such as a RAM, and stores programs executed by the CPU 202 and data to be referred to.
- a network I/F 206 is an interface for connecting to a network.
- a display I/F 208 is an interface for connecting a display device 220 such as an LCD (Liquid Crystal Display).
- the input/output I/F 210 is an interface for connecting input/output devices.
- a keyboard 230 and a mouse 232 as a pointing device are connected.
- the auxiliary storage I/F 212 is an interface for connecting auxiliary storage devices such as HDD (Hard Disk Drive) 250 and SSD (Solid State Drive).
- HDD Hard Disk Drive
- SSD Solid State Drive
- the HDD 250 has a large storage capacity and stores programs for executing this embodiment.
- An image vertex recognition program 261, a 3D point determination program 262, an object angle calculation program 263, an image reception program 264, and an object angle transmission program 265 are installed in the object orientation calculation apparatus 100.
- the image vertex recognition program 261, the 3D point determination program 262, the object angle calculation program 263, the image reception program 264, and the object angle transmission program 265 are the image vertex recognition unit 101, the 3D reference point determination unit 102, the object angle calculation unit 103, It is a program that executes the functions of the image reception unit 104 and the object angle transmission unit 105 .
- the HDD 250 also holds an image data DB 111, a measurement data DB 112 of measurement data necessary for operation, and an object orientation calculation data DB 113 used to calculate the orientation of an object (pallet).
- FIG. 3 the processing related to the calculation of the object orientation of the forklift transportation system will be described using FIGS. 3 and 4.
- the object orientation calculation device 100 receives a two-dimensional image captured by the 2D camera 21 via the access point 5 (S01).
- the object orientation calculation device 100 receives the three-dimensional image captured by the 3D camera 22 via the access point 5 (S02).
- the object orientation calculation device 100 calculates the orientation of the object (pallet) based on the two-dimensional image captured by the 2D camera 21 and the three-dimensional image captured by the 3D camera 22 (S03). This process will be described in detail below with reference to FIG.
- the object orientation calculation device 100 transmits information (pallet angle) on the calculated orientation of the object (pallet) to the forklift via the access point 5 (S04).
- This process corresponds to S03 in FIG.
- the image vertex recognition unit 101 of the object orientation calculation device 100 performs image recognition processing by AI learning according to the pallet model, cuts out a bounding box from the two-dimensional image, and extracts four vertices of the pallet viewed from the forklift 20 side (Fig. 13 ( S10 ) .
- a 11 , A 12 , A 21 , and A 22 corresponding to a 11 , a 12 , a 21 , and a 22 are obtained for the four vertices, respectively.
- the 3D reference point determining unit 102 of the object orientation calculation apparatus 100 selects a reference point for a 3D point (hereinafter referred to as a "3D (Dimension) reference point") from the three-dimensional image captured by the 3D camera 22. is calculated (S20).
- 3D (Dimension) reference point a reference point for a 3D point (hereinafter referred to as a "3D (Dimension) reference point") from the three-dimensional image captured by the 3D camera 22. is calculated (S20).
- 3D (D) reference point a reference point for a 3D point (hereinafter referred to as a "3D (Dimension) reference point" from the three-dimensional image captured by the 3D camera 22. is calculated (S20).
- a method for obtaining the coordinates of the 3D reference point will be described in detail later.
- the posture of the object (pallet) is calculated from the 2D coordinates of one of the four vertices seen from the forklift 20 side and the 3D reference point (S30). A detailed algorithm for calculating the orientation of the object (pallet) will be described later.
- the apparent pallet front center point 14 is used as the 3D reference point.
- This process can be obtained, for example, by using the rs2_deproject_pixel_to_point function provided by Intel's RealSense (registered trademark) technology.
- the rs2_deproject_pixel_to_point function receives as arguments the camera internal parameter object, the coordinates (X, Y) in the image, and the depth, and transforms them into a space with the camera as the origin.
- the horizontal direction of the 3D camera 22 is taken as the X axis, the right side as the X + direction, the front and back direction as the Z axis, the front as the Z + direction, the vertical direction as the Y axis, and the bottom as the Y + direction. I will take it.
- FIG. 1 The positional relationship between the 3D camera and the pallet as seen from behind the 3D camera is shown in FIG. Also, the positional relationship between the 3D camera and the palette as viewed from above the 3D camera is as shown in FIG.
- h is the height of the pallet 10 and s is the width of the pallet 10, as described with reference to FIG.
- r v is the left and right resolution of the 2D image (e.g., 1920 pixels)
- r h is the top and bottom resolution of the 2D image (e.g., 1080 pixels)
- ⁇ is the top and bottom angle of view of the 2D image (e.g., 59 degrees )
- ⁇ is the left and right angle of view (for example, 90 degrees) of the 2D image.
- This formula describes the relationship between 2D coordinates and 3D coordinates based on the properties of 2D images.
- the rotation of the pallet 10 in the three-dimensional axial direction also exists in the X-axis and Z-axis directions. Therefore, in order for the forklift 20 to recognize the position of the pallet 10, it is considered sufficient to consider only the rotation angle of the shaft in the vertical direction (the pallet moves left and right).
- the orientation of a pallet on which packages are to be loaded can be determined with high accuracy and accuracy through simple calculations. Therefore, it is possible to smoothly carry out the transportation work using cargo handling equipment such as a forklift.
- FIG. 2 A second embodiment according to the present invention will be described below with reference to FIGS. 8 to 11.
- FIG. 8 A second embodiment according to the present invention will be described below with reference to FIGS. 8 to 11.
- a 2D image obtained from a 2D camera and 3D data obtained from a 3D camera are used to calculate the angle of the pallet from the coordinate system.
- This embodiment corrects the 3D reference point used in the calculation of the first embodiment so that it can be calculated more accurately.
- the configuration of the forklift transportation system at the logistics site is almost the same as that of the first embodiment.
- a 3D reference point correction unit 106 is added to the object orientation calculation device 100 .
- the 3D reference point correction unit 106 is a functional unit that obtains a corrected pallet front surface center point from the apparent pallet front surface center point 14 of the first embodiment (details will be described later).
- FIG. 9 the processing of the object orientation calculation device in Embodiment 2 will be described using FIGS. 9 to 11.
- the object orientation calculation processing of the second embodiment is substantially the same as the object orientation calculation processing of the first embodiment shown in FIG. 4, but 3D reference point coordinate correction processing (S21) is added between S20 and S30. .
- the 3D reference point coordinate correction process is a process of obtaining a corrected pallet front center point from the apparent pallet front center point 14 .
- FIG. 10 The 3D reference point coordinate correction process will be described in detail below with reference to FIGS. 10 and 11.
- FIG. 10 The 3D reference point coordinate correction process will be described in detail below with reference to FIGS. 10 and 11.
- L 1 be the length of the line segment between A 11 and the apparent center point 14 of the pallet front surface
- L 2 be the length of the line segment between the apparent center point 14 of the pallet front surface and A 22 .
- L 1 /L 2 is determined as a ratio of these line segments, and a corrected pallet front center point 15 is determined from the apparent pallet front center point 14 according to this line segment ratio L 1 /L 2 .
- the coordinates of the corrected pallet front surface center point 15 are (x a , ya , za ) , and the correction amounts of the apparent pallet front surface center point 14 in three-dimensional coordinates are ( ⁇ x , ⁇ y , ⁇ z ). Then, the following (formula 6) is established.
- the center point correction data table 1131 defines an X correction amount ⁇ x 1131b, a Y correction amount ⁇ y 1131c, and a Z correction amount ⁇ z 1131d for each line segment ratio L 1 /L 2 1131a. It is a table that has been
- Appropriate values for the X correction amount ⁇ x 1131b, the Y correction amount ⁇ y 1131c, and the Z correction amount ⁇ z 1131d may be obtained by, for example, statistical data processing.
- an example of obtaining the correction amount from the data of the table in which the correction amount for each line segment ratio L 1 /L 2 is defined was explained, but with the line segment ratio L 1 /L 2 as an argument, A library function that returns the correction amount as a value may be prepared to obtain the correction amount for the line segment ratio L 1 /L 2 .
- the processing of calculating the posture of the object (pallet) from the 2D coordinates of one of the four vertices seen from the forklift 20 side in S30 and the 3D reference point (corrected center point of the front face of the pallet) is , is the same as in the first embodiment.
- FIG. 3 A third embodiment according to the present invention will be described below with reference to FIG. This embodiment, like the second embodiment, corrects the 3D reference points used in the calculation of the first embodiment to more accurately calculate them.
- the configuration of the forklift transportation system at the distribution site and the processing of the object orientation calculation device are the same as in the second embodiment.
- the contents of the 3D reference point coordinate correction process (S21) are different from those in the second embodiment.
- the apparent center point 14 of the pallet front surface which is the intersection of straight lines passing through the two opposing vertices of A 11 , A 12 , A 21 , and A 22 recognized by 2D coordinates, is required.
- a point group cutting area 16 (that is, height 2t, width 2p) having a length t in the vertical direction and a length p in the horizontal direction is formed.
- each (X, Y, Z ) coordinates to obtain the coordinates (x a , y a , za ) of the corrected pallet front center point 15 .
- the point cloud cutting area 16 takes an area of an appropriate size when viewed from the shape of the palette.
- the length t in the up-down direction and the length p in the left-right direction are given by the following (Equation 9).
- Hh is the height of the fork pocket 11
- Hs is the width of the fork pocket 11
- Rs is the width of the rectangular region between the fork pockets.
- Embodiment 1 The processing of calculating the posture of the object (pallet) from the 2D coordinates of one of the four vertices seen from the forklift 20 side in S30 and the 3D reference point (corrected center point of the front face of the pallet) is , Embodiment 1, and Embodiment 2.
- the apparent pallet front center point 14 obtained from the four vertices A 11 , A 12 , A 21 , and A 22 of the pallet is corrected to obtain a more accurate 3D reference point.
- a more accurate pallet orientation can be obtained for adoption.
Landscapes
- Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- Transportation (AREA)
- Structural Engineering (AREA)
- General Physics & Mathematics (AREA)
- Combustion & Propulsion (AREA)
- Civil Engineering (AREA)
- Life Sciences & Earth Sciences (AREA)
- Geology (AREA)
- Mechanical Engineering (AREA)
- Chemical & Material Sciences (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Theoretical Computer Science (AREA)
- Forklifts And Lifting Vehicles (AREA)
Abstract
This object orientation calculation device has: an image vertex recognition unit into which a two-dimensional image captured by a two-dimensional camera and a three-dimensional image captured by a three-dimensional camera are inputted, and which recognizes a plurality of vertices of an object according to the two-dimensional image; a three-dimensional reference point determination unit that, on the basis of coordinates for the vertices recognized by the image vertex recognition unit, determines the three-dimensional coordinates of a three-dimensional reference point; and an object orientation calculation unit that calculates the angle of one side surface of the object relative to a prescribed axis for a prescribed coordinate system. The object orientation calculation unit calculates the angle of the object on the basis of the coordinates for the vertices in three-dimensional coordinates, a constraint pertaining to the three-dimensional reference point in the three-dimensional coordinates, and a constraint pertaining to the three-dimensional coordinates that correspond to the two-dimensional coordinates of the vertices based on the characteristics of the two-dimensional image. This makes it possible to improve the accuracy of calculating the orientation of pallets on which loads are mounted, and to smoothly carry out a conveyance operation using a forklift or other cargo-handling apparatus, in a forklift transportation system at a physical distribution site.
Description
本発明は、物体姿勢算出装置および物体姿勢算出方法に係り、特に、倉庫等における物流作業において、荷物を搭載するパレットの姿勢算出の精度を正確にして、フォークリフト等の荷役機器による搬送作業を円滑に行うのに好適な物体姿勢算出装置および物体姿勢算出方法に関する。
The present invention relates to an object orientation calculation device and an object orientation calculation method, and in particular, in distribution work in a warehouse or the like, accurate calculation of the orientation of a pallet on which cargo is loaded can be made to facilitate transportation work using cargo handling equipment such as a forklift. The present invention relates to an object orientation calculation apparatus and an object orientation calculation method suitable for performing
近年の物流スタイルでは、パレットといわれる荷物を載せるための荷役台とフォークリフトにより、荷物の運搬作業を行うのが一般的である。パレットには、貨物を載せる面の両サイドに差込口(フォークポケット)が設けられており、そこにフォークリフトのツメ(フォーク)を差し込むことによって、上下移動や搬送が容易にでき、効率的な作業が可能となる。
In the recent logistics style, it is common to use a loading platform called a pallet and a forklift to carry the cargo. The pallet has insertion openings (fork pockets) on both sides of the surface on which the cargo is to be placed. work becomes possible.
このような物流の作業スタイルに鑑み、パレットを画像認識して、位置を把捉して、無人の自動フォークリフトでの運搬や、有人のフォークリフトの運転手の補助を行う技術が提案されている。
In view of this logistics work style, technologies have been proposed for image recognition of pallets, grasping their positions, transporting them with unmanned automatic forklifts, and assisting drivers of manned forklifts.
パレットを画像認識するために、2D画像とデプスカメラなどによる3D点群によりパレットの位置姿勢認識をする技術がある。このような物体の位置姿勢認識において、一般的には、離れた物体上の2点以上のデプス(例えば、左の頂点、右の頂点)を使って認識するが、物流作業の実務においては、現場環境や検出対象物状態により、3D点群の測定品質が低いという問題があり、数少ない測定品質の高い点群を用いて対象物体の位置姿勢認識を認識する必要がある。
In order to recognize the image of the pallet, there is a technology that recognizes the position and orientation of the pallet using 2D images and 3D point clouds using depth cameras. In the position and orientation recognition of such an object, generally two or more depth points (for example, left vertex and right vertex) on a distant object are used for recognition. There is a problem that the measurement quality of the 3D point cloud is low due to the site environment and the state of the detection object, and it is necessary to recognize the position and orientation of the target object using a few point clouds with high measurement quality.
3D参照点を利用して、物体の位置姿勢を求めるアルゴリズムについては、例えば、非特許文献1に記載がある。この非特許文献1では、3D参照点とそれらの2D平面におけるイメージの位置関係を表す数式が記載されている(§2 THE CAMERA POSE FROM THREE POINTS REVISITED, Fig 1)。
An algorithm for obtaining the position and orientation of an object using 3D reference points is described, for example, in Non-Patent Document 1. This non-patent document 1 describes mathematical formulas representing the positional relationship between 3D reference points and their images in the 2D plane (§2 THE CAMERA POSE FROM THREE POINTS REVISITED, Fig 1).
また、これに関連して、特許文献1には、デプスカメラにより取得されたデプス画像と、デプス画像から抽出された平面領域に基づいて、物体の位置、姿勢を算出する三次元計測・表示装置が開示されている。
In relation to this, Patent Document 1 discloses a three-dimensional measurement and display apparatus for calculating the position and orientation of an object based on a depth image acquired by a depth camera and a plane area extracted from the depth image. is disclosed.
以下、図13を用いて発明が解決しようとする課題について説明する。
The problem to be solved by the invention will be described below with reference to FIG.
図13は、パレットの斜視図と諸元を説明する図である。
上記で述べたように、現状のデプスカメラの測定の問題により、3D点群の測定品質が低いため、数少ない測定品質の高い点群を用いて対象物体となるパレットの位置姿勢認識を認識することを考える。 FIG. 13 is a perspective view of a pallet and a diagram for explaining specifications.
As mentioned above, due to the measurement problem of the current depth camera, the measurement quality of the 3D point cloud is low. think of.
上記で述べたように、現状のデプスカメラの測定の問題により、3D点群の測定品質が低いため、数少ない測定品質の高い点群を用いて対象物体となるパレットの位置姿勢認識を認識することを考える。 FIG. 13 is a perspective view of a pallet and a diagram for explaining specifications.
As mentioned above, due to the measurement problem of the current depth camera, the measurement quality of the 3D point cloud is low. think of.
そこで、必要な3D点群を減らして検出可能なパレットの側面3D1点+2D1点と、この2点の位置関係を使ってパレットの位置姿勢検出することにする。
Therefore, we will detect the position and orientation of the pallet by reducing the necessary 3D point cloud and using the 3D 1 point + 2D 1 point on the side of the pallet that can be detected and the positional relationship between these two points.
ここで、パレットの位置姿勢検出とは、具体的には、測定に係るカメラとパレットの傾きを求めることである(空間上の具体的なイメージは、後に説明する)。
Here, detecting the position and orientation of the pallet specifically means obtaining the tilt of the camera and the pallet involved in the measurement (the specific spatial image will be explained later).
パレット10は、図13に示されるように、高さh、幅sのパレット側面12を有しており、二つのフォークリフトのフォークを差し込んで、運搬するためのフォークポケット11が設けられており、左右のフォークポケット11l,11rの間には、フォークポケット間矩形領域13が形成されている。パレットは、様々な規格があるが、例えば、JISに規定されたT11型パレットは、s=1100mm、h=144mmである。
The pallet 10, as shown in FIG. 13, has a pallet side 12 with a height h and a width s, and is provided with fork pockets 11 for inserting and carrying the forks of two forklifts, An inter-fork-pocket rectangular region 13 is formed between the left and right fork pockets 11l and 11r. There are various standards for pallets. For example, the T11 type pallet specified by JIS has s=1100 mm and h=144 mm.
パレットの位置姿勢検出を行うときに、画像認識によりバウンディングボックスを切り取り、その側面の2Dでの認識された4頂点が図13に示されるように、a11,a12,a21,a22であるとする。なお、座標系は、図のように(X,Y,Z)のような三次元直交座標系をとることとする。
When performing pallet pose detection, the bounding box is clipped by image recognition, and the recognized four vertices in 2D of its sides are a 11 , a 12 , a 21 , a 22 as shown in FIG. Suppose there is The coordinate system is assumed to be a three-dimensional orthogonal coordinate system such as (X, Y, Z) as shown in the figure.
ここで、右上の2D点のa22と、左上のa21に対応する3D点により、パレットの位置姿勢を検出しようとするが、実際には、例えば、図13に示されるように、画像認識の誤差が生じ、a11の座標は、A11、a21の座標は、A21となり、画像認識の誤差を含んだ2D点の座標A21に対応する3D点により、パレットの位置姿勢の誤差が大きくなるという問題が生じる。
Here, an attempt is made to detect the position and orientation of the pallet from the upper right 2D point a22 and the 3D point corresponding to the upper left a21 . , the coordinates of a 11 become A 11 , the coordinates of a 21 become A 21 , and the 3D point corresponding to the coordinate A 21 of the 2D point including the image recognition error causes an error in the position and orientation of the pallet becomes large.
本発明の目的は、荷物を搭載するパレットの姿勢算出の精度を正確にして、フォークリフト等の荷役機器による搬送作業を円滑に行うことのできる物体姿勢算出装置および物体姿勢算出方法を提供することにある。
SUMMARY OF THE INVENTION It is an object of the present invention to provide an object orientation calculation apparatus and an object orientation calculation method that can accurately calculate the orientation of a pallet on which cargo is to be loaded and can smoothly carry out a transportation operation using cargo handling equipment such as a forklift. be.
本発明の物体姿勢算出装置の構成は、好ましくは、2次元カメラにより撮像された2次元画像と、3次元カメラにより撮像された3次元画像とを入力して、所定の座標系に対する対象物体の角度を算出する物体姿勢算出装置であって、2次元画像より対象物体の複数の頂点を認識する画像頂点認識部と、画像頂点認識部により認識された頂点の座標に基づいて、3次元基準点の3次元座標を決定する3次元基準点決定部と、所定の座標系に対する所定の軸に対して、前記対象物体の一側面の角度を算出する物体姿勢算出部とを有し、物体姿勢算出部は、3次元座標における頂点の座標と、3次元座標における3次元基準点の制約条件と、前記2次元画像の性質に基づく頂点の2次元座標と対応する3次元座標の制約条件とに基づいて前記角度を算出するようにしたものである。
The configuration of the object orientation calculation apparatus of the present invention is preferably such that a two-dimensional image taken by a two-dimensional camera and a three-dimensional image taken by a three-dimensional camera are input, and the target object is positioned with respect to a predetermined coordinate system. An object orientation calculation device for calculating an angle, comprising: an image vertex recognition unit for recognizing a plurality of vertices of a target object from a two-dimensional image; and an object orientation calculation unit for calculating an angle of one side surface of the target object with respect to a predetermined axis with respect to a predetermined coordinate system. Based on the coordinates of vertices in 3D coordinates, the constraints of 3D reference points in 3D coordinates, and the 2D coordinates of vertices and the corresponding 3D coordinates constraints based on the properties of the 2D image. to calculate the angle.
本発明によれば、荷物を搭載するパレットの姿勢算出の精度を正確にして、フォークリフト等の荷役機器による搬送作業を円滑に行うことのできる物体姿勢算出装置および物体姿勢算出方法を提供することができる。
According to the present invention, it is possible to provide an object attitude calculation apparatus and an object attitude calculation method that can accurately calculate the attitude of a pallet on which cargo is to be loaded and can smoothly carry out a transportation operation using cargo handling equipment such as a forklift. can.
以下、図面を参照して本発明の実施形態を説明する。実施例は、本発明を説明するための例示であって、説明の明確化のため、適宜、省略および簡略化がなされている。本発明は、他の種々の形態でも実施することが可能である。特に限定しない限り、各構成要素は単数でも複数でも構わない。
Hereinafter, embodiments of the present invention will be described with reference to the drawings. The examples are exemplifications for explaining the present invention, and are appropriately omitted and simplified for clarity of explanation. The present invention can also be implemented in various other forms. Unless otherwise specified, each component may be singular or plural.
図面において示す各構成要素の位置、大きさ、形状、範囲などは、発明の理解を容易にするため、実際の位置、大きさ、形状、範囲などを表していない場合がある。このため、本発明は、必ずしも、図面に開示された位置、大きさ、形状、範囲などに限定されない。
The position, size, shape, range, etc. of each component shown in the drawings may not represent the actual position, size, shape, range, etc. in order to facilitate the understanding of the invention. As such, the present invention is not necessarily limited to the locations, sizes, shapes, extents, etc., disclosed in the drawings.
各種情報の例として、「テーブル」、「リスト」、「キュー」等の表現にて説明することがあるが、各種情報はこれら以外のデータ構造で表現されてもよい。例えば、「XXテーブル」、「XXリスト」、「XXキュー」等の各種情報は、「XX情報」としてもよい。識別情報について説明する際に、「識別情報」、「識別子」、「名」、「ID」、「番号」等の表現を用いるが、これらについてはお互いに置換が可能である。
As examples of various types of information, expressions such as "table", "list", and "queue" may be used, but various types of information may be expressed in data structures other than these. For example, various information such as "XX table", "XX list", and "XX queue" may be referred to as "XX information". When describing identification information, expressions such as “identification information”, “identifier”, “name”, “ID”, and “number” are used, but these can be replaced with each other.
同一あるいは同様の機能を有する構成要素が複数ある場合には、同一の符号に異なる添字を付して説明する場合がある。また、これらの複数の構成要素を区別する必要がない場合には、添字を省略して説明する場合がある。
When there are multiple components that have the same or similar functions, they may be described with the same reference numerals with different suffixes. Further, when there is no need to distinguish between these constituent elements, the subscripts may be omitted in the description.
実施例において、プログラムを実行して行う処理について説明する場合がある。ここで、計算機は、プロセッサ(例えばCPU、GPU)によりプログラムを実行し、記憶資源(例えばメモリ)やインターフェースデバイス(例えば通信ポート)等を用いながら、プログラムで定められた処理を行う。そのため、プログラムを実行して行う処理の主体を、プロセッサとしてもよい。同様に、プログラムを実行して行う処理の主体が、プロセッサを有するコントローラ、装置、システム、計算機、ノードであってもよい。プログラムを実行して行う処理の主体は、演算部であれば良く、特定の処理を行う専用回路を含んでいてもよい。ここで、専用回路とは、例えばFPGA(Field Programmable Gate Array)やASIC(Application Specific Integrated Circuit)、CPLD(Complex Programmable Logic Device)等である。
In the examples, the processing performed by executing the program may be explained. Here, the computer executes a program by means of a processor (eg, CPU, GPU) and performs processing determined by the program while using storage resources (eg, memory) and interface devices (eg, communication port). Therefore, the main body of the processing performed by executing the program may be the processor. Similarly, a main body of processing executed by executing a program may be a controller having a processor, a device, a system, a computer, or a node. The subject of the processing performed by executing the program may be an arithmetic unit, and may include a dedicated circuit for performing specific processing. Here, the dedicated circuit is, for example, FPGA (Field Programmable Gate Array), ASIC (Application Specific Integrated Circuit), CPLD (Complex Programmable Logic Device), or the like.
プログラムは、プログラムソースから計算機にインストールされてもよい。プログラムソースは、例えば、プログラム配布サーバまたは計算機が読み取り可能な記憶メディアであってもよい。プログラムソースがプログラム配布サーバの場合、プログラム配布サーバはプロセッサと配布対象のプログラムを記憶する記憶資源を含み、プログラム配布サーバのプロセッサが配布対象のプログラムを他の計算機に配布してもよい。また、実施例において、2以上のプログラムが1つのプログラムとして実現されてもよいし、1つのプログラムが2以上のプログラムとして実現されてもよい。
The program may be installed on the computer from the program source. The program source may be, for example, a program distribution server or a computer-readable storage medium. When the program source is a program distribution server, the program distribution server may include a processor and storage resources for storing the distribution target program, and the processor of the program distribution server may distribute the distribution target program to other computers. Also, in the embodiment, two or more programs may be implemented as one program, and one program may be implemented as two or more programs.
以下、本発明に係る各実施形態を、図1ないし図12を用いて説明する。
Each embodiment according to the present invention will be described below with reference to FIGS. 1 to 12. FIG.
〔実施形態1〕
以下、本発明に係る実施形態1を、図1ないし図7を用いて説明する。 [Embodiment 1]
A first embodiment according to the present invention will be described below with reference to FIGS. 1 to 7. FIG.
以下、本発明に係る実施形態1を、図1ないし図7を用いて説明する。 [Embodiment 1]
A first embodiment according to the present invention will be described below with reference to FIGS. 1 to 7. FIG.
先ず、図1および図2を用いて物流現場でのフォークリフト運搬システムの構成について説明する。
First, the configuration of a forklift transportation system at a distribution site will be described using FIGS. 1 and 2.
フォークリフト20は、フォーク25をパレット10のフォークポケット11に差し込んで運搬を行う機器である。フォークリフト20は、外部から指令や運行プログラムによって、動輪28やフォーク稼働部26に指示を与えて動作させる運行指令部30を有している。
The forklift 20 is a device that carries by inserting the forks 25 into the fork pockets 11 of the pallet 10 . The forklift 20 has an operation command unit 30 that gives instructions to the driving wheels 28 and the fork operation unit 26 to operate them according to an external command or an operation program.
また、2D(Dimension:次元)カメラ(RGBカメラ)21により、二次元の画像(2D点画像)を撮像し、3D(Dimension:次元)カメラ(Depthカメラ)22により、三次元の画像(3D点画像)を撮像して、外部インタフェース部27を介して、アクセスポイント5に無線に送信する。3Dカメラ22は、撮像した画像に対しての深度(Depth)に関する情報を参照できるカメラである。3Dカメラ22は、二つのカメラにより距離を測定するステレオ方式で実現されてもよいし、光の反射時間を測定するToF(Time of Flight)方式であってもよいし、特殊な光を照射した状態から深度を測定する構造化照明方式であってもよい。
A 2D (Dimension) camera (RGB camera) 21 captures a two-dimensional image (2D point image), and a 3D (Dimension) camera (Depth camera) 22 captures a three-dimensional image (3D point image). image) is captured and wirelessly transmitted to the access point 5 via the external interface unit 27 . The 3D camera 22 is a camera that can refer to information about the depth of the captured image. The 3D camera 22 may be realized by a stereo system that measures the distance with two cameras, may be a ToF (Time of Flight) system that measures the reflection time of light, or may be a system that emits special light. It may be a structured lighting scheme that measures depth from the state.
そして、アクセスポイント5から撮像した画像は、物体姿勢算出装置100に送信される。
Then, the image captured from the access point 5 is transmitted to the object orientation calculation device 100.
また、レーダセンサ40により、周りの物体を計測して、運行指令部30に計測したデータを送信する。
Also, the radar sensor 40 measures surrounding objects and transmits the measured data to the operation command unit 30 .
物体姿勢算出装置100は、撮像画像に基づいて、パレット10の姿勢(座標系となす角度)を求める装置である。物体姿勢算出装置100は、画像頂点認識部101、3D(Dimension:次元)基準点決定部102、物体角度算出部103、画像受信部104、物体角度送信部105、記憶部110の各機能部を有する。
The object orientation calculation device 100 is a device that obtains the orientation of the pallet 10 (the angle formed with the coordinate system) based on the captured image. The object orientation calculation apparatus 100 includes functional units such as an image vertex recognition unit 101, a 3D (dimensional) reference point determination unit 102, an object angle calculation unit 103, an image reception unit 104, an object angle transmission unit 105, and a storage unit 110. have.
画像頂点認識部101は、2D点の画像より画像認識を行って、パレット10の側面の頂点を認識する機能部である。3D基準点決定部102は、3D点画像の中の点から物体角度算出に用いられる3D基準点(詳細は、後述)を決定する機能部である。物体角度算出部103は、2D点と3D点より、パレット10の角度を算出する機能部である。画像受信部104は、2Dカメラ21により撮像された二次元の画像(2D点画像)と、3Dカメラ22により撮像された三次元の画像を受信する機能部である。物体角度送信部105は、算出されたパレット10の角度を送信する機能部である。
The image vertex recognition unit 101 is a functional unit that recognizes the vertices of the sides of the pallet 10 by performing image recognition from a 2D point image. The 3D reference point determination unit 102 is a functional unit that determines a 3D reference point (details will be described later) used for object angle calculation from points in the 3D point image. The object angle calculator 103 is a functional unit that calculates the angle of the pallet 10 from 2D points and 3D points. The image receiving unit 104 is a functional unit that receives a two-dimensional image (2D point image) captured by the 2D camera 21 and a three-dimensional image captured by the 3D camera 22 . The object angle transmission unit 105 is a functional unit that transmits the calculated angle of the pallet 10 .
フォークリフト20の運行指令部30は、アクセスポイント5、外部インタフェース部27を介して、パレット10の角度を受信し、適切なパレットの運行データを生成する。
The operation command unit 30 of the forklift 20 receives the angle of the pallet 10 via the access point 5 and the external interface unit 27, and generates appropriate pallet operation data.
記憶部110は、物体姿勢算出装置100に必要なデータを保持する機能部であり、2D画像データと3D画像データを格納する画像データDB111と、運行に必要な測定データを格納する測定データDB112と、物体の姿勢を算出するためのデータを格納する物体姿勢算出データDB113を保持する。
The storage unit 110 is a functional unit that holds data necessary for the object orientation calculation apparatus 100, and includes an image data DB 111 that stores 2D image data and 3D image data, and a measurement data DB 112 that stores measurement data necessary for operation. , holds an object orientation calculation data DB 113 that stores data for calculating the orientation of an object.
次に、図2を用いて物体姿勢算出装置100のハードウェア・ソフトウェア構成を説明する。
物体姿勢算出装置100のハードウェア構成としては、例えば、図2に示されるパーソナルコンピュータのような一般的な情報処理装置で実現される。 Next, the hardware/software configuration of the object orientation calculation apparatus 100 will be described with reference to FIG.
As a hardware configuration of the object orientation calculation apparatus 100, for example, it is realized by a general information processing apparatus such as a personal computer shown in FIG.
物体姿勢算出装置100のハードウェア構成としては、例えば、図2に示されるパーソナルコンピュータのような一般的な情報処理装置で実現される。 Next, the hardware/software configuration of the object orientation calculation apparatus 100 will be described with reference to FIG.
As a hardware configuration of the object orientation calculation apparatus 100, for example, it is realized by a general information processing apparatus such as a personal computer shown in FIG.
物体姿勢算出装置100は、CPU(Central Processing Unit)202、主記憶装置204、ネットワークI/F(InterFace)206、表示I/F208、入出力I/F210、補助記憶I/F212が、バスにより結合された形態になっている。
The object orientation calculation device 100 includes a CPU (Central Processing Unit) 202, a main memory device 204, a network I/F (InterFace) 206, a display I/F 208, an input/output I/F 210, and an auxiliary memory I/F 212, which are coupled via a bus. It is in the form of
CPU202は、物体姿勢算出装置100の各部を制御し、主記憶装置204に必要なプログラムをロードして実行する。
The CPU 202 controls each part of the object orientation calculation device 100, loads necessary programs into the main storage device 204, and executes them.
主記憶装置204は、通常、RAMなどの揮発メモリで構成され、CPU202が実行するプログラム、参照するデータが記憶される。
The main storage device 204 is normally composed of a volatile memory such as a RAM, and stores programs executed by the CPU 202 and data to be referred to.
ネットワークI/F206は、ネットワークと接続するためのインタフェースである。
A network I/F 206 is an interface for connecting to a network.
表示I/F208は、LCD(Liquid Crystal Display)などの表示装置220を接続するためのインタフェースである。
A display I/F 208 is an interface for connecting a display device 220 such as an LCD (Liquid Crystal Display).
入出力I/F210は、入出力装置を接続するためのインタフェースである。図2の例では、キーボード230とポインティングデバイスのマウス232が接続されている。
The input/output I/F 210 is an interface for connecting input/output devices. In the example of FIG. 2, a keyboard 230 and a mouse 232 as a pointing device are connected.
補助記憶I/F212は、HDD(Hard Disk Drive)250やSSD(Solid State Drive)などの補助記憶装置を接続するためのインタフェースである。
The auxiliary storage I/F 212 is an interface for connecting auxiliary storage devices such as HDD (Hard Disk Drive) 250 and SSD (Solid State Drive).
HDD250は、大容量の記憶容量を有しており、本実施形態を実行するためのプログラムが格納されている。物体姿勢算出装置100には、画像頂点認識プログラム261、3D点決定プログラム262、物体角度算出プログラム263、画像受信プログラム264、物体角度送信プログラム265がインストールされている。
The HDD 250 has a large storage capacity and stores programs for executing this embodiment. An image vertex recognition program 261, a 3D point determination program 262, an object angle calculation program 263, an image reception program 264, and an object angle transmission program 265 are installed in the object orientation calculation apparatus 100. FIG.
画像頂点認識プログラム261、3D点決定プログラム262、物体角度算出プログラム263、画像受信プログラム264、物体角度送信プログラム265は、それぞれ画像頂点認識部101、3D基準点決定部102、物体角度算出部103、画像受信部104、物体角度送信部105の機能を実行するプログラムである。
The image vertex recognition program 261, the 3D point determination program 262, the object angle calculation program 263, the image reception program 264, and the object angle transmission program 265 are the image vertex recognition unit 101, the 3D reference point determination unit 102, the object angle calculation unit 103, It is a program that executes the functions of the image reception unit 104 and the object angle transmission unit 105 .
また、HDD250は、画像データDB111と、運行に必要な測定データの測定データDB112と、物体(パレット)の姿勢を算出するために用いられる物体姿勢算出データDB113を保持する。
The HDD 250 also holds an image data DB 111, a measurement data DB 112 of measurement data necessary for operation, and an object orientation calculation data DB 113 used to calculate the orientation of an object (pallet).
次に、図3および図4を用いてフォークリフト運搬システムの物体姿勢算出に関係する処理について説明する。
Next, the processing related to the calculation of the object orientation of the forklift transportation system will be described using FIGS. 3 and 4. FIG.
先ず、物体姿勢算出装置100は、2Dカメラ21が撮像した2次元画像を、アクセスポイント5を介して受信する(S01)。
First, the object orientation calculation device 100 receives a two-dimensional image captured by the 2D camera 21 via the access point 5 (S01).
次に、物体姿勢算出装置100は、3Dカメラ22が撮像した3次元画像を、アクセスポイント5を介して受信する(S02)。
Next, the object orientation calculation device 100 receives the three-dimensional image captured by the 3D camera 22 via the access point 5 (S02).
次に、物体姿勢算出装置100は、2Dカメラ21が撮像した2次元画像と3Dカメラ22が撮像した3次元画像に基づいて、物体(パレット)の姿勢を算出する(S03)。この処理は、図4を用いて次に詳説する。
Next, the object orientation calculation device 100 calculates the orientation of the object (pallet) based on the two-dimensional image captured by the 2D camera 21 and the three-dimensional image captured by the 3D camera 22 (S03). This process will be described in detail below with reference to FIG.
次に、物体姿勢算出装置100は、算出された物体(パレット)の姿勢に関する情報(パレットの角度)をアクセスポイント5を介してフォークリフトに送信する(S04)。
Next, the object orientation calculation device 100 transmits information (pallet angle) on the calculated orientation of the object (pallet) to the forklift via the access point 5 (S04).
次に、既に説明した図13および図4を用いて物体姿勢算出処理の詳細について説明する。
Next, the details of the object orientation calculation processing will be described using FIGS. 13 and 4 already described.
この処理は、図3のS03に該当する処理である。
This process corresponds to S03 in FIG.
先ず、物体姿勢算出装置100の画像頂点認識部101は、パレットモデルに従って、AI学習による画像認識処理を行い、2次元画像からバウンディングボックスを切り出し、パレットのフォークリフト20側から見た4頂点(図13に示されたa11,a12,a21,a22)を求める(S10)。ただし、実際には、誤差があるため4頂点は、それぞれa11,a12,a21,a22に対応したA11,A12,A21,A22が求められる。
First, the image vertex recognition unit 101 of the object orientation calculation device 100 performs image recognition processing by AI learning according to the pallet model, cuts out a bounding box from the two-dimensional image, and extracts four vertices of the pallet viewed from the forklift 20 side (Fig. 13 ( S10 ) . However, in practice, due to errors, A 11 , A 12 , A 21 , and A 22 corresponding to a 11 , a 12 , a 21 , and a 22 are obtained for the four vertices, respectively.
次に、物体姿勢算出装置100の3D基準点決定部102は、3Dカメラ22が撮像した3次元画像から、3D点の基準となる点(以下、「3D(Dimension:次元)基準点」という)の座標を算出する(S20)。なお、3D基準点の座標の求め方は、後に詳説する。
Next, the 3D reference point determining unit 102 of the object orientation calculation apparatus 100 selects a reference point for a 3D point (hereinafter referred to as a "3D (Dimension) reference point") from the three-dimensional image captured by the 3D camera 22. is calculated (S20). A method for obtaining the coordinates of the 3D reference point will be described in detail later.
フォークリフト20側から見た4頂点の内の一つ頂点の2D座標と、3D基準点から物体(パレット)の姿勢を算出する(S30)。なお、物体(パレット)の姿勢を算出する詳細なアルゴリズムは、後に詳説する。
The posture of the object (pallet) is calculated from the 2D coordinates of one of the four vertices seen from the forklift 20 side and the 3D reference point (S30). A detailed algorithm for calculating the orientation of the object (pallet) will be described later.
次に、図5を用いて3D基準点の座標の求め方について説明する。
Next, how to obtain the coordinates of the 3D reference point will be explained using FIG.
これは、図4のS20に該当する処理である。
This is the process corresponding to S20 in FIG.
先ず、S20の3D基準点を求める処理では、2D座標で認識されたA11,A12,A21,A22の対向する互いの2頂点を通る直線の交点を取り、これを「みかけのパレット前面中心点」ということにする。そして、本実施形態では、みかけのパレット前面中心点14を3D基準点とする。
First, in the process of obtaining the 3D reference point in S20, the intersection of straight lines passing through two mutually opposing vertices of A 11 , A 12 , A 21 , and A 22 recognized by 2D coordinates is taken, and this is referred to as the "apparent palette "front center point". In this embodiment, the apparent pallet front center point 14 is used as the 3D reference point.
A11,A12,A21,A22の2D座標を、それぞれA11=(x11,y11)、A12=(x12,y12)、A21=(x21,y21)、A22=(x22,y22)とすると、以下の(式1)の連立方程式を解くことにより、みかけのパレット前面中心点14の2D座標(x0,y0)が得られる。
Let the 2D coordinates of A 11 , A 12 , A 21 and A 22 be A 11 =(x 11 ,y 11 ), A 12 =(x 12 ,y 12 ), A 21 =(x 21 ,y 21 ), If A 22 =(x 22 , y 22 ), the 2D coordinates (x 0 , y 0 ) of the apparent pallet front center point 14 can be obtained by solving the following simultaneous equations (Equation 1).
そして、3Dソフトウェアを利用することにより、みかけのパレット前面中心点14の2D座標(x0,y0)に対応するみかけのパレット前面中心点14の3D座標(x0,y0,z0)を求める。
Then, by using 3D software, the 3D coordinates (x 0 , y 0 , z 0 ) of the apparent pallet front center point 14 corresponding to the 2D coordinates (x 0 , y 0 ) of the apparent pallet front center point 14 Ask for
この処理は、例えば、インテルのRealSense(登録商標)テクノロジーの提供するrs2_deproject_pixel_to_point関数を利用することにより求めることができる。rs2_deproject_pixel_to_point関数は、引数として、カメラの内部パラメータのオブジェクト、画像内の座標(X, Y)、奥行きを受け取り、カメラを原点とした空間に変換するものである。
This process can be obtained, for example, by using the rs2_deproject_pixel_to_point function provided by Intel's RealSense (registered trademark) technology. The rs2_deproject_pixel_to_point function receives as arguments the camera internal parameter object, the coordinates (X, Y) in the image, and the depth, and transforms them into a space with the camera as the origin.
次に、図6および図7を用いて4頂点の内の一つ頂点の2D座標と、3D基準点から物体(パレット)の姿勢を算出する処理について説明する。
Next, using FIGS. 6 and 7, the processing for calculating the posture of the object (pallet) from the 2D coordinates of one of the four vertices and the 3D reference point will be described.
これは、図4のS30に該当する処理である。
This is the process corresponding to S30 in FIG.
座標系として、3Dカメラ22の左右方向をX軸にとり、右側をXの+方向、前後方向をZ軸にとり、前方をZの+方向、上下方向をY軸にとり、下側をYの+方向にとることにする。
As a coordinate system, the horizontal direction of the 3D camera 22 is taken as the X axis, the right side as the X + direction, the front and back direction as the Z axis, the front as the Z + direction, the vertical direction as the Y axis, and the bottom as the Y + direction. I will take it.
ここで、パレット10がY軸方向にθy分回転しているものとする。
Here, it is assumed that the pallet 10 is rotated by θy in the Y-axis direction.
3Dカメラの後方からみた3Dカメラとパレットの位置関係は、図6に示されるようになる。また、3Dカメラの上方からみた3Dカメラとパレットの位置関係は、図7に示されるようになる。
The positional relationship between the 3D camera and the pallet as seen from behind the 3D camera is shown in FIG. Also, the positional relationship between the 3D camera and the palette as viewed from above the 3D camera is as shown in FIG.
このときパレット10の左上頂点a21の座標を(xt,yt,zt)とすると、以下の(式2)に示す関係がある。
At this time, assuming that the coordinates of the upper left vertex a 21 of the palette 10 are (x t , y t , z t ), there is a relationship shown in (Equation 2) below.
ここで、図13で説明したように、hは、パレット10の高さ、sは、パレット10の幅である。
Here, h is the height of the pallet 10 and s is the width of the pallet 10, as described with reference to FIG.
次に、パレット10の左上頂点a21の座標を(xt,yt,zt)と、画像認識処理を求められた左上頂点A21=(x21,y21)の関係を考察する。
Next, consider the relationship between the coordinates of the upper left vertex a 21 of the palette 10 (x t , y t , z t ) and the upper left vertex A 21 =(x 21 , y 21 ) obtained by image recognition processing.
この両者の間には、以下の(式3)の関係がある。
Between these two, there is the following relationship (Formula 3).
ここで、rvは、2D画像の左右の解像度(例えば、1920pixels)、rhは、2D画像の上下の解像度(例えば、1080pixels)、αは、2D画像の上下の画角(例えば、59度)、βは、2D画像の左右の画角(例えば、90度)である。
Here, r v is the left and right resolution of the 2D image (e.g., 1920 pixels), r h is the top and bottom resolution of the 2D image (e.g., 1080 pixels), α is the top and bottom angle of view of the 2D image (e.g., 59 degrees ), and β is the left and right angle of view (for example, 90 degrees) of the 2D image.
この式は、2D画像の性質に基づいて、2D座標と3D座標の関係を記述するものである。
This formula describes the relationship between 2D coordinates and 3D coordinates based on the properties of 2D images.
(式2)、(式3)より、(xt,yt,zt)を消去し、三角関数の自明な関係式である以下の(式4)を用いて、0≦θy<πに角度の範囲を限定するとパレット10がY軸方向の回転角度であるθyが求められる。
Eliminate (x t , y t , z t ) from (Equation 2) and (Equation 3), and use the following (Equation 4), which is a trigonometric function trivial relational expression, to obtain 0≦θ y <π θy , which is the rotation angle of the pallet 10 in the Y-axis direction, can be obtained.
なお、パレット10の三次元軸方向の回転は、X軸、Z軸方向にも存在するが、パレット10の運用では、パレット10は、傾いていない地面や棚の上に存在することが通常であるため、フォークリフト20がパレット10の位置を認識するためには、上下方向の軸の回転角度(パレットが左右にぶれる)のみを問題とすればある十分であると考えられる。
The rotation of the pallet 10 in the three-dimensional axial direction also exists in the X-axis and Z-axis directions. Therefore, in order for the forklift 20 to recognize the position of the pallet 10, it is considered sufficient to consider only the rotation angle of the shaft in the vertical direction (the pallet moves left and right).
以上説明してきたように、本実施形態によれば、2Dカメラから得られる2D画像と3Dカメラから得られる3Dデータを利用することにより、簡易な計算で荷物を搭載するパレットの姿勢を精度高く正確に把握することができ、フォークリフト等の荷役機器による搬送作業を円滑に行うことができる。
As described above, according to the present embodiment, by using a 2D image obtained from a 2D camera and 3D data obtained from a 3D camera, the orientation of a pallet on which packages are to be loaded can be determined with high accuracy and accuracy through simple calculations. Therefore, it is possible to smoothly carry out the transportation work using cargo handling equipment such as a forklift.
〔実施形態2〕
以下、本発明に係る実施形態2を、図8ないし図11を用いて説明する。 [Embodiment 2]
A second embodiment according to the present invention will be described below with reference to FIGS. 8 to 11. FIG.
以下、本発明に係る実施形態2を、図8ないし図11を用いて説明する。 [Embodiment 2]
A second embodiment according to the present invention will be described below with reference to FIGS. 8 to 11. FIG.
実施形態1では、2Dカメラから得られる2D画像と3Dカメラから得られる3Dデータを利用し、パレットの座標系からの角度を計算する物体姿勢算出装置について説明した。
In the first embodiment, a 2D image obtained from a 2D camera and 3D data obtained from a 3D camera are used to calculate the angle of the pallet from the coordinate system.
本実施形態は、実施形態1の計算で用いられた3D基準点をより正確に計算するように補正するものである。
This embodiment corrects the 3D reference point used in the calculation of the first embodiment so that it can be calculated more accurately.
本実施形態では、実施形態1と異なる所を中心に説明する。
In this embodiment, the points different from the first embodiment will be mainly described.
物流現場でのフォークリフト運搬システムの構成は、ほぼ実施形態1と同様である。ただし、物体姿勢算出装置100に、3D基準点補正部106が付け加わっている。3D基準点補正部106は、実施形態1のみかけのパレット前面中心点14から、補正されたパレット前面中心点を求める機能部である(詳細は後述)。
The configuration of the forklift transportation system at the logistics site is almost the same as that of the first embodiment. However, a 3D reference point correction unit 106 is added to the object orientation calculation device 100 . The 3D reference point correction unit 106 is a functional unit that obtains a corrected pallet front surface center point from the apparent pallet front surface center point 14 of the first embodiment (details will be described later).
次に、図9ないし図11を用いて実施形態2での物体姿勢算出装置の処理について説明する。
Next, the processing of the object orientation calculation device in Embodiment 2 will be described using FIGS. 9 to 11. FIG.
実施形態2の物体姿勢算出処理は、図4に示した実施形態1の物体姿勢算出処理とほぼ同様であるが、S20とS30の間に、3D基準点座標補正処理(S21)が加わっている。
The object orientation calculation processing of the second embodiment is substantially the same as the object orientation calculation processing of the first embodiment shown in FIG. 4, but 3D reference point coordinate correction processing (S21) is added between S20 and S30. .
3D基準点座標補正処理は、みかけのパレット前面中心点14から、補正されたパレット前面中心点を求める処理である。
The 3D reference point coordinate correction process is a process of obtaining a corrected pallet front center point from the apparent pallet front center point 14 .
以下、図10および図11を用いて3D基準点座標補正処理について詳細に説明する。
The 3D reference point coordinate correction process will be described in detail below with reference to FIGS. 10 and 11. FIG.
本実施形態の3D基準点座標補正処理では、実施形態1で示したように、2D座標で認識されたA11,A12,A21,A22の対向する互いの2頂点を通る直線の交点であるみかけのパレット前面中心点14が求まっていることが前提である。
In the 3D reference point coordinate correction processing of this embodiment, as shown in the first embodiment, the intersection point of straight lines passing through the two opposing vertices of A 11 , A 12 , A 21 , and A 22 recognized by 2D coordinates It is assumed that the apparent center point 14 of the front face of the pallet is obtained.
次に、A11とみかけのパレット前面中心点14との線分の長さを、L1、みかけのパレット前面中心点14とA22との線分の長さを、L2とする。
Next, let L 1 be the length of the line segment between A 11 and the apparent center point 14 of the pallet front surface, and L 2 be the length of the line segment between the apparent center point 14 of the pallet front surface and A 22 .
L1、L2は、以下の(式5)により求められる。
L 1 and L 2 are obtained by the following (Equation 5).
そして、これらの線分の比として、L1/L2を求め、この線分比L1/L2に従って、みかけのパレット前面中心点14から、補正されたパレット前面中心点15を求める。
Then, L 1 /L 2 is determined as a ratio of these line segments, and a corrected pallet front center point 15 is determined from the apparent pallet front center point 14 according to this line segment ratio L 1 /L 2 .
補正されたパレット前面中心点15の座標を(xa,ya,za)、みかけのパレット前面中心点14の三次元座標での補正量を、(Δx,Δy,Δz)とすると、以下の(式6)が成り立つ。
The coordinates of the corrected pallet front surface center point 15 are (x a , ya , za ) , and the correction amounts of the apparent pallet front surface center point 14 in three-dimensional coordinates are (Δ x , Δ y , Δ z ). Then, the following (formula 6) is established.
線分比L1/L2から、補正量(Δx,Δy,Δz)を求めるためには、例えば、線分比L1/L2ごとに、図11に示される中心点補正データテーブルのようなデータを保持して求めればよい。
In order to obtain the correction amounts (Δ x , Δ y , Δ z ) from the line segment ratio L 1 /L 2 , for example, for each line segment ratio L 1 /L 2 , the center point correction data shown in FIG. Data such as a table should be retained and obtained.
中心点補正データテーブル1131は、図11に示されるように、線分比L1/L21131aごとに、X補正量Δx1131b、Y補正量Δy1131c、Z補正量Δz1131dが定義されたテーブルである。
As shown in FIG. 11, the center point correction data table 1131 defines an X correction amount Δx 1131b, a Y correction amount Δy 1131c, and a Z correction amount Δz 1131d for each line segment ratio L 1 /L 2 1131a. It is a table that has been
X補正量Δx1131b、Y補正量Δy1131c、Z補正量Δz1131dの適切な値は、例えば、統計的なデータ処理により求めればよい、
なお、上の説明では、線分比L1/L2ごとの補正量が定義されているテーブルのデータにより補正量を求める例を説明したが、線分比L1/L2を引数とし、値として補正量を返すライブラリ関数を用意し、それにより線分比L1/L2に対する補正量を求めるようにしてもよい。 Appropriate values for theX correction amount Δx 1131b, the Y correction amount Δy 1131c, and the Z correction amount Δz 1131d may be obtained by, for example, statistical data processing.
In the above description, an example of obtaining the correction amount from the data of the table in which the correction amount for each line segment ratio L 1 /L 2 is defined was explained, but with the line segment ratio L 1 /L 2 as an argument, A library function that returns the correction amount as a value may be prepared to obtain the correction amount for the line segment ratio L 1 /L 2 .
なお、上の説明では、線分比L1/L2ごとの補正量が定義されているテーブルのデータにより補正量を求める例を説明したが、線分比L1/L2を引数とし、値として補正量を返すライブラリ関数を用意し、それにより線分比L1/L2に対する補正量を求めるようにしてもよい。 Appropriate values for the
In the above description, an example of obtaining the correction amount from the data of the table in which the correction amount for each line segment ratio L 1 /L 2 is defined was explained, but with the line segment ratio L 1 /L 2 as an argument, A library function that returns the correction amount as a value may be prepared to obtain the correction amount for the line segment ratio L 1 /L 2 .
以降のS30のフォークリフト20側から見た4頂点の内の一つ頂点の2D座標と、3D基準点(補正された補正されたパレット前面中心点)から物体(パレット)の姿勢を算出する処理は、実施形態1と同様である。
The processing of calculating the posture of the object (pallet) from the 2D coordinates of one of the four vertices seen from the forklift 20 side in S30 and the 3D reference point (corrected center point of the front face of the pallet) is , is the same as in the first embodiment.
ただし、実施形態1の(式2)の代わりに、座標を置き換えた以下の(式7)を用いる。
However, instead of (Equation 2) in Embodiment 1, the following (Equation 7) in which the coordinates are replaced is used.
本実施形態では、パレットの4頂点のA11,A12,A21,A22から求まるみかけのパレット前面中心点14を補正して、より精度の高い3D基準点を採用するために、結果として、より精度の高いパレットの姿勢を求めることができる。
In this embodiment, in order to correct the apparent pallet front center point 14 obtained from the four vertices A 11 , A 12 , A 21 , and A 22 of the pallet and adopt a more accurate 3D reference point, as a result , a more accurate pallet orientation can be obtained.
〔実施形態3〕
以下、本発明に係る実施形態3を、図12を用いて説明する。
本実施形態は、実施形態2同様、実施形態1の計算で用いられた3D基準点をより正確に計算するように補正するものである。 [Embodiment 3]
A third embodiment according to the present invention will be described below with reference to FIG.
This embodiment, like the second embodiment, corrects the 3D reference points used in the calculation of the first embodiment to more accurately calculate them.
以下、本発明に係る実施形態3を、図12を用いて説明する。
本実施形態は、実施形態2同様、実施形態1の計算で用いられた3D基準点をより正確に計算するように補正するものである。 [Embodiment 3]
A third embodiment according to the present invention will be described below with reference to FIG.
This embodiment, like the second embodiment, corrects the 3D reference points used in the calculation of the first embodiment to more accurately calculate them.
本実施形態でも、実施形態1と異なる所を中心に説明する。
In this embodiment as well, the points different from the first embodiment will be mainly described.
物流現場でのフォークリフト運搬システムの構成、物体姿勢算出装置の処理については、実施形態2と同様である。
The configuration of the forklift transportation system at the distribution site and the processing of the object orientation calculation device are the same as in the second embodiment.
本実施形態では、実施形態2と比較して、3D基準点座標補正処理(S21)の内容が異なっている。
In this embodiment, the contents of the 3D reference point coordinate correction process (S21) are different from those in the second embodiment.
以下、図12を用いて3D基準点座標補正処理について詳細に説明する。
本実施形態の3D基準点座標補正処理でも、2D座標で認識されたA11,A12,A21,A22の対向する互いの2頂点を通る直線の交点であるみかけのパレット前面中心点14が求まっていることが前提である。 The 3D reference point coordinate correction process will be described in detail below with reference to FIG.
In the 3D reference point coordinate correction processing of this embodiment as well, theapparent center point 14 of the pallet front surface, which is the intersection of straight lines passing through the two opposing vertices of A 11 , A 12 , A 21 , and A 22 recognized by 2D coordinates, is required.
本実施形態の3D基準点座標補正処理でも、2D座標で認識されたA11,A12,A21,A22の対向する互いの2頂点を通る直線の交点であるみかけのパレット前面中心点14が求まっていることが前提である。 The 3D reference point coordinate correction process will be described in detail below with reference to FIG.
In the 3D reference point coordinate correction processing of this embodiment as well, the
ここで、みかけのパレット前面中心点14から、上下方向にそれぞれ長さt、左右方向にそれぞれ長さpの点群切りとり領域16(すなわち、高さ2t、幅2p)を形成する。
Here, from the apparent center point 14 of the front face of the pallet, a point group cutting area 16 (that is, height 2t, width 2p) having a length t in the vertical direction and a length p in the horizontal direction is formed.
そして、点群切りとり領域16の中に、図3のS02で所得した画像が、m個の3D座標点があるとすると、以下の(式8)のように、各々の(X,Y,Z)座標での相加平均をとり、補正されたパレット前面中心点15の座標(xa,ya,za)とする。
Then, if the image obtained in S02 of FIG. 3 has m 3D coordinate points in the point cloud cut region 16, each (X, Y, Z ) coordinates to obtain the coordinates (x a , y a , za ) of the corrected pallet front center point 15 .
ここで、(xi,yi,zi)(i=1,…,m)は、点群切りとり領域16の中の3D画像の点の座標である。
where (x i , y i , z i ) (i=1, .
また、点群切りとり領域16は、パレットの形状から見て適切な大きさの領域をとる。例えば、上下方向に長さt、左右方向に長さpは、以下のような(式9)のような値をとることとする。
Also, the point cloud cutting area 16 takes an area of an appropriate size when viewed from the shape of the palette. For example, the length t in the up-down direction and the length p in the left-right direction are given by the following (Equation 9).
ここで、Hhは、フォークポケット11の高さ、Hsは、フォークポケット11の幅、Rsは、フォークポケット間矩形領域の幅である。
Here, Hh is the height of the fork pocket 11, Hs is the width of the fork pocket 11, and Rs is the width of the rectangular region between the fork pockets.
以降のS30のフォークリフト20側から見た4頂点の内の一つ頂点の2D座標と、3D基準点(補正された補正されたパレット前面中心点)から物体(パレット)の姿勢を算出する処理は、実施形態1、実施形態2と同様である。
The processing of calculating the posture of the object (pallet) from the 2D coordinates of one of the four vertices seen from the forklift 20 side in S30 and the 3D reference point (corrected center point of the front face of the pallet) is , Embodiment 1, and Embodiment 2.
また、実施形態1の(式2)の代わりに、座標を置き換えた(式7)を用いることも実施形態2と同様である。
Also, instead of (Equation 2) of Embodiment 1, using (Equation 7) in which the coordinates are replaced is the same as in Embodiment 2.
本実施形態でも、実施形態2と同様に、パレットの4頂点のA11,A12,A21,A22から求まるみかけのパレット前面中心点14を補正して、より精度の高い3D基準点を採用するために、結果として、より精度の高いパレットの姿勢を求めることができる。
In this embodiment, as in the second embodiment, the apparent pallet front center point 14 obtained from the four vertices A 11 , A 12 , A 21 , and A 22 of the pallet is corrected to obtain a more accurate 3D reference point. As a result, a more accurate pallet orientation can be obtained for adoption.
5…アクセスポイント、
10…パレット、11…フォークポケット、12…パレット側面、13…フォークポケット間矩形領域、14…みかけのパレット前面中心点、15…補正されたパレット前面中心点、16…点群切りとり領域、
20…フォークリフト、21…2Dカメラ(RGBカメラ)、22…3Dカメラ(Depthカメラ)、25…フォーク、27…外部インタフェース部、28…動輪、26…フォーク稼働部、30…運行指令部、40…レーダセンサ、
100…物体姿勢算出装置、
101…画像頂点認識部、102…3D基準点決定部、103…物体角度算出部、104…画像受信部、105…物体角度送信部、106…3D基準点補正部、110…記憶部、111…画像データDB、112…測定データDB、113…物体姿勢算出データDB 5 access point,
10... Pallet, 11... Fork pocket, 12... Pallet side, 13... Rectangular area between fork pockets, 14... Apparent pallet front center point, 15... Corrected pallet front center point, 16... Point group cutting area,
20...Forklift 21... 2D camera (RGB camera) 22... 3D camera (Depth camera) 25... Fork 27... External interface unit 28... Drive wheel 26... Fork operation unit 30... Operation command unit 40... radar sensor,
100... Object orientation calculation device,
101... Imagevertex recognition unit 102... 3D reference point determination unit 103... Object angle calculation unit 104... Image reception unit 105... Object angle transmission unit 106... 3D reference point correction unit 110... Storage unit 111... Image data DB 112 Measurement data DB 113 Object orientation calculation data DB
10…パレット、11…フォークポケット、12…パレット側面、13…フォークポケット間矩形領域、14…みかけのパレット前面中心点、15…補正されたパレット前面中心点、16…点群切りとり領域、
20…フォークリフト、21…2Dカメラ(RGBカメラ)、22…3Dカメラ(Depthカメラ)、25…フォーク、27…外部インタフェース部、28…動輪、26…フォーク稼働部、30…運行指令部、40…レーダセンサ、
100…物体姿勢算出装置、
101…画像頂点認識部、102…3D基準点決定部、103…物体角度算出部、104…画像受信部、105…物体角度送信部、106…3D基準点補正部、110…記憶部、111…画像データDB、112…測定データDB、113…物体姿勢算出データDB 5 access point,
10... Pallet, 11... Fork pocket, 12... Pallet side, 13... Rectangular area between fork pockets, 14... Apparent pallet front center point, 15... Corrected pallet front center point, 16... Point group cutting area,
20...
100... Object orientation calculation device,
101... Image
Claims (8)
- 2次元カメラにより撮像された2次元画像と、3次元カメラにより撮像された3次元画像とを入力して、所定の座標系に対する対象物体の角度を算出する物体姿勢算出装置であって、
前記2次元画像より対象物体の複数の頂点を認識する画像頂点認識部と、
前記画像頂点認識部により認識された頂点の座標に基づいて、3次元基準点の3次元座標を決定する3次元基準点決定部と、
前記所定の座標系に対する所定の軸に対して、前記対象物体の一側面の角度を算出する物体姿勢算出部とを有し、
前記物体姿勢算出部は、3次元座標における頂点の座標と、3次元座標における3次元基準点の制約条件と、前記2次元画像の性質に基づく頂点の2次元座標と対応する3次元座標の制約条件とに基づいて前記角度を算出することを特徴とする物体姿勢算出装置。 1. An object attitude calculation device that inputs a two-dimensional image captured by a two-dimensional camera and a three-dimensional image captured by a three-dimensional camera and calculates an angle of a target object with respect to a predetermined coordinate system,
an image vertex recognition unit that recognizes a plurality of vertices of a target object from the two-dimensional image;
a three-dimensional reference point determining unit that determines three-dimensional coordinates of a three-dimensional reference point based on the coordinates of vertices recognized by the image vertex recognition unit;
an object orientation calculation unit that calculates an angle of one side surface of the target object with respect to a predetermined axis with respect to the predetermined coordinate system;
The object attitude calculation unit calculates the coordinates of vertices in three-dimensional coordinates, constraints on three-dimensional reference points in three-dimensional coordinates, and constraints on two-dimensional coordinates of vertices and corresponding three-dimensional coordinates based on the properties of the two-dimensional image. and calculating the angle based on a condition. - 前記対象物体は、直方体形状をなし、前記画像頂点認識部により認識された頂点は、直方体形状のある側面の四頂点を含むことを特徴とする請求項1記載の物体姿勢算出装置。 The object orientation calculation device according to claim 1, wherein the target object has a rectangular parallelepiped shape, and the vertices recognized by the image vertex recognition unit include four vertices on one side of the rectangular parallelepiped shape.
- 前記3次元基準点は、前記側面の四頂点の対向する互いの2点を結んだ線分の交点に配置されることを特徴する請求項2記載の物体姿勢算出装置。 3. The object attitude calculation apparatus according to claim 2, wherein the three-dimensional reference points are arranged at intersections of line segments connecting two opposing points of the four vertices of the side surfaces.
- 前記側面の四頂点の対向する互いの二点を結んだ線分の交点に対して、対向する互いの二点を結んだ第一の線分の長さと、第二の線分の長さの比に基づいて補正された点を、前記3次元基準点とすることを特徴とする請求項2記載の物体姿勢算出装置。 With respect to the intersection point of the line segment connecting the two opposing points of the four vertices of the side surface, the length of the first line segment connecting the two opposing points and the length of the second line segment 3. The object attitude calculation apparatus according to claim 2, wherein the point corrected based on the ratio is used as the three-dimensional reference point.
- 前記側面の四頂点の対向する互いの二点を結んだ線分の交点を中心とする一定領域に対して、画像データを取得し、前記画像データの3次元座標におけるそれぞれの座標の相加平均をとった値を座標とする点を、前記3次元基準点とすることを特徴とする請求項2記載の物体姿勢算出装置。 Image data is acquired for a certain area centered on the intersection of line segments connecting two opposing points of the four vertices of the side surface, and the arithmetic mean of each coordinate in the three-dimensional coordinates of the image data. 3. The object attitude calculation apparatus according to claim 2, wherein the three-dimensional reference point is a point whose coordinates are the values obtained by taking the following values.
- 前記対象物体は、パレットであることを特徴とする請求項1記載の物体姿勢算出装置。 The object orientation calculation device according to claim 1, wherein the target object is a pallet.
- 前記対象物体は、パレットであり、前記一定領域は、前記パレットのフォークポケット間のおける矩形領域を含むことを特徴とする請求項5記載の物体姿勢算出装置。 The object orientation calculation device according to claim 5, wherein the target object is a pallet, and the fixed area includes a rectangular area between fork pockets of the pallet.
- 2次元画像を撮像する2次元カメラと、
3次元画像を撮像する3次元カメラと、
フォークリフトと、
2次元カメラにより撮像された2次元画像と、3次元カメラにより撮像された3次元画像とを入力して、所定の座標系に対する対象物体の角度を算出する物体姿勢算出装置とを備えるフォークリフト運搬システムであって、
前記物体姿勢算出装置が、2次元カメラにより撮像された2次元画像を取得するステップと、
前記物体姿勢算出装置が、3次元カメラにより撮像された3次元画像を取得するステップと、
前記物体姿勢算出装置が、前記2次元画像よりパレットの複数の頂点を認識する画像頂点認識ステップと、
前記物体姿勢算出装置が、前記画像頂点認識部により認識された頂点の座標に基づいて、3次元基準点の3次元座標を決定する3次元基準点決定ステップと、
前記物体姿勢算出装置が、前記所定の座標系に対する所定の軸に対して、前記対象物体の一側面の角度を算出する物体姿勢算出ステップと、
前記所定の座標系に対する所定の軸に対して、前記対象物体の一側面の角度を算出する物体姿勢算出部とを有し、
前記物体姿勢算出装置が、前記フォークリフトに対して算出した角度を送信するステップと、
前記フォークリフトは、送信された角度に基づき前記パレットの運行を行うステップとを有し、
前記物体姿勢算出ステップでは、3次元座標における頂点の座標と、3次元座標における3次元基準点の制約条件と、前記2次元画像の性質に基づく頂点の2次元座標と対応する3次元座標の制約条件とに基づいて前記角度を算出することを特徴とする物体姿勢算出方法。 a two-dimensional camera that captures a two-dimensional image;
a three-dimensional camera that captures a three-dimensional image;
a forklift and
A forklift transportation system comprising an object attitude calculation device that inputs a two-dimensional image captured by a two-dimensional camera and a three-dimensional image captured by a three-dimensional camera and calculates the angle of the target object with respect to a predetermined coordinate system. and
a step in which the object orientation calculation device obtains a two-dimensional image captured by a two-dimensional camera;
obtaining a three-dimensional image captured by a three-dimensional camera;
an image vertex recognition step in which the object orientation calculation device recognizes a plurality of vertices of a palette from the two-dimensional image;
a three-dimensional reference point determination step in which the object orientation calculation device determines the three-dimensional coordinates of a three-dimensional reference point based on the coordinates of the vertices recognized by the image vertex recognition unit;
an object orientation calculation step in which the object orientation calculation device calculates an angle of one side surface of the target object with respect to a predetermined axis with respect to the predetermined coordinate system;
an object orientation calculation unit that calculates an angle of one side surface of the target object with respect to a predetermined axis with respect to the predetermined coordinate system;
a step in which the object attitude calculation device transmits the calculated angle to the forklift;
the forklift maneuvering the pallet based on the transmitted angle;
In the object attitude calculation step, the coordinates of the vertices in the 3D coordinates, the constraints of the 3D reference points in the 3D coordinates, and the 2D coordinates of the vertices based on the properties of the 2D image and the corresponding 3D coordinates. and calculating the angle based on a condition.
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2022-025085 | 2022-02-21 | ||
JP2022025085 | 2022-02-21 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2023157443A1 true WO2023157443A1 (en) | 2023-08-24 |
Family
ID=87578043
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2022/045955 WO2023157443A1 (en) | 2022-02-21 | 2022-12-14 | Object orientation calculation device and object orientation calculation method |
Country Status (1)
Country | Link |
---|---|
WO (1) | WO2023157443A1 (en) |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20230410360A1 (en) * | 2022-06-15 | 2023-12-21 | Ai In Motion B.V. | System and Method for Load-Carrier Pose Estimation |
Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2015056057A (en) * | 2013-09-12 | 2015-03-23 | トヨタ自動車株式会社 | Method of estimating posture and robot |
JP2016210586A (en) * | 2015-05-12 | 2016-12-15 | 株式会社豊田中央研究所 | Fork lift |
US20180089517A1 (en) * | 2016-08-10 | 2018-03-29 | Barry D. Douglas | Pallet localization systems and methods |
US20210274149A1 (en) * | 2020-02-27 | 2021-09-02 | Jungheinrich Aktiengesellschaft | Method for calibrating a sensor unit of an industrial truck |
JP2021160931A (en) * | 2020-03-31 | 2021-10-11 | 株式会社豊田自動織機 | Cargo handling system |
-
2022
- 2022-12-14 WO PCT/JP2022/045955 patent/WO2023157443A1/en unknown
Patent Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2015056057A (en) * | 2013-09-12 | 2015-03-23 | トヨタ自動車株式会社 | Method of estimating posture and robot |
JP2016210586A (en) * | 2015-05-12 | 2016-12-15 | 株式会社豊田中央研究所 | Fork lift |
US20180089517A1 (en) * | 2016-08-10 | 2018-03-29 | Barry D. Douglas | Pallet localization systems and methods |
US20210274149A1 (en) * | 2020-02-27 | 2021-09-02 | Jungheinrich Aktiengesellschaft | Method for calibrating a sensor unit of an industrial truck |
JP2021160931A (en) * | 2020-03-31 | 2021-10-11 | 株式会社豊田自動織機 | Cargo handling system |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20230410360A1 (en) * | 2022-06-15 | 2023-12-21 | Ai In Motion B.V. | System and Method for Load-Carrier Pose Estimation |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US11636605B2 (en) | Robotic system with automated package registration mechanism and minimum viable region detection | |
JP7304961B2 (en) | box detection | |
US11192250B1 (en) | Methods and apparatus for determining the pose of an object based on point cloud data | |
EP3063598B1 (en) | Systems, methods, and industrial vehicles for determining the visibility of features | |
US10102629B1 (en) | Defining and/or applying a planar model for object detection and/or pose estimation | |
KR101663977B1 (en) | Method and apparatus for sharing map data associated with automated industrial vehicles | |
CN112009812A (en) | Robot system with dynamic packaging mechanism | |
KR20210054448A (en) | A robotic system with wall-based packing mechanism and methods of operating the same | |
JP6433122B2 (en) | Enhanced mobile platform positioning | |
US20200143557A1 (en) | Method and apparatus for detecting 3d object from 2d image | |
AU2014343128A1 (en) | Systems, methods, and industrial vehicles for determining the visibility of features | |
CN110815202A (en) | Obstacle detection method and device | |
WO2023157443A1 (en) | Object orientation calculation device and object orientation calculation method | |
CN113253737A (en) | Shelf detection method and device, electronic equipment and storage medium | |
KR20180046361A (en) | Method and System for loading optimization based on depth sensor | |
US10679367B2 (en) | Methods, systems, and apparatuses for computing dimensions of an object using angular estimates | |
CN116309882A (en) | Tray detection and positioning method and system for unmanned forklift application | |
JP7312089B2 (en) | Measuring device | |
JP7192706B2 (en) | Position and orientation estimation device | |
CN116359932A (en) | Barrier distance feedback method and device based on laser radar sensor | |
CN113345023B (en) | Box positioning method and device, medium and electronic equipment | |
CN117011362A (en) | Method for calculating cargo volume and method for dynamically calculating volume rate | |
CN111498213B (en) | Robot system with dynamic packaging mechanism | |
CN110857859B (en) | Obstacle detection method and device | |
WO2024219072A1 (en) | Positional information setting method, program, and information processing device |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 22927329 Country of ref document: EP Kind code of ref document: A1 |
|
NENP | Non-entry into the national phase |
Ref country code: DE |