[go: up one dir, main page]
More Web Proxy on the site http://driver.im/

JPH09105608A - Measuring method for position of load - Google Patents

Measuring method for position of load

Info

Publication number
JPH09105608A
JPH09105608A JP26188695A JP26188695A JPH09105608A JP H09105608 A JPH09105608 A JP H09105608A JP 26188695 A JP26188695 A JP 26188695A JP 26188695 A JP26188695 A JP 26188695A JP H09105608 A JPH09105608 A JP H09105608A
Authority
JP
Japan
Prior art keywords
load
image
luggage
package
height
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
JP26188695A
Other languages
Japanese (ja)
Inventor
Eiji Matsukuma
英治 松隈
Naohito Soma
直仁 相馬
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Nippon Steel Corp
Original Assignee
Nippon Steel Corp
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Nippon Steel Corp filed Critical Nippon Steel Corp
Priority to JP26188695A priority Critical patent/JPH09105608A/en
Publication of JPH09105608A publication Critical patent/JPH09105608A/en
Pending legal-status Critical Current

Links

Landscapes

  • Manipulator (AREA)
  • Length Measuring Devices By Optical Means (AREA)

Abstract

PROBLEM TO BE SOLVED: To obtain a measuring method in which the position of a load can be measured in a simple configuration, with good accuracy and surely by measuring the height of a camera from the load and compositing an image which is imaged by illuminating the load from its oblique direction and a mask image. SOLUTION: From a palletizing computer 3, measured position data is transmitted to a robot control device 2, and the device 2 moves a hand at a robot 1 to a measuring position. The computer 3 which receives a movement completion signal from the device 2 turns on a laser marker 6 in order to measure the height position of a camera from a load. Image data which is picked up by a CCD camera 8 by applying slit light to the load in its oblique direction is processed by an image processing device 4, and a height position is computed. Then, the computer 3 turns on an illuminator, it takes in an image which is picked up by applying light to the load in its oblique direction, and it takes in an image obtained by turning off the illuminator. The device 4 computes a position ad the plane position of the load and the angle of rotation of the load.

Description

【発明の詳細な説明】DETAILED DESCRIPTION OF THE INVENTION

【0001】[0001]

【発明の属する技術分野】本発明は、パレットに積載さ
れたダンボール箱のような荷物を取上げ、荷降ろし作業
を行う荷物のハンドリングにおける荷物の位置計測方法
に関する。
BACKGROUND OF THE INVENTION 1. Field of the Invention The present invention relates to a method of measuring the position of a load in handling a load such as a cardboard box loaded on a pallet for picking up and unloading the load.

【0002】[0002]

【従来の技術】パレットに積載されたダンボール箱等の
荷物を、パレットから降ろしたり、別のパレットに積み
替えるための荷降ろし作業の自動化においては、ロボッ
トのようなハンドリング装置が利用されている。一般
に、パレットに積載されている荷物は、同一品種・同一
寸法のものが規定の配置でパレット上に積載されてお
り、予めハンドリング装置にパレット上の各荷物の積載
位置や積載方向を記憶させ、その情報によりハンドリン
グ装置の動作制御を行なっている。しかし、パレットに
積載されている荷物は、途中の搬送過程による位置ズ
レ、パレットのたわみ、また人手作業にてパレットに積
載する場合の積載位置のばらつき等により、規定位置か
らの高さ・水平方向の並進ズレや水平方向の回転ズレに
より、ハンドリング装置で荷物を掴み損ねたり、掴めて
も別のパレットに積み替える際に精度良く積載できない
ため、次の荷物を積載する場合に支障を来すことにな
る。
2. Description of the Related Art A handling device such as a robot is used in automation of unloading work for unloading a cargo such as a cardboard box loaded on a pallet from the pallet or reloading to another pallet. In general, the loads loaded on the pallets are of the same type and size and are loaded on the pallets in a prescribed arrangement, and the handling device stores in advance the loading position and loading direction of each load on the pallet. The information is used to control the operation of the handling device. However, the loads loaded on the pallet are higher than the specified position in the horizontal or horizontal direction due to misalignment due to the intermediate transfer process, bending of the pallet, and variations in the loading position when manually loading the pallet. Due to the translational deviation or horizontal rotation deviation, the handling device may fail to grab the luggage, or even if it can be grabbed, it cannot be loaded accurately when it is loaded on another pallet, which causes trouble when loading the next luggage. become.

【0003】そこで、従来、荷物を掴む直前に荷物の位
置を計測して掴み位置を補正する方法として、特開平2
ー310214号公報においては、高さ計測センサーに
よる高さ方向の計測と、画像計測装置による水平方向の
二次元計測とを組合わせ、荷物の三次元位置を計測して
パレットから取出すハンドリングロボットが示されてい
る。
Therefore, as a conventional method for correcting the grip position by measuring the position of the load immediately before gripping the load, Japanese Patent Application Laid-Open No. Hei 2
In Japanese Patent No. 310214, a handling robot for measuring the three-dimensional position of a package and taking it out from a pallet is shown by combining the measurement in the height direction by a height measurement sensor and the two-dimensional measurement in the horizontal direction by an image measuring device. Has been done.

【0004】また、特開平6ー241719号公報では
スリット光を物品に照射したものを撮像し、物品の寸法
情報を用いて物品の位置を計測する装置が述ベられてい
る。
Further, Japanese Patent Application Laid-Open No. 6-241719 describes an apparatus which images an article irradiated with slit light and measures the position of the article by using dimension information of the article.

【0005】[0005]

【発明が解決しようとする課題】しかしながら、特開平
2ー310214号公報記載の計測方法では、高さ計測
と二次元計測を別々の装置構成で行うために制御が複雑
であり、画像計測による二次元計測では荷物の輪郭線に
よる計測やパターンマッチング等の手法を用いるため
に、荷物上面の絵柄により誤検出の恐れがある上に、演
算が複雑なために処理時間も長いものとなっている。ま
た、撮像装置はパレット全体の広い視野を撮像している
ために計測精度が低く、撮像のために照明装置を施して
いるものの、太陽光の影響で昼夜を通して荷物の照度が
変化するために安定した画像が得られないという問題が
あり、さらに、荷物にマークを付す必要があるため、手
間がかかり、また、表面がフラットでないプラスチック
コンテナ等に適用するのが困難である。
However, in the measuring method described in Japanese Patent Laid-Open No. 2-310214, the control is complicated because the height measurement and the two-dimensional measurement are performed by separate device configurations, and the image measurement method is used. Since the dimension measurement uses a method of measuring the contour of the package, pattern matching, or the like, there is a risk of erroneous detection due to the pattern on the top surface of the package, and the processing time is long because the calculation is complicated. In addition, the imaging device has a low measurement accuracy because it captures a wide field of view of the entire pallet, and although it is equipped with an illumination device for imaging, it is stable because the illuminance of the luggage changes throughout the day and night due to the effects of sunlight. However, it is difficult to apply it to a plastic container or the like whose surface is not flat because it is necessary to put a mark on the baggage.

【0006】次に、特開平6ー241719号公報記載
の測定方法では、物品の水平方向の位置しか検出しない
ので、物品の高さ方向の位置が物品のつぶれやパレット
のたわみ等により規定位置からずれている場合には、そ
のズレ量に比例して水平方向の計測誤差が発生する。ま
た、物品の高さ位置を計測できないので、パレット上の
積載数が正規の数と異なる場合の荷物の取り出しにおい
て、ハンドリング装置が物品に衝突して、物品やハンド
リング装置を損傷する恐れがあり、さらに、計測点を多
く取ることができないので、箱が変形している場合、精
度に問題があり、また、表面がフラットでないプラスチ
ックコンテナや曲面である円筒物には適用しにくいとい
う問題がある。
Next, in the measuring method described in Japanese Patent Laid-Open No. 6-241719, only the horizontal position of the article is detected, so the position in the height direction of the article is deviated from the specified position due to the crushing of the article, the bending of the pallet, or the like. If there is a deviation, a horizontal measurement error occurs in proportion to the amount of deviation. Further, since the height position of the article cannot be measured, the handling device may collide with the article when taking out the load when the number of items loaded on the pallet is different from the regular number, and the item or the handling device may be damaged. Further, since many measurement points cannot be taken, there is a problem in accuracy when the box is deformed, and it is difficult to apply to a plastic container whose surface is not flat or a cylindrical object having a curved surface.

【0007】そこで、本発明は、簡素な装置構成で、精
度良く確実な計測ができ、段ボール、プラスチックコン
テナ、円筒形の荷物等についても位置計測ができる方法
を提供するものである。
[0007] Therefore, the present invention provides a method capable of performing accurate and reliable measurement with a simple device configuration and measuring the position of a corrugated board, a plastic container, a cylindrical luggage, and the like.

【0008】[0008]

【課題を解決するための手段】本発明は、スリット光を
荷物の斜め方向から荷物に照射し、三角測量により荷物
からのカメラの高さを計測し、荷物の斜め方向から照明
を当てて、撮像した画像とマスク画像を合成し、荷物の
位置計測領域にある外乱要素を取り除き、荷物上面の輪
郭を抽出し、先に求めた高さ位置と荷物の寸法情報によ
り荷物の平面方向の位置と回転角を求めることを特徴と
するものである。
SUMMARY OF THE INVENTION The present invention irradiates a package with slit light from an oblique direction of the package, measures the height of the camera from the package by triangulation, and illuminates the package obliquely. The captured image and the mask image are combined, the disturbance element in the luggage position measurement area is removed, the contour of the luggage upper surface is extracted, and the plane position of the luggage is determined by the height position and the luggage dimension information obtained earlier. The feature is that the rotation angle is obtained.

【0009】[0009]

【発明の実施の形態】図1は本発明による計測システム
の構成図であり、図1において、計測システムは、ロボ
ット1、ロボット制御装置2、パレタイズコンピュータ
3、画像処理装置4で構成され、ロボット1のハンド5
にはレーザマーカー6、照明7、CCDカメラ8が設け
られており、これらは電源ケーブル、信号ケーブル9で
電源10あるいは画像処理装置4に連結されている。レ
ーザ光及び照明は荷物11に対して斜め方向から照射さ
れる。
1 is a block diagram of a measuring system according to the present invention. In FIG. 1, the measuring system is composed of a robot 1, a robot controller 2, a palletizing computer 3, and an image processing device 4. Hand one 1
Is provided with a laser marker 6, an illumination 7, and a CCD camera 8, which are connected to a power source 10 or an image processing device 4 by a power cable and a signal cable 9. The luggage 11 is irradiated with the laser light and the illumination obliquely.

【0010】図2は本発明によるシステムの全体処理フ
ローであり、図2において、パレタイズコンピュータか
ら計測位置データをロボット制御装置へ送信し、ロボッ
ト制御装置は受信した計測位置データに基づいてロボッ
トのハンドを計測位置へ移動させる。
FIG. 2 is an overall processing flow of the system according to the present invention. In FIG. 2, the measurement position data is transmitted from the palletizing computer to the robot control device, and the robot control device is based on the received measurement position data. To move to the measurement position.

【0011】ロボット制御装置から移動完了信号を受信
したパレタイズコンピュータは、荷物からカメラの高さ
位置計測を行うためにレーザーマーカーをONする。荷
物に斜め方向からスリット光を当て、CCDカメラで撮
像して得られた画像データは画像処理装置で処理され高
さ位置が演算される。
Upon receiving the movement completion signal from the robot controller, the palletizing computer turns on the laser marker in order to measure the height position of the camera from the luggage. Image data obtained by irradiating a package with slit light obliquely and picking up an image with a CCD camera is processed by an image processing device to calculate a height position.

【0012】次いで、パレタイズコンピュータは、箱の
位置・姿勢を計測するため、照明をONし、荷物に斜め
方向から光を当てて撮像された画像を取り込み、次いで
照明をOFFにして、照明OFFで得られた画像を取り
込み、得られた画像データに基づいて画像処理装置は、
荷物の平面位置及び水平面における荷物の回転角度であ
る姿勢を演算する。
Next, the palletizing computer turns on the illumination to measure the position / orientation of the box, illuminates the luggage obliquely and captures the imaged image, then turns off the illumination and turns off the illumination. The obtained image is captured, and the image processing apparatus based on the obtained image data,
The plane position of the package and the posture which is the rotation angle of the package in the horizontal plane are calculated.

【0013】パレタイズコンピュータは、高さ位置、平
面位置及び姿勢の各データからロボットのハンドと荷物
との相対距離であるシフト量を算出しシフトデーターを
ロボット制御装置に送信し、受信したシフト量に基づい
てロボット制御装置は、ロボットのハンドにピッキング
動作を開始させる。
The palletizing computer calculates the shift amount, which is the relative distance between the robot hand and the luggage, from each data of the height position, the plane position, and the posture, transmits the shift data to the robot controller, and uses the received shift amount as the shift amount. Based on this, the robot controller causes the robot hand to start the picking operation.

【0014】次に、高さ位置、平面位置及び姿勢の算出
方法について説明する。
Next, a method of calculating the height position, the plane position and the posture will be described.

【0015】図3及び図4は本発明の計測目的を説明す
るための荷物とロボットのハンドの位置関係を示す平面
図及び正面図で、本発明においては、ロボットのハンド
中心を荷物の中心にもって行くために図3及び図4に示
すシフト量Δx,y,z,θを計測し、ロボットのハン
ド中心をΔx,y,z,θだけシフトさせて荷物の中心
に一致させるものである。そのために、本発明では高さ
位置、平面位置及び姿勢を計測するものである。
FIGS. 3 and 4 are a plan view and a front view showing the positional relationship between the luggage and the robot hand for explaining the measurement purpose of the present invention. In the present invention, the robot hand center is the center of the luggage. The shift amounts Δx, y, z, and θ shown in FIGS. 3 and 4 are measured in order to bring the robot, and the robot hand center is shifted by Δx, y, z, and θ to match the center of the luggage. Therefore, in the present invention, the height position, the plane position, and the posture are measured.

【0016】(1) 高さ方向の計測 図5は高さ位置hを検出するための画像処理フローチャ
ート、図6はスリット画像の処理を図式化した説明図
で、ハンドに取り付けられたレーザマーカーからスリッ
ト光が斜め方向から荷物に照射され、荷物に対するスリ
ット光の照射状態がCCDカメラで撮像され、スリット
画像が画像処理装置に取り込まれる。画像処理装置では
スリット光のみが抽出できるようなしきい値を設定し画
像データを2値化処理して2値画像を得る。次いでスリ
ットの検出検出領域においてスリット画像のPx軸方向
の中心軸(重心)座標Pxlを求める。スリット光は斜め
方向から荷物に照射されているので、Pxlは荷物の高さ
により異なることを利用して三角測量の原理を用いて次
式から高さ位置(h)が求められる。
(1) Measurement in the height direction FIG. 5 is an image processing flow chart for detecting the height position h, and FIG. 6 is an explanatory view schematically showing the processing of the slit image, from the laser marker attached to the hand. The slit light is radiated onto the package from an oblique direction, the irradiation state of the slit light on the package is captured by a CCD camera, and the slit image is captured by the image processing device. The image processing apparatus sets a threshold value so that only slit light can be extracted, and binarizes the image data to obtain a binary image. Next, the central axis (center of gravity) coordinates P xl of the slit image in the P x axis direction are obtained in the detection area of the slit. Since the slit light is radiated to the luggage from an oblique direction, the height position (h) is obtained from the following equation using the principle of triangulation by utilizing the fact that P xl varies depending on the height of the luggage.

【0017】図7は高さ位置(h)を求めるための説明
図で、高さ位置は h=bh/(pxl+ah) から求められる。ここでah及びbhは、予め既知の高さ
1及びh2で観察したスリット光の座標Pxl1及びPxl2
を利用して次式から求めることができる。既知の高さh
1及びh2では、h1=bh/(Pxl1+ah)及びh2=bh
/(Pxl2+ah)であり、この式からah及びh2を求め
ると、 ah=(h1・Pxl1ーh2・Pxl2+)/(h1ーh2) bh=(Pxl1ーPxl2)・(h1ーh2)/(h2ーh1) となる。
FIG. 7 is an explanatory view for obtaining the height position (h), and the height position is obtained from h = b h / (p xl + a h ). Here, a h and b h are the coordinates P xl1 and P xl2 of the slit light observed at known heights h 1 and h 2 in advance.
Can be obtained from the following equation. Known height h
For 1 and h 2 , h 1 = b h / (P xl1 + a h) and h 2 = b h
/ (P xl2 + a h ), and when a h and h 2 are obtained from this equation, a h = (h 1 · P xl1 −h 2 · P xl2 +) / (h 1 −h 2 ) b h = to become (P xl1 over P xl2) · (h 1 over h 2) / (h 2 over h 1).

【0018】(2) <平面位置及び姿勢の算出> 図8は平面位置及び姿勢を検出するための画像処理フロ
ーチャートで、ロボットのハンドに取り付けられたハロ
ゲンランプ等により荷物にピッキングしたい荷物の斜め
上方より照明を当てて、CCDカメラでのぞいた時に荷
物上面の2辺のエッジ部に影ができるようにして撮像し
た画像が取り込まれた後、照明OFFで撮像した画像が
取り込まれる。次いで、外乱光の影響を少なくして照明
光の明るさ成分のみを抽出して荷物のエッジ部分を強調
するために、照明をONで撮像した画像から照明をOF
Fで撮像した画像を差し引いた差分画像を得た後、予め
しきい値を設定しておき、2値化処理で、暗い部分であ
る影の部分を白、明るい部分である荷物上面を黒に反転
して2値画像を得る。
(2) <Calculation of Plane Position and Attitude> FIG. 8 is an image processing flowchart for detecting the plane position and attitude. The halogen lamp attached to the hand of the robot or the like diagonally above the package to be picked. After the image is taken in such a manner that shadows are formed on the two edge portions of the upper surface of the luggage when the light is applied more and the CCD camera looks into the image, the image taken with the light OFF is taken. Next, in order to reduce the influence of ambient light and extract only the brightness component of the illumination light to emphasize the edge portion of the luggage, the illumination is turned off from the image captured with the illumination turned on.
After the difference image obtained by subtracting the image captured in F is obtained, the threshold value is set in advance, and the binarization processing sets the dark portion of the shadow to white and the bright portion of the luggage upper surface to black. Invert to obtain a binary image.

【0019】図9は荷物エッジ点を抽出するための説明
図で、2値画像(図9a)と白マスクパターン画像(図
9b)及び黒マスクパターン画像(図8c)を合成した
画像(図9d)を得た後、白画素で囲まれた暗い領域を
白にする穴埋め処理(図8e)を行う。この画像処理に
より荷物上面の輪郭を抽出する。
FIG. 9 is an explanatory view for extracting the baggage edge points, and is an image (FIG. 9d) in which a binary image (FIG. 9a), a white mask pattern image (FIG. 9b) and a black mask pattern image (FIG. 8c) are combined. ) Is obtained, a hole filling process (FIG. 8e) is performed to make a dark region surrounded by white pixels white. By this image processing, the contour of the upper surface of the package is extracted.

【0020】図10は荷物のエッヂ点抽出法の説明図
で、図9の画像に対して水平方向及び垂直方向へ複数回
走査して白から黒へ変化するエッジ点を求め、長さ方向
のエッヂ線分(L1)、及び水平方向のエッヂ線分
(L2)を最小二乗法により算出する。
FIG. 10 is an explanatory view of the edge point extraction method of the package, in which the edge point changing from white to black is obtained by scanning the image of FIG. The edge line segment (L 1 ) and the edge line segment in the horizontal direction (L 2 ) are calculated by the least square method.

【0021】<エッジ線分の算出>直線の式から、 (L1):Py=a1Px+b1 (L2):Px=a2Py+b2 となり、最小二乗法で次式により(a1,b1)、
(a2,b2)を求める。
<Calculation of Edge Segment> From the straight line formula, (L 1 ): Py = a 1 Px + b 1 (L 2 ): Px = a 2 Py + b 2 and the least square method is used to calculate (a 1 , b 1 ),
Find (a 2 , b 2 ).

【0022】a1={nΣ(xi・yi)ーΣxi・Σ
i}/{nΣxi 2ー(Σxi2} b1={(Σxi 2)・(Σyi)ー(Σxi)・(Σxi
i)}/{nΣxi 2ー(Σxi2} a2={nΣ(xi・yi)ーΣxi・Σyi}/{nΣyi
2ー(Σyi2} b2={(Σyi 2)・(Σxi)ー(Σyi)・(Σxi
i)}/{nΣyi 2ー(Σyi2} なお、nは走査回数である。
A 1 = {nΣ (x i · y i ) −Σx i · Σ
y i } / {nΣx i 2 − (Σx i ) 2 } b 1 = {(Σx i 2 ) · (Σy i ) − (Σx i ) · (Σx i ·
y i)} / {nΣx i 2 chromatography (Σx i) 2} a 2 = {nΣ (x i · y i) over Σx i · Σy i} / { nΣy i
2− (Σy i ) 2 } b 2 = {(Σy i 2 ) · (Σx i ) − (Σy i ) · (Σx i ·
y i )} / {nΣy i 2 − (Σy i ) 2 } where n is the number of scans.

【0023】<荷物の回転角の算出>図11は荷物の回
転角の説明図で、直線L1と直線L2とがなす回転角θ1
及びθ2を平均化して、次式から回転角θを求める。
<Calculation of Rotation Angle of Package> FIG. 11 is an explanatory view of the rotation angle of the package. The rotation angle θ 1 formed by the straight line L 1 and the straight line L 2 is shown.
And θ 2 are averaged, and the rotation angle θ is calculated from the following equation.

【0024】 θ=(θ1+θ2)/2 ={tan-1(−a1)+tan-1(a2)}/2 <荷物の中心座標の算出>図12は中心座標及び対角点
座標を求めるための説明図で、荷物の中心座標(p
0,py0)をL1とL2との交点(px1,py1)及び
対角点(px2,py2)から求める。
Θ = (θ 1 + θ 2 ) / 2 = {tan −1 (−a 1 ) + tan −1 (a 2 )} / 2 <Calculation of central coordinates of package> FIG. 12 shows central coordinates and diagonal points. An explanatory diagram for obtaining the coordinates, in which the central coordinates of the package (p
x 0 , py 0 ) is obtained from the intersection (px 1 , py 1 ) of L 1 and L 2 and the diagonal point (px 2 , py 2 ).

【0025】L1とL2との交点座標は、 Py1=(a1・b2+b1)/(1−a1・a2) Px1=a2・Py1+b2 から求め、対角点座標は、 px2=px1+L,cosθーW,sinθ py2=px1+L,sinθーW,cosθ から求める。なお、上式においてL,及びW,(画素)
は、荷物の長さL及び幅W(mm)の距離hにおける見
かけの寸法であり,荷物の長さL及びW(mm)は、予
め求められている距離hにおける(mm)→(画素)変
換係数Fpx及びFpyで次式により変換できる。
The intersection coordinates of L 1 and L 2 are obtained from Py 1 = (a 1 · b 2 + b 1 ) / (1-a 1 · a 2 ) Px 1 = a 2 · Py 1 + b 2 and the pair corner points coordinates, p x2 = p x1 + L , cosθ over W, sinθ p y2 = px1 + L, sinθ over W, obtained from cos [theta]. In the above equation, L 1 , W , (pixel)
Is the apparent dimension of the length L and width W (mm) of the luggage at the distance h, and the length L and W (mm) of the luggage is (mm) → (pixels) at the distance h that is obtained in advance. The conversion coefficients Fpx and Fpy can be converted by the following equation.

【0026】L,=L/Fpx W,=L/Fpy 交点及び対角点の座標が求まると、次いで、荷物中心座
標(px0,py0)を次式により算出する。
L , = L / Fpx W , = L / Fpy When the coordinates of the intersection and the diagonal point are obtained, then the package center coordinates (px 0 , py 0 ) are calculated by the following equation.

【0027】Px0=(Px1+Px2)/2 Py0=(Py1+Py2)/2 <ロボット座標系への換算>図13はシフト量をピクセ
ル座標系からロボット座標系へ換算するための説明図
で、カメラ中心と荷物中心のx軸方向のずれεx及びy
軸方向のずれεyは、予め求められているカメラ光軸の
ずれの補正係数cxとcy及び前記の変換係数Fpx及び
Fpyから次式で換算される。
Px 0 = (Px 1 + Px 2 ) / 2 Py 0 = (Py 1 + Py 2 ) / 2 <Conversion to robot coordinate system> FIG. 13 is for converting the shift amount from the pixel coordinate system to the robot coordinate system. In the explanatory diagram of FIG. 3, the shifts εx and y in the x-axis direction between the camera center and the luggage center
Axial offset εy are translated from advance correction coefficients of sought of the optical axis of the camera is shifted c x and c y and the transform coefficients Fpx and Fpy by the following equation.

【0028】 εx={(px0−px)+cx}・Fpx(mm) εy={(py−py0)+cy}・Fpy(mm) したがって、シフト量は、次式から算出される。[0028] εx = {(p x0 -px) + c x} · F px (mm) εy = {(p y -p y0) + c y} · F py (mm) Therefore, the shift amount is calculated from the following equation To be done.

【0029】Δx=Sxーεx Δy=Syーεy なお、本説明では、ピクセル座標系とロボット座標系の
関係を図13に示したが、座標系の組合せ、即ち計測に
おけるハンドの向きは、荷物の配置や向きにより適時図
14(a)〜(c)に示す組合せを選択する。
.DELTA.x = Sx-.epsilon.x .DELTA.y = Sy-.epsilon.y In this description, the relationship between the pixel coordinate system and the robot coordinate system is shown in FIG. 13, but the combination of coordinate systems, that is, the orientation of the hand in measurement is The combinations shown in FIGS. 14 (a) to 14 (c) are selected at appropriate times depending on the arrangement and orientation of the.

【0030】また、本計測方法では、図15(a)に示
すように荷物のエッジを外部方向から検出するので、プ
ラスチックコンテナにおいても内容物に影響を受けずに
計測することが可能であり、さらに、図15(b),
(c)に示すようにマスクパターン合成しエッジ検出し
てから円の式より荷物の中心点を求めるようにすれば、
円筒形の荷物についても適用が可能である。
Further, in the present measuring method, the edge of the package is detected from the outside as shown in FIG. 15 (a), so that it is possible to measure the plastic container without being affected by the contents, Furthermore, FIG.
As shown in (c), if the mask pattern is synthesized, the edge is detected, and then the center point of the baggage is obtained from the circle formula,
It can also be applied to cylindrical luggage.

【0031】[0031]

【発明の効果】本発明の効果は次のとおりである。The effects of the present invention are as follows.

【0032】(1) 簡単な装置構成で精度良く確実な
計測ができる。
(1) Accurate and reliable measurement can be performed with a simple device configuration.

【0033】(2) 表面がフラットでない段ボール
箱、プラスチックコンテナ、円筒形の荷物等幅広く位置
計測ができる。
(2) It is possible to measure a wide range of positions such as a cardboard box whose surface is not flat, a plastic container, and a cylindrical luggage.

【0034】(3) 荷物の周囲の画像に影響されにく
い計測が可能である。
(3) It is possible to perform measurement that is not easily influenced by the image around the luggage.

【図面の簡単な説明】[Brief description of the drawings]

【図1】本発明による計測システムの構成図。FIG. 1 is a configuration diagram of a measurement system according to the present invention.

【図2】本発明の全体処理フローチャート。FIG. 2 is an overall processing flowchart of the present invention.

【図3】荷物とハンドの位置関係を示す平面図。FIG. 3 is a plan view showing the positional relationship between luggage and hands.

【図4】荷物とハンドの位置関係を示す正面図。FIG. 4 is a front view showing the positional relationship between luggage and hands.

【図5】高さ位置を検出するための画像処理フローチャ
ート。
FIG. 5 is an image processing flowchart for detecting a height position.

【図6】高さ位置(h)を求めるための説明図。FIG. 6 is an explanatory diagram for obtaining a height position (h).

【図7】平面位置及び姿勢を検出するための画像処理フ
ローチャート。
FIG. 7 is an image processing flowchart for detecting a plane position and orientation.

【図8】荷物の輪郭を抽出するための画像処理フローチ
ャート。
FIG. 8 is an image processing flowchart for extracting the outline of a package.

【図9】荷物の輪郭抽出の説明図。FIG. 9 is an explanatory view of contour extraction of a package.

【図10】荷物のエッヂ点抽出法の説明図。FIG. 10 is an explanatory view of an edge point extraction method for luggage.

【図11】荷物の回転角の説明図。FIG. 11 is an explanatory diagram of a rotation angle of luggage.

【図12】中心座標及び対角点座標を求めるための説明
図。
FIG. 12 is an explanatory diagram for obtaining center coordinates and diagonal point coordinates.

【図13】シフト量をピクセル座標系からロボット座標
系へ換算するための説明図。
FIG. 13 is an explanatory diagram for converting a shift amount from a pixel coordinate system to a robot coordinate system.

【図14】ピクセル座標系とロボット座標系の組合せ
図。
FIG. 14 is a combination diagram of a pixel coordinate system and a robot coordinate system.

【図15】プラスチックコンテナ及び円筒形の荷物の位
置計測の説明図。
FIG. 15 is an explanatory diagram of position measurement of a plastic container and a cylindrical load.

【符号の説明】[Explanation of symbols]

1 ロボット、 2 ロボット制御装置、 3 パレタ
イズコンピュータ、 4画像処理装置、 5 ハンド、
6 レーザマーカー、 7 照明、 8 CCDカメ
ラ、 9 電源ケーブル、信号ケーブル、 10 電
源、 11 荷物
1 robot, 2 robot control device, 3 palletizing computer, 4 image processing device, 5 hands,
6 laser marker, 7 illumination, 8 CCD camera, 9 power cable, signal cable, 10 power source, 11 luggage

Claims (1)

【特許請求の範囲】[Claims] 【請求項1】 スリット光を荷物の斜め方向から荷物に
照射し、三角測量により荷物からカメラの高さを計測
し、荷物の斜め方向から照明を当てて、撮像した画像と
マスク画像を合成し、荷物の位置計測領域にある外乱要
素を取り除き、荷物上面の輪郭を抽出し、先に求めた高
さ位置と荷物の寸法情報により荷物の平面方向の位置と
回転角を求めることを特徴とする荷物の位置計測方法。
1. A package is irradiated with slit light from an oblique direction of the package, the height of the camera is measured from the package by triangulation, and illumination is applied from the oblique direction of the package to synthesize a captured image and a mask image. The feature is that the disturbance element in the luggage position measurement area is removed, the contour of the luggage upper surface is extracted, and the position and rotation angle in the plane direction of the luggage are obtained from the height position and luggage dimension information previously obtained. How to measure the position of luggage.
JP26188695A 1995-10-09 1995-10-09 Measuring method for position of load Pending JPH09105608A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
JP26188695A JPH09105608A (en) 1995-10-09 1995-10-09 Measuring method for position of load

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
JP26188695A JPH09105608A (en) 1995-10-09 1995-10-09 Measuring method for position of load

Publications (1)

Publication Number Publication Date
JPH09105608A true JPH09105608A (en) 1997-04-22

Family

ID=17368137

Family Applications (1)

Application Number Title Priority Date Filing Date
JP26188695A Pending JPH09105608A (en) 1995-10-09 1995-10-09 Measuring method for position of load

Country Status (1)

Country Link
JP (1) JPH09105608A (en)

Cited By (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20010001976A (en) * 1999-06-10 2001-01-05 김형벽 Multifunctional robot hand
JP2012242281A (en) * 2011-05-20 2012-12-10 Omron Corp Method, device and program for calculating center position of detection object
JP2012242282A (en) * 2011-05-20 2012-12-10 Omron Corp Method, device and program for calculating position of head portion and direction of shaft portion of detection object
JP2013088169A (en) * 2011-10-14 2013-05-13 Hitachi-Ge Nuclear Energy Ltd Inspection apparatus and method
JP2013129034A (en) * 2011-12-22 2013-07-04 Yaskawa Electric Corp Robot system, and sorted article manufacturing method
JP2015114292A (en) * 2013-12-16 2015-06-22 川崎重工業株式会社 Workpiece position information identification apparatus and workpiece position information identification method
JP2015178141A (en) * 2014-03-19 2015-10-08 トヨタ自動車株式会社 transfer robot and transfer method
JP2018138318A (en) * 2017-02-24 2018-09-06 パナソニックIpマネジメント株式会社 Electronic equipment manufacturing device and electronic equipment manufacturing method
WO2020050405A1 (en) * 2018-09-07 2020-03-12 Ntn株式会社 Work device
US10913150B2 (en) 2015-09-11 2021-02-09 Kabushiki Kaisha Yaskawa Denki Processing system and method of controlling robot

Cited By (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20010001976A (en) * 1999-06-10 2001-01-05 김형벽 Multifunctional robot hand
JP2012242281A (en) * 2011-05-20 2012-12-10 Omron Corp Method, device and program for calculating center position of detection object
JP2012242282A (en) * 2011-05-20 2012-12-10 Omron Corp Method, device and program for calculating position of head portion and direction of shaft portion of detection object
JP2013088169A (en) * 2011-10-14 2013-05-13 Hitachi-Ge Nuclear Energy Ltd Inspection apparatus and method
JP2013129034A (en) * 2011-12-22 2013-07-04 Yaskawa Electric Corp Robot system, and sorted article manufacturing method
JP2015114292A (en) * 2013-12-16 2015-06-22 川崎重工業株式会社 Workpiece position information identification apparatus and workpiece position information identification method
JP2015178141A (en) * 2014-03-19 2015-10-08 トヨタ自動車株式会社 transfer robot and transfer method
US10913150B2 (en) 2015-09-11 2021-02-09 Kabushiki Kaisha Yaskawa Denki Processing system and method of controlling robot
JP2018138318A (en) * 2017-02-24 2018-09-06 パナソニックIpマネジメント株式会社 Electronic equipment manufacturing device and electronic equipment manufacturing method
WO2020050405A1 (en) * 2018-09-07 2020-03-12 Ntn株式会社 Work device
JPWO2020050405A1 (en) * 2018-09-07 2021-09-02 Ntn株式会社 Working equipment

Similar Documents

Publication Publication Date Title
US20240177382A1 (en) Systems and Methods for Stitching Sequential Images of an Object
US6798925B1 (en) Method and apparatus for calibrating an image acquisition system
US8781159B2 (en) System and method for dimensioning objects using stereoscopic imaging
JPH09505886A (en) Grid array inspection system and method
JPH09105608A (en) Measuring method for position of load
JPS6332306A (en) Non-contact three-dimensional automatic dimension measuring method
JP2003121115A (en) Visual inspection apparatus and method therefor
JP4191295B2 (en) Semiconductor package inspection equipment
Chen et al. A smart machine vision system for PCB inspection
JP4073995B2 (en) Electronic component position detection method
JPH07113626A (en) Detecting method of image-sensing range in surface inspection
JPH08304025A (en) Position measuring method for rectangular load
CA2508595C (en) Camera calibrating apparatus and method
JP2000088555A (en) Inspection method for position deviation of character and figure and device thereof
CN106643483A (en) work piece detection method and device
JP2004325338A (en) Device and method for inspecting printed circuit board
JP2775924B2 (en) Image data creation device
CN113155097B (en) Dynamic tracking system with pose compensation function and pose compensation method thereof
Yu et al. New approach to vision-based BGA package inspection
JPH05231842A (en) Shape pattern inspection method and device
JPH0560559A (en) Height measuring apparatus
JPS6375503A (en) Positioning device for parts with lead
JP2001004334A (en) Inspecting device for semiconductor package
JP2836580B2 (en) Projection inspection device for semiconductor integrated circuit device
JP2626926B2 (en) Method and apparatus for detecting vehicle position in automobile manufacturing line

Legal Events

Date Code Title Description
A02 Decision of refusal

Free format text: JAPANESE INTERMEDIATE CODE: A02

Effective date: 20020607