[go: up one dir, main page]
More Web Proxy on the site http://driver.im/

CN114078160A - Transportation monitoring method and system thereof - Google Patents

Transportation monitoring method and system thereof Download PDF

Info

Publication number
CN114078160A
CN114078160A CN202010825200.6A CN202010825200A CN114078160A CN 114078160 A CN114078160 A CN 114078160A CN 202010825200 A CN202010825200 A CN 202010825200A CN 114078160 A CN114078160 A CN 114078160A
Authority
CN
China
Prior art keywords
distance
processing device
image
pixel coordinate
robot
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202010825200.6A
Other languages
Chinese (zh)
Inventor
陈毅达
谢文钦
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Winbond Electronics Corp
Original Assignee
Winbond Electronics Corp
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Winbond Electronics Corp filed Critical Winbond Electronics Corp
Priority to CN202010825200.6A priority Critical patent/CN114078160A/en
Publication of CN114078160A publication Critical patent/CN114078160A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/70Determining position or orientation of objects or cameras
    • G06T7/73Determining position or orientation of objects or cameras using feature-based methods
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T1/00General purpose image data processing
    • G06T1/0014Image feed-back for automatic industrial control, e.g. robot with camera
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/90Determination of colour characteristics
    • GPHYSICS
    • G08SIGNALLING
    • G08BSIGNALLING OR CALLING SYSTEMS; ORDER TELEGRAPHS; ALARM SYSTEMS
    • G08B21/00Alarms responsive to a single specified undesired or abnormal condition and not otherwise provided for
    • G08B21/18Status alarms
    • G08B21/187Machine fault alarms
    • HELECTRICITY
    • H01ELECTRIC ELEMENTS
    • H01LSEMICONDUCTOR DEVICES NOT COVERED BY CLASS H10
    • H01L21/00Processes or apparatus adapted for the manufacture or treatment of semiconductor or solid state devices or of parts thereof
    • H01L21/67Apparatus specially adapted for handling semiconductor or electric solid state devices during manufacture or treatment thereof; Apparatus specially adapted for handling wafers during manufacture or treatment of semiconductor or electric solid state devices or components ; Apparatus not specifically provided for elsewhere
    • H01L21/677Apparatus specially adapted for handling semiconductor or electric solid state devices during manufacture or treatment thereof; Apparatus specially adapted for handling wafers during manufacture or treatment of semiconductor or electric solid state devices or components ; Apparatus not specifically provided for elsewhere for conveying, e.g. between different workstations
    • H01L21/67763Apparatus specially adapted for handling semiconductor or electric solid state devices during manufacture or treatment thereof; Apparatus specially adapted for handling wafers during manufacture or treatment of semiconductor or electric solid state devices or components ; Apparatus not specifically provided for elsewhere for conveying, e.g. between different workstations the wafers being stored in a carrier, involving loading and unloading
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30232Surveillance

Landscapes

  • Engineering & Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Manufacturing & Machinery (AREA)
  • Computer Hardware Design (AREA)
  • Microelectronics & Electronic Packaging (AREA)
  • Power Engineering (AREA)
  • Condensed Matter Physics & Semiconductors (AREA)
  • Business, Economics & Management (AREA)
  • Emergency Management (AREA)
  • Robotics (AREA)
  • Manipulator (AREA)

Abstract

The invention provides a transportation monitoring method, which comprises the following steps: acquiring a monitoring image of the mechanical arm outside the carrying container from a fixed view through the image acquisition device, wherein the mechanical arm is used for moving the articles into or out of the carrying container; obtaining a sampling region from the monitoring image by a processing device; and judging the tilting state of the mechanical arm according to the sampling area through the processing device, wherein when the processing device judges that the mechanical arm is tilted, the processing device sends out a warning signal. A transportation monitoring system is also provided.

Description

Transportation monitoring method and system thereof
Technical Field
The invention relates to an article carrying technology, in particular to a transportation monitoring method and a system thereof.
Background
In a typical semiconductor manufacturing process, wafers (wafers) are transferred between a process module and a memory module. When the wafer is loaded into the processing equipment, the wafer is taken out from the wafer carrier (FOUP) by a Robot (Robot Blade) and then is transmitted into the processing reaction chamber for processing reaction, and the wafer is returned to the wafer carrier after the processing is finished. However, if the robot arm tilts or shifts, the wafer may be scratched during the process of picking and placing the wafer, resulting in the generation of defective products. Therefore, to avoid wafer damage, it is important to monitor the status of the robot in real time during the transportation process.
Disclosure of Invention
In view of the above, the present invention provides a transportation monitoring method and a system thereof, which can monitor whether a robot arm tilts in real time to prevent the robot arm from damaging articles during the transportation process.
The embodiment of the invention provides a transportation monitoring method, which comprises the following steps: acquiring a monitoring image of the mechanical arm outside the carrying container from a fixed view through the image acquisition device, wherein the mechanical arm is used for moving the articles into or out of the carrying container; obtaining a sampling region from the monitoring image by a processing device; and judging the tilting state of the mechanical arm according to the sampling area through the processing device, wherein when the processing device judges that the mechanical arm is tilted, the processing device sends out a warning signal.
The embodiment of the invention provides a transportation monitoring system, which comprises an image acquisition device and a processing device. The image acquisition device is used for acquiring a monitoring image of the mechanical arm outside the carrying container from a fixed view, wherein the mechanical arm is used for moving the articles into or out of the carrying container. The processing device is electrically connected with the image acquisition device and the mechanical arm and used for judging the inclination state of the mechanical arm according to the sampling area of the monitoring image, wherein when the processing device judges that the mechanical arm is inclined, the processing device sends out a warning signal.
Based on the above, the transportation monitoring method and the transportation monitoring system provided by the embodiment of the invention can obtain the image of the robot arm outside the carrying container as the monitoring image, wherein the view field of the monitoring image is fixed. The monitoring image is processed and a sampling range is obtained therefrom. The processing device judges the inclination state of the mechanical arm according to the sampling area of the monitoring image and correspondingly sends out a warning signal to remind an operator of the occurrence of the inclination condition of the mechanical arm.
In order to make the aforementioned and other features and advantages of the invention more comprehensible, embodiments accompanied with figures are described in detail below.
Drawings
FIG. 1 is a schematic view of a transportation monitoring system according to an embodiment of the present invention;
FIG. 2 is a schematic view of a wafer transportation process according to an embodiment of the present invention;
FIG. 3 is a schematic illustration of a transportation monitoring method according to an embodiment of the invention;
FIG. 4 is a schematic illustration of a monitoring image according to an embodiment of the invention;
fig. 5 is a schematic diagram of a tilt type of a robot according to an embodiment of the present invention.
Detailed Description
Fig. 1 is a schematic view of a transportation monitoring system according to an embodiment of the invention, and fig. 2 is a schematic view of a transportation process of a wafer according to an embodiment of the invention. Referring to fig. 1 and 2 together, the transportation monitoring system 100 is used for monitoring the tilting state of the robot 120, wherein the robot 120 is used to move an item into or out of a carrying container. The transportation monitoring system 100 includes a processing device 110 and an image capturing device 130, wherein the processing device 110 is electrically connected to the robot 120 and the image capturing device 130.
In the present embodiment, the transportation monitoring System 100 is applied to the Wafer Transfer System (Wafer Transfer System) of fig. 2. In fig. 2, the article WA is a wafer, and the carrier 200 is a Front Opening Unified Pod (FOUP). The robot 120 may transfer wafers into and out of the wafer carrier. The wafer carrier (load container 200) has a plurality of receiving slots (for example, four receiving slots 210-240), and the receiving slots 210-240 are arranged along the Z direction for individually receiving wafers.
The robot 120 can respectively correspond to the receiving slots 210-240 in the wafer carrier at different heights in the Z direction, so as to selectively move the wafer out of or into the corresponding receiving slot. The robot 120 extends in the X direction. In other words, the wafer may lie flat on the X-Y plane of the robot 120.
The image capturing device 130 includes, for example, a Charge Coupled Device (CCD) or a Complementary Metal Oxide Semiconductor (CMOS) image sensor. The image capturing device 130 is disposed beside the carrying container 200 for capturing an image of the robot 120 to generate at least one monitoring image. The Processing device 110 includes, for example, a Central Processing Unit (CPU), or other programmable general purpose or special purpose Microprocessor (Microprocessor), Digital Signal Processor (DSP), programmable controller, Application Specific Integrated Circuit (ASIC), or other similar components or combinations thereof, but the invention is not limited thereto. The processing device 110 can process the monitoring image to determine whether the robot 120 is tilted or whether the position of the robot 120 is correct.
Fig. 3 is a schematic diagram of a transportation monitoring method according to an embodiment of the invention. The transportation monitoring system 100 is adapted to implement the transportation monitoring method 300 of fig. 3. Embodiments of the transportation monitoring method 300 are described below with reference to the elements of the transportation monitoring system 100.
In step S310, the image capturing device 130 captures a monitoring image of the robot arm 120 outside the carrying container 200 from the fixed field of view. In other words, the angle captured by the image capturing device 130 is fixed, and the capture range is also fixed. In one embodiment, the image capturing device 130 captures the monitoring image before the robot 120 removes the article WA (e.g., wafer) from the carrier 200. That is, the monitor image is an image when the robot arm 120 is about to enter the carrying container 200 to take out the article WA, but has not yet entered. The invention is not so limited. Next, in step S320, the processing device 110 receives the monitoring image from the image acquisition device 130, and obtains the sampling region from the monitoring image. The following examples will further illustrate how to obtain a sampling region. In step S330, the processing device 110 determines the tilt state of the robot 120 according to the sampling area of the monitoring image. If the processing device 110 determines that the robot 120 is not tilted and the operation status is normal, the process returns to step S310 to continue monitoring the robot 120. When the processing device 110 determines that the robot arm 120 is tilted, the process proceeds to step S340, and the processing device 110 sends an alarm signal. In the present embodiment, the processing device 110 may send an alarm signal to a Failure Detection and Classification (FDC) control host of the wafer transfer system to alert an operator that the robot 120 is abnormal. The robot 120 may also automatically stop subsequent operations in response to the warning signal to avoid accidents during wafer removal.
Fig. 4 is a schematic diagram of a monitoring image according to an embodiment of the invention. Referring to fig. 4, after receiving the monitoring image 400, the processing device 110 performs an image processing operation on the monitoring image 400 to obtain a sampling region 410 from the monitoring image 400 and obtain a plurality of gray-scale values of the sampling region 410. For example, the processing device 110 performs a gray scale conversion operation to convert the monitoring image 400 into a gray scale image. In the present embodiment, the sampling area 400 is a fixed pixel range of the monitoring image 400, so the processing device 110 can obtain the sampling area 410 from the converted monitoring image 400 according to the predetermined pixel coordinates and obtain all the gray-scale values of the sampling area 410. The processing device 110 performs image recognition according to the gray-scale values to find the position of the robot 120 in the sampling area 410. Since the appearance of the robot 120 usually presents a metal color, the robot 120 has a higher gray scale value, and the processing device 110 can identify the robot 120 by using the gray scale difference according to the characteristic.
The processing device 110 obtains at least two pixel coordinates of the robot 120 according to the gray-scale values, and determines the tilt state of the robot 120 according to the at least two pixel coordinates. The invention does not limit the number of pixel coordinates obtained by the processing device 110. In fig. 4, the processing device 110 acquires four pixel coordinates P1 to P4 of the robot arm 120. In other embodiments, the processing device 110 may only obtain two pixel coordinates or more pixel coordinates of the robot 120.
The processing device 110 calculates the distances D1-D4 from the pixel coordinates P1-P4 to the boundary of the sampling region 410, and determines whether the robot 120 is tilted and the tilt type according to at least two of the distances D1-D4.
Fig. 5 is a schematic diagram of a tilt type of a robot arm according to an embodiment of the present invention. Please refer to fig. 5 with fig. 4. The coordinates P1 and P2 in fig. 4 are two pixel coordinates located at different positions on the first surface S1 of the robot 120, and are hereinafter referred to as a first pixel coordinate P1 and a second pixel coordinate P2, respectively. More specifically, the first pixel coordinate P1 and the second pixel coordinate P2 respectively represent positions of different lengths of the robot arm 120, i.e., different positions in the X direction. In another embodiment, the first pixel coordinate P1 and the second pixel coordinate P2 represent positions of the robot arm 120 with different lengths, i.e., different positions in the X direction and the same position in the Y direction, respectively. The first pixel coordinate P1 is located at the front end of the robot 120, and the second pixel coordinate P2 is located at the rear end of the robot 120. The distance from the first pixel coordinate P1 to the boundary B1 of the sampling region 410 is referred to as a first distance D1, and the distance from the second pixel coordinate P2 to the boundary B1 of the sampling region 410 is referred to as a second distance D2. The processing device 110 determines whether the robot arm 120 tilts back and forth according to the difference between the first distance D1 and the second distance D2. In an embodiment, the difference between the first distance D1 and the second distance D2 is greater than the threshold, and the processing device 110 determines that the robot 120 tilts forward and backward and sends an alarm signal.
For example, if the first distance D1 is greater than the second distance D2, the processing device 110 determines that the robot 120 is tilted back and forth, and the robot 120 is tilted down as shown in the tilt type 530 of fig. 5. If the first distance D1 is less than the second distance D2, the processing device 110 still determines that the robot 120 is tilted back and forth, as shown by the tilt type 510 of FIG. 5, and the robot 120 is tilted upward.
The coordinates P3 and P4 in fig. 4 are two pixel coordinates located at different positions on the second surface S2 of the robot 120, and are hereinafter referred to as a third pixel coordinate P3 and a fourth pixel coordinate P4, respectively, where the second surface S2 is opposite to the first surface S1. Here, the first surface S1 is an upper surface of the robot arm 120, and the second surface S2 is a lower surface of the robot arm 120. More specifically, the third pixel coordinate P3 and the fourth pixel coordinate P4 respectively represent positions of different lengths of the robot arm 120, i.e., different positions in the X direction. In another embodiment, the third pixel coordinate P3 and the fourth pixel coordinate P4 respectively represent positions of the robot 120 with different lengths, i.e., different positions in the X direction and the same position in the Y direction, the third pixel coordinate P3 is located at the front end of the robot 120, and the fourth pixel coordinate P4 is located at the rear end of the robot 120. The distance from the third pixel coordinate P3 to the boundary B2 of the sampling region 410 is referred to as a third distance D3, and the distance from the fourth pixel coordinate P4 to the boundary B2 of the sampling region 410 is referred to as a fourth distance D4. Boundary B2 is opposite boundary B1. The processing device 110 may also determine whether the robot arm 120 tilts back and forth according to the difference between the third distance D3 and the fourth distance D4.
For example, if the third distance D3 is greater than the fourth distance D4, the processing device 110 determines that the robot 120 is tilted back and forth, and the robot 120 is tilted upward as shown in the tilt type 560 of fig. 5. If the third distance D3 is less than the fourth distance D4, the processing device 110 still determines that the robot 120 is tilted back and forth, as shown by tilt type 580 of FIG. 5, and that the robot 120 is tilted downward.
The processing device 110 may double-confirm whether the robot arm 120 is tilted and the tilt type thereof, whether it is tilted up or down, by the difference between the first distance D1 and the second distance D2 and the difference between the third distance D3 and the fourth distance D4.
In the present embodiment, the fifth distance D5 is the total length of the sampling region 410 from the boundary B1 to the boundary B2. The processing device 110 may further determine whether the robot arm 120 tilts left and right according to a difference D5- (D1+ D3) between the fifth distance D5 and the first distance D1 and the third distance D3. In the present embodiment, the first pixel coordinate P1 and the third pixel coordinate P3 are located at the front end of the robot 120, and the second pixel coordinate P2 and the fourth pixel coordinate P4 are located at the rear end of the robot 120. More specifically, the first pixel coordinate P1 and the third pixel coordinate P3 may be located at the same length of the robot arm 120 (the same position in the X direction and the different position in the Y direction), the second pixel coordinate P2 and the fourth pixel coordinate P4 may be located at the same length of the robot arm 120 (the same position in the X direction and the different position in the Y direction), that is, the first pixel coordinate P1 and the third pixel coordinate P3 are on opposite sides of the robot arm 120, and the second pixel coordinate P2 and the fourth pixel coordinate P4 are on opposite sides of the robot arm 120. However, in other embodiments, the first pixel coordinate P1 and the third pixel coordinate P3 may be located at different lengths of the robot 120, and the second pixel coordinate P2 and the fourth pixel coordinate P4 may be located at different lengths of the robot 120.
When the difference D5- (D1+ D3) is greater than the threshold (e.g., the thickness of the robot 120), the processing device 110 determines that the robot 120 is tilted left or right, as shown in the tilt type 520 or 550 of fig. 5. On the other hand, the fifth distance D5 is subtracted from the second distance D2 and the fourth distance D4 to obtain another difference D5- (D2+ D4). When the difference D5- (D2+ D4) is greater than the threshold, the processing device 110 determines that the robot 120 tilts left and right, as shown by the tilt type 540 or 570 in fig. 5.
In this embodiment, since the image acquiring device 130 acquires an image of a fixed field of view and acquires an image of a fixed size from the monitoring image 400 as the sampling region 410, the processing device 110 may obtain a spatial coordinate corresponding to each pixel coordinate through a lookup table, and further determine the position of the robot arm 120 in space.
The transportation monitoring system 100 may acquire pre-corrected images from the same fixed field of view in advance. That is, the pre-corrected image has the same field of view as the monitored image, and the captured background range is also the same. The processing device 110 may establish a lookup table from the pre-corrected image and the known spatial coordinates of the shot, wherein the lookup table records a plurality of spatial coordinates, wherein a plurality of pixel coordinates of the image acquired by the image acquiring device 130 correspond to the spatial coordinates.
In the present embodiment, a correction flag may be set in the captured background, wherein the position of the correction flag is to be within the fixed field of view of the image capturing apparatus 130, so that the content of the monitoring image includes the correction flag. The processing device 110 may determine whether the fixed field of view is shifted according to the position of the correction flag in the monitored image. In the embodiment of FIG. 4, two correction flags 402 and 404 are used, and the number of correction flags is not limited by the invention. The correction flag 402 and the correction flag 404 are set at positions appearing at diagonal corners of the monitor image 400. If either of the calibration marks 402 or 404 is not present at the default position in the monitored image 400, the processing device 110 or the operator may determine that the capturing angle of view of the image capturing device 130 has shifted and calibrate the image capturing device 130 accordingly.
Since the lookup table already records the pixel coordinates and the corresponding spatial coordinates, the processing device 110 can determine in advance which part of the monitored image 400 is the sampling region 410. For example, when the robot 120 moves to the entrance of the container 200, the image capturing device 130 captures an image of the robot 120, and the processing device 110 may capture only an image of the door of the container 200 as a sampling range, wherein the sampling range is included in the robot 120. In this way, the calculation burden for recognizing the robot 120 can be reduced, and the pixel length from the robot 120 to the boundary of the sampling range can be easily converted into the spatial distance.
After the processing device 110 obtains the pixel coordinates P1-P4 of the robot 120, the processing device 110 may further determine the spatial position of the robot 120 through a lookup table according to at least one of the pixel coordinates P1-P4, and determine whether to adjust the spatial position of the robot 120 relative to the carrying container 200 according to the determination result. In one embodiment, the field of view captured by the image capturing device 130 is fixed and selected at the entrance of the container 200 and covers the height of the plurality of receiving slots 210-240. The processing device 110 obtains the height (position in the Z direction) of the robot arm 120 from the monitor image 400, and further determines whether the robot arm 120 can successfully take out the article WA. For example, if the robot arm 120 wants to take out the article WA located in the storage slot 220, and the processing device 110 determines that the robot arm 120 is currently located in front of the opening of the storage slot 240 or the robot arm 120 is damaged when it is inserted into the storage container 200, the processing device 110 changes the height of the robot arm 120 until the processing device 110 determines that the robot arm 120 can take out the article WA smoothly.
In particular, in the present embodiment, the carrying container 200 and the processing device 110 may be configured in a one-to-one manner, but are not limited thereto. By one-to-one configuration, each processing device 110 can process the image of the robot arm 120 outside each carrier container 200 in time, and can determine whether the robot arm 120 is suitable for the transportation operation before the robot arm 120 performs the transportation operation, so as to achieve a good monitoring effect.
In summary, the transportation monitoring method and the transportation monitoring system thereof of the invention can monitor the state of the robot arm in real time. Whether the mechanical arm is inclined or not is judged by acquiring a monitoring image of the mechanical arm, and the inclination type of the mechanical arm can be further judged. When the mechanical arm is found to be inclined, the processing device can send out a warning signal to stop the mechanical arm, so that the damage to the articles is avoided. In addition, through monitoring the image, the processing device can also judge whether the positioning of the mechanical arm is correct or not at the same time so as to avoid taking wrong articles or scratching goods in the moving process.
Although the present invention has been described with reference to the above embodiments, it should be understood that various changes and modifications can be made therein by those skilled in the art without departing from the spirit and scope of the invention.

Claims (14)

1. A transportation monitoring method, comprising:
acquiring a monitoring image of a mechanical arm outside a carrying container from a fixed view through an image acquisition device, wherein the mechanical arm is used for moving articles into or out of the carrying container;
obtaining a sampling region from the monitoring image by a processing device; and
determining, by the processing device, a tilt state of the robotic arm based on the sampling area,
when the processing device judges that the mechanical arm inclines, the processing device sends out a warning signal.
2. The transportation monitoring method of claim 1, wherein the step of determining, by the processing device, the tilt state of the robot arm from the sampling area of the monitoring image comprises:
performing an image processing operation to obtain a plurality of gray scale values of the sampling region;
obtaining at least two pixel coordinates of the mechanical arm according to the plurality of gray-scale values; and
and judging the inclination state of the mechanical arm according to the at least two pixel coordinates.
3. The transportation monitoring method of claim 2, wherein the step of determining, by the processing device, the tilt state of the robot arm from the sampling area of the monitoring image further comprises:
calculating at least two distances from the at least two pixel coordinates to the boundary of the sampling region; and
and comparing the at least two distances to judge the type of the inclination of the mechanical arm.
4. The transportation monitoring method of claim 3, wherein the at least two distances comprise a first distance and a second distance, wherein the first distance and the second distance are distances from a first pixel coordinate and a second pixel coordinate on a first surface of the robot arm to a boundary of the sampling area, respectively, and whether the robot arm tilts back and forth is determined according to a difference between the first distance and the second distance, wherein the first pixel coordinate and the second pixel coordinate are located at different length positions of the robot arm, respectively.
5. The transportation monitoring method of claim 4, wherein the at least two distances further comprise a third distance, wherein the third distance is a distance from a third pixel coordinate on a second surface of the robot arm to a boundary of the sampling area, wherein the second surface is opposite to the first surface, and whether the robot arm tilts left and right is determined according to a difference between a total length of the sampling area and the first distance and the third distance, wherein the first pixel coordinate and the third pixel coordinate are respectively located at the same length position of the robot arm.
6. A transportation monitoring system comprising:
the image acquisition device is used for acquiring a monitoring image of the mechanical arm outside the carrying container from a fixed view, wherein the mechanical arm is used for moving the articles into or out of the carrying container; and
and the processing device is electrically connected with the image acquisition device and the mechanical arm and is used for judging the inclination state of the mechanical arm according to the sampling area of the monitoring image, wherein when the processing device judges that the mechanical arm is inclined, the processing device sends out a warning signal.
7. The transportation monitoring system of claim 6, wherein the sampling area is a fixed pixel range of the monitored image.
8. The transportation monitoring system of claim 6, wherein the processing device performs an image processing operation to obtain a plurality of gray scale values for the sampling area and to obtain at least two pixel coordinates of the robotic arm based on the plurality of gray scale values, and the processing device determines the tilt status of the robotic arm based on the at least two pixel coordinates.
9. The transportation monitoring system of claim 8, wherein the processing device further determines a spatial position of the robotic arm based on at least one of the at least two pixel coordinates and determines whether to adjust the spatial position of the robotic arm relative to the carrier based on the determination.
10. The transportation monitoring system of claim 8, wherein the processing device calculates at least two distances of the at least two pixel coordinates to a boundary of the sampling area and compares the at least two distances to determine a type of tilt of the robotic arm.
11. The transportation monitoring system of claim 10, wherein the at least two distances comprise a first distance and a second distance, wherein the first distance and the second distance are a first pixel coordinate on a first surface of the robotic arm and a distance from a second pixel seat to a boundary of the sampling area, respectively, and wherein a determination is made as to whether the robotic arm is tilted back and forth based on a difference between the first distance and the second distance, wherein the first pixel coordinate and the second pixel coordinate are located at different lengths of the robotic arm, respectively.
12. The transportation monitoring system of claim 11, wherein the at least two distances further comprise a third distance, wherein the third distance is a distance from a third pixel coordinate on a second surface of the robot arm to a boundary of the sampling area, wherein the second surface is opposite to the first surface, and whether the robot arm tilts left and right is determined according to a difference between a total length of the sampling area and the first distance and the third distance, wherein the first pixel coordinate and the third pixel coordinate are respectively located at a position of the same length of the robot arm.
13. The transportation monitoring system of claim 6, further comprising:
a correction mark arranged at the correction position,
wherein the content of the monitoring image further includes the correction flag, and the processing device judges whether the fixed field of view is shifted according to a position of the correction flag in the monitoring image.
14. The transportation monitoring system of claim 6, wherein the image acquisition device acquires the monitoring image before the robotic arm transports the item from the carrier container.
CN202010825200.6A 2020-08-17 2020-08-17 Transportation monitoring method and system thereof Pending CN114078160A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202010825200.6A CN114078160A (en) 2020-08-17 2020-08-17 Transportation monitoring method and system thereof

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202010825200.6A CN114078160A (en) 2020-08-17 2020-08-17 Transportation monitoring method and system thereof

Publications (1)

Publication Number Publication Date
CN114078160A true CN114078160A (en) 2022-02-22

Family

ID=80280810

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202010825200.6A Pending CN114078160A (en) 2020-08-17 2020-08-17 Transportation monitoring method and system thereof

Country Status (1)

Country Link
CN (1) CN114078160A (en)

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6542839B1 (en) * 2000-09-29 2003-04-01 Taiwan Semiconductor Manufacturing Co., Ltd Apparatus and method for calibrating the position of a cassette indexer
KR20070047606A (en) * 2005-11-02 2007-05-07 삼성전자주식회사 Method of sensing a blade horizontal level and wafer transferring equipment using the same
KR20120133031A (en) * 2011-05-30 2012-12-10 세메스 주식회사 Teaching method of apparatus for manufacturing semiconductor
CN104752295A (en) * 2013-12-30 2015-07-01 北京北方微电子基地设备工艺研究中心有限责任公司 Position monitoring device, plasma processing device and method for loading and unloading workpiece
KR20160124965A (en) * 2015-04-20 2016-10-31 에스케이하이닉스 주식회사 Wafer Transferring Apparatus

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6542839B1 (en) * 2000-09-29 2003-04-01 Taiwan Semiconductor Manufacturing Co., Ltd Apparatus and method for calibrating the position of a cassette indexer
KR20070047606A (en) * 2005-11-02 2007-05-07 삼성전자주식회사 Method of sensing a blade horizontal level and wafer transferring equipment using the same
KR20120133031A (en) * 2011-05-30 2012-12-10 세메스 주식회사 Teaching method of apparatus for manufacturing semiconductor
CN104752295A (en) * 2013-12-30 2015-07-01 北京北方微电子基地设备工艺研究中心有限责任公司 Position monitoring device, plasma processing device and method for loading and unloading workpiece
KR20160124965A (en) * 2015-04-20 2016-10-31 에스케이하이닉스 주식회사 Wafer Transferring Apparatus

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
黄自柯: "基于湿式清洗机的搬运机械臂精度设计", 机电一体化, no. 01, 15 January 2018 (2018-01-15), pages 47 - 52 *

Similar Documents

Publication Publication Date Title
US10593575B2 (en) System and method for monitoring wafer handling and a wafer handling machine
US20010055069A1 (en) One camera system for component to substrate registration
KR102313347B1 (en) Image inspection apparatus and image inspection method
JP2018056256A (en) Diagnostic system for substrate transfer hand
JP2941617B2 (en) Electronic component component data recording device and electronic component transport and assembling device using the same
JP4733001B2 (en) Component mounting apparatus, component mounting method, and program
CN111415889A (en) Wafer state detection method and semiconductor device
US20070260341A1 (en) Correcting apparatus for wafer transport equipment and correcting method for wafer transport equipment
CN114400190B (en) Device for omnibearing detecting wafers in wafer cassette
TWI582852B (en) Method and apparatus for monitoring edge bevel removal area in semiconductor apparatus and electroplating system
US20200161161A1 (en) Apparatus and methods for handling semiconductor part carriers
CN114078160A (en) Transportation monitoring method and system thereof
KR20110062522A (en) Substrate transfer apparatus and method for transferring substrates therefo
US12046492B2 (en) Robot blade tilt determination by evaluating grayscale in images
CN117524964B (en) Method and system for detecting and correcting wafer center offset in conveying process
WO2020100522A1 (en) Mark detection system, signal processing circuit, computer program, and method
KR20090114565A (en) Wafer location error indicating system and method using vision sensor
CN101092034A (en) Adjusting device for facility of handling wafers, and adjusting method for facility of handling wafers
KR20220170366A (en) Inspection method and etching system
JP4566686B2 (en) Method and apparatus for determining shape of object
JP3604587B2 (en) Image processing device
KR102289382B1 (en) Position calibrating method for semiconductor factory
CN110890288B (en) Semiconductor manufacturing system, edge detection device, and method for detecting removal region
US20240265662A1 (en) Recognition device and recognition method
CN111742625B (en) Component transfer apparatus, component transfer method, and component mounting apparatus

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination