CN107562449B - Image processing apparatus, image processing method, and image processing program - Google Patents
Image processing apparatus, image processing method, and image processing program Download PDFInfo
- Publication number
- CN107562449B CN107562449B CN201710094971.0A CN201710094971A CN107562449B CN 107562449 B CN107562449 B CN 107562449B CN 201710094971 A CN201710094971 A CN 201710094971A CN 107562449 B CN107562449 B CN 107562449B
- Authority
- CN
- China
- Prior art keywords
- processing
- item
- image
- items
- image processing
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Active
Links
Images
Landscapes
- Image Processing (AREA)
- User Interface Of Digital Computer (AREA)
- Image Analysis (AREA)
Abstract
The invention provides an image processing apparatus capable of executing each set process in a manner that the process which cannot be repeated is not executed at the same time. The image processing apparatus includes a user interface that receives selection of one or more processing items included in a process setting selected from a plurality of processing items prepared in advance and designation of an execution order of the selected processing items. The image processing apparatus can specify a parallel item and a multi-level item (352, 354) associated with a plurality of processing item sets each including a combination of one or more processing items. The image processing apparatus starts the execution of the processing of each of the plurality of sets of processing items associated with the parallel items at the same time when the timing of the execution order of the parallel items comes, and starts the execution of the processing of each of the sets of processing items (360, 370, 380) associated with the multi-level items in sequence every time a predetermined signal satisfies a predetermined condition when the timing of the execution order of the multi-level items (352, 354) comes.
Description
Technical Field
The present disclosure relates to an image processing apparatus, an image processing method, and an image processing program capable of specifying an execution order of image processing.
Background
In the field of FA (Factory Automation), a technique for automatically inspecting an inspection target such as a workpiece is becoming popular. With regard to such a technique, JP 2015-232479 a (patent document 1) discloses an inspection apparatus that inspects a workpiece by irradiating the workpiece with an illumination light source.
The inspection processing of the workpiece is realized by a combination of various kinds of image processing. An application tool that provides a user interface for setting a combination of such image processing by the user himself is being developed. As for such application tools, JP 2014-203309 (patent document 2) discloses an image processing apparatus that "can realize more efficient and high-speed image processing by using specific knowledge about application tool image processing possessed by a user". According to patent document 2, a user selects a desired processing item from a plurality of processing items defining different image processing, arranges the processing items on a user interface, and executes the processing items in an execution order according to the arrangement order. The user can change the combination of the processing items to realize an arbitrary inspection process.
Patent document 1: JP 2015-232479 publication
Patent document 2: JP 2014-203309 publication
Disclosure of Invention
In the FA field, a series of processes, i.e., an image capturing process of a workpiece and a predetermined image processing performed on the obtained image, are repeatedly performed to continuously inspect the workpiece. According to patent document 2, a user can combine processing items in such a manner that each of the series of processes is processed in parallel. It is desirable that, when each of the series of processes is processed in parallel, one imaging unit does not repeat a plurality of imaging processes. That is, it is desirable to execute each set process so that the process that cannot be repeated is not executed at the same time.
According to one embodiment of the present invention, an image processing apparatus includes: an execution control unit that executes image processing in accordance with the processing setting; and a setting unit that provides a user interface for accepting selection of one or more processing items included in the processing setting from among a plurality of processing items prepared in advance and designation of an execution order of the selected processing items. The setting unit can specify a first order specifying item and a second order specifying item associated with a plurality of processing item sets each including a combination of one or more processing items. The execution control unit starts execution of the respective processes of the plurality of sets of process items associated with the first order specifying item at the same time when the timing of the execution order of the first order specifying item comes, and starts execution of the respective processes of the plurality of sets of process items associated with the second order specifying item in sequence each time a predetermined signal satisfies a predetermined condition when the timing of the execution order of the second order specifying item comes.
Preferably, the user interface displays the second order specifying item and the processing item set hierarchically such that the second order specifying item is positioned higher than a plurality of processing item sets associated with the second order specifying item.
Preferably, the second order specifying item includes a start processing item and an end processing item, and the start processing item and the end processing item determine a range of a processing item set that can be associated with the second order specifying item. When the timing of the execution sequence of the second sequence designation items comes, the setting unit sequentially starts execution of the processing item sets set between the start processing item and the end processing item each time a predetermined signal satisfies a predetermined condition.
Preferably, the plurality of processing items include processing for searching for a region indicating the inspection target in an image obtained by imaging the inspection target.
Preferably, the plurality of processing items include processing for acquiring an image from an imaging unit connectable to the image processing apparatus.
Preferably, the predetermined signal indicates whether the image pickup unit is picking up an image. The predetermined condition is satisfied by shifting from a state in which the image pickup unit is in image pickup to a state in which the image pickup unit is not in image pickup.
According to another embodiment of the present invention, a method for performing image processing according to processing settings includes a step of providing a user interface for accepting selection of one or more processing items included in the processing settings from among a plurality of processing items prepared in advance and specification of an execution order of the selected processing items, the user interface being capable of specifying a first order specifying item and a second order specifying item associated with a processing item set including a combination of the plurality of one or more processing items.
The image processing method further includes: a step of starting, when a timing of an execution order of the first order specifying item comes, a process of each of a plurality of process item sets associated with the first order specifying item at the same time; and a step of sequentially starting execution of the respective processes of the plurality of process item sets associated with the second order specifying item each time a predetermined signal satisfies a predetermined condition when timing of the execution order of the second order specifying item comes.
According to another embodiment of the present invention, an image processing program for executing image processing in accordance with processing settings causes a computer to execute a step of providing a user interface for accepting selection of one or more processing items included in the processing settings from among a plurality of processing items prepared in advance and designation of an execution order of the selected processing items, the user interface being capable of designating a first order designation item and a second order designation item associated with a plurality of processing item sets each including a combination of the one or more processing items. The image processing program further causes the computer to execute: a step of starting, when a timing of an execution order of the first order specifying item comes, a process of each of a plurality of process item sets associated with the first order specifying item at the same time; and a step of sequentially starting execution of the respective processes of the plurality of process item sets associated with the second order specifying item each time a predetermined signal satisfies a predetermined condition when timing of the execution order of the second order specifying item comes.
In one embodiment of the present invention, each set process can be executed so that the processes that cannot be repeated are not executed simultaneously.
The above and other objects, features, embodiments and advantages of the present invention will become apparent from the following detailed description of the present invention which is to be read in connection with the accompanying drawings.
Drawings
Fig. 1 is a schematic diagram showing an overall configuration of an image processing system including an image processing apparatus.
Fig. 2 is a schematic diagram showing a functional configuration of the image processing apparatus.
Fig. 3 is a diagram showing a user interface for creating process settings provided by the image processing apparatus.
Fig. 4 is a diagram showing an example of a user interface in which image capturing processing provided by the image processing apparatus is multi-staged.
Fig. 5 is a diagram showing a flow of the multistage process.
Fig. 6 is a diagram showing a transition procedure of the assignment process to each core in accordance with the process flow shown in fig. 5.
Fig. 7 is a diagram showing how the acquisition process is repeatedly performed.
Fig. 8 is a diagram showing a workpiece photographed by executing the multi-stage processing shown in fig. 5 and 6.
Fig. 9 is a diagram showing an example of a user interface in which processing provided by the image processing apparatus is parallel.
Fig. 10 is a diagram showing a flow of parallel processing.
Fig. 11 is a diagram showing in time series the processing assigned to each core at the time of parallel processing.
Fig. 12 is a schematic diagram showing a configuration of an image processing program installed in the image processing apparatus.
Fig. 13 is a flowchart showing a setting process executed by the image processing apparatus in the setting mode.
Fig. 14 is a flowchart showing a measurement process performed by the image processing apparatus in the measurement mode.
Fig. 15 is a diagram showing an example of a user interface in which image capturing processing is performed in multiple stages, which is provided by the image processing apparatus according to the application example.
Fig. 16 is a diagram showing a process flow in the case where a plurality of multi-level items are set.
Fig. 17 is a diagram showing a transition procedure of the assignment process to each core in accordance with the process flow shown in fig. 16.
Fig. 18 is a diagram showing a workpiece W imaged by the multi-stage processing according to the present application example.
Wherein the reference numerals are as follows:
1 image processing system, 5 controller, 6 transport mechanism, 8A, 8B imaging unit, 31A to 31C, 33A to 33C, 51A to 51E acquisition processing, 32A to 32C, 41A, 41B, 42A, 42B, 52A to 52E image processing, 70A to 70C, 72A, 72B area, 100 image processing apparatus, 102 display unit, 104 keyboard, 106 memory card, 110 processor, 110A to 110E core, 112RAM, 114 display controller, 116 system controller, 118I/O controller, 120 hard disk, 122 camera interface, 124 input interface, 126 interface, 128 communication interface, 130 memory card interface, 150 image processing program, 152 setting unit, 154 execution control unit, 160 library, 170 processing setting, 300 user interface, 302 setting item display area, 304 processing item selection area, 306 camera image display area, 308 add button, 310 button, 350 multi-level block, 351, 364, 374, 384 fetch item, 352, 354 multi-level item, 356, 357, 358, 359 sub multi-level item, 360, 370, 380, 390, 400, 460, 470 process item set, 362, 372, 382, 392, 402 multi-level task, 366, 376, 386, 464, 466, 474, 476 image process item, 450 parallel block, 452, 454 parallel item, 462, 472 parallel task.
Detailed Description
Hereinafter, embodiments according to the present invention will be described with reference to the drawings. In the following description, the same components and constituent elements are denoted by the same reference numerals. Their names and functions are also the same. Therefore, detailed description thereof will not be repeated. Further, the modifications described below may be optionally combined as appropriate.
A. Configuration of image processing system 1
First, the overall configuration of an image processing system 1 including an image processing apparatus 100 according to an embodiment of the present invention will be described. Fig. 1 is a schematic diagram showing the overall configuration of an image processing system 1 including an image processing apparatus 100 according to the present embodiment.
Referring to fig. 1, an image processing system 1 includes, as main components, an image processing apparatus 100 also called a visual sensor, an image pickup unit 8 connected to the image processing apparatus 100, and a PLC (Programmable Logic Controller) 5 capable of communicating with the image processing apparatus 100. As an example, the image processing apparatus 100 integrated with the display unit 102 is shown.
The image processing apparatus 100 is incorporated in a production line or the like, and executes image processing such as checking whether or not there is defect and/or dirt on an inspection target (hereinafter also referred to as "workpiece W"), measuring the size and/or arrangement direction of the workpiece W, and recognizing characters and/or graphics on the surface of the workpiece W. That is, the image processing apparatus 100 performs image processing on image data generated by imaging the workpiece W. In the image processing system 1, a workpiece W is conveyed by a conveying mechanism 6 such as a belt conveyor, and is sequentially imaged by an imaging unit 8. The PLC5 communicates with the image processing apparatus 100 to control the conveyance mechanism 6 and the like.
For example, the imaging unit 8 includes an imaging element divided into a plurality of pixels, such as a CCD (charge Coupled Device) or CMOS (Complementary Metal Oxide Semiconductor) sensor, in addition to an optical system such as a lens. Image data (hereinafter, also referred to as "camera image") captured by the imaging unit 8 is transmitted to the image processing apparatus 100. Then, the image processing apparatus 100 performs image processing on the camera image captured by the imaging unit 8. An illumination device may be further provided to irradiate the workpiece W imaged by the imaging unit 8 with light. The image processing apparatus 100 may be configured to be connectable with more image pickup units 8.
The image processing apparatus 100 has at least a "measurement mode" for executing a preset image process on the camera image from the imaging unit 8 and a "setting mode" for setting and adjusting the content of the image process. In the "setting mode", the user can set processing items for realizing image processing, an execution order thereof, and the like to the image processing apparatus 100. Details of the image processing setting step and the like will be described later.
B. Configuration of image processing apparatus 100
The following describes an overall configuration of an image processing apparatus 100 included in the image processing system 1 shown in fig. 1. Fig. 2 is a schematic diagram showing a functional configuration of the image processing apparatus 100 according to the present embodiment.
Referring to fig. 2, the image processing apparatus 100 generally has a structure in accordance with a general-purpose computer architecture, and implements various image processing as described later by a processor executing a program installed in advance.
More specifically, the image Processing apparatus 100 includes a processor 110 such as a CPU (Central Processing Unit) or an MPU (Micro-Processing Unit), a RAM (Random Access Memory) 112, a display controller 114, a system controller 116, an I/O (Input Output) controller 118, a hard disk 120, a camera interface 122, an Input interface 124, a PLC interface 126, a communication interface 128, and a Memory card interface 130. The above-described respective sections are connected to enable data communication with each other, centering on the system controller 116.
The processor 110 includes a plurality of processor cores (cores 110A to 110D) corresponding to a plurality of processing units. The processor 110 exchanges programs (codes) with the system controller 116, and executes the programs (codes) in a predetermined order, thereby realizing target arithmetic processing.
The cores 110A-110D are capable of executing commands independently of each other. The number of cores installed in the processor 110 is not limited to 4 as long as it is a plurality within a range that technology can be implemented. Fig. 2 shows a configuration in which a plurality of cores are mounted in a single processor (a so-called multi-core processor system), but a configuration in which a plurality of processors are mounted (a so-called multi-processor system) may be employed. Further, a multi-core processor system may be used as a part or all of the processors constituting the multiprocessor system. That is, the image processing apparatus of the present embodiment may have any architecture as long as it has a plurality of processing units capable of executing processing independently of each other.
The system controller 116 is connected to the processor 110, the RAM112, the display controller 114, and the I/O controller 118 via buses, exchanges data between the respective units, and controls the overall processing of the image processing apparatus 100.
The RAM112 is typically a volatile storage device such as a DRAM (Dynamic Random Access Memory), and holds a program read from the hard disk 120, a camera image (image data) acquired by the imaging unit 8, a processing result of the camera image, workpiece data, and the like.
The display controller 114 is connected to the display unit 102, and outputs signals for displaying various information to the display unit 102 in accordance with an internal command from the system controller 116. The display unit 102 includes, for example, a liquid crystal display, an organic EL (Electro Luminescence) display, an organic EL, and the like.
The I/O controller 118 controls data exchange with a recording medium or an external device connected to the image processing apparatus 100. More specifically, the I/O controller 118 is connected to a hard disk 120, a camera interface 122, an input interface 124, a PLC interface 126, a communication interface 128, and a memory card interface 130.
The hard disk 120 is generally a nonvolatile magnetic storage device, and the hard disk 120 stores various setting values and the like in addition to the image processing program 150 executed by the processor 110. The image processing program 150 installed in the hard disk 120 is distributed in a state of being stored in the memory card 106 or the like. Further, the camera image is saved in the hard disk 120. Instead of the hard Disk 120, a semiconductor Memory device such as a flash Memory or an optical Memory device such as a DVD-RAM (Digital Versatile Disk Random Access Memory) may be used.
The camera interface 122 corresponds to an input unit that receives image data generated by imaging a workpiece W (inspection target), and serves as a medium for data transmission between the processor 110 and the imaging unit 8. More specifically, the camera interface 122 can be connected to one or more image pickup units 8, and outputs an image pickup instruction from the processor 110 to the image pickup unit 8 via the camera interface 122. In this way, the image pickup section 8 picks up an image of the subject and outputs the generated image to the processor 110 via the camera interface 122.
The input interface 124 mediates data transfer between the processor 110 and input devices such as a keyboard 104, mouse, touch panel, dedicated console, and the like. That is, the input interface 124 receives an operation instruction given by the user operating the input device.
The PLC interface 126 mediates data transfer between the processor 110 and the PLC 5. More specifically, the PLC interface 126 transmits information on the state of the production line controlled by the PLC5, information on the workpiece W, and the like to the processor 110.
The communication interface 128 serves as an intermediary for data transfer between the processor 110 and other personal computers and/or server devices, etc., not shown. The communication interface 128 is generally constituted by ethernet (registered trademark), USB (Universal Serial Bus), or the like. As described later, instead of installing the program stored in the memory card 106 in the image processing apparatus 100, a program downloaded from a distribution server or the like via the communication interface 128 may be installed in the image processing apparatus 100. For example, the communication interface 128 receives a signal indicating the state of the image pickup unit 8 from the image pickup unit 8, the PLC5, or the like. This signal indicates whether the imaging unit 8 is performing imaging.
The memory card interface 130 mediates data transmission between the processor 110 and the memory card 106 as a recording medium. That is, the image processing program 150 and the like executed by the image processing apparatus 100 are distributed in a state of being stored in the memory card 106, and the memory card interface 130 reads out the image processing program 150 from the memory card 106. In response to an internal command from the processor 110, the memory card interface 130 writes a camera image acquired by the imaging unit 8, a processing result of the image processing apparatus 100, and the like into the memory card 106. The Memory card 106 is configured by a general-purpose semiconductor Memory device such as SD (Secure Digital), a magnetic recording medium such as a Flexible Disk (Flexible Disk), an optical recording medium such as a CD-ROM (Compact Disk Read Only Memory), and the like.
In the case of using a computer having a general-purpose computer architecture as described above, an OS (Operating System) for providing basic functions of the computer may be installed in addition to the application tool for providing the functions of the present embodiment. In this case, the image processing program according to the present embodiment may call necessary ones of program modules provided as a part of the OS in a predetermined order and/or timing to execute processing. That is, the program itself of the present embodiment does not include the above-described module, and may execute processing in cooperation with the OS. Therefore, the image processing program according to the present embodiment may be configured not to include such a part of the modules.
Further, the image processing program according to the present embodiment may be provided as a part of another program. In this case, the program itself does not include a module included in another program combined as described above, and executes processing in cooperation with the other program. That is, the image processing program according to the present embodiment may be incorporated in another program such as this.
Further, a dedicated hardware circuit may be installed in place of a part or all of the functions provided by executing the image processing program.
C. Summary of the invention
The image processing apparatus 100 of the present embodiment provides a user interface that accepts selection of an arbitrary processing item from a plurality of processing items in which different processes are defined in advance. The user can select and configure an arbitrary processing item in the user interface. The image processing apparatus 100 executes each processing item in an execution order according to the arrangement order of the combination and arrangement of the selected processing items. Hereinafter, the content of the image processing defined by setting the processing item is also referred to as "processing setting". That is, "process setting" indicates a combination of process items selected by the user's input. In addition, the processing items in the present specification are functional units ("processing items" or "means") having specific applications, and the processing target and the processing result can be specified for each processing item.
The image processing apparatus 100 according to the present embodiment provides a user interface for setting parallel execution of processing items constituting target image processing more easily so that image processing can be performed more efficiently and at a higher speed using the knowledge of the user. The processing items of the parallel object are specified by establishing an association according to a specific processing item. Hereinafter, such an item for specifying a processing item of a parallel object is also referred to as a "parallel item".
In the present embodiment, the execution order of the processing items can be specified so that the subsequent image processing is executed in parallel while the image acquisition processing is sequentially executed so that the image pickup processing for one image pickup unit does not overlap. Hereinafter, such items for specifying the execution order are also referred to as "multi-level items". While the image capturing process that cannot be repeated is sequentially executed by setting the multi-level items as each processing item, the subsequent image processing is executed in parallel. Hereinafter, the process executed by setting such a multi-level item is also referred to as "multi-level process".
Not only parallel items but also multi-level items can be set, thereby further improving the degree of freedom in design of image processing. That is, in general, a series of image acquisition processing and subsequent image processing is repeated, and thus, parallel items cannot be set. In this case, the user performs each of a series of processes in series. However, in the image processing apparatus 100 according to the present embodiment, a plurality of level items for executing each of the subsequent image processes in parallel while sequentially executing the image acquisition process are prepared in advance. Therefore, the image capturing process can be executed at an arbitrary timing, and the processing time can be shortened for the subsequent image processing.
D. User interface for creating process settings
The following describes a user interface for creating process settings provided by the image processing apparatus 100 according to the present embodiment. Fig. 3 is a diagram showing a user interface 300 for creating process settings provided by the image processing apparatus 100 of the present embodiment.
As shown in fig. 3, the image processing apparatus 100 provides at least a user interface 300 for accepting selection of one or more processing items used in image processing on image data from among a plurality of processing items defined in advance, and designation of an execution order of the selected processing items. As an example, an acquisition process for acquiring an image from the image pickup unit 8 (see fig. 1) connectable to the image processing apparatus 100 is defined as one of the processing items that can be selected. In addition, image processing for searching for an area representing an inspection object within an image obtained by photographing the inspection object is defined as one of processing items available for selection.
More specifically, a description will be given of a procedure in which the user creates a process setting for realizing the target image processing. The user interface 300 has a set-up item display area 302, a processing item selection area 304, a camera image display area 306, a processing item insertion/addition button 308, and an execution order swapping button 310. In the set item display area 302, the content of the currently set process setting is displayed as a graphic. In the processing item selection area 304, icons and names representing processing items available for addition are list-displayed.
The user selects a process item required for the target image processing in the process item selection area 304 of the user interface 300 ((1) selects a process item), and selects a position (order) (addition position) to which the selected process item should be added in the set-up item display area 302. Then, when the user selects the processing item insertion/addition button 308((3) presses the insertion/addition button), the processing item is added ((4) add processing item). The content of the process setting after the addition of the process item is reflected in the set-up item display area 302.
The user appropriately repeats the processing to create processing settings for realizing the target image processing. Further, the user can select the execution order changeover button 310 to change the execution order as appropriate after selecting the processing item in the set item display area 302 during or after the creation of the processing setting.
By such an operation, the user can create the processing setting necessary for the target image processing. The creation of the processing setting is performed in the setting mode.
Then, when an instruction is received in the measurement mode, execution of the designated process setting is started. More specifically, when creation of the process setting is completed (normally, the created process setting is saved), an instruction sequence corresponding to the saved process setting is generated. The sequence of instructions is executed by a processor to thereby implement target image processing. The instruction sequence may be native code assigned to the processor, an internal command assigned to the OS or middleware, or a mixture of the native code and the internal command. That is, the image processing apparatus 100 generates a command sequence for a plurality of processor cores (processing units) in accordance with the setting for the image processing received by the interactive processing.
E. Multi-stage processing
(E1. Multi-level project)
The following description will be made in detail with respect to the above-described multi-level items for executing the subsequent image processing in parallel so that the image acquisition processing does not repeat. In the user interface 300, the user can clearly set the processing items of the multi-level object. The processing items of the multi-level object are specified by establishing an association with the multi-level item. A multi-level item is prepared in advance as one of the processing items displayed in the processing item selection area 304 of the user interface 300.
Fig. 4 is a diagram showing an example of a user interface in which the image capturing process is multistage, which is provided by the image processing apparatus 100 according to the present embodiment. In the initial state, no processing item is set as shown in step S1. In this state, the user selects the icons of the multi-level items displayed in the processing item selection area 304, whereby the multi-level block 350 is added to the set-completed item display area 302 as shown in step S2.
The multi-level block 350 includes a set of multi-level items 352 indicating the start of multi-level processing and multi-level items 354 indicating the end of multi-level processing. A plurality of multi-level tasks can be set between the multi-level item 352 and the multi-level item 354. The number of multi-level tasks included in the multi-level block 350 may be arbitrarily changed by a user. That is, a multi-level task is added to or deleted from the designated multi-level block 350 by the user's operation. In the example of fig. 4, 3 multi-level tasks 362, 372, 382 are set between the multi-level items 352, 354.
An arbitrary processing item can be added to each multi-stage task. For example, as shown in step S3, an acquisition item 364 for acquiring an image from the image pickup section and an image processing item 366 for searching for an inspection target such as a workpiece from the image can be added to the multi-stage task 362 as the processing item set 360. A fetch item 374 and an image processing item 376 are added to the multi-level task 372 as a set of processing items 370. A fetch item 384 and an image processing item 386 are added to the multi-level task 382 as a set of processing items 380. In this way, a processing item set composed of a combination of one or more processing items is associated with each of the multi-stage tasks. A set of processing items represents a set of processing items that are executed serially and sequentially by a single processor core.
As shown in fig. 4, the multi-level blocks 350 are typically displayed hierarchically in a tree structure. As an example, the user interface 300 displays the multi-level items 352, 354 at a higher level than the set of processing items 360, 370, 380. In this way, the user can easily grasp the processing item set of the multi-level object.
In addition, the processing items associated with each of the processing item sets 360, 370, 380 are shown as being at the same level. In this way, the user can easily grasp the processing items that are processed serially.
(E2. flow of multistage treatment)
In the measurement mode, the image processing apparatus 100 performs multi-level processing on each processing item set associated with a multi-level item set in the user interface 300, based on the multi-level item. More specifically, when the timing of the execution order of the multi-level items comes, the image processing apparatus 100 sequentially starts to execute each process of the plurality of sets of process items associated with the multi-level items each time a predetermined signal satisfies a predetermined condition.
The flow of such a multistage process will be described below with reference to fig. 5 and 6. Fig. 5 is a diagram showing a flow of the multistage process. Fig. 6 is a diagram showing a transition procedure of the assignment process to each core in accordance with the process flow shown in fig. 5.
Similarly to the above description of fig. 4, the user interface 300 sets a series of processing item sets 360, 370, and 380, which are processing for acquiring an image from the imaging unit 8 and processing for searching for an inspection target from the acquired image, as multi-stage processing targets. The image processing apparatus 100 starts each process of the process item sets 360, 370, and 380 set between the multi-level item 352 (start process item) and the multi-level item 354 (end process item) in sequence each time a trigger signal indicating the start of the next image capturing is received from the PLC5 (see fig. 1).
Normally, the PLC5 periodically inquires the imaging unit 8 whether the imaging unit 8 is currently performing imaging, and after the PLC5 confirms that the imaging unit 8 is not performing imaging, it transmits a trigger signal indicating the start of the next imaging process to the image processing apparatus 100 at a necessary timing. Normally, the PLC5 transmits a trigger signal to the image processing apparatus 100 to start image pickup processing, based on a change from a state in which the image pickup unit 8 is in image pickup to a state in which the image pickup unit 8 is not in image pickup. Upon receiving the trigger signal from the PLC5, the image processing apparatus 100 starts execution of a set of processing items to be processed in multiple stages. That is, the image processing apparatus 100 sequentially executes a set of processing items to be processed in multiple stages each time a transition is made from a state in which the image pickup unit 8 is in image pickup to a state in which the image pickup unit 8 is not in image pickup.
As a more specific process flow, when the timing of the execution order of the multi-level items comes at time T1, the image processing apparatus 100 allocates the acquisition process 31A of the process item set 360 to the core 110A based on the reception of the trigger signal output from the PLC5 (see fig. 1). That is, the image processing apparatus 100 starts the acquisition process 31A (first acquisition process) of the process item set 360 before starting the acquisition process 31B (second acquisition process) of the process item set 370 when the execution start timing of the externally set multi-level item comes.
At time T2, the acquisition process 31A ends. After the acquisition process 31A is completed, the image processing apparatus 100 starts the execution of the image process 32A by the kernel 110A, and after the acquisition process 31A is completed, starts the execution of the acquisition process 31B by the kernel 110B. More specifically, the image processing apparatus 100 allocates the acquisition process 31B to the core 110A based on detection of the end of the acquisition process 31A. Further, the image processing apparatus 100 starts the acquisition process 31B by the core 110B based on reception of a trigger signal indicating start of image capturing from the PLC5 as an external apparatus. In this way, by sequentially executing the acquisition processes 31A and 31B based on the trigger signal, it is possible to prevent the imaging process from being repeated for one imaging unit. In addition, since the image processing 32A and the acquisition processing 31B are executed in parallel, the processing time is shortened.
The trigger signal is specifically explained. The PLC5 monitors the state in which the imaging unit 8 is imaging, and periodically inquires the imaging unit 8 whether the imaging unit 8 is imaging. The PLC5 transmits a trigger signal to the image processing apparatus 100 to start image pickup processing, based on a change from a state in which the image pickup unit 8 is in image pickup to a state in which the image pickup unit 8 is not in image pickup. Upon receiving the trigger signal from the PLC5 as an external device, the image processing apparatus 100 starts execution of the acquisition process 31B by the core 110B.
At time T3, the acquisition process 31B ends. The image processing apparatus 100 starts execution of the subsequent image processing 32B and acquisition processing 31C upon completion of the acquisition processing 31B. That is, regardless of whether the image processing 32A is completed, the image processing apparatus 100 executes the image processing 32B after the acquisition processing 31B is completed, and starts the acquisition processing 31C. More specifically, the image processing apparatus 100 distributes the image processing 32B to the kernel 110B. Further, the image processing apparatus 100 allocates the acquisition process 31C to the core 110C based on the reception of the trigger signal indicating the start of imaging.
At time T4, the acquisition process 31C ends. After the acquisition process 31C is completed, the image processing apparatus 100 allocates the subsequent image process 32C to the core 110C.
Fig. 7 is a diagram showing how the acquisition process is repeatedly performed. As shown in fig. 7, when the acquisition process 33B is executed before the acquisition process 33A is completed, the imaging unit repeatedly executes the imaging process. Similarly, when the acquisition process 33C is executed before the acquisition process 33B is completed, the imaging unit repeatedly executes the imaging process. In the user interface 300 according to the present embodiment, since the plurality of processing items capable of designating the parallel processing of each image processing so that the image capturing process does not overlap are prepared in advance, it is possible to prevent the image capturing process from overlapping.
Fig. 8 is a diagram showing the workpiece W imaged by executing the multi-stage processing shown in fig. 5 and 6. Since the image pickup processing is sequentially executed by the multi-stage processing, the image pickup section 8 can pick up different regions of the workpiece W. As an example, the region 70A of the workpiece W is imaged by the first imaging process of the multi-stage process. The area 70B of the workpiece W is imaged by the second imaging process of the multi-stage process. The area 70C of the workpiece W is photographed by the third image pickup processing of the multi-stage processing. In this way, since the next second image capturing process can be executed before the image processing subsequent to the first image capturing process is executed, the image capturing interval becomes short, and the image capturing process can be executed at an arbitrary timing.
As described above, when the timing of the execution order of the multi-level items comes, the image processing apparatus 100 sequentially starts performing the respective processes of the plurality of processing item sets associated with the multi-level items every time the trigger signal is received. In addition, although the above description has been given of an example in which the image processing apparatus 100 sequentially starts the processing for acquiring each processing item set on condition that it receives the trigger signal from the PLC5, the starting condition of the processing for acquiring each processing item set is not limited to this. For example, the image processing apparatus 100 may periodically inquire of the imaging unit 8 whether or not the imaging unit 8 is performing imaging, and start execution of the acquisition process of each processing item set on the condition that a signal obtained in response to the inquiry indicates that the imaging unit 8 is not performing imaging. Alternatively, the trigger signal may be issued on the condition that the imaging unit 8 itself has finished imaging.
F. Parallel processing
(F1. parallel items)
The parallel processing for specifying the execution order of the processing items will be described in detail below. In the user interface 300, the user can clearly set the processing items of the parallel object. The processing items of the multi-level object are specified by establishing associations with the parallel items. The parallel item is prepared in advance as one of the processing items displayed in the processing item selection area 304 of the user interface 300.
Fig. 9 is a diagram showing an example of a user interface for parallelizing processing provided by the image processing apparatus 100 according to the present embodiment. In the initial state or the state in which any processing item is set, the user selects the icon displayed in the processing item selection area 304, and adds the parallel block 450 to the processing setting in the set item display area 302, as shown in step S11.
The parallel block 450 contains a set of parallel entries 452 representing the start of parallel processing and parallel entries 454 representing the end of parallel processing. A plurality of parallel tasks can be set between the parallel item 452 and the parallel item 454. The number of parallel tasks contained in the parallel block 450 can be arbitrarily changed by the user. That is, a parallel task is added to or deleted from the specified parallel block 450 by the operation of the user. In the example of fig. 9, two parallel tasks 462 and 472 are set between the parallel items 452 and 454.
An arbitrary processing item can be added to each parallel task. For example, as shown in step S13, image processing items 464, 466 that search for an area representing an inspection object within an image are added to the parallel task 462 as a processing item set 460. Image processing items 474, 476 are added to the parallel task 472 as a set of processing items 470. In this way, a plurality of processing item sets each including a combination of one or more processing items are associated with each parallel task. A set of processing items represents a set of processing items that are processed sequentially by a single processor core.
As shown in fig. 9, the parallel blocks 450 are typically displayed hierarchically in a tree structure. As an example, the user interface 300 displays the parallel items 452, 454 at a higher level than the processing item sets 460, 470. In this way, the user can easily grasp the processing item set of the parallel object.
In addition, the processing items associated with the processing item sets 460, 470 are shown as being located at the same level. In this way, the user can easily grasp the processing items that are processed serially.
(F2. flow of parallel processing)
In the measurement mode, the image processing apparatus 100 concurrently processes each set of processing items associated with a parallel item set in the user interface 300 based on the parallel item. More specifically, the image processing apparatus 100 starts executing each process of a plurality of sets of processing items associated with a parallel item (first order specifying item) at the same time when the timing of the execution order of the parallel item comes.
The flow of parallel processing will be described below with reference to fig. 10 and 11. Fig. 10 is a diagram showing a flow of parallel processing. Fig. 11 is a diagram showing in time series the processing assigned to each core at the time of parallel processing.
At time T11, when the timing of the execution order of the parallel items comes, the image processing apparatus 100 allocates the image processing 41A of the processing item set 460 to the core 110A, and allocates the image processing 42A of the processing item set 470 to the core 110B. That is, when the execution start timing as the externally set parallel item comes, the image processing apparatus 100 executes the image processing 41A and the image processing 42A in parallel.
At time T12, the image processing 41A ends. After the image processing 41A is completed, the image processing apparatus 100 allocates the subsequent image processing 42A to the kernel 110A, and starts the execution of the image processing 42A. In this manner, the image processing 41A, 42A set as one processing item set 460 are sequentially executed in series.
At time T13, the image processing 41B ends. After the image processing 41B is completed, the image processing apparatus 100 allocates the subsequent image processing 42B to the kernel 110B, and starts the execution of the image processing 42B. In this manner, the image processing 41B, 42B set as one processing item set 470 is sequentially executed in series.
G. Functional structure of image processing program
The functional configuration of the image processing program according to the present embodiment will be described below. Fig. 12 is a schematic diagram showing the configuration of an image processing program 150 installed in the image processing apparatus 100 of the present embodiment.
Referring to fig. 12, the image processing program 150 includes a setting unit 152, an execution control unit 154, and a library 160 as basic components.
The setting unit 152 provides a user interface for accepting selection of one or more processing items included in the processing setting 170 from among a plurality of processing items prepared in advance, and designation of an execution order of the selected processing items. That is, the setting section 152 provides an interactive user interface for creating the process setting 170 of the image processing desired by the user in accordance with the user operation. More specifically, the setting unit 152 displays the user interface shown in fig. 3, 4, and 9 on the display unit 102, and generates the process setting 170 according to the user operation. The generated process setting 170 is temporarily stored in a work area reserved in the RAM112 (see fig. 2) or the like.
The execution control unit 154 executes each processing item in accordance with the setting of the processing item (processing setting 170) received by the setting unit 152. More specifically, when the timing of the execution sequence of the parallel items comes, the execution control unit 154 simultaneously starts executing the processing of each of the plurality of processing item sets associated with the parallel items. When the timing of the execution order of the multi-level items comes, the execution control unit 154 sequentially starts the execution of the processing of each of the plurality of sets of processing items associated with the multi-level items each time a trigger signal indicating the start of imaging is received. The execution control unit 154 sequentially executes the processing items not associated with the designated items in the execution order such as the parallel items and the multi-level items in series in the processing order set by the processing setting 170.
Normally, the execution control unit 154 refers to a library 160 prepared in advance, and executes processing associated with each processing item. That is, the library 160 includes library programs corresponding to the processing items listed in the processing item selection area 304 (see fig. 3) of the user interface 300 shown in fig. 3, and the execution control unit 154 calls a necessary library program to execute image processing corresponding to a designated processing item. With a library 160 such as this, processing items can be added and/or improved more easily.
Note that fig. 12 illustrates a single image processing program 150 that can execute all of the predetermined setting, the schedule, and the execution of the image processing, but a plurality of programs may be associated to realize these functions. For example, the processing setting and the processing for generating the command sequence may be implemented in such a manner that the command sequence is executed by a personal computer and the command sequence generated as a result of the execution is transferred to the image processing apparatus to execute the function. In this case, the personal computer executes a program including a part corresponding to the setting unit 152, and the image processing apparatus 100 executes a program including the execution control unit 154 and the library 160. Such a mounting method is naturally included in the technical scope of the present invention.
H. Flow chart
The control structure of the image processing system 1 will be described with reference to fig. 13 and 14. Fig. 13 is a flowchart showing a setting process executed by the image processing apparatus 100 in the setting mode. Fig. 14 is a flowchart showing measurement processing executed by the image processing apparatus 100 in the measurement mode.
Hereinafter, a control flow of the setting process in the setting mode and a control flow of the measurement process in the measurement mode will be described in order.
[ H1. control flow of setting processing ]
First, a control flow of the setting process executed by the image processing apparatus 100 in the setting mode will be described with reference to fig. 13.
The processing shown in fig. 13 is realized by executing a program by the processor 110 (see fig. 2) of the image processing apparatus 100, for example. On the other hand, a part or all of the processing may be executed by the controller 5 (see fig. 1) or other hardware.
In step S110, the processor 110 determines whether or not a user operation to display a setting screen of a processing item is accepted. If the processor 110 determines that the user operation to display the setting screen of the processing item is accepted (yes in step S110), the control is switched to step S112. If not (no in step S110), the processor 110 executes the process of step S110 again.
In step S112, the processor 110 provides the user interface 300 (see fig. 3) for setting the processing items as the setting unit 152 (see fig. 12). The user interface 300 is displayed on, for example, the display unit 102 (see fig. 1) of the image processing apparatus 100.
In step S114, the processor 110 receives, as the setting unit 152 (see fig. 12), selection of one or more processing items included in a process setting selected from a plurality of processing items prepared in advance and specification of the execution order of the selected processing items on the user interface 300 by a user operation.
In step S120, the processor 110 determines whether an operation for saving the process setting is accepted. For example, the processor 110 determines that the operation for saving process setting is accepted when it detects that the save button of the user interface 300 is pressed. If the processor 110 determines that the operation for saving the process setting is accepted (yes in step S120), the control is switched to step S122. If not (no in step S120), the processor 110 returns control to step S114.
In step S122, the processor 110 reflects the setting items set in the user interface 300 to the processing settings. In this way, the contents set in the user interface 300 are saved as the process setting.
[ H2. control flow of measurement treatment ]
With reference to fig. 14, a control flow of the measurement process executed by the image processing apparatus 100 in the measurement mode will be described.
The processing shown in fig. 14 is realized by, for example, the processor 110 (see fig. 2) of the image processing apparatus 100 executing a program. On the other hand, a part or all of the processing may be executed by the controller 5 (see fig. 1) or other hardware.
In step S150, the processor 110 determines whether or not a measurement instruction of an inspection target such as a workpiece is received. If it is determined that the measurement instruction of the inspection target has been accepted (yes in step S150), the processor 110 switches the control to step S152. If not (no in step S150), the processor 110 executes the process of step S150 again.
In step S152, the processor 110 reads out the first processing item from the processing setting saved in the setting mode. The processing item corresponds to a processing item set in the uppermost portion of the set item display area 302 (see fig. 3) of the user interface 300.
In step S160, the processor 110 determines whether the read processing item is a processing item (for example, a parallel item or a multi-level item) for specifying the execution order of the processing. If the processor 110 determines that the read processing item is a processing item for specifying the execution order of the processing (yes in step S160), the control is switched to step S170. If not (no in step S160), the processor 110 switches control to step S162.
In step S162, the processor 110 executes the processing defined by the read processing item as the execution control unit 154 (see fig. 12).
In step S170, the processor 110 determines whether the read processing item is a parallel item. If the processor 110 determines that the read processing item is a parallel item (yes in step S170), the control is switched to step S172. If not (no in step S170), the processor 110 switches control to step S180.
In step S172, the processor 110 serves as the execution control unit 154 that allocates each processing item set associated with the parallel items to any core in the processor 110 and processes the processing item sets in parallel. The parallel processing method is the same as that described with reference to fig. 10 and 11.
In step S180, the processor 110 determines whether the read processing item is a multi-level item. If the processor 110 determines that the read processing item is a multi-level item (yes in step S180), the control is switched to step S182. If not (no in step S180), the processor 110 switches control to step S190.
In step S182, the processor 110 as the execution control unit 154 sequentially starts executing the respective processes of the plurality of process item sets associated with the multi-level items each time a trigger signal indicating that the image capturing process is not being performed is received from the PLC5 (see fig. 1). The method of the multistage processing is the same as that described with reference to fig. 5 and 6.
In step S190, the processor 110 determines whether the measurement result of the workpiece is obtained by the processing performed in steps S162, S172, and S182. If it is determined that the measurement result of the workpiece is obtained (yes in step S190), the processor 110 switches the control to step S192. If not (no in step S190), the processor 110 switches control to step S200.
In step S192, the processor 110 displays the measurement result on the display unit 102 (see fig. 1) of the image processing apparatus 100. Examples of the measurement results displayed include an inspection result relating to the size of the inspection object and an inspection result relating to the appearance of the inspection object. For example, when the workpiece is a connector, a result indicating whether the length of the pins of the connector is normal, a result indicating whether the distance between the pins of the connector is normal, and the like are output as a result of the dimension inspection. The result indicating whether or not there is a defect or stain in the needle portion of the connector is output as the inspection result of the appearance.
In step S200, the processor 110 determines whether all the processing items included in the processing setting are executed. If the processor 110 determines that all the processing items included in the processing setting have been executed (yes in step S200), the measurement processing shown in fig. 13 is ended. If not (no in step S200), the processor 110 returns control to step S160.
In step S202, the processor 110 reads out processing items to be executed below from the processing setting.
I. Application example
An application example of the image processing apparatus 100 according to the present embodiment will be described below.
Although the description has been given on the premise that one image pickup unit 8 is connected to the image processing apparatus 100, in the present application example, a plurality of image pickup units 8 are connected to the image processing apparatus 100. In the case where there is one imaging unit 8, the imaging process cannot be repeated, but in the case where there are a plurality of imaging units 8, the imaging process can be simultaneously executed by each imaging unit 8. In the present application example, processing items for causing each of the plurality of imaging units 8 to execute the multi-stage processing are prepared in advance.
Fig. 15 is a diagram showing an example of a user interface 300 in which image capturing processing is performed in multiple stages, which is provided by the image processing apparatus 100 according to an application example. In a state where an initial state or any processing item (for example, an acquisition item 351 for acquiring an image) is set, the user selects a manual parallel icon displayed in the processing item selection area 304, and thereby adds the multi-level block 350 to the set item display area 302.
The multi-level block 350 contains a group of multi-level items 352 indicating the start of multi-level processing and multi-level items 354 indicating the end of multi-level processing. Between the multi-level item 352 and the multi-level item 354, a plurality of sub multi-level items can be set with the number of image capturing units connectable to the image processing apparatus 100 as an upper limit. In the example of FIG. 15, multi-level tasks 362, 372, 382 sandwiched between sub-multi- level projects 356, 357 are designated as first multi-level objects, and multi-level tasks 392, 402 sandwiched between sub-multi- level projects 358, 359 are designated as second multi-level objects.
Fig. 16 is a diagram showing a flow of processing in a case where a plurality of multi-level items are set. Fig. 17 is a diagram showing a transition procedure of the assignment process to each core in accordance with the process flow shown in fig. 16. In fig. 2, the processor 110 (see fig. 2) is configured by 4 cores 110A to 110D, but the processor 110 is configured by 5 cores 110A to 110E. In addition, the image processing apparatus 100 is connected to two image pickup units, a first image pickup unit and a second image pickup unit.
The acquisition process 51A and the image process 52A shown in fig. 16 correspond to the process item set 360 shown in fig. 15 described above. The acquisition process 51B and the image process 52B correspond to the processing item set 370 shown in fig. 15. The acquisition process 51C and the image process 52C correspond to the processing item set 380 shown in fig. 15. The acquisition process 51D and the image process 52D correspond to the processing item set 390 shown in fig. 15 described above. The acquisition process 51E and the image process 52E correspond to the process item set 400 shown in fig. 15 described above.
At time T21, the image processing apparatus 100 allocates the acquisition process 51A to the core 110A based on the reception of the trigger signal indicating that the first imaging unit is not imaging from the PLC 5. In this way, the execution of the acquisition process 51A for acquiring an image from the first imaging unit is started.
At time T22, the image processing apparatus 100 receives a trigger signal indicating that the second imaging unit is not imaging from the PLC5, and allocates the acquisition process 51D to the core 110D without waiting for the end of the acquisition process 51A. In this way, the execution of the acquisition process 51D for acquiring an image from the second imaging unit is started. As a result, the acquisition process 51D of the second imaging unit is executed in parallel with the acquisition process 51A of the first imaging unit.
At time T23, the acquisition process 51A ends. After the completion of the acquisition process 51A, the image processing apparatus 100 starts the execution of the subsequent image processing 52A and the acquisition process 51B at the same time. More specifically, the image processing apparatus 100 distributes the image processing 52A to the kernel 110A. Further, the image processing apparatus 100 allocates the acquisition processing 51B to the core 110B based on the reception of the trigger signal indicating the start of the image capturing by the first image capturing unit. In this way, by sequentially executing the acquisition processes 51A and 51B based on the trigger signal, the first imaging unit does not repeatedly execute the imaging process.
At time T24, the acquisition process 51D ends. After the completion of the acquisition process 51D, the image processing apparatus 100 starts the execution of the subsequent image processing 52D and the acquisition process 51E at the same time. More specifically, the image processing apparatus 100 distributes the image processing 52D to the kernel 110D. Further, the image processing apparatus 100 allocates the acquisition process 51E to the core 110E based on the reception of the trigger signal indicating the start of the image capturing by the second image capturing unit. In this way, by sequentially executing the acquisition processes 51D and 51E based on the trigger signal, the second imaging unit does not repeatedly execute the imaging process.
At time T25, the acquisition process 51B ends. After the completion of the acquisition process 51B, the image processing apparatus 100 starts the execution of the subsequent image processing 52B and the acquisition process 51C at the same time. More specifically, the image processing apparatus 100 distributes the image processing 52B to the kernel 110B. Further, the image processing apparatus 100 allocates the acquisition process 51C to the core 110C based on the reception of the trigger signal indicating the start of imaging by the first imaging unit.
At time T26, the acquisition process 51E ends. After the acquisition process 51E is completed, the image processing apparatus 100 allocates the subsequent image processing 52E to the kernel 110E, and starts the execution of the image processing 52E.
At time T27, the acquisition process 51C ends. After the acquisition process 51C is completed, the image processing apparatus 100 allocates the subsequent image processing 52C to the kernel 110C, and starts execution of the image processing 52C.
Fig. 18 is a diagram showing a workpiece W imaged by the multi-stage processing according to the present application example. The areas 70A to 70C of the workpiece W are sequentially imaged by the processing item sets 360, 370, and 380 (see fig. 15) set as the first multi-stage processing targets. The areas 72A and 72B of the workpiece W are sequentially imaged by the processing item sets 390 and 400 (see fig. 15) set as the second multi-stage processing targets. As described above, by setting a plurality of multi-level objects, it is possible to execute imaging processing at arbitrary timing for each of the plurality of imaging units 8A and 8B.
J. Advantages of the invention
As described above, in the image processing apparatus 100, parallel items for executing respective processes in parallel and multi-level items for sequentially executing respective processes each time a trigger signal indicating the start of shooting is received are prepared in advance. Not only the parallel items but also the multi-level items can be set, thereby improving the degree of freedom in design of image processing. That is, in general, a series of processing of image acquisition and subsequent image processing is repeated, and thus parallel items cannot be set. In this case, the user performs each of a series of processes in series. However, in the image processing apparatus 100 according to the present embodiment, a plurality of stages for sequentially executing a series of processes each time a trigger signal indicating the start of shooting is received are prepared in advance. Therefore, the image capturing process can be executed at an arbitrary timing, and the subsequent image processing can shorten the processing time.
The presently disclosed embodiments are to be considered in all respects as illustrative and not restrictive. The scope of the present invention is defined by the claims, not by the above description, and includes all modifications equivalent in meaning to the claims and within their scope.
Claims (7)
1. An image processing apparatus, comprising:
a plurality of processing units for performing a plurality of processes,
an execution control section that executes image processing in accordance with the processing setting, an
A setting unit that provides a user interface for accepting selection of one or more processing items included in the processing setting from among a plurality of processing items prepared in advance and designation of an execution order of the selected processing items;
the setting unit is capable of specifying a first multi-level item and a second multi-level item that are associated with a plurality of processing item sets each composed of a combination of one or more processing items and are executed in parallel,
the execution control unit starts processing of each of a plurality of sets of processing items associated with the first multi-level item when timing of an execution order of the first multi-level item comes,
the execution control unit, when timing of an execution order of the second multi-level item comes, sequentially assigns each of a plurality of processing item sets associated with the second multi-level item to any one of the plurality of processing units each time a predetermined signal satisfies a predetermined condition, and causes the processing unit to which the processing item set is assigned to sequentially execute processing of each processing item associated with the processing item set in series,
the plurality of processing items include processing for acquiring an image from an imaging unit connectable to the image processing apparatus,
the plurality of sets of processing items associated with the first multi-level item and the plurality of sets of processing items associated with the second multi-level item include the processing of acquiring images from the same image pickup unit,
the predetermined signal indicates whether the image pickup section is performing image pickup,
the predetermined condition is satisfied by shifting from a state in which the image pickup section is in image pickup to a state in which the image pickup section is not in image pickup,
the execution control section executes the following processes in parallel: the processing of acquiring an image from the same image pickup section among a plurality of sets of processing items associated with the second multi-level item is included as a processing item to be executed after the processing of acquiring an image from the same image pickup section among a plurality of sets of processing items associated with the first multi-level item.
2. The image processing apparatus according to claim 1,
the user interface hierarchically displays the second multi-level item and the set of processing items in a manner that the second multi-level item is hierarchically superior to a set of processing items with which the second multi-level item is associated.
3. The image processing apparatus according to claim 1 or 2,
the second multi-level item is composed of a start processing item and an end processing item, the start processing item and the end processing item determining a range of a processing item set that can be associated with the second multi-level item,
when the timing of the execution sequence of the second multi-level item comes, the setting unit sequentially starts execution of the processing of each processing item set between the start processing item and the end processing item each time a predetermined signal satisfies a predetermined condition.
4. The image processing apparatus according to claim 1 or 2,
the plurality of processing items include processing for searching for a region representing an inspection object in an image obtained by imaging the inspection object.
5. The image processing apparatus according to claim 3,
the plurality of processing items include processing for searching for a region representing an inspection object in an image obtained by imaging the inspection object.
6. An image processing method is characterized in that,
is carried out by a plurality of processing parts,
the image processing method performs image processing in accordance with a processing setting,
the image processing method includes a step of providing a user interface for accepting selection of one or more processing items included in the processing setting from among a plurality of processing items prepared in advance and designation of an execution order of the selected processing items,
the user interface is capable of specifying a first multi-level item and a second multi-level item that are associated with a plurality of processing item sets made up of combinations including one or more processing items and executed in parallel,
the image processing method further includes:
a step of starting respective processes of a plurality of sets of process items associated with the first multi-level item at the same time when an opportunity of an execution order of the first multi-level item comes, and
a step of, when timing of an execution order of the second multi-level item comes, sequentially assigning each of a plurality of processing item sets associated with the second multi-level item to any one of the plurality of processing units each time a predetermined signal satisfies a predetermined condition, and sequentially executing processing of each processing item associated with the processing item set in series by the processing unit to which the processing item set is assigned,
the plurality of processing items include processing for acquiring an image from an imaging unit,
the plurality of sets of processing items associated with the first multi-level item and the plurality of sets of processing items associated with the second multi-level item include the processing of acquiring images from the same image pickup unit,
the predetermined signal indicates whether the image pickup section is performing image pickup,
the predetermined condition is satisfied by shifting from a state in which the image pickup section is in image pickup to a state in which the image pickup section is not in image pickup,
the execution control unit executes the following processes in parallel: the processing of acquiring an image from the same image pickup section among a plurality of sets of processing items associated with the second multi-level item is included as a processing item to be executed after the processing of acquiring an image from the same image pickup section among a plurality of sets of processing items associated with the first multi-level item.
7. A computer-readable storage medium having an image processing program stored thereon, wherein the image processing program executes, on a processor of a computer, the steps of:
is carried out by a plurality of processing parts,
the image processing program performs image processing in accordance with the processing setting,
the image processing program causes a computer to execute a step of providing a user interface capable of specifying a first multi-level item and a second multi-level item that are associated with a plurality of processing item sets each including a combination of one or more processing items and executed in parallel, the user interface being configured to accept selection of one or more processing items included in the processing setting from among a plurality of processing items prepared in advance and specification of an execution order of the selected processing items,
the image processing program further causes the computer to execute:
a step of starting respective processes of a plurality of sets of process items associated with the first multi-level item at the same time when an opportunity of an execution order of the first multi-level item comes, and
a step of, when timing of an execution order of the second multi-level item comes, sequentially assigning each of a plurality of processing item sets associated with the second multi-level item to any one of the plurality of processing units each time a predetermined signal satisfies a predetermined condition, and sequentially executing processing of each processing item associated with the processing item set in series by the processing unit to which the processing item set is assigned,
the plurality of processing items include processing for acquiring an image from an imaging unit,
the plurality of sets of processing items associated with the first multi-level item and the plurality of sets of processing items associated with the second multi-level item include the processing of acquiring images from the same image pickup unit,
the predetermined signal indicates whether the image pickup section is performing image pickup,
the predetermined condition is satisfied by shifting from a state in which the image pickup section is in image pickup to a state in which the image pickup section is not in image pickup,
the execution control unit executes the following processes in parallel: the processing of acquiring an image from the same image pickup section among a plurality of sets of processing items associated with the second multi-level item is included as a processing item to be executed after the processing of acquiring an image from the same image pickup section among a plurality of sets of processing items associated with the first multi-level item.
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2016130485A JP7103746B2 (en) | 2016-06-30 | 2016-06-30 | Image processing equipment, image processing methods, and image processing programs |
JP2016-130485 | 2016-06-30 |
Publications (2)
Publication Number | Publication Date |
---|---|
CN107562449A CN107562449A (en) | 2018-01-09 |
CN107562449B true CN107562449B (en) | 2022-04-29 |
Family
ID=60944928
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN201710094971.0A Active CN107562449B (en) | 2016-06-30 | 2017-02-22 | Image processing apparatus, image processing method, and image processing program |
Country Status (2)
Country | Link |
---|---|
JP (1) | JP7103746B2 (en) |
CN (1) | CN107562449B (en) |
Families Citing this family (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2019175407A (en) * | 2018-03-29 | 2019-10-10 | パナソニック デバイスSunx株式会社 | Program creation device and program creation method of image inspection program, and creation program |
CN108712656B (en) * | 2018-06-07 | 2021-07-20 | 滨州学院 | Remote video processing method and video service terminal |
Citations (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN103312959A (en) * | 2012-03-15 | 2013-09-18 | 欧姆龙株式会社 | Photographing device and photographing device controlling method |
CN104103032A (en) * | 2013-04-05 | 2014-10-15 | 欧姆龙株式会社 | Image processing device, control method, and program |
Family Cites Families (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP5353566B2 (en) * | 2009-08-31 | 2013-11-27 | オムロン株式会社 | Image processing apparatus and image processing program |
-
2016
- 2016-06-30 JP JP2016130485A patent/JP7103746B2/en active Active
-
2017
- 2017-02-22 CN CN201710094971.0A patent/CN107562449B/en active Active
Patent Citations (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN103312959A (en) * | 2012-03-15 | 2013-09-18 | 欧姆龙株式会社 | Photographing device and photographing device controlling method |
CN104103032A (en) * | 2013-04-05 | 2014-10-15 | 欧姆龙株式会社 | Image processing device, control method, and program |
Also Published As
Publication number | Publication date |
---|---|
JP7103746B2 (en) | 2022-07-20 |
CN107562449A (en) | 2018-01-09 |
JP2018005501A (en) | 2018-01-11 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US20140304637A1 (en) | Image processing device, control method, and program | |
JP2019145177A (en) | Image processing method and imaging device | |
JP5469433B2 (en) | Image processing apparatus and image processing method | |
CN107562449B (en) | Image processing apparatus, image processing method, and image processing program | |
EP3764320A1 (en) | Substrate inspection method and system | |
JP5757156B2 (en) | Method, apparatus and program for calculating center position of detection object | |
US6920180B1 (en) | Image processing inspection apparatus | |
CN107562003B (en) | Image processing apparatus, image processing method, and image processing program | |
JP2021033869A (en) | Electronic equipment, control method of the same, program and storage medium | |
US10474124B2 (en) | Image processing system, image processing device, method of reconfiguring circuit in FPGA, and program for reconfiguring circuit in FPGA | |
CN111609933B (en) | Spectroscopic inspection method, image processing apparatus, and robot system | |
CN116519597B (en) | Multi-axis system detection method, device, upper computer, medium and system | |
JP7363601B2 (en) | Image processing device, control method and program | |
US20180218491A1 (en) | Image processing system, information processing device, information processing method, and information processing program | |
US10440264B2 (en) | Image processing system, method of reconfiguring field programmable gate array, information processing device, information processing method, and non-transitory computer-readable medium | |
CN109427039B (en) | Image processing apparatus, setting support method, and computer-readable recording medium | |
JP7311360B2 (en) | Image measurement system | |
US20220261943A1 (en) | Image processing device, image processing method, production system, product manufacturing method, and storage medium | |
JP7207509B2 (en) | Image processing system, information processing device, information processing method, and information processing program | |
CN118628785A (en) | Image processing apparatus and method | |
CN103018162B (en) | A kind of system and method processed for the image data tested | |
JP7571627B2 (en) | IMAGE PROCESSING APPARATUS, EDITING METHOD, AND PROGRAM | |
WO2021131531A1 (en) | Program executing device, program executing method, and program | |
JP7222795B2 (en) | Image inspection system and image inspection method | |
CN113744180A (en) | Image processing method, non-transitory recording medium, and image processing system |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant | ||
GR01 | Patent grant |