WO2016125915A1 - 評価装置及び評価方法 - Google Patents
評価装置及び評価方法 Download PDFInfo
- Publication number
- WO2016125915A1 WO2016125915A1 PCT/JP2016/056290 JP2016056290W WO2016125915A1 WO 2016125915 A1 WO2016125915 A1 WO 2016125915A1 JP 2016056290 W JP2016056290 W JP 2016056290W WO 2016125915 A1 WO2016125915 A1 WO 2016125915A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- data
- movement
- bucket
- evaluation
- detection
- Prior art date
Links
- 238000000034 method Methods 0.000 title description 52
- 238000001514 detection method Methods 0.000 claims abstract description 151
- 238000011156 evaluation Methods 0.000 claims description 211
- 238000009412 basement excavation Methods 0.000 claims description 102
- 238000004364 calculation method Methods 0.000 claims description 56
- 238000003384 imaging method Methods 0.000 claims description 45
- 238000012545 processing Methods 0.000 claims description 23
- 238000010586 diagram Methods 0.000 description 43
- 238000003860 storage Methods 0.000 description 33
- 238000004891 communication Methods 0.000 description 30
- 230000008569 process Effects 0.000 description 17
- 238000010276 construction Methods 0.000 description 15
- 238000005520 cutting process Methods 0.000 description 15
- 239000000463 material Substances 0.000 description 11
- 230000006870 function Effects 0.000 description 10
- 238000002360 preparation method Methods 0.000 description 10
- 230000036544 posture Effects 0.000 description 8
- 239000002689 soil Substances 0.000 description 7
- 230000007935 neutral effect Effects 0.000 description 6
- 238000004590 computer program Methods 0.000 description 5
- 230000003287 optical effect Effects 0.000 description 4
- 239000004576 sand Substances 0.000 description 4
- 125000002066 L-histidyl group Chemical group [H]N1C([H])=NC(C([H])([H])[C@](C(=O)[*])([H])N([H])[H])=C1[H] 0.000 description 3
- 238000011158 quantitative evaluation Methods 0.000 description 3
- 230000001678 irradiating effect Effects 0.000 description 2
- 230000007704 transition Effects 0.000 description 2
- 230000009471 action Effects 0.000 description 1
- 230000000295 complement effect Effects 0.000 description 1
- 238000005553 drilling Methods 0.000 description 1
- 230000000694 effects Effects 0.000 description 1
- 230000006872 improvement Effects 0.000 description 1
- 238000007689 inspection Methods 0.000 description 1
- 239000004973 liquid crystal related substance Substances 0.000 description 1
- 238000005259 measurement Methods 0.000 description 1
- 229910044991 metal oxide Inorganic materials 0.000 description 1
- 150000004706 metal oxides Chemical class 0.000 description 1
- 238000012806 monitoring device Methods 0.000 description 1
- 239000004065 semiconductor Substances 0.000 description 1
- 230000007480 spreading Effects 0.000 description 1
- 238000003892 spreading Methods 0.000 description 1
- 238000001067 thermoluminescence detection Methods 0.000 description 1
Images
Classifications
-
- E—FIXED CONSTRUCTIONS
- E02—HYDRAULIC ENGINEERING; FOUNDATIONS; SOIL SHIFTING
- E02F—DREDGING; SOIL-SHIFTING
- E02F9/00—Component parts of dredgers or soil-shifting machines, not restricted to one of the kinds covered by groups E02F3/00 - E02F7/00
- E02F9/26—Indicating devices
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06Q—INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
- G06Q10/00—Administration; Management
- G06Q10/06—Resources, workflows, human or project management; Enterprise or organisation planning; Enterprise or organisation modelling
- G06Q10/063—Operations research, analysis or management
- G06Q10/0631—Resource planning, allocation, distributing or scheduling for enterprises or organisations
- G06Q10/06311—Scheduling, planning or task assignment for a person or group
- G06Q10/063114—Status monitoring or status determination for a person or group
-
- E—FIXED CONSTRUCTIONS
- E02—HYDRAULIC ENGINEERING; FOUNDATIONS; SOIL SHIFTING
- E02F—DREDGING; SOIL-SHIFTING
- E02F3/00—Dredgers; Soil-shifting machines
- E02F3/04—Dredgers; Soil-shifting machines mechanically-driven
- E02F3/28—Dredgers; Soil-shifting machines mechanically-driven with digging tools mounted on a dipper- or bucket-arm, i.e. there is either one arm or a pair of arms, e.g. dippers, buckets
- E02F3/36—Component parts
- E02F3/42—Drives for dippers, buckets, dipper-arms or bucket-arms
- E02F3/43—Control of dipper or bucket position; Control of sequence of drive operations
- E02F3/435—Control of dipper or bucket position; Control of sequence of drive operations for dipper-arms, backhoes or the like
-
- E—FIXED CONSTRUCTIONS
- E02—HYDRAULIC ENGINEERING; FOUNDATIONS; SOIL SHIFTING
- E02F—DREDGING; SOIL-SHIFTING
- E02F9/00—Component parts of dredgers or soil-shifting machines, not restricted to one of the kinds covered by groups E02F3/00 - E02F7/00
- E02F9/26—Indicating devices
- E02F9/264—Sensors and their calibration for indicating the position of the work tool
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06Q—INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
- G06Q10/00—Administration; Management
- G06Q10/06—Resources, workflows, human or project management; Enterprise or organisation planning; Enterprise or organisation modelling
- G06Q10/063—Operations research, analysis or management
- G06Q10/0633—Workflow analysis
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06Q—INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
- G06Q50/00—Information and communication technology [ICT] specially adapted for implementation of business processes of specific business sectors, e.g. utilities or tourism
- G06Q50/08—Construction
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/70—Determining position or orientation of objects or cameras
-
- G—PHYSICS
- G07—CHECKING-DEVICES
- G07C—TIME OR ATTENDANCE REGISTERS; REGISTERING OR INDICATING THE WORKING OF MACHINES; GENERATING RANDOM NUMBERS; VOTING OR LOTTERY APPARATUS; ARRANGEMENTS, SYSTEMS OR APPARATUS FOR CHECKING NOT PROVIDED FOR ELSEWHERE
- G07C5/00—Registering or indicating the working of vehicles
- G07C5/08—Registering or indicating performance data other than driving, working, idle, or waiting time, with or without registering driving, working, idle or waiting time
- G07C5/0808—Diagnosing performance data
-
- G—PHYSICS
- G07—CHECKING-DEVICES
- G07C—TIME OR ATTENDANCE REGISTERS; REGISTERING OR INDICATING THE WORKING OF MACHINES; GENERATING RANDOM NUMBERS; VOTING OR LOTTERY APPARATUS; ARRANGEMENTS, SYSTEMS OR APPARATUS FOR CHECKING NOT PROVIDED FOR ELSEWHERE
- G07C5/00—Registering or indicating the working of vehicles
- G07C5/08—Registering or indicating performance data other than driving, working, idle, or waiting time, with or without registering driving, working, idle or waiting time
- G07C5/0816—Indicating performance data, e.g. occurrence of a malfunction
-
- G—PHYSICS
- G07—CHECKING-DEVICES
- G07C—TIME OR ATTENDANCE REGISTERS; REGISTERING OR INDICATING THE WORKING OF MACHINES; GENERATING RANDOM NUMBERS; VOTING OR LOTTERY APPARATUS; ARRANGEMENTS, SYSTEMS OR APPARATUS FOR CHECKING NOT PROVIDED FOR ELSEWHERE
- G07C5/00—Registering or indicating the working of vehicles
- G07C5/08—Registering or indicating performance data other than driving, working, idle, or waiting time, with or without registering driving, working, idle or waiting time
- G07C5/0841—Registering performance data
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N7/00—Television systems
- H04N7/18—Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast
- H04N7/183—Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast for receiving images from a single remote source
- H04N7/185—Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast for receiving images from a single remote source from a mobile camera, e.g. for remote control
-
- E—FIXED CONSTRUCTIONS
- E02—HYDRAULIC ENGINEERING; FOUNDATIONS; SOIL SHIFTING
- E02F—DREDGING; SOIL-SHIFTING
- E02F3/00—Dredgers; Soil-shifting machines
- E02F3/04—Dredgers; Soil-shifting machines mechanically-driven
- E02F3/28—Dredgers; Soil-shifting machines mechanically-driven with digging tools mounted on a dipper- or bucket-arm, i.e. there is either one arm or a pair of arms, e.g. dippers, buckets
- E02F3/30—Dredgers; Soil-shifting machines mechanically-driven with digging tools mounted on a dipper- or bucket-arm, i.e. there is either one arm or a pair of arms, e.g. dippers, buckets with a dipper-arm pivoted on a cantilever beam, i.e. boom
- E02F3/32—Dredgers; Soil-shifting machines mechanically-driven with digging tools mounted on a dipper- or bucket-arm, i.e. there is either one arm or a pair of arms, e.g. dippers, buckets with a dipper-arm pivoted on a cantilever beam, i.e. boom working downwardly and towards the machine, e.g. with backhoes
Definitions
- the present invention relates to an evaluation apparatus and an evaluation method.
- Patent Document 1 discloses a technique for evaluating the skill of an operator.
- An object of an aspect of the present invention is to provide an evaluation device and an evaluation method that can objectively evaluate the skill of an operator of a work vehicle.
- a detection data acquisition unit that acquires detection data including a detection movement trajectory of a predetermined unit; a target data generation unit that generates target data including a target movement trajectory of the predetermined unit of the work implement; the detection data and the target data; And an evaluation data generation unit that generates evaluation data of an operator who operates the work machine.
- first detection data indicating the excavation amount of the work implement and second detection data indicating the excavation time of the work implement are acquired based on operation data of the work implement of the work vehicle.
- An evaluation apparatus includes a detection data acquisition unit that performs, and an evaluation data generation unit that generates evaluation data of an operator who operates the work machine based on the first detection data and the second detection data. .
- the operation data of the work implement of the work vehicle from the movement start position to the movement end position of the work implement detected by the detection device that detects the operation of the work implement of the work vehicle is used.
- detection data including a detection movement trajectory of the predetermined part of the work implement based on the above
- target data including a target movement trajectory of the predetermined part of the work implement, the detection data and the target data
- evaluation data of an operator who operates the work machine based on the above.
- first detection data indicating the excavation amount of the work implement and second detection data indicating the excavation time of the work implement are acquired based on operation data of the work implement of the work vehicle. And generating evaluation data for an operator who operates the work machine based on the first detection data and the second detection data.
- an evaluation device and an evaluation method that can objectively evaluate the skill of an operator of a work vehicle are provided.
- FIG. 1 is a diagram schematically illustrating an example of an evaluation system according to the first embodiment.
- FIG. 2 is a side view showing an example of a hydraulic excavator according to the first embodiment.
- FIG. 3 is a plan view illustrating an example of the hydraulic excavator according to the first embodiment.
- FIG. 4 is a diagram schematically illustrating an example of the operation device according to the first embodiment.
- FIG. 5 is a diagram schematically illustrating an example of a hardware configuration of the evaluation system according to the first embodiment.
- FIG. 6 is a functional block diagram illustrating an example of the mobile device according to the first embodiment.
- FIG. 7 is a flowchart illustrating an example of the evaluation method according to the first embodiment.
- FIG. 1 is a diagram schematically illustrating an example of an evaluation system according to the first embodiment.
- FIG. 2 is a side view showing an example of a hydraulic excavator according to the first embodiment.
- FIG. 3 is a plan view illustrating an example of the hydraulic excav
- FIG. 8 is a flowchart illustrating an example of a shooting preparation method according to the first embodiment.
- FIG. 9 is a diagram for explaining an example of a photographing method according to the first embodiment.
- FIG. 10 is a diagram for explaining the method of specifying the position of the upper-part turning body according to the first embodiment.
- FIG. 11 is a diagram for explaining the work machine position specifying method according to the first embodiment.
- FIG. 12 is a schematic diagram for explaining an example of the evaluation method according to the first embodiment.
- FIG. 13 is a flowchart illustrating an example of a shooting and evaluation method according to the first embodiment.
- FIG. 14 is a diagram for explaining a method for specifying the movement start position of the work implement according to the first embodiment.
- FIG. 15 is a diagram for explaining a method of acquiring photographing data including a detected movement locus of the work machine according to the first embodiment.
- FIG. 16 is a diagram for explaining a method for acquiring imaging data including the detected movement locus of the work machine according to the first embodiment.
- FIG. 17 is a diagram for explaining a method for specifying the movement end position of the work machine according to the first embodiment.
- FIG. 18 is a diagram for explaining a method of generating target data indicating the target movement locus of the work machine according to the first embodiment.
- FIG. 19 is a diagram for explaining the evaluation data display method according to the first embodiment.
- FIG. 20 is a diagram for explaining an example of a relative data display method according to the first embodiment.
- FIG. 21 is a diagram for explaining an example of the operator evaluation method according to the first embodiment.
- FIG. 22 is a diagram for explaining an example of an operator evaluation method according to the first embodiment.
- FIG. 23 is a functional block diagram illustrating an example of a mobile device according to the second embodiment.
- FIG. 24 is a flowchart illustrating an example of a shooting and evaluation method according to the second embodiment.
- FIG. 25 is a diagram for explaining an example of the excavation amount calculation method according to the second embodiment.
- FIG. 26 is a diagram schematically illustrating an example of a hydraulic excavator including a detection device that detects the operation of the bucket.
- FIG. 27 is a diagram for explaining an example of a method for remotely operating a hydraulic excavator.
- FIG. 28 is a diagram for explaining an example of a method for remotely operating a hydraulic excavator.
- FIG. 1 is a diagram schematically illustrating an example of an evaluation system 1 according to the present embodiment.
- the work vehicle 3 operates at the construction site 2.
- the work vehicle 3 is operated by an operator Ma who has boarded the work vehicle 3.
- the evaluation system 1 performs one or both of the evaluation of the operation of the work vehicle 3 and the evaluation of the skill of the operator Ma who operates the work vehicle 3.
- the operator Ma operates the work vehicle 3 to construct the construction site 2.
- a worker Mb different from the operator Ma performs the work.
- the worker Mb performs auxiliary work at the construction site 2.
- the worker Mb uses the mobile device 6.
- the evaluation system 1 includes a management device 4 including a computer system and a portable device 6 including a computer system.
- the management device 4 functions as a server.
- the management device 4 provides a service to the client.
- the client includes at least one of an operator Ma, a worker Mb, a holder of the work vehicle 3, and a contractor from whom the work vehicle 3 is rented. Note that the owner of the work vehicle 3 and the operator Ma of the work vehicle 3 may be the same person or different persons.
- the portable device 6 is possessed by at least one of the operator Ma and the worker Mb.
- the portable device 6 includes a portable computer such as a smartphone or a tablet personal computer.
- the management device 4 is capable of data communication with a plurality of portable devices 6.
- FIG. 2 is a side view showing an example of the hydraulic excavator 3 according to the present embodiment.
- FIG. 3 is a plan view showing an example of the hydraulic excavator 3 according to the present embodiment.
- FIG. 3 is a plan view of the excavator 3 viewed from above in the posture of the work machine 10 shown in FIG.
- the excavator 3 includes a work machine 10 that is operated by hydraulic pressure, and a vehicle body 20 that supports the work machine 10.
- the vehicle main body 20 includes an upper swing body 21 and a lower traveling body 22 that supports the upper swing body 21.
- the upper swing body 21 includes a cab 23, a machine room 24, and a counterweight 24C.
- the cab 23 includes a cab.
- a driver's seat 7 on which the operator Ma sits and an operating device 8 that is operated by the operator Ma are arranged in the driver's cab.
- the operating device 8 includes a work lever for operating the work implement 10 and the upper swing body 21 and a travel lever for operating the lower travel body 22.
- the work machine 10 is operated by the operator Ma via the operation device 8.
- the upper swing body 21 and the lower traveling body 22 are operated by the operator Ma via the operation device 8.
- the operator Ma can operate the operation device 8 while sitting on the driver's seat 7.
- the lower traveling body 22 includes drive wheels 25 called sprockets, idle wheels 26 called idlers, and crawler belts 27 supported by the drive wheels 25 and idle wheels 26.
- the drive wheel 25 is operated by power generated by a drive source such as a hydraulic motor.
- the drive wheel 25 rotates by operating the travel lever of the operation device 8.
- the drive wheel 25 rotates about the rotation axis DX1 as a rotation center.
- the idler wheel 26 rotates about the rotation axis DX2.
- the rotation axis DX1 and the rotation axis DX2 are parallel. As the driving wheel 25 rotates and the crawler belt 27 rotates, the excavator 3 travels or turns back and forth.
- the upper turning body 21 can turn around the turning axis RX while being supported by the lower traveling body 22.
- the work machine 10 is supported by the upper turning body 21 of the vehicle body 20.
- the work machine 10 includes a boom 11 connected to the upper swing body 21, an arm 12 connected to the boom 11, and a bucket 13 connected to the arm 12.
- the bucket 13 has, for example, a plurality of convex blades.
- a plurality of cutting edges 13B, which are the tips of the blades, are provided.
- the blade edge 13B of the bucket 13 may be the tip of a straight blade provided in the bucket 13.
- the upper swing body 21 and the boom 11 are connected via a boom pin 11P.
- the boom 11 is supported by the upper swing body 21 so as to be operable with the rotation axis AX1 as a fulcrum.
- the boom 11 and the arm 12 are connected via an arm pin 12P.
- the arm 12 is supported by the boom 11 so as to be operable with the rotation axis AX2 as a fulcrum.
- the arm 12 and the bucket 13 are connected via a bucket pin 13P.
- the bucket 13 is supported by the arm 12 so as to be operable with the rotation axis AX3 as a fulcrum.
- the rotation axis AX1, the rotation axis AX2, and the rotation axis AX3 are parallel to the front-rear direction. The definition of the front-rear direction will be described later.
- the direction in which the axes of the rotation axes AX1, AX2, and AX3 extend is referred to as the vehicle width direction of the upper swing body 21 as appropriate, and the direction in which the axis of the swing axis RX extends is appropriately
- the direction orthogonal to both the rotation axes AX1, AX2, AX3 and the turning axis RX is appropriately referred to as the front-rear direction of the upper turning body 21.
- the direction in which the work machine 10 including the bucket 13 is present is the front, and the reverse direction of the front is the rear.
- One side in the vehicle width direction is the right side, and the opposite direction to the right side, that is, the direction in which the cab 23 is present is the left side.
- the bucket 13 is disposed in front of the upper swing body 21.
- the plurality of cutting edges 13B of the bucket 13 are arranged in the vehicle width direction.
- the upper swing body 21 is disposed above the lower traveling body 22.
- Work machine 10 is operated by a hydraulic cylinder.
- the hydraulic excavator 3 has a boom cylinder 14 for operating the boom 11, an arm cylinder 15 for operating the arm 12, and a bucket cylinder 16 for operating the bucket 13.
- the boom cylinder 14 expands and contracts, the boom 11 operates with the rotation axis AX1 as a fulcrum, and the tip of the boom 11 moves in the vertical direction.
- the arm cylinder 15 expands and contracts, the arm 12 operates with the rotation axis AX2 as a fulcrum, and the tip of the arm 12 moves in the vertical direction or the front-rear direction.
- the bucket 13 When the bucket cylinder 16 expands and contracts, the bucket 13 operates with the rotation axis AX3 as a fulcrum, and the blade edge 13B of the bucket 13 moves in the vertical direction or the front-rear direction.
- the hydraulic cylinder of the work machine 10 including the boom cylinder 14, the arm cylinder 15, and the bucket cylinder 16 is operated by a work lever of the operation device 8.
- the posture of the work implement 10 changes as the hydraulic cylinder of the work implement 10 expands and contracts.
- FIG. 4 is a diagram schematically illustrating an example of the operation device 8 according to the present embodiment.
- the operation lever of the operating device 8 includes a right operation lever 8WR disposed to the right of the center of the driver seat 7 in the vehicle width direction and a left operation disposed to the left of the center of the driver seat 7 in the vehicle width direction.
- Lever 8WL The travel lever of the operating device 8 includes a right travel lever 8MR disposed to the right of the center of the driver seat 7 in the vehicle width direction and a left travel disposed to the left of the center of the driver seat 7 in the vehicle width direction.
- Lever 8ML Lever 8ML.
- the operation pattern regarding the operational relationship between the tilting direction of the right working lever 8WR and the left working lever 8WL and the working direction of the work implement 10 and the turning direction of the upper turning pair 21 may not be the above-described relationship.
- FIG. 5 is a diagram schematically illustrating an example of a hardware configuration of the evaluation system 1 according to the present embodiment.
- the portable device 6 includes a computer system.
- the portable device 6 includes an arithmetic processing device 60, a storage device 61, a position detection device 62 that detects the position of the portable device 6, a photographing device 63, a display device 64, an input device 65, and an input / output interface device 66. And a communication device 67.
- the arithmetic processing unit 60 includes a microprocessor such as a CPU (Central Processing Unit).
- the storage device 61 includes memory and storage such as ROM (Read Only Memory) or RAM (Random Access Memory).
- the arithmetic processing device 60 performs arithmetic processing according to a computer program stored in the storage device 61.
- the position detection device 62 detects an absolute position indicating the position of the mobile device 6 in the global coordinate system by a global navigation system (GNSS).
- GNSS global navigation system
- the photographing device 63 has a video camera function capable of acquiring moving image data of a subject and a still camera function capable of acquiring still image data of the subject.
- the photographing device 63 includes an optical system and an image sensor that acquires photographing data of a subject via the optical system.
- the imaging device includes a charge coupled device (CCD) image sensor or a complementary metal oxide semiconductor (CMOS) image sensor.
- CCD charge coupled device
- CMOS complementary metal oxide semiconductor
- the photographing device 63 can photograph the excavator 3.
- the imaging device 63 functions as a detection device that detects the operation of the work machine 10 of the excavator 3.
- the photographing device 63 photographs the hydraulic excavator 3 from the outside of the hydraulic excavator 3 and detects the operation of the work machine 10.
- the imaging device 63 can acquire the shooting data of the work machine 10 and acquire the movement data of the work machine 10 including at least one of the movement trajectory, the movement speed, and the movement time of the work machine 10.
- the shooting data of the work machine 10 includes one or both of moving image data and still image data of the work machine 10.
- the display device 64 includes a flat panel display such as a liquid crystal display (LCD) or an organic EL display (OLED).
- the input device 65 generates input data when operated.
- the input device 65 includes a touch sensor provided on the display screen of the display device 64.
- Display device 64 includes a touch panel.
- the input / output interface device 66 performs data communication among the arithmetic processing device 60, the storage device 61, the position detection device 62, the photographing device 63, the display device 64, the input device 65, and the communication device 67.
- the communication device 67 performs data communication with the management device 4 wirelessly.
- the communication device 67 performs data communication with the management device 4 using a satellite communication network, a mobile phone communication network, or an Internet line. Note that the communication device 67 may perform data communication with the management device 4 in a wired manner.
- Management device 4 includes a computer system.
- the management device 4 uses a server, for example.
- the management device 4 includes an arithmetic processing device 40, a storage device 41, an output device 42, an input device 43, an input / output interface device 44, and a communication device 45.
- the arithmetic processing unit 40 includes a microprocessor such as a CPU.
- the storage device 41 includes a memory such as a ROM or a RAM and a storage.
- the output device 42 includes a display device such as a flat panel display.
- the output device 42 may include a printing device that outputs print data.
- the input device 43 generates input data when operated.
- the input device 43 includes at least one of a keyboard and a mouse. Note that the input device 43 may include a touch sensor provided on the display screen of the display device.
- the input / output interface device 44 performs data communication among the arithmetic processing device 40, the storage device 41, the output device 42, the input device 43, and the communication device 45.
- the communication device 45 performs data communication with the mobile device 6 wirelessly.
- the communication device 45 performs data communication with the mobile device 6 using a mobile phone communication network or an Internet line.
- the communication device 45 may perform data communication with the portable device 6 by wire.
- FIG. 6 is a functional block diagram illustrating an example of the mobile device 6 according to the present embodiment.
- the portable device 6 functions as an evaluation device 600 that performs one or both of the evaluation of the operation of the excavator 3 and the evaluation of the skill of the operator Ma who operates the excavator 3.
- the functions of the evaluation device 600 are exhibited by the arithmetic processing device 60 and the storage device 61.
- the evaluation device 600 acquires detection data including the movement state of the work implement 10 based on the image data (hereinafter, referred to as operation data as appropriate) of the work implement 10 of the excavator 3 detected by the image capture device 63. Based on the detection data acquisition unit 601, the operation data of the work machine 10 of the excavator 3 detected by the imaging device 63, the position data calculation unit 602 that calculates the position data of the work machine 10, and the target movement of the work machine 10 A target data generation unit 603 that generates target data including conditions, an evaluation data generation unit 604 that generates evaluation data based on detection data and target data, a display control unit 605 that controls the display device 64, and a storage A unit 608 and an input / output unit 610. The evaluation device 600 performs data communication via the input / output unit 610.
- the photographing device 63 detects the operation data of the work machine 10 from the movement start position to the movement end position of the work machine 10 operated by the operator Ma via the operation device 8.
- the operation data of the work machine 10 includes shooting data of the work machine 10 shot by the shooting device 63.
- the detection data acquisition unit 601 includes a detected movement locus of a predetermined part of the work machine 10 based on operation data of the work machine 10 from the movement start position to the movement end position detected by the imaging device 63. Get detection data. Further, the detection data acquisition unit 601 acquires an elapsed time after the bucket 13 starts moving based on the imaging data.
- the position data acquisition unit 602 calculates the position data of the work implement 10 from the operation data of the work implement 10 detected by the photographing device 63.
- the position data acquisition unit 602 calculates the position data of the work machine 10 from the shooting data of the work machine 10 using, for example, a pattern matching method.
- the target data generation unit 603 generates target data including the target movement locus of the work implement 10 from the operation data of the work implement 10 detected by the photographing device 63. Details of the target data will be described later.
- the evaluation data generation unit 604 generates evaluation data based on the detection data acquired by the detection data acquisition unit 601 and the target data generated by the target data generation unit 603.
- the evaluation data includes one or both of evaluation data indicating evaluation of the operation of the work machine 10 and evaluation data indicating evaluation of the operator Ma who operates the work machine 10 via the operation device 8. Details of the evaluation data will be described later.
- the display control unit 605 generates display data from the detection data and target data and causes the display device 64 to display the display data. Further, the display control unit 605 generates display data from the evaluation data and causes the display device 64 to display the display data. Details of the display data will be described later.
- the storage unit 608 stores various data.
- the storage unit 608 stores a computer program for executing the evaluation method according to the present embodiment.
- FIG. 7 is a flowchart illustrating an example of the evaluation method according to the present embodiment.
- the evaluation method includes a step of preparing the photographing of the excavator 3 by the photographing device 63 (S200), and a step of photographing the hydraulic excavator 3 using the photographing device 63 and evaluating the skill of the operator Ma ( S300).
- FIG. 8 is a flowchart illustrating an example of a shooting preparation method according to the present embodiment.
- the shooting preparation method includes a step of determining the shooting position of the shooting device 63 with respect to the excavator 3 (S210), a step of specifying the position of the upper swing body 21 (S220), and specifying the position of the boom 11.
- step S210 a process of determining a relative position between the hydraulic excavator 3 and the imaging device 63 that images the hydraulic excavator 3 is performed (step S210).
- FIG. 9 is a diagram for explaining an example of a photographing method according to the present embodiment.
- the input device 65 of the mobile device 6 is operated by the operator Ma or the worker Mb
- the computer program stored in the storage unit 608 is activated.
- the portable device 6 transitions to the shooting preparation mode.
- the zoom function of the optical system of the shooting device 63 is limited.
- the excavator 3 is photographed by a photographing device 63 having a fixed prescribed photographing magnification.
- the process of specifying the position of the upper swing body 21 is performed. Is implemented (step S220).
- the position data calculation unit 602 specifies the position of the upper swing body 21 using the pattern matching method.
- FIG. 10 is a diagram for explaining a method for specifying the position of the upper-part turning body 21 according to the present embodiment.
- the photographing device 63 acquires photographing data of the photographing region 73 including the hydraulic excavator 3.
- the position data calculation unit 602 calculates the position data of the work machine 10 based on the shooting data of the shooting area 73 shot by the shooting device 63.
- the position data calculation unit 602 scans and moves the upper swing body template 21T (first template), which is the template of the upper swing body 21, with respect to the imaging region 73 on the display screen of the display device 64. Calculate position data.
- the upper swing body template 21T is data indicating the outer shape of the upper swing body 21 viewed from the left side, and is data indicating the outer shape including the cab 23, the machine room 24, and the counterweight 24C, and is stored in the storage unit 608 in advance. ing.
- the position data calculation unit 602 calculates the position data of the vehicle main body 20 based on the correlation value between the imaging data of the vehicle main body 20 and the upper swing body template 21T.
- the upper swing body template 21T is data indicating the outer shape of only the cab 23 or only the machine room 24, the outer shape is close to a quadrangle and may exist in the natural world, and is based on photographing data. It may be difficult to specify the position of the upper turning pair 21.
- the upper swing body template 21T is data indicating an outer shape including the cab 23 and at least the machine room 24, the outer shape becomes an L-shaped polygon and is less likely to exist in nature, and is based on the shooting data.
- the position of the upper turning pair 21 can be easily identified.
- the position data of the vehicle body 20 is calculated, whereby the position of the upper swing body 21 is specified.
- the position of the boom pin 11P is specified.
- the position data calculation unit 602 calculates dimension data indicating the dimensions of the vehicle body 20 based on the shooting data of the shooting area 73.
- the position data calculation unit 602 calculates the dimension (the dimension L in the front-rear direction) of the upper swing body 21 on the display screen of the display device 64 when the upper swing body 21 is viewed from the left side.
- the position data calculation unit 602 calculates the position data of the boom 11 by moving a boom template 11T (second template) that is a template of the boom 11 with respect to the imaging region 73 on the display screen of the display device 64.
- the boom template 11T is data indicating the outer shape of the boom 11, and is stored in the storage unit 608 in advance.
- the position data calculation unit 602 calculates the position data of the boom 11 based on the correlation value between the shooting data of the boom 11 and the boom template 11T.
- FIG. 11 is a diagram for explaining a method for specifying the position of the boom 11 according to the present embodiment.
- the boom 11 can be operated with respect to the upper swing body 21 around the rotation axis AX1. Therefore, since the boom 11 rotates about the rotation axis AX1 and can take various postures, depending on the rotation angle of the boom 11, the boom template 11T can be simply scanned and moved relative to the imaging region 73. There is a possibility that the shooting data does not match the prepared boom template 11T.
- the position of the boom pin 11P is specified by specifying the position of the upper swing body 21.
- the position data calculation unit 602 displays the position of the boom pin 11P of the boom 11 and the position of the boom pin of the boom template 11T specified in step S230 on the display screen of the display device 64. Match. After matching the position of the boom pin 11P of the boom 11 and the position of the boom pin of the boom template 11T, the position data calculation unit 602 matches the boom 11 and the boom template 11T indicated by the shooting data on the display screen of the display device 64. As described above, the boom template 11T is rotated and the position data of the boom 11 is calculated.
- the position data calculation unit 602 calculates the position data of the boom 11 based on the correlation value between the shooting data of the boom 11 and the boom template 11T.
- boom templates 11T of various postures are stored in the storage unit 608 in advance, the boom template 11T that matches the boom 11 indicated by the shooting data is searched, and any boom template 11T is selected to select the boom template 11T. Eleven position data may be calculated.
- the position of the boom 11 is specified by calculating the position data of the boom 11. By specifying the position of the boom 11, the position of the arm pin 12P is specified.
- a process for specifying the position of the arm 12 is performed (step S240).
- the position data calculation unit 602 calculates the position data of the arm 12 by moving an arm template (second template) that is a template of the arm 12 with respect to the imaging region 73 on the display screen of the display device 64.
- the position data calculation unit 602 calculates the position data of the arm 12 based on the correlation value between the imaging data of the arm 12 and the arm template.
- the arm 12 can operate with respect to the boom 11 about the rotation axis AX2. Therefore, the arm 12 rotates about the rotation axis AX2 and can take various postures. Therefore, depending on the rotation angle of the arm 12, only the arm template is scanned and moved with respect to the imaging region 73. Data and prepared arm template may not match.
- the position of the arm 11 is specified by specifying the position of the boom 11.
- the position data calculation unit 602 specifies the position of the arm 12 in the same procedure as the procedure for specifying the position of the boom 11.
- the position data calculation unit 602 matches the position of the arm pin 12P of the arm 12 identified in step S240 with the position of the arm pin of the arm template on the display screen of the display device 64.
- the position data calculation unit 602 causes the arm 12 and the arm template indicated by the shooting data to match on the display screen of the display device 64. Then, the position of the arm 12 is calculated by rotating the arm template.
- the position data calculation unit 602 calculates the position data of the arm 12 based on the correlation value between the imaging data of the arm 12 and the arm template.
- arm templates having various postures are stored in the storage unit 608 in advance, the arm template matching the arm 12 indicated by the imaging data is searched, and the position of the arm 12 is selected by selecting one of the arm templates. Data may be calculated.
- the position of the arm 12 is specified by calculating the position data of the arm 12. By specifying the position of the arm 12, the position of the bucket pin 13P is specified.
- a process for specifying the position of the bucket 13 is performed (step S250).
- the position data calculation unit 602 calculates the position data of the bucket 13 by moving a bucket template (second template) that is a template of the bucket 13 with respect to the imaging region 73 on the display screen of the display device 64.
- the position data calculation unit 602 calculates the position data of the bucket 13 based on the correlation value between the shooting data of the bucket 13 and the bucket template.
- the bucket 13 can operate with respect to the arm 12 about the rotation axis AX3. For this reason, the bucket 13 rotates around the rotation axis AX3 and can take various postures. Depending on the angle of the bucket 13, only the bucket template is scanned and moved with respect to the imaging region 73, The prepared bucket template may not match.
- the position data calculation unit 602 specifies the position of the bucket 13 in the same procedure as the procedure for specifying the position of the boom 11 and the procedure for specifying the position of the arm 12.
- the position data calculation unit 602 matches the bucket pin 13P of the bucket 13 identified in step S250 with the position of the bucket pin of the bucket template on the display screen of the display device 64.
- the position data calculation unit 602 matches the bucket 13 indicated by the shooting data and the bucket template on the display screen of the display device 64. As described above, the bucket template is rotated and the position data of the bucket 13 is calculated.
- the position data calculation unit 602 calculates the position data of the bucket 13 based on the correlation value between the shooting data of the bucket 13 and the bucket template.
- bucket templates having various postures are stored in the storage unit 608 in advance, the bucket template matching the bucket 13 indicated by the shooting data is searched, and the position of the bucket 13 is selected by selecting one of the bucket templates. Data may be calculated.
- the position of the bucket 13 is specified by calculating the position data of the bucket 13. By specifying the position of the bucket 13, the position of the blade edge 13 ⁇ / b> B of the bucket 13 is specified.
- the movement state of the work machine 10 of the excavator 3 operated by the operator Ma via the operation device 8 is photographed by the photographing device 63 of the portable device 6.
- the operation conditions of the work machine 10 by the operator Ma are determined so that the work machine 10 moves under a specific movement condition.
- FIG. 12 is a diagram schematically showing operating conditions of the work machine 10 imposed on the operator Ma in the evaluation method according to the present embodiment.
- the blade tip 13 ⁇ / b> B of the bucket 13 in an unloaded state in the air is operated to draw a linear movement trajectory along a horizontal plane.
- This is imposed on the operator Ma of the excavator 3.
- the operator Ma operates the operating device 8 so that the cutting edge 13B of the bucket 13 draws a straight movement locus along the horizontal plane.
- the movement start position and the movement end position of the bucket 13 are arbitrarily determined by the operator Ma.
- the time at which the blade edge 13B of the bucket 13 is stationary is equal to or longer than the specified time, and the position where the stationary bucket 13 starts moving is determined as the movement start position. Further, the time when the stationary bucket 13 starts to move is determined as the movement start time.
- the position of the bucket 13 that is determined that the cutting edge 13B of the bucket 13B in the moving state stops moving and the stop time is equal to or longer than the specified time is determined as the movement end position. Further, the time when the movement is finished is determined as the movement end time.
- the position at which the stationary bucket 13 starts moving is the movement start position, and the time when the bucket 13 starts moving is the movement start time.
- the position where the bucket 13 in the moving state stops is the movement end position, and the time point when it stops is the movement end point.
- FIG. 13 is a flowchart showing an example of the photographing and evaluation method according to this embodiment.
- FIG. 13 shows a step (S300) of photographing the excavator 3 using the photographing device 63 and evaluating the skill of the operator Ma.
- the shooting and evaluation method according to the present embodiment includes a step (S310) of specifying a movement start position of the work machine 10, a step (S320) of obtaining shooting data of the moving work machine 10, and an end of movement of the work machine 10.
- FIG. 14 is a diagram for explaining a method for specifying the movement start position of the work machine 10 according to the present embodiment.
- the detection data acquisition unit 601 specifies the position of the blade edge 13 ⁇ / b> B of the bucket 13 of the work machine 10 in a stationary state based on the shooting data of the work machine 10 taken by the shooting device 63. When it is determined that the time during which the blade edge 13B of the bucket 13 is stationary is equal to or longer than the specified time, the detection data acquisition unit 601 determines the position of the blade edge 13B of the bucket 13 as the movement start position of the bucket 13.
- the detection data acquisition unit 601 detects that the bucket 13 has started moving based on the shooting data of the work implement 10. The detection data acquisition unit 601 determines that the time when the blade edge 13B of the stationary bucket 13 starts moving is the time when the bucket 13 starts moving.
- the detection data acquisition unit 601 acquires shooting data that is moving image data of the work machine 10 from the shooting device 63 (step S320).
- FIGS. 15 and 16 are diagrams for explaining a method of acquiring photographing data of the work machine 10 according to the present embodiment.
- the detection data acquisition unit 601 starts acquisition of imaging data of the work machine 10 that has started moving.
- the detection data acquisition unit 601 acquires detection data including the movement locus of the work implement 10 based on the imaging data of the bucket 13 from the movement start position to the movement end position.
- the detection data includes the movement trajectory of the unloaded work machine 10 in the air from when the stationary work machine 10 starts moving at the movement start position to when the movement end position ends.
- the detection data acquisition unit 601 acquires the movement trajectory of the bucket 13 based on the imaging data. Further, the detection data acquisition unit 601 acquires an elapsed time after the bucket 13 starts moving based on the imaging data.
- FIG. 15 shows the display device 64 immediately after the movement of the bucket 13 is started.
- the position data calculation unit 602 calculates the position data of the cutting edge 13B of the bucket 13 included in the position data of the work implement 10, and the display control unit In 605, display data indicating the cutting edge 13B of the bucket 13 is displayed on the display device 64.
- the movement start position SP is displayed on the display device 64 as a round point, for example.
- the display control unit 605 displays the movement end position EP on the display device 64 as, for example, a round dot.
- the display control unit 605 displays the plot PD (SP, EP), which is display data indicating the cutting edge 13B, on the display device 64 as, for example, a round point.
- the display control unit 605 also displays elapsed time data TD that is display data indicating the elapsed time since the work machine 10 started moving from the movement start position, and the work machine 10 indicates the movement start position and the movement end position.
- Character data MD which is display data indicating that the user is moving, is displayed on the display device 64.
- the display control unit 605 causes the display device 64 to display the “Moving” character data MD.
- FIG. 16 shows the display device 64 when the bucket 13 is moving.
- the detection data acquisition unit 601 continues to detect the position of the bucket 13 based on the imaging data, and the position data calculation unit 602 continues to calculate the position data of the blade edge 13B of the bucket 13 and detects the movement of the blade edge 13B of the bucket 13. Get the trajectory.
- the detection data acquisition unit 601 acquires an elapsed time indicating the movement time of the bucket 13 from the movement start time.
- the display control unit 605 generates display data indicating the detected movement locus of the bucket 13 from the detection data, and causes the display device 64 to display the display data.
- the display control unit 605 generates a plot PD indicating the position of the blade edge 13B of the bucket 13 at regular time intervals based on the detection data.
- the display control unit 605 causes the display device 64 to display the plot PD generated at regular time intervals. In FIG. 16, a short interval between plots PD indicates that the moving speed of the bucket 13 is low, and a long interval between plots PD indicates that the moving speed of the bucket 13 is high.
- the display control unit 605 causes the display device 64 to display the detection line TL indicating the detected movement locus of the bucket 13 based on the plurality of plots PD.
- the detection line TL is broken line display data connecting a plurality of plots PD.
- the detection line TL may be displayed by connecting a plurality of plots PD with a smooth curve.
- FIG. 17 is a diagram for explaining a method for specifying the movement end position of the work machine 10 according to the present embodiment.
- the detection data acquisition unit 601 detects that the movement of the bucket 13 is stopped based on the shooting data.
- the detection data acquisition unit 601 determines the position at which the cutting edge 13B of the bucket 13 in the moving state has stopped moving as the movement end position of the bucket 13. Also, the detection data acquisition unit 601 determines the time point when the movement of the bucket 13 ends when the cutting edge 13B of the bucket 13 in the moving state stops moving.
- the detection data acquisition unit 601 determines that the time when the bucket 13 in the moving state stops moving and the blade edge 13B of the bucket 13 is stationary is equal to or longer than the specified time, the position of the blade edge 13B of the bucket 13 is determined.
- the movement end position of the bucket 13 is determined.
- the position data calculation unit 602 calculates position data of the cutting edge 13B of the bucket 13 at the movement end position.
- FIG. 17 shows the display device 64 immediately after the movement of the bucket 13 is stopped.
- the display control unit 605 erases the elapsed time data TD and the character data MD from the display device 64. Thereby, the operator Mb who is the photographer can recognize that the movement of the bucket 13 is stopped.
- the character data MD indicating that the movement of the bucket 13 is stopped may be displayed without deleting the character data MD from the display device 64.
- FIG. 18 is a diagram for explaining a method for generating target data indicating the target movement locus of the work machine 10 according to the present embodiment.
- the target data generation unit 603 generates target data indicating the target movement locus of the bucket 13.
- the target movement locus includes a straight line connecting the movement start position SP and the movement end position EP.
- the display control unit 605 generates display data to be displayed on the display device 64 from the target data, and causes the display device 64 to display the display data.
- the display control unit 605 causes the display device 64 to display a target line RL indicating a target movement locus connecting the movement start position SP and the movement end position EP.
- the target line RL is linear display data connecting the movement start position SP and the movement end position EP.
- the target line RL is generated based on the target data. That is, the target line RL indicates target data.
- the display control unit 605 displays the plot PD (SP, EP) and the detection line TL on the display device 64 together with the target line RL. As described above, the display control unit 605 generates display data including the plot PD and the detection line TL from the detection data, generates display data including the target line RL that is target data, and causes the display device 64 to display the display data.
- the operator Mb or the operator Ma can determine that the actual movement locus of the bucket 13 (the cutting edge 13B) is from the target movement locus indicated by a straight line. You can qualitatively recognize how far away.
- Step S350 After the detection data including the movement trajectory is acquired and the target data including the target movement trajectory is generated, a process of generating quantitative evaluation data of the operator Ma is performed based on the detection data and the target data. (Step S350).
- the shooting data of the work machine 10 acquired by the shooting device 63 is stored in the storage unit 608.
- the worker Mb selects the shooting data to be evaluated from the plurality of shooting data stored in the storage unit 608 via the input device 65. To do.
- the evaluation data generation unit 604 generates evaluation data from the selected shooting data.
- the evaluation data generation unit 604 generates evaluation data of the operator Ma based on the difference between the movement locus and the target movement locus. It means that the smaller the difference between the detected detected movement locus and the target movement locus is, the more the bucket 13 can be moved along the target movement locus, and it is evaluated that the skill of the operator Ma is high. On the other hand, the larger the difference between the detected movement trajectory and the target movement trajectory, the more the bucket 13 (blade edge 13B) could not be moved along the target movement trajectory, and the operator Ma evaluated that the skill level was low. Is done.
- both the right working lever 8WR and the left working lever 8WL of the operating device 8 must be operated simultaneously or alternately, and the skill of the operator Ma is low. In this case, it is not easy to move the cutting edge 13B linearly and for a long distance in a short time.
- the evaluation data generation unit 604 generates evaluation data based on the area of the plane defined by the detection line TL indicating the detection movement locus and the target line RL indicating the target movement locus. That is, as indicated by the hatched portion in FIG. 18, the area of the plane DI defined by the detection line TL indicated by a curve and the target line RL indicated by a straight line is calculated by the evaluation data generation unit 604, and the area Evaluation data is generated based on the above. The smaller the area, the higher the skill of the operator Ma, and the larger the area, the lower the skill of the operator Ma. The size of the area (plane D1) is also included in the evaluation data.
- the movement start position SP and the movement end position EP are specified based on the shooting data.
- the detection data acquisition unit 601 acquires the distance between the movement start position SP and the movement end position EP based on the imaging data.
- the detection data acquired by the detection data acquisition unit 601 includes the movement distance of the bucket 13 between the movement start position SP and the movement end position EP.
- the evaluation data generation unit 604 generates evaluation data based on the distance between the movement start position SP and the movement end position EP.
- the longer the distance between the movement start position SP and the movement end position EP the longer the bucket 13 can be moved along the target movement trajectory, and it is evaluated that the skill of the operator Ma is high. It means that the shorter the distance between the movement start position SP and the movement end position EP, the more the bucket 13 can be moved along the target movement trajectory, and the lower the skill of the operator Ma is.
- the dimension L of the vehicle body 20 in the front-rear direction on the display screen of the display device 64 is calculated in the shooting preparation mode.
- actual dimension data indicating the actual dimension of the vehicle body 20 in the front-rear direction is stored in the storage unit 608. Therefore, by calculating the distance between the movement start position SP and the movement end position EP on the display screen of the display device 64, the detection data acquisition unit 601 can detect the dimension L and the vehicle body 20 stored in the storage unit 608. Based on the ratio with the actual dimension, the actual movement distance of the bucket 13 from the movement start position SP to the movement end position EP can be calculated. The movement distance may be calculated by the position data calculation device 602.
- the elapsed time after the bucket 13 starts moving and the moving time of the bucket 13 from the movement start position SP to the movement end position EP are acquired based on the imaging data.
- the detection data acquisition unit 601 has an internal timer.
- the detection data acquisition unit 601 acquires the time of the movement start time and the movement end time of the bucket 13 based on the measurement result of the internal timer and the shooting data of the shooting device 63.
- the detection data acquired by the detection data acquisition unit 601 includes the movement time of the bucket 13 between the movement start time and the movement end time.
- the evaluation data generation unit 604 generates evaluation data based on the movement time of the bucket 13 (blade edge 13B) between the movement start time and the movement end time. As the time between the movement start time and the movement end time is shorter, it means that the bucket 13 can be moved in a shorter time along the target movement trajectory, and it is evaluated that the skill of the operator Ma is higher. The longer the time between the movement start time and the movement end time, the longer it takes to move the bucket 13 along the target movement trajectory, and it is evaluated that the skill of the operator Ma is low.
- the detection data acquisition unit 601 calculates the actual movement distance of the bucket 13 from the movement start position SP to the movement end position EP. Therefore, the detection data acquisition unit 601 starts moving based on the actual movement distance of the bucket 13 from the movement start position SP to the movement end position EP and the movement time of the bucket 13 from the movement start time to the movement end time.
- the moving speed (average moving speed) of the bucket 13 between the position SP and the movement end position EP can be calculated. This movement speed may be calculated by the position data calculation device 602.
- the detection data acquired by the detection data acquisition unit 601 includes the moving speed of the bucket 13 between the movement start position SP and the movement end position EP.
- the evaluation data generation unit 604 generates evaluation data based on the moving speed of the bucket 13 (blade edge 13B) between the movement start position SP and the movement end position EP. This means that the higher the moving speed of the bucket 13 between the movement start position SP and the movement end position EP, the higher the speed of moving the bucket 13 (blade edge 13B) along the target movement trajectory.
- the skill of the person Ma is evaluated as high. The lower the movement speed of the bucket 13 between the movement start position SP and the movement end position EP, the lower the movement speed of the bucket 13 (blade edge 13B) along the target movement trajectory.
- the skill of the operator Ma is evaluated as low.
- FIG. 19 is a diagram for explaining a method of displaying evaluation data according to the present embodiment.
- the display control unit 605 generates display data from the evaluation data and causes the display device 64 to display the display data.
- the display control unit 605 displays the name of the operator Ma, which is personal data, on the display device 64, for example.
- Personal data is stored in the storage unit 606 in advance.
- the display control unit 605 uses, as evaluation data, “linearity” indicating the difference between the target movement locus and the detected movement locus, and “distance” indicating the movement distance of the bucket 13 from the movement start position SP to the movement end position EP. “Time” indicating the movement time of the bucket 13 from the movement start position SP to the movement end position EP, and “Speed” indicating the average movement speed of the bucket 13 from the movement start position SP to the movement end position EP. It is displayed on the display device 64.
- the display control unit 605 causes the display device 64 to display numerical data of each item of “linearity”, “distance”, “time”, and “speed” as quantitative evaluation data.
- the numerical data of “linearity” is, for example, 100 points when the difference between the target movement locus and the detection movement locus (plane DI) is less than a predetermined amount, and 100 points as the difference becomes larger than the predetermined amount. It can be calculated by deducting from the point.
- numerical data may be displayed on the display device 64 as points based on the difference from the reference numerical value that is a maximum of 100 points.
- Evaluation data such as “time” and “speed” indicating the average moving speed of the portion from the movement start position SP to the movement end position EP may be acquired. That is, since the imaging device 63 (detection device) detects the operation of the work machine 10 and acquires the shooting data, the operation data based on the movement of the work machine 10 included in the shooting data is used to set the predetermined unit of the work machine 10. You may acquire a movement locus
- the display control unit 605 causes the display device 64 to display the skill score of the operator Ma as quantitative evaluation data.
- the storage unit 608 stores reference data regarding skills. Reference data is, for example, evaluation data obtained by comprehensively evaluating numerical data of each item of “linearity”, “distance”, “time”, and “speed” for an operator having a standard skill, Required statistically or empirically. The skill score of the operator Ma is calculated based on the reference data.
- the display control unit 605 also displays on the display device 64 frequency data indicating how many evaluation data the operator Ma has generated in the past, and the average or maximum score of past evaluation data (skill scores). You may let them.
- the evaluation data generation unit 604 outputs the generated evaluation data to an external server via the communication device 67.
- the external server may be the management device 4 or a server different from the management device 4.
- relative data indicating the relative evaluation of the operator Ma with the other operator Ma is provided from the external server to the communication device 67 of the portable device 6.
- the evaluation data generation unit 604 acquires relative data supplied from an external server.
- the display control unit 605 generates display data regarding the relative data and causes the display device 64 to display the display data.
- the relative data indicating the relative evaluation between the operator Ma and other operators Ma includes ranking data that ranks the skills of the plurality of operators Ma.
- the external server collects evaluation data of a plurality of operators Ma existing all over the country.
- the external server aggregates and analyzes the evaluation data of the plurality of operators Ma, and generates skill ranking data for each of the plurality of operators Ma.
- the external server distributes the generated ranking data to each of the plurality of mobile devices 6.
- the ranking data is included in the evaluation data, and is relative data indicating a relative evaluation with other operators Ma.
- FIG. 20 is a diagram for explaining an example of a relative data display method according to the present embodiment.
- the display control unit 605 generates display data from the relative data and causes the display device 64 to display the display data.
- the display control unit 605 causes the display device 64 to display the following information regarding display data. For example, among the number of operators Ma nationwide and the operator Ma nationwide who registered personal data with the mobile device 6 and generated evaluation data using the mobile device 6, the name of the operator Ma The display device 64 displays the ranking based on the evaluation data (score) of the operator Ma who has generated the evaluation data and the score indicating the evaluation data on the mobile device 6 (the mobile device 6 that is trying to display the display data).
- the display control unit 605 may display the information on the display device 64 by receiving from the external server information indicating the name and score of the operator Ma having the higher score indicating the evaluation data.
- the rank based on the evaluation data is also included in the evaluation data, and is relative data indicating a relative evaluation with another operator Ma.
- the detection data acquisition unit 601 that acquires detection data including the detected movement trajectory of the work implement 10 and the target data that generates target data including the target movement trajectory of the work implement 10.
- An evaluation device 600 including a generation unit 603 and an evaluation data generation unit 604 that generates evaluation data of the operator Ma based on the detection data and the target data enables objective and quantitative determination of the skill of the operator Ma of the excavator 3. Can be evaluated. By providing the evaluation data and the relative data based on the evaluation data, the operator Ma's willingness to improve the skill is improved. Further, the operator Ma can improve his / her operation based on the evaluation data.
- the detection data is the unloaded work machine 10 in the air from the start of movement of the stationary work machine 10 at the movement start position SP to the end of movement at the movement end position EP.
- Including the movement trajectory By imposing operation conditions so that the work machine 10 moves in the air, it is possible to make the evaluation conditions of the operator Ma existing all over the country constant. For example, when the soil quality is different for each construction site 2, for example, when the operator Ma existing in various parts of the country is actually evaluated by performing excavation operation, the operator Ma is evaluated with different evaluation conditions. Become. In this case, the fairness of the evaluation may be lacking. By operating the work machine 10 so as to move in the air, the skill of the operator Ma can be evaluated fairly under the same evaluation conditions.
- a straight line connecting the movement start position SP and the movement end position EP is adopted as the target movement locus.
- the evaluation data generation unit 604 generates evaluation data based on the difference between the detected movement locus and the target movement locus. Thereby, the skill of the operator Ma who moves the blade edge 13B of the bucket 13 straight can be evaluated appropriately. According to the present embodiment, the evaluation data generation unit 604 generates evaluation data based on the area (difference) of the plane defined by the detection line TL indicating the detection movement locus and the target line RL indicating the target movement locus. . Thereby, the skill of the operator Ma who moves the blade edge 13B of the bucket 13 straight can be evaluated more appropriately.
- the detection data includes the movement distance of the bucket 13 between the movement start position SP and the movement end position EP
- the evaluation data generation unit 604 is based on the movement distance of the bucket 13. Generate evaluation data.
- the operator Ma capable of moving the blade edge 13B of the bucket 13 for a long distance can be appropriately evaluated as a person having high skill.
- the detection data includes the movement time of the bucket 13 from the movement start position SP to the movement end position EP, and the evaluation data generation unit 603 evaluates the evaluation data based on the movement time of the bucket 13. Is generated.
- operator Ma who can move blade edge 13B of bucket 13 in a short time can be appropriately evaluated as a person with high skill.
- the detection device 63 that detects the operation data of the work machine 10 is the imaging device 63 that detects the operation data of the work machine 10. Thereby, the operation data of the work machine 10 can be easily obtained without using a large-scale device.
- the position data calculation unit 602 scans and moves the upper swing body template 21T with respect to the shooting area 73, and the shooting data of the upper swing body 21 and the upper swing body template 21T (first template). After calculating the position data of the upper-part turning body 21 based on the correlation value between the boom 11 and the boom template 11T, the boom template 11T (second template) is moved with respect to the shooting region 73 and the boom template 11T is captured. Based on the above, position data of the boom 11 is calculated. As a result, the position of the work implement 10 can be specified also in the hydraulic excavator 3 having a characteristic structure and movement that the work implement 10 that moves relative to the vehicle body 20 exists.
- the position of the boom 11 is determined by specifying the position of the boom 11 with reference to the boom pin 11P after the position of the upper swing body 21 including the boom pin 11P is specified by the pattern matching method. Is accurately identified.
- the position of the boom 11 is specified, the position of the arm 12 is specified with reference to the arm pin 12P.
- the position of the bucket 13 is specified with reference to the bucket pin 13P. Even in the hydraulic excavator 3 having a characteristic structure and movement, the position of the blade edge 13B of the bucket 13 can be accurately specified.
- the position data calculation unit 602 calculates the dimension data of the upper swing body 21 on the display screen of the display device 64 based on the shooting data of the shooting area 73.
- the evaluation data generation unit 604 actually calculates the movement start position SP and the movement end position EP from the ratio between the dimension data of the upper swing body 21 and the actual dimension data of the upper swing body 21 on the display screen of the display device 64. Can be calculated.
- the display control unit 605 that generates display data from the detection data and target data and displays the display data on the display device 64 is provided.
- the operator Ma can recognize qualitatively through vision how far his / her skill is from the target.
- the display data can be displayed on the display device 64 as numerical data such as linearity, distance, time, speed, and score, it is possible to quantitatively recognize its own skill.
- the display data includes the elapsed time data TD indicating the elapsed time since the work machine 10 started moving from the movement start position SP, and the work machine 10 is moved from the movement start position SP to the end of movement.
- One or both of character data MD indicating that movement is in progress with the position EP is included.
- the display control unit 605 generates display data from the evaluation data and causes the display device 64 to display the display data. Accordingly, the operator Ma can objectively recognize the evaluation data of his / her skill through vision.
- FIG. 21 and 22 are diagrams for explaining an example of an evaluation method for the operator Ma according to the present embodiment.
- a first evaluation method As shown in FIG. 12, the operator Ma so that the blade edge 13B of the bucket 13 in the unloaded state in the air draws a linear movement locus along the horizontal plane. Then, the work machine 10 was operated to evaluate the skill of the operator Ma.
- the operation of the work machine 10 such as the first evaluation method is assumed to be an operation for forming the ground into a flat surface or an operation for spreading and leveling earth and sand. As shown in FIG.
- the operator Ma is caused to operate the work machine 10 so that the cutting edge 13B of the bucket 13 in the unloaded bucket 13 in the air draws a linear movement trajectory inclined with respect to the horizontal plane. May be evaluated (hereinafter, second evaluation method).
- the operation of the work machine 10 like the second evaluation method is assumed to be a work for forming a slope, and requires a high skill.
- the operator Ma is caused to operate the work machine 10 so that the blade edge 13B of the bucket 13 in an unloaded state in the air draws a circular movement trajectory, and the skill of the operator Ma is evaluated (hereinafter referred to as “first”).
- first skill evaluation methods
- the above three first to third evaluation methods may be performed, or any one of the evaluation methods may be performed.
- the above three first to third evaluation methods may be implemented step by step.
- the operation data of the work machine 10 when performing the lifting work may be imaged by the imaging device 63, and the skill of the operator Ma may be evaluated based on the operation data.
- the operator Ma is evaluated based on the movement state of the unloaded work machine 10 in the air.
- the operator Ma is operated by the operator Ma so that the bucket 13 performs excavation, and the operator Ma is evaluated.
- the mobile device 6 having the photographing device 63 is used for the evaluation of the operator Ma.
- the excavation operation of the work machine 10 of the excavator 3 operated by the operator Ma via the operation device 8 is photographed by, for example, the photographing device 63 of the portable device 6 held by the worker Mb.
- the photographing device 63 photographs the excavation operation of the work machine 10 from the outside of the excavator 3.
- FIG. 23 is a functional block diagram illustrating an example of a portable device according to the present embodiment. Similar to the above-described embodiment, the evaluation apparatus 600 includes the detection data acquisition unit 601, the position data calculation unit 602, the evaluation data generation unit 604, the display control unit 605, the storage unit 608, and the input / output unit 610. Have.
- the detection data acquisition unit 601 performs image processing based on operation data including the shooting data of the work machine 10 detected by the shooting device 63, and the first detection data and bucket indicating the excavation amount of the bucket 13. Second detection data indicating 13 excavation times is acquired.
- the evaluation data generation unit 604 generates evaluation data for the operator Ma based on the first detection data and the second detection data.
- the evaluation apparatus 600 includes an excavation time calculation unit 613 that performs image processing on the imaging data of the bucket 13 imaged by the imaging apparatus 63 and calculates the excavation time of one excavation operation by the bucket 13. .
- the evaluation device 600 performs image processing on the photographing data of the bucket 13 photographed by the photographing device 63, and when the bucket 13 is viewed from the side (left or right), the opening end portion (
- the excavation amount calculation part 614 which calculates the excavation amount of the bucket 13 from the area of the excavation thing which has come out from the opening edge part 13K shown in FIG. 25 is provided.
- One excavation operation by the bucket 13 starts, for example, the bucket 13 moves to excavate excavated material as earth and sand, enters the ground, the bucket 13 moves while scooping earth and sand, and holds the earth and sand in the bucket 13. The operation until the movement of the bucket 13 stops.
- the evaluation of the excavation time for the operation it is determined that the skill of the operator Ma is higher as the excavation time is shorter, and it is determined that the skill of the operator Ma is lower as the excavation time is longer.
- the excavation time and the score may be associated with each other, and in the case of a short excavation time, evaluation data with a high score may be generated.
- the evaluation apparatus 600 includes a target data acquisition unit 611 that acquires target data indicating the target excavation amount of the work machine 10.
- the evaluation data generation unit 604 generates evaluation data of the operator Ma based on the difference between the first detection data indicating the excavation amount of the work implement 10 and the target data acquired by the target data acquisition unit 611.
- FIG. 24 is a flowchart illustrating an example of a shooting and evaluation method according to the present embodiment.
- the imaging and evaluation method according to the present embodiment includes a step of acquiring target data indicating a target excavation amount of the work machine 10 (S305B), a step of specifying a movement start position of the work machine 10 (S310B), and a moving work.
- the process which acquires the target data which show the target excavation amount of the working machine 10 is implemented (step S305B).
- the operator Ma declares a target excavation amount to be excavated, and inputs the target excavation amount to the evaluation device 600 via the input device 65.
- the target data acquisition unit 611 acquires target data indicating the target excavation amount of the bucket 13. Note that the target excavation amount may be stored in the storage unit 608 in advance, and the target excavation amount may be used.
- the target excavation amount may be specified by the capacity of the excavated material, or may be specified by the full rate based on the state in which the excavated material of the specified volume has come out from the opening end of the bucket 13.
- the target excavation amount is designated by the full rate.
- the fullness rate is a kind of pile capacity.
- a predetermined amount of soil for example, 1 0.0 [m 3 ]
- a full rate of 1.0 when the excavated material is raised from the opening end (upper edge) of the bucket 13 with a gradient of 1: 1, a predetermined amount of soil (for example, 1 0.0 [m 3 ]) of excavated material being scooped into the bucket 13 is, for example, a full rate of 1.0.
- step S310B a process of specifying the movement start position and the movement start time of the bucket 13 of the work machine 10 is performed. If the position data calculation unit 602 determines that the time during which the bucket 13 is stationary is equal to or longer than the specified time based on the shooting data of the shooting device 63, the position of the bucket 13 is determined as the movement start position of the bucket 13. To do.
- the position data calculation unit 602 detects that the movement of the bucket 13 is started based on the shooting data.
- the position data calculation unit 602 determines a time point when the stationary bucket 13 starts to move as a time point when the bucket 13 starts moving.
- step S320B a process of acquiring operation data of the bucket 13 is performed.
- the operation data of the bucket 13 is the shooting data of the bucket 13 until the stationary work machine 10 starts moving at the movement start position and performs excavation operation, and the excavation operation ends and the movement ends at the movement end position. Including.
- step S330B a process for specifying the movement end position and the movement end point of the bucket 13 of the work machine 10 is performed.
- the position data calculation unit 602 detects that the movement of the bucket 13 is stopped based on the shooting data.
- the position data calculation unit 602 determines the position at which the bucket 13 in the moving state has stopped moving as the movement end position of the bucket 13. Further, the position data calculation unit 602 determines the time point when the bucket 13 in the moving state stops moving as the time point when the movement of the bucket 13 ends.
- the position data calculation unit 602 determines that the moving bucket 13 stops moving and the time during which the bucket 13 is stationary is equal to or longer than the specified time, the position of the bucket 13 is changed to the movement end position of the bucket 13. To decide.
- the excavation time calculation unit 613 calculates the excavation time of the bucket 13 based on the imaging data (step S332B).
- the excavation time is the time from the start of movement to the end of movement.
- the excavation amount calculation unit 614 identifies the open end 13K of the bucket 13 based on the shooting data of the bucket 13 shot by the shooting device 63.
- FIG. 25 is a diagram for explaining an example of the excavation amount calculation method according to the present embodiment.
- the excavated material is held in the bucket 13 when the excavation operation ends.
- the excavation operation is performed so that the excavated material goes upward from the opening end portion 13K of the bucket 13.
- the excavation amount calculation unit 614 performs image processing on the imaging data of the bucket 13 captured from the left by the imaging device 63, and specifies the opening end portion 13K of the bucket 13 that is the boundary between the bucket 13 and the excavated material.
- the excavation amount calculation unit 614 can specify the open end 13K of the bucket 13 based on contrast data including at least one of a luminance difference, a brightness difference, and a chromaticity difference between the bucket 13 and the excavated object. .
- the excavation amount calculation unit 614 specifies the position of the opening end 13K of the bucket 13, performs image processing on the bucket 13 and the imaging data of the excavation taken by the imaging device 63, and exits from the opening end 13 K of the bucket 13. Calculate the area of the excavated material.
- the excavation amount calculation unit 614 calculates the excavation amount of the bucket 13 from the area of the excavated matter that has come out from the open end 13K. From the area of the excavated matter coming out of the open end portion 13K, an approximate amount of soil (excavated amount) excavated by the bucket 13 in one excavation operation is estimated. That is, the capacity [m 3 ] of the bucket 13 to be used and the dimension in the width direction of the bucket 13 are known, for example, stored in advance in the storage unit 608, and the excavation amount calculation unit 614 uses the capacity and width of the bucket 13.
- the evaluation data generation unit 604 Ma evaluation data is generated.
- the evaluation data may be evaluation data only for the amount of excavation or may be evaluation data only for the excavation time, but having a high skill in excavation work means that an appropriate amount of excavation can be excavated in a short time in one excavation operation. Since the amount can be excavated with the bucket 13, in order to quantitatively evaluate whether the operator Ma has such a skill, the evaluation data is generated using both the excavation amount and the excavation time. Is desirable. That is, for example, the evaluation data generation unit 604 adds the score related to the excavation amount and the score related to the excavation time, and generates a comprehensively evaluated score.
- the evaluation data generation unit 604 obtains the evaluation data of the operator Ma based on the difference between the first detection data indicating the excavation amount of the bucket 13 and the target data indicating the target excavation amount of the bucket 13 acquired in step S305B. Generate. The smaller the difference between the first detection data and the target data, the better the skill of the operator Ma. On the other hand, it is evaluated that the skill of the operator Ma is inferior as the difference between the first detection data and the target data is large. Further, it is determined that the skill of the operator Ma is higher as the excavation time is shorter, and it is determined that the skill of the operator Ma is lower as the excavation time is longer.
- step S360B a process for displaying the evaluation data on the display device 64 is performed. For example, a score indicating evaluation data is displayed on the display device 64.
- the operator Ma actually performs the excavation operation in the evaluation of the operator Ma, and the second detection data indicating the excavation amount and the second excavation time of the work implement 10 are obtained. Since the detection data is acquired and the evaluation data of the operator Ma is generated based on the first detection data and the second detection data, the skill of the actual excavation operation of the operator Ma is quantitatively evaluated. be able to.
- the evaluation device 600 includes the target data acquisition unit 611 that acquires target data indicating the target excavation amount, and the evaluation data generation unit 604 calculates the difference between the first detection data and the target data. Based on this, evaluation data is generated. For example, assuming that the target data is a full rate of 1.0, the full rate of the excavation amount indicated by the first detection data for the excavation amount corresponding to the full rate of 1.0 may be generated as the evaluation data, or the first Evaluation data may be generated using the ratio of detected data as a score. Thereby, an arbitrary target excavation amount can be specified and the skill of the operator Ma regarding the excavation amount can be evaluated.
- the operator Ma when performing a loading operation such as loading excavated material on a dump truck bed using the hydraulic excavator 3, the operator Ma needs to finely adjust the excavation amount by the bucket 13 so as to obtain an appropriate loading amount.
- the target excavation amount By specifying the target excavation amount and evaluating the skill of the operator Ma based on the target excavation amount, the skill of the actual loading operation of the operator Ma can be evaluated.
- the excavation amount of the bucket 13 is calculated from the area of the excavated matter that is output from the opening end portion 13 ⁇ / b> K of the bucket 13 by performing image processing on the imaging data of the bucket 13 imaged by the imaging device 63. Is done. Thereby, the excavation amount of the bucket 13 can be easily obtained without performing complicated processing. According to the present embodiment, it is possible to evaluate whether or not it was possible to excavate an appropriate amount of soil with the bucket 13 in one excavation operation in a short time, and to evaluate the excavation work efficiency of the operator Ma. Can do.
- the operation data of the bucket 13 is detected by the imaging device 63.
- the operation data of the bucket 13 may be detected by a scanner device capable of detecting the operation data of the bucket 13 by irradiating the bucket 13 with detection light such as radar, or by irradiating the bucket 13 with radio waves. It may be detected by a radar device that can detect the operation data.
- FIG. 26 is a diagram schematically illustrating an example of a hydraulic excavator 3 ⁇ / b> C including a detection device 63 ⁇ / b> C that detects operation data of the bucket 13.
- the detecting device 63C detects the relative position of the blade edge 13B of the bucket 13 with respect to the upper swing body 21.
- the detection device 63C includes a boom cylinder stroke sensor 14S, an arm cylinder stroke sensor 15S, and a bucket cylinder stroke sensor 16S.
- the boom cylinder stroke sensor 14 ⁇ / b> S detects boom cylinder length data indicating the stroke length of the boom cylinder 14.
- the arm cylinder stroke sensor 15 ⁇ / b> S detects arm cylinder length data indicating the stroke length of the arm cylinder 15.
- the bucket cylinder stroke sensor 16 ⁇ / b> S detects bucket cylinder length data indicating the stroke length of the bucket cylinder 16.
- an angle sensor may be used as the detection device 63C.
- the detecting device 63C calculates the tilt angle ⁇ 1 of the boom 11 with respect to the direction parallel to the turning axis RX of the upper turning body 21 based on the boom cylinder length data.
- the detection device 63C calculates the inclination angle ⁇ 2 of the arm 12 with respect to the boom 11 based on the arm cylinder length data.
- the detection device 63C calculates the inclination angle ⁇ 3 of the blade edge 13B of the bucket 13 with respect to the arm 12 based on the bucket cylinder length data.
- the detection device 63C is based on the inclination angle ⁇ 1, the inclination angle ⁇ 2, the inclination angle ⁇ 3, and the known working machine dimensions (the length L1 of the boom 11, the length L2 of the arm 12, and the length L3 of the bucket 13).
- the relative position of the blade edge 13B of the bucket 13 with respect to the upper swing body 21 is calculated. Since the detection device 63 ⁇ / b> C can detect the relative position of the bucket 13 with respect to the upper swing body 21, it can detect the movement state of the bucket 13.
- the detection device 63 ⁇ / b> C at least the position, the movement trajectory, the movement speed, and the movement time of the bucket 13 can be detected from the operation data of the bucket 13.
- the excavation amount of the bucket 13 may be obtained by providing a weight sensor in the bucket 13 and obtaining the excavation amount [m 3 ] based on the detected weight.
- the operator Ma sits on the driver's seat 7 and operates the work machine 10.
- the work machine 10 may be remotely operated.
- 27 and 28 are diagrams for explaining an example of a method for remotely operating the excavator 3.
- FIG. 27 is a diagram illustrating a method in which the excavator 3 is remotely operated from the remote operation chamber 1000.
- the remote operation chamber 1000 and the excavator 3 can communicate wirelessly via a communication device.
- the remote operation room 1000 is provided with a construction information display device 1100, a driver's seat 1200, an operation device 1300 for remotely operating the excavator 3, and a monitor device 1400.
- the construction information display device 1100 displays various types of data such as construction site image data, work machine 10 image data, construction process data, and construction control data.
- the operating device 1300 includes a right working lever 1310R, a left working lever 1310L, a right traveling lever 1320R, and a left traveling lever 1320L.
- an operation signal is wirelessly transmitted to the excavator 3 based on the operation direction and the operation amount. Thereby, the hydraulic excavator 3 is remotely operated.
- the monitor device 1400 is installed obliquely in front of the driver seat 1200. Detection data of a sensor system (not shown) of the hydraulic excavator 3 is wirelessly transmitted to the remote operation room 1000 via a communication device, and display data based on the detection data is displayed on the monitor device 1400.
- FIG. 28 is a diagram illustrating a method in which the excavator 3 is remotely operated by the mobile terminal device 2000.
- the portable terminal device 2000 includes a construction information display device, an operation device for remotely operating the excavator 3, and a monitor device.
- the operation data of the remotely operated hydraulic excavator 3 is acquired, so that the skill of the operator Ma who operates remotely can be evaluated.
- the management device 4 may have some or all of the functions of the evaluation device 600.
- the management device 4 determines the skill of the operator Ma based on the operation data of the excavator 3. Can be evaluated. Since the management device 4 includes the arithmetic processing device 40 and the storage device 41 that can store the computer program for performing the evaluation method according to the present embodiment, the management device 4 can exhibit the functions of the evaluation device 600.
- the skill of the operator Ma is evaluated based on the operation data of the work machine 10.
- the operating state of the work machine 10 may be evaluated. For example, an inspection process for determining whether or not the operating state of the work implement 10 is normal based on operation data of the work implement 10 may be performed.
- the work vehicle 3 is the hydraulic excavator 3.
- the work vehicle 3 may be a work vehicle having a work machine that can move relative to the vehicle body, such as a bulldozer, a wheel loader, and a forklift.
Landscapes
- Engineering & Computer Science (AREA)
- Business, Economics & Management (AREA)
- Human Resources & Organizations (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Strategic Management (AREA)
- Economics (AREA)
- Entrepreneurship & Innovation (AREA)
- General Engineering & Computer Science (AREA)
- Civil Engineering (AREA)
- Mining & Mineral Resources (AREA)
- Structural Engineering (AREA)
- Theoretical Computer Science (AREA)
- Marketing (AREA)
- General Business, Economics & Management (AREA)
- Tourism & Hospitality (AREA)
- Mechanical Engineering (AREA)
- Development Economics (AREA)
- Operations Research (AREA)
- Quality & Reliability (AREA)
- Game Theory and Decision Science (AREA)
- Educational Administration (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Multimedia (AREA)
- Signal Processing (AREA)
- Health & Medical Sciences (AREA)
- General Health & Medical Sciences (AREA)
- Primary Health Care (AREA)
- Component Parts Of Construction Machinery (AREA)
- Operation Control Of Excavators (AREA)
- Image Processing (AREA)
- Image Analysis (AREA)
Abstract
Description
<評価システム>
図1は、本実施形態に係る評価システム1の一例を模式的に示す図である。施工現場2において作業車両3が稼働する。作業車両3は、その作業車両3に搭乗した操作者Maに操作される。評価システム1は、作業車両3の動作の評価、及び作業車両3を操作する操作者Maの技量の評価の一方又は両方を実施する。操作者Maは、作業車両3を操作して施工現場2を施工する。施工現場2においては操作者Maとは別の作業者Mbが作業を実施する。作業者Mbは、例えば施工現場2で補助作業を実施する。例えば作業者Mbは、携帯機器6を使用する。
次に、本実施形態に係る作業車両3について説明する。本実施形態においては、作業車両3が油圧ショベルである例について説明する。図2は、本実施形態に係る油圧ショベル3の一例を示す側面図である。図3は、本実施形態に係る油圧ショベル3の一例を示す平面図である。図3は、図2に示す作業機10の姿勢において、油圧ショベル3を上方から見たときの平面図を示す。
次に、本実施形態に係る操作装置8について説明する。図4は、本実施形態に係る操作装置8の一例を模式的に示す図である。操作装置8の作業レバーは、車幅方向において運転席7の中心よりも右方に配置される右作業レバー8WRと、車幅方向において運転席7の中心よりも左方に配置される左作業レバー8WLとを含む。操作装置8の走行レバーは、車幅方向において運転席7の中心よりも右方に配置される右走行レバー8MRと、車幅方向において運転席7の中心よりも左方に配置される左走行レバー8MLとを含む。
次に、本実施形態に係る評価システム1のハードウエア構成について説明する。図5は、本実施形態に係る評価システム1のハードウエア構成の一例を模式的に示す図である。
次に、図5に示した携帯機器6について詳細に説明する。図6は、本実施形態に係る携帯機器6の一例を示す機能ブロック図である。携帯機器6は、油圧ショベル3の動作の評価、及び油圧ショベル3を操作する操作者Maの技量の評価の一方又は両方を実施する評価装置600として機能する。評価装置600の機能は、演算処理装置60及び記憶装置61によって発揮される。
次に、本実施形態に係る操作者Maの評価方法について説明する。図7は、本実施形態に係る評価方法の一例を示すフローチャートである。
撮影装置63による油圧ショベル3の撮影準備が実施される(ステップS200)。図8は、本実施形態に係る撮影準備方法の一例を示すフローチャートである。
撮影装置63による油圧ショベル3の撮影準備を実施するステップ(S200)が実行され、作業機10の位置が特定され、下記に説明する、バケット13の移動開始位置が特定されると、携帯機器6は、撮影及び評価モードに遷移する。撮影及び評価モードにおいても、撮影装置63の光学系のズーム機能が制限される。油圧ショベル3は、固定された規定撮影倍率の撮影装置63によって撮影される。撮影準備モードにおける規定撮影倍率と、撮影及び評価モードにおける規定撮影倍率とは同一である。
以上説明したように、本実施形態によれば、作業機10の検出移動軌跡を含む検出データを取得する検出データ取得部601と、作業機10の目標移動軌跡を含む目標データを生成する目標データ生成部603と、検出データと目標データとに基づいて操作者Maの評価データを生成する評価データ生成部604とを備える評価装置600により、油圧ショベル3の操作者Maの技量を客観的及び定量的に評価することができる。評価データや評価データに基づく相対データが提供されることにより、技量向上のための操作者Maの意欲は向上する。また、操作者Maは、評価データに基づいて、自身の操作を改善することができる。
第2実施形態について説明する。以下の説明において、上述の実施形態と同一又は同等の構成要素については同一の符号を付し、その説明を簡略又は省略する。
なお、上述の実施形態においては、バケット13の動作データが撮影装置63で検出されることとした。バケット13の動作データは、バケット13に、例えばレーダ等の検出光を照射してバケット13の動作データを検出可能なスキャナ装置で検出されてもよいし、バケット13に電波を照射してバケット13の動作データを検出可能なレーダ装置で検出されてもよい。
Claims (12)
- 作業車両の作業機の動作を検出する検出装置によって検出された、前記作業機の移動開始位置から移動終了位置までの動作データに基づいて前記作業機の所定部の検出移動軌跡を含む検出データを取得する検出データ取得部と、
前記作業機の所定部の目標移動軌跡を含む目標データを生成する目標データ生成部と、
前記検出データと前記目標データとに基づいて、前記作業機を操作する操作者の評価データを生成する評価データ生成部と、
を備える評価装置。 - 前記検出データは、前記移動開始位置において静止状態の前記作業機が移動を開始してから前記移動終了位置において移動を終了するまでの空中における無負荷状態の前記作業機の検出移動軌跡を含む、
請求項1に記載の評価装置。 - 前記目標移動軌跡は、前記移動開始位置と前記移動終了位置とを結ぶ直線を含む、
請求項1又は請求項2に記載の評価装置。 - 前記評価データ生成部は、前記検出移動軌跡と前記目標移動軌跡との差分に基づいて前記評価データを生成する、
請求項1から請求項3のいずれか一項に記載の評価装置。 - 前記検出データは、前記移動開始位置と前記移動終了位置との距離を含み、
前記評価データ生成部は、前記距離に基づいて前記評価データを生成する、
請求項1から請求項4のいずれか一項に記載の評価装置。 - 前記検出データは、前記移動開始位置から前記移動終了位置までの前記作業機の移動時間を含み、
前記評価データ生成部は、前記移動時間に基づいて前記評価データを生成する、
請求項1から請求項5のいずれか一項に記載の評価装置。 - 前記検出装置は、前記作業車両を撮影可能な撮影装置を含み、
前記動作データは、前記作業機の撮影データを含み、
前記作業機は、前記作業車両の車両本体に支持され、
前記動作データは、前記撮影装置で撮影された前記作業車両を含む撮影領域の撮影データを含み、
前記撮影領域の撮影データに基づいて前記作業機の位置データを算出する位置データ算出部を備え、
前記位置データ算出部は、前記撮影領域に対して第1テンプレートを移動して前記車両本体の撮影データと前記第1テンプレートとの相関値に基づいて前記車両本体の位置データを算出した後、前記撮影領域に対して第2テンプレートを移動して前記作業機の撮影データと前記第2テンプレートとの相関値に基づいて前記作業機の位置データを算出する、
請求項1から請求項6のいずれか一項に記載の評価装置。 - 作業車両の作業機の動作データに基づいて、前記作業機の掘削量を示す第1検出データ及び前記作業機の掘削時間を示す第2検出データを取得する検出データ取得部と、
前記第1検出データ及び前記第2検出データに基づいて、前記作業機を操作する操作者の評価データを生成する評価データ生成部と、
を備える評価装置。 - 前記作業機の目標掘削量を示す目標データを取得する目標データ取得部を備え、
前記評価データ生成部は、前記第1検出データと前記目標データとの差分に基づいて前記評価データを生成する、
請求項8に記載の評価装置。 - 前記検出装置は、前記作業車両を撮影可能な撮影装置を含み、
前記動作データは、前記作業機の撮影データを含み、
前記作業機は、バケットを含み、
前記撮影装置によって撮影された前記バケットの撮影データを画像処理して前記バケットの開口端部から出ている掘削物の面積から前記掘削量を算出する掘削量算出部を備える、
請求項8又は請求項9に記載の評価装置。 - 作業車両の作業機の動作を検出する検出装置によって検出された、前記作業機の移動開始位置から移動終了位置までの作業車両の作業機の動作データに基づいて前記作業機の所定部の検出移動軌跡を含む検出データを取得することと、
前記作業機の所定部の目標移動軌跡を含む目標データを生成することと、
前記検出データと前記目標データとに基づいて、前記作業機を操作する操作者の評価データを生成することと、
を含む評価方法。 - 作業車両の作業機の動作データに基づいて、前記作業機の掘削量を示す第1検出データ及び前記作業機の掘削時間を示す第2検出データを取得することと、
前記第1検出データ及び前記第2検出データに基づいて、前記作業機を操作する操作者の評価データを生成することと、
を含む評価方法。
Priority Applications (7)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US15/128,210 US20170255895A1 (en) | 2016-03-01 | 2016-03-01 | Evaluation device and evaluation method |
JP2016523353A JP6259515B2 (ja) | 2016-03-01 | 2016-03-01 | 評価装置及び評価方法 |
AU2016216347A AU2016216347B2 (en) | 2016-03-01 | 2016-03-01 | Evaluation device and evaluation method |
PCT/JP2016/056290 WO2016125915A1 (ja) | 2016-03-01 | 2016-03-01 | 評価装置及び評価方法 |
CN201680000912.6A CN107343381A (zh) | 2016-03-01 | 2016-03-01 | 评价装置以及评价方法 |
DE112016000019.7T DE112016000019T5 (de) | 2016-03-01 | 2016-03-01 | Bewertungsvorrichtung und Bewertungsverfahren |
KR1020167026005A KR20170102799A (ko) | 2016-03-01 | 2016-03-01 | 평가 장치 및 평가 방법 |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
PCT/JP2016/056290 WO2016125915A1 (ja) | 2016-03-01 | 2016-03-01 | 評価装置及び評価方法 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2016125915A1 true WO2016125915A1 (ja) | 2016-08-11 |
Family
ID=56564245
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2016/056290 WO2016125915A1 (ja) | 2016-03-01 | 2016-03-01 | 評価装置及び評価方法 |
Country Status (7)
Country | Link |
---|---|
US (1) | US20170255895A1 (ja) |
JP (1) | JP6259515B2 (ja) |
KR (1) | KR20170102799A (ja) |
CN (1) | CN107343381A (ja) |
AU (1) | AU2016216347B2 (ja) |
DE (1) | DE112016000019T5 (ja) |
WO (1) | WO2016125915A1 (ja) |
Cited By (13)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2018199069A1 (ja) * | 2017-04-28 | 2018-11-01 | 株式会社小松製作所 | 作業機械および作業機械の制御方法 |
CN109034509A (zh) * | 2017-06-08 | 2018-12-18 | 株式会社日立制作所 | 作业人员评价系统、作业人员评价装置及评价方法 |
JPWO2018043299A1 (ja) * | 2016-08-31 | 2019-06-24 | 株式会社小松製作所 | 作業機械の画像表示システム、作業機械の遠隔操作システム、作業機械及び作業機械の画像表示方法 |
JP2019159818A (ja) * | 2018-03-13 | 2019-09-19 | 矢崎総業株式会社 | 作業評価装置、及び、作業評価方法 |
JPWO2020196838A1 (ja) * | 2019-03-27 | 2020-10-01 | ||
WO2020202788A1 (ja) * | 2019-03-29 | 2020-10-08 | コベルコ建機株式会社 | 作業分析方法、作業分析装置及び作業分析プログラム |
JP2021033568A (ja) * | 2019-08-22 | 2021-03-01 | ナブテスコ株式会社 | 情報処理システム、情報処理方法、建設機械 |
US11076130B2 (en) | 2017-07-14 | 2021-07-27 | Komatsu Ltd. | Operation information transmission device, construction management system, operation information transmission method, and program |
WO2022071349A1 (ja) * | 2020-10-02 | 2022-04-07 | コベルコ建機株式会社 | 仕分先特定装置、仕分先特定方法、及びプログラム |
US11619028B2 (en) | 2017-12-11 | 2023-04-04 | Sumitomo Construction Machinery Co., Ltd. | Shovel |
WO2023189216A1 (ja) * | 2022-03-31 | 2023-10-05 | 日立建機株式会社 | 作業支援システム |
JP7540359B2 (ja) | 2021-02-09 | 2024-08-27 | コベルコ建機株式会社 | 操作対象装置 |
JP7540358B2 (ja) | 2021-02-09 | 2024-08-27 | コベルコ建機株式会社 | 操作対象装置 |
Families Citing this family (22)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
KR102633625B1 (ko) * | 2015-12-28 | 2024-02-02 | 스미토모 겐키 가부시키가이샤 | 쇼벨, 쇼벨용 시스템 및 쇼벨의 제어방법 |
JP2017156972A (ja) * | 2016-03-01 | 2017-09-07 | 株式会社小松製作所 | 評価装置、管理装置、評価システム、及び評価方法 |
CN107407076B (zh) * | 2016-03-11 | 2020-09-22 | 日立建机株式会社 | 工程机械的控制装置 |
KR20170113001A (ko) * | 2016-03-28 | 2017-10-12 | 가부시키가이샤 고마쓰 세이사쿠쇼 | 평가 장치 및 평가 방법 |
JP6697955B2 (ja) * | 2016-05-26 | 2020-05-27 | 株式会社クボタ | 作業車及び作業車に適用される時間ベース管理システム |
EP3571562A4 (en) * | 2017-01-23 | 2020-12-02 | Built Robotics Inc. | EXCAVATION OF SOIL FROM AN EXCAVATION SITE USING AN EXCAVATION VEHICLE |
US10408241B2 (en) | 2017-02-09 | 2019-09-10 | Deere & Company | Method of determining cycle time of an actuator and a system for determining a cycle time of a machine having an actuator |
JP6930337B2 (ja) * | 2017-09-27 | 2021-09-01 | カシオ計算機株式会社 | 電子機器、移動経路記録方法、およびプログラム |
JP7106851B2 (ja) * | 2017-12-12 | 2022-07-27 | 富士フイルムビジネスイノベーション株式会社 | 情報処理装置及びプログラム |
JP7143634B2 (ja) * | 2018-05-29 | 2022-09-29 | コベルコ建機株式会社 | 技能評価システム及び技能評価方法 |
JP7059845B2 (ja) * | 2018-07-18 | 2022-04-26 | トヨタ自動車株式会社 | 車載装置 |
CN109296024B (zh) * | 2018-11-30 | 2023-04-07 | 徐州市产品质量监督检验中心 | 一种无人挖掘机采装位姿精度检测方法 |
CN109903337B (zh) * | 2019-02-28 | 2022-06-14 | 北京百度网讯科技有限公司 | 用于确定挖掘机的铲斗的位姿的方法和装置 |
JP7293822B2 (ja) * | 2019-04-05 | 2023-06-20 | コベルコ建機株式会社 | 技能評価システム及び技能評価方法 |
JP7302244B2 (ja) * | 2019-04-05 | 2023-07-04 | コベルコ建機株式会社 | スキル情報提示システム及びスキル情報提示方法 |
JP2020170474A (ja) * | 2019-04-05 | 2020-10-15 | コベルコ建機株式会社 | スキル情報提示システム及びスキル情報提示方法 |
WO2020262661A1 (ja) * | 2019-06-27 | 2020-12-30 | 住友重機械工業株式会社 | 作業機械の管理システム、作業機械の管理装置、作業員用端末、施工業者用端末 |
JPWO2021085608A1 (ja) * | 2019-10-31 | 2021-05-06 | ||
JP7490948B2 (ja) * | 2019-11-25 | 2024-05-28 | コベルコ建機株式会社 | 作業支援サーバおよび作業支援システム |
JP7392422B2 (ja) * | 2019-11-25 | 2023-12-06 | コベルコ建機株式会社 | 作業支援サーバおよび作業支援システム |
CN111557642B (zh) * | 2020-03-31 | 2021-05-11 | 广东省国土资源测绘院 | 基于轨迹评估外业作业成效的方法、系统 |
KR20220064599A (ko) * | 2020-11-12 | 2022-05-19 | 주식회사 가린시스템 | 차량의 원격시동장치를 이용한 빅데이터 기반의 능동 서비스 제공방법 및 시스템 |
Citations (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2005011058A (ja) * | 2003-06-19 | 2005-01-13 | Hitachi Constr Mach Co Ltd | 作業機械の作業支援・管理システム |
JP2008241300A (ja) * | 2007-03-26 | 2008-10-09 | Komatsu Ltd | 油圧ショベルの作業量計測方法および作業量計測装置 |
JP2009235833A (ja) * | 2008-03-28 | 2009-10-15 | Komatsu Ltd | 建設機械の運転評価システム及び運転評価方法 |
JP2014112329A (ja) * | 2012-12-05 | 2014-06-19 | Kajima Corp | 作業内容分類システム及び作業内容分類方法 |
JP2014148893A (ja) * | 2014-05-30 | 2014-08-21 | Komatsu Ltd | 油圧ショベルの表示システム |
JP2015067990A (ja) * | 2013-09-27 | 2015-04-13 | ダイキン工業株式会社 | 建設機械 |
JP2015132090A (ja) * | 2014-01-10 | 2015-07-23 | キャタピラー エス エー アール エル | 建設機械 |
Family Cites Families (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP3805504B2 (ja) * | 1997-11-14 | 2006-08-02 | 株式会社トプコン | 測量機の通信システム |
WO2003091856A2 (en) * | 2002-04-26 | 2003-11-06 | Emotion Mobility, Llc | System for vehicle assignment and pickup |
JP2010287069A (ja) * | 2009-06-11 | 2010-12-24 | Caterpillar Sarl | 作業機械管理システムにおける作業機械管理方法 |
JP5337220B2 (ja) * | 2011-09-29 | 2013-11-06 | 株式会社小松製作所 | 作業機械の表示装置および表示装置を搭載した作業機械 |
JP5944805B2 (ja) * | 2012-09-26 | 2016-07-05 | 株式会社クボタ | コンバイン及びコンバインの管理システム |
CN105297817A (zh) * | 2014-07-28 | 2016-02-03 | 西安众智惠泽光电科技有限公司 | 一种对挖掘机进行监控的方法 |
-
2016
- 2016-03-01 CN CN201680000912.6A patent/CN107343381A/zh active Pending
- 2016-03-01 AU AU2016216347A patent/AU2016216347B2/en active Active
- 2016-03-01 DE DE112016000019.7T patent/DE112016000019T5/de active Pending
- 2016-03-01 JP JP2016523353A patent/JP6259515B2/ja active Active
- 2016-03-01 WO PCT/JP2016/056290 patent/WO2016125915A1/ja active Application Filing
- 2016-03-01 KR KR1020167026005A patent/KR20170102799A/ko not_active Application Discontinuation
- 2016-03-01 US US15/128,210 patent/US20170255895A1/en not_active Abandoned
Patent Citations (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2005011058A (ja) * | 2003-06-19 | 2005-01-13 | Hitachi Constr Mach Co Ltd | 作業機械の作業支援・管理システム |
JP2008241300A (ja) * | 2007-03-26 | 2008-10-09 | Komatsu Ltd | 油圧ショベルの作業量計測方法および作業量計測装置 |
JP2009235833A (ja) * | 2008-03-28 | 2009-10-15 | Komatsu Ltd | 建設機械の運転評価システム及び運転評価方法 |
JP2014112329A (ja) * | 2012-12-05 | 2014-06-19 | Kajima Corp | 作業内容分類システム及び作業内容分類方法 |
JP2015067990A (ja) * | 2013-09-27 | 2015-04-13 | ダイキン工業株式会社 | 建設機械 |
JP2015132090A (ja) * | 2014-01-10 | 2015-07-23 | キャタピラー エス エー アール エル | 建設機械 |
JP2014148893A (ja) * | 2014-05-30 | 2014-08-21 | Komatsu Ltd | 油圧ショベルの表示システム |
Cited By (25)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPWO2018043299A1 (ja) * | 2016-08-31 | 2019-06-24 | 株式会社小松製作所 | 作業機械の画像表示システム、作業機械の遠隔操作システム、作業機械及び作業機械の画像表示方法 |
JP2018188831A (ja) * | 2017-04-28 | 2018-11-29 | 株式会社小松製作所 | 作業機械および作業機械の制御方法 |
US11408146B2 (en) | 2017-04-28 | 2022-08-09 | Komatsu Ltd. | Work machine and method for controlling the same |
WO2018199069A1 (ja) * | 2017-04-28 | 2018-11-01 | 株式会社小松製作所 | 作業機械および作業機械の制御方法 |
CN109034509A (zh) * | 2017-06-08 | 2018-12-18 | 株式会社日立制作所 | 作业人员评价系统、作业人员评价装置及评价方法 |
US11076130B2 (en) | 2017-07-14 | 2021-07-27 | Komatsu Ltd. | Operation information transmission device, construction management system, operation information transmission method, and program |
US11619028B2 (en) | 2017-12-11 | 2023-04-04 | Sumitomo Construction Machinery Co., Ltd. | Shovel |
JP2019159818A (ja) * | 2018-03-13 | 2019-09-19 | 矢崎総業株式会社 | 作業評価装置、及び、作業評価方法 |
JP7439053B2 (ja) | 2019-03-27 | 2024-02-27 | 住友重機械工業株式会社 | ショベル及びショベルの管理装置 |
WO2020196838A1 (ja) * | 2019-03-27 | 2020-10-01 | 住友重機械工業株式会社 | ショベル及びショベルの管理装置 |
JPWO2020196838A1 (ja) * | 2019-03-27 | 2020-10-01 | ||
US11941562B2 (en) | 2019-03-29 | 2024-03-26 | Kobelco Construction Machinery Co., Ltd. | Operation analysis method, operation analysis device, and operation analysis program |
JP2020166462A (ja) * | 2019-03-29 | 2020-10-08 | コベルコ建機株式会社 | 作業分析方法、作業分析装置及びプログラム |
CN113474526A (zh) * | 2019-03-29 | 2021-10-01 | 神钢建机株式会社 | 作业分析方法、作业分析装置及作业分析程序 |
WO2020202788A1 (ja) * | 2019-03-29 | 2020-10-08 | コベルコ建機株式会社 | 作業分析方法、作業分析装置及び作業分析プログラム |
JP7163235B2 (ja) | 2019-03-29 | 2022-10-31 | コベルコ建機株式会社 | 作業分析方法、作業分析装置及びプログラム |
CN113474526B (zh) * | 2019-03-29 | 2023-05-09 | 神钢建机株式会社 | 作业分析方法、作业分析装置及存储介质 |
JP7383255B2 (ja) | 2019-08-22 | 2023-11-20 | ナブテスコ株式会社 | 情報処理システム、情報処理方法、建設機械 |
JP2021033568A (ja) * | 2019-08-22 | 2021-03-01 | ナブテスコ株式会社 | 情報処理システム、情報処理方法、建設機械 |
WO2022071349A1 (ja) * | 2020-10-02 | 2022-04-07 | コベルコ建機株式会社 | 仕分先特定装置、仕分先特定方法、及びプログラム |
US11987307B2 (en) | 2020-10-02 | 2024-05-21 | Kobelco Construction Machinery Co., Ltd. | Sorting destination identification device, sorting destination identification method, and program |
JP7496278B2 (ja) | 2020-10-02 | 2024-06-06 | コベルコ建機株式会社 | 仕分先特定装置、仕分先特定方法及びプログラム |
JP7540359B2 (ja) | 2021-02-09 | 2024-08-27 | コベルコ建機株式会社 | 操作対象装置 |
JP7540358B2 (ja) | 2021-02-09 | 2024-08-27 | コベルコ建機株式会社 | 操作対象装置 |
WO2023189216A1 (ja) * | 2022-03-31 | 2023-10-05 | 日立建機株式会社 | 作業支援システム |
Also Published As
Publication number | Publication date |
---|---|
AU2016216347B2 (en) | 2019-05-23 |
US20170255895A1 (en) | 2017-09-07 |
DE112016000019T5 (de) | 2016-12-01 |
AU2016216347A1 (en) | 2018-02-08 |
KR20170102799A (ko) | 2017-09-12 |
JPWO2016125915A1 (ja) | 2017-04-27 |
CN107343381A (zh) | 2017-11-10 |
JP6259515B2 (ja) | 2018-01-10 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
JP6259515B2 (ja) | 評価装置及び評価方法 | |
JP6002873B1 (ja) | 評価装置及び評価方法 | |
KR102102133B1 (ko) | 쇼벨의 화상표시장치 | |
WO2017150298A1 (ja) | 評価装置、管理装置、評価システム、及び評価方法 | |
CN106460373B (zh) | 评价装置 | |
AU2016336318B2 (en) | Construction machine and construction management system | |
JP7151392B2 (ja) | 建設機械の遠隔操作装置 | |
CN111868335B (zh) | 远程操作系统以及主操作装置 | |
JP2018059400A (ja) | 施工管理システム | |
CN114127745A (zh) | 工程机械的作业信息生成系统以及作业信息生成方法 | |
JP2017071914A (ja) | 施工方法、作業機械の制御システム及び作業機械 | |
JP2013133631A (ja) | ショベル | |
CN113785091A (zh) | 施工机械用图像获取装置、信息管理系统、信息终端、施工机械用图像获取程序 | |
JP7092714B2 (ja) | 作業機械の制御装置及び作業機械の制御方法 | |
JP2020045634A (ja) | 作業機械の表示システムおよびその制御方法 |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
ENP | Entry into the national phase |
Ref document number: 2016523353 Country of ref document: JP Kind code of ref document: A |
|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 16746743 Country of ref document: EP Kind code of ref document: A1 |
|
ENP | Entry into the national phase |
Ref document number: 20167026005 Country of ref document: KR Kind code of ref document: A |
|
WWE | Wipo information: entry into national phase |
Ref document number: 15128210 Country of ref document: US |
|
WWE | Wipo information: entry into national phase |
Ref document number: 112016000019 Country of ref document: DE |
|
ENP | Entry into the national phase |
Ref document number: 2016216347 Country of ref document: AU Date of ref document: 20160301 Kind code of ref document: A |
|
122 | Ep: pct application non-entry in european phase |
Ref document number: 16746743 Country of ref document: EP Kind code of ref document: A1 |