WO2023190671A1 - 物体検知システム - Google Patents
物体検知システム Download PDFInfo
- Publication number
- WO2023190671A1 WO2023190671A1 PCT/JP2023/012771 JP2023012771W WO2023190671A1 WO 2023190671 A1 WO2023190671 A1 WO 2023190671A1 JP 2023012771 W JP2023012771 W JP 2023012771W WO 2023190671 A1 WO2023190671 A1 WO 2023190671A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- object detection
- detection system
- distribution density
- video data
- vehicle body
- Prior art date
Links
- 238000001514 detection method Methods 0.000 title claims abstract description 118
- 238000004891 communication Methods 0.000 claims abstract description 41
- 238000009826 distribution Methods 0.000 claims abstract description 38
- 238000010276 construction Methods 0.000 claims abstract description 32
- 238000000034 method Methods 0.000 claims abstract description 14
- 230000008569 process Effects 0.000 claims description 11
- 238000012545 processing Methods 0.000 claims description 11
- 230000015556 catabolic process Effects 0.000 claims description 2
- 230000002776 aggregation Effects 0.000 abstract description 9
- 238000004220 aggregation Methods 0.000 abstract description 9
- 238000010586 diagram Methods 0.000 description 7
- 230000005540 biological transmission Effects 0.000 description 6
- 230000006872 improvement Effects 0.000 description 3
- 230000000694 effects Effects 0.000 description 2
- 239000010720 hydraulic oil Substances 0.000 description 2
- 230000004913 activation Effects 0.000 description 1
- 230000004931 aggregating effect Effects 0.000 description 1
- 238000013459 approach Methods 0.000 description 1
- 238000009412 basement excavation Methods 0.000 description 1
- 239000002131 composite material Substances 0.000 description 1
- 238000013500 data storage Methods 0.000 description 1
- 239000000284 extract Substances 0.000 description 1
- 230000007774 longterm Effects 0.000 description 1
- 238000012986 modification Methods 0.000 description 1
- 230000004048 modification Effects 0.000 description 1
- 238000012544 monitoring process Methods 0.000 description 1
- 238000002360 preparation method Methods 0.000 description 1
- 230000004044 response Effects 0.000 description 1
- 238000012552 review Methods 0.000 description 1
- 238000003860 storage Methods 0.000 description 1
- 230000001360 synchronised effect Effects 0.000 description 1
Images
Classifications
-
- E—FIXED CONSTRUCTIONS
- E02—HYDRAULIC ENGINEERING; FOUNDATIONS; SOIL SHIFTING
- E02F—DREDGING; SOIL-SHIFTING
- E02F9/00—Component parts of dredgers or soil-shifting machines, not restricted to one of the kinds covered by groups E02F3/00 - E02F7/00
- E02F9/24—Safety devices, e.g. for preventing overload
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60R—VEHICLES, VEHICLE FITTINGS, OR VEHICLE PARTS, NOT OTHERWISE PROVIDED FOR
- B60R11/00—Arrangements for holding or mounting articles, not otherwise provided for
- B60R11/02—Arrangements for holding or mounting articles, not otherwise provided for for radio sets, television sets, telephones, or the like; Arrangement of controls thereof
-
- E—FIXED CONSTRUCTIONS
- E02—HYDRAULIC ENGINEERING; FOUNDATIONS; SOIL SHIFTING
- E02F—DREDGING; SOIL-SHIFTING
- E02F9/00—Component parts of dredgers or soil-shifting machines, not restricted to one of the kinds covered by groups E02F3/00 - E02F7/00
- E02F9/26—Indicating devices
-
- E—FIXED CONSTRUCTIONS
- E02—HYDRAULIC ENGINEERING; FOUNDATIONS; SOIL SHIFTING
- E02F—DREDGING; SOIL-SHIFTING
- E02F9/00—Component parts of dredgers or soil-shifting machines, not restricted to one of the kinds covered by groups E02F3/00 - E02F7/00
- E02F9/26—Indicating devices
- E02F9/261—Surveying the work-site to be treated
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V20/00—Scenes; Scene-specific elements
- G06V20/50—Context or environment of the image
- G06V20/56—Context or environment of the image exterior to a vehicle by using sensors mounted on the vehicle
- G06V20/58—Recognition of moving objects or obstacles, e.g. vehicles or pedestrians; Recognition of traffic objects, e.g. traffic signs, traffic lights or roads
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N7/00—Television systems
- H04N7/18—Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast
Definitions
- the present invention relates to an object detection system applied to a construction machine operating site.
- construction machinery is equipped with a sensor that detects objects and people around the vehicle body, and warns the operator when the sensor detects a person or object within close range of the vehicle body. It is equipped with a function that allows the vehicle to move forward and restrict the movement of the vehicle.
- An object of the present invention is to provide an object detection system for construction machinery that can easily determine where and under what circumstances a near-miss has occurred at a construction machinery operation site.
- an object detection system includes a communication section that receives detection data indicating that an object detection sensor attached to the body of a construction machine has detected an object, and position information of the vehicle body. , has a recording unit that records at least position information, and a detection result output unit that outputs the position and frequency distribution density at which the object detection sensor has detected an object to an external monitor based on the detection data and the position information.
- the above configuration makes it possible to schematically and visually display the location and frequency at which an object, etc. is detected on the external monitor, compared to the sole information that the vehicle body has detected an object, etc. in the surroundings. Therefore, it becomes possible to easily understand where and under what circumstances a near-miss occurred at the site, and it becomes easy for the site manager to propose safety improvements.
- FIG. 1 is a diagram showing the structure of a construction machine having a camera and an object detection sensor.
- FIG. 2 is a block diagram showing the functional configuration of a camera and an object detection sensor included in the construction machine.
- 5 is a flowchart showing a process of saving video data captured by a camera of a construction machine in a recording device.
- 1 is a flowchart showing a process of transmitting data from a construction machine to an object detection system.
- FIG. 1 shows a construction machine 100 having a camera and an object detection sensor to which the present invention is applied.
- the construction machine 100 is equipped with an engine 1 as a prime mover and a main pump 2 driven by the engine 1, and hydraulic oil sent by the main pump 2 is used to power the lower traveling body 3, the upper rotating body 4, and the front device. 5 operate independently.
- the lower traveling body 3 drives and controls a pair of crawler tracks 6 (only one side is shown in FIG. 1) with a traveling hydraulic motor 7.
- the upper rotating body 4 is provided so as to be able to rotate relative to the lower traveling body 3, and the upper rotating body 4 is driven and controlled by rotating the swing hydraulic motor 8.
- the front device 5 is mounted on the upper revolving body 4, and includes a boom 9, a boom cylinder 10 for driving the boom 9, an arm 11, an arm cylinder 12 for driving the arm 11, a bucket 13, and a bucket 13. It is composed of a bucket cylinder 14 for driving.
- Each cylinder expands and contracts with hydraulic oil sent from the main pump 2, and drives the rotating shaft 15 of the upper revolving structure 4, the boom 9, the rotating shaft 16 of the boom 9, the arm 11, the arm 11, and the rotating shaft 17 of the bucket 13, respectively. Perform work such as excavation and land leveling.
- FIG. 2 is a block diagram showing the functional configuration of the camera and object detection sensor mounted on the construction machine 100.
- a plurality of cameras 18 and object detection sensors 19 for monitoring the surroundings are installed outside the driver's seat of the construction machine 100, and a monitor controller 20, a vehicle body controller 21, a monitor 22, and a recording device are installed inside the driver's seat. 23 is installed.
- Each controller and the recording device 23 can communicate with each other via an in-vehicle network 24 such as CAN.
- the camera 18 is connected to a monitor controller 20, and the object detection sensor 19 is connected to a vehicle body controller 21.
- the monitor controller 20 synthesizes the surrounding images input from the cameras 18 and outputs it to the monitor 22 as a surround view in which the images of each camera 18 are divided or the boundaries of adjacent cameras 18 are blended.
- the monitor 22 displays the input composite video, and further outputs the video displayed on the monitor 22 to the recording device 23.
- a communication terminal 25 is connected to the recording device 23, which stores the input image on the monitor 22 and the vehicle body information flowing through the in-vehicle network 24, connects to the Internet with the communication terminal 25, and uploads it to the server 26. This allows access to the recording device 23 from both. Furthermore, the communication terminal 25 acquires position information from a GPS (not shown).
- the vehicle body controller 21 transmits the detection signal of the recording object detection sensor 19 to the recording device 23 via the vehicle network 24.
- the recording device 23 Upon receiving the detection signal, the recording device 23 stores images and vehicle body data before and after detection for a preset period. Furthermore, the vehicle body position acquired by the communication terminal 25 at the time of object detection is also saved and linked to the video and vehicle body data.
- the vehicle body data here includes operating information obtained from sensors provided on the construction machine, such as engine rotation speed, input values of various levers, position information, and time.
- the vehicle body controller 21 is also a controller that performs hydraulic control of the machine, and acquires the amount of operation of the swing lever 27 and the amount of operation of the travel lever 28 operated by the operator. Furthermore, the locked and unlocked states of the gate lock device 29, which permits or prohibits operation of the vehicle body, are also acquired.
- FIG. 3 is a block diagram showing the functional configuration of the object detection system implemented in the server 26. Note that the object detection system will be simply referred to as a "system” below.
- the system includes a communication section 30, a video processing section 31, a detection result output section 32, a camera control section 33, and a recording section 34.
- the communication unit 30 transmits and receives data to and from the communication terminal 25 on the construction machine 100 side via the network 39.
- the video processing unit 31 processes the video received by the communication unit 30 and captured by the camera 18 of the construction machine 100, stores it as necessary, and transmits it to the detection result output unit 32.
- the detection result output unit 32 records the detection position of the object corresponding to the detection data in response to the reception of the detection data by the communication unit 30, and accumulates the detected position of the object to generate a distribution density indicating the position and frequency at which the object is detected. Calculate. Further, distribution density information indicating the above-mentioned position and frequency is outputted to an external monitor 40 connected via the network 39 and displayed. The display of the distribution density indicating the position and frequency will be described in detail later.
- the camera control unit 33 controls activation of the camera 18 of the construction machine 100. Specifically, the camera 18 is activated using the release of the gate lock device 29 of the construction machine 100 as a trigger.
- the recording unit 34 stores data 35 transmitted to the communication unit 30 via the communication terminal 25 of the construction machine 100.
- the received data 35 is saved for each model (35A, 35B, . . . ).
- Each data 35 includes time series data 37 and constant data 38.
- the time-series data 37 includes data regarding recorded video, operation information, detection sensor information, etc. within a predetermined period of time before and after the point in time when the object detection sensor 19 detects an object or the like.
- the predetermined period of time referred to here can be set to, for example, 30 seconds before or after, but it does not have to be determined in advance, but can be set to any value, such as the period from when the object detection sensor 19 starts detecting an object to when it ends. You can set the time.
- the constant data 38 includes data such as the time and vehicle body position at the time when the object detection sensor 19 detects an object or the like.
- FIG. 4 shows a flowchart when the server 26 saves video data and vehicle body data.
- the camera control unit 33 determines whether the gate lock device 29 of the vehicle body is released (step A3). If the gate lock device 29 is released, the camera 18 is activated to start photographing and recording (step A4).
- the communication unit 30 determines whether or not a detection signal of an object or the like is received from the vehicle body object detection sensor 19 from the vehicle body controller 21 (step A5).
- the communication unit 30 When receiving a detection signal from the object detection sensor 19, the communication unit 30 further extracts video data generated by the monitor controller 20 from a preset time T1 seconds ago from the recording device 23. Receive (step A6). Further, vehicle body data at the time when the object detection sensor 19 detects an object or the like is also saved.
- the communication unit 30 waits until a predetermined time T2 seconds have elapsed from the time when the object detection sensor 19 detected an object or the like. Then, after T2 seconds have elapsed, the recording unit 34 records the video data up to that point (steps A7 and A8).
- Step A9 After recording the video data in step A8, it is determined whether the power to the vehicle body is turned off (step A9). If the power is not turned off, the process returns to step A3, and the detection signal reception determination and data generation and storage are repeated. The determination in Step A9 is also performed when the gate lock device 29 is locked in Step A3 and when no detection signal is received in Step A5.
- step A10 If it is determined in step A9 that the vehicle power is turned off, the power to the recording device 23 is turned off (step A10), and the control flow is ended. At this time, the power of the vehicle is OFF, but the recording device 23 is driven by a separate battery power source (not shown), and the recording device 23 keeps its own power until the last data storage is completed. Do not turn it off.
- FIG. 5 shows a flowchart when the communication terminal 25 transmits the video data and vehicle body data recorded in the recording device 23 to the server 26. This control is executed in parallel with the data generation flow shown in FIG.
- step B1 when the vehicle body is powered on, the communication terminal 25 is powered on (step B1).
- the communication terminal 25 determines whether it can communicate with the server 26 (step B2). If communication is not established, a retry is performed and no further control is performed until communication is established.
- the communication terminal 25 refers to the recording device 23 and determines whether unsent data exists in the recording device 23 (step B3). If there is no unsent data, the process returns to step B2.
- step B4 it is further determined whether there is one or more unsent data. If only one piece of data is stored, that data is sent to the server 26 (step B5). If two or more pieces of unsent data are stored, the oldest data among them is sent to the server 26 (step B6). This is because, as described below, saved video data is displayed in the video list in chronological order, but if you do not send the oldest data first, the order of the videos will be changed in the video list, making handling complicated. This is because there is a risk of it becoming.
- step B7 After the transmission processing in steps B5 and B6, in either case, it is determined whether the transmission processing performed in each step has been reliably completed (step B7).
- step B8 it is determined whether the communication terminal 25 can communicate with the server 26 (step B8). If communication is possible, the process returns to step B7 and waits until data transmission is completed. If communication is not possible, the process returns to step B2 and waits until communication becomes possible.
- step B7 If it is determined in step B7 that data transmission has been completed, it is determined whether the power of the vehicle body is turned off (step B9). If the power is not turned off, the process returns to step B2, and checking and sending of unsent data is repeated.
- step B10 the power of the communication terminal 25 is turned off.
- the control flow is ended.
- the power of the vehicle is OFF, but the communication terminal 25 is powered by a separate battery power source (not shown), and the communication terminal 25 keeps its own power until the last data transmission is completed. Do not turn it off.
- FIG. 6 is an example of a report screen displayed by the system on an external monitor.
- the report screen shown in FIG. 6 is output by the detection result output unit 32 of the system to the external monitor 40 based on the vehicle body data received by the communication unit 30 from the communication terminal 25 of the construction machine 100, and is displayed on the screen of the external monitor 40. be done.
- an operating machine list window 41 On the report screen, an operating machine list window 41, a distribution density window 43, a detection time window 45, a detection date window 46, and a detection direction window 47 are displayed.
- the list of machine type information 36 explained in FIG. 3 is displayed in the operating machine list window 41.
- a changeover switch 42 is assigned to each vehicle, and the received data 35 corresponding to the vehicle body is referred to from the recording unit 34, with the vehicle body for which the changeover switch 42 is enabled as the target of aggregation.
- the detection result output unit 32 integrates the vehicle body position when the object detection sensor 19 detects an object, etc. during the aggregation target period from the received data 35 within the aggregation target period, and the color becomes darker in proportion to the density of the detection location.
- a distribution density 44 is drawn on the map of the distribution density window 43.
- cooperation information (link) with video data shot in a predetermined range indicating the distribution density 44 is added to the distribution density 44.
- the distribution density 44 and the video in that range are linked, and as described later, by clicking on any part of the distribution density 44, you can refer to the video in the range (area) showing the distribution density 44. can.
- the density of detection locations here indicates the frequency of locations where objects etc. are detected within the aggregation target period.
- distribution density for example, gives a certain value within a predetermined radius around the position where an object, etc. is detected, and after aggregating all data, the color density is determined by the sum of the values given to each position. It is possible to generate it by a method such as determining.
- distribution density 44 may be displayed in a manner that allows the user to intuitively understand the magnitude of the distribution density using the shade of color, the size of a figure, or a combination thereof.
- the detection time window 45 refers to the time when an object or the like was detected during the aggregation target period, and displays the ratio for each time period.
- the detection date window 46 the time when an object or the like was detected during the aggregation target period is referenced, and the number of detections per day and a breakdown of the operations performed at the time of detection are displayed.
- the detection direction window 47 the directions of the object detection sensors 19 that transmitted detection signals to the recording device 23 during the period to be counted are referenced, and the ratio for each direction is displayed. Note that in the above, the time when the recording device 23 receives the detection signal may be used as the reference.
- FIG. 7 is a configuration diagram of the video list screen linked from the distribution density 44.
- FIG. 8 is a configuration diagram of a video playback screen linked from the video list screen of FIG. 7.
- Information listed in the video list is displayed as a video title 53 at the top of the video playback screen.
- a field map similar to the distribution density window 43 is displayed on the map window 54, and an aircraft icon 55 is displayed at a point referenced from the vehicle body position data.
- Recorded video is displayed in the video window 56.
- the video can be played/stopped using the play button 57 and stop button 58, and the playback position can be adjusted using the seek bar 59.
- operation window 62 operation information such as the engine speed, traveling operation (travel lever operation), turning operation (swing lever operation), and unlocking and locking of the gate lock device is displayed in a graph.
- the graph is synchronized with the video data, and the playback position 63 on the graph slides in accordance with the position of the seek bar 59.
- FIG. 8 a video showing the "rear approach warning" in FIG. 7 is being played, and a tree 60 as a detection target is surrounded by a detection target recognition frame 61.
- the object detection system includes video data captured by a camera attached to the vehicle body of a construction machine, detection data indicating that an object detection sensor attached to the vehicle body has detected an object, and the position of the vehicle body. a communication unit that receives information; a recording unit that records at least video data and position information; and a communication unit that records at least video data and position information; It has a detection result output unit that outputs to a monitor.
- the above configuration makes it possible to schematically and visually display the position where an object, etc. is detected and the distribution density of its frequency, compared to the sole information that the vehicle body has detected a surrounding object, etc. Therefore, it becomes possible to easily understand where and under what circumstances a near-miss occurred at the site, and it becomes easy for the site manager to propose safety improvements.
- the detection result output unit displays on the external monitor the distribution density of the position and frequency, the time distribution when the object detection sensor detected the object, the date when the object detection sensor detected the object, and the vehicle body information. Displays the type of movement and the direction in which the object detection sensor detected the object. This allows more diverse information to be displayed on the external monitor, allowing site managers to take safety measures from various perspectives.
- the detection result output unit generates a position and frequency distribution density for each of the plurality of car bodies, displays a list of the plurality of car bodies on an external monitor, and displays the position and frequency distribution density on the external monitor. It is possible to select the vehicle body to be used. This makes it possible to collect data from a plurality of operating machines, and improves the accuracy of the calculated position and frequency distribution density.
- the detection result output unit calculates the position and frequency distribution density for an arbitrary period. This allows, for example, to shorten the aggregation period to obtain short-term information during a period when many near-misses occurred, and to lengthen the aggregation period to obtain long-term information for a period in which there were not many near-misses. , etc., making it possible to respond flexibly.
- the video processing unit further includes a video processing unit that processes video data, and the video processing unit analyzes the video data received by the communication unit within a predetermined time before and after the object detection sensor detects the object, and analyzes the video data according to the location and frequency distribution.
- the object is stored in association with the point on the density display where the object detection sensor detected the object.
- the construction machine further includes a camera control unit that activates the camera when the gate lock of the construction machine is released. This makes it possible to reliably operate the camera while the construction machine is in operation.
- the present invention is not limited to the above embodiments, and various modifications are possible.
- the above-mentioned embodiments have been described in detail to explain the present invention in an easy-to-understand manner, and the present invention is not necessarily limited to embodiments having all the configurations described.
- the present invention is utilized at a site where construction machinery equipped with an object detection system is operated.
Landscapes
- Engineering & Computer Science (AREA)
- Mining & Mineral Resources (AREA)
- Structural Engineering (AREA)
- General Engineering & Computer Science (AREA)
- Civil Engineering (AREA)
- Multimedia (AREA)
- Theoretical Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Signal Processing (AREA)
- Mechanical Engineering (AREA)
- Closed-Circuit Television Systems (AREA)
- Component Parts Of Construction Machinery (AREA)
Abstract
Description
Claims (9)
- 建設機械の車体に取り付けられた物体検知センサが物体を検知したことを示す検知データ、及び前記車体の位置情報を受信する通信部と、
少なくとも前記位置情報を記録する記録部と、
前記検知データ及び前記位置情報に基づいて、前記物体検知センサが前記物体を検知した位置及び頻度の分布密度を外部モニタに出力する検知結果出力部と、を有することを特徴とする物体検知システム。 - 請求項1に記載の物体検知システムであって、
前記通信部は、前記建設機械の車体に取り付けられたカメラが撮影した動画データを受信し、
前記記録部は、前記動画データを記録し、
前記検知結果出力部は、前記位置及び頻度の分布密度に前記動画データをリンクさせて前記外部モニタに出力することを特徴とする物体検知システム。 - 請求項1に記載の物体検知システムであって、
前記検知結果出力部は、前記外部モニタ上に、前記位置及び頻度の分布密度とともに前記物体検知センサが物体を検知した際の時間帯の分布を表示することを特徴とする物体検知システム。 - 請求項1に記載の物体検知システムであって、
前記検知結果出力部は、前記外部モニタ上に、前記物体検知センサが物体を検知した際の日付及び前記車体の動作の種類をさらに表示することを特徴とする物体検知システム。 - 請求項1に記載の物体検知システムであって、
前記検知結果出力部は、前記外部モニタ上に、前記物体検知センサが物体を検知した方向の内訳をさらに表示することを特徴とする物体検知システム。 - 請求項1に記載の物体検知システムであって、
前記検知結果出力部は、複数の前記車体のそれぞれについて前記位置及び頻度の分布密度を生成し、前記外部モニタ上に前記複数の車体のリストを表示し、前記外部モニタ上に前記位置及び頻度の分布密度を表示する前記車体を選択可能とすることを特徴とする物体検知システム。 - 請求項1に記載の物体検知システムであって、
前記検知結果出力部は、任意の期間に対して前記位置及び頻度の分布密度を生成することを特徴とする物体検知システム。 - 請求項1に記載の物体検知システムであって、
前記通信部は、前記建設機械の車体に取り付けられたカメラが撮影した動画データを受信し、
前記記録部は、前記動画データを記録し、
前記動画データを処理する動画処理部をさらに有し、
前記動画処理部は、前記物体検知センサが前記物体を検知した時点の前後所定時間以内に前記通信部が受信した前記動画データを、前記位置及び頻度の分布密度の表示上の、前記物体検知センサが前記物体を検知した地点と関連付けて保存することを特徴とする物体検知システム。 - 請求項1に記載の物体検知システムであって、
前記通信部は、前記建設機械の車体に取り付けられたカメラが撮影した動画データを受信し、
前記記録部は、前記動画データを記録し、
前記建設機械のゲートロックが解除された際に前記カメラを起動させるカメラ制御部をさらに有することを特徴とする物体検知システム。
Priority Applications (4)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
EP23780680.7A EP4450332A1 (en) | 2022-03-30 | 2023-03-29 | Object detection system |
CN202380018095.7A CN118574969A (zh) | 2022-03-30 | 2023-03-29 | 物体检知系统 |
KR1020247024197A KR20240121329A (ko) | 2022-03-30 | 2023-03-29 | 물체 검지 시스템 |
JP2024512668A JPWO2023190671A5 (ja) | 2023-03-29 | 物体検知システム |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2022-055797 | 2022-03-30 | ||
JP2022055797 | 2022-03-30 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2023190671A1 true WO2023190671A1 (ja) | 2023-10-05 |
Family
ID=88202050
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2023/012771 WO2023190671A1 (ja) | 2022-03-30 | 2023-03-29 | 物体検知システム |
Country Status (4)
Country | Link |
---|---|
EP (1) | EP4450332A1 (ja) |
KR (1) | KR20240121329A (ja) |
CN (1) | CN118574969A (ja) |
WO (1) | WO2023190671A1 (ja) |
Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2017201114A (ja) * | 2016-04-28 | 2017-11-09 | コベルコ建機株式会社 | 建設機械 |
JP2018141314A (ja) * | 2017-02-28 | 2018-09-13 | コベルコ建機株式会社 | 建設機械 |
JP2019175096A (ja) * | 2018-03-28 | 2019-10-10 | コベルコ建機株式会社 | 建設機械の作業情報管理システム |
-
2023
- 2023-03-29 CN CN202380018095.7A patent/CN118574969A/zh active Pending
- 2023-03-29 WO PCT/JP2023/012771 patent/WO2023190671A1/ja active Application Filing
- 2023-03-29 EP EP23780680.7A patent/EP4450332A1/en active Pending
- 2023-03-29 KR KR1020247024197A patent/KR20240121329A/ko unknown
Patent Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2017201114A (ja) * | 2016-04-28 | 2017-11-09 | コベルコ建機株式会社 | 建設機械 |
JP2019056301A (ja) * | 2016-04-28 | 2019-04-11 | コベルコ建機株式会社 | 建設機械 |
JP2018141314A (ja) * | 2017-02-28 | 2018-09-13 | コベルコ建機株式会社 | 建設機械 |
JP6805883B2 (ja) | 2017-02-28 | 2020-12-23 | コベルコ建機株式会社 | 建設機械 |
JP2019175096A (ja) * | 2018-03-28 | 2019-10-10 | コベルコ建機株式会社 | 建設機械の作業情報管理システム |
Also Published As
Publication number | Publication date |
---|---|
CN118574969A (zh) | 2024-08-30 |
KR20240121329A (ko) | 2024-08-08 |
JPWO2023190671A1 (ja) | 2023-10-05 |
EP4450332A1 (en) | 2024-10-23 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US11623517B2 (en) | Vehicle exception event management systems | |
KR102454612B1 (ko) | 건설기계용 안전관리시스템, 관리장치 | |
US8139820B2 (en) | Discretization facilities for vehicle event data recorders | |
JP4847913B2 (ja) | 作業機械周辺監視装置 | |
US20080147267A1 (en) | Methods of Discretizing data captured at event data recorders | |
CN114729522B (zh) | 挖土机管理系统、挖土机用便携终端及用于挖土机用便携终端的程序 | |
JP6373393B2 (ja) | 作業車両、遠隔診断システム、及び遠隔診断方法 | |
US20150195483A1 (en) | Event recorder playback with integrated gps mapping | |
WO2023190671A1 (ja) | 物体検知システム | |
CN112053464A (zh) | 一种掘进机远程故障诊断方法及系统 | |
JP7073146B2 (ja) | 建設機械、建設機械の表示装置、及び、建設機械の管理装置 | |
KR101725760B1 (ko) | 차량 블랙박스를 이용한 교통법규위반 신고 시스템 및 그 방법 | |
KR101391909B1 (ko) | 차량용 블랙박스의 영상데이터와 gis 연동 서비스 방법 | |
JP2021103840A (ja) | 作業支援サーバ、撮像装置の選択方法 | |
GB2573509A (en) | Method and system for providing display redundancy on a machine | |
CN209946960U (zh) | 车辆监控系统和车辆 | |
JP2024139044A (ja) | 映像記録システム及び作業機械 | |
WO2024202693A1 (ja) | 稼働履歴管理システム | |
WO2024070448A1 (ja) | 映像記録システム | |
KR200387268Y1 (ko) | 승강기 운행 감시시스템 | |
WO2023190047A1 (ja) | 映像記録装置 | |
JP2023175157A (ja) | 作業機械の画像表示システム、作業機械の画像表示方法および作業機械の画像表示用プログラム | |
JP2022155699A (ja) | 車体情報収集システム | |
US20230150358A1 (en) | Collision avoidance system and method for avoiding collision of work machine with obstacles | |
JP7580184B2 (ja) | 作業機械、情報管理システム、プログラム |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 23780680 Country of ref document: EP Kind code of ref document: A1 |
|
DPE1 | Request for preliminary examination filed after expiration of 19th month from priority date (pct application filed from 20040101) | ||
WWE | Wipo information: entry into national phase |
Ref document number: 2024512668 Country of ref document: JP |
|
ENP | Entry into the national phase |
Ref document number: 20247024197 Country of ref document: KR Kind code of ref document: A |
|
WWE | Wipo information: entry into national phase |
Ref document number: 202380018095.7 Country of ref document: CN |
|
ENP | Entry into the national phase |
Ref document number: 2023780680 Country of ref document: EP Effective date: 20240716 |