[go: up one dir, main page]
More Web Proxy on the site http://driver.im/

KR101656642B1 - Group action analysis method by image - Google Patents

Group action analysis method by image Download PDF

Info

Publication number
KR101656642B1
KR101656642B1 KR1020160028943A KR20160028943A KR101656642B1 KR 101656642 B1 KR101656642 B1 KR 101656642B1 KR 1020160028943 A KR1020160028943 A KR 1020160028943A KR 20160028943 A KR20160028943 A KR 20160028943A KR 101656642 B1 KR101656642 B1 KR 101656642B1
Authority
KR
South Korea
Prior art keywords
image
space
unit
control server
camera
Prior art date
Application number
KR1020160028943A
Other languages
Korean (ko)
Inventor
강성진
Original Assignee
(주)디지탈라인
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by (주)디지탈라인 filed Critical (주)디지탈라인
Priority to KR1020160028943A priority Critical patent/KR101656642B1/en
Application granted granted Critical
Publication of KR101656642B1 publication Critical patent/KR101656642B1/en

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q50/00Information and communication technology [ICT] specially adapted for implementation of business processes of specific business sectors, e.g. utilities or tourism
    • G06Q50/10Services
    • GPHYSICS
    • G08SIGNALLING
    • G08BSIGNALLING OR CALLING SYSTEMS; ORDER TELEGRAPHS; ALARM SYSTEMS
    • G08B21/00Alarms responsive to a single specified undesired or abnormal condition and not otherwise provided for
    • G08B21/18Status alarms
    • GPHYSICS
    • G08SIGNALLING
    • G08BSIGNALLING OR CALLING SYSTEMS; ORDER TELEGRAPHS; ALARM SYSTEMS
    • G08B25/00Alarm systems in which the location of the alarm condition is signalled to a central station, e.g. fire or police telegraphic systems
    • G08B25/14Central alarm receiver or annunciator arrangements
    • H04N5/232
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N7/00Television systems
    • H04N7/18Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast
    • H04N7/181Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast for receiving images from a plurality of remote sources

Landscapes

  • Business, Economics & Management (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Tourism & Hospitality (AREA)
  • Emergency Management (AREA)
  • Engineering & Computer Science (AREA)
  • Health & Medical Sciences (AREA)
  • Signal Processing (AREA)
  • Multimedia (AREA)
  • Economics (AREA)
  • General Health & Medical Sciences (AREA)
  • Human Resources & Organizations (AREA)
  • Marketing (AREA)
  • Primary Health Care (AREA)
  • Strategic Management (AREA)
  • General Business, Economics & Management (AREA)
  • Theoretical Computer Science (AREA)
  • Image Analysis (AREA)
  • Alarm Systems (AREA)

Abstract

A method for analyzing group behavior using images is disclosed. The group behavior analysis method of the present invention comprises the steps of: dividing an output image of an image detection unit into an image of a fixed background and an object of a motion, and matching with a continuously generated moving object image; The image data for the confirmation space and the X and Y coordinate values of the specific location are transmitted so that the corresponding image is popped up (POP -UP), and displays enlarged images of the area. Thus, it is possible to promptly cope with an accident or accident occurring in places such as a crowded area using CCTV images in real time.

Description

[0001] GROUP ACTION ANALYSIS METHOD BY IMAGE [0002]

The present invention relates to a group behavior analysis method, and more particularly, it relates to a group behavior analysis method in which a fixed background image is firstly separated and stored, a real pattern classification and analysis is performed on a moving subject, The present invention relates to a method for analyzing group behavior using differentiated images, which enables a user to quickly identify a space.

Security system generally consists of a fixed camera and PTZ (Pan Tilt Zoom) rotating camera, and a fixed camera that maintains a certain angle of view. If suspicious condition is suspected, .

Through many technological advances, if a person or an automobile is detected by analyzing the image on a fixed screen, the PTZ rotating camera automatically constructs a tracking system. However, for example, in a large playground, Efficiency is maximized, and automatic tracking system can not be installed and operated because there are many moving people and automobiles in crowded areas such as general India, subway platforms, stairs and intersections.

When a group battle or accident occurs at a certain place, or when a criminal is threatened by a crowded place, or is in danger by explosives or the like, it is generally reported and dispatched to the site. And rapid response are important, but there is a restriction to rely on manpower.

In particular, most places are exposed to accidents caused by terrorism and human resources from antisocial groups. In case of accidents, the damage will be increased in a delayed response.

KR Patent Registration No. 10-0767065 (October 10, 2007)

In order to solve such a problem, the present invention analyzes CCTV images in a crowded area and analyzes movement patterns of the CCTV images, And a method thereof.

In addition, the present invention analyzes the movement of a pattern different from usual by using a CCTV image of a crowded area and pops up an enlarged image with a nearby PTZ rotating camera in case of an incident or an accident, It is another object to provide a method of analyzing a group behavior using an image.

To solve these problems, the group behavior analysis method using the image analysis module for analyzing the group behavior using the output image of the image detection unit composed of the object detection camera for capturing the image of the present invention and the object tracking camera for tracking function, a) a real-time image is received from the object detecting camera in the object generating unit of the image analyzing module, and the real-time image is received from the object detecting camera through a pattern learning for a predetermined time or an operator coordinates the image directly through the control server, The background is stored as background image data, and the fixed background image data and the image received from the detection camera are analyzed in units of 1/30 second to match the continuously generated moving object data of the 2/30 second image, Determining that the object is an object to be analyzed if it is detected; (b) The direction analysis unit of the image analysis module individually assigns a color and a number to the object to be analyzed and analyzes the motion of the individual object in X and Y coordinates based on the stored background image data, The moving speed of the individual object to which the color and the number are assigned is calculated in the moving speed meter of the image analysis module, and (c) The analysis processing unit calculates the coordinates when the information received from the direction analyzing unit and the moving speed meter acquiring unit is equal to or greater than a preset number of times, and transmits a signal to the alarm generating unit as a confirmation required space. In step (b) If you are avoiding a specific place by matching the value in the normal environment with the direction and speed of the object, Value and the number of objects without movement during the time set in a specific place is more than the set value, it is judged that the case or accident occurs and the person is avoided or surrounded by the group. If there is no object passing through the specific space for a predetermined time, the object is determined as a space required for confirmation when the object moves without moving or by 30% faster than the speed of the learned pattern. Determining whether the space required for confirmation is a necessary space, setting the space required for confirmation in advance and monitoring it intensively, or setting the confirmation space as a confirmation unnecessary space such as a previously approved construction, and (d) The image data and the X, Y coordinate values of the specific location are stored in the image analysis module (E) the alarm generating unit receives the image data and the X and Y coordinates received in the step (d), pops up the corresponding image to the control server, And transmitting the received X and Y coordinates to a neighboring object tracking camera to control the photographing of the enlarged image, receiving the enlarged image, displaying the received enlarged image on the control server, and sending an alarm signal, and in the step (a) A background image may be used to set a dangerous space as a dangerous space, an automobile exclusive road, a construction section, or a falling stone danger zone. When a moving object moves to a dangerous space or is staying in a dangerous space for a certain time An alarm is generated in the control server immediately, and the X and Y coordinates of the dangerous space are transmitted to a nearby object tracking camera in the dangerous space to enlarge The control and zoom images to photograph an image can be achieved by controlling to display on the control server.

Therefore, according to the group behavior analysis method using the image of the present invention, since it is possible to accurately grasp the events or accidents occurring in places such as a crowded area by using CCTV images in real time, it is possible to cope quickly, Since it is operated automatically by alarm, it is possible to manage efficiently and protect valuable person and property.

FIG. 1 is a schematic diagram for implementing a group behavior analysis method using a CCTV image,
2 is a detailed configuration diagram of an image analysis module according to an embodiment of the present invention;
FIG. 3 is a flowchart illustrating a method of automatically detecting an incident or the like by acquiring and analyzing an image according to an embodiment of the present invention,
FIG. 4 is a diagram illustrating a normal environment in a population dense place according to an embodiment of the present invention; FIG.
And,
FIG. 5 is a diagram illustrating an environment in which a confirmation place alarm is generated in a population dense space according to an embodiment of the present invention.

It is to be understood that the words or words used in the present specification and claims are not to be construed in a conventional or dictionary sense and that the inventor can properly define the concept of a term in order to describe its invention in the best possible way And should be construed in light of the meanings and concepts consistent with the technical idea of the present invention.

Throughout the specification, when an element is referred to as "comprising ", it means that it can include other elements as well, without excluding other elements unless specifically stated otherwise. It should be noted that the terms such as " part, "" module, " .

It is to be understood that the term "and / or" throughout the specification includes all possible combinations from one or more related items. For example, the meaning of "first item, second item and / or third item" may be presented from two or more of the first, second or third items as well as the first, second or third item It means a combination of all the items that can be.

Identification codes (e.g., a, b, c, ...) in each step throughout the specification are used for convenience of description, and the identification codes do not limit the order of each step, Unless the context clearly states a particular order, it may take place differently from the stated order. That is, each step may occur in the same order as described, may be performed substantially concurrently, or may be performed in reverse order.

Hereinafter, an embodiment of the present invention will be described with reference to the drawings.

The present invention separates a fixed background and a subject having continuous motion by a method of receiving a scene image and learning a predetermined time pattern or setting an operator directly to an image, The image is continuously stored in the image processing unit 131, and the image having continuous motion is separately extracted, and the matching is repeated in units of 1/25 second. The direction and the moving speed of the detected subject are calculated to analyze the group behavior of the subject or subjects An image analysis module 130 that generates an alarm when the behavior analysis processing unit 132 recognizes the presence or absence of a required space as a confirmation space, and a control server 140 that can remotely control and manage these systems do.

FIG. 1 is a block diagram for implementing a group behavior analysis method using a CCTV image. As shown in FIG. 1, the group behavior analysis system of the present invention is installed in a field and acquires an image, A road, a staircase, a tree, and other landscapes (e.g., a building, an automobile road, a stairway, a tree, and the like) by receiving the X and Y coordinate values of the CCTV image sensing unit 110 and the CCTV image sensing unit 110, An image analysis module 130 for analyzing group behavior by distinguishing illumination facilities and objects having continuous motion and automatically generating an alarm, and an image analyzing module 130 for analyzing the CCTV image through wireless such as wired line, A network 120 for communicating with the detection unit 110 and the image analysis module 130, and a control server 140 capable of remotely controlling and managing these systems.

The video image detection unit 110 includes an object detection camera 111 installed on the scene to detect a moving object and an object tracking camera 112 having a pan and tilt zoom in and zoom out function.

The object detecting camera 111 comprises at least one camera for capturing a fixed image in a fixed direction and outputting image data.

The object tracking camera 112 can be driven by a pan, tilt, zoom (PTZ), and is configured to automatically track an object to perform close-up and enlarged photographing.

The object detecting camera 111 and the object tracking camera 112 may be digital video cameras, color cameras, monochrome cameras, and cameras. (CCTV) camera, pan, tilt, zoom (Pan, Tilt, Zoom) camera, camcorder, PC camera, web camera, infrared (IR) video camera, low-light video camera, thermal video camera, Zoom: PTZ) camera, or a video sensing device.

In addition, the image analysis module 130 sets the background image through the pattern learning in the CCTV image photographed on the scene through the object detection camera 111, individually detects the moving object, and analyzes the direction and the speed etc. When an action of staying at a specific place for a certain period of time or moving away from a specific place is sensed, an alarm is generated in the control server 140 operated by the terminal of the operator so as to promptly confirm the object, The camera 112 is used to command a coordinate value so as to grasp the correct situation, and POP-UP of the enlarged image is made possible so that group behavior analysis can be performed.

Referring to FIG. 2, the image analysis module 130 includes an object detection camera 111 for generating an object image for generating an analysis object image from the image received from the object detection camera 111, An image separating unit 132 for separating the fixed image into a fixed image having a fixed background and an object image having a motion, a fixed image storing unit 132 for separately storing the fixed image in the background image storing unit 133a, A direction analyzing unit 134 for analyzing the direction of the subject having continuous motion and transmitting the data to the behavior analysis processing unit 136, A moving speed analyzing unit 135 for calculating the speed and delivering the object individual moving speed to the behavior analysis processing unit 136, and the direction and speed of each individual object thus received are analyzed If the case of avoiding a specific place by matching with a fixed value or a value in a normal environment learned in advance is a value of an object larger than the set number of times, or if the number of objects without motion is greater than a set value during a time set in a specific place, Or an accident, and is regarded as a case where it is avoided or surrounded by a group, and is recognized as a space requiring confirmation, and a behavior analysis processing unit 136 (see FIG. 1) 136 for transmitting the image data and the X, Y coordinate values of the specific location to the alarm generating unit 137 ), Receives the alarm data as a confirmation-required space, pops up the corresponding image to the operator, transmits the coordinates to the nearby object tracking camera 112, and transmits the enlarged image to the operator, and transmits RS- 422, RS-232C, Hardwire, etc., or wireless communication such as Zigbee, Wifi, Lora, etc., ), A command input unit 138 including a function of allowing the operator to set a check space in advance and to intensively monitor the check space, or to set a specific space not to be analyzed for reasons such as previously approved construction.

Although enlarged images can be magnified by direct close-ups, 1/3 of the entire image is photographed at a rate of 75 frames for 3 seconds, and the points excluding the lower 43% of the images are set as reference points of X and Y coordinates , It is preferable to magnify the image at a speed of 50 frames for 2 seconds.

The subject generating unit 131 receives a real-time image from a fixed angle-of-view object detecting camera 111 installed in the field, and learns through a pattern for a set time or directly instructs an operator through the control server 140 And the like to generate an analysis target image.

Specifically, the image received from the on-site object detection camera 111 is analyzed in units of 1/30 second and is matched with the 2/30 second image. When the change is detected, the image is divided into an analysis object image and transmitted to the image separation unit 132 .

The image separating unit 132 separates the analysis target image transmitted from the subject generating unit 131 into a fixed background and a motion object, and separates the motion-based objects into colors and numbers individually.

That is, the image separating unit 132 transmits the fixed background image from the image of the analysis object outputted from the subject generating unit 131 to the background image storing unit 133a, stores the motion image, The image is transmitted to the background image matching unit 133.

The background image matching unit 133 successively compares the background image data stored in the background image storage unit 133a with the image data of the object having motion thereafter and transmits the object image data having motion to the orientation analyzer 134 .

The direction analyzing unit 134 assigns colors and numbers to the moving objects individually and analyzes the motion of the individual objects in X and Y coordinates based on the background image data stored in the background image storing unit 133a And transmits the information to the behavior analysis processing unit 136 when a motion different from the direction of the learned pattern is detected.

The movement speed calculation unit 135 calculates the movement speed of the individual object to which the color and the number are assigned, and when the movement speed is 30% faster than the speed of the learned pattern without motion or for a preset time, Information.

When the information received from the direction analyzing unit 134 and the movement speed calculating unit 135 is equal to or greater than a preset number of times, the behavior analysis processing unit 136 calculates the coordinates and outputs the coordinates to the alarm generating unit 137 And POP-UP the corresponding image to the control server 140 of the operator.

The alarm generating unit 137 sends the coordinate value of the corresponding place to the neighboring object tracking camera 112 with respect to the confirmation required place signal received from the behavior analysis processing unit 136 and transmits the magnified picture to the control server 140 And, if necessary, to send a warning signal to a physical device such as a warning light.

A function of allowing the user to intensively monitor the command input unit 138 by setting a check space in advance or setting a specific space not to be analyzed on the grounds of approved work or the like.

The command input unit 138 may be used through the terminal, but may also be connected to the control server 140 by wired or wireless connection, and may also input commands to the control server 140 directly.

Hereinafter, a group behavior analysis method will be described using the above-described group behavior analyzing apparatus of the present invention.

FIG. 3 is a flowchart for explaining a group behavior analysis method using an image for automatically detecting an incident or the like by acquiring and analyzing an image according to an embodiment of the present invention.

First, the object detecting camera 111 of the video detecting unit 110 captures an image of a scene and transmits the captured image in real time (S110). The subject generating unit 131 compares the coordinates of the object with the coordinate values of the pattern learning or operator's manual image writing, Generates image data, and transmits the generated data to the image separating unit 132 (S111).

The image separating unit 132 divides the data into a fixed background image and a motion object image, stores the fixed background image in the background image storage unit 133a, and the object image is stored in the background image matching unit 133 Matching with data continuously generated, and moving images are divided into object frames and transmitted (S112).

The distinction between the background image and the motion image in step S112 can be made by matching the previous frame change rate from the obtained object image and matching the absolute value of the additive color mixture reference light to separate the background image and the object frame.

That is, a frame before 1/25 second is matched with a real-time photographed frame to detect a value of a primary frame rate, and a frame change rate and fixed buildings and backgrounds at an angle of view of the object detection camera are recognized, (The entire screen is black), and when an arbitrary object is included in the angle of view, the change in the absolute value of light in the region caused by the color mixing of light is secondarily detected, Which is the rate of change of the absolute value of light.

In step S112, the direction analyzing unit 134 and the movement speed calculating unit 135 analyze the direction and the moving speed of the individual object with respect to the object frame in motion (S113).

In step S113, the behavior analysis processing unit 135 receives the data on the direction and the moving speed of the object frame and the individual object in real time, and determines whether the value is within the sampling error range of the set value (S114).

If it is within the standard error range in step S114, the analysis exclusion command is transmitted to the image separating unit 132 again. If the error is within the standard error range of the set value or below, And coordinates (S115).

In step S115, the alarm generating unit 137 sends the coordinate value of the place to the nearby object tracking camera 112 for the confirmation place signal received from the behavior analysis processing unit 136, And sends an alarm signal to a physical device such as a warning light or emergency bell when necessary.

In addition, in step S112, it is possible to include a function capable of setting a confirmation required space in advance through the command input unit 138 so as to be able to monitor it intensively, or to set a specific space not to be analyzed for reasons such as preliminarily approved construction .

On the other hand, in step S114, when the number of cases in which the moving objects are not moving for a preset time or the number of cases in which moving objects move in a direction to avoid a specific place is equal to or greater than a predetermined value, And allowing the nearby object tracking camera 112 to command the coordinates of the object when the specific space is detected as a space requiring confirmation, so that the enlarged image can be photographed.

That is, the judgment of the case where the moving objects do not move for a set time in a specific place or moves in a direction to avoid a specific place should be judged as a confirmation place compared with a normal image.

An example of such determination is illustrated in Figs. 4 and 5. Fig.

4 is a view showing a normal environment in a densely populated place according to an embodiment of the present invention. When a real-time image is photographed by installing an object detection camera 111 on the spot, The subjects of A, B, C, D, E, and F detected the movement in a certain direction when the right direction is set as X and the upward direction is set as Y, Environment. For example, in the case of the object A, the movement of the first detected A1 in a certain direction is detected as A2 and A3 in the X direction from the reference point. In this case, the Y value is a standard that does not exceed the range of the road Z recognized as a fixed background It can be understood that it is within the error range.

For example, in the case of the E subject, the first detected E1 detects movement in a certain direction from E2, E3, E4, and E5 in the X direction with a high absolute value. In this case, It is judged as a normal flow within the standard error range not exceeding the range of the road Z and the alarm is not generated.

B, C, D, and E detected in an environment having the same conditions as those of FIG. 4 will be described with reference to FIG. 5, which shows an environment in which a confirmation place alarm is generated in a dense population space according to an embodiment of the present invention. , And F are recognized as a space required for confirmation when the direction and speed of each subject are equal to or greater than a predetermined value or less, and an alarm is generated.

For example, in the case of the A subject, the first detected A1 moves in the X direction from the reference point, then moves without moving for more than the time set in the A2 space, and in the case of the B subject, the first detected B1 moves in the X direction from the reference point In the case of the C subject, the first detected C1 moves from the X direction toward the reference point in which the absolute value is high, and then moves without movement for more than the time set in the C3 space. If there is no movement for more than three subjects in the place, the same place is recognized as an environment where the subjects are surrounded by an accident or the like, and an alarm is generated in the confirmation space 210. [

In the case of the D subject, the first detected D1 moves in the X direction from the reference point, and the D3 is detected on the road Z, To generate an alarm.

In the case of the E subject, the first detected E1 is detected as moving at a certain speed or faster than usual when the absolute value is moved from the X side to the reference point, F1 is moved to the reference point in the X direction with a high absolute value and then the direction is again returned in front of the specific place. Even in the case of the D subject described in the above example, Z, which is a fixed background in front of a specific place, It is detected that a direction is moved to the space 220. When three or more subjects move in the same place by detouring or switching directions different from the normal flow, the specific place is recognized as an environment in which the subject is evacuated due to an accident or the like So that an alarm is generated in the confirmation space 230.

In addition, the control server 140 may set a background image as a dangerous space, such as an automobile exclusive road, a construction section, or a falling stone danger section, and the behavior analysis processing section 136 may set the moving object as a dangerous space If the user moves or is staying in a dangerous space for a predetermined time, an alarm is generated in the control server 140 immediately, and X and Y coordinates of the dangerous space are transmitted to the nearby object tracking camera 112 in the dangerous space, And control the enlarged image to be displayed on the control server 140.

As described above, according to the group behavior analysis method using the image of the present invention, it is possible to promptly cope with an event or an accident occurring in places such as a crowded area by using CCTV images in real time, Since the function is operated automatically by video and alarm, it is possible to manage effectively and protect valuable person and property.

While the present invention has been particularly shown and described with reference to exemplary embodiments thereof, it is evident that many alternatives, modifications and variations will be apparent to those skilled in the art.

110: video detecting unit 111: object detecting camera
112: object tracking camera 120: network
130: Image analysis module 131:
132: image separating unit 133: background image matching unit
133a: background image storage unit 134: direction analysis unit
135: moving speed analysis unit 136: behavior analysis processing unit
137: alarm generator 138: command input
140: control server

Claims (1)

1. A group behavior analysis method using an image analysis module for analyzing group behavior using an output image of an image detection unit composed of an object detection camera for capturing an image and an object tracking camera for tracking function,
(a) a subject generation unit of the image analysis module receives a real-time image from the object detection camera, compares the real-time image with a coordinate value directly designated to an image photographed through a predetermined time pattern learning or an operator's control server, The background image data is stored in the background image storage unit as the background image data, and the fixed background image data and the image received from the object detection camera are analyzed in units of 1/30 second, Determining the object to be analyzed if the change is detected by matching with the object data, and transmitting the object to the direction analyzing unit;
(b) assigning a color and a number individually to the analysis object of the image analysis module in the direction analyzing module of the image analysis module in the step (a), and based on the stored background image data, And Y coordinates. If motion is detected that is different from the direction of the learned pattern, the corresponding information is transmitted to the behavior analysis processing unit of the image analysis module. In the moving speed measurement unit of the image analysis module, Calculating a moving speed of the moving object;
(c) if the information received from the direction analyzing unit and the moving speed meter acquiring unit is greater than or equal to a preset number of times, the behavior analyzing unit calculates the coordinates and transmits a signal to the alarm generating unit as a confirmation required space, In case of avoiding a specific place by matching the value in the normal environment with the direction and speed analyzed in step b), the case where the number of objects without motion is greater than the set number of times, and the number of objects without motion during the time set in the specific place If there is no movement during the preset time or if the movement moves by 30% faster than the speed of the learned pattern, it is judged as the space required for confirmation. And there is an object with motion on the left and right of a specific space, It is possible to determine whether or not a predetermined space is required to pass through the specific space, and to make it possible to monitor in a concentrated manner the preset space for confirmation, or to check the space unnecessarily ;
(d) transmitting image data for the confirmation space and X, Y coordinate values of the specific location to an alarm generator of the image analysis module;
(e) The alarm generating unit receives the image data and X, Y coordinates received in the step (d), POP-up the corresponding image to the control server, and transmits the image to the nearby object tracking camera Transmitting an X, Y coordinate to capture an enlarged image, receiving an enlarged image, displaying the enlarged image on a control server, and sending an alarm signal;
Wherein in the step (a), at least one of a space for an automobile exclusive use road, a construction section, and a falling stone danger zone is set as a dangerous space in the control server as a background image, Or an alarm is generated in the control server immediately in case of staying in a dangerous space for a predetermined time, and X and Y coordinates of the dangerous space are transmitted to a nearby object tracking camera in the dangerous space, A group behavior analysis method for controlling an image to be displayed on a control server.















KR1020160028943A 2016-03-10 2016-03-10 Group action analysis method by image KR101656642B1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
KR1020160028943A KR101656642B1 (en) 2016-03-10 2016-03-10 Group action analysis method by image

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
KR1020160028943A KR101656642B1 (en) 2016-03-10 2016-03-10 Group action analysis method by image

Publications (1)

Publication Number Publication Date
KR101656642B1 true KR101656642B1 (en) 2016-09-09

Family

ID=56939401

Family Applications (1)

Application Number Title Priority Date Filing Date
KR1020160028943A KR101656642B1 (en) 2016-03-10 2016-03-10 Group action analysis method by image

Country Status (1)

Country Link
KR (1) KR101656642B1 (en)

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR101879444B1 (en) * 2018-03-05 2018-07-17 (주)아성정보 Method and apparatus for operating CCTV(closed circuit television)
US20210192225A1 (en) * 2019-12-24 2021-06-24 Uif (University Industry Foundation), Yonsei University Apparatus for real-time monitoring for construction object and monitoring method and computer program for the same
KR102296037B1 (en) * 2020-03-17 2021-09-01 주식회사 건강나눔 Sports Facility Member and Energy Integrated Processing System
KR102661202B1 (en) 2023-10-27 2024-04-26 글로벌정보통신(주) PTZ camera with signal function

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH0118674B2 (en) * 1984-02-23 1989-04-06 Akai Electric
KR100767065B1 (en) 2007-03-26 2007-10-17 (주)홈시큐넷 Remote monitoring system and method thereof
KR100813936B1 (en) * 2006-04-14 2008-03-14 텔미정보통신 주식회사 Method for extracting subject and image synthesizing in moving picture
KR20150034398A (en) * 2013-09-26 2015-04-03 서진이엔에스(주) A Parking Event Detection System Based on Object Recognition

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH0118674B2 (en) * 1984-02-23 1989-04-06 Akai Electric
KR100813936B1 (en) * 2006-04-14 2008-03-14 텔미정보통신 주식회사 Method for extracting subject and image synthesizing in moving picture
KR100767065B1 (en) 2007-03-26 2007-10-17 (주)홈시큐넷 Remote monitoring system and method thereof
KR20150034398A (en) * 2013-09-26 2015-04-03 서진이엔에스(주) A Parking Event Detection System Based on Object Recognition

Cited By (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR101879444B1 (en) * 2018-03-05 2018-07-17 (주)아성정보 Method and apparatus for operating CCTV(closed circuit television)
US20210192225A1 (en) * 2019-12-24 2021-06-24 Uif (University Industry Foundation), Yonsei University Apparatus for real-time monitoring for construction object and monitoring method and computer program for the same
KR20210081618A (en) * 2019-12-24 2021-07-02 연세대학교 산학협력단 Apparatus for real-time monitoring for construction object and monitoring method and and computer program for the same
KR102416227B1 (en) * 2019-12-24 2022-07-01 연세대학교 산학협력단 Apparatus for real-time monitoring for construction object and monitoring method and and computer program for the same
US11568648B2 (en) 2019-12-24 2023-01-31 Uif (University Industry Foundation), Yonsei University Apparatus for real-time monitoring for construction object and monitoring method and computer program for the same
KR102296037B1 (en) * 2020-03-17 2021-09-01 주식회사 건강나눔 Sports Facility Member and Energy Integrated Processing System
KR102661202B1 (en) 2023-10-27 2024-04-26 글로벌정보통신(주) PTZ camera with signal function

Similar Documents

Publication Publication Date Title
JP6631619B2 (en) Video monitoring system and video monitoring method
CN107766788B (en) Information processing apparatus, method thereof, and computer-readable storage medium
JP4803376B2 (en) Camera tampering detection method
KR101544019B1 (en) Fire detection system using composited video and method thereof
EP2058777A1 (en) Suspicious behavior detection system and method
KR100849689B1 (en) watching control system having an auto picture pop-up function and controlling method for the same
JP2006523043A (en) Method and system for monitoring
KR101835552B1 (en) Control center system of working environment for smart factory
KR102195706B1 (en) Method and Apparatus for Detecting Intruder
KR101036947B1 (en) The automatic guard system to prevent the crime and accident using computer video image analysis technology
KR101656642B1 (en) Group action analysis method by image
KR101496390B1 (en) System for Vehicle Number Detection
KR101695127B1 (en) Group action analysis method by image
KR101741107B1 (en) Apparatus and method for monitoring fire by using cctv
KR101832274B1 (en) System for crime prevention of intelligent type by video photographing and method for acting thereof
KR102113527B1 (en) Thermal ip cctv system and method for detecting fire by using the same
JP2023126352A (en) Program, video monitoring method, and video monitoring system
JP2016092693A (en) Imaging apparatus, imaging apparatus control method, and program
KR102046591B1 (en) Image Monitoring System and Method for Monitoring Image
KR20160048428A (en) Method and Apparatus for Playing Video by Using Pan-Tilt-Zoom Camera
KR20180072466A (en) System and method for setting of video surveillance area
JP2001094968A (en) Video processor
KR102713573B1 (en) Video analysis device using a multi camera consisting of a plurality of fixed cameras
KR100982342B1 (en) Intelligent security system and operating method thereof
KR101926510B1 (en) Wide area surveillance system based on facial recognition using wide angle camera

Legal Events

Date Code Title Description
E701 Decision to grant or registration of patent right
GRNT Written decision to grant