US10021311B2 - Camera control apparatus - Google Patents
Camera control apparatus Download PDFInfo
- Publication number
- US10021311B2 US10021311B2 US14/509,230 US201414509230A US10021311B2 US 10021311 B2 US10021311 B2 US 10021311B2 US 201414509230 A US201414509230 A US 201414509230A US 10021311 B2 US10021311 B2 US 10021311B2
- Authority
- US
- United States
- Prior art keywords
- camera
- image
- center coordinates
- control
- processor
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Active, expires
Links
Images
Classifications
-
- H04N5/23296—
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/60—Control of cameras or camera modules
- H04N23/69—Control of means for changing angle of the field of view, e.g. optical zoom objectives or electronic zooming
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V20/00—Scenes; Scene-specific elements
- G06V20/50—Context or environment of the image
- G06V20/52—Surveillance or monitoring of activities, e.g. for recognising suspicious objects
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V40/00—Recognition of biometric, human-related or animal-related patterns in image or video data
- G06V40/10—Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
- G06V40/103—Static body considered as a whole, e.g. static pedestrian or occupant recognition
-
- G—PHYSICS
- G08—SIGNALLING
- G08B—SIGNALLING OR CALLING SYSTEMS; ORDER TELEGRAPHS; ALARM SYSTEMS
- G08B13/00—Burglar, theft or intruder alarms
- G08B13/18—Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength
- G08B13/189—Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength using passive radiation detection systems
- G08B13/194—Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength using passive radiation detection systems using image scanning and comparing systems
- G08B13/196—Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength using passive radiation detection systems using image scanning and comparing systems using television cameras
- G08B13/19602—Image analysis to detect motion of the intruder, e.g. by frame subtraction
- G08B13/19608—Tracking movement of a target, e.g. by detecting an object predefined as a target, using target direction and or velocity to predict its new position
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/60—Control of cameras or camera modules
- H04N23/61—Control of cameras or camera modules based on recognised objects
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/60—Control of cameras or camera modules
- H04N23/61—Control of cameras or camera modules based on recognised objects
- H04N23/611—Control of cameras or camera modules based on recognised objects where the recognised objects include parts of the human body
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/60—Control of cameras or camera modules
- H04N23/68—Control of cameras or camera modules for stable pick-up of the scene, e.g. compensating for camera body vibrations
- H04N23/681—Motion detection
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/60—Control of cameras or camera modules
- H04N23/68—Control of cameras or camera modules for stable pick-up of the scene, e.g. compensating for camera body vibrations
- H04N23/681—Motion detection
- H04N23/6811—Motion detection based on the image signal
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/60—Control of cameras or camera modules
- H04N23/68—Control of cameras or camera modules for stable pick-up of the scene, e.g. compensating for camera body vibrations
- H04N23/681—Motion detection
- H04N23/6815—Motion detection by distinguishing pan or tilt from motion
-
- H04N5/232—
-
- H04N5/23219—
-
- H04N5/23261—
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N7/00—Television systems
- H04N7/18—Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast
- H04N7/183—Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast for receiving images from a single remote source
-
- H04N5/23251—
-
- H04N5/23254—
Definitions
- Apparatuses and methods consistent with exemplary embodiments relate to a camera control apparatus which makes an object included in a camera image acquired by capturing an image of a monitoring area be positioned in a center of the camera image.
- a pan-tilt-zoom (PTZ) camera is used to accurately capture an image of an object in a monitoring area.
- the PTZ camera includes pan, tilt, zoom, and focus functions to track an object through a movement of a camera or to enlarge or precisely capture an image of the object.
- the PTZ camera in the related art tracks or precisely captures the image of the object through execution of any one of pan, tilt, zoom, and focus functions after forming a camera image that is acquired by capturing the image of the monitoring area, a delay phenomenon occurs in tracking the object due to the performing of the tracking after capturing the image of the object.
- the object included in the camera image is positioned at a point that secedes from the center of the camera image.
- Exemplary embodiments address at least the above problems and/or disadvantages and other disadvantages not described above. Also, exemplary embodiments are not required to overcome the disadvantages described above, and an exemplary embodiment may not overcome any of the problems described above
- One or more exemplary embodiments include a camera control apparatus, which makes an object included in a camera image acquired by capturing an image of a monitoring area be positioned in a center of the camera image.
- a camera control apparatus including: a processor configured to detect a partial image including an object from a first image of a camera, generate a control value for controlling a camera capturing area to position a specific point of the object corresponding to specific coordinates of the partial image on center coordinates of a second image of the camera which is captured by the camera subsequent to the first image based on center coordinates of the partial image and center coordinates of the first image; and a camera drive controller configured to change the camera capturing area based on the control value output from the processor.
- the processor may include an image reception module configured to receive a camera image in a frame unit; a target detection module configured to detect the partial image through a user's selection or a predetermined image detection process; a control value calculation module configured to determine control center coordinates for the camera capturing area as a specific point in an opposite direction to the center coordinates of the partial image based on a difference between the center coordinates of the partial image and the center coordinates of the first image, change a weight of a control acceleration for changing the camera capturing area according to a difference between the center coordinates of the partial image and the control center coordinates, and to generate the control value based on the changed weight; and a control execution module configured to output the control value to the camera drive controller.
- an image reception module configured to receive a camera image in a frame unit
- a target detection module configured to detect the partial image through a user's selection or a predetermined image detection process
- a control value calculation module configured to determine control center coordinates for the camera capturing area as a specific point in an opposite direction
- the camera control apparatus may further include an input unit to select the partial image.
- the control value calculation module is configured to determine whether a speed reduction pattern of an acceleration of the object exists, and set a camera speed reduction section before a stop expectation point of the object if it is determined that the speed reduction pattern exists.
- control value calculation module is configured to set the camera speed reduction section based on a stop point of the object during a stop or direction change of the object, and set to move camera in a reverse direction at a predetermined reference speed as long as a distance between the center coordinates of the second image and the specific point of the object.
- the target detection module is configured to select any one of the plurality of objects by a user selection.
- the target detection module is configured to automatically select any one of the plurality of objects based on a predetermined selection priority, in which an object having high possibility of seceding from a monitoring area is set as a priority object.
- the control value calculation module is further configured to include parameters for applying a field of view (FOV) value corresponding to a zoom magnification of the second image to the weight change of the control acceleration.
- FOV field of view
- a method of controlling a camera control apparatus including: receiving a camera image in a frame unit from a camera; detecting a partial image including an object from a first image of a camera; generating a control value for controlling a camera capturing area to position a specific point of the object corresponding to specific coordinates of the partial image on center coordinates of a second image of the camera which is captured by the camera subsequent to the first image based on center coordinates of the partial image and center coordinates of the first image; and changing the camera capturing area based on the generated control value.
- a non-transitory computer readable medium having recorded thereon a program, which, when executed by a computer, performs above-recited method.
- FIG. 1 is a block diagram illustrating a camera control apparatus according to an exemplary embodiment
- FIG. 2 is a block diagram illustrating a configuration of a processor shown in FIG. 1 , according to an exemplary embodiment
- FIG. 3 is a block diagram illustrating a camera control apparatus according to an exemplary embodiment
- FIG. 4 is a diagram illustrating a camera image captured by a camera, according to an exemplary embodiment
- FIG. 5 is an exemplary diagram illustrating an image screen in the process of moving an object in the camera image of FIG. 4 to the center of a screen, according to an exemplary embodiment
- FIGS. 6 to 8 are exemplary diagrams illustrating the image screen of FIG. 5 by time zones
- FIG. 9 is a flowchart illustrating an operation of the processor shown in FIG. 1 according to an exemplary embodiment
- FIG. 10 is a flowchart of operation S 104 shown in FIG. 9 , according to an exemplary embodiment
- FIG. 11 is an exemplary diagram illustrating an image screen of an object which is in a standstill state or of which the direction is changed in the camera image of FIG. 4 , according to an exemplary embodiment
- FIG. 12 is a flowchart illustrating a process subsequent to “K” shown in FIG. 10 , according to an exemplary embodiment
- FIG. 13 is an exemplary diagram illustrating camera images captured by a camera, which are discriminated by control directions;
- FIG. 14 is an exemplary diagram illustrating an example in which an object is specified if a plurality of objects exist in a camera image captured by a camera;
- FIG. 15 is a flowchart illustrating an operation of the camera control apparatus of FIG. 1 according to an exemplary embodiment
- FIG. 16 is an exemplary diagram illustrating another example in which an object is specified if a plurality of objects exist in a camera image captured by a camera;
- FIG. 17 is a flowchart illustrating an operation of the camera control apparatus according to an exemplary embodiment
- FIG. 18 is a flowchart illustrating an operation of the camera control apparatus according to an exemplary embodiment
- FIG. 19 is an exemplary diagram illustrating a case where objects recede in the distance in camera images captured by a camera
- FIG. 20 is an exemplary diagram illustrating zoom-in camera images of FIG. 19 ;
- FIGS. 21A and 21B are exemplary diagrams illustrating respective images that are obtained by discriminating camera images of FIG. 4 according to fields of view.
- FIG. 22 is a graph illustrating an example of an acceleration change according to a zoom change.
- first”, “second”, etc. may be used herein to describe various elements, components, regions, layers and/or sections, these elements, components, regions, layers and/or sections should not be limited by these terms. These terms are only used to distinguish one element, component, region, layer or section from another region, layer or section. Thus, a first element, component, region, layer or section discussed below could be termed a second element, component, region, layer or section without departing from the teachings of the inventive concept.
- spatially relative terms such as “beneath”, “below”, “lower”, “above”, “upper”, and the like, may be used herein for ease of description to describe one element or feature's relationship to another element(s) or feature(s) as illustrated in the figures. It will be understood that the spatially relative terms are intended to encompass different orientations of the device in use or operation in addition to the orientation depicted in the figures. For example, if the device in the figures is turned over, elements described as “below” or “beneath” other elements or features would then be oriented “above” the other elements or features. Thus, the exemplary term “below” can encompass both an orientation of above and below. In other words, the device may be otherwise reoriented (e.g., rotated 90 degrees or at other orientations) and the spatially relative descriptors used herein interpreted accordingly.
- Exemplary embodiments are described herein with reference to cross-section illustrations that are schematic illustrations of idealized embodiments (and intermediate structures). As such, variations from the shapes of the illustrations as a result, for example, of manufacturing techniques and/or tolerances, are to be expected. Thus, the exemplary embodiments should not be construed as limited to the particular shapes of regions illustrated herein but are to include deviations in shapes that result, for example, from manufacturing. For example, an implanted region illustrated as a rectangle will, typically, have rounded or curved features and/or a gradient of implant concentration at its edges rather than a binary change from implanted to non-implanted region.
- a buried region formed by implantation may result in some implantation in the region between the buried region and the surface through which the implantation takes place.
- the regions illustrated in the figures are schematic in nature and their shapes are not intended to illustrate the actual shape of a region of a device and are not intended to limit the scope of the inventive concept.
- a camera control apparatus 100 is so configured that an object included in a camera image acquired by capturing an image of a monitoring area be always positioned in the center of the camera image, and thus can greatly reduce the possibility that an object positioned at an edge of the camera image is lost due to a non-reaction to an abrupt speed change of the object or the like.
- the camera control apparatus 100 is configured so as to remove a screen vibration phenomenon that occurs in the center portion of the camera image due to inertia of movement of a camera that is caused by a camera stop when the object being tracked is stopped or changes its direction after controlling the movement of the camera through increasing of the moving speed of the camera to make the object be always positioned in the camera image.
- the camera control apparatus 100 may include a processor 120 and a camera drive controller 130 .
- a camera that interlocks with the camera control apparatus 100 may be provided as a PTZ camera having pan, tilt, zoom, and focus functions.
- the processor 120 detects a partial image including the object from a first image of the camera provided from the camera, and generates a control value for positioning a specific point of the object corresponding to specific coordinates of the detected partial image on the center coordinates of a second image of the camera which is captured by the camera subsequent to the first image.
- the partial image including the object may be detected from the first image by a user's selection or through a predetermined image detection process.
- the partial image including the object may be automatically detected from the first image through the predetermined image detection process without user's participation.
- a camera image of a current frame and a camera image of a previous frame may be compared with each other, a cell area including an object as a target of observation may be specified based on different points between the images as a result of comparison, and a specific cell area may be determined as the partial image as described above.
- a face recognition algorithm may be executed with respect to a camera image of the current frame, a cell area including an object as a target of observation may be specified based on a result of the face recognition, and a specific cell area may be determined as the partial image as described above.
- the processor 120 and the camera drive controller 130 may be integrally provided with the camera or may be provided as a separate configuration from the camera.
- a control of a single camera may be performed in direct interlocking with the camera through a wired or wireless connection, and a control of a plurality of cameras may be performed through network connections (e.g., home network, Internet, and heterogeneous network) with the plurality of cameras including the camera.
- network connections e.g., home network, Internet, and heterogeneous network
- the camera drive controller 130 changes a capturing area of the camera based on the control value output from the processor 120 .
- FIG. 2 is a block diagram illustrating a configuration of the processor 120 shown in FIG. 1 .
- the processor 120 may include an image reception module 121 configured to receive a camera image in a frame unit from the camera, a target detection module 123 configured to detect a partial image from a first image of the camera through a predetermined image detection process, a control value calculation module 125 configured to determine a specific point of the first image corresponding to the center coordinates of the partial image based on the center coordinates of the first image as control center coordinates of the camera, to change a weight of a control acceleration for movement control of the camera according to a difference between the center coordinates of the partial image and the control center coordinates of the camera, and to generate the control value for movement of the camera based on the changed weight, and a control execution module 127 configured to output the generated control value to the camera drive controller 130 .
- an image reception module 121 configured to receive a camera image in a frame unit from the camera
- a target detection module 123 configured to detect a partial image from a first image of the camera through a predetermined image detection process
- a control value calculation module 125
- FIG. 3 is a block diagram illustrating a camera control apparatus 200 according to an exemplary embodiment.
- the camera control apparatus 200 may include an input unit 240 , a processor 220 , and a camera drive controller 230 .
- the input unit 240 may enable a user to select a partial image including an object from a camera image.
- the processor 220 detects a partial image including the object from a first image provided from the camera through a user setting signal that is transferred from the input unit 240 , and generates a control value for positioning a specific point of the object corresponding to specific coordinates of the partial image on the center coordinates of a second image of the camera which is captured by the camera subsequent to the first image.
- the detection of the partial image from the first image through a user's selection is performed by selecting a specific point at which the object is positioned from the first image through a user's touch or a user input means, such as a cursor, and designating a surrounding group cell around the selected specific point in the unit of a predetermined cell to detect the partial image.
- FIG. 4 is a diagram illustrating a camera image captured by a camera.
- the camera image illustrated in FIG. 4 is a first image of the camera that is a line input of camera image for each frame provided from the camera.
- a cell area in which the object is positioned among the first image may be a partial image.
- the center coordinates B of the first image and the center coordinates A of the partial image detected from the first image are positioned at different points in the first image, and thus do not coincide with each other.
- the camera control apparatus 100 operates to make a specific point of the object corresponding to the center coordinates A of the partial image be positioned on the center coordinates of the second image corresponding to the subsequent frame.
- the specific point of the object corresponding to the center coordinates A of the partial image be positioned on the center coordinates of the second image
- FIG. 5 is an exemplary diagram illustrating an image screen in the process of moving an object in the camera image of FIG. 4 to the center of a screen.
- the processor 120 gives a weight that makes the control acceleration for movement control of the camera exceed an acceleration of the object to the control acceleration of the camera.
- the center coordinates B′ of the second image may be directed to the specific point of the object corresponding to the center coordinates A of the partial image that is an image capturing area different from the center coordinates B of the first image.
- FIGS. 6 to 8 are exemplary diagrams illustrating the image screen of FIG. 5 by time zones.
- FIG. 6 shows a camera image of a frame at time “t”
- FIG. 7 shows a camera image of a frame at time “t+1”
- FIG. 8 shows a camera image of a frame at time “t+2”.
- the control center coordinates C of the camera may be determined as a specific point in an opposite direction to the center coordinates A of the partial image based on the center coordinates B of the first image.
- the processor 120 may determine the acceleration of the object using the distance between the center coordinates A of the partial image and the center coordinates B of the first image, and may determine the control acceleration of the camera using the distance between the center coordinates A of the partial image and the control center coordinates C of the camera. If the acceleration of the object using the distance between the center coordinates A of the partial image and the center coordinates B of the first image is “a”, the control acceleration of the camera using the center coordinates A of the partial image and the control center coordinates C of the camera becomes “2a”.
- control acceleration “2a” of the camera is higher than the acceleration (i.e., “a”) of the object
- the control speed of the camera may be set to be higher than the moving speed of the object to make the object be positioned in the center of the camera as illustrated in FIG. 8 .
- the processor 120 changes the control acceleration of the camera from “2a” to “a”. Through this, the processor 120 maintains the control acceleration of the camera and the acceleration of the object equal to each other.
- FIG. 9 is a flowchart illustrating an operation of the processor 120 according to an exemplary embodiment.
- the processor 120 receives camera images in a frame unit from the camera (operation S 100 ).
- a camera image currently being received is a first image
- a camera image subsequently being received is a second image.
- the processor 120 detects the partial image including the object in the first image through an image detection process (operation S 102 ).
- the processor 120 generates a control value for positioning the specific point of the object corresponding to the specific coordinates of the partial image detected in operation S 102 on the center coordinates of the second image which is captured by the camera subsequent to the first image (operation S 104 ).
- the processor 120 outputs the generated control value to the camera drive controller 130 to make the camera drive controller 130 change the image capturing area of the camera (operation S 106 ).
- FIG. 10 is a flowchart illustrating of operation S 104 shown in FIG. 9 , according to an exemplary embodiment.
- the processor 120 compares the center coordinates of the first image with the center coordinates of the partial image including the object in the first image (operation S 104 - 1 ).
- the processor 120 determines whether both coordinates do not coincide with each other (operation S 104 - 3 ).
- the processor 120 sets the specific point of the first image corresponding to the center coordinates of the partial image in an opposite direction to the moving direction of the object as the control center coordinates of the camera (operation S 104 - 5 ).
- the processor 120 gives a weight to the control acceleration of the camera using the distance between the center coordinates of the partial image and the control center coordinates of the camera (operation S 104 - 7 ).
- the processor 120 generates the control value for controlling the driving of the camera based on the weight determined in operation S 104 - 7 , and outputs the generated control value to the camera drive controller 130 (operation S 104 - 9 ).
- the controller 120 maintains the control acceleration of the camera to be equal to the acceleration of the object (operation S 104 - 11 ).
- FIG. 11 is an exemplary diagram illustrating an image screen of an object which is in a standstill state or of which the direction is changed in the camera image of FIG. 4 , according to an exemplary embodiment.
- the processor 120 controls the movement of the camera by increasing the moving speed of the camera to make the object be always positioned in the camera image, and then performs a control so as to remove a screen vibration phenomenon that occurs in the center portion of the camera image due to inertia of movement of the camera, that is caused by a camera stop when the object being tracked is stopped or changes its direction.
- two control methods for the processor 120 to remove the screen vibration phenomenon may be performed.
- the processor 120 may predetermine a speed reduction pattern of the acceleration of the object, set a speed reduction section of the camera before a stop expectation point of the object, and execute a speed reduction process to minimize the inertia of movement of the camera in the set speed reduction section.
- the processor 120 may set the speed reduction section of the camera based on the stop point of the object during the stop or direction change of the object, and move the camera in a reverse direction at a predetermined reference speed as long as a distance between the center coordinates B of the camera image as illustrated in FIG. 11 and the specific point of the object.
- the predetermined reference speed is a moving speed for positioning the object in the center of the screen after temporarily stopping the movement of the camera that tracks the object, and it is preferable, but not necessary, that the predetermined reference speed is a speed that minimizes the inertia of movement of the camera even if the camera is stopped after moving as long as the distance between the center coordinates B of the camera image and the specific point of the object.
- the processor 120 can operate to remove the screen vibration phenomenon as described above.
- FIG. 12 is a flowchart illustrating a process subsequent to “K” shown in FIG. 10 , according to an exemplary embodiment.
- the processor 120 determines whether a speed reduction pattern of the acceleration of the object exists during movement control of the camera to track the object (operation S 104 - 13 ).
- the processor 120 sets a speed reduction section of the camera in the moving direction of the object (operation S 104 - 15 ).
- the processor 120 removes a weight that is given to the control acceleration of the camera (operation S 104 - 17 ).
- the processor 120 sets the control acceleration of the camera to be equal to the acceleration of the object to minimize the inertia of movement of the camera (operation S 104 - 19 ).
- the processor 120 determines whether the object is stopped or changes its direction (operation S 104 - 21 ).
- the processor 120 sets the speed reduction section of the camera based on the stopping point of the object (operation S 104 - 23 ).
- the processor 120 controls the camera (not illustrated) to be gradually stopped in the speed reduction section of S 104 - 23 (operation S 104 - 25 ).
- the processor 120 sets to move the camera in a reverse direction at a reference speed as long as the distance between the center coordinates of the partial image including the object positioned in a reverse direction to the traveling direction of the camera and the center coordinates of the camera image (operation S 104 - 27 ).
- FIG. 13 is an exemplary diagram illustrating camera images captured by a camera, which are discriminated by control directions.
- control center coordinates of the camera are changed with directivity according to the control direction of the camera to track the object.
- FIG. 14 is an exemplary diagram illustrating an example in which an object is specified if a plurality of objects exist in a camera image captured by a camera.
- a plurality of objects may exist in the first image.
- the processor 120 may set a moving object in the first image as a target.
- FIG. 15 is a flowchart illustrating an operation of the camera control apparatus 100 of FIG. 1 according to an exemplary embodiment.
- the processor 120 receives camera images in a frame unit from the camera (operation S 200 ).
- a camera image currently being received is a first image
- a camera image subsequently being received is a second image.
- the processor 120 detects the partial image including the object from the first image through an image detection process, and if a plurality of objects exist in the first image, the processor 120 sets a moving object as a target (operation S 202 ).
- the processor 120 generates a control value for positioning the specific point of the object corresponding to the specific coordinates of the partial image detected in operation S 202 on the center coordinates of the second image which is captured by the camera subsequent to the first image (operation S 204 ).
- the processor 120 outputs the generated control value to the camera drive controller 130 to make the camera drive controller 130 change the image capturing area of the camera (operation S 206 ).
- FIG. 16 is an exemplary diagram illustrating another example in which an object is specified if a plurality of objects exist in a camera image captured by a camera.
- the processor 120 may specify an object selected by a user or may select any one object of the plurality of objects according to a predetermined selection priority.
- FIG. 17 is a flowchart illustrating an operation of the camera control apparatus 100 of FIG. 1 according to an exemplary embodiment.
- the processor 120 receives camera images in a frame unit from the camera (operation S 300 ).
- a camera image currently being received is a first image
- a camera image subsequently being received is a second image.
- the user who is a manager selects any one of the plurality of objects in the first image (operation S 306 ).
- the processor 120 sets the object selected by the user as a target (operation S 308 ).
- the processor 120 generates a control value for positioning the specific point of the object corresponding to the specific coordinates of the partial image detected in operation 308 on the center coordinates of the second image which is captured by the camera subsequent to the first image (operation S 310 ).
- the processor 120 outputs the generated control value to the camera drive controller 130 to make the camera drive controller 130 change the image capturing area of the camera (operation S 312 ).
- FIG. 18 is a flowchart illustrating an operation of the camera control apparatus 100 of FIG. 1 according to an exemplary embodiment.
- the processor 120 receives camera images in a frame unit from the camera (operation S 400 ).
- a camera image currently being received is a first image
- a camera image subsequently being received is a second image.
- the processor 120 detects the partial image including the object in the first image through an image detection process, and if a plurality of objects exist in the first image, the processor 120 may select an object having high possibility of seceding from the monitoring area as a priority object (operations S 402 to S 406 ).
- the processor 120 generates a control value for positioning the specific point of the object corresponding to the specific coordinates of the partial image detected in operations 406 on the center coordinates of the second image which is captured by the camera subsequent to the first image (operation S 408 ).
- the processor 120 outputs the generated control value generated to the camera drive controller 130 to make the camera drive controller 130 change the image capturing area of the camera (operation S 410 ).
- FIG. 19 is an exemplary diagram illustrating a case where objects recede in the distance in camera images captured by a camera.
- the camera control apparatus 100 is configured so that objects included in camera images obtained by capturing a monitoring area are always positioned in the center of the camera images even if a zoom magnification is changed.
- the fields of view of the respective camera images are kept as they are.
- the fields of view of the respective camera images are all 55.5°.
- the field of view (FOV) with respect to the screen is narrowed, while if zoom-out is performed, the FOV with respect to the screen is widened.
- FOV 2 arctan (0.5 h/f), [Equation 1] where, h is a sensor size, and f is a focal length.
- the control is performed at an improper speed (e.g., too fast or too slow).
- the exemplary embodiment since the actual acceleration is not accumulatively averaged, but the accumulated average is calculated through movement of the center of the control on the screen, the change of the control of the zoom magnification is reflected in the FOV, and the acceleration per pixel is immediately changed.
- FIG. 20 is an exemplary diagram illustrating zoom-in camera images of FIG. 19 .
- respective camera images are images that are obtained by performing zoom-in or non-zoom-in with respect to the camera images of FIG. 19 .
- the second camera image corresponds to a case where the FOV is changed from 55.5° to 35° as a result of performing the zoom-in with respect to the second camera image of FIG. 19 .
- the fifth camera image corresponds to a case where the FOV is changed from 55.5° to 1.59° as a result of performing the zoom-in with respect to the fifth camera image of FIG. 19 .
- FIGS. 21A and 21B are exemplary diagrams illustrating respective images that are obtained by discriminating camera images of FIG. 4 according to fields of view.
- FIG. 21A shows a camera image obtained through moving the camera image of FIG. 4 as it is, and the zoom magnification is not changed. That is, the FOV is 55°.
- FIG. 21B shows a case where the zoom magnification is changed through performing of the zoom-in with respect to the camera image of FIG. 4 . That is, the FOV is 1.59°.
- the distances between the screen center and the control center are same.
- the control speed includes the accumulated average, the speed change that is confirmed in the corresponding camera image becomes too fast when the zoom magnification is changed (e.g., the zoom-in is performed) as shown in FIG. 21B .
- control value calculation module makes the object in the camera image be always positioned in the center of the screen even if the zoom magnification is changed by generating a control value including the FOV value according to the zoom magnification applied to the camera image as a parameter to be applied to the weight change of the control acceleration for movement of the camera.
- FIG. 22 is a graph illustrating an example of an acceleration change according to a zoom change.
- the acceleration may be changed as in Equation 2 and Equation 3 below according to the zoom magnification.
- P _Spd1 arctan (avg_spd/ D 1), [Equation 2] where P_Spd 1 is an acceleration in the case of a first zoom magnification, avg_spd is an average moving speed of an object, and D 1 is a first distance between a camera and an object.
- P _Spd2 arctan (avg_spd/ D 2), [Equation 3] where P_Spd 2 is an acceleration in the case of a second zoom magnification, avg_spd is an average moving speed of an object, and D 2 is a second distance between a camera and an object.
- the first acceleration P_Spd 1 and the second acceleration P_Spd 2 are calculated as described above.
- the control value calculation module generates the control value including the FOV value according to the zoom magnification applied to the camera image as a parameter to be applied to the weight change of the control acceleration for the movement of the camera.
- control value calculation module can set the control acceleration corresponding to FIG. 21B , which is different from the control acceleration corresponding to FIG. 21A , using the deviation between the FOA corresponding to FIG. 21A and the FOA corresponding to FIG. 21B .
- At least one of the components, elements, modules or units represented by a block as illustrated in FIGS. 1-3 may be embodied as various numbers of hardware, software and/or firmware structures that execute respective functions described above, according to an exemplary embodiment.
- at least one of these components, elements, modules or units may use a direct circuit structure, such as a memory, processing, logic, a look-up table, etc. that may execute the respective functions through controls of one or more microprocessors or other control apparatuses.
- at least one of these components, elements, modules or units may be specifically embodied by a module, a program, or a part of code, which contains one or more executable instructions for performing specified logic functions.
- At least one of these components, elements, modules or units may further include a processor such as a central processing unit (CPU) that performs the respective functions, a microprocessor, or the like.
- a processor such as a central processing unit (CPU) that performs the respective functions, a microprocessor, or the like.
- CPU central processing unit
- microprocessor or the like.
- bus is not illustrated in the above block diagrams, communication between the components, elements, modules or units may be performed through the bus.
- the inventive concept can be embodied with possibility of marketing and trade, and thus becomes industrially applicable.
Landscapes
- Engineering & Computer Science (AREA)
- Multimedia (AREA)
- Signal Processing (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Theoretical Computer Science (AREA)
- Human Computer Interaction (AREA)
- Studio Devices (AREA)
Abstract
Description
FOV=2 arctan (0.5 h/f), [Equation 1]
where, h is a sensor size, and f is a focal length.
P_Spd1=arctan (avg_spd/D1), [Equation 2]
where P_Spd1 is an acceleration in the case of a first zoom magnification, avg_spd is an average moving speed of an object, and D1 is a first distance between a camera and an object.
P_Spd2=arctan (avg_spd/D2), [Equation 3]
where P_Spd2 is an acceleration in the case of a second zoom magnification, avg_spd is an average moving speed of an object, and D2 is a second distance between a camera and an object.
Claims (16)
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
KR1020140065108A KR102152725B1 (en) | 2014-05-29 | 2014-05-29 | Control apparatus for camera |
KR10-2014-0065108 | 2014-05-29 |
Publications (2)
Publication Number | Publication Date |
---|---|
US20150350556A1 US20150350556A1 (en) | 2015-12-03 |
US10021311B2 true US10021311B2 (en) | 2018-07-10 |
Family
ID=54703286
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US14/509,230 Active 2035-05-25 US10021311B2 (en) | 2014-05-29 | 2014-10-08 | Camera control apparatus |
Country Status (3)
Country | Link |
---|---|
US (1) | US10021311B2 (en) |
KR (1) | KR102152725B1 (en) |
CN (1) | CN105245770B (en) |
Families Citing this family (11)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US10659676B2 (en) * | 2015-12-08 | 2020-05-19 | Canon Kabushiki Kaisha | Method and apparatus for tracking a moving subject image based on reliability of the tracking state |
CN105718887A (en) * | 2016-01-21 | 2016-06-29 | 惠州Tcl移动通信有限公司 | Shooting method and shooting system capable of realizing dynamic capturing of human faces based on mobile terminal |
KR102596487B1 (en) * | 2016-04-06 | 2023-11-01 | 한화비전 주식회사 | Display Control System, Method and Computer Readable Record Medium Thereof |
US10078908B2 (en) * | 2016-08-12 | 2018-09-18 | Elite Robotics | Determination of relative positions |
RU2019117210A (en) * | 2016-11-03 | 2020-12-03 | Конинклейке Филипс Н.В. | AUTOMATIC PAN, TILT AND ZOOM ADJUSTMENT FOR IMPROVED LIFE PERFORMANCE |
CN107426497A (en) * | 2017-06-15 | 2017-12-01 | 深圳天珑无线科技有限公司 | The method, apparatus and computer-readable recording medium of a kind of recording image |
US10810751B2 (en) * | 2017-06-23 | 2020-10-20 | Panasonic Intellectual Property Corporation Of America | Distance measuring apparatus and distance measuring method |
CN113841087A (en) * | 2019-05-27 | 2021-12-24 | 索尼集团公司 | Composition control device, composition control method, and program |
KR20210128736A (en) | 2020-04-17 | 2021-10-27 | 삼성전자주식회사 | Electronic device including multi-cameras and shooting method |
CN114942657B (en) * | 2022-04-21 | 2023-10-27 | 国网山东省电力公司建设公司 | Internal temperature control system and control method in concrete pouring process |
KR102600418B1 (en) * | 2023-04-28 | 2023-11-09 | 주식회사 아로텍 | Apparatus and method for deciding of area |
Citations (20)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US5111288A (en) * | 1988-03-02 | 1992-05-05 | Diamond Electronics, Inc. | Surveillance camera system |
US6215519B1 (en) * | 1998-03-04 | 2001-04-10 | The Trustees Of Columbia University In The City Of New York | Combined wide angle and narrow angle imaging system and method for surveillance and monitoring |
KR20020015637A (en) | 2001-05-28 | 2002-02-28 | 최종욱 | System for automatic installation of detection area using panning, tilting and zooming transformation of cctv cameras |
US6359647B1 (en) * | 1998-08-07 | 2002-03-19 | Philips Electronics North America Corporation | Automated camera handoff system for figure tracking in a multiple camera system |
US20040119819A1 (en) * | 2002-10-21 | 2004-06-24 | Sarnoff Corporation | Method and system for performing surveillance |
US20050073585A1 (en) * | 2003-09-19 | 2005-04-07 | Alphatech, Inc. | Tracking systems and methods |
US20060126737A1 (en) * | 2004-12-15 | 2006-06-15 | International Business Machines Corporation | Method, system and program product for a camera to track an object using motion vector data |
US20070291104A1 (en) * | 2006-06-07 | 2007-12-20 | Wavetronex, Inc. | Systems and methods of capturing high-resolution images of objects |
KR20080009878A (en) | 2006-07-25 | 2008-01-30 | 권현구 | Variable focusing camera system using manual focusing |
JP2008252331A (en) | 2007-03-29 | 2008-10-16 | Victor Co Of Japan Ltd | Digital monitoring system |
US20090102924A1 (en) * | 2007-05-21 | 2009-04-23 | Masten Jr James W | Rapidly Deployable, Remotely Observable Video Monitoring System |
US20100026809A1 (en) * | 2008-07-29 | 2010-02-04 | Gerald Curry | Camera-based tracking and position determination for sporting events |
KR20100104194A (en) | 2009-03-17 | 2010-09-29 | 엘지전자 주식회사 | Apparatus and method for controlling camera photographing |
US20100265394A1 (en) * | 2009-04-16 | 2010-10-21 | Sony Corporation | Image processing apparatus, image processing method, and recording medium |
US20100329582A1 (en) * | 2009-06-29 | 2010-12-30 | Tessera Technologies Ireland Limited | Adaptive PSF Estimation Technique Using a Sharp Preview and a Blurred Image |
US20120020524A1 (en) * | 2009-03-31 | 2012-01-26 | Nec Corporation | Tracked object determination device, tracked object determination method and tracked object determination program |
KR101111503B1 (en) | 2010-02-17 | 2012-02-22 | (주)서광시스템 | Apparatus for controlling Pan/Tilt/Zoom camera in omnidirectional and method for the same |
KR101136366B1 (en) | 2010-02-26 | 2012-04-18 | 주식회사 비츠로시스 | Auto object tracking system |
US20130070091A1 (en) * | 2011-09-19 | 2013-03-21 | Michael Mojaver | Super resolution imaging and tracking system |
KR20130031016A (en) | 2011-09-20 | 2013-03-28 | 주식회사 미동전자통신 | A method for saving of moving picture in car blackbox |
Family Cites Families (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP3704045B2 (en) * | 2001-01-15 | 2005-10-05 | 株式会社ニコン | Target object tracking device |
-
2014
- 2014-05-29 KR KR1020140065108A patent/KR102152725B1/en active IP Right Grant
- 2014-10-08 US US14/509,230 patent/US10021311B2/en active Active
- 2014-11-18 CN CN201410658081.4A patent/CN105245770B/en active Active
Patent Citations (20)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US5111288A (en) * | 1988-03-02 | 1992-05-05 | Diamond Electronics, Inc. | Surveillance camera system |
US6215519B1 (en) * | 1998-03-04 | 2001-04-10 | The Trustees Of Columbia University In The City Of New York | Combined wide angle and narrow angle imaging system and method for surveillance and monitoring |
US6359647B1 (en) * | 1998-08-07 | 2002-03-19 | Philips Electronics North America Corporation | Automated camera handoff system for figure tracking in a multiple camera system |
KR20020015637A (en) | 2001-05-28 | 2002-02-28 | 최종욱 | System for automatic installation of detection area using panning, tilting and zooming transformation of cctv cameras |
US20040119819A1 (en) * | 2002-10-21 | 2004-06-24 | Sarnoff Corporation | Method and system for performing surveillance |
US20050073585A1 (en) * | 2003-09-19 | 2005-04-07 | Alphatech, Inc. | Tracking systems and methods |
US20060126737A1 (en) * | 2004-12-15 | 2006-06-15 | International Business Machines Corporation | Method, system and program product for a camera to track an object using motion vector data |
US20070291104A1 (en) * | 2006-06-07 | 2007-12-20 | Wavetronex, Inc. | Systems and methods of capturing high-resolution images of objects |
KR20080009878A (en) | 2006-07-25 | 2008-01-30 | 권현구 | Variable focusing camera system using manual focusing |
JP2008252331A (en) | 2007-03-29 | 2008-10-16 | Victor Co Of Japan Ltd | Digital monitoring system |
US20090102924A1 (en) * | 2007-05-21 | 2009-04-23 | Masten Jr James W | Rapidly Deployable, Remotely Observable Video Monitoring System |
US20100026809A1 (en) * | 2008-07-29 | 2010-02-04 | Gerald Curry | Camera-based tracking and position determination for sporting events |
KR20100104194A (en) | 2009-03-17 | 2010-09-29 | 엘지전자 주식회사 | Apparatus and method for controlling camera photographing |
US20120020524A1 (en) * | 2009-03-31 | 2012-01-26 | Nec Corporation | Tracked object determination device, tracked object determination method and tracked object determination program |
US20100265394A1 (en) * | 2009-04-16 | 2010-10-21 | Sony Corporation | Image processing apparatus, image processing method, and recording medium |
US20100329582A1 (en) * | 2009-06-29 | 2010-12-30 | Tessera Technologies Ireland Limited | Adaptive PSF Estimation Technique Using a Sharp Preview and a Blurred Image |
KR101111503B1 (en) | 2010-02-17 | 2012-02-22 | (주)서광시스템 | Apparatus for controlling Pan/Tilt/Zoom camera in omnidirectional and method for the same |
KR101136366B1 (en) | 2010-02-26 | 2012-04-18 | 주식회사 비츠로시스 | Auto object tracking system |
US20130070091A1 (en) * | 2011-09-19 | 2013-03-21 | Michael Mojaver | Super resolution imaging and tracking system |
KR20130031016A (en) | 2011-09-20 | 2013-03-28 | 주식회사 미동전자통신 | A method for saving of moving picture in car blackbox |
Non-Patent Citations (2)
Title |
---|
"Computer." Merriam-Webster.com. 2016. http://www.merriam-webster.com (Accessed on Jul. 6, 2016). * |
Mohammed A. Taha & Sharief F. Babiker, Object Video Tracking using a Pan-Tilt-Zoom System, 4 University of Khartoum Engineering J. 12, 13-19 (Feb. 2014). * |
Also Published As
Publication number | Publication date |
---|---|
US20150350556A1 (en) | 2015-12-03 |
CN105245770B (en) | 2018-11-06 |
KR20150137368A (en) | 2015-12-09 |
KR102152725B1 (en) | 2020-09-07 |
CN105245770A (en) | 2016-01-13 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US10021311B2 (en) | Camera control apparatus | |
US11102417B2 (en) | Target object capturing method and device, and video monitoring device | |
JP4241742B2 (en) | Automatic tracking device and automatic tracking method | |
JP5656567B2 (en) | Video processing apparatus and method | |
JP6532229B2 (en) | Object detection apparatus, object detection system, object detection method and program | |
WO2014155979A1 (en) | Tracking processing device and tracking processing system provided with same, and tracking processing method | |
CN106780550B (en) | Target tracking method and electronic equipment | |
JP4558696B2 (en) | Automatic body tracking device | |
WO2018228413A1 (en) | Method and device for capturing target object and video monitoring device | |
CN105979143B (en) | Method and device for adjusting shooting parameters of dome camera | |
RU2668782C1 (en) | Device for detection of traffic light and method for detection of traffic light | |
CN102231798A (en) | Method for controlling PTZ (Pan/Tilt/Zoom) camera to zoom automatically and system thereof | |
JP6381313B2 (en) | Control device, control method, and program | |
JP2014089687A (en) | Obstacle detection apparatus and method for avm system | |
CN112365522A (en) | Method for tracking personnel in park across borders | |
CN106657777A (en) | Automatic focusing method and system for infrared thermal imager | |
WO2017101292A1 (en) | Autofocusing method, device and system | |
CN109685062A (en) | Target detection method, device, equipment and storage medium | |
JP5127692B2 (en) | Imaging apparatus and tracking method thereof | |
WO2016152316A1 (en) | Surveillance system, surveillance method, surveillance device, and surveillance device control program | |
JP2020088416A (en) | Electronic apparatus and control apparatus | |
JP2006033188A (en) | Supervisory apparatus and supervisory method | |
TWI453698B (en) | The method of automatic tracking of ball camera | |
CN102469247A (en) | Photographic device and dynamic focusing method thereof | |
CN107959767B (en) | Focusing and dimming method using television tracking result as guide |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
AS | Assignment |
Owner name: SAMSUNG TECHWIN CO., LTD., KOREA, REPUBLIC OF Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:LEE, HYUN JIN;REEL/FRAME:033910/0064 Effective date: 20140930 |
|
AS | Assignment |
Owner name: HANWHA TECHWIN CO., LTD., KOREA, REPUBLIC OF Free format text: CHANGE OF NAME;ASSIGNOR:SAMSUNG TECHWIN CO., LTD.;REEL/FRAME:036205/0923 Effective date: 20150701 |
|
STCF | Information on status: patent grant |
Free format text: PATENTED CASE |
|
AS | Assignment |
Owner name: HANWHA AEROSPACE CO., LTD., KOREA, REPUBLIC OF Free format text: CHANGE OF NAME;ASSIGNOR:HANWHA TECHWIN CO., LTD;REEL/FRAME:046927/0019 Effective date: 20180401 |
|
AS | Assignment |
Owner name: HANWHA AEROSPACE CO., LTD., KOREA, REPUBLIC OF Free format text: CORRECTIVE ASSIGNMENT TO CORRECT THE APPLICATION NUMBER 10/853,669. IN ADDITION PLEASE SEE EXHIBIT A PREVIOUSLY RECORDED ON REEL 046927 FRAME 0019. ASSIGNOR(S) HEREBY CONFIRMS THE CHANGE OF NAME;ASSIGNOR:HANWHA TECHWIN CO., LTD.;REEL/FRAME:048496/0596 Effective date: 20180401 |
|
AS | Assignment |
Owner name: HANWHA TECHWIN CO., LTD., KOREA, REPUBLIC OF Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:HANWHA AEROSPACE CO., LTD.;REEL/FRAME:049013/0723 Effective date: 20190417 |
|
MAFP | Maintenance fee payment |
Free format text: PAYMENT OF MAINTENANCE FEE, 4TH YEAR, LARGE ENTITY (ORIGINAL EVENT CODE: M1551); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY Year of fee payment: 4 |
|
AS | Assignment |
Owner name: HANWHA VISION CO., LTD., KOREA, REPUBLIC OF Free format text: CHANGE OF NAME;ASSIGNOR:HANWHA TECHWIN CO., LTD.;REEL/FRAME:064549/0075 Effective date: 20230228 |