[go: up one dir, main page]
More Web Proxy on the site http://driver.im/

GB2499427A - Video tracking apparatus having two cameras mounted on a moveable unit - Google Patents

Video tracking apparatus having two cameras mounted on a moveable unit Download PDF

Info

Publication number
GB2499427A
GB2499427A GB1202692.8A GB201202692A GB2499427A GB 2499427 A GB2499427 A GB 2499427A GB 201202692 A GB201202692 A GB 201202692A GB 2499427 A GB2499427 A GB 2499427A
Authority
GB
United Kingdom
Prior art keywords
image acquisition
acquisition device
image
image data
unit
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Withdrawn
Application number
GB1202692.8A
Other versions
GB201202692D0 (en
Inventor
David Watkins
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Overview Ltd
Original Assignee
Overview Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Overview Ltd filed Critical Overview Ltd
Priority to GB1202692.8A priority Critical patent/GB2499427A/en
Publication of GB201202692D0 publication Critical patent/GB201202692D0/en
Priority to PCT/GB2013/050368 priority patent/WO2013121215A1/en
Publication of GB2499427A publication Critical patent/GB2499427A/en
Withdrawn legal-status Critical Current

Links

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N5/00Details of television systems
    • H04N5/14Picture signal circuitry for video frequency region
    • H04N5/144Movement detection
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S3/00Direction-finders for determining the direction from which infrasonic, sonic, ultrasonic, or electromagnetic waves, or particle emission, not having a directional significance, are being received
    • G01S3/78Direction-finders for determining the direction from which infrasonic, sonic, ultrasonic, or electromagnetic waves, or particle emission, not having a directional significance, are being received using electromagnetic waves other than radio waves
    • G01S3/782Systems for determining direction or deviation from predetermined direction
    • G01S3/785Systems for determining direction or deviation from predetermined direction using adjustment of orientation of directivity characteristics of a detector or detector system to give a desired condition of signal derived from that detector or detector system
    • G01S3/786Systems for determining direction or deviation from predetermined direction using adjustment of orientation of directivity characteristics of a detector or detector system to give a desired condition of signal derived from that detector or detector system the desired condition being maintained automatically
    • G01S3/7864T.V. type tracking systems
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N7/00Television systems
    • H04N7/18Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast
    • H04N7/181Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast for receiving images from a plurality of remote sources
    • GPHYSICS
    • G08SIGNALLING
    • G08BSIGNALLING OR CALLING SYSTEMS; ORDER TELEGRAPHS; ALARM SYSTEMS
    • G08B13/00Burglar, theft or intruder alarms
    • G08B13/18Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength
    • G08B13/189Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength using passive radiation detection systems
    • G08B13/194Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength using passive radiation detection systems using image scanning and comparing systems
    • G08B13/196Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength using passive radiation detection systems using image scanning and comparing systems using television cameras
    • G08B13/19602Image analysis to detect motion of the intruder, e.g. by frame subtraction
    • G08B13/19608Tracking movement of a target, e.g. by detecting an object predefined as a target, using target direction and or velocity to predict its new position
    • GPHYSICS
    • G08SIGNALLING
    • G08BSIGNALLING OR CALLING SYSTEMS; ORDER TELEGRAPHS; ALARM SYSTEMS
    • G08B13/00Burglar, theft or intruder alarms
    • G08B13/18Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength
    • G08B13/189Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength using passive radiation detection systems
    • G08B13/194Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength using passive radiation detection systems using image scanning and comparing systems
    • G08B13/196Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength using passive radiation detection systems using image scanning and comparing systems using television cameras
    • G08B13/19617Surveillance camera constructional details
    • G08B13/19632Camera support structures, e.g. attachment means, poles

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Physics & Mathematics (AREA)
  • Electromagnetism (AREA)
  • General Physics & Mathematics (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Closed-Circuit Television Systems (AREA)
  • Studio Devices (AREA)
  • Image Processing (AREA)

Abstract

An object tracking apparatus is provided comprising a moveable unit 106; a first image acquisition device, or camera, 102 mounted on the moveable unit; a second image acquisition device 104 mounted on the moveable unit and pointing in substantially the same direction as the first image acquisition device; and a control device 108. The control device is configured to determine, from image data received from the first image acquisition device, positional information of an object represented in the image data to be tracked; and output a control signal to the moveable unit to control its movement and cause the second image acquisition device to track the object in accordance with the determined positional information. The control means might also use information about the relative separation of the first and second cameras.

Description

1
Video tracking apparatus and method Field of the Invention
5 The present invention relates to an apparatus for tracking an object and, in particular, to a video tracking apparatus.
Background of the Invention
10
Video tracking arrangements are known in which a PTZ (Pan/Tilt/Zoom) camera system follows a particular object such as a person, or a vehicle, and such arrangements are becoming popular for other applications for example with CCTV speed domes. These domes (or PTZ systems) track by referencing the area or object that is to be followed against its background, and this works well if the
15 tracked area or object is small compared to the background. If this is not the case, for instance if a person is being tracked and the camera is zoomed in to see more of the person so that the background becomes relatively small, then there is not enough background to give meaningful reference points and the ability to track is lost.
20 The above problem can be remedied by using two cameras installed separately alongside each other or some distance away from each other, a static wide angle camera to give a reference frame, and a narrow (possibly zoom) angle PTZ camera whose tracking input commands are derived from the wide angle view. This system works well but it is dependent on the distance between the cameras being known accurately and calibration of the system on initial installation is therefore necessary.
25
Speed dome solutions are known in which a 360° fish eye lens is suspended from the bottom of a viewing bubble which can provide tracking information to the dome PTZ system. This approach denies the view immediately beneath the dome to the PTZ camera but the fish eye lens is close enough here to provide all the information needed. In any case it is unlikely that the area immediately
30 beneath the dome will contain a target. Alternative implementations provide multiple cameras arranged around the periphery of the PTZ unit and merge the individual views into a single view, which can also provide tracking commands. An advantage of these solutions is that the wide angle view camera is aware of the whole scene, independently of the PTZ camera which has a limited view.
35 It is desirable to provide a tracking apparatus that can be easily installed and used to accurately track an object.
2
Summary of the Invention
5 According to a first aspect of the invention, there is provided an object tracking apparatus comprising a moveable unit; a first image acquisition device mounted on the moveable unit; a second image acquisition device mounted on the moveable unit; and a control device configured to: determine, from image data received from the first image acquisition device, positional information of an object represented in the image data to be tracked; and output a control signal to the moveable unit to 10 control its movement and cause the second image acquisition device to track the object in accordance with the determined positional information. Advantageously, the first and second acquisition devices of the apparatus will move together when tracking an object.
Preferably, the second image acquisition device is mounted on the moveable unit in a fixed position 15 relative to the first image acquisition device. Advantageously, this avoids the need for calibration of the relative locations of the first and second image acquisition devices on installation.
The control means may be configured to determine the position information from both image data received from the first image acquisition device, and the position of the first image acquisition device 20 relative to the second image acquisition device.
The first image acquisition device and the second image acquisition device may be configured to point in substantially the same direction, or at the same scene In essence, this means that the optical axes of the first and second image acquisition devices are substantially parallel.
25
The predefined field of view of the first image acquisition device may be wider than that of the second image acquisition device.
The field of view of the first image acquisition device may be less than 360°.
30
The first image acquisition device may comprise a zoom lens operable to adjust the first predefined field of view when tracking the object.
3
The control unit may be configured to generate control signals to control the zoom lens to adjust the field of view so as to maximise accuracy of tracking of the object.
5 The control means may be further configured to determine the positional information based on a known angle of motion of the movable unit.
The first image acquisition device and the second image acquisition device may be housed within a closed circuit camera (CCTV) dome.
10
The second image acquisition device and the mounting unit may combine to form a pan tilt zoom (PTZ) camera.
The movable unit may comprise a pan and tilt gimbal mechanism.
15
The movable unit may comprise at least one motor controlled by the control unit.
The first and second image acquisition devices may be video acquisition devices, and the image data may comprise image data of frames of video data.
20
In one embodiment, one or other, or both of the first and second image acquisition devices is a digital camera which comprises a lens having an optical axis, and an image acquisition element, such as a charge coupled device, onto which an image is focussed by the lens which, for the first image acquisition device may be a zoom lens operable to adjust the first predefined field of view when 25 tracking the object. The charge coupled device is connected to image processing circuitry in the camera which generates and outputs image data. For video, the image data may be output as a series of video frames, possibly encoded and compressed in a digital format, such as MPEG video. In an alternative embodiment, an analogue camera may be used.
30 According to a second aspect of the invention, there is provided a method for tracking objects, the method comprising: determining, from image data received from a first image acquisition device
4
mounted on a moveable unit, positional information of an object represented in the image data to be tracked; and outputting a control signal to the moveable unit to control its movement and cause a second image acquisition device mounted on the moveable unit to track the object in accordance with the determined positional information.
5
Determining positional information may comprise determining the positional information from both image data received from the first image acquisition device, and a position of the first image acquisition device relative to the second image acquisition device.
10 According to a third aspect of the invention, there is provided a computer-readable medium comprising computer-executable instructions which, when executed, cause a processor to perform the above method for tracking objects.
15 Brief Description of the Drawings
The invention will now be described by way of example with reference to the accompanying drawings, in which:
20 Figure 1 is a schematic view of an apparatus according to an embodiment of the invention.
Figure 2a is a schematic view of an apparatus according to an embodiment of the invention;
Figure 2b is a schematic view of an apparatus according to an embodiment of the invention.
25
Figure 3 is a flow chart describing the steps performed when tracking an object according to an embodiment of the invention.
Figure 4 is a flow chart describing the step of determining whether an image contains a target 30 according to an embodiment of the invention.
Figure 5a is a flow chart describing the step of determining whether an image contains a target according to an embodiment of the invention.
5
Figure 5b is a schematic view of an apparatus according to an embodiment of the invention.
Figure 6 is a flow chart describing the steps performed when detecting motion of a target according to an embodiment of the invention.
5
Detailed Description of the Drawings
The invention is now described with reference to an exemplary embodiment, as depicted in figures 1 10 to 6.
Figure 1 is a schematic view of an apparatus 100 according to an embodiment of the invention. The apparatus 100 includes a first camera or image acquisition device 102, a second camera or image acquisition device 104, a movable unit 106 and a control unit 108. The first camera 102 and the 15 second camera 104 are mounted on the movable unit 106 and the first camera 102, the second camera 104 and the movable unit 106 are connected electrically with the control unit 108. The first camera 102 and the second camera 104 point in substantially the same direction, or at the same scene. In essence, this means that the optical axes of the first and second image acquisition devices are substantially parallel.
20
In the embodiment of the invention depicted in figure 2, the movable unit 106 is a pan and tilt gimbal mechanism (or pan and tilt shaft). The pan and tilt gimbal mechanism 106 is operated by motor 202 and another motor (not shown)through belts (not shown).
25 In the disclosed embodiment, the first camera or image acquisition device 102 is a wide angle camera or is a camera set to a wider zoom than the second camera or image acquisition device 104 which is a high optical zoom (e.g. greater than 18, 15, 10, 9, 8 , 7 ,6 ,5 ,4, 3 or 2 times zoom). The wide angle camera 102 and the zoom camera 104 are mounted on the same gimbal mechanism 106 so that both the wide angle camera 102 and the zoom camera 104 move by the same amount when the gimbal 30 mechanism 106 moves.
Whilst the first camera 102 (the "wide angle" camera) and the second camera 104 (the "zoom" camera) are depicted as being of a similar size in figure 2, it will be appreciated that the wide angle camera 102 can be a different to size to the zoom camera 104.
35
6
The control unit 108 is a personal computer or any unit comprising a processor configured to control the movable unit. The control unit 108 is connected to both cameras and the movable unit by a wired and/or a wireless connection.
5 Operation of the apparatus in accordance with an embodiment of the invention is now described with reference to figure 3.
At step S300, a series of images or frames lo,...,n is acquired by the wide angle camera 102. It will be understood that in what follows the term 'image' refers to image data, i.e. a numeric representation of 10 a two-dimensional image, for example, a bitmap image. Similarly, a series or sequence of images means a series of frames of image data which might make up a sequence of moving video represented in video data. An image comprising/including an object should be understood to mean that a subset of the image data represents the specified object. The acquired image data are provided to the control unit 108. The image data are provided to the control unit sequentially as they 15 are received. Alternatively, multiple images are provided to the control unit at the same time. The image data may be provided to the control unit in a compressed form, in which case the control unit decompresses the image data before preforming further processing.
At step S302, the control unit 108 determines whether any one of the received images from the image 20 data or frames in the image or video data acquired from the first image acquisition device 102 comprise one or more targets to be tracked. A target may, for example, comprise a person, a vehicle, an animal etc. This step may be performed automatically or manually by a user of the apparatus, as discussed in more detail with respect to figures 4 and 5. If it is determined that none of the received images comprise a target to be tracked, the control unit 108 causes the gimbal mechanism 106 to 25 return to a reference position or to remain at the current position and processing returns to step S300. Alternatively, the control unit 108 may cause the gimbal mechanism 106 to move to any other position, for example the control unit 108 may cause the gimbal mechanism 106 to move to a position specified by a user input.
30 If an image or frame of the series of images or video data is determined to comprise a target to be tracked, processing proceeds to step S304 at which the motion of the target is determined. The motion of the target is determined using any motion detection means, such as (but not limited to) appearance based matching (i.e. colour, image gradients and other higher order visual features). The motion of the target is determined by comparing the relative position of the target in the first image li in 35 which a target is identified (which should be considered to be a template window) to the relative position of the target in a subsequent image li+i received from the wide angle camera 102. The step of determining the motion of the target is described in more detail with reference to figure 6.
7
At step S306, the control unit 108 causes the movable unit 106 to move in accordance with the determined target motion in order to track the identified target. The control unit generates input commands based on the determined motion to tracking software which then causes the gimbal 5 mechanism to move a required amount (e.g. by a specified angular distance) to ensure that the target is in view of the wide angle camera 102. The control unit instructs the gimbal mechanism to move such that the target is centred in the image acquired by the wide angle camera 102. In this manner the wide angle camera 102 tracks the motion of the target, whilst the zoom camera 104 acquires a detailed image of the target. If the target is determined to be stationary (i.e. the motion of the target is 10 determined to be less than a threshold amount), the gimbal mechanism is maintained at a fixed position.
In a particular embodiment of the invention, the control unit implements a high resolution 2 axis motor control algorithm based on the determined motion of the object to drive the movable unit 106 so that 15 the target is maintained in the view of both the mounted cameras.
As discussed above, both the wide angle camera 102 and the zoom camera 104 are mounted on the gimbal mechanism 106. Accordingly, movement of the gimbal mechanism to track the identified target based on the image received by the wide angle camera 102 ensures that the target is also in 20 the view of the zoom camera 104.
The zoom camera 104 then acquires a narrow field image containing the identified object. This narrow field, or zoom, image provides a more detailed or closer view of the object to be tracked. This is because the zoom camera 104 is focused (or zoomed) to see more of the object and less of the 25 background. Generally, this focussing or zooming is achieved through optical means, such as a zoom lens (see below), on the zoom camera 104, in comparison to a wider angle type lens on the wide angle camera 102. The wide angle camera 102 has a narrow field of view compared to a 360° fish eye lens camera. Additionally or alternatively, the wide angle camera 102 may comprise a zoom lens that can be used to adjust the field of view of the wide angle camera 102 for distance use. The 30 wide angle camera 102 therefore allows for accurate tracking of an object, even when the target is distant from the tracking apparatus 100.
In this manner, the control unit 108 can determine, from the image acquired by the wide angle camera 102, the background features that are stationary relative to the object being tracked. Based on this 35 information, the control unit 108 can 'unwrap' the image and move the gimbal mechanism 106 in order to substantially centre the wide angle camera 102 and the zoom camera 104 on the identified target. The term 'unwrap' means that the control unit makes continual adjustments for the movement
8
of the reference background in the field of view of the first image acquisition device, and then searches for one or more reference points (new, similar or identical) so as to continue to implement the control process, as previous reference points disappear from the first image acquisition device's wider field of view.
5
Since the wide angle camera 102 and the zoom camera 104 are fixed to the same gimbal mechanism 106, calibration of the cameras can be performed during manufacture, thereby avoiding the need for calibration during installation of the tracking apparatus 100.
10 Step S304 is now described in more detail with reference to figures 4 and 5. In an embodiment of the invention, one or both of the methods described with reference to figures 4 and 5 are used to determine if the image received from the wide angle camera 102 comprises a target to be tracked.
At step S400 of figure 4, the control unit 108 determines whether a received image comprises an 15 object to be tracked using image processing means, for example: single image object recognition algorithms, motion based background subtraction algorithms, edge detection, or feature extraction. Additionally or alternatively, the target may be identified in accordance with a user input in which a display unit and input means (see below) are connected to the control unit 108 and the input means can be manipulated by a user to identify an object, and generate object recognition data for the 20 control unit corresponding to the reference point in the field of view of the first image acquisition device of the object to be tracked. The object recognition data stores identifying characteristics of the object to be tracked and is continually applied to the image data from the zoom camera 104 to determine the object's position in the image represented by the image data. The control unit then outputs control signals to the movable unit 106 to cause the object to be centred in the image. This 25 way, the object will be tracked as it moves across the scene which is being acquired by the wide angle camera 102..
In the embodiment of the invention described with reference to figure 5b, the apparatus 100 additionally comprises a display unit 500 and input means 502 connected to the control unit, as 30 shown in figure 5a. The display unit 500 and the input means 502 are connected to the control unit via one or both of a wired and a wireless connection. The display unit comprises any means suitable for displaying image data for example, a monitor of a personal computer, a television screen or a touch screen monitor. The input means 502 comprises any means for inputting a user selection, e.g., a mouse, a keyboard, etc.
35
At step S500, the control unit 108 causes the initial image data l0 to be displayed on the display unit 500. At step S502, the control unit 108 receives an input from a user via the input means 502. This
9
input identifies a subset of the image data l0, the identified subset of data corresponding to the target to be tracked. The input may, for example, comprise a user using the input means to draw a box around the image data corresponding to the target, or using the input means to identify or select the image data corresponding to the target by some other means. Additionally or alternatively, the input 5 may comprise the user visually identifying a target in the image data and using the input means to input a command to move the gimbal mechanism in order to centre the wide angle camera 102 on the identified target.
The control unit 108 determines (or causes a processor to determine) the motion of the target to be 10 tracked as described in Figure 6. At step S600, the control unit 108 identifies a reference object R0 in the background of the image data li. In this case, the background of the image data comprises image data other than the data identified as corresponding to the target. The reference object R0 is a subset of the background image data corresponding to a stationary identifiable object in the background of the image, for example data corresponding to a road, a kerb, a tree, a part of a building etc.
15
At step S602, the data corresponding to the target is separated or extracted from the background data of image li. This extraction is performed using a segmentation technique, e.g. appearance based matching based on colour, image gradients and/or other higher order features. At step S604 the location of the target relative to the reference object R0 is determined.
20
Steps S602 and S604 are then repeated using the subsequently received image data li+i. If it is determined that the subsequently received image data li+i does not comprise a target, then processing continues at step S306, at which the control unit may cause the gimbal mechanism to remain stationary or to return to a central or initial position.
25
At S606 the motion of the target is determined from the difference between the relative position of the target to the reference object R0 in image data li+i and the relative position of the target to the reference object R0 in image data li. The motion detection algorithm can additionally take into account physical constraints such as continuous motion (i.e. predicting where the target will be in the next 30 frame given the previously observed motion), for example through the use of particle filtering or bootstrap techniques.
The entire tracking apparatus 100 is enclosed in a housing 170, for example a CCTV speed dome 'bubble'. The control unit 108 may also be enclosed (partially, but preferably wholly) within the 35 housing 170, and preferably attached to the movable unit 106 (e.g. gimbal mechanism). The control unit 108 may preferably be integrated into one or other of the image acquisition devices and be embedded with the image processing circuitry of the one or other image acquisition devices. When
10
the apparatus is placed inside a speed dome 'bubble', the wide angle camera view will be distorted because it will not be looking through a central axis of the bubble. However, as described, the image data acquired by the wide angle camera 102 is only required to allow the control unit 108 to determine tracking input (or commands) for causing the gimbal mechanism 106 to track the target. The image 5 data acquired by the wide angle camera 102 is not required to provide a detailed view of the target because such a view is instead obtained by the image data acquired by the zoom camera 104. As described above, the control unit 108 only requires relative positional information in order to determine the movement required by the gimbal mechanism 106. The distortion of the image acquired by the wide angle camera 102 is constant and therefore does not prevent the control unit 10 from accurately determining the required relative positional information.
The present invention has been described above in exemplary form with reference to the accompanying figures which represent embodiments of the invention. It will be understood that there are further embodiments of the invention that also fall within the scope of the invention as defined by 15 the following claims.
11

Claims (19)

  1. Claims
    An object tracking apparatus comprising:
    a moveable unit;
    a first image acquisition device mounted on the moveable unit;
    a second image acquisition device mounted on the moveable unit; and a control device configured to:
    determine, from image data received from the first image acquisition device , positional information of an object represented in the image data to be tracked; and output a control signal to the moveable unit to control its movement and cause the second image acquisition device to track the object in accordance with the determined positional information.
  2. 2. The apparatus of claim 1, wherein the second image acquisition device is mounted on the 15 moveable unit in a fixed position relative to the first image acquisition device.
  3. 3. The apparatus of claim 2, wherein the control means is configured to determine the position information from both image data received from the first image acquisition device, and the position of the first image acquisition device relative to the second image acquisition device.
    20
  4. 4. The apparatus of any one of the preceding claims, wherein the first image acquisition device and the second image acquisition device are configured to point in substantially the same direction towards a scene for image acquisition.
    25
  5. 5. The apparatus of any one of the preceding claims, wherein the first image acquisition device has a first predefined field of view, and the second image acquisition device has a second predefined field of view, the first predefined field of view being wider the second predefined field of view.
  6. 6. The apparatus of claim 5, wherein the first predefined field of view is less than 360°.
    30
    1.
    5
    10
    12
  7. 7. The apparatus of any one of the preceding claims, wherein the first image acquisition device comprises a zoom lens operable to adjust the first predefined field of view when tracking the object.
  8. 8. The apparatus of claim 7, wherein the control unit is configured to generate control signals to 5 control the zoom lens to adjust the field of view so as to maximise accuracy of tracking of the object.
  9. 9. The apparatus of any of the preceding claims, wherein the control means is further configured to determine the positional information based on a known angle of motion of the movable unit.
  10. 10 10. The apparatus of any of the preceding claims, wherein the first image acquisition device and the second image acquisition device are housed within a closed circuit camera (CCTV) dome.
  11. 11. The apparatus of any of the preceding claims, wherein a combination of the second image acquisition device and the mounting unit is a pan tilt zoom (PTZ) camera.
    15
  12. 12. The apparatus of any of the preceding claims, wherein the movable unit comprises a pan and tilt gimbal mechanism.
  13. 13. The apparatus of any one of the preceding claims, wherein the movable unit comprises at 20 least one motor controlled by the control unit.
  14. 14. The apparatus of any one of the preceding claims, wherein the first and second image acquisition devices are video acquisition devices, and the image data comprises image data of frames of video data.
    25
  15. 15. A method for tracking objects, comprising:
    determining, from image data received from a first image acquisition device mounted on a moveable unit, positional information of an object represented in the image data to be tracked; and outputting a control signal to the moveable unit to control its movement and cause a second 30 image acquisition device mounted on the moveable unit to track the object in accordance with the determined positional information.
    13
  16. 16. The method of claim 15, wherein determining positional information comprises determining the positional information from both image data received from the first image acquisition device, and a position of the first image acquisition device relative to the second image acquisition device.
    5
  17. 17. A computer-readable medium comprising computer-executable instructions which, when executed, cause a processor to perform the steps of claim 15 or claim 16.
  18. 18. An apparatus substantially as hereinbefore described with reference to any one of figures 1 to 10 6.
  19. 19. A method substantially as hereinbefore described with reference to any one of figures 1 to 6.
GB1202692.8A 2012-02-16 2012-02-16 Video tracking apparatus having two cameras mounted on a moveable unit Withdrawn GB2499427A (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
GB1202692.8A GB2499427A (en) 2012-02-16 2012-02-16 Video tracking apparatus having two cameras mounted on a moveable unit
PCT/GB2013/050368 WO2013121215A1 (en) 2012-02-16 2013-02-15 Video tracking apparatus and method

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
GB1202692.8A GB2499427A (en) 2012-02-16 2012-02-16 Video tracking apparatus having two cameras mounted on a moveable unit

Publications (2)

Publication Number Publication Date
GB201202692D0 GB201202692D0 (en) 2012-04-04
GB2499427A true GB2499427A (en) 2013-08-21

Family

ID=45939740

Family Applications (1)

Application Number Title Priority Date Filing Date
GB1202692.8A Withdrawn GB2499427A (en) 2012-02-16 2012-02-16 Video tracking apparatus having two cameras mounted on a moveable unit

Country Status (2)

Country Link
GB (1) GB2499427A (en)
WO (1) WO2013121215A1 (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104580687A (en) * 2013-10-11 2015-04-29 Lg电子株式会社 Mobile terminal and controlling method thereof
EP3328057A1 (en) * 2016-11-29 2018-05-30 Guangdong Oppo Mobile Telecommunications Corp., Ltd Camera assembly, method for portrait tracking based on the same, and electronic device

Families Citing this family (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2016007962A1 (en) 2014-07-11 2016-01-14 ProSports Technologies, LLC Camera feed distribution from event venue virtual seat cameras
US9760572B1 (en) 2014-07-11 2017-09-12 ProSports Technologies, LLC Event-based content collection for network-based distribution
WO2016007965A1 (en) 2014-07-11 2016-01-14 ProSports Technologies, LLC Ball tracker camera
US9571903B2 (en) 2014-07-11 2017-02-14 ProSports Technologies, LLC Ball tracker snippets
US9655027B1 (en) 2014-07-11 2017-05-16 ProSports Technologies, LLC Event data transmission to eventgoer devices
US9729644B1 (en) 2014-07-28 2017-08-08 ProSports Technologies, LLC Event and fantasy league data transmission to eventgoer devices
US9699523B1 (en) 2014-09-08 2017-07-04 ProSports Technologies, LLC Automated clip creation
CN107749978A (en) * 2017-12-15 2018-03-02 苏州雷目电子科技有限公司 It is a kind of based on the real time data monitoring device being wirelessly transferred

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP1765014A2 (en) * 2005-09-20 2007-03-21 Fujinon Corporation Surveillance camera apparatus and surveillance camera system
WO2009142332A1 (en) * 2008-05-23 2009-11-26 Advas Co., Ltd. Hybrid video camera syste
EP2328341A1 (en) * 2008-08-20 2011-06-01 Tokyo Institute of Technology Long-distance target detection camera system

Family Cites Families (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH10210506A (en) * 1997-01-22 1998-08-07 Sony Corp Three-dimensional image information input device and three-dimensional image information input output device
US6215519B1 (en) * 1998-03-04 2001-04-10 The Trustees Of Columbia University In The City Of New York Combined wide angle and narrow angle imaging system and method for surveillance and monitoring
CN1219397C (en) * 2002-10-22 2005-09-14 张晓林 Bionic automatic vision and sight control system and method

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP1765014A2 (en) * 2005-09-20 2007-03-21 Fujinon Corporation Surveillance camera apparatus and surveillance camera system
WO2009142332A1 (en) * 2008-05-23 2009-11-26 Advas Co., Ltd. Hybrid video camera syste
EP2328341A1 (en) * 2008-08-20 2011-06-01 Tokyo Institute of Technology Long-distance target detection camera system

Cited By (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104580687A (en) * 2013-10-11 2015-04-29 Lg电子株式会社 Mobile terminal and controlling method thereof
EP2860961B1 (en) * 2013-10-11 2018-08-01 Lg Electronics Inc. Mobile terminal and controlling method thereof
CN104580687B (en) * 2013-10-11 2020-01-14 Lg电子株式会社 Mobile terminal and control method thereof
EP3328057A1 (en) * 2016-11-29 2018-05-30 Guangdong Oppo Mobile Telecommunications Corp., Ltd Camera assembly, method for portrait tracking based on the same, and electronic device
US10937184B2 (en) 2016-11-29 2021-03-02 Guangdong Oppo Mobile Telecommunications Corp., Ltd. Camera assembly, method for tracking target portion based on the same, and electronic device

Also Published As

Publication number Publication date
WO2013121215A1 (en) 2013-08-22
GB201202692D0 (en) 2012-04-04

Similar Documents

Publication Publication Date Title
GB2499427A (en) Video tracking apparatus having two cameras mounted on a moveable unit
JP4915655B2 (en) Automatic tracking device
US9210336B2 (en) Automatic extraction of secondary video streams
JP4699040B2 (en) Automatic tracking control device, automatic tracking control method, program, and automatic tracking system
EP2851880B1 (en) Method, system and storage medium for controlling an image capture device, in particular of a door station
KR101530255B1 (en) Cctv system having auto tracking function of moving target
CN114745498B (en) Method and system for capturing subregions and notifying whether altered by camera movement
US9256324B2 (en) Interactive operation method of electronic apparatus
JP3440916B2 (en) Automatic tracking device, automatic tracking method, and recording medium recording automatic tracking program
KR20160062880A (en) road traffic information management system for g using camera and radar
US11037013B2 (en) Camera and image processing method of camera
CN103688292A (en) Image display apparatus and image display method
CN101568020A (en) Camera system, control device and method
WO2012054830A1 (en) Method and system of video object tracking
JP4575829B2 (en) Display screen position analysis device and display screen position analysis program
KR20130130544A (en) Method and system for presenting security image
KR101096157B1 (en) watching apparatus using dual camera
JP2012003364A (en) Person movement determination device and program for the same
CN112514366A (en) Image processing method, image processing apparatus, and image processing system
JP2019149621A (en) Information processing device, information processing method, and program
CN103168461A (en) Device for assisting focusing of a camera
EP3557386B1 (en) Information processing device and information processing method
KR101738514B1 (en) Monitoring system employing fish-eye thermal imaging camera and monitoring method using the same
JP2006244272A (en) Hand position tracking method, device and program
KR101272631B1 (en) Apparatus for detecting a moving object and detecting method thereof

Legal Events

Date Code Title Description
WAP Application withdrawn, taken to be withdrawn or refused ** after publication under section 16(1)