US20110234825A1 - Accelerometer / gyro-facilitated video stabilization - Google Patents
Accelerometer / gyro-facilitated video stabilization Download PDFInfo
- Publication number
- US20110234825A1 US20110234825A1 US12/755,620 US75562010A US2011234825A1 US 20110234825 A1 US20110234825 A1 US 20110234825A1 US 75562010 A US75562010 A US 75562010A US 2011234825 A1 US2011234825 A1 US 2011234825A1
- Authority
- US
- United States
- Prior art keywords
- motion
- frame
- threshold
- detector
- video sequence
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Granted
Links
- 230000006641 stabilisation Effects 0.000 title claims abstract description 50
- 238000011105 stabilization Methods 0.000 title claims abstract description 50
- 238000000034 method Methods 0.000 claims abstract description 30
- 238000001514 detection method Methods 0.000 claims description 11
- 238000001914 filtration Methods 0.000 claims description 3
- 238000003672 processing method Methods 0.000 claims 3
- 230000004931 aggregating effect Effects 0.000 claims 2
- 238000012545 processing Methods 0.000 description 29
- 239000013598 vector Substances 0.000 description 25
- 238000009499 grossing Methods 0.000 description 15
- 238000010606 normalization Methods 0.000 description 10
- 238000004891 communication Methods 0.000 description 8
- 238000010586 diagram Methods 0.000 description 8
- 230000006835 compression Effects 0.000 description 4
- 238000007906 compression Methods 0.000 description 4
- 206010044565 Tremor Diseases 0.000 description 2
- 239000000463 material Substances 0.000 description 2
- 238000003491 array Methods 0.000 description 1
- 238000000354 decomposition reaction Methods 0.000 description 1
- 230000000694 effects Effects 0.000 description 1
- 238000003384 imaging method Methods 0.000 description 1
- 230000001788 irregular Effects 0.000 description 1
- 230000003278 mimic effect Effects 0.000 description 1
- 238000012986 modification Methods 0.000 description 1
- 230000004048 modification Effects 0.000 description 1
- 230000003287 optical effect Effects 0.000 description 1
- 238000007781 pre-processing Methods 0.000 description 1
- 230000002123 temporal effect Effects 0.000 description 1
Images
Classifications
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/60—Control of cameras or camera modules
- H04N23/68—Control of cameras or camera modules for stable pick-up of the scene, e.g. compensating for camera body vibrations
- H04N23/681—Motion detection
- H04N23/6811—Motion detection based on the image signal
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/60—Control of cameras or camera modules
- H04N23/68—Control of cameras or camera modules for stable pick-up of the scene, e.g. compensating for camera body vibrations
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/60—Control of cameras or camera modules
- H04N23/68—Control of cameras or camera modules for stable pick-up of the scene, e.g. compensating for camera body vibrations
- H04N23/681—Motion detection
- H04N23/6812—Motion detection based on additional sensors, e.g. acceleration sensors
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/60—Control of cameras or camera modules
- H04N23/68—Control of cameras or camera modules for stable pick-up of the scene, e.g. compensating for camera body vibrations
- H04N23/682—Vibration or motion blur correction
- H04N23/683—Vibration or motion blur correction performed by a processor, e.g. controlling the readout of an image memory
Definitions
- Video stabilization is a class of video processing that removes unwanted shakiness from videos captured from portable camera devices such as smart phones, personal entertainment systems laptop computers and/or camcorder.
- the goal of video stabilization is to revise an original video sequence to mimic a sequence that would have been obtained if a camera captured the video from an ideal or a specified motion trajectory.
- video stabilization techniques generate an idealized motion vector of a captured video sequence and then introduce motion compensation to a sequence of captured video to replicate the idealized motion vector. If, for example, a video stabilization algorithm estimated that a video sequence should exhibit no motion (e.g., ideally a camera would have been perfectly still during motion capture), then the motion compensation processes would estimate a global motion vector on each frame and perform processes to remove the global motion.
- video stabilization can improve the perceptual quality of a video sequence, it has its consequences. First, it can consume considerable resources at a capture device or processing device. Second, it can reduce the field of view of the final video sequence. Third and perhaps most importantly, video stabilization can impair perceived quality if the algorithm generates an incorrect estimate of idealized motion or an incorrect estimate of the source motion vector.
- consumer capture devices are provisioned with motion detection devices such as accelerometers and/or gyroscopes.
- the motion detection devices can provide metadata that indicates motion effects of a camera during video capture, however, even though the motion detectors provide data relating to global motion of the camera, the level of shakiness between frames often is comparable to the noise level of the motion detector data. Such high level of the noise in data prohibits directly use of accelerometer data in video stabilization.
- FIG. 1 is a simplified block diagram of a portable video capture device according to an embodiment of the present invention.
- FIG. 2 is a flow diagram illustrating data flow according to an embodiment of the present invention.
- FIG. 3 is a functional block diagram of a processing system according to an embodiment of the present invention.
- FIGS. 4-5 are graphs illustrating exemplary motion data according to an embodiment of the present invention.
- FIG. 6 is a functional block diagram of a processing system according to another embodiment of the present invention.
- Embodiments of the present invention provide a control system for video processes that selectively control the operation of motion stabilization processes.
- motion sensor data indicative of motion of a mobile device may be received and processed.
- a determination may be made by comparing processed motion sensor data to a threshold. Based on the determination, motion stabilization may be suspended on select portions of a captured video sequence.
- FIG. 1 is a simplified block diagram of a portable video device 100 according to an embodiment of the present invention.
- the video device 100 may include a processing system 110 , a camera 120 , and a motion detector 130 such as an accelerometer or gyroscope.
- the processing system 100 may include various microprocessors and memory systems (not shown) to execute an operating system of the video device, to manage device operations and to store and process video data captured by the camera 120 .
- the camera 120 may include a lens system and imaging device to convert incident light into a digital data stream.
- the motion detector 130 may generate electrical data indicating a direction of movement of the mobile device.
- FIG. 1 further illustrates the mobile device 100 as having a transceiver 140 and/or communication port 150 to exchange data between the mobile device 100 and a remote device (not shown).
- the transceiver 140 may support communication between the mobile device 100 and the remote device by a wireless communication link, such as those formed by 3G, 4G, Wi-Fi or Wi-MAX communication networks.
- the communication port 150 may support communication between the mobile device 100 and a remote device by wired communication links, for example, those provided by a Universal Serial Bus (USB) communication link.
- USB Universal Serial Bus
- FIG. 2 is a process flow diagram 200 illustrating data flow according to an embodiment of the present invention.
- captured video data 210 and motion data 220 may be processed by a global motion estimation stage 230 , a motion detector processing stage 240 , a scene change detection stage 250 , a motion smoothing stage 260 and a motion stabilization stage 270 .
- a video device may calculate motion of video content on a frame-by-frame basis across a field of view.
- the global motion estimation stage 230 may output metadata identifying, for each frame in the captured video, a motion vector representing average motion of the frame, measured from a preceding frame.
- the motion estimate metadata may be output to the motion smoothing stage 260 and the scene change detection stage 250 .
- the motion smoothing stage 260 may generate new motion vectors for each frame according to average motion observable in the motion vectors output from the global motion estimation stage 230 .
- the motion smoothing stage 260 may generate motion vectors for each frame i representing an average of multiple motion vectors (say, 10 frames) from the global motion estimation stage 230 surrounding and including frame i.
- the motion smoothing stage 260 may generate motion vectors representing a low pass filtering of multiple motion vectors from the global motion estimation stage 230 (again, perhaps 10 frames). Motion smoothing helps remove jitter and other high frequency artifacts from the motion vectors output by the global motion estimation stage 230 .
- the motion smoothing stage 260 may output motion vectors to the motion stabilization stage 270 .
- the motion detector processing stage 240 may receive motion data from a motion detector device.
- the motion detector outputs motion data at rate in excess of one sample per video frame.
- motion detector samples may be output erratically to the motion detector processing stage 240 ; some frames may have a relatively large number of motion detector samples provided therefor whereas other frames may have a relatively small number of samples (or none at all).
- the motion detector processing stage 240 may aggregate and normalize samples on a per frame basis to generate a motion value per frame.
- the motion data may be output from the motion detector processing stage 240 to the scene change detection stage 250 .
- the scene change detector 250 may selectively enable and disable operation of the motion stabilization stage 270 based on motion values provided by the motion detector processing stage 240 .
- the scene change detector 250 may identify region(s) of a captured video sequence for which the camera was moving so fast that the camera effectively was in a scene change. During such times, the scene change detector 250 may disable operation of the motion stabilization stage 270 .
- the scene change detector 250 may make scene change decisions based on motion vectors output by the global motion estimation stage 230 or the motion detector processing stage 240 .
- FIG. 3 is a functional block diagram of a processing system 300 for controlling video stabilization processes according to an embodiment of the present invention.
- FIG. 3 illustrates a motion detector processing stage 310 , a scene change detector 320 and a video coder 330 .
- the motion detector processing stage 310 may include an accumulator 312 that receives motion samples from a motion detector, such as a gyroscope or accelerometer.
- the accumulator 312 may output accumulated motion values ACC to a normalization unit 314 .
- the accumulator 312 may be cleared at the onset of each new frame.
- the motion detector processing stage 310 may output normalized motion values ACC to the scene change detector 320 .
- the scene change detector 320 may include a comparator 322 and a codec controller 324 .
- the comparator 322 may compare normalized ACC values from the motion detector processor 310 to a predetermined threshold. It may output a signal representing results of the comparison to the video coder 330 and, specifically, to the video stabilization unit 332 .
- the codec controller 324 may disable the video stabilization unit 332 .
- the codec controller 324 disables the video stabilization unit 332 , it may keep the video stabilization unit 332 disabled thereafter for at least a predetermined number of frames (say, 6 frames).
- FIG. 3 illustrates a video coder 330 that includes a video stabilization unit 332 and a motion smoothing unit 334 .
- the motion smoothing unit 334 may receive motion samples from the motion detector or, optionally, from the motion detector processor 310 in addition to motion vectors from a global estimation processor (not shown).
- the motion smoothing unit 334 may output revised motion vectors to the video stabilization unit 332 for use in stabilization processing.
- the motion smoothing unit may 334 perform a weighted average of motion vectors from a global estimation processor and from the motion detector processor 310 to generate revised motion vectors to the video stabilization unit 332 .
- FIG. 3 presents a simplified video coder 330 , illustrating only the blocks 332 , 334 that are material to the present discussion.
- that video coder 330 may include other processing blocks to code input video data.
- a video coder may operate according to any of a plurality of known video compression algorithms that exploit temporal and spatial redundancies in video data to reduce bandwidth of a video signal.
- the compression algorithms may be lossy or non-lossy.
- the video compression algorithms may perform motion-compensated prediction in combination with spatial transforms, such as discrete cosine transforms or wavelet decomposition.
- the known H.263 and H.264 families of video coders are examples of such modern algorithms.
- Video coders also can perform various pre-processing operations to adapt input video data for compression operations, including video stabilization among others. The embodiments of the present invention find application with such video coders.
- the operation of the normalization unit 314 may be tailored to fit implementation of the motion detector.
- data may be read from the motion detector via an operating system service executing on a processing system at a mobile device.
- motion detector data may be provided to the motion detector processor 610 on an irregular basis.
- Each frame may have a different number of motion samples associated with it. Some frames may have a relatively larger number of samples associated with them whereas other frames may have many fewer associated samples, possibly no samples at all.
- the normalization unit may perform a variety of processes to generate uniform ACC values to the scene change detector 320 .
- the normalization unit 314 may perform a low pass filter over samples available in each frame. Equivalently, the normalization unit 314 may average samples presented to the motion detector processor 310 in each frame. The normalization unit 314 further may determine whether motion samples are missing entirely from individual frames and, in such an event, the normalization unit 314 may interpolate an ACC value from ACC values of neighboring frames.
- the normalization unit 314 may be omitted.
- FIG. 4 illustrates a graph of exemplary motion change values over time. As illustrated, the motion change values exceed the predetermined threshold for all frames prior to time t 0 . At time t 0 , the motion change values falls below the motion threshold TH. Accordingly, the control unit 350 ( FIG. 3 ) may engage the motion stabilization unit 370 of the codec 360.
- the motion change values are lower than the threshold TH for all frames until time t 1 , at which time the motion change value exceeds TH. Due to the latency of the control unit 350 , the control unit 350 may maintain the motion stabilization unit 370 in an active mode for at least N frames. After time t 1 , the motion change value drops below the threshold TH again, resetting the control unit. The motion change values exceed the threshold at times t 2 and t 3 but, again, drop below the threshold TH within the N frame window that starts at each time. Thereafter, the motion change values remain lower than the threshold TH for the remainder of the time interval shown in FIG. 4 . In this example, because the motion change values did not exceed the threshold TH and remain above the threshold for at least N frames for any time following t 0 , the video stabilization unit remained engaged for all times following t 0 .
- a codec controller may employ an N frame latency any time the ACC values cross the TH threshold.
- FIG. 5 illustrates such operation with the same set of exemplary motion data is illustrated in FIG. 4 .
- the motion change values exceed the predetermined threshold for all frames prior to time t 0 .
- the motion change values falls below the motion threshold TH.
- the codec controller FIG. 3
- the motion stabilization unit 322 would be engaged at time t 4 .
- the motion change values exceed the threshold at times t 1 , t 2 and t 3 but, again, drop below the threshold TH within the N frame window that starts at each time. Thereafter, the motion change values remain lower than the threshold TH for the remainder of the time interval shown in FIG. 5 . Thus, the video stabilization unit 322 may remain engaged for all times following t 0 .
- FIG. 6 is a functional block diagram of a processing system 600 for controlling video stabilization processes according to another embodiment of the present invention.
- FIG. 6 illustrates a motion detector processing stage 610 , a scene change detector 620 , a video coder 630 and a global motion estimator 640 .
- the motion detector processing stage 610 may include an accumulator 612 that receives motion samples from a motion detector, such as a gyroscope or accelerometer.
- the accumulator may output accumulated motion values ACC to a normalization unit 614 .
- the motion detector processing stage 610 may output normalized motion values ACC to the scene change detector 620 .
- the scene change detector 620 may include a pair of comparators 622 , 626 and a codec controller 624 .
- a first comparator 622 may compare normalized ACC values from the motion detector processor 610 to a first threshold TH 1 . It may output a signal representing results of the comparison to codec controller 624 .
- the second comparator 626 may compare motion vectors from the global motion estimator to a second threshold TH 2 . It may output a second signal representing results of this comparison to the controller 624 .
- the codec controller 624 may disable the video stabilization unit 632 based on these comparisons. For example, the codec controller 624 may disable the video stabilization unit 632 when either of these two comparison signals indicates motion has exceeded their respective threshold.
- the codec controller 624 may disable the video stabilization unit 632 when both of the comparison signals indicate motion has exceeded the thresholds.
- the codec controller 624 may disable the video stabilization unit 632 , it may keep the video stabilization unit 632 disabled thereafter for at least a predetermined number of frames (say, 6 frames).
- FIG. 6 illustrates a video coder 630 that includes a video stabilization unit 632 and a motion smoothing unit 634 .
- the motion smoothing unit 634 may receive motion samples from the motion detector or, optionally, from the motion detector processor 610 in addition to motion vectors from the global estimation processor 640 (connection not shown).
- the motion smoothing unit 634 may output revised motion vectors to the video stabilization unit 632 for use in stabilization processing.
- revised motion vectors may be output to the second comparator 626 in lieu of the motion vectors from the global estimation processor 640 .
- the motion smoothing unit may 634 perform a weighted average of motion vectors from a global estimation processor and from the motion detector processor 610 to generate revised motion vectors to the video stabilization unit 632 .
- FIG. 6 presents a simplified video coder 630 , illustrating only the blocks 632 , 634 that are material to the present discussion.
- the video coder 620 may include other processing blocks (not shown) to code input video data and reduce bandwidth of the video signal.
- the foregoing embodiments provide a coding/control system that estimates motion of a video capture device, estimates the presence of scene changes in video and selectively engages video stabilization processes based thereon.
- the techniques described above find application in both software- and hardware-based control systems.
- the functional units described hereinabove may be implemented on a computer system (commonly, a server, personal computer or mobile computing platform) executing program instructions corresponding to the functional blocks and methods listed above.
- the program instructions themselves may be stored in a storage device, such as an electrical, optical or magnetic storage medium, and executed by a processor of the computer system.
- the functional blocks illustrated above may be provided in dedicated functional units of processing hardware, for example, digital signal processors, application specific integrated circuits, field programmable logic arrays and the like.
- the processing hardware may include state machines that perform the methods described in the foregoing discussion.
- the principles of the present invention also find application in hybrid systems of mixed hardware and software designs.
Landscapes
- Engineering & Computer Science (AREA)
- Multimedia (AREA)
- Signal Processing (AREA)
- Studio Devices (AREA)
Abstract
Description
- The present application claims the benefit of U.S. Provisional application Ser. No. 61/317,642, filed Mar. 25, 2010, entitled “Accelerometer/Gyro-Facilitated Video Stabilization,” the disclosure of which is incorporated herein by reference in its entirety.
- Video stabilization is a class of video processing that removes unwanted shakiness from videos captured from portable camera devices such as smart phones, personal entertainment systems laptop computers and/or camcorder. The goal of video stabilization is to revise an original video sequence to mimic a sequence that would have been obtained if a camera captured the video from an ideal or a specified motion trajectory. Specifically, video stabilization techniques generate an idealized motion vector of a captured video sequence and then introduce motion compensation to a sequence of captured video to replicate the idealized motion vector. If, for example, a video stabilization algorithm estimated that a video sequence should exhibit no motion (e.g., ideally a camera would have been perfectly still during motion capture), then the motion compensation processes would estimate a global motion vector on each frame and perform processes to remove the global motion. Although video stabilization can improve the perceptual quality of a video sequence, it has its consequences. First, it can consume considerable resources at a capture device or processing device. Second, it can reduce the field of view of the final video sequence. Third and perhaps most importantly, video stabilization can impair perceived quality if the algorithm generates an incorrect estimate of idealized motion or an incorrect estimate of the source motion vector.
- Increasingly, consumer capture devices are provisioned with motion detection devices such as accelerometers and/or gyroscopes. The motion detection devices can provide metadata that indicates motion effects of a camera during video capture, however, even though the motion detectors provide data relating to global motion of the camera, the level of shakiness between frames often is comparable to the noise level of the motion detector data. Such high level of the noise in data prohibits directly use of accelerometer data in video stabilization.
-
FIG. 1 is a simplified block diagram of a portable video capture device according to an embodiment of the present invention. -
FIG. 2 is a flow diagram illustrating data flow according to an embodiment of the present invention. -
FIG. 3 is a functional block diagram of a processing system according to an embodiment of the present invention -
FIGS. 4-5 are graphs illustrating exemplary motion data according to an embodiment of the present invention. -
FIG. 6 is a functional block diagram of a processing system according to another embodiment of the present invention - Embodiments of the present invention provide a control system for video processes that selectively control the operation of motion stabilization processes. According to the present invention, motion sensor data indicative of motion of a mobile device may be received and processed. A determination may be made by comparing processed motion sensor data to a threshold. Based on the determination, motion stabilization may be suspended on select portions of a captured video sequence.
-
FIG. 1 is a simplified block diagram of aportable video device 100 according to an embodiment of the present invention. Thevideo device 100 may include aprocessing system 110, acamera 120, and amotion detector 130 such as an accelerometer or gyroscope. Theprocessing system 100 may include various microprocessors and memory systems (not shown) to execute an operating system of the video device, to manage device operations and to store and process video data captured by thecamera 120. Thecamera 120 may include a lens system and imaging device to convert incident light into a digital data stream. Themotion detector 130 may generate electrical data indicating a direction of movement of the mobile device. -
FIG. 1 further illustrates themobile device 100 as having atransceiver 140 and/orcommunication port 150 to exchange data between themobile device 100 and a remote device (not shown). Thetransceiver 140 may support communication between themobile device 100 and the remote device by a wireless communication link, such as those formed by 3G, 4G, Wi-Fi or Wi-MAX communication networks. Thecommunication port 150 may support communication between themobile device 100 and a remote device by wired communication links, for example, those provided by a Universal Serial Bus (USB) communication link. -
FIG. 2 is a process flow diagram 200 illustrating data flow according to an embodiment of the present invention. In this embodiment, capturedvideo data 210 andmotion data 220 may be processed by a globalmotion estimation stage 230, a motiondetector processing stage 240, a scenechange detection stage 250, amotion smoothing stage 260 and amotion stabilization stage 270. - In the global
motion estimation stage 230, a video device may calculate motion of video content on a frame-by-frame basis across a field of view. The globalmotion estimation stage 230 may output metadata identifying, for each frame in the captured video, a motion vector representing average motion of the frame, measured from a preceding frame. The motion estimate metadata may be output to themotion smoothing stage 260 and the scenechange detection stage 250. - The
motion smoothing stage 260 may generate new motion vectors for each frame according to average motion observable in the motion vectors output from the globalmotion estimation stage 230. For example, themotion smoothing stage 260 may generate motion vectors for each frame i representing an average of multiple motion vectors (say, 10 frames) from the globalmotion estimation stage 230 surrounding and including frame i. Alternatively, themotion smoothing stage 260 may generate motion vectors representing a low pass filtering of multiple motion vectors from the global motion estimation stage 230 (again, perhaps 10 frames). Motion smoothing helps remove jitter and other high frequency artifacts from the motion vectors output by the globalmotion estimation stage 230. Themotion smoothing stage 260 may output motion vectors to themotion stabilization stage 270. - The motion
detector processing stage 240 may receive motion data from a motion detector device. The motion detector outputs motion data at rate in excess of one sample per video frame. In some implementations, motion detector samples may be output erratically to the motiondetector processing stage 240; some frames may have a relatively large number of motion detector samples provided therefor whereas other frames may have a relatively small number of samples (or none at all). The motiondetector processing stage 240 may aggregate and normalize samples on a per frame basis to generate a motion value per frame. The motion data may be output from the motiondetector processing stage 240 to the scenechange detection stage 250. - The
scene change detector 250 may selectively enable and disable operation of themotion stabilization stage 270 based on motion values provided by the motiondetector processing stage 240. Thescene change detector 250 may identify region(s) of a captured video sequence for which the camera was moving so fast that the camera effectively was in a scene change. During such times, thescene change detector 250 may disable operation of themotion stabilization stage 270. Optionally, as part of its processing, thescene change detector 250 may make scene change decisions based on motion vectors output by the globalmotion estimation stage 230 or the motiondetector processing stage 240. -
FIG. 3 is a functional block diagram of aprocessing system 300 for controlling video stabilization processes according to an embodiment of the present invention.FIG. 3 illustrates a motiondetector processing stage 310, ascene change detector 320 and avideo coder 330. The motiondetector processing stage 310 may include anaccumulator 312 that receives motion samples from a motion detector, such as a gyroscope or accelerometer. Theaccumulator 312 may output accumulated motion values ACC to anormalization unit 314. Theaccumulator 312 may be cleared at the onset of each new frame. The motiondetector processing stage 310 may output normalized motion values ACC to thescene change detector 320. - The
scene change detector 320 may include acomparator 322 and acodec controller 324. Thecomparator 322 may compare normalized ACC values from themotion detector processor 310 to a predetermined threshold. It may output a signal representing results of the comparison to thevideo coder 330 and, specifically, to thevideo stabilization unit 332. In an embodiment, when the normalized ACC values exceed the threshold, thecodec controller 324 may disable thevideo stabilization unit 332. Optionally, when thecodec controller 324 disables thevideo stabilization unit 332, it may keep thevideo stabilization unit 332 disabled thereafter for at least a predetermined number of frames (say, 6 frames). -
FIG. 3 illustrates avideo coder 330 that includes avideo stabilization unit 332 and amotion smoothing unit 334. Themotion smoothing unit 334 may receive motion samples from the motion detector or, optionally, from themotion detector processor 310 in addition to motion vectors from a global estimation processor (not shown). Themotion smoothing unit 334 may output revised motion vectors to thevideo stabilization unit 332 for use in stabilization processing. In an embodiment, the motion smoothing unit may 334 perform a weighted average of motion vectors from a global estimation processor and from themotion detector processor 310 to generate revised motion vectors to thevideo stabilization unit 332. -
FIG. 3 presents asimplified video coder 330, illustrating only theblocks video coder 330 may include other processing blocks to code input video data. For example a video coder may operate according to any of a plurality of known video compression algorithms that exploit temporal and spatial redundancies in video data to reduce bandwidth of a video signal. The compression algorithms may be lossy or non-lossy. In one implementation, the video compression algorithms may perform motion-compensated prediction in combination with spatial transforms, such as discrete cosine transforms or wavelet decomposition. The known H.263 and H.264 families of video coders are examples of such modern algorithms. Further coding efficiencies may be obtained by performing entropy coding of resulting data. Video coders also can perform various pre-processing operations to adapt input video data for compression operations, including video stabilization among others. The embodiments of the present invention find application with such video coders. - The operation of the
normalization unit 314 may be tailored to fit implementation of the motion detector. In some applications, for example, data may be read from the motion detector via an operating system service executing on a processing system at a mobile device. In such an embodiment, motion detector data may be provided to themotion detector processor 610 on an irregular basis. Each frame may have a different number of motion samples associated with it. Some frames may have a relatively larger number of samples associated with them whereas other frames may have many fewer associated samples, possibly no samples at all. Accordingly, the normalization unit may perform a variety of processes to generate uniform ACC values to thescene change detector 320. - In one embodiment, the
normalization unit 314 may perform a low pass filter over samples available in each frame. Equivalently, thenormalization unit 314 may average samples presented to themotion detector processor 310 in each frame. Thenormalization unit 314 further may determine whether motion samples are missing entirely from individual frames and, in such an event, thenormalization unit 314 may interpolate an ACC value from ACC values of neighboring frames. - In a hardware environment or other implementation where the
motion detector processor 310 receives a regular number of motion detection samples on each frame such that normalization processes are not required, thenormalization unit 314 may be omitted. -
FIG. 4 illustrates a graph of exemplary motion change values over time. As illustrated, the motion change values exceed the predetermined threshold for all frames prior to time t0. At time t0, the motion change values falls below the motion threshold TH. Accordingly, the control unit 350 (FIG. 3 ) may engage the motion stabilization unit 370 of the codec 360. - After time t0, the motion change values are lower than the threshold TH for all frames until time t1, at which time the motion change value exceeds TH. Due to the latency of the control unit 350, the control unit 350 may maintain the motion stabilization unit 370 in an active mode for at least N frames. After time t1, the motion change value drops below the threshold TH again, resetting the control unit. The motion change values exceed the threshold at times t2 and t3 but, again, drop below the threshold TH within the N frame window that starts at each time. Thereafter, the motion change values remain lower than the threshold TH for the remainder of the time interval shown in
FIG. 4 . In this example, because the motion change values did not exceed the threshold TH and remain above the threshold for at least N frames for any time following t0, the video stabilization unit remained engaged for all times following t0. - Optionally, a codec controller may employ an N frame latency any time the ACC values cross the TH threshold.
FIG. 5 illustrates such operation with the same set of exemplary motion data is illustrated inFIG. 4 . As illustrated, the motion change values exceed the predetermined threshold for all frames prior to time t0. At time t0, the motion change values falls below the motion threshold TH. In this embodiment, the codec controller (FIG. 3 ) may engage themotion stabilization unit 322 if the motion values remain under the TH threshold for at least N frames. Thus, themotion stabilization unit 322 would be engaged at time t4. - The motion change values exceed the threshold at times t1, t2 and t3 but, again, drop below the threshold TH within the N frame window that starts at each time. Thereafter, the motion change values remain lower than the threshold TH for the remainder of the time interval shown in
FIG. 5 . Thus, thevideo stabilization unit 322 may remain engaged for all times following t0. -
FIG. 6 is a functional block diagram of aprocessing system 600 for controlling video stabilization processes according to another embodiment of the present invention.FIG. 6 illustrates a motiondetector processing stage 610, ascene change detector 620, a video coder 630 and a global motion estimator 640. The motiondetector processing stage 610 may include anaccumulator 612 that receives motion samples from a motion detector, such as a gyroscope or accelerometer. The accumulator may output accumulated motion values ACC to anormalization unit 614. The motiondetector processing stage 610 may output normalized motion values ACC to thescene change detector 620. - The
scene change detector 620 may include a pair ofcomparators codec controller 624. Afirst comparator 622 may compare normalized ACC values from themotion detector processor 610 to a first threshold TH1. It may output a signal representing results of the comparison tocodec controller 624. Thesecond comparator 626 may compare motion vectors from the global motion estimator to a second threshold TH2. It may output a second signal representing results of this comparison to thecontroller 624. Thecodec controller 624 may disable the video stabilization unit 632 based on these comparisons. For example, thecodec controller 624 may disable the video stabilization unit 632 when either of these two comparison signals indicates motion has exceeded their respective threshold. Alternatively, thecodec controller 624 may disable the video stabilization unit 632 when both of the comparison signals indicate motion has exceeded the thresholds. Optionally, when thecodec controller 624 disables the video stabilization unit 632, it may keep the video stabilization unit 632 disabled thereafter for at least a predetermined number of frames (say, 6 frames). -
FIG. 6 illustrates a video coder 630 that includes a video stabilization unit 632 and a motion smoothing unit 634. The motion smoothing unit 634 may receive motion samples from the motion detector or, optionally, from themotion detector processor 610 in addition to motion vectors from the global estimation processor 640 (connection not shown). The motion smoothing unit 634 may output revised motion vectors to the video stabilization unit 632 for use in stabilization processing. Optionally, revised motion vectors may be output to thesecond comparator 626 in lieu of the motion vectors from the global estimation processor 640. In an embodiment, the motion smoothing unit may 634 perform a weighted average of motion vectors from a global estimation processor and from themotion detector processor 610 to generate revised motion vectors to the video stabilization unit 632. - As in
FIG. 3 ,FIG. 6 presents a simplified video coder 630, illustrating only the blocks 632, 634 that are material to the present discussion. Thevideo coder 620 may include other processing blocks (not shown) to code input video data and reduce bandwidth of the video signal. - As discussed above, the foregoing embodiments provide a coding/control system that estimates motion of a video capture device, estimates the presence of scene changes in video and selectively engages video stabilization processes based thereon. The techniques described above find application in both software- and hardware-based control systems. In a software-based control system, the functional units described hereinabove may be implemented on a computer system (commonly, a server, personal computer or mobile computing platform) executing program instructions corresponding to the functional blocks and methods listed above. The program instructions themselves may be stored in a storage device, such as an electrical, optical or magnetic storage medium, and executed by a processor of the computer system. In a hardware-based system, the functional blocks illustrated above may be provided in dedicated functional units of processing hardware, for example, digital signal processors, application specific integrated circuits, field programmable logic arrays and the like. The processing hardware may include state machines that perform the methods described in the foregoing discussion. The principles of the present invention also find application in hybrid systems of mixed hardware and software designs.
- Several embodiments of the invention are specifically illustrated and/or described herein. However, it will be appreciated that modifications and variations of the invention are covered by the above teachings and within the purview of the appended claims without departing from the spirit and intended scope of the invention.
Claims (20)
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US12/755,620 US8558903B2 (en) | 2010-03-25 | 2010-04-07 | Accelerometer / gyro-facilitated video stabilization |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US31764210P | 2010-03-25 | 2010-03-25 | |
US12/755,620 US8558903B2 (en) | 2010-03-25 | 2010-04-07 | Accelerometer / gyro-facilitated video stabilization |
Publications (2)
Publication Number | Publication Date |
---|---|
US20110234825A1 true US20110234825A1 (en) | 2011-09-29 |
US8558903B2 US8558903B2 (en) | 2013-10-15 |
Family
ID=44656001
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US12/755,620 Active 2031-01-13 US8558903B2 (en) | 2010-03-25 | 2010-04-07 | Accelerometer / gyro-facilitated video stabilization |
Country Status (1)
Country | Link |
---|---|
US (1) | US8558903B2 (en) |
Cited By (60)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20120075182A1 (en) * | 2010-08-24 | 2012-03-29 | Lg Electronics Inc. | Mobile terminal and displaying method thereof |
WO2012088403A2 (en) * | 2010-12-22 | 2012-06-28 | Seyyer, Inc. | Video transmission and sharing over ultra-low bitrate wireless communication channel |
US20130044228A1 (en) * | 2011-08-15 | 2013-02-21 | Apple Inc. | Motion-Based Video Stabilization |
WO2013090097A1 (en) * | 2011-12-15 | 2013-06-20 | Apple Inc. | Motion sensor based virtual tripod method for video stabilization |
US20130342714A1 (en) * | 2012-06-22 | 2013-12-26 | Apple Inc. | Automated tripod detection and handling in video stabilization |
WO2014046867A1 (en) * | 2012-09-24 | 2014-03-27 | Motorola Mobility Llc | Preventing motion artifacts by intelligently disabling video stabilization |
US20140085492A1 (en) * | 2012-09-24 | 2014-03-27 | Motorola Mobility Llc | Preventing motion artifacts by intelligently disabling video stabilization |
WO2014046868A1 (en) * | 2012-09-24 | 2014-03-27 | Motorola Mobility Llc | Preventing motion artifacts by intelligently disabling video stabilization |
WO2014085090A1 (en) * | 2012-11-27 | 2014-06-05 | Qualcomm Incorporated | System and method for adjusting orientation of captured video |
US20140160021A1 (en) * | 2012-12-07 | 2014-06-12 | Wen-Chieh Geoffrey Lee | Optical Mouse with Cursor Rotating Ability |
US20150036006A1 (en) * | 2013-07-31 | 2015-02-05 | Spreadtrum Communications (Shanghai) Co., Ltd. | Video anti-shaking method and video anti-shaking device |
US20150149328A1 (en) * | 2013-11-26 | 2015-05-28 | Viscovery Pte. Ltd. | Image recognition method for offline and online synchronous operation |
US9082400B2 (en) | 2011-05-06 | 2015-07-14 | Seyyer, Inc. | Video generation based on text |
CN104885441A (en) * | 2012-12-26 | 2015-09-02 | 索尼公司 | Image processing device and method, and program |
WO2015183824A1 (en) * | 2014-05-26 | 2015-12-03 | Pelican Imaging Corporation | Autofocus system for a conventional camera that uses depth information from an array camera |
US20150380055A1 (en) * | 2012-09-12 | 2015-12-31 | Intel Corporation | Techniques for indexing video files |
US9300880B2 (en) | 2013-12-31 | 2016-03-29 | Google Technology Holdings LLC | Methods and systems for providing sensor data and image data to an application processor in a digital image format |
US9313343B2 (en) * | 2014-02-20 | 2016-04-12 | Google Inc. | Methods and systems for communicating sensor data on a mobile device |
US9413963B2 (en) * | 2014-05-30 | 2016-08-09 | Apple Inc. | Video image stabilization |
US9454245B2 (en) | 2011-11-01 | 2016-09-27 | Qualcomm Incorporated | System and method for improving orientation data |
US9536161B1 (en) * | 2014-06-17 | 2017-01-03 | Amazon Technologies, Inc. | Visual and audio recognition for scene change events |
US9596411B2 (en) | 2014-08-25 | 2017-03-14 | Apple Inc. | Combined optical and electronic image stabilization |
US9633442B2 (en) | 2013-03-15 | 2017-04-25 | Fotonation Cayman Limited | Array cameras including an array camera module augmented with a separate camera |
US9674438B2 (en) | 2014-07-06 | 2017-06-06 | Apple Inc. | Low light video image stabilization strength modulation |
US9787902B1 (en) | 2016-06-10 | 2017-10-10 | Apple Inc. | Video image stabilization with enforced stabilization constraints |
US20180048821A1 (en) * | 2016-04-27 | 2018-02-15 | Gopro, Inc. | Electronic Image Stabilization Frequency Estimator |
EP2742680B1 (en) * | 2011-09-26 | 2018-03-28 | Skype | Video stabilisation |
US9953431B2 (en) | 2016-04-04 | 2018-04-24 | Sony Corporation | Image processing system and method for detection of objects in motion |
US10122993B2 (en) | 2013-03-15 | 2018-11-06 | Fotonation Limited | Autofocus system for a conventional camera that uses depth information from an array camera |
US10187554B2 (en) | 2014-10-01 | 2019-01-22 | Samsung Electronics Co., Ltd. | Cover photography apparatus, portable terminal apparatus and control method of cover photography apparatus |
US10455218B2 (en) | 2013-03-15 | 2019-10-22 | Fotonation Limited | Systems and methods for estimating depth using stereo array cameras |
US10462370B2 (en) | 2017-10-03 | 2019-10-29 | Google Llc | Video stabilization |
US10467674B2 (en) | 2013-12-02 | 2019-11-05 | A9.Com, Inc. | Visual search in a controlled shopping environment |
US10638099B2 (en) | 2013-03-15 | 2020-04-28 | Fotonation Limited | Extended color processing on pelican array cameras |
US10674156B2 (en) * | 2016-11-03 | 2020-06-02 | Ujet, Inc. | Image management |
US10694114B2 (en) | 2008-05-20 | 2020-06-23 | Fotonation Limited | Capturing and processing of images including occlusions focused on an image sensor by a lens stack array |
US10708492B2 (en) | 2013-11-26 | 2020-07-07 | Fotonation Limited | Array camera configurations incorporating constituent array cameras and constituent cameras |
US10735635B2 (en) | 2009-11-20 | 2020-08-04 | Fotonation Limited | Capturing and processing of images captured by camera arrays incorporating cameras with telephoto and conventional lenses to generate depth maps |
US10812717B2 (en) * | 2018-05-04 | 2020-10-20 | Google Llc | Stabilizing video by accounting for a location of a feature in a stabilized view of a frame |
US10839485B2 (en) | 2010-12-14 | 2020-11-17 | Fotonation Limited | Systems and methods for synthesizing high resolution images using images captured by an array of independently controllable imagers |
EP3745705A4 (en) * | 2018-03-23 | 2020-12-30 | Huawei Technologies Co., Ltd. | Video image anti-shake method and terminal |
US10909707B2 (en) | 2012-08-21 | 2021-02-02 | Fotonation Limited | System and methods for measuring depth using an array of independently controllable cameras |
US10944961B2 (en) | 2014-09-29 | 2021-03-09 | Fotonation Limited | Systems and methods for dynamic calibration of array cameras |
US10958892B2 (en) | 2013-03-10 | 2021-03-23 | Fotonation Limited | System and methods for calibration of an array camera |
US10984276B2 (en) | 2011-09-28 | 2021-04-20 | Fotonation Limited | Systems and methods for encoding image files containing depth maps stored as metadata |
US11022725B2 (en) | 2012-06-30 | 2021-06-01 | Fotonation Limited | Systems and methods for manufacturing camera modules using active alignment of lens stack arrays and sensors |
US11190689B1 (en) | 2020-07-29 | 2021-11-30 | Google Llc | Multi-camera video stabilization |
US11270110B2 (en) | 2019-09-17 | 2022-03-08 | Boston Polarimetrics, Inc. | Systems and methods for surface modeling using polarization cues |
US11290658B1 (en) | 2021-04-15 | 2022-03-29 | Boston Polarimetrics, Inc. | Systems and methods for camera exposure control |
US11302012B2 (en) | 2019-11-30 | 2022-04-12 | Boston Polarimetrics, Inc. | Systems and methods for transparent object segmentation using polarization cues |
US11525906B2 (en) | 2019-10-07 | 2022-12-13 | Intrinsic Innovation Llc | Systems and methods for augmentation of sensor systems and imaging systems with polarization |
US11580667B2 (en) | 2020-01-29 | 2023-02-14 | Intrinsic Innovation Llc | Systems and methods for characterizing object pose detection and measurement systems |
US11689813B2 (en) | 2021-07-01 | 2023-06-27 | Intrinsic Innovation Llc | Systems and methods for high dynamic range imaging using crossed polarizers |
US11792538B2 (en) | 2008-05-20 | 2023-10-17 | Adeia Imaging Llc | Capturing and processing of images including occlusions focused on an image sensor by a lens stack array |
US11797863B2 (en) | 2020-01-30 | 2023-10-24 | Intrinsic Innovation Llc | Systems and methods for synthesizing data for training statistical models on different imaging modalities including polarized images |
US11954886B2 (en) | 2021-04-15 | 2024-04-09 | Intrinsic Innovation Llc | Systems and methods for six-degree of freedom pose estimation of deformable objects |
US11953700B2 (en) | 2020-05-27 | 2024-04-09 | Intrinsic Innovation Llc | Multi-aperture polarization optical systems using beam splitters |
US12020455B2 (en) | 2021-03-10 | 2024-06-25 | Intrinsic Innovation Llc | Systems and methods for high dynamic range image reconstruction |
US12067746B2 (en) | 2021-05-07 | 2024-08-20 | Intrinsic Innovation Llc | Systems and methods for using computer vision to pick up small objects |
US12069227B2 (en) | 2021-03-10 | 2024-08-20 | Intrinsic Innovation Llc | Multi-modal and multi-spectral stereo camera arrays |
Families Citing this family (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US10284794B1 (en) | 2015-01-07 | 2019-05-07 | Car360 Inc. | Three-dimensional stabilized 360-degree composite image capture |
US9998663B1 (en) | 2015-01-07 | 2018-06-12 | Car360 Inc. | Surround image capture and processing |
US10055852B2 (en) | 2016-04-04 | 2018-08-21 | Sony Corporation | Image processing system and method for detection of objects in motion |
US10547784B2 (en) | 2017-06-19 | 2020-01-28 | SighTour Technologies, Inc. | Image stabilization |
US11748844B2 (en) | 2020-01-08 | 2023-09-05 | Carvana, LLC | Systems and methods for generating a virtual display of an item |
Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20010014124A1 (en) * | 1998-01-30 | 2001-08-16 | Tsuyoshi Nishikawa | Motion vector estimation circuit and method |
US20070104479A1 (en) * | 2005-11-10 | 2007-05-10 | Akihiro Machida | Correcting photographs obtained by a handheld device |
US20080166115A1 (en) * | 2007-01-05 | 2008-07-10 | David Sachs | Method and apparatus for producing a sharp image from a handheld device containing a gyroscope |
US20080170125A1 (en) * | 2005-01-18 | 2008-07-17 | Shih-Hsuan Yang | Method to Stabilize Digital Video Motion |
US20080252736A1 (en) * | 2007-04-16 | 2008-10-16 | Stmicroelectronics (Research & Development) Limite | Image stabilization method and apparatus |
-
2010
- 2010-04-07 US US12/755,620 patent/US8558903B2/en active Active
Patent Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20010014124A1 (en) * | 1998-01-30 | 2001-08-16 | Tsuyoshi Nishikawa | Motion vector estimation circuit and method |
US20080170125A1 (en) * | 2005-01-18 | 2008-07-17 | Shih-Hsuan Yang | Method to Stabilize Digital Video Motion |
US20070104479A1 (en) * | 2005-11-10 | 2007-05-10 | Akihiro Machida | Correcting photographs obtained by a handheld device |
US20080166115A1 (en) * | 2007-01-05 | 2008-07-10 | David Sachs | Method and apparatus for producing a sharp image from a handheld device containing a gyroscope |
US20080252736A1 (en) * | 2007-04-16 | 2008-10-16 | Stmicroelectronics (Research & Development) Limite | Image stabilization method and apparatus |
Cited By (115)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US10694114B2 (en) | 2008-05-20 | 2020-06-23 | Fotonation Limited | Capturing and processing of images including occlusions focused on an image sensor by a lens stack array |
US11792538B2 (en) | 2008-05-20 | 2023-10-17 | Adeia Imaging Llc | Capturing and processing of images including occlusions focused on an image sensor by a lens stack array |
US11412158B2 (en) | 2008-05-20 | 2022-08-09 | Fotonation Limited | Capturing and processing of images including occlusions focused on an image sensor by a lens stack array |
US12022207B2 (en) | 2008-05-20 | 2024-06-25 | Adeia Imaging Llc | Capturing and processing of images including occlusions focused on an image sensor by a lens stack array |
US12041360B2 (en) | 2008-05-20 | 2024-07-16 | Adeia Imaging Llc | Capturing and processing of images including occlusions focused on an image sensor by a lens stack array |
US10735635B2 (en) | 2009-11-20 | 2020-08-04 | Fotonation Limited | Capturing and processing of images captured by camera arrays incorporating cameras with telephoto and conventional lenses to generate depth maps |
US20120075182A1 (en) * | 2010-08-24 | 2012-03-29 | Lg Electronics Inc. | Mobile terminal and displaying method thereof |
US11875475B2 (en) | 2010-12-14 | 2024-01-16 | Adeia Imaging Llc | Systems and methods for synthesizing high resolution images using images captured by an array of independently controllable imagers |
US10839485B2 (en) | 2010-12-14 | 2020-11-17 | Fotonation Limited | Systems and methods for synthesizing high resolution images using images captured by an array of independently controllable imagers |
US11423513B2 (en) | 2010-12-14 | 2022-08-23 | Fotonation Limited | Systems and methods for synthesizing high resolution images using images captured by an array of independently controllable imagers |
WO2012088403A2 (en) * | 2010-12-22 | 2012-06-28 | Seyyer, Inc. | Video transmission and sharing over ultra-low bitrate wireless communication channel |
US10375534B2 (en) | 2010-12-22 | 2019-08-06 | Seyyer, Inc. | Video transmission and sharing over ultra-low bitrate wireless communication channel |
WO2012088403A3 (en) * | 2010-12-22 | 2012-10-11 | Seyyer, Inc. | Video transmission and sharing over ultra-low bitrate wireless communication channel |
US9082400B2 (en) | 2011-05-06 | 2015-07-14 | Seyyer, Inc. | Video generation based on text |
US8896713B2 (en) * | 2011-08-15 | 2014-11-25 | Apple Inc. | Motion-based video stabilization |
US20130044228A1 (en) * | 2011-08-15 | 2013-02-21 | Apple Inc. | Motion-Based Video Stabilization |
EP2742680B1 (en) * | 2011-09-26 | 2018-03-28 | Skype | Video stabilisation |
US10984276B2 (en) | 2011-09-28 | 2021-04-20 | Fotonation Limited | Systems and methods for encoding image files containing depth maps stored as metadata |
US12052409B2 (en) | 2011-09-28 | 2024-07-30 | Adela Imaging LLC | Systems and methods for encoding image files containing depth maps stored as metadata |
US11729365B2 (en) | 2011-09-28 | 2023-08-15 | Adela Imaging LLC | Systems and methods for encoding image files containing depth maps stored as metadata |
US9495018B2 (en) | 2011-11-01 | 2016-11-15 | Qualcomm Incorporated | System and method for improving orientation data |
US9995575B2 (en) | 2011-11-01 | 2018-06-12 | Qualcomm Incorporated | System and method for improving orientation data |
US9785254B2 (en) | 2011-11-01 | 2017-10-10 | Qualcomm Incorporated | System and method for improving orientation data |
US9454245B2 (en) | 2011-11-01 | 2016-09-27 | Qualcomm Incorporated | System and method for improving orientation data |
WO2013090097A1 (en) * | 2011-12-15 | 2013-06-20 | Apple Inc. | Motion sensor based virtual tripod method for video stabilization |
US9628711B2 (en) | 2011-12-15 | 2017-04-18 | Apple Inc. | Motion sensor based virtual tripod method for video stabilization |
US9300873B2 (en) * | 2012-06-22 | 2016-03-29 | Apple Inc. | Automated tripod detection and handling in video stabilization |
US20130342714A1 (en) * | 2012-06-22 | 2013-12-26 | Apple Inc. | Automated tripod detection and handling in video stabilization |
US11022725B2 (en) | 2012-06-30 | 2021-06-01 | Fotonation Limited | Systems and methods for manufacturing camera modules using active alignment of lens stack arrays and sensors |
US10909707B2 (en) | 2012-08-21 | 2021-02-02 | Fotonation Limited | System and methods for measuring depth using an array of independently controllable cameras |
US12002233B2 (en) | 2012-08-21 | 2024-06-04 | Adeia Imaging Llc | Systems and methods for estimating depth and visibility from a reference viewpoint for pixels in a set of images captured from different viewpoints |
US9576608B2 (en) * | 2012-09-12 | 2017-02-21 | Intel Corporation | Techniques for indexing video files |
US20150380055A1 (en) * | 2012-09-12 | 2015-12-31 | Intel Corporation | Techniques for indexing video files |
US20140085492A1 (en) * | 2012-09-24 | 2014-03-27 | Motorola Mobility Llc | Preventing motion artifacts by intelligently disabling video stabilization |
US9554042B2 (en) * | 2012-09-24 | 2017-01-24 | Google Technology Holdings LLC | Preventing motion artifacts by intelligently disabling video stabilization |
US20140085493A1 (en) * | 2012-09-24 | 2014-03-27 | Motorola Mobility Llc | Preventing motion artifacts by intelligently disabling video stabilization |
WO2014046868A1 (en) * | 2012-09-24 | 2014-03-27 | Motorola Mobility Llc | Preventing motion artifacts by intelligently disabling video stabilization |
US8941743B2 (en) * | 2012-09-24 | 2015-01-27 | Google Technology Holdings LLC | Preventing motion artifacts by intelligently disabling video stabilization |
WO2014046867A1 (en) * | 2012-09-24 | 2014-03-27 | Motorola Mobility Llc | Preventing motion artifacts by intelligently disabling video stabilization |
KR102147300B1 (en) * | 2012-09-24 | 2020-08-24 | 구글 테크놀로지 홀딩스 엘엘씨 | Preventing motion artifacts by intelligently disabling video stabilization |
CN104737530A (en) * | 2012-09-24 | 2015-06-24 | 摩托罗拉行动有限公司 | Preventing motion artifacts by intelligently disabling video stabilization |
KR20150065717A (en) * | 2012-09-24 | 2015-06-15 | 모토로라 모빌리티 엘엘씨 | Preventing motion artifacts by intelligently disabling video stabilization |
JP2015534370A (en) * | 2012-09-24 | 2015-11-26 | モトローラ モビリティ エルエルシーMotorola Mobility Llc | Prevent motion artifacts by intelligently disabling video stabilization |
WO2014085090A1 (en) * | 2012-11-27 | 2014-06-05 | Qualcomm Incorporated | System and method for adjusting orientation of captured video |
US9516229B2 (en) | 2012-11-27 | 2016-12-06 | Qualcomm Incorporated | System and method for adjusting orientation of captured video |
US9733727B2 (en) * | 2012-12-07 | 2017-08-15 | Wen-Chieh Geoffrey Lee | Optical mouse with cursor rotating ability |
US20140160021A1 (en) * | 2012-12-07 | 2014-06-12 | Wen-Chieh Geoffrey Lee | Optical Mouse with Cursor Rotating Ability |
CN104885441A (en) * | 2012-12-26 | 2015-09-02 | 索尼公司 | Image processing device and method, and program |
EP2940986A4 (en) * | 2012-12-26 | 2016-07-13 | Sony Corp | Image processing device and method, and program |
US11985293B2 (en) | 2013-03-10 | 2024-05-14 | Adeia Imaging Llc | System and methods for calibration of an array camera |
US11570423B2 (en) | 2013-03-10 | 2023-01-31 | Adeia Imaging Llc | System and methods for calibration of an array camera |
US11272161B2 (en) | 2013-03-10 | 2022-03-08 | Fotonation Limited | System and methods for calibration of an array camera |
US10958892B2 (en) | 2013-03-10 | 2021-03-23 | Fotonation Limited | System and methods for calibration of an array camera |
US10455218B2 (en) | 2013-03-15 | 2019-10-22 | Fotonation Limited | Systems and methods for estimating depth using stereo array cameras |
US10122993B2 (en) | 2013-03-15 | 2018-11-06 | Fotonation Limited | Autofocus system for a conventional camera that uses depth information from an array camera |
US10674138B2 (en) | 2013-03-15 | 2020-06-02 | Fotonation Limited | Autofocus system for a conventional camera that uses depth information from an array camera |
US10638099B2 (en) | 2013-03-15 | 2020-04-28 | Fotonation Limited | Extended color processing on pelican array cameras |
US9633442B2 (en) | 2013-03-15 | 2017-04-25 | Fotonation Cayman Limited | Array cameras including an array camera module augmented with a separate camera |
US20150036006A1 (en) * | 2013-07-31 | 2015-02-05 | Spreadtrum Communications (Shanghai) Co., Ltd. | Video anti-shaking method and video anti-shaking device |
US9253402B2 (en) * | 2013-07-31 | 2016-02-02 | Spreadtrum Communications (Shanghai) Co., Ltd. | Video anti-shaking method and video anti-shaking device |
US9817849B2 (en) * | 2013-11-26 | 2017-11-14 | Viscovery Pte. Ltd. | Image recognition method for offline and online synchronous operation |
US20150149328A1 (en) * | 2013-11-26 | 2015-05-28 | Viscovery Pte. Ltd. | Image recognition method for offline and online synchronous operation |
US10708492B2 (en) | 2013-11-26 | 2020-07-07 | Fotonation Limited | Array camera configurations incorporating constituent array cameras and constituent cameras |
US10095714B2 (en) * | 2013-11-26 | 2018-10-09 | Viscovery Pte. Ltd. | Mobile device capable of offline and online synchronous image identifying, an image identifying system, and a storage medium for the same |
US10467674B2 (en) | 2013-12-02 | 2019-11-05 | A9.Com, Inc. | Visual search in a controlled shopping environment |
US9596443B2 (en) | 2013-12-31 | 2017-03-14 | Google Inc. | Methods and systems for providing sensor data and image data to an application processor in a digital image format |
US9300880B2 (en) | 2013-12-31 | 2016-03-29 | Google Technology Holdings LLC | Methods and systems for providing sensor data and image data to an application processor in a digital image format |
US9313343B2 (en) * | 2014-02-20 | 2016-04-12 | Google Inc. | Methods and systems for communicating sensor data on a mobile device |
US9485366B2 (en) | 2014-02-20 | 2016-11-01 | Google Inc. | Methods and systems for communicating sensor data on a mobile device |
WO2015183824A1 (en) * | 2014-05-26 | 2015-12-03 | Pelican Imaging Corporation | Autofocus system for a conventional camera that uses depth information from an array camera |
US9413963B2 (en) * | 2014-05-30 | 2016-08-09 | Apple Inc. | Video image stabilization |
US9706123B2 (en) | 2014-05-30 | 2017-07-11 | Apple Inc. | Video image stabilization |
US9536161B1 (en) * | 2014-06-17 | 2017-01-03 | Amazon Technologies, Inc. | Visual and audio recognition for scene change events |
US9674438B2 (en) | 2014-07-06 | 2017-06-06 | Apple Inc. | Low light video image stabilization strength modulation |
US9596411B2 (en) | 2014-08-25 | 2017-03-14 | Apple Inc. | Combined optical and electronic image stabilization |
US10944961B2 (en) | 2014-09-29 | 2021-03-09 | Fotonation Limited | Systems and methods for dynamic calibration of array cameras |
US11546576B2 (en) | 2014-09-29 | 2023-01-03 | Adeia Imaging Llc | Systems and methods for dynamic calibration of array cameras |
US10187554B2 (en) | 2014-10-01 | 2019-01-22 | Samsung Electronics Co., Ltd. | Cover photography apparatus, portable terminal apparatus and control method of cover photography apparatus |
US9979889B2 (en) | 2015-06-07 | 2018-05-22 | Apple Inc. | Combined optical and electronic image stabilization |
US9953431B2 (en) | 2016-04-04 | 2018-04-24 | Sony Corporation | Image processing system and method for detection of objects in motion |
US10205879B2 (en) * | 2016-04-27 | 2019-02-12 | Gopro, Inc. | Electronic image stabilization frequency estimator |
US10506162B2 (en) * | 2016-04-27 | 2019-12-10 | Gopro, Inc. | Electronic image stabilization frequency estimator |
US20180048821A1 (en) * | 2016-04-27 | 2018-02-15 | Gopro, Inc. | Electronic Image Stabilization Frequency Estimator |
US11317024B2 (en) | 2016-04-27 | 2022-04-26 | Gopro, Inc. | Electronic image stabilization frequency estimator |
US20190109990A1 (en) * | 2016-04-27 | 2019-04-11 | Gopro, Inc. | Electronic Image Stabilization Frequency Estimator |
US9787902B1 (en) | 2016-06-10 | 2017-10-10 | Apple Inc. | Video image stabilization with enforced stabilization constraints |
US10148881B2 (en) | 2016-06-10 | 2018-12-04 | Apple Inc. | Video image stabilization with enforced stabilization constraints |
US10674156B2 (en) * | 2016-11-03 | 2020-06-02 | Ujet, Inc. | Image management |
US11064119B2 (en) | 2017-10-03 | 2021-07-13 | Google Llc | Video stabilization |
US11683586B2 (en) | 2017-10-03 | 2023-06-20 | Google Llc | Video stabilization |
US10462370B2 (en) | 2017-10-03 | 2019-10-29 | Google Llc | Video stabilization |
US11539887B2 (en) | 2018-03-23 | 2022-12-27 | Huawei Technologies Co., Ltd. | Video image anti-shake method and terminal |
EP3745705A4 (en) * | 2018-03-23 | 2020-12-30 | Huawei Technologies Co., Ltd. | Video image anti-shake method and terminal |
RU2758460C1 (en) * | 2018-03-23 | 2021-10-28 | Хуавей Текнолоджиз Ко., Лтд. | Terminal apparatus and method for video image stabilisation |
US10812717B2 (en) * | 2018-05-04 | 2020-10-20 | Google Llc | Stabilizing video by accounting for a location of a feature in a stabilized view of a frame |
US11227146B2 (en) | 2018-05-04 | 2022-01-18 | Google Llc | Stabilizing video by accounting for a location of a feature in a stabilized view of a frame |
US11699273B2 (en) | 2019-09-17 | 2023-07-11 | Intrinsic Innovation Llc | Systems and methods for surface modeling using polarization cues |
US11270110B2 (en) | 2019-09-17 | 2022-03-08 | Boston Polarimetrics, Inc. | Systems and methods for surface modeling using polarization cues |
US11982775B2 (en) | 2019-10-07 | 2024-05-14 | Intrinsic Innovation Llc | Systems and methods for augmentation of sensor systems and imaging systems with polarization |
US12099148B2 (en) | 2019-10-07 | 2024-09-24 | Intrinsic Innovation Llc | Systems and methods for surface normals sensing with polarization |
US11525906B2 (en) | 2019-10-07 | 2022-12-13 | Intrinsic Innovation Llc | Systems and methods for augmentation of sensor systems and imaging systems with polarization |
US11842495B2 (en) | 2019-11-30 | 2023-12-12 | Intrinsic Innovation Llc | Systems and methods for transparent object segmentation using polarization cues |
US11302012B2 (en) | 2019-11-30 | 2022-04-12 | Boston Polarimetrics, Inc. | Systems and methods for transparent object segmentation using polarization cues |
US11580667B2 (en) | 2020-01-29 | 2023-02-14 | Intrinsic Innovation Llc | Systems and methods for characterizing object pose detection and measurement systems |
US11797863B2 (en) | 2020-01-30 | 2023-10-24 | Intrinsic Innovation Llc | Systems and methods for synthesizing data for training statistical models on different imaging modalities including polarized images |
US11953700B2 (en) | 2020-05-27 | 2024-04-09 | Intrinsic Innovation Llc | Multi-aperture polarization optical systems using beam splitters |
US11856295B2 (en) | 2020-07-29 | 2023-12-26 | Google Llc | Multi-camera video stabilization |
US11190689B1 (en) | 2020-07-29 | 2021-11-30 | Google Llc | Multi-camera video stabilization |
US12020455B2 (en) | 2021-03-10 | 2024-06-25 | Intrinsic Innovation Llc | Systems and methods for high dynamic range image reconstruction |
US12069227B2 (en) | 2021-03-10 | 2024-08-20 | Intrinsic Innovation Llc | Multi-modal and multi-spectral stereo camera arrays |
US11290658B1 (en) | 2021-04-15 | 2022-03-29 | Boston Polarimetrics, Inc. | Systems and methods for camera exposure control |
US11954886B2 (en) | 2021-04-15 | 2024-04-09 | Intrinsic Innovation Llc | Systems and methods for six-degree of freedom pose estimation of deformable objects |
US11683594B2 (en) | 2021-04-15 | 2023-06-20 | Intrinsic Innovation Llc | Systems and methods for camera exposure control |
US12067746B2 (en) | 2021-05-07 | 2024-08-20 | Intrinsic Innovation Llc | Systems and methods for using computer vision to pick up small objects |
US11689813B2 (en) | 2021-07-01 | 2023-06-27 | Intrinsic Innovation Llc | Systems and methods for high dynamic range imaging using crossed polarizers |
Also Published As
Publication number | Publication date |
---|---|
US8558903B2 (en) | 2013-10-15 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US8558903B2 (en) | Accelerometer / gyro-facilitated video stabilization | |
US8811661B2 (en) | Monitoring camera system, monitoring camera, and monitoring camera control apparatus | |
US9723315B2 (en) | Frame encoding selection based on frame similarities and visual quality and interests | |
JP6016332B2 (en) | Image processing apparatus and image processing method | |
US8493499B2 (en) | Compression-quality driven image acquisition and processing system | |
US10999577B2 (en) | Quantization parameter determination method and image capture apparatus | |
US11044477B2 (en) | Motion adaptive encoding of video | |
US10623744B2 (en) | Scene based rate control for video compression and video streaming | |
US9924097B2 (en) | Apparatus, method and recording medium for image stabilization | |
US20140307112A1 (en) | Motion Adaptive Cropping for Video Stabilization | |
US12087048B2 (en) | Video analysis method and system, and information processing device, transmits image frame to cloud server based on difference between analysis result on the edge side and result predicted on a cloud server | |
US20130235931A1 (en) | Masking video artifacts with comfort noise | |
US20150062371A1 (en) | Encoding apparatus and method | |
US9336460B2 (en) | Adaptive motion instability detection in video | |
WO2019162969A1 (en) | System for computationally efficient analysis of traffic details in traffic video stream and a method thereof | |
US8699575B2 (en) | Motion vector generation apparatus, motion vector generation method, and non-transitory computer-readable storage medium | |
US11716475B2 (en) | Image processing device and method of pre-processing images of a video stream before encoding | |
WO2023053166A1 (en) | Video processing system, information processing device, video processing method, and recording medium | |
Fernandez et al. | Integrated H. 264 region-of-interest detection, tracking and compression for surveillance scenes | |
JP6388613B2 (en) | Image processing apparatus, design support system, and program | |
US20140269906A1 (en) | Moving image encoding apparatus, method for controlling the same and image capturing apparatus |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
AS | Assignment |
Owner name: APPLE INC., CALIFORNIA Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:LIU, YUXIN;SHI, XIAOJIN;NORMILE, JAMES OLIVER;AND OTHERS;REEL/FRAME:024198/0545 Effective date: 20100402 |
|
FEPP | Fee payment procedure |
Free format text: PAYOR NUMBER ASSIGNED (ORIGINAL EVENT CODE: ASPN); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY |
|
STCF | Information on status: patent grant |
Free format text: PATENTED CASE |
|
FPAY | Fee payment |
Year of fee payment: 4 |
|
MAFP | Maintenance fee payment |
Free format text: PAYMENT OF MAINTENANCE FEE, 8TH YEAR, LARGE ENTITY (ORIGINAL EVENT CODE: M1552); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY Year of fee payment: 8 |