US20130343732A1 - Imaging apparatus and reproducing apparatus which changes frame rate based on zoom operation - Google Patents
Imaging apparatus and reproducing apparatus which changes frame rate based on zoom operation Download PDFInfo
- Publication number
- US20130343732A1 US20130343732A1 US13/954,888 US201313954888A US2013343732A1 US 20130343732 A1 US20130343732 A1 US 20130343732A1 US 201313954888 A US201313954888 A US 201313954888A US 2013343732 A1 US2013343732 A1 US 2013343732A1
- Authority
- US
- United States
- Prior art keywords
- zoom
- zoom operation
- frame rate
- video signal
- recorded
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Granted
Links
- 238000003384 imaging method Methods 0.000 title abstract description 26
- 238000000034 method Methods 0.000 claims description 56
- 230000000694 effects Effects 0.000 claims description 23
- 230000008569 process Effects 0.000 claims description 17
- 238000012545 processing Methods 0.000 abstract description 26
- 230000008859 change Effects 0.000 abstract description 5
- 230000015654 memory Effects 0.000 description 24
- 230000006835 compression Effects 0.000 description 9
- 238000007906 compression Methods 0.000 description 9
- 238000001514 detection method Methods 0.000 description 9
- 230000003287 optical effect Effects 0.000 description 9
- 230000003247 decreasing effect Effects 0.000 description 8
- 230000006870 function Effects 0.000 description 8
- 238000009499 grossing Methods 0.000 description 8
- 238000010586 diagram Methods 0.000 description 4
- 230000007423 decrease Effects 0.000 description 3
- 239000004973 liquid crystal related substance Substances 0.000 description 2
- 239000004065 semiconductor Substances 0.000 description 2
- 230000008901 benefit Effects 0.000 description 1
- 238000012937 correction Methods 0.000 description 1
- 230000008921 facial expression Effects 0.000 description 1
- 238000012986 modification Methods 0.000 description 1
- 230000004048 modification Effects 0.000 description 1
- 238000012544 monitoring process Methods 0.000 description 1
- 230000002035 prolonged effect Effects 0.000 description 1
- 238000009877 rendering Methods 0.000 description 1
- 238000005070 sampling Methods 0.000 description 1
- 238000000926 separation method Methods 0.000 description 1
- 230000005236 sound signal Effects 0.000 description 1
- 238000012546 transfer Methods 0.000 description 1
Images
Classifications
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N9/00—Details of colour television systems
- H04N9/79—Processing of colour television signals in connection with recording
- H04N9/87—Regeneration of colour television signals
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N5/00—Details of television systems
- H04N5/222—Studio circuitry; Studio devices; Studio equipment
- H04N5/262—Studio circuits, e.g. for mixing, switching-over, change of character of image, other special effects ; Cameras specially adapted for the electronic generation of special effects
- H04N5/2621—Cameras specially adapted for the electronic generation of special effects during image pickup, e.g. digital cameras, camcorders, video cameras having integrated special effects capability
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/60—Control of cameras or camera modules
- H04N23/69—Control of means for changing angle of the field of view, e.g. optical zoom objectives or electronic zooming
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N5/00—Details of television systems
- H04N5/76—Television signal recording
- H04N5/765—Interface circuits between an apparatus for recording and another apparatus
- H04N5/77—Interface circuits between an apparatus for recording and another apparatus between a recording apparatus and a television camera
- H04N5/772—Interface circuits between an apparatus for recording and another apparatus between a recording apparatus and a television camera the recording apparatus and the television camera being placed in the same enclosure
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/60—Control of cameras or camera modules
- H04N23/63—Control of cameras or camera modules by using electronic viewfinders
- H04N23/633—Control of cameras or camera modules by using electronic viewfinders for displaying additional information relating to control or operation of the camera
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N5/00—Details of television systems
- H04N5/76—Television signal recording
- H04N5/765—Interface circuits between an apparatus for recording and another apparatus
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N5/00—Details of television systems
- H04N5/76—Television signal recording
- H04N5/78—Television signal recording using magnetic recording
- H04N5/781—Television signal recording using magnetic recording on disks or drums
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N5/00—Details of television systems
- H04N5/76—Television signal recording
- H04N5/78—Television signal recording using magnetic recording
- H04N5/782—Television signal recording using magnetic recording on tape
- H04N5/783—Adaptations for reproducing at a rate different from the recording rate
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N5/00—Details of television systems
- H04N5/76—Television signal recording
- H04N5/84—Television signal recording using optical recording
- H04N5/85—Television signal recording using optical recording on discs or drums
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N5/00—Details of television systems
- H04N5/76—Television signal recording
- H04N5/907—Television signal recording using static stores, e.g. storage tubes or semiconductor memories
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N9/00—Details of colour television systems
- H04N9/79—Processing of colour television signals in connection with recording
- H04N9/7921—Processing of colour television signals in connection with recording for more than one processing mode
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N9/00—Details of colour television systems
- H04N9/79—Processing of colour television signals in connection with recording
- H04N9/80—Transformation of the television signal for recording, e.g. modulation, frequency changing; Inverse transformation for playback
- H04N9/804—Transformation of the television signal for recording, e.g. modulation, frequency changing; Inverse transformation for playback involving pulse code modulation of the colour picture signal components
- H04N9/8042—Transformation of the television signal for recording, e.g. modulation, frequency changing; Inverse transformation for playback involving pulse code modulation of the colour picture signal components involving data reduction
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N9/00—Details of colour television systems
- H04N9/79—Processing of colour television signals in connection with recording
- H04N9/80—Transformation of the television signal for recording, e.g. modulation, frequency changing; Inverse transformation for playback
- H04N9/804—Transformation of the television signal for recording, e.g. modulation, frequency changing; Inverse transformation for playback involving pulse code modulation of the colour picture signal components
- H04N9/806—Transformation of the television signal for recording, e.g. modulation, frequency changing; Inverse transformation for playback involving pulse code modulation of the colour picture signal components with processing of the sound signal
- H04N9/8063—Transformation of the television signal for recording, e.g. modulation, frequency changing; Inverse transformation for playback involving pulse code modulation of the colour picture signal components with processing of the sound signal using time division multiplex of the PCM audio and PCM video signals
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N9/00—Details of colour television systems
- H04N9/79—Processing of colour television signals in connection with recording
- H04N9/80—Transformation of the television signal for recording, e.g. modulation, frequency changing; Inverse transformation for playback
- H04N9/82—Transformation of the television signal for recording, e.g. modulation, frequency changing; Inverse transformation for playback the individual colour picture signal components being recorded simultaneously only
- H04N9/8205—Transformation of the television signal for recording, e.g. modulation, frequency changing; Inverse transformation for playback the individual colour picture signal components being recorded simultaneously only involving the multiplexing of an additional signal and the colour video signal
Definitions
- the present invention relates to an imaging apparatus that records moving images, and to a reproducing apparatus that reproduces the moving images, such as a video camera or a digital camera.
- Japanese Patent Application Laid-Open No. 2003-274360 discusses a technique for controlling the frame rate of captured video signals without requiring a photographer's control. More specifically, based on this technique, when an image is captured, a surrounding audio volume level is monitored, and if it is detected that the monitored volume level exceeds a predetermined level, the frame rate is increased, so that important scenes are recorded with high quality. Further, based on this technique, when the available memory capacity of a recording medium reaches less than or equal to a predetermined level, the frame rate is decreased to suppress consumption of the recording medium. In addition, when the available power of a battery reaches less than or equal to a predetermined level, the frame rate is decreased to suppress consumption of the battery.
- Japanese Patent Application Laid-Open No. 2007-134991 discusses a technique in which motion of a captured image is detected. Based on this technique, when the detected motion is more than or equal to a threshold, the captured image is recorded at a high frame rate, and when the detected motion is less than the threshold, the captured image is recorded at a low frame rate.
- the technique discussed in Japanese Patent Application Laid-Open No. 2007-134991 uses motion information acquired from an image signal as an index. For example, when the ratio of monotonous scenes including the sky or night scenes is increased, it is determined that the amount of motion is less, and as a result, the image is recorded at a low frame rate. On the other hand, in a wide image whose focal length is short, the motion amount tends to be detected as being large because of camera shake, and consequently, the image is recorded at a high frame rate. In either case, the frame rate is determined irrespective of photographer's intention.
- TV receivers or video monitors support only a fixed frame rate.
- the video may be displayed at an unnatural rendering speed misaligned with the actual time. For example, if a monitor receives an input video signal and displays the signal at a certain rate images captured at a frame rate higher than the standard frame rate are rendered in slow motion, and images captured at a lower frame rate are rendered at high speed.
- variable frame rate recording Since compatibility between these viewing and reproduction environments is not taken into consideration, use of the technique of variable frame rate recording is limited.
- videos including zoom operations can be reproduced. Since videos are reproduced at a standard frame rate, compatibility with existing video display apparatuses can be ensured. During periods including zoom operation periods, videos are recorded at a frame rate higher than a normal frame rate, and when the videos are reproduced, the frame rate is reduced to the standard frame rate. Thus, high-quality reproduction videos can be displayed.
- FIG. 1 is a block diagram illustrating a schematic configuration of an exemplary embodiment of the present invention.
- FIGS. 2A to 2C illustrate a relationship among a zoom operation, captured frames, and recorded frames of the exemplary embodiment.
- FIG. 3 is a flow chart illustrating a recording operation of the exemplary embodiment.
- FIG. 4 illustrates a relationship between a zoom speed and a recording frame rate of the exemplary embodiment.
- FIG. 5 illustrates a metadata structure and data examples of the present exemplary embodiment.
- FIG. 6 illustrates a relationship between zoom magnification and recorded frames of the exemplary embodiment.
- FIG. 7 illustrates recording areas each storing frames at different frame rates.
- FIG. 10 illustrates frame interpolation in reproduction mode 1 of the exemplary embodiment.
- FIG. 11 illustrates frame interpolation in reproduction mode 2 of the exemplary embodiment.
- FIG. 13 illustrates frame interpolation in reproduction mode 5 of the exemplary embodiment.
- FIG. 14 is a block diagram illustrating functions of a system control unit 50 .
- FIG. 16 illustrates a relationship between zoom magnification and recorded frames of the second exemplary embodiment.
- FIG. 1 is a block diagram illustrating a schematic configuration of an exemplary embodiment of the present invention. A configuration and a basic operation of the present exemplary embodiment will be described with reference to FIG. 1 .
- An imaging unit 10 (camera unit) has the following configuration and functions.
- the imaging optical system of the imaging unit 10 includes a front lens 12 , a zoom lens 14 , a diaphragm 16 , and a focus lens 18 .
- the front lens 12 is fixed to a lens barrel, and the zoom lens 14 and the focus lens 18 can move along the optical axis.
- the diaphragm 16 is arranged between the zoom lens 14 and the focus lens 18 .
- An exposure control unit 20 receives a control signal from a camera control unit 30 and controls the aperture of the diaphragm 16 based on the control signal.
- a zoom position sensor 22 detects the position of the zoom lens 14 .
- a lens control unit 24 Based on the position of the zoom lens 14 detected by the zoom position sensor 22 , a lens control unit 24 carries out feedback control on the zoom lens 14 .
- the lens control unit 24 also controls the zoom position of the zoom lens 14 and the focus position of the focus lens 18 , based on instruction signals from the camera control unit 30 to be described later.
- An image sensor driving unit 28 drives an image sensor 26 that converts an optical image formed by the imaging optical system into an electric image signal.
- the image sensor driving unit 28 drives the image sensor 26 based on a timing signal supplied from the camera control unit 30 and controls the image sensor 26 to output each pixel signal for the image signal.
- a frame rate higher than a standard frame rate can be handled.
- the camera signal processing unit 36 Based on parameters set by the camera control unit 30 , the camera signal processing unit 36 carries out known camera signal processing, such as color separation, gradation correction, and white balance adjustment, on the image data output from the A/D converter 34 .
- the camera signal processing unit 36 supplies the processed image data to a compression/expansion unit 44 via a data bus 38 and a memory 40 .
- the camera control unit 30 controls operations of the lens control unit 24 , the exposure control unit 20 , the image sensor driving unit 28 , and the camera signal processing unit 36 .
- the system control unit 50 includes a central processing unit (CPU) and controls individual units according to instructions from an operation unit 52 operated by a user, setting values, or an operation status.
- the system control unit 50 controls an imaging operation via the camera control unit 30 and controls recording/reproducing data in the recording medium 48 via the recording medium control unit 46 .
- the system control unit 50 comprehensively controls the entire imaging apparatus by executing predetermined programs.
- An external interface (I/F) 54 sends and receives signals in a predetermined format to and from an external device connected to an external terminal 56 .
- the external terminal 56 include terminals that comply with standards such as the institute of electrical and electronic engineers (IEEE) 1394, universal serial bus (USB) and/or the high-definition multimedia interface (HDMI).
- the signal format include analog video signals (audio/video) and digital video signals of various resolutions and frame rates.
- the operation unit 52 is used as a user interface and includes a shutter button, a zoom button, a power-source on/off button, and a record start/stop button used during imaging or recording.
- the operation unit 52 also includes a select button, a determination button, and a cancel button used to make selection on the menu, set parameters, and the like.
- a display control unit 58 drives a display apparatus 60 , which is used as a monitor displaying captured images and reproduced images and is used as a display unit displaying various types of management information to users.
- the memory 40 is shared by each function block via the data bus 38 , and is formed by a read only memory (ROM) and/or a random-access memory (RAM). Each function block connected to the data bus 38 can write and read data in and from the memory 40 via a memory control unit 42 .
- ROM read only memory
- RAM random-access memory
- Image and audio data recorded in the recording medium 48 is reproduced as follows.
- the recording medium control unit 46 supplies compressed data read from the recording medium 48 to the compression/expansion unit 44 .
- the compression/expansion unit 44 decompresses the input compressed data to restore video data and audio data.
- the display control unit 58 supplies the restored video data to the display apparatus 60 , which displays a reproduced image using the supplied video data.
- Examples of the display apparatus 60 include a liquid crystal display panel and an organic light-emitting device.
- a speaker (not illustrated) outputs reproduced audio. When necessary, reproduced video data and audio data is output to the outside via the external I/F 54 and the external terminal 56 .
- FIG. 2 illustrates a relationship between captured frames temporarily stored in the memory 40 and recorded frames recorded in the recording medium 48 .
- FIG. 2A illustrates a zoom operation period during imaging carried out by the operation unit 52 .
- FIG. 2B illustrates captured frames temporarily stored in the memory 40 .
- FIG. 2C illustrates recorded frames recorded in the recording medium 48 .
- FIG. 2A illustrates a zoom-on period, which is when a zoom operation is carried out by a photographer, and a zoom-off period, which is when no zoom operation is carried out by a photographer.
- the camera control unit 30 drives the image sensor 26 at a rate of 1/240 second, irrespective of on/off of the zoom operation.
- the image data of each frame is temporarily stored in the memory 40 at a frame rate of 1/240 second.
- FIG. 3 is a flow chart illustrating a control operation of a recording frame rate.
- the system control unit 50 monitors a zoom operation carried out by the operation unit 52 .
- step S 1 if the system control unit 50 detects a zoom operation during a recording operation (YES in step S 1 ), the system control unit 50 carries out the processing in steps S 2 to S 5 . If not (NO in step S 1 ), the processing proceeds to step S 6 .
- step S 2 the system control unit 50 determines the zoom speed. For example, the zoom speed is determined based on the force applied by the user to the zoom switch on the operation unit 52 or the length of time while the zoom switch is pushed. If a zoom speed is previously or initially set on the menu screen or the like, the system control unit 50 refers to the zoom speed. In other words, the system control unit 50 also functions as a zoom speed detection unit.
- step S 3 based on the zoom speed determined in step S 2 , the system control unit 50 sets a recording frame rate.
- FIG. 4 illustrates a relationship between the recording frame rate and the zoom speed. As illustrated in FIG. 4 , a higher recording frame rate is set for a higher zoom speed, and a lower recording frame rate is set for a lower zoom speed.
- the recording frame rate may be changed on a step-by-step basis depending on the zoom speed.
- the recording frame rate may be changed linearly.
- Such a relationship between the zoom speed and the recording frame rate is previously set in a form of a table and stored in the memory 40 (ROM therein).
- step S 4 the system control unit 50 carries out a thinning process on the image frame data in the memory 40 based on the recording frame rate set in step S 3 , and records the remaining frames in the recording medium 48 .
- step S 5 the system control unit 50 records zoom magnification information and frame rate management information in the recording medium 48 as metadata, along with video data of each frame.
- step S 6 if instructions to stop recording are input, the processing ends. If not, steps S 1 to S 5 are repeated.
- FIG. 5 illustrates a structure of metadata added to the video data of each frame, and data examples.
- the video data includes a header portion, a metadata portion, and a data portion.
- the header portion stores basic information that does not change depending on time, such as imaging apparatus information, format information, and thumbnail information.
- the metadata portion stores information that is updated depending on various imaging conditions, such as a date, time, a recording frame rate, and a shutter speed.
- the data portion stores image data and audio data encoded in a predetermined format.
- the metadata is embedded in the video data in the metadata format using a description language such as extensible markup language (XML) or hypertext markup language (HTML).
- XML extensible markup language
- HTML hypertext markup language
- the metadata may be added as binary data or a watermark.
- FIG. 6 schematically illustrates a relationship between the recording frame rate and the zoom magnification according to the present exemplary embodiment.
- the horizontal axis represents time
- the vertical axis represents zoom magnification (angle of view of the image).
- Double circles represent frames captured at 1/240-second intervals
- single circles represent frames captured at 1/60-second intervals.
- the frames represented by the single circles and captured at 1/60-second intervals are recorded in the recording medium 48 , irrespective of on/off of the zoom operation.
- FIG. 7 illustrates an example of how image data illustrated in FIG. 6 is allocated and recorded in the recording medium 48 .
- Frames captured at t1, t2, t3, and the like at a normal recording frame rate of 1/60 second are stored in a main storage area A of the recording medium 48 .
- the frames captured at a higher frame rate between t3, t3a, t3b, t3c, t4, . . . , and t9 are stored in a sub-storage area Sub of the recording medium 48 .
- the main storage area A and the sub-storage area Sub are distinguished logically in terms of file management, and thus physical recording locations of these areas may be identical.
- the frames captured at t3, t4, t5, and the like are recorded in the sub-storage area Sub as well as in the main storage area A. This is for decryption processing carried out during reproduction. If the frames captured at t3, t4, t5, and the like in the main storage area can be used, these frames do not need to be stored in the sub-storage area Sub.
- step S 11 based on reproduction instructions from the operation unit 52 , the system control unit 50 reads metadata about a specified video file from the file management information of the recording medium 48 .
- step S 12 the system control unit 50 refers to the metadata and determines whether videos of different frame rates are recorded in the video file to be reproduced.
- step S 12 If images are not recorded at different recording frame rates (NO in step S 12 ), the processing proceeds to step S 16 .
- step S 16 normal reproduction processing is carried out.
- step S 13 the system control unit 50 reads special reproduction effects previously set by the user.
- steps S 14 and S 15 the system control unit 50 controls displayed frames and voice output as described below.
- FIG. 9 is a table illustrating special reproduction effects (modes) that involve zoom operations and the effects of the individual modes.
- a linear smoothing mode is a reproduction interpolation mode in which, even when the speed of a zoom operation carried out by the user is not constant, the zoom speed is changed during reproduction processing and reproduced images are displayed (rendered) at a constant zoom speed.
- the system control unit 50 selects frames so that the zoom magnification of the selected frames changes proportionally. Next, the selected frames are displayed on the display apparatus 60 at a normal frame rate.
- FIG. 10 schematically illustrates frames displayed in the linear smoothing mode.
- the horizontal axis represents time and the vertical axis represents zoom magnification (angle of view of the image) .
- Double circles represent frames recorded at 1/240-second intervals during a zoom operation, and single circles represent frames recorded at 1/60-second intervals.
- captured frames are recorded at 1/240-second intervals.
- the captured frames are thinned and recorded at a frame rate of 1/60-second intervals.
- the zoom speed is not constant during imaging, i.e., the zoom speed fluctuates during a zoom operation from the wide end to the telephoto end.
- the linear smoothing mode will be described in detail with reference to FIG. 10 .
- Actually recorded frames are shifted so that the zoom magnification linearly changes as set in the linear smoothing mode, and the frames used to be reproduced and displayed at a normal frame rate are selected and determined.
- the frames at time t5c, t6c, t7c, t8a, and t8c are selected to be displayed.
- the frame at time t5c is shifted as a frame at time t4 (frame 70-1 represented as a black circle) to be displayed.
- the frame at time t6c is shifted as a frame at time t5 (frame 70-2 represented as a black circle) to be displayed.
- the frame at time t7c is shifted as a frame at time t6 (frame 70-3 represented as a black circle) to be displayed.
- the frame at time t8a is shifted as a frame at time t7 (frame 70-4 represented as a black circle) to be displayed.
- the frame at time t8c is shifted as a frame at time t8 (frame 70-5 represented as a black circle) to be displayed.
- captured frames (recorded frames) whose zoom magnification changes linearly from the wide end to the telephoto end are re-arranged on the time axis so that the frames are displayed at a constant zoom speed.
- a dynamic shooting mode is a reproduction interpolation mode in which an image is zoomed rapidly as the object comes closer, irrespective of the zoom operation speed by the user.
- the linear smoothing mode provides a zoom effect of gradually increasing the zoom speed.
- the system control unit 50 selects frames so that the speed, at which the zoom magnification changes, is gradually increased.
- the selected frames are displayed on the display apparatus 60 at a normal frame rate.
- FIG. 11 schematically illustrates frames displayed in the dynamic shooting mode.
- the horizontal axis represents time and the vertical axis represents zoom magnification (angle of view of the image) .
- Double circles represent frames recorded at 1/240-second intervals during a zoom operation, and single circles represent frames recorded at 1/60-second intervals.
- captured frames are recorded at 1/240-second intervals.
- the captured frames are thinned and recorded at a frame rate of 1/60-second intervals.
- the speed, at which the zoom magnification changes is constant during imaging.
- the dynamic shooting mode will be described in detail with reference to FIG. 11 .
- Actually recorded frames are shifted so that the zoom magnification is changing in a curve as set in the linear smoothing mode, and the frames used to be reproduced and displayed at a normal frame rate are selected and determined.
- the frames at time t3, t3a, t3b, t4, and t5a are selected to be displayed.
- the frame at time t3 is temporally shifted as a frame at time t4 (frame 72-1 represented as a black circle) to be displayed.
- the frame at time t3a is temporally shifted as a frame at time t5 (frame 72-2 represented as a black circle) to be displayed.
- the frame at time t3b is temporally shifted as a frame at time t6 (frame 72-3 represented as a black circle) to be displayed.
- the frame at time t4 is temporally shifted as a frame at time t7 (frame 72-4 represented as a black circle) to be displayed.
- the frame at time t5a is temporally shifted as a frame at time t8 (frame 72-5 represented as a black circle) to be displayed.
- a soft landing mode (mode 3) is a reproduction interpolation mode in which an image is zoomed in a decreasing speed as the object comes closer, irrespective of the zoom operation speed by the user. Namely, the soft landing mode provides a zoom effect of gradually decreasing the zoom speed.
- the system control unit 50 selects frames so that the zoom speed gradually decreases as the object comes closer.
- the selected frames are displayed on the display apparatus 60 at a normal frame rate.
- FIG. 12 schematically illustrates frames displayed in the soft landing mode.
- the horizontal axis represents time and the vertical axis represents zoom magnification (angle of view of the image).
- Double circles represent frames recorded at 1/240-second intervals during a zoom operation, and single circles represent frames recorded at 1/60-second intervals.
- captured frames are recorded at 1/240-second intervals.
- the captured frames are thinned and recorded at a frame rate of 1/60-second intervals.
- the speed, at which the zoom magnification is changing is constant during imaging.
- the soft landing mode will be described in detail with reference to FIG. 12 .
- Actually recorded frames are shifted so that the zoom magnification changes in a curve as set in the soft landing mode, and the frames used to be reproduced and displayed at a normal frame rate are selected and determined.
- the frames at time t6c, t8, t8b, t8c, and t9 are selected to be displayed.
- the frame at time t6c is temporally shifted as a frame at time t4 (frame 74-1 represented as a black circle) to be displayed.
- the frame at time t8 is temporally shifted as a frame at time t5 (frame 74-2 represented as a black circle) to be displayed.
- the frame at time t8b is temporally shifted as a frame at time t6 (frame 74-3 represented as a black circle) to be displayed.
- the frame at time t8c is temporally shifted as a frame at time t7 (frame 74-4 represented as a black circle) to be displayed.
- the frame at time t9 is temporally shifted as a frame at time t8 (frame 74-5 represented as a black circle) to be displayed.
- a slow motion mode is a reproduction mode in which all the frame images recorded at a high frame rate during a zoom operation are rendered at a preset magnification. For example, frame images captured and recorded at a high frame rate of 1/240-second intervals during a zoom period are reproduced at a normal frame rate of 1/60-second intervals. As a result, the images are displayed at a slow speed (1 ⁇ 4 of the original speed). Namely, the video recorded during a zoom operation is reproduced slowly and can be observed easily.
- a skip mode is a reproduction mode that may be used when a user makes a mistake in a zoom operation. For example, when a user increases the zoom magnification excessively and decreases hurriedly, such erroneous operation is automatically detected, and reproduction of the images captured by the erroneous operation is skipped. When a zoom-in operation or a zoom-out operation is repeated in a short period of time and an erroneous operation is made, this mode is effective in removing the images captured by the erroneous operation.
- FIG. 13 schematically illustrates frames displayed in the skip mode.
- the horizontal axis represents time and the vertical axis represents zoom magnification (angle of view of the image).
- Double circles represent frames recorded at 1/240-second intervals during a zoom operation, and single circles represent frames recorded at 1/60-second intervals.
- captured frames are recorded at 1/240-second intervals.
- the captured frames are thinned and recorded at a frame rate of 1/60-second intervals.
- the zoom magnification changes at a constant speed from the wide end at time t3 to the telephoto end at t6.
- the zoom magnification is somewhat decreased toward the wide end.
- the frames captured during the erroneous operation are not displayed on the screen.
- the skip mode will be described in detail with reference to FIG. 13 .
- the skip mode changes in an intermediate area in the zoom magnification are ignored, and the skip mode sets such changes in the zoom magnification as illustrated in FIG. 13 . More specifically, the starting point and the end point of a zoom operation are connected based on a certain rule (a straight line in FIG. 13 ). Next, actually recorded frames are shifted to correspond to the zoom magnification set in the skip mode, and the frames used to be reproduced and displayed at a normal frame rate are selected and determined.
- the frames at time t3b, t4, t4b, and t5 are selected to be displayed.
- the frame at time t3b is temporally shifted as a frame at time t4 (frame 76-1 represented as a black circle) to be displayed.
- the frame at time t4 is temporally shifted as a frame at time t5 (frame 76-2 represented as a black circle) to be displayed.
- the frame at time t4b is temporally shifted as a frame at time t6 (frame 76-3 represented as a black circle) to be displayed.
- the frame at time t5 is temporally shifted as a frame at time t7 (frame 76-4 represented as a black circle) to be displayed.
- the zoom operation period that deviates from the average zoom magnification can be removed.
- frame interpolation during a zoom operation is carried out by re-arranging the frames during the zoom operation so that the changes of the zoom magnification are constant between the average zoom magnification A and B.
- any one of these special reproduction modes may be specified for each scene as a display attribute.
- a plurality of these special reproduction modes maybe used in combination.
- the above special zoom reproduction effects have been described based on examples where a video is zoomed in from the wide end to the telephoto end. However, needless to say, the same processing is possible when a video is zoomed out from the telephoto end to the wide end.
- step S 15 the system control unit 50 continuously controls, reproduces, and outputs recorded audio, irrespective of selection of reproduced and output frames.
- the system control unit 50 appropriately adjusts the time axis and/or deletes unnecessary portions (silent or prolonged portions, for example).
- step S 16 the reproduction and outputting processing is carried out.
- the external I/F 54 supplies the reproduction signal to the external device connected thereto.
- step S 17 if the system control unit 50 receives instructions to stop reproduction from the operation unit 52 or completes reproduction of the object to be reproduced (YES in step S 17 ), the system control unit 50 ends the reproduction processing. If not (NO in step S 17 ), the processing returns to step S 11 .
- FIG. 14 is a block diagram illustrating functions of the system control unit 50 .
- the system control unit 50 is connected to the operation unit 52 and also to the compression/expansion unit 44 , the recording medium control unit 46 , the memory 40 , and the memory control unit 42 via the data bus 38 .
- the system control unit 50 includes a zoom operation detection unit 50 - 1 , a frame rate control unit 50 - 2 , and a file management data generation unit 50 - 3 as a recording system. Further, the system control unit 50 includes a file management data detection unit 50 - 4 , a display frame control unit 50 - 5 , and an audio control unit 50 - 6 as a reproducing system.
- the zoom operation detection unit 50 - 1 detects whether a zoom operation is carried out by the user and the zoom speed during a zoom operation.
- the zoom operation detection unit 50 - 1 supplies the detected zoom operation information to the frame rate control unit 50 - 2 .
- the frame rate control unit 50 - 2 controls the recording frame rate, based on the zoom operation information supplied from the zoom operation detection unit 50 - 1 .
- frames captured during a zoom operation are recorded at a high frame rate, and during reproduction, based on a specified effect, certain frames are re-arranged on the time axis. In this way, videos during zoom operations can be reproduced and displayed with various display effects.
- videos are captured at a high frame rate, frame images of a desired zoom magnification can be extracted, and high-quality and smooth interpolation can be realized. Additionally, even when a video captured during a zoom operation is visually undesirable because of a human error or the like during the zoom operation, by carrying out an interpolation process, the video can be reproduced without the undesirable portion. By separately processing the video and audio during such interpolation process, while some frames are skipped in the video, continuity of the audio can be maintained and reproduced without a break.
- FIG. 15 illustrates a reproduction operation of the present exemplary embodiment.
- the horizontal axis represents time.
- FIG. 15A illustrates on/off of a zoom operation during recording.
- FIG. 15B illustrates a frame control signal indicating a special reproduction period formed by a zoom operation period and the periods before and after the zoom operation period.
- the special reproduction period is formed by the zoom operation period illustrated in FIG. 15A , a period D1 before the zoom operation period, and a period D2 after the zoom operation period.
- FIG. 15C illustrates captured frames.
- FIG. 15D illustrates frames recorded in the recording medium 48 .
- the frames captured during periods D1 and D2 which are before and after the zoom operation, are also recorded in the recording medium 48 at a high frame rate of 1/240-second intervals.
- the lengths of periods D1 and D2 may be different from each other.
- a buffer memory capable of storing video data output from the camera signal processing unit 36 for more than period D1 is set in the memory 40 in advance.
- frames captured during period D1, the zoom operation, and period D2 and recorded in the buffer memory 40 are encoded without changing the frame rate, and recorded in the recording medium 48 . If the start of a zoom operation is not detected, frames captured after period D1 and stored in the buffer memory 40 are thinned at a frame rate of 1/60-second intervals. The resultant frames are then encoded and recorded in the recording medium 48 .
- FIG. 16 schematically illustrates a relationship between the recording frame rate and zoom magnification according to the present exemplary embodiment.
- the horizontal axis represents time and the vertical axis represents zoom magnification (angle of view of the image).
- Double circles represent frames captured at 1/240-second intervals
- single circles represent frames captured at 1/60-second intervals.
- the frames represented by the single circles and captured at 1/60-second intervals are recorded in the recording medium 48 , irrespective of on/off of the zoom operation.
- video data captured during the period between time t1 and time t3, which is before the zoom operation, and during the period between time t9 to time t11, which is after the zoom operation is also recorded in the recording medium 48 at a frame rate of 1/240-second intervals, which is higher than normal rate.
- video is recorded in the recording medium 48 at a high frame rate during certain periods before and after the zoom operation period.
- video captured before and after a zoom-in operation or a zoom-out operation includes a target object.
- the video includes important scenes that reflect the photographer's intention, and examples of such scenes include goal scenes in sports games and close-up facial expressions.
- the present invention may similarly be applicable to an imaging apparatus, which uses an electronic zoom that electronically zooms an image signal generated by the sensor 26 .
- the rate of a video signal has been described as a frame rate in the above description. However, in the case of an interlace signal, by replacing the frame rate with a field rate, similar effects can be obtained.
Landscapes
- Engineering & Computer Science (AREA)
- Multimedia (AREA)
- Signal Processing (AREA)
- Studio Devices (AREA)
- Television Signal Processing For Recording (AREA)
Abstract
Description
- This application is a continuation of U.S. Pat. No. 8,508,627 filed Sep. 6, 2012, which is a continuation of U.S. Pat. No. 8,264,573 filed May 11, 2010, which claims the benefit of and priority to Japanese Patent Application No. 2009-121535 filed May 20, 2009, the contents of each of which are hereby incorporated by reference herein in their entirety.
- 1. Field of the Invention
- The present invention relates to an imaging apparatus that records moving images, and to a reproducing apparatus that reproduces the moving images, such as a video camera or a digital camera.
- 2. Description of the Related Art
- In recent years, a recording and reproducing apparatus capable of recording/reproducing high-quality video signals such as high definition (HD) video signals or HD signals, and a video display apparatus capable of displaying such signals as images have been widely spread. Background techniques of an image processing system used in these apparatuses include techniques of miniaturizing the size of an image sensor, increasing the number of pixels, increasing the speed of transfer processing, and improving the efficiency of coding such as the H.264 standard. In addition, various types of video display apparatuses have been made available, such as those of a liquid crystal type, a plasma type, and an electro-luminous (EL) type. Further, video display apparatuses having a further decreased thickness, a higher definition, and/or a larger screen are being developed. Mobile-type video display apparatuses are also being developed.
- Since display apparatuses of such various types have been become available, users' viewing styles have changed significantly. Users handle images in various manners depending on the purpose, for example, some users prefer high-quality images while others wish to exchange short movies with ease.
- Techniques for storing large amounts of video data play an important role in addressing the needs of users. Particularly, remarkable progress has been made in the increase of the capacity of hard disks and in the decrease of costs thereof, followed by semiconductor memories. Additionally, large-capacity removable media or servers on the order of terabytes have been made available at low cost.
- Under the background described above, while the frame rate of the national television system committee (NTSC) is 29.97 frames/second in the field of imaging techniques, higher frame rates such as 120 frames/second and 240 frames/second are proposed. Further, techniques for multi-screen reproduction and reproduction with multi-audio channels are also being developed. Based on these techniques, more realistic and dynamic images can be displayed.
- While video signals are conventionally recorded at a fixed frame rate, some techniques are proposed for appropriately changing the frame rate, efficiently suppressing the total amount of recording data, and recording high-quality videos when necessary.
- Japanese Patent Application Laid-Open No. 2003-274360 (U.S. Pat. No. 7,456,875) discusses a technique for controlling the frame rate of captured video signals without requiring a photographer's control. More specifically, based on this technique, when an image is captured, a surrounding audio volume level is monitored, and if it is detected that the monitored volume level exceeds a predetermined level, the frame rate is increased, so that important scenes are recorded with high quality. Further, based on this technique, when the available memory capacity of a recording medium reaches less than or equal to a predetermined level, the frame rate is decreased to suppress consumption of the recording medium. In addition, when the available power of a battery reaches less than or equal to a predetermined level, the frame rate is decreased to suppress consumption of the battery.
- Japanese Patent Application Laid-Open No. 2007-134991 (USPA 2007/0104462) discusses a technique in which motion of a captured image is detected. Based on this technique, when the detected motion is more than or equal to a threshold, the captured image is recorded at a high frame rate, and when the detected motion is less than the threshold, the captured image is recorded at a low frame rate.
- However, based on the above techniques discussed in Japanese Patent Application Laid-Open No. 2003-274360 (U.S. Pat. No. 7,456,875) and Japanese Patent Application Laid-Open No. 2007-134991 (USPA 2007/0104462), change of the frame rate does not reflect photographer's intention. Namely, based on the technique discussed in Japanese Patent Application Laid-Open No. 2003-274360 (U.S. Pat. No. 7,456,875), the frame rate is changed depending on conditions surrounding an imaging apparatus, such as the surrounding audio volume level or the available power of a recording medium or a battery. Thus, the imaging condition is changed without photographer's intention.
- In addition, the technique discussed in Japanese Patent Application Laid-Open No. 2007-134991 (USPA 2007/0104462) uses motion information acquired from an image signal as an index. For example, when the ratio of monotonous scenes including the sky or night scenes is increased, it is determined that the amount of motion is less, and as a result, the image is recorded at a low frame rate. On the other hand, in a wide image whose focal length is short, the motion amount tends to be detected as being large because of camera shake, and consequently, the image is recorded at a high frame rate. In either case, the frame rate is determined irrespective of photographer's intention.
- Normally, TV receivers or video monitors support only a fixed frame rate. Thus, when a video signal is input and the frame rate thereof is changed in the middle of processing, even if the change of the frame rate can be followed, the video may be displayed at an unnatural rendering speed misaligned with the actual time. For example, if a monitor receives an input video signal and displays the signal at a certain rate images captured at a frame rate higher than the standard frame rate are rendered in slow motion, and images captured at a lower frame rate are rendered at high speed.
- Since compatibility between these viewing and reproduction environments is not taken into consideration, use of the technique of variable frame rate recording is limited.
- The present invention is directed to an imaging apparatus capable of changing a frame rate in view of a photographer's intention and maintaining compatibility with existing viewing and reproduction environments.
- According to an aspect of the present invention, an imaging apparatus includes an imaging unit including an image sensor configured to convert an optical image to an image signal, and a zoom unit configured to optically zoom an optical image incident on the image sensor or electronically zoom an image signal output from the image sensor, a recording unit configured to record a video signal including an image signal captured by the imaging unit in a recording medium, and configured to record a recording frame rate, at which the video signal is recorded in the recording medium, and zoom operation information for the zoom unit in the recording medium, a zoom operation unit configured to operate the zoom unit, a control unit configured to control the recording frame rate at which the video signal is recorded in the recording medium based on an operation of the zoom operation unit, and configured to increase the recording frame rate to be higher than a normal frame rate during a period including a period when the zoom operation unit is operated, and a reproducing unit configured to reproduce the video signal from the recording medium based on a set reproduction mode, and configured to carry out thinning processing on the video signal during the period including the period when the zoom operation unit is operated based on the zoom operation information and reproduce the processed video signal at the normal frame rate.
- Based on an imaging apparatus and a reproducing apparatus according to the present invention, videos including zoom operations can be reproduced. Since videos are reproduced at a standard frame rate, compatibility with existing video display apparatuses can be ensured. During periods including zoom operation periods, videos are recorded at a frame rate higher than a normal frame rate, and when the videos are reproduced, the frame rate is reduced to the standard frame rate. Thus, high-quality reproduction videos can be displayed.
- Further features and aspects of the present invention will become apparent from the following detailed description of exemplary embodiments with reference to the attached drawings.
- The accompanying drawings, which are incorporated in and constitute a part of the specification, illustrate exemplary embodiments, features, and aspects of the invention and, together with the description, serve to explain the principles of the invention.
-
FIG. 1 is a block diagram illustrating a schematic configuration of an exemplary embodiment of the present invention. -
FIGS. 2A to 2C illustrate a relationship among a zoom operation, captured frames, and recorded frames of the exemplary embodiment. -
FIG. 3 is a flow chart illustrating a recording operation of the exemplary embodiment. -
FIG. 4 illustrates a relationship between a zoom speed and a recording frame rate of the exemplary embodiment. -
FIG. 5 illustrates a metadata structure and data examples of the present exemplary embodiment. -
FIG. 6 illustrates a relationship between zoom magnification and recorded frames of the exemplary embodiment. -
FIG. 7 illustrates recording areas each storing frames at different frame rates. -
FIG. 8 is a flow chart illustrating a reproduction operation of the exemplary embodiment. -
FIG. 9 is a table illustrating reproduction modes that involve zoom operations. -
FIG. 10 illustrates frame interpolation inreproduction mode 1 of the exemplary embodiment. -
FIG. 11 illustrates frame interpolation inreproduction mode 2 of the exemplary embodiment. -
FIG. 12 illustrates frame interpolation inreproduction mode 3 of the exemplary embodiment. -
FIG. 13 illustrates frame interpolation inreproduction mode 5 of the exemplary embodiment. -
FIG. 14 is a block diagram illustrating functions of asystem control unit 50. -
FIGS. 15A to 15D illustrate a relationship among a zoom operation, captured frames, and recorded frames of a second exemplary embodiment. -
FIG. 16 illustrates a relationship between zoom magnification and recorded frames of the second exemplary embodiment. - Various exemplary embodiments, features, and aspects of the invention will be described in detail below with reference to the drawings.
-
FIG. 1 is a block diagram illustrating a schematic configuration of an exemplary embodiment of the present invention. A configuration and a basic operation of the present exemplary embodiment will be described with reference toFIG. 1 . - An imaging unit 10 (camera unit) has the following configuration and functions. The imaging optical system of the
imaging unit 10 includes afront lens 12, azoom lens 14, adiaphragm 16, and a focus lens 18. Thefront lens 12 is fixed to a lens barrel, and thezoom lens 14 and the focus lens 18 can move along the optical axis. Thediaphragm 16 is arranged between thezoom lens 14 and the focus lens 18. - An
exposure control unit 20 receives a control signal from acamera control unit 30 and controls the aperture of thediaphragm 16 based on the control signal. Azoom position sensor 22 detects the position of thezoom lens 14. Based on the position of thezoom lens 14 detected by thezoom position sensor 22, alens control unit 24 carries out feedback control on thezoom lens 14. Thelens control unit 24 also controls the zoom position of thezoom lens 14 and the focus position of the focus lens 18, based on instruction signals from thecamera control unit 30 to be described later. - An image
sensor driving unit 28 drives animage sensor 26 that converts an optical image formed by the imaging optical system into an electric image signal. The imagesensor driving unit 28 drives theimage sensor 26 based on a timing signal supplied from thecamera control unit 30 and controls theimage sensor 26 to output each pixel signal for the image signal. By increasing the driving signal frequency of the imagesensor driving unit 28, a frame rate higher than a standard frame rate can be handled. - A sample-and-hold (S/H)
unit 32 amplifies the image signal output from theimage sensor 26, and samples and holds the signal based on a timing signal supplied from the camera control unit 30 (or the image sensor driving unit 28). An analog-to-digital (A/D)converter 34 converts the signal supplied from the S/H unit 32 into a digital signal, and supplies the digital signal to a camerasignal processing unit 36. - Based on parameters set by the
camera control unit 30, the camerasignal processing unit 36 carries out known camera signal processing, such as color separation, gradation correction, and white balance adjustment, on the image data output from the A/D converter 34. The camerasignal processing unit 36 supplies the processed image data to a compression/expansion unit 44 via adata bus 38 and amemory 40. - While an audio input unit and a system for processing an audio signal supplied from the audio input unit are not illustrated in
FIG. 1 , input audio data is supplied to the compression/expansion unit 44 via thememory 40. The compression/expansion unit 44 carries out compressing, coding, and multiplexing processing on the input image and audio data, based on a moving image compression method such as moving picture experts group (MPEG) 2 or H.264. The compression/expansion unit 44 outputs the data in a predetermined format. - A recording
medium control unit 46 writes the image and audio data compressed by the compression/expansion unit 44 in arecording medium 48. Examples of therecording medium 48 include a nonvolatile semiconductor memory such as a flash memory, a hard disk apparatus, and a recordable optical disk. - Based on instructions from the
system control unit 50, thecamera control unit 30 controls operations of thelens control unit 24, theexposure control unit 20, the imagesensor driving unit 28, and the camerasignal processing unit 36. - The
system control unit 50 includes a central processing unit (CPU) and controls individual units according to instructions from anoperation unit 52 operated by a user, setting values, or an operation status. For example, thesystem control unit 50 controls an imaging operation via thecamera control unit 30 and controls recording/reproducing data in therecording medium 48 via the recordingmedium control unit 46. Needless to say, thesystem control unit 50 comprehensively controls the entire imaging apparatus by executing predetermined programs. - An external interface (I/F) 54 sends and receives signals in a predetermined format to and from an external device connected to an
external terminal 56. Examples of theexternal terminal 56 include terminals that comply with standards such as the institute of electrical and electronic engineers (IEEE) 1394, universal serial bus (USB) and/or the high-definition multimedia interface (HDMI). Examples of the signal format include analog video signals (audio/video) and digital video signals of various resolutions and frame rates. - The
operation unit 52 is used as a user interface and includes a shutter button, a zoom button, a power-source on/off button, and a record start/stop button used during imaging or recording. Theoperation unit 52 also includes a select button, a determination button, and a cancel button used to make selection on the menu, set parameters, and the like. - A
display control unit 58 drives adisplay apparatus 60, which is used as a monitor displaying captured images and reproduced images and is used as a display unit displaying various types of management information to users. - The
memory 40 is shared by each function block via thedata bus 38, and is formed by a read only memory (ROM) and/or a random-access memory (RAM). Each function block connected to thedata bus 38 can write and read data in and from thememory 40 via amemory control unit 42. - Image and audio data recorded in the
recording medium 48 is reproduced as follows. The recordingmedium control unit 46 supplies compressed data read from therecording medium 48 to the compression/expansion unit 44. The compression/expansion unit 44 decompresses the input compressed data to restore video data and audio data. - The
display control unit 58 supplies the restored video data to thedisplay apparatus 60, which displays a reproduced image using the supplied video data. Examples of thedisplay apparatus 60 include a liquid crystal display panel and an organic light-emitting device. A speaker (not illustrated) outputs reproduced audio. When necessary, reproduced video data and audio data is output to the outside via the external I/F 54 and theexternal terminal 56. - An operation for controlling the frame rate at which video data is recorded in the
recording medium 48 will be described below.FIG. 2 illustrates a relationship between captured frames temporarily stored in thememory 40 and recorded frames recorded in therecording medium 48.FIG. 2A illustrates a zoom operation period during imaging carried out by theoperation unit 52.FIG. 2B illustrates captured frames temporarily stored in thememory 40.FIG. 2C illustrates recorded frames recorded in therecording medium 48. -
FIG. 2A illustrates a zoom-on period, which is when a zoom operation is carried out by a photographer, and a zoom-off period, which is when no zoom operation is carried out by a photographer. As illustrated inFIG. 2B , thecamera control unit 30 drives theimage sensor 26 at a rate of 1/240 second, irrespective of on/off of the zoom operation. The image data of each frame is temporarily stored in thememory 40 at a frame rate of 1/240 second. - The
camera control unit 30 controls the recordingmedium control unit 46 to write frame images temporarily stored in thememory 40 in therecording medium 48, as illustrated inFIG. 2C . More specifically, the number of the frame images captured during the zoom-off period and temporarily stored in thememory 40 is thinned to ¼, and the resultant frame images are written in therecording medium 48 at a normal frame rate of 1/60 second. In the zoom-on period, the frame images temporarily stored in thememory 40 at a frame rate of 1/240 second are written in therecording medium 48 without change. -
FIG. 3 is a flow chart illustrating a control operation of a recording frame rate. Thesystem control unit 50 monitors a zoom operation carried out by theoperation unit 52. In step S1, if thesystem control unit 50 detects a zoom operation during a recording operation (YES in step S1), thesystem control unit 50 carries out the processing in steps S2 to S5. If not (NO in step S1), the processing proceeds to step S6. - If the
system control unit 50 detects a zoom operation (YES in step S1), in step S2, thesystem control unit 50 determines the zoom speed. For example, the zoom speed is determined based on the force applied by the user to the zoom switch on theoperation unit 52 or the length of time while the zoom switch is pushed. If a zoom speed is previously or initially set on the menu screen or the like, thesystem control unit 50 refers to the zoom speed. In other words, thesystem control unit 50 also functions as a zoom speed detection unit. - In step S3, based on the zoom speed determined in step S2, the
system control unit 50 sets a recording frame rate.FIG. 4 illustrates a relationship between the recording frame rate and the zoom speed. As illustrated inFIG. 4 , a higher recording frame rate is set for a higher zoom speed, and a lower recording frame rate is set for a lower zoom speed. - As the relationship between the zoom speed and the recording frame rate of
FIG. 4 illustrates, the recording frame rate may be changed on a step-by-step basis depending on the zoom speed. Alternatively, the recording frame rate may be changed linearly. Such a relationship between the zoom speed and the recording frame rate is previously set in a form of a table and stored in the memory 40 (ROM therein). - In step S4, the
system control unit 50 carries out a thinning process on the image frame data in thememory 40 based on the recording frame rate set in step S3, and records the remaining frames in therecording medium 48. In step S5, thesystem control unit 50 records zoom magnification information and frame rate management information in therecording medium 48 as metadata, along with video data of each frame. In step S6, if instructions to stop recording are input, the processing ends. If not, steps S1 to S5 are repeated. - Since the recording frame rate is changed based on the zoom speed, when images are captured at a low zoom speed, the images are recorded at a low frame rate. Thus, since unnecessary captured frames are not recorded, consumption of the available capacity of the
recording medium 48 can be suppressed. While an object moves rapidly during a zoom-on period at a high zoom speed, since the object is followed at a high frame rate, captured scenes in which the angle of view changes greatly can be recorded with good quality at a high sampling rate. -
FIG. 5 illustrates a structure of metadata added to the video data of each frame, and data examples. The video data includes a header portion, a metadata portion, and a data portion. The header portion stores basic information that does not change depending on time, such as imaging apparatus information, format information, and thumbnail information. - The metadata portion stores information that is updated depending on various imaging conditions, such as a date, time, a recording frame rate, and a shutter speed. The data portion stores image data and audio data encoded in a predetermined format. The metadata is embedded in the video data in the metadata format using a description language such as extensible markup language (XML) or hypertext markup language (HTML). The metadata may be added as binary data or a watermark.
-
FIG. 6 schematically illustrates a relationship between the recording frame rate and the zoom magnification according to the present exemplary embodiment. InFIG. 6 , the horizontal axis represents time, and the vertical axis represents zoom magnification (angle of view of the image). Double circles represent frames captured at 1/240-second intervals, and single circles represent frames captured at 1/60-second intervals. According to the present exemplary embodiment, the frames represented by the single circles and captured at 1/60-second intervals are recorded in therecording medium 48, irrespective of on/off of the zoom operation. - In the example illustrated in
FIG. 6 , during the zoom operation, namely, during the period between time t3 and time t9 when the zoom magnification changes, frames captured at 1/240-second intervals, which is a rate higher than normal rate, are recorded in therecording medium 48. Namely, captured frames are recorded at 1/240-second intervals during the zoom operation, and when the zoom operation is not carried out, the captured frames are thinned and recorded at 1/60-second intervals. - To ensure compatibility for reproduction, according to the present exemplary embodiment, storage of the video data in the
recording medium 48 is controlled as follows.FIG. 7 illustrates an example of how image data illustrated inFIG. 6 is allocated and recorded in therecording medium 48. - Frames captured at t1, t2, t3, and the like at a normal recording frame rate of 1/60 second are stored in a main storage area A of the
recording medium 48. On the other hand, if a zoom operation is carried out and the frame rate is changed, the frames captured at a higher frame rate between t3, t3a, t3b, t3c, t4, . . . , and t9 are stored in a sub-storage area Sub of therecording medium 48. - Thus, by separating video data depending on the frame rate and separately storing the data in different areas of the
recording medium 48, even when an apparatus without special reproduction functions is used, the video data stored in the main storage area A can be reproduced. Thus, compatibility for reproduction can be ensured. The main storage area A and the sub-storage area Sub are distinguished logically in terms of file management, and thus physical recording locations of these areas may be identical. - In
FIG. 7 , the frames captured at t3, t4, t5, and the like are recorded in the sub-storage area Sub as well as in the main storage area A. This is for decryption processing carried out during reproduction. If the frames captured at t3, t4, t5, and the like in the main storage area can be used, these frames do not need to be stored in the sub-storage area Sub. - A reproduction operation of the present exemplary embodiment will be described below. According to the present exemplary embodiment, before reproducing videos of different frame rates, an interpolation or thinning process is carried out. In this way, scenes captured during a zoom operation can be provided with special reproduction effects.
FIG. 8 is a flow chart illustrating a reproduction operation. - In step S11, based on reproduction instructions from the
operation unit 52, thesystem control unit 50 reads metadata about a specified video file from the file management information of therecording medium 48. In step S12, thesystem control unit 50 refers to the metadata and determines whether videos of different frame rates are recorded in the video file to be reproduced. - If images are not recorded at different recording frame rates (NO in step S12), the processing proceeds to step S16. In step S16, normal reproduction processing is carried out.
- If images are recorded at different recording frame rates (YES in step S12), in step S13, the
system control unit 50 reads special reproduction effects previously set by the user. In steps S14 and S15, thesystem control unit 50 controls displayed frames and voice output as described below.FIG. 9 is a table illustrating special reproduction effects (modes) that involve zoom operations and the effects of the individual modes. - A linear smoothing mode (mode 1) is a reproduction interpolation mode in which, even when the speed of a zoom operation carried out by the user is not constant, the zoom speed is changed during reproduction processing and reproduced images are displayed (rendered) at a constant zoom speed. In the linear smoothing mode, among the frames recorded in the
recording medium 48, thesystem control unit 50 selects frames so that the zoom magnification of the selected frames changes proportionally. Next, the selected frames are displayed on thedisplay apparatus 60 at a normal frame rate. -
FIG. 10 schematically illustrates frames displayed in the linear smoothing mode. The horizontal axis represents time and the vertical axis represents zoom magnification (angle of view of the image) . Double circles represent frames recorded at 1/240-second intervals during a zoom operation, and single circles represent frames recorded at 1/60-second intervals. During the period between time t3 and time t9 when the zoom magnification is changing, captured frames are recorded at 1/240-second intervals. - When the zoom operation is not carried out, the captured frames are thinned and recorded at a frame rate of 1/60-second intervals. As illustrated in
FIG. 10 , the zoom speed is not constant during imaging, i.e., the zoom speed fluctuates during a zoom operation from the wide end to the telephoto end. - The linear smoothing mode will be described in detail with reference to
FIG. 10 . Actually recorded frames are shifted so that the zoom magnification linearly changes as set in the linear smoothing mode, and the frames used to be reproduced and displayed at a normal frame rate are selected and determined. InFIG. 10 , the frames at time t5c, t6c, t7c, t8a, and t8c are selected to be displayed. - The frame at time t5c is shifted as a frame at time t4 (frame 70-1 represented as a black circle) to be displayed. The frame at time t6c is shifted as a frame at time t5 (frame 70-2 represented as a black circle) to be displayed. The frame at time t7c is shifted as a frame at time t6 (frame 70-3 represented as a black circle) to be displayed. The frame at time t8a is shifted as a frame at time t7 (frame 70-4 represented as a black circle) to be displayed. The frame at time t8c is shifted as a frame at time t8 (frame 70-5 represented as a black circle) to be displayed.
- Thus, in the linear smoothing mode, captured frames (recorded frames) whose zoom magnification changes linearly from the wide end to the telephoto end are re-arranged on the time axis so that the frames are displayed at a constant zoom speed.
- A dynamic shooting mode (mode 2) is a reproduction interpolation mode in which an image is zoomed rapidly as the object comes closer, irrespective of the zoom operation speed by the user. In other words, the linear smoothing mode provides a zoom effect of gradually increasing the zoom speed.
- Among the recorded frames in the
recording medium 48, thesystem control unit 50 selects frames so that the speed, at which the zoom magnification changes, is gradually increased. The selected frames are displayed on thedisplay apparatus 60 at a normal frame rate. -
FIG. 11 schematically illustrates frames displayed in the dynamic shooting mode. The horizontal axis represents time and the vertical axis represents zoom magnification (angle of view of the image) . Double circles represent frames recorded at 1/240-second intervals during a zoom operation, and single circles represent frames recorded at 1/60-second intervals. - During the period between time t3 and time t9 when the zoom magnification changes, captured frames are recorded at 1/240-second intervals. When the zoom operation is not carried out, the captured frames are thinned and recorded at a frame rate of 1/60-second intervals. In the example illustrated in
FIG. 11 , the speed, at which the zoom magnification changes, is constant during imaging. - The dynamic shooting mode will be described in detail with reference to
FIG. 11 . Actually recorded frames are shifted so that the zoom magnification is changing in a curve as set in the linear smoothing mode, and the frames used to be reproduced and displayed at a normal frame rate are selected and determined. InFIG. 11 , the frames at time t3, t3a, t3b, t4, and t5a are selected to be displayed. - The frame at time t3 is temporally shifted as a frame at time t4 (frame 72-1 represented as a black circle) to be displayed. The frame at time t3a is temporally shifted as a frame at time t5 (frame 72-2 represented as a black circle) to be displayed. The frame at time t3b is temporally shifted as a frame at time t6 (frame 72-3 represented as a black circle) to be displayed. The frame at time t4 is temporally shifted as a frame at time t7 (frame 72-4 represented as a black circle) to be displayed. The frame at time t5a is temporally shifted as a frame at time t8 (frame 72-5 represented as a black circle) to be displayed.
- Thus, in the dynamic shooting mode, frames whose zoom magnification changes nonlinearly from the wide end to the telephoto end are displayed at a normal frame rate, so that the zoom speed is gradually increased.
- A soft landing mode (mode 3) is a reproduction interpolation mode in which an image is zoomed in a decreasing speed as the object comes closer, irrespective of the zoom operation speed by the user. Namely, the soft landing mode provides a zoom effect of gradually decreasing the zoom speed.
- Among the recorded frames in the
recording medium 48, thesystem control unit 50 selects frames so that the zoom speed gradually decreases as the object comes closer. The selected frames are displayed on thedisplay apparatus 60 at a normal frame rate. -
FIG. 12 schematically illustrates frames displayed in the soft landing mode. The horizontal axis represents time and the vertical axis represents zoom magnification (angle of view of the image). Double circles represent frames recorded at 1/240-second intervals during a zoom operation, and single circles represent frames recorded at 1/60-second intervals. - During the period between time t3 and time t9 when the zoom magnification changes, captured frames are recorded at 1/240-second intervals. When the zoom operation is not carried out, the captured frames are thinned and recorded at a frame rate of 1/60-second intervals. In the example illustrated in
FIG. 12 , the speed, at which the zoom magnification is changing, is constant during imaging. - The soft landing mode will be described in detail with reference to
FIG. 12 . Actually recorded frames are shifted so that the zoom magnification changes in a curve as set in the soft landing mode, and the frames used to be reproduced and displayed at a normal frame rate are selected and determined. InFIG. 12 , the frames at time t6c, t8, t8b, t8c, and t9 are selected to be displayed. - The frame at time t6c is temporally shifted as a frame at time t4 (frame 74-1 represented as a black circle) to be displayed. The frame at time t8 is temporally shifted as a frame at time t5 (frame 74-2 represented as a black circle) to be displayed. The frame at time t8b is temporally shifted as a frame at time t6 (frame 74-3 represented as a black circle) to be displayed. The frame at time t8c is temporally shifted as a frame at time t7 (frame 74-4 represented as a black circle) to be displayed. The frame at time t9 is temporally shifted as a frame at time t8 (frame 74-5 represented as a black circle) to be displayed.
- Thus, in the soft landing mode, frames whose zoom magnification changes nonlinearly from the wide end to the telephoto end are interpolated and displayed, so that the zoom speed is gradually decreased.
- A slow motion mode (mode 4) is a reproduction mode in which all the frame images recorded at a high frame rate during a zoom operation are rendered at a preset magnification. For example, frame images captured and recorded at a high frame rate of 1/240-second intervals during a zoom period are reproduced at a normal frame rate of 1/60-second intervals. As a result, the images are displayed at a slow speed (¼ of the original speed). Namely, the video recorded during a zoom operation is reproduced slowly and can be observed easily.
- A skip mode (mode 5) is a reproduction mode that may be used when a user makes a mistake in a zoom operation. For example, when a user increases the zoom magnification excessively and decreases hurriedly, such erroneous operation is automatically detected, and reproduction of the images captured by the erroneous operation is skipped. When a zoom-in operation or a zoom-out operation is repeated in a short period of time and an erroneous operation is made, this mode is effective in removing the images captured by the erroneous operation.
-
FIG. 13 schematically illustrates frames displayed in the skip mode. The horizontal axis represents time and the vertical axis represents zoom magnification (angle of view of the image). Double circles represent frames recorded at 1/240-second intervals during a zoom operation, and single circles represent frames recorded at 1/60-second intervals. - During the period between time t3 and time t9 when the zoom magnification is changing, captured frames are recorded at 1/240-second intervals. When the zoom operation is not carried out, the captured frames are thinned and recorded at a frame rate of 1/60-second intervals. In the example illustrated in
FIG. 13 , the zoom magnification changes at a constant speed from the wide end at time t3 to the telephoto end at t6. However, after time t6, the zoom magnification is somewhat decreased toward the wide end. InFIG. 13 , the frames captured during the erroneous operation are not displayed on the screen. - The skip mode will be described in detail with reference to
FIG. 13 . In the skip mode, changes in an intermediate area in the zoom magnification are ignored, and the skip mode sets such changes in the zoom magnification as illustrated inFIG. 13 . More specifically, the starting point and the end point of a zoom operation are connected based on a certain rule (a straight line inFIG. 13 ). Next, actually recorded frames are shifted to correspond to the zoom magnification set in the skip mode, and the frames used to be reproduced and displayed at a normal frame rate are selected and determined. - In
FIG. 13 , the frames at time t3b, t4, t4b, and t5 are selected to be displayed. The frame at time t3b is temporally shifted as a frame at time t4 (frame 76-1 represented as a black circle) to be displayed. The frame at time t4 is temporally shifted as a frame at time t5 (frame 76-2 represented as a black circle) to be displayed. The frame at time t4b is temporally shifted as a frame at time t6 (frame 76-3 represented as a black circle) to be displayed. The frame at time t5 is temporally shifted as a frame at time t7 (frame 76-4 represented as a black circle) to be displayed. - Thus, when a user carries out an erroneous zoom operation or adjustment, if the user captures a video in which the zoom magnification is overshot, the user can reproduce and display the video without the overshoot portion by using the skip mode.
- In the case of the example illustrated in
FIG. 13 , by monitoring the amount of the zoom operation within a certain time based on the metadata and determining an average zoom magnification A or B during a certain period, the zoom operation period that deviates from the average zoom magnification (overshoot portion) can be removed. Further, frame interpolation during a zoom operation is carried out by re-arranging the frames during the zoom operation so that the changes of the zoom magnification are constant between the average zoom magnification A and B. - While five special reproduction modes have thus been described, any one of these special reproduction modes may be specified for each scene as a display attribute. A plurality of these special reproduction modes maybe used in combination. Further, the above special zoom reproduction effects have been described based on examples where a video is zoomed in from the wide end to the telephoto end. However, needless to say, the same processing is possible when a video is zoomed out from the telephoto end to the wide end.
- In step S15, the
system control unit 50 continuously controls, reproduces, and outputs recorded audio, irrespective of selection of reproduced and output frames. For example, thesystem control unit 50 appropriately adjusts the time axis and/or deletes unnecessary portions (silent or prolonged portions, for example). - Referring back to
FIG. 8 , after these output frames and audio are controlled as described above, in step S16, the reproduction and outputting processing is carried out. When necessary, the external I/F 54 supplies the reproduction signal to the external device connected thereto. - In step S17, if the
system control unit 50 receives instructions to stop reproduction from theoperation unit 52 or completes reproduction of the object to be reproduced (YES in step S17), thesystem control unit 50 ends the reproduction processing. If not (NO in step S17), the processing returns to step S11. -
FIG. 14 is a block diagram illustrating functions of thesystem control unit 50. - The
system control unit 50 is connected to theoperation unit 52 and also to the compression/expansion unit 44, the recordingmedium control unit 46, thememory 40, and thememory control unit 42 via thedata bus 38. Thesystem control unit 50 includes a zoom operation detection unit 50-1, a frame rate control unit 50-2, and a file management data generation unit 50-3 as a recording system. Further, thesystem control unit 50 includes a file management data detection unit 50-4, a display frame control unit 50-5, and an audio control unit 50-6 as a reproducing system. - During recording, the zoom operation detection unit 50-1 detects whether a zoom operation is carried out by the user and the zoom speed during a zoom operation. The zoom operation detection unit 50-1 supplies the detected zoom operation information to the frame rate control unit 50-2. The frame rate control unit 50-2 controls the recording frame rate, based on the zoom operation information supplied from the zoom operation detection unit 50-1.
- The file management data generation unit 50-3 stores frame rate and zoom operation information about video data stored in the
recording medium 48 in the file management data as metadata. Needless to say, the file management data generation unit 50-3 may store the recording frame rate and zoom operation information in another file different from an image file, as long as each frame of video data is associated with the recording frame rate and zoom operation information. - During reproduction, the file management data detection unit 50-4 reads the metadata about video data to be reproduced from the
recording medium 48, and supplies the metadata to the display frame control unit 50-5 and the audio control unit 50-6. The display frame control unit 50-5 controls display frames as described above, based on the metadata supplied from the file management data detection unit 50-4 and a specified reproduction mode. Similarly, the audio control unit 50-6 controls output audio as described above, based on the metadata supplied from the file management data detection unit 50-4 and a specified reproduction mode. - As described above in detail, according to the present exemplary embodiment, frames captured during a zoom operation are recorded at a high frame rate, and during reproduction, based on a specified effect, certain frames are re-arranged on the time axis. In this way, videos during zoom operations can be reproduced and displayed with various display effects.
- Since videos are captured at a high frame rate, frame images of a desired zoom magnification can be extracted, and high-quality and smooth interpolation can be realized. Additionally, even when a video captured during a zoom operation is visually undesirable because of a human error or the like during the zoom operation, by carrying out an interpolation process, the video can be reproduced without the undesirable portion. By separately processing the video and audio during such interpolation process, while some frames are skipped in the video, continuity of the audio can be maintained and reproduced without a break.
- In the above exemplary embodiments, the camera unit constantly captures frames at a high frame rate, and the frame rate at which the frames are recorded in the
recording medium 48 is decreased when necessary. However, the present invention is not limited to such example. For example, by changing the frequency of a timing signal output from the imagesensor driving unit 28, even when the frame rate at which the camera unit captures frames is changed more flexibly, similar effects can be obtained. - While a video captured during a zoom operation often includes important scenes, similarly, videos captured before and after a zoom operation often include important scenes. Thus, irrespective of on/off of a zoom operation, frames are recorded at a high frame rate, to record video data captured during certain periods before and after the zoom operation in the
recording medium 48 at a high frame rate. During reproduction, special reproduction effects similar to those of the first exemplary embodiment are applied to a zoom operation period and the periods before and after the zoom operation period. In this way, a video can be reproduced smoothly during a zoom operation period and the periods before and after the zoom operation period. -
FIG. 15 illustrates a reproduction operation of the present exemplary embodiment. The horizontal axis represents time. -
FIG. 15A illustrates on/off of a zoom operation during recording.FIG. 15B illustrates a frame control signal indicating a special reproduction period formed by a zoom operation period and the periods before and after the zoom operation period. Thus, the special reproduction period is formed by the zoom operation period illustrated inFIG. 15A , a period D1 before the zoom operation period, and a period D2 after the zoom operation period.FIG. 15C illustrates captured frames.FIG. 15D illustrates frames recorded in therecording medium 48. - As seen from
FIGS. 15B and 15D , according to the present exemplary embodiment, in addition to the frames captured during a zoom operation, the frames captured during periods D1 and D2, which are before and after the zoom operation, are also recorded in therecording medium 48 at a high frame rate of 1/240-second intervals. The lengths of periods D1 and D2 may be different from each other. - To record frames captured during period D1, which is before the zoom operation, in the
recording medium 48, a buffer memory capable of storing video data output from the camerasignal processing unit 36 for more than period D1 is set in thememory 40 in advance. - If the start of a zoom operation is detected, frames captured during period D1, the zoom operation, and period D2 and recorded in the
buffer memory 40 are encoded without changing the frame rate, and recorded in therecording medium 48. If the start of a zoom operation is not detected, frames captured after period D1 and stored in thebuffer memory 40 are thinned at a frame rate of 1/60-second intervals. The resultant frames are then encoded and recorded in therecording medium 48. -
FIG. 16 schematically illustrates a relationship between the recording frame rate and zoom magnification according to the present exemplary embodiment. The horizontal axis represents time and the vertical axis represents zoom magnification (angle of view of the image). - Double circles represent frames captured at 1/240-second intervals, and single circles represent frames captured at 1/60-second intervals. According to the present exemplary embodiment, the frames represented by the single circles and captured at 1/60-second intervals are recorded in the
recording medium 48, irrespective of on/off of the zoom operation. In addition to the period between time t3 and time t9 when the zoom magnification changes, video data captured during the period between time t1 and time t3, which is before the zoom operation, and during the period between time t9 to time t11, which is after the zoom operation, is also recorded in therecording medium 48 at a frame rate of 1/240-second intervals, which is higher than normal rate. - Thus, according to the present exemplary embodiment, in addition to a zoom operation period, video is recorded in the
recording medium 48 at a high frame rate during certain periods before and after the zoom operation period. Generally, when a user carries out a zoom operation, video captured before and after a zoom-in operation or a zoom-out operation includes a target object. Thus, the video includes important scenes that reflect the photographer's intention, and examples of such scenes include goal scenes in sports games and close-up facial expressions. By recording the video captured before and after a zoom operation at a high frame rate as described above, the video captured before and after the zoom operation can be displayed effectively. - While the above exemplary embodiments use an optical zoom, which optically zooms an optical image incident on the
image sensor 26, the present invention may similarly be applicable to an imaging apparatus, which uses an electronic zoom that electronically zooms an image signal generated by thesensor 26. - The rate of a video signal has been described as a frame rate in the above description. However, in the case of an interlace signal, by replacing the frame rate with a field rate, similar effects can be obtained.
- While the present invention has been described with reference to exemplary embodiments, it is to be understood that the invention is not limited to the disclosed exemplary embodiments. The scope of the following claims is to be accorded the broadest interpretation so as to encompass all modifications, equivalent structures, and functions.
Claims (20)
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US13/954,888 US9270968B2 (en) | 2009-05-20 | 2013-07-30 | Imaging apparatus and reproducing apparatus which changes frame rate based on zoom operation |
Applications Claiming Priority (5)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2009121535A JP5414357B2 (en) | 2009-05-20 | 2009-05-20 | Imaging device and playback device |
JP2009-121535 | 2009-05-20 | ||
US12/778,017 US8264573B2 (en) | 2009-05-20 | 2010-05-11 | Imaging apparatus and reproducing apparatus which changes frame rate based on zoom operation |
US13/605,065 US8508627B2 (en) | 2009-05-20 | 2012-09-06 | Imaging apparatus and reproducing apparatus which changes frame rate based on zoom operation |
US13/954,888 US9270968B2 (en) | 2009-05-20 | 2013-07-30 | Imaging apparatus and reproducing apparatus which changes frame rate based on zoom operation |
Related Parent Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US13/605,065 Continuation US8508627B2 (en) | 2009-05-20 | 2012-09-06 | Imaging apparatus and reproducing apparatus which changes frame rate based on zoom operation |
Publications (2)
Publication Number | Publication Date |
---|---|
US20130343732A1 true US20130343732A1 (en) | 2013-12-26 |
US9270968B2 US9270968B2 (en) | 2016-02-23 |
Family
ID=43124350
Family Applications (3)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US12/778,017 Expired - Fee Related US8264573B2 (en) | 2009-05-20 | 2010-05-11 | Imaging apparatus and reproducing apparatus which changes frame rate based on zoom operation |
US13/605,065 Expired - Fee Related US8508627B2 (en) | 2009-05-20 | 2012-09-06 | Imaging apparatus and reproducing apparatus which changes frame rate based on zoom operation |
US13/954,888 Expired - Fee Related US9270968B2 (en) | 2009-05-20 | 2013-07-30 | Imaging apparatus and reproducing apparatus which changes frame rate based on zoom operation |
Family Applications Before (2)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US12/778,017 Expired - Fee Related US8264573B2 (en) | 2009-05-20 | 2010-05-11 | Imaging apparatus and reproducing apparatus which changes frame rate based on zoom operation |
US13/605,065 Expired - Fee Related US8508627B2 (en) | 2009-05-20 | 2012-09-06 | Imaging apparatus and reproducing apparatus which changes frame rate based on zoom operation |
Country Status (2)
Country | Link |
---|---|
US (3) | US8264573B2 (en) |
JP (1) | JP5414357B2 (en) |
Cited By (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20130300900A1 (en) * | 2012-05-08 | 2013-11-14 | Tomas Pfister | Automated Recognition Algorithm For Detecting Facial Expressions |
US20130329090A1 (en) * | 2012-06-08 | 2013-12-12 | Canon Kabushiki Kaisha | Image capturing apparatus and control method thereof |
US10917563B2 (en) * | 2018-12-07 | 2021-02-09 | Canon Kabushiki Kaisha | Image processing apparatus, image processing method, and non-transitory computer-readable storage medium |
WO2022159328A1 (en) * | 2021-01-22 | 2022-07-28 | Qualcomm Incorporated | Zoom in or zoom out with slow-motion video capture |
Families Citing this family (19)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US8730301B2 (en) * | 2010-03-12 | 2014-05-20 | Sony Corporation | Service linkage to caption disparity data transport |
US9256361B2 (en) | 2011-08-03 | 2016-02-09 | Ebay Inc. | Control of search results with multipoint pinch gestures |
JP2013219556A (en) * | 2012-04-09 | 2013-10-24 | Olympus Imaging Corp | Imaging apparatus |
JP6137877B2 (en) * | 2013-03-05 | 2017-05-31 | オリンパス株式会社 | Image processing apparatus, image processing method, and program thereof |
JP2015122734A (en) * | 2013-11-25 | 2015-07-02 | パナソニックIpマネジメント株式会社 | Imaging apparatus and imaging method |
DE102014102689A1 (en) | 2014-02-28 | 2015-09-03 | Arnold & Richter Cine Technik Gmbh & Co. Betriebs Kg | Motion picture camera arrangement and method for operating a motion picture camera arrangement |
JP2015186235A (en) * | 2014-03-26 | 2015-10-22 | ソニー株式会社 | Image sensor and electronic apparatus |
EP3165955B1 (en) * | 2014-07-02 | 2021-01-13 | Sony Corporation | Zoom control device, zoom control method, and program |
JP6341815B2 (en) * | 2014-09-08 | 2018-06-13 | キヤノン株式会社 | Imaging device, control method thereof, and program |
JP6371656B2 (en) * | 2014-09-26 | 2018-08-08 | キヤノン株式会社 | Image reproduction apparatus, image reproduction method and program, and imaging apparatus |
US10991393B2 (en) | 2016-01-07 | 2021-04-27 | Samsung Electronics Co., Ltd. | Electronic device and method of managing a playback rate of a plurality of images |
WO2018079390A1 (en) | 2016-10-27 | 2018-05-03 | ソニー株式会社 | Video signal processing device, imaging device, method for checking for flickering in imaging device, and server |
KR102295526B1 (en) | 2017-04-10 | 2021-08-30 | 삼성전자 주식회사 | Image sensor and image processing device including the same |
CN107343155B (en) * | 2017-07-10 | 2019-03-29 | Oppo广东移动通信有限公司 | Inhibit method and device, the terminal device of AEC jump |
JP6980470B2 (en) * | 2017-09-19 | 2021-12-15 | キヤノン株式会社 | Electronic devices and their control methods |
JP2019086701A (en) | 2017-11-08 | 2019-06-06 | キヤノン株式会社 | Imaging control apparatus and control method thereof |
KR20190052615A (en) | 2017-11-08 | 2019-05-16 | 캐논 가부시끼가이샤 | Imaging apparatus |
WO2019227324A1 (en) * | 2018-05-30 | 2019-12-05 | 深圳市大疆创新科技有限公司 | Method and device for controlling video playback speed and motion camera |
US20220327718A1 (en) * | 2021-04-13 | 2022-10-13 | Qualcomm Incorporated | Techniques for enhancing slow motion recording |
Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2008005051A (en) * | 2006-06-20 | 2008-01-10 | Ricoh Co Ltd | Imaging apparatus, program, and recording medium |
US20110199496A1 (en) * | 2010-02-16 | 2011-08-18 | Casio Computer Co., Ltd. | Image capturing apparatus, image capturing control method, and storage medium |
US8577161B2 (en) * | 2011-06-13 | 2013-11-05 | Canon Kabushiki Kaisha | Reproduction apparatus |
Family Cites Families (28)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US5963203A (en) * | 1997-07-03 | 1999-10-05 | Obvious Technology, Inc. | Interactive video icon with designated viewing position |
JP2001004903A (en) * | 1999-06-23 | 2001-01-12 | Fuji Photo Optical Co Ltd | Television lens system |
JP3819669B2 (en) * | 2000-04-13 | 2006-09-13 | 三洋電機株式会社 | Digital camera |
JP2001358984A (en) * | 2000-06-15 | 2001-12-26 | Mitsubishi Heavy Ind Ltd | Moving picture processing camera |
JP2002300457A (en) | 2001-03-30 | 2002-10-11 | Ricoh Co Ltd | Digital camera |
JP2002354329A (en) | 2001-05-30 | 2002-12-06 | Minolta Co Ltd | Photographing device and photographing system |
JP2003274360A (en) | 2002-03-14 | 2003-09-26 | Sony Corp | Apparatus and method for imaging, and device and system for imaging management |
US7456875B2 (en) * | 2002-03-14 | 2008-11-25 | Sony Corporation | Image pickup apparatus and method, signal processing apparatus and method, and wearable signal processing apparatus |
GB2388265B (en) * | 2002-04-30 | 2005-10-12 | Hewlett Packard Co | Improvements in and relating to processing of images |
JP2004064334A (en) * | 2002-07-26 | 2004-02-26 | Mitsubishi Electric Corp | Image pick-up apparatus |
JP4276452B2 (en) * | 2003-02-18 | 2009-06-10 | パナソニック株式会社 | Network camera |
JP4178076B2 (en) | 2003-05-30 | 2008-11-12 | 株式会社日立製作所 | Mobile phone |
JP2005086499A (en) * | 2003-09-09 | 2005-03-31 | Minolta Co Ltd | Imaging apparatus |
US6956589B2 (en) * | 2003-09-29 | 2005-10-18 | Beon Media Inc. | Method and system for specifying zoom size |
US6989848B2 (en) * | 2003-09-29 | 2006-01-24 | Beon Media Inc. | Method and system for specifying zoom speed |
US20050081247A1 (en) * | 2003-09-29 | 2005-04-14 | Lipsky Scott E. | Method and system for generating image display plans |
US7532753B2 (en) * | 2003-09-29 | 2009-05-12 | Lipsky Scott E | Method and system for specifying color of a fill area |
JP4687404B2 (en) | 2005-11-10 | 2011-05-25 | ソニー株式会社 | Image signal processing apparatus, imaging apparatus, and image signal processing method |
JP4262263B2 (en) * | 2006-06-07 | 2009-05-13 | キヤノン株式会社 | Imaging apparatus and control method thereof |
JP2008099110A (en) * | 2006-10-13 | 2008-04-24 | Olympus Corp | Imaging apparatus |
EP2079231B1 (en) * | 2006-10-24 | 2014-04-30 | Sony Corporation | Imaging device and reproduction control device |
EP2099213A4 (en) * | 2006-11-28 | 2011-02-16 | Nec Corp | Moving image pickup apparatus with zooming function, image processing, displaying method and program |
JP4861234B2 (en) * | 2007-04-13 | 2012-01-25 | 株式会社エルモ社 | Exposure control method and imaging apparatus |
JP5018332B2 (en) * | 2007-08-17 | 2012-09-05 | ソニー株式会社 | Image processing apparatus, imaging apparatus, image processing method, and program |
US20100039536A1 (en) * | 2008-08-14 | 2010-02-18 | Sony Ericsson Mobile Communications Ab | Video recording device and method |
JP2010087778A (en) * | 2008-09-30 | 2010-04-15 | Casio Computer Co Ltd | Imaging apparatus, variable speed imaging method, and program |
JP2011119854A (en) * | 2009-12-01 | 2011-06-16 | Canon Inc | Imaging device |
JP5764740B2 (en) * | 2010-10-13 | 2015-08-19 | パナソニックIpマネジメント株式会社 | Imaging device |
-
2009
- 2009-05-20 JP JP2009121535A patent/JP5414357B2/en not_active Expired - Fee Related
-
2010
- 2010-05-11 US US12/778,017 patent/US8264573B2/en not_active Expired - Fee Related
-
2012
- 2012-09-06 US US13/605,065 patent/US8508627B2/en not_active Expired - Fee Related
-
2013
- 2013-07-30 US US13/954,888 patent/US9270968B2/en not_active Expired - Fee Related
Patent Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2008005051A (en) * | 2006-06-20 | 2008-01-10 | Ricoh Co Ltd | Imaging apparatus, program, and recording medium |
US20110199496A1 (en) * | 2010-02-16 | 2011-08-18 | Casio Computer Co., Ltd. | Image capturing apparatus, image capturing control method, and storage medium |
US8577161B2 (en) * | 2011-06-13 | 2013-11-05 | Canon Kabushiki Kaisha | Reproduction apparatus |
Cited By (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20130300900A1 (en) * | 2012-05-08 | 2013-11-14 | Tomas Pfister | Automated Recognition Algorithm For Detecting Facial Expressions |
US8848068B2 (en) * | 2012-05-08 | 2014-09-30 | Oulun Yliopisto | Automated recognition algorithm for detecting facial expressions |
US20130329090A1 (en) * | 2012-06-08 | 2013-12-12 | Canon Kabushiki Kaisha | Image capturing apparatus and control method thereof |
US9210333B2 (en) * | 2012-06-08 | 2015-12-08 | Canon Kabushiki Kaisha | Image capturing apparatus for generating composite image and control method thereof |
US10917563B2 (en) * | 2018-12-07 | 2021-02-09 | Canon Kabushiki Kaisha | Image processing apparatus, image processing method, and non-transitory computer-readable storage medium |
WO2022159328A1 (en) * | 2021-01-22 | 2022-07-28 | Qualcomm Incorporated | Zoom in or zoom out with slow-motion video capture |
US11425306B2 (en) | 2021-01-22 | 2022-08-23 | Qualcomm Incorporated | Zoom in or zoom out with slow-motion video capture |
Also Published As
Publication number | Publication date |
---|---|
US8508627B2 (en) | 2013-08-13 |
US20100295970A1 (en) | 2010-11-25 |
US8264573B2 (en) | 2012-09-11 |
JP5414357B2 (en) | 2014-02-12 |
JP2010272999A (en) | 2010-12-02 |
US20120327273A1 (en) | 2012-12-27 |
US9270968B2 (en) | 2016-02-23 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US9270968B2 (en) | Imaging apparatus and reproducing apparatus which changes frame rate based on zoom operation | |
US8081255B2 (en) | Image converting apparatus, image signal processing apparatus, camera system and image signal processing method | |
US8428423B2 (en) | Reproducing apparatus for video data | |
JP5036410B2 (en) | Imaging apparatus and control method thereof | |
US8743227B2 (en) | Imaging apparatus and control method for reducing a load of writing image data on a recording medium | |
EP1744554B1 (en) | Imaging device and method | |
JP4556195B2 (en) | Imaging device, moving image playback device, and program | |
JP2009111518A (en) | Imaging apparatus, image reproducing unit and program thereof, and data structure of image file | |
JP5164610B2 (en) | Imaging apparatus and control method thereof | |
US10348957B2 (en) | Image capturing apparatus, method of controlling the same, and storage medium for shooting a still image without interrupting shooting of moving images | |
WO2011021445A1 (en) | Image processing apparatus and image processing method | |
JP2010021710A (en) | Imaging device, image processor, and program | |
JP5076457B2 (en) | Video signal processing apparatus and video signal processing method | |
JP2004120364A (en) | Digital still camera | |
JP5012985B2 (en) | Imaging apparatus and program thereof | |
US20120219264A1 (en) | Image processing device | |
KR101480407B1 (en) | Digital image processing apparatus, method for controlling the same and medium of recording the method | |
JP2004120367A (en) | Digital camera | |
JP5320186B2 (en) | Recording apparatus and reproducing apparatus | |
US20060023083A1 (en) | Method of controlling digital photographing apparatus for efficient reproduction operation and digital photographing apparatus adopting the same | |
JP4974841B2 (en) | Video data recording device | |
JP2021064903A (en) | Imaging apparatus, control method, and program | |
JP2010034933A (en) | Image processor, image processing method, and program | |
JP2005136794A (en) | Method for recording data of camera | |
JP2004236290A (en) | Image recording device |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
ZAAA | Notice of allowance and fees due |
Free format text: ORIGINAL CODE: NOA |
|
ZAAB | Notice of allowance mailed |
Free format text: ORIGINAL CODE: MN/=. |
|
ZAAA | Notice of allowance and fees due |
Free format text: ORIGINAL CODE: NOA |
|
STCF | Information on status: patent grant |
Free format text: PATENTED CASE |
|
MAFP | Maintenance fee payment |
Free format text: PAYMENT OF MAINTENANCE FEE, 4TH YEAR, LARGE ENTITY (ORIGINAL EVENT CODE: M1551); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY Year of fee payment: 4 |
|
FEPP | Fee payment procedure |
Free format text: MAINTENANCE FEE REMINDER MAILED (ORIGINAL EVENT CODE: REM.); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY |
|
LAPS | Lapse for failure to pay maintenance fees |
Free format text: PATENT EXPIRED FOR FAILURE TO PAY MAINTENANCE FEES (ORIGINAL EVENT CODE: EXP.); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY |
|
STCH | Information on status: patent discontinuation |
Free format text: PATENT EXPIRED DUE TO NONPAYMENT OF MAINTENANCE FEES UNDER 37 CFR 1.362 |
|
FP | Lapsed due to failure to pay maintenance fee |
Effective date: 20240223 |