[go: up one dir, main page]
More Web Proxy on the site http://driver.im/

US20090160969A1 - Image capturing apparatus - Google Patents

Image capturing apparatus Download PDF

Info

Publication number
US20090160969A1
US20090160969A1 US12/314,664 US31466408A US2009160969A1 US 20090160969 A1 US20090160969 A1 US 20090160969A1 US 31466408 A US31466408 A US 31466408A US 2009160969 A1 US2009160969 A1 US 2009160969A1
Authority
US
United States
Prior art keywords
image
processing
fields
post
resolution
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Abandoned
Application number
US12/314,664
Inventor
Toshihisa Kuroiwa
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Nikon Corp
Original Assignee
Nikon Corp
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Nikon Corp filed Critical Nikon Corp
Assigned to NIKON CORPORATION reassignment NIKON CORPORATION ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: KUROIWA, TOSHIHISA
Publication of US20090160969A1 publication Critical patent/US20090160969A1/en
Priority to US13/791,439 priority Critical patent/US8767086B2/en
Abandoned legal-status Critical Current

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N5/00Details of television systems
    • H04N5/222Studio circuitry; Studio devices; Studio equipment
    • H04N5/262Studio circuits, e.g. for mixing, switching-over, change of character of image, other special effects ; Cameras specially adapted for the electronic generation of special effects
    • H04N5/2628Alteration of picture size, shape, position or orientation, e.g. zooming, rotation, rolling, perspective, translation
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/80Camera processing pipelines; Components thereof
    • H04N23/84Camera processing pipelines; Components thereof for processing colour signals
    • H04N23/843Demosaicing, e.g. interpolating colour pixel values
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N25/00Circuitry of solid-state image sensors [SSIS]; Control thereof
    • H04N25/10Circuitry of solid-state image sensors [SSIS]; Control thereof for transforming different wavelengths into image signals
    • H04N25/11Arrangement of colour filter arrays [CFA]; Filter mosaics
    • H04N25/13Arrangement of colour filter arrays [CFA]; Filter mosaics characterised by the spectral characteristics of the filter elements
    • H04N25/134Arrangement of colour filter arrays [CFA]; Filter mosaics characterised by the spectral characteristics of the filter elements based on three different wavelength filter elements
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N25/00Circuitry of solid-state image sensors [SSIS]; Control thereof
    • H04N25/40Extracting pixel data from image sensors by controlling scanning circuits, e.g. by modifying the number of pixels sampled or to be sampled
    • H04N25/44Extracting pixel data from image sensors by controlling scanning circuits, e.g. by modifying the number of pixels sampled or to be sampled by partially reading an SSIS array
    • H04N25/445Extracting pixel data from image sensors by controlling scanning circuits, e.g. by modifying the number of pixels sampled or to be sampled by partially reading an SSIS array by skipping some contiguous pixels within the read portion of the array
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N2209/00Details of colour television systems
    • H04N2209/04Picture signal generators
    • H04N2209/041Picture signal generators using solid-state devices
    • H04N2209/042Picture signal generators using solid-state devices having a single pick-up sensor
    • H04N2209/045Picture signal generators using solid-state devices having a single pick-up sensor using mosaic colour filter
    • H04N2209/046Colour interpolation to calculate the missing colour values
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/63Control of cameras or camera modules by using electronic viewfinders

Definitions

  • the present application relates to an image capturing apparatus which obtains an image by capturing an image of a subject.
  • An electronic camera having an image sensor which separates a captured image of a subject into fields and reads the fields has been popularized.
  • the applicant of the present invention has already proposed an electronic camera as an invention described in Japanese Unexamined Patent Application Publication No. 2004-135225.
  • the proposed electronic camera can shorten the capturing interval by generating an image suited to be displayed on a display device for checking a captured result (hereinafter referred to as “quick view image”) or an image suited for list display (hereinafter referred to as “thumbnail image”) before completion of reading of all the fields.
  • the number of fields to be read however has increased with the recent advance of increase in the number of pixels used in the image sensor. For this reason, the time up to completion of reading of all the fields has become longer. As a result, the user's waiting time has become longer problematically.
  • a proposition of the present embodiments is to use the reading time from an image sensor efficiently.
  • the image capturing apparatus includes an image capturing unit which separates a color image of a subject being captured by an image sensor having pixels of colors into three or more fields and outputs said three or more fields successively, and an image processing unit which generates a low-resolution image which is lower in resolution than the color image obtained by the image capturing unit, based on output of one or more fields among said three or more fields, said one or more fields being able to extract color information of all the colors, wherein the image processing unit starts generation of the low-resolution image in a period in which fields other than said one or more fields for generating the low-resolution image are read.
  • the low-resolution image is an image for checking a result of capturing
  • the image capturing apparatus may further include a display unit which displays the low-resolution image when the low-resolution image is generated by the image processing unit.
  • the image capturing apparatus may further include a field selecting unit which selects one field from the fields, wherein the image processing unit includes a pre-processing part which performs pre-processing on the color image output from the image capturing unit and a post-processing part which directly receives an output of the pre-processing part and performs post-processing on the pre-processed color image, and when the low-resolution image is to be generated, the color image of one field selected by the field selecting unit is directly transferred from the pre-processing part to the post-processing part to thereby perform the pre-processing and the post-processing integrally and sequentially.
  • the image processing unit includes a pre-processing part which performs pre-processing on the color image output from the image capturing unit and a post-processing part which directly receives an output of the pre-processing part and performs post-processing on the pre-processed color image, and when the low-resolution image is to be generated, the color image of one field selected by the field selecting unit is directly transferred from
  • the image processing unit may generate a first image lower in resolution and a second image lower in resolution, and the field selecting unit may select one field for generating the first image and one field for generating the second image, respectively.
  • the image processing unit may include a pre-processing part which performs pre-processing on the color image output from the image capturing unit, a post-processing part which performs post-processing on the pre-processed color image, and a pixel averaging part which directly receives an output of the post-processing part and averages any pixels of the post-processed color image, and when the low-resolution image is to be generated, a first low-resolution image is generated by the post-processing part and the generated first low-resolution image is directly transferred from the post-processing part to the pixel averaging part to thereby generate a second low-resolution image which is lower in resolution than the first low-resolution image simultaneously.
  • the image processing unit may include a white balance adjusting part, and when the low-resolution image is to be generated, the white balance adjusting part performs white balance adjustment in accordance with a white balance adjustment value decided in advance.
  • the image processing unit may include a pre-processing part which performs pre-processing on the color image output from the image capturing unit and a post-processing part which performs post-processing on the pre-processed color image
  • the image capturing apparatus may further include a plurality of buffer memory areas which store the color image pre-processed by the pre-processing part and a high-speed continuous capturing mode which performs, as parallel processing, a process of performing the pre-processing on the color image of one frame output from the image capturing unit and storing the pre-processed color image in one of the buffer memory areas and a process of performing the post-processing on the color image of a previous frame stored in another of the buffer memory areas, and the image processing unit does not start generation of the low-resolution image in a period of reading of fields other than said one or more fields for generating the low-resolution image while image capturing is executed in the high-speed continuous capturing mode.
  • FIG. 1 is a block diagram showing the configuration of an electronic camera 1 according to an embodiment.
  • FIG. 2 is a view for explaining a Bayer arrangement.
  • FIG. 3 is a block diagram showing the details of an image processing part 13 .
  • FIG. 4 is a view for explaining a view operation.
  • FIG. 5 is a view for explaining a still image capturing operation.
  • FIG. 6 is a flow of image data during the view operation.
  • FIG. 7 is a diagram for explaining an image buffer of an SDRAM 19 .
  • FIG. 8 is a view for explaining generation of a main image.
  • FIG. 9 is a flow of image data during the still image capturing operation.
  • FIG. 10 is another diagram for explaining an image buffer of the SDRAM 19 .
  • FIG. 11 is another diagram for explaining an image buffer of the SDRAM 19 .
  • FIGS. 12A to 12D are timing charts showing still image capturing sequences respectively.
  • FIGS. 13A and 13B are timing charts of an image signal output of a CCD 11 during the still image capturing operation.
  • FIGS. 14A and 14B are views for explaining generation of a quick view image.
  • FIG. 15 is a flow of data during generation of a quick view image.
  • FIG. 16 is a flow of data during generation of a thumbnail image.
  • FIG. 17 is another flow of data during generation of a quick view image and a thumbnail image.
  • FIGS. 18A and 18B are other timing charts showing still image capturing sequences respectively.
  • FIG. 19 is another flow of data during generation of a quick view image and a thumbnail image.
  • FIG. 20 is another flow of data during generation of a quick view image and a thumbnail image.
  • FIG. 21 is another flow of data during generation of a quick view image and a thumbnail image.
  • FIG. 22 is a view for explaining a high-speed continuous image capturing mode.
  • FIG. 23 is another diagram for explaining an image buffer of SDRAM 19 .
  • the electronic camera 1 includes respective parts, i.e. an image-capturing lens 10 , a CCD 11 , an AFE (Analog Front End) 12 , and an image processing part 13 .
  • the image-capturing lens 10 includes a focus lens, a zoom lens, a lens drive motor, etc. which are not shown.
  • the CCD 11 has a Bayer arrangement color filter.
  • the CCD 11 is not limited to this example.
  • the CCD 11 may have another filter arrangement such as a stripe arrangement or may be replaced with another image sensor than the CDD.
  • An image of a subject captured by the CCD 11 through the image-capturing lens 10 is transformed into an image signal by the CCD 11 .
  • the image signal is output to the AFE 12 .
  • the output image signal is converted into digital data (hereinafter referred to as “image data”) by the AFE 12 .
  • the image data is output to the image processing part 13 .
  • the electronic camera 1 further includes respective parts, i.e. a TG (Timing Generator) 14 , an MDIC (Motor Driver IC) 15 , an SIO (Serial Input/Output) 16 , and a PIO (Parallel Input/Output) 17 .
  • the TG 14 drives the CCD 11 and the AFE 12 to perform exposure, image signal output, etc.
  • the MDIC 15 drives the lens drive motor of the image-capturing lens 10 .
  • the SIO 16 controls the TG 14 and the MDIC 15 .
  • the PIO 17 controls the MDIC 15 .
  • the electronic camera 1 further includes respective parts, i.e. a JPEG compression part 18 , an SDRAM 19 , an SDRAM controller 20 , an LCD 21 , and a display controller 22 .
  • the JPEG compression part 18 compresses and expands image data subjected to image processing by the image processing part 13 .
  • the SDRAM 19 temporarily stores image data when the image data is subjected to image processing or image compression.
  • the SDRAM controller 20 is an interface with the SDRAM 19 .
  • the LCD 21 displays image data and various kinds of information.
  • the display controller 22 controls the LCD 21 .
  • the respective parts, i.e. the image processing part 13 , the JPEG compression part 18 , the SDRAM controller 20 and the display controller 22 are coupled to one another by an image bus.
  • the electronic camera 1 further includes respective parts, i.e. a memory card 23 , a card I/F part 24 , a USB I/F part 25 and a clock generator 26 , and a CPU 27 .
  • the memory card 23 is removable and used for recording image data, etc.
  • the card I/F part 24 is an interface with the memory card 23 .
  • the USB I/F part 25 can be coupled to a host PC, etc.
  • the clock generator 26 supplies operating clocks to the respective parts.
  • the CPU 27 controls the respective parts.
  • the respective parts i.e.
  • the image processing part 13 , the SIO 16 , the PIO 17 , the JPEG compression part 18 , the SDRAM controller 20 , the display controller 22 , the card I/F part 24 , the USB I/F part 25 , the clock generator 26 and the CPU 27 are coupled to one another by a CPU bus.
  • FIG. 3 is a block diagram showing the details of the image processing part 13 .
  • the image processing part 13 has a pre-processing part 30 , and a post-processing part 31 .
  • the pre-processing part 30 has respective parts, i.e. a defect correcting part 32 , an OB clamp processing part 33 , a sensitivity-ratio adjusting part 34 , a 3A-evaluated value calculating part 35 , and an output buffer 36 .
  • the defect correcting part 32 applies defect pixel correction to image data input from the AFE 12 .
  • the OB clamp processing part 33 decides the black level of the image data corrected by the defect correcting part 32 .
  • the sensitivity-ratio adjusting part 34 corrects the signal levels of R, G and B by applying sensitivity ratio adjustment to the image data processed by the OB clamp processing part 33 .
  • the 3A-evaluated value calculating part 35 calculates respective evaluated values of AWB (Auto White Balance) in addition to the aforementioned AE and AF based on the output of the sensitivity-ratio adjusting part 34 . Calculation results of the 3A-evaluated value calculating part 35 are output to the CPU 27 through the CPU bus.
  • the output of the sensitivity-ratio adjusting part 34 is output to the post-processing part 31 and output to the image bus via the output buffer 36 .
  • the post-processing part 31 has respective parts, i.e. a horizontal decimation part 40 , a WB adjusting part 41 , a ⁇ correcting part 42 , a color interpolating part 43 , a color converting & color correcting part 44 , a resolution converting part 45 , a spatial filtering part 46 , a CbCr decimation part 47 , an input buffer 48 , and an output buffer 49 .
  • the horizontal decimation part 40 reduces the number of horizontal pixels by applying horizontal decimation to the image data pre-processed by the pre-processing part 30 .
  • the WB adjusting part 41 applies white balance adjustment to the image data decimated by the horizontal decimation part 40 , based on the AWB evaluated value, etc. calculated by the 3A-evaluated value calculating part 35 .
  • the ⁇ correcting part 42 applies ⁇ correction to the image data white-balance-adjusted by the WB adjusting part 41 .
  • the color interpolating part 43 generates image data having three colors per pixel from Bayer arrangement image data having one color per pixel by applying color interpolation to the image data corrected by the y correcting part 42 .
  • the color converting & color correcting part 44 generates image data in a target color space (e.g. sRGB) by applying color conversion and color correction to the image data interpolated by the color interpolating part 43 .
  • the resolution converting part 45 generates image data with a target size by applying a resolution conversion process to the image data corrected by the color converting & color correcting part 44 .
  • image data with a QVGA (320 ⁇ 240) size or a VGA (640 ⁇ 480) size is generated.
  • the spatial filtering part 46 applies a spatial filtering process to the image data converted by the resolution converting part 45 .
  • the spatial filtering part 46 applies an edge emphasizing process to a Y signal and applies a low-pass filtering process to color-difference signals (a Cb signal and a Cr signal).
  • the output of the output buffer 49 is coupled to the image bus. While the output from the image bus is coupled to the input buffer 48 , the output of the input buffer 48 is coupled to the horizontal decimation part 40 and the color converting & color correcting part 44 .
  • the view operation is an operation of generating and displaying a through image to check a composition in real time.
  • the still image capturing operation is an operation of generating an image (hereinafter referred to as “main image”) by main image capturing.
  • a high frame rate (e.g. 30 fps) is obtained because a decimated image signal is output from the CCD 11 as shown in FIG. 4 .
  • the view operation is suited for real-time observation of a subject on the LCD 21 , photometric measurement for AE (Auto Exposure) or execution of AF (Auto Focusing).
  • an image signal with all pixels is output from the CCD 1 1 as shown in FIG. 5 . Accordingly, the image signal is high in resolution and is output in the condition that the image signal is separated into a plurality of fields.
  • FIG. 5 shows an example where 4 fields are output, the number of fields has a tendency toward increase with the advance of increase in number of pixels used in the CCD 11 .
  • post-processing due to the post-processing part 31 can be directly applied to the image data pre-processed by the pre-processing part 30 because adjacent lines of an image signal are output sequentially from the CCD 11 as shown in FIG. 4 . That is, in the view operation, the pre-processing part 30 inputs the pre-processed image data to the post-processing part 31 directly. Then, the image data post-processed by the post-processing part 31 is temporarily stored in the SDRAM 19 via the image bus and the SDRAM controller 20 . Further, the image data from the SDRAM 19 passes through the SDRAM controller 20 , the image bus and the display controller 22 successively and is displayed as a through image on the LCD 21 .
  • an interpolating process or the like in the post-processing due to the post-processing part 31 cannot be executed because the image signal is output from the CCD 11 in the condition that the image signal is separated into a plurality of fields as shown in FIG. 5 .
  • a line n+4 is output next to a line n. Since lines n+1, n+2 and n+3 are inserted between the lines n and n+4, a process such as color interpolation, resolution conversion or spatial filtering using adjacent lines of image data cannot be applied to image data of the first field.
  • image data of the fields are pre-processed by the pre-processing part 30 respectively, temporarily stored in the SDRAM 19 and combined into a frame image on the SDRAM 19 and then post-processed by the post-processing part 31 .
  • FIG. 6 shows a flow of image data during the view operation.
  • the CPU 27 performs image processing along an arrow ( 1 ) and displays a through image along an arrow ( 2 ).
  • the two image buffers of a V1 buffer 60 and a V2 buffer 61 as shown in FIG. 7 are prepared so that the two image buffers are switched alternately every frame.
  • the CPU 27 performs AE operation using an AE evaluated value and AF using an AF evaluated value based on the 3A-evaluated value calculating part 35 in preparation for the still image capturing operation.
  • the release button is full-pushed after it is half-pushed, the CPU 27 performs exposure for still image capturing based on a result of the aforementioned AE operation after completion of AF and goes to the still image capturing operation.
  • the exposure for still image capturing is terminated by the closure of a mechanical shutter not shown, so that an image signal separated into a plurality of fields as shown in FIG. 5 is output from the CCD 11 .
  • the CPU 27 calculates an AWB evaluated value in the 3A-evaluated value calculating part 35 in accordance with each field.
  • the CPU 27 sets a WB adjustment value obtained from the aforementioned AWB evaluated values in the WB adjusting part 41 .
  • the CPU 27 reads image data stored in the SDRAM 19 in (progressive) order of lines so that the image data is post-processed by the post-processing part 31 .
  • the three images of a quick view image for checking capturing, a thumbnail image suited for list display and a main image are generated in the still picture capturing. These images are generated by post-processing the pre-processed image data respectively.
  • the size of the main image is so large that the main image cannot be usually generated by one post-process. Therefore, as shown in FIG. 8 , the main image is generated in such a manner that pre-processed image data is separated into narrow strip blocks, each of the blocks is post-processed, and the post-processed blocks are combined. However, the size of each post-processed block is reduced because surrounding pixels are cut off. Therefore, as shown in FIG.
  • boundary portions of adjacent blocks are made to overlap with each other so that post-processed images are combined correctly.
  • the horizontal decimation part 40 is generally bypassed so that full-resolution pixels are fed to the post-stage.
  • the size of each of the quick view image and the thumbnail image is small so that each of the quick view image and the thumbnail image can be generated by one post-process (without separation into strip blocks as described with reference to FIG. 8 ) if it has been initially subjected to horizontal decimation.
  • FIG. 9 shows a flow of image data during the still image capturing operation.
  • the CPU 27 performs pre-processing along an arrow ( 1 ) and performs post-processing along arrows ( 2 ) and ( 3 ).
  • Each of the pre-processing part 30 and the post-processing part 31 has one image processing pipeline. Accordingly, each of the quick view image, the thumbnail image and the main image is generated sequentially.
  • Each of the three images is stored in the SDRAM 19 .
  • the quick view image is generated first so that the user can check the contents of the captured main image quickly. It is preferable that the thumbnail image is then generated and the main image is finally generated.
  • Exif/DCF uses data arrangement that the thumbnail image is recorded on a header portion of a JPEG file and the main image is recorded on a tail portion of the JPEG file.
  • the CPU 27 displays image data of the quick view image on the LCD 21 along an arrow ( 6 ).
  • the CPU 27 further records the thumbnail image and the main image both in a JPEG compressed format.
  • the CPU 27 performs JPEG compression along arrows ( 4 ) and ( 5 ). Because one part is provided as the JPEG compression part 18 , compression of the thumbnail image and compression of the main image are performed sequentially. In this case, it is preferable that the thumbnail image and the main image are compressed in this order.
  • the CPU 27 combines compressed data of the thumbnail image and compressed data of the main image into one file on the SDRAM 19 in accordance with the Exif/DCF format and records the file on the memory card 23 through the card I/F part 24 along an arrow ( 7 ).
  • the SDRAM 19 has respective image buffers, i.e. an R buffer 62 , a T buffer 63 , an M buffer 64 , a T-J buffer 65 , an M-J buffer 66 , and a Q buffer 67 .
  • the CPU 27 stores pre-processed image data in the R buffer 62 .
  • the image data stored in the R buffer 62 is post-processed so that image data of the thumbnail image thus generated is stored in the T buffer 63 .
  • the image data stored in the R buffer 62 is post-processed so that image data of the main image thus generated is stored in the M buffer 64 .
  • the image data stored in the T buffer 63 is JPEG-compressed by the JPEG compression part 18 so that image data of the compressed image thus generated is stored in the T-j buffer 65 .
  • the image data stored in the M buffer 64 is JPEG-compressed by the JPEG compression part 18 so that image data of the compressed image thus generated is stored in the M-J buffer 66 .
  • the image data stored in the T-J buffer 65 and the image data stored in the M-J buffer 66 are combined into one file on one SDRAM 19 as described above, so that the file is recorded on the memory card 23 .
  • the image data stored in the R buffer 62 is post-processed so that image data of the quick view image thus generated is stored in the Q buffer 67 .
  • the image data stored in the Q buffer 67 is displayed on the LCD 21 through the display controller 22 as described above.
  • the SDRAM 19 has respective image buffers, i.e. an R1 buffer 68 , an R2 buffer 69 , an R3 buffer 70 and an R4 buffer 71 in place of the R buffer 62 shown in FIG. 10 .
  • These image buffers may be configured so that the R buffers (R1 to R4) 62 of respective fields have discrete addresses so that a one-frame image can be read in (progressive) order of lines regardless of the number of original fields.
  • Image data of the fourth field is input to the R4 buffer.
  • the CPU 27 starts post-processing due to the post-processing part 31 after all image data of the four fields are stored in the image buffers respectively.
  • FIG. 12A is a timing chart showing a sequence for capturing a still image as described above.
  • “Q image”, “T image” and “M image” express a quick view image, a thumbnail image and a main image respectively.
  • outputting of an image signal from the CCD 11 and pre-processing due to the pre-processing part 30 are performed in parallel so that image data of fields subjected to pre-processing are stored in the R buffers (the R1 buffer 68 to the R4 buffer 71 ) successively.
  • the thumbnail image and the main image are generated successively and JPEG-compressed successively. Therefore, when the thumbnail image is JPEG-compressed during generation of the main image, and compressed data of the thumbnail image is recorded during JPEG compression of the main image as shown in FIG. 12B , a plurality of processes can be executed while overlapping with one another so that the total processing time (capturing time) can be shortened. Further, when odd fields are read from the CCD 11 , a low-resolution color image such as a quick view image or a thumbnail image can be generated from image data of only one field in parallel with outputting of an image signal from the CCD 11 as described above in the Related Art (see FIG. 12C ).
  • FIG. 12D is a timing chart showing a high-speed still image capturing sequence obtained by combination of FIG. 12B and FIG. 12C .
  • generation of a quick view image from image data of only one field is performed in parallel with outputting of an image signal from the CCD 11 .
  • JPEG compression of a thumbnail image is started in the middle of generation of a main image. As a result, there is a large merit that the quick view image and the thumbnail image can be generated earlier.
  • FIGS. 12C and 12D an example in which generation of only a quick view image is performed in parallel with outputting of an image signal from the CCD 11 is shown in FIGS. 12C and 12D .
  • the CCD 11 is of a 3-field output type, an image signal of the three fields is output successively in synchronization with a vertical synchronizing signal (VD signal) from the TG 14 as shown in FIG. 13A .
  • VD signal vertical synchronizing signal
  • FIGS. 14A and 14B are views for explaining generation of a quick view image in this embodiment.
  • FIG. 14A shows an example where the CCD 11 is of a 3-field output type.
  • FIG. 14B shows an example where the CCD 11 is of a 4-field output type.
  • the CCD 11 is of a 3-field output type, one field contains all color signal components (R, G and B) as shown in FIG. 14A . Accordingly, a quick view image and a thumbnail image can be generated from image data of only one field (the first field in the example shown in FIG. 14A ).
  • each field lacks one color signal component as shown in FIG. 14B .
  • the first field does not contain any B signal
  • the second field does not contain any R signal. Therefore, in such a case, a quick view image and a thumbnail image are generated from image data of two fields (the first and second fields in the example shown in FIG. 14B ) from which all color information can be extracted. Although all color information can be extracted from the image signal of two fields in the example shown in FIG. 14B , a smallest number of fields allowed to extract all color information may be properly used for a CCD etc. using color filters containing three or more color components.
  • a quick view image and a thumbnail image can be generated from image data of the first and second fields without waiting for completion of outputting of an image signal of all the fields from the CCD 11 , at the point of time of completion of outputting of an image signal of the second field. Accordingly, the period of reading of the third and fourth fields can be used efficiently.
  • the quick view image has priority over generation of the thumbnail image. This is because the user can check capturing based on the quick view image earlier while the quick view image can be used for generating the thumbnail image. In comparison between an image obtained by combining the first and second fields and a quick view image generated from the image, the size (resolution) of the quick view image is generally smaller (lower). Accordingly, when the quick view image is generated first and used for generating the thumbnail image, processing can be accelerated.
  • FIG. 15 shows a flow of data during generation of a quick view image.
  • the CPU 27 reads image data of the first and second fields stored in the SDRAM 19 (the R1 buffer 68 and the R2 buffer 69 ) while storing image data of the third and fourth fields in the SDRAM 19 (the R3 buffer 70 and the R4 buffer 71 ). Then, the read image data of the first and second fields are post-processed by the post-processing part 31 to generate a quick view image.
  • FIG. 16 shows a flow of data during generation of a thumbnail image from the quick view image generated by the data flow of FIG. 15 .
  • the CPU 27 reads image data of the quick view image stored in the SDRAM 19 (the Q buffer 67 ) and applies a reduction process, etc. due to the resolution converting part 45 of the post-processing part 31 to the read image data of the quick view image to thereby generate a thumbnail image.
  • the total processing time can be shortened.
  • the total reading time of the third and fourth fields is required for generating the quick view image
  • the whole processing is performed in the same timing as in FIG. 12C or 12 D.
  • the total processing time can be further shortened because generation of the thumbnail image can be executed ahead of schedule. For example, this can be achieved when the frequency of processing clocks concerned with the post-processing part 31 is set at a high frequency.
  • generation of the quick view image is completed at an early timing of the reading period of the fourth field, generation of the thumbnail image can be completed before an end of the reading period of the fourth field. In this case, generation of the thumbnail image from the quick view image as described above with reference to FIG. 16 is effective.
  • the CPU 27 reads image data of the first field stored in the SDRAM 19 (the R1 buffer 68 ) and applies post-processing due to the post-processing part 31 to the read image data of the first field to generate a quick view image and a thumbnail image.
  • the CPU 27 transfers image data of one field pre-processed by the pre-processing part 30 from the pre-processing part 30 to the post-processing part 31 directly. Then, the image data is post-processed by the post-processing part 31 to generate a quick view image.
  • the quick view image is stored in the SDRAM 19 (the Q buffer 67 ).
  • the same processing as in the data flow during the view operation shown in FIGS. 6 and 7 can be made to transfer image data from the pre-processing part 30 to the post-processing part 31 directly.
  • a plurality of quick view images are however generated when image data is simply transferred from the pre-processing part 30 to the post-processing part 31 directly.
  • a field selecting unit for selecting one of the fields is therefore provided so that image data of only one field set by the field selecting unit in advance can be directly transferred to the post-processing part 31 to thereby generate one quick view image.
  • the image processing part 13 performs control automatically (without necessity of the CPU 27 's controlling the operation/suspension of the post-processing part 31 ) to operate the post-processing part 31 in the reading period of only one selected field but suspend the post-processing part 31 in the reading period of the other fields.
  • the horizontal decimation part 40 can be used efficiently in the same manner as in the view operation described above with reference to FIGS. 6 and 7 .
  • image data of all the fields pre-processed by the pre-processing part 30 must be stored in the SDRAM 19 so that a high-resolution main image can be generated afterward.
  • FIG. 17 there are hence provided two data flows corresponding to the quick view image output from the post-processing part 31 and the image data output from the pre-processing part 30 .
  • the thumbnail image can be generated from the quick view image as described above with reference to FIG. 16 .
  • the processing time can be shortened as shown in a timing chart of FIG. 18A .
  • the quick view image is generated from the image data of the first field in the reading period of the first field and the thumbnail image is then generated from the image data of the second field in the reading period of the second field.
  • the field used for generating the quick view image and the field used for generating the thumbnail image can be set respectively by the aforementioned selection unit. According to this configuration, the processing time can be shortened at the same level as that in the case where the thumbnail image is generated from the quick view image.
  • the size of the quick view image is a QVGA (320 ⁇ 240) size or a VGA (640 ⁇ 480) size.
  • the size of the thumbnail image is defined as a standard (160 ⁇ 120) size in the Exif/DCF format. It is found from size comparison between the quick view image and the thumbnail image that the standard thumbnail image (160 ⁇ 120) in the Exif/DCF format can be generated if the size of the quick view image is reduced to “1/N” times (in which N is an integer). It is a matter of course that this good compatibility is effective in the case where the thumbnail image is generated from the quick view image as described above with reference to FIG. 16 .
  • a pixel averaging part 50 is provided in the rear of the post-processing part 31 .
  • Post-processed image data is input from the post-processing part 31 to the pixel averaging part 50 directly.
  • FIG. 19 shows an example where the CCD 11 is of a 4-field output type.
  • the pixel averaging part 50 calculates an average of pixel values in a block 4 pixels (horizontal) by 4 pixels (vertical) (16 pixels in total) in the quick view image (i.e.
  • the pixel averaging part 50 outputs the average as the value of one pixel of the thumbnail image.
  • the CPU 27 stores image data of the quick view image generated by the post-processing part 31 in the SDRAM 19 (the Q buffer 67 ) and stores image data of the thumbnail image generated by the pixel averaging part 50 in the SDRAM 19 (the T buffer 63 ). That is, the processing time can be shortened as shown in a timing chart of FIG. 18B .
  • the averaging process executed by the pixel averaging part 50 is effective for generating the quick view image from image data of one field
  • the averaging process is particularly effective for generating the quick view image from image data of two or more fields.
  • the start of generation of the quick view image from image data of two or more fields is always delayed compared with the start of generation of the quick view image from image data of one field.
  • the delay can be however compensated when the quick view image and the thumbnail image are generated simultaneously as described above.
  • generation of the quick view image can be started at the point of time of completion of outputting of the image signal of the second field and generation of the thumbnail image can be started at the same time as described above with reference to FIG. 13B . Accordingly, both the quick view image and the thumbnail image can be generated before the reading period of all the fields is terminated.
  • image data reduced for the quick view image in advance may be stored in the SDRAM 19 . That is, as shown in a data flow of FIG. 20 , while image data of all fields pre-processed by the pre-processing part 30 are stored in the SDRAM 19 (the R1 buffer 68 to the R4 buffer 71 ), image data of fields used for generating the quick view image are transferred from the pre-processing part 30 to the post-processing part 31 directly, size-reduced by the post-processing part 31 and stored in the SDRAM 19 (a Q-R1 buffer 82 and a Q-R2 buffer 83 ).
  • a reduction ratio for the size reduction process in the horizontal decimation part 40 in the post-processing part 31 is set so that an image slightly larger than the quick view image can be obtained.
  • the horizontally size-reduced image data of the first and second fields are stored in the SDRAM 19 (the Q-R 1 buffer 82 and the Q-R2 buffer 83 ) via the output buffer 49 . Therefore, configuration is made so that the output of the horizontal decimation part 40 can be connected to the output buffer 49 .
  • the horizontal size reduction permits the quick view image to be generated at a high speed
  • combination of the horizontal size reduction and vertical size reduction permits the quick view image to be generated at a higher speed when the total number of first and second field lines is considerably larger than the number of lines in the quick view image (e.g. when the total number of first and second field lines is larger than twice as large as the number of lines in the quick view image).
  • the data stored in the Q-R1 and Q-R2 buffers 82 and 83 in FIG. 20 are Bayer arrangement image data.
  • the color arrangement in one field is however common to all the lines as described above with reference to FIG. 14B . Accordingly, even when, for example, the lines added up in FIG. 14B are first field lines n and n+4 or second field lines n+1and n+5, the color arrangement is unchanged. That is, this is the same as the case where two lines are added up as described in the view operation in FIG. 4 . Accordingly, the aforementioned 10 vertical size reduction can be achieved by use of line averaging (addition and division). A line memory is however required when the line averaging is performed by both addition and division.
  • image data of one line horizontally size-reduced by the horizontal decimation part 40 is stored in the line memory so that vertical size reduction can be performed by calculation of an average (addition and division due to bit shift) of the image data and image data of the next line horizontally size-reduced in the same manner as described above.
  • the time required for image processing is proportional to the size of an image which is a subject of image processing. Therefore, the processing time can be shortened when the quick view image is generated from the low-resolution RAW data (reduced Bayer arrangement image data) as described above with reference to FIG. 20 .
  • the quick view image can be generated at a high speed because even horizontal size reduction executed by the horizontal decimation part 40 can dispense with the separation of image data into strip blocks as shown in FIG. 8 in addition to reduction in number of horizontal pixels.
  • FIG. 21 A data flow shown in FIG. 21 which is a modification of FIG. 20 is effective likewise. That is, as shown in FIG. 21 , a quick view image is generated first and then a thumbnail image is generated from the quick view image. In this case, the processing time becomes longer than that in the data flow shown in FIG. 20 . Elongation of the processing time can be however suppressed because the thumbnail image is generated from the quick view image. In addition, increase of hardware can be suppressed because it is unnecessary to provide any new structure such as a pixel averaging circuit.
  • White balance adjustment is an adjusting process for reproducing an appropriate color by correcting change of the color temperature of illuminating light on a subject when the color temperature changes.
  • the CPU 27 sets the WB adjusting part 41 at the WB adjustment value obtained from the AWB evaluated value calculated by the 3A-evaluated value calculating part 35 as described above with reference to FIG. 3 .
  • the AWB evaluated value is updated at a constant rate (e.g. at a rate of 30 fps) in accordance with new image data which is continuously input to the pre-processing part 30 .
  • a constant rate e.g. at a rate of 30 fps
  • the WB adjustment value can be updated at a moderate rate.
  • the WB adjustment value is obtained based on the AWB evaluated value generally extracted from image data per se of the still image because the still image is an independent image of one frame.
  • the WB adjustment value is obtained based on the AWB evaluated value extracted from image data per se of a quick view image or a thumbnail image to be generated, generation of the quick view image or the thumbnail image is delayed. It is therefore preferable that the WB adjustment value used for generating the quick view image or the thumbnail image is decided in advance.
  • the WB adjustment value may be the latest WB adjustment value used in the view operation just before or may be obtained, as a WB adjustment value corresponding to the color temperature of illuminating light, from a database in the electronic camera 1 if the color temperature of illuminating light can be found in advance (e.g. as in flash photography). Then, the CPU 27 performs white balance adjustment due to the WB adjusting part 41 in accordance with the WB adjustment value decided in advance.
  • the WB adjustment value can be calculated based on the AWB evaluated value at a high speed
  • the WB adjustment value may be obtained based on the AWB evaluated value extracted from image data per se of the quick view image or the thumbnail image to be generated.
  • the CCD 11 is of a 3-field output type as described above with reference to FIG. 13A
  • an AWB evaluated value is extracted from image data of the first field in the reading period of the first field
  • a WB adjustment value is calculated based on the AWB evaluated value extracted in the reading period of the first field, in the reading period of the second field
  • a quick view image is finally generated in the reading period of the third field. Because it is conceived that there is a high correlation between the respective fields, there is no large problem when the quick view image is generated from image data of the third field based on the AWB evaluated value extracted in the reading period of the first field.
  • an AWB evaluated value is extracted from image data of the first and second fields in the reading period of the first and second fields
  • a WB adjustment value is calculated based on the AWB evaluated value extracted in the reading period of the first and second fields, in the reading period of the third field
  • a quick view image and a thumbnail image are generated in the reading period of the fourth field.
  • the WB adjustment value may be calculated in the reading period of the third and fourth fields so that the quick view image and the thumbnail image can be generated simultaneously after final reading of the fourth field is completed.
  • processing may be made in accordance with the data flow described with reference to FIG. 20 .
  • the high-speed continuous capturing mode is a capturing mode where capturing of one frame of the latest still image is made in parallel with image processing (post-processing) of a previous frame, in parallel with JPEG compression of a further previous frame and in parallel with recording of a further previous frame.
  • image processing post-processing
  • JPEG compression JPEG compression
  • recording processing the three images of a quick view image, a thumbnail image and a main image are generated in image processing (post-processing)
  • the thumbnail image and the main image are compressed in JPEG compression processing
  • compressed data of the thumbnail image and the main image are recorded in recording processing. That is, in each step, images are processed in parallel.
  • an R1 buffer 68 and an R2 buffer 69 are provided in place of the R buffer 62 shown in FIG. 10
  • a Q1 buffer 72 and a Q2 buffer 73 are provided in place of the Q buffer 67 shown in FIG. 10
  • a T1 buffer 74 and a T2 buffer 75 are provided in place of the T buffer 63 shown in FIG. 10
  • an M 1 buffer 76 and an M 2 buffer 77 are provided in place of the M buffer 64 shown in FIG. 10 .
  • a T-J1 buffer 78 and a T-J2 buffer 79 are provided in place of the T-J buffer 65 shown in FIG. 10
  • an M-J1 buffer 80 and an M-J2 buffer 81 are provided in place of the M-J buffer 66 shown in FIG. 10 . That is, as shown in FIG. 23 , all the image buffers are doubled. With the configuration of the SDRAM 19 , the four steps described with reference to FIG. 22 can overlap with one other perfectly. Incidentally, each step includes smaller sub-steps which are executed sequentially.
  • the processing time of each step becomes longer than that in image capturing in a single capturing mode but the total processing time can be shortened because new frames are captured successively.
  • at least the R buffer may be doubled (as an R1 buffer 68 and an R2 buffer 69 ) so that image capturing of the next frame can be continued.
  • processing modes corresponding to the respective cases may be provided so that one of the processing modes selected in accordance with switching of the reading mode can be executed. That is, configuration may be made so that a quick view image and a thumbnail image are generated from image data of one field when one field contains all color signal components (R, G and B), whereas a quick view image and a thumbnail image are generated from image data of two or more fields when one field does not contain all color image components (R, G and B). With this configuration, optimum processing can be performed regardless of the configuration of the CCD 11 , the number of fields to be read and the switching of the reading mode.
  • a display unit is further provided for displaying an image for checking a result of capturing as a lower-resolution image when the image is generated. Accordingly, the waiting time of the user can be shortened.
  • a field selecting unit for selecting one of fields is further provided so that a color image of one field selected by the field selecting unit to generate a low-resolution image is transferred from the pre-processing part to the post-processing part directly to thereby unify pre-processing and post-processing. Accordingly, the low-resolution image can be generated efficiently and quickly.
  • a first low-resolution image and a second low-resolution image are generated in such a manner that the field selecting unit selects either of a field used for generating the first image and a field used for generating the second image. Accordingly, low-resolution images can be generated sequentially and quickly without necessity of adding any special configuration.
  • a first image with a low resolution is generated by the post-processing part and directly transferred from the post-processing part to the image averaging part to thereby simultaneously generate a second image with a lower resolution than that of the first image. Accordingly, low-resolution images can be generated efficiently and quickly.
  • the white balance adjusting part performs white balance adjustment in accordance with a white balance adjustment value decided in advance. Accordingly, low-resolution images can be generated quickly while white balance adjustment can be performed appropriately.
  • a high-speed continuous capturing mode is provided so that generation of any low-resolution image is not started in the reading period of the remaining fields during execution of image capturing in the high-speed continuous capturing mode. Accordingly, the reading time from the image sensor can be used efficiently regardless of the image capturing mode.
  • the lengths of the respective sub-steps have no correlation with actual processing lengths.
  • the processing time required for processing such as image processing, JPEG compression processing, data recording, etc. is proportional to the amount of data used for the processing. That is, the longest processing time is required for generation and JPEG compression processing of the main image compared with generation of the quick view image and the thumbnail image. With respect to JPEG compression, the longest time is required for compression of the main image.
  • the three sub-steps of image processing (post-processing), JPEG compression processing and data recording are dedicated to processing of the main image so that reduction of the image capturing time can be achieved.
  • generation of the quick view image and the thumbnail image can be quickened to thereby achieve reduction of the image capturing time.
  • the quick view image which was heretofore used only for checking image capturing is JPEG-compressed and recorded on the memory card 23 while associated with the main image
  • the quick view image can be displayed on the LCD 21 at the time of reproduction of the main image.
  • JPEG compression processing may be applied to the quick view image as well as the main image.
  • a private area (maker note area) of Exif/DCF can be used for recording the quick view image.
  • bit rate control has been heretofore made to keep the size of data of one frame almost constant.
  • the bit rate control can keep the number of image capturing frames constant in accordance with the recording capacity of the memory card 23 .
  • JPEG compression must be repeated several times under the bit rate control, so that the image capturing time may become remarkably long. It is therefore desired that the amount of data is brought close to a target value while the number of times in repetition of compression is reduced as sufficiently as possible.
  • the number of times in repetition of compression can be reduced when a quantization table for JPEG compression of the main image is decided with reference to information concerned with JPEG compression of the quick view image or the thumbnail image generated before the main image.
  • the present invention is not limited to such an example.
  • the number of the fields of the image signal which is used for generating low-resolution images may be any numbers as long as the number of the fields of the image signal is more than the number of the minimum fields but less than the number of all fields of the CCD.

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Physics & Mathematics (AREA)
  • Spectroscopy & Molecular Physics (AREA)
  • Studio Devices (AREA)
  • Color Television Image Signal Generators (AREA)

Abstract

An image capturing apparatus can use a reading time from an image sensor efficiently. To achieve the object, the image capturing apparatus includes an image capturing unit which separates a color image of a subject being captured by an image sensor having pixels of colors into three or more fields and outputs said three or more fields successively, and an image processing unit which generates a low-resolution image which is lower in resolution than the color image obtained by the image capturing unit, based on output of one or more fields among said three or more fields, said one or more fields being able to extract color information of all the colors, wherein the image processing unit starts generation of the low-resolution image in a period in which fields other than said one or more fields for generating the low-resolution image are read.

Description

    CROSS REFERENCE TO THE RELATED APPLICATIONS
  • This application is based upon and claims the benefit of priority from Japanese Patent Application No. 2007-332281, filed on Dec. 25, 2007, the entire contents of which are incorporated herein by reference.
  • BACKGROUND
  • 1. Field
  • The present application relates to an image capturing apparatus which obtains an image by capturing an image of a subject.
  • 2. Description of the Related Art
  • An electronic camera having an image sensor which separates a captured image of a subject into fields and reads the fields has been popularized. The applicant of the present invention has already proposed an electronic camera as an invention described in Japanese Unexamined Patent Application Publication No. 2004-135225. The proposed electronic camera can shorten the capturing interval by generating an image suited to be displayed on a display device for checking a captured result (hereinafter referred to as “quick view image”) or an image suited for list display (hereinafter referred to as “thumbnail image”) before completion of reading of all the fields.
  • The number of fields to be read however has increased with the recent advance of increase in the number of pixels used in the image sensor. For this reason, the time up to completion of reading of all the fields has become longer. As a result, the user's waiting time has become longer problematically.
  • SUMMARY
  • A proposition of the present embodiments is to use the reading time from an image sensor efficiently.
  • To achieve the proposition, the image capturing apparatus includes an image capturing unit which separates a color image of a subject being captured by an image sensor having pixels of colors into three or more fields and outputs said three or more fields successively, and an image processing unit which generates a low-resolution image which is lower in resolution than the color image obtained by the image capturing unit, based on output of one or more fields among said three or more fields, said one or more fields being able to extract color information of all the colors, wherein the image processing unit starts generation of the low-resolution image in a period in which fields other than said one or more fields for generating the low-resolution image are read.
  • Incidentally, the low-resolution image is an image for checking a result of capturing, and the image capturing apparatus may further include a display unit which displays the low-resolution image when the low-resolution image is generated by the image processing unit.
  • The image capturing apparatus may further include a field selecting unit which selects one field from the fields, wherein the image processing unit includes a pre-processing part which performs pre-processing on the color image output from the image capturing unit and a post-processing part which directly receives an output of the pre-processing part and performs post-processing on the pre-processed color image, and when the low-resolution image is to be generated, the color image of one field selected by the field selecting unit is directly transferred from the pre-processing part to the post-processing part to thereby perform the pre-processing and the post-processing integrally and sequentially.
  • The image processing unit may generate a first image lower in resolution and a second image lower in resolution, and the field selecting unit may select one field for generating the first image and one field for generating the second image, respectively.
  • The image processing unit may include a pre-processing part which performs pre-processing on the color image output from the image capturing unit, a post-processing part which performs post-processing on the pre-processed color image, and a pixel averaging part which directly receives an output of the post-processing part and averages any pixels of the post-processed color image, and when the low-resolution image is to be generated, a first low-resolution image is generated by the post-processing part and the generated first low-resolution image is directly transferred from the post-processing part to the pixel averaging part to thereby generate a second low-resolution image which is lower in resolution than the first low-resolution image simultaneously.
  • The image processing unit may include a white balance adjusting part, and when the low-resolution image is to be generated, the white balance adjusting part performs white balance adjustment in accordance with a white balance adjustment value decided in advance.
  • The image processing unit may include a pre-processing part which performs pre-processing on the color image output from the image capturing unit and a post-processing part which performs post-processing on the pre-processed color image, the image capturing apparatus may further include a plurality of buffer memory areas which store the color image pre-processed by the pre-processing part and a high-speed continuous capturing mode which performs, as parallel processing, a process of performing the pre-processing on the color image of one frame output from the image capturing unit and storing the pre-processed color image in one of the buffer memory areas and a process of performing the post-processing on the color image of a previous frame stored in another of the buffer memory areas, and the image processing unit does not start generation of the low-resolution image in a period of reading of fields other than said one or more fields for generating the low-resolution image while image capturing is executed in the high-speed continuous capturing mode.
  • BRIEF DESCRIPTION OF THE DRAWINGS
  • FIG. 1 is a block diagram showing the configuration of an electronic camera 1 according to an embodiment.
  • FIG. 2 is a view for explaining a Bayer arrangement.
  • FIG. 3 is a block diagram showing the details of an image processing part 13.
  • FIG. 4 is a view for explaining a view operation.
  • FIG. 5 is a view for explaining a still image capturing operation.
  • FIG. 6 is a flow of image data during the view operation.
  • FIG. 7 is a diagram for explaining an image buffer of an SDRAM 19.
  • FIG. 8 is a view for explaining generation of a main image.
  • FIG. 9 is a flow of image data during the still image capturing operation.
  • FIG. 10 is another diagram for explaining an image buffer of the SDRAM 19.
  • FIG. 11 is another diagram for explaining an image buffer of the SDRAM 19.
  • FIGS. 12A to 12D are timing charts showing still image capturing sequences respectively.
  • FIGS. 13A and 13B are timing charts of an image signal output of a CCD 11 during the still image capturing operation.
  • FIGS. 14A and 14B are views for explaining generation of a quick view image.
  • FIG. 15 is a flow of data during generation of a quick view image.
  • FIG. 16 is a flow of data during generation of a thumbnail image.
  • FIG. 17 is another flow of data during generation of a quick view image and a thumbnail image.
  • FIGS. 18A and 18B are other timing charts showing still image capturing sequences respectively.
  • FIG. 19 is another flow of data during generation of a quick view image and a thumbnail image.
  • FIG. 20 is another flow of data during generation of a quick view image and a thumbnail image.
  • FIG. 21 is another flow of data during generation of a quick view image and a thumbnail image.
  • FIG. 22 is a view for explaining a high-speed continuous image capturing mode.
  • FIG. 23 is another diagram for explaining an image buffer of SDRAM 19.
  • DETAILED DESCRIPTION OF THE EMBODIMENTS
  • Embodiments of the present invention will be described below with reference to the drawings.
  • The configuration of an electronic camera 1 according to an embodiment will be described first with reference to FIG. 1.
  • As shown in FIG. 1, the electronic camera 1 includes respective parts, i.e. an image-capturing lens 10, a CCD 11, an AFE (Analog Front End) 12, and an image processing part 13. The image-capturing lens 10 includes a focus lens, a zoom lens, a lens drive motor, etc. which are not shown. As shown in FIG. 2, the CCD 11 has a Bayer arrangement color filter. Incidentally, the CCD 11 is not limited to this example. The CCD 11 may have another filter arrangement such as a stripe arrangement or may be replaced with another image sensor than the CDD. An image of a subject captured by the CCD 11 through the image-capturing lens 10 is transformed into an image signal by the CCD 11. The image signal is output to the AFE 12. The output image signal is converted into digital data (hereinafter referred to as “image data”) by the AFE 12. The image data is output to the image processing part 13.
  • The electronic camera 1 further includes respective parts, i.e. a TG (Timing Generator) 14, an MDIC (Motor Driver IC) 15, an SIO (Serial Input/Output) 16, and a PIO (Parallel Input/Output) 17. The TG 14 drives the CCD 11 and the AFE 12 to perform exposure, image signal output, etc. The MDIC 15 drives the lens drive motor of the image-capturing lens 10. The SIO 16 controls the TG 14 and the MDIC 15. The PIO 17 controls the MDIC 15.
  • The electronic camera 1 further includes respective parts, i.e. a JPEG compression part 18, an SDRAM 19, an SDRAM controller 20, an LCD 21, and a display controller 22. The JPEG compression part 18 compresses and expands image data subjected to image processing by the image processing part 13. The SDRAM 19 temporarily stores image data when the image data is subjected to image processing or image compression. The SDRAM controller 20 is an interface with the SDRAM 19. The LCD 21 displays image data and various kinds of information. The display controller 22 controls the LCD 21. Incidentally, the respective parts, i.e. the image processing part 13, the JPEG compression part 18, the SDRAM controller 20 and the display controller 22 are coupled to one another by an image bus.
  • The electronic camera 1 further includes respective parts, i.e. a memory card 23, a card I/F part 24, a USB I/F part 25 and a clock generator 26, and a CPU 27. The memory card 23 is removable and used for recording image data, etc. The card I/F part 24 is an interface with the memory card 23. The USB I/F part 25 can be coupled to a host PC, etc. The clock generator 26 supplies operating clocks to the respective parts. The CPU 27 controls the respective parts. Incidentally, the respective parts, i.e. the image processing part 13, the SIO 16, the PIO 17, the JPEG compression part 18, the SDRAM controller 20, the display controller 22, the card I/F part 24, the USB I/F part 25, the clock generator 26 and the CPU 27 are coupled to one another by a CPU bus.
  • FIG. 3 is a block diagram showing the details of the image processing part 13. As shown in FIG. 3, the image processing part 13 has a pre-processing part 30, and a post-processing part 31. The pre-processing part 30 has respective parts, i.e. a defect correcting part 32, an OB clamp processing part 33, a sensitivity-ratio adjusting part 34, a 3A-evaluated value calculating part 35, and an output buffer 36. The defect correcting part 32 applies defect pixel correction to image data input from the AFE 12. The OB clamp processing part 33 decides the black level of the image data corrected by the defect correcting part 32. The sensitivity-ratio adjusting part 34 corrects the signal levels of R, G and B by applying sensitivity ratio adjustment to the image data processed by the OB clamp processing part 33. The 3A-evaluated value calculating part 35 calculates respective evaluated values of AWB (Auto White Balance) in addition to the aforementioned AE and AF based on the output of the sensitivity-ratio adjusting part 34. Calculation results of the 3A-evaluated value calculating part 35 are output to the CPU 27 through the CPU bus. The output of the sensitivity-ratio adjusting part 34 is output to the post-processing part 31 and output to the image bus via the output buffer 36.
  • The post-processing part 31 has respective parts, i.e. a horizontal decimation part 40, a WB adjusting part 41, a γ correcting part 42, a color interpolating part 43, a color converting & color correcting part 44, a resolution converting part 45, a spatial filtering part 46, a CbCr decimation part 47, an input buffer 48, and an output buffer 49.
  • The horizontal decimation part 40 reduces the number of horizontal pixels by applying horizontal decimation to the image data pre-processed by the pre-processing part 30. The WB adjusting part 41 applies white balance adjustment to the image data decimated by the horizontal decimation part 40, based on the AWB evaluated value, etc. calculated by the 3A-evaluated value calculating part 35. The γ correcting part 42 applies γ correction to the image data white-balance-adjusted by the WB adjusting part 41. The color interpolating part 43 generates image data having three colors per pixel from Bayer arrangement image data having one color per pixel by applying color interpolation to the image data corrected by the y correcting part 42. The color converting & color correcting part 44 generates image data in a target color space (e.g. sRGB) by applying color conversion and color correction to the image data interpolated by the color interpolating part 43. The image data is generally image data with YCbCr=4:4:4.
  • The resolution converting part 45 generates image data with a target size by applying a resolution conversion process to the image data corrected by the color converting & color correcting part 44. For example, for a view operation which will be described later, image data with a QVGA (320×240) size or a VGA (640×480) size is generated. The spatial filtering part 46 applies a spatial filtering process to the image data converted by the resolution converting part 45. Specifically, the spatial filtering part 46 applies an edge emphasizing process to a Y signal and applies a low-pass filtering process to color-difference signals (a Cb signal and a Cr signal). The CbCr decimation part 47 applies a decimation process to color-difference signals (a Cb signal and a Cr signal) to generate image data, for example, with YCbCr=4:2:2 and output the image data to the output buffer 49. The output of the output buffer 49 is coupled to the image bus. While the output from the image bus is coupled to the input buffer 48, the output of the input buffer 48 is coupled to the horizontal decimation part 40 and the color converting & color correcting part 44.
  • In the electronic camera 1 having the aforementioned configuration, there are a view operation and a still image capturing operation as capturing operations. The view operation is an operation of generating and displaying a through image to check a composition in real time. The still image capturing operation is an operation of generating an image (hereinafter referred to as “main image”) by main image capturing.
  • In the view operation, a high frame rate (e.g. 30 fps) is obtained because a decimated image signal is output from the CCD 11 as shown in FIG. 4. The view operation is suited for real-time observation of a subject on the LCD 21, photometric measurement for AE (Auto Exposure) or execution of AF (Auto Focusing).
  • On the other hand, in the still image capturing operation, an image signal with all pixels is output from the CCD 1 1 as shown in FIG. 5. Accordingly, the image signal is high in resolution and is output in the condition that the image signal is separated into a plurality of fields. Although FIG. 5 shows an example where 4 fields are output, the number of fields has a tendency toward increase with the advance of increase in number of pixels used in the CCD 11.
  • In the aforementioned view operation, post-processing due to the post-processing part 31 can be directly applied to the image data pre-processed by the pre-processing part 30 because adjacent lines of an image signal are output sequentially from the CCD 11 as shown in FIG. 4. That is, in the view operation, the pre-processing part 30 inputs the pre-processed image data to the post-processing part 31 directly. Then, the image data post-processed by the post-processing part 31 is temporarily stored in the SDRAM 19 via the image bus and the SDRAM controller 20. Further, the image data from the SDRAM 19 passes through the SDRAM controller 20, the image bus and the display controller 22 successively and is displayed as a through image on the LCD 21.
  • On the other hand, in the aforementioned still image capturing operation, an interpolating process or the like in the post-processing due to the post-processing part 31 cannot be executed because the image signal is output from the CCD 11 in the condition that the image signal is separated into a plurality of fields as shown in FIG. 5. For example, in a first field shown in FIG. 5, a line n+4 is output next to a line n. Since lines n+1, n+2 and n+3 are inserted between the lines n and n+4, a process such as color interpolation, resolution conversion or spatial filtering using adjacent lines of image data cannot be applied to image data of the first field. In the aforementioned still image capturing operation, therefore, image data of the fields are pre-processed by the pre-processing part 30 respectively, temporarily stored in the SDRAM 19 and combined into a frame image on the SDRAM 19 and then post-processed by the post-processing part 31.
  • FIG. 6 shows a flow of image data during the view operation. The CPU 27 performs image processing along an arrow (1) and displays a through image along an arrow (2). Incidentally, in order to display the through image continuously, the two image buffers of a V1 buffer 60 and a V2 buffer 61 as shown in FIG. 7 are prepared so that the two image buffers are switched alternately every frame. When a release button is half-pushed, the CPU 27 performs AE operation using an AE evaluated value and AF using an AF evaluated value based on the 3A-evaluated value calculating part 35 in preparation for the still image capturing operation. When the release button is full-pushed after it is half-pushed, the CPU 27 performs exposure for still image capturing based on a result of the aforementioned AE operation after completion of AF and goes to the still image capturing operation.
  • The exposure for still image capturing is terminated by the closure of a mechanical shutter not shown, so that an image signal separated into a plurality of fields as shown in FIG. 5 is output from the CCD 11. While image data of each field is pre-processed by the pre-processing part 30 and then stored in the SDRAM 19, the CPU 27 calculates an AWB evaluated value in the 3A-evaluated value calculating part 35 in accordance with each field. When all the pre-processed image data of the fields are stored in the SDRAM 19, the CPU 27 sets a WB adjustment value obtained from the aforementioned AWB evaluated values in the WB adjusting part 41. Then, the CPU 27 reads image data stored in the SDRAM 19 in (progressive) order of lines so that the image data is post-processed by the post-processing part 31.
  • Incidentally, the three images of a quick view image for checking capturing, a thumbnail image suited for list display and a main image are generated in the still picture capturing. These images are generated by post-processing the pre-processed image data respectively. The size of the main image is so large that the main image cannot be usually generated by one post-process. Therefore, as shown in FIG. 8, the main image is generated in such a manner that pre-processed image data is separated into narrow strip blocks, each of the blocks is post-processed, and the post-processed blocks are combined. However, the size of each post-processed block is reduced because surrounding pixels are cut off. Therefore, as shown in FIG. 8, boundary portions of adjacent blocks are made to overlap with each other so that post-processed images are combined correctly. Incidentally, when the main image is to be generated, the horizontal decimation part 40 is generally bypassed so that full-resolution pixels are fed to the post-stage. On the other hand, the size of each of the quick view image and the thumbnail image is small so that each of the quick view image and the thumbnail image can be generated by one post-process (without separation into strip blocks as described with reference to FIG. 8) if it has been initially subjected to horizontal decimation.
  • A still picture capturing operation according to the related art will be described first for the sake of comparison. FIG. 9 shows a flow of image data during the still image capturing operation. The CPU 27 performs pre-processing along an arrow (1) and performs post-processing along arrows (2) and (3). Each of the pre-processing part 30 and the post-processing part 31 has one image processing pipeline. Accordingly, each of the quick view image, the thumbnail image and the main image is generated sequentially. Each of the three images is stored in the SDRAM 19. Incidentally, it is preferable that the quick view image is generated first so that the user can check the contents of the captured main image quickly. It is preferable that the thumbnail image is then generated and the main image is finally generated. This is because a general file format standard called Exif/DCF uses data arrangement that the thumbnail image is recorded on a header portion of a JPEG file and the main image is recorded on a tail portion of the JPEG file. The CPU 27 displays image data of the quick view image on the LCD 21 along an arrow (6).
  • The CPU 27 further records the thumbnail image and the main image both in a JPEG compressed format. The CPU 27 performs JPEG compression along arrows (4) and (5). Because one part is provided as the JPEG compression part 18, compression of the thumbnail image and compression of the main image are performed sequentially. In this case, it is preferable that the thumbnail image and the main image are compressed in this order. The CPU 27 combines compressed data of the thumbnail image and compressed data of the main image into one file on the SDRAM 19 in accordance with the Exif/DCF format and records the file on the memory card 23 through the card I/F part 24 along an arrow (7).
  • As described above, a plurality of data flows appear during the still image capturing operation. Therefore, a plurality of image buffers are prepared in the SDRAM 19 correspondingly to the data flows. As shown in FIG. 10, the SDRAM 19 has respective image buffers, i.e. an R buffer 62, a T buffer 63, an M buffer 64, a T-J buffer 65, an M-J buffer 66, and a Q buffer 67.
  • As shown in FIG. 10, the CPU 27 stores pre-processed image data in the R buffer 62. The image data stored in the R buffer 62 is post-processed so that image data of the thumbnail image thus generated is stored in the T buffer 63. The image data stored in the R buffer 62 is post-processed so that image data of the main image thus generated is stored in the M buffer 64. The image data stored in the T buffer 63 is JPEG-compressed by the JPEG compression part 18 so that image data of the compressed image thus generated is stored in the T-j buffer 65. The image data stored in the M buffer 64 is JPEG-compressed by the JPEG compression part 18 so that image data of the compressed image thus generated is stored in the M-J buffer 66. The image data stored in the T-J buffer 65 and the image data stored in the M-J buffer 66 are combined into one file on one SDRAM 19 as described above, so that the file is recorded on the memory card 23. In addition, the image data stored in the R buffer 62 is post-processed so that image data of the quick view image thus generated is stored in the Q buffer 67. The image data stored in the Q buffer 67 is displayed on the LCD 21 through the display controller 22 as described above.
  • Although the data flow reaching the R buffer 62 is drawn as one line in FIG. 10, a plurality of data flows as shown in FIG. 11 are actually provided because image data separated into fields as shown in FIG. 5 are input to the R buffer 62 successively. As shown in FIG. 11, the SDRAM 19 has respective image buffers, i.e. an R1 buffer 68, an R2 buffer 69, an R3 buffer 70 and an R4 buffer 71 in place of the R buffer 62 shown in FIG. 10. These image buffers may be configured so that the R buffers (R1 to R4) 62 of respective fields have discrete addresses so that a one-frame image can be read in (progressive) order of lines regardless of the number of original fields. Image data of the fourth field is input to the R4 buffer. The CPU 27 starts post-processing due to the post-processing part 31 after all image data of the four fields are stored in the image buffers respectively.
  • FIG. 12A is a timing chart showing a sequence for capturing a still image as described above. In FIG. 12A, “Q image”, “T image” and “M image” express a quick view image, a thumbnail image and a main image respectively. As shown in FIG. 12A, outputting of an image signal from the CCD 11 and pre-processing due to the pre-processing part 30 are performed in parallel so that image data of fields subjected to pre-processing are stored in the R buffers (the R1 buffer 68 to the R4 buffer 71) successively. When image data of all the fields are stored in the R buffers, post-processing due to the post-processing part 31 is applied to the image data to generate the three images of a quick view image, a thumbnail image and a main image successively. Incidentally, the CPU 27 displays the quick view image on the LCD 21 immediately after the CPU 27 generates the quick view image. Then, the CPU 27 applies JPEG compression due to the JPEG compression part 18 to the thumbnail image and the main image successively. Finally, the CPU 27 records compressed data of the thumbnail image and compressed data of the main image on the memory card 23 successively.
  • Incidentally, the thumbnail image and the main image are generated successively and JPEG-compressed successively. Therefore, when the thumbnail image is JPEG-compressed during generation of the main image, and compressed data of the thumbnail image is recorded during JPEG compression of the main image as shown in FIG. 12B, a plurality of processes can be executed while overlapping with one another so that the total processing time (capturing time) can be shortened. Further, when odd fields are read from the CCD 11, a low-resolution color image such as a quick view image or a thumbnail image can be generated from image data of only one field in parallel with outputting of an image signal from the CCD 11 as described above in the Related Art (see FIG. 12C).
  • FIG. 12D is a timing chart showing a high-speed still image capturing sequence obtained by combination of FIG. 12B and FIG. 12C. As shown in FIG. 12D, generation of a quick view image from image data of only one field is performed in parallel with outputting of an image signal from the CCD 11. Further, JPEG compression of a thumbnail image is started in the middle of generation of a main image. As a result, there is a large merit that the quick view image and the thumbnail image can be generated earlier.
  • Further, an example in which generation of only a quick view image is performed in parallel with outputting of an image signal from the CCD 11 is shown in FIGS. 12C and 12D. However, if the CCD 11 is of a 3-field output type, an image signal of the three fields is output successively in synchronization with a vertical synchronizing signal (VD signal) from the TG 14 as shown in FIG. 13A. Accordingly, when a quick view image is generated during reading of the first or second field and a thumbnail image is generated during reading of the second or third field, the generation of the quick view image and the thumbnail image can be completed before the outputting of the image signal of all the fields from the CCD 11 is completed.
  • Incidentally, as described above, the time required for completion of reading of all the fields from the CCD has become longer with the advance of increase in number of fields to be read. Therefore, in this invention, generation of the quick view image and the thumbnail image is started earlier in order to use the reading time efficiently.
  • FIGS. 14A and 14B are views for explaining generation of a quick view image in this embodiment. FIG. 14A shows an example where the CCD 11 is of a 3-field output type. FIG. 14B shows an example where the CCD 11 is of a 4-field output type. When the CCD 11 is of a 3-field output type, one field contains all color signal components (R, G and B) as shown in FIG. 14A. Accordingly, a quick view image and a thumbnail image can be generated from image data of only one field (the first field in the example shown in FIG. 14A). On the other hand, when the CCD 11 is of a 4-field output type, each field lacks one color signal component as shown in FIG. 14B. For example, the first field does not contain any B signal, and the second field does not contain any R signal. Therefore, in such a case, a quick view image and a thumbnail image are generated from image data of two fields (the first and second fields in the example shown in FIG. 14B) from which all color information can be extracted. Although all color information can be extracted from the image signal of two fields in the example shown in FIG. 14B, a smallest number of fields allowed to extract all color information may be properly used for a CCD etc. using color filters containing three or more color components.
  • For example, as shown in FIG. 13B, when the CCD 11 is of a 4-field output type, a quick view image and a thumbnail image can be generated from image data of the first and second fields without waiting for completion of outputting of an image signal of all the fields from the CCD 11, at the point of time of completion of outputting of an image signal of the second field. Accordingly, the period of reading of the third and fourth fields can be used efficiently.
  • On this occasion, it is preferable that generation of the quick view image has priority over generation of the thumbnail image. This is because the user can check capturing based on the quick view image earlier while the quick view image can be used for generating the thumbnail image. In comparison between an image obtained by combining the first and second fields and a quick view image generated from the image, the size (resolution) of the quick view image is generally smaller (lower). Accordingly, when the quick view image is generated first and used for generating the thumbnail image, processing can be accelerated.
  • FIG. 15 shows a flow of data during generation of a quick view image. As shown in FIG. 15, the CPU 27 reads image data of the first and second fields stored in the SDRAM 19 (the R1 buffer 68 and the R2 buffer 69) while storing image data of the third and fourth fields in the SDRAM 19 (the R3 buffer 70 and the R4 buffer 71). Then, the read image data of the first and second fields are post-processed by the post-processing part 31 to generate a quick view image.
  • FIG. 16 shows a flow of data during generation of a thumbnail image from the quick view image generated by the data flow of FIG. 15. As shown in FIG. 16, the CPU 27 reads image data of the quick view image stored in the SDRAM 19 (the Q buffer 67) and applies a reduction process, etc. due to the resolution converting part 45 of the post-processing part 31 to the read image data of the quick view image to thereby generate a thumbnail image.
  • As described above with reference to FIGS. 15 and 16, even when the quick view image and the thumbnail image are generated sequentially, the total processing time can be shortened. When, for example, the total reading time of the third and fourth fields is required for generating the quick view image, the whole processing is performed in the same timing as in FIG. 12C or 12D. When the quick view image is generated in a shorter time, the total processing time can be further shortened because generation of the thumbnail image can be executed ahead of schedule. For example, this can be achieved when the frequency of processing clocks concerned with the post-processing part 31 is set at a high frequency. When generation of the quick view image is completed at an early timing of the reading period of the fourth field, generation of the thumbnail image can be completed before an end of the reading period of the fourth field. In this case, generation of the thumbnail image from the quick view image as described above with reference to FIG. 16 is effective.
  • On the other hand, as described above with reference to FIG. 13A, when the CCD 11 is of a 3-field output type, a quick view image and a thumbnail image can be generated from image data of only the first field. Accordingly, the reading period of the second and third fields can be used efficiently. In this case, as described above with reference to FIG. 15, the CPU 27 reads image data of the first field stored in the SDRAM 19 (the R1 buffer 68) and applies post-processing due to the post-processing part 31 to the read image data of the first field to generate a quick view image and a thumbnail image.
  • As described above with reference to FIG. 14A, when one filed contains all color signal components (R, G and B), generation of a quick view image and a thumbnail image by a data flow shown in FIG. 17 can make processing more efficient. That is, as shown in FIG. 17, the CPU 27 transfers image data of one field pre-processed by the pre-processing part 30 from the pre-processing part 30 to the post-processing part 31 directly. Then, the image data is post-processed by the post-processing part 31 to generate a quick view image. The quick view image is stored in the SDRAM 19 (the Q buffer 67).
  • The same processing as in the data flow during the view operation shown in FIGS. 6 and 7 can be made to transfer image data from the pre-processing part 30 to the post-processing part 31 directly. A plurality of quick view images are however generated when image data is simply transferred from the pre-processing part 30 to the post-processing part 31 directly. A field selecting unit for selecting one of the fields is therefore provided so that image data of only one field set by the field selecting unit in advance can be directly transferred to the post-processing part 31 to thereby generate one quick view image. According to this configuration, the image processing part 13 performs control automatically (without necessity of the CPU 27's controlling the operation/suspension of the post-processing part 31) to operate the post-processing part 31 in the reading period of only one selected field but suspend the post-processing part 31 in the reading period of the other fields.
  • Because a reduction process is generally performed when a quick view image is generated, the horizontal decimation part 40 can be used efficiently in the same manner as in the view operation described above with reference to FIGS. 6 and 7. On the other hand, image data of all the fields pre-processed by the pre-processing part 30 must be stored in the SDRAM 19 so that a high-resolution main image can be generated afterward. As shown in FIG. 17, there are hence provided two data flows corresponding to the quick view image output from the post-processing part 31 and the image data output from the pre-processing part 30.
  • According to the configuration described with reference to FIG. 17, it is possible to expect a merit of generating the quick view image in real time and a merit of reducing data traffic on the SDRAM 19. For example, if generation of the quick view image is completed in the reading period of the first field in FIG. 13A, two reading periods of the second and third fields are free sufficiently to generate the thumbnail image. In this case, the thumbnail image can be generated from the quick view image as described above with reference to FIG. 16. When the thumbnail image is generated from the quick view image, completion of generation of the thumbnail image in the reading period of the second field can be achieved. That is, the processing time can be shortened as shown in a timing chart of FIG. 18A. When the processing clock is further adjusted appropriately, processing such as starting or terminating compression of the thumbnail image in the reading period of the third field can be performed further ahead of schedule.
  • Instead of generation of the thumbnail image from the quick view image as described above with reference to FIG. 16, there may be used a method in which the quick view image is generated from the image data of the first field in the reading period of the first field and the thumbnail image is then generated from the image data of the second field in the reading period of the second field. In this case, the field used for generating the quick view image and the field used for generating the thumbnail image can be set respectively by the aforementioned selection unit. According to this configuration, the processing time can be shortened at the same level as that in the case where the thumbnail image is generated from the quick view image.
  • Alternatively, configuration may be made so that the quick view image and the thumbnail image are generated simultaneously in parallel. In most cases, the size of the quick view image is a QVGA (320×240) size or a VGA (640×480) size. On the other hand, the size of the thumbnail image is defined as a standard (160×120) size in the Exif/DCF format. It is found from size comparison between the quick view image and the thumbnail image that the standard thumbnail image (160×120) in the Exif/DCF format can be generated if the size of the quick view image is reduced to “1/N” times (in which N is an integer). It is a matter of course that this good compatibility is effective in the case where the thumbnail image is generated from the quick view image as described above with reference to FIG. 16. On the other hand, the size reduction process is equivalent to a process of calculating an average of pixel values in a block “N pixels” (horizontal) by “N pixels” (vertical). Therefore, a bit shift technique which is a known technique can be used for generating the quick view image and the thumbnail image simultaneously in parallel. Specifically, 2 bits are shifted to the right for “¼” times (N=4) and 1 bit is shifted to the right for “½” times (N=2). Because such a circuit for calculating an average of pixels has a very simple structure, increase in cost and power consumption caused by the provision of this circuit is allowable.
  • Therefore, as shown in a data flow of FIG. 19, a pixel averaging part 50 is provided in the rear of the post-processing part 31. Post-processed image data is input from the post-processing part 31 to the pixel averaging part 50 directly. FIG. 19 shows an example where the CCD 11 is of a 4-field output type. When the quick view image has been generated in the post-processing part 31, the pixel averaging part 50 calculates an average of pixel values in a block 4 pixels (horizontal) by 4 pixels (vertical) (16 pixels in total) in the quick view image (i.e. increases an integrated value of 16 pixels by 1/16 times (4-bit shift)) or calculates an average of pixel values in a block 2 pixels (horizontal) by 2 pixels (vertical) in the quick view image (4 pixels in total) (i.e. increase an integrated value of 4 pixels by ¼ times (2-bit shift)). Then, the pixel averaging part 50 outputs the average as the value of one pixel of the thumbnail image.
  • Then, the CPU 27 stores image data of the quick view image generated by the post-processing part 31 in the SDRAM 19 (the Q buffer 67) and stores image data of the thumbnail image generated by the pixel averaging part 50 in the SDRAM 19 (the T buffer 63). That is, the processing time can be shortened as shown in a timing chart of FIG. 18B.
  • Although the averaging process executed by the pixel averaging part 50 is effective for generating the quick view image from image data of one field, the averaging process is particularly effective for generating the quick view image from image data of two or more fields. The start of generation of the quick view image from image data of two or more fields is always delayed compared with the start of generation of the quick view image from image data of one field. The delay can be however compensated when the quick view image and the thumbnail image are generated simultaneously as described above. For example, when the CCD 11 is of a 4-field output type, generation of the quick view image can be started at the point of time of completion of outputting of the image signal of the second field and generation of the thumbnail image can be started at the same time as described above with reference to FIG. 13B. Accordingly, both the quick view image and the thumbnail image can be generated before the reading period of all the fields is terminated.
  • To further improve the processing speed for generating the quick view image from image data of two or more fields, image data reduced for the quick view image in advance (i.e. low-resolution RAW data) may be stored in the SDRAM 19. That is, as shown in a data flow of FIG. 20, while image data of all fields pre-processed by the pre-processing part 30 are stored in the SDRAM 19 (the R1 buffer 68 to the R4 buffer 71), image data of fields used for generating the quick view image are transferred from the pre-processing part 30 to the post-processing part 31 directly, size-reduced by the post-processing part 31 and stored in the SDRAM 19 (a Q-R1 buffer 82 and a Q-R2 buffer 83). In this case, a reduction ratio for the size reduction process in the horizontal decimation part 40 in the post-processing part 31 is set so that an image slightly larger than the quick view image can be obtained.
  • Then, the horizontally size-reduced image data of the first and second fields are stored in the SDRAM 19 (the Q-R 1 buffer 82 and the Q-R2 buffer 83) via the output buffer 49. Therefore, configuration is made so that the output of the horizontal decimation part 40 can be connected to the output buffer 49. Although the horizontal size reduction permits the quick view image to be generated at a high speed, combination of the horizontal size reduction and vertical size reduction permits the quick view image to be generated at a higher speed when the total number of first and second field lines is considerably larger than the number of lines in the quick view image (e.g. when the total number of first and second field lines is larger than twice as large as the number of lines in the quick view image).
  • The data stored in the Q-R1 and Q- R2 buffers 82 and 83 in FIG. 20 are Bayer arrangement image data. The color arrangement in one field is however common to all the lines as described above with reference to FIG. 14B. Accordingly, even when, for example, the lines added up in FIG. 14B are first field lines n and n+4 or second field lines n+1and n+5, the color arrangement is unchanged. That is, this is the same as the case where two lines are added up as described in the view operation in FIG. 4. Accordingly, the aforementioned 10 vertical size reduction can be achieved by use of line averaging (addition and division). A line memory is however required when the line averaging is performed by both addition and division. That is, image data of one line horizontally size-reduced by the horizontal decimation part 40 is stored in the line memory so that vertical size reduction can be performed by calculation of an average (addition and division due to bit shift) of the image data and image data of the next line horizontally size-reduced in the same manner as described above.
  • Generally, the time required for image processing is proportional to the size of an image which is a subject of image processing. Therefore, the processing time can be shortened when the quick view image is generated from the low-resolution RAW data (reduced Bayer arrangement image data) as described above with reference to FIG. 20. Incidentally, if it is difficult to provide the aforementioned line memory, the quick view image can be generated at a high speed because even horizontal size reduction executed by the horizontal decimation part 40 can dispense with the separation of image data into strip blocks as shown in FIG. 8 in addition to reduction in number of horizontal pixels.
  • A data flow shown in FIG. 21 which is a modification of FIG. 20 is effective likewise. That is, as shown in FIG. 21, a quick view image is generated first and then a thumbnail image is generated from the quick view image. In this case, the processing time becomes longer than that in the data flow shown in FIG. 20. Elongation of the processing time can be however suppressed because the thumbnail image is generated from the quick view image. In addition, increase of hardware can be suppressed because it is unnecessary to provide any new structure such as a pixel averaging circuit.
  • Alternatively, the following measures may be taken in consideration of accuracy in white balance adjustment executed by the WB adjusting part 41 of the post-processing part 31. As described above with reference to FIG. 3, it is necessary to set an appropriate WB adjustment value in the WB adjusting part 41 when image data is post-processed by the post-processing part 31. White balance adjustment is an adjusting process for reproducing an appropriate color by correcting change of the color temperature of illuminating light on a subject when the color temperature changes. For the white balance adjustment, the CPU 27 sets the WB adjusting part 41 at the WB adjustment value obtained from the AWB evaluated value calculated by the 3A-evaluated value calculating part 35 as described above with reference to FIG. 3. In the view operation in which a through image is displayed on the LCD 21, the AWB evaluated value is updated at a constant rate (e.g. at a rate of 30 fps) in accordance with new image data which is continuously input to the pre-processing part 30. Generally, it is unnecessary to change the WB adjustment value frequently because the color temperature of illuminating light does not change rapidly. Accordingly, in the view operation, the WB adjustment value can be updated at a moderate rate.
  • On the other hand, in the still image capturing operation in which a still image is generated, the WB adjustment value is obtained based on the AWB evaluated value generally extracted from image data per se of the still image because the still image is an independent image of one frame. However, if the WB adjustment value is obtained based on the AWB evaluated value extracted from image data per se of a quick view image or a thumbnail image to be generated, generation of the quick view image or the thumbnail image is delayed. It is therefore preferable that the WB adjustment value used for generating the quick view image or the thumbnail image is decided in advance. For example, the WB adjustment value may be the latest WB adjustment value used in the view operation just before or may be obtained, as a WB adjustment value corresponding to the color temperature of illuminating light, from a database in the electronic camera 1 if the color temperature of illuminating light can be found in advance (e.g. as in flash photography). Then, the CPU 27 performs white balance adjustment due to the WB adjusting part 41 in accordance with the WB adjustment value decided in advance.
  • However, if the WB adjustment value can be calculated based on the AWB evaluated value at a high speed, the WB adjustment value may be obtained based on the AWB evaluated value extracted from image data per se of the quick view image or the thumbnail image to be generated. For example, when the CCD 11 is of a 3-field output type as described above with reference to FIG. 13A, an AWB evaluated value is extracted from image data of the first field in the reading period of the first field, a WB adjustment value is calculated based on the AWB evaluated value extracted in the reading period of the first field, in the reading period of the second field, and a quick view image is finally generated in the reading period of the third field. Because it is conceived that there is a high correlation between the respective fields, there is no large problem when the quick view image is generated from image data of the third field based on the AWB evaluated value extracted in the reading period of the first field.
  • On the other hand, for example, when the CCD 11 is of a 4-field output type as described above with reference to FIG. 13B, an AWB evaluated value is extracted from image data of the first and second fields in the reading period of the first and second fields, a WB adjustment value is calculated based on the AWB evaluated value extracted in the reading period of the first and second fields, in the reading period of the third field, and a quick view image and a thumbnail image are generated in the reading period of the fourth field. Incidentally, when a long time is required for calculating the WB adjustment value, the WB adjustment value may be calculated in the reading period of the third and fourth fields so that the quick view image and the thumbnail image can be generated simultaneously after final reading of the fourth field is completed. Although the quick view image and the thumbnail image cannot be generated in the period of reading of an image signal from the CCD 11, the total processing time can be shortened because the WB adjustment value can be calculated ahead of schedule. In this case, processing may be made in accordance with the data flow described with reference to FIG. 20.
  • Exceptional processing will be described below when the electronic camera 1 has a high-speed continuous capturing mode. As shown in FIG. 22, the high-speed continuous capturing mode is a capturing mode where capturing of one frame of the latest still image is made in parallel with image processing (post-processing) of a previous frame, in parallel with JPEG compression of a further previous frame and in parallel with recording of a further previous frame. In the high-speed continuous capturing mode, the three images of a quick view image, a thumbnail image and a main image are generated in image processing (post-processing), the thumbnail image and the main image are compressed in JPEG compression processing, and compressed data of the thumbnail image and the main image are recorded in recording processing. That is, in each step, images are processed in parallel.
  • It is necessary to provide a plurality of image buffers so that the four steps shown in FIG. 22 can overlap with one another. For example, as shown in FIG. 23, an R1 buffer 68 and an R2 buffer 69 are provided in place of the R buffer 62 shown in FIG. 10, and a Q1 buffer 72 and a Q2 buffer 73 are provided in place of the Q buffer 67 shown in FIG. 10. Moreover, a T1 buffer 74 and a T2 buffer 75 are provided in place of the T buffer 63 shown in FIG. 10, and an M1 buffer 76 and an M2 buffer 77 are provided in place of the M buffer 64 shown in FIG. 10. In addition, a T-J1 buffer 78 and a T-J2 buffer 79 are provided in place of the T-J buffer 65 shown in FIG. 10, and an M-J1 buffer 80 and an M-J2 buffer 81 are provided in place of the M-J buffer 66 shown in FIG. 10. That is, as shown in FIG. 23, all the image buffers are doubled. With the configuration of the SDRAM 19, the four steps described with reference to FIG. 22 can overlap with one other perfectly. Incidentally, each step includes smaller sub-steps which are executed sequentially.
  • In execution of image capturing in the high-speed continuous capturing mode described above, outputting of an image signal corresponding to one frame of the latest still image from the CCD 11 is performed in parallel with image processing (post-processing) of a previous frame performed by the post-processing part 31. Accordingly, a quick view image or a thumbnail image corresponding to one frame of the latest still image cannot be generated in parallel with the outputting of the image signal from the CCD 11. Therefore, in execution of image capturing in the high-speed continuous capturing mode, generation of the quick view image and the thumbnail image is not started during outputting of the image signal from the CCD 11. The quick view image and the thumbnail image are generated collectively and subsequently. In this case, the processing time of each step becomes longer than that in image capturing in a single capturing mode but the total processing time can be shortened because new frames are captured successively. When it is difficult to provided doubled image buffers as shown in FIG. 23 in terms of memory capacity, etc., at least the R buffer may be doubled (as an R1 buffer 68 and an R2 buffer 69) so that image capturing of the next frame can be continued.
  • An example of combination of the aforementioned processes will be described finally. When the CCD 11 is configured so that the reading mode can be switched in accordance with whether one field contains all color signal components (R, G and B) or not, processing modes corresponding to the respective cases may be provided so that one of the processing modes selected in accordance with switching of the reading mode can be executed. That is, configuration may be made so that a quick view image and a thumbnail image are generated from image data of one field when one field contains all color signal components (R, G and B), whereas a quick view image and a thumbnail image are generated from image data of two or more fields when one field does not contain all color image components (R, G and B). With this configuration, optimum processing can be performed regardless of the configuration of the CCD 11, the number of fields to be read and the switching of the reading mode.
  • As described above, according to this embodiment, generation of a low-resolution image is started in the reading period of remaining one(s) of all fields except part of the fields. Accordingly, the reading time from the image sensor can be used efficiently.
  • Moreover, according to this embodiment, a display unit is further provided for displaying an image for checking a result of capturing as a lower-resolution image when the image is generated. Accordingly, the waiting time of the user can be shortened.
  • Moreover, according to this embodiment, a field selecting unit for selecting one of fields is further provided so that a color image of one field selected by the field selecting unit to generate a low-resolution image is transferred from the pre-processing part to the post-processing part directly to thereby unify pre-processing and post-processing. Accordingly, the low-resolution image can be generated efficiently and quickly.
  • Moreover, according to this embodiment, a first low-resolution image and a second low-resolution image are generated in such a manner that the field selecting unit selects either of a field used for generating the first image and a field used for generating the second image. Accordingly, low-resolution images can be generated sequentially and quickly without necessity of adding any special configuration.
  • Moreover, according to this embodiment, for generation of low-resolution images, a first image with a low resolution is generated by the post-processing part and directly transferred from the post-processing part to the image averaging part to thereby simultaneously generate a second image with a lower resolution than that of the first image. Accordingly, low-resolution images can be generated efficiently and quickly.
  • Moreover, according to this embodiment, for generation of the low-resolution images, the white balance adjusting part performs white balance adjustment in accordance with a white balance adjustment value decided in advance. Accordingly, low-resolution images can be generated quickly while white balance adjustment can be performed appropriately.
  • Moreover, according to this embodiment, a high-speed continuous capturing mode is provided so that generation of any low-resolution image is not started in the reading period of the remaining fields during execution of image capturing in the high-speed continuous capturing mode. Accordingly, the reading time from the image sensor can be used efficiently regardless of the image capturing mode.
  • Although the respective sub-steps are drawn in the timing charts of FIGS. 12A to 12D and FIGS. 18A and 18B so as to have equalized lengths for the sake of simplification in the aforementioned embodiment, the lengths of the respective sub-steps have no correlation with actual processing lengths. Generally, the processing time required for processing such as image processing, JPEG compression processing, data recording, etc. is proportional to the amount of data used for the processing. That is, the longest processing time is required for generation and JPEG compression processing of the main image compared with generation of the quick view image and the thumbnail image. With respect to JPEG compression, the longest time is required for compression of the main image. Accordingly, the three sub-steps of image processing (post-processing), JPEG compression processing and data recording are dedicated to processing of the main image so that reduction of the image capturing time can be achieved. As described above in this embodiment, generation of the quick view image and the thumbnail image can be quickened to thereby achieve reduction of the image capturing time.
  • When the quick view image which was heretofore used only for checking image capturing is JPEG-compressed and recorded on the memory card 23 while associated with the main image, the quick view image can be displayed on the LCD 21 at the time of reproduction of the main image. As a result, in comparison with the case where the main image was heretofore size-reduced before displayed on the LCD 21, the waiting time of the user can be shortened in this case. In this case, JPEG compression processing may be applied to the quick view image as well as the main image. A private area (maker note area) of Exif/DCF can be used for recording the quick view image.
  • In JPEG compression processing, bit rate control has been heretofore made to keep the size of data of one frame almost constant. The bit rate control can keep the number of image capturing frames constant in accordance with the recording capacity of the memory card 23. There is however some case where JPEG compression must be repeated several times under the bit rate control, so that the image capturing time may become remarkably long. It is therefore desired that the amount of data is brought close to a target value while the number of times in repetition of compression is reduced as sufficiently as possible. On this occasion, the number of times in repetition of compression can be reduced when a quantization table for JPEG compression of the main image is decided with reference to information concerned with JPEG compression of the quick view image or the thumbnail image generated before the main image.
  • In each of the above-described embodiments, in the case of using an image signal having minimum number of fields (for example, 2 fields in the embodiment) which can extract color information of all the colors in the CCD to generate low-resolution images such as quick view images. However, the present invention is not limited to such an example. The number of the fields of the image signal which is used for generating low-resolution images may be any numbers as long as the number of the fields of the image signal is more than the number of the minimum fields but less than the number of all fields of the CCD.
  • The many features and advantages of the embodiments are apparent from the detailed specification and, thus, it is intended by the appended claims to cover all such features and advantages of the embodiments that fall within the true spirit and scope thereof. Further, since numerous modifications and changes will readily occur to those skilled in the art, it is not desired to limit the inventive embodiments to the exact construction and operation illustrated and described, and accordingly all suitable modifications and equivalents may be resorted to, falling within the scope thereof.

Claims (7)

1. An image capturing apparatus comprising:
an image capturing unit which separates a color image of a subject being captured by an image sensor having pixels of colors into three or more fields and outputs said three or more fields successively; and
an image processing unit which generates a low-resolution image which is lower in resolution than the color image obtained by the image capturing unit, based on output of one or more fields among said three or more fields, said one or more fields being able to extract color information of all the colors, wherein
the image processing unit starts generation of the low-resolution image in a period in which fields other than said one or more fields for generating the low-resolution image are read.
2. The image capturing apparatus according to claim 1, wherein:
the low-resolution image is an image for checking a result of capturing; and
the image capturing apparatus further comprises a display unit which displays the low-resolution image when the low-resolution image is generated by the image processing unit.
3. The image capturing apparatus according to claim 1, further comprising a field selecting unit which selects one field from the fields, wherein:
the image processing unit includes a pre-processing part which performs pre-processing on the color image output from the image capturing unit and a post-processing part which directly receives an output of the pre-processing part and performs post-processing on the pre-processed color image; and
when the low-resolution image is to be generated, the color image of one field selected by the field selecting unit is directly transferred from the pre-processing part to the post-processing part to thereby perform the pre-processing and the post-processing integrally and sequentially.
4. The image capturing apparatus according to claim 3, wherein:
the image processing unit generates a first image lower in resolution and a second image lower in resolution; and
the field selecting unit selects one field for generating the first image and one field for generating the second image, respectively.
5. The image capturing apparatus according to claim 1, wherein:
the image processing unit includes a pre-processing part which performs pre-processing on the color image output from the image capturing unit, a post-processing part which performs post-processing on the pre-processed color image, and a pixel averaging part which directly receives an output of the post-processing part and averages any pixels of the post-processed color image; and
when the low-resolution image is to be generated, a first low-resolution image is generated by the post-processing part and the generated first low-resolution image is directly transferred from the post-processing part to the pixel averaging part to thereby generate a second low-resolution image which is lower in resolution than the first low-resolution image simultaneously.
6. The image capturing apparatus according to claim 1, wherein:
the image processing unit includes a white balance adjusting part; and
when the low-resolution image is to be generated, the white balance adjusting part performs white balance adjustment in accordance with a white balance adjustment value decided in advance.
7. The image capturing apparatus according to claim 1, wherein:
the image processing unit includes a pre-processing part which performs pre-processing on the color image output from the image capturing unit and a post-processing part which performs post-processing on the pre-processed color image;
the image capturing apparatus further comprises a plurality of buffer memory areas which store the color image pre-processed by the pre-processing part and a high-speed continuous capturing mode which performs, as parallel processing, a process of performing the pre-processing on the color image of one frame output from the image capturing unit and storing the pre-processed color image in one of the buffer memory areas and a process of performing the post-processing on the color image of a previous frame stored in another of the buffer memory areas; and
the image processing unit does not start generation of the low-resolution image in a period of reading of fields other than said one or more fields for generating the low-resolution image while image capturing is executed in the high-speed continuous capturing mode.
US12/314,664 2007-12-25 2008-12-15 Image capturing apparatus Abandoned US20090160969A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
US13/791,439 US8767086B2 (en) 2007-12-25 2013-03-08 Image capturing apparatus

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2007332281A JP5256725B2 (en) 2007-12-25 2007-12-25 Imaging device
JP2007-332281 2007-12-25

Related Child Applications (1)

Application Number Title Priority Date Filing Date
US13/791,439 Continuation US8767086B2 (en) 2007-12-25 2013-03-08 Image capturing apparatus

Publications (1)

Publication Number Publication Date
US20090160969A1 true US20090160969A1 (en) 2009-06-25

Family

ID=40788134

Family Applications (2)

Application Number Title Priority Date Filing Date
US12/314,664 Abandoned US20090160969A1 (en) 2007-12-25 2008-12-15 Image capturing apparatus
US13/791,439 Expired - Fee Related US8767086B2 (en) 2007-12-25 2013-03-08 Image capturing apparatus

Family Applications After (1)

Application Number Title Priority Date Filing Date
US13/791,439 Expired - Fee Related US8767086B2 (en) 2007-12-25 2013-03-08 Image capturing apparatus

Country Status (2)

Country Link
US (2) US20090160969A1 (en)
JP (1) JP5256725B2 (en)

Cited By (14)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20100321514A1 (en) * 2009-06-19 2010-12-23 Asia Optical Co., Inc. Methods and systems for image capture and processing
US20110007184A1 (en) * 2008-03-10 2011-01-13 Jung Won Lee Image-signal processor capable of supporting a plurality of ccd image sensors and method for processing image signals using the image-signal processor
EP2485476A1 (en) * 2009-09-28 2012-08-08 Sony Corporation Imaging device
US20130162678A1 (en) * 2006-07-21 2013-06-27 Jerry G. Harris Progressive refinement of an edited image using secondary high resolution image processing
US20130162672A1 (en) * 2011-12-21 2013-06-27 Sony Corporation Image processing device, image processing method, and program
US8487963B1 (en) 2008-05-30 2013-07-16 Adobe Systems Incorporated Preview representation of pixels effected by a brush tip area
US20130293736A1 (en) * 2012-04-16 2013-11-07 Sony Corporation Image sensor, control method of image sensor, and imaging apparatus
US20140211037A1 (en) * 2013-01-29 2014-07-31 Samsung Electronics Co., Ltd. Electronic Apparatus, Method for Controlling the Same, and Computer-Readable Recording Medium
US20150271355A1 (en) * 2014-03-18 2015-09-24 Canon Kabushiki Kaisha Imaging apparatus and control method thereof
US20180152611A1 (en) * 2015-11-25 2018-05-31 Huawei Technologies Co., Ltd. Photographing Method, Photographing Apparatus, and Terminal
US20180176539A1 (en) * 2011-10-04 2018-06-21 Canon Kabushiki Kaisha Imaging apparatus and method for controlling same
US10115223B2 (en) * 2017-04-01 2018-10-30 Intel Corporation Graphics apparatus including a parallelized macro-pipeline
CN109196858A (en) * 2016-03-29 2019-01-11 株式会社尼康 Capturing element and filming apparatus
US20190020851A1 (en) * 2017-07-11 2019-01-17 Canon Kabushiki Kaisha Image encoding apparatus, and control method thereof

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20130097833A1 (en) 2011-04-15 2013-04-25 Tac-Fast Georgia L.L.C. Methods and systems for engagement of decorative covering
JP5884604B2 (en) * 2012-03-30 2016-03-15 株式会社ニコン Image recording apparatus, imaging apparatus, image recording program, and image display program

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20030169346A1 (en) * 2002-01-30 2003-09-11 Noriaki Ojima Photographing apparatus and photographing method
US20030206319A1 (en) * 2002-05-01 2003-11-06 Hiroshi Kondo Image sensing apparatus, image sensing method, program, and storage medium
US20080062272A1 (en) * 1999-09-28 2008-03-13 Nikon Corporation Electronic camera that reduces processing time by performing different processes in parallel
US20080266410A1 (en) * 2004-07-09 2008-10-30 Takahiro Fukuhara Imaging Apparatus, Integrated Circuit for Image Pickup Device and Image Data Processing Method

Family Cites Families (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH10136244A (en) * 1996-11-01 1998-05-22 Olympus Optical Co Ltd Electronic image pickup device
JP3988461B2 (en) * 2001-12-28 2007-10-10 株式会社ニコン Electronic camera
JP4077247B2 (en) * 2002-06-04 2008-04-16 株式会社リコー Imaging method, imaging apparatus, and image information processing apparatus
JP2003244714A (en) * 2002-02-19 2003-08-29 Mega Chips Corp Image processing apparatus and digital still camera
JP2004032713A (en) * 2002-05-01 2004-01-29 Canon Inc Imaging apparatus, imaging method, program, and storage medium
JP2004135225A (en) * 2002-10-15 2004-04-30 Nikon Corp Electronic camera
JP2007074475A (en) * 2005-09-08 2007-03-22 Sanyo Electric Co Ltd Photographic apparatus

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20080062272A1 (en) * 1999-09-28 2008-03-13 Nikon Corporation Electronic camera that reduces processing time by performing different processes in parallel
US20030169346A1 (en) * 2002-01-30 2003-09-11 Noriaki Ojima Photographing apparatus and photographing method
US20030206319A1 (en) * 2002-05-01 2003-11-06 Hiroshi Kondo Image sensing apparatus, image sensing method, program, and storage medium
US7433099B2 (en) * 2002-05-01 2008-10-07 Canon Kabushiki Kaisha Image sensing apparatus, image sensing method, program, and storage medium
US20080266410A1 (en) * 2004-07-09 2008-10-30 Takahiro Fukuhara Imaging Apparatus, Integrated Circuit for Image Pickup Device and Image Data Processing Method

Cited By (29)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20130162678A1 (en) * 2006-07-21 2013-06-27 Jerry G. Harris Progressive refinement of an edited image using secondary high resolution image processing
US8885208B2 (en) * 2006-07-21 2014-11-11 Adobe Systems Incorporated Progressive refinement of an edited image using secondary high resolution image processing
US20110007184A1 (en) * 2008-03-10 2011-01-13 Jung Won Lee Image-signal processor capable of supporting a plurality of ccd image sensors and method for processing image signals using the image-signal processor
US8368778B2 (en) * 2008-03-10 2013-02-05 Semisolution Inc. Image-signal processor capable of supporting a plurality of CCD image sensors and method for processing image signals using the image-signal processor
US8487963B1 (en) 2008-05-30 2013-07-16 Adobe Systems Incorporated Preview representation of pixels effected by a brush tip area
US20100321514A1 (en) * 2009-06-19 2010-12-23 Asia Optical Co., Inc. Methods and systems for image capture and processing
EP2485476A4 (en) * 2009-09-28 2014-10-01 Sony Corp Imaging device
EP2485476A1 (en) * 2009-09-28 2012-08-08 Sony Corporation Imaging device
US10587860B2 (en) * 2011-10-04 2020-03-10 Canon Kabushiki Kaisha Imaging apparatus and method for controlling same
US20180176539A1 (en) * 2011-10-04 2018-06-21 Canon Kabushiki Kaisha Imaging apparatus and method for controlling same
US9251765B2 (en) * 2011-12-21 2016-02-02 Sony Corporation Image processing device, image processing method, and program for generating composite image
US20130162672A1 (en) * 2011-12-21 2013-06-27 Sony Corporation Image processing device, image processing method, and program
US10542227B2 (en) 2012-04-16 2020-01-21 Sony Corporation Image sensor, control method of image sensor, and imaging apparatus
CN108055488A (en) * 2012-04-16 2018-05-18 索尼公司 Imaging sensor, the control method of imaging sensor and imaging device
US20130293736A1 (en) * 2012-04-16 2013-11-07 Sony Corporation Image sensor, control method of image sensor, and imaging apparatus
US10009563B2 (en) * 2012-04-16 2018-06-26 Sony Corporation Image sensor, control method of image sensor, and imaging apparatus
US9313407B2 (en) * 2013-01-29 2016-04-12 Samsung Electronics Co., Ltd. Electronic apparatus, method for controlling the same, and computer-readable recording medium
KR102004989B1 (en) 2013-01-29 2019-07-29 삼성전자주식회사 Photographing apparatus, method for controlling the same, and computer-readable recording medium
KR20140096921A (en) * 2013-01-29 2014-08-06 삼성전자주식회사 Photographing apparatus, method for controlling the same, and computer-readable recording medium
US20140211037A1 (en) * 2013-01-29 2014-07-31 Samsung Electronics Co., Ltd. Electronic Apparatus, Method for Controlling the Same, and Computer-Readable Recording Medium
US9769377B2 (en) 2014-03-18 2017-09-19 Canon Kabushiki Kaisha Imaging apparatus and control method for handling a raw image of a moving image or a still image
US9413921B2 (en) * 2014-03-18 2016-08-09 Canon Kabushiki Kaisha Imaging apparatus and control method thereof
US20150271355A1 (en) * 2014-03-18 2015-09-24 Canon Kabushiki Kaisha Imaging apparatus and control method thereof
US20180152611A1 (en) * 2015-11-25 2018-05-31 Huawei Technologies Co., Ltd. Photographing Method, Photographing Apparatus, and Terminal
US10742889B2 (en) * 2015-11-25 2020-08-11 Huawei Technologies Co., Ltd. Image photographing method, image photographing apparatus, and terminal
CN109196858A (en) * 2016-03-29 2019-01-11 株式会社尼康 Capturing element and filming apparatus
US10115223B2 (en) * 2017-04-01 2018-10-30 Intel Corporation Graphics apparatus including a parallelized macro-pipeline
US20190020851A1 (en) * 2017-07-11 2019-01-17 Canon Kabushiki Kaisha Image encoding apparatus, and control method thereof
US10674110B2 (en) * 2017-07-11 2020-06-02 Canon Kabushiki Kaisha Image encoding apparatus, and control method thereof

Also Published As

Publication number Publication date
US20130194450A1 (en) 2013-08-01
US8767086B2 (en) 2014-07-01
JP2009159056A (en) 2009-07-16
JP5256725B2 (en) 2013-08-07

Similar Documents

Publication Publication Date Title
US8767086B2 (en) Image capturing apparatus
US7787026B1 (en) Continuous burst mode digital camera
JP4727457B2 (en) Imaging device
US20110050950A1 (en) Image device and imaging method
US20080043133A1 (en) Method of continuously capturing images in single lens reflex digital camera
US7830447B2 (en) Imaging apparatus including a plurality of image pickup elements
AU2013201746A1 (en) Image processing apparatus and method of camera device
JP4372686B2 (en) Imaging system
US7583280B2 (en) Image display device
US8908060B2 (en) Imaging apparatus generating evaluation values at a high frame rate and having a live view function of displaying a video smoothly at a low frame rate
US8610792B2 (en) Imaging apparatus and imaging method
JP2010021710A (en) Imaging device, image processor, and program
JP2011223146A (en) Electronic camera
JP2004260265A (en) Pixel extracting circuit having pixel turning over function, and image pickup apparatus
JP2009033385A (en) Image processor, image processing method, image processing program and imaging device
JP2018056652A (en) Imaging device and control method for imaging device
JP7545238B2 (en) Imaging device and control method thereof
JP2002359856A (en) Data conversion circuit and digital camera
JP2005191711A (en) Imaging system and method therefor, regeneration system and program thereof, and imaging regeneration system and program thereof
JP2003244714A (en) Image processing apparatus and digital still camera
JP2005142707A (en) Imaging apparatus
JP2006134030A (en) Image processor
JP5471004B2 (en) Focus adjustment apparatus, focus adjustment method, and program
JP3603683B2 (en) Video encoder circuit and television system conversion method
JP4322448B2 (en) Digital camera and digital camera control method

Legal Events

Date Code Title Description
AS Assignment

Owner name: NIKON CORPORATION,JAPAN

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:KUROIWA, TOSHIHISA;REEL/FRAME:022036/0554

Effective date: 20081209

STCB Information on status: application discontinuation

Free format text: ABANDONED -- FAILURE TO RESPOND TO AN OFFICE ACTION