[go: up one dir, main page]
More Web Proxy on the site http://driver.im/

WO2016129404A1 - Image processing device, image processing method, and program - Google Patents

Image processing device, image processing method, and program Download PDF

Info

Publication number
WO2016129404A1
WO2016129404A1 PCT/JP2016/052580 JP2016052580W WO2016129404A1 WO 2016129404 A1 WO2016129404 A1 WO 2016129404A1 JP 2016052580 W JP2016052580 W JP 2016052580W WO 2016129404 A1 WO2016129404 A1 WO 2016129404A1
Authority
WO
WIPO (PCT)
Prior art keywords
image
marker
light component
unit
light
Prior art date
Application number
PCT/JP2016/052580
Other languages
French (fr)
Japanese (ja)
Inventor
源吾 森年
星野 和弘
Original Assignee
ソニー株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by ソニー株式会社 filed Critical ソニー株式会社
Publication of WO2016129404A1 publication Critical patent/WO2016129404A1/en

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06KGRAPHICAL DATA READING; PRESENTATION OF DATA; RECORD CARRIERS; HANDLING RECORD CARRIERS
    • G06K7/00Methods or arrangements for sensing record carriers, e.g. for reading patterns
    • G06K7/10Methods or arrangements for sensing record carriers, e.g. for reading patterns by electromagnetic radiation, e.g. optical sensing; by corpuscular radiation
    • G06K7/12Methods or arrangements for sensing record carriers, e.g. for reading patterns by electromagnetic radiation, e.g. optical sensing; by corpuscular radiation using a selected wavelength, e.g. to sense red marks and ignore blue marks
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N25/00Circuitry of solid-state image sensors [SSIS]; Control thereof
    • H04N25/10Circuitry of solid-state image sensors [SSIS]; Control thereof for transforming different wavelengths into image signals
    • H04N25/11Arrangement of colour filter arrays [CFA]; Filter mosaics
    • H04N25/13Arrangement of colour filter arrays [CFA]; Filter mosaics characterised by the spectral characteristics of the filter elements
    • H04N25/131Arrangement of colour filter arrays [CFA]; Filter mosaics characterised by the spectral characteristics of the filter elements including elements passing infrared wavelengths
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T19/00Manipulating 3D models or images for computer graphics
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N25/00Circuitry of solid-state image sensors [SSIS]; Control thereof
    • H04N25/10Circuitry of solid-state image sensors [SSIS]; Control thereof for transforming different wavelengths into image signals
    • H04N25/11Arrangement of colour filter arrays [CFA]; Filter mosaics
    • H04N25/13Arrangement of colour filter arrays [CFA]; Filter mosaics characterised by the spectral characteristics of the filter elements
    • H04N25/135Arrangement of colour filter arrays [CFA]; Filter mosaics characterised by the spectral characteristics of the filter elements based on four or more different wavelength filter elements

Definitions

  • the present technology relates to an image processing device, an image processing method, and a program, and more particularly, to an image processing device, an image processing method, and a program suitable for use in detecting an AR (Augmented Reality) marker.
  • AR Augmented Reality
  • AR Augmented Reality
  • a marker type AR that superimposes and displays additional information at a position where an AR marker of a predetermined pattern is detected (see, for example, Patent Document 1).
  • the position of the AR marker may be limited in order to damage the scenery and the design.
  • the present technology has been made in view of such a situation, and is intended to increase the degree of freedom of arrangement of the AR marker without degrading the detection accuracy of the AR marker.
  • An image processing apparatus includes a separation unit that separates an invisible light component from an input image including a visible light component and an invisible light component, and a pixel value of a first image based on the invisible light component of the input image.
  • a noise removal unit that removes noise by reducing
  • a marker detection unit that detects an AR (Augmented Reality) marker indicating a predetermined pattern by invisible light in the first image from which noise has been removed.
  • the separation unit further separates a visible light component from the input image, and the noise removal unit includes a luminance around the AR marker in the second image based on the visible light component of the input image, or the Based on at least one of the ambient temperatures of the image processing apparatus, it is possible to set a subtraction amount that is an amount by which the pixel value of the first image is reduced.
  • the AR marker reflects the invisible light to show the pattern, and at least one of the brightness around the AR marker, the subtraction amount, and the detection result of the AR marker in the second image. Based on the above, it is possible to further provide an irradiation control unit for controlling the intensity of the invisible light irradiated to the AR marker.
  • An imaging device that includes a first pixel that detects a visible light component and an invisible light component, and a second pixel that detects only an invisible light component, and captures the input image can be further provided.
  • An imaging element that includes a plurality of types of pixels that detect invisible light components and visible light components having different wavelength bands and that captures the input image can be further provided.
  • the separation unit further includes an additional information superimposing unit that further separates a visible light component from the input image and superimposes additional information corresponding to the AR marker on a second image based on the visible light component of the input image. Can be provided.
  • An image processing method includes a separation step of separating an invisible light component from an input image including a visible light component and an invisible light component, and a pixel value of a first image based on the invisible light component of the input image.
  • AR Augmented Reality
  • a program includes a separation step of separating an invisible light component from an input image including a visible light component and an invisible light component, and reducing a pixel value of the first image based on the invisible light component of the input image.
  • the computer executes a process including a noise removing step for removing noise and a marker detecting step for detecting an AR (Augmented Reality) marker indicating a predetermined pattern with invisible light in the first image from which the noise has been removed.
  • the invisible light component is separated from the input image including the visible light component and the invisible light component, and noise is removed by reducing the pixel value of the first image based on the invisible light component of the input image.
  • an AR Augmented Reality
  • FIG. 1 shows a configuration example of a camera system 1 that is the first embodiment of the present technology.
  • the camera system 1 is a system that can superimpose additional information on a captured image using AR technology. That is, the camera system 1 detects the AR marker by irradiating the colorless and transparent AR marker that reflects IR light with the IR light, and displays the additional information superimposed on the captured image based on the detected AR marker. .
  • the additional information includes information related to the photographed subject and location.
  • the camera system 1 is configured to include a bandpass filter 11, an image sensor 12, a temperature sensor 13, an image processing unit 14, and an IR light irradiation device 15.
  • the band-pass filter 11 transmits light in a predetermined wavelength band including a visible light region and an IR light region, and blocks light in other wavelength bands among the light from the subject.
  • the image sensor 12 receives light from the subject that has passed through the band-pass filter 11 and generates an image (hereinafter referred to as an input image) composed of pixel signals indicating the intensity of incident light on each pixel.
  • the image sensor 12 supplies the generated input image to the separation unit 21 of the image processing unit 14.
  • FIG. 2 shows an arrangement example of the pixels of the image sensor 12.
  • each pixel is arranged in a grid pattern with a pattern of 2 vertical pixels ⁇ 2 horizontal pixels as one unit.
  • IR pixels are arranged in place of G pixels.
  • the G pixel mainly receives a component in the G (green) wavelength band (hereinafter referred to as G component) and a component in the IR light wavelength band (hereinafter referred to as IR light component), and performs photoelectric conversion.
  • G component a component in the G (green) wavelength band
  • IR light component a component in the IR light wavelength band
  • the R pixel is a pixel capable of receiving and photoelectrically converting a component in the R (red) wavelength band (hereinafter referred to as an R component) and an IR light component. That is, the R pixel is a pixel that can detect the R component and the IR light component.
  • the B pixel is a pixel capable of receiving and photoelectrically converting a component in the B (blue) wavelength band (hereinafter referred to as a B component) and an IR light component. That is, the B pixel is a pixel that can detect the B component and the IR light component.
  • the IR pixel is a pixel that can receive an IR light component and perform photoelectric conversion. That is, the IR pixel is a pixel that can detect only the IR light component.
  • the temperature sensor 13 measures the ambient temperature of the camera system 1 and supplies the measurement result to the coring unit 24 of the image processing unit 14.
  • the temperature sensor 13 is disposed in the vicinity of the image sensor 12.
  • the image processing unit 14 performs image processing such as AR processing on the input image supplied from the image sensor 12.
  • the image processing unit 14 includes a separation unit 21, a visible light image processing unit 22, a luminance calculation unit 23, a coring unit (noise removal unit) 24, a threshold determination unit 25, a marker detection unit 26, an additional information superimposition unit 27, IR light.
  • the control unit 28 and the buffer 29 are included.
  • the separation unit 21 separates the visible light component and the IR light component of each pixel of the input image. Then, the separation unit 21 supplies a mosaic image based on the visible light component to the visible light image processing unit 22 and causes the buffer 29 to store an image based on the IR light component (hereinafter referred to as an IR light image).
  • the visible light image processing unit 22 performs predetermined image processing such as demosaic processing on the mosaic image, thereby generating an RGB image in which each pixel has pixel values of R component, G component, and B component.
  • the visible light image processing unit 22 supplies the generated RGB image to the luminance calculation unit 23 and the additional information superimposing unit 27.
  • the luminance calculation unit 23 calculates the luminance around the AR marker in the RGB image (hereinafter referred to as peripheral luminance), and stores the calculation result in the buffer 29.
  • the coring unit 24 reads the peripheral luminance calculation result and the IR light image from the buffer 29.
  • the coring unit 24 removes noise from the IR light image by performing a coring process on the IR light image based on the ambient brightness and the ambient temperature of the camera system 1.
  • the coring unit 24 stores the IR light image after the coring process in the buffer 29.
  • the threshold determination unit 25 compares the pixel value of each pixel of the IR light image after the coring process stored in the buffer 29 with a predetermined threshold. Then, the threshold determination unit 25 sets the pixel value of the pixel determined to be greater than or equal to the predetermined threshold to 1, and sets the pixel value of the pixel determined to be less than the predetermined threshold to 0. A binary image is generated and stored in the buffer 29.
  • the marker detection unit 26 detects the AR marker using the binarized image stored in the buffer 29.
  • the marker detection unit 26 stores the detection result of the AR marker in the buffer 29 and supplies it to the IR light control unit 28.
  • the additional information superimposing unit 27 superimposes additional information corresponding to the AR marker on the RGB image based on the detection result of the AR marker stored in the buffer 29.
  • the additional information superimposing unit 27 outputs the RGB image on which the additional information is superimposed to a subsequent device or the like.
  • the IR light control unit 28 controls the amount of IR light emitted from the IR light irradiation device 15 based on the detection result of the AR marker and the calculation result of the peripheral luminance stored in the buffer 29.
  • the buffer 29 is configured by a memory such as a RAM, for example.
  • the IR light irradiation device 15 irradiates a region including the imaging range of the camera system 1 with IR light in a predetermined wavelength band under the control of the IR light control unit 28.
  • step S1 the camera system 1 performs shooting. Specifically, the IR light irradiation device 15 irradiates IR light in the shooting direction of the camera system 1 under the control of the IR light control unit 28. On the other hand, light from the subject enters the bandpass filter 11. The incident light includes reflected light obtained by reflecting the IR light from the IR light irradiation device 15 by the subject.
  • the band-pass filter 11 transmits light in a predetermined wavelength band including a visible light region and an IR light region among incident light from the subject, and blocks light in other wavelength bands.
  • the image sensor 12 photoelectrically converts the light imaged on the light receiving surface into an electrical signal corresponding to the intensity of the light for each pixel, and further generates an input image composed of pixel signals obtained by AD conversion of the electrical signal of each pixel.
  • the image sensor 12 supplies the generated input image to the separation unit 21.
  • R pixels, G pixels, B pixels, and IR pixels are arranged in a grid pattern.
  • the signal output from the R pixel includes the R component and the IR light component
  • the pixel signal output from the G pixel includes the G component and the IR light component
  • the pixel signal output from the B pixel includes the B component and
  • the pixel signal that includes the IR light component and is output from the IR pixel includes only the IR light component.
  • the AR marker 101 shown in FIG. 4 is arranged within the shooting range of the camera system 1.
  • the AR marker 101 is divided into a reflective portion 101A filled with black and a non-reflective portion 101B filled with white in the drawing.
  • the reflective portion 101A is, for example, a portion to which a colorless and transparent paint having a high reflectivity with respect to IR light is applied.
  • the non-reflective portion 101B is a portion where a colorless and transparent paint is not applied.
  • the AR marker 101 when the AR marker 101 is irradiated with IR light, the IR light is reflected only by the reflecting portion 101A and enters the camera system 1. Therefore, as will be described later, in the camera system 1, the shapes of the reflective portion 101A and the non-reflective portion 101B are recognized as the pattern of the AR marker 101.
  • the AR marker 101 is transparent and colorless in both the reflection part 101A and the non-reflection part 101B, and cannot be seen by human eyes. Therefore, even if the AR marker 101 is arranged in various places, the scenery and the design are not impaired.
  • step S2 the separation unit 21 separates the visible light component and the IR light component. That is, the separation unit 21 separates each pixel signal component of the input image into a visible light component and an IR light component.
  • the separation unit 21 extracts the R component by subtracting the IR light component included in the pixel signal of the neighboring IR pixel from the component of the pixel signal of the R pixel. Further, the separation unit 21 extracts the IR light component by subtracting the extracted R component from the component of the pixel signal of the R pixel. Thereby, the R component (visible light component) and the IR light component of the pixel signal of the R pixel are separated.
  • the G component (visible light component) and the IR light component of the pixel signal of the G pixel are separated, and the B component (visible light component) and the IR light component of the pixel signal of the B pixel are separated.
  • the separation unit 21 corresponds to the intensity of the IR light component of the pixel signal of the R pixel, the IR light component of the pixel signal of the G pixel, the IR light component of the pixel signal of the B pixel, and the IR light component of the IR pixel.
  • An IR light image having pixel values is generated and stored in the buffer 29.
  • the separation unit 21 also includes an R component of the pixel signal of the R pixel, a G component of the pixel signal of the G pixel, a B component of the pixel signal of the B pixel, and a visible light component of the pixel signal of the IR pixel (in practice, A mosaic image having a pixel value corresponding to the intensity of 0) is generated and supplied to the visible light image processing unit 22.
  • the visible light image processing unit 22 performs visible light image processing.
  • the visible light image processing unit 22 performs demosaic processing on the mosaic image, and generates an RGB image by complementing the R component, G component, and B component pixel values of each pixel.
  • the visible light image processing unit 22 performs processing on a normal RGB image such as white balance processing and noise removal processing on the generated RGB image, for example. Then, the visible light image processing unit 22 supplies the RGB image to the luminance calculation unit 23 and the additional information superimposing unit 27.
  • step S4 the luminance calculation unit 23 calculates the luminance around the AR marker 101. Specifically, the luminance calculation unit 23 calculates the luminance component (Y component) of each pixel from the pixel value of each RGB component of each pixel of the RGB image using a predetermined conversion formula. Then, the luminance calculation unit 23 calculates the peripheral luminance around the AR marker 101 based on the luminance component of each pixel of the RGB image.
  • the luminance calculation unit 23 calculates the luminance around the AR marker 101 based on the luminance component of each pixel of the RGB image.
  • the luminance calculation unit 23 calculates an average value of luminance components of each pixel of the entire RGB image as the peripheral luminance (overall luminance).
  • the luminance calculation unit 23 calculates the average value of the luminance components of the pixels in the rectangular luminance detection region 112 around the AR marker 101 of the RGB image 111 as the peripheral luminance. This makes it possible to detect the brightness around the AR marker 101 more accurately.
  • the position of the luminance detection area 112 (for example, the coordinates (X, Y), width W, height H, etc. of the luminance detection area 112) may be set by the user, or the camera system 1 may It may be set automatically.
  • the brightness detection area 112 is reliably set by setting the brightness detection area 112 so as to include the object or place. It is possible to include the AR marker 101 inside.
  • the luminance calculation unit 23 performs only the position detection of the AR marker 101, and sets the area including the AR marker 101 as a luminance detection area. 112 may be set.
  • the luminance calculation unit 23 may set the luminance detection region 112 based on the detection result of the AR marker 101 in the previous frame.
  • the luminance calculation unit 23 stores the calculation result of the peripheral luminance in the buffer 29.
  • step S5 the temperature sensor 13 measures the temperature around the camera system 1.
  • the temperature sensor 13 supplies the measurement result to the coring unit 24.
  • step S6 the coring unit 24 performs a coring process. Specifically, the coring unit 24 reads the IR light image and the calculation result of the peripheral luminance from the buffer 29. Then, the coring unit 24 corrects each pixel value of the IR light image based on the ambient brightness and the ambient temperature of the camera system 1.
  • the coring unit 24 uses, for example, the graph shown in FIG. 6 to calculate the subtraction amount (hereinafter referred to as luminance-based subtraction amount) for the pixel value of each pixel of the IR light image based on the peripheral luminance. ) Is calculated.
  • the horizontal axis represents luminance (peripheral luminance)
  • the vertical axis represents subtraction amount (luminance-based subtraction amount).
  • the luminance base subtraction amount is fixed to the smallest SYL in the range where the peripheral luminance is less than YL.
  • the luminance base subtraction amount increases linearly from the minimum SYL to the maximum SYH in proportion to the peripheral luminance.
  • the luminance base subtraction amount is fixed to the maximum SYH. That is, since the noise component of the IR light image increases as the peripheral luminance increases, the luminance base subtraction amount is set to be large.
  • the coring unit 24 uses, for example, a graph shown in FIG. 7 to calculate a subtraction amount (hereinafter referred to as a temperature-based subtraction amount) for the pixel value of each pixel of the IR light image based on the ambient temperature of the camera system 1. Is calculated).
  • a subtraction amount hereinafter referred to as a temperature-based subtraction amount
  • the horizontal axis indicates the temperature around the camera system 1 and the vertical axis indicates the subtraction amount (temperature-based subtraction amount).
  • the temperature base subtraction amount is fixed to the minimum STL in the range where the ambient temperature is less than TL.
  • the temperature base subtraction amount increases linearly from the minimum STL to the maximum STH in proportion to the ambient temperature.
  • the temperature-based subtraction amount is fixed to the maximum STH. That is, since the noise component of the IR light image increases as the ambient temperature increases, the temperature base subtraction amount is set to be larger.
  • the coring unit 24 calculates a final subtraction amount (hereinafter referred to as a total subtraction amount) based on the luminance base subtraction amount and the temperature base subtraction amount. For example, the coring unit 24 calculates the total subtraction amount by adding the luminance base subtraction amount and the temperature base subtraction amount. Alternatively, for example, the coring unit 24 calculates the total subtraction amount by weighted addition of the luminance base subtraction amount and the temperature base subtraction amount.
  • the coring unit 24 corrects the pixel value of each pixel to be reduced by subtracting the total subtraction amount from the pixel value of each pixel of the IR light image.
  • the pixel value of each pixel is reduced according to the amount of noise component generated in the IR light image, and the noise component in the IR light image is appropriately reduced.
  • the coring unit 24 stores the IR light image after the coring process in the buffer 29.
  • step S7 the threshold determination unit 25 performs a threshold determination process. Specifically, the threshold determination unit 25 reads the IR light image after the coring process from the buffer 29. Next, the threshold determination unit 25 determines whether the pixel value of each pixel of the IR light image is equal to or greater than a predetermined threshold.
  • the threshold value determination unit 25 sets the pixel value of a pixel whose pixel value is equal to or greater than a predetermined threshold, in other words, a pixel whose IR light component after the coring process is equal to or greater than a predetermined intensity to 1, for example.
  • the threshold determination unit 25 sets, for example, the pixel value of a pixel having a pixel value less than a predetermined threshold, in other words, a pixel having an IR light component after coring processing less than a predetermined intensity to 0. Thereby, a binarized image is generated.
  • the threshold determination unit 25 stores the generated binarized image in the buffer 29.
  • the threshold value used for this processing may be adjusted based on, for example, the ambient brightness, the ambient temperature of the camera system 1, and the like.
  • step S8 the marker detection unit 26 performs a process of detecting the AR marker 101. Specifically, the marker detection unit 26 reads a binarized image from the buffer 29. Next, the marker detection unit 26 detects the AR marker 101 by searching the binarized image for an image having the same shape as the reflection unit 101A of the AR marker 101 by a predetermined method such as pattern matching. Do.
  • step S9 the marker detection unit 26 determines whether or not the AR marker 101 has been detected based on the result of the process in step S8. If it is determined that the AR marker 101 has been detected, the process proceeds to step S10.
  • step S10 the image processing unit 14 performs additional information superimposition processing.
  • the marker detection unit 26 stores the detection result of the AR marker 101 in the buffer 29.
  • This detection result includes, for example, the coordinates of the position where the AR marker 101 is detected in the binarized image.
  • the additional information superimposing unit 27 reads the detection result of the AR marker 101 from the buffer 29. Next, the additional information superimposing unit 27 superimposes additional information corresponding to the AR marker 101 in the vicinity of the position where the AR marker 101 is detected in the RGB image. Then, the additional information superimposing unit 27 outputs an RGB image on which the additional information is superimposed, and supplies the RGB image to a subsequent device or the like. Thereby, for example, an RGB image on which additional information is superimposed is displayed in the subsequent device.
  • step S1 the process returns to step S1, and the processes after step S1 are executed. That is, the above-described AR processing is performed on each frame of the input image.
  • step S9 if it is determined in step S9 that the AR marker 101 cannot be detected, the process proceeds to step S11.
  • step S11 the image processing unit 14 adjusts the irradiation amount of the IR light. Specifically, the marker detection unit 26 notifies the IR light control unit 28 that the AR marker 101 has not been detected.
  • the IR light control unit 28 reads out the calculation result of the peripheral luminance from the buffer 29.
  • the IR light control unit 28 calculates an appropriate amount of IR light based on the peripheral luminance. For example, the IR light control unit 28 decreases the amount of IR light when the peripheral luminance is high, and increases the amount of IR light when the peripheral luminance is low. It should be noted that it is desirable not to increase the amount of IR light too much in order to suppress an increase in power consumption and noise generation due to irregular reflection of IR light. Then, the IR light control unit 28 controls the IR light irradiation device 15 so that the amount of IR light irradiated by the IR light irradiation device 15 becomes the calculated light amount.
  • the amount of IR light is set to an appropriate value based on the peripheral luminance, and the detection accuracy of the AR marker 101 is improved.
  • the coring process in step S6 and the IR light irradiation amount adjusting process in step S11 may be limited in the number and time of execution so that hunting does not occur.
  • step S1 the process returns to step S1, and the processes after step S1 are executed.
  • the detection accuracy of the AR marker 101 can be improved.
  • additional information can be superimposed on an appropriate position of the RGB image and presented to the user.
  • the AR marker 101 is colorless and transparent, the AR marker 101 can be arranged at a desired position without deteriorating the scenery or the design. That is, the degree of freedom of arrangement of the AR marker is improved. Further, since the AR marker 101 is colorless and transparent, the confidentiality of information is improved.
  • the load and data required for processing can be reduced.
  • FIG. 8 illustrates a configuration example of the camera system 201 according to the second embodiment of the present technology. In the figure, portions corresponding to those in FIG.
  • the camera system 201 is common to the camera system 1 in that it includes a bandpass filter 11, an image sensor 12, and a temperature sensor 13. On the other hand, the camera system 201 differs from the camera system 1 in that an image processing unit 214 is provided instead of the image processing unit 14 and that the IR light irradiation device 15 is not provided.
  • the image processing unit 214 is common in that it includes a visible light image processing unit 22 and an additional information superimposing unit 27.
  • the image processing unit 214 is different from the image processing unit 14 of the camera system 1 in place of the separation unit 21, the luminance calculation unit 23, the coring unit 24, the threshold determination unit 25, and the marker detection unit 26. The difference is that it includes a separation unit 221, a luminance calculation unit 223, a coring unit 224, a threshold value determination unit 225, and a marker detection unit 226, and that the IR light control unit 28 is not provided.
  • the separation unit 221, the luminance calculation unit 223, the coring unit 224, the threshold determination unit 225, and the marker detection unit 226 are the separation unit 21, the luminance calculation unit 23, the coring unit 24, and the threshold determination unit 25 of the image processing unit 14. And basically the same processing as the marker detection unit 26 is performed. However, the separation unit 221, the luminance calculation unit 223, the coring unit 224, the threshold value determination unit 225, and the marker detection unit 226 directly exchange data with each other without using a buffer.
  • the separation unit 221 separates the visible light component and the IR light component of each pixel of the input image, like the separation unit 21. Then, the separation unit 221 supplies the mosaic image including the visible light component to the visible light image processing unit 22, and supplies the IR light image including the IR light component to the luminance calculation unit 223.
  • the luminance calculation unit 223 calculates the luminance around the AR marker in the RGB image supplied from the visible light image processing unit 22 (peripheral luminance), as with the luminance calculation unit 23.
  • the luminance calculation unit 223 supplies the IR light image and the calculation result of the peripheral luminance to the coring unit 224.
  • the coring unit 224 performs the coring process on the IR light image based on the ambient brightness and the ambient temperature of the camera system 1 measured by the temperature sensor 13, similarly to the coring unit 24.
  • the coring unit 24 supplies the IR light image after the coring process to the threshold value determination unit 225.
  • the threshold value determination unit 225 performs a threshold value determination process on the IR light image after the coring process.
  • the threshold determination unit 225 supplies the binarized image generated by the threshold determination process to the marker detection unit 226.
  • the marker detection unit 226 detects an AR marker using a binarized image, as with the marker detection unit 26.
  • the marker detection unit 226 supplies the AR marker detection result to the additional information superimposing unit 27.
  • the IR light irradiation device 15 is not provided in the camera system 201. Therefore, the camera system 201 detects an AR marker that emits IR light.
  • the AR marker is detected by replacing the reflective portion 101A of the AR marker 101 in FIG. 4 described above with a light emitting portion that emits IR light.
  • the total subtraction amount may be set for each pixel by setting the luminance base subtraction amount for each pixel based on the luminance component of each pixel of the IR light image.
  • the subtraction amount may be expressed not by the absolute value as described above but by a ratio for decreasing the pixel value.
  • the pixel value of each pixel of the IR light image may be corrected by 10%.
  • the graphs for calculating the subtraction amount in FIGS. 6 and 7 are examples thereof, and other graphs may be used.
  • a curved graph as shown in FIG.
  • a plurality of representative points are set, a pseudo curve connecting the representative points with a straight line, a curve obtained by neighborhood interpolation processing using the plurality of representative points, or the like is used. .
  • the subtraction amount may be set based on only one of the ambient brightness and the ambient temperature.
  • the IR light control unit 28 controls the amount of IR light from the IR light irradiation device 15 based on the peripheral luminance. For example, based on the subtraction amount of the coring process, the IR light control unit 28 The amount of light may be controlled. For example, when the subtraction amount is large, the IR light control unit 28 assumes that the peripheral luminance is high and the noise amount is large. Therefore, when the IR light amount is decreased and the subtraction amount is small, the peripheral luminance is low and the noise amount is small. Therefore, the amount of IR light may be increased.
  • the IR light control unit 28 may control the amount of IR light based on both the peripheral luminance and the subtraction amount of the coring process.
  • the IR light control unit 28 may increase or decrease the amount of IR light by a predetermined value without using the peripheral luminance and the subtraction amount of the coring process. Good.
  • the IR light control unit 28 may control the amount of IR light based on at least one of the peripheral luminance and the subtraction amount of the coring process even when the AR marker is detected.
  • pixels capable of receiving an IR light component and performing photoelectric conversion are used for both the R pixel, the G pixel, and the B pixel. Further, the IR light is set to enter the image sensor without blocking the IR light by an IR cut filter or the like as in a normal camera.
  • frequency analysis is performed on each pixel signal of R pixel, G pixel, and B pixel by using a technique such as FFT (Fast Fourier Transform). Based on the result, it is possible to separate the visible light component and the IR light component.
  • FFT Fast Fourier Transform
  • an image sensor that can detect invisible light (for example, ultraviolet light) having a wavelength band different from that of IR light.
  • invisible light for example, ultraviolet light
  • an AR marker that reflects or emits the invisible light is used.
  • an image sensor capable of detecting visible light in a wavelength band other than the R wavelength band, the G wavelength band, and the B wavelength band (for example, a yellow wavelength band).
  • the additional information is superimposed on the RGB image in the camera system 1 and then output.
  • the detection result of the RGB image and the AR marker is output, and the additional information is superimposed on the RGB image in the subsequent apparatus. You may make it do.
  • the threshold determination unit 25 determines whether or not the pixel value of the IR light image after the coring process satisfies a value that can detect the AR marker 101. May be. For example, the threshold determination unit 25 determines that the AR marker 101 can be detected when the average value of the pixel values of the IR light image after the coring process is equal to or greater than a predetermined threshold. 101 is determined to be undetectable.
  • the threshold determination unit 25 may use an average value of pixel values of the entire IR light image, or may use an average value of pixel values for each predetermined region. In the former case, it is determined whether or not the AR marker 101 can be detected in the entire IR light image. In the latter case, it is determined whether or not the AR marker 101 can be detected for each region of the IR light image. .
  • each unit may directly exchange data without passing through the buffer 29. Further, for example, in the camera system 201 in FIG. 8, each unit may exchange data via a buffer as in the camera system 1 in FIG.
  • this technology can be applied to both moving images and still images.
  • the series of processes described above can be executed by hardware or can be executed by software.
  • a program constituting the software is installed in the computer.
  • the computer includes, for example, a general-purpose personal computer capable of executing various functions by installing various programs by installing a computer incorporated in dedicated hardware.
  • FIG. 10 is a block diagram showing an example of the hardware configuration of a computer that executes the above-described series of processing by a program.
  • a CPU Central Processing Unit
  • ROM Read Only Memory
  • RAM Random Access Memory
  • an input / output interface 405 is connected to the bus 404.
  • An input unit 406, an output unit 407, a storage unit 408, a communication unit 409, and a drive 410 are connected to the input / output interface 405.
  • the input unit 406 includes a keyboard, a mouse, a microphone, and the like.
  • the output unit 407 includes a display, a speaker, and the like.
  • the storage unit 408 includes a hard disk, a nonvolatile memory, and the like.
  • the communication unit 409 includes a network interface.
  • the drive 410 drives a removable medium 411 such as a magnetic disk, an optical disk, a magneto-optical disk, or a semiconductor memory.
  • the CPU 401 loads, for example, a program stored in the storage unit 408 to the RAM 403 via the input / output interface 405 and the bus 404 and executes the program, and the series described above. Is performed.
  • the program executed by the computer (CPU 401) can be provided by being recorded on a removable medium 411 as a package medium, for example.
  • the program can be provided via a wired or wireless transmission medium such as a local area network, the Internet, or digital satellite broadcasting.
  • the program can be installed in the storage unit 408 via the input / output interface 405 by attaching the removable medium 411 to the drive 410.
  • the program can be received by the communication unit 409 via a wired or wireless transmission medium and installed in the storage unit 408.
  • the program can be installed in the ROM 402 or the storage unit 408 in advance.
  • the program executed by the computer may be a program that is processed in time series in the order described in this specification, or in parallel or at a necessary timing such as when a call is made. It may be a program for processing.
  • the system means a set of a plurality of components (devices, modules (parts), etc.), and it does not matter whether all the components are in the same housing. Accordingly, a plurality of devices housed in separate housings and connected via a network and a single device housing a plurality of modules in one housing are all systems. .
  • the present technology can take a cloud computing configuration in which one function is shared by a plurality of devices via a network and is jointly processed.
  • each step described in the above flowchart can be executed by one device or can be shared by a plurality of devices.
  • the plurality of processes included in the one step can be executed by being shared by a plurality of apparatuses in addition to being executed by one apparatus.
  • the present technology can take the following configurations.
  • a separation unit that separates the invisible light component from the input image including the visible light component and the invisible light component;
  • a noise removing unit for removing noise by reducing a pixel value of the first image based on an invisible light component of the input image;
  • An image processing apparatus comprising: a marker detection unit configured to detect an AR (Augmented Reality) marker indicating a predetermined pattern with invisible light in the first image from which noise is removed.
  • the separation unit further separates a visible light component from the input image;
  • the noise removal unit is configured to determine the first image based on at least one of luminance around the AR marker in the second image based on a visible light component of the input image or temperature around the image processing device.
  • the image processing apparatus wherein a subtraction amount that is an amount by which the pixel value of the image is reduced is set.
  • the AR marker shows the pattern by reflecting invisible light; Irradiation control for controlling the intensity of the invisible light irradiated to the AR marker based on at least one of the brightness around the AR marker in the second image, the subtraction amount, and the detection result of the AR marker.
  • the image processing apparatus according to (2), further including a unit.
  • the imaging device according to any one of (1) to (3), further including an imaging element that includes a plurality of types of pixels that detect invisible light components and visible light components each having a different wavelength band, and that captures the input image.
  • Image processing device. (6) The separation unit further separates a visible light component from the input image;
  • the image processing device according to any one of (1) to (5), further including an additional information superimposing unit that superimposes additional information corresponding to the AR marker on a second image based on a visible light component of the input image. .
  • a program for causing a computer to execute processing including: a marker detection step of detecting an AR (Augmented Reality) marker showing a predetermined pattern with invisible light in the first image from which noise has been removed.

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Spectroscopy & Molecular Physics (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Theoretical Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Electromagnetism (AREA)
  • General Health & Medical Sciences (AREA)
  • Toxicology (AREA)
  • Artificial Intelligence (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Health & Medical Sciences (AREA)
  • Computer Graphics (AREA)
  • Computer Hardware Design (AREA)
  • General Engineering & Computer Science (AREA)
  • Software Systems (AREA)
  • Studio Devices (AREA)

Abstract

This technology relates to an image processing device, an image processing method, and a program which make it possible to increase the flexibility of placement of an AR marker without decreasing the detection accuracy of the AR marker. An image processing unit is provided with: a separation unit for separating an invisible light component from an input image including a visible light component and the invisible light component; a noise elimination unit for eliminating noise by decreasing the pixel value of a first image based on the invisible light component of the input image; and a marker detection unit for detecting an augmented reality (AR) marker indicating a predetermined pattern by invisible light in the first image from which the noise is eliminated. This technology is applicable, for example, to a camera that uses AR technology.

Description

画像処理装置、画像処理方法、及び、プログラムImage processing apparatus, image processing method, and program
 本技術は、画像処理装置、画像処理方法、及び、プログラムに関し、特に、AR(Augmented Reality)マーカを検出する場合に用いて好適な画像処理装置、画像処理方法、及び、プログラムに関する。 The present technology relates to an image processing device, an image processing method, and a program, and more particularly, to an image processing device, an image processing method, and a program suitable for use in detecting an AR (Augmented Reality) marker.
 AR(Augmented Reality)技術の1つに、所定のパターンのARマーカが検出された位置に付加情報を重畳して表示するマーカ型のARがある(例えば、特許文献1参照)。 As one of AR (Augmented Reality) technologies, there is a marker type AR that superimposes and displays additional information at a position where an AR marker of a predetermined pattern is detected (see, for example, Patent Document 1).
特開2009-266096号公報JP 2009-266096 A
 ところで、ARマーカを現実の空間に配置する場合、景観やデザインを損ねるためARマーカの位置が制限されることがある。 By the way, when the AR marker is arranged in a real space, the position of the AR marker may be limited in order to damage the scenery and the design.
 本技術はこのような状況に鑑みてなされたものであり、ARマーカの検出精度を低下させずに、ARマーカの配置の自由度を高めるようにするものである。 The present technology has been made in view of such a situation, and is intended to increase the degree of freedom of arrangement of the AR marker without degrading the detection accuracy of the AR marker.
 本技術の一側面の画像処理装置は、可視光成分と不可視光成分を含む入力画像から不可視光成分を分離する分離部と、前記入力画像の不可視光成分に基づく第1の画像の画素値を減らすことによりノイズを除去するノイズ除去部と、ノイズが除去された前記第1の画像において、不可視光により所定のパターンを示すAR(Augmented Reality)マーカを検出するマーカ検出部とを備える。 An image processing apparatus according to an aspect of the present technology includes a separation unit that separates an invisible light component from an input image including a visible light component and an invisible light component, and a pixel value of a first image based on the invisible light component of the input image. A noise removal unit that removes noise by reducing, and a marker detection unit that detects an AR (Augmented Reality) marker indicating a predetermined pattern by invisible light in the first image from which noise has been removed.
 前記分離部には、前記入力画像から可視光成分をさらに分離させ、前記ノイズ除去部には、前記入力画像の可視光成分に基づく第2の画像における前記ARマーカの周囲の輝度、又は、前記画像処理装置の周囲の温度の少なくとも一方に基づいて、前記第1の画像の画素値を減らす量である減算量を設定させることができる。 The separation unit further separates a visible light component from the input image, and the noise removal unit includes a luminance around the AR marker in the second image based on the visible light component of the input image, or the Based on at least one of the ambient temperatures of the image processing apparatus, it is possible to set a subtraction amount that is an amount by which the pixel value of the first image is reduced.
 前記ARマーカが、不可視光を反射することにより前記パターンを示すようにし、前記第2の画像における前記ARマーカの周囲の輝度、前記減算量、及び、前記ARマーカの検出結果のうち少なくとも1つに基づいて、前記ARマーカに照射する不可視光の強度を制御する照射制御部をさらに設けることができる。 The AR marker reflects the invisible light to show the pattern, and at least one of the brightness around the AR marker, the subtraction amount, and the detection result of the AR marker in the second image. Based on the above, it is possible to further provide an irradiation control unit for controlling the intensity of the invisible light irradiated to the AR marker.
 可視光成分及び不可視光成分を検出する第1の画素、並びに、不可視光成分のみを検出する第2の画素を含み、前記入力画像を撮像する撮像素子をさらに設けることができる。 An imaging device that includes a first pixel that detects a visible light component and an invisible light component, and a second pixel that detects only an invisible light component, and captures the input image can be further provided.
 不可視光成分、及び、それぞれ波長帯が異なる可視光成分を検出する複数の種類の画素を含み、前記入力画像を撮像する撮像素子をさらに設けることができる。 An imaging element that includes a plurality of types of pixels that detect invisible light components and visible light components having different wavelength bands and that captures the input image can be further provided.
 前記分離部には、前記入力画像から可視光成分をさらに分離させ、前記入力画像の可視光成分に基づく第2の画像に、前記ARマーカに対応する付加情報を重畳する付加情報重畳部をさらに設けることができる。 The separation unit further includes an additional information superimposing unit that further separates a visible light component from the input image and superimposes additional information corresponding to the AR marker on a second image based on the visible light component of the input image. Can be provided.
 本技術の一側面の画像処理方法は、可視光成分と不可視光成分を含む入力画像から不可視光成分を分離する分離ステップと、前記入力画像の不可視光成分に基づく第1の画像の画素値を減らすことによりノイズを除去するノイズ除去ステップと、ノイズが除去された前記第1の画像において、不可視光により所定のパターンを示すAR(Augmented Reality)マーカを検出するマーカ検出ステップとを含む。 An image processing method according to an aspect of the present technology includes a separation step of separating an invisible light component from an input image including a visible light component and an invisible light component, and a pixel value of a first image based on the invisible light component of the input image. A noise removal step of removing noise by reducing, and a marker detection step of detecting an AR (Augmented Reality) marker indicating a predetermined pattern by invisible light in the first image from which the noise has been removed.
 本技術の一側面のプログラムは、可視光成分と不可視光成分を含む入力画像から不可視光成分を分離する分離ステップと、前記入力画像の不可視光成分に基づく第1の画像の画素値を減らすことによりノイズを除去するノイズ除去ステップと、ノイズが除去された前記第1の画像において、不可視光により所定のパターンを示すAR(Augmented Reality)マーカを検出するマーカ検出ステップとを含む処理をコンピュータに実行させる。 A program according to an aspect of the present technology includes a separation step of separating an invisible light component from an input image including a visible light component and an invisible light component, and reducing a pixel value of the first image based on the invisible light component of the input image. The computer executes a process including a noise removing step for removing noise and a marker detecting step for detecting an AR (Augmented Reality) marker indicating a predetermined pattern with invisible light in the first image from which the noise has been removed. Let
 本技術の一側面においては、可視光成分と不可視光成分を含む入力画像から不可視光成分が分離され、前記入力画像の不可視光成分に基づく第1の画像の画素値を減らすことによりノイズが除去され、ノイズが除去された前記第1の画像において、不可視光により所定のパターンを示すAR(Augmented Reality)マーカが検出される。 In one aspect of the present technology, the invisible light component is separated from the input image including the visible light component and the invisible light component, and noise is removed by reducing the pixel value of the first image based on the invisible light component of the input image. In the first image from which noise is removed, an AR (Augmented Reality) marker indicating a predetermined pattern is detected by invisible light.
 本技術の一側面によれば、ARマーカの検出精度を低下させずに、ARマーカの配置の自由度を高めることができる。 According to one aspect of the present technology, it is possible to increase the degree of freedom of arrangement of the AR marker without reducing the detection accuracy of the AR marker.
 なお、ここに記載された効果は必ずしも限定されるものではなく、本開示中に記載されたいずれかの効果であってもよい。 It should be noted that the effects described here are not necessarily limited, and may be any of the effects described in the present disclosure.
本技術を適用したカメラシステムの第1の実施の形態を示すブロック図である。It is a block diagram showing a 1st embodiment of a camera system to which this art is applied. イメージセンサの画素の配置例を示す図である。It is a figure which shows the example of arrangement | positioning of the pixel of an image sensor. AR処理を説明するためのフローチャートである。It is a flowchart for demonstrating AR process. ARマーカの例を示す図である。It is a figure which shows the example of AR marker. 輝度の検出を行う領域の例を示す図である。It is a figure which shows the example of the area | region which detects a brightness | luminance. 周辺輝度に対する画素値の減算量を計算するためのグラフの例である。It is an example of the graph for calculating the subtraction amount of the pixel value with respect to surrounding luminance. 周囲の温度に対する画素値の減算量を計算するためのグラフの例である。It is an example of the graph for calculating the subtraction amount of the pixel value with respect to ambient temperature. 本技術を適用したカメラシステムの第2の実施の形態を示すブロック図である。It is a block diagram which shows 2nd Embodiment of the camera system to which this technique is applied. 周辺輝度又は周囲の温度に対する画素値の減算量を計算するためのグラフの例である。It is an example of the graph for calculating the subtraction amount of the pixel value with respect to ambient luminance or ambient temperature. コンピュータの構成例を示すブロック図である。It is a block diagram which shows the structural example of a computer.
 以下、本技術を実施するための形態(以下、実施の形態という)について説明する。なお、説明は以下の順序で行う。
1.第1の実施の形態(ARマーカにIR光を照射する場合)
2.第2の実施の形態(ARマーカがIR光を自ら発する場合)
3.変形例
Hereinafter, modes for carrying out the present technology (hereinafter referred to as embodiments) will be described. The description will be given in the following order.
1. 1st Embodiment (when irradiating IR light to AR marker)
2. Second embodiment (when the AR marker emits IR light itself)
3. Modified example
<1.第1の実施の形態>
 まず、図1乃至図7を参照して、本技術の第1の実施の形態について説明する。この第1の実施の形態では、IR光(赤外光)を反射する無色透明のARマーカを用いたマーカ型ARが実現される。
<1. First Embodiment>
First, a first embodiment of the present technology will be described with reference to FIGS. In the first embodiment, a marker-type AR using a colorless and transparent AR marker that reflects IR light (infrared light) is realized.
{カメラシステム1の構成例}
 図1は、本技術の第1の実施の形態であるカメラシステム1の構成例を示している。
{Configuration example of camera system 1}
FIG. 1 shows a configuration example of a camera system 1 that is the first embodiment of the present technology.
 カメラシステム1は、AR技術を用いて、撮影した画像に付加情報を重畳することが可能なシステムである。すなわち、カメラシステム1は、IR光を反射する無色透明のARマーカにIR光を照射してARマーカを検出し、検出したARマーカに基づいて、撮影した画像に付加情報を重畳して表示する。 The camera system 1 is a system that can superimpose additional information on a captured image using AR technology. That is, the camera system 1 detects the AR marker by irradiating the colorless and transparent AR marker that reflects IR light with the IR light, and displays the additional information superimposed on the captured image based on the detected AR marker. .
 なお、付加情報の内容は特に制限されるものではない。例えば、付加情報は、撮影した被写体や場所等に関する情報等を含む。 Note that the content of the additional information is not particularly limited. For example, the additional information includes information related to the photographed subject and location.
 カメラシステム1は、バンドパスフィルタ11、イメージセンサ12、温度センサ13、画像処理部14、及び、IR光照射装置15を含むように構成される。 The camera system 1 is configured to include a bandpass filter 11, an image sensor 12, a temperature sensor 13, an image processing unit 14, and an IR light irradiation device 15.
 バンドパスフィルタ11は、被写体からの光のうち、可視光領域及びIR光領域を含む所定の波長帯の光を透過し、それ以外の波長帯の光を遮断する。 The band-pass filter 11 transmits light in a predetermined wavelength band including a visible light region and an IR light region, and blocks light in other wavelength bands among the light from the subject.
 イメージセンサ12は、バンドパスフィルタ11を透過した被写体からの光を受光し、各画素への入射光の強度を示す画素信号からなる画像(以下、入力画像と称する)を生成する。イメージセンサ12は、生成した入力画像を画像処理部14の分離部21に供給する。 The image sensor 12 receives light from the subject that has passed through the band-pass filter 11 and generates an image (hereinafter referred to as an input image) composed of pixel signals indicating the intensity of incident light on each pixel. The image sensor 12 supplies the generated input image to the separation unit 21 of the image processing unit 14.
 図2は、イメージセンサ12の画素の配置例を示している。イメージセンサ12には、縦2画素×横2画素からなるパターンを1単位として、各画素が格子状に配置されている。この例においては、G画素:R画素:B画素:IR画素=1:1:1:1の割合で配置されている。従来のベイヤ配列のイメージセンサと比較して、G画素の代わりにIR画素が配置されている点が大きく異なる。 FIG. 2 shows an arrangement example of the pixels of the image sensor 12. In the image sensor 12, each pixel is arranged in a grid pattern with a pattern of 2 vertical pixels × 2 horizontal pixels as one unit. In this example, the pixels are arranged in a ratio of G pixel: R pixel: B pixel: IR pixel = 1: 1: 1: 1. Compared to a conventional Bayer array image sensor, IR pixels are arranged in place of G pixels.
 G画素は、主にG(緑)の波長帯の成分(以下、G成分と称する)、及び、IR光の波長帯の成分(以下、IR光成分と称する)を受光し、光電変換することが可能な画素である。すなわち、G画素は、G成分及びIR光成分を検出可能な画素である。R画素は、主にR(赤)の波長帯の成分(以下、R成分と称する)、及び、IR光成分を受光し、光電変換することが可能な画素である。すなわち、R画素は、R成分及びIR光成分を検出可能な画素である。B画素は、主にB(青)の波長帯の成分(以下、B成分と称する)、及び、IR光成分を受光し、光電変換することが可能な画素である。すなわち、B画素は、B成分及びIR光成分を検出可能な画素である。IR画素は、IR光成分を受光し、光電変換することが可能な画素である。すなわち、IR画素は、IR光成分のみを検出可能な画素である。 The G pixel mainly receives a component in the G (green) wavelength band (hereinafter referred to as G component) and a component in the IR light wavelength band (hereinafter referred to as IR light component), and performs photoelectric conversion. Is a possible pixel. That is, the G pixel is a pixel that can detect the G component and the IR light component. The R pixel is a pixel capable of receiving and photoelectrically converting a component in the R (red) wavelength band (hereinafter referred to as an R component) and an IR light component. That is, the R pixel is a pixel that can detect the R component and the IR light component. The B pixel is a pixel capable of receiving and photoelectrically converting a component in the B (blue) wavelength band (hereinafter referred to as a B component) and an IR light component. That is, the B pixel is a pixel that can detect the B component and the IR light component. The IR pixel is a pixel that can receive an IR light component and perform photoelectric conversion. That is, the IR pixel is a pixel that can detect only the IR light component.
 温度センサ13は、カメラシステム1の周囲の温度を測定し、測定結果を画像処理部14のコアリング部24に供給する。温度センサ13は、例えば、イメージセンサ12の近傍に配置される。 The temperature sensor 13 measures the ambient temperature of the camera system 1 and supplies the measurement result to the coring unit 24 of the image processing unit 14. For example, the temperature sensor 13 is disposed in the vicinity of the image sensor 12.
 画像処理部14は、イメージセンサ12から供給される入力画像に対してAR処理等の画像処理を行う。画像処理部14は、分離部21、可視光画像処理部22、輝度算出部23、コアリング部(ノイズ除去部)24、閾値判定部25、マーカ検出部26、付加情報重畳部27、IR光制御部28、及び、バッファ29を含むように構成される。 The image processing unit 14 performs image processing such as AR processing on the input image supplied from the image sensor 12. The image processing unit 14 includes a separation unit 21, a visible light image processing unit 22, a luminance calculation unit 23, a coring unit (noise removal unit) 24, a threshold determination unit 25, a marker detection unit 26, an additional information superimposition unit 27, IR light. The control unit 28 and the buffer 29 are included.
 分離部21は、入力画像の各画素の可視光成分とIR光成分を分離する。そして、分離部21は、可視光成分に基づくモザイク画像を可視光画像処理部22に供給し、IR光成分に基づく画像(以下、IR光画像と称する)をバッファ29に記憶させる。 The separation unit 21 separates the visible light component and the IR light component of each pixel of the input image. Then, the separation unit 21 supplies a mosaic image based on the visible light component to the visible light image processing unit 22 and causes the buffer 29 to store an image based on the IR light component (hereinafter referred to as an IR light image).
 可視光画像処理部22は、モザイク画像に対してデモザイク処理等の所定の画像処理を行うことにより、各画素がR成分、G成分、B成分の画素値を有するRGB画像を生成する。可視光画像処理部22は、生成したRGB画像を輝度算出部23及び付加情報重畳部27に供給する。 The visible light image processing unit 22 performs predetermined image processing such as demosaic processing on the mosaic image, thereby generating an RGB image in which each pixel has pixel values of R component, G component, and B component. The visible light image processing unit 22 supplies the generated RGB image to the luminance calculation unit 23 and the additional information superimposing unit 27.
 輝度算出部23は、RGB画像内のARマーカの周辺の輝度(以下、周辺輝度と称する)を算出し、算出結果をバッファ29に記憶させる。 The luminance calculation unit 23 calculates the luminance around the AR marker in the RGB image (hereinafter referred to as peripheral luminance), and stores the calculation result in the buffer 29.
 コアリング部24は、周辺輝度の算出結果及びIR光画像をバッファ29から読み出す。コアリング部24は、周辺輝度及びカメラシステム1の周囲の温度に基づいて、IR光画像に対してコアリング処理を行うことにより、IR光画像のノイズを除去する。コアリング部24は、コアリング処理後のIR光画像をバッファ29に記憶させる。 The coring unit 24 reads the peripheral luminance calculation result and the IR light image from the buffer 29. The coring unit 24 removes noise from the IR light image by performing a coring process on the IR light image based on the ambient brightness and the ambient temperature of the camera system 1. The coring unit 24 stores the IR light image after the coring process in the buffer 29.
 閾値判定部25は、バッファ29に記憶されているコアリング処理後のIR光画像の各画素の画素値を所定の閾値と比較する。そして、閾値判定部25は、画素値が所定の閾値以上であると判定された画素の画素値を1とし、画素値が所定の閾値未満であると判定された画素の画素値を0とする2値化画像を生成し、バッファ29に記憶させる。 The threshold determination unit 25 compares the pixel value of each pixel of the IR light image after the coring process stored in the buffer 29 with a predetermined threshold. Then, the threshold determination unit 25 sets the pixel value of the pixel determined to be greater than or equal to the predetermined threshold to 1, and sets the pixel value of the pixel determined to be less than the predetermined threshold to 0. A binary image is generated and stored in the buffer 29.
 マーカ検出部26は、バッファ29に記憶されている2値化画像を用いて、ARマーカの検出を行う。マーカ検出部26は、ARマーカの検出結果をバッファ29に記憶させるとともに、IR光制御部28に供給する。 The marker detection unit 26 detects the AR marker using the binarized image stored in the buffer 29. The marker detection unit 26 stores the detection result of the AR marker in the buffer 29 and supplies it to the IR light control unit 28.
 付加情報重畳部27は、バッファ29に記憶されているARマーカの検出結果に基づいて、ARマーカに対応する付加情報をRGB画像に重畳する。付加情報重畳部27は、付加情報を重畳したRGB画像を後段の機器等に出力する。 The additional information superimposing unit 27 superimposes additional information corresponding to the AR marker on the RGB image based on the detection result of the AR marker stored in the buffer 29. The additional information superimposing unit 27 outputs the RGB image on which the additional information is superimposed to a subsequent device or the like.
 IR光制御部28は、ARマーカの検出結果、及び、バッファ29に記憶されている周辺輝度の算出結果に基づいて、IR光照射装置15から発せられるIR光の光量を制御する。 The IR light control unit 28 controls the amount of IR light emitted from the IR light irradiation device 15 based on the detection result of the AR marker and the calculation result of the peripheral luminance stored in the buffer 29.
 バッファ29は、例えば、RAM等のメモリにより構成される。 The buffer 29 is configured by a memory such as a RAM, for example.
 IR光照射装置15は、IR光制御部28の制御の下に、カメラシステム1の撮影範囲を含む領域に所定の波長帯のIR光を照射する。 The IR light irradiation device 15 irradiates a region including the imaging range of the camera system 1 with IR light in a predetermined wavelength band under the control of the IR light control unit 28.
{撮影処理}
 次に、図3のフローチャートを参照して、カメラシステム1により実行されるAR処理について説明する。
{Shooting process}
Next, the AR process executed by the camera system 1 will be described with reference to the flowchart of FIG.
 ステップS1において、カメラシステム1は、撮影を行う。具体的には、IR光照射装置15は、IR光制御部28の制御の下に、カメラシステム1の撮影方向に向けてIR光を照射する。一方、被写体からの光は、バンドパスフィルタ11に入射する。この入射光には、IR光照射装置15からのIR光が被写体により反射された反射光が含まれる。 In step S1, the camera system 1 performs shooting. Specifically, the IR light irradiation device 15 irradiates IR light in the shooting direction of the camera system 1 under the control of the IR light control unit 28. On the other hand, light from the subject enters the bandpass filter 11. The incident light includes reflected light obtained by reflecting the IR light from the IR light irradiation device 15 by the subject.
 バンドパスフィルタ11は、被写体からの入射光のうち、可視光領域とIR光領域を含む所定の波長帯の光を透過し、それ以外の波長帯の光を遮断する。バンドパスフィルタ11を透過した光、すなわち、被写体からの光のうち可視光成分及びIR光成分のみを含む光がイメージセンサ12に入射し、イメージセンサ12の受光面において結像する。 The band-pass filter 11 transmits light in a predetermined wavelength band including a visible light region and an IR light region among incident light from the subject, and blocks light in other wavelength bands. Light that has passed through the bandpass filter 11, that is, light that includes only the visible light component and the IR light component among the light from the subject, enters the image sensor 12 and forms an image on the light receiving surface of the image sensor 12.
 イメージセンサ12は、受光面において結像した光を、画素毎に光の強度に応じた電気信号に光電変換し、さらに各画素の電気信号をAD変換した画素信号からなる入力画像を生成する。イメージセンサ12は、生成した入力画像を分離部21に供給する。 The image sensor 12 photoelectrically converts the light imaged on the light receiving surface into an electrical signal corresponding to the intensity of the light for each pixel, and further generates an input image composed of pixel signals obtained by AD conversion of the electrical signal of each pixel. The image sensor 12 supplies the generated input image to the separation unit 21.
 図2を参照して上述したように、イメージセンサ12は、R画素、G画素、B画素、及び、IR画素が格子状に配置されている。R画素から出力される信号は、R成分及びIR光成分を含み、G画素から出力される画素信号は、G成分及びIR光成分を含み、B画素から出力される画素信号は、B成分及びIR光成分を含み、IR画素から出力される画素信号は、IR光成分のみを含む。 As described above with reference to FIG. 2, in the image sensor 12, R pixels, G pixels, B pixels, and IR pixels are arranged in a grid pattern. The signal output from the R pixel includes the R component and the IR light component, the pixel signal output from the G pixel includes the G component and the IR light component, and the pixel signal output from the B pixel includes the B component and The pixel signal that includes the IR light component and is output from the IR pixel includes only the IR light component.
 なお、以下、カメラシステム1の撮影範囲内に、図4に示されるARマーカ101が配置されているものとする。このARマーカ101は、図内の黒で塗りつぶされた反射部101Aと、白で塗りつぶされた非反射部101Bに分かれる。反射部101Aは、例えば、IR光に対する反射率が高い無色透明の塗料が塗布された部分である。非反射部101Bは、無色透明の塗料が塗布されていない部分である。 In the following, it is assumed that the AR marker 101 shown in FIG. 4 is arranged within the shooting range of the camera system 1. The AR marker 101 is divided into a reflective portion 101A filled with black and a non-reflective portion 101B filled with white in the drawing. The reflective portion 101A is, for example, a portion to which a colorless and transparent paint having a high reflectivity with respect to IR light is applied. The non-reflective portion 101B is a portion where a colorless and transparent paint is not applied.
 従って、ARマーカ101にIR光を照射すると、IR光が反射部101Aでのみ反射され、カメラシステム1に入射する。従って、後述するように、カメラシステム1において、反射部101A及び非反射部101Bの形状がARマーカ101のパターンとして認識される。 Therefore, when the AR marker 101 is irradiated with IR light, the IR light is reflected only by the reflecting portion 101A and enters the camera system 1. Therefore, as will be described later, in the camera system 1, the shapes of the reflective portion 101A and the non-reflective portion 101B are recognized as the pattern of the AR marker 101.
 また、ARマーカ101は、反射部101Aも非反射部101Bも無色透明であり、人間の目では見ることができない。従って、ARマーカ101を各所に配置しても、景観やデザインが損なわれることがない。 Also, the AR marker 101 is transparent and colorless in both the reflection part 101A and the non-reflection part 101B, and cannot be seen by human eyes. Therefore, even if the AR marker 101 is arranged in various places, the scenery and the design are not impaired.
 ステップS2において、分離部21は、可視光成分とIR光成分を分離する。すなわち、分離部21は、入力画像の各画素信号の成分を可視光成分とIR光成分に分離する。 In step S2, the separation unit 21 separates the visible light component and the IR light component. That is, the separation unit 21 separates each pixel signal component of the input image into a visible light component and an IR light component.
 例えば、分離部21は、R画素の画素信号の成分から、近隣のIR画素の画素信号に含まれるIR光成分を減算することにより、R成分を抽出する。また、分離部21は、R画素の画素信号の成分から、抽出したR成分を減算することにより、IR光成分を抽出する。これにより、R画素の画素信号のR成分(可視光成分)とIR光成分が分離される。 For example, the separation unit 21 extracts the R component by subtracting the IR light component included in the pixel signal of the neighboring IR pixel from the component of the pixel signal of the R pixel. Further, the separation unit 21 extracts the IR light component by subtracting the extracted R component from the component of the pixel signal of the R pixel. Thereby, the R component (visible light component) and the IR light component of the pixel signal of the R pixel are separated.
 同様にして、G画素の画素信号のG成分(可視光成分)とIR光成分が分離され、B画素の画素信号のB成分(可視光成分)とIR光成分が分離される。 Similarly, the G component (visible light component) and the IR light component of the pixel signal of the G pixel are separated, and the B component (visible light component) and the IR light component of the pixel signal of the B pixel are separated.
 そして、分離部21は、R画素の画素信号のIR光成分、G画素の画素信号のIR光成分、B画素の画素信号のIR光成分、及び、IR画素のIR光成分の強度に応じた画素値を有するIR光画像を生成し、バッファ29に記憶させる。また、分離部21は、R画素の画素信号のR成分、G画素の画素信号のG成分、B画素の画素信号のB成分、及び、IR画素の画素信号の可視光成分(実際には、0)の強度に応じた画素値を有するモザイク画像を生成し、可視光画像処理部22に供給する。 The separation unit 21 corresponds to the intensity of the IR light component of the pixel signal of the R pixel, the IR light component of the pixel signal of the G pixel, the IR light component of the pixel signal of the B pixel, and the IR light component of the IR pixel. An IR light image having pixel values is generated and stored in the buffer 29. The separation unit 21 also includes an R component of the pixel signal of the R pixel, a G component of the pixel signal of the G pixel, a B component of the pixel signal of the B pixel, and a visible light component of the pixel signal of the IR pixel (in practice, A mosaic image having a pixel value corresponding to the intensity of 0) is generated and supplied to the visible light image processing unit 22.
 ステップS3において、可視光画像処理部22は、可視光画像処理を行う。例えば、可視光画像処理部22は、モザイク画像に対してデモザイク処理を行い、各画素のR成分、G成分、及び、B成分の画素値を補完することにより、RGB画像を生成する。また、可視光画像処理部22は、例えば、生成したRGB画像に対して、ホワイトバランス処理、ノイズ除去処理等の通常のRGB画像に対する処理を行う。そして、可視光画像処理部22は、RGB画像を輝度算出部23及び付加情報重畳部27に供給する。 In step S3, the visible light image processing unit 22 performs visible light image processing. For example, the visible light image processing unit 22 performs demosaic processing on the mosaic image, and generates an RGB image by complementing the R component, G component, and B component pixel values of each pixel. In addition, the visible light image processing unit 22 performs processing on a normal RGB image such as white balance processing and noise removal processing on the generated RGB image, for example. Then, the visible light image processing unit 22 supplies the RGB image to the luminance calculation unit 23 and the additional information superimposing unit 27.
 ステップS4において、輝度算出部23は、ARマーカ101の周辺の輝度を算出する。具体的には、輝度算出部23は、所定の変換式を用いて、RGB画像の各画素のRGBの各成分の画素値から、各画素の輝度成分(Y成分)を算出する。そして、輝度算出部23は、RGB画像の各画素の輝度成分に基づいて、ARマーカ101の周辺の周辺輝度を算出する。 In step S4, the luminance calculation unit 23 calculates the luminance around the AR marker 101. Specifically, the luminance calculation unit 23 calculates the luminance component (Y component) of each pixel from the pixel value of each RGB component of each pixel of the RGB image using a predetermined conversion formula. Then, the luminance calculation unit 23 calculates the peripheral luminance around the AR marker 101 based on the luminance component of each pixel of the RGB image.
 例えば、輝度算出部23は、RGB画像全体の各画素の輝度成分の平均値を周辺輝度(大局輝度)として算出する。 For example, the luminance calculation unit 23 calculates an average value of luminance components of each pixel of the entire RGB image as the peripheral luminance (overall luminance).
 或いは、例えば、輝度算出部23は、図5に示されるように、RGB画像111のARマーカ101周辺の矩形の輝度検出用領域112内の画素の輝度成分の平均値を周辺輝度として算出する。こちらの方が、ARマーカ101周辺の輝度をより正確に検出することが可能になる。 Alternatively, for example, as shown in FIG. 5, the luminance calculation unit 23 calculates the average value of the luminance components of the pixels in the rectangular luminance detection region 112 around the AR marker 101 of the RGB image 111 as the peripheral luminance. This makes it possible to detect the brightness around the AR marker 101 more accurately.
 なお、輝度検出用領域112の位置(例えば、輝度検出用領域112の座標(X,Y)、幅W、高さH等)は、ユーザが設定するようにしてもよいし、カメラシステム1が自動的に設定するようにしてもよい。 The position of the luminance detection area 112 (for example, the coordinates (X, Y), width W, height H, etc. of the luminance detection area 112) may be set by the user, or the camera system 1 may It may be set automatically.
 例えば、ユーザが設定する場合、上述したようにARマーカ101は目に見えないため、ARマーカ101を含むように輝度検出用領域112を設定するのは通常困難である。しかし、例えば、ARマーカ101が配置されている物体又は場所等が予めわかっている場合、その物体又は場所等を含むように輝度検出用領域112を設定することにより、確実に輝度検出用領域112内にARマーカ101を含めることが可能である。 For example, when the user sets, since the AR marker 101 is not visible as described above, it is usually difficult to set the luminance detection region 112 so as to include the AR marker 101. However, for example, when the object or place where the AR marker 101 is arranged is known in advance, the brightness detection area 112 is reliably set by setting the brightness detection area 112 so as to include the object or place. It is possible to include the AR marker 101 inside.
 また、カメラシステム1が設定する場合、例えば、ARマーカ101の外周は矩形なので、エッジ成分等を用いてARマーカ101の外周を検出し、ARマーカ101の位置を検出することは比較的容易である。従って、後述するステップS8において、ARマーカ101の詳細な検出処理を行う前に、例えば、輝度算出部23が、ARマーカ101の位置検出のみを行い、ARマーカ101を含む領域を輝度検出用領域112に設定するようにしてもよい。 Further, when the camera system 1 is set, for example, since the outer periphery of the AR marker 101 is rectangular, it is relatively easy to detect the outer periphery of the AR marker 101 using an edge component or the like and detect the position of the AR marker 101. is there. Therefore, before performing the detailed detection process of the AR marker 101 in step S8 described later, for example, the luminance calculation unit 23 performs only the position detection of the AR marker 101, and sets the area including the AR marker 101 as a luminance detection area. 112 may be set.
 或いは、例えば、輝度算出部23が、前のフレームにおけるARマーカ101の検出結果に基づいて、輝度検出用領域112を設定するようにしてもよい。 Alternatively, for example, the luminance calculation unit 23 may set the luminance detection region 112 based on the detection result of the AR marker 101 in the previous frame.
 そして、輝度算出部23は、周辺輝度の算出結果をバッファ29に記憶させる。 Then, the luminance calculation unit 23 stores the calculation result of the peripheral luminance in the buffer 29.
 ステップS5において、温度センサ13は、カメラシステム1の周囲の温度を測定する。温度センサ13は、測定結果をコアリング部24に供給する。 In step S5, the temperature sensor 13 measures the temperature around the camera system 1. The temperature sensor 13 supplies the measurement result to the coring unit 24.
 ステップS6において、コアリング部24は、コアリング処理を行う。具体的には、コアリング部24は、IR光画像、及び、周辺輝度の算出結果をバッファ29から読み出す。そして、コアリング部24は、周辺輝度及びカメラシステム1の周囲の温度に基づいて、IR光画像の各画素値を補正する。 In step S6, the coring unit 24 performs a coring process. Specifically, the coring unit 24 reads the IR light image and the calculation result of the peripheral luminance from the buffer 29. Then, the coring unit 24 corrects each pixel value of the IR light image based on the ambient brightness and the ambient temperature of the camera system 1.
 具体的には、コアリング部24は、例えば、図6に示されるグラフを用いて、周辺輝度に基づいて、IR光画像の各画素の画素値に対する減算量(以下、輝度ベース減算量と称する)を算出する。このグラフは、横軸が輝度(周辺輝度)を示し、縦軸が減算量(輝度ベース減算量)を示している。 Specifically, the coring unit 24 uses, for example, the graph shown in FIG. 6 to calculate the subtraction amount (hereinafter referred to as luminance-based subtraction amount) for the pixel value of each pixel of the IR light image based on the peripheral luminance. ) Is calculated. In this graph, the horizontal axis represents luminance (peripheral luminance), and the vertical axis represents subtraction amount (luminance-based subtraction amount).
 この例においては、周辺輝度がYL未満の範囲において、輝度ベース減算量は最小のSYLに固定される。周辺輝度がYL以上かつYH以下の範囲において、輝度ベース減算量は、周辺輝度に比例して、最小のSYLから最大のSYHまで線形に増加する。周辺輝度がYHを超える範囲において、輝度ベース減算量は、最大のSYHに固定される。すなわち、周辺輝度が高くなるほど、IR光画像のノイズ成分が増えるため、輝度ベース減算量が大きくなるように設定される。 In this example, the luminance base subtraction amount is fixed to the smallest SYL in the range where the peripheral luminance is less than YL. In the range where the peripheral luminance is YL or more and YH or less, the luminance base subtraction amount increases linearly from the minimum SYL to the maximum SYH in proportion to the peripheral luminance. In a range where the peripheral luminance exceeds YH, the luminance base subtraction amount is fixed to the maximum SYH. That is, since the noise component of the IR light image increases as the peripheral luminance increases, the luminance base subtraction amount is set to be large.
 また、コアリング部24は、例えば、図7に示されるグラフを用いて、カメラシステム1の周囲の温度に基づいて、IR光画像の各画素の画素値に対する減算量(以下、温度ベース減算量と称する)を算出する。このグラフは、横軸がカメラシステム1の周囲の温度を示し、縦軸が減算量(温度ベース減算量)を示している。 Further, the coring unit 24 uses, for example, a graph shown in FIG. 7 to calculate a subtraction amount (hereinafter referred to as a temperature-based subtraction amount) for the pixel value of each pixel of the IR light image based on the ambient temperature of the camera system 1. Is calculated). In this graph, the horizontal axis indicates the temperature around the camera system 1 and the vertical axis indicates the subtraction amount (temperature-based subtraction amount).
 この例においては、周囲の温度がTL未満の範囲において、温度ベース減算量は最小のSTLに固定される。周囲の温度がTL以上かつTH以下の範囲において、温度ベース減算量は、周囲の温度に比例して、最小のSTLから最大のSTHまで線形に増加する。周囲の温度がTHを超える範囲において、温度ベース減算量は、最大のSTHに固定される。すなわち、周囲の温度が高くなるほど、IR光画像のノイズ成分が増えるため、温度ベース減算量が大きくなるように設定される。 In this example, the temperature base subtraction amount is fixed to the minimum STL in the range where the ambient temperature is less than TL. In the range where the ambient temperature is not less than TL and not more than TH, the temperature base subtraction amount increases linearly from the minimum STL to the maximum STH in proportion to the ambient temperature. In the range where the ambient temperature exceeds TH, the temperature-based subtraction amount is fixed to the maximum STH. That is, since the noise component of the IR light image increases as the ambient temperature increases, the temperature base subtraction amount is set to be larger.
 次に、コアリング部24は、輝度ベース減算量及び温度ベース減算量に基づいて、最終的な減算量(以下、トータル減算量と称する)を算出する。例えば、コアリング部24は、輝度ベース減算量と温度ベース減算量を加算することによりトータル減算量を算出する。或いは、例えば、コアリング部24は、輝度ベース減算量と温度ベース減算量を重み付け加算することによりトータル減算量を算出する。 Next, the coring unit 24 calculates a final subtraction amount (hereinafter referred to as a total subtraction amount) based on the luminance base subtraction amount and the temperature base subtraction amount. For example, the coring unit 24 calculates the total subtraction amount by adding the luminance base subtraction amount and the temperature base subtraction amount. Alternatively, for example, the coring unit 24 calculates the total subtraction amount by weighted addition of the luminance base subtraction amount and the temperature base subtraction amount.
 そして、コアリング部24は、IR光画像の各画素の画素値からトータル減算量を引くことにより、各画素の画素値を減らすように補正する。これにより、IR光画像のノイズ成分の発生量に応じて、各画素の画素値が減らされ、IR光画像のノイズ成分が適切に低減される。コアリング部24は、コアリング処理後のIR光画像をバッファ29に記憶させる。 The coring unit 24 corrects the pixel value of each pixel to be reduced by subtracting the total subtraction amount from the pixel value of each pixel of the IR light image. Thus, the pixel value of each pixel is reduced according to the amount of noise component generated in the IR light image, and the noise component in the IR light image is appropriately reduced. The coring unit 24 stores the IR light image after the coring process in the buffer 29.
 ステップS7において、閾値判定部25は、閾値判定処理を行う。具体的には、閾値判定部25は、コアリング処理後のIR光画像をバッファ29から読み出す。次に、閾値判定部25は、IR光画像の各画素の画素値が、所定の閾値以上であるか否かを判定する。 In step S7, the threshold determination unit 25 performs a threshold determination process. Specifically, the threshold determination unit 25 reads the IR light image after the coring process from the buffer 29. Next, the threshold determination unit 25 determines whether the pixel value of each pixel of the IR light image is equal to or greater than a predetermined threshold.
 そして、閾値判定部25は、例えば、画素値が所定の閾値以上の画素、換言すれば、コアリング処理後のIR光成分が所定の強度以上の画素の画素値を1に設定する。また、閾値判定部25は、例えば、画素値が所定の閾値未満の画素、換言すれば、コアリング処理後のIR光成分が所定の強度未満の画素の画素値を0に設定する。これにより、2値化画像が生成される。閾値判定部25は、生成した2値化画像をバッファ29に記憶させる。 Then, the threshold value determination unit 25 sets the pixel value of a pixel whose pixel value is equal to or greater than a predetermined threshold, in other words, a pixel whose IR light component after the coring process is equal to or greater than a predetermined intensity to 1, for example. In addition, the threshold determination unit 25 sets, for example, the pixel value of a pixel having a pixel value less than a predetermined threshold, in other words, a pixel having an IR light component after coring processing less than a predetermined intensity to 0. Thereby, a binarized image is generated. The threshold determination unit 25 stores the generated binarized image in the buffer 29.
 なお、この処理に用いる閾値を、例えば、周辺輝度やカメラシステム1の周囲の温度等に基づいて調整するようにしてもよい。 It should be noted that the threshold value used for this processing may be adjusted based on, for example, the ambient brightness, the ambient temperature of the camera system 1, and the like.
 ステップS8において、マーカ検出部26は、ARマーカ101の検出処理を行う。具体的には、マーカ検出部26は、2値化画像をバッファ29から読み出す。次に、マーカ検出部26は、例えば、パターンマッチング等の所定の手法により、ARマーカ101の反射部101Aと同じ形状の画像を2値化画像内において検索することにより、ARマーカ101の検出を行う。 In step S8, the marker detection unit 26 performs a process of detecting the AR marker 101. Specifically, the marker detection unit 26 reads a binarized image from the buffer 29. Next, the marker detection unit 26 detects the AR marker 101 by searching the binarized image for an image having the same shape as the reflection unit 101A of the AR marker 101 by a predetermined method such as pattern matching. Do.
 ステップS9において、マーカ検出部26は、ステップS8の処理の結果に基づいて、ARマーカ101を検出できたか否かを判定する。ARマーカ101を検出できたと判定された場合、処理はステップS10に進む。 In step S9, the marker detection unit 26 determines whether or not the AR marker 101 has been detected based on the result of the process in step S8. If it is determined that the AR marker 101 has been detected, the process proceeds to step S10.
 ステップS10において、画像処理部14は、付加情報重畳処理を行う。具体的には、マーカ検出部26は、ARマーカ101の検出結果をバッファ29に記憶させる。この検出結果は、例えば、2値化画像内においてARマーカ101が検出された位置の座標等を含む。 In step S10, the image processing unit 14 performs additional information superimposition processing. Specifically, the marker detection unit 26 stores the detection result of the AR marker 101 in the buffer 29. This detection result includes, for example, the coordinates of the position where the AR marker 101 is detected in the binarized image.
 付加情報重畳部27は、ARマーカ101の検出結果をバッファ29から読み出す。次に、付加情報重畳部27は、RGB画像において、ARマーカ101が検出された位置付近にARマーカ101に対応する付加情報を重畳する。そして、付加情報重畳部27は、付加情報を重畳したRGB画像を出力し、後段の機器等に供給する。これにより、例えば、後段の機器において付加情報が重畳されたRGB画像が表示される。 The additional information superimposing unit 27 reads the detection result of the AR marker 101 from the buffer 29. Next, the additional information superimposing unit 27 superimposes additional information corresponding to the AR marker 101 in the vicinity of the position where the AR marker 101 is detected in the RGB image. Then, the additional information superimposing unit 27 outputs an RGB image on which the additional information is superimposed, and supplies the RGB image to a subsequent device or the like. Thereby, for example, an RGB image on which additional information is superimposed is displayed in the subsequent device.
 その後、処理はステップS1に戻り、ステップS1以降の処理が実行される。すなわち、入力画像の各フレームにおいて、上述したAR処理が行われる。 Thereafter, the process returns to step S1, and the processes after step S1 are executed. That is, the above-described AR processing is performed on each frame of the input image.
 一方、ステップS9において、ARマーカ101を検出できなかったと判定された場合、処理はステップS11に進む。 On the other hand, if it is determined in step S9 that the AR marker 101 cannot be detected, the process proceeds to step S11.
 ステップS11において、画像処理部14は、IR光の照射量を調整する。具体的には、マーカ検出部26は、ARマーカ101を検出できなかったことをIR光制御部28に通知する。 In step S11, the image processing unit 14 adjusts the irradiation amount of the IR light. Specifically, the marker detection unit 26 notifies the IR light control unit 28 that the AR marker 101 has not been detected.
 IR光制御部28は、周辺輝度の算出結果をバッファ29から読み出す。そして、IR光制御部28は、周辺輝度に基づいて、適切なIR光の光量を算出する。例えば、IR光制御部28は、周辺輝度が高い場合、IR光の光量を下げ、周辺輝度が低い場合、IR光の光量を上げる。なお、消費電力の増大や、IR光の乱反射等によるノイズの発生を抑制するために、IR光の光量をあまり上げすぎないようにすることが望ましい。そして、IR光制御部28は、IR光照射装置15により照射されるIR光の光量が、算出した光量になるようにIR光照射装置15を制御する。 The IR light control unit 28 reads out the calculation result of the peripheral luminance from the buffer 29. The IR light control unit 28 calculates an appropriate amount of IR light based on the peripheral luminance. For example, the IR light control unit 28 decreases the amount of IR light when the peripheral luminance is high, and increases the amount of IR light when the peripheral luminance is low. It should be noted that it is desirable not to increase the amount of IR light too much in order to suppress an increase in power consumption and noise generation due to irregular reflection of IR light. Then, the IR light control unit 28 controls the IR light irradiation device 15 so that the amount of IR light irradiated by the IR light irradiation device 15 becomes the calculated light amount.
 これにより、周辺輝度に基づいてIR光の光量が適切な値に設定され、ARマーカ101の検出精度が向上する。 Thereby, the amount of IR light is set to an appropriate value based on the peripheral luminance, and the detection accuracy of the AR marker 101 is improved.
 なお、ステップS6のコアリング処理や、ステップS11のIR光の照射量の調整処理は、ハンチングが発生しないように、実行する回数や時間に制限を設けるようにしてもよい。 The coring process in step S6 and the IR light irradiation amount adjusting process in step S11 may be limited in the number and time of execution so that hunting does not occur.
 その後、処理はステップS1に戻り、ステップS1以降の処理が実行される。 Thereafter, the process returns to step S1, and the processes after step S1 are executed.
 以上のようにして、ARマーカ101の検出精度を向上させることができる。その結果、RGB画像の適切な位置に付加情報を重畳して、ユーザに提示することができる。 As described above, the detection accuracy of the AR marker 101 can be improved. As a result, additional information can be superimposed on an appropriate position of the RGB image and presented to the user.
 また、ARマーカ101が無色透明であるため、景観やデザインを損ねずに、所望の位置にARマーカ101を配置することができる。すなわち、ARマーカの配置の自由度が向上する。また、ARマーカ101が無色透明であるため、情報の秘匿性が向上する。 In addition, since the AR marker 101 is colorless and transparent, the AR marker 101 can be arranged at a desired position without deteriorating the scenery or the design. That is, the degree of freedom of arrangement of the AR marker is improved. Further, since the AR marker 101 is colorless and transparent, the confidentiality of information is improved.
 さらに、ARマーカを検出するためだけに画像を撮影する必要がなく、通常の撮影を行いながら、ARマーカの検出が可能になる。 Furthermore, it is not necessary to take an image only for detecting the AR marker, and the AR marker can be detected while performing normal shooting.
 また、ARマーカを用いないマーカレス方式と比較して、処理に必要な負荷やデータを削減することができる。 Also, compared to a markerless method that does not use an AR marker, the load and data required for processing can be reduced.
<2.第2の実施の形態>
 次に、図8を参照して、本技術の第2の実施の形態について説明する。この第2の実施の形態では、IR光を自ら発するARマーカを用いたマーカ型ARが実現される。
<2. Second Embodiment>
Next, a second embodiment of the present technology will be described with reference to FIG. In the second embodiment, a marker AR using an AR marker that emits IR light itself is realized.
{カメラシステム201の構成例}
 図8は、本技術の第2の実施の形態であるカメラシステム201の構成例を示している。なお、図中、図1と対応する部分には、同じ符号又は下二桁が同じ符号を示している。
{Configuration example of camera system 201}
FIG. 8 illustrates a configuration example of the camera system 201 according to the second embodiment of the present technology. In the figure, portions corresponding to those in FIG.
 カメラシステム201は、カメラシステム1と比較して、バンドパスフィルタ11、イメージセンサ12、及び、温度センサ13を含む点が共通する。一方、カメラシステム201は、カメラシステム1と比較して、画像処理部14の代わりに画像処理部214が設けられている点、並びに、IR光照射装置15が設けられていない点が異なる。 The camera system 201 is common to the camera system 1 in that it includes a bandpass filter 11, an image sensor 12, and a temperature sensor 13. On the other hand, the camera system 201 differs from the camera system 1 in that an image processing unit 214 is provided instead of the image processing unit 14 and that the IR light irradiation device 15 is not provided.
 画像処理部214は、カメラシステム1の画像処理部14と比較して、可視光画像処理部22及び付加情報重畳部27を含む点が共通する。一方、画像処理部214は、カメラシステム1の画像処理部14と比較して、分離部21、輝度算出部23、コアリング部24、閾値判定部25、及び、マーカ検出部26の代わりに、分離部221、輝度算出部223、コアリング部224、閾値判定部225、及び、マーカ検出部226を含む点、並びに、IR光制御部28が設けられていない点が異なる。 Compared with the image processing unit 14 of the camera system 1, the image processing unit 214 is common in that it includes a visible light image processing unit 22 and an additional information superimposing unit 27. On the other hand, the image processing unit 214 is different from the image processing unit 14 of the camera system 1 in place of the separation unit 21, the luminance calculation unit 23, the coring unit 24, the threshold determination unit 25, and the marker detection unit 26. The difference is that it includes a separation unit 221, a luminance calculation unit 223, a coring unit 224, a threshold value determination unit 225, and a marker detection unit 226, and that the IR light control unit 28 is not provided.
 分離部221、輝度算出部223、コアリング部224、閾値判定部225、及び、マーカ検出部226は、画像処理部14の分離部21、輝度算出部23、コアリング部24、閾値判定部25、及び、マーカ検出部26と基本的に同じ処理を行う。ただし、分離部221、輝度算出部223、コアリング部224、閾値判定部225、及び、マーカ検出部226は、バッファを介さずに、互いに直接データのやり取りを行う。 The separation unit 221, the luminance calculation unit 223, the coring unit 224, the threshold determination unit 225, and the marker detection unit 226 are the separation unit 21, the luminance calculation unit 23, the coring unit 24, and the threshold determination unit 25 of the image processing unit 14. And basically the same processing as the marker detection unit 26 is performed. However, the separation unit 221, the luminance calculation unit 223, the coring unit 224, the threshold value determination unit 225, and the marker detection unit 226 directly exchange data with each other without using a buffer.
 具体的には、分離部221は、分離部21と同様に、入力画像の各画素の可視光成分とIR光成分を分離する。そして、分離部221は、可視光成分からなるモザイク画像を可視光画像処理部22に供給し、IR光成分からなるIR光画像を輝度算出部223に供給する。 Specifically, the separation unit 221 separates the visible light component and the IR light component of each pixel of the input image, like the separation unit 21. Then, the separation unit 221 supplies the mosaic image including the visible light component to the visible light image processing unit 22, and supplies the IR light image including the IR light component to the luminance calculation unit 223.
 輝度算出部223は、輝度算出部23と同様に、可視光画像処理部22から供給されるRGB画像内のARマーカの周辺の輝度(周辺輝度)を算出する。輝度算出部223は、IR光画像、及び、周辺輝度の算出結果をコアリング部224に供給する。 The luminance calculation unit 223 calculates the luminance around the AR marker in the RGB image supplied from the visible light image processing unit 22 (peripheral luminance), as with the luminance calculation unit 23. The luminance calculation unit 223 supplies the IR light image and the calculation result of the peripheral luminance to the coring unit 224.
 コアリング部224は、コアリング部24と同様に、周辺輝度、及び、温度センサ13により測定されるカメラシステム1の周囲の温度に基づいて、IR光画像に対してコアリング処理を行う。コアリング部24は、コアリング処理後のIR光画像を閾値判定部225に供給する。 The coring unit 224 performs the coring process on the IR light image based on the ambient brightness and the ambient temperature of the camera system 1 measured by the temperature sensor 13, similarly to the coring unit 24. The coring unit 24 supplies the IR light image after the coring process to the threshold value determination unit 225.
 閾値判定部225は、閾値判定部25と同様に、コアリング処理後のIR光画像に対して閾値判定処理を行う。閾値判定部225は、閾値判定処理により生成した2値化画像をマーカ検出部226に供給する。 Like the threshold value determination unit 25, the threshold value determination unit 225 performs a threshold value determination process on the IR light image after the coring process. The threshold determination unit 225 supplies the binarized image generated by the threshold determination process to the marker detection unit 226.
 マーカ検出部226は、マーカ検出部26と同様に、2値化画像を用いて、ARマーカの検出を行う。マーカ検出部226は、ARマーカの検出結果を付加情報重畳部27に供給する。 The marker detection unit 226 detects an AR marker using a binarized image, as with the marker detection unit 26. The marker detection unit 226 supplies the AR marker detection result to the additional information superimposing unit 27.
 なお、カメラシステム201には、カメラシステム1と異なり、IR光照射装置15が設けられていない。従って、カメラシステム201では、自らがIR光の発光を行うARマーカの検出を行う。例えば、上述した図4のARマーカ101の反射部101Aを、IR光の発光を行う発光部に置き換えたARマーカの検出が行われる。 Note that, unlike the camera system 1, the IR light irradiation device 15 is not provided in the camera system 201. Therefore, the camera system 201 detects an AR marker that emits IR light. For example, the AR marker is detected by replacing the reflective portion 101A of the AR marker 101 in FIG. 4 described above with a light emitting portion that emits IR light.
<3.変形例>
{コアリング処理に関する変形例}
 以上の説明では、各画素で共通のトータル減算量を設定する例を示したが、画像内の領域や画素毎に複数のトータル減算量を使い分けるようにしてもよい。例えば、IR光画像を複数の領域に分割し、領域毎に周辺輝度を算出し、輝度ベース減算量を設定することにより、領域毎にトータル減算量を設定するようにしてもよい。
<3. Modification>
{Variation related to coring processing}
In the above description, an example in which a common total subtraction amount is set for each pixel has been described. However, a plurality of total subtraction amounts may be used for each region or pixel in the image. For example, the total amount of subtraction may be set for each region by dividing the IR light image into a plurality of regions, calculating the peripheral luminance for each region, and setting the luminance base subtraction amount.
 また、例えば、IR光画像の各画素の輝度成分に基づいて、画素毎に輝度ベース減算量を設定することにより、画素毎にトータル減算量を設定するようにしてもよい。 Further, for example, the total subtraction amount may be set for each pixel by setting the luminance base subtraction amount for each pixel based on the luminance component of each pixel of the IR light image.
 さらに、例えば、減算量を、上述したような絶対値ではなく、画素値を減少させる比率で表すようにしてもよい。例えば、トータル減算量が10%に設定された場合、IR光画像の各画素の画素値を10%ずつ小さくするように補正するようにしてもよい。 Further, for example, the subtraction amount may be expressed not by the absolute value as described above but by a ratio for decreasing the pixel value. For example, when the total subtraction amount is set to 10%, the pixel value of each pixel of the IR light image may be corrected by 10%.
 また、図6及び図7の減算量算出用のグラフは、その一例であり、他のグラフを用いるようにしてもよい。例えば、図6及び図7に示すような線形に変化するグラフではなく、図9に示されるように、曲線状のグラフを用いることも可能である。このような曲線状のグラフには、例えば、複数の代表点を設定し、代表点間を直線で結ぶ疑似曲線や、当該複数の代表点を用いた近傍補間処理により求められる曲線等が用いられる。 Moreover, the graphs for calculating the subtraction amount in FIGS. 6 and 7 are examples thereof, and other graphs may be used. For example, instead of a linearly changing graph as shown in FIGS. 6 and 7, it is also possible to use a curved graph as shown in FIG. For such a curved graph, for example, a plurality of representative points are set, a pseudo curve connecting the representative points with a straight line, a curve obtained by neighborhood interpolation processing using the plurality of representative points, or the like is used. .
 さらに、例えば、周辺輝度、及び、周囲の温度の一方のみに基づいて減算量を設定するようにしてもよい。 Further, for example, the subtraction amount may be set based on only one of the ambient brightness and the ambient temperature.
{IR光の光量の制御に関する変形例}
 以上の説明では、IR光制御部28が、周辺輝度に基づいてIR光照射装置15からのIR光の光量を制御する例を示したが、例えば、コアリング処理の減算量に基づいて、IR光の光量を制御するようにしてもよい。例えば、IR光制御部28は、減算量が大きい場合、周辺輝度が高くノイズ量が多いと想定されるので、IR光の光量を下げ、減算量が小さい場合、周辺輝度が低くノイズ量が少ないと想定されるので、IR光の光量を上げるようにしてもよい。
{Modification regarding control of the amount of IR light}
In the above description, an example in which the IR light control unit 28 controls the amount of IR light from the IR light irradiation device 15 based on the peripheral luminance has been shown. For example, based on the subtraction amount of the coring process, the IR light control unit 28 The amount of light may be controlled. For example, when the subtraction amount is large, the IR light control unit 28 assumes that the peripheral luminance is high and the noise amount is large. Therefore, when the IR light amount is decreased and the subtraction amount is small, the peripheral luminance is low and the noise amount is small. Therefore, the amount of IR light may be increased.
 また、例えば、IR光制御部28が、周辺輝度及びコアリング処理の減算量の両方に基づいて、IR光の光量を制御するようにしてもよい。 For example, the IR light control unit 28 may control the amount of IR light based on both the peripheral luminance and the subtraction amount of the coring process.
 さらに、例えば、IR光制御部28が、ARマーカが検出されなかった場合、周辺輝度及びコアリング処理の減算量を用いずに、所定の値だけIR光の光量を上げる又は下げるようにしてもよい。 Furthermore, for example, when the AR marker is not detected, the IR light control unit 28 may increase or decrease the amount of IR light by a predetermined value without using the peripheral luminance and the subtraction amount of the coring process. Good.
 また、例えば、IR光制御部28が、ARマーカが検出された場合にも、周辺輝度及びコアリング処理の減算量の少なくとも一方に基づいて、IR光の光量を制御するようにしてもよい。 For example, the IR light control unit 28 may control the amount of IR light based on at least one of the peripheral luminance and the subtraction amount of the coring process even when the AR marker is detected.
{イメージセンサ12に関する変形例}
 以上の説明では、上述した図2に示されるように、IR画素を含むイメージセンサ12を用いる例を示したが、本技術においては、IR画素を含まないイメージンセンサを用いることも可能である。例えば、従来のベイヤ配列型のイメージセンサを用いることが可能である。
{Variation of image sensor 12}
In the above description, as shown in FIG. 2 described above, an example in which the image sensor 12 including IR pixels is used has been described. However, in the present technology, an image sensor that does not include IR pixels may be used. . For example, a conventional Bayer array type image sensor can be used.
 なお、従来のベイヤ配列型のイメージセンサを用いる場合、R画素、G画素、及び、B画素とも、IR光成分を受光し、光電変換することが可能な画素が用いられる。また、通常のカメラのようにIRカットフィルタ等によりIR光を遮断せずに、IR光がイメージセンサに入射するように設定される。 When a conventional Bayer array type image sensor is used, pixels capable of receiving an IR light component and performing photoelectric conversion are used for both the R pixel, the G pixel, and the B pixel. Further, the IR light is set to enter the image sensor without blocking the IR light by an IR cut filter or the like as in a normal camera.
 また、ベイヤ配列型のイメージセンサを用いた場合、例えば、FFT(高速フーリエ変換)等の手法を用いて、R画素、G画素、及び、B画素の各画素信号の周波数分析を行い、周波数分析の結果に基づいて、可視光成分とIR光成分の分離を行うことも可能である。 Further, when a Bayer array type image sensor is used, for example, frequency analysis is performed on each pixel signal of R pixel, G pixel, and B pixel by using a technique such as FFT (Fast Fourier Transform). Based on the result, it is possible to separate the visible light component and the IR light component.
 さらに、例えば、IR光とは異なる波長帯の不可視光(例えば、紫外光等)を検出可能なイメージセンサを用いることも可能である。この場合、当該不可視光を反射又は発するARマーカが用いられる。 Furthermore, for example, it is possible to use an image sensor that can detect invisible light (for example, ultraviolet light) having a wavelength band different from that of IR light. In this case, an AR marker that reflects or emits the invisible light is used.
 また、例えば、R波長帯、G波長帯、及び、B波長帯以外の他の波長帯(例えば、黄色の波長帯等)の可視光を検出可能なイメージセンサを用いることも可能である。 Also, for example, it is possible to use an image sensor capable of detecting visible light in a wavelength band other than the R wavelength band, the G wavelength band, and the B wavelength band (for example, a yellow wavelength band).
{その他の変形例}
 以上の説明では、カメラシステム1においてRGB画像に付加情報を重畳してから出力するようにしたが、RGB画像とARマーカの検出結果を出力し、後段の装置において、RGB画像に付加情報を重畳するようにしてもよい。
{Other variations}
In the above description, the additional information is superimposed on the RGB image in the camera system 1 and then output. However, the detection result of the RGB image and the AR marker is output, and the additional information is superimposed on the RGB image in the subsequent apparatus. You may make it do.
 また、例えば、閾値判定部25(又は、閾値判定部225)が、コアリング処理後のIR光画像の画素値が、ARマーカ101を検出可能な値を満たしているか否かを判定するようにしてもよい。例えば、閾値判定部25は、コアリング処理後のIR光画像の画素値の平均値が所定の閾値以上の場合、ARマーカ101を検出可能であると判定し、閾値未満である場合、ARマーカ101を検出不可であると判定する。 Further, for example, the threshold determination unit 25 (or the threshold determination unit 225) determines whether or not the pixel value of the IR light image after the coring process satisfies a value that can detect the AR marker 101. May be. For example, the threshold determination unit 25 determines that the AR marker 101 can be detected when the average value of the pixel values of the IR light image after the coring process is equal to or greater than a predetermined threshold. 101 is determined to be undetectable.
 この場合、閾値判定部25は、IR光画像全体の画素値の平均値を用いるようにしてもよいし、所定の領域毎の画素値の平均値を用いるようにしてもよい。前者の場合、IR光画像全体でARマーカ101を検出可能であるか否かが判定され、後者の場合、IR光画像の領域毎にARマーカ101を検出可能であるか否かが判定される。 In this case, the threshold determination unit 25 may use an average value of pixel values of the entire IR light image, or may use an average value of pixel values for each predetermined region. In the former case, it is determined whether or not the AR marker 101 can be detected in the entire IR light image. In the latter case, it is determined whether or not the AR marker 101 can be detected for each region of the IR light image. .
 さらに、例えば、図1のカメラシステム1において、図8のカメラシステム201のように、各部がバッファ29を介さずに互いに直接データのやり取りを行うようにしてもよい。さらに、例えば、図8のカメラシステム201において、図1のカメラシステム1のように、各部がバッファを介してデータのやり取りを行うようにしてもよい。 Further, for example, in the camera system 1 of FIG. 1, as in the camera system 201 of FIG. 8, each unit may directly exchange data without passing through the buffer 29. Further, for example, in the camera system 201 in FIG. 8, each unit may exchange data via a buffer as in the camera system 1 in FIG.
 また、本技術は動画及び静止画のいずれの撮影時にも適用することができる。 In addition, this technology can be applied to both moving images and still images.
{コンピュータの構成例}
 上述した一連の処理は、ハードウエアにより実行することもできるし、ソフトウエアにより実行することもできる。一連の処理をソフトウエアにより実行する場合には、そのソフトウエアを構成するプログラムが、コンピュータにインストールされる。ここで、コンピュータには、専用のハードウエアに組み込まれているコンピュータや、各種のプログラムをインストールすることで、各種の機能を実行することが可能な、例えば汎用のパーソナルコンピュータなどが含まれる。
{Example of computer configuration}
The series of processes described above can be executed by hardware or can be executed by software. When a series of processing is executed by software, a program constituting the software is installed in the computer. Here, the computer includes, for example, a general-purpose personal computer capable of executing various functions by installing various programs by installing a computer incorporated in dedicated hardware.
 図10は、上述した一連の処理をプログラムにより実行するコンピュータのハードウエアの構成例を示すブロック図である。 FIG. 10 is a block diagram showing an example of the hardware configuration of a computer that executes the above-described series of processing by a program.
 コンピュータにおいて、CPU(Central Processing Unit)401,ROM(Read Only Memory)402,RAM(Random Access Memory)403は、バス404により相互に接続されている。 In the computer, a CPU (Central Processing Unit) 401, a ROM (Read Only Memory) 402, and a RAM (Random Access Memory) 403 are connected to each other by a bus 404.
 バス404には、さらに、入出力インタフェース405が接続されている。入出力インタフェース405には、入力部406、出力部407、記憶部408、通信部409、及びドライブ410が接続されている。 Further, an input / output interface 405 is connected to the bus 404. An input unit 406, an output unit 407, a storage unit 408, a communication unit 409, and a drive 410 are connected to the input / output interface 405.
 入力部406は、キーボード、マウス、マイクロフォンなどよりなる。出力部407は、ディスプレイ、スピーカなどよりなる。記憶部408は、ハードディスクや不揮発性のメモリなどよりなる。通信部409は、ネットワークインタフェースなどよりなる。ドライブ410は、磁気ディスク、光ディスク、光磁気ディスク、又は半導体メモリなどのリムーバブルメディア411を駆動する。 The input unit 406 includes a keyboard, a mouse, a microphone, and the like. The output unit 407 includes a display, a speaker, and the like. The storage unit 408 includes a hard disk, a nonvolatile memory, and the like. The communication unit 409 includes a network interface. The drive 410 drives a removable medium 411 such as a magnetic disk, an optical disk, a magneto-optical disk, or a semiconductor memory.
 以上のように構成されるコンピュータでは、CPU401が、例えば、記憶部408に記憶されているプログラムを、入出力インタフェース405及びバス404を介して、RAM403にロードして実行することにより、上述した一連の処理が行われる。 In the computer configured as described above, the CPU 401 loads, for example, a program stored in the storage unit 408 to the RAM 403 via the input / output interface 405 and the bus 404 and executes the program, and the series described above. Is performed.
 コンピュータ(CPU401)が実行するプログラムは、例えば、パッケージメディア等としてのリムーバブルメディア411に記録して提供することができる。また、プログラムは、ローカルエリアネットワーク、インターネット、デジタル衛星放送といった、有線または無線の伝送媒体を介して提供することができる。 The program executed by the computer (CPU 401) can be provided by being recorded on a removable medium 411 as a package medium, for example. The program can be provided via a wired or wireless transmission medium such as a local area network, the Internet, or digital satellite broadcasting.
 コンピュータでは、プログラムは、リムーバブルメディア411をドライブ410に装着することにより、入出力インタフェース405を介して、記憶部408にインストールすることができる。また、プログラムは、有線または無線の伝送媒体を介して、通信部409で受信し、記憶部408にインストールすることができる。その他、プログラムは、ROM402や記憶部408に、あらかじめインストールしておくことができる。 In the computer, the program can be installed in the storage unit 408 via the input / output interface 405 by attaching the removable medium 411 to the drive 410. The program can be received by the communication unit 409 via a wired or wireless transmission medium and installed in the storage unit 408. In addition, the program can be installed in the ROM 402 or the storage unit 408 in advance.
 なお、コンピュータが実行するプログラムは、本明細書で説明する順序に沿って時系列に処理が行われるプログラムであっても良いし、並列に、あるいは呼び出しが行われたとき等の必要なタイミングで処理が行われるプログラムであっても良い。 The program executed by the computer may be a program that is processed in time series in the order described in this specification, or in parallel or at a necessary timing such as when a call is made. It may be a program for processing.
 また、本明細書において、システムとは、複数の構成要素(装置、モジュール(部品)等)の集合を意味し、すべての構成要素が同一筐体中にあるか否かは問わない。したがって、別個の筐体に収納され、ネットワークを介して接続されている複数の装置、及び、1つの筐体の中に複数のモジュールが収納されている1つの装置は、いずれも、システムである。 In this specification, the system means a set of a plurality of components (devices, modules (parts), etc.), and it does not matter whether all the components are in the same housing. Accordingly, a plurality of devices housed in separate housings and connected via a network and a single device housing a plurality of modules in one housing are all systems. .
 さらに、本技術の実施の形態は、上述した実施の形態に限定されるものではなく、本技術の要旨を逸脱しない範囲において種々の変更が可能である。 Furthermore, embodiments of the present technology are not limited to the above-described embodiments, and various modifications can be made without departing from the gist of the present technology.
 例えば、本技術は、1つの機能をネットワークを介して複数の装置で分担、共同して処理するクラウドコンピューティングの構成をとることができる。 For example, the present technology can take a cloud computing configuration in which one function is shared by a plurality of devices via a network and is jointly processed.
 また、上述のフローチャートで説明した各ステップは、1つの装置で実行する他、複数の装置で分担して実行することができる。 Further, each step described in the above flowchart can be executed by one device or can be shared by a plurality of devices.
 さらに、1つのステップに複数の処理が含まれる場合には、その1つのステップに含まれる複数の処理は、1つの装置で実行する他、複数の装置で分担して実行することができる。 Further, when a plurality of processes are included in one step, the plurality of processes included in the one step can be executed by being shared by a plurality of apparatuses in addition to being executed by one apparatus.
 また、本明細書に記載された効果はあくまで例示であって限定されるものではなく、他の効果があってもよい。 Further, the effects described in the present specification are merely examples and are not limited, and other effects may be obtained.
 さらに、例えば、本技術は以下のような構成も取ることができる。 Furthermore, for example, the present technology can take the following configurations.
(1)
 可視光成分と不可視光成分を含む入力画像から不可視光成分を分離する分離部と、
 前記入力画像の不可視光成分に基づく第1の画像の画素値を減らすことによりノイズを除去するノイズ除去部と、
 ノイズが除去された前記第1の画像において、不可視光により所定のパターンを示すAR(Augmented Reality)マーカを検出するマーカ検出部と
 を備える画像処理装置。
(2)
 前記分離部は、前記入力画像から可視光成分をさらに分離し、
 前記ノイズ除去部は、前記入力画像の可視光成分に基づく第2の画像における前記ARマーカの周囲の輝度、又は、前記画像処理装置の周囲の温度の少なくとも一方に基づいて、前記第1の画像の画素値を減らす量である減算量を設定する
 前記(1)に記載の画像処理装置。
(3)
 前記ARマーカは、不可視光を反射することにより前記パターンを示し、
 前記第2の画像における前記ARマーカの周囲の輝度、前記減算量、及び、前記ARマーカの検出結果のうち少なくとも1つに基づいて、前記ARマーカに照射する不可視光の強度を制御する照射制御部を
 さらに備える前記(2)に記載の画像処理装置。
(4)
 可視光成分及び不可視光成分を検出する第1の画素、並びに、不可視光成分のみを検出する第2の画素を含み、前記入力画像を撮像する撮像素子を
 さらに備える前記(1)乃至(3)のいずれかに記載の画像処理装置。
(5)
 不可視光成分、及び、それぞれ波長帯が異なる可視光成分を検出する複数の種類の画素を含み、前記入力画像を撮像する撮像素子を
 さらに備える前記(1)乃至(3)のいずれかに記載の画像処理装置。
(6)
 前記分離部は、前記入力画像から可視光成分をさらに分離し、
 前記入力画像の可視光成分に基づく第2の画像に、前記ARマーカに対応する付加情報を重畳する付加情報重畳部を
 さらに備える前記(1)乃至(5)のいずれかに記載の画像処理装置。
(7)
 可視光成分と不可視光成分を含む入力画像から不可視光成分を分離する分離ステップと、
 前記入力画像の不可視光成分に基づく第1の画像の画素値を減らすことによりノイズを除去するノイズ除去ステップと、
 ノイズが除去された前記第1の画像において、不可視光により所定のパターンを示すAR(Augmented Reality)マーカを検出するマーカ検出ステップと
 を含む画像処理方法。
(8)
 可視光成分と不可視光成分を含む入力画像から不可視光成分を分離する分離ステップと、
 前記入力画像の不可視光成分に基づく第1の画像の画素値を減らすことによりノイズを除去するノイズ除去ステップと、
 ノイズが除去された前記第1の画像において、不可視光により所定のパターンを示すAR(Augmented Reality)マーカを検出するマーカ検出ステップと
 を含む処理をコンピュータに実行させるためのプログラム。
(1)
A separation unit that separates the invisible light component from the input image including the visible light component and the invisible light component;
A noise removing unit for removing noise by reducing a pixel value of the first image based on an invisible light component of the input image;
An image processing apparatus comprising: a marker detection unit configured to detect an AR (Augmented Reality) marker indicating a predetermined pattern with invisible light in the first image from which noise is removed.
(2)
The separation unit further separates a visible light component from the input image;
The noise removal unit is configured to determine the first image based on at least one of luminance around the AR marker in the second image based on a visible light component of the input image or temperature around the image processing device. The image processing apparatus according to (1), wherein a subtraction amount that is an amount by which the pixel value of the image is reduced is set.
(3)
The AR marker shows the pattern by reflecting invisible light;
Irradiation control for controlling the intensity of the invisible light irradiated to the AR marker based on at least one of the brightness around the AR marker in the second image, the subtraction amount, and the detection result of the AR marker. The image processing apparatus according to (2), further including a unit.
(4)
(1) to (3), further comprising: an imaging element that includes a first pixel that detects a visible light component and an invisible light component, and a second pixel that detects only the invisible light component, and captures the input image. An image processing apparatus according to any one of the above.
(5)
The imaging device according to any one of (1) to (3), further including an imaging element that includes a plurality of types of pixels that detect invisible light components and visible light components each having a different wavelength band, and that captures the input image. Image processing device.
(6)
The separation unit further separates a visible light component from the input image;
The image processing device according to any one of (1) to (5), further including an additional information superimposing unit that superimposes additional information corresponding to the AR marker on a second image based on a visible light component of the input image. .
(7)
A separation step of separating the invisible light component from the input image including the visible light component and the invisible light component;
A noise removal step of removing noise by reducing a pixel value of the first image based on an invisible light component of the input image;
A marker detection step of detecting an AR (Augmented Reality) marker that shows a predetermined pattern with invisible light in the first image from which noise has been removed.
(8)
A separation step of separating the invisible light component from the input image including the visible light component and the invisible light component;
A noise removal step of removing noise by reducing a pixel value of the first image based on an invisible light component of the input image;
A program for causing a computer to execute processing including: a marker detection step of detecting an AR (Augmented Reality) marker showing a predetermined pattern with invisible light in the first image from which noise has been removed.
 1 カメラシステム, 11 バンドパスフィルタ, 12 イメージセンサ, 13 温度センサ, 14 画像処理部, 15 IR光照射装置, 21 分離部, 22 可視光画像処理部, 23 輝度算出部, 24 コアリング部, 25 閾値判定部, 26 マーカ検出部, 27 付加情報重畳部, 28 IR光制御部, 101 ARマーカ, 101A 反射部, 101B 非反射部, 201 カメラシステム, 214 画像処理部, 221 分離部, 223 輝度算出部, 224 コアリング部, 225 閾値判定部, 226 マーカ検出部 1 camera system, 11 band pass filter, 12 image sensor, 13 temperature sensor, 14 image processing unit, 15 IR light irradiation device, 21 separation unit, 22 visible light image processing unit, 23 luminance calculation unit, 24 coring unit, 25 Threshold determination unit, 26 marker detection unit, 27 additional information superimposition unit, 28 IR light control unit, 101 AR marker, 101A reflection unit, 101B non-reflection unit, 201 camera system, 214 image processing unit, 221 separation unit, 223 luminance calculation Part, 224 coring part, 225 threshold judgment part, 226 marker detection part

Claims (8)

  1.  可視光成分と不可視光成分を含む入力画像から不可視光成分を分離する分離部と、
     前記入力画像の不可視光成分に基づく第1の画像の画素値を減らすことによりノイズを除去するノイズ除去部と、
     ノイズが除去された前記第1の画像において、不可視光により所定のパターンを示すAR(Augmented Reality)マーカを検出するマーカ検出部と
     を備える画像処理装置。
    A separation unit that separates the invisible light component from the input image including the visible light component and the invisible light component;
    A noise removing unit for removing noise by reducing a pixel value of the first image based on an invisible light component of the input image;
    An image processing apparatus comprising: a marker detection unit configured to detect an AR (Augmented Reality) marker indicating a predetermined pattern with invisible light in the first image from which noise is removed.
  2.  前記分離部は、前記入力画像から可視光成分をさらに分離し、
     前記ノイズ除去部は、前記入力画像の可視光成分に基づく第2の画像における前記ARマーカの周囲の輝度、又は、前記画像処理装置の周囲の温度の少なくとも一方に基づいて、前記第1の画像の画素値を減らす量である減算量を設定する
     請求項1に記載の画像処理装置。
    The separation unit further separates a visible light component from the input image;
    The noise removal unit is configured to determine the first image based on at least one of luminance around the AR marker in the second image based on a visible light component of the input image or temperature around the image processing device. The image processing apparatus according to claim 1, wherein a subtraction amount, which is an amount by which the pixel value of is reduced, is set.
  3.  前記ARマーカは、不可視光を反射することにより前記パターンを示し、
     前記第2の画像における前記ARマーカの周囲の輝度、前記減算量、及び、前記ARマーカの検出結果のうち少なくとも1つに基づいて、前記ARマーカに照射する不可視光の強度を制御する照射制御部を
     さらに備える請求項2に記載の画像処理装置。
    The AR marker shows the pattern by reflecting invisible light;
    Irradiation control for controlling the intensity of the invisible light irradiated to the AR marker based on at least one of the brightness around the AR marker in the second image, the subtraction amount, and the detection result of the AR marker. The image processing apparatus according to claim 2, further comprising a unit.
  4.  可視光成分及び不可視光成分を検出する第1の画素、並びに、不可視光成分のみを検出する第2の画素を含み、前記入力画像を撮像する撮像素子を
     さらに備える請求項1に記載の画像処理装置。
    The image processing according to claim 1, further comprising: an imaging element that includes a first pixel that detects a visible light component and an invisible light component, and a second pixel that detects only the invisible light component, and captures the input image. apparatus.
  5.  不可視光成分、及び、それぞれ波長帯が異なる可視光成分を検出する複数の種類の画素を含み、前記入力画像を撮像する撮像素子を
     さらに備える請求項1に記載の画像処理装置。
    The image processing apparatus according to claim 1, further comprising: an imaging element that includes a plurality of types of pixels that detect invisible light components and visible light components having different wavelength bands, and that captures the input image.
  6.  前記分離部は、前記入力画像から可視光成分をさらに分離し、
     前記入力画像の可視光成分に基づく第2の画像に、前記ARマーカに対応する付加情報を重畳する付加情報重畳部を
     さらに備える請求項1に記載の画像処理装置。
    The separation unit further separates a visible light component from the input image;
    The image processing apparatus according to claim 1, further comprising an additional information superimposing unit that superimposes additional information corresponding to the AR marker on a second image based on a visible light component of the input image.
  7.  可視光成分と不可視光成分を含む入力画像から不可視光成分を分離する分離ステップと、
     前記入力画像の不可視光成分に基づく第1の画像の画素値を減らすことによりノイズを除去するノイズ除去ステップと、
     ノイズが除去された前記第1の画像において、不可視光により所定のパターンを示すAR(Augmented Reality)マーカを検出するマーカ検出ステップと
     を含む画像処理方法。
    A separation step of separating the invisible light component from the input image including the visible light component and the invisible light component;
    A noise removal step of removing noise by reducing a pixel value of the first image based on an invisible light component of the input image;
    A marker detection step of detecting an AR (Augmented Reality) marker that shows a predetermined pattern with invisible light in the first image from which noise has been removed.
  8.  可視光成分と不可視光成分を含む入力画像から不可視光成分を分離する分離ステップと、
     前記入力画像の不可視光成分に基づく第1の画像の画素値を減らすことによりノイズを除去するノイズ除去ステップと、
     ノイズが除去された前記第1の画像において、不可視光により所定のパターンを示すAR(Augmented Reality)マーカを検出するマーカ検出ステップと
     を含む処理をコンピュータに実行させるためのプログラム。
    A separation step of separating the invisible light component from the input image including the visible light component and the invisible light component;
    A noise removal step of removing noise by reducing a pixel value of the first image based on an invisible light component of the input image;
    A program for causing a computer to execute processing including: a marker detection step of detecting an AR (Augmented Reality) marker showing a predetermined pattern with invisible light in the first image from which noise has been removed.
PCT/JP2016/052580 2015-02-09 2016-01-29 Image processing device, image processing method, and program WO2016129404A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2015023414 2015-02-09
JP2015-023414 2015-02-09

Publications (1)

Publication Number Publication Date
WO2016129404A1 true WO2016129404A1 (en) 2016-08-18

Family

ID=56614361

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2016/052580 WO2016129404A1 (en) 2015-02-09 2016-01-29 Image processing device, image processing method, and program

Country Status (1)

Country Link
WO (1) WO2016129404A1 (en)

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2008182350A (en) * 2007-01-23 2008-08-07 Fujifilm Corp Imaging apparatus and method
JP2010050757A (en) * 2008-08-21 2010-03-04 Kwansei Gakuin Information providing method and information providing system
JP2012239103A (en) * 2011-05-13 2012-12-06 Sony Corp Image processing device, image processing method, and program

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2008182350A (en) * 2007-01-23 2008-08-07 Fujifilm Corp Imaging apparatus and method
JP2010050757A (en) * 2008-08-21 2010-03-04 Kwansei Gakuin Information providing method and information providing system
JP2012239103A (en) * 2011-05-13 2012-12-06 Sony Corp Image processing device, image processing method, and program

Similar Documents

Publication Publication Date Title
EP1528797B1 (en) Image processing apparatus, image-taking system and image processing method
US10560670B2 (en) Imaging apparatus and imaging control method
WO2014185064A1 (en) Image processing method and system
US11099008B2 (en) Capture device assembly, three-dimensional shape measurement device, and motion detection device
US8199228B2 (en) Method of and apparatus for correcting contour of grayscale image
TW201544848A (en) Includes structured stereo imaging components for independent imagers of different wavelengths
JP2013013043A (en) Method of calculating lens shading correction factor and method and apparatus for correcting lens shading by using the method
JP2008099218A (en) Target detector
US20110310014A1 (en) Image pickup apparatus and projection type image display apparatus
EP3270586A1 (en) Image processing device, image processing method, and program
JP2004222231A (en) Image processing apparatus and image processing program
JP5771677B2 (en) Image processing apparatus, imaging apparatus, program, and image processing method
JP5718138B2 (en) Image signal processing apparatus and program
JP2011239067A (en) Image processor
CN108574796B (en) Digital camera method and apparatus optimized for computer vision applications
WO2018216500A1 (en) Device and method for image processing
WO2016129404A1 (en) Image processing device, image processing method, and program
JP6346431B2 (en) Image processing apparatus and image processing method
US11688046B2 (en) Selective image signal processing
JP5256236B2 (en) Image processing apparatus and method, and image processing program
JP2014158165A (en) Image processing device, image processing method, and program
US10812765B2 (en) Image processing apparatus, imaging apparatus, image processing method, and image processing program
JP2012133587A (en) Image analysis device, image analysis method and program
US10306153B2 (en) Imaging apparatus, image sensor, and image processor
JP7324329B1 (en) Image processing method, device, electronic device and readable storage medium

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 16749048

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 16749048

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: JP