[go: up one dir, main page]
More Web Proxy on the site http://driver.im/

WO2011158343A1 - Image processing method, program, image processing device, and imaging device - Google Patents

Image processing method, program, image processing device, and imaging device Download PDF

Info

Publication number
WO2011158343A1
WO2011158343A1 PCT/JP2010/060184 JP2010060184W WO2011158343A1 WO 2011158343 A1 WO2011158343 A1 WO 2011158343A1 JP 2010060184 W JP2010060184 W JP 2010060184W WO 2011158343 A1 WO2011158343 A1 WO 2011158343A1
Authority
WO
WIPO (PCT)
Prior art keywords
virtual projection
projection plane
coordinate system
image processing
pixel
Prior art date
Application number
PCT/JP2010/060184
Other languages
French (fr)
Japanese (ja)
Inventor
上田 滋之
央樹 坪井
Original Assignee
コニカミノルタオプト株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by コニカミノルタオプト株式会社 filed Critical コニカミノルタオプト株式会社
Priority to JP2012520202A priority Critical patent/JPWO2011158343A1/en
Priority to PCT/JP2010/060184 priority patent/WO2011158343A1/en
Publication of WO2011158343A1 publication Critical patent/WO2011158343A1/en

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T3/00Geometric image transformations in the plane of the image
    • G06T3/12Panospheric to cylindrical image transformations

Definitions

  • the present invention relates to an image processing method, a program, an image processing apparatus, and an imaging apparatus that perform distortion correction processing on an image captured by an imaging element via an optical system including a condenser lens.
  • Patent Document 1 discloses a correction method of the prior art that corrects distortion generated in a captured image captured using a lens with a short focal length using a lens correction parameter.
  • the image height change rate increases when the incident angle is greater than or equal to a predetermined value in image processing when displaying photographing data photographed using a wide-angle lens on the display. In the case of less than a predetermined value, correction is performed so that the change rate of the image height decreases.
  • Image processing for captured images obtained with lenses as disclosed in Patent Documents 1 and 2 requires a lot of correction processing such as shading correction and distortion correction, and is therefore processed when hardware is used as an image processing apparatus. There is a problem that the time is increased, the circuit scale is increased, and the cost is increased.
  • the present invention has an object to provide an image processing method, a program, an image processing apparatus, and an imaging apparatus capable of reducing processing time with a relatively small circuit. .
  • an image processing method for obtaining image data subjected to distortion correction processing using a plurality of pixel data obtained by receiving an image sensor having a plurality of pixels via an optical system
  • a first step of setting the position and size of the virtual projection plane in the world coordinate system based on a user instruction;
  • An image processing method comprising:
  • calculation of image data is performed by calculating a corresponding position on the imaging element surface from an incident angle ⁇ with respect to the optical axis of the optical system at each pixel position on the virtual projection plane. 4. The image processing method according to claim 1, wherein image data of the pixel on the virtual projection plane is obtained from pixel data of the pixel.
  • An image processing apparatus that obtains image data subjected to distortion correction processing using a plurality of pixel data obtained by receiving light on an imaging device having a plurality of pixels via an optical system, The coordinates in the world coordinate system of each pixel of the virtual projection plane for which the position and size are set are converted into the camera coordinate system using the distortion correction coefficient of the optical system, the coordinates converted into the camera coordinate system and the plurality of pixel data
  • An image processing unit that calculates image data on the virtual projection plane, An image processing apparatus comprising:
  • An image processing apparatus that obtains image data subjected to distortion correction processing using a plurality of pixel data obtained by receiving light on an imaging device having a plurality of pixels via an optical system,
  • a setting unit capable of setting the position and size of the virtual projection plane in the world coordinate system;
  • the coordinates in the world coordinate system of each pixel of the virtual projection plane set by the setting unit are converted into the camera coordinate system using the distortion correction coefficient of the optical system, the coordinates converted into the camera coordinate system and the plurality of pixel data
  • An image processing unit that calculates image data on the virtual projection plane set by the setting unit,
  • An image processing apparatus comprising:
  • the image processing unit calculates the position of the pixel at the calculated position by calculating the corresponding position on the imaging element surface from the incident angle ⁇ with respect to the optical axis of the optical system at each position on the virtual projection plane. 8. The image processing apparatus according to item 6 or 7, wherein image data at the pixel on the virtual projection plane is obtained from pixel data.
  • An image processing apparatus program for obtaining image data subjected to distortion correction processing using a plurality of pixel data obtained by receiving an image sensor having a plurality of pixels via an optical system, the computer comprising: The coordinates in the world coordinate system of each pixel of the virtual projection plane for which the position and size are set are converted into the camera coordinate system using the distortion correction coefficient of the optical system, the coordinates converted into the camera coordinate system and the plurality of pixel data Based on the image processing unit for calculating the image data on the virtual projection plane, Program to function as.
  • An image processing apparatus program for obtaining image data subjected to distortion correction processing using a plurality of pixel data obtained by receiving an image sensor having a plurality of pixels via an optical system
  • the computer comprising: A setting unit capable of setting the position and size of the virtual projection plane in the world coordinate system; The coordinates in the world coordinate system of each pixel of the virtual projection plane set by the setting unit are converted into the camera coordinate system using the distortion correction coefficient of the optical system, the coordinates converted into the camera coordinate system and the plurality of pixel data
  • Optical system An imaging device having a plurality of pixels; A setting unit for setting the position and size of the virtual projection plane in the world coordinate system; The coordinates in the world coordinate system of each pixel of the virtual projection plane set by the setting unit are converted into the camera coordinate system using the distortion correction coefficient of the optical system, the coordinates converted into the camera coordinate system and the plurality of pixel data And an image processing unit that calculates image data on the virtual projection plane set by the setting unit.
  • An operation unit operated by a user A display unit; Have The setting unit sets the position and size of the virtual projection plane in the world coordinate system based on an operation to the operation unit, 12.
  • the image processing unit calculates the position of the pixel at the calculated position by calculating the corresponding position on the imaging element surface from the incident angle ⁇ with respect to the optical axis of the optical system at each position on the virtual projection plane. 14. The imaging apparatus according to any one of 11 to 13, wherein image data of the pixel on the virtual projection plane is obtained from pixel data.
  • the virtual projection plane set in the world coordinate system is converted into the camera coordinate system using the distortion correction coefficient, and the virtual projection plane is based on the converted coordinates of the camera coordinate system and the pixel data of the image sensor.
  • FIG. 4A shows a main control flow
  • FIG. 4B shows a subroutine of step S20.
  • It is a figure explaining the coordinate of the virtual projection surface VP.
  • It is a figure which shows the correspondence of camera coordinate system xy and image pick-up element surface IA.
  • An example in which two virtual projection planes VP are set is shown. This is an example in which the position of the virtual projection plane VP is changed with the image center o of the camera coordinate system as the rotation center.
  • An example in which the virtual projection plane VP0 is rotated by roll is shown.
  • FIG. VP0 An example in which the virtual projection plane VP0 is pitch rotated is shown.
  • An example in which the virtual projection plane VP0 is rotated by yaw is shown. This is an example in which the position of the virtual projection plane VP is changed with the center ov of the virtual projection plane VP0 as the rotation center.
  • An example in which the virtual projection plane VP0 is rotated by a virtual pitch is shown.
  • An example in which the virtual projection plane VP0 is rotated by a virtual yaw is shown.
  • This is an example in which the position of the virtual projection plane VP is changed with the image center o in the camera coordinate system as the movement center.
  • An example in which the virtual projection plane VP0 is offset in the x direction is shown.
  • the virtual projection plane VP0 is offset in the y direction is shown.
  • FIG. 1 is a schematic diagram for explaining distortion correction according to the present embodiment.
  • X, Y, and Z are world coordinate systems, and the origin O is the lens center.
  • Z includes the optical axis, and the XY plane includes the lens center plane LC passing through the lens center O.
  • Point P is an object point of the object in the world coordinate system XYZ.
  • is an incident angle with respect to the optical axis (coincident with the Z axis).
  • X and y are camera coordinate systems, and the xy plane corresponds to the image sensor surface IA.
  • o is the center of the image and is the intersection of the optical axis Z and the image sensor surface.
  • the point p is a point on the image sensor surface in the camera coordinate system, and the object point P is converted into the camera coordinate system using a distortion correction coefficient based on a parameter based on lens characteristics (hereinafter referred to as “lens parameter”). is there.
  • the VP is a virtual projection plane.
  • the virtual projection plane VP is set on the opposite side of the imaging element (and imaging element surface IA) with respect to the lens position (lens center plane LC) of the optical system.
  • the virtual projection plane VP can be changed in position and size based on an instruction from the user to the operation unit 130 (see FIG. 3).
  • position change is a concept that includes not only the case where the virtual projection plane VP is translated on the XY plane, but also an angle change (also referred to as an attitude change) with respect to the XY plane.
  • the virtual projection plane VP is arranged at a predetermined position (Z direction) parallel to the lens center plane LC (XY direction) with a predetermined size, and the center of the virtual projection plane VP.
  • ov is located on the Z axis.
  • Gv is a point where the object point P is projected onto the virtual projection plane VP, and is an intersection of the object point P and a straight line passing through the lens center O and the virtual projection plane VP.
  • a virtual projection plane VP1 in FIG. 2 shows a state in which the virtual projection plane VP0 is rotated on the XZ plane based on the input of the operation unit 130.
  • FIG. 3 is a block diagram illustrating a schematic configuration of the imaging apparatus.
  • the imaging apparatus includes an imaging unit 110, a control device 100, a display unit 120, and an operation unit 130.
  • the imaging unit 110 includes a lens, an imaging element, and the like.
  • examples of the lens include a wide-angle lens and a fisheye lens.
  • the control device 100 includes an image processing unit 101, a setting unit 102, and a storage unit 103.
  • the setting unit 102 sets the position and size of the virtual projection plane VP based on an input instruction to the operation unit 130.
  • the image processing unit 101 creates a conversion table of each coordinate on the virtual projection plane into the camera coordinate system based on the set position and size of the virtual projection plane VP, and shoots with the imaging unit 110 using the conversion table.
  • the processed pixel data is processed to create image data to be displayed on the display unit 120.
  • the storage unit 103 stores a distortion correction coefficient calculated based on the lens parameters of the lens. Also, the position and size of the virtual projection plane VP and the created conversion table are stored.
  • the display unit 120 includes a display screen such as a liquid crystal display, and sequentially displays the image data created by the image processing unit 101 based on the pixel data captured by the imaging unit 110 on the display screen.
  • the operation unit 130 includes a keyboard, a mouse, or a touch panel arranged so as to be superimposed on the liquid crystal display of the display unit, and receives a user's input operation.
  • FIG. 4 is a diagram showing a control flow of the present embodiment.
  • FIG. 4A shows a main control flow
  • FIG. 4B shows a subroutine of step S20.
  • step S10 the virtual projection plane VP is converted (set).
  • the setting unit 102 instructs the virtual projection plane VP in the world coordinate system in response to an input instruction to the operation unit 130 as described above.
  • the size is set.
  • the camera viewpoint displayed on the display unit 120 is changed by changing the position as shown in FIG. 2 (corresponding to pan and tilt). Further, if the distance from the lens center O is changed with the change of the position of the virtual projection plane VP, the zoom-in / zoom-out is performed. Zooming in and zooming out can also be performed by changing the size of the virtual projection plane VP. A specific example regarding the position change of the virtual projection plane VP will be described later.
  • the virtual projection plane VP is divided into the number of pixels n based on the input size setting (or the default size value).
  • the number of pixels is preferably equal to or greater than the total number of display pixels (screen resolution) of the display unit 120.
  • both the number of pixels n and the total number of display pixels of the display unit 120 will be described under a fixed condition of 640 ⁇ 480 pixels (total number of pixels 307,000).
  • the size of the virtual projection plane VP is fixed when the number of pixels n is fixed.
  • step S20 distortion correction processing is mainly performed by the image processing unit 101 based on the state of the virtual projection plane VP set in step S10.
  • the distortion correction process will be described with reference to FIG.
  • step S21 the coordinates Gv (X, Y, Z) of the world coordinate system are acquired for each pixel Gv on the virtual projection plane VP.
  • FIG. 5 is a schematic diagram for explaining a coordinate system. As shown in FIG. 5, point A (0, 0, Za), point B (0, 479, Zb), point C (639, 479, Zc), point D (639, 0) at the four corners of the virtual projection plane VP. , Zd) is divided into 640 ⁇ 480 pixel pixels Gv (total number of pixels: 307,000) at equal intervals, and the coordinates of all the pixels Gv in the world coordinate system are obtained.
  • step S22 coordinates Gi (x, y in the corresponding camera coordinate system on the image sensor surface IA are calculated from the coordinates of the pixel Gv in the world coordinate system and the distortion correction coefficient of the imaging unit 110 stored in the storage unit 103. ) Is calculated. Specifically, the distortion correction coefficient calculated from the lens parameters of the optical system is stored in the storage unit 103, and is calculated from the incident angle ⁇ with respect to the optical axis Z obtained from the coefficient and the coordinates of each pixel Gv. (Reference: International Publication No. 2010/032720).
  • FIG. 6 is a diagram showing a correspondence relationship between the camera coordinate system xy and the imaging element surface IA.
  • points a to d are obtained by converting the points A to D in FIG. 5 into the camera coordinate system.
  • the virtual projection plane VP surrounded by the points A to D is a rectangular plane.
  • the area surrounded by the points a to d after the coordinate conversion to the camera coordinate system is (the virtual projection plane VP
  • the shape is distorted (corresponding to the position).
  • the figure shows an example of distortion in a barrel shape, but the case may be a pincushion type or a Jinkasa type (a shape that changes into a barrel type at the center and straight or pincushion at the end) due to the characteristics of the optical system. There is also.
  • step S23 the pixel of the image sensor to be referenced is determined from the coordinates Gi (x ′, y ′) in the camera coordinate system.
  • x and y in the coordinates (x, y) of each pixel of the image sensor are integers, but x ′ and y ′ in the coordinates Gi (x ′, y ′) calculated in step S22 are not necessarily integers. Can take a real value with a fractional part.
  • x ′ and y ′ are integers as in the former and the coordinates Gi (x ′, y ′) coincide with the pixel position of the image sensor, the pixel data of the pixel of the corresponding image sensor is used as the virtual projection plane.
  • the calculated coordinates Gi (x ′, y ′) are used as pixel data of the pixel Gv.
  • the peripheral locations are not limited to 4 locations, and may be 1 location, 16 locations or more.
  • Steps S21 to S23 are moved from the point A (0, 0, Za), which is the starting point of FIG. 5, by one pixel (pixel) at a time to each pixel (up to the lower right end point C (639, 479, Zc)).
  • Pixel image data in which distortion correction has been performed for all pixels can be acquired. This is the control related to the subroutine of step S20 shown in FIG.
  • step S40 (corresponding to “fourth step”), the image data acquired in step S20 is displayed on the display screen of the display unit 120. Steps S20 and S40 are sequentially performed, and the image data after distortion processing based on the captured pixel data is displayed on the display unit 120 in real time.
  • the present embodiment by setting the position and size of the virtual projection plane VP, it is possible to handle all processes including panning, tilting, and zooming processes including distortion correction processes in a batch process.
  • the processing becomes lighter, and the processing time can be shortened with a relatively small circuit.
  • ASIC Application Specific Integrated Circuit
  • FIG. 7 is a schematic diagram for explaining distortion correction according to the second embodiment.
  • the single virtual projection plane is used.
  • the virtual projection plane is not limited to this and may be two or more.
  • the image data obtained at the positions of the respective virtual projection planes are switched and displayed by time division or a user instruction, or the display screen is divided and displayed at the same time.
  • the second embodiment shown in FIG. 7 is an example in which two virtual projection planes are set. Except for the configuration shown in FIG. 7, it is the same as the embodiment described in FIGS.
  • FIG. 7 shows an example in which two virtual projection planes VPh and VPj are set. Both can set the position and size independently.
  • the ranges corresponding to the virtual projection planes VPh and VPj on the image sensor surface IA are the areas h and j
  • the points corresponding to the object points P1 and P2 are the points p1 and p2.
  • Image data is calculated by the flow shown in FIG. 4B for each of the set virtual projection planes VPh and VPj, and each of them is individually displayed on the display unit 120 by the process of step S40 of FIG. 4A. Displayed on the screen.
  • [Specific Example of Changing Position of Virtual Projection Plane VP] 8 to 11 are examples in which the position of the virtual projection plane VP0 is changed with the image center o of the camera coordinate system as the rotation center (or the movement center). As shown in FIG. 8, rotation about the x-axis with the image center o as the rotation center is pitch (also referred to as tilt), rotation about the y-axis is yaw (also referred to as pan), and rotation about the Z-axis. Is roll.
  • FIG. 10, and FIG. 11 show examples in which the virtual projection plane VP0 is rotated by roll, pitch, and yaw, respectively, based on the input set value of the rotation amount.
  • Ca0 and Ca1 are virtual cameras
  • cav is the camera viewpoint of the virtual camera Ca1 after the position change. Note that the camera viewpoint cav of the virtual camera Ca0 before the position change coincides with the Z axis.
  • the virtual projection plane VP1 is obtained by changing the position of the virtual projection plane VP0 and rotating it.
  • the camera viewpoint cav coincides with before and after the position change.
  • the virtual projection plane VP1 is obtained by changing the position of the virtual projection plane VP0 and performing the pitch rotation, and the camera viewpoint cav moves in the direction of looking up in FIG.
  • the virtual projection plane VP0 is obtained by changing the position of the virtual projection plane VP0 and rotating the yaw rotation, which is the virtual projection plane VP1, and the camera viewpoint cav is rotating clockwise in FIG.
  • FIG. 12 to 14 are examples in which the position of the virtual projection plane VP is changed based on the center ov of the virtual projection plane VP0 based on the input rotation amount setting value.
  • viewpoint conversion corresponding to rotating or changing the position of the virtual camera Ca0 is performed.
  • one of the two axes that are orthogonal to each other on the virtual projection plane VP0 is set as Yaw-axis, and the other is set as P-axis. Both are axes passing through the center ov, and rotation around the Yaw-axis with the center ov as the rotation center is called virtual yaw rotation, and rotation around the P-axis is called virtual pitch rotation.
  • the virtual projection plane VP0 is parallel to the xy plane of the camera coordinate system, and the center ov exists on the Z axis.
  • P-axis is parallel to the x-axis and Yaw-axis is parallel to the y-axis.
  • the virtual projection plane VP1 is obtained by changing the position of the virtual projection plane VP0 and rotating the virtual pitch, and after the position change, the virtual camera Ca1 is positioned upward and the camera viewpoint cav is in a direction to look down.
  • the virtual projection plane VP1 is obtained by changing the position of the virtual projection plane VP0 and rotating the virtual yaw, and the virtual camera Ca1 and the camera viewpoint cav rotate counterclockwise after the position change.
  • 15 to 18 are examples in which the position of the virtual projection plane VP is changed with the image center o in the camera coordinate system as the movement center.
  • viewpoint conversion corresponding to the parallel movement of the virtual camera Ca0 together with the virtual projection plane VP is performed.
  • FIGS. 16, 17, and 18 show examples in which the virtual projection plane VP0 is offset (translated) in the X, Y, and Z directions based on the input offset movement amount setting value. is there.
  • the offset movement in the Z direction is the same movement as zooming in and zooming out.
  • the offset movement in each direction is effective when moving a dark part (see an example described later) outside the imaging area of the optical system outside the image area.
  • FIG. 19 to 26 are examples of display images displayed on the display unit 120.
  • FIG. FIG. 19 shows an example of a distorted image when the distortion correction processing is not performed.
  • 20 to 27 are examples of display images that have been subjected to distortion correction processing.
  • 26 and 27 are examples in which a plurality of virtual projection planes VP are set and images corresponding to the respective virtual projection planes VP are displayed on the display unit 120.
  • FIG. 20 shows an example in which the virtual projection plane VP is set parallel to the lens center plane LC, and the center of the virtual projection plane VP is substantially coincident with the optical axis.
  • the display image corresponds to the virtual projection plane VP0 in the initial state.
  • FIG. 21 shows an example in which the virtual projection plane VP0 is offset in the Z direction.
  • FIG. 22 shows the position of the virtual projection plane VP0 rotated by yaw (corresponding to FIG. 11).
  • the right end is outside the imaging area and is therefore a dark part.
  • FIG. 23 shows the position of the virtual projection plane VP0 rotated by pitch (corresponding to FIG. 10). Also in the figure, a dark portion is generated at the lower end.
  • FIG. 24 shows the virtual projection plane VP0 rotated by roll (corresponding to FIG. 9).
  • FIG. 25 shows the position of the virtual projection plane VP0 rotated by a virtual yaw (see FIG. 14; however, the direction of rotation is opposite to that in FIG. 14). Also in the figure, a dark part is generated at the right end.
  • FIG. 26 is an example in which two virtual projection planes VP are set
  • FIG. 27 is an example in which three virtual projection planes VP are set, and display images corresponding to the respective virtual projection planes VP are displayed on the display unit 120. It is divided and displayed.
  • each of the plurality of virtual projection planes VP can be rotated and moved independently.
  • the center image and the two images on both sides differ in the offset amount in the Z direction. Is enlarged.
  • the center image and the images on both sides overlap a part of the shooting area.
  • the optical system including a single lens is exemplified as the optical system including the condensing lens.
  • the condensing lens may be composed of a plurality of lenses.
  • an optical element other than the condenser lens may be provided, and the invention of the present application can be applied.
  • the distortion correction coefficient may be a value for the entire optical system.

Landscapes

  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Image Processing (AREA)
  • Studio Devices (AREA)

Abstract

In order to reduce distortion correction processing time in a relatively small-scale circuit, the disclosed image processing method converts a virtual projection surface to a camera coordinate system using distortion correction coefficients, and calculates image data of the virtual projection surface set in a world coordinate system on the basis of the converted camera coordinate system coordinates and the pixel data of an imaging element.

Description

画像処理方法、プログラム、画像処理装置及び撮像装置Image processing method, program, image processing apparatus, and imaging apparatus
 本願発明は、集光レンズを含む光学系を介して撮像素子により撮像された画像の歪み補正処理を行う、画像処理方法、プログラム、画像処理装置及び撮像装置に関するものである。 The present invention relates to an image processing method, a program, an image processing apparatus, and an imaging apparatus that perform distortion correction processing on an image captured by an imaging element via an optical system including a condenser lens.
 一般に、広角レンズあるいは魚眼レンズのような焦点距離の短いレンズや画角の大きなレンズを備えた光学系により撮影した画像は歪曲を伴うので、歪曲を補正する画像処理を行う。特許文献1には従来技術の補正方法として、焦点距離の短いレンズを使用して撮像された撮像画像に生じる歪曲を、レンズの補正用のパラメータを用いて補正する方法が開示されている。 Generally, since an image taken by an optical system having a short focal length lens or a large angle of view lens such as a wide-angle lens or a fish-eye lens is distorted, image processing for correcting the distortion is performed. Patent Document 1 discloses a correction method of the prior art that corrects distortion generated in a captured image captured using a lens with a short focal length using a lens correction parameter.
 特許文献2の車両周辺を表示する表示装置では、広角レンズを用いて撮影した撮影データをディスプレイに表示させる際の画像処理において、入射角度が所定値以上の場合には像高の変化率が増加するようにし、所定値未満の場合には像高の変化率が減少するように補正を行っている。 In the display device that displays the periphery of a vehicle disclosed in Patent Document 2, the image height change rate increases when the incident angle is greater than or equal to a predetermined value in image processing when displaying photographing data photographed using a wide-angle lens on the display. In the case of less than a predetermined value, correction is performed so that the change rate of the image height decreases.
特開2009-140066号公報JP 2009-140066 A 特開2010-3014号公報JP 2010-3014 A
 特許文献1、2に開示されたようなレンズで得た撮像画像についての画像処理では、シェーディング補正や歪み補正等の多くの補正処理を必要とし、このため画像処理装置としてハード化した場合に処理時間が長くなり、回路規模が増大してしまい、コストが嵩んでしまう問題があった。 Image processing for captured images obtained with lenses as disclosed in Patent Documents 1 and 2 requires a lot of correction processing such as shading correction and distortion correction, and is therefore processed when hardware is used as an image processing apparatus. There is a problem that the time is increased, the circuit scale is increased, and the cost is increased.
 特に、監視カメラや特許文献2に開示された車載カメラの様に撮像画像をモニターにリアルタイムで表示させるような場合においては、ユーザにより撮影領域を変更する指示に基づいてズーム、パン、チルト等の処理を追加するたびに新たな画像処理が必要となる。 In particular, when a captured image is displayed on a monitor in real time like a surveillance camera or an in-vehicle camera disclosed in Patent Document 2, zooming, panning, tilting, and the like are performed based on an instruction to change a shooting area by a user. Each time processing is added, new image processing is required.
 このため画像処理装置としてハード化した場合に処理時間が長くなり、回路規模が増大してしまい、コストが嵩んでしまう問題があった。本願発明はこのような問題に鑑み、比較的小規模な回路で、処理時間の短縮化を図ることが可能な、画像処理方法、プログラム、画像処理装置及び撮像装置を提供することを目的とする。 For this reason, there is a problem that when the image processing apparatus is implemented as hardware, the processing time becomes long, the circuit scale increases, and the cost increases. SUMMARY OF THE INVENTION In view of such problems, the present invention has an object to provide an image processing method, a program, an image processing apparatus, and an imaging apparatus capable of reducing processing time with a relatively small circuit. .
 上記の目的は、下記に記載する発明により達成される。 The above object is achieved by the invention described below.
 1.光学系を介して複数の画素を有する撮像素子に受光して得られた複数の画素データを用いて歪み補正処理した画像データを得る画像処理方法において、
 ワールド座標系における仮想投影面の位置及びサイズを、ユーザの指示に基づいて設定する第1ステップと、
 前記第1ステップで設定された仮想投影面の各画素のワールド座標系における座標を前記光学系の歪み補正係数を用いてカメラ座標系に変換する第2ステップと、
 前記複数の画素データと前記第2ステップで変換したカメラ座標系における座標とに基づいて、前記第1ステップで設定された仮想投影面の画像データを算出する第3ステップと、
 を有することを特徴とする画像処理方法。
1. In an image processing method for obtaining image data subjected to distortion correction processing using a plurality of pixel data obtained by receiving an image sensor having a plurality of pixels via an optical system,
A first step of setting the position and size of the virtual projection plane in the world coordinate system based on a user instruction;
A second step of converting coordinates in the world coordinate system of each pixel of the virtual projection plane set in the first step into a camera coordinate system using a distortion correction coefficient of the optical system;
A third step of calculating image data of the virtual projection plane set in the first step based on the plurality of pixel data and the coordinates in the camera coordinate system converted in the second step;
An image processing method comprising:
 2.前記第3ステップで算出した画像データを表示部に表示させる第4ステップを有することを特徴とする前記1に記載の画像処理方法。 2. 2. The image processing method according to item 1, further comprising a fourth step of displaying the image data calculated in the third step on a display unit.
 3.前記第1ステップで設定される仮想投影面は、複数の仮想投影面であることを特徴とする、前記1又は2に記載の画像処理方法。 3. 3. The image processing method according to item 1 or 2, wherein the virtual projection plane set in the first step is a plurality of virtual projection planes.
 4.前記第3ステップでは、画像データの算出を前記仮想投影面の各画素の位置における前記光学系の光軸に対する入射角度θから、対応する前記撮像素子面上の位置を算出し、算出した位置の画素の画素データから前記仮想投影面の前記画素の画像データを得ることを特徴とする前記1から3のいずれか一項に記載の画像処理方法。 4. In the third step, calculation of image data is performed by calculating a corresponding position on the imaging element surface from an incident angle θ with respect to the optical axis of the optical system at each pixel position on the virtual projection plane. 4. The image processing method according to claim 1, wherein image data of the pixel on the virtual projection plane is obtained from pixel data of the pixel.
 5.光学系を介して複数の画素を有する撮像素子に受光して得られた複数の画素データを用いて歪み補正処理された画像データを得る画像処理装置であって、
 位置及びサイズが設定された仮想投影面の各画素のワールド座標系における座標を前記光学系の歪み補正係数を用いてカメラ座標系に変換し、カメラ座標系に変換した座標と前記複数の画素データに基づいて、前記仮想投影面での画像データを算出する画像処理部と、
 を有することを特徴とする画像処理装置。
5. An image processing apparatus that obtains image data subjected to distortion correction processing using a plurality of pixel data obtained by receiving light on an imaging device having a plurality of pixels via an optical system,
The coordinates in the world coordinate system of each pixel of the virtual projection plane for which the position and size are set are converted into the camera coordinate system using the distortion correction coefficient of the optical system, the coordinates converted into the camera coordinate system and the plurality of pixel data An image processing unit that calculates image data on the virtual projection plane,
An image processing apparatus comprising:
 6.光学系を介して複数の画素を有する撮像素子に受光して得られた複数の画素データを用いて歪み補正処理された画像データを得る画像処理装置であって、
 ワールド座標系における仮想投影面の位置及びサイズを設定可能な設定部と、
 前記設定部で設定された仮想投影面の各画素のワールド座標系における座標を前記光学系の歪み補正係数を用いてカメラ座標系に変換し、カメラ座標系に変換した座標と前記複数の画素データに基づいて、前記設定部で設定された仮想投影面での画像データを算出する画像処理部と、
 を有することを特徴とする画像処理装置。
6). An image processing apparatus that obtains image data subjected to distortion correction processing using a plurality of pixel data obtained by receiving light on an imaging device having a plurality of pixels via an optical system,
A setting unit capable of setting the position and size of the virtual projection plane in the world coordinate system;
The coordinates in the world coordinate system of each pixel of the virtual projection plane set by the setting unit are converted into the camera coordinate system using the distortion correction coefficient of the optical system, the coordinates converted into the camera coordinate system and the plurality of pixel data An image processing unit that calculates image data on the virtual projection plane set by the setting unit,
An image processing apparatus comprising:
 7.前記設定部で設定された仮想投影面は、複数の仮想投影面であることを特徴とする前記6に記載の画像処理装置。 7. 7. The image processing apparatus according to 6, wherein the virtual projection plane set by the setting unit is a plurality of virtual projection planes.
 8.前記画像処理部は、画像データの算出を前記仮想投影面の各位置における前記光学系の光軸に対する入射角度θから、対応する前記撮像素子面上の位置を算出し、算出した位置の画素の画素データから前記仮想投影面の前記画素における画像データを得ることを特徴とする前記6又は7に記載の画像処理装置。 8. The image processing unit calculates the position of the pixel at the calculated position by calculating the corresponding position on the imaging element surface from the incident angle θ with respect to the optical axis of the optical system at each position on the virtual projection plane. 8. The image processing apparatus according to item 6 or 7, wherein image data at the pixel on the virtual projection plane is obtained from pixel data.
 9.光学系を介して複数の画素を有する撮像素子に受光して得られた複数の画素データを用いて歪み補正処理された画像データを得る画像処理装置のプログラムであって、コンピュータを、
 位置及びサイズが設定された仮想投影面の各画素のワールド座標系における座標を前記光学系の歪み補正係数を用いてカメラ座標系に変換し、カメラ座標系に変換した座標と前記複数の画素データに基づいて、前記仮想投影面での画像データを算出する画像処理部、
 として機能させるプログラム。
9. An image processing apparatus program for obtaining image data subjected to distortion correction processing using a plurality of pixel data obtained by receiving an image sensor having a plurality of pixels via an optical system, the computer comprising:
The coordinates in the world coordinate system of each pixel of the virtual projection plane for which the position and size are set are converted into the camera coordinate system using the distortion correction coefficient of the optical system, the coordinates converted into the camera coordinate system and the plurality of pixel data Based on the image processing unit for calculating the image data on the virtual projection plane,
Program to function as.
 10.光学系を介して複数の画素を有する撮像素子に受光して得られた複数の画素データを用いて歪み補正処理された画像データを得る画像処理装置のプログラムであって、コンピュータを、
 ワールド座標系における仮想投影面の位置及びサイズを設定可能な設定部と、
 前記設定部で設定された仮想投影面の各画素のワールド座標系における座標を前記光学系の歪み補正係数を用いてカメラ座標系に変換し、カメラ座標系に変換した座標と前記複数の画素データに基づいて、前記設定部で設定された仮想投影面での画像データを算出する画像処理部、
 として機能させるプログラム。
10. An image processing apparatus program for obtaining image data subjected to distortion correction processing using a plurality of pixel data obtained by receiving an image sensor having a plurality of pixels via an optical system, the computer comprising:
A setting unit capable of setting the position and size of the virtual projection plane in the world coordinate system;
The coordinates in the world coordinate system of each pixel of the virtual projection plane set by the setting unit are converted into the camera coordinate system using the distortion correction coefficient of the optical system, the coordinates converted into the camera coordinate system and the plurality of pixel data An image processing unit for calculating image data on the virtual projection plane set by the setting unit,
Program to function as.
 11.光学系と、
 複数の画素を有する撮像素子と、
 ワールド座標系における仮想投影面の位置及びサイズを設定する設定部と、
 前記設定部で設定された仮想投影面の各画素のワールド座標系における座標を前記光学系の歪み補正係数を用いてカメラ座標系に変換し、カメラ座標系に変換した座標と前記複数の画素データに基づいて、前記設定部で設定された仮想投影面での画像データを算出する画像処理部と
 を有することを特徴とする撮像装置。
11. Optical system,
An imaging device having a plurality of pixels;
A setting unit for setting the position and size of the virtual projection plane in the world coordinate system;
The coordinates in the world coordinate system of each pixel of the virtual projection plane set by the setting unit are converted into the camera coordinate system using the distortion correction coefficient of the optical system, the coordinates converted into the camera coordinate system and the plurality of pixel data And an image processing unit that calculates image data on the virtual projection plane set by the setting unit.
 12.ユーザが操作する操作部と、
 表示部と、
 を有し、
 前記設定部は、前記操作部への操作に基づいて前記仮想投影面のワールド座標系における位置及びサイズの設定を行い、
 前記表示部は、設定された前記仮想投影面での画像データを表示することを特徴とする前記11に記載の撮像装置。
12 An operation unit operated by a user;
A display unit;
Have
The setting unit sets the position and size of the virtual projection plane in the world coordinate system based on an operation to the operation unit,
12. The imaging apparatus according to 11, wherein the display unit displays image data on the set virtual projection plane.
 13.前記設定部で設定される仮想投影面は、複数の仮想投影面であることを特徴とする前記11又は12に記載の撮像装置。 13. 13. The imaging apparatus according to 11 or 12, wherein the virtual projection plane set by the setting unit is a plurality of virtual projection planes.
 14.前記画像処理部は、画像データの算出を前記仮想投影面の各位置における前記光学系の光軸に対する入射角度θから、対応する前記撮像素子面上の位置を算出し、算出した位置の画素の画素データから前記仮想投影面の前記画素の画像データを得ることを特徴とする前記11から13の何れか一項に記載の撮像装置。 14. The image processing unit calculates the position of the pixel at the calculated position by calculating the corresponding position on the imaging element surface from the incident angle θ with respect to the optical axis of the optical system at each position on the virtual projection plane. 14. The imaging apparatus according to any one of 11 to 13, wherein image data of the pixel on the virtual projection plane is obtained from pixel data.
 本願発明によれば、ワールド座標系に設定された仮想投影面を歪み補正係数を用いてカメラ座標系に変換し、変換したカメラ座標系の座標と、撮像素子の画素データに基づいて仮想投影面の画像データを算出することにより、比較的小規模な回路で、処理時間の短縮化を図ることが可能となる。 According to the present invention, the virtual projection plane set in the world coordinate system is converted into the camera coordinate system using the distortion correction coefficient, and the virtual projection plane is based on the converted coordinates of the camera coordinate system and the pixel data of the image sensor. By calculating the image data, it is possible to reduce the processing time with a relatively small circuit.
本実施形態に係る歪曲補正を説明する模式図である。It is a schematic diagram explaining the distortion correction which concerns on this embodiment. 仮想投影面VPの位置を移動させた例を示している。An example in which the position of the virtual projection plane VP is moved is shown. 撮像装置の概略構成を示すブロック図である。It is a block diagram which shows schematic structure of an imaging device. 図4(a)はメインの制御フローを示す図であり、図4(b)はステップS20のサブルーチンを示す図である。FIG. 4A shows a main control flow, and FIG. 4B shows a subroutine of step S20. 仮想投影面VPの座標を説明する図である。It is a figure explaining the coordinate of the virtual projection surface VP. カメラ座標系xyと撮像素子面IAとの対応関係を示す図である。It is a figure which shows the correspondence of camera coordinate system xy and image pick-up element surface IA. 2つの仮想投影面VPを設定した例を示している。An example in which two virtual projection planes VP are set is shown. カメラ座標系の画像中心oを回転中心として仮想投影面VPの位置を変更する例である。This is an example in which the position of the virtual projection plane VP is changed with the image center o of the camera coordinate system as the rotation center. 仮想投影面VP0をroll回転させた例を示すものである。An example in which the virtual projection plane VP0 is rotated by roll is shown. 仮想投影面VP0をpitch回転させた例を示すものである。An example in which the virtual projection plane VP0 is pitch rotated is shown. 仮想投影面VP0をyaw回転させた例を示すものである。An example in which the virtual projection plane VP0 is rotated by yaw is shown. 仮想投影面VP0の中心ovを回転中心として仮想投影面VPの位置を変更する例である。This is an example in which the position of the virtual projection plane VP is changed with the center ov of the virtual projection plane VP0 as the rotation center. 仮想投影面VP0を仮想pitch回転させた例を示すものである。An example in which the virtual projection plane VP0 is rotated by a virtual pitch is shown. 仮想投影面VP0を仮想yaw回転させた例を示すものである。An example in which the virtual projection plane VP0 is rotated by a virtual yaw is shown. カメラ座標系の画像中心oを移動中心として仮想投影面VPの位置を変更する例である。This is an example in which the position of the virtual projection plane VP is changed with the image center o in the camera coordinate system as the movement center. 仮想投影面VP0をx方向にオフセット移動させた例を示すものである。An example in which the virtual projection plane VP0 is offset in the x direction is shown. 仮想投影面VP0をy方向にオフセット移動させた例を示すものである。An example in which the virtual projection plane VP0 is offset in the y direction is shown. 仮想投影面VP0をZ方向にオフセット移動させた例を示すものである。An example in which the virtual projection plane VP0 is offset in the Z direction is shown. 表示部120に表示させた表示画像の例である。It is an example of a display image displayed on the display unit 120. 表示部120に表示させた表示画像の例である。It is an example of a display image displayed on the display unit 120. 表示部120に表示させた表示画像の例である。It is an example of a display image displayed on the display unit 120. 表示部120に表示させた表示画像の例である。It is an example of a display image displayed on the display unit 120. 表示部120に表示させた表示画像の例である。It is an example of a display image displayed on the display unit 120. 表示部120に表示させた表示画像の例である。It is an example of a display image displayed on the display unit 120. 表示部120に表示させた表示画像の例である。It is an example of a display image displayed on the display unit 120. 表示部120に分割表示させた表示画像の例である。It is an example of a display image divided and displayed on the display unit 120. 表示部120に分割表示させた表示画像の例である。It is an example of a display image divided and displayed on the display unit 120.
 本発明を実施の形態に基づいて説明するが、本発明は該実施の形態に限られない。 The present invention will be described based on an embodiment, but the present invention is not limited to the embodiment.
 図1は、本実施形態に係る歪曲補正を説明する模式図である。図1において、X、Y、Zはワールド座標系であり、原点Oはレンズ中心である。Zは光軸、XY平面はレンズ中心Oを通るレンズ中心面LCを含んでいる。点Pはワールド座標系XYZにおける対象物の物点である。θは光軸(Z軸に一致)に対する入射角度である。 FIG. 1 is a schematic diagram for explaining distortion correction according to the present embodiment. In FIG. 1, X, Y, and Z are world coordinate systems, and the origin O is the lens center. Z includes the optical axis, and the XY plane includes the lens center plane LC passing through the lens center O. Point P is an object point of the object in the world coordinate system XYZ. θ is an incident angle with respect to the optical axis (coincident with the Z axis).
 x,yはカメラ座標系であり、xy平面は撮像素子面IAに対応する。oは画像中心であり光軸Zと撮像素子面との交点である。点pはカメラ座標系における撮像素子面上の点であり、物点Pをレンズ特性に基づくパラメータ(以下、「レンズパラメータ」という)に基づく歪み補正係数を用いてカメラ座標系に変換したものである。 X and y are camera coordinate systems, and the xy plane corresponds to the image sensor surface IA. o is the center of the image and is the intersection of the optical axis Z and the image sensor surface. The point p is a point on the image sensor surface in the camera coordinate system, and the object point P is converted into the camera coordinate system using a distortion correction coefficient based on a parameter based on lens characteristics (hereinafter referred to as “lens parameter”). is there.
 VPは仮想投影面である。仮想投影面VPは光学系のレンズ位置(レンズ中心面LC)に対して撮像素子(及び撮像素子面IA)とは反対側に設定される。仮想投影面VPは、ユーザによる操作部130(図3参照)への指示に基づいて、位置及びサイズの変更を行うことが可能である。本願において「位置変更」とは、仮想投影面VPをXY平面上で平行移動させる場合のみならず、XY平面に対する角度変更(姿勢変更ともいう)をも含む概念である。 VP is a virtual projection plane. The virtual projection plane VP is set on the opposite side of the imaging element (and imaging element surface IA) with respect to the lens position (lens center plane LC) of the optical system. The virtual projection plane VP can be changed in position and size based on an instruction from the user to the operation unit 130 (see FIG. 3). In the present application, “position change” is a concept that includes not only the case where the virtual projection plane VP is translated on the XY plane, but also an angle change (also referred to as an attitude change) with respect to the XY plane.
 初期状態(初期の位置設定のこと、以下同様)において仮想投影面VPは、所定サイズでレンズ中心面LCと平行(XY方向)の所定位置(Z方向)に配置され、仮想投影面VPの中心ovはZ軸上に位置している。Gvは物点Pが仮想投影面VP上に投影された点であり、物点Pとレンズ中心Oを通る直線と仮想投影面VPとの交点である。図2における仮想投影面VP1は、仮想投影面VP0を操作部130の入力に基づいてXZ平面上で回転させた状態を示している。 In an initial state (initial position setting, the same applies hereinafter), the virtual projection plane VP is arranged at a predetermined position (Z direction) parallel to the lens center plane LC (XY direction) with a predetermined size, and the center of the virtual projection plane VP. ov is located on the Z axis. Gv is a point where the object point P is projected onto the virtual projection plane VP, and is an intersection of the object point P and a straight line passing through the lens center O and the virtual projection plane VP. A virtual projection plane VP1 in FIG. 2 shows a state in which the virtual projection plane VP0 is rotated on the XZ plane based on the input of the operation unit 130.
 [ブロック図]
 図3は、撮像装置の概略構成を示すブロック図である。撮影装置は、撮像ユニット110、制御装置100、表示部120、操作部130を備えている。
[Block Diagram]
FIG. 3 is a block diagram illustrating a schematic configuration of the imaging apparatus. The imaging apparatus includes an imaging unit 110, a control device 100, a display unit 120, and an operation unit 130.
 撮像ユニット110は、レンズ、撮像素子等から構成される。本実施形態においては、レンズとしては例えば広角レンズ、魚眼レンズがある。 The imaging unit 110 includes a lens, an imaging element, and the like. In the present embodiment, examples of the lens include a wide-angle lens and a fisheye lens.
 制御装置100は、画像処理部101、設定部102、記憶部103から構成される。 The control device 100 includes an image processing unit 101, a setting unit 102, and a storage unit 103.
 設定部102では、操作部130への入力指示に基づいて仮想投影面VPの位置、サイズの設定を行う。 The setting unit 102 sets the position and size of the virtual projection plane VP based on an input instruction to the operation unit 130.
 画像処理部101では、設定された仮想投影面VPの位置、サイズに基づいて仮想投影面上の各座標のカメラ座標系への変換テーブルを作成し、当該変換テーブルを用いて撮像ユニット110で撮影した画素データを処理して表示部120に表示させる画像データを作成する。記憶部103には、レンズのレンズパラメータにより算出された歪み補正係数が記憶されている。また仮想投影面VPの位置、サイズ及び作成した変換テーブルの記憶も行う。 The image processing unit 101 creates a conversion table of each coordinate on the virtual projection plane into the camera coordinate system based on the set position and size of the virtual projection plane VP, and shoots with the imaging unit 110 using the conversion table. The processed pixel data is processed to create image data to be displayed on the display unit 120. The storage unit 103 stores a distortion correction coefficient calculated based on the lens parameters of the lens. Also, the position and size of the virtual projection plane VP and the created conversion table are stored.
 表示部120は、液晶ディスプレイ等の表示画面を備え、撮像ユニット110で撮影した画素データに基づいて画像処理部101で作成した画像データを逐次、表示画面に表示させる。 The display unit 120 includes a display screen such as a liquid crystal display, and sequentially displays the image data created by the image processing unit 101 based on the pixel data captured by the imaging unit 110 on the display screen.
 操作部130は、キーボード、マウス、あるいは表示部の液晶ディスプレイに重畳して配置したタッチパネルを備え、ユーザの入力操作を受け付ける。 The operation unit 130 includes a keyboard, a mouse, or a touch panel arranged so as to be superimposed on the liquid crystal display of the display unit, and receives a user's input operation.
 [制御フロー]
 図4は、本実施形態の制御フローを示す図である。図4(a)はメインの制御フローを示す図であり、図4(b)はステップS20のサブルーチンを示す図である。
[Control flow]
FIG. 4 is a diagram showing a control flow of the present embodiment. FIG. 4A shows a main control flow, and FIG. 4B shows a subroutine of step S20.
 ステップS10(「第1ステップ」に相当)では、仮想投影面VPの変換(設定)を行う。これは初期状態の位置、サイズに設定されている仮想投影面に対して、前述の様にユーザによる操作部130への入力指示により、設定部102により仮想投影面VPのワールド座標系における位置、サイズの設定が行われる。図2に示すように位置を変更することにより表示部120に表示されるカメラ視点が変更される(パン、チルトに相当)。また仮想投影面VPの位置の変更に伴いレンズ中心Oとの距離が変更されればズームイン、ズームアウトされることになる。またズームイン、ズームアウトは仮想投影面VPのサイズを変更することによっても行うことができる。仮想投影面VPの位置変更に関しての具体例は後述する。 In step S10 (corresponding to “first step”), the virtual projection plane VP is converted (set). For the virtual projection plane set to the initial position and size, the setting unit 102 instructs the virtual projection plane VP in the world coordinate system in response to an input instruction to the operation unit 130 as described above. The size is set. The camera viewpoint displayed on the display unit 120 is changed by changing the position as shown in FIG. 2 (corresponding to pan and tilt). Further, if the distance from the lens center O is changed with the change of the position of the virtual projection plane VP, the zoom-in / zoom-out is performed. Zooming in and zooming out can also be performed by changing the size of the virtual projection plane VP. A specific example regarding the position change of the virtual projection plane VP will be described later.
 またステップS10では、入力されたサイズ設定(あるいはデフォルト設定のサイズ値)に基づいて仮想投影面VPを画素数nに分割する。当該画素数は表示部120の表示総画素数(画面解像度)と同かこれ以上であることが好ましい。以下においては例として、当該画素数n及び表示部120の表示総画素数はともに640×480pixel(総画素数30.7万)の固定条件で説明する。なお本実施形態においては仮想投影面VP上において隣接する画素との間隔は等間隔に設定しているので、画素数nが固定の場合には仮想投影面VPのサイズは固定となる。 In step S10, the virtual projection plane VP is divided into the number of pixels n based on the input size setting (or the default size value). The number of pixels is preferably equal to or greater than the total number of display pixels (screen resolution) of the display unit 120. Hereinafter, as an example, both the number of pixels n and the total number of display pixels of the display unit 120 will be described under a fixed condition of 640 × 480 pixels (total number of pixels 307,000). In the present embodiment, since the interval between adjacent pixels on the virtual projection plane VP is set to be equal, the size of the virtual projection plane VP is fixed when the number of pixels n is fixed.
 ステップS20(「第2、第3ステップ」に相当)では、ステップS10で設定された仮想投影面VPの状態に基づいて、主に画像処理部101により歪み補正処理を行う。歪み補正処理について図4(b)を参照して説明する。 In step S20 (corresponding to “second and third steps”), distortion correction processing is mainly performed by the image processing unit 101 based on the state of the virtual projection plane VP set in step S10. The distortion correction process will be described with reference to FIG.
 ステップS21では、仮想投影面VP上での各々の画素Gvについてワールド座標系の座標Gv(X,Y,Z)を取得する。図5は、座標系を説明する模式図である。図5に示すように仮想投影面VPの4隅の点A(0,0,Za)、点B(0,479,Zb)、点C(639,479,Zc)、点D(639,0,Zd)で囲まれる平面を等間隔で640×480pixelの画素Gv(総画素数30.7万)に分割し、全ての画素Gvそれぞれのワールド座標系における座標を取得する。 In step S21, the coordinates Gv (X, Y, Z) of the world coordinate system are acquired for each pixel Gv on the virtual projection plane VP. FIG. 5 is a schematic diagram for explaining a coordinate system. As shown in FIG. 5, point A (0, 0, Za), point B (0, 479, Zb), point C (639, 479, Zc), point D (639, 0) at the four corners of the virtual projection plane VP. , Zd) is divided into 640 × 480 pixel pixels Gv (total number of pixels: 307,000) at equal intervals, and the coordinates of all the pixels Gv in the world coordinate system are obtained.
 ステップS22では、画素Gvのワールド座標系での座標と記憶部103に記憶されている撮像ユニット110の歪み補正係数から、撮像素子面IAでの対応するカメラ座標系での座標Gi(x,y)を算出する。具体的には、光学系のレンズパラメータより算出された歪み補正係数が記憶部103に記憶されており、当該係数と各画素Gvの座標から得られる光軸Zに対する入射角度θにより算出している(参考文献:国際公開第2010/032720号)。 In step S22, coordinates Gi (x, y in the corresponding camera coordinate system on the image sensor surface IA are calculated from the coordinates of the pixel Gv in the world coordinate system and the distortion correction coefficient of the imaging unit 110 stored in the storage unit 103. ) Is calculated. Specifically, the distortion correction coefficient calculated from the lens parameters of the optical system is stored in the storage unit 103, and is calculated from the incident angle θ with respect to the optical axis Z obtained from the coefficient and the coordinates of each pixel Gv. (Reference: International Publication No. 2010/032720).
 図6はカメラ座標系xyと撮像素子面IAとの対応関係を示す図である。図6において点a~dは、図5の点A~Dをカメラ座標系に変換したものである。なお図5では点A~Dで囲まれる仮想投影面VPは矩形の平面であるが、図6においてカメラ座標系に座標変換した後の点a~dで囲まれる領域は(仮想投影面VPの位置に対応して)歪んだ形状となる。同図においては樽型形状に歪んだ例を示しているが、光学系の特性により糸巻型、陣笠型(中央では樽型で端部では直線あるいは糸巻型に変化する形状)の歪みとなる場合もある。 FIG. 6 is a diagram showing a correspondence relationship between the camera coordinate system xy and the imaging element surface IA. In FIG. 6, points a to d are obtained by converting the points A to D in FIG. 5 into the camera coordinate system. In FIG. 5, the virtual projection plane VP surrounded by the points A to D is a rectangular plane. In FIG. 6, the area surrounded by the points a to d after the coordinate conversion to the camera coordinate system is (the virtual projection plane VP The shape is distorted (corresponding to the position). The figure shows an example of distortion in a barrel shape, but the case may be a pincushion type or a Jinkasa type (a shape that changes into a barrel type at the center and straight or pincushion at the end) due to the characteristics of the optical system. There is also.
 ステップS23では、カメラ座標系における座標Gi(x’,y’)から参照する撮像素子の画素を決定する。なお撮像素子の各画素の座標(x,y)におけるx、yは整数であるが、ステップS22で算出される座標Gi(x’,y’)のx’、y’は整数とは限らず小数部分を持つ実数値を取り得る。前者のようにx’、y’が整数で、座標Gi(x’,y’)と撮像素子の画素の位置とが一致する場合には、対応する撮像素子の画素の画素データを仮想投影面VP上の画素Gv(X,Y,Z)の画素データとして用いる。後者のようにx’、y’が整数でなくx’、y’とx、yとが一致しないような場合には画素Gvの画素データとして、算出された座標Gi(x’,y’)周辺の画素、例えば座標Gi(x’,y’)の位置に近接する上位4箇所の画素の画素データを用いて、これらの単純平均値あるいは、座標Gi(x’,y’)に対する距離により近接する4箇所の画素に対して重み付けをして算出した画素データを用いたりしてもよい。なお周辺の箇所としては4箇所には限られず1箇所、又は16箇所若しくはそれ以上であってもよい。 In step S23, the pixel of the image sensor to be referenced is determined from the coordinates Gi (x ′, y ′) in the camera coordinate system. Note that x and y in the coordinates (x, y) of each pixel of the image sensor are integers, but x ′ and y ′ in the coordinates Gi (x ′, y ′) calculated in step S22 are not necessarily integers. Can take a real value with a fractional part. When x ′ and y ′ are integers as in the former and the coordinates Gi (x ′, y ′) coincide with the pixel position of the image sensor, the pixel data of the pixel of the corresponding image sensor is used as the virtual projection plane. Used as pixel data of the pixel Gv (X, Y, Z) on the VP. As in the latter case, when x ′ and y ′ are not integers and x ′ and y ′ do not match x and y, the calculated coordinates Gi (x ′, y ′) are used as pixel data of the pixel Gv. By using the pixel data of the surrounding pixels, for example, the top four pixels close to the position of the coordinate Gi (x ′, y ′), these simple average values or the distance to the coordinate Gi (x ′, y ′) Pixel data calculated by weighting four adjacent pixels may be used. The peripheral locations are not limited to 4 locations, and may be 1 location, 16 locations or more.
 当該ステップS21からステップS23を、図5の起点となる点A(0,0,Za)から1画素(ピクセル)ずつ移動させて右下の終点C(639,479,Zc)までの各画素(ピクセル)について実行することで全画素について歪み補正が行われた画像データを取得することができる。ここまでが図4(b)に示したステップS20のサブルーチンに関する制御である。 Steps S21 to S23 are moved from the point A (0, 0, Za), which is the starting point of FIG. 5, by one pixel (pixel) at a time to each pixel (up to the lower right end point C (639, 479, Zc)). (Pixel), image data in which distortion correction has been performed for all pixels can be acquired. This is the control related to the subroutine of step S20 shown in FIG.
 図4(a)の制御フローの説明に戻る。ステップS40(「第4ステップ」に相当)では、ステップS20で取得した画像データを表示部120の表示画面に表示させる。なおステップS20、S40は逐次実行されるものであり、撮影された画素データに基づく歪み処理後の画像データを表示部120にリアルタイムで表示させる。 Returning to the explanation of the control flow in FIG. In step S40 (corresponding to “fourth step”), the image data acquired in step S20 is displayed on the display screen of the display unit 120. Steps S20 and S40 are sequentially performed, and the image data after distortion processing based on the captured pixel data is displayed on the display unit 120 in real time.
 本実施形態によれば、仮想投影面VPの位置、サイズの設定することによりパン、チルト、ズーム処理をはじめ歪み補正処理を含む全ての処理を一括した処理で対応することが可能となるために、処理が軽くなり、比較的小規模な回路で、処理時間の短縮化を図ることが可能となる。このことによりASIC(Application Specific Integrated Circuit)等のような比較的小規模な回路構成であってもリアルタイムで処理可能となる。 According to the present embodiment, by setting the position and size of the virtual projection plane VP, it is possible to handle all processes including panning, tilting, and zooming processes including distortion correction processes in a batch process. The processing becomes lighter, and the processing time can be shortened with a relatively small circuit. As a result, even a relatively small circuit configuration such as ASIC (Application Specific Integrated Circuit) can be processed in real time.
 [第2の実施形態]
 図7は第2の実施形態に係る歪曲補正を説明する模式図である。図1から図6の説明において、仮想投影面は単一であったが、これに限られず2つあるいはこれ以上であってもよい。各々の仮想投影面の位置で得られた画像データを時分割やユーザの指示によりで切り換えて表示したり、表示画面を分割して同時に並べて表示したりする。図7に示す第2の実施形態では、2つの仮想投影面を設定した例である。図7に示す構成以外は、図3から図6に説明した実施形態と同一であり説明は省略する。
[Second Embodiment]
FIG. 7 is a schematic diagram for explaining distortion correction according to the second embodiment. In the description of FIGS. 1 to 6, the single virtual projection plane is used. However, the virtual projection plane is not limited to this and may be two or more. The image data obtained at the positions of the respective virtual projection planes are switched and displayed by time division or a user instruction, or the display screen is divided and displayed at the same time. The second embodiment shown in FIG. 7 is an example in which two virtual projection planes are set. Except for the configuration shown in FIG. 7, it is the same as the embodiment described in FIGS.
 図7では仮想投影面VPh、VPjの2つの仮想投影面を設定した例を示している。両者は独立にその位置、サイズを設定可能である。同図においては、撮像素子面IA上で仮想投影面VPh、VPj、に対応する範囲は領域h、jであり、物点P1、P2に対応する点は、点p1、p2である。 FIG. 7 shows an example in which two virtual projection planes VPh and VPj are set. Both can set the position and size independently. In the drawing, the ranges corresponding to the virtual projection planes VPh and VPj on the image sensor surface IA are the areas h and j, and the points corresponding to the object points P1 and P2 are the points p1 and p2.
 設定された仮想投影面VPh、VPjのそれぞれに対して図4(b)に示すフローにより画像データが算出され、それぞれが個別に、図4(a)のステップS40の処理により表示部120に2画面表示される。 Image data is calculated by the flow shown in FIG. 4B for each of the set virtual projection planes VPh and VPj, and each of them is individually displayed on the display unit 120 by the process of step S40 of FIG. 4A. Displayed on the screen.
 [仮想投影面VPの位置の変更の具体例]
 図8から図11は、カメラ座標系の画像中心oを回転中心(若しくは移動中心)として仮想投影面VP0の位置を変更する例である。図8に示すように画像中心oを回転中心としてx軸回りの回転がpitch(tilt:チルトともいう)であり、y軸回りの回転がyaw(pan:パンともいう)、Z軸回りの回転がrollである。
[Specific Example of Changing Position of Virtual Projection Plane VP]
8 to 11 are examples in which the position of the virtual projection plane VP0 is changed with the image center o of the camera coordinate system as the rotation center (or the movement center). As shown in FIG. 8, rotation about the x-axis with the image center o as the rotation center is pitch (also referred to as tilt), rotation about the y-axis is yaw (also referred to as pan), and rotation about the Z-axis. Is roll.
 図9、図10、図11は入力された回転量の設定値に基づいてそれぞれ仮想投影面VP0をroll回転、pitch回転、yaw回転させた例を示すものである。これらの図(及びこれ以降も)においてCa0、Ca1は仮想カメラであり、cavは位置変更後の仮想カメラCa1のカメラ視点である。なお位置変更前の仮想カメラCa0のカメラ視点cavはZ軸と一致している。 9, FIG. 10, and FIG. 11 show examples in which the virtual projection plane VP0 is rotated by roll, pitch, and yaw, respectively, based on the input set value of the rotation amount. In these figures (and also thereafter), Ca0 and Ca1 are virtual cameras, and cav is the camera viewpoint of the virtual camera Ca1 after the position change. Note that the camera viewpoint cav of the virtual camera Ca0 before the position change coincides with the Z axis.
 図9において仮想投影面VP0を位置変更してroll回転させたものが仮想投影面VP1である。カメラ視点cavは位置変更前と後で一致している。 In FIG. 9, the virtual projection plane VP1 is obtained by changing the position of the virtual projection plane VP0 and rotating it. The camera viewpoint cav coincides with before and after the position change.
 図10において、仮想投影面VP0を位置変更してpitch回転させたものが仮想投影面VP1であり、カメラ視点cavは同図において見上げる方向に移動している。 In FIG. 10, the virtual projection plane VP1 is obtained by changing the position of the virtual projection plane VP0 and performing the pitch rotation, and the camera viewpoint cav moves in the direction of looking up in FIG.
 図11において、仮想投影面VP0を位置変更してyaw回転させたものが仮想投影面VP1であり、カメラ視点cavは同図において時計回りに回転している。 In FIG. 11, the virtual projection plane VP0 is obtained by changing the position of the virtual projection plane VP0 and rotating the yaw rotation, which is the virtual projection plane VP1, and the camera viewpoint cav is rotating clockwise in FIG.
 図12から図14は、入力された回転量の設定値に基づいて仮想投影面VP0の中心ovを回転中心として仮想投影面VPの位置を変更する例である。以下においては仮想カメラCa0を回転あるいは位置変更させたことに相当する視点変換が行われる。 12 to 14 are examples in which the position of the virtual projection plane VP is changed based on the center ov of the virtual projection plane VP0 based on the input rotation amount setting value. In the following, viewpoint conversion corresponding to rotating or changing the position of the virtual camera Ca0 is performed.
 図12に示すように仮想投影面VP0上の直交する関係となる2軸の一方をYaw-axis、他方をP-axisとして設定する。両者は中心ovを通る軸であり、中心ovを回転中心としてYaw-axis回りの回転を仮想yaw回転、P-axis回りの回転を仮想pitch回転という。初期状態においては、仮想投影面VP0はカメラ座標系のxy平面と平行であり、Z軸上に中心ovが存在する。この初期状態においては、P-axisはx軸と平行であり、Yaw-axisはy軸と平行である。 As shown in FIG. 12, one of the two axes that are orthogonal to each other on the virtual projection plane VP0 is set as Yaw-axis, and the other is set as P-axis. Both are axes passing through the center ov, and rotation around the Yaw-axis with the center ov as the rotation center is called virtual yaw rotation, and rotation around the P-axis is called virtual pitch rotation. In the initial state, the virtual projection plane VP0 is parallel to the xy plane of the camera coordinate system, and the center ov exists on the Z axis. In this initial state, P-axis is parallel to the x-axis and Yaw-axis is parallel to the y-axis.
 図13において、仮想投影面VP0を位置変更して仮想pitch回転させたものが仮想投影面VP1であり、位置変更後において仮想カメラCa1は上方に位置し、カメラ視点cavは見下げる方向となる。 In FIG. 13, the virtual projection plane VP1 is obtained by changing the position of the virtual projection plane VP0 and rotating the virtual pitch, and after the position change, the virtual camera Ca1 is positioned upward and the camera viewpoint cav is in a direction to look down.
 図14において、仮想投影面VP0を位置変更して仮想yaw回転させたものが仮想投影面VP1であり、位置変更後において仮想カメラCa1、及びカメラ視点cavは反時計方向に回転する。 14, the virtual projection plane VP1 is obtained by changing the position of the virtual projection plane VP0 and rotating the virtual yaw, and the virtual camera Ca1 and the camera viewpoint cav rotate counterclockwise after the position change.
 図15から図18は、カメラ座標系の画像中心oを移動中心として仮想投影面VPの位置を変更する例である。これらの例においては仮想カメラCa0を仮想投影面VPとともに平行移動させたことに相当する視点変換が行われる。 15 to 18 are examples in which the position of the virtual projection plane VP is changed with the image center o in the camera coordinate system as the movement center. In these examples, viewpoint conversion corresponding to the parallel movement of the virtual camera Ca0 together with the virtual projection plane VP is performed.
 図16、図17、図18は、入力されたオフセット移動量の設定値に基づいて仮想投影面VP0をX方向、Y方向、Z方向にそれぞれオフセット移動(平行移動)させた例を示すものである。初期状態においては、Z方向へのオフセット移動は、ズームイン、ズームアウトと同様の動きとなる。初期状態以外においては、各方向へのオフセット移動は光学系の撮影領域外の暗部(後述例参照)を画像領域外に移動させる際に有効である。 FIGS. 16, 17, and 18 show examples in which the virtual projection plane VP0 is offset (translated) in the X, Y, and Z directions based on the input offset movement amount setting value. is there. In the initial state, the offset movement in the Z direction is the same movement as zooming in and zooming out. Except in the initial state, the offset movement in each direction is effective when moving a dark part (see an example described later) outside the imaging area of the optical system outside the image area.
 [実施例]
 本実施形態の撮像装置により歪み補正処理を行った例について説明する。図19から図26は表示部120に表示させた表示画像の例である。図19は歪み補正処理を行わない場合の歪曲画像の例である。図20から図27は、歪み補正処理を行った表示画像の例である。またその中で図26、図27では仮想投影面VPを複数設定し、それぞれの仮想投影面VPに対応する画像を表示部120に表示させた例である。
[Example]
An example in which distortion correction processing is performed by the imaging apparatus of the present embodiment will be described. 19 to 26 are examples of display images displayed on the display unit 120. FIG. FIG. 19 shows an example of a distorted image when the distortion correction processing is not performed. 20 to 27 are examples of display images that have been subjected to distortion correction processing. 26 and 27 are examples in which a plurality of virtual projection planes VP are set and images corresponding to the respective virtual projection planes VP are displayed on the display unit 120.
 図20は、仮想投影面VPをレンズ中心面LCと平行で、その中心が光軸と略一致した位置設定としている例である。図8等の例では初期状態の仮想投影面VP0に対応する表示画像である。図21は、仮想投影面VP0をZ方向へオフセット移動させた例である。 FIG. 20 shows an example in which the virtual projection plane VP is set parallel to the lens center plane LC, and the center of the virtual projection plane VP is substantially coincident with the optical axis. In the example of FIG. 8 and the like, the display image corresponds to the virtual projection plane VP0 in the initial state. FIG. 21 shows an example in which the virtual projection plane VP0 is offset in the Z direction.
 図22は、仮想投影面VP0の位置を、yaw回転させたものである(図11に対応する)。なお同図において右端は撮影領域外であるために暗部となっている。 FIG. 22 shows the position of the virtual projection plane VP0 rotated by yaw (corresponding to FIG. 11). In the figure, the right end is outside the imaging area and is therefore a dark part.
 図23は、仮想投影面VP0の位置を、pitch回転させたものである(図10に対応する)。なお同図においても下端に暗部が生じている。 FIG. 23 shows the position of the virtual projection plane VP0 rotated by pitch (corresponding to FIG. 10). Also in the figure, a dark portion is generated at the lower end.
 図24は、仮想投影面VP0をroll回転させたものである(図9に対応する)。 FIG. 24 shows the virtual projection plane VP0 rotated by roll (corresponding to FIG. 9).
 図25は、仮想投影面VP0の位置を、仮想yaw回転させたものである(図14参照:但し図14とは回転方向が反対である)。同図においても右端に暗部が生じている。 FIG. 25 shows the position of the virtual projection plane VP0 rotated by a virtual yaw (see FIG. 14; however, the direction of rotation is opposite to that in FIG. 14). Also in the figure, a dark part is generated at the right end.
 図26は、2つの仮想投影面VPを設定した例であり、図27は、3つの仮想投影面VPを設定した例であり、それぞれの仮想投影面VPに対応する表示画像を表示部120に分割表示させている。同図に示すように複数の仮想投影面VPのそれぞれは独立に回転、移動が可能であり、図27においては中央の画像と両脇の2画像とは、Z方向のオフセット量が異なり両脇では拡大表示されている。また中央の画像と両脇の画像は、撮影領域の一部が重複している。 FIG. 26 is an example in which two virtual projection planes VP are set, and FIG. 27 is an example in which three virtual projection planes VP are set, and display images corresponding to the respective virtual projection planes VP are displayed on the display unit 120. It is divided and displayed. As shown in the figure, each of the plurality of virtual projection planes VP can be rotated and moved independently. In FIG. 27, the center image and the two images on both sides differ in the offset amount in the Z direction. Is enlarged. In addition, the center image and the images on both sides overlap a part of the shooting area.
 以上に説明した実施の形態では、集光レンズを含む光学系として、単玉レンズからなる光学系を例示して説明したが、集光レンズは複数枚のレンズで構成されてもよく、光学系には集光レンズ以外の光学素子を備えていてもよいことは勿論であり、本願の発明を適用できるものである。その場合には、歪み補正係数は、光学系全体としての値を利用すればよい。しかしながら、撮像装置の小型化や低コスト化の上では、以上の実施の形態で例示したように、単玉のレンズを用いることが最も好ましいものである。 In the embodiment described above, the optical system including a single lens is exemplified as the optical system including the condensing lens. However, the condensing lens may be composed of a plurality of lenses. Of course, an optical element other than the condenser lens may be provided, and the invention of the present application can be applied. In that case, the distortion correction coefficient may be a value for the entire optical system. However, in terms of downsizing and cost reduction of the imaging device, it is most preferable to use a single lens as exemplified in the above embodiment.
 100 制御装置
 101 画像処理部
 102 設定部
 103 記憶部
 110 撮像ユニット
 120 表示部
 130 操作部
 VP 仮想投影面
 LC レンズ中心面
 IA 撮像素子面
 O レンズ中心
 o 画像中心
DESCRIPTION OF SYMBOLS 100 Control apparatus 101 Image processing part 102 Setting part 103 Storage part 110 Imaging unit 120 Display part 130 Operation part VP Virtual projection surface LC Lens center plane IA Image sensor surface O Lens center o Image center

Claims (14)

  1.  光学系を介して複数の画素を有する撮像素子に受光して得られた複数の画素データを用いて歪み補正処理した画像データを得る画像処理方法において、
     ワールド座標系における仮想投影面の位置及びサイズを、ユーザの指示に基づいて設定する第1ステップと、
     前記第1ステップで設定された仮想投影面の各画素のワールド座標系における座標を前記光学系の歪み補正係数を用いてカメラ座標系に変換する第2ステップと、
     前記複数の画素データと前記第2ステップで変換したカメラ座標系における座標とに基づいて、前記第1ステップで設定された仮想投影面の画像データを算出する第3ステップと、
     を有することを特徴とする画像処理方法。
    In an image processing method for obtaining image data subjected to distortion correction processing using a plurality of pixel data obtained by receiving an image sensor having a plurality of pixels via an optical system,
    A first step of setting the position and size of the virtual projection plane in the world coordinate system based on a user instruction;
    A second step of converting coordinates in the world coordinate system of each pixel of the virtual projection plane set in the first step into a camera coordinate system using a distortion correction coefficient of the optical system;
    A third step of calculating image data of the virtual projection plane set in the first step based on the plurality of pixel data and the coordinates in the camera coordinate system converted in the second step;
    An image processing method comprising:
  2.  前記第3ステップで算出した画像データを表示部に表示させる第4ステップを有することを特徴とする請求項1に記載の画像処理方法。 The image processing method according to claim 1, further comprising a fourth step of displaying the image data calculated in the third step on a display unit.
  3.  前記第1ステップで設定される仮想投影面は、複数の仮想投影面であることを特徴とする、請求項1又は2に記載の画像処理方法。 3. The image processing method according to claim 1, wherein the virtual projection plane set in the first step is a plurality of virtual projection planes.
  4.  前記第3ステップでは、画像データの算出を前記仮想投影面の各画素の位置における前記光学系の光軸に対する入射角度θから、対応する前記撮像素子面上の位置を算出し、算出した位置の画素の画素データから前記仮想投影面の前記画素の画像データを得ることを特徴とする請求項1から3のいずれか一項に記載の画像処理方法。 In the third step, calculation of image data is performed by calculating a corresponding position on the imaging element surface from an incident angle θ with respect to the optical axis of the optical system at each pixel position on the virtual projection plane. The image processing method according to claim 1, wherein image data of the pixel on the virtual projection plane is obtained from pixel data of the pixel.
  5.  光学系を介して複数の画素を有する撮像素子に受光して得られた複数の画素データを用いて歪み補正処理された画像データを得る画像処理装置であって、
     位置及びサイズが設定された仮想投影面の各画素のワールド座標系における座標を前記光学系の歪み補正係数を用いてカメラ座標系に変換し、カメラ座標系に変換した座標と前記複数の画素データに基づいて、前記仮想投影面での画像データを算出する画像処理部と、
     を有することを特徴とする画像処理装置。
    An image processing apparatus that obtains image data subjected to distortion correction processing using a plurality of pixel data obtained by receiving light on an imaging device having a plurality of pixels via an optical system,
    The coordinates in the world coordinate system of each pixel of the virtual projection plane for which the position and size are set are converted into the camera coordinate system using the distortion correction coefficient of the optical system, the coordinates converted into the camera coordinate system and the plurality of pixel data An image processing unit that calculates image data on the virtual projection plane,
    An image processing apparatus comprising:
  6.  光学系を介して複数の画素を有する撮像素子に受光して得られた複数の画素データを用いて歪み補正処理された画像データを得る画像処理装置であって、
     ワールド座標系における仮想投影面の位置及びサイズを設定可能な設定部と、
     前記設定部で設定された仮想投影面の各画素のワールド座標系における座標を前記光学系の歪み補正係数を用いてカメラ座標系に変換し、カメラ座標系に変換した座標と前記複数の画素データに基づいて、前記設定部で設定された仮想投影面での画像データを算出する画像処理部と、
     を有することを特徴とする画像処理装置。
    An image processing apparatus that obtains image data subjected to distortion correction processing using a plurality of pixel data obtained by receiving light on an imaging device having a plurality of pixels via an optical system,
    A setting unit capable of setting the position and size of the virtual projection plane in the world coordinate system;
    The coordinates in the world coordinate system of each pixel of the virtual projection plane set by the setting unit are converted into the camera coordinate system using the distortion correction coefficient of the optical system, the coordinates converted into the camera coordinate system and the plurality of pixel data An image processing unit that calculates image data on the virtual projection plane set by the setting unit, and
    An image processing apparatus comprising:
  7.  前記設定部で設定された仮想投影面は、複数の仮想投影面であることを特徴とする請求項6に記載の画像処理装置。 The image processing apparatus according to claim 6, wherein the virtual projection plane set by the setting unit is a plurality of virtual projection planes.
  8.  前記画像処理部は、画像データの算出を前記仮想投影面の各位置における前記光学系の光軸に対する入射角度θから、対応する前記撮像素子面上の位置を算出し、算出した位置の画素の画素データから前記仮想投影面の前記画素における画像データを得ることを特徴とする請求項6又は7に記載の画像処理装置。 The image processing unit calculates the position of the pixel at the calculated position by calculating the corresponding position on the imaging element surface from the incident angle θ with respect to the optical axis of the optical system at each position on the virtual projection plane. The image processing apparatus according to claim 6, wherein image data at the pixels on the virtual projection plane is obtained from pixel data.
  9.  光学系を介して複数の画素を有する撮像素子に受光して得られた複数の画素データを用いて歪み補正処理された画像データを得る画像処理装置のプログラムであって、コンピュータを、
     位置及びサイズが設定された仮想投影面の各画素のワールド座標系における座標を前記光学系の歪み補正係数を用いてカメラ座標系に変換し、カメラ座標系に変換した座標と前記複数の画素データに基づいて、前記仮想投影面での画像データを算出する画像処理部、
     として機能させるプログラム。
    An image processing apparatus program for obtaining image data subjected to distortion correction processing using a plurality of pixel data obtained by receiving an image sensor having a plurality of pixels via an optical system, the computer comprising:
    The coordinates in the world coordinate system of each pixel of the virtual projection plane for which the position and size are set are converted into the camera coordinate system using the distortion correction coefficient of the optical system, the coordinates converted into the camera coordinate system and the plurality of pixel data Based on the image processing unit for calculating the image data on the virtual projection plane,
    Program to function as.
  10.  光学系を介して複数の画素を有する撮像素子に受光して得られた複数の画素データを用いて歪み補正処理された画像データを得る画像処理装置のプログラムであって、コンピュータを、
     ワールド座標系における仮想投影面の位置及びサイズを設定可能な設定部と、
     前記設定部で設定された仮想投影面の各画素のワールド座標系における座標を前記光学系の歪み補正係数を用いてカメラ座標系に変換し、カメラ座標系に変換した座標と前記複数の画素データに基づいて、前記設定部で設定された仮想投影面での画像データを算出する画像処理部、
     として機能させるプログラム。
    An image processing apparatus program for obtaining image data subjected to distortion correction processing using a plurality of pixel data obtained by receiving an image sensor having a plurality of pixels via an optical system, the computer comprising:
    A setting unit capable of setting the position and size of the virtual projection plane in the world coordinate system;
    The coordinates in the world coordinate system of each pixel of the virtual projection plane set by the setting unit are converted into the camera coordinate system using the distortion correction coefficient of the optical system, the coordinates converted into the camera coordinate system and the plurality of pixel data An image processing unit for calculating image data on the virtual projection plane set by the setting unit,
    Program to function as.
  11.  光学系と、
     複数の画素を有する撮像素子と、
     ワールド座標系における仮想投影面の位置及びサイズを設定する設定部と、
     前記設定部で設定された仮想投影面の各画素のワールド座標系における座標を前記光学系の歪み補正係数を用いてカメラ座標系に変換し、カメラ座標系に変換した座標と前記複数の画素データに基づいて、前記設定部で設定された仮想投影面での画像データを算出する画像処理部と
     を有することを特徴とする撮像装置。
    Optical system,
    An imaging device having a plurality of pixels;
    A setting unit for setting the position and size of the virtual projection plane in the world coordinate system;
    The coordinates in the world coordinate system of each pixel of the virtual projection plane set by the setting unit are converted into the camera coordinate system using the distortion correction coefficient of the optical system, the coordinates converted into the camera coordinate system and the plurality of pixel data And an image processing unit that calculates image data on the virtual projection plane set by the setting unit.
  12.  ユーザが操作する操作部と、
     表示部と、
     を有し、
     前記設定部は、前記操作部への操作に基づいて前記仮想投影面のワールド座標系における位置及びサイズの設定を行い、
     前記表示部は、設定された前記仮想投影面での画像データを表示することを特徴とする請求項11に記載の撮像装置。
    An operation unit operated by a user;
    A display unit;
    Have
    The setting unit sets the position and size of the virtual projection plane in the world coordinate system based on an operation to the operation unit,
    The imaging apparatus according to claim 11, wherein the display unit displays image data on the set virtual projection plane.
  13.  前記設定部で設定される仮想投影面は、複数の仮想投影面であることを特徴とする請求項11又は12に記載の撮像装置。 The imaging apparatus according to claim 11 or 12, wherein the virtual projection plane set by the setting unit is a plurality of virtual projection planes.
  14.  前記画像処理部は、画像データの算出を前記仮想投影面の各位置における前記光学系の光軸に対する入射角度θから、対応する前記撮像素子面上の位置を算出し、算出した位置の画素の画素データから前記仮想投影面の前記画素の画像データを得ることを特徴とする請求項11から13の何れか一項に記載の撮像装置。 The image processing unit calculates the position of the pixel at the calculated position by calculating the corresponding position on the imaging element surface from the incident angle θ with respect to the optical axis of the optical system at each position on the virtual projection plane. The image pickup apparatus according to any one of claims 11 to 13, wherein image data of the pixel on the virtual projection plane is obtained from pixel data.
PCT/JP2010/060184 2010-06-16 2010-06-16 Image processing method, program, image processing device, and imaging device WO2011158343A1 (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
JP2012520202A JPWO2011158343A1 (en) 2010-06-16 2010-06-16 Image processing method, program, image processing apparatus, and imaging apparatus
PCT/JP2010/060184 WO2011158343A1 (en) 2010-06-16 2010-06-16 Image processing method, program, image processing device, and imaging device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/JP2010/060184 WO2011158343A1 (en) 2010-06-16 2010-06-16 Image processing method, program, image processing device, and imaging device

Publications (1)

Publication Number Publication Date
WO2011158343A1 true WO2011158343A1 (en) 2011-12-22

Family

ID=45347765

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2010/060184 WO2011158343A1 (en) 2010-06-16 2010-06-16 Image processing method, program, image processing device, and imaging device

Country Status (2)

Country Link
JP (1) JPWO2011158343A1 (en)
WO (1) WO2011158343A1 (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2013094588A1 (en) * 2011-12-19 2013-06-27 大日本印刷株式会社 Image processing device, image processing method, program for image processing device, storage medium, and image display device

Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH11261868A (en) * 1998-03-13 1999-09-24 Fujitsu Ltd Fisheye lens camera device and image distortion correction method and image extraction method thereof
JP2000083242A (en) * 1993-02-08 2000-03-21 Interactive Pictures Corp Full field still camera and surveillance system
JP2000242773A (en) * 1999-02-19 2000-09-08 Fitto:Kk Image data converting device
JP2006148767A (en) * 2004-11-24 2006-06-08 Canon Inc Video image distribution system, video image distributing apparatus, video image receiving apparatus, communication method for video image distributing apparatus, display method of video image receiving apparatus, program, and storage medium
JP2007228531A (en) * 2006-02-27 2007-09-06 Sony Corp Camera apparatus and monitor system
JP2008052589A (en) * 2006-08-25 2008-03-06 Konica Minolta Holdings Inc Method for correcting distortion of wide angle image
JP2008311890A (en) * 2007-06-14 2008-12-25 Fujitsu General Ltd Image data converter, and camera device provided therewith
WO2009014075A1 (en) * 2007-07-20 2009-01-29 Techwell Japan K.K. Image processing device and camera system
JP2009043060A (en) * 2007-08-09 2009-02-26 Canon Inc Image processing method for performing distortion correction to image data, program, and recording medium

Patent Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2000083242A (en) * 1993-02-08 2000-03-21 Interactive Pictures Corp Full field still camera and surveillance system
JPH11261868A (en) * 1998-03-13 1999-09-24 Fujitsu Ltd Fisheye lens camera device and image distortion correction method and image extraction method thereof
JP2000242773A (en) * 1999-02-19 2000-09-08 Fitto:Kk Image data converting device
JP2006148767A (en) * 2004-11-24 2006-06-08 Canon Inc Video image distribution system, video image distributing apparatus, video image receiving apparatus, communication method for video image distributing apparatus, display method of video image receiving apparatus, program, and storage medium
JP2007228531A (en) * 2006-02-27 2007-09-06 Sony Corp Camera apparatus and monitor system
JP2008052589A (en) * 2006-08-25 2008-03-06 Konica Minolta Holdings Inc Method for correcting distortion of wide angle image
JP2008311890A (en) * 2007-06-14 2008-12-25 Fujitsu General Ltd Image data converter, and camera device provided therewith
WO2009014075A1 (en) * 2007-07-20 2009-01-29 Techwell Japan K.K. Image processing device and camera system
JP2009043060A (en) * 2007-08-09 2009-02-26 Canon Inc Image processing method for performing distortion correction to image data, program, and recording medium

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2013094588A1 (en) * 2011-12-19 2013-06-27 大日本印刷株式会社 Image processing device, image processing method, program for image processing device, storage medium, and image display device
US9269124B2 (en) 2011-12-19 2016-02-23 Dai Nippon Printing Co., Ltd. Image processing device, image processing method, program for image processing device, recording medium, and image display device

Also Published As

Publication number Publication date
JPWO2011158343A1 (en) 2013-08-15

Similar Documents

Publication Publication Date Title
US9196022B2 (en) Image transformation and multi-view output systems and methods
JP6960238B2 (en) Image stabilization device and its control method, program, storage medium
CN107770433B (en) Image acquisition device and image smooth scaling method thereof
US8134608B2 (en) Imaging apparatus
TWI393072B (en) Multi-sensor array module with wide viewing angle; image calibration method, operating method and application for the same
EP3438919B1 (en) Image displaying method and head-mounted display apparatus
US20130300875A1 (en) Correction of image distortion in ir imaging
JP6253280B2 (en) Imaging apparatus and control method thereof
US8913162B2 (en) Image processing method, image processing apparatus and image capturing apparatus
CN111800589B (en) Image processing method, device and system and robot
US20140184837A1 (en) Image capturing apparatus, control method thereof, and storage medium
JP6236908B2 (en) Imaging apparatus, imaging system, and imaging method
WO2011158344A1 (en) Image processing method, program, image processing device, and imaging device
WO2011161746A1 (en) Image processing method, program, image processing device and image capturing device
WO2012056982A1 (en) Image processing method, image processing device, and imaging device
JP2013005393A (en) Image processing method having wide-angle distortion correction processing, image processing apparatus and imaging apparatus
JP5393877B2 (en) Imaging device and integrated circuit
JP2009123131A (en) Imaging apparatus
JP5682473B2 (en) Image processing method having wide-angle distortion correction processing, image processing apparatus, and imaging apparatus
WO2011158343A1 (en) Image processing method, program, image processing device, and imaging device
JP7510264B2 (en) Image processing device, image processing method, and imaging device
JP2013005392A (en) Image processing method having wide-angle distortion correction processing, image processing apparatus and imaging apparatus
WO2012060271A1 (en) Image processing method, image processing device, and imaging device
JP4211653B2 (en) Video generation system
JP2020061662A (en) Video processing device, video processing method, and program

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 10853225

Country of ref document: EP

Kind code of ref document: A1

WWE Wipo information: entry into national phase

Ref document number: 2012520202

Country of ref document: JP

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 10853225

Country of ref document: EP

Kind code of ref document: A1