WO2018033952A1 - パノラマ画像合成解析システム、パノラマ画像合成解析方法及びプログラム - Google Patents
パノラマ画像合成解析システム、パノラマ画像合成解析方法及びプログラム Download PDFInfo
- Publication number
- WO2018033952A1 WO2018033952A1 PCT/JP2016/073855 JP2016073855W WO2018033952A1 WO 2018033952 A1 WO2018033952 A1 WO 2018033952A1 JP 2016073855 W JP2016073855 W JP 2016073855W WO 2018033952 A1 WO2018033952 A1 WO 2018033952A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- image
- panoramic image
- analysis
- panoramic
- subject
- Prior art date
Links
- 238000004458 analytical method Methods 0.000 title claims abstract description 58
- 230000015572 biosynthetic process Effects 0.000 title claims abstract description 55
- 238000003786 synthesis reaction Methods 0.000 title claims abstract description 55
- 238000010191 image analysis Methods 0.000 claims abstract description 73
- 230000000295 complement effect Effects 0.000 claims description 15
- 238000003384 imaging method Methods 0.000 abstract description 46
- 230000002194 synthesizing effect Effects 0.000 abstract description 5
- 239000013589 supplement Substances 0.000 abstract description 2
- 238000010586 diagram Methods 0.000 description 21
- 238000004891 communication Methods 0.000 description 10
- 238000000034 method Methods 0.000 description 9
- 239000002131 composite material Substances 0.000 description 6
- 230000006870 function Effects 0.000 description 5
- 239000000203 mixture Substances 0.000 description 5
- 230000000694 effects Effects 0.000 description 4
- 230000005540 biological transmission Effects 0.000 description 3
- 239000000284 extract Substances 0.000 description 1
- 230000010365 information processing Effects 0.000 description 1
- 230000003287 optical effect Effects 0.000 description 1
- 239000000047 product Substances 0.000 description 1
- 239000010454 slate Substances 0.000 description 1
- 239000004984 smart glass Substances 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T3/00—Geometric image transformations in the plane of the image
- G06T3/40—Scaling of whole images or parts thereof, e.g. expanding or contracting
- G06T3/4038—Image mosaicing, e.g. composing plane images from plane sub-images
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T3/00—Geometric image transformations in the plane of the image
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/60—Control of cameras or camera modules
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/60—Control of cameras or camera modules
- H04N23/698—Control of cameras or camera modules for achieving an enlarged field of view, e.g. panoramic image capture
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/90—Arrangement of cameras or camera modules, e.g. multiple cameras in TV studios or sports stadiums
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N7/00—Television systems
- H04N7/18—Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N7/00—Television systems
- H04N7/18—Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast
- H04N7/181—Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast for receiving images from a plurality of remote sources
Definitions
- the present invention relates to a panorama image synthesis analysis system, a panorama image synthesis analysis method, and a program for creating a panorama image.
- Patent Document 1 when image analysis is performed on an image, the image analysis is performed on the image captured by each imaging device.
- the accuracy is compared with the accuracy of image analysis for a subject that is not entirely reflected, there is a problem that the accuracy of the latter is lowered. That is, in some imaging devices, all of the subject is reflected, but in different imaging devices, not all of the subject is reflected, the accuracy of the latter image analysis is low. .
- the present invention uses a plurality of cameras and combines the images captured by the plurality of cameras even if the image analysis is performed on a subject that is not entirely reflected in the image of one camera.
- An object of the present invention is to provide a panoramic image synthesis analysis system, a panoramic image synthesis analysis method, and a program that improve the accuracy of image analysis by performing image analysis after creating an image.
- the present invention provides the following solutions.
- the invention according to the first feature includes panoramic image creation means for creating a panoramic image by combining captured images captured by a plurality of cameras; Image analysis means for performing image analysis on a subject reflected in the synthesized panoramic image; Analysis result display means for displaying the result of the image analysis; A panoramic image synthesis analysis system is provided.
- the panoramic image synthesis analysis system creates a panoramic image by synthesizing captured images taken by a plurality of cameras, and applies the subject to the subject reflected in the synthesized panoramic image.
- the image analysis is performed on the image, and the result of the image analysis is displayed.
- the invention according to the first feature is a category of the panoramic image synthesis analysis system, but the same actions and effects according to the category are exhibited in other categories such as methods and programs.
- the image analysis means performs image analysis on a subject reflected in a joint portion of the synthesized panoramic image.
- a panoramic image synthesis analysis system which is an invention according to a first feature.
- the panoramic image synthesis analysis system which is the invention relating to the first feature performs image analysis on the subject reflected in the joint portion of the synthesized panoramic image.
- the analysis result display means displays the result of the image analysis in association with the subject.
- the panoramic image synthesis analysis system that is the invention relating to the first feature displays the result of the image analysis in association with the subject.
- the invention according to a fourth feature is based on the result of the image analysis, complement display means for complementing and displaying the missing portion of the subject, A panoramic image synthesis analysis system which is an invention according to the first feature is provided.
- the panoramic image synthesis analysis system complements and displays the missing portion of the subject based on the result of the image analysis.
- the panoramic image creating means creates a panoramic image in accordance with a captured image having the largest number of pixels among the plurality of cameras.
- a panoramic image synthesis analysis system which is an invention according to a first feature.
- the panoramic image synthesis analysis system creates a panoramic image in accordance with a captured image having the largest number of pixels among the plurality of cameras. To do.
- the panoramic image creation means creates a panoramic image in accordance with a captured image having the smallest number of pixels among the plurality of cameras.
- a panoramic image synthesis analysis system which is an invention according to a first feature.
- the panoramic image synthesis analysis system creates a panoramic image in accordance with the captured image having the smallest number of pixels among the plurality of cameras. To do.
- the invention according to the seventh feature is the step of synthesizing captured images captured by a plurality of cameras to create a panoramic image; Performing image analysis on a subject reflected in the synthesized panoramic image; Displaying the result of the image analysis; A panoramic image synthesis analysis method is provided.
- the invention according to the eighth feature provides a panoramic image synthesis analysis system, Creating a panoramic image by combining captured images captured by a plurality of cameras; Performing image analysis on a subject reflected in the synthesized panoramic image; Displaying the result of the image analysis; Provide a program to execute.
- the present invention it is possible to provide a panoramic image synthesis analysis system, a panoramic image synthesis analysis method, and a program that improve the accuracy of image analysis.
- FIG. 1 is a diagram showing an overview of a panoramic image synthesis analysis system 1.
- FIG. 2 is an overall configuration diagram of the panoramic image synthesis analysis system 1.
- FIG. 3 is a functional block diagram of the user terminal 100 and the camera 200.
- FIG. 4 is a flowchart showing panoramic image synthesis analysis processing executed by the user terminal 100 and the camera 200.
- FIG. 5 is a diagram illustrating an example of the arrangement of the cameras 201, 202, and 203 constituting the camera 200.
- FIG. 6 is a diagram illustrating an example of one image synthesized by the user terminal 100.
- FIG. 7 is a diagram illustrating an example of a panoramic image area created by the user terminal 100.
- FIG. 8 is a diagram illustrating an example of a panoramic image area created by the user terminal 100.
- FIG. 1 is a diagram showing an overview of a panoramic image synthesis analysis system 1.
- FIG. 2 is an overall configuration diagram of the panoramic image synthesis analysis system 1.
- FIG. 3 is a functional block diagram of the
- FIG. 9 is a diagram illustrating an example of a panoramic image created by the user terminal 100.
- FIG. 10 is a diagram illustrating an example of a panoramic image created by the user terminal 100.
- FIG. 11 is a diagram illustrating an example of a panoramic image displayed on the user terminal 100.
- FIG. 1 is a diagram for explaining an overview of a panoramic image synthesis analysis system 1 which is a preferred embodiment of the present invention.
- the panoramic image synthesis analysis system 1 includes a user terminal 100 and a camera 200.
- the camera 200 includes a camera 201, a camera 202, and a camera 203. It should be noted that the camera 200 merely represents the cameras 201, 202, and 203 as one group for convenience, and the cameras 201, 202, and 203 do not exist as one imaging device.
- the number of user terminals 100 is not limited to one and may be plural. Further, the number of cameras 200 is not limited to three, and may be more or less. Further, the user terminal 100 may be realized by either or both of a real device and a virtual device. Each process described below may be realized by either or both of the user terminal 100 and the camera 200.
- the user terminal 100 is a terminal device possessed by the user that can perform data communication with the camera 200.
- the camera 200 includes, for example, electronic products such as a netbook terminal, a slate terminal, an electronic book terminal, a portable music player, a smart glass, a head mounted display, etc. in addition to a mobile phone, a portable information terminal, a tablet terminal, and a personal computer. Wearable terminals and other items.
- the user terminal 100 may be a terminal device or a virtual device that the user does not possess, such as cloud computing.
- the camera 200 is an imaging device that captures images such as moving images and still images, each of which is capable of data communication with the user terminal 100.
- the cameras 201, 202, and 203 are arranged in a line at a predetermined interval. Note that the cameras 200, 202, 203 may be arranged in a circle at a predetermined interval, or may be arranged in other states.
- the captured image means an image such as a moving image or a still image captured by the camera 200.
- the camera 200 captures an image (step S01).
- the camera 200 captures an image by receiving an imaging instruction from the user terminal 100 or an operation input of a switch included in the camera 200.
- Each of the cameras 201, 202, and 203 captures a different image.
- the cameras 201, 202, and 203 capture images simultaneously.
- the camera 200 transmits captured image data that is data of the captured image to the user terminal 100 (step S02). At this time, each of the cameras 201, 202, and 203 transmits its own positional information and data such as the positional relationship of the cameras 201, 202, and 203 to the user terminal 100 together with the captured image data.
- User terminal 100 receives captured image data.
- the user terminal 100 combines the received plurality of captured image data to create a panoramic image (step S03).
- the user terminal 100 may create a panoramic image according to an image having the largest number of pixels in the camera 200.
- the user terminal 100 may create a panoramic image in accordance with an image having the smallest pixel number in the camera 200.
- the user terminal 100 performs image analysis on the subject reflected in the synthesized panoramic image (step S04). Note that the user terminal 100 may perform image analysis on the subject reflected in the joint portion of the synthesized panoramic image. At this time, for example, when a part of the subject reflected in the joint part is missing, the missing part may be complemented to create a panoramic image.
- the user terminal 100 displays the result of image analysis (step S05).
- the user terminal 100 displays the panoramic image and the image analysis result together. Note that the user terminal 100 may display the result of the image analysis in association with the subject reflected in the panoramic image.
- FIG. 2 is a diagram showing a system configuration of a panoramic image synthesis analysis system 1 which is a preferred embodiment of the present invention.
- the panoramic image synthesis analysis system 1 includes a user terminal 100, a plurality of cameras (cameras 201, 202, 203) 200, and a public network (Internet network, third generation, fourth generation communication network, etc.) 5.
- the number of user terminals 100 is not limited to one and may be plural.
- the number of cameras 200 is not limited to three, and the number may be more or less.
- the user terminal 100 may be realized by either or both of a real device and a virtual device. Each process described below may be realized by either or both of the user terminal 100 and the camera 200.
- the user terminal 100 is the above-described terminal device having the functions described later.
- the camera 200 is the above-described imaging device having the functions described below.
- the camera 200 is a generic name for a plurality of cameras 201, 202, and 203. In the following description, the configurations of the cameras 201, 202, and 203 are the same, and thus will be described as the camera 200.
- FIG. 3 is a functional block diagram of the user terminal 100 and the camera 200.
- the user terminal 100 includes a CPU (Central Processing Unit), a RAM (Random Access Memory), a ROM (Read Only Memory), etc. as the control unit 110, and a device for enabling communication with other devices as the communication unit 120.
- the user terminal 100 includes, as the input / output unit 140, a display unit that outputs and displays data and images controlled by the control unit 110, an input unit such as a touch panel, a keyboard, and a mouse that receives input from the user.
- the data reception module 150 is realized in cooperation with the communication unit 120 by the control unit 110 reading a predetermined program.
- the control unit 110 reads a predetermined program, so that in cooperation with the input / output unit 140, the image synthesis module 170, the panorama image creation module 171, the image analysis module 172, the image complementation module 173, A display module 174 is realized.
- the camera 200 includes a CPU, RAM, ROM, and the like as the control unit 210, and a device for enabling communication with other devices as the communication unit 220.
- the camera 200 includes an imaging device 240 such as a lens, an imaging device, various buttons, and an imaging device such as a flash.
- the control unit 210 when the control unit 210 reads a predetermined program, the data transmission module 250 is realized in cooperation with the communication unit 220. In the camera 200, the control unit 210 reads a predetermined program, thereby realizing the imaging module 270 in cooperation with the imaging unit 240.
- FIG. 4 is a diagram illustrating a flowchart of panoramic image synthesis analysis processing executed by the user terminal 100 and the camera 200. The processing executed by the modules of each device described above will be described together with this processing.
- FIG. 5 is a diagram illustrating an example of the arrangement of the cameras 201, 202, and 203 constituting the camera 200.
- region which the camera 201,202,203 images is shown typically.
- the cameras 201, 202, and 203 are arranged in a horizontal row at a predetermined interval.
- An area captured by the camera 201 is an imaging area 301.
- An area captured by the camera 202 is an imaging area 302.
- An area captured by the camera 203 is an imaging area 303.
- an imaging region 301 of the camera 201 and an imaging region 302 of the camera 202 have an overlapping region 310 that is an overlapping region.
- An imaging region 302 of the camera 202 and an imaging region 303 of the camera 203 have an overlapping region 311 that is an overlapping region.
- the camera 201 has a non-imaging area 320 that is an area that cannot be imaged.
- a non-imaging area 321 that is an area that cannot be imaged exists between the camera 201 and the camera 202.
- the camera 203 has a non-imaging area 323 that is an area that cannot be imaged.
- the positions and shapes of the imaging regions 301 to 303, the non-imaging regions 320 to 323, and the overlapping regions 310 and 311 can be changed as appropriate. That is, the non-imaging area may be eliminated by increasing the number of cameras. Further, the non-imaging area may be eliminated by adjusting the position of the camera. Further, the overlapping area may be increased by increasing the number of cameras. Further, the overlapping area may be increased by adjusting the position of the camera.
- the user terminal 100 creates a panoramic image by synthesizing images taken by the cameras 201, 202, and 203 in the process described later.
- the user terminal 100 performs image analysis on the panoramic image, thereby comparing the images of the subjects existing in the imaging regions 301, 302, and 303, as compared with the case where the images captured by the cameras 201, 202, and 203 are separately analyzed. It becomes possible to improve the accuracy of analysis. Further, it is possible to create a panoramic image in which the subject existing in the non-imaging regions 320 and 321 is complemented based on the result of panoramic image analysis.
- the imaging module 270 captures an image (step S10).
- step S ⁇ b> 10 the imaging module 270 captures an image by accepting an imaging instruction from the user terminal 100 or an input operation to the camera 200, for example.
- step S10 the cameras 201 to 203 respectively capture the imaging areas 301 to 303 shown in FIG.
- the data transmission module 250 transmits captured image data that is image data captured by the imaging module 270 to the user terminal 100 (step S11).
- the data transmission module 250 adds position information of each of the cameras 201, 202, and 203 and arrangement data such as a positional relationship to the captured image data and transmits the captured image data to the user terminal 100.
- the data reception module 150 receives captured image data transmitted by the camera 200.
- the data receiving module 150 receives captured image data from the cameras 201 to 203.
- the image composition module 170 synthesizes one image based on the received captured image data and arrangement data (step S12).
- FIG. 6 is a diagram illustrating an example of one image synthesized by the image synthesis module 170.
- the image composition module 170 synthesizes one image 400 based on the arrangement data of the cameras 201 to 203 and the captured image data captured by each of the cameras 201 to 203.
- the image composition module 170 arranges an image 401 captured by the camera 201, an image 402 captured by the camera 202, and an image 403 captured by the camera 203 in a horizontal row based on the arrangement of the cameras 201 to 203. Is synthesized.
- the image composition module 170 determines the joint portion between the image 401 and the image 402 based on the image of the overlapping area 310 described above, and composes them.
- the image composition module 170 determines and synthesizes the joint portion between the image 402 and the image 403 based on the image of the overlapping region 311 described above.
- the panorama image creation module 171 creates a panorama image area based on the synthesized one image (step S13).
- FIG. 7 is a diagram showing an example of a panorama image area created by the panorama image creation module 171.
- the panorama image creation module 171 creates a panorama image region 500 surrounded by a thick line in accordance with the image 402 having the largest number of pixels.
- the panorama image creation module 171 creates a panorama image region 500 that includes all of the images 401, 402, and 403 with reference to the image 402 having the largest number of pixels.
- the panoramic image area 500 includes blank areas 510 to 513 that are areas corresponding to the non-imaging areas 320 to 323 described above.
- FIG. 8 is a diagram illustrating an example of a panorama image area created by the panorama image creation module 171.
- the panorama image creation module 171 creates a panorama image region 600 surrounded by a thick line in accordance with the image 403 having the smallest pixel number.
- the panorama image creation module 171 creates a panorama image region 600 that includes a part of the image 401, a part of the image 402, and the whole of the image 403 on the basis of the image 403 having the smallest number of pixels.
- the panoramic image area 600 includes a blank area 610 that is an area corresponding to the non-imaging area 321 described above.
- the image analysis module 172 performs image analysis on the subject reflected in the synthesized panoramic image area (step S14).
- step S ⁇ b> 14 the image analysis module 172 performs image analysis on the subject reflected in the joint portion of each image in the synthesized panoramic image area.
- the joint portion of each image in the synthesized panoramic image area means, for example, the boundary between the image 401 and the image 402 or the boundary between the image 402 and the image 403 in FIG. 7 or FIG.
- the image analysis module 172 extracts the feature amount of each subject, and identifies each subject by comparing the extracted feature amount with the feature amount stored in the external computer or the user terminal 100 itself. .
- the image analysis module 172 acquires the specified subject name, various information, and the like based on a database or the like stored in the external computer or the user terminal 100. For example, when the subject is a vehicle, the image analysis module 172 acquires information such as the name, model, manufacturer, and the like of the vehicle. If the subject is a person, the image analysis module 172 acquires information such as the name, age, occupation, address, telephone number, and mail address of the person. Also, the image analysis module 172 acquires various types of information in the same manner for other subjects.
- the image analysis module 172 determines whether or not there is a missing part in the specified subject (step S15). In step S ⁇ b> 15, the image analysis module 172 determines whether or not a portion lacking in the subject exists based on whether or not a part of the identified subject exists in the panoramic image area.
- step S15 when the image analysis module 172 determines that the missing portion exists (step S15: YES), the image complementing module 173 complements the missing portion (step S16).
- step S16 the image complementation module 173 acquires, for example, an image relating to the identified subject from an external computer or a database stored in the user terminal 100, and synthesizes a portion corresponding to the acquired image with a missing portion. ,Complement.
- the image complementing module 173 may be configured to complement by repeating a missing part and a missing part. For example, when the image analysis module 172 identifies a wall surface as a subject and determines that a portion lacking in the wall surface exists, the image complementing module 173 determines whether the pattern present in the portion without the wall surface The missing portion may be complemented by repeating the portion lacking the features such as the handle. In addition, the image complementing module 173 may complement a missing portion due to a configuration other than the configuration described above.
- step S16 After executing the process of step S16, the user terminal 100 executes a process of step S17 described later.
- step S15 when the image analysis module 172 determines in step S15 that there is no missing portion (NO in step S15), the panorama image creation module 171 creates a panorama image (step S17).
- FIG. 9 is a diagram showing an example of a panorama image created by the panorama image creation module 171 by complementing a part of the panorama image area 500 in FIG.
- the panoramic image 700 includes a composite image 730 and blank complement regions 710 to 713.
- the subject 731 is reflected in the composite image 730.
- the subject 731 is an article such as a vehicle or a sign, a person, nature, or the like.
- the composite image 730 is an image obtained by combining the images 401 to 403 in FIG.
- Blank complement areas 710 to 713 are areas where the blank areas 510 to 513 in FIG. 7 described above are complemented by the image complement module 173 in step S16 described above.
- FIG. 10 is a diagram showing an example of a panoramic image created by the panoramic image creation module 171 by complementing a part of the panoramic image area 700 in FIG.
- the panoramic image 800 includes a composite image 830 and a blank complement region 810.
- the subject 831 is reflected in the composite image 830.
- the subject 831 is an article such as a vehicle or a sign, a person, nature, or the like.
- the composite image 830 is an image obtained by combining the images 401 to 403 in FIG. 8 described above.
- the blank complement region 810 is a region in which the blank region 610 in FIG. 8 described above is complemented by the image complement module 173 in step S16 described above.
- the subject 831 is lacking because a part of the subject 831 exists in the blank area 610, and thus there is a subject complementing area 820 in which the image complementing module 173 complements part of the subject 831.
- the blank complement region 810 and the subject complement region 820 are complemented by the processing in step S16 described above.
- the user terminal 100 can create a panoramic image of a subject that has not been captured by estimating and complementing the subject that exists in the blank area.
- the display module 174 displays the result of the image analysis (step S18).
- step S18 the display module 174 displays the result of image analysis on the created panoramic image.
- the display module 174 displays the result of the image analysis in association with the subject reflected in the created panoramic image.
- step S18 for example, when the subject is a vehicle, the display module 174 displays information such as the name, model, and owner name of the vehicle in association with the subject.
- the display module 174 displays information such as the name, age, occupation, address, telephone number, and mail address of the person in association with the subject.
- the display module 174 displays various information in association with the subject even when the subject is in other cases.
- FIG. 11 is a diagram illustrating an example of a panoramic image displayed by the display module 174.
- the display module 174 displays a car 910 as a subject on the panoramic image 900.
- a subject other than the vehicle 910 displayed by the display module 174 is omitted, and the vehicle 910 is schematically shown for simplification.
- the display module 174 displays the information display area 920 associated with the car 910 so as to be superimposed on the panoramic image 900.
- This information display area 920 draws a leader line to the car 910 and indicates which subject the associated subject is.
- the display module 174 displays the result of the image analysis performed in step S14 described above in the information display area 920.
- the display module 174 displays the model name, model, and owner in the information display area 920.
- the display module 174 may be configured to display the information display area 920 around the panoramic image or at other locations instead of being superimposed on the panoramic image. At this time, for example, in order to clearly indicate the association, a sign may be attached to the subject, and the information display area 920 may be attached with the same sign as the sign attached to the subject.
- the display module 174 may be configured to display the result of the image analysis on the subject that has received an input operation from the user. In this case, for example, the display module 174 accepts an input operation on any subject, and displays the result of image analysis regarding the accepted subject.
- the panorama image synthesis analysis system 1 uses the plurality of cameras 200 to perform image analysis on a subject that is not entirely reflected in the captured image of one camera 200 by the above-described processing. By performing image analysis after combining images to create a panoramic image, the accuracy of image analysis can be improved. Furthermore, based on the result of image analysis, a portion that is not shown can be complemented and displayed.
- the means and functions described above are realized by a computer (including a CPU, an information processing apparatus, and various terminals) reading and executing a predetermined program.
- the program is provided in a form recorded on a computer-readable recording medium such as a flexible disk, CD (CD-ROM, etc.), DVD (DVD-ROM, DVD-RAM, etc.).
- the computer reads the program from the recording medium, transfers it to the internal storage device or the external storage device, stores it, and executes it.
- the program may be recorded in advance in a storage device (recording medium) such as a magnetic disk, an optical disk, or a magneto-optical disk, and provided from the storage device to a computer via a communication line.
Landscapes
- Engineering & Computer Science (AREA)
- Multimedia (AREA)
- Signal Processing (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Theoretical Computer Science (AREA)
- Studio Devices (AREA)
- Image Processing (AREA)
- Controls And Circuits For Display Device (AREA)
- Closed-Circuit Television Systems (AREA)
Abstract
Description
前記合成されたパノラマ画像に映り込んだ被写体に対して画像解析を行う画像解析手段と、
前記画像解析の結果を、表示する解析結果表示手段と、
を備えることを特徴とするパノラマ画像合成解析システムを提供する。
ことを特徴とする第1の特徴に係る発明であるパノラマ画像合成解析システムを提供する。
ことを特徴とする第1の特徴に係る発明であるパノラマ画像合成解析システムを提供する。
を備えることを特徴とする第1の特徴に係る発明であるパノラマ画像合成解析システムを提供する。
ことを特徴とする第1の特徴に係る発明であるパノラマ画像合成解析システムを提供する。
ことを特徴とする第1の特徴に係る発明であるパノラマ画像合成解析システムを提供する。
前記合成されたパノラマ画像に映り込んだ被写体に対して画像解析を行うステップと、
前記画像解析の結果を、表示するステップと、
を備えることを特徴とするパノラマ画像合成解析方法を提供する。
複数台のカメラで撮像された撮像画像を合成してパノラマ画像を作成するステップ、
前記合成されたパノラマ画像に映り込んだ被写体に対して画像解析を行うステップ、
前記画像解析の結果を、表示するステップ、
を実行させるためのプログラムを提供する。
本発明の概要について、図1に基づいて説明する。図1は、本発明の好適な実施形態であるパノラマ画像合成解析システム1の概要を説明するための図である。パノラマ画像合成解析システム1は、ユーザ端末100、カメラ200から構成される。このカメラ200は、カメラ201、カメラ202及びカメラ203から構成される。なお、カメラ200は、便宜的にカメラ201,202,203を一のグループとして表現するものに過ぎず、カメラ201,202,203が一の撮像装置として存在する訳ではない。
図2に基づいて、パノラマ画像合成解析システム1のシステム構成について説明する。図2は、本発明の好適な実施形態であるパノラマ画像合成解析システム1のシステム構成を示す図である。パノラマ画像合成解析システム1は、ユーザ端末100、複数台のカメラ(カメラ201,202,203)200、公衆回線網(インターネット網や、第3、第4世代通信網等)5から構成される。なお、ユーザ端末100は、1つに限らず、複数であってもよい。また、カメラ200は、3つに限らず、それ以上の数又はそれ以下の数であってもよい。また、ユーザ端末100は、実在する装置又は仮想的な装置のいずれか又は双方により実現されてもよい。また、後述する各処理は、ユーザ端末100、カメラ200のいずれか又は双方により実現されてもよい。
図3に基づいて、パノラマ画像合成解析システム1の機能について説明する。図3は、ユーザ端末100、カメラ200の機能ブロック図を示す図である。
図4に基づいて、パノラマ画像合成解析システム1が実行するパノラマ画像合成解析処理について説明する。図4は、ユーザ端末100、カメラ200が実行するパノラマ画像合成解析処理のフローチャートを示す図である。上述した各装置のモジュールが実行する処理について、本処理に併せて説明する。
Claims (8)
- 複数台のカメラで撮像された撮像画像を合成してパノラマ画像を作成するパノラマ画像作成手段と、
前記合成されたパノラマ画像に映り込んだ被写体に対して画像解析を行う画像解析手段と、
前記画像解析の結果を、表示する解析結果表示手段と、
を備えることを特徴とするパノラマ画像合成解析システム。 - 前記画像解析手段は、前記合成されたパノラマ画像の繋ぎ目部分に映り込んだ被写体に対して画像解析を行う、
ことを特徴とする請求項1に記載のパノラマ画像合成解析システム。 - 前記解析結果表示手段は、前記画像解析の結果を、前記被写体に関連付けて表示する、
ことを特徴とする請求項1に記載のパノラマ画像合成解析システム。 - 前記画像解析の結果に基づいて、前記被写体の欠けている部分を補完して表示する補完表示手段と、
を備えることを特徴とする請求項1に記載のパノラマ画像合成解析システム。 - 前記パノラマ画像作成手段は、前記複数台のカメラのうち、一番大きい画素数の撮像画像に合わせてパノラマ画像を作成する、
ことを特徴とする請求項1に記載のパノラマ画像合成解析システム。 - 前記パノラマ画像作成手段は、前記複数台のカメラのうち、一番小さい画素数の撮像画像に合わせてパノラマ画像を作成する、
ことを特徴とする請求項1に記載のパノラマ画像合成解析システム。 - 複数台のカメラで撮像された撮像画像を合成してパノラマ画像を作成するステップと、
前記合成されたパノラマ画像に映り込んだ被写体に対して画像解析を行うステップと、
前記画像解析の結果を、表示するステップと、
を備えることを特徴とするパノラマ画像合成解析方法。 - パノラマ画像合成解析システムに、
複数台のカメラで撮像された撮像画像を合成してパノラマ画像を作成するステップ、
前記合成されたパノラマ画像に映り込んだ被写体に対して画像解析を行うステップ、
前記画像解析の結果を、表示するステップ、
を実行させるためのプログラム。
Priority Applications (3)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2016572853A JP6267809B1 (ja) | 2016-08-15 | 2016-08-15 | パノラマ画像合成解析システム、パノラマ画像合成解析方法及びプログラム |
PCT/JP2016/073855 WO2018033952A1 (ja) | 2016-08-15 | 2016-08-15 | パノラマ画像合成解析システム、パノラマ画像合成解析方法及びプログラム |
US15/544,089 US10430925B2 (en) | 2016-08-15 | 2016-08-15 | System, method, and program for synthesizing panoramic image |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
PCT/JP2016/073855 WO2018033952A1 (ja) | 2016-08-15 | 2016-08-15 | パノラマ画像合成解析システム、パノラマ画像合成解析方法及びプログラム |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2018033952A1 true WO2018033952A1 (ja) | 2018-02-22 |
Family
ID=61020796
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2016/073855 WO2018033952A1 (ja) | 2016-08-15 | 2016-08-15 | パノラマ画像合成解析システム、パノラマ画像合成解析方法及びプログラム |
Country Status (3)
Country | Link |
---|---|
US (1) | US10430925B2 (ja) |
JP (1) | JP6267809B1 (ja) |
WO (1) | WO2018033952A1 (ja) |
Families Citing this family (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2018136098A1 (en) * | 2017-01-23 | 2018-07-26 | Huami Inc. | System and Method for Generating Digital Content for a Composite Camera |
US11347303B2 (en) * | 2018-11-30 | 2022-05-31 | Sony Interactive Entertainment Inc. | Systems and methods for determining movement of a controller with respect to an HMD |
KR20200081527A (ko) | 2018-12-19 | 2020-07-08 | 삼성전자주식회사 | 전자 장치 및 그 제어 방법 |
Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2001320616A (ja) * | 2000-02-29 | 2001-11-16 | Matsushita Electric Ind Co Ltd | 撮像システム |
JP2008254710A (ja) * | 2007-04-09 | 2008-10-23 | Fujitsu Ten Ltd | 障害物検知装置 |
JP2012160904A (ja) * | 2011-01-31 | 2012-08-23 | Sony Corp | 情報処理装置、情報処理方法、プログラム、及び撮像装置 |
Family Cites Families (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US6549681B1 (en) * | 1995-09-26 | 2003-04-15 | Canon Kabushiki Kaisha | Image synthesization method |
DE60106997T2 (de) * | 2000-02-29 | 2005-12-01 | Matsushita Electric Industrial Co., Ltd., Kadoma | Bildaufnahmesystem und fahrzeugmontiertes Sensorsystem |
JP5383576B2 (ja) * | 2010-03-31 | 2014-01-08 | 富士フイルム株式会社 | 撮像装置、撮像方法およびプログラム |
JP6016662B2 (ja) | 2013-02-13 | 2016-10-26 | 三菱電機株式会社 | 画像合成装置及び画像合成方法 |
-
2016
- 2016-08-15 JP JP2016572853A patent/JP6267809B1/ja active Active
- 2016-08-15 US US15/544,089 patent/US10430925B2/en not_active Expired - Fee Related
- 2016-08-15 WO PCT/JP2016/073855 patent/WO2018033952A1/ja active Application Filing
Patent Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2001320616A (ja) * | 2000-02-29 | 2001-11-16 | Matsushita Electric Ind Co Ltd | 撮像システム |
JP2008254710A (ja) * | 2007-04-09 | 2008-10-23 | Fujitsu Ten Ltd | 障害物検知装置 |
JP2012160904A (ja) * | 2011-01-31 | 2012-08-23 | Sony Corp | 情報処理装置、情報処理方法、プログラム、及び撮像装置 |
Also Published As
Publication number | Publication date |
---|---|
US10430925B2 (en) | 2019-10-01 |
JP6267809B1 (ja) | 2018-01-24 |
US20190180413A1 (en) | 2019-06-13 |
JPWO2018033952A1 (ja) | 2018-08-16 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
JP6627861B2 (ja) | 画像処理システムおよび画像処理方法、並びにプログラム | |
US10547822B2 (en) | Image processing apparatus and method to generate high-definition viewpoint interpolation image | |
WO2011105044A1 (en) | Information processing method and information processing apparatus | |
US20190098227A1 (en) | Apparatus and method for displaying ar object | |
US20120268491A1 (en) | Color Channels and Optical Markers | |
US11189102B2 (en) | Electronic device for displaying object for augmented reality and operation method therefor | |
CN112543343B (zh) | 基于连麦直播的直播画面处理方法、装置及电子设备 | |
JP2013197785A (ja) | 画像生成装置、画像生成方法及びプログラム | |
CN107798932A (zh) | 一种基于ar技术的早教训练系统 | |
JP6375070B1 (ja) | コンピュータシステム、画面共有方法及びプログラム | |
JP2018026064A (ja) | 画像処理装置、画像処理方法、システム | |
JP6267809B1 (ja) | パノラマ画像合成解析システム、パノラマ画像合成解析方法及びプログラム | |
JP6275086B2 (ja) | サーバ、データ提供方法及びサーバ用プログラム | |
JP6608311B2 (ja) | 画像評価装置及び画像評価プログラム | |
JP2009223835A (ja) | 撮影装置、画像処理装置、これらにおける画像処理方法およびプログラム | |
JP2010072813A (ja) | 画像処理装置および画像処理プログラム | |
CN111309212A (zh) | 一种分屏对比的试衣方法、装置、设备及存储介质 | |
WO2015072091A1 (ja) | 画像処理装置、画像処理方法及びプログラム記憶媒体 | |
JP6720585B2 (ja) | 情報処理システム、情報処理端末、および、プログラム | |
JP2017188787A (ja) | 撮像装置、画像合成方法、および画像合成プログラム | |
JP6404525B2 (ja) | 全天球カメラ撮像画像表示システム、全天球カメラ撮像画像表示方法及びプログラム | |
JP6650998B2 (ja) | 鏡、画像表示方法及びプログラム | |
JP2008014825A (ja) | 測定方法及び測定プログラム | |
CN114424519B (zh) | 图像处理装置、图像处理方法、程序和图像处理系统 | |
JP4736611B2 (ja) | 画像処理装置および画像処理方法 |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
WWE | Wipo information: entry into national phase |
Ref document number: 2016572853 Country of ref document: JP |
|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 16913482 Country of ref document: EP Kind code of ref document: A1 |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
32PN | Ep: public notification in the ep bulletin as address of the adressee cannot be established |
Free format text: NOTING OF LOSS OF RIGHTS PURSUANT TO RULE 112(1) EPC (EPO FORM 1205A DATED 29.05.2019) |
|
122 | Ep: pct application non-entry in european phase |
Ref document number: 16913482 Country of ref document: EP Kind code of ref document: A1 |