WO2014017392A1 - 情報処理装置、そのデータ処理方法、およびプログラム - Google Patents
情報処理装置、そのデータ処理方法、およびプログラム Download PDFInfo
- Publication number
- WO2014017392A1 WO2014017392A1 PCT/JP2013/069623 JP2013069623W WO2014017392A1 WO 2014017392 A1 WO2014017392 A1 WO 2014017392A1 JP 2013069623 W JP2013069623 W JP 2013069623W WO 2014017392 A1 WO2014017392 A1 WO 2014017392A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- additional data
- image
- information processing
- information
- processing apparatus
- Prior art date
Links
- 230000010365 information processing Effects 0.000 title claims abstract description 139
- 238000003672 processing method Methods 0.000 title claims description 31
- 238000003384 imaging method Methods 0.000 claims abstract description 38
- 238000013500 data storage Methods 0.000 claims abstract description 34
- 238000013075 data extraction Methods 0.000 claims abstract description 24
- 239000000284 extract Substances 0.000 claims abstract description 20
- 238000000034 method Methods 0.000 claims description 62
- 230000007815 allergy Effects 0.000 claims description 28
- 206010020751 Hypersensitivity Diseases 0.000 claims description 27
- 238000012790 confirmation Methods 0.000 claims description 25
- 208000026935 allergic disease Diseases 0.000 claims description 23
- 230000000172 allergic effect Effects 0.000 claims description 21
- 208000010668 atopic eczema Diseases 0.000 claims description 21
- 239000000126 substance Substances 0.000 claims description 18
- 230000005540 biological transmission Effects 0.000 claims description 17
- 239000002131 composite material Substances 0.000 claims description 4
- 238000011156 evaluation Methods 0.000 claims description 4
- 230000006870 function Effects 0.000 description 30
- 238000010586 diagram Methods 0.000 description 17
- 238000004891 communication Methods 0.000 description 15
- 238000012545 processing Methods 0.000 description 15
- 230000008569 process Effects 0.000 description 10
- 238000004590 computer program Methods 0.000 description 8
- 239000013566 allergen Substances 0.000 description 5
- 230000000694 effects Effects 0.000 description 5
- 230000007704 transition Effects 0.000 description 5
- 238000001514 detection method Methods 0.000 description 4
- 238000012217 deletion Methods 0.000 description 3
- 230000037430 deletion Effects 0.000 description 3
- 238000000605 extraction Methods 0.000 description 3
- 238000003825 pressing Methods 0.000 description 3
- 235000002767 Daucus carota Nutrition 0.000 description 2
- 244000000626 Daucus carota Species 0.000 description 2
- 230000003190 augmentative effect Effects 0.000 description 2
- 235000005911 diet Nutrition 0.000 description 2
- 230000037213 diet Effects 0.000 description 2
- 239000004615 ingredient Substances 0.000 description 2
- 238000010295 mobile communication Methods 0.000 description 2
- 238000012986 modification Methods 0.000 description 2
- 230000004048 modification Effects 0.000 description 2
- 235000015927 pasta Nutrition 0.000 description 2
- 238000003909 pattern recognition Methods 0.000 description 2
- 238000005554 pickling Methods 0.000 description 2
- 230000001932 seasonal effect Effects 0.000 description 2
- 230000008859 change Effects 0.000 description 1
- 238000006243 chemical reaction Methods 0.000 description 1
- 239000000470 constituent Substances 0.000 description 1
- 230000003247 decreasing effect Effects 0.000 description 1
- 238000005401 electroluminescence Methods 0.000 description 1
- 238000005516 engineering process Methods 0.000 description 1
- 235000013305 food Nutrition 0.000 description 1
- JEIPFZHSYJVQDO-UHFFFAOYSA-N iron(III) oxide Inorganic materials O=[Fe]O[Fe]=O JEIPFZHSYJVQDO-UHFFFAOYSA-N 0.000 description 1
- 239000004973 liquid crystal related substance Substances 0.000 description 1
- 239000000463 material Substances 0.000 description 1
- 238000004091 panning Methods 0.000 description 1
- 238000007639 printing Methods 0.000 description 1
- 239000002994 raw material Substances 0.000 description 1
- 238000012552 review Methods 0.000 description 1
- 230000003442 weekly effect Effects 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06Q—INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
- G06Q50/00—Information and communication technology [ICT] specially adapted for implementation of business processes of specific business sectors, e.g. utilities or tourism
- G06Q50/10—Services
- G06Q50/12—Hotels or restaurants
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F16/00—Information retrieval; Database structures therefor; File system structures therefor
- G06F16/50—Information retrieval; Database structures therefor; File system structures therefor of still image data
- G06F16/58—Retrieval characterised by using metadata, e.g. metadata not derived from the content or metadata generated manually
- G06F16/583—Retrieval characterised by using metadata, e.g. metadata not derived from the content or metadata generated manually using metadata automatically derived from the content
- G06F16/5838—Retrieval characterised by using metadata, e.g. metadata not derived from the content or metadata generated manually using metadata automatically derived from the content using colour
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F16/00—Information retrieval; Database structures therefor; File system structures therefor
- G06F16/70—Information retrieval; Database structures therefor; File system structures therefor of video data
- G06F16/78—Retrieval characterised by using metadata, e.g. metadata not derived from the content or metadata generated manually
- G06F16/783—Retrieval characterised by using metadata, e.g. metadata not derived from the content or metadata generated manually using metadata automatically derived from the content
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F16/00—Information retrieval; Database structures therefor; File system structures therefor
- G06F16/70—Information retrieval; Database structures therefor; File system structures therefor of video data
- G06F16/78—Retrieval characterised by using metadata, e.g. metadata not derived from the content or metadata generated manually
- G06F16/783—Retrieval characterised by using metadata, e.g. metadata not derived from the content or metadata generated manually using metadata automatically derived from the content
- G06F16/7837—Retrieval characterised by using metadata, e.g. metadata not derived from the content or metadata generated manually using metadata automatically derived from the content using objects detected or recognised in the video content
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F16/00—Information retrieval; Database structures therefor; File system structures therefor
- G06F16/70—Information retrieval; Database structures therefor; File system structures therefor of video data
- G06F16/78—Retrieval characterised by using metadata, e.g. metadata not derived from the content or metadata generated manually
- G06F16/7867—Retrieval characterised by using metadata, e.g. metadata not derived from the content or metadata generated manually using information manually generated, e.g. tags, keywords, comments, title and artist information, manually generated time, location and usage information, user ratings
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/002—Specific input/output arrangements not covered by G06F3/01 - G06F3/16
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/002—Specific input/output arrangements not covered by G06F3/01 - G06F3/16
- G06F3/005—Input arrangements through a video camera
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/011—Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06Q—INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
- G06Q30/00—Commerce
- G06Q30/06—Buying, selling or leasing transactions
- G06Q30/0601—Electronic shopping [e-shopping]
- G06Q30/0641—Shopping interfaces
- G06Q30/0643—Graphical representation of items or shoppers
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T11/00—2D [Two Dimensional] image generation
- G06T11/60—Editing figures and text; Combining figures or text
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V20/00—Scenes; Scene-specific elements
- G06V20/20—Scenes; Scene-specific elements in augmented reality scenes
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2210/00—Indexing scheme for image generation or computer graphics
- G06T2210/41—Medical
Definitions
- the present invention relates to an information processing apparatus, a data processing method thereof, and a program.
- Patent Document 1 discloses a self-order system in which a terminal is placed at each table of a store in a restaurant in the restaurant industry or the like, and an order is operated by a customer himself / herself.
- the order system described in Patent Document 2 uses a user's electronic device, here, car navigation, to cause the user to create order information and receive it by wireless communication.
- the product information search system described in Patent Literature 3 uses a display image reproduced and displayed on an image playback device to search for product information via a network using an image captured by a terminal, and acquires the acquired product. Order products based on the information.
- the present invention has been made in view of the above circumstances, and an object of the present invention is to provide an information processing apparatus that improves the listability of information of a plurality of images using a mobile terminal, a data processing method thereof, and a program. There is.
- Video data acquisition means for sequentially acquiring video data in which at least a part of an imaging target presenting a plurality of items is captured by the imaging unit;
- Image recognition means for detecting that the video data includes at least part of an image registered in advance and recognizing the image;
- Additional data extracting means for extracting the additional data of each image from the additional data storage section that stores additional data corresponding to each of the plurality of images presented to the photographing object recognized by the image recognition means.
- Additional data display means for displaying the additional data of each image extracted by the additional data extraction means at a position corresponding to each image of the video data.
- the data processing method of the information processing apparatus of the present invention includes: Information processing device Sequentially acquiring video data in which at least a part of an imaging target presenting a plurality of items is captured by an imaging unit; Detecting that the video data includes at least a part of an image registered in advance, and recognizing the image; The additional data of each of the images is extracted from the additional data storage unit that stores additional data corresponding to each of the plurality of images that are recognized and presented to the imaging target, The data processing method of the information processing apparatus displays the additional data of each extracted image at a position corresponding to each image of the video data.
- the computer program of the present invention is: In a computer that implements an information processing device, A procedure for sequentially acquiring video data in which at least a part of an imaging target presenting a plurality of items is captured by an imaging unit, A procedure for detecting that the video data includes at least a part of an image registered in advance and recognizing the image; A procedure for extracting the additional data of each of the images from the additional data storage unit that stores the additional data corresponding to each of the plurality of images presented to the imaging target that has been recognized; It is a program for executing a procedure for displaying additional data of each extracted image at a position corresponding to each image of the video data.
- a plurality of components are formed as a single member, and a single component is formed of a plurality of members. It may be that a certain component is a part of another component, a part of a certain component overlaps with a part of another component, or the like.
- the data processing method and the computer program of the present invention describe a plurality of procedures in order, the described order does not limit the order in which the plurality of procedures are executed. For this reason, when implementing the data processing method and computer program of this invention, the order of the several procedure can be changed in the range which does not have trouble in content.
- the data processing method and the plurality of procedures of the computer program of the present invention are not limited to being executed at different timings. For this reason, another procedure may occur during the execution of a certain procedure, or some or all of the execution timing of a certain procedure and the execution timing of another procedure may overlap.
- an information processing apparatus that improves the listability of information of a plurality of images using a mobile terminal, a data processing method thereof, and a program.
- FIG. 1 is a block diagram showing a configuration example of an order system 1 using an information processing apparatus according to an embodiment of the present invention.
- the order system 1 of the present embodiment includes a smartphone 10 that is a mobile terminal used by a user, an order reception device 80 provided in the store 5, a printer 82 connected to the order reception device 80, the smartphone 10, and an order reception.
- a server device 60 capable of communicating with the device 80 via the network 3 and a database 50 (shown as “DB” in the figure) connected to the server device 60 are provided.
- DB database 50
- a user shoots a mobile terminal such as a smartphone 10 over a menu 7 and displays it on a preview screen 9 of a video displayed in real time. You can order goods.
- a mobile terminal such as a smartphone 10
- a preview screen 9 of a video displayed in real time You can order goods.
- this embodiment demonstrates the case where an information processing apparatus is utilized for the order system 1 in a restaurant etc., it is not limited to this.
- the system according to the present embodiment may be any system that recognizes an image from video data obtained by photographing a photographing target on which a plurality of arbitrary items are displayed and presents additional data related to the image to the user.
- the smartphone 10 is described as an example of a mobile terminal used by a user, but the present invention is not limited to this.
- the mobile terminal of the present invention can be a mobile mobile radio communication terminal such as a mobile phone, a PDA (Personal Digital Assistant), a tablet terminal, a game machine, or other electronic devices in addition to a smartphone.
- the mobile terminal of the present invention may be a mobile terminal deployed in a store or a product presentation place, etc., in addition to the mobile terminal carried by the user, and can be commonly used by the users who visited or visited the place. Such a terminal may be used.
- the object to be photographed by the user holding the smartphone 10 is not limited to the menu 7 or a paper medium such as a product catalog, a flyer, or a magazine announcement, but also a product or a product displayed on a show window or a product shelf. It may be a screen on which a digital product catalog or the like is displayed on a display of a model, a terminal such as a personal computer, or product information displayed on a digital signage installed in a street or a store.
- the order reception apparatus 80 is installed in the store, it is not limited to this.
- the order receiving means of the present invention may be a virtual store provided on the server device 60 or a net shop on a website.
- the information processing apparatus in particular, shoots a shooting target in which a plurality of items are arranged adjacent to each other while changing the direction and position of the camera part by part, and When browsing sequentially on a mobile-sized screen such as a smartphone 10), the listability of additional information of a plurality of items can be improved.
- Items displayed for shooting can include not-for-sale items, such as exhibits and prototypes, in addition to items that become products.
- the items can include product options, such as sushi rust removal, and selected parts that comprise the product, such as aero parts of an automobile, and combinations thereof.
- the items may include services provided by various types of business, options for the services, designation of the date and time for receiving the service, designation of a person in charge, and the like.
- the item may be a plurality of choices presented to the user, and one or more optional choices may be designated by the user, for example, a questionnaire or quiz answer choice.
- FIG. 2 is a functional block diagram showing a logical configuration of the information processing apparatus 100 according to the embodiment of the present invention.
- An information processing apparatus 100 includes a video data acquisition unit 102 that sequentially acquires video data in which at least a part of an imaging target that presents a plurality of items is captured by an imaging unit, and video data
- An image recognizing unit 104 that detects that at least a part of an image registered in advance is included, and recognizes the image, and the plurality of images presented to the imaging target recognized by the image recognizing unit 104.
- the additional data extracting unit 108 for extracting the additional data of each image from the additional data storage unit 106 for storing the corresponding additional data, and the additional data of each image extracted by the additional data extracting unit 108 are displayed.
- an additional data display unit 110 for extracting the additional data of each image from the additional data storage unit 106 for storing the corresponding additional data, and the additional data of each image extracted by the additional data extracting unit 108 are displayed.
- an additional data display unit 110
- the information processing apparatus 100 is a server device that can communicate with a user portable terminal (smartphone 10) or a user portable terminal (smartphone 10). 60, or a combination thereof.
- the smartphone 10 at least a part of the functions of the information processing apparatus 100 can be realized on the smartphone 10 by installing in advance an application program for realizing the information processing apparatus 100 according to the embodiment of the present invention.
- the smart phone 10 can utilize the function of the information processing apparatus 100 because a web page is provided on a web server (not shown) and a user accesses using the smart phone 10.
- FIG. 4 is a block diagram showing a hardware configuration of the smartphone 10 that realizes the user portable terminal according to the embodiment of the present invention.
- the smartphone 10 of this embodiment includes a CPU (Central Processing Unit) 12, a ROM (Read Only Memory) 14, a RAM (Random Access Memory) 16, a mobile phone network communication unit 18, A wireless LAN (Local Area Network) communication unit 20, an operation unit 22, an operation reception unit 24, a display unit 26, a display control unit 28, an imaging unit 30, a speaker 32, a microphone 34, and an audio control unit. 36.
- a CPU Central Processing Unit
- ROM Read Only Memory
- RAM Random Access Memory
- the CPU 12 is connected to each element of the smartphone 10 via the bus 40 and controls the entire smartphone 10 together with each element.
- the ROM 14 stores programs and various application programs for operating the smartphone 10, various setting data used when these programs operate, and user data including address data and various content data.
- the RAM 16 has an area for temporarily storing data, such as a work area for operating the program.
- Each component of the smartphone 10 includes a CPU 12, a RAM 16, a program that realizes at least a part of the components of FIG. 2 loaded in the RAM 16, a ROM 14 that stores the program, a network connection interface (a mobile phone network communication unit) 18. It is realized by any combination of hardware and software of any computer having the wireless LAN communication unit 20). It will be understood by those skilled in the art that there are various modifications to the implementation method and apparatus.
- the functional block diagram of each embodiment described below shows a logical functional unit block, not a hardware unit configuration.
- the ROM 14 and the RAM 16 described above may be other devices having a function for storing application programs and setting data for operating the programs, temporary storage data, user data, and the like, such as a flash memory and a disk drive. Good.
- the operation unit 22 includes operation keys, operation buttons, switches, a jog dial, a touch pad, a touch panel integrated with the display unit 26, and the like.
- the operation reception unit 24 receives an operation of the operation unit 22 by the user and notifies the CPU 12 of the operation.
- the display unit 26 includes an LED (Light Emitting Diode) display, a liquid crystal display, an organic EL (ElectroLuminescence) display, and the like.
- the display control unit 28 displays various screens on the display unit 26 in accordance with instructions from the CPU 12.
- the voice control unit 36 performs voice output from the speaker 32 and voice input from the microphone 34 in accordance with instructions from the CPU 12.
- the smartphone 10 is connected to a base station (not shown) via a mobile phone network communication unit 18 and a mobile phone network antenna 19 in a mobile communication network (not shown), for example, in a 3G (3rd generation mobile phone) system. ) To connect and communicate.
- the smartphone 10 is connected to the network 3 (FIG. 1) such as the Internet from the mobile communication network, and can communicate with the server device 60 (FIG. 1).
- the wireless LAN communication unit 20 performs wireless LAN communication with a relay device (not shown) via a wireless LAN antenna 21 by a method compliant with, for example, IEEE 802.11 standard.
- the smartphone 10 performs wireless LAN communication with a relay device installed in the store 5 by the wireless LAN communication unit 20 and connects to a home network (not shown), and the order receiving device 80 connected to the home network. Can communicate.
- FIG. 5 is a block diagram showing a hardware configuration of server device 60 of ordering system 1 according to the embodiment of the present invention.
- the server device 60 of the present embodiment can be realized by a server computer or a personal computer connected to the database 50 (FIG. 1), or a device corresponding to them.
- the server device 60 may be configured with a virtual server or the like.
- Each component of the server device 60 of the order system 1 includes a CPU 62, a RAM 66, a program that realizes at least a part of the components of FIG. 2 loaded in the RAM 66, a ROM 64 that stores the program, and a network connection It is realized by an arbitrary combination of hardware and software of an arbitrary computer having an I / O (Input / Output) 68 including an interface.
- the CPU 62 is connected to each element of the server device 60 via the bus 69 and controls the entire server device 60 together with each element. It will be understood by those skilled in the art that there are various modifications to the implementation method and apparatus.
- the functional block diagram of each embodiment described below shows a logical functional unit block, not a hardware unit configuration.
- the server device 60 can also be connected to an input / output device (not shown) via the I / O 68.
- the video data acquisition unit 102 is configured such that at least a part of an imaging target presenting images of a plurality of products is an imaging unit (the imaging unit 30 of the smartphone 10 in FIG. 4).
- the video data picked up by is sequentially acquired.
- the user holds the smartphone 10 over the menu 7 (FIG. 3) and previews at least a part of images of a plurality of products presented on the menu 7 or the like as a real-time video on the display unit 26 (FIG. 4) of the smartphone 10. Live view is displayed on the screen 9 (FIG. 3).
- the video data acquisition unit 102 captures at least a part of the shooting target, and sequentially acquires video data of a size displayed on the screen of the mobile terminal size.
- the video data acquisition unit 102 is configured to be realized by a camera (such as the imaging unit 30 in FIG. 4) built in or connected to the smartphone 10, but is not limited thereto.
- the video data acquisition unit 102 can also be realized by the server device 60.
- the video data acquired by the video data acquisition unit 102 of the server device 60 may be streamed to the user's smartphone 10 and displayed on the display unit 26 (FIG. 4) of the smartphone 10.
- the server device 60 may be remotely operated from the smartphone 10 side, and the video data captured by the video data acquisition unit 102 of the server device 60 may be streamed to the smartphone 10 and displayed on the display unit 26 of the smartphone 10.
- a shop show window is photographed remotely from the smartphone 10 using a live camera to acquire video data, and the acquired video is streamed to the smartphone 10 via the server device 60. It may be distributed and displayed on the display unit 26 of the smartphone 10.
- the image recognition unit 104 specifies a plurality of products included in the video data from the product image included in the video data, and acquires identification information of each product. For example, the image recognition unit 104 recognizes images of a plurality of products from the video data and extracts feature points of the recognized product images. Then, the image recognition unit 104 identifies a product based on the feature point data of the extracted image. It is desirable that the image recognition unit 104 can simultaneously identify images of a plurality of products from the video data. Further, the image recognition unit 104 specifies the position on the display screen of the product image on the video data. In this embodiment, an example of specifying a product has been described. However, in principle, the image recognition unit 104 detects that the video data includes at least a part of an image registered in advance, and Recognize images.
- the image recognition unit 104 collates the feature quantity in the video image with the feature quantity (feature point data) of the product image registered in advance in the image table 460 (FIG. 27) by pattern recognition or the like. Then, an area where at least a part is matched is detected, the position and image of the area are specified, and the product corresponding to the image is specified.
- the image recognizing unit 104 detects that a plurality of product images are included in the video image by pattern recognition or the like, and an area considered to be an individual product image in the video image. Separating, extracting feature quantities for each of the separated areas, comparing with feature quantities (feature point data) in the image table 460 (FIG. 27), and specifying the position and image of the areas that at least partially match The product corresponding to the image is specified.
- the feature amount (feature point data) of the product image in the image table 460 (FIG. 27) and the feature amount in the video image do not need to be completely matched, and if at least a part of them matches. Good.
- the image recognition unit 104 can specify the image. Note that the first method is more preferable because the first method does not require the process of dividing the video image into individual product image areas.
- the image recognition unit 104 can simultaneously identify a plurality of images from video data. Further, the image recognition unit 104 specifies the position on the display screen of the image on the video data.
- the image recognition unit 104 can be realized by either the smartphone 10 or the server device 60.
- the information processing apparatus 100 holds an image table 460 (FIG. 27) that associates feature points of product images with product identification information.
- the image table 460 is stored in the database 50 of the server device 60, the ROM 14 of the smartphone 10, or a recording medium readable by the smartphone 10 attached to the smartphone 10 (hereinafter, both are also abbreviated as “smartphone 10 memory”). can do.
- the information processing apparatus of the present invention may have a configuration in which the constituent elements of the image recognition unit 104 are allocated to the smartphone 10 and the server apparatus 60 in any combination.
- the image authentication unit 104 realizes the following functions.
- A Function for recognizing that a plurality of product images are included in the photographed image
- b Function for dividing individual product images in the photographed image
- c Function for extracting feature points for each product image
- D Function for identifying the corresponding product from the extracted feature points
- the following five function sharing combinations can be considered.
- All functions are realized by the smartphone 10.
- the function (a) is realized by the smartphone 10, the result is transmitted to the server device 60, and the functions (b) to (d) are realized by the server device 60.
- the functions (a) to (b) are realized by the smartphone 10, the result is transmitted to the server device 60, and the functions (c) to (d) are realized by the server device 60.
- the functions (a) to (c) are realized by the smartphone 10, the result is transmitted to the server device 60, and the function (d) is realized by the server device 60.
- All functions are realized by the server device 60.
- the image recognition unit 104 When the image recognition unit 104 is configured to detect an image using the second method, the image authentication unit 104 realizes the following functions.
- E Function for extracting feature points from photographed image
- f Function for collating extracted feature points with feature point data for each pre-registered product image
- g Included in photographed image from collation result Function to specify the location of the product image and the type of product
- the following four combinations of function sharing can be considered.
- (1) All functions are realized by the smartphone 10.
- (2) The function (e) is realized by the smartphone 10, the result is transmitted to the server device 60, and the functions (f) to (g) are realized by the server device 60.
- (3) The functions (e) to (f) are realized by the smartphone 10, the result is transmitted to the server device 60, and the function (g) is realized by the server device 60.
- All functions are realized by the server device 60.
- the image table 460 stores product numbers, which are product identification information, and feature point data of product images in association with each other.
- the image recognition unit 104 can acquire product identification information corresponding to the feature points of the product image extracted from the video data. Further, the image table 460 may hold image feature point data and additional data corresponding to the image feature point data. Then, the image recognition unit 104 or the additional data extraction unit 108 refers to the product master 430 (FIG. 24 to be described later), and performs the process of extracting additional data (step S107 in FIG. 7 to be described later) without performing the process.
- the configuration may be such that additional data corresponding to the feature point data of the image can be extracted with reference to 460.
- the additional data storage unit 106 stores additional data corresponding to each of a plurality of images presented to the photographing target.
- a product master 430 as shown in FIG. 24 is held in the database 50 of the server device 60.
- the merchandise master 430 includes, for example, merchandise identification information, merchandise number, merchandise name, merchandise unit price, sales tax-inclusive price, allergic substance information of material contained in the food product, coupon information related to the merchandise, etc. be able to.
- the additional data storage unit 106 is included in the database 50 of the server device 60, but may be included in the memory of the smartphone 10.
- the update information of the additional data may be transmitted from the server device 60 to the smartphone 10, and the information processing device 100 may be configured to update the additional data held by the smartphone 10.
- the information processing apparatus 100 may be configured to selectively download additional data required by the user from the server apparatus 60 to the smartphone 10 and store it in the additional data storage unit 106.
- the additional data extraction unit 108 extracts additional data of each product specified by the image recognition unit 104 from the additional data storage unit 106 based on the product identification information acquired by the image recognition unit 104.
- the additional data storage unit 106 may be included in both the smartphone 10 and the server device 60, and the additional data extraction unit 108 searches the additional data storage unit 106 in the smartphone 10 to obtain a product. If the additional data cannot be extracted, the server device 60 may be accessed to acquire the additional data.
- the additional data display unit 110 displays the additional data of each product extracted by the additional data extraction unit 108 at a position corresponding to each product in the video data.
- the additional data display unit 110 expresses the additional data of the product with, for example, various objects such as an icon, a balloon including a text, a pop-up window, or a replacement image, and displays the data on the video data.
- the additional data storage unit 106 stores additional data corresponding to each of a plurality of images, and the additional data
- the extraction unit 108 can extract additional data corresponding to each image from the additional data storage unit 106.
- the additional data display unit 110 displays the additional data of each image at a position corresponding to each image of the video data. Further, the additional data display unit 110 may be configured to display the additional data at a predetermined position in the screen or the entire screen regardless of the position corresponding to each image of the video data.
- the image recognition unit 104, the additional data extraction unit 108, and the additional data display unit 110 for example, augmented reality (Augmented : Reality) that can additionally present information using a computer in a real environment photographed by a camera or the like. AR) can be used. If AR is used, the information processing apparatus 100 recognizes a three-dimensional coordinate system in which an area in which a specific image registered in advance such as a product image is displayed on an image captured by a camera such as the smartphone 10 is an XY plane. The corresponding additional data can be displayed on the display unit 26 as a 3D object, for example.
- augmented reality Augmented : Reality
- AR Augmented : Reality
- the information processing apparatus 100 recognizes a three-dimensional coordinate system in which an area in which a specific image registered in advance such as a product image is displayed on an image captured by a camera such as the smartphone 10 is an XY plane.
- the corresponding additional data can be displayed on the display unit 26 as a 3D object, for example.
- Additional data to be displayed includes recommended information, discount information, coupon information, calories, raw materials (allergic information), URL address of the home page on which detailed product information is posted, product review information, customer information Evaluation by a critic or the like, or optional service information may be included.
- the additional data may include, as information unique to the store, an area-limited menu, a store manager recommended menu, a limited-time store visit point, product sold-out information, and the like.
- the additional data may include information such as daily information, time (day of week, weekly, monthly) sale, seasonal sale, rainy day sale, etc. in relation to the date and the weather.
- the additional data may include a product name and price that are basic data of the product.
- the additional data may be any combination of data as shown in the above example.
- the additional data displayed in the video data may be an icon indicating that additional data exists.
- the additional data display unit 110 may display detailed information on the product when the operation of the icon is received. That is, the additional data display unit 110 can display additional data of products that cannot be printed in a product catalog or the like. In this way, the information processing apparatus 100 publishes only the minimum necessary information on the object to be imaged or information on the product that the user wants to pay attention to, and presents detailed information only about the product that the user is interested in. Effective advertising and sales promotion.
- the information processing apparatus 100 of the order system 1 of this embodiment is further provided with the user identification information acquisition part 112 and the order data transmission part 114, as shown in FIG.
- the user identification information acquisition unit 112 acquires user identification information.
- the user identification information may display a login screen 120 as shown in FIG. 11 after starting an application that implements the information processing apparatus 100 of the present invention on the smartphone 10.
- the user identification information acquisition unit 112 may acquire user identification information by accepting a user ID registered in advance and a password (PW) on the login screen 120.
- PW password
- the user identification information acquisition part 112 may read and acquire the terminal identification information separately provided to the main body of the user portable terminal.
- the order data transmission unit 114 identifies product information selected from a plurality of products included in the video displayed on the mobile terminal size display unit 26 based on the composite video data obtained by the additional data display unit 110. And order data including user identification information is transmitted. Note that the product identification information and the user identification information are not necessarily required, and the order data transmission unit 114 may transmit at least the data of the selected image.
- the user selects a product to be ordered from among a plurality of products included in the video displayed on the display unit 26 (FIG. 4) of the smartphone 10 by operating the touch panel (the operation unit 22 of FIG. 4). You can choose.
- the selection operation can be received by the operation receiving unit 24 of the smartphone 10 of FIG.
- an operation button for placing an order is displayed beside the product image, and the user can easily place an order for the product by pressing the button.
- the user takes an image with the imaging unit 30 facing at least a part of an imaging target on which a large number of products such as a product catalog and menu 7 are posted, and displays a plurality of products on the portable size display unit 26 of the smartphone 10. Display an image and select a product from it. Details of the selection operation will be described later.
- the user checks the additional data displayed on the real-time video displayed on the display unit 26 while panning the imaging unit 30 of the smartphone 10 within the range of the shooting target, and checks the product of the entire shooting target, A product can be selected from the list.
- the information processing apparatus 100 of the present invention displays additional data particularly when a product is selected and ordered from shooting targets such as a catalog or menu in which images of a plurality of products are presented in a list. , It is possible to make the product that the user wants to appeal stand out in the list and make it noticeable.
- information (product orders) of a plurality of images selected by the order data transmission unit 114 is temporarily stored in a storage unit (tray), and the stored order is presented to the user. Further, a confirmation receiving unit (not shown) for receiving a confirmation operation can be further provided.
- the order data transmission unit 114 transmits information (orders) of a plurality of images after the confirmation accepting unit accepts the confirmation operation. That is, the order data transmission unit 114 can transmit orders (image information) in the tray in a lump after accepting orders for a plurality of products (images).
- the order data transmission unit 114 When the order data transmission unit 114 is realized by the smartphone 10, for example, the order data is transmitted from the smartphone 10 to the order receiving device 80 (FIG. 1). Then, the order receiving device 80 can print out the order data by the printer 82 (FIG. 1) installed in the kitchen of the store 5 or the like. By printing out the order data, the user's order is automatically transmitted from the smartphone 10 to the kitchen.
- the order data transmitting unit 114 can transmit the order data to the order receiving device 80 via a relay terminal installed in the store 5 using, for example, the wireless LAN communication unit 20 of the smartphone 10 of FIG.
- the order data transmission unit 114 can be realized by the server device 60.
- the order data transmission unit 114 may transmit the order data ordered by the smartphone 10 to the server device 60 and transmit the order data from the server device 60 to the order reception device 80.
- the order data transmission unit 114 can upload the order data to the server device 60 using the mobile phone network communication unit 18 of the smartphone 10 of FIG.
- the order data transmitting unit 114 can transmit order data from the server device 60 to the order receiving device 80 via the network 3.
- the network 3 between the server device 60 and the order receiving device 80 is not particularly limited.
- the computer program according to the present embodiment is a procedure for sequentially acquiring video data in which at least a part of a shooting target presenting a plurality of items is captured by an imaging unit in a computer for realizing the information processing apparatus 100, and video data , A procedure for recognizing the image by detecting that at least a part of the image registered in advance is included, and additional data corresponding to each of the recognized plurality of images presented to the photographing target are stored. It is described to execute a procedure for extracting additional data of each specified image from the additional data storage unit and a procedure for displaying the additional data of each extracted image.
- the computer program of this embodiment may be recorded on a computer-readable recording medium.
- the recording medium is not particularly limited, and various forms can be considered.
- the program may be loaded from a recording medium into a computer memory, or downloaded to a computer through a network and loaded into the memory.
- FIG. 10 is a diagram illustrating an example of a screen configuration of the smartphone 10 of the information processing apparatus 100 according to the embodiment of the present invention.
- the screen configuration of the information processing apparatus 100 includes a login screen 120, a main screen 130, a table number screen 140, an order screen 150, a tray screen 170, an order confirmation screen 190, and a deletion confirmation screen 192. .
- the login screen 120 is displayed on the display unit 26 (FIG. 4) of the smartphone 10.
- the login screen 120 includes a user ID input field 122, a password input field 124, a login button 126, and an end button 128, as shown in FIG.
- the user ID and password registered in advance by the user are input to the user ID input field 122 and the password input field 124 of the login screen 120 and the login button 126 is pressed, so that the display screen changes to the main screen 130.
- the password is confirmed by the user authentication process (step S401). If there is an error (Y in step S401), a message that the password is in error is displayed (step S403), and the display screen is the login screen 120. Return to.
- the display screen transitions to the main screen 130.
- the user authentication process can be performed by the server device 60.
- the server device 60 holds a user master 400 shown in FIG.
- the server device 60 can perform user authentication processing with reference to the user master 400. If the end button 128 is pressed on the login screen 120, an end confirmation message is displayed (step S405). When the end confirmation is obtained (Y in step S407), this application is ended. When the end is canceled (N in step S407), the display screen returns to the login screen 120.
- the main screen 130 includes a store name display unit 132, a table number display unit 133, a user name display unit 134, a point number display unit 135, a table number button 137, and an order button 138. , A tray button 139 and an end button 128.
- Each information of the store name display unit 132, the user name display unit 134, and the point number display unit 135 of the main screen 130 is stored in the database 50 of the server device 60.
- the user master 400 of FIG. 22 and the store master of FIG. Reference numeral 420 or the like can be displayed.
- the display screen changes to the table number screen 140.
- the display screen is changed to the main screen.
- the information of the table number display unit 133 may be displayed.
- the pressing of the order button 138 is accepted, the display screen transitions to the order screen 150.
- the return button 144 is pressed on the table number screen 140, the display screen returns to the main screen 130.
- step S409 If the end button 128 is pressed on the main screen 130, an end confirmation message is displayed (step S409).
- the end confirmation is obtained (Y in step S411), this application is ended.
- the end is canceled (N in step S411), the display screen returns to the main screen 130.
- FIG. 13 is a diagram illustrating an example of an order screen 150 when the smartphone 10 is held over the menu 7.
- additional data such as a store manager's recommended menu obtained by recognizing images of a plurality of products included in the video of the menu 7 is displayed by a balloon 145.
- an order completion mark 146 is displayed on the product for which the order operation has been completed.
- the ordered quantity may be displayed here.
- the order screen 150 displays a product image recognition area 152 that individually recognizes images of a plurality of products included in the video of the menu 7 (FIG. 1), and additional data of the specified products.
- An additional data display unit 153 to be displayed and a determination button 154 for determining the order of the product can be included.
- an operation of the operation unit 155 having various operation buttons provided beside the display of the smartphone 10 is also an operation reception unit 24 (see FIG. 4) Accept.
- the return button 156 and the menu button 157 of the operation unit 155 are used.
- the enter button 154 When the enter button 154 is pressed, the product order is accepted and the order information is stored in the tray table 440 of FIG.
- the tray table 440 in FIG. 25 can be held in either the smartphone 10 or the server device 60.
- the return button 156 When the return button 156 is pressed, the display screen returns to the main screen 130.
- the menu button 157 is pressed, the display screen transitions to the submenu window 151, and a tray button 158 and a main button 159 are displayed at the bottom of the order screen 150.
- the tray button 158 When the tray button 158 is pressed, the display screen transitions to the tray screen 170.
- the main button 159 When the main button 159 is pressed, the display screen returns to the main screen 130. Alternatively, the display screen transitions to the tray screen 170 when the tray button 139 is pressed on the main screen 130.
- the tray screen 170 includes a menu image field 171 and a menu content field 172 as shown in FIG.
- the tray stores the current order reservation status.
- the current order reservation status stored in the tray is displayed. Note that the tray can be provided in the memory of the smartphone 10 or the server device 60 as described above.
- menu images 173 of products ordered on the order screen 150 are displayed.
- the menu content column 172 includes a menu name 174 of the ordered product, a quantity display unit 175 for displaying the ordered quantity, an amount display unit 178 for displaying the amount of money, and a delete button operated when canceling the menu. 179.
- FIG. 15 shows an example in which three menus A, B, and C are selected. Further, the order quantity can be increased or decreased by an increase button 177 (+: plus) and a decrease button 176 (-: minus).
- the menu image column 171 and the menu content column 172 can scroll a plurality of lines.
- the tray screen 170 includes a menu total number display unit 182, a total quantity display unit 183, and a total amount display unit 184.
- the total menu number display area 182 displays the total number of menus in the tray.
- the total quantity display portion 183 displays the total quantity of all menus in the tray.
- the total amount display portion 184 displays the total amount of all menus in the tray.
- the tray screen 170 includes an order button 186, an order button 187, a main button 188, and an end button 128. Further, a button for accepting an automatic order registration of My Menu may be provided on the tray screen 170 as a menu for ordering the order menu as a favorite or a sure order. Then, each registration may be received on the tray screen 170, and the smartphone 10 may transmit the registration information to the server device 60 to be registered in the database 50. These pieces of information can be held in the database 50 as user attribute information 450 in FIG.
- step S413 a screen for final confirmation of the order contents is displayed (step S413). If the order is OK (Y in step S415), the order data is accepted and the display screen returns to the main screen 130. If the order is NG (N in Step S415), the display screen returns to the tray screen 170. At this time, the order data transmission unit 114 transmits the order data to the order receiving device 80, and the order contents are printed out to the printer 82 (kitchen printer output 194 in FIG. 10). If the transmission is successful, the order contents in the tray are cleared. In this example, the order data is output to the kitchen printer. However, it is not always necessary to take the means of output by the kitchen printer. For the staff in the store, such as display on the screen or announcement by voice. A configuration using any means for transmitting the order contents may be used.
- a deletion confirmation message is displayed (step S417), the menu for which the delete button 179 is pressed is deleted, and the display screen returns to the tray screen 170.
- the menu may be deleted after receiving a deletion confirmation operation.
- the main button 188 is pressed, the display screen returns to the main screen 130. At this time, it is desirable to keep the contents of the tray.
- step S4119 When the end button 128 is pressed, an end confirmation message is displayed (step S419).
- the end confirmation is obtained (Y in step S421), this application is ended.
- the end is canceled (N in step S421), the display screen returns to the tray screen 170.
- FIG. 6 is a flowchart illustrating an example of the operation of the information processing apparatus 100 according to the present embodiment.
- the information processing apparatus 100 sequentially acquires video data in which at least a part of a shooting target presenting a plurality of items is captured by an imaging unit ( Step S101, Step S103), detecting that the video data includes at least a part of a pre-registered image, recognizing the image (Step S105), and recognizing a plurality of images presented to the photographing target.
- the additional data of each specified image is extracted from the additional data storage unit 106 that stores additional data corresponding to each of the images (step S107), and the additional data of each extracted image is displayed (step S107). S109).
- the video data acquisition unit 102 captures at least a part of an imaging target on which images of a plurality of products are presented by the imaging unit (Step S101).
- the captured video data is sequentially acquired (step S103).
- the image recognition unit 104 specifies a plurality of products included in the video data from the product image included in the video data, and acquires identification information of each product (step S105).
- the additional data extraction unit 108 extracts the additional data of each product specified by the image recognition unit 104 from the additional data storage unit 106 based on the product identification information acquired by the image recognition unit 104 (Ste S107). Then, the additional data display unit 110 displays the additional data of each product extracted by the additional data extraction unit 108 at a position corresponding to each product of the video data (step S109).
- the information processing apparatus 100 can be realized by the smartphone 10 and the server apparatus 60.
- Various functions can be considered between the smartphone 10 and the server device 60.
- an example in which the second method of the image detection method described above is employed is shown.
- FIG. 7 is an example in which all of the configuration of FIG. In this example, all the steps S101 to S109 of the processing procedure shown in FIG. Then, the user identification information acquisition unit 112 acquires the user identification information (step S111).
- the order data transmission unit 114 identifies the product selected from the plurality of products included in the video displayed on the mobile terminal size display unit based on the composite video data obtained by the additional data display unit 110 Order data including information and user identification information is transmitted (step S113).
- the order receiving device 80 receives the order data transmitted by the smartphone 10 (step S121), and the order process is completed. At this time, the smartphone 10 can transmit the order data to the order receiving device 80 using the wireless LAN communication unit 20 of FIG.
- FIG. 8 is an example in which the smartphone 10 side performs the product specifying process, and the server device 60 side extracts the product additional data.
- the server device 60 performs extraction of the additional data of the product in step S107 of the order processing procedure of FIG.
- the order data is not transmitted from the smartphone 10 to the order receiving device 80, but is transmitted to the order receiving device 80 via the server device 60.
- step S111, step S113, step S121, and step S201 to step S211 are further included.
- the smartphone 10 transmits the product identification information acquired in step S105 to the server device 60 (step S201).
- the server device 60 receives the product identification information from the smartphone 10 (step S203), and extracts the received additional data of each product from the additional data storage unit 106 based on the received product identification information (step S203). S107).
- the server apparatus 60 transmits the additional data of the extracted goods to the smart phone 10 (step S205).
- the smartphone 10 receives the additional data of the product from the server device 60 (step S207), and the additional data display unit 110 displays the received additional data of each product at a position corresponding to each product of the video data ( Step S109).
- the user identification information acquisition unit 112 acquires the user identification information (step S111).
- the order data transmission unit 114 identifies the product selected from the plurality of products included in the video displayed on the mobile terminal size display unit based on the composite video data obtained by the additional data display unit 110 Order data including information and user identification information is transmitted (step S113).
- step S113 the order data is transmitted from the smartphone 10 to the server device 60.
- step S209 the order data is transferred from the server device 60 that has received the order data to the order receiving device 80 (step S211).
- the smartphone 10 can transmit the order data to the server device 60 using the mobile phone network communication unit 18 of FIG.
- the smartphone 10 uploads order data to the web server of the server device 60.
- FIG. 9 is an example in which the smartphone 10 side performs up to the feature point extraction process of the product image, and the server device 60 side identifies the product and extracts the additional data from the feature point.
- the process of step S105 in FIG. 6 is shared by the smartphone 10 and the server device 60, and is performed in steps S301 and S307, and step S107 in FIG.
- the order data is transmitted to the order receiving device 80 via the server device 60.
- steps S301 to S307, steps S205 to S211, step S111, step S113, and step S121 are further included.
- the smartphone 10 extracts the feature points of the image of each product from the product image included in the video data acquired in step S103 (step S301). And the smart phone 10 transmits the feature point of the image of each goods to the server apparatus 60 (step S303).
- the server device 60 receives the feature points of the image of each product from the smartphone 10 (step S305), specifies a plurality of products included in the video data from the feature points of the received image of each product, and Each piece of identification information is acquired (step S307).
- the additional data extraction unit 108 extracts the additional data of each identified product from the additional data storage unit 106 based on the acquired product identification information (step S107).
- the order data is directly transmitted from the smartphone 10 to the order receiving device 80.
- the order data may be transmitted via the server device 60.
- the order data is transmitted from the smartphone 10 to the order receiving device 80 via the server device 60.
- the order data may be directly transmitted from the smartphone 10 to the order receiving device 80. it can.
- the information processing apparatus 100 of the embodiment of the present invention it is possible to display information useful for the user for a plurality of products in a list so that an order can be received. And the information processing apparatus 100 of this embodiment can be implement
- FIG. 17 is a functional block diagram showing a main configuration of the information processing apparatus 200 according to the embodiment of the present invention.
- the information processing apparatus 200 according to the present embodiment is different from the information processing apparatus 100 according to the above-described embodiment in that additional data corresponding to user circumstances (attributes) is extracted.
- the information processing apparatus 200 of the present embodiment further includes an attribute acquisition unit 202 and a user attribute information storage unit (“user attribute information” in the figure). 204).
- the user attribute information storage unit 204 can include the number of visits to the store, allergy information, etc., as shown in the user attribute information 450 of FIG.
- the user attribute information storage unit 204 stores the number of points held by the user, order history, product information registered in favorites or automatic orders, and other profile information (gender, age group, residential area, family structure, presence / absence of diet) , Fields of interest, hobbies), usage status of other stores, and the like.
- the attribute acquisition unit 202 acquires attribute information about the user.
- the attribute acquisition unit 202 refers to the user attribute information storage unit 204, for example, and identifies the user by the identification information acquired by the user identification information acquisition unit 112.
- the attribute acquisition part 202 may refer to the information regarding the user registered into other databases etc., such as a goods purchase history, and the information regarding the store currently visited as user attribute information.
- the additional data extraction unit 108 extracts, for each image recognized by the image recognition unit 104, additional data corresponding to the user attribute information acquired by the attribute acquisition unit 202.
- the user attribute information includes allergy information.
- the additional data storage unit 106 stores allergen substance information included in each of the plurality of images as additional data.
- the additional data extraction unit 108 extracts allergic substance information included in the user's allergy information for each image recognized by the image recognition unit 104.
- the additional data display unit 110 is based on the allergen substance information extracted by the additional data extraction unit 108, and the allergen substance information that the user may cause allergies from among a plurality of images recognized by the image recognition unit 104. Is extracted, and a predetermined display element is displayed at a position corresponding to the specified image of the video data.
- the additional data display unit 110 displays the additional data for each image recognized by the image recognition unit 104 according to the user attribute information acquired by the attribute acquisition unit 202. Change visibility.
- the allergy NG display unit 224 is displayed so as to overlap the product image.
- the allergic OK display part 223 is displayed on the image of goods.
- the image of the allergy unknown display portion 226 is displayed in an overlapping manner.
- the allergic NG display unit 224 may be shaded so as to hide the image of the original product, for example. Further, the allergy NG display unit 224 may clearly display an image that can be clearly recognized by the user not to order at a glance with characters such as “NG” or a cross (x) mark 225.
- the allergic OK display unit 223 may display, for example, a frame in which the original product image is particularly conspicuous or an image that draws attention.
- the allergy OK display unit 223 may display an image of the original product in a three-dimensional manner by projecting it in 3D.
- the allergy unknown display unit 226 may superimpose and display an image on which the original product image has been subjected to a skeleton process that is slightly difficult to identify compared to the allergic OK display unit 223.
- the additional data display unit 110 is a balloon display 241.
- ⁇ maru
- X in the case of allergy NG x
- ⁇ (triangle) if allergy is unknown.
- Such marks may be displayed.
- the additional data display unit 110 may display a blue frame in the case of allergy OK, a red frame in the case of allergy NG, and a gray frame in a case where it is unknown whether the allergic substance is contained. Good.
- the additional data extraction unit 108 may extract low-calorie products from the product, and the additional data display unit 110 may display additional data recommended to the user.
- the additional data extraction unit 108 may extract the user's preference from the user's order history or browsing history, and the additional data display unit 110 may display additional data that recommends a menu that matches the user's preference.
- the additional data display unit 110 may display additional data including an operation button for allowing a user to receive a favorite registration on a product image, or additional data indicating that the product is a registered product or a related product. For already purchased products, the additional data display unit 110 may display additional data indicating that it has been purchased. If it is a repeat product, the additional data display unit 110 may display additional data that recommends repurchasing.
- the visibility of the image may be changed step by step according to the user's needs.
- the product image may be conspicuous by performing 3D processing on the product image as additional data, and popping out the closer the user's needs are.
- the product image may be skeletonized as additional data, and as the user's needs are lower, the shaded density is increased to make the product image difficult to see.
- the menu including the ingredients that the user is not good at can display the same display as the above-mentioned allergic information to alert the user.
- a display such as “pickling removal possible” may be added.
- a mark such as “Support carrots overcoming carrots!” Is displayed on the menu that contains the registered weak ingredients, and the mark is selected. In some cases, an explanation such as “If you try your best and eat all, you will get another toy for kids lunch!”
- the same effects as those of the above-described embodiment can be obtained, and the listability of a plurality of products can be further improved. Therefore, it is possible to easily present and show what suits the user's circumstances, so that there is an effect that it is easy for the user to identify a product suitable for him. In addition, it is possible to recognize at a glance a product that the user wants to appeal with particular priority. Conversely, products that have low needs for the user can be excluded from the options in the product list, and the necessary products can be quickly identified.
- FIG. 20 is a functional block diagram showing a main configuration of the information processing apparatus 300 according to the embodiment of the present invention.
- the information processing apparatus 200 according to the present embodiment is different from the information processing apparatus 100 according to the above-described embodiment in that, for example, switching additional data that is different for each time zone or day is displayed at the position of the product image according to a predetermined condition. Is different.
- the information processing apparatus 300 of the present embodiment further includes an additional data storage unit 306 having switching additional data 308.
- this embodiment can also include the structure of the information processing apparatus 200 of the said embodiment of FIG.
- the additional data storage unit 306 stores, as additional data for at least one image, switching additional data 308 that differs depending on preset conditions.
- the additional data extraction unit 108 extracts the switching additional data 308 corresponding to the condition for at least one image recognized by the image recognition unit 104.
- the additional data display unit 110 displays the switching additional data 308 corresponding to the condition extracted by the additional data extraction unit 108 at a position corresponding to the image corresponding to the additional data of the video data.
- the additional data display unit 110 can display a daily menu image 321 and a balloon image 323 as the switching additional data 308 on the product image of the menu 7. In this way, on the menu 7, a general pasta image is posted, and the additional data display unit 110 displays the specific pasta in the current time zone, day of the week, or date at the position of the product image. Images and product names can be displayed.
- the additional data display unit 110 may display the switching additional data 308 prepared for each store, region, season, and weather.
- the product name is hidden on the catalog, and the additional data display unit 110 can display the product at a certain processing timing.
- the additional data display unit 110 can promote sales of optimal products such as seasonal products at the processing timing.
- the additional data display unit 110 may designate a period and switch to an image of a sales promotion target product during a time sale or a campaign, and can effectively promote a specific product.
- additional data suitable for the processing timing is prepared in the additional data storage unit 106, and the additional data extraction unit 108 refers to the additional data storage unit 106 that has switched the additional data to a suitable one for the processing timing. Thus, additional data may be extracted.
- the same effect as the above embodiment can be obtained, and additional data corresponding to the processing timing can be displayed, so that the product can be displayed at the order processing timing without specifying the product at the time of posting. It can be switched and sold, and sales of specific products can be effectively promoted.
- Information processing device Sequentially acquiring video data in which at least a part of an imaging target presenting a plurality of items is captured by an imaging unit; Detecting that the video data includes at least a part of an image registered in advance, and recognizing the image; The additional data of each image is extracted from the recognized additional data storage unit that stores additional data corresponding to each of the plurality of images presented to the subject to be imaged, A data processing method of an information processing apparatus for displaying additional data of each extracted image. 2. 1. In the data processing method of the information processing device described in The information processing apparatus is A data processing method of an information processing apparatus for displaying additional data of each extracted image at a position corresponding to each image of the video data. 3. 1. Or 2.
- the information processing apparatus is Get attribute information about the user, A data processing method of an information processing apparatus for extracting additional data corresponding to acquired user attribute information for each recognized image. 4). 3.
- the attribute information of the user includes allergy information
- the additional data storage unit stores allergic substance information included in each of the plurality of images as the additional data
- the information processing apparatus is For each recognized image, extract allergic substance information included in the user's allergic information, Based on the extracted allergic substance information, an image from which allergen substance information that may cause the allergy is extracted is identified from among a plurality of recognized images, and the specified image data is identified.
- the data processing method of the information processing device described in The information processing apparatus is The data processing method of the information processing apparatus which changes the visibility of the said additional data according to the attribute information of the said user acquired about each image recognized. 6). 1. To 5. In the data processing method of the information processing apparatus according to any one of the above, The additional data storage unit stores, as additional data for at least one image, different switching additional data according to preset conditions, The information processing apparatus includes: For at least one recognized image, extract additional switching data corresponding to the condition, A data processing method of an information processing apparatus that displays the extracted additional switching data corresponding to the condition at a position corresponding to an image corresponding to the additional data of the video data. 7). 1. To 6.
- the data processing method of an information processing apparatus wherein the additional data includes recommendation information, discount information, coupon information, allergy information, evaluation information, or optional service information related to the image. 8). 1. To 7. In the data processing method of the information processing apparatus according to any one of the above, The information processing apparatus is A data processing method for an information processing apparatus that transmits data of an image selected from a plurality of images included in a video displayed on a display unit based on the obtained synthesized video data. 9. 1. To 8.
- the information processing apparatus is Temporarily storing orders of a plurality of selected products in the storage unit, presenting the stored order to the user, and accepting a confirmation operation; A data processing method of an information processing apparatus for transmitting information on the order after receiving the confirmation operation. 10. 1. To 9. In the data processing method of the information processing apparatus according to any one of the above, The information processing apparatus is a data processing method for an information processing apparatus which is a user portable terminal or a server apparatus capable of communicating with the user portable terminal.
- the attribute information of the user includes allergy information
- the additional data storage unit stores allergic substance information included in each of the plurality of images as the additional data
- a procedure for extracting allergic substance information included in the user's allergic information for each recognized image Based on the extracted allergic substance information, an image from which allergen substance information that may cause the allergy is extracted is identified from among a plurality of recognized images, and the specified image data is identified.
- the additional data storage unit stores, as additional data for at least one image, different switching additional data according to preset conditions, A procedure for extracting switching additional data corresponding to the condition for at least one recognized image; A program for causing a computer to execute a procedure for displaying the extracted additional switching data corresponding to the condition at a position corresponding to an image corresponding to the additional data of the video data. 17. 11. To 16.
- the additional data is a program including recommendation information, discount information, coupon information, allergy information, evaluation information, or optional service information related to the image. 18. 11. To 17. In any program, A program for causing a computer to execute a procedure of transmitting data of an image selected from a plurality of images included in a video displayed on a display unit based on the obtained synthesized video data. 19. 11. To 18. In any program, A procedure for temporarily storing information on a plurality of selected images in a storage unit, presenting the stored information on the plurality of images to a user, and receiving a confirmation operation; A program for causing a computer to execute a procedure of transmitting information of the plurality of images after receiving the confirmation operation. 20. 11. Thru 19. In any program, The information processing apparatus realized by a computer is a user portable terminal or a program that is a server apparatus that can communicate with the user portable terminal.
Landscapes
- Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Business, Economics & Management (AREA)
- General Engineering & Computer Science (AREA)
- Multimedia (AREA)
- Library & Information Science (AREA)
- Tourism & Hospitality (AREA)
- Databases & Information Systems (AREA)
- Data Mining & Analysis (AREA)
- Human Computer Interaction (AREA)
- Marketing (AREA)
- Strategic Management (AREA)
- General Business, Economics & Management (AREA)
- Economics (AREA)
- Accounting & Taxation (AREA)
- Finance (AREA)
- Primary Health Care (AREA)
- Human Resources & Organizations (AREA)
- General Health & Medical Sciences (AREA)
- Health & Medical Sciences (AREA)
- Development Economics (AREA)
- User Interface Of Digital Computer (AREA)
- Information Transfer Between Computers (AREA)
Abstract
Description
また、特許文献2に記載されたオーダシステムは、利用者の電子機器、ここでは、カーナビゲーションを利用して、利用者に注文情報を作成させ、無線通信で受信する。
複数のアイテムを提示する撮影対象の少なくとも一部が撮像部により撮像された映像データを逐次取得する映像データ取得手段と、
前記映像データに、予め登録された画像の少なくとも一部が含まれることを検出して、該画像を認識する画像認識手段と、
前記画像認識手段により認識された、前記撮影対象に提示される前記複数の画像の各々に対応する付加データを格納する付加データ格納部から、各前記画像の付加データをそれぞれ抽出する付加データ抽出手段と、
前記付加データ抽出手段により抽出された各画像の付加データを、前記映像データの該各画像に対応する位置に表示する付加データ表示手段と、を備える。
情報処理装置が、
複数のアイテムを提示する撮影対象の少なくとも一部が撮像部により撮像された映像データを逐次取得し、
前記映像データに、予め登録された画像の少なくとも一部が含まれることを検出して、該画像を認識し、
認識された、前記撮影対象に提示される前記複数の画像の各々に対応する付加データを格納する付加データ格納部から、各前記画像の付加データをそれぞれ抽出し、
抽出された各画像の付加データを、前記映像データの該各画像に対応する位置に表示する情報処理装置のデータ処理方法である。
情報処理装置を実現するコンピュータに、
複数のアイテムを提示する撮影対象の少なくとも一部が撮像部により撮像された映像データを逐次取得する手順、
前記映像データに、予め登録された画像の少なくとも一部が含まれることを検出して、該画像を認識する手順、
認識された、前記撮影対象に提示される前記複数の画像の各々に対応する付加データを格納する付加データ格納部から、各前記画像の付加データをそれぞれ抽出する手順、
抽出された各画像の付加データを、前記映像データの該各画像に対応する位置に表示する手順を実行させるためのプログラムである。
図1は、本発明の実施の形態に係る情報処理装置を用いた注文システム1の構成例を示すブロック図である。
本実施形態の注文システム1は、ユーザが利用する携帯端末であるスマートフォン10と、店舗5に設けられた注文受付装置80と、注文受付装置80に接続されるプリンタ82と、スマートフォン10および注文受付装置80とネットワーク3を介して通信可能なサーバ装置60と、サーバ装置60に接続されるデータベース50(図中、「DB」と示す)と、を備える。
本発明の実施の形態に係る情報処理装置100は、複数のアイテムを提示する撮影対象の少なくとも一部が撮像部により撮像された映像データを逐次取得する映像データ取得部102と、映像データに、予め登録された画像の少なくとも一部が含まれることを検出して、該画像を認識する画像認識部104と、画像認識部104により認識された、前記撮影対象に提示される前記複数の画像の各々に対応する付加データを格納する付加データ格納部106から、各前記画像の付加データをそれぞれ抽出する付加データ抽出部108と、付加データ抽出部108により抽出された各画像の付加データを、表示する付加データ表示部110と、を備える。
本実施形態のサーバ装置60は、データベース50(図1)に接続されるサーバコンピュータやパーソナルコンピュータ、またはそれらに相当する装置により実現することができる。また、サーバ装置60は、仮想サーバなどにより構成されてもよい。
たとえば、本実施形態の映像データ取得部102は、撮影対象の少なくとも一部を撮影し、携帯端末サイズの画面に表示されるサイズの映像データを逐次取得する。
なお、本実施形態では、商品を特定する例について説明したが、原則としては、画像認識部104は、映像データに、予め登録された画像の少なくとも一部が含まれることを検出して、該画像を認識する。
第1の方法では、画像認識部104が、パターン認識等により、映像画像内の特徴量を、画像テーブル460(図27)内に予め登録した商品の画像の特徴量(特徴点データ)と照合し、少なくとも一部が一致した領域を検出して、その領域の位置と画像を特定し、画像に対応する商品を特定する。
なお、第1の方法では、映像画像を個々の商品画像の領域に分割する処理が不要なため、第1の方法の方が、より好ましい。
(a)撮影画像に複数の商品画像が含まれていることを認識する機能
(b)撮影画像内の個々の商品画像を分割する機能
(c)それぞれの商品画像ごとに特徴点を抽出する機能
(d)抽出された特徴点から対応する商品を特定する機能
(1)全ての機能をスマートフォン10で実現する。
(2)機能(a)をスマートフォン10で実現し、結果をサーバ装置60に送信し、機能(b)~(d)をサーバ装置60で実現する。
(3)機能(a)~(b)をスマートフォン10で実現し、結果をサーバ装置60に送信し、機能(c)~(d)をサーバ装置60で実現する。
(4)機能(a)~(c)をスマートフォン10で実現し、結果をサーバ装置60に送信し、機能(d)をサーバ装置60で実現する。
(5)全ての機能をサーバ装置60で実現する。
(e)撮影画像から特徴点を抽出する機能
(f)抽出された特徴点を事前に登録された商品画像ごとの特長点データと照合する機能
(g)照合された結果から撮影画像内に含まれる商品画像の位置と商品の種類を特定する機能
(1)全ての機能をスマートフォン10で実現する。
(2)機能(e)をスマートフォン10で実現し、その結果をサーバ装置60に送信し、機能(f)~(g)をサーバ装置60で実現する。
(3)機能(e)~(f)をスマートフォン10で実現し、その結果をサーバ装置60に送信し、機能(g)をサーバ装置60で実現する。
(4)全ての機能をサーバ装置60で実現する。
また、画像テーブル460は、画像の特徴点データとそれに対応する付加データを関連付けて保持してもよい。そして、画像認識部104または付加データ抽出部108が、商品マスタ430(後述する図24)を参照して付加データを抽出する処理(後述する図7等のステップS107)を行うことなく、画像テーブル460を参照して画像の特徴点データに対応した付加データを抽出できる構成であってもよい。
上述した第2の画像検出方法ように、映像データに含まれる画像を検出して認識する構成の場合、付加データ格納部106は、複数の画像のそれぞれに対応する付加データを格納し、付加データ抽出部108は、付加データ格納部106から各画像に対応する付加データを抽出することができる。そして、付加データ表示部110は、各画像の付加データを映像データの各画像に対応する位置に表示する。
また、付加データ表示部110は、付加データを、映像データの各画像に対応する位置とは無関係に、画面内の決められた位置、あるいは画面全体に表示する構成であってもよい。
付加データは上記の例に示したようなデータの任意の組み合わせであってもよい。
なお、商品の識別情報およびユーザ識別情報は必ずしも必要はなく、オーダデータ送信部114は、少なくとも選択された画像のデータを送信すればよい。
なお、サーバ装置60と注文受付装置80の間のネットワーク3は特に限定されない。
本実施形態のコンピュータプログラムは、情報処理装置100を実現されるためのコンピュータに、複数のアイテムを提示する撮影対象の少なくとも一部が撮像部により撮像された映像データを逐次取得する手順、映像データに、予め登録された画像の少なくとも一部が含まれることを検出して、該画像を認識する手順、認識された、撮影対象に提示される複数の画像の各々に対応する付加データを格納する付加データ格納部から、特定される各画像の付加データをそれぞれ抽出する手順、抽出された各画像の付加データを、表示する手順、を実行させるように記述されている。
情報処理装置100の画面構成としては、ログイン画面120と、メイン画面130と、テーブル番号画面140と、オーダ画面150と、トレイ画面170と、注文確認画面190と、削除確認画面192と、を含む。
スマートフォン10で本実施形態の情報処理装置100のアプリケーションが起動されると、図16のフローが開始する。
ログイン画面120で終了ボタン128が押下された場合には、終了確認のメッセージが表示される(ステップS405)。終了確認がとれた場合(ステップS407のY)、本アプリケーションを終了する。終了がキャンセルされた場合(ステップS407のN)、表示画面はログイン画面120に戻る。
そして、図16に示すように、オーダボタン138の押下を受け付けたとき、表示画面がオーダ画面150に遷移する。また、テーブル番号画面140で、戻るボタン144が押下されたときは、表示画面はメイン画面130に戻る。
なお、この例ではオーダデータはキッチンプリンタに出力される構成としたが、必ずしもキッチンプリンタによる出力という手段を取る必要は無く、画面への表示や音声でのアナウンスなど、店舗内のスタッフに対してオーダ内容を伝えるための任意の手段を用いる構成であってもよい。
メインボタン188が押下されたとき、表示画面はメイン画面130に戻る。このとき、トレイの内容は保持したままであるのが望ましい。トレイの内容を消去する場合は、ユーザに確認のメッセージを提示して確認してから消去するのが望ましい。
本発明の実施の形態に係る情報処理装置100のデータ処理方法は、情報処理装置100が、複数のアイテムを提示する撮影対象の少なくとも一部が撮像部により撮像された映像データを逐次取得し(ステップS101、ステップS103)、映像データに、予め登録された画像の少なくとも一部が含まれることを検出して、該画像を認識し(ステップS105)、認識された、撮影対象に提示される複数の画像の各々に対応する付加データを格納する付加データ格納部106から、特定される各画像の付加データをそれぞれ抽出し(ステップS107)、抽出された各画像の付加データを、表示する(ステップS109)。
そして、付加データ表示部110が、付加データ抽出部108により抽出された各商品の付加データを、映像データの該各商品に対応する位置に表示する(ステップS109)。
そして、ユーザ識別情報取得部112が、ユーザの識別情報を取得する(ステップS111)。
スマートフォン10が送信したオーダデータを注文受付装置80が受信し(ステップS121)、オーダ処理が完了する。このとき、スマートフォン10は、図4の無線LAN通信部20を用いて注文受付装置80にオーダデータを送信することができる。
そして、ユーザ識別情報取得部112が、ユーザの識別情報を取得する(ステップS111)。
そして、付加データ抽出部108が、取得した商品の識別情報に基づいて、付加データ格納部106から、特定された各商品の付加データをそれぞれ抽出する(ステップS107)。以下、図8と同様である。
なお、図7の例では、スマートフォン10から直接注文受付装置80にオーダデータを送信する構成としているが、サーバ装置60経由で送信する構成とすることもできる。また、図8および図9の例では、スマートフォン10からオーダデータをサーバ装置60経由で注文受付装置80に送信する構成としているが、スマートフォン10から注文受付装置80に直接送信する構成とすることもできる。
図17は、本発明の実施の形態に係る情報処理装置200の要部構成を示す機能ブロック図である。本実施形態の情報処理装置200は、上記実施形態の情報処理装置100とは、ユーザ事情(属性)に応じた付加データを抽出する点で相違する。
本実施形態の情報処理装置200は、図2の上記実施形態の情報処理装置100の構成に加え、さらに、属性取得部202と、ユーザ属性情報格納部(図中、「ユーザ属性情報」と示す)204と、を備える。
付加データ抽出部108は、画像認識部104により認識される各画像について、属性取得部202により取得されるユーザの属性情報に対応する付加データをそれぞれ抽出する。
付加データ格納部106は、付加データとして、複数の画像の各々に含まれるアレルギ物質情報を格納する。
付加データ抽出部108は、画像認識部104により認識される各画像について、ユーザのアレルギ情報に含まれるアレルギ物質情報をそれぞれ抽出する。
付加データ表示部110は、付加データ抽出部108により抽出されるアレルギ物質情報に基づいて、画像認識部104により認識される複数の画像の中から、ユーザがアレルギを引き起こす可能性のあるアレルギ物質情報が抽出された画像を特定し、映像データの該特定された画像に対応する位置に所定の表示要素を表示する。
または、付加データ表示部110は、図19にように、吹き出し表示241で、アレルギOKの場合は○(マル)、アレルギNGの場合は×(バツ)、またはアレルギ不明の場合は△(三角)などのマークを表示してもよい。
また、付加データ表示部110は、アレルギOKの場合は青枠を、アレルギNGの場合は赤枠を、当該アレルギ物質が含まれているか不明な場合にはグレーの枠を重ねて表示してもよい。
さらに、小さいお子様の好き嫌い克服を支援するための「応援キャンペーン」を開催する場合に、登録された苦手食材を含むメニューに、「ニンジン克服応援!」などのマークを表示し、そのマークを選択した際に「頑張って全部食べたらお子様ランチ用のオモチャをもう一個プレゼント!」などの説明が加えられてもよい。
図20は、本発明の実施の形態に係る情報処理装置300の要部構成を示す機能ブロック図である。本実施形態の情報処理装置200は、上記実施形態の情報処理装置100とは、予め定めた条件に応じて、たとえば、時間帯または日毎に異なる切替付加データを商品画像の位置に表示する点で相違する。
本実施形態の情報処理装置300は、図2の上記実施形態の情報処理装置100の構成に加え、さらに、切替付加データ308を有する付加データ格納部306を備える。なお、本実施形態は、図17の上記実施形態の情報処理装置200の構成を含むこともできる。
そして、本実施形態において、付加データ抽出部108は、画像認識部104により認識される少なくとも1つの画像について、条件に対応する切替付加データ308を抽出する。
付加データ表示部110は、付加データ抽出部108により抽出された、条件に対応する切替付加データ308を、映像データの付加データに対応する画像に対応する位置に表示する。
なお、本発明において利用者(ユーザ)に関する情報を取得、利用する場合は、これを適法に行うものとする。
1. 情報処理装置が、
複数のアイテムを提示する撮影対象の少なくとも一部が撮像部により撮像された映像データを逐次取得し、
前記映像データに、予め登録された画像の少なくとも一部が含まれることを検出して、該画像を認識し、
認識された、前記撮影対象に提示される前記複数の画像の各々に対応する付加データを格納する付加データ格納部から、各画像の付加データをそれぞれ抽出し、
抽出された各画像の付加データを、表示する情報処理装置のデータ処理方法。
2. 1.に記載の情報処理装置のデータ処理方法において、
前記情報処理装置が、
抽出された各画像の付加データを、前記映像データの該各画像に対応する位置に表示する情報処理装置のデータ処理方法。
3. 1.または2.に記載の情報処理装置のデータ処理方法において、
前記情報処理装置が、
ユーザについての属性情報を取得し、
認識される各画像について、取得される前記ユーザの属性情報に対応する付加データをそれぞれ抽出する情報処理装置のデータ処理方法。
4. 3.に記載の情報処理装置のデータ処理方法において、
前記ユーザの属性情報は、アレルギ情報を含み、
前記付加データ格納部は、前記付加データとして、前記複数の画像の各々に含まれるアレルギ物質情報を格納し、
前記情報処理装置が、
認識される各画像について、前記ユーザのアレルギ情報に含まれるアレルギ物質情報をそれぞれ抽出し、
抽出されるアレルギ物質情報に基づいて、認識される複数の画像の中から、前記ユーザがアレルギを引き起こす可能性のあるアレルギ物質情報が抽出された画像を特定し、前記映像データの該特定された画像に対応する位置に所定の表示要素を表示する情報処理装置のデータ処理方法。
5. 3.または4.に記載の情報処理装置のデータ処理方法において、
前記情報処理装置が、
認識される各画像について、取得される前記ユーザの属性情報に応じて、前記付加データの視認性を変化させる情報処理装置のデータ処理方法。
6. 1.乃至5.いずれかに記載の情報処理装置のデータ処理方法において、
前記付加データ格納部は、少なくとも1つの画像についての付加データとして、予め設定された条件に応じて異なる切替付加データを格納し、
前記情報処理装置は、
認識される少なくとも1つの画像について、前記条件に対応する切替付加データを抽出し、
抽出された、前記条件に対応する切替付加データを、前記映像データの前記付加データに対応する画像に対応する位置に表示する情報処理装置のデータ処理方法。
7. 1.乃至6.いずれかに記載の情報処理装置のデータ処理方法において、
前記付加データは、前記画像に関連する、お勧め情報、割引情報、クーポン情報、アレルギ情報、評価情報、またはオプションサービス情報を含む情報処理装置のデータ処理方法。
8. 1.乃至7.いずれかに記載の情報処理装置のデータ処理方法において、
前記情報処理装置が、
得られた合成映像データに基づいて表示部に表示される映像に含まれる複数の画像の中から選択された画像のデータを送信する情報処理装置のデータ処理方法。
9. 1.乃至8.いずれかに記載の情報処理装置のデータ処理方法において、
前記情報処理装置が、
選択された複数の商品の注文を一時的に記憶部に記憶し、記憶した前記オーダをユーザに提示し、確認操作を受け付け、
前記確認操作を受け付けた後、前記オーダの情報を送信する情報処理装置のデータ処理方法。
10. 1.乃至9.いずれかに記載の情報処理装置のデータ処理方法において、
前記情報処理装置は、ユーザ携帯端末、または、ユーザ携帯端末と通信可能なサーバ装置である情報処理装置のデータ処理方法。
複数のアイテムを提示する撮影対象の少なくとも一部が撮像部により撮像された映像データを逐次取得する手順、
前記映像データに、予め登録された画像の少なくとも一部が含まれることを検出して、該画像を認識する手順、
認識された、前記撮影対象に提示される前記複数の画像の各々に対応する付加データを格納する付加データ格納部から、各前記画像の付加データをそれぞれ抽出する手順、
抽出された各画像の付加データを、表示する手順を実行させるためのプログラム。
12. 11.に記載のプログラムにおいて、
抽出された各画像の付加データを、前記映像データの該各画像に対応する位置に表示する手順をコンピュータに実行させるためのプログラム。
13. 11.または12.に記載のプログラムにおいて、
ユーザについての属性情報を取得する手順、
認識される各画像について、取得される前記ユーザの属性情報に対応する付加データをそれぞれ抽出する手順をコンピュータに実行させるためのプログラム。
14. 13.に記載のプログラムにおいて、
前記ユーザの属性情報は、アレルギ情報を含み、
前記付加データ格納部は、前記付加データとして、前記複数の画像の各々に含まれるアレルギ物質情報を格納し、
認識される各画像について、前記ユーザのアレルギ情報に含まれるアレルギ物質情報をそれぞれ抽出する手順、
抽出されるアレルギ物質情報に基づいて、認識される複数の画像の中から、前記ユーザがアレルギを引き起こす可能性のあるアレルギ物質情報が抽出された画像を特定し、前記映像データの該特定された画像に対応する位置に所定の表示要素を表示する手順をコンピュータに実行させるためのプログラム。
15. 13.または14.に記載のプログラムにおいて、
前記画像を認識する手順により認識される各画像について、前記属性を取得する手順により取得される前記ユーザの属性情報に応じて、前記付加データの視認性を変化させる手順をコンピュータに実行させるためのプログラム。
16. 11.乃至15.いずれかに記載のプログラムにおいて、
前記付加データ格納部は、少なくとも1つの画像についての付加データとして、予め設定された条件に応じて異なる切替付加データを格納し、
認識される少なくとも1つの画像について、前記条件に対応する切替付加データを抽出する手順、
抽出された、前記条件に対応する切替付加データを、前記映像データの前記付加データに対応する画像に対応する位置に表示する手順をコンピュータに実行させるためのプログラム。
17. 11.乃至16.いずれかに記載のプログラムにおいて、
前記付加データは、前記画像に関連する、お勧め情報、割引情報、クーポン情報、アレルギ情報、評価情報、またはオプションサービス情報を含むプログラム。
18. 11.乃至17.いずれかに記載のプログラムにおいて、
得られた合成映像データに基づいて表示部に表示される映像に含まれる複数の画像の中から選択された画像のデータを送信する手順をコンピュータに実行させるためのプログラム。
19. 11.乃至18.いずれかに記載のプログラムにおいて、
選択された複数の画像の情報を一時的に記憶部に記憶し、記憶した前記複数の画像の情報をユーザに提示し、確認操作を受け付ける手順、
前記確認操作を受け付けた後、前記複数の画像の情報を送信する手順をコンピュータに実行させるためのプログラム。
20. 11.乃至19.いずれかに記載のプログラムにおいて、
コンピュータが実現する前記情報処理装置は、ユーザ携帯端末、または、ユーザ携帯端末と通信可能なサーバ装置であるプログラム。
Claims (12)
- 複数のアイテムを提示する撮影対象の少なくとも一部が撮像部により撮像された映像データを逐次取得する映像データ取得手段と、
前記映像データに、予め登録された画像の少なくとも一部が含まれることを検出して、該画像を認識する画像認識手段と、
前記画像認識手段により認識された、前記撮影対象に提示される前記複数の画像の各々に対応する付加データを格納する付加データ格納部から、各前記画像の付加データをそれぞれ抽出する付加データ抽出手段と、
前記付加データ抽出手段により抽出された各画像の付加データを、表示する付加データ表示手段と、を備える情報処理装置。 - 請求項1に記載の情報処理装置において、
前記付加データ表示手段が、
前記付加データ抽出手段により抽出された各画像の付加データを、前記映像データの該各画像に対応する位置に表示する情報処理装置。 - 請求項1または2に記載の情報処理装置において、
ユーザについての属性情報を取得する属性取得手段、
をさらに備え、
前記付加データ抽出手段は、前記画像認識手段により認識される各画像について、前記属性取得手段により取得される前記ユーザの属性情報に対応する付加データをそれぞれ抽出する情報処理装置。 - 請求項3に記載の情報処理装置において、
前記ユーザの属性情報は、アレルギ情報を含み、
前記付加データ格納部は、前記付加データとして、前記複数の画像の各々に含まれるアレルギ物質情報を格納し、
前記付加データ抽出手段は、前記画像認識手段により認識される各画像について、前記ユーザのアレルギ情報に含まれるアレルギ物質情報をそれぞれ抽出し、
前記付加データ表示手段は、前記付加データ抽出手段により抽出されるアレルギ物質情報に基づいて、前記画像認識手段により認識される複数の画像の中から、前記ユーザがアレルギを引き起こす可能性のあるアレルギ物質情報が抽出された画像を特定し、前記映像データの該特定された画像に対応する位置に所定の表示要素を表示する情報処理装置。 - 請求項3または4に記載の情報処理装置において、
前記付加データ表示手段は、前記画像認識手段により認識される各画像について、前記属性取得手段により取得される前記ユーザの属性情報に応じて、前記付加データの視認性を変化させる情報処理装置。 - 請求項1乃至5いずれかに記載の情報処理装置において、
前記付加データ格納部は、少なくとも1つの画像についての付加データとして、予め設定された条件に応じて異なる切替付加データを格納し、
前記付加データ抽出手段は、前記画像認識手段により認識される少なくとも1つの画像について、前記条件に対応する切替付加データを抽出し、
前記付加データ表示手段は、前記付加データ抽出手段により抽出された、前記条件に対応する切替付加データを、前記映像データの前記付加データに対応する画像に対応する位置に表示する情報処理装置。 - 請求項1乃至6いずれかに記載の情報処理装置において、
前記付加データは、前記画像に関連する、お勧め情報、割引情報、クーポン情報、アレルギ情報、評価情報、またはオプションサービス情報を含む情報処理装置。 - 請求項1乃至7いずれかに記載の情報処理装置において、
前記付加データ表示手段により得られた合成映像データに基づいて表示部に表示される映像に含まれる複数の画像の中から選択された画像のデータを送信するデータ送信手段と、を備える情報処理装置。 - 請求項8に記載の情報処理装置において、
前記データ送信手段により選択された複数の画像の情報を一時的に記憶部に記憶し、記憶した前記複数の画像の情報をユーザに提示し、確認操作を受け付ける確認受付手段をさらに備え、
前記データ送信手段は、前記確認受付手段が前記確認操作を受け付けた後、前記複数の画像の情報を送信する情報処理装置。 - 請求項1乃至9いずれかに記載の情報処理装置において、
前記情報処理装置は、ユーザ携帯端末、または、ユーザ携帯端末と通信可能なサーバ装置である情報処理装置。 - 情報処理装置が、
複数のアイテムを提示する撮影対象の少なくとも一部が撮像部により撮像された映像データを逐次取得し、
前記映像データに、予め登録された画像の少なくとも一部が含まれることを検出して、該画像を認識し、
認識された、前記撮影対象に提示される前記複数の画像の各々に対応する付加データを格納する付加データ格納部から、各前記画像の付加データをそれぞれ抽出し、
抽出された各画像の付加データを、表示する情報処理装置のデータ処理方法。 - 情報処理装置を実現するコンピュータに、
複数のアイテムを提示する撮影対象の少なくとも一部が撮像部により撮像された映像データを逐次取得する手順、
前記映像データに、予め登録された画像の少なくとも一部が含まれることを検出して、該画像を認識する手順、
認識された、前記撮影対象に提示される前記複数の画像の各々に対応する付加データを格納する付加データ格納部から、各前記画像の付加データをそれぞれ抽出する手順、
抽出された各画像の付加データを、表示する手順を実行させるためのプログラム。
Priority Applications (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US14/416,370 US9489702B2 (en) | 2012-07-24 | 2013-07-19 | Information processing device, data processing method thereof, and program |
JP2014526891A JP6070705B2 (ja) | 2012-07-24 | 2013-07-19 | 情報処理装置、そのデータ処理方法、およびプログラム |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2012164130 | 2012-07-24 | ||
JP2012-164130 | 2012-07-24 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2014017392A1 true WO2014017392A1 (ja) | 2014-01-30 |
Family
ID=49997205
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2013/069623 WO2014017392A1 (ja) | 2012-07-24 | 2013-07-19 | 情報処理装置、そのデータ処理方法、およびプログラム |
Country Status (3)
Country | Link |
---|---|
US (1) | US9489702B2 (ja) |
JP (1) | JP6070705B2 (ja) |
WO (1) | WO2014017392A1 (ja) |
Cited By (10)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2014056504A (ja) * | 2012-09-13 | 2014-03-27 | Dainippon Printing Co Ltd | 注文情報作成装置、注文情報作成方法、およびプログラム、並びに、注文情報作成システム |
JP2016071471A (ja) * | 2014-09-29 | 2016-05-09 | 京セラドキュメントソリューションズ株式会社 | 物品情報提供装置、物品情報提供システム、物品情報提供方法及び物品情報提供プログラム |
WO2016093106A1 (ja) * | 2014-12-11 | 2016-06-16 | 恵比寿十四株式会社 | 情報提示装置、情報提示システム、情報提示方法および情報提示プログラム |
JP2016115325A (ja) * | 2014-12-11 | 2016-06-23 | 恵比寿十四株式会社 | 情報提示装置、情報提示システム、情報提示方法および情報提示プログラム |
JP6353118B1 (ja) * | 2017-05-10 | 2018-07-04 | ヤフー株式会社 | 表示プログラム、情報提供装置、表示装置、表示方法、情報提供方法および情報提供プログラム |
JP6396568B1 (ja) * | 2017-09-19 | 2018-09-26 | ヤフー株式会社 | 提供プログラム、提供装置、提供方法、端末装置および情報提供装置 |
JP2019061518A (ja) * | 2017-09-27 | 2019-04-18 | 株式会社Nttドコモ | 情報処理装置及びプログラム |
JP2019175193A (ja) * | 2018-03-28 | 2019-10-10 | 東京瓦斯株式会社 | オーダーシステム、情報処理装置およびプログラム |
JP6798741B1 (ja) * | 2020-05-07 | 2020-12-09 | eBASE株式会社 | サーバ装置、情報処理方法、及びプログラム |
JP7534080B2 (ja) | 2019-11-08 | 2024-08-14 | 東芝テック株式会社 | 取引処理システム |
Families Citing this family (8)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2015121991A1 (ja) * | 2014-02-14 | 2015-08-20 | 楽天株式会社 | 表示制御装置、表示制御装置の制御方法、プログラム、及び情報記憶媒体 |
AU2016236850A1 (en) * | 2015-03-26 | 2017-11-09 | Preventative Enterprises Pty Ltd | An identification and exclusion system for restricting access to recreation venues |
US10664902B2 (en) * | 2015-10-09 | 2020-05-26 | Rakuten, Inc. | Setting and displaying allocation quantities for allocating amounts of a food product to multiple users while meeting user restriction and demand conditions |
US20170134698A1 (en) * | 2015-11-11 | 2017-05-11 | Vivint, Inc | Video composite techniques |
US20180268503A1 (en) * | 2017-03-15 | 2018-09-20 | Manav C. PARIKH | System and method for selective choice |
US10949667B2 (en) | 2017-09-14 | 2021-03-16 | Ebay Inc. | Camera platform and object inventory control |
CN111967430B (zh) * | 2020-08-28 | 2024-08-06 | 维沃移动通信有限公司 | 消息处理方法、装置、电子设备及可读存储介质 |
US12079394B2 (en) * | 2020-10-14 | 2024-09-03 | Aksor | Interactive contactless ordering terminal |
Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2006148618A (ja) * | 2004-11-22 | 2006-06-08 | Olympus Corp | 情報重畳端末及び付加画像重畳システム |
JP2010238118A (ja) * | 2009-03-31 | 2010-10-21 | Promise Co Ltd | オーダリングシステム |
JP2012507761A (ja) * | 2008-09-02 | 2012-03-29 | エコール ポリテクニーク フェデラル ドゥ ローザンヌ(エーペーエフエル) | ポータブル・デバイス上での画像アノテーション |
Family Cites Families (9)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2006085498A (ja) * | 2004-09-16 | 2006-03-30 | Tm Second:Kk | アレルギー物質チェック方法、そのための装置及びプログラム |
JP4963365B2 (ja) | 2005-03-07 | 2012-06-27 | エスアイアイ・データサービス株式会社 | セルフオーダシステムおよび注文入力端末 |
JP2006285654A (ja) | 2005-03-31 | 2006-10-19 | Dainippon Printing Co Ltd | 商品情報検索システム |
JP4555853B2 (ja) | 2007-10-29 | 2010-10-06 | 株式会社ケンウッド | オーダシステム |
US9204050B2 (en) * | 2008-12-25 | 2015-12-01 | Panasonic Intellectual Property Management Co., Ltd. | Information displaying apparatus and information displaying method |
JP2011081556A (ja) | 2009-10-06 | 2011-04-21 | Sony Corp | 情報処理装置、情報処理方法、プログラムおよびサーバ |
JP5521621B2 (ja) * | 2010-02-19 | 2014-06-18 | 日本電気株式会社 | 携帯端末、拡張現実システム、及び拡張現実情報表示方法 |
CN103415849B (zh) | 2010-12-21 | 2019-11-15 | 高通股份有限公司 | 用于标注视图图像的至少一个特征的计算机化方法和设备 |
EP2824591A4 (en) * | 2012-03-08 | 2015-11-25 | Omron Tateisi Electronics Co | OUTPUT DEVICE, OUTPUT SYSTEM, AND PROGRAM |
-
2013
- 2013-07-19 US US14/416,370 patent/US9489702B2/en active Active
- 2013-07-19 JP JP2014526891A patent/JP6070705B2/ja active Active
- 2013-07-19 WO PCT/JP2013/069623 patent/WO2014017392A1/ja active Application Filing
Patent Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2006148618A (ja) * | 2004-11-22 | 2006-06-08 | Olympus Corp | 情報重畳端末及び付加画像重畳システム |
JP2012507761A (ja) * | 2008-09-02 | 2012-03-29 | エコール ポリテクニーク フェデラル ドゥ ローザンヌ(エーペーエフエル) | ポータブル・デバイス上での画像アノテーション |
JP2010238118A (ja) * | 2009-03-31 | 2010-10-21 | Promise Co Ltd | オーダリングシステム |
Cited By (13)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2014056504A (ja) * | 2012-09-13 | 2014-03-27 | Dainippon Printing Co Ltd | 注文情報作成装置、注文情報作成方法、およびプログラム、並びに、注文情報作成システム |
JP2016071471A (ja) * | 2014-09-29 | 2016-05-09 | 京セラドキュメントソリューションズ株式会社 | 物品情報提供装置、物品情報提供システム、物品情報提供方法及び物品情報提供プログラム |
WO2016093106A1 (ja) * | 2014-12-11 | 2016-06-16 | 恵比寿十四株式会社 | 情報提示装置、情報提示システム、情報提示方法および情報提示プログラム |
JP2016115325A (ja) * | 2014-12-11 | 2016-06-23 | 恵比寿十四株式会社 | 情報提示装置、情報提示システム、情報提示方法および情報提示プログラム |
JP2018190294A (ja) * | 2017-05-10 | 2018-11-29 | ヤフー株式会社 | 表示プログラム、情報提供装置、表示装置、表示方法、情報提供方法および情報提供プログラム |
JP6353118B1 (ja) * | 2017-05-10 | 2018-07-04 | ヤフー株式会社 | 表示プログラム、情報提供装置、表示装置、表示方法、情報提供方法および情報提供プログラム |
JP6396568B1 (ja) * | 2017-09-19 | 2018-09-26 | ヤフー株式会社 | 提供プログラム、提供装置、提供方法、端末装置および情報提供装置 |
JP2019057259A (ja) * | 2017-09-19 | 2019-04-11 | ヤフー株式会社 | 提供プログラム、提供装置、提供方法、端末装置および情報提供装置 |
JP2019061518A (ja) * | 2017-09-27 | 2019-04-18 | 株式会社Nttドコモ | 情報処理装置及びプログラム |
JP2019175193A (ja) * | 2018-03-28 | 2019-10-10 | 東京瓦斯株式会社 | オーダーシステム、情報処理装置およびプログラム |
JP7534080B2 (ja) | 2019-11-08 | 2024-08-14 | 東芝テック株式会社 | 取引処理システム |
JP6798741B1 (ja) * | 2020-05-07 | 2020-12-09 | eBASE株式会社 | サーバ装置、情報処理方法、及びプログラム |
JP2021177280A (ja) * | 2020-05-07 | 2021-11-11 | eBASE株式会社 | サーバ装置、情報処理方法、及びプログラム |
Also Published As
Publication number | Publication date |
---|---|
US20150206257A1 (en) | 2015-07-23 |
JPWO2014017392A1 (ja) | 2016-07-11 |
US9489702B2 (en) | 2016-11-08 |
JP6070705B2 (ja) | 2017-02-01 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
JP6070705B2 (ja) | 情報処理装置、そのデータ処理方法、およびプログラム | |
JP6687051B2 (ja) | 画像認識装置、その処理方法、およびプログラム | |
US9418293B2 (en) | Information processing apparatus, content providing method, and computer program | |
WO2014017393A1 (ja) | 情報処理装置、そのデータ処理方法、およびプログラム | |
JP4203502B2 (ja) | 商品情報提供システム、ユーザメモ管理装置、端末装置、及び情報提供装置等 | |
CN106982240B (zh) | 信息的显示方法和装置 | |
JP6120467B1 (ja) | サーバ装置、端末装置、情報処理方法、およびプログラム | |
US10872324B2 (en) | Shopping support computing device | |
JP2018156478A (ja) | コンピュータプログラム | |
TW201807632A (zh) | 圖像化交易方法及其交易系統 | |
US20150108213A1 (en) | Shopping support device and shopping support method | |
WO2014027433A1 (ja) | 情報提供装置、情報提供方法、及び、プログラム | |
JP6532555B1 (ja) | 販売支援装置、販売支援方法及びプログラム | |
KR20110125866A (ko) | 증강현실을 통한 정보제공 방법 및 장치 | |
KR20140105059A (ko) | 통합 위시리스트 관리 시스템 및 방법 | |
JP2019079269A (ja) | 遠隔接客プログラム、遠隔接客方法及び遠隔接客装置 | |
KR101259926B1 (ko) | 동영상전자메뉴판을 이용한 주문방법 | |
JP7148950B2 (ja) | サーバー装置、商業施設内情報システムおよび行動履歴の提示方法 | |
JP2011170797A (ja) | 注文支援システム、注文支援方法 | |
KR101907885B1 (ko) | 단말 및 그의 제어 방법 | |
JP6299530B2 (ja) | 情報提供システム | |
WO2018235318A1 (ja) | 情報処理装置、情報処理方法及びプログラム | |
JP2022087419A (ja) | サーバ装置及びプログラム | |
US10592972B2 (en) | Graphic transaction method and system for utilizing the same | |
CN118735508A (zh) | 商品组合交易的方法、电子设备、存储介质及程序产品 |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 13822956 Country of ref document: EP Kind code of ref document: A1 |
|
ENP | Entry into the national phase |
Ref document number: 2014526891 Country of ref document: JP Kind code of ref document: A |
|
WWE | Wipo information: entry into national phase |
Ref document number: 14416370 Country of ref document: US |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
122 | Ep: pct application non-entry in european phase |
Ref document number: 13822956 Country of ref document: EP Kind code of ref document: A1 |