WO2018150758A1 - Information processing device, information processing method, and storage medium - Google Patents
Information processing device, information processing method, and storage medium Download PDFInfo
- Publication number
- WO2018150758A1 WO2018150758A1 PCT/JP2017/047385 JP2017047385W WO2018150758A1 WO 2018150758 A1 WO2018150758 A1 WO 2018150758A1 JP 2017047385 W JP2017047385 W JP 2017047385W WO 2018150758 A1 WO2018150758 A1 WO 2018150758A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- information
- user
- information processing
- processing apparatus
- processing system
- Prior art date
Links
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F16/00—Information retrieval; Database structures therefor; File system structures therefor
Definitions
- the present disclosure relates to an information processing apparatus, an information processing method, and a storage medium.
- Devices that display various information according to user operations on the touch panel such as smartphones and tablet terminals, are widely used.
- tablet terminals the screen size has been increased, and the use method in which a plurality of users operate at the same time is being considered.
- a projector is conventionally used as a device for displaying information.
- Patent Document 1 information is displayed according to the environment in which information is to be displayed and the status of the information being displayed. Techniques for performing are disclosed.
- Patent Document 1 a method of providing content to each user when there are a plurality of users has been studied. However, a study on user input when there are a plurality of users has not been sufficient.
- the present disclosure provides a mechanism that makes it possible to appropriately handle input from a plurality of users.
- an information processing apparatus including a processing unit that performs processing for associating drawing object information indicating a drawing object used for drawing in the drawing process obtained through the sensing process.
- an information processing method including performing a process of associating, with a processor, drawing object information indicating a drawing object used for drawing in the drawing process obtained through a third sensing process.
- a drawing process showing a user drawing information on a real object by the user obtained through the second sensing process and user information about the user obtained through the first sensing process.
- a storage storing a program for functioning as a processing unit that performs processing for associating information and drawing object information indicating a drawing object used for drawing in the drawing process, obtained through the third sensing process A medium is provided.
- a process of associating user information, drawing process information, and drawing object information with respect to a drawing operation of a certain user is performed. Therefore, even when a plurality of users perform a drawing operation, it is possible to appropriately handle the input for each user based on the associated information.
- a mechanism is provided that makes it possible to appropriately handle inputs from a plurality of users.
- the above effects are not necessarily limited, and any of the effects shown in the present specification, or other effects that can be grasped from the present specification, together with or in place of the above effects. May be played.
- FIG. 1 is a diagram illustrating a configuration example of an information processing system according to an embodiment of the present disclosure.
- the system may mean a configuration for executing a predetermined process, and the system as a whole can be regarded as one device, or a system is configured by a plurality of devices. It can also be regarded as.
- the information processing system according to the present embodiment illustrated in FIG. 1 may also be configured to execute predetermined processing (for example, processing realized by the functional configuration illustrated in FIG. 4) as the entire information processing system. Which of these configurations is considered as one device may be arbitrary.
- an information processing system 100a includes an input unit 110a and an output unit 130a.
- the output unit 130a visually notifies the user of the information by displaying various types of information on the table 140a.
- a projector is used as the output unit 130a.
- the output unit 130a is disposed above the table 140a, for example, spaced from the table 140a by a predetermined distance while being suspended from the ceiling, and projects information on the top surface of the table 140a.
- a method for displaying information on the top surface of the table 140a from above is also referred to as a “projection type”.
- the entire area where information is displayed by the output unit 130a is also referred to as a display screen.
- the output unit 130a displays information presented to the user as the application is executed by the information processing system 100a on the display screen.
- the displayed information is, for example, an operation screen of each application.
- each display area in which the operation screen of such an application is displayed on the display screen is also referred to as a display object.
- the display object may be a so-called GUI (Graphical User Interface) component (widget).
- the output unit 130a may include a lighting device.
- the information processing system 100a turns on the lighting device based on the content of information input by the input unit 110a and / or the content of information displayed by the output unit 130a. You may control states, such as light extinction.
- the output unit 130a may include a speaker and may output various kinds of information as sound.
- the number of speakers may be one or plural.
- the output unit 130a includes a plurality of speakers, the information processing system 100a may limit the speakers that output sound or adjust the direction in which sound is output.
- the output unit 130a may include a plurality of output devices, and may include, for example, a projector, a lighting device, and a speaker.
- the input unit 110a is a device that inputs operation details of a user who uses the information processing system 100a.
- the input unit 110a is provided above the table 140a, for example, in a state suspended from the ceiling.
- the input unit 110a is provided apart from the table 140a on which information is displayed.
- the input unit 110a may be configured by an imaging device that can capture the top surface of the table 140a, that is, the display screen.
- a camera that images the table 140a with one lens a stereo camera that can record information in the depth direction by imaging the table 140a with two lenses, or the like can be used.
- the input unit 110a is a stereo camera, for example, a visible light camera or an infrared camera can be used.
- the information processing system 100a analyzes the image (captured image) captured by the camera to physically locate the table 140a. It is possible to detect an object (hereinafter also referred to as a real object), for example, the position of a user's hand.
- a stereo camera is used as the input unit 110a, the information processing system 100a analyzes the captured image by the stereo camera, and in addition to the position information of the object located on the table 140a, Depth information (in other words, three-dimensional information) can be acquired.
- the information processing system 100a can detect contact or proximity of the user's hand to the table 140a in the height direction and separation of the hand from the table 140a based on the depth information.
- contact when the user touches or brings an operating body such as a hand in contact with information on the display screen is also simply referred to as “contact”.
- the position of the operating body for example, the user's hand on the display screen (that is, the top surface of the table 140a) is detected, and based on the detected position of the operating body.
- Various information is input. That is, the user can perform various operation inputs by moving the operating tool on the display screen. For example, when a user's hand contact with the display object is detected, an operation input for the display object is performed.
- a case where a user's hand is used as an operation body will be described as an example, but the present embodiment is not limited to this example, and various operation members such as a stylus are used as the operation body. May be.
- the input unit 110a may capture not only the top surface of the table 140a but also a user existing around the table 140a.
- the information processing system 100a can detect the position of the user around the table 140a based on the captured image.
- the information processing system 100a may perform personal recognition of the user by extracting physical features that can identify the individual of the user, such as the size of the user's face and body included in the captured image. .
- the present embodiment is not limited to such an example, and user operation input may be executed by other methods.
- the input unit 110a may be provided as a touch panel on the top surface of the table 140a, and the user's operation input may be detected by contact of the user's finger or the like with respect to the touch panel.
- the touch panel may be realized by various methods such as a pressure-sensitive type, a capacitance type, and an optical type.
- the input unit 110a performs spatial position recognition of an object using ultrasonic reflection, or detects a contact position between the object and another object by detecting and analyzing the vibration of the object.
- a user operation on the table top surface 140a may be detected.
- the input unit 110a may employ any one or any combination thereof as a technique for detecting a user operation on the table top surface 140a. Further, the user's operation input may be detected by a gesture with respect to the imaging device constituting the input unit 110a.
- the input unit 110a may include a voice input device such as a microphone that picks up sounds produced by the user and environmental sounds of the surrounding environment.
- a microphone array for collecting sound in a specific direction can be suitably used. Further, the microphone array can be configured such that the sound collection direction can be adjusted to an arbitrary direction.
- an operation input may be performed using the collected voice.
- the information processing system 100a may perform individual recognition based on the voice by analyzing the collected voice.
- the input unit 110a may be configured by a remote control device (so-called remote control).
- the remote controller may be one in which a predetermined instruction is input by operating a predetermined button arranged on the remote controller, or movement of the remote controller by a sensor such as an acceleration sensor or a gyro sensor mounted on the remote controller. Or a posture may be detected, and a predetermined instruction may be input by an operation of the user moving the remote controller.
- the information processing system 100a may include other input devices such as a mouse, a keyboard, buttons, switches, and levers (not shown) as the input unit 110a, and user operations are input through these input devices. Also good.
- the configuration of the information processing system 100a according to the present embodiment has been described above with reference to FIG. Although not shown in FIG. 1, another device may be connected to the information processing system 100a.
- an illumination device for illuminating the table 140a may be connected to the information processing system 100a.
- the information processing system 100a may control the lighting state of the lighting device according to the state of the display screen.
- the configuration of the information processing system is not limited to that shown in FIG.
- the information processing system according to the present embodiment only needs to include an output unit that displays various types of information on the display screen, and an input unit that can accept at least an operation input for the displayed information. Is not limited.
- FIG.2 and FIG.3 the other structural example of the information processing system which concerns on this embodiment is demonstrated.
- 2 and 3 are diagrams showing another configuration example of the information processing system according to the present embodiment.
- an output unit 130a is provided below the table 140b.
- the output unit 130a is a projector, for example, and projects information from below toward the top plate of the table 140b.
- the top plate of the table 140b is formed of a transparent material such as a glass plate or a transparent plastic plate, and the information projected by the output unit 130a is displayed on the top surface of the table 140b.
- a method of projecting information from the bottom of the table 140b onto the output unit 130a and displaying the information on the top surface of the table 140b is also referred to as a “rear projection type”.
- the input unit 110b is provided on the top surface (front surface) of the table 140b.
- the input unit 110b is configured by, for example, a touch panel, and the operation input by the user is performed when the touch of the operating body on the display screen on the top surface of the table 140b is detected by the touch panel.
- the configuration of the input unit 110b is not limited to this example, and the input unit 110b may be provided below the table 140b and separated from the table 140b, similarly to the information processing system 100a shown in FIG.
- the input unit 110b is configured by an imaging device, for example, and can detect the position of the operation body on the top surface of the table 140b through a top plate formed of a transparent material.
- a touch panel display is installed on a table with its display screen facing upward.
- the input unit 110c and the output unit 130c can be integrally configured as the touch panel display. That is, various types of information are displayed on the display screen of the display, and the operation input by the user is performed by detecting the touch of the operating body on the display screen of the display by the touch panel.
- an imaging device may be provided above the touch panel display as the input unit 110c. The position of the user around the table can be detected by the imaging device.
- the information processing system according to the present embodiment can be realized by various configurations.
- the present embodiment will be described by taking as an example the configuration of the information processing system 100a in which the input unit 110a and the output unit 130a are provided above the table 140a shown in FIG.
- the information processing system 100a, the input unit 110a, and the output unit 130a are simply referred to as the information processing system 100, the input unit 110, and the output unit 130.
- FIG. 4 is a block diagram illustrating an example of a functional configuration of the information processing system 100 according to the present embodiment.
- the information processing system 100 includes an input unit 110, a processing unit 120, an output unit 130, a storage unit 150, and a communication unit 160 as its functions.
- the input unit 110 is an input interface for inputting various information to the information processing system 100. A user can input various types of information to the information processing system 100 via the input unit 110.
- the input unit 110 corresponds to the input units 110a to 110c shown in FIGS.
- the input unit 110 can include various sensors.
- the input unit 110 performs sensing on the user in the sensing target range, user actions, real objects, and the relationship between these and display objects, generates sensing information indicating the sensing result, and outputs the sensing information to the processing unit 120.
- the sensing target range may not be limited to the top surface of the table 140, and may include, for example, the periphery of the table 140.
- the input unit 110 includes an imaging device and captures a captured image including a user's body, a user's face, a user's hand, an object positioned on the top surface of the table 140, and the like.
- Information (for example, information about the captured image) input via the input unit 110 is provided to the processing unit 120 described later, and a user is identified, a user operation input is recognized, or an object is detected.
- the imaging device may be a visible light camera or an infrared camera, for example.
- the input unit 110 may be configured as an imaging device including a function as a depth sensor capable of acquiring depth information such as a stereo camera.
- the depth sensor may be configured separately from the imaging device as a sensor using an arbitrary method such as a time of flight method or a structured light method.
- the input unit 110 may include a touch sensor. In that case, the touch sensor detects a touch on the display screen. And the detection function of the user's hand which is not touching on the display screen and the object on the display screen may be secured by the imaging device which images the depth sensor and / or the display screen from above.
- the input unit 110 includes a sound collection device and collects a user's voice, a sound accompanying a user's operation, an environmental sound, and the like. Information input via the input unit 110 is provided to the processing unit 120 described later, and a user's voice input is recognized, or movement of an object is detected.
- the sound collection device may be an array microphone, and the sound source direction may be detected by the processing unit 120.
- the sound collection device is typically configured by a microphone, but may be configured as a sensor that detects sound by vibration of light.
- the processing unit 120 includes various processors such as a CPU and a DSP, and controls the operation of the information processing system 100 by executing various arithmetic processes. For example, the processing unit 120 processes various types of information obtained from the input unit 110 or the communication unit 160 and stores the information in the storage unit 150 or causes the output unit 130 to output the information.
- the processing unit 120 may be regarded as an information processing apparatus that processes various types of information.
- the processing unit 120 includes an association processing unit 121, a storage control unit 123, and a display control unit 125 as its functions. Note that the processing unit 120 may have functions other than these functions.
- each function of the processing unit 120 is realized by a processor constituting the processing unit 120 operating according to a predetermined program.
- the processing unit 120 can function as a first processing unit and a second processing unit.
- the association processing unit 121 shows, as the first processing unit, user information about the user obtained through the first sensing process and a drawing process on the real object by the user obtained through the second sensing process.
- a process of associating the drawing process information with the drawing object information obtained through the third sensing process and indicating the drawing object used for drawing in the drawing process is performed.
- the association processing unit 121 generates information in which processing for associating these pieces of information, that is, information that associates user information, drawing process information, and drawing object information.
- the sensing process is information detection processing performed by the input unit 110.
- the first sensing process, the second sensing process, and the third sensing process may be different sensing processes, at least two sensing processes may be the same, or the first to third sensing processes may be the same.
- the sensing processes may all be the same.
- the storage control unit 123 and the display control unit 125 perform various processes using the information on which the associating process is performed as the second processing unit. Specifically, the storage control unit 123 causes the storage unit 150 to store information that associates user information, drawing process information, and drawing object information. The display control unit 125 causes the output unit 130 to display display information based on information associated with user information, drawing process information, and drawing object information.
- the first processing unit and the second processing unit may be included in different information processing apparatuses. That is, the association process and the process using the associated information may be executed by different information processing apparatuses. Further, the association processing unit 121 and the storage control unit 123 may be included in the same information processing apparatus, and the display control unit 125 may be included in another information processing apparatus. That is, the process of storing in association with each other and the process using the stored associated information may be executed by different information processing apparatuses.
- the association processing unit 121, the storage control unit 123, and the display control unit 125 will be described as being included in the information processing system 100.
- the output unit 130 is an output interface for notifying the user of various types of information processed by the information processing system 100.
- the output unit 130 includes a display device such as a display, a touch panel, or a projector, and displays various types of information on the display screen under the control of the display control unit 125.
- the output unit 130 corresponds to the output units 130a to 130c shown in FIGS. 1 to 3, and displays a display object on the display screen as described above.
- the present embodiment is not limited to this example, and the output unit 130 may further include an audio output device such as a speaker, and may output various types of information as audio.
- the storage unit 150 is a storage device that temporarily or permanently stores information for the operation of the information processing system 100.
- the storage unit 150 stores information that associates user information, drawing process information, and drawing object information.
- the communication unit 160 is a communication interface for transmitting / receiving data to / from an external device by wire / wireless.
- the communication unit 160 is a method such as wireless LAN (Local Area Network), Wi-Fi (Wireless Fidelity (registered trademark)), infrared communication, Bluetooth (registered trademark), etc., directly with an external device or via a network access point.
- Communicate For example, the communication unit 160 communicates information with a user device such as a user's smartphone or a wearable device attached to the user.
- the communication unit 160 may acquire information from an SNS (Social Networking Service) or the like by communicating with a server on the Web, for example.
- SNS Social Networking Service
- FIG. 5 and 6 are diagrams for explaining the outline of the information processing system 100 according to the present embodiment.
- FIG. 5 there are users 10 ⁇ / b> A, 10 ⁇ / b> B, and 10 ⁇ / b> C around the table 140, and the illustration 30 is drawn on the imitation paper 20 placed on the table 140.
- the illustration 30 is classified into illustrations 30A, 30B, and 30C.
- the illustration 30A is a drawing result by the user 10A
- the illustration 30B is a drawing result by the user 10B
- the illustration 30C is a drawing result by the user 10C.
- the illustration 30 is drawn by overlapping the illustrations 30A, 30B, and 30C.
- the information processing system 100 senses and stores a drawing process by the users 10A, 10B, and 10C using the input unit 110.
- the drawing process is a state in which drawing is performed, and is a concept including a user being drawn or a surrounding state, a time-series change of a drawing result, and the like.
- the information processing system 100 can separately store which part of the illustration 30 is drawn by which user by sensing the drawing process in real time.
- the information processing system 100 stores information about the illustration 30A in association with the user 10A, stores information about the illustration 30B in association with the user 10B, and stores information about the illustration 30C in association with the user 10C.
- the information processing system 100 can reproduce the illustration 30 by the output unit 130 based on the stored information. At this time, the information processing system 100 can display only one of the illustrations 30A, 30B, and 30C, change the way of superimposing, and appropriately process it. In addition to the illustration 30, the information processing system 100 may reproduce the state of the user at the time of drawing, the environmental sound, and the like. In this case, the browsing user can relive the past drawing experience. It is.
- the drawing result may be a character or a sentence, or may be a three-dimensional shaped object.
- Processing by the information processing system 100 is roughly divided into storage processing and reproduction processing.
- the storage process and the reproduction process will be described in detail in order.
- FIG. 7 is a flowchart showing an example of the flow of storage processing executed in the information processing system 100 according to the present embodiment.
- the information processing system 100 acquires sensing information (step S102).
- the information processing system 100 acquires various types of information based on the sensing information (step S104).
- the information processing system 100 acquires user information, drawing object information, and drawing process information based on sensing information.
- the information processing system 100 stores the acquired various types of information in association with each other (step S106).
- the storage process ends.
- the information processing system 100 can acquire various sensing information.
- the information processing system 100 acquires sensing information such as a captured image and sound by the input unit 110.
- the information processing system 100 may acquire the sensing information of the wearable device worn by the user through the communication unit 160.
- the information processing system 100 can acquire a variety of information based on the sensing information. An example will be described below.
- the drawing object information includes at least one of drawing object identification information, information indicating the drawing color of the drawing object in the drawing process, information indicating the drawing style, and information indicating the state.
- the information processing system 100 can acquire drawing object information based on a captured image of the drawing object. Specifically, the information processing system 100 can acquire the drawing object information by inquiring the database about the image recognition result and the feature point information of the identification information such as QR code (registered trademark) attached to the drawing object. Good. Further, the information processing system 100 may acquire drawing object information by performing image recognition on a drawing result actually drawn by a drawing object.
- the information processing system 100 may acquire drawing object information by performing image recognition on a drawing result actually drawn by a drawing object.
- a drawing object is an object used for drawing.
- the drawing object may be an object that the user can freely draw, such as a pencil, a ballpoint pen, a fountain pen, a crayon, or a brush.
- the drawing object may be an object capable of performing predetermined drawing, such as a stamp.
- the drawn object may be an object other than a so-called tool such as a user's hand.
- the drawing can be realized by various methods other than an additive method such as placing ink on paper.
- the drawing may be realized by a subtractive method.
- the drawing object is, for example, a carving sword that cuts off an actual object, or an eraser that deletes the drawing content.
- drawing may be realized in three dimensions in addition to being realized in two dimensions.
- the drawing object may be a hand-held 3D printer (so-called 3D pen) capable of three-dimensional drawing.
- the drawing object identification information may be an ID number uniquely assigned to each drawing object, a drawing object name, an image, or the like.
- Information indicating the drawing color of the drawing object is information indicating, for example, the color of the pencil core or the ink color of the ballpoint pen.
- Information indicating a drawing style is information indicating a line type, a blurring level, or the like corresponding to a type of a drawing object such as a pencil or a ballpoint pen.
- the information indicating the drawing style may be information indicating the shape of the pen tip, the shape of the blade of the engraving knife, or the like.
- the drawing object information may include information indicating the transparency of the drawing result as one of the information indicating the drawing style.
- the drawing result is a real object generated as a result of drawing, and is, for example, a drawn line, an illustration, or a stamp impression.
- the drawing result when the drawing object is a highlighter, the drawing result (that is, the line) has transparency.
- the illustration 30 corresponds to the drawing result.
- FIG. 8 a process for detecting the transparency of the drawing result will be described in detail.
- FIG. 8 is a diagram for explaining the process of detecting the transparency of the drawing result according to this embodiment.
- the left diagram of FIG. 8 shows a state in which a line 31 is drawn first and then a transparent line 32 is drawn.
- the right diagram in FIG. 8 shows a state in which the line 33 is drawn first, and then the non-transparent line 34 is drawn.
- the transparency of the drawing result can be detected based on the difference between the portion of the drawing result overwritten with the other drawing result and the portion not overlapping with the other drawing result.
- the information processing system 100 compares the area C and the area B, detects that the color is different as shown in the left figure, and detects the same color as shown in the right figure. If it is, it is detected that there is no permeability.
- the information processing system 100 may detect the degree of transparency (0% to 100%) in addition to the presence or absence of transparency.
- the information processing system 100 further detects the transparency of the region by recognizing, for example, the degree of color mixing in the region C based on the color of the region A that is not overwritten among other overwritten drawing results. May be improved.
- Information indicating state Information indicating the state of the drawing object in the drawing process is information that can dynamically change, such as the posture of the drawing object, the writing pressure, the ink amount, or the ink viscosity.
- drawing process information includes at least one of information indicating the locus of the representative point of the drawing object being drawn, and information indicating the drawing result of the drawing object.
- the information processing system 100 acquires the drawing process information based on sensing information in a period from when the drawing object comes into contact with the real object to be drawn such as paper until the contact is released.
- the information processing system 100 acquires drawing process information based on the time series change of the position of the drawing object and the time series change of the drawing result obtained by recognizing the time series change of the captured image. May be.
- the representative point of the drawing object is a point where drawing is performed.
- the representative point of the drawing object is the pen tip.
- the information indicating the trajectory of the representative point of the drawing object is information indicating the trajectory in the two-dimensional space (that is, a plane) when the drawing is performed two-dimensionally.
- the information indicating the trajectory of the representative point of the drawn object is information indicating the trajectory in the three-dimensional space when the drawing is performed three-dimensionally.
- the information indicating the locus of the representative point of the drawn object is typically vector data.
- Information indicating the drawing result by the drawing object is information indicating the drawing result itself. For example, regarding a complicated drawing result that is difficult to represent as the locus of the representative point of the drawing object, information indicating the drawing result by the drawing object can be acquired.
- the information indicating the drawing result by the drawing object is typically pixel data (in other words, an image).
- the drawing process information may be vector data or pixel data, or a combination thereof.
- it is usually acquired as vector data, and may be acquired as pixel data if the data amount can be reduced.
- User information includes at least one of user identification information, attribute information, and biometric information.
- the user identification information may be an ID number uniquely assigned to each user, a person name, a face image, or the like.
- the user attribute information is, for example, information such as age, sex, occupation, family structure, and friendship.
- the information processing system 100 acquires the user information based on the sensing result. For example, the information processing system 100 acquires user identification information by face recognition or voice recognition based on a captured image or voice obtained from the input unit 110. Further, the information processing system 100 acquires user attribute information by image recognition or voice recognition based on a captured image or sound obtained from the input unit 110, or acquires user identification information from a database using the user identification information as a search condition. Or
- the search condition is information associated with target information such as a search keyword, and information including information that is the same as or similar to the search condition is obtained as a search result.
- the user information includes at least one of a user or a captured image around the user in the drawing process, a voice around the user or the user, biometric information of the user, or information obtained by processing these information.
- the information processing system 100 acquires these pieces of information based on the sensing result.
- the captured image and sound are obtained by the input unit 110.
- the biological information is acquired based on a sensing result by a wearable device worn by the user, for example.
- the biometric information of the user is information such as heartbeat, sweating, body temperature, and emotion.
- the information obtained by processing the captured image, voice, or biological information is, for example, a face recognition result or a gesture recognition result based on the captured image, or text data obtained by applying TTS (Text To Speech) processing to the voice. is there.
- the information processing system 100 may further associate time information with user information, drawing process information, and drawing object information.
- the time information is information indicating time at the time of sensing, for example.
- the drawing process information becomes information indicating time-series changes of drawing by being associated with time information.
- the information processing system 100 may further associate information related to the real object existing around the user with the user information, the drawing process information, and the drawn object information. For example, the information processing system 100 may associate the captured image of the paper material placed in the sensing target range (for example, the top surface of the table 140). Alternatively, the information processing system 100 may acquire and associate source data of the paper material (for example, a text file when the paper material is a sentence) from the network. Further, the information processing system 100 may associate the captured image of the paper material and the source data together. In this way, by associating the real object information, the information processing system 100 can reproduce not only the user's drawing result but also the real object existing around it during the reproduction process. Become.
- the real object information may include information indicating a real object including a drawing result by the user, such as the imitation paper 20 illustrated in FIG. In this case, as will be described later with reference to FIG. 16, a real object including a drawing result by the user can be used in the reproduction process.
- the information processing system 100 stores the associated information in the storage unit 150.
- the information processing system 100 stores user information, drawing process information, and drawing object information in association with each other.
- the information processing system 100 may further store time information and / or real object information in association with each other. In this way, a set of information subjected to the association process is also referred to as a drawing information set below.
- the information processing system 100 stores a drawing information set related to the illustration 30A drawn by the user 10A.
- the information processing system 100 includes the user information of the user 10A, the drawing process information when the user 10A draws the illustration 30A, the drawing object information regarding the drawing object used for drawing the illustration 30A, and the surroundings of the user 10A.
- the real object information and the time information that existed are associated with each other and stored as a drawing information set.
- the information processing system 100 stores a drawing information set related to the illustration 30B drawn by the user 10B and a drawing information set related to the illustration 30C drawn by the user 10C.
- FIG. 9 is a flowchart illustrating an example of the flow of reproduction processing executed in the information processing system 100 according to the present embodiment.
- the information processing system 100 accepts input of search conditions (step S202).
- the information processing system 100 displays information indicating a search result based on the input search condition, and accepts a reproduction target selection operation (step S204).
- the information processing system 100 reproduces the designated reproduction target (step S206).
- the reproduction process ends.
- the information processing system 100 accepts input of search conditions.
- the search condition here is information that specifies at least a part of the information stored in association with the storage unit 150.
- the user who has drawn in the storage process described above and the user who browses the drawing result reproduced in the reproduction process may be the same or different.
- the information processing system 100 searches for a drawing information set including information that is the same as or similar to the search condition based on the input search condition.
- the information processing system 100 can search the drawing information set using various information as search conditions.
- the information processing system 100 may search for a drawing information set based on a person.
- a drawing information set relating to a drawing result drawn by a specific user is searched. For example, when a user inputs a person name, a search using the person name as a search condition can be performed.
- a search using the face image as a search condition can be performed.
- a search can be performed using a person name or ID number recognized based on the face image as a search condition.
- the information processing system 100 may search for a drawing information set based on a drawing object.
- a drawing information set relating to a drawing result drawn using a specific drawing object is searched.
- a search using the drawing object identification information as a search condition may be performed.
- a search can be performed using the drawing object image as a search condition.
- a search can be performed using the name or identification information of the drawing object recognized based on the drawing object image as a search condition.
- the information processing system 100 may search for a drawing information set based on time information.
- a drawing information set relating to a drawing result drawn in a specific time zone is searched.
- a search can be performed using time information belonging to the input time zone as a search condition.
- a drawing information set related to the drawing result drawn in the same time period can be searched.
- the information processing system 100 may search for a drawing information set based on the position information.
- a drawing information set relating to a drawing result drawn at a specific position is searched.
- a desired range in the sensing target range for example, the top surface of the table 140
- a drawing information set related to a drawing result drawn in the desired range can be searched. This makes it possible to satisfy a request such as “I want you to reproduce the pictures that I have written here”.
- the information processing system 100 may search for a drawing information set based on the drawing result.
- a drawing information set that is the same as or similar to the input drawing result is searched.
- a drawing information set related to a picture that is the same as or similar to the picture drawn by the user as a search condition is searched.
- search may be performed using one search condition, or may be performed using a search expression in which a plurality of search conditions are combined.
- Search conditions can be input by various input methods such as gesture operation, touch operation on the display screen, or voice input.
- the user inputs a search condition on the search condition input screen displayed on the display screen.
- search condition input screen displayed on the display screen.
- FIG. 10 is a diagram for explaining an example of a search condition input screen according to the present embodiment.
- the input screen 40 is displayed on a display screen, for example.
- the user selects an input target from the input forms 41 to 45 and inputs a search condition by a gesture operation or voice input.
- a search condition For example, in the input form 41, handwritten input is accepted.
- the drawing result is input as a search condition. Thereby, for example, a picture that is the same as or similar to the picture drawn in the input form 41 is searched.
- the input form 42 accepts input of user information
- the input form 43 accepts input of drawing object information
- the input form 44 accepts input of time information
- the input form 45 inputs actual object information. Is accepted.
- the input screen 40 may include an input form for receiving input of other search conditions.
- the information processing system 100 searches the storage unit 150 for a drawing information set including information that is the same as or similar to the input search condition, and displays information based on the search result. For example, the information processing system 100 displays a search result screen showing a list of search results. Then, the information processing system 100 accepts a reproduction target selection operation on the search result screen.
- a drawing information set including information that is the same as or similar to the input search condition.
- the information processing system 100 displays a search result screen showing a list of search results. Then, the information processing system 100 accepts a reproduction target selection operation on the search result screen.
- FIG. 11 is a diagram for explaining an example of a search result screen according to the present embodiment.
- the search result screen 50 is displayed on a display screen, for example.
- the search result screen 50 is a screen that displays a list of drawing information sets that have been hit by a search according to the input search conditions. For example, six drawing information sets out of 100 hits are displayed in blocks 51A to 51F. Yes.
- Each of the blocks 51 includes a thumbnail image 52 and an area 53 in which meta information is described.
- the meta information for example, drawing object information, drawing process information, and user information can be described in whole or in part as representative information.
- the block 51 may include an icon 54 indicating the presence or absence and type of the access restriction.
- the thumbnail image 52 is, for example, an image showing a drawing result, and is a thumbnail image of the illustration 30A, 30B, or 30C in the example shown in FIGS. The user selects a block 51 to be reproduced from the blocks 51 displayed on the search result screen 50. Thereby, the reproduction process based on the drawing information set to be reproduced is performed.
- the search condition input screen and search result screen described above may be displayed simultaneously. An incremental search may be performed. In this case, the search result screen is updated each time the search condition is input or changed on the search condition input screen.
- a message to that effect may be displayed.
- a search result screen may be displayed when a search is performed with some search conditions disabled. In this case, it is desirable that the invalidated search condition is specified.
- the information processing system 100 reproduces the drawing result based on the drawing information set.
- the information processing system 100 reproduces a display object (for example, a reproduction screen) indicating a past drawing result based on a drawing information set selected as a reproduction target by the user on the search result screen.
- the information processing system 100 reproduces the drawing result by performing a simulation of moving the drawing object indicated by the drawing object information according to the information indicating the trajectory included in the drawing process information.
- the information processing system 100 may reproduce a completed drawing result of an illustration or the like that has been written, or may reproduce a time-series change in a drawing result from the start of drawing to the end of writing.
- the information processing system 100 may reproduce the drawing result by displaying the pixel data included in the drawing process information as it is. Further, the information processing system 100 may consider the drawing color, drawing style, state, and the like of the drawing object.
- the information processing system 100 displays an image showing a real object around the user at the time of drawing, an image showing the state of the user at the time of drawing, It is also possible to reproduce the sound that was ringing around the user. Thereby, the user who browses can relive the past drawing experience.
- FIGS. 5 and 6 are diagrams for explaining an example of the reproduction screen according to the present embodiment.
- the left figure of each figure shows the illustration 30 drawn on the imitation paper 20 in the example shown in FIGS. 5 and 6, and the information processing system 100 stores the drawing information set related to the illustration 30. Then, as shown in the right diagram of each drawing, the information processing system 100 displays a reproduction screen on the top surface of the table 140 by the output unit 130 based on the stored drawing information set.
- the information processing system 100 selectively reproduces only the illustration 30A drawn by the user 10A among the illustrations 30 drawn by the users 10A, 10B, and 10C.
- the information processing system 100 may display information indicating a drawing result by a specific user among a plurality of users based on the drawing information set. Thereby, the browsing user can selectively browse only the drawing result of the desired user.
- the information processing system 100 reproduces the illustrations 30A, 30B, and 30C by changing how they overlap. Specifically, in the original illustration 30, the back is the illustration 30C, the middle is the illustration 30A, and the front is the illustration 30B. On the reproduction screen 61, the back is the illustration 30A, the middle is the illustration 30C, and the front is It is illustration 30B.
- the information processing system 100 may display the drawing results by a plurality of users while changing the overlapping manner based on the drawing information set. Thereby, the browsing user can browse the drawing results of other users as the background while making the drawing results of the desired user stand out.
- the information processing system 100 reproduces the illustrations 30A, 30B, and 30C by changing their arrangement. Specifically, on the reproduction screen 62, the illustrations 30A, 30B, and 30C are respectively arranged and reproduced at positions apart from the original illustration 30. As described above, the information processing system 100 may change the arrangement of the drawing results by a plurality of users based on the drawing information set. Thereby, the user who browses can browse separately the drawing results of a plurality of users.
- the information processing system 100 reproduces the illustration 30B by changing the line type while maintaining the arrangement of the illustrations 30A, 30B, and 30C. Specifically, regarding the illustration 30B, the information processing system 100 changes the information indicating the drawing style of the drawing object information in the drawing information set from the original line type to, for example, the line type instructed by the viewing user, Reproduce. As described above, the information processing system 100 may process and display at least a part of a drawing result by the user based on the drawing information set. Thereby, the user who browses can perform desired processing and reproduce the drawing result.
- the information processing system 100 displays the reproduction screen based on the drawing information set.
- the information processing system 100 may perform the reproduction process by projecting a real object including a drawing result by the user using a projection apparatus (that is, a projector).
- a projection apparatus that is, a projector
- FIG. 16 is a diagram for explaining an example of the reproduction process according to the present embodiment.
- the imitation paper 20 on which the illustration 30 is drawn is placed on the top surface of the table 140, and the information processing system 100 uses the output unit (projector) 130 to image the imitation paper 20. Is projected.
- the information processing system 100 projects different images on the illustration 30 drawn on the dummy paper 20 by the users 10A, 10B, and 10C depending on the area that includes the drawing result by the user 10A and the area that does not. Specifically, the information processing system 100 projects a white image on an area including a drawing result by the user 10A, that is, the illustration 30A to make it stand out, and projects a black image on an area not including the illustration 30A.
- the information processing system 100 projects, on the real object including the drawing result by the user, different images between the area including the drawing result by the specific user and the area not including the drawing result by the projection device based on the drawing information set. May be. Thereby, the user who browses can make it browse only conspicuously the drawing result of a desired user.
- the information processing system 100 has been described as performing reproduction processing based on information stored in the storage unit 150, but the present technology is not limited to such an example.
- the storage process and the reproduction process may be performed in parallel at different places. In that case, for example, the state of drawing a child at school can be reproduced in real time on the parent smartphone.
- the information processing system 100 may impose access restrictions during processing using the associated information.
- the information processing system 100 may control whether or not at least a part of information included in the drawing information set can be stored for each user who performs drawing. Further, the information processing system 100 may control whether to display at least a part of information included in the drawing information set for each user who browses. Thereby, the information processing system 100 can appropriately protect various rights such as copyrights and portrait rights.
- the information processing system 100 captures an image of a paper material placed in a sensing target range (for example, the top surface of the table 140), source data of the paper material, or the paper material. Of the captured image and the source data.
- the information processing system 100 can control which information is used to perform the reproduction process according to the operation of the user who browses.
- the information processing system 100 is based on who is browsing (for example, a child, a parent, or a teacher), in other words, depending on access restrictions (or granted access rights) imposed on the browsing user. It is possible to control which information is used to perform the reproduction process.
- Access restrictions may be switched for each user. For example, any one of a first access restriction in which display is not restricted but storage is restricted, a second access restriction in which display and storage are not restricted, and a third access restriction in which display and storage are restricted is per user. Can be set to
- the access restriction may be switched for each piece of information included in the drawing information set.
- storage relating to drawing object information may be limited. Specifically, during the storage process, whether information obtained as a result of sensing is stored or default information may be switched. In addition, during reproduction processing, switching between reproduction including colors or reproduction in black and white may be performed.
- Access restrictions may be switched for each content.
- Examples of the contents include drawing results drawn by the user, real objects such as paper materials, and text data of paper materials.
- FIG. 17 is a block diagram illustrating an example of a hardware configuration of the information processing apparatus according to the present embodiment.
- the information processing apparatus 900 illustrated in FIG. 17 can realize the information processing system 100 illustrated in FIG. 4, for example.
- Information processing by the information processing system 100 according to the present embodiment is realized by cooperation between software and hardware described below.
- the information processing apparatus 900 includes a CPU (Central Processing Unit) 901, a ROM (Read Only Memory) 902, a RAM (Random Access Memory) 903, and a host bus 904a.
- the information processing apparatus 900 includes a bridge 904, an external bus 904b, an interface 905, an input device 906, an output device 907, a storage device 908, a drive 909, a connection port 911, and a communication device 913.
- the information processing apparatus 900 may include a processing circuit such as an electric circuit, a DSP, or an ASIC instead of or in addition to the CPU 901.
- the CPU 901 functions as an arithmetic processing unit and a control unit, and controls the overall operation in the information processing apparatus 900 according to various programs. Further, the CPU 901 may be a microprocessor.
- the ROM 902 stores programs used by the CPU 901, calculation parameters, and the like.
- the RAM 903 temporarily stores programs used in the execution of the CPU 901, parameters that change as appropriate during the execution, and the like.
- the CPU 901 can form the processing unit 120 illustrated in FIG. 4.
- the CPU 901, ROM 902, and RAM 903 are connected to each other by a host bus 904a including a CPU bus.
- the host bus 904 a is connected to an external bus 904 b such as a PCI (Peripheral Component Interconnect / Interface) bus via a bridge 904.
- an external bus 904 b such as a PCI (Peripheral Component Interconnect / Interface) bus
- PCI Peripheral Component Interconnect / Interface
- the host bus 904a, the bridge 904, and the external bus 904b do not necessarily have to be configured separately, and these functions may be mounted on one bus.
- the input device 906 is realized by a device in which information is input by the user, such as a mouse, a keyboard, a touch panel, a button, a microphone, a switch, and a lever.
- the input device 906 may be, for example, a remote control device using infrared rays or other radio waves, or may be an external connection device such as a mobile phone or a PDA that supports the operation of the information processing device 900.
- the input device 906 may include, for example, an input control circuit that generates an input signal based on information input by the user using the above-described input means and outputs the input signal to the CPU 901.
- a user of the information processing apparatus 900 can input various data and instruct a processing operation to the information processing apparatus 900 by operating the input device 906.
- the input device 906 can be formed by a device that detects information about the user.
- the input device 906 includes various sensors such as an image sensor (for example, a camera), a depth sensor (for example, a stereo camera), an acceleration sensor, a gyro sensor, a geomagnetic sensor, an optical sensor, a sound sensor, a distance sensor, and a force sensor. Can be included.
- the input device 906 includes information related to the information processing device 900 state, such as the posture and movement speed of the information processing device 900, and information related to the surrounding environment of the information processing device 900, such as brightness and noise around the information processing device 900. May be obtained.
- the input device 906 receives a GNSS signal from a GNSS (Global Navigation Satellite System) satellite (for example, a GPS signal from a GPS (Global Positioning System) satellite) and receives position information including the latitude, longitude, and altitude of the device.
- GNSS Global Navigation Satellite System
- a GNSS module to measure may be included.
- the input device 906 may detect the position by transmission / reception with Wi-Fi (registered trademark), a mobile phone / PHS / smartphone, or the like, or near field communication.
- Wi-Fi registered trademark
- the input device 906 can form, for example, the input unit 110 shown in FIG.
- the output device 907 is formed of a device that can notify the user of the acquired information visually or audibly. Examples of such devices include CRT display devices, liquid crystal display devices, plasma display devices, EL display devices, display devices such as laser projectors, LED projectors and lamps, audio output devices such as speakers and headphones, printer devices, and the like. .
- the output device 907 outputs results obtained by various processes performed by the information processing device 900. Specifically, the display device visually displays results obtained by various processes performed by the information processing device 900 in various formats such as text, images, tables, and graphs.
- the audio output device converts an audio signal composed of reproduced audio data, acoustic data, and the like into an analog signal and outputs it aurally.
- the output device 907 can form, for example, the output unit 130 shown in FIG.
- the storage device 908 is a data storage device formed as an example of a storage unit of the information processing device 900.
- the storage apparatus 908 is realized by, for example, a magnetic storage device such as an HDD, a semiconductor storage device, an optical storage device, a magneto-optical storage device, or the like.
- the storage device 908 may include a storage medium, a recording device that records data on the storage medium, a reading device that reads data from the storage medium, a deletion device that deletes data recorded on the storage medium, and the like.
- the storage device 908 stores programs executed by the CPU 901, various data, various data acquired from the outside, and the like.
- the storage device 908 can form, for example, the storage unit 150 shown in FIG.
- the drive 909 is a storage medium reader / writer, and is built in or externally attached to the information processing apparatus 900.
- the drive 909 reads information recorded on a removable storage medium such as a magnetic disk, an optical disk, a magneto-optical disk, or a semiconductor memory, and outputs the information to the RAM 903.
- the drive 909 can also write information to a removable storage medium.
- connection port 911 is an interface connected to an external device, and is a connection port with an external device capable of transmitting data by USB (Universal Serial Bus), for example.
- USB Universal Serial Bus
- the communication device 913 is a communication interface formed by a communication device or the like for connecting to the network 920, for example.
- the communication device 913 is, for example, a communication card for wired or wireless LAN (Local Area Network), LTE (Long Term Evolution), Bluetooth (registered trademark), or WUSB (Wireless USB).
- the communication device 913 may be a router for optical communication, a router for ADSL (Asymmetric Digital Subscriber Line), a modem for various communication, or the like.
- the communication device 913 can transmit and receive signals and the like according to a predetermined protocol such as TCP / IP, for example, with the Internet and other communication devices.
- the communication device 913 can form, for example, the communication unit 160 illustrated in FIG.
- the network 920 is a wired or wireless transmission path for information transmitted from a device connected to the network 920.
- the network 920 may include a public line network such as the Internet, a telephone line network, and a satellite communication network, various LANs including the Ethernet (registered trademark), a wide area network (WAN), and the like.
- the network 920 may include a dedicated line network such as an IP-VPN (Internet Protocol-Virtual Private Network).
- IP-VPN Internet Protocol-Virtual Private Network
- each of the above components may be realized using a general-purpose member, or may be realized by hardware specialized for the function of each component. Therefore, it is possible to change the hardware configuration to be used as appropriate according to the technical level at the time of carrying out this embodiment.
- a computer program for realizing each function of the information processing apparatus 900 according to the present embodiment as described above can be produced and mounted on a PC or the like.
- a computer-readable recording medium storing such a computer program can be provided.
- the recording medium is, for example, a magnetic disk, an optical disk, a magneto-optical disk, a flash memory, or the like.
- the above computer program may be distributed via a network, for example, without using a recording medium.
- the information processing system 100 includes the user information about the user obtained through the first sensing process and the drawing process showing the drawing process on the real object by the user obtained through the second sensing process.
- a process of associating the information and the drawing object information indicating the drawing object used for drawing in the drawing process obtained through the third sensing process is performed. Since user information, drawing process information, and drawing object information are associated with a drawing operation of a certain user, it is possible to appropriately handle input for each user even when a plurality of users perform drawing operations.
- the information processing system 100 can store information regarding a plurality of drawing results drawn by a plurality of users for each user.
- the information processing system 100 can display only one of a plurality of drawing results drawn by a plurality of users, change the overlay, change the arrangement, or appropriately process the result. It is.
- the processing unit 120 and the storage unit 150 are connected to an apparatus such as a server connected to the input unit 110, the output unit 130, and the communication unit 160 via a network or the like. It may be provided.
- the processing unit 120 and the storage unit 150 are provided in a device such as a server, information obtained by the input unit 110 or the communication unit 160 is transmitted to the device such as the server via a network or the like, and the processing unit 120 sets the drawing information set.
- the information for the output unit 130 to output from the device such as the server is sent to the output unit 130 through a network or the like.
- An information processing apparatus comprising: (2) The information processing apparatus according to (1), wherein information indicating a drawing result by a specific user among the plurality of users is displayed based on the information on which the associating process is performed. (3) The information processing apparatus according to (1) or (2), wherein a method of overlapping drawing results by the plurality of users is displayed based on information on which the association processing is performed.
- the information processing apparatus according to any one of (1) to (5), wherein whether or not at least a part of information included in the information subjected to the association processing is displayed is controlled for each browsing user.
- the information processing apparatus according to any one of (1) to (7), wherein the information on which the associating process is performed is stored in a storage unit.
- the information processing apparatus wherein information on which the association process including information that is the same as or similar to the input search condition is performed is searched from the storage unit, and information based on the search result is displayed. .
- the information processing apparatus according to (8) or (9), wherein whether or not at least a part of information included in the information subjected to the association process is stored is controlled for each user who draws.
- the drawing process information includes at least one of information indicating a locus of a representative point of the drawing object being drawn and information showing a drawing result by the drawing object.
- the information indicating the trajectory is the information processing apparatus according to (11), wherein the information indicating the trajectory in a three-dimensional space.
- the drawing object information includes at least one of identification information of the drawing object, information indicating a drawing color of the drawing object in the drawing process, information indicating a drawing style, and information indicating a state.
- the information processing apparatus according to any one of (12).
- the drawing object information includes information indicating transparency of a drawing result by the drawing object, The transparency of the drawing result is detected based on a difference between a portion of the drawing result in which another drawing result is overwritten and a portion that does not overlap with another drawing result.
- the information processing apparatus according to one item.
- the information processing apparatus according to any one of (1) to (14), wherein the user information includes at least one of identification information and attribute information of the user.
- the user information is at least one of the captured image of the user or the user in the drawing process, the voice of the user or the user, the biological information of the user, or information obtained by processing the information.
- the information processing apparatus including: (17) The information processing apparatus according to any one of (1) to (16), wherein the processing unit performs processing for further associating time information with the user information, the drawing process information, and the drawing object information. (18) The processing unit performs processing for further associating the user information, the drawing process information, and the drawing object information with information relating to an actual object existing around the user, according to any one of (1) to (17) The information processing apparatus according to one item. (19) User information about the user obtained through the first sensing process, drawing process information obtained through the second sensing process indicating the drawing process on the real object by the user, and obtained through the third sensing process.
- the processor performs a process of associating the drawing object information indicating the drawing object used for drawing in the drawing process,
- An information processing method including: (20) Computer User information about the user obtained through the first sensing process, drawing process information obtained through the second sensing process indicating the drawing process on the real object by the user, and obtained through the third sensing process.
- a storage medium that stores a program for functioning as a computer.
Landscapes
- Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- Data Mining & Analysis (AREA)
- Databases & Information Systems (AREA)
- Physics & Mathematics (AREA)
- General Engineering & Computer Science (AREA)
- General Physics & Mathematics (AREA)
- User Interface Of Digital Computer (AREA)
Abstract
[Problem] To provide a structure which makes it possible to appropriately deal with inputs by a plurality of users.
[Solution] An information processing device provided with a processing unit for performing a process for associating, with one another: user information which is acquired through a first sensing process and which relates to users; drawing process information which is acquired through a second sensing process and which indicates a drawing process with respect to an actual object by the users; and drawing object information which is acquired through a third sensing process and which indicates a drawing object used to perform drawing in the drawing process.
Description
本開示は、情報処理装置、情報処理方法及び記憶媒体に関する。
The present disclosure relates to an information processing apparatus, an information processing method, and a storage medium.
スマートフォンやタブレット端末などの、タッチパネルに対するユーザ操作に応じて様々な情報を表示する装置が、広く普及している。タブレット端末については、画面サイズの大型化も図られるようになり、複数のユーザが同時に操作する使われ方も考慮されるようになりつつある。また、情報を表示する装置としては、従来からプロジェクタが用いられている。
Devices that display various information according to user operations on the touch panel, such as smartphones and tablet terminals, are widely used. With regard to tablet terminals, the screen size has been increased, and the use method in which a plurality of users operate at the same time is being considered. In addition, a projector is conventionally used as a device for displaying information.
ユーザインタラクションに応じて情報を効率よく表示する技術は従来から多く提案されており、例えば下記特許文献1では、情報を表示しようとする環境や、表示されている情報の状況に応じて情報の表示を行う技術が開示されている。
Many techniques for efficiently displaying information according to user interaction have been proposed. For example, in Patent Document 1 below, information is displayed according to the environment in which information is to be displayed and the status of the information being displayed. Techniques for performing are disclosed.
近年、上述した各種情報処理装置をはじめとして、ユーザとのインタラクションに基づいて各種サービスを提供する装置が増えている。このような装置では、典型的には、ひとりのユーザとのインタラクションに基づいて、当該ひとりのユーザへのサービスが提供される。複数ユーザへの拡張に関しては、上記特許文献1において、ユーザが複数いる場合における各々のユーザへのコンテンツの提供方法の検討が行われている。しかしながら、ユーザが複数いる場合におけるユーザ入力に関する検討は十分ではなかった。
In recent years, an increasing number of devices provide various services based on user interaction, including the various information processing devices described above. Such devices typically provide services to a single user based on interaction with the single user. Regarding the extension to a plurality of users, in Patent Document 1, a method of providing content to each user when there are a plurality of users has been studied. However, a study on user input when there are a plurality of users has not been sufficient.
そこで、本開示では、複数のユーザによる入力を適切に取り扱うことを可能にする仕組みを提供する。
Therefore, the present disclosure provides a mechanism that makes it possible to appropriately handle input from a plurality of users.
本開示によれば、第1のセンシングプロセスを通じて得られた、ユーザに関するユーザ情報と、第2のセンシングプロセスを通じて得られた、前記ユーザによる実物体への描画過程を示す描画過程情報と、第3のセンシングプロセスを通じて得られた、前記描画過程において描画に用いられた描画物体を示す描画物体情報とを、関連付ける処理を行う処理部、を備える情報処理装置が提供される。
According to the present disclosure, user information regarding the user obtained through the first sensing process, drawing process information obtained through the second sensing process and indicating a drawing process on the real object by the user, and third There is provided an information processing apparatus including a processing unit that performs processing for associating drawing object information indicating a drawing object used for drawing in the drawing process obtained through the sensing process.
また、本開示によれば、第1のセンシングプロセスを通じて得られた、ユーザに関するユーザ情報と、第2のセンシングプロセスを通じて得られた、前記ユーザによる実物体への描画過程を示す描画過程情報と、第3のセンシングプロセスを通じて得られた、前記描画過程において描画に用いられた描画物体を示す描画物体情報とを、関連付ける処理をプロセッサにより行うこと、を含む情報処理方法が提供される。
Further, according to the present disclosure, user information about the user obtained through the first sensing process, drawing process information obtained through the second sensing process and indicating a drawing process on the real object by the user, There is provided an information processing method including performing a process of associating, with a processor, drawing object information indicating a drawing object used for drawing in the drawing process obtained through a third sensing process.
また、本開示によれば、コンピュータを、第1のセンシングプロセスを通じて得られた、ユーザに関するユーザ情報と、第2のセンシングプロセスを通じて得られた、前記ユーザによる実物体への描画過程を示す描画過程情報と、第3のセンシングプロセスを通じて得られた、前記描画過程において描画に用いられた描画物体を示す描画物体情報とを、関連付ける処理を行う処理部、として機能させるためのプログラムが記憶された記憶媒体が提供される。
In addition, according to the present disclosure, a drawing process showing a user drawing information on a real object by the user obtained through the second sensing process and user information about the user obtained through the first sensing process. A storage storing a program for functioning as a processing unit that performs processing for associating information and drawing object information indicating a drawing object used for drawing in the drawing process, obtained through the third sensing process A medium is provided.
本開示によれば、あるユーザの描画動作に関し、ユーザ情報、描画過程情報及び描画物体情報が関連付ける処理が行われる。そのため、複数のユーザが描画動作を行う場合であっても、関連付けられた情報に基づいて、各々のユーザごとの入力を適切に取り扱うことが可能となる。
According to the present disclosure, a process of associating user information, drawing process information, and drawing object information with respect to a drawing operation of a certain user is performed. Therefore, even when a plurality of users perform a drawing operation, it is possible to appropriately handle the input for each user based on the associated information.
以上説明したように本開示によれば、複数のユーザによる入力を適切に取り扱うことを可能にする仕組みが提供される。なお、上記の効果は必ずしも限定的なものではなく、上記の効果とともに、または上記の効果に代えて、本明細書に示されたいずれかの効果、または本明細書から把握され得る他の効果が奏されてもよい。
As described above, according to the present disclosure, a mechanism is provided that makes it possible to appropriately handle inputs from a plurality of users. Note that the above effects are not necessarily limited, and any of the effects shown in the present specification, or other effects that can be grasped from the present specification, together with or in place of the above effects. May be played.
以下に添付図面を参照しながら、本開示の好適な実施の形態について詳細に説明する。なお、本明細書及び図面において、実質的に同一の機能構成を有する構成要素については、同一の符号を付することにより重複説明を省略する。
Hereinafter, preferred embodiments of the present disclosure will be described in detail with reference to the accompanying drawings. In addition, in this specification and drawing, about the component which has the substantially same function structure, duplication description is abbreviate | omitted by attaching | subjecting the same code | symbol.
なお、説明は以下の順序で行うものとする。
1.情報処理システムの概要
2.機能構成
3.技術的特徴
3.1.概要
3.2.記憶処理
3.2.1.センシング情報の取得
3.2.2.センシング情報に基づく各種情報の取得
3.2.3.記憶
3.3.再現処理
3.3.1.検索条件入力の受け付け及び選択操作
3.3.2.再現対象の再現
3.4.アクセス制限
4.ハードウェア構成例
5.まとめ The description will be made in the following order.
1. 1. Overview of information processing system Functional configuration Technical features 3.1. Outline 3.2. Storage process 3.2.1. Acquisition of sensing information 3.2.2. Acquisition of various information based on sensing information 3.2.3. Memory 3.3. Reproduction process 3.3.1. Accepting and selecting search condition input 3.3.2. Reproduction of reproduction target 3.4. Access restriction 4. Hardware configuration example Summary
1.情報処理システムの概要
2.機能構成
3.技術的特徴
3.1.概要
3.2.記憶処理
3.2.1.センシング情報の取得
3.2.2.センシング情報に基づく各種情報の取得
3.2.3.記憶
3.3.再現処理
3.3.1.検索条件入力の受け付け及び選択操作
3.3.2.再現対象の再現
3.4.アクセス制限
4.ハードウェア構成例
5.まとめ The description will be made in the following order.
1. 1. Overview of information processing system Functional configuration Technical features 3.1. Outline 3.2. Storage process 3.2.1. Acquisition of sensing information 3.2.2. Acquisition of various information based on sensing information 3.2.3. Memory 3.3. Reproduction process 3.3.1. Accepting and selecting search condition input 3.3.2. Reproduction of reproduction target 3.4. Access restriction 4. Hardware configuration example Summary
<<1.情報処理システムの概要>>
図1を参照して、本開示の一実施形態に係る情報処理システムの構成について説明する。図1は、本開示の一実施形態に係る情報処理システムの一構成例を示す図である。なお、本明細書において、システムとは、所定の処理を実行するための構成のことを意味していてよく、システム全体として1つの装置とみなすこともできるし、複数の装置によってシステムが構成されているとみなすこともできる。図1に示す本実施形態に係る情報処理システムも、情報処理システム全体として所定の処理(例えば図4に示す機能構成によって実現される処理)を実行可能に構成されていればよく、情報処理システム内のどの構成を1つの装置とみなすかは任意であってよい。 << 1. Overview of Information Processing System >>
A configuration of an information processing system according to an embodiment of the present disclosure will be described with reference to FIG. FIG. 1 is a diagram illustrating a configuration example of an information processing system according to an embodiment of the present disclosure. In this specification, the system may mean a configuration for executing a predetermined process, and the system as a whole can be regarded as one device, or a system is configured by a plurality of devices. It can also be regarded as. The information processing system according to the present embodiment illustrated in FIG. 1 may also be configured to execute predetermined processing (for example, processing realized by the functional configuration illustrated in FIG. 4) as the entire information processing system. Which of these configurations is considered as one device may be arbitrary.
図1を参照して、本開示の一実施形態に係る情報処理システムの構成について説明する。図1は、本開示の一実施形態に係る情報処理システムの一構成例を示す図である。なお、本明細書において、システムとは、所定の処理を実行するための構成のことを意味していてよく、システム全体として1つの装置とみなすこともできるし、複数の装置によってシステムが構成されているとみなすこともできる。図1に示す本実施形態に係る情報処理システムも、情報処理システム全体として所定の処理(例えば図4に示す機能構成によって実現される処理)を実行可能に構成されていればよく、情報処理システム内のどの構成を1つの装置とみなすかは任意であってよい。 << 1. Overview of Information Processing System >>
A configuration of an information processing system according to an embodiment of the present disclosure will be described with reference to FIG. FIG. 1 is a diagram illustrating a configuration example of an information processing system according to an embodiment of the present disclosure. In this specification, the system may mean a configuration for executing a predetermined process, and the system as a whole can be regarded as one device, or a system is configured by a plurality of devices. It can also be regarded as. The information processing system according to the present embodiment illustrated in FIG. 1 may also be configured to execute predetermined processing (for example, processing realized by the functional configuration illustrated in FIG. 4) as the entire information processing system. Which of these configurations is considered as one device may be arbitrary.
図1を参照すると、本開示の一実施形態に係る情報処理システム100aは、入力部110aと、出力部130aと、を備える。
Referring to FIG. 1, an information processing system 100a according to an embodiment of the present disclosure includes an input unit 110a and an output unit 130a.
出力部130aは、各種の情報をテーブル140aに表示することにより、当該情報をユーザに対して視覚的に通知する。出力部130aとしては、例えばプロジェクタが用いられる。図示するように、出力部130aは、テーブル140aの上方に、例えば天井から吊り下げられた状態でテーブル140aと所定の距離離隔して配置され、テーブル140aの天面に情報を投影する。このように上方からテーブル140aの天面に情報を表示する方式を、「プロジェクション型」とも呼称する。
The output unit 130a visually notifies the user of the information by displaying various types of information on the table 140a. For example, a projector is used as the output unit 130a. As illustrated, the output unit 130a is disposed above the table 140a, for example, spaced from the table 140a by a predetermined distance while being suspended from the ceiling, and projects information on the top surface of the table 140a. A method for displaying information on the top surface of the table 140a from above is also referred to as a “projection type”.
なお、以下の説明では、出力部130aによって情報が表示される領域全体のことを表示画面とも呼称する。例えば、出力部130aは、表示画面に、情報処理システム100aによるアプリケーションの実行に伴いユーザに対して提示される情報を表示する。表示される情報は、例えば各アプリケーションの動作画面である。以下では、表示画面において、このようなアプリケーションの動作画面が表示される各表示領域を、表示オブジェクトとも称する。表示オブジェクトは、いわゆるGUI(Graphical User Interface)部品(ウィジェット)であってもよい。
In the following description, the entire area where information is displayed by the output unit 130a is also referred to as a display screen. For example, the output unit 130a displays information presented to the user as the application is executed by the information processing system 100a on the display screen. The displayed information is, for example, an operation screen of each application. Hereinafter, each display area in which the operation screen of such an application is displayed on the display screen is also referred to as a display object. The display object may be a so-called GUI (Graphical User Interface) component (widget).
ここで、情報処理システム100aがプロジェクション型である場合には、出力部130aは照明機器を含んでもよい。出力部130aに照明機器が含まれる場合、情報処理システム100aは、入力部110aによって入力された情報の内容及び/又は出力部130aによって表示される情報の内容に基づいて、当該照明機器の点灯、消灯等の状態を制御してもよい。
Here, when the information processing system 100a is a projection type, the output unit 130a may include a lighting device. When the output unit 130a includes a lighting device, the information processing system 100a turns on the lighting device based on the content of information input by the input unit 110a and / or the content of information displayed by the output unit 130a. You may control states, such as light extinction.
また、出力部130aは、スピーカを含んでもよく、各種の情報を音声として出力してもよい。出力部130aがスピーカで構成される場合、スピーカの数は1つであってもよく、複数であってもよい。出力部130aが複数のスピーカで構成される場合、情報処理システム100aは、音声を出力するスピーカを限定したり、音声を出力する方向を調整したりしてもよい。もちろん、出力部130aは、複数の出力装置を含んでいてもよく、例えば、プロジェクタ、照明機器及びスピーカを含んでいてもよい。
Further, the output unit 130a may include a speaker and may output various kinds of information as sound. When the output unit 130a is configured by speakers, the number of speakers may be one or plural. When the output unit 130a includes a plurality of speakers, the information processing system 100a may limit the speakers that output sound or adjust the direction in which sound is output. Of course, the output unit 130a may include a plurality of output devices, and may include, for example, a projector, a lighting device, and a speaker.
入力部110aは、情報処理システム100aを使用するユーザの操作内容を入力する装置である。図1に示す例では、入力部110aは、テーブル140aの上方に、例えば天井から吊り下げられた状態で設けられる。このように、入力部110aは、情報が表示される対象となるテーブル140aと離隔して設けられる。入力部110aは、テーブル140aの天面、すなわち、表示画面を撮影し得る撮像装置によって構成され得る。入力部110aとしては、例えば1つのレンズでテーブル140aを撮像するカメラや、2つのレンズでテーブル140aを撮像して奥行き方向の情報を記録することが可能なステレオカメラ等が用いられ得る。入力部110aがステレオカメラである場合には、例えば可視光カメラや赤外線カメラ等が用いられ得る。
The input unit 110a is a device that inputs operation details of a user who uses the information processing system 100a. In the example illustrated in FIG. 1, the input unit 110a is provided above the table 140a, for example, in a state suspended from the ceiling. As described above, the input unit 110a is provided apart from the table 140a on which information is displayed. The input unit 110a may be configured by an imaging device that can capture the top surface of the table 140a, that is, the display screen. As the input unit 110a, for example, a camera that images the table 140a with one lens, a stereo camera that can record information in the depth direction by imaging the table 140a with two lenses, or the like can be used. When the input unit 110a is a stereo camera, for example, a visible light camera or an infrared camera can be used.
入力部110aとして、1つのレンズでテーブル140aを撮像するカメラが用いられる場合、情報処理システム100aは、そのカメラが撮像した画像(撮像画像)を解析することで、テーブル140a上に位置する物理的な物体(以下、実物体とも称する)、例えばユーザの手の位置を検出することができる。また、入力部110aとしてステレオカメラが用いられる場合には、情報処理システム100aは、当該ステレオカメラによる撮像画像を解析することで、テーブル140a上に位置する物体の位置情報に加えて、当該物体の深度情報(換言すると、三次元情報)を取得することができる。情報処理システム100aは、当該深度情報に基づいて、高さ方向におけるテーブル140aへのユーザの手の接触若しくは近接、及びテーブル140aからの手の離脱を検出することが可能となる。なお、以下の説明では、ユーザが情報に表示画面に手等の操作体を接触又は近接させることを、まとめて単に「接触」とも称する。
When a camera that captures the table 140a with a single lens is used as the input unit 110a, the information processing system 100a analyzes the image (captured image) captured by the camera to physically locate the table 140a. It is possible to detect an object (hereinafter also referred to as a real object), for example, the position of a user's hand. When a stereo camera is used as the input unit 110a, the information processing system 100a analyzes the captured image by the stereo camera, and in addition to the position information of the object located on the table 140a, Depth information (in other words, three-dimensional information) can be acquired. The information processing system 100a can detect contact or proximity of the user's hand to the table 140a in the height direction and separation of the hand from the table 140a based on the depth information. In the following description, when the user touches or brings an operating body such as a hand in contact with information on the display screen is also simply referred to as “contact”.
本実施形態では、入力部110aによる撮像画像に基づいて、表示画面上(すなわちテーブル140aの天面上)における操作体、例えばユーザの手の位置が検出され、検出された操作体の位置に基づいて各種の情報が入力される。つまり、ユーザは、表示画面上で操作体を動かすことにより、各種の操作入力を行うことができる。例えば、表示オブジェクトに対するユーザの手の接触が検出されることにより、当該表示オブジェクトに対する操作入力が行われることになる。なお、以下の説明では、一例として、操作体としてユーザの手が用いられる場合について説明するが、本実施形態はかかる例に限定されず、操作体としてはスタイラス等の各種の操作部材が用いられてもよい。
In the present embodiment, based on the image captured by the input unit 110a, the position of the operating body, for example, the user's hand on the display screen (that is, the top surface of the table 140a) is detected, and based on the detected position of the operating body. Various information is input. That is, the user can perform various operation inputs by moving the operating tool on the display screen. For example, when a user's hand contact with the display object is detected, an operation input for the display object is performed. In the following description, a case where a user's hand is used as an operation body will be described as an example, but the present embodiment is not limited to this example, and various operation members such as a stylus are used as the operation body. May be.
また、入力部110aが撮像装置によって構成される場合には、入力部110aは、テーブル140aの天面を撮影するだけでなく、テーブル140aの周囲に存在するユーザを撮影してもよい。例えば、情報処理システム100aは、撮像画像に基づいて、テーブル140aの周囲におけるユーザの位置を検出することができる。また、例えば、情報処理システム100aは、撮像画像に含まれるユーザの顔や体の大きさ等、ユーザ個人を特定し得る身体的な特徴を抽出することにより、ユーザの個人認識を行ってもよい。
Further, when the input unit 110a is configured by an imaging device, the input unit 110a may capture not only the top surface of the table 140a but also a user existing around the table 140a. For example, the information processing system 100a can detect the position of the user around the table 140a based on the captured image. In addition, for example, the information processing system 100a may perform personal recognition of the user by extracting physical features that can identify the individual of the user, such as the size of the user's face and body included in the captured image. .
ここで、本実施形態はかかる例に限定されず、他の方法によりユーザの操作入力が実行されてもよい。例えば、入力部110aがテーブル140aの天面にタッチパネルとして設けられてもよく、ユーザの操作入力は、当該タッチパネルに対するユーザの指等の接触によって検出されてもよい。なお、タッチパネルは、感圧式、静電容量式、又は光学式等の多様な方式で実現されてもよい。また、入力部110aは、超音波の反射を用いて対象物の空間位置認識を行う、又は対象物の振動を検出して解析することで対象物と他の物体との接触位置を検出することで、テーブル天面140aへのユーザ操作を検出してもよい。入力部110aは、テーブル天面140aへのユーザ操作を検出する技術として、これらのいずれか又は任意の組み合わせを採用し得る。また、入力部110aを構成する撮像装置に対するジェスチャによってユーザの操作入力が検出されてもよい。あるいは、入力部110aは、ユーザが発する音声や、周囲の環境の環境音を収音するマイクロフォン等の音声入力装置を含んでもよい。当該音声入力装置としては、特定の方向の音声を収音するためのマイクアレイが好適に用いられ得る。また、当該マイクアレイは、その収音方向を任意の方向に調整可能に構成され得る。入力部110aとして音声入力装置が用いられる場合には、収音された音声により操作入力が行われてもよい。また、情報処理システム100aは、収音された音声を解析することにより、当該音声に基づいて個人認識を行ってもよい。あるいは、入力部110aは、リモートコントロール装置(いわゆるリモコン)によって構成されてもよい。当該リモコンは、リモコンに配置された所定のボタンを操作することにより、所定の指示が入力されるものであってもよいし、リモコンに搭載される加速度センサやジャイロセンサ等のセンサによってリモコンの動きや姿勢が検出されることにより、ユーザが当該リモコンを動かす操作によって、所定の指示が入力されるものであってもよい。更に、情報処理システム100aは、図示しないマウス、キーボード、ボタン、スイッチ及びレバー等、他の入力装置を入力部110aとして備えてもよく、ユーザによる操作は、これらの入力装置を介して入力されてもよい。
Here, the present embodiment is not limited to such an example, and user operation input may be executed by other methods. For example, the input unit 110a may be provided as a touch panel on the top surface of the table 140a, and the user's operation input may be detected by contact of the user's finger or the like with respect to the touch panel. The touch panel may be realized by various methods such as a pressure-sensitive type, a capacitance type, and an optical type. Further, the input unit 110a performs spatial position recognition of an object using ultrasonic reflection, or detects a contact position between the object and another object by detecting and analyzing the vibration of the object. Thus, a user operation on the table top surface 140a may be detected. The input unit 110a may employ any one or any combination thereof as a technique for detecting a user operation on the table top surface 140a. Further, the user's operation input may be detected by a gesture with respect to the imaging device constituting the input unit 110a. Alternatively, the input unit 110a may include a voice input device such as a microphone that picks up sounds produced by the user and environmental sounds of the surrounding environment. As the sound input device, a microphone array for collecting sound in a specific direction can be suitably used. Further, the microphone array can be configured such that the sound collection direction can be adjusted to an arbitrary direction. When a voice input device is used as the input unit 110a, an operation input may be performed using the collected voice. Further, the information processing system 100a may perform individual recognition based on the voice by analyzing the collected voice. Alternatively, the input unit 110a may be configured by a remote control device (so-called remote control). The remote controller may be one in which a predetermined instruction is input by operating a predetermined button arranged on the remote controller, or movement of the remote controller by a sensor such as an acceleration sensor or a gyro sensor mounted on the remote controller. Or a posture may be detected, and a predetermined instruction may be input by an operation of the user moving the remote controller. Furthermore, the information processing system 100a may include other input devices such as a mouse, a keyboard, buttons, switches, and levers (not shown) as the input unit 110a, and user operations are input through these input devices. Also good.
以上、図1を参照して、本実施形態に係る情報処理システム100aの構成について説明した。なお、図1には図示しないが、情報処理システム100aには他の装置が接続されていてもよい。例えば、情報処理システム100aには、テーブル140aを照らすための照明機器が接続されていてもよい。情報処理システム100aは、表示画面の状態に応じて当該照明機器の点灯状態を制御してもよい。
The configuration of the information processing system 100a according to the present embodiment has been described above with reference to FIG. Although not shown in FIG. 1, another device may be connected to the information processing system 100a. For example, an illumination device for illuminating the table 140a may be connected to the information processing system 100a. The information processing system 100a may control the lighting state of the lighting device according to the state of the display screen.
ここで、本実施形態では、情報処理システムの構成は図1に示すものに限定されない。本実施形態に係る情報処理システムは、各種の情報を表示画面に表示する出力部と、表示された情報に対する操作入力を少なくとも受け付けることが可能な入力部を備えればよく、その具体的な構成は限定されない。図2及び図3を参照して、本実施形態に係る情報処理システムの他の構成例について説明する。図2及び図3は、本実施形態に係る情報処理システムの他の構成例を示す図である。
Here, in the present embodiment, the configuration of the information processing system is not limited to that shown in FIG. The information processing system according to the present embodiment only needs to include an output unit that displays various types of information on the display screen, and an input unit that can accept at least an operation input for the displayed information. Is not limited. With reference to FIG.2 and FIG.3, the other structural example of the information processing system which concerns on this embodiment is demonstrated. 2 and 3 are diagrams showing another configuration example of the information processing system according to the present embodiment.
図2に示す情報処理システム100bでは、テーブル140bの下方に出力部130aが設けられる。出力部130aは、例えばプロジェクタであり、当該テーブル140bの天板に向かって下側から情報を投影する。テーブル140bの天板は、例えばガラス板や透明プラスチック板等の透明な材質で形成されており、出力部130aによって投影された情報は、テーブル140bの天面に表示されることとなる。このようにテーブル140bの下から情報を出力部130aに投影して、テーブル140bの天面に情報を表示する方式を、「リアプロジェクション型」とも称する。
In the information processing system 100b shown in FIG. 2, an output unit 130a is provided below the table 140b. The output unit 130a is a projector, for example, and projects information from below toward the top plate of the table 140b. The top plate of the table 140b is formed of a transparent material such as a glass plate or a transparent plastic plate, and the information projected by the output unit 130a is displayed on the top surface of the table 140b. A method of projecting information from the bottom of the table 140b onto the output unit 130a and displaying the information on the top surface of the table 140b is also referred to as a “rear projection type”.
図2に示す例では、テーブル140bの天面(表面)に入力部110bが設けられる。入力部110bは、例えばタッチパネルによって構成され、テーブル140bの天面の表示画面への操作体の接触が当該タッチパネルによって検出されることにより、ユーザによる操作入力が行われる。なお、入力部110bの構成はかかる例に限定されず、図1に示す情報処理システム100aと同様に、入力部110bは、テーブル140bの下方にテーブル140bと離隔して設けられてもよい。この場合、入力部110bは、例えば撮像装置によって構成され、透明な材質によって形成される天板越しに、テーブル140bの天面上での操作体の位置を検出し得る。
In the example shown in FIG. 2, the input unit 110b is provided on the top surface (front surface) of the table 140b. The input unit 110b is configured by, for example, a touch panel, and the operation input by the user is performed when the touch of the operating body on the display screen on the top surface of the table 140b is detected by the touch panel. Note that the configuration of the input unit 110b is not limited to this example, and the input unit 110b may be provided below the table 140b and separated from the table 140b, similarly to the information processing system 100a shown in FIG. In this case, the input unit 110b is configured by an imaging device, for example, and can detect the position of the operation body on the top surface of the table 140b through a top plate formed of a transparent material.
図3に示す情報処理システム100cでは、タッチパネル式のディスプレイが、その表示画面を上方に向けた状態でテーブル上に設置される。情報処理システム100cでは、入力部110c及び出力部130cは、当該タッチパネル式のディスプレイとして一体的に構成され得る。つまり、ディスプレイの表示画面に各種の情報が表示され、当該ディスプレイの表示画面に対する操作体の接触がタッチパネルによって検出されることにより、ユーザによる操作入力が行われる。なお、情報処理システム100cにおいても、図1に示す情報処理システム100aと同様に、入力部110cとして、タッチパネル式のディスプレイの上方に撮像装置が設けられてもよい。当該撮像装置により、テーブルの周囲のユーザの位置等が検出され得る。
In the information processing system 100c shown in FIG. 3, a touch panel display is installed on a table with its display screen facing upward. In the information processing system 100c, the input unit 110c and the output unit 130c can be integrally configured as the touch panel display. That is, various types of information are displayed on the display screen of the display, and the operation input by the user is performed by detecting the touch of the operating body on the display screen of the display by the touch panel. In the information processing system 100c as well, as in the information processing system 100a shown in FIG. 1, an imaging device may be provided above the touch panel display as the input unit 110c. The position of the user around the table can be detected by the imaging device.
以上、図2及び図3を参照して、本実施形態に係る情報処理システムの他の構成例について説明した。以上説明したように、本実施形態に係る情報処理システムは、多様な構成によって実現され得る。ここで、以下では、図1に示す、テーブル140aの上方に入力部110a及び出力部130aが設けられる情報処理システム100aの構成を例に挙げて、本実施形態についての説明を行う。ただし、上述した図2又は図3に示す構成等、本実施形態に係る情報処理システムを実現し得る他の構成であっても、以下に説明する機能と同様の機能を実現することが可能である。以下の説明では、簡単のため、情報処理システム100a、入力部110a、出力部130aを、単に情報処理システム100、入力部110、出力部130とも呼称することとする。
The other configuration example of the information processing system according to the present embodiment has been described above with reference to FIGS. 2 and 3. As described above, the information processing system according to the present embodiment can be realized by various configurations. Here, in the following, the present embodiment will be described by taking as an example the configuration of the information processing system 100a in which the input unit 110a and the output unit 130a are provided above the table 140a shown in FIG. However, even with other configurations that can realize the information processing system according to the present embodiment, such as the configuration shown in FIG. 2 or FIG. 3 described above, functions similar to those described below can be realized. is there. In the following description, for the sake of simplicity, the information processing system 100a, the input unit 110a, and the output unit 130a are simply referred to as the information processing system 100, the input unit 110, and the output unit 130.
<<2.機能構成>>
以下、図4を参照して、以上説明した本実施形態に係る情報処理システム100を実現し得る機能構成について説明する。図4は、本実施形態に係る情報処理システム100の機能構成の一例を示すブロック図である。 << 2. Functional configuration >>
Hereinafter, a functional configuration capable of realizing theinformation processing system 100 according to the present embodiment described above will be described with reference to FIG. FIG. 4 is a block diagram illustrating an example of a functional configuration of the information processing system 100 according to the present embodiment.
以下、図4を参照して、以上説明した本実施形態に係る情報処理システム100を実現し得る機能構成について説明する。図4は、本実施形態に係る情報処理システム100の機能構成の一例を示すブロック図である。 << 2. Functional configuration >>
Hereinafter, a functional configuration capable of realizing the
図4を参照すると、本実施形態に係る情報処理システム100は、その機能として、入力部110、処理部120、出力部130、記憶部150、及び通信部160を含む。
Referring to FIG. 4, the information processing system 100 according to the present embodiment includes an input unit 110, a processing unit 120, an output unit 130, a storage unit 150, and a communication unit 160 as its functions.
(1)入力部110
入力部110は、情報処理システム100に対して各種の情報を入力するための入力インタフェースである。ユーザは、入力部110を介して、各種の情報を情報処理システム100に入力することができる。入力部110は、図1~図3に示す入力部110a~110cに対応するものである。 (1)Input unit 110
Theinput unit 110 is an input interface for inputting various information to the information processing system 100. A user can input various types of information to the information processing system 100 via the input unit 110. The input unit 110 corresponds to the input units 110a to 110c shown in FIGS.
入力部110は、情報処理システム100に対して各種の情報を入力するための入力インタフェースである。ユーザは、入力部110を介して、各種の情報を情報処理システム100に入力することができる。入力部110は、図1~図3に示す入力部110a~110cに対応するものである。 (1)
The
入力部110は、各種センサを含み得る。入力部110は、センシング対象範囲におけるユーザ、ユーザ動作、実物体、およびこれらと表示オブジェクトとの関係等についてセンシングを行い、センシングの結果を示すセンシング情報を生成して処理部120に出力する。センシング対象範囲は、テーブル140の天面に限定されずともよく、例えばテーブル140の周囲を含んでいてもよい。
The input unit 110 can include various sensors. The input unit 110 performs sensing on the user in the sensing target range, user actions, real objects, and the relationship between these and display objects, generates sensing information indicating the sensing result, and outputs the sensing information to the processing unit 120. The sensing target range may not be limited to the top surface of the table 140, and may include, for example, the periphery of the table 140.
例えば、入力部110は、撮像装置を含み、ユーザの体、ユーザの顔、ユーザの手、及びテーブル140の天面上に位置する物体等を含む撮像画像を撮影する。入力部110を介して入力された情報(例えば当該撮像画像についての情報等)は、後述する処理部120に提供され、ユーザが識別されたり、ユーザの操作入力が認識されたり、物体が検出されたりする。撮像装置は、例えば可視光カメラ又は赤外線カメラであってもよい。また、上述したように、入力部110は、ステレオカメラ等の深度情報を取得可能な深度センサとしての機能を含む撮像装置として構成されてもよい。他方、深度センサは、time of flight方式、又はstructured light方式等の任意の方式によるセンサとして、撮像装置とは別個に構成されてもよい。また、入力部110は、タッチセンサを含んで構成されてもよい。その場合、タッチセンサは表示画面へのタッチを検出する。そして、表示画面上のタッチしていないユーザの手及び表示画面上の物体の検出機能は、深度センサ及び/又は表示画面を上方から撮像する撮像装置により担保されてもよい。
For example, the input unit 110 includes an imaging device and captures a captured image including a user's body, a user's face, a user's hand, an object positioned on the top surface of the table 140, and the like. Information (for example, information about the captured image) input via the input unit 110 is provided to the processing unit 120 described later, and a user is identified, a user operation input is recognized, or an object is detected. Or The imaging device may be a visible light camera or an infrared camera, for example. Further, as described above, the input unit 110 may be configured as an imaging device including a function as a depth sensor capable of acquiring depth information such as a stereo camera. On the other hand, the depth sensor may be configured separately from the imaging device as a sensor using an arbitrary method such as a time of flight method or a structured light method. The input unit 110 may include a touch sensor. In that case, the touch sensor detects a touch on the display screen. And the detection function of the user's hand which is not touching on the display screen and the object on the display screen may be secured by the imaging device which images the depth sensor and / or the display screen from above.
例えば、入力部110は、収音装置を含み、ユーザの音声、ユーザの動作に伴う音、及び環境音等を収音する。入力部110を介して入力された情報は、後述する処理部120に提供され、ユーザの音声入力が認識されたり、物体の移動等が検出されたりする。収音装置は、アレイマイクであってもよく、処理部120により音源方向が検出されてもよい。なお、収音装置は、典型的にはマイクによって構成されるが、光の振動により音を検出するセンサとして構成されてもよい。
For example, the input unit 110 includes a sound collection device and collects a user's voice, a sound accompanying a user's operation, an environmental sound, and the like. Information input via the input unit 110 is provided to the processing unit 120 described later, and a user's voice input is recognized, or movement of an object is detected. The sound collection device may be an array microphone, and the sound source direction may be detected by the processing unit 120. The sound collection device is typically configured by a microphone, but may be configured as a sensor that detects sound by vibration of light.
(2)処理部120
処理部120は、例えばCPU又はDSP等の各種のプロセッサを含み、各種の演算処理を実行することにより、情報処理システム100の動作を制御する。例えば、処理部120は、入力部110又は通信部160から得られた各種情報を処理して、記憶部150に情報を記憶させたり、出力部130に情報を出力させたりする。処理部120は、各種の情報を処理する情報処理装置として捉えられてもよい。処理部120は、その機能として、関連付け処理部121、記憶制御部123、及び表示制御部125を含む。なお、処理部120は、これらの機能以外の機能を有していてもよい。また、処理部120の各機能は、処理部120を構成するプロセッサが所定のプログラムに従って動作することにより実現される。 (2)Processing unit 120
Theprocessing unit 120 includes various processors such as a CPU and a DSP, and controls the operation of the information processing system 100 by executing various arithmetic processes. For example, the processing unit 120 processes various types of information obtained from the input unit 110 or the communication unit 160 and stores the information in the storage unit 150 or causes the output unit 130 to output the information. The processing unit 120 may be regarded as an information processing apparatus that processes various types of information. The processing unit 120 includes an association processing unit 121, a storage control unit 123, and a display control unit 125 as its functions. Note that the processing unit 120 may have functions other than these functions. In addition, each function of the processing unit 120 is realized by a processor constituting the processing unit 120 operating according to a predetermined program.
処理部120は、例えばCPU又はDSP等の各種のプロセッサを含み、各種の演算処理を実行することにより、情報処理システム100の動作を制御する。例えば、処理部120は、入力部110又は通信部160から得られた各種情報を処理して、記憶部150に情報を記憶させたり、出力部130に情報を出力させたりする。処理部120は、各種の情報を処理する情報処理装置として捉えられてもよい。処理部120は、その機能として、関連付け処理部121、記憶制御部123、及び表示制御部125を含む。なお、処理部120は、これらの機能以外の機能を有していてもよい。また、処理部120の各機能は、処理部120を構成するプロセッサが所定のプログラムに従って動作することにより実現される。 (2)
The
処理部120は、第1の処理部及び第2の処理部として機能し得る。
The processing unit 120 can function as a first processing unit and a second processing unit.
関連付け処理部121は、第1の処理部として、第1のセンシングプロセスを通じて得られた、ユーザに関するユーザ情報と、第2のセンシングプロセスを通じて得られた、前記ユーザによる実物体への描画過程を示す描画過程情報と、第3のセンシングプロセスを通じて得られた、前記描画過程において描画に用いられた描画物体を示す描画物体情報とを、関連付ける処理を行う。換言すると、関連付け処理部121は、これらの情報を関連付ける処理が行われた情報、即ちユーザ情報、描画過程情報及び描画物体情報を関連付けた情報を生成する。ここで、センシングプロセスとは、入力部110により行われる情報の検出処理である。第1のセンシングプロセス、第2のセンシングプロセス、及び第3のセンシングプロセスは、それぞれ異なるセンシングプロセスであってもよいし、少なくとも2つのセンシングプロセスが同一であってもよいし、第1から第3のセンシングプロセスは全て同一であってもよい。
The association processing unit 121 shows, as the first processing unit, user information about the user obtained through the first sensing process and a drawing process on the real object by the user obtained through the second sensing process. A process of associating the drawing process information with the drawing object information obtained through the third sensing process and indicating the drawing object used for drawing in the drawing process is performed. In other words, the association processing unit 121 generates information in which processing for associating these pieces of information, that is, information that associates user information, drawing process information, and drawing object information. Here, the sensing process is information detection processing performed by the input unit 110. The first sensing process, the second sensing process, and the third sensing process may be different sensing processes, at least two sensing processes may be the same, or the first to third sensing processes may be the same. The sensing processes may all be the same.
記憶制御部123及び表示制御部125は、第2の処理部として、関連付ける処理が行われた情報を用いた各種処理を行う。詳しくは、記憶制御部123は、ユーザ情報、描画過程情報及び描画物体情報を関連付けた情報を記憶部150に記憶させる。表示制御部125は、ユーザ情報、描画過程情報及び描画物体情報を関連付けた情報に基づく表示情報を出力部130により表示させる。
The storage control unit 123 and the display control unit 125 perform various processes using the information on which the associating process is performed as the second processing unit. Specifically, the storage control unit 123 causes the storage unit 150 to store information that associates user information, drawing process information, and drawing object information. The display control unit 125 causes the output unit 130 to display display information based on information associated with user information, drawing process information, and drawing object information.
なお、第1の処理部と第2の処理部とは、それぞれ別の情報処理装置に含まれていてもよい。即ち、関連付ける処理と、関連付けられた情報を用いた処理とが、別々の情報処理装置により実行されてもよい。また、関連付け処理部121と記憶制御部123とが同一の情報処理装置に含まれ、表示制御部125が別の情報処理装置に含まれてもよい。即ち、関連付けて記憶する処理と、記憶された関連付けられた情報を用いた処理とが、別々の情報処理装置により実行されてもよい。以下では、関連付け処理部121、記憶制御部123及び表示制御部125が、情報処理システム100に含まれるものとして説明する。
Note that the first processing unit and the second processing unit may be included in different information processing apparatuses. That is, the association process and the process using the associated information may be executed by different information processing apparatuses. Further, the association processing unit 121 and the storage control unit 123 may be included in the same information processing apparatus, and the display control unit 125 may be included in another information processing apparatus. That is, the process of storing in association with each other and the process using the stored associated information may be executed by different information processing apparatuses. Hereinafter, the association processing unit 121, the storage control unit 123, and the display control unit 125 will be described as being included in the information processing system 100.
(3)出力部130
出力部130は、情報処理システム100によって処理される各種の情報をユーザに対して通知するための出力インタフェースである。出力部130は、例えば、ディスプレイ、タッチパネル又はプロジェクタ等の表示装置を含み、表示制御部125からの制御により、表示画面に各種の情報を表示する。出力部130は、図1~図3に示す出力部130a~130cに対応するものであり、上述したように、表示画面に表示オブジェクトを表示する。なお、本実施形態はかかる例に限定されず、出力部130は、スピーカ等の音声出力装置を更に含んでもよく、各種の情報を音声として出力してもよい。 (3)Output unit 130
Theoutput unit 130 is an output interface for notifying the user of various types of information processed by the information processing system 100. The output unit 130 includes a display device such as a display, a touch panel, or a projector, and displays various types of information on the display screen under the control of the display control unit 125. The output unit 130 corresponds to the output units 130a to 130c shown in FIGS. 1 to 3, and displays a display object on the display screen as described above. Note that the present embodiment is not limited to this example, and the output unit 130 may further include an audio output device such as a speaker, and may output various types of information as audio.
出力部130は、情報処理システム100によって処理される各種の情報をユーザに対して通知するための出力インタフェースである。出力部130は、例えば、ディスプレイ、タッチパネル又はプロジェクタ等の表示装置を含み、表示制御部125からの制御により、表示画面に各種の情報を表示する。出力部130は、図1~図3に示す出力部130a~130cに対応するものであり、上述したように、表示画面に表示オブジェクトを表示する。なお、本実施形態はかかる例に限定されず、出力部130は、スピーカ等の音声出力装置を更に含んでもよく、各種の情報を音声として出力してもよい。 (3)
The
(4)記憶部150
記憶部150は、情報処理システム100の動作のための情報を一時的に又は恒久的に記憶する記憶装置である。例えば、記憶部150は、ユーザ情報、描画過程情報及び描画物体情報を関連付けた情報を記憶する。 (4)Storage unit 150
Thestorage unit 150 is a storage device that temporarily or permanently stores information for the operation of the information processing system 100. For example, the storage unit 150 stores information that associates user information, drawing process information, and drawing object information.
記憶部150は、情報処理システム100の動作のための情報を一時的に又は恒久的に記憶する記憶装置である。例えば、記憶部150は、ユーザ情報、描画過程情報及び描画物体情報を関連付けた情報を記憶する。 (4)
The
(5)通信部160
通信部160は、有線/無線で外部機器との間でデータの送受信を行うための通信インタフェースである。例えば、通信部160は、無線LAN(Local Area Network)、Wi-Fi(Wireless Fidelity、登録商標)、赤外線通信、Bluetooth(登録商標)等の方式で、外部機器と直接、またはネットワークアクセスポイントを介して通信する。例えば、通信部160は、ユーザのスマートフォン又はユーザに装着されたウェラブルデバイス等のユーザデバイスとの間で情報を通信する。通信部160は、例えばWeb上のサーバと通信して、SNS(Social Networking Service)等から情報を取得してもよい。 (5)Communication unit 160
Thecommunication unit 160 is a communication interface for transmitting / receiving data to / from an external device by wire / wireless. For example, the communication unit 160 is a method such as wireless LAN (Local Area Network), Wi-Fi (Wireless Fidelity (registered trademark)), infrared communication, Bluetooth (registered trademark), etc., directly with an external device or via a network access point. Communicate. For example, the communication unit 160 communicates information with a user device such as a user's smartphone or a wearable device attached to the user. The communication unit 160 may acquire information from an SNS (Social Networking Service) or the like by communicating with a server on the Web, for example.
通信部160は、有線/無線で外部機器との間でデータの送受信を行うための通信インタフェースである。例えば、通信部160は、無線LAN(Local Area Network)、Wi-Fi(Wireless Fidelity、登録商標)、赤外線通信、Bluetooth(登録商標)等の方式で、外部機器と直接、またはネットワークアクセスポイントを介して通信する。例えば、通信部160は、ユーザのスマートフォン又はユーザに装着されたウェラブルデバイス等のユーザデバイスとの間で情報を通信する。通信部160は、例えばWeb上のサーバと通信して、SNS(Social Networking Service)等から情報を取得してもよい。 (5)
The
<<3.技術的特徴>>
<3.1.概要>
まず、図5及び図6を参照して、本実施形態に係る情報処理システム100による動作の概要を説明する。 << 3. Technical features >>
<3.1. Overview>
First, with reference to FIG.5 and FIG.6, the outline | summary of operation | movement by theinformation processing system 100 which concerns on this embodiment is demonstrated.
<3.1.概要>
まず、図5及び図6を参照して、本実施形態に係る情報処理システム100による動作の概要を説明する。 << 3. Technical features >>
<3.1. Overview>
First, with reference to FIG.5 and FIG.6, the outline | summary of operation | movement by the
図5及び図6は、本実施形態に係る情報処理システム100の概要を説明するための図である。図5に示すように、テーブル140の周囲にユーザ10A、10B及び10Cがいて、テーブル140上に載置された模造紙20にイラスト30を描画している。図6に示すように、イラスト30は、イラスト30A、30B及び30Cに分類される。例えば、イラスト30Aはユーザ10Aによる描画結果であり、イラスト30Bはユーザ10Bによる描画結果であり、イラスト30Cはユーザ10Cによる描画結果である。イラスト30は、イラスト30A、30B及び30Cが重なって描画されている。
5 and 6 are diagrams for explaining the outline of the information processing system 100 according to the present embodiment. As shown in FIG. 5, there are users 10 </ b> A, 10 </ b> B, and 10 </ b> C around the table 140, and the illustration 30 is drawn on the imitation paper 20 placed on the table 140. As shown in FIG. 6, the illustration 30 is classified into illustrations 30A, 30B, and 30C. For example, the illustration 30A is a drawing result by the user 10A, the illustration 30B is a drawing result by the user 10B, and the illustration 30C is a drawing result by the user 10C. The illustration 30 is drawn by overlapping the illustrations 30A, 30B, and 30C.
情報処理システム100は、入力部110によりユーザ10A、10B及び10Cによる描画過程をセンシングして、記憶する。描画過程とは、描画が行われる様子であり、描画中のユーザ又は周囲の様子、及び描画結果の時系列変化等を含む概念である。その際、情報処理システム100は、描画過程をリアルタイムにセンシングすることで、イラスト30のうちどの部分がどのユーザにより描画されたものかを、分離して記憶することが可能である。例えば、情報処理システム100は、ユーザ10Aに関連付けてイラスト30Aに関する情報を記憶し、ユーザ10Bに関連付けてイラスト30Bに関する情報を記憶し、ユーザ10Cに関連付けてイラスト30Cに関する情報を記憶する。
The information processing system 100 senses and stores a drawing process by the users 10A, 10B, and 10C using the input unit 110. The drawing process is a state in which drawing is performed, and is a concept including a user being drawn or a surrounding state, a time-series change of a drawing result, and the like. At that time, the information processing system 100 can separately store which part of the illustration 30 is drawn by which user by sensing the drawing process in real time. For example, the information processing system 100 stores information about the illustration 30A in association with the user 10A, stores information about the illustration 30B in association with the user 10B, and stores information about the illustration 30C in association with the user 10C.
情報処理システム100は、記憶した情報に基づいて、出力部130によりイラスト30を再現することが可能である。その際、情報処理システム100は、イラスト30A、30B又は30Cのいずれかのみを表示したり、重ね方を変えたり、適宜加工したりして表示することが可能である。また、情報処理システム100は、イラスト30に加えて、描画時のユーザの様子や環境音等を再現してもよく、その場合、閲覧するユーザは、過去の描画体験を追体験することが可能である。
The information processing system 100 can reproduce the illustration 30 by the output unit 130 based on the stored information. At this time, the information processing system 100 can display only one of the illustrations 30A, 30B, and 30C, change the way of superimposing, and appropriately process it. In addition to the illustration 30, the information processing system 100 may reproduce the state of the user at the time of drawing, the environmental sound, and the like. In this case, the browsing user can relive the past drawing experience. It is.
なお、図5及び図6では、描画結果の一例としてイラストを示したが、本技術はかかる例に限定されない。例えば、描画結果は、文字又は文章であってもよいし、三次元の造形物であってもよい。
In addition, in FIG.5 and FIG.6, although the illustration was shown as an example of a drawing result, this technique is not limited to this example. For example, the drawing result may be a character or a sentence, or may be a three-dimensional shaped object.
情報処理システム100による処理は、記憶処理と再現処理とに大別される。以下、記憶処理と再現処理とに関し、順に詳しく説明する。
Processing by the information processing system 100 is roughly divided into storage processing and reproduction processing. Hereinafter, the storage process and the reproduction process will be described in detail in order.
<3.2.記憶処理>
図7は、本実施形態に係る情報処理システム100において実行される記憶処理の流れの一例を示すフローチャートである。図7に示すように、情報処理システム100は、センシング情報を取得する(ステップS102)。次いで、情報処理システム100は、センシング情報に基づいて各種情報を取得する(ステップS104)。例えば、情報処理システム100は、センシング情報に基づいて、ユーザ情報、描画物体情報及び描画過程情報を取得する。そして、情報処理システム100は、取得した各種情報を関連付けて記憶する(ステップS106)。以上により、記憶処理は終了する。 <3.2. Memory processing>
FIG. 7 is a flowchart showing an example of the flow of storage processing executed in theinformation processing system 100 according to the present embodiment. As shown in FIG. 7, the information processing system 100 acquires sensing information (step S102). Next, the information processing system 100 acquires various types of information based on the sensing information (step S104). For example, the information processing system 100 acquires user information, drawing object information, and drawing process information based on sensing information. Then, the information processing system 100 stores the acquired various types of information in association with each other (step S106). Thus, the storage process ends.
図7は、本実施形態に係る情報処理システム100において実行される記憶処理の流れの一例を示すフローチャートである。図7に示すように、情報処理システム100は、センシング情報を取得する(ステップS102)。次いで、情報処理システム100は、センシング情報に基づいて各種情報を取得する(ステップS104)。例えば、情報処理システム100は、センシング情報に基づいて、ユーザ情報、描画物体情報及び描画過程情報を取得する。そして、情報処理システム100は、取得した各種情報を関連付けて記憶する(ステップS106)。以上により、記憶処理は終了する。 <3.2. Memory processing>
FIG. 7 is a flowchart showing an example of the flow of storage processing executed in the
以下では、記憶処理における各ステップについて詳細に説明する。
Hereinafter, each step in the storage process will be described in detail.
<3.2.1.センシング情報の取得>
情報処理システム100は、多様なセンシング情報を取得し得る。例えば、情報処理システム100は、入力部110により撮像画像及び音声等のセンシング情報を取得する。また、情報処理システム100は、通信部160によりユーザが装着するウェアラブルデバイスのセンシング情報を取得してもよい。 <3.2.1. Acquisition of sensing information>
Theinformation processing system 100 can acquire various sensing information. For example, the information processing system 100 acquires sensing information such as a captured image and sound by the input unit 110. In addition, the information processing system 100 may acquire the sensing information of the wearable device worn by the user through the communication unit 160.
情報処理システム100は、多様なセンシング情報を取得し得る。例えば、情報処理システム100は、入力部110により撮像画像及び音声等のセンシング情報を取得する。また、情報処理システム100は、通信部160によりユーザが装着するウェアラブルデバイスのセンシング情報を取得してもよい。 <3.2.1. Acquisition of sensing information>
The
<3.2.2.センシング情報に基づく各種情報の取得>
情報処理システム100は、センシング情報に基づいて、多様な情報を取得し得る。以下、その一例を説明する。 <3.2.2. Acquisition of various information based on sensing information>
Theinformation processing system 100 can acquire a variety of information based on the sensing information. An example will be described below.
情報処理システム100は、センシング情報に基づいて、多様な情報を取得し得る。以下、その一例を説明する。 <3.2.2. Acquisition of various information based on sensing information>
The
(1)描画物体情報
描画物体情報は、描画物体の識別情報、描画過程における描画物体の描画色を示す情報、描画様式を示す情報、又は状態を示す情報の少なくともいずれかを含む。 (1) Drawing Object Information The drawing object information includes at least one of drawing object identification information, information indicating the drawing color of the drawing object in the drawing process, information indicating the drawing style, and information indicating the state.
描画物体情報は、描画物体の識別情報、描画過程における描画物体の描画色を示す情報、描画様式を示す情報、又は状態を示す情報の少なくともいずれかを含む。 (1) Drawing Object Information The drawing object information includes at least one of drawing object identification information, information indicating the drawing color of the drawing object in the drawing process, information indicating the drawing style, and information indicating the state.
例えば、情報処理システム100は、描画物体情報を、描画物体の撮像画像に基づいて取得し得る。具体的には、情報処理システム100は、描画物体に付されたQRコード(登録商標)等の識別情報の画像認識結果や特徴点情報をデータベースに問い合わせることで、描画物体情報を取得してもよい。また、情報処理システム100は、実際に描画物体により描画された描画結果を、画像認識等することで、描画物体情報を取得してもよい。
For example, the information processing system 100 can acquire drawing object information based on a captured image of the drawing object. Specifically, the information processing system 100 can acquire the drawing object information by inquiring the database about the image recognition result and the feature point information of the identification information such as QR code (registered trademark) attached to the drawing object. Good. Further, the information processing system 100 may acquire drawing object information by performing image recognition on a drawing result actually drawn by a drawing object.
・描画物体
描画物体は、描画のために用いられる物体である。描画物体は、例えば鉛筆、ボールペン、万年筆、クレヨン又は毛筆等のユーザが自由に描画可能な物体であってもよい。また、描画物体は、例えばスタンプ等の所定の描画を行うことが可能な物体であってもよい。また、描画物体は、ユーザの手等の、いわゆる道具以外の物体であってもよい。 -Drawing object A drawing object is an object used for drawing. The drawing object may be an object that the user can freely draw, such as a pencil, a ballpoint pen, a fountain pen, a crayon, or a brush. The drawing object may be an object capable of performing predetermined drawing, such as a stamp. The drawn object may be an object other than a so-called tool such as a user's hand.
描画物体は、描画のために用いられる物体である。描画物体は、例えば鉛筆、ボールペン、万年筆、クレヨン又は毛筆等のユーザが自由に描画可能な物体であってもよい。また、描画物体は、例えばスタンプ等の所定の描画を行うことが可能な物体であってもよい。また、描画物体は、ユーザの手等の、いわゆる道具以外の物体であってもよい。 -Drawing object A drawing object is an object used for drawing. The drawing object may be an object that the user can freely draw, such as a pencil, a ballpoint pen, a fountain pen, a crayon, or a brush. The drawing object may be an object capable of performing predetermined drawing, such as a stamp. The drawn object may be an object other than a so-called tool such as a user's hand.
また、描画とは、紙にインクを載せる等の加算的な方法で実現される以外にも、多様な方法が考えられる。例えば、描画は、減算的な方法で実現されてもよく、その場合、描画物体は、例えば実物体を削る彫刻刀、又は描画内容を削除する消しゴム等である。
In addition, the drawing can be realized by various methods other than an additive method such as placing ink on paper. For example, the drawing may be realized by a subtractive method. In this case, the drawing object is, for example, a carving sword that cuts off an actual object, or an eraser that deletes the drawing content.
また、描画とは、二次元的に実現される以外にも、三次元的に実現されてもよい。例えば、描画物体は、立体的な描画が可能な手持ちの3Dプリンタ(いわゆる3Dペン)であってもよい。
Further, drawing may be realized in three dimensions in addition to being realized in two dimensions. For example, the drawing object may be a hand-held 3D printer (so-called 3D pen) capable of three-dimensional drawing.
・描画物体の識別情報
描画物体の識別情報は、描画物体毎に一意に割り当てられるID番号、描画物体の名前又は画像等であってもよい。 Drawing Object Identification Information The drawing object identification information may be an ID number uniquely assigned to each drawing object, a drawing object name, an image, or the like.
描画物体の識別情報は、描画物体毎に一意に割り当てられるID番号、描画物体の名前又は画像等であってもよい。 Drawing Object Identification Information The drawing object identification information may be an ID number uniquely assigned to each drawing object, a drawing object name, an image, or the like.
・描画物体の描画色を示す情報
描画物体の描画色を示す情報は、例えば鉛筆の芯の色、又はボールペンのインクの色等を示す情報である。 Information indicating the drawing color of the drawing object The information indicating the drawing color of the drawing object is information indicating, for example, the color of the pencil core or the ink color of the ballpoint pen.
描画物体の描画色を示す情報は、例えば鉛筆の芯の色、又はボールペンのインクの色等を示す情報である。 Information indicating the drawing color of the drawing object The information indicating the drawing color of the drawing object is information indicating, for example, the color of the pencil core or the ink color of the ballpoint pen.
・描画様式を示す情報
描画様式を示す情報は、例えば、鉛筆やボールペンといった描画物体の種別に対応する、線種やかすれ度合い等を示す情報である。他にも、描画様式を示す情報は、ペン先の形状や彫刻刀の刃の形状等を示す情報であってもよい。 Information indicating a drawing style Information indicating a drawing style is information indicating a line type, a blurring level, or the like corresponding to a type of a drawing object such as a pencil or a ballpoint pen. In addition, the information indicating the drawing style may be information indicating the shape of the pen tip, the shape of the blade of the engraving knife, or the like.
描画様式を示す情報は、例えば、鉛筆やボールペンといった描画物体の種別に対応する、線種やかすれ度合い等を示す情報である。他にも、描画様式を示す情報は、ペン先の形状や彫刻刀の刃の形状等を示す情報であってもよい。 Information indicating a drawing style Information indicating a drawing style is information indicating a line type, a blurring level, or the like corresponding to a type of a drawing object such as a pencil or a ballpoint pen. In addition, the information indicating the drawing style may be information indicating the shape of the pen tip, the shape of the blade of the engraving knife, or the like.
描画物体情報は、描画様式を示す情報のひとつとして、描画結果の透過性を示す情報を含んでいてもよい。なお、描画結果とは、描画の結果生成される実物体であり、例えば描画された線、イラスト、又はスタンプの印影等である。例えば、描画物体が蛍光ペンである場合に、描画結果(即ち、線)は透過性を有する。図5に示した例では、イラスト30が描画結果に相当する。ここで、図8を参照して、描画結果の透過性の検出処理について具体的に説明する。
The drawing object information may include information indicating the transparency of the drawing result as one of the information indicating the drawing style. The drawing result is a real object generated as a result of drawing, and is, for example, a drawn line, an illustration, or a stamp impression. For example, when the drawing object is a highlighter, the drawing result (that is, the line) has transparency. In the example shown in FIG. 5, the illustration 30 corresponds to the drawing result. Here, with reference to FIG. 8, a process for detecting the transparency of the drawing result will be described in detail.
図8は、本実施形態に係る描画結果の透過性の検出処理を説明するための図である。図8の左図では、先に線31が描画され、その後に透過性を有する線32が描画された様子が示されている。図8の右図では、先に線33が描画され、その後に透過性を有さない線34が描画された様子が示されている。描画結果の透過性は、描画結果のうち他の描画結果を上書きした部分と他の描画結果と重ならない部分との差に基づいて検出され得る。具体的には、情報処理システム100は、領域Cと領域Bとを比較して、左図のように色が異なる場合には透過性があるものと検出し、右図のように色が同一である場合には透過性がないものと検出する。情報処理システム100は、透過性の有無に加えて、透過性の程度(0%~100%)を検出してもよい。情報処理システム100は、さらに、上書きされる他の描画結果のうち上書きされていない部分である領域Aの色に基づいて、例えば領域Cにおける混色の程度を認識する等して透過性の検出精度を向上させてもよい。
FIG. 8 is a diagram for explaining the process of detecting the transparency of the drawing result according to this embodiment. The left diagram of FIG. 8 shows a state in which a line 31 is drawn first and then a transparent line 32 is drawn. The right diagram in FIG. 8 shows a state in which the line 33 is drawn first, and then the non-transparent line 34 is drawn. The transparency of the drawing result can be detected based on the difference between the portion of the drawing result overwritten with the other drawing result and the portion not overlapping with the other drawing result. Specifically, the information processing system 100 compares the area C and the area B, detects that the color is different as shown in the left figure, and detects the same color as shown in the right figure. If it is, it is detected that there is no permeability. The information processing system 100 may detect the degree of transparency (0% to 100%) in addition to the presence or absence of transparency. The information processing system 100 further detects the transparency of the region by recognizing, for example, the degree of color mixing in the region C based on the color of the region A that is not overwritten among other overwritten drawing results. May be improved.
・状態を示す情報
描画過程における描画物体の状態を示す情報は、例えば描画物体の姿勢、筆圧、インク量又はインクの粘度等の、動的に変動し得る情報である。 Information indicating state Information indicating the state of the drawing object in the drawing process is information that can dynamically change, such as the posture of the drawing object, the writing pressure, the ink amount, or the ink viscosity.
描画過程における描画物体の状態を示す情報は、例えば描画物体の姿勢、筆圧、インク量又はインクの粘度等の、動的に変動し得る情報である。 Information indicating state Information indicating the state of the drawing object in the drawing process is information that can dynamically change, such as the posture of the drawing object, the writing pressure, the ink amount, or the ink viscosity.
(2)描画過程情報
描画過程情報は、描画中の描画物体の代表点の軌跡を示す情報、又は描画物体による描画結果を示す情報の少なくともいずれかを含む。 (2) Drawing process information The drawing process information includes at least one of information indicating the locus of the representative point of the drawing object being drawn, and information indicating the drawing result of the drawing object.
描画過程情報は、描画中の描画物体の代表点の軌跡を示す情報、又は描画物体による描画結果を示す情報の少なくともいずれかを含む。 (2) Drawing process information The drawing process information includes at least one of information indicating the locus of the representative point of the drawing object being drawn, and information indicating the drawing result of the drawing object.
例えば、情報処理システム100は、描画過程情報を、描画物体が紙等の描画対象の実物体に接触してから接触が解除されるまでの期間におけるセンシング情報に基づいて取得する。他にも、情報処理システム100は、撮像画像の時系列変化を認識することにより得られる、描画物体の位置の時系列変化や、描画結果の時系列変化に基づいて、描画過程情報を取得してもよい。
For example, the information processing system 100 acquires the drawing process information based on sensing information in a period from when the drawing object comes into contact with the real object to be drawn such as paper until the contact is released. In addition, the information processing system 100 acquires drawing process information based on the time series change of the position of the drawing object and the time series change of the drawing result obtained by recognizing the time series change of the captured image. May be.
・描画物体の代表点の軌跡を示す情報
描画物体の代表点とは、描画が行われる点である。例えば、描画物体の代表点は、ペン先である。 Information indicating the locus of the representative point of the drawing object The representative point of the drawing object is a point where drawing is performed. For example, the representative point of the drawing object is the pen tip.
描画物体の代表点とは、描画が行われる点である。例えば、描画物体の代表点は、ペン先である。 Information indicating the locus of the representative point of the drawing object The representative point of the drawing object is a point where drawing is performed. For example, the representative point of the drawing object is the pen tip.
描画物体の代表点の軌跡を示す情報は、描画が二次元的に行われた場合は二次元空間(即ち、平面)における軌跡を示す情報である。描画物体の代表点の軌跡を示す情報は、描画が三次元的に行われた場合は三次元空間における軌跡を示す情報である。
The information indicating the trajectory of the representative point of the drawing object is information indicating the trajectory in the two-dimensional space (that is, a plane) when the drawing is performed two-dimensionally. The information indicating the trajectory of the representative point of the drawn object is information indicating the trajectory in the three-dimensional space when the drawing is performed three-dimensionally.
描画物体の代表点の軌跡を示す情報は、典型的には、ベクトルデータである。
The information indicating the locus of the representative point of the drawn object is typically vector data.
・描画物体による描画結果を示す情報
描画物体による描画結果を示す情報は、描画結果そのものを示す情報である。例えば、描画物体の代表点の軌跡としては表現することが困難な複雑な描画結果に関しては、描画物体による描画結果を示す情報が取得され得る。 Information indicating the drawing result by the drawing object Information indicating the drawing result by the drawing object is information indicating the drawing result itself. For example, regarding a complicated drawing result that is difficult to represent as the locus of the representative point of the drawing object, information indicating the drawing result by the drawing object can be acquired.
描画物体による描画結果を示す情報は、描画結果そのものを示す情報である。例えば、描画物体の代表点の軌跡としては表現することが困難な複雑な描画結果に関しては、描画物体による描画結果を示す情報が取得され得る。 Information indicating the drawing result by the drawing object Information indicating the drawing result by the drawing object is information indicating the drawing result itself. For example, regarding a complicated drawing result that is difficult to represent as the locus of the representative point of the drawing object, information indicating the drawing result by the drawing object can be acquired.
描画物体による描画結果を示す情報は、典型的には、ピクセルデータ(換言すると、画像)である。
The information indicating the drawing result by the drawing object is typically pixel data (in other words, an image).
なお、描画過程情報は、ベクトルデータ又はピクセルデータのいずれであってもよいし、その組み合わせであってもよい。例えば、通常はベクトルデータとして取得され、データ量が削減可能であればピクセルデータとして取得されてもよい。
Note that the drawing process information may be vector data or pixel data, or a combination thereof. For example, it is usually acquired as vector data, and may be acquired as pixel data if the data amount can be reduced.
(3)ユーザ情報
・静的ユーザ情報
ユーザ情報は、ユーザの識別情報、属性情報、又は生体情報の少なくともいずれかを含む。ユーザの識別情報は、ユーザ毎に一意に割り当てられるID番号、人名、又は顔画像等であってもよい。ユーザの属性情報は、例えば年齢、性別、職業、家族構成、及び友人関係等の情報である。 (3) User information Static user information User information includes at least one of user identification information, attribute information, and biometric information. The user identification information may be an ID number uniquely assigned to each user, a person name, a face image, or the like. The user attribute information is, for example, information such as age, sex, occupation, family structure, and friendship.
・静的ユーザ情報
ユーザ情報は、ユーザの識別情報、属性情報、又は生体情報の少なくともいずれかを含む。ユーザの識別情報は、ユーザ毎に一意に割り当てられるID番号、人名、又は顔画像等であってもよい。ユーザの属性情報は、例えば年齢、性別、職業、家族構成、及び友人関係等の情報である。 (3) User information Static user information User information includes at least one of user identification information, attribute information, and biometric information. The user identification information may be an ID number uniquely assigned to each user, a person name, a face image, or the like. The user attribute information is, for example, information such as age, sex, occupation, family structure, and friendship.
情報処理システム100は、これらのユーザ情報を、センシング結果に基づいて取得する。例えば、情報処理システム100は、ユーザの識別情報を、入力部110から得られた撮像画像又は音声に基づく顔認識又は音声認識により取得する。また、情報処理システム100は、ユーザの属性情報を、入力部110から得られた撮像画像又は音声に基づく画像認識又は音声認識により取得したり、ユーザの識別情報を検索条件にしてデータベースから取得したりする。なお、検索条件とは、検索キーワード等の、目的とする情報に紐付く情報であり、検索条件と同一又は類似する情報を含む情報が、検索結果として得られる。
The information processing system 100 acquires the user information based on the sensing result. For example, the information processing system 100 acquires user identification information by face recognition or voice recognition based on a captured image or voice obtained from the input unit 110. Further, the information processing system 100 acquires user attribute information by image recognition or voice recognition based on a captured image or sound obtained from the input unit 110, or acquires user identification information from a database using the user identification information as a search condition. Or The search condition is information associated with target information such as a search keyword, and information including information that is the same as or similar to the search condition is obtained as a search result.
・動的ユーザ情報
ユーザ情報は、描画過程におけるユーザ若しくはユーザの周囲の撮像画像、ユーザ若しくはユーザの周囲の音声、若しくはユーザの生体情報、又はこれらの情報を処理した情報の少なくともいずれかを含む。 Dynamic User Information The user information includes at least one of a user or a captured image around the user in the drawing process, a voice around the user or the user, biometric information of the user, or information obtained by processing these information.
ユーザ情報は、描画過程におけるユーザ若しくはユーザの周囲の撮像画像、ユーザ若しくはユーザの周囲の音声、若しくはユーザの生体情報、又はこれらの情報を処理した情報の少なくともいずれかを含む。 Dynamic User Information The user information includes at least one of a user or a captured image around the user in the drawing process, a voice around the user or the user, biometric information of the user, or information obtained by processing these information.
情報処理システム100は、これらの情報を、センシング結果に基づいて取得する。撮像画像や音声は、入力部110により得られる。生体情報は、例えばユーザが装着するウェアラブルデバイスによるセンシング結果に基づいて取得される。
The information processing system 100 acquires these pieces of information based on the sensing result. The captured image and sound are obtained by the input unit 110. The biological information is acquired based on a sensing result by a wearable device worn by the user, for example.
なお、ユーザの生体情報は、例えば心拍、発汗、体温、及び感情等の情報である。また、撮像画像、音声又は生体情報を処理した情報とは、例えば撮像画像に基づく顔認識結果若しくはジェスチャ認識結果、又は音声にTTS(Text To Speech)処理を適用して得られたテキストデータ等である。
The biometric information of the user is information such as heartbeat, sweating, body temperature, and emotion. The information obtained by processing the captured image, voice, or biological information is, for example, a face recognition result or a gesture recognition result based on the captured image, or text data obtained by applying TTS (Text To Speech) processing to the voice. is there.
(4)その他の情報
・時間情報
情報処理システム100は、ユーザ情報、描画過程情報及び描画物体情報に、時間情報をさらに関連付けてもよい。時間情報は、例えばセンシング時の時刻を示す情報である。とりわけ、描画過程情報は、時間情報に関連付けられることで、描画の時系列変化を示す情報となる。 (4) Other information-Time information Theinformation processing system 100 may further associate time information with user information, drawing process information, and drawing object information. The time information is information indicating time at the time of sensing, for example. In particular, the drawing process information becomes information indicating time-series changes of drawing by being associated with time information.
・時間情報
情報処理システム100は、ユーザ情報、描画過程情報及び描画物体情報に、時間情報をさらに関連付けてもよい。時間情報は、例えばセンシング時の時刻を示す情報である。とりわけ、描画過程情報は、時間情報に関連付けられることで、描画の時系列変化を示す情報となる。 (4) Other information-Time information The
・実物体情報
情報処理システム100は、ユーザ情報、描画過程情報及び描画物体情報に、ユーザの周囲に存在する実物体に関する情報をさらに関連付けてもよい。例えば、情報処理システム100は、センシング対象範囲(例えば、テーブル140の天面)に置かれた紙資料の撮像画像を関連付けてもよい。若しくは、情報処理システム100は、当該紙資料のソースデータ(例えば、紙資料が文章である場合はテキストファイル)をネットワークから取得して、関連付けてもよい。また、情報処理システム100は、当該紙資料の撮像画像及びソースデータを、共に関連付けてもよい。このように、実物体情報が関連付けられることで、情報処理システム100は、再現処理時に、ユーザの描画結果だけでなくその周囲に存在していた実物体をも疑似的に再現することが可能となる。なお、実物体情報が、図5に示した模造紙20等の、ユーザによる描画結果を含む実物体を示す情報を含んでいてもよい。その場合、図16を参照して後に説明するように、ユーザによる描画結果を含む実物体が、再現処理において用いられ得る。 Real Object Information Theinformation processing system 100 may further associate information related to the real object existing around the user with the user information, the drawing process information, and the drawn object information. For example, the information processing system 100 may associate the captured image of the paper material placed in the sensing target range (for example, the top surface of the table 140). Alternatively, the information processing system 100 may acquire and associate source data of the paper material (for example, a text file when the paper material is a sentence) from the network. Further, the information processing system 100 may associate the captured image of the paper material and the source data together. In this way, by associating the real object information, the information processing system 100 can reproduce not only the user's drawing result but also the real object existing around it during the reproduction process. Become. The real object information may include information indicating a real object including a drawing result by the user, such as the imitation paper 20 illustrated in FIG. In this case, as will be described later with reference to FIG. 16, a real object including a drawing result by the user can be used in the reproduction process.
情報処理システム100は、ユーザ情報、描画過程情報及び描画物体情報に、ユーザの周囲に存在する実物体に関する情報をさらに関連付けてもよい。例えば、情報処理システム100は、センシング対象範囲(例えば、テーブル140の天面)に置かれた紙資料の撮像画像を関連付けてもよい。若しくは、情報処理システム100は、当該紙資料のソースデータ(例えば、紙資料が文章である場合はテキストファイル)をネットワークから取得して、関連付けてもよい。また、情報処理システム100は、当該紙資料の撮像画像及びソースデータを、共に関連付けてもよい。このように、実物体情報が関連付けられることで、情報処理システム100は、再現処理時に、ユーザの描画結果だけでなくその周囲に存在していた実物体をも疑似的に再現することが可能となる。なお、実物体情報が、図5に示した模造紙20等の、ユーザによる描画結果を含む実物体を示す情報を含んでいてもよい。その場合、図16を参照して後に説明するように、ユーザによる描画結果を含む実物体が、再現処理において用いられ得る。 Real Object Information The
<3.2.3.記憶>
情報処理システム100は、関連付けられた情報を記憶部150に記憶する。例えば、情報処理システム100は、ユーザ情報、描画過程情報及び描画物体情報を関連付けて記憶する。上述したように、情報処理システム100は、時間情報及び/又は実物体情報をさらに関連付けて記憶してもよい。このように、関連付ける処理が行われた一組の情報を、以下では描画情報セットとも称する。 <3.2.3. Memory>
Theinformation processing system 100 stores the associated information in the storage unit 150. For example, the information processing system 100 stores user information, drawing process information, and drawing object information in association with each other. As described above, the information processing system 100 may further store time information and / or real object information in association with each other. In this way, a set of information subjected to the association process is also referred to as a drawing information set below.
情報処理システム100は、関連付けられた情報を記憶部150に記憶する。例えば、情報処理システム100は、ユーザ情報、描画過程情報及び描画物体情報を関連付けて記憶する。上述したように、情報処理システム100は、時間情報及び/又は実物体情報をさらに関連付けて記憶してもよい。このように、関連付ける処理が行われた一組の情報を、以下では描画情報セットとも称する。 <3.2.3. Memory>
The
例えば、図5及び図6に示した例に関して言えば、情報処理システム100は、ユーザ10Aにより描画されたイラスト30Aに関する描画情報セットを記憶する。具体的には、情報処理システム100は、ユーザ10Aのユーザ情報、ユーザ10Aがイラスト30Aを描画する際の描画過程情報、イラスト30Aの描画に用いられた描画物体に関する描画物体情報、ユーザ10Aの周囲に存在していた実物体情報、及び時間情報を関連付けて、描画情報セットとして記憶する。同様に、情報処理システム100は、ユーザ10Bにより描画されたイラスト30Bに関する描画情報セット、及びユーザ10Cにより描画されたイラスト30Cに関する描画情報セットを記憶する。
For example, regarding the example shown in FIGS. 5 and 6, the information processing system 100 stores a drawing information set related to the illustration 30A drawn by the user 10A. Specifically, the information processing system 100 includes the user information of the user 10A, the drawing process information when the user 10A draws the illustration 30A, the drawing object information regarding the drawing object used for drawing the illustration 30A, and the surroundings of the user 10A. The real object information and the time information that existed are associated with each other and stored as a drawing information set. Similarly, the information processing system 100 stores a drawing information set related to the illustration 30B drawn by the user 10B and a drawing information set related to the illustration 30C drawn by the user 10C.
<3.3.再現処理>
図9は、本実施形態に係る情報処理システム100において実行される再現処理の流れの一例を示すフローチャートである。図9に示すように、情報処理システム100は、検索条件の入力を受け付ける(ステップS202)。次いで、情報処理システム100は、入力された検索条件による検索結果を示す情報を表示して、再現対象の選択操作を受け付ける(ステップS204)。そして、情報処理システム100は、指示された再現対象を再現する(ステップS206)。以上により、再現処理は終了する。 <3.3. Reproduction process>
FIG. 9 is a flowchart illustrating an example of the flow of reproduction processing executed in theinformation processing system 100 according to the present embodiment. As shown in FIG. 9, the information processing system 100 accepts input of search conditions (step S202). Next, the information processing system 100 displays information indicating a search result based on the input search condition, and accepts a reproduction target selection operation (step S204). Then, the information processing system 100 reproduces the designated reproduction target (step S206). Thus, the reproduction process ends.
図9は、本実施形態に係る情報処理システム100において実行される再現処理の流れの一例を示すフローチャートである。図9に示すように、情報処理システム100は、検索条件の入力を受け付ける(ステップS202)。次いで、情報処理システム100は、入力された検索条件による検索結果を示す情報を表示して、再現対象の選択操作を受け付ける(ステップS204)。そして、情報処理システム100は、指示された再現対象を再現する(ステップS206)。以上により、再現処理は終了する。 <3.3. Reproduction process>
FIG. 9 is a flowchart illustrating an example of the flow of reproduction processing executed in the
以下では、再現処理における各ステップについて詳細に説明する。
In the following, each step in the reproduction process will be described in detail.
<3.3.1.検索条件入力の受け付け及び選択操作>
情報処理システム100は、検索条件の入力を受け付ける。ここでの検索条件は、記憶部150に関連付けて記憶された情報の少なくとも一部を指定する情報である。なお、上述した記憶処理において描画したユーザと、再現処理において再現された描画結果を閲覧するユーザとは、同一であってもよいし、異なっていてもよい。 <3.3.1. Accepting search conditions and selecting them>
Theinformation processing system 100 accepts input of search conditions. The search condition here is information that specifies at least a part of the information stored in association with the storage unit 150. The user who has drawn in the storage process described above and the user who browses the drawing result reproduced in the reproduction process may be the same or different.
情報処理システム100は、検索条件の入力を受け付ける。ここでの検索条件は、記憶部150に関連付けて記憶された情報の少なくとも一部を指定する情報である。なお、上述した記憶処理において描画したユーザと、再現処理において再現された描画結果を閲覧するユーザとは、同一であってもよいし、異なっていてもよい。 <3.3.1. Accepting search conditions and selecting them>
The
情報処理システム100は、入力された検索条件に基づいて、検索条件と同一又は類似する情報を含む描画情報セットを検索する。情報処理システム100は、多様な情報を検索条件として、描画情報セットを検索し得る。
The information processing system 100 searches for a drawing information set including information that is the same as or similar to the search condition based on the input search condition. The information processing system 100 can search the drawing information set using various information as search conditions.
例えば、情報処理システム100は、人に基づいて描画情報セットを検索してもよい。その場合、特定のユーザにより描画された描画結果に関する描画情報セットが検索される。例えば、ユーザが人名を入力すると、人名を検索条件とした検索が行われ得る。また、ユーザが顔画像を撮影して入力すると、顔画像を検索条件とした検索が行われ得る。また、ユーザが顔画像を撮影して入力すると、顔画像に基づいて認識された人名又はID番号を検索条件とした検索が行われ得る。
For example, the information processing system 100 may search for a drawing information set based on a person. In that case, a drawing information set relating to a drawing result drawn by a specific user is searched. For example, when a user inputs a person name, a search using the person name as a search condition can be performed. When the user captures and inputs a face image, a search using the face image as a search condition can be performed. When the user captures and inputs a face image, a search can be performed using a person name or ID number recognized based on the face image as a search condition.
例えば、情報処理システム100は、描画物体に基づいて描画情報セットを検索してもよい。その場合、特定の描画物体を用いて描画された描画結果に関する描画情報セットが検索される。例えば、ユーザが描画物体の識別情報を入力すると、描画物体の識別情報を検索条件とした検索が行われ得る。また、ユーザが描画物体を撮影して入力すると、描画物体の画像を検索条件とした検索が行われ得る。また、ユーザが描画物体を撮影して入力すると、描画物体の画像に基づいて認識された描画物体の名前又は識別情報を検索条件とした検索が行われ得る。
For example, the information processing system 100 may search for a drawing information set based on a drawing object. In that case, a drawing information set relating to a drawing result drawn using a specific drawing object is searched. For example, when the user inputs drawing object identification information, a search using the drawing object identification information as a search condition may be performed. When the user captures and inputs a drawing object, a search can be performed using the drawing object image as a search condition. When the user captures and inputs a drawing object, a search can be performed using the name or identification information of the drawing object recognized based on the drawing object image as a search condition.
例えば、情報処理システム100は、時間情報に基づいて描画情報セットを検索してもよい。その場合、特定の時間帯において描画された描画結果に関する描画情報セットが検索される。例えば、ユーザが時間帯を入力すると、入力された時間帯に属する時間情報を検索条件とした検索が行われ得る。また、ユーザが特定の描画結果を示す情報を入力すると、同時間帯において描画された描画結果に関する描画情報セットが検索され得る。
For example, the information processing system 100 may search for a drawing information set based on time information. In that case, a drawing information set relating to a drawing result drawn in a specific time zone is searched. For example, when a user inputs a time zone, a search can be performed using time information belonging to the input time zone as a search condition. When the user inputs information indicating a specific drawing result, a drawing information set related to the drawing result drawn in the same time period can be searched.
例えば、情報処理システム100は、位置情報に基づいて描画情報セットを検索してもよい。その場合、特定の位置で描画された描画結果に関する描画情報セットが検索される。例えば、ユーザが、情報処理システム100のセンシング対象範囲(例えば、テーブル140の天面)のうち所望する範囲を入力すると、所望された範囲において描画された描画結果に関する描画情報セットが検索され得る。これにより、「ここら辺に書いた絵を再現して欲しい」といった要望を満たすことが可能となる。
For example, the information processing system 100 may search for a drawing information set based on the position information. In that case, a drawing information set relating to a drawing result drawn at a specific position is searched. For example, when the user inputs a desired range in the sensing target range (for example, the top surface of the table 140) of the information processing system 100, a drawing information set related to a drawing result drawn in the desired range can be searched. This makes it possible to satisfy a request such as “I want you to reproduce the pictures that I have written here”.
例えば、情報処理システム100は、描画結果に基づいて描画情報セットを検索してもよい。その場合、入力された描画結果と同一又は類似する描画情報セットが検索される。例えば、ユーザが検索条件として描画した絵と同一又は類似する絵に関する描画情報セットが検索される。
For example, the information processing system 100 may search for a drawing information set based on the drawing result. In this case, a drawing information set that is the same as or similar to the input drawing result is searched. For example, a drawing information set related to a picture that is the same as or similar to the picture drawn by the user as a search condition is searched.
なお、ひとつの検索条件を用いて検索されてもよいし、複数の検索条件が組み合わされた検索式を用いて検索されてもよい。
Note that the search may be performed using one search condition, or may be performed using a search expression in which a plurality of search conditions are combined.
検索条件は、ジェスチャ操作、表示画面へのタッチ操作、又は音声入力等の多様な入力方法により入力され得る。例えば、ユーザは、表示画面に表示された検索条件の入力画面において、検索条件を入力する。以下、図10を参照して、検索条件の入力画面の一例を説明する。
Search conditions can be input by various input methods such as gesture operation, touch operation on the display screen, or voice input. For example, the user inputs a search condition on the search condition input screen displayed on the display screen. Hereinafter, an example of the search condition input screen will be described with reference to FIG.
図10は、本実施形態に係る検索条件の入力画面の一例を説明するための図である。入力画面40は、例えば表示画面に表示される。例えば、ユーザは、入力フォーム41~45から入力対象を選択して、ジェスチャ操作又は音声入力等により、検索条件の入力を行う。例えば、入力フォーム41では、手書きの入力が受け付けられる。ユーザが、入力フォーム41において表示画面に指を接触させながら動かして描画すると、その描画結果が検索条件として入力される。これにより、例えば入力フォーム41において描かれた絵と同一又は類似する絵が検索される。また、例えば、入力フォーム42ではユーザ情報の入力が受け付けられ、入力フォーム43では描画物体情報の入力が受け付けられ、入力フォーム44では時間情報の入力が受け付けられ、入力フォーム45では実物体情報の入力が受け付けられる。図10では図示が省略されているが、他の検索条件の入力を受け付けるための入力フォームが入力画面40に含まれてもよい。
FIG. 10 is a diagram for explaining an example of a search condition input screen according to the present embodiment. The input screen 40 is displayed on a display screen, for example. For example, the user selects an input target from the input forms 41 to 45 and inputs a search condition by a gesture operation or voice input. For example, in the input form 41, handwritten input is accepted. When a user moves and draws a finger on the display screen in the input form 41, the drawing result is input as a search condition. Thereby, for example, a picture that is the same as or similar to the picture drawn in the input form 41 is searched. Further, for example, the input form 42 accepts input of user information, the input form 43 accepts input of drawing object information, the input form 44 accepts input of time information, and the input form 45 inputs actual object information. Is accepted. Although not shown in FIG. 10, the input screen 40 may include an input form for receiving input of other search conditions.
検索条件が入力されると、情報処理システム100は、入力された検索条件と同一又は類似する情報を含む、描画情報セットを記憶部150から検索して、検索結果に基づく情報を表示する。例えば、情報処理システム100は、検索結果の一覧を示す検索結果画面を表示する。そして、情報処理システム100は、検索結果画面において、再現対象の選択操作を受け付ける。以下、図11を参照して、検索結果画面の一例を説明する。
When the search condition is input, the information processing system 100 searches the storage unit 150 for a drawing information set including information that is the same as or similar to the input search condition, and displays information based on the search result. For example, the information processing system 100 displays a search result screen showing a list of search results. Then, the information processing system 100 accepts a reproduction target selection operation on the search result screen. Hereinafter, an example of the search result screen will be described with reference to FIG.
図11は、本実施形態に係る検索結果画面の一例を説明するための図である。検索結果画面50は、例えば表示画面に表示される。検索結果画面50は、入力された検索条件による検索によりヒットした描画情報セットの一覧を表示する画面であり、例えば100件ヒットしたうちの6つの描画情報セットが、ブロック51A~51Fにおいて表示されている。ブロック51(ブロック51A~51F)の各々は、サムネイル画像52、メタ情報が記述される領域53を含む。メタ情報としては、例えば、描画物体情報、描画過程情報、及びユーザ情報の全部又は代表的な一部の情報が記述され得る。また、描画情報セットに後述するアクセス制限が課されている場合、ブロック51は、アクセス制限の有無や種類を表すアイコン54を含み得る。サムネイル画像52は、例えば描画結果を示す画像であり、図5及び図6に示した例に関して言えば、イラスト30A、30B又は30Cのサムネイル画像である。ユーザは、検索結果画面50において表示された各ブロック51のうち、再現対象とするブロック51を選択する。これにより、再現対象となった描画情報セットに基づく再現処理が行われる。
FIG. 11 is a diagram for explaining an example of a search result screen according to the present embodiment. The search result screen 50 is displayed on a display screen, for example. The search result screen 50 is a screen that displays a list of drawing information sets that have been hit by a search according to the input search conditions. For example, six drawing information sets out of 100 hits are displayed in blocks 51A to 51F. Yes. Each of the blocks 51 (blocks 51A to 51F) includes a thumbnail image 52 and an area 53 in which meta information is described. As the meta information, for example, drawing object information, drawing process information, and user information can be described in whole or in part as representative information. Further, when an access restriction described later is imposed on the drawing information set, the block 51 may include an icon 54 indicating the presence or absence and type of the access restriction. The thumbnail image 52 is, for example, an image showing a drawing result, and is a thumbnail image of the illustration 30A, 30B, or 30C in the example shown in FIGS. The user selects a block 51 to be reproduced from the blocks 51 displayed on the search result screen 50. Thereby, the reproduction process based on the drawing information set to be reproduced is performed.
上述した検索条件の入力画面と検索結果画面とは、同時に表示されてもよい。そして、インクリメンタルサーチが行われてもよく、その場合、検索条件の入力画面において検索条件が入力又は変更される度に、検索結果画面が更新される。
The search condition input screen and search result screen described above may be displayed simultaneously. An incremental search may be performed. In this case, the search result screen is updated each time the search condition is input or changed on the search condition input screen.
検索条件に該当する描画情報セットが無い場合は、その旨が表示されてもよい。さらに、一部の検索条件を無効にして検索した場合の検索結果画面が表示されてもよい。この場合、無効にされた検索条件が明示されることが望ましい。
If there is no drawing information set that meets the search conditions, a message to that effect may be displayed. Furthermore, a search result screen may be displayed when a search is performed with some search conditions disabled. In this case, it is desirable that the invalidated search condition is specified.
<3.3.2.再現対象の再現>
情報処理システム100は、描画情報セットに基づいて、描画結果を再現する。例えば、情報処理システム100は、検索結果画面においてユーザにより再現対象として選択された描画情報セットに基づいて、過去の描画結果を示す表示オブジェクト(例えば、再現画面)を表示する等して再現する。 <3.3.2. Reproduction of reproduction target>
Theinformation processing system 100 reproduces the drawing result based on the drawing information set. For example, the information processing system 100 reproduces a display object (for example, a reproduction screen) indicating a past drawing result based on a drawing information set selected as a reproduction target by the user on the search result screen.
情報処理システム100は、描画情報セットに基づいて、描画結果を再現する。例えば、情報処理システム100は、検索結果画面においてユーザにより再現対象として選択された描画情報セットに基づいて、過去の描画結果を示す表示オブジェクト(例えば、再現画面)を表示する等して再現する。 <3.3.2. Reproduction of reproduction target>
The
例えば、情報処理システム100は、描画物体情報が示す描画物体を、描画過程情報に含まれる軌跡を示す情報に従って動かすシミュレーションを行うことで、描画結果を再現する。情報処理システム100は、書き終わったイラスト等の完成された描画結果を再現してもよいし、イラストの書き始めから書き終わりまでの描画結果の時系列変化を再現してもよい。また、情報処理システム100は、描画過程情報に含まれるピクセルデータをそのまま表示することで、描画結果を再現してもよい。また、情報処理システム100は、描画物体の描画色、描画様式、及び状態等を加味してもよい。
For example, the information processing system 100 reproduces the drawing result by performing a simulation of moving the drawing object indicated by the drawing object information according to the information indicating the trajectory included in the drawing process information. The information processing system 100 may reproduce a completed drawing result of an illustration or the like that has been written, or may reproduce a time-series change in a drawing result from the start of drawing to the end of writing. The information processing system 100 may reproduce the drawing result by displaying the pixel data included in the drawing process information as it is. Further, the information processing system 100 may consider the drawing color, drawing style, state, and the like of the drawing object.
また、情報処理システム100は、ユーザの描画結果に加えて、描画時にユーザの周囲にあった実物体を示す画像を表示したり、描画時のユーザの様子を示す画像を表示したり、描画時のユーザの周囲で鳴っていた音を再生したりしてもよい。これにより、閲覧するユーザは、過去の描画体験を追体験することが可能である。
In addition to the user's drawing result, the information processing system 100 displays an image showing a real object around the user at the time of drawing, an image showing the state of the user at the time of drawing, It is also possible to reproduce the sound that was ringing around the user. Thereby, the user who browses can relive the past drawing experience.
再現方法は多様に考えられる。以下、図12~図15を参照して、再現画面の一例を説明する。
* There are various ways to reproduce. Hereinafter, an example of the reproduction screen will be described with reference to FIGS.
図12~図15は、本実施形態に係る再現画面の一例を説明するための図である。各図の左図は、図5及び図6に示した例において模造紙20に描画されたイラスト30を示しており、情報処理システム100は、イラスト30に関する描画情報セットを記憶したものとする。そして、各図の右図に示すように、情報処理システム100は、記憶した描画情報セットに基づいて、出力部130によりテーブル140の天面に再現画面を表示する。
12 to 15 are diagrams for explaining an example of the reproduction screen according to the present embodiment. The left figure of each figure shows the illustration 30 drawn on the imitation paper 20 in the example shown in FIGS. 5 and 6, and the information processing system 100 stores the drawing information set related to the illustration 30. Then, as shown in the right diagram of each drawing, the information processing system 100 displays a reproduction screen on the top surface of the table 140 by the output unit 130 based on the stored drawing information set.
図12に示した再現画面60では、情報処理システム100は、ユーザ10A、10B及び10Cにより描画されたイラスト30のうち、ユーザ10Aにより描画されたイラスト30Aのみを、選択的に再現している。このように、情報処理システム100は、描画情報セットに基づいて、複数のユーザのうち特定のユーザによる描画結果を示す情報を表示してもよい。これにより、閲覧するユーザは、所望するユーザの描画結果のみを選択的に閲覧することが可能となる。
In the reproduction screen 60 shown in FIG. 12, the information processing system 100 selectively reproduces only the illustration 30A drawn by the user 10A among the illustrations 30 drawn by the users 10A, 10B, and 10C. As described above, the information processing system 100 may display information indicating a drawing result by a specific user among a plurality of users based on the drawing information set. Thereby, the browsing user can selectively browse only the drawing result of the desired user.
図13に示した再現画面61では、情報処理システム100は、イラスト30A、30B及び30Cの重なり方を変えて再現している。具体的には、元々のイラスト30では最背面がイラスト30C、中段がイラスト30A、最前面がイラスト30Bであったが、再現画面61では、最背面がイラスト30A、中段がイラスト30C、最前面がイラスト30Bである。このように、情報処理システム100は、描画情報セットに基づいて、複数のユーザによる描画結果の重なり方を変えて表示してもよい。これにより、閲覧するユーザは、所望するユーザの描画結果を目立たせつつ、他のユーザによる描画結果を背景として閲覧することが可能となる。
In the reproduction screen 61 shown in FIG. 13, the information processing system 100 reproduces the illustrations 30A, 30B, and 30C by changing how they overlap. Specifically, in the original illustration 30, the back is the illustration 30C, the middle is the illustration 30A, and the front is the illustration 30B. On the reproduction screen 61, the back is the illustration 30A, the middle is the illustration 30C, and the front is It is illustration 30B. As described above, the information processing system 100 may display the drawing results by a plurality of users while changing the overlapping manner based on the drawing information set. Thereby, the browsing user can browse the drawing results of other users as the background while making the drawing results of the desired user stand out.
図14に示した再現画面62では、情報処理システム100は、イラスト30A、30B及び30Cの配置を変えて再現している。具体的には、再現画面62では、元々のイラスト30と比較して、イラスト30A、30B及び30Cが、それぞれ離れた場所に配置されて再現されている。このように、情報処理システム100は、描画情報セットに基づいて、複数のユーザによる描画結果の配置を変えて表示してもよい。これにより、閲覧するユーザは、複数のユーザによる描画結果の各々を分けて閲覧することが可能となる。
In the reproduction screen 62 shown in FIG. 14, the information processing system 100 reproduces the illustrations 30A, 30B, and 30C by changing their arrangement. Specifically, on the reproduction screen 62, the illustrations 30A, 30B, and 30C are respectively arranged and reproduced at positions apart from the original illustration 30. As described above, the information processing system 100 may change the arrangement of the drawing results by a plurality of users based on the drawing information set. Thereby, the user who browses can browse separately the drawing results of a plurality of users.
図15に示した再現画面63では、情報処理システム100は、イラスト30A、30B及び30Cの配置はそのままに、イラスト30Bの線種を変えて再現している。具体的には、情報処理システム100は、イラスト30Bに関し、描画情報セットのうち描画物体情報の描画様式を示す情報を、元の線種から例えば閲覧するユーザにより指示された線種に変えて、再現する。このように、情報処理システム100は、描画情報セットに基づいて、ユーザによる描画結果の少なくとも一部を加工して表示してもよい。これにより、閲覧するユーザは、所望する加工を施して描画結果を再現することが可能となる。
In the reproduction screen 63 shown in FIG. 15, the information processing system 100 reproduces the illustration 30B by changing the line type while maintaining the arrangement of the illustrations 30A, 30B, and 30C. Specifically, regarding the illustration 30B, the information processing system 100 changes the information indicating the drawing style of the drawing object information in the drawing information set from the original line type to, for example, the line type instructed by the viewing user, Reproduce. As described above, the information processing system 100 may process and display at least a part of a drawing result by the user based on the drawing information set. Thereby, the user who browses can perform desired processing and reproduce the drawing result.
以上、図12~図15を参照して説明した例では、情報処理システム100は、描画情報セットに基づく再現画面を表示していた。その他にも、情報処理システム100は、ユーザによる描画結果を含む実物体に対する投影装置(即ち、プロジェクタ)による投影により、再現処理を行ってもよい。以下、図16を参照して、その一例を説明する。
As described above, in the example described with reference to FIGS. 12 to 15, the information processing system 100 displays the reproduction screen based on the drawing information set. In addition, the information processing system 100 may perform the reproduction process by projecting a real object including a drawing result by the user using a projection apparatus (that is, a projector). Hereinafter, an example will be described with reference to FIG.
図16は、本実施形態に係る再現処理の一例を説明するための図である。図16に示した例では、テーブル140の天面にイラスト30が描画された模造紙20が載置されており、情報処理システム100は、模造紙20に対して出力部(プロジェクタ)130により画像を投影している。情報処理システム100は、ユーザ10A、10B及び10Cにより模造紙20に描画されたイラスト30に対して、ユーザ10Aによる描画結果を含む領域とそうでない領域とで異なる画像を投影する。具体的には、情報処理システム100は、ユーザ10Aによる描画結果を含む領域、即ちイラスト30Aに対して白色画像を投影して目立たせ、イラスト30Aを含まない領域に対して黒色画像を投影して目立たなくさせている。このように、情報処理システム100は、描画情報セットに基づいて、ユーザによる描画結果を含む実物体に、特定のユーザによる描画結果を含む領域とそうでない領域とで異なる画像を投影装置により投影してもよい。これにより、閲覧するユーザは、所望するユーザの描画結果のみを目立たせて閲覧することが可能となる。
FIG. 16 is a diagram for explaining an example of the reproduction process according to the present embodiment. In the example illustrated in FIG. 16, the imitation paper 20 on which the illustration 30 is drawn is placed on the top surface of the table 140, and the information processing system 100 uses the output unit (projector) 130 to image the imitation paper 20. Is projected. The information processing system 100 projects different images on the illustration 30 drawn on the dummy paper 20 by the users 10A, 10B, and 10C depending on the area that includes the drawing result by the user 10A and the area that does not. Specifically, the information processing system 100 projects a white image on an area including a drawing result by the user 10A, that is, the illustration 30A to make it stand out, and projects a black image on an area not including the illustration 30A. It is inconspicuous. As described above, the information processing system 100 projects, on the real object including the drawing result by the user, different images between the area including the drawing result by the specific user and the area not including the drawing result by the projection device based on the drawing information set. May be. Thereby, the user who browses can make it browse only conspicuously the drawing result of a desired user.
なお、図12~図16を参照して上記説明した再現処理の各方法は、適宜組み合わせて用いられてもよい。
Note that the reproduction processing methods described above with reference to FIGS. 12 to 16 may be used in appropriate combination.
・補足
なお、本実施形態では、情報処理システム100は、記憶部150に記憶された情報に基づいて再現処理を行うものとして説明したが、本技術はかかる例に限定されない。例えば、記憶処理と再現処理とは、別々の場所で同時並行的に行われてもよい。その場合、例えば学校における子供のお絵かきの様子が、リアルタイムに親のスマートフォンにおいて再現され得る。 Supplementary Note In the present embodiment, theinformation processing system 100 has been described as performing reproduction processing based on information stored in the storage unit 150, but the present technology is not limited to such an example. For example, the storage process and the reproduction process may be performed in parallel at different places. In that case, for example, the state of drawing a child at school can be reproduced in real time on the parent smartphone.
なお、本実施形態では、情報処理システム100は、記憶部150に記憶された情報に基づいて再現処理を行うものとして説明したが、本技術はかかる例に限定されない。例えば、記憶処理と再現処理とは、別々の場所で同時並行的に行われてもよい。その場合、例えば学校における子供のお絵かきの様子が、リアルタイムに親のスマートフォンにおいて再現され得る。 Supplementary Note In the present embodiment, the
<3.4.アクセス制限>
情報処理システム100は、関連付けられた情報を用いた処理の際にアクセス制限を課してもよい。 <3.4. Access restrictions>
Theinformation processing system 100 may impose access restrictions during processing using the associated information.
情報処理システム100は、関連付けられた情報を用いた処理の際にアクセス制限を課してもよい。 <3.4. Access restrictions>
The
例えば、情報処理システム100は、描画情報セットに含まれる少なくとも一部の情報の記憶可否を、描画するユーザごとに制御してもよい。また、情報処理システム100は、描画情報セットに含まれる少なくとも一部の情報の表示可否を、閲覧するユーザごとに制御してもよい。これにより、情報処理システム100は、著作権や肖像権等の、各種権利を適切に保護することが可能である。
For example, the information processing system 100 may control whether or not at least a part of information included in the drawing information set can be stored for each user who performs drawing. Further, the information processing system 100 may control whether to display at least a part of information included in the drawing information set for each user who browses. Thereby, the information processing system 100 can appropriately protect various rights such as copyrights and portrait rights.
例えば、実物体情報に関して上述したように、情報処理システム100は、センシング対象範囲(例えば、テーブル140の天面)に置かれた紙資料の撮像画像若しくは当該紙資料のソースデータ、又は当該紙資料の撮像画像及びソースデータを、関連付け得る。紙資料の撮像画像及びソースデータが関連付けられる場合、情報処理システム100は、閲覧するユーザの操作に応じて、いずれの情報に基づいて再現処理を行うかを制御し得る。また、情報処理システム100は、閲覧するユーザが誰か(例えば、子供、親、又は教師)に応じて、換言すると閲覧するユーザに課されたアクセス制限(又は付与されたアクセス権)に応じて、いずれの情報に基づいて再現処理を行うかを制御し得る。
For example, as described above with respect to real object information, the information processing system 100 captures an image of a paper material placed in a sensing target range (for example, the top surface of the table 140), source data of the paper material, or the paper material. Of the captured image and the source data. When the captured image of the paper material and the source data are associated with each other, the information processing system 100 can control which information is used to perform the reproduction process according to the operation of the user who browses. In addition, the information processing system 100 is based on who is browsing (for example, a child, a parent, or a teacher), in other words, depending on access restrictions (or granted access rights) imposed on the browsing user. It is possible to control which information is used to perform the reproduction process.
アクセス制限は、ユーザごとに切り替えられてもよい。例えば、表示は制限されないが記憶は制限される第1のアクセス制限、表示及び記憶が制限されない第2のアクセス制限、及び表示及び記憶が制限される第3のアクセス制限のいずれかが、ユーザごとに設定され得る。
Access restrictions may be switched for each user. For example, any one of a first access restriction in which display is not restricted but storage is restricted, a second access restriction in which display and storage are not restricted, and a third access restriction in which display and storage are restricted is per user. Can be set to
アクセス制限は、描画情報セットに含まれる情報ごとに切り替えられてもよい。例えば、描画物体情報に関する記憶が制限されてもよい。具体的には、記憶処理の際に、センシングの結果得られた情報が記憶されるかデフォルトの情報が記憶されるかが切り替えられてもよい。また、再現処理の際に、色も含めて再現されるか白黒で再現されるかが切り替えられてもよい。
The access restriction may be switched for each piece of information included in the drawing information set. For example, storage relating to drawing object information may be limited. Specifically, during the storage process, whether information obtained as a result of sensing is stored or default information may be switched. In addition, during reproduction processing, switching between reproduction including colors or reproduction in black and white may be performed.
アクセス制限は、コンテンツごとに切り替えられてもよい。ここでのコンテンツとしては、ユーザが描画した描画結果、紙資料等の実物体、及び紙資料のテキストデータ等が挙げられる。
Access restrictions may be switched for each content. Examples of the contents include drawing results drawn by the user, real objects such as paper materials, and text data of paper materials.
<<4.ハードウェア構成例>>
最後に、図17を参照して、本実施形態に係る情報処理装置のハードウェア構成について説明する。図17は、本実施形態に係る情報処理装置のハードウェア構成の一例を示すブロック図である。なお、図17に示す情報処理装置900は、例えば、図4に示した情報処理システム100を実現し得る。本実施形態に係る情報処理システム100による情報処理は、ソフトウェアと、以下に説明するハードウェアとの協働により実現される。 << 4. Hardware configuration example >>
Finally, with reference to FIG. 17, a hardware configuration of the information processing apparatus according to the present embodiment will be described. FIG. 17 is a block diagram illustrating an example of a hardware configuration of the information processing apparatus according to the present embodiment. Note that theinformation processing apparatus 900 illustrated in FIG. 17 can realize the information processing system 100 illustrated in FIG. 4, for example. Information processing by the information processing system 100 according to the present embodiment is realized by cooperation between software and hardware described below.
最後に、図17を参照して、本実施形態に係る情報処理装置のハードウェア構成について説明する。図17は、本実施形態に係る情報処理装置のハードウェア構成の一例を示すブロック図である。なお、図17に示す情報処理装置900は、例えば、図4に示した情報処理システム100を実現し得る。本実施形態に係る情報処理システム100による情報処理は、ソフトウェアと、以下に説明するハードウェアとの協働により実現される。 << 4. Hardware configuration example >>
Finally, with reference to FIG. 17, a hardware configuration of the information processing apparatus according to the present embodiment will be described. FIG. 17 is a block diagram illustrating an example of a hardware configuration of the information processing apparatus according to the present embodiment. Note that the
図17に示すように、情報処理装置900は、CPU(Central Processing Unit)901、ROM(Read Only Memory)902、RAM(Random Access Memory)903及びホストバス904aを備える。また、情報処理装置900は、ブリッジ904、外部バス904b、インタフェース905、入力装置906、出力装置907、ストレージ装置908、ドライブ909、接続ポート911及び通信装置913を備える。情報処理装置900は、CPU901に代えて、又はこれとともに、電気回路、DSP若しくはASIC等の処理回路を有してもよい。
As shown in FIG. 17, the information processing apparatus 900 includes a CPU (Central Processing Unit) 901, a ROM (Read Only Memory) 902, a RAM (Random Access Memory) 903, and a host bus 904a. The information processing apparatus 900 includes a bridge 904, an external bus 904b, an interface 905, an input device 906, an output device 907, a storage device 908, a drive 909, a connection port 911, and a communication device 913. The information processing apparatus 900 may include a processing circuit such as an electric circuit, a DSP, or an ASIC instead of or in addition to the CPU 901.
CPU901は、演算処理装置および制御装置として機能し、各種プログラムに従って情報処理装置900内の動作全般を制御する。また、CPU901は、マイクロプロセッサであってもよい。ROM902は、CPU901が使用するプログラムや演算パラメータ等を記憶する。RAM903は、CPU901の実行において使用するプログラムや、その実行において適宜変化するパラメータ等を一時記憶する。CPU901は、例えば、図4に示す処理部120を形成し得る。
The CPU 901 functions as an arithmetic processing unit and a control unit, and controls the overall operation in the information processing apparatus 900 according to various programs. Further, the CPU 901 may be a microprocessor. The ROM 902 stores programs used by the CPU 901, calculation parameters, and the like. The RAM 903 temporarily stores programs used in the execution of the CPU 901, parameters that change as appropriate during the execution, and the like. For example, the CPU 901 can form the processing unit 120 illustrated in FIG. 4.
CPU901、ROM902及びRAM903は、CPUバスなどを含むホストバス904aにより相互に接続されている。ホストバス904aは、ブリッジ904を介して、PCI(Peripheral Component Interconnect/Interface)バスなどの外部バス904bに接続されている。なお、必ずしもホストバス904a、ブリッジ904および外部バス904bを分離構成する必要はなく、1つのバスにこれらの機能を実装してもよい。
The CPU 901, ROM 902, and RAM 903 are connected to each other by a host bus 904a including a CPU bus. The host bus 904 a is connected to an external bus 904 b such as a PCI (Peripheral Component Interconnect / Interface) bus via a bridge 904. Note that the host bus 904a, the bridge 904, and the external bus 904b do not necessarily have to be configured separately, and these functions may be mounted on one bus.
入力装置906は、例えば、マウス、キーボード、タッチパネル、ボタン、マイクロフォン、スイッチ及びレバー等、ユーザによって情報が入力される装置によって実現される。また、入力装置906は、例えば、赤外線やその他の電波を利用したリモートコントロール装置であってもよいし、情報処理装置900の操作に対応した携帯電話やPDA等の外部接続機器であってもよい。さらに、入力装置906は、例えば、上記の入力手段を用いてユーザにより入力された情報に基づいて入力信号を生成し、CPU901に出力する入力制御回路などを含んでいてもよい。情報処理装置900のユーザは、この入力装置906を操作することにより、情報処理装置900に対して各種のデータを入力したり処理動作を指示したりすることができる。
The input device 906 is realized by a device in which information is input by the user, such as a mouse, a keyboard, a touch panel, a button, a microphone, a switch, and a lever. The input device 906 may be, for example, a remote control device using infrared rays or other radio waves, or may be an external connection device such as a mobile phone or a PDA that supports the operation of the information processing device 900. . Furthermore, the input device 906 may include, for example, an input control circuit that generates an input signal based on information input by the user using the above-described input means and outputs the input signal to the CPU 901. A user of the information processing apparatus 900 can input various data and instruct a processing operation to the information processing apparatus 900 by operating the input device 906.
他にも、入力装置906は、ユーザに関する情報を検知する装置により形成され得る。例えば、入力装置906は、画像センサ(例えば、カメラ)、深度センサ(例えば、ステレオカメラ)、加速度センサ、ジャイロセンサ、地磁気センサ、光センサ、音センサ、測距センサ、力センサ等の各種のセンサを含み得る。また、入力装置906は、情報処理装置900の姿勢、移動速度等、情報処理装置900自身の状態に関する情報や、情報処理装置900の周辺の明るさや騒音等、情報処理装置900の周辺環境に関する情報を取得してもよい。また、入力装置906は、GNSS(Global Navigation Satellite System)衛星からのGNSS信号(例えば、GPS(Global Positioning System)衛星からのGPS信号)を受信して装置の緯度、経度及び高度を含む位置情報を測定するGNSSモジュールを含んでもよい。また、位置情報に関しては、入力装置906は、Wi-Fi(登録商標)、携帯電話・PHS・スマートフォン等との送受信、または近距離通信等により位置を検知するものであってもよい。入力装置906は、例えば、図4に示す入力部110を形成し得る。
Alternatively, the input device 906 can be formed by a device that detects information about the user. For example, the input device 906 includes various sensors such as an image sensor (for example, a camera), a depth sensor (for example, a stereo camera), an acceleration sensor, a gyro sensor, a geomagnetic sensor, an optical sensor, a sound sensor, a distance sensor, and a force sensor. Can be included. In addition, the input device 906 includes information related to the information processing device 900 state, such as the posture and movement speed of the information processing device 900, and information related to the surrounding environment of the information processing device 900, such as brightness and noise around the information processing device 900. May be obtained. Further, the input device 906 receives a GNSS signal from a GNSS (Global Navigation Satellite System) satellite (for example, a GPS signal from a GPS (Global Positioning System) satellite) and receives position information including the latitude, longitude, and altitude of the device. A GNSS module to measure may be included. As for the position information, the input device 906 may detect the position by transmission / reception with Wi-Fi (registered trademark), a mobile phone / PHS / smartphone, or the like, or near field communication. The input device 906 can form, for example, the input unit 110 shown in FIG.
出力装置907は、取得した情報をユーザに対して視覚的又は聴覚的に通知することが可能な装置で形成される。このような装置として、CRTディスプレイ装置、液晶ディスプレイ装置、プラズマディスプレイ装置、ELディスプレイ装置、レーザープロジェクタ、LEDプロジェクタ及びランプ等の表示装置や、スピーカ及びヘッドホン等の音声出力装置や、プリンタ装置等がある。出力装置907は、例えば、情報処理装置900が行った各種処理により得られた結果を出力する。具体的には、表示装置は、情報処理装置900が行った各種処理により得られた結果を、テキスト、イメージ、表、グラフ等、様々な形式で視覚的に表示する。他方、音声出力装置は、再生された音声データや音響データ等からなるオーディオ信号をアナログ信号に変換して聴覚的に出力する。出力装置907は、例えば、図4に示す出力部130を形成し得る。
The output device 907 is formed of a device that can notify the user of the acquired information visually or audibly. Examples of such devices include CRT display devices, liquid crystal display devices, plasma display devices, EL display devices, display devices such as laser projectors, LED projectors and lamps, audio output devices such as speakers and headphones, printer devices, and the like. . For example, the output device 907 outputs results obtained by various processes performed by the information processing device 900. Specifically, the display device visually displays results obtained by various processes performed by the information processing device 900 in various formats such as text, images, tables, and graphs. On the other hand, the audio output device converts an audio signal composed of reproduced audio data, acoustic data, and the like into an analog signal and outputs it aurally. The output device 907 can form, for example, the output unit 130 shown in FIG.
ストレージ装置908は、情報処理装置900の記憶部の一例として形成されたデータ格納用の装置である。ストレージ装置908は、例えば、HDD等の磁気記憶部デバイス、半導体記憶デバイス、光記憶デバイス又は光磁気記憶デバイス等により実現される。ストレージ装置908は、記憶媒体、記憶媒体にデータを記録する記録装置、記憶媒体からデータを読み出す読出し装置および記憶媒体に記録されたデータを削除する削除装置などを含んでもよい。このストレージ装置908は、CPU901が実行するプログラムや各種データ及び外部から取得した各種のデータ等を格納する。ストレージ装置908は、例えば、図4に示す記憶部150を形成し得る。
The storage device 908 is a data storage device formed as an example of a storage unit of the information processing device 900. The storage apparatus 908 is realized by, for example, a magnetic storage device such as an HDD, a semiconductor storage device, an optical storage device, a magneto-optical storage device, or the like. The storage device 908 may include a storage medium, a recording device that records data on the storage medium, a reading device that reads data from the storage medium, a deletion device that deletes data recorded on the storage medium, and the like. The storage device 908 stores programs executed by the CPU 901, various data, various data acquired from the outside, and the like. The storage device 908 can form, for example, the storage unit 150 shown in FIG.
ドライブ909は、記憶媒体用リーダライタであり、情報処理装置900に内蔵、あるいは外付けされる。ドライブ909は、装着されている磁気ディスク、光ディスク、光磁気ディスク、または半導体メモリ等のリムーバブル記憶媒体に記録されている情報を読み出して、RAM903に出力する。また、ドライブ909は、リムーバブル記憶媒体に情報を書き込むこともできる。
The drive 909 is a storage medium reader / writer, and is built in or externally attached to the information processing apparatus 900. The drive 909 reads information recorded on a removable storage medium such as a magnetic disk, an optical disk, a magneto-optical disk, or a semiconductor memory, and outputs the information to the RAM 903. The drive 909 can also write information to a removable storage medium.
接続ポート911は、外部機器と接続されるインタフェースであって、例えばUSB(Universal Serial Bus)などによりデータ伝送可能な外部機器との接続口である。
The connection port 911 is an interface connected to an external device, and is a connection port with an external device capable of transmitting data by USB (Universal Serial Bus), for example.
通信装置913は、例えば、ネットワーク920に接続するための通信デバイス等で形成された通信インタフェースである。通信装置913は、例えば、有線若しくは無線LAN(Local Area Network)、LTE(Long Term Evolution)、Bluetooth(登録商標)又はWUSB(Wireless USB)用の通信カード等である。また、通信装置913は、光通信用のルータ、ADSL(Asymmetric Digital Subscriber Line)用のルータ又は各種通信用のモデム等であってもよい。この通信装置913は、例えば、インターネットや他の通信機器との間で、例えばTCP/IP等の所定のプロトコルに則して信号等を送受信することができる。通信装置913は、例えば、図4に示した通信部160を形成し得る。
The communication device 913 is a communication interface formed by a communication device or the like for connecting to the network 920, for example. The communication device 913 is, for example, a communication card for wired or wireless LAN (Local Area Network), LTE (Long Term Evolution), Bluetooth (registered trademark), or WUSB (Wireless USB). The communication device 913 may be a router for optical communication, a router for ADSL (Asymmetric Digital Subscriber Line), a modem for various communication, or the like. The communication device 913 can transmit and receive signals and the like according to a predetermined protocol such as TCP / IP, for example, with the Internet and other communication devices. The communication device 913 can form, for example, the communication unit 160 illustrated in FIG.
なお、ネットワーク920は、ネットワーク920に接続されている装置から送信される情報の有線、または無線の伝送路である。例えば、ネットワーク920は、インターネット、電話回線網、衛星通信網などの公衆回線網や、Ethernet(登録商標)を含む各種のLAN(Local Area Network)、WAN(Wide Area Network)などを含んでもよい。また、ネットワーク920は、IP-VPN(Internet Protocol-Virtual Private Network)などの専用回線網を含んでもよい。
Note that the network 920 is a wired or wireless transmission path for information transmitted from a device connected to the network 920. For example, the network 920 may include a public line network such as the Internet, a telephone line network, and a satellite communication network, various LANs including the Ethernet (registered trademark), a wide area network (WAN), and the like. Further, the network 920 may include a dedicated line network such as an IP-VPN (Internet Protocol-Virtual Private Network).
以上、本実施形態に係る情報処理装置900の機能を実現可能なハードウェア構成の一例を示した。上記の各構成要素は、汎用的な部材を用いて実現されていてもよいし、各構成要素の機能に特化したハードウェアにより実現されていてもよい。従って、本実施形態を実施する時々の技術レベルに応じて、適宜、利用するハードウェア構成を変更することが可能である。
Heretofore, an example of the hardware configuration capable of realizing the functions of the information processing apparatus 900 according to the present embodiment has been shown. Each of the above components may be realized using a general-purpose member, or may be realized by hardware specialized for the function of each component. Therefore, it is possible to change the hardware configuration to be used as appropriate according to the technical level at the time of carrying out this embodiment.
なお、上述のような本実施形態に係る情報処理装置900の各機能を実現するためのコンピュータプログラムを作製し、PC等に実装することが可能である。また、このようなコンピュータプログラムが格納された、コンピュータで読み取り可能な記録媒体も提供することができる。記録媒体は、例えば、磁気ディスク、光ディスク、光磁気ディスク、フラッシュメモリ等である。また、上記のコンピュータプログラムは、記録媒体を用いずに、例えばネットワークを介して配信されてもよい。
It should be noted that a computer program for realizing each function of the information processing apparatus 900 according to the present embodiment as described above can be produced and mounted on a PC or the like. In addition, a computer-readable recording medium storing such a computer program can be provided. The recording medium is, for example, a magnetic disk, an optical disk, a magneto-optical disk, a flash memory, or the like. Further, the above computer program may be distributed via a network, for example, without using a recording medium.
<<5.まとめ>>
以上、図1~図17を参照して、本開示の一実施形態について詳細に説明した。上記説明したように、情報処理システム100は、第1のセンシングプロセスを通じて得られた、ユーザに関するユーザ情報と、第2のセンシングプロセスを通じて得られた、ユーザによる実物体への描画過程を示す描画過程情報と、第3のセンシングプロセスを通じて得られた、描画過程において描画に用いられた描画物体を示す描画物体情報とを、関連付ける処理を行う。あるユーザの描画動作に関し、ユーザ情報、描画過程情報及び描画物体情報が関連付けられるので、複数のユーザが描画動作を行う場合であっても、各々のユーザごとの入力を適切に取り扱うことが可能となる。具体的には、情報処理システム100は、複数のユーザが描画した複数の描画結果に関する情報を、ユーザごとに記憶することが可能である。また、情報処理システム100は、複数のユーザが描画した複数の描画結果に関し、いずれかのみを表示したり、重ね方を変えたり、配置を変えたり、適宜加工したりして表示することが可能である。 << 5. Summary >>
The embodiment of the present disclosure has been described in detail above with reference to FIGS. As described above, theinformation processing system 100 includes the user information about the user obtained through the first sensing process and the drawing process showing the drawing process on the real object by the user obtained through the second sensing process. A process of associating the information and the drawing object information indicating the drawing object used for drawing in the drawing process obtained through the third sensing process is performed. Since user information, drawing process information, and drawing object information are associated with a drawing operation of a certain user, it is possible to appropriately handle input for each user even when a plurality of users perform drawing operations. Become. Specifically, the information processing system 100 can store information regarding a plurality of drawing results drawn by a plurality of users for each user. In addition, the information processing system 100 can display only one of a plurality of drawing results drawn by a plurality of users, change the overlay, change the arrangement, or appropriately process the result. It is.
以上、図1~図17を参照して、本開示の一実施形態について詳細に説明した。上記説明したように、情報処理システム100は、第1のセンシングプロセスを通じて得られた、ユーザに関するユーザ情報と、第2のセンシングプロセスを通じて得られた、ユーザによる実物体への描画過程を示す描画過程情報と、第3のセンシングプロセスを通じて得られた、描画過程において描画に用いられた描画物体を示す描画物体情報とを、関連付ける処理を行う。あるユーザの描画動作に関し、ユーザ情報、描画過程情報及び描画物体情報が関連付けられるので、複数のユーザが描画動作を行う場合であっても、各々のユーザごとの入力を適切に取り扱うことが可能となる。具体的には、情報処理システム100は、複数のユーザが描画した複数の描画結果に関する情報を、ユーザごとに記憶することが可能である。また、情報処理システム100は、複数のユーザが描画した複数の描画結果に関し、いずれかのみを表示したり、重ね方を変えたり、配置を変えたり、適宜加工したりして表示することが可能である。 << 5. Summary >>
The embodiment of the present disclosure has been described in detail above with reference to FIGS. As described above, the
以上、添付図面を参照しながら本開示の好適な実施形態について詳細に説明したが、本開示の技術的範囲はかかる例に限定されない。本開示の技術分野における通常の知識を有する者であれば、請求の範囲に記載された技術的思想の範疇内において、各種の変更例または修正例に想到し得ることは明らかであり、これらについても、当然に本開示の技術的範囲に属するものと了解される。
The preferred embodiments of the present disclosure have been described in detail above with reference to the accompanying drawings, but the technical scope of the present disclosure is not limited to such examples. It is obvious that a person having ordinary knowledge in the technical field of the present disclosure can come up with various changes or modifications within the scope of the technical idea described in the claims. Of course, it is understood that it belongs to the technical scope of the present disclosure.
なお、本明細書において説明した装置は、単独の装置として実現されてもよく、一部または全部が別々の装置として実現されても良い。例えば、図4に示した情報処理システム100の機能構成例のうち、処理部120及び記憶部150が、入力部110、出力部130及び通信部160とネットワーク等で接続されたサーバ等の装置に備えられていても良い。処理部120及び記憶部150がサーバ等の装置に備えられる場合は、入力部110又は通信部160により得られた情報がネットワーク等を通じて当該サーバ等の装置に送信され、処理部120が描画情報セットを関連付けて処理して、当該サーバ等の装置から、出力部130が出力するための情報がネットワーク等を通じて出力部130に送られる。
Note that the device described in this specification may be realized as a single device, or a part or all of the devices may be realized as separate devices. For example, in the functional configuration example of the information processing system 100 illustrated in FIG. 4, the processing unit 120 and the storage unit 150 are connected to an apparatus such as a server connected to the input unit 110, the output unit 130, and the communication unit 160 via a network or the like. It may be provided. When the processing unit 120 and the storage unit 150 are provided in a device such as a server, information obtained by the input unit 110 or the communication unit 160 is transmitted to the device such as the server via a network or the like, and the processing unit 120 sets the drawing information set. The information for the output unit 130 to output from the device such as the server is sent to the output unit 130 through a network or the like.
また、本明細書においてフローチャートを用いて説明した処理は、必ずしも図示された順序で実行されなくてもよい。いくつかの処理ステップは、並列的に実行されてもよい。また、追加的な処理ステップが採用されてもよく、一部の処理ステップが省略されてもよい。
In addition, the processes described using the flowcharts in this specification do not necessarily have to be executed in the order shown. Some processing steps may be performed in parallel. Further, additional processing steps may be employed, and some processing steps may be omitted.
また、本明細書に記載された効果は、あくまで説明的または例示的なものであって限定的ではない。つまり、本開示に係る技術は、上記の効果とともに、または上記の効果に代えて、本明細書の記載から当業者には明らかな他の効果を奏しうる。
In addition, the effects described in this specification are merely illustrative or illustrative, and are not limited. That is, the technology according to the present disclosure can exhibit other effects that are apparent to those skilled in the art from the description of the present specification in addition to or instead of the above effects.
なお、以下のような構成も本開示の技術的範囲に属する。
(1)
第1のセンシングプロセスを通じて得られた、ユーザに関するユーザ情報と、第2のセンシングプロセスを通じて得られた、前記ユーザによる実物体への描画過程を示す描画過程情報と、第3のセンシングプロセスを通じて得られた、前記描画過程において描画に用いられた描画物体を示す描画物体情報とを、関連付ける処理を行う処理部、
を備える情報処理装置。
(2)
前記関連付ける処理が行われた情報に基づいて、複数の前記ユーザのうち特定の前記ユーザによる描画結果を示す情報が表示される、前記(1)に記載の情報処理装置。
(3)
前記関連付ける処理が行われた情報に基づいて、複数の前記ユーザによる描画結果の重なり方が変えて表示される、前記(1)又は(2)に記載の情報処理装置。
(4)
前記関連付ける処理が行われた情報に基づいて、複数の前記ユーザによる描画結果の配置が変えて表示される、前記(1)~(3)のいずれか一項に記載の情報処理装置。
(5)
前記関連付ける処理が行われた情報に基づいて、前記ユーザによる描画結果の少なくとも一部が加工して表示される、前記(1)~(4)のいずれか一項に記載の情報処理装置。
(6)
前記関連付ける処理が行われた情報に基づいて、前記ユーザによる描画結果を含む実物体に、特定の前記ユーザによる描画結果を含む領域とそうでない領域とで異なる画像が投影装置により投影される、前記(1)~(5)のいずれか一項に記載の情報処理装置。
(7)
前記関連付ける処理が行われた情報に含まれる少なくとも一部の情報の表示可否が、閲覧するユーザごとに制御される、前記(1)~(5)のいずれか一項に記載の情報処理装置。
(8)
前記関連付ける処理が行われた情報は、記憶部に記憶される、前記(1)~(7)のいずれか一項に記載の情報処理装置。
(9)
前記記憶部から、入力された検索条件と同一又は類似する情報を含む前記関連付ける処理が行われた情報が検索され、検索結果に基づく情報が表示される、前記(8)に記載の情報処理装置。
(10)
前記関連付ける処理が行われた情報に含まれる少なくとも一部の情報の記憶可否が、描画する前記ユーザごとに制御される、前記(8)又は(9)に記載の情報処理装置。
(11)
前記描画過程情報は、描画中の前記描画物体の代表点の軌跡を示す情報、又は前記描画物体による描画結果を示す情報の少なくともいずれかを含む、前記(1)~(10)のいずれか一項に記載の情報処理装置。
(12)
前記軌跡を示す情報は、三次元空間における軌跡を示す情報である、前記(11)に記載の情報処理装置。
(13)
前記描画物体情報は、前記描画物体の識別情報、前記描画過程における前記描画物体の描画色を示す情報、描画様式を示す情報、又は状態を示す情報の少なくともいずれかを含む、前記(1)~(12)のいずれか一項に記載の情報処理装置。
(14)
前記描画物体情報は、前記描画物体による描画結果の透過性を示す情報を含み、
前記描画結果の透過性は、描画結果のうち他の描画結果を上書きした部分と他の描画結果と重ならない部分との差に基づいて検出される、前記(1)~(13)のいずれか一項に記載の情報処理装置。
(15)
前記ユーザ情報は、前記ユーザの識別情報又は属性情報の少なくともいずれかを含む、前記(1)~(14)のいずれか一項に記載の情報処理装置。
(16)
前記ユーザ情報は、前記描画過程における前記ユーザ若しくは前記ユーザの周囲の撮像画像、前記ユーザ若しくは前記ユーザの周囲の音声、若しくは前記ユーザの生体情報、又はこれらの情報を処理した情報の少なくともいずれかを含む、前記(1)~(15)のいずれか一項に記載の情報処理装置。
(17)
前記処理部は、前記ユーザ情報、前記描画過程情報、及び前記描画物体情報に、時間情報をさらに関連付ける処理を行う、前記(1)~(16)のいずれか一項に記載の情報処理装置。
(18)
前記処理部は、前記ユーザ情報、前記描画過程情報、及び前記描画物体情報に、前記ユーザの周囲に存在する実物体に関する情報をさらに関連付ける処理を行う、前記(1)~(17)のいずれか一項に記載の情報処理装置。
(19)
第1のセンシングプロセスを通じて得られた、ユーザに関するユーザ情報と、第2のセンシングプロセスを通じて得られた、前記ユーザによる実物体への描画過程を示す描画過程情報と、第3のセンシングプロセスを通じて得られた、前記描画過程において描画に用いられた描画物体を示す描画物体情報とを、関連付ける処理をプロセッサにより行うこと、
を含む情報処理方法。
(20)
コンピュータを、
第1のセンシングプロセスを通じて得られた、ユーザに関するユーザ情報と、第2のセンシングプロセスを通じて得られた、前記ユーザによる実物体への描画過程を示す描画過程情報と、第3のセンシングプロセスを通じて得られた、前記描画過程において描画に用いられた描画物体を示す描画物体情報とを、関連付ける処理を行う処理部、
として機能させるためのプログラムが記憶された記憶媒体。 The following configurations also belong to the technical scope of the present disclosure.
(1)
User information about the user obtained through the first sensing process, drawing process information obtained through the second sensing process indicating the drawing process on the real object by the user, and obtained through the third sensing process. A processing unit for performing a process of associating the drawing object information indicating the drawing object used for drawing in the drawing process;
An information processing apparatus comprising:
(2)
The information processing apparatus according to (1), wherein information indicating a drawing result by a specific user among the plurality of users is displayed based on the information on which the associating process is performed.
(3)
The information processing apparatus according to (1) or (2), wherein a method of overlapping drawing results by the plurality of users is displayed based on information on which the association processing is performed.
(4)
The information processing apparatus according to any one of (1) to (3), wherein the arrangement of drawing results by the plurality of users is changed and displayed based on the information on which the association processing is performed.
(5)
The information processing apparatus according to any one of (1) to (4), wherein at least a part of a drawing result by the user is processed and displayed based on the information on which the associating process is performed.
(6)
Based on the information on which the associating process has been performed, a different image is projected onto a real object including the rendering result by the user in a region including the rendering result by the specific user and a region not including the rendering result by the user, The information processing apparatus according to any one of (1) to (5).
(7)
The information processing apparatus according to any one of (1) to (5), wherein whether or not at least a part of information included in the information subjected to the association processing is displayed is controlled for each browsing user.
(8)
The information processing apparatus according to any one of (1) to (7), wherein the information on which the associating process is performed is stored in a storage unit.
(9)
The information processing apparatus according to (8), wherein information on which the association process including information that is the same as or similar to the input search condition is performed is searched from the storage unit, and information based on the search result is displayed. .
(10)
The information processing apparatus according to (8) or (9), wherein whether or not at least a part of information included in the information subjected to the association process is stored is controlled for each user who draws.
(11)
The drawing process information includes at least one of information indicating a locus of a representative point of the drawing object being drawn and information showing a drawing result by the drawing object. The information processing apparatus according to item.
(12)
The information indicating the trajectory is the information processing apparatus according to (11), wherein the information indicating the trajectory in a three-dimensional space.
(13)
The drawing object information includes at least one of identification information of the drawing object, information indicating a drawing color of the drawing object in the drawing process, information indicating a drawing style, and information indicating a state. The information processing apparatus according to any one of (12).
(14)
The drawing object information includes information indicating transparency of a drawing result by the drawing object,
The transparency of the drawing result is detected based on a difference between a portion of the drawing result in which another drawing result is overwritten and a portion that does not overlap with another drawing result. The information processing apparatus according to one item.
(15)
The information processing apparatus according to any one of (1) to (14), wherein the user information includes at least one of identification information and attribute information of the user.
(16)
The user information is at least one of the captured image of the user or the user in the drawing process, the voice of the user or the user, the biological information of the user, or information obtained by processing the information. The information processing apparatus according to any one of (1) to (15), including:
(17)
The information processing apparatus according to any one of (1) to (16), wherein the processing unit performs processing for further associating time information with the user information, the drawing process information, and the drawing object information.
(18)
The processing unit performs processing for further associating the user information, the drawing process information, and the drawing object information with information relating to an actual object existing around the user, according to any one of (1) to (17) The information processing apparatus according to one item.
(19)
User information about the user obtained through the first sensing process, drawing process information obtained through the second sensing process indicating the drawing process on the real object by the user, and obtained through the third sensing process. The processor performs a process of associating the drawing object information indicating the drawing object used for drawing in the drawing process,
An information processing method including:
(20)
Computer
User information about the user obtained through the first sensing process, drawing process information obtained through the second sensing process indicating the drawing process on the real object by the user, and obtained through the third sensing process. A processing unit for performing a process of associating the drawing object information indicating the drawing object used for drawing in the drawing process;
A storage medium that stores a program for functioning as a computer.
(1)
第1のセンシングプロセスを通じて得られた、ユーザに関するユーザ情報と、第2のセンシングプロセスを通じて得られた、前記ユーザによる実物体への描画過程を示す描画過程情報と、第3のセンシングプロセスを通じて得られた、前記描画過程において描画に用いられた描画物体を示す描画物体情報とを、関連付ける処理を行う処理部、
を備える情報処理装置。
(2)
前記関連付ける処理が行われた情報に基づいて、複数の前記ユーザのうち特定の前記ユーザによる描画結果を示す情報が表示される、前記(1)に記載の情報処理装置。
(3)
前記関連付ける処理が行われた情報に基づいて、複数の前記ユーザによる描画結果の重なり方が変えて表示される、前記(1)又は(2)に記載の情報処理装置。
(4)
前記関連付ける処理が行われた情報に基づいて、複数の前記ユーザによる描画結果の配置が変えて表示される、前記(1)~(3)のいずれか一項に記載の情報処理装置。
(5)
前記関連付ける処理が行われた情報に基づいて、前記ユーザによる描画結果の少なくとも一部が加工して表示される、前記(1)~(4)のいずれか一項に記載の情報処理装置。
(6)
前記関連付ける処理が行われた情報に基づいて、前記ユーザによる描画結果を含む実物体に、特定の前記ユーザによる描画結果を含む領域とそうでない領域とで異なる画像が投影装置により投影される、前記(1)~(5)のいずれか一項に記載の情報処理装置。
(7)
前記関連付ける処理が行われた情報に含まれる少なくとも一部の情報の表示可否が、閲覧するユーザごとに制御される、前記(1)~(5)のいずれか一項に記載の情報処理装置。
(8)
前記関連付ける処理が行われた情報は、記憶部に記憶される、前記(1)~(7)のいずれか一項に記載の情報処理装置。
(9)
前記記憶部から、入力された検索条件と同一又は類似する情報を含む前記関連付ける処理が行われた情報が検索され、検索結果に基づく情報が表示される、前記(8)に記載の情報処理装置。
(10)
前記関連付ける処理が行われた情報に含まれる少なくとも一部の情報の記憶可否が、描画する前記ユーザごとに制御される、前記(8)又は(9)に記載の情報処理装置。
(11)
前記描画過程情報は、描画中の前記描画物体の代表点の軌跡を示す情報、又は前記描画物体による描画結果を示す情報の少なくともいずれかを含む、前記(1)~(10)のいずれか一項に記載の情報処理装置。
(12)
前記軌跡を示す情報は、三次元空間における軌跡を示す情報である、前記(11)に記載の情報処理装置。
(13)
前記描画物体情報は、前記描画物体の識別情報、前記描画過程における前記描画物体の描画色を示す情報、描画様式を示す情報、又は状態を示す情報の少なくともいずれかを含む、前記(1)~(12)のいずれか一項に記載の情報処理装置。
(14)
前記描画物体情報は、前記描画物体による描画結果の透過性を示す情報を含み、
前記描画結果の透過性は、描画結果のうち他の描画結果を上書きした部分と他の描画結果と重ならない部分との差に基づいて検出される、前記(1)~(13)のいずれか一項に記載の情報処理装置。
(15)
前記ユーザ情報は、前記ユーザの識別情報又は属性情報の少なくともいずれかを含む、前記(1)~(14)のいずれか一項に記載の情報処理装置。
(16)
前記ユーザ情報は、前記描画過程における前記ユーザ若しくは前記ユーザの周囲の撮像画像、前記ユーザ若しくは前記ユーザの周囲の音声、若しくは前記ユーザの生体情報、又はこれらの情報を処理した情報の少なくともいずれかを含む、前記(1)~(15)のいずれか一項に記載の情報処理装置。
(17)
前記処理部は、前記ユーザ情報、前記描画過程情報、及び前記描画物体情報に、時間情報をさらに関連付ける処理を行う、前記(1)~(16)のいずれか一項に記載の情報処理装置。
(18)
前記処理部は、前記ユーザ情報、前記描画過程情報、及び前記描画物体情報に、前記ユーザの周囲に存在する実物体に関する情報をさらに関連付ける処理を行う、前記(1)~(17)のいずれか一項に記載の情報処理装置。
(19)
第1のセンシングプロセスを通じて得られた、ユーザに関するユーザ情報と、第2のセンシングプロセスを通じて得られた、前記ユーザによる実物体への描画過程を示す描画過程情報と、第3のセンシングプロセスを通じて得られた、前記描画過程において描画に用いられた描画物体を示す描画物体情報とを、関連付ける処理をプロセッサにより行うこと、
を含む情報処理方法。
(20)
コンピュータを、
第1のセンシングプロセスを通じて得られた、ユーザに関するユーザ情報と、第2のセンシングプロセスを通じて得られた、前記ユーザによる実物体への描画過程を示す描画過程情報と、第3のセンシングプロセスを通じて得られた、前記描画過程において描画に用いられた描画物体を示す描画物体情報とを、関連付ける処理を行う処理部、
として機能させるためのプログラムが記憶された記憶媒体。 The following configurations also belong to the technical scope of the present disclosure.
(1)
User information about the user obtained through the first sensing process, drawing process information obtained through the second sensing process indicating the drawing process on the real object by the user, and obtained through the third sensing process. A processing unit for performing a process of associating the drawing object information indicating the drawing object used for drawing in the drawing process;
An information processing apparatus comprising:
(2)
The information processing apparatus according to (1), wherein information indicating a drawing result by a specific user among the plurality of users is displayed based on the information on which the associating process is performed.
(3)
The information processing apparatus according to (1) or (2), wherein a method of overlapping drawing results by the plurality of users is displayed based on information on which the association processing is performed.
(4)
The information processing apparatus according to any one of (1) to (3), wherein the arrangement of drawing results by the plurality of users is changed and displayed based on the information on which the association processing is performed.
(5)
The information processing apparatus according to any one of (1) to (4), wherein at least a part of a drawing result by the user is processed and displayed based on the information on which the associating process is performed.
(6)
Based on the information on which the associating process has been performed, a different image is projected onto a real object including the rendering result by the user in a region including the rendering result by the specific user and a region not including the rendering result by the user, The information processing apparatus according to any one of (1) to (5).
(7)
The information processing apparatus according to any one of (1) to (5), wherein whether or not at least a part of information included in the information subjected to the association processing is displayed is controlled for each browsing user.
(8)
The information processing apparatus according to any one of (1) to (7), wherein the information on which the associating process is performed is stored in a storage unit.
(9)
The information processing apparatus according to (8), wherein information on which the association process including information that is the same as or similar to the input search condition is performed is searched from the storage unit, and information based on the search result is displayed. .
(10)
The information processing apparatus according to (8) or (9), wherein whether or not at least a part of information included in the information subjected to the association process is stored is controlled for each user who draws.
(11)
The drawing process information includes at least one of information indicating a locus of a representative point of the drawing object being drawn and information showing a drawing result by the drawing object. The information processing apparatus according to item.
(12)
The information indicating the trajectory is the information processing apparatus according to (11), wherein the information indicating the trajectory in a three-dimensional space.
(13)
The drawing object information includes at least one of identification information of the drawing object, information indicating a drawing color of the drawing object in the drawing process, information indicating a drawing style, and information indicating a state. The information processing apparatus according to any one of (12).
(14)
The drawing object information includes information indicating transparency of a drawing result by the drawing object,
The transparency of the drawing result is detected based on a difference between a portion of the drawing result in which another drawing result is overwritten and a portion that does not overlap with another drawing result. The information processing apparatus according to one item.
(15)
The information processing apparatus according to any one of (1) to (14), wherein the user information includes at least one of identification information and attribute information of the user.
(16)
The user information is at least one of the captured image of the user or the user in the drawing process, the voice of the user or the user, the biological information of the user, or information obtained by processing the information. The information processing apparatus according to any one of (1) to (15), including:
(17)
The information processing apparatus according to any one of (1) to (16), wherein the processing unit performs processing for further associating time information with the user information, the drawing process information, and the drawing object information.
(18)
The processing unit performs processing for further associating the user information, the drawing process information, and the drawing object information with information relating to an actual object existing around the user, according to any one of (1) to (17) The information processing apparatus according to one item.
(19)
User information about the user obtained through the first sensing process, drawing process information obtained through the second sensing process indicating the drawing process on the real object by the user, and obtained through the third sensing process. The processor performs a process of associating the drawing object information indicating the drawing object used for drawing in the drawing process,
An information processing method including:
(20)
Computer
User information about the user obtained through the first sensing process, drawing process information obtained through the second sensing process indicating the drawing process on the real object by the user, and obtained through the third sensing process. A processing unit for performing a process of associating the drawing object information indicating the drawing object used for drawing in the drawing process;
A storage medium that stores a program for functioning as a computer.
10 ユーザ
20 模造紙
30 イラスト
100 情報処理システム
110 入力部
120 処理部
121 関連付け処理部
123 記憶制御部
125 表示制御部
130 出力部
140 テーブル
150 記憶部
160 通信部 10user 20 imitation paper 30 illustration 100 information processing system 110 input unit 120 processing unit 121 association processing unit 123 storage control unit 125 display control unit 130 output unit 140 table 150 storage unit 160 communication unit
20 模造紙
30 イラスト
100 情報処理システム
110 入力部
120 処理部
121 関連付け処理部
123 記憶制御部
125 表示制御部
130 出力部
140 テーブル
150 記憶部
160 通信部 10
Claims (20)
- 第1のセンシングプロセスを通じて得られた、ユーザに関するユーザ情報と、第2のセンシングプロセスを通じて得られた、前記ユーザによる実物体への描画過程を示す描画過程情報と、第3のセンシングプロセスを通じて得られた、前記描画過程において描画に用いられた描画物体を示す描画物体情報とを、関連付ける処理を行う処理部、
を備える情報処理装置。 User information about the user obtained through the first sensing process, drawing process information obtained through the second sensing process indicating the drawing process on the real object by the user, and obtained through the third sensing process. A processing unit for performing a process of associating the drawing object information indicating the drawing object used for drawing in the drawing process;
An information processing apparatus comprising: - 前記関連付ける処理が行われた情報に基づいて、複数の前記ユーザのうち特定の前記ユーザによる描画結果を示す情報が表示される、請求項1に記載の情報処理装置。 The information processing apparatus according to claim 1, wherein information indicating a drawing result by a specific user among the plurality of users is displayed based on the information on which the associating process is performed.
- 前記関連付ける処理が行われた情報に基づいて、複数の前記ユーザによる描画結果の重なり方が変えて表示される、請求項1に記載の情報処理装置。 The information processing apparatus according to claim 1, wherein a method of overlapping drawing results by the plurality of users is displayed based on information on which the associating process is performed.
- 前記関連付ける処理が行われた情報に基づいて、複数の前記ユーザによる描画結果の配置が変えて表示される、請求項1に記載の情報処理装置。 The information processing apparatus according to claim 1, wherein the arrangement of drawing results by the plurality of users is changed and displayed based on the information on which the associating process is performed.
- 前記関連付ける処理が行われた情報に基づいて、前記ユーザによる描画結果の少なくとも一部が加工して表示される、請求項1に記載の情報処理装置。 The information processing apparatus according to claim 1, wherein at least a part of a drawing result by the user is processed and displayed based on the information on which the associating process is performed.
- 前記関連付ける処理が行われた情報に基づいて、前記ユーザによる描画結果を含む実物体に、特定の前記ユーザによる描画結果を含む領域とそうでない領域とで異なる画像が投影装置により投影される、請求項1に記載の情報処理装置。 Based on the information on which the associating process has been performed, a different image is projected by a projection device on a real object including a rendering result by the user in a region including the rendering result of the specific user and a region not including the rendering result of the specific user. Item 4. The information processing apparatus according to Item 1.
- 前記関連付ける処理が行われた情報に含まれる少なくとも一部の情報の表示可否が、閲覧するユーザごとに制御される、請求項1に記載の情報処理装置。 The information processing apparatus according to claim 1, wherein whether or not at least a part of information included in the information subjected to the association processing is displayed is controlled for each browsing user.
- 前記関連付ける処理が行われた情報は、記憶部に記憶される、請求項1に記載の情報処理装置。 The information processing apparatus according to claim 1, wherein the information subjected to the associating process is stored in a storage unit.
- 前記記憶部から、入力された検索条件と同一又は類似する情報を含む前記関連付ける処理が行われた情報が検索され、検索結果に基づく情報が表示される、請求項8に記載の情報処理装置。 The information processing apparatus according to claim 8, wherein information on which the associating process including information that is the same as or similar to the input search condition is performed is searched from the storage unit, and information based on the search result is displayed.
- 前記関連付ける処理が行われた情報に含まれる少なくとも一部の情報の記憶可否が、描画する前記ユーザごとに制御される、請求項8に記載の情報処理装置。 The information processing apparatus according to claim 8, wherein whether or not at least a part of information included in the information subjected to the associating process is stored is controlled for each user who draws.
- 前記描画過程情報は、描画中の前記描画物体の代表点の軌跡を示す情報、又は前記描画物体による描画結果を示す情報の少なくともいずれかを含む、請求項1に記載の情報処理装置。 The information processing apparatus according to claim 1, wherein the drawing process information includes at least one of information indicating a locus of a representative point of the drawing object being drawn or information showing a drawing result by the drawing object.
- 前記軌跡を示す情報は、三次元空間における軌跡を示す情報である、請求項11に記載の情報処理装置。 The information processing apparatus according to claim 11, wherein the information indicating the locus is information indicating a locus in a three-dimensional space.
- 前記描画物体情報は、前記描画物体の識別情報、前記描画過程における前記描画物体の描画色を示す情報、描画様式を示す情報、又は状態を示す情報の少なくともいずれかを含む、請求項1に記載の情報処理装置。 The drawing object information includes at least one of identification information of the drawing object, information indicating a drawing color of the drawing object in the drawing process, information indicating a drawing style, and information indicating a state. Information processing device.
- 前記描画物体情報は、前記描画物体による描画結果の透過性を示す情報を含み、
前記描画結果の透過性は、描画結果のうち他の描画結果を上書きした部分と他の描画結果と重ならない部分との差に基づいて検出される、請求項1に記載の情報処理装置。 The drawing object information includes information indicating transparency of a drawing result by the drawing object,
The information processing apparatus according to claim 1, wherein the transparency of the drawing result is detected based on a difference between a portion of the drawing result overwritten with another drawing result and a portion that does not overlap another drawing result. - 前記ユーザ情報は、前記ユーザの識別情報又は属性情報の少なくともいずれかを含む、請求項1に記載の情報処理装置。 The information processing apparatus according to claim 1, wherein the user information includes at least one of identification information and attribute information of the user.
- 前記ユーザ情報は、前記描画過程における前記ユーザ若しくは前記ユーザの周囲の撮像画像、前記ユーザ若しくは前記ユーザの周囲の音声、若しくは前記ユーザの生体情報、又はこれらの情報を処理した情報の少なくともいずれかを含む、請求項1に記載の情報処理装置。 The user information is at least one of the captured image of the user or the user in the drawing process, the voice of the user or the user, the biological information of the user, or information obtained by processing the information. The information processing apparatus according to claim 1, further comprising:
- 前記処理部は、前記ユーザ情報、前記描画過程情報、及び前記描画物体情報に、時間情報をさらに関連付ける処理を行う、請求項1に記載の情報処理装置。 The information processing apparatus according to claim 1, wherein the processing unit performs a process of further associating time information with the user information, the drawing process information, and the drawing object information.
- 前記処理部は、前記ユーザ情報、前記描画過程情報、及び前記描画物体情報に、前記ユーザの周囲に存在する実物体に関する情報をさらに関連付ける処理を行う、請求項1に記載の情報処理装置。 The information processing apparatus according to claim 1, wherein the processing unit performs a process of further associating the user information, the drawing process information, and the drawing object information with information related to a real object existing around the user.
- 第1のセンシングプロセスを通じて得られた、ユーザに関するユーザ情報と、第2のセンシングプロセスを通じて得られた、前記ユーザによる実物体への描画過程を示す描画過程情報と、第3のセンシングプロセスを通じて得られた、前記描画過程において描画に用いられた描画物体を示す描画物体情報とを、関連付ける処理をプロセッサにより行うこと、
を含む情報処理方法。 User information about the user obtained through the first sensing process, drawing process information obtained through the second sensing process indicating the drawing process on the real object by the user, and obtained through the third sensing process. The processor performs a process of associating the drawing object information indicating the drawing object used for drawing in the drawing process,
An information processing method including: - コンピュータを、
第1のセンシングプロセスを通じて得られた、ユーザに関するユーザ情報と、第2のセンシングプロセスを通じて得られた、前記ユーザによる実物体への描画過程を示す描画過程情報と、第3のセンシングプロセスを通じて得られた、前記描画過程において描画に用いられた描画物体を示す描画物体情報とを、関連付ける処理を行う処理部、
として機能させるためのプログラムが記憶された記憶媒体。 Computer
User information about the user obtained through the first sensing process, drawing process information obtained through the second sensing process indicating the drawing process on the real object by the user, and obtained through the third sensing process. A processing unit for performing a process of associating the drawing object information indicating the drawing object used for drawing in the drawing process;
A storage medium that stores a program for functioning as a computer.
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2017-025769 | 2017-02-15 | ||
JP2017025769 | 2017-02-15 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2018150758A1 true WO2018150758A1 (en) | 2018-08-23 |
Family
ID=63169777
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2017/047385 WO2018150758A1 (en) | 2017-02-15 | 2017-12-28 | Information processing device, information processing method, and storage medium |
Country Status (1)
Country | Link |
---|---|
WO (1) | WO2018150758A1 (en) |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US10922570B1 (en) | 2019-07-29 | 2021-02-16 | NextVPU (Shanghai) Co., Ltd. | Entering of human face information into database |
Citations (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2006277167A (en) * | 2005-03-29 | 2006-10-12 | Fuji Xerox Co Ltd | Annotation data processing program, system and method |
JP2007233698A (en) * | 2006-03-01 | 2007-09-13 | Just Syst Corp | Web display terminal and annotation processing module |
-
2017
- 2017-12-28 WO PCT/JP2017/047385 patent/WO2018150758A1/en active Application Filing
Patent Citations (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2006277167A (en) * | 2005-03-29 | 2006-10-12 | Fuji Xerox Co Ltd | Annotation data processing program, system and method |
JP2007233698A (en) * | 2006-03-01 | 2007-09-13 | Just Syst Corp | Web display terminal and annotation processing module |
Cited By (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US10922570B1 (en) | 2019-07-29 | 2021-02-16 | NextVPU (Shanghai) Co., Ltd. | Entering of human face information into database |
JP2021022351A (en) * | 2019-07-29 | 2021-02-18 | ネクストヴイピーユー(シャンハイ)カンパニー リミテッドNextvpu(Shanghai)Co.,Ltd. | Method and device for entering face information into database |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US11093045B2 (en) | Systems and methods to augment user interaction with the environment outside of a vehicle | |
CN105706456B (en) | Method and apparatus for reproducing content | |
KR101730759B1 (en) | Manipulation of virtual object in augmented reality via intent | |
EP3183640B1 (en) | Device and method of providing handwritten content in the same | |
US10564712B2 (en) | Information processing device, information processing method, and program | |
CN102906671A (en) | Gesture input device and gesture input method | |
JP2013037675A5 (en) | ||
JP2015533003A (en) | Multi-user collaboration with smart pen system | |
US9195697B2 (en) | Correlation of written notes to digital content | |
US11107287B2 (en) | Information processing apparatus and information processing method | |
US20170053449A1 (en) | Apparatus for providing virtual contents to augment usability of real object and method using the same | |
US11367416B1 (en) | Presenting computer-generated content associated with reading content based on user interactions | |
US10290120B2 (en) | Color analysis and control using an electronic mobile device transparent display screen | |
CN107122113A (en) | Generate the method and device of picture | |
CN113168221A (en) | Information processing apparatus, information processing method, and program | |
CN110378318B (en) | Character recognition method and device, computer equipment and storage medium | |
CN110675473B (en) | Method, device, electronic equipment and medium for generating GIF dynamic diagram | |
WO2018150758A1 (en) | Information processing device, information processing method, and storage medium | |
JP7242674B2 (en) | Electronic devices, methods of driving electronic devices, and methods of controlling data recording applications | |
US20200065604A1 (en) | User interface framework for multi-selection and operation of non-consecutive segmented information | |
CN109582203A (en) | Method and apparatus for reproducing content | |
US10565898B2 (en) | System for presenting items | |
KR20180071492A (en) | Realistic contents service system using kinect sensor | |
KR102594106B1 (en) | Control method of application for recording data and recording medium thereof | |
KR20170022860A (en) | Apparatus for providing virtual contents to augment usability of real object and method using the same |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 17896376 Country of ref document: EP Kind code of ref document: A1 |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
122 | Ep: pct application non-entry in european phase |
Ref document number: 17896376 Country of ref document: EP Kind code of ref document: A1 |
|
NENP | Non-entry into the national phase |
Ref country code: JP |