WO2021227918A1 - Interaction method and augmented reality device - Google Patents
Interaction method and augmented reality device Download PDFInfo
- Publication number
- WO2021227918A1 WO2021227918A1 PCT/CN2021/091864 CN2021091864W WO2021227918A1 WO 2021227918 A1 WO2021227918 A1 WO 2021227918A1 CN 2021091864 W CN2021091864 W CN 2021091864W WO 2021227918 A1 WO2021227918 A1 WO 2021227918A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- augmented reality
- action
- reality device
- information
- target
- Prior art date
Links
- 230000003190 augmentative effect Effects 0.000 title claims abstract description 316
- 230000003993 interaction Effects 0.000 title claims abstract description 113
- 238000000034 method Methods 0.000 title claims abstract description 90
- 230000009471 action Effects 0.000 claims abstract description 256
- 238000004891 communication Methods 0.000 claims abstract description 10
- 230000004044 response Effects 0.000 claims description 71
- 230000002452 interceptive effect Effects 0.000 claims description 29
- 230000035807 sensation Effects 0.000 claims description 24
- 238000001514 detection method Methods 0.000 claims description 3
- 238000002620 method output Methods 0.000 claims description 2
- 238000005516 engineering process Methods 0.000 abstract description 6
- 230000000694 effects Effects 0.000 description 21
- 230000008569 process Effects 0.000 description 10
- 125000001475 halogen functional group Chemical group 0.000 description 8
- 230000006870 function Effects 0.000 description 7
- 238000010586 diagram Methods 0.000 description 6
- 210000001508 eye Anatomy 0.000 description 4
- 230000009286 beneficial effect Effects 0.000 description 2
- 210000005252 bulbus oculi Anatomy 0.000 description 2
- 210000003128 head Anatomy 0.000 description 2
- 230000003287 optical effect Effects 0.000 description 2
- 230000006399 behavior Effects 0.000 description 1
- 230000004397 blinking Effects 0.000 description 1
- 238000007664 blowing Methods 0.000 description 1
- 238000007599 discharging Methods 0.000 description 1
- 230000002708 enhancing effect Effects 0.000 description 1
- 238000010079 rubber tapping Methods 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/011—Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/011—Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
- G06F3/013—Eye tracking input arrangements
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/011—Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
- G06F3/014—Hand-worn input/output arrangements, e.g. data gloves
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/017—Gesture based interaction, e.g. based on a set of recognized hand gestures
Definitions
- This application belongs to the field of communication technology, and specifically relates to an interaction method and an augmented reality device.
- AR Augmented Reality
- the AR device can capture the environment where the video participant is located, thereby enabling face-to-face communication between multiple users. For example, a user conducts a multi-person video conference through an AR device, and each participant can view the environment where other participants are located through the AR device.
- AR devices can only display the collected user's video images, but cannot interact with users, and the interactivity of AR devices is poor. .
- the purpose of the embodiments of the present application is to provide an interaction method and an augmented reality device, which can solve the problem of poor interactivity of the AR device.
- an embodiment of the present application provides an interaction method applied to a first augmented reality device, and the method includes:
- the second augmented reality device is sent to the second augmented reality device corresponding to the first action Target information
- the target identifier is an identifier in the projection area of the first augmented reality device
- the second augmented reality device is an electronic device associated with the target identifier
- an embodiment of the present application provides an interaction method applied to a second augmented reality device, and the method includes:
- Target information sent by a first augmented reality device where the target information corresponds to a first action detected by the first augmented reality device
- an embodiment of the present application provides an interaction device, which is applied to a first augmented reality device, and the interaction device includes:
- the detection module is used to detect the line of sight direction of the first user and the first action of the first user;
- the first sending module is configured to send and to the second augmented reality device when the line of sight of the first user is projected on the area where the target identifier is located, and the first action of the first user meets the first preset condition Target information corresponding to the first action;
- the target identifier is an identifier in the projection area of the first augmented reality device
- the second augmented reality device is an electronic device associated with the target identifier
- an embodiment of the present application provides an interaction device, which is applied to a second augmented reality device, and the interaction device includes:
- a first receiving module configured to receive target information sent by a first augmented reality device, where the target information corresponds to a first action detected by the first augmented reality device;
- the third output module is configured to output third prompt information when it is detected that the second user's second action satisfies the second preset condition within the third preset time, and send the information to the first augmented reality device Response information corresponding to the target information;
- the fourth output module is configured to output fourth prompt information when the second action of the second user that satisfies the second preset condition is not detected within the third preset time.
- an embodiment of the present application provides an augmented reality device.
- the electronic device includes a processor, a memory, and a program or instruction that is stored on the memory and can run on the processor.
- the program or instruction When executed by the processor, the steps in the method according to the first aspect or the second aspect are realized.
- an embodiment of the present application provides a readable storage medium that stores a program or instruction on the readable storage medium, and when the program or instruction is executed by a processor, the implementation is as described in the first or second aspect. Steps in the method.
- an embodiment of the present application provides a chip, the chip includes a processor and a communication interface, the communication interface is coupled with the processor, and the processor is used to run a program or an instruction to implement the chip as in the first aspect Or the method described in the second aspect.
- an embodiment of the present application provides an interaction device, wherein the interaction device is configured to execute the method according to the first aspect or the second aspect.
- the direction of the line of sight of the first user and the first action of the first user are detected; the line of sight of the first user is projected on the area where the target identifier is located, and the first action of the first user
- the target information corresponding to the first action is sent to the second augmented reality device; wherein the target identifier is the identifier in the projection area of the first augmented reality device, and
- the second augmented reality device is an electronic device associated with the target identifier.
- the first augmented reality device can detect the user's actions, so as to realize the interaction between the augmented reality device and the user, and can participate in the interaction between the user and the user, and can improve the interaction effect.
- FIG. 1 is one of the flowcharts of the interaction method provided by the embodiment of the present invention.
- Figures 2 to 6 are images displayed by an augmented reality device provided by an embodiment of the present invention.
- FIG. 7 is the second flowchart of the interaction method provided by the embodiment of the present invention.
- FIG. 8 is a structural diagram of a first augmented reality device provided by an embodiment of the present invention.
- FIG. 9 is a structural diagram of a second augmented reality device provided by an embodiment of the present invention.
- FIG. 10 is a structural diagram of a first augmented reality device or a second augmented reality device provided by an embodiment of the present invention.
- the interaction method of the embodiment of the present invention may be applied to augmented reality (Augmented Reality, AR) devices, which may specifically be electronic devices with augmented reality functions or head-mounted augmented reality devices, etc., which are not limited here.
- augmented reality Augmented Reality, AR
- AR Augmented Reality
- each user of the video call can view the scene of other users through the augmented reality device, and each user can also use the augmented reality device Interact with other users.
- the first augmented reality device may collect the first action and determine whether the first preset condition is satisfied , Such as whether it matches with the preset action or whether the intensity of the first action is greater than the preset value.
- the first augmented reality device can obtain the action parameters of the first action, including the direction of the action, the force generated, the speed of the action, etc., and send it to the second augmented reality device Generate target information including action parameters.
- the second augmented reality device may prompt the second user according to the received target information, so that the second user can make an interactive action, thereby realizing the interaction between the first user and the second user.
- the second augmented reality device receives the prompt information including the action parameters sent from the first augmented reality device, it can output the prompt sound according to the prompt information to simulate a real stereo effect, the volume of the prompt sound and the size of the action parameters
- the type of prompt sound corresponds to the type of action.
- the second augmented reality device detects that the second user makes an action response within the preset time after receiving the prompt, such as projecting the line of sight on the account identification corresponding to the first augmented reality device, waving a hand, etc.
- the logo can display a green halo, which means that the second user is aware of this interactive action and the interaction is successful. If the response is not received within the preset time, the response is deemed to be missed/not heard, and the account logos on the AR devices of both parties can display a red halo or further output prompts to prompt the second user to respond. If the second user still does not respond, a red halo may be displayed, indicating that the user status of the recipient is busy.
- the first user and the second user can interact through the augmented reality device, and the augmented reality device can prompt the interaction between users, can participate in the interaction between users, and can improve the interaction effect.
- the first augmented reality device may also send the file to other augmented reality devices according to the user's first action, thereby simulating the actual file sending effect.
- the interaction method on the side of the first augmented reality device may include the following steps:
- Step 101 Detect the line of sight direction of the first user and the first action of the first user.
- the first augmented reality device when the first user uses the first augmented reality device, can recognize the user’s line of sight direction by means of an eye locator or camera. In addition, it can also use a depth camera, infrared Sensors such as cameras and RGB (red, green and blue) cameras recognize the first action of the first user.
- an eye locator or camera In addition, it can also use a depth camera, infrared Sensors such as cameras and RGB (red, green and blue) cameras recognize the first action of the first user.
- RGB red, green and blue
- the above-mentioned first motion may specifically include gesture motion, head motion, or body motion, etc., such as shaking hands, shaking head, and hugging.
- Step 102 In a case where the line of sight of the first user is projected on the area where the target identifier is located, and the first action of the first user meets a first preset condition, send a connection with the first augmented reality device to the second augmented reality device.
- the target identifier is an identifier in the projection area of the first augmented reality device
- the second augmented reality device is an electronic device associated with the target identifier
- the first augmented reality device can project the content that needs to be displayed to a specific area, which is convenient for the user to watch.
- the above-mentioned projection area may be the content displayed by the first augmented reality device. By projecting the content, the displayed content can be presented to the user, and the user using the first augmented reality device can view the content of the projection area.
- the projection area of the first augmented reality device may include a target identifier.
- the target identifier may be an account identifier associated with other electronic devices, and may also be a file identifier, an application program identifier, or other identifiers used to implement specific functions.
- the above-mentioned first action meeting the first preset condition can be understood as the first action matches the preset action, or the moving speed of the first action is within the preset speed range, or the force generated by the first action is within the preset force range, and many more.
- the target information corresponding to the first action can be sent according to the action type and action parameters of the first action, and the target The information can be used to indicate the type of information output by the second augmented reality device, including tactile information (such as vibration), display information (such as yellow halo), sound information (such as prompt sound), and so on.
- tactile information such as vibration
- display information such as yellow halo
- sound information such as prompt sound
- the target information includes prompt information for indicating a handshake.
- the target information includes the file corresponding to the file identifier.
- the augmented reality device can participate in the interaction between users and send the interaction information of the first user to the second user, and the first user and the second user can interact through the augmented reality device, which can improve the interaction effect.
- the present invention acquires the user's line of sight direction and behavior and transmits them to the augmented reality device of the other party, which can bring an interactive communication mode close to the real scene.
- the target identifier is a second account identifier
- the method further includes:
- a prompt identifier is displayed in the area where the second account identifier is located.
- the projection area of the first augmented reality device may include one or more account identifiers, and each account identifier may correspond to an electronic device.
- the second account identifier may be an account identifier corresponding to the second augmented reality device.
- a prompt identifier may be displayed in the area where the second account identifier is located, so that the user confirms that the user needs to interact with the user corresponding to the account identifier.
- the prompt indicator can be a flashing display, a color-enhanced display, or a focus indicator as shown in the right figure in FIG. 2.
- the first augmented reality device When the first augmented reality device detects that the line of sight of the first user is projected on the area where the second account identifier is located, and the duration is longer than the preset duration, the first augmented reality device can display the prompt identifier in the area where the second account identifier is located, thereby reducing account identification. The error.
- the method further includes:
- the first augmented reality device after the first augmented reality device sends the target information to the second augmented reality device, if the response information to the target information sent by the second augmented reality device is received within the first preset time, the first augmented reality device The reality device outputs the first prompt message to indicate the success of the interaction, such as displaying a green halo on the target logo; if the response information sent by the second augmented reality device is not received within the first preset time, then the second prompt message is output , Such as showing a red halo on the target logo to indicate that the interaction has failed.
- the first user taps the target identifier through the first AR device, and the first AR device sends prompt information corresponding to the tap action to the second AR device.
- the second AR device outputs prompt information according to the received prompt. If within 5 seconds after outputting the prompt information, the eye locator recognizes that the second user is looking at the account identification of the first user, and the account identification of the first user and the account identification of the second user on both devices display a green halo respectively, It means that the second user is aware of this simulated tapping action and the interaction is successful.
- the second user does not look at the account ID of the first user within 5 seconds, it is deemed to have missed the prompt, and the account ID of the first user and the account ID of the second user on the devices of both parties respectively show red halos, indicating the user of the recipient The state is busy.
- the first augmented reality device outputs different prompt messages to prompt the user of the success or failure of the interaction, which is convenient for the user to perform other processing or interaction according to the result of the interaction.
- the method further includes:
- the second augmented reality device is sent to the second augmented reality device.
- the first information is used to control the second wearing glove to output tactile sensation.
- the first augmented reality device may further prompt the second augmented reality device information, that is, send the first information.
- the second augmented reality device may control the second wearing glove to output tactile sensations according to the first information, so as to prompt the second user to make a response action.
- the second augmented reality device can also prompt the first augmented reality device in the above-mentioned manner, thereby controlling the first augmented reality device to output a tactile sensation to the first wearable glove connected to the first augmented reality device.
- the prompts can be made more intuitively and effectively, and the effect of interaction can be improved.
- the sending target information corresponding to the first action to the second augmented reality device includes:
- the first action may include actions such as clapping, shaking hands, waving, and hugging
- the action parameters may include at least one of the following: the movement direction of the movement, the movement speed, the strength of the movement, and the like.
- the first augmented reality device can obtain the movement direction and speed of the action through the depth camera, RGB camera, etc.
- the strength of the action can be obtained through smart gloves or other sensing devices connected to the first augmented reality device, and can also be calculated by using the movement parameters get.
- the first augmented reality device may send different information to the second augmented reality device according to the type and size of the action parameter, so as to instruct the second augmented reality device to output the prompt information in a different manner.
- the second augmented reality device when the action parameter of the first action satisfies the first parameter condition, if the intensity is small, the second augmented reality device can be instructed to prompt with a smaller prompt sound, and the type and size of the prompt sound can simulate the first action The sound produced; when the action parameters of the first action meet the second parameter conditions, such as greater intensity, the second augmented reality device can be instructed to prompt with a larger prompt sound, and the type and size of the prompt sound can simulate the first The sound produced by the action.
- the second augmented reality device when the first action is an eye action, can be instructed to display an image of the first action; when the first action is a gesture action, the second augmented reality device can be instructed to control wearing gloves to output tactile sensations.
- the prompts are output in different ways, so that the second user can quickly obtain the interactive actions made by the first user, so as to respond quickly and improve the efficiency and effect of the interaction.
- the first augmented reality device also sends the above-mentioned action parameters to the second augmented reality device, so that the second augmented reality device outputs corresponding prompts, for example, the prompt sound is loud when the photo is retaken, and the prompt sound is small when the photo is tapped. , It is convenient for the second user to understand the type of action or other information conveyed by the action according to the prompt.
- the method further includes:
- the method further includes:
- response information to the target information sent by the second augmented reality device is received, and the response information is used to indicate a second action
- the action represented by the third image includes the first action and the second action
- the first image in the projection area is switched to a fourth image, and the fourth image
- the indicated action is the opposite of the first action.
- the first action may include interactive actions such as hugging, shaking hands, blinking, blowing kisses, and taking files.
- the first augmented reality device may display a first image representing the first action in the projection area, and the image may be a picture or a video image, so that the first user can obtain the interactive action he is making.
- the second augmented reality device may send response information to the first augmented reality device, for example, the second user has made a second action in response.
- the response information may include an image used to indicate the second action, or it may be an instruction message, which is used to instruct the first device to display its stored interactive image with the first action in the projection area.
- the third image may be displayed according to the response information, and the third image may include images of the first action and the second action that match each other.
- the first augmented reality device sends target information to the second augmented reality device.
- the second augmented reality device displays the image as shown in FIG. 4 according to the target information.
- the first augmented reality device and the second augmented reality device may simultaneously display the handshake image as shown in FIG. 5, indicating that the interaction is successful.
- the first user sends target information including the file receiving request to the second augmented reality device through the first augmented reality device, and both the first augmented reality device and the second augmented reality device display the image of the file sending request.
- the second augmented reality device receives the nodding action of the second user (that is, the nodding action is the response information to the target information)
- the first augmented reality device and the second augmented reality device can display the image received by the file at the same time, indicating that they agree to receive document.
- the action represented by the first image can be withdrawn, that is, switch to The fourth image representing the withdrawal action.
- the action represented by the first image is a handshake
- the action represented by the fourth image is an action of retracting the hand.
- the first augmented reality device can display the interactive actions of the first user and the second user, and can make a corresponding response according to whether the second user responds, and can further restore the interactive effect in the real scene.
- the first action is a gesture action
- the first augmented reality device is connected to a first wearing glove worn by the first user, and the method further includes:
- the response information to the target information sent by the second augmented reality device includes tactile information, controlling the first wearing glove to output the tactile feeling corresponding to the tactile information;
- the first action is an action of taking a target item
- the first finger cuff is controlled to output a first contact force
- the second finger cuff is controlled to output a second contact force, wherein the preset distance is determined according to the size of the target item.
- the first augmented reality device is connected with the first wearable glove, so that information interaction can be realized.
- the response information sent by the second augmented reality device includes tactile information, such as touch strength, touch position, etc.
- the first augmented reality device can control the first wearing glove to output the tactile sensation, and the strength and position of the tactile sensation can be compared with the second augmented reality device. It corresponds to the information carried in the tactile information sent by the real device.
- the response information includes handshake action information and the strength of the handshake, and the first wearable device outputs a tactile sensation to the first user according to the strength. In this way, the real scene where the first user and the second user make gesture actions can be restored, and the interaction effect can be improved.
- the first augmented reality device can determine whether to control the first wearing glove to output tactile sensations according to the response information, which improves the flexibility of interactive reminders.
- the first augmented reality device may also detect the user's first action and the user's line of sight. For example, when the line of sight is projected on the file identifier and the first action meets a preset condition, it is determined to be an action of taking the target item. It is also possible to determine whether to pick up the target item according to parameters such as the action mode and posture. For example, as shown in FIG. 6, the eye locator 1 can identify the direction of the user's eyeballs, and the wearable sensor and the image recognition system 2 can identify the user's gestures. Combining the recognition of the direction of the line of sight and the gesture action can improve the accuracy of the recognition of the user's taking action.
- the distance between the first finger cuff and the second finger cuff is less than the preset distance, it can be considered that the first finger cuff and the second finger cuff have contacted the target item, and the first finger cuff and the second finger cuff can be controlled to output touch Power to simulate the user’s true feelings when picking up items.
- wearing gloves can detect the pressure of the taking action on each finger cuff, that is, the pressure generated by the taking action.
- the target item is located between the first finger and the second finger.
- the first augmented reality device can output a force that is opposite to the direction of the pressure and equal in strength to the finger according to the pressure of each finger cuff.
- the aforementioned target item may be a virtual item displayed in the projection area.
- the user’s action of taking a virtual object is identified by using the parameters of the line of sight projection combined with the gesture action, which can improve the identification accuracy, and the tactile sensation can be output by wearing gloves, which can display the online interactive actions more intuitively and achieve more realism. Effect.
- the execution subject may be an interaction device corresponding to the first augmented reality device, or a control module in the interaction device for executing the method of loading interaction.
- the method of loading interaction performed by the interactive device is taken as an example to illustrate the method of interaction provided in the embodiment of the present application.
- the interaction method on the second augmented reality device side may include the following steps:
- Step 701 Receive target information sent by a first augmented reality device, where the target information corresponds to a first action detected by the first augmented reality device.
- Step 702 When it is detected that the second action of the second user satisfies the second preset condition within the third preset time, output third prompt information, and send the target information to the first augmented reality device Corresponding response information.
- the second action meeting the second preset condition can be understood as a match between the second action and the first action, or that the parameters of the second action meet a specific condition, and so on.
- the second augmented reality device can be recognized by devices such as an eyeball locator, a depth camera, an infrared camera, and an RGB camera.
- the second augmented reality device may output third prompt information to prompt the second user that the interaction is successful.
- the second augmented reality device receives prompt information for prompting a handshake request, and when the second user makes a handshake action within a predetermined time, the second augmented reality device outputs a prompt message indicating that the handshake is successful.
- the second augmented reality device may also send response information of the target information to the first augmented reality device, so that the first augmented reality device obtains the interaction result.
- the response information may include image information, or only prompt information used to indicate the second action, and may also include other information.
- the first prompt information on the first augmented reality device is the same as the third prompt information on the second augmented reality device, and the second prompt information is the same as the fourth prompt information.
- step 702 can also be replaced with step 703.
- Step 703 If the second user's second action meeting the second preset condition is not detected within the third preset time, output fourth prompt information.
- the second augmented reality device may output fourth prompt information to prompt the interaction failure.
- the second augmented reality device may also send response information of the target information to the first augmented reality device, so that the first augmented reality device obtains the interaction result.
- the second augmented reality device can obtain the action of the second user to determine whether the interaction is successful, and can send the interaction result to the first augmented reality device, which can improve the interaction effect.
- the second augmented reality device is connected to a second wearable glove worn by the second user, and before the sending response information corresponding to the target information to the first augmented reality device, the Methods also include:
- the second wearing glove is controlled to output a tactile sensation.
- the third prompt information can be output according to the gesture parameters, and the third prompt information can specifically output a sound that simulates the first action according to the type of the first action, so that the second user can obtain more information.
- the real effect For example, if the first action is clapping hands, the second augmented reality device can output the clapping stereo sound, and the color of the account logo can be changed according to the sound; the first action is a handshake, then the second augmented reality device can use the smart glove to move towards the second The user outputs the corresponding grip strength.
- the second augmented reality device can restore the effect of the first action more realistically, thereby improving the interaction effect between the first user and the second user.
- the target information includes the first target information
- the target information includes the second target information
- the second augmented reality device may output the prompt information in different ways according to the type of information included in the target information.
- the first target information and the second target information may be information for indicating the output mode of the prompt information, and the first mode and the second mode may be voice output, display screen output, or vibration output, respectively. This can be understood in conjunction with the description on the side of the first augmented reality device, and will not be repeated here.
- the prompt information is output in different ways, which is convenient for the user to obtain the information of the first action according to the prompt information.
- the method further includes:
- the method further includes:
- a third target image is displayed in the projection area of the second augmented reality device, and the action represented by the third target image includes the first action and the second action.
- the first target image may be the same or different from the first image in the foregoing embodiment
- the third target image may be the same or different from the third image in the foregoing embodiment.
- the first image is a gesture of reaching toward the other party ( Figure 3)
- the first target image is a gesture of reaching toward oneself (Figure 4).
- the second augmented reality device may display the third target image in the projection area to indicate that the interaction is successful.
- participating in user interaction through image presentation can improve the interaction effect, and can prompt the user to improve the interaction efficiency.
- the second enhanced display device can make corresponding prompt content according to the response action made by the second user to prompt the user that the interaction is successful. It can simulate real interactive scenes more intuitively and improve the interactive effect.
- the execution subject of the interaction method provided by the embodiment of the present application may be the interaction device corresponding to the second augmented reality device, or the control module in the interaction device for executing the method of loading interaction.
- the method of loading interaction performed by the interactive device is taken as an example to illustrate the method of interaction provided in the embodiment of the present application.
- FIG. 8 is a structural diagram of an interaction device provided by an embodiment of the present invention, which can be applied to a first augmented reality device.
- the interaction device 800 includes:
- the detection module 801 is configured to detect the line of sight direction of the first user and the first action of the first user;
- the first sending module 802 is configured to send to the second augmented reality device when the line of sight of the first user is projected on the area where the target identifier is located, and the first action of the first user meets the first preset condition Target information corresponding to the first action;
- the target identifier is an identifier in the projection area of the first augmented reality device
- the second augmented reality device is an electronic device associated with the target identifier
- the interaction device further includes:
- the first output module is configured to output first prompt information when the response information sent by the second augmented reality device is received within the first preset time;
- the second output module is configured to output second prompt information when the response information sent by the second augmented reality device is not received within the first preset time.
- the interaction device further includes:
- the second sending module is configured to send the response information sent by the second augmented reality device to the The second augmented reality device sends the first information
- the first information is used to control the second wearing glove to output tactile sensation.
- the first sending module includes:
- a sending submodule configured to send first target information to the second augmented reality device when the action parameter satisfies the first parameter condition, where the first target information is used to indicate the second augmented reality device Output prompt information in the first way;
- the interaction device further includes:
- a first display module configured to display a first image for representing the first action in the projection area
- the second display module is configured to receive response information to the target information sent by the second augmented reality device, and the response information is used to indicate a second action, according to the response information in the
- the projection area displays a third image, and the action represented by the third image includes the first action and the second action.
- the first action is a gesture action
- the first augmented reality device is connected to a first wearing glove worn by the first user
- the interaction device further includes:
- the first control module is configured to control the first wearing glove to output the tactile sensation corresponding to the tactile sensation information in the case that the response information to the target information sent by the second augmented reality device includes tactile sensation information;
- the second control module is configured to detect the distance between the first finger cuff and the second finger cuff of the first wearing glove when the first action is an action of picking up a target item, and When the distance is less than a preset distance, the first finger cuff is controlled to output a first contact force, and the second finger cuff is controlled to output a second contact force, wherein the preset distance is based on the size of the target item The size is ok.
- the interaction apparatus 800 can implement the interaction method on the first augmented reality device side in the foregoing method embodiment and achieve the same beneficial effects. To avoid repetition, details are not described herein again.
- FIG. 9 is a structural diagram of an interaction device provided by an embodiment of the present invention.
- the interaction device 900 can be applied to a second augmented reality device.
- the interaction device 900 includes:
- the first receiving module 901 is configured to receive target information sent by a first augmented reality device, where the target information corresponds to a first action detected by the first augmented reality device;
- the third output module 902 is configured to output third prompt information to the first augmented reality device when it is detected that the second action of the second user meets the second preset condition within the third preset time Response information corresponding to the target information;
- the fourth output module 903 is configured to output fourth prompt information when the second user's second action that satisfies the second preset condition is not detected within the third preset time.
- the second augmented reality device is connected to a second wearable glove worn by the second user, and the interaction device further includes:
- the second receiving module is configured to receive the first information sent by the first augmented reality device
- the third control module is configured to control the second wearing glove to output tactile sensations according to the first information.
- the interaction device further includes:
- a fifth output module configured to output the third prompt information in a first manner when the target information includes the first target information
- the sixth output module is configured to output the third prompt information in a second manner when the target information includes the second target information.
- the interaction device further includes:
- a third display module configured to display a first target image used to represent the first action in the projection area of the second augmented reality device
- the interaction device further includes:
- the fourth display module is configured to display a third target image in the projection area of the second augmented reality device, and the action represented by the third target image includes the first action and the second action.
- the interaction device 900 can implement the interaction method on the second augmented reality device side in the foregoing method embodiment and achieve the same beneficial effects. To avoid repetition, details are not described herein again.
- the interaction device in the embodiments of the present application may be a device, or a component, integrated circuit, or chip in a terminal.
- the device may be an electronic device.
- the electronic device may be a mobile phone, a tablet computer, a notebook computer, a handheld computer, a vehicle-mounted electronic device, a wearable device, an ultra-mobile personal computer (UMPC), a netbook, or a personal digital assistant (personal digital assistant). assistant, PDA), etc., and the embodiments of this application are not specifically limited.
- the interaction device in the embodiment of the present application may be a device with an operating system.
- the operating system may be an Android operating system, an ios operating system, or other possible operating systems, which are not specifically limited in the embodiment of the present application.
- the interaction device provided by the embodiment of the present application can implement each process implemented by the interaction device in the method embodiments of FIG. 1 to FIG.
- the first augmented reality device can detect the user's actions, so as to enhance the interaction between the augmented reality device and the user, and can participate in the interaction between the user and the user, and can improve the interaction effect.
- an embodiment of the present application further provides an augmented reality device, including a processor, a memory, and a program or instruction that is stored in the memory and can run on the processor, and the program or instruction is executed when the processor is executed.
- an augmented reality device including a processor, a memory, and a program or instruction that is stored in the memory and can run on the processor, and the program or instruction is executed when the processor is executed.
- FIG. 10 is a schematic diagram of the hardware structure of an augmented reality device implementing an embodiment of the present application.
- the augmented reality device 1000 includes, but is not limited to: a radio frequency unit 1001, a network module 1002, an audio output unit 1003, an input unit 1004, a sensor 1005, a display unit 1006, a user input unit 1007, an interface unit 1008, a memory 1009, and a processor 1010 And other parts.
- the augmented reality device 1000 may also include a power source (such as a battery) for supplying power to various components.
- the power source may be logically connected to the processor 1010 through a power management system, so that the power management system can manage charging, discharging, and Functions such as power management.
- the structure of the augmented reality device shown in FIG. 10 does not constitute a limitation on the augmented reality device.
- the augmented reality device may include more or fewer components than shown in the figure, or some components may be combined, or different component arrangements, here No longer.
- the processor 1010 is configured to detect the direction of the first user's line of sight and the first action of the first user; and the first user's line of sight is projected on the target identifier In the area where the first user’s first action satisfies a first preset condition, sending target information corresponding to the first action to the second augmented reality device;
- the target identifier is an identifier in the projection area of the first augmented reality device
- the second augmented reality device is an electronic device associated with the target identifier
- the first augmented reality device can detect the user's actions, thereby enhancing the interaction between the augmented reality device and the user, and can participate in the interaction between the user and the user, and can improve the interaction effect.
- the target identifier is a second account identifier
- the processor 1010 is further configured to:
- the first action is a first gesture action
- the target identifier is a second account identifier
- the processor 1010 is further configured to:
- the second augmented reality device is sent to the second augmented reality device.
- the first information is used to control the second wearing glove to output tactile sensation.
- processor 1010 executes the sending of target information corresponding to the first action to the second augmented reality device includes:
- processor 1010 is further configured to:
- the control display unit 1006 displays the first image representing the first action in the projection area; after receiving the response information to the target information sent by the second augmented reality device, and the response information is used In the case of indicating a second action, a third image is displayed in the projection area according to the response information, and the action represented by the third image includes the first action and the second action.
- the first action is a gesture action
- the first augmented reality device is connected to a first wearing glove worn by the first user
- the processor 1010 is further configured to:
- the response information to the target information sent by the second augmented reality device includes tactile information, controlling the first wearing glove to output the tactile feeling corresponding to the tactile information;
- the distance between the first finger cuff and the second finger cuff of the first wearing glove is detected, and when the distance is less than a preset distance
- the first finger cuff is controlled to output a first contact force
- the second finger cuff is controlled to output a second contact force, wherein the preset distance is determined according to the size of the target item.
- the processor 1010 is configured to receive target information sent by the first augmented reality device, where the target information corresponds to the first action detected by the first augmented reality device;
- the second augmented reality device is connected to a second wearable glove worn by the second user, and the processor 1010 is further configured to:
- the second wearing glove is controlled to output a tactile sensation.
- processor 1010 is further configured to:
- the target information includes the first target information
- the target information includes the second target information
- processor 1010 is further configured to:
- a third target image is displayed in the projection area of the second augmented reality device, and the action represented by the third target image includes the first action and the second action.
- the second user can interact with the first user through the second augmented reality device, which can improve the interaction effect.
- the second augmented reality device can make a corresponding prompt according to the received target information, which is convenient for the user to quickly obtain the content of the first action.
- the embodiment of the present application also provides a readable storage medium, the readable storage medium stores a program or instruction, and when the program or instruction is executed by a processor, the first augmented reality device side and the second augmented reality device side are realized.
- a readable storage medium stores a program or instruction
- the program or instruction is executed by a processor, the first augmented reality device side and the second augmented reality device side are realized.
- Each process of the embodiment of the interactive method can achieve the same technical effect. In order to avoid repetition, it will not be repeated here.
- the processor is the processor in the augmented reality device described in the foregoing embodiment.
- the readable storage medium includes a computer readable storage medium, such as a computer read-only memory (Read-Only Memory, ROM), random access memory (Random Access Memory, RAM), magnetic disk, or optical disk.
- An embodiment of the present application further provides a chip.
- the chip includes a processor and a communication interface, the communication interface is coupled to the processor, and the processor is used to run a program or instruction to implement the first augmented reality device side.
- Each process of the embodiment of the interaction method with the second augmented reality device side can achieve the same technical effect. To avoid repetition, details are not repeated here.
- An embodiment of the present application further provides an interaction device configured to execute the processes of the foregoing interaction method embodiments on the first augmented reality device side and the second augmented reality device side, and can achieve the same technology The effect, in order to avoid repetition, will not be repeated here.
- chips mentioned in the embodiments of the present application may also be referred to as system-level chips, system-on-chips, system-on-chips, or system-on-chips.
- the technical solution of this application essentially or the part that contributes to the existing technology can be embodied in the form of a software product, and the computer software product is stored in a storage medium (such as ROM/RAM, magnetic disk, The optical disc) includes several instructions to make a terminal (which may be a mobile phone, a computer, a server, an air conditioner, or a network device, etc.) execute the methods described in the various embodiments of the present application.
- a terminal which may be a mobile phone, a computer, a server, an air conditioner, or a network device, etc.
Landscapes
- Engineering & Computer Science (AREA)
- General Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- Human Computer Interaction (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- User Interface Of Digital Computer (AREA)
Abstract
Description
Claims (24)
- 一种交互方法,应用于第一增强现实设备,包括:An interaction method applied to a first augmented reality device, including:检测第一用户的视线方向和所述第一用户的第一动作;Detecting the direction of the line of sight of the first user and the first action of the first user;在所述第一用户的视线投射于目标标识所在区域,且所述第一用户的第一动作满足第一预设条件的情况下,向第二增强现实设备发送与所述第一动作对应的目标信息;In the case that the line of sight of the first user is projected on the area where the target identifier is located, and the first action of the first user meets the first preset condition, the second augmented reality device is sent to the second augmented reality device corresponding to the first action Target information其中,所述目标标识为所述第一增强现实设备的投影区域中的标识,所述第二增强现实设备为与所述目标标识关联的电子设备。Wherein, the target identifier is an identifier in the projection area of the first augmented reality device, and the second augmented reality device is an electronic device associated with the target identifier.
- 根据权利要求1所述的方法,其中,所述向第二增强现实设备发送与所述第一动作对应的目标信息之后,所述方法还包括:The method according to claim 1, wherein after the sending the target information corresponding to the first action to the second augmented reality device, the method further comprises:在第一预设时间内接收到所述第二增强现实设备发送的响应信息的情况下,输出第一提示信息,其中,所述第一提示信息用于提示交互成功;In the case of receiving the response information sent by the second augmented reality device within the first preset time, output first prompt information, where the first prompt information is used to prompt the success of the interaction;或,在所述第一预设时间内未接收到所述第二增强现实设备发送的响应信息的情况下,输出第二提示信息,其中所述第二提示信息用于提示交互失败。Or, in the case that the response information sent by the second augmented reality device is not received within the first preset time, output second prompt information, where the second prompt information is used to prompt an interaction failure.
- 根据权利要求2所述的方法,其中,所述在所述第一预设时间内未接收到所述第二增强现实设备发送的响应信息的情况下,输出第二提示信息之前,所述方法还包括:The method according to claim 2, wherein, in the case that the response information sent by the second augmented reality device is not received within the first preset time, before outputting the second prompt information, the method Also includes:在第二预设时间内未接收到所述第二增强现实设备发送的响应信息,且所述第二增强现实设备与第二穿戴手套连接的情况下,向所述第二增强现实设备发送第一信息;In the case that the response information sent by the second augmented reality device is not received within the second preset time, and the second augmented reality device is connected to the second wearable glove, the second augmented reality device is sent to the second augmented reality device. One information;其中,所述第一信息用于控制所述第二穿戴手套输出触感。Wherein, the first information is used to control the second wearing glove to output tactile sensation.
- 根据权利要求1所述的方法,其中,所述向第二增强现实设备发送与所述第一动作对应的目标信息,包括:The method according to claim 1, wherein the sending target information corresponding to the first action to the second augmented reality device comprises:获取所述第一动作的动作参数;Acquiring the action parameter of the first action;在所述动作参数满足第一参数条件的情况下,向所述第二增强现实设备 发送第一目标信息,所述第一目标信息用于指示所述第二增强现实设备以第一方式输出提示信息,其中,所述第一方式与满足所述第一参数条件的所述动作参数相关联;When the action parameter satisfies the first parameter condition, send first target information to the second augmented reality device, where the first target information is used to instruct the second augmented reality device to output a prompt in the first manner Information, wherein the first manner is associated with the action parameter that satisfies the first parameter condition;或,在所述动作参数满足第二参数条件的情况下,向所述第二增强现实设备发送第二目标信息,所述第二目标信息用于指示所述第二增强现实设备以第二方式输出提示信息,其中,所述第二方式与满足所述第二参数条件的所述动作参数相关联。Or, when the action parameter satisfies the second parameter condition, send second target information to the second augmented reality device, where the second target information is used to instruct the second augmented reality device to use the second method Output prompt information, wherein the second mode is associated with the action parameter that satisfies the second parameter condition.
- 根据权利要求1所述的方法,其中,所述在检测所述第一用户的第一动作之后,所述方法还包括:The method according to claim 1, wherein, after detecting the first action of the first user, the method further comprises:在所述投影区域显示用于表示所述第一动作的第一图像;Displaying a first image used to represent the first action in the projection area;所述向第二增强现实设备发送与所述第一动作对应的目标信息之后,所述方法还包括:After the sending the target information corresponding to the first action to the second augmented reality device, the method further includes:在接收到所述第二增强现实设备发送的对所述目标信息的响应信息,且所述响应信息用于表示第二动作的情况下,根据所述响应信息在所述投影区域显示第三图像,所述第三图像表示的动作包括所述第一动作和所述第二动作;In a case where response information to the target information sent by the second augmented reality device is received, and the response information is used to indicate a second action, display a third image in the projection area according to the response information , The action represented by the third image includes the first action and the second action;或者,在未接收到所述第二增强现实设备发送的对所述目标信息的响应信息的情况下,将所述投影区域中的所述第一图像切换至第四图像,所述第四图像表示的动作与所述第一动作相反。Or, in a case where the response information to the target information sent by the second augmented reality device is not received, the first image in the projection area is switched to a fourth image, and the fourth image The indicated action is the opposite of the first action.
- 根据权利要求5所述的方法,其中,所述第一动作为手势动作,所述第一增强现实设备与穿戴于所述第一用户的第一穿戴手套连接,所述方法还包括:The method according to claim 5, wherein the first action is a gesture action, and the first augmented reality device is connected to a first wearing glove worn by the first user, and the method further comprises:在所述第二增强现实设备发送的对所述目标信息的响应信息中包括触感信息的情况下,控制所述第一穿戴手套输出与所述触感信息对应的触感;In a case where the response information to the target information sent by the second augmented reality device includes tactile information, controlling the first wearing glove to output the tactile feeling corresponding to the tactile information;或,在所述第一动作为拿取目标物品的动作的情况下,检测所述第一穿戴手套的第一指套和第二指套之间的距离,在所述距离小于预设距离情况下,控制所述第一指套输出第一触力,控制所述第二指套输出第二触力,其中, 所述预设距离是根据所述目标物品的尺寸大小确定的。Or, when the first action is an action of taking a target item, detect the distance between the first finger cuff and the second finger cuff of the first wearing glove, and if the distance is less than a preset distance Next, the first finger cuff is controlled to output a first contact force, and the second finger cuff is controlled to output a second contact force, wherein the preset distance is determined according to the size of the target item.
- 一种交互方法,应用于第二增强现实设备,包括:An interaction method applied to a second augmented reality device, including:接收第一增强现实设备发送的目标信息,所述目标信息与所述第一增强现实设备检测的第一动作对应;Receiving target information sent by a first augmented reality device, where the target information corresponds to a first action detected by the first augmented reality device;在第三预设时间内检测到第二用户的第二动作满足第二预设条件的情况下,输出第三提示信息,并向所述第一增强现实设备发送与所述目标信息对应的响应信息;If it is detected within the third preset time that the second action of the second user meets the second preset condition, output third prompt information, and send a response corresponding to the target information to the first augmented reality device information;或,在所述第三预设时间内未检测到第二用户的满足所述第二预设条件的第二动作的情况下,输出第四提示信息。Or, in the case that the second action of the second user that satisfies the second preset condition is not detected within the third preset time, output fourth prompt information.
- 根据权利要求7所述的方法,其中,所述第二增强现实设备与穿戴于所述第二用户的第二穿戴手套连接,所述向所述第一增强现实设备发送与所述目标信息对应的响应信息之前,所述方法还包括:The method according to claim 7, wherein the second augmented reality device is connected to a second wearable glove worn by the second user, and the sending to the first augmented reality device corresponds to the target information Before the response information, the method further includes:接收所述第一增强现实设备发送的第一信息;Receiving the first information sent by the first augmented reality device;根据所述第一信息,控制所述第二穿戴手套输出触感。According to the first information, the second wearing glove is controlled to output a tactile sensation.
- 根据权利要求7所述的方法,其中,The method according to claim 7, wherein:在所述目标信息包括第一目标信息的情况下,通过第一方式输出所述第三提示信息;In a case where the target information includes the first target information, output the third prompt information in a first manner;或,在所述目标信息包括第二目标信息的情况下,通过第二方式输出所述第三提示信息。Or, in a case where the target information includes the second target information, output the third prompt information in a second manner.
- 根据权利要求7所述的方法,其中,所述接收第一增强现实设备发送的目标信息之后,所述方法还包括:The method according to claim 7, wherein after the receiving the target information sent by the first augmented reality device, the method further comprises:在所述第二增强现实设备的投影区域显示用于表示所述第一动作的第一目标图像;Displaying a first target image used to represent the first action in the projection area of the second augmented reality device;所述在第三预设时间内检测到第二用户的第二动作满足第二预设条件之后,所述方法还包括:After detecting that the second action of the second user meets the second preset condition within the third preset time, the method further includes:在所述第二增强现实设备的投影区域显示第三目标图像,所述第三目标图像表示的动作包括所述第一动作和所述第二动作。A third target image is displayed in the projection area of the second augmented reality device, and the action represented by the third target image includes the first action and the second action.
- 一种交互装置,应用于第一增强现实设备,包括:An interactive device applied to a first augmented reality device, including:检测模块,用于检测第一用户的视线方向和所述第一用户的第一动作;The detection module is used to detect the line of sight direction of the first user and the first action of the first user;第一发送模块,用于在所述第一用户的视线投射于目标标识所在区域,且所述第一用户的第一动作满足第一预设条件的情况下,向第二增强现实设备发送与所述第一动作对应的目标信息;The first sending module is configured to send and to the second augmented reality device when the line of sight of the first user is projected on the area where the target identifier is located, and the first action of the first user meets the first preset condition Target information corresponding to the first action;其中,所述目标标识为所述第一增强现实设备的投影区域中的标识,所述第二增强现实设备为与所述目标标识关联的电子设备。Wherein, the target identifier is an identifier in the projection area of the first augmented reality device, and the second augmented reality device is an electronic device associated with the target identifier.
- 根据权利要求11所述的交互装置,还包括:The interactive device according to claim 11, further comprising:第一输出模块,用于在第一预设时间内接收到所述第二增强现实设备发送的响应信息的情况下,输出第一提示信息;The first output module is configured to output first prompt information when the response information sent by the second augmented reality device is received within the first preset time;或,第二输出模块,用于在所述第一预设时间内未接收到所述第二增强现实设备发送的响应信息的情况下,输出第二提示信息。Or, the second output module is configured to output second prompt information when the response information sent by the second augmented reality device is not received within the first preset time.
- 根据权利要求12所述的交互装置,还包括:The interactive device according to claim 12, further comprising:第二发送模块,用于在第二预设时间内未接收到所述第二增强现实设备发送的响应信息,且所述第二增强现实设备与第二穿戴手套连接的情况下,向所述第二增强现实设备发送第一信息;The second sending module is configured to send the response information sent by the second augmented reality device to the The second augmented reality device sends the first information;其中,所述第一信息用于控制所述第二穿戴手套输出触感。Wherein, the first information is used to control the second wearing glove to output tactile sensation.
- 根据权利要求11所述的交互装置,其中,所述第一发送模块包括:The interactive device according to claim 11, wherein the first sending module comprises:获取子模块,用于获取所述第一动作的动作参数;An acquiring sub-module for acquiring the action parameter of the first action;发送子模块,用于在所述动作参数满足第一参数条件的情况下,向所述第二增强现实设备发送第一目标信息,所述第一目标信息用于指示所述第二增强现实设备以第一方式输出提示信息;A sending submodule, configured to send first target information to the second augmented reality device when the action parameter satisfies the first parameter condition, where the first target information is used to indicate the second augmented reality device Output prompt information in the first way;或,用于在所述动作参数满足第二参数条件的情况下,向所述第二增强现实设备发送第二目标信息,所述第二目标信息用于指示所述第二增强现实设备以第二方式输出提示信息。Or, when the action parameter satisfies the second parameter condition, send second target information to the second augmented reality device, where the second target information is used to instruct the second augmented reality device to Two ways to output prompt information.
- 根据权利要求11所述的交互装置,还包括:The interactive device according to claim 11, further comprising:第一显示模块,用于在所述投影区域显示用于表示所述第一动作的第一 图像;A first display module, configured to display a first image for representing the first action in the projection area;第二显示模块,用于在接收到所述第二增强现实设备发送的对所述目标信息的响应信息,且所述响应信息用于表示第二动作的情况下,根据所述响应信息在所述投影区域显示第三图像,所述第三图像表示的动作包括所述第一动作和所述第二动作。The second display module is configured to receive response information to the target information sent by the second augmented reality device, and the response information is used to indicate a second action, according to the response information in the The projection area displays a third image, and the action represented by the third image includes the first action and the second action.
- 根据权利要求15所述的交互装置,其中,所述第一动作为手势动作,所述第一增强现实设备与穿戴于所述第一用户的第一穿戴手套连接,所述交互装置还包括:The interaction device according to claim 15, wherein the first action is a gesture action, the first augmented reality device is connected to a first wearable glove worn by the first user, and the interaction device further comprises:第一控制模块,用于在所述第二增强现实设备发送的对所述目标信息的响应信息中包括触感信息的情况下,控制所述第一穿戴手套输出与所述触感信息对应的触感;The first control module is configured to control the first wearing glove to output the tactile sensation corresponding to the tactile sensation information in the case that the response information to the target information sent by the second augmented reality device includes tactile sensation information;或,第二控制模块,用于在所述第一动作为拿取目标物品的动作的情况下,检测所述第一穿戴手套的第一指套和第二指套之间的距离,在所述距离小于预设距离情况下,控制所述第一指套输出第一触力,控制所述第二指套输出第二触力,其中,所述预设距离是根据所述目标物品的尺寸大小确定的。Or, the second control module is configured to detect the distance between the first finger cuff and the second finger cuff of the first wearing glove when the first action is an action of picking up a target item, and When the distance is less than a preset distance, the first finger cuff is controlled to output a first contact force, and the second finger cuff is controlled to output a second contact force, wherein the preset distance is based on the size of the target item The size is ok.
- 一种交互装置,应用于第二增强现实设备,包括:An interaction device applied to a second augmented reality device, including:第一接收模块,用于接收第一增强现实设备发送的目标信息,所述目标信息与所述第一增强现实设备检测的第一动作对应;A first receiving module, configured to receive target information sent by a first augmented reality device, where the target information corresponds to a first action detected by the first augmented reality device;第三输出模块,用于在第三预设时间内检测到第二用户的第二动作满足第二预设条件的情况下,输出第三提示信息,并向所述第一增强现实设备发送与所述目标信息对应的响应信息;The third output module is configured to output third prompt information when it is detected that the second user's second action satisfies the second preset condition within the third preset time, and send the information to the first augmented reality device Response information corresponding to the target information;或,第四输出模块,用于在所述第三预设时间内未检测到第二用户的满足所述第二预设条件的第二动作的情况下,输出第四提示信息。Or, the fourth output module is configured to output fourth prompt information when the second action of the second user that satisfies the second preset condition is not detected within the third preset time.
- 根据权利要求17所述的交互装置,其中,所述第二增强现实设备与穿戴于所述第二用户的第二穿戴手套连接,所述交互装置还包括:The interaction device according to claim 17, wherein the second augmented reality device is connected to a second wearable glove worn by the second user, and the interaction device further comprises:第二接收模块,用于接收所述第一增强现实设备发送的第一信息;The second receiving module is configured to receive the first information sent by the first augmented reality device;第三控制模块,用于根据所述第一信息,控制所述第二穿戴手套输出触 感。The third control module is configured to control the second wearing glove to output tactile sensations according to the first information.
- 根据权利要求17所述的交互装置,其中,所述交互装置还包括:The interactive device according to claim 17, wherein the interactive device further comprises:第五输出模块,用于在所述目标信息包括第一目标信息的情况下,通过第一方式输出所述第三提示信息;A fifth output module, configured to output the third prompt information in a first manner when the target information includes the first target information;或,第六输出模块,用于在所述目标信息包括第二目标信息的情况下,通过第二方式输出所述第三提示信息。Or, the sixth output module is configured to output the third prompt information in a second manner when the target information includes the second target information.
- 根据权利要求17所述的交互装置,还包括:The interactive device according to claim 17, further comprising:第三显示模块,用于在所述第二增强现实设备的投影区域显示用于表示所述第一动作的第一目标图像;A third display module, configured to display a first target image used to represent the first action in the projection area of the second augmented reality device;所述交互装置还包括:The interaction device further includes:第四显示模块,用于在所述第二增强现实设备的投影区域显示第三目标图像,所述第三目标图像表示的动作包括所述第一动作和所述第二动作。The fourth display module is configured to display a third target image in the projection area of the second augmented reality device, and the action represented by the third target image includes the first action and the second action.
- 一种增强现实设备,包括处理器,存储器及存储在所述存储器上并可在所述处理器上运行的程序或指令,所述程序或指令被所述处理器执行时实现如权利要求1至6任一项所述的交互方法中的步骤,或实现如权利要求7至10任一项所述的交互方法中的步骤。An augmented reality device, comprising a processor, a memory, and a program or instruction stored on the memory and capable of running on the processor, and the program or instruction is executed by the processor to implement claims 1 to 6. Steps in the interactive method according to any one of claims, or implement the steps in the interactive method according to any one of claims 7 to 10.
- 一种可读存储介质,其中,所述可读存储介质上存储程序或指令,所述程序或指令被处理器执行时实现如权利要求1至6任一项所述的交互方法中的步骤,或实现如权利要求7至10任一项所述的交互方法中的步骤。A readable storage medium, wherein a program or instruction is stored on the readable storage medium, and when the program or instruction is executed by a processor, the steps in the interaction method according to any one of claims 1 to 6 are implemented, Or implement the steps in the interactive method according to any one of claims 7 to 10.
- 一种芯片,其中,所述芯片包括处理器和通信接口,所述通信接口和所述处理器耦合,所述处理器用于运行程序或指令,实现如权利要求1至6任一项所述的交互方法中的步骤,或实现如权利要求7至10任一项所述的交互方法中的步骤。A chip, wherein the chip includes a processor and a communication interface, the communication interface is coupled with the processor, and the processor is used to run a program or an instruction to implement the method according to any one of claims 1 to 6 The steps in the interactive method, or the steps in the interactive method according to any one of claims 7 to 10 are implemented.
- 一种交互装置,其中,所述交互装置被配置成用于执行如权利要求1至6任一项所述的交互方法中的步骤,或实现如权利要求7至10任一项所述的交互方法中的步骤。An interaction device, wherein the interaction device is configured to perform the steps in the interaction method according to any one of claims 1 to 6, or to implement the interaction according to any one of claims 7 to 10 Steps in the method.
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202010389104.1 | 2020-05-09 | ||
CN202010389104.1A CN111580661A (en) | 2020-05-09 | 2020-05-09 | Interaction method and augmented reality device |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2021227918A1 true WO2021227918A1 (en) | 2021-11-18 |
Family
ID=72122887
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/CN2021/091864 WO2021227918A1 (en) | 2020-05-09 | 2021-05-06 | Interaction method and augmented reality device |
Country Status (2)
Country | Link |
---|---|
CN (1) | CN111580661A (en) |
WO (1) | WO2021227918A1 (en) |
Cited By (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN114338897A (en) * | 2021-12-16 | 2022-04-12 | 杭州逗酷软件科技有限公司 | Object sharing method and device, electronic equipment and storage medium |
CN114385004A (en) * | 2021-12-15 | 2022-04-22 | 北京五八信息技术有限公司 | Interaction method and device based on augmented reality, electronic equipment and readable medium |
CN114578966A (en) * | 2022-03-07 | 2022-06-03 | 北京百度网讯科技有限公司 | Interaction method and device, head-mounted display equipment, electronic equipment and medium |
CN115550886A (en) * | 2022-11-29 | 2022-12-30 | 蔚来汽车科技(安徽)有限公司 | Vehicle-mounted augmented reality equipment control method and system and vehicle-mounted interaction system |
Families Citing this family (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN111580661A (en) * | 2020-05-09 | 2020-08-25 | 维沃移动通信有限公司 | Interaction method and augmented reality device |
US11516618B2 (en) | 2020-10-09 | 2022-11-29 | Lemon Inc. | Social networking using augmented reality |
CN114489331A (en) * | 2021-12-31 | 2022-05-13 | 上海米学人工智能信息科技有限公司 | Method, apparatus, device and medium for interaction of separated gestures distinguished from button clicks |
CN115191788B (en) * | 2022-07-14 | 2023-06-23 | 慕思健康睡眠股份有限公司 | Somatosensory interaction method based on intelligent mattress and related products |
CN116540872B (en) * | 2023-04-28 | 2024-06-04 | 中广电广播电影电视设计研究院有限公司 | VR data processing method, device, equipment, medium and product |
Citations (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN108829247A (en) * | 2018-06-01 | 2018-11-16 | 北京市商汤科技开发有限公司 | Exchange method and device based on eye tracking, computer equipment |
CN108933723A (en) * | 2017-05-19 | 2018-12-04 | 腾讯科技(深圳)有限公司 | message display method, device and terminal |
CN109937394A (en) * | 2016-10-04 | 2019-06-25 | 脸谱公司 | Control and interface for user's interaction in Virtual Space |
CN110298925A (en) * | 2019-07-04 | 2019-10-01 | 珠海金山网络游戏科技有限公司 | A kind of augmented reality image processing method, calculates equipment and storage medium at device |
CN110413109A (en) * | 2019-06-28 | 2019-11-05 | 广东虚拟现实科技有限公司 | Generation method, device, system, electronic equipment and the storage medium of virtual content |
CN111580661A (en) * | 2020-05-09 | 2020-08-25 | 维沃移动通信有限公司 | Interaction method and augmented reality device |
Family Cites Families (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US10445935B2 (en) * | 2017-05-26 | 2019-10-15 | Microsoft Technology Licensing, Llc | Using tracking to simulate direct tablet interaction in mixed reality |
CN110716647A (en) * | 2019-10-17 | 2020-01-21 | 广州大西洲科技有限公司 | Augmented reality interaction method, device and system |
-
2020
- 2020-05-09 CN CN202010389104.1A patent/CN111580661A/en active Pending
-
2021
- 2021-05-06 WO PCT/CN2021/091864 patent/WO2021227918A1/en active Application Filing
Patent Citations (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN109937394A (en) * | 2016-10-04 | 2019-06-25 | 脸谱公司 | Control and interface for user's interaction in Virtual Space |
CN108933723A (en) * | 2017-05-19 | 2018-12-04 | 腾讯科技(深圳)有限公司 | message display method, device and terminal |
CN108829247A (en) * | 2018-06-01 | 2018-11-16 | 北京市商汤科技开发有限公司 | Exchange method and device based on eye tracking, computer equipment |
CN110413109A (en) * | 2019-06-28 | 2019-11-05 | 广东虚拟现实科技有限公司 | Generation method, device, system, electronic equipment and the storage medium of virtual content |
CN110298925A (en) * | 2019-07-04 | 2019-10-01 | 珠海金山网络游戏科技有限公司 | A kind of augmented reality image processing method, calculates equipment and storage medium at device |
CN111580661A (en) * | 2020-05-09 | 2020-08-25 | 维沃移动通信有限公司 | Interaction method and augmented reality device |
Cited By (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN114385004A (en) * | 2021-12-15 | 2022-04-22 | 北京五八信息技术有限公司 | Interaction method and device based on augmented reality, electronic equipment and readable medium |
CN114338897A (en) * | 2021-12-16 | 2022-04-12 | 杭州逗酷软件科技有限公司 | Object sharing method and device, electronic equipment and storage medium |
CN114338897B (en) * | 2021-12-16 | 2024-01-16 | 杭州逗酷软件科技有限公司 | Method and device for sharing objects, electronic equipment and storage medium |
CN114578966A (en) * | 2022-03-07 | 2022-06-03 | 北京百度网讯科技有限公司 | Interaction method and device, head-mounted display equipment, electronic equipment and medium |
CN114578966B (en) * | 2022-03-07 | 2024-02-06 | 北京百度网讯科技有限公司 | Interaction method, interaction device, head-mounted display device, electronic device and medium |
CN115550886A (en) * | 2022-11-29 | 2022-12-30 | 蔚来汽车科技(安徽)有限公司 | Vehicle-mounted augmented reality equipment control method and system and vehicle-mounted interaction system |
CN115550886B (en) * | 2022-11-29 | 2023-03-28 | 蔚来汽车科技(安徽)有限公司 | Vehicle-mounted augmented reality equipment control method and system and vehicle-mounted interaction system |
Also Published As
Publication number | Publication date |
---|---|
CN111580661A (en) | 2020-08-25 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
WO2021227918A1 (en) | Interaction method and augmented reality device | |
WO2015188614A1 (en) | Method and device for operating computer and mobile phone in virtual world, and glasses using same | |
WO2020078319A1 (en) | Gesture-based manipulation method and terminal device | |
US10701316B1 (en) | Gesture-triggered overlay elements for video conferencing | |
US11703941B2 (en) | Information processing system, information processing method, and program | |
CN104813642A (en) | Methods, apparatuses and computer readable medium for triggering a gesture recognition mode and device pairing and sharing via non-touch gestures | |
CN103955275B (en) | Application control method and apparatus | |
CN105338238B (en) | A kind of photographic method and electronic equipment | |
WO2021227916A1 (en) | Facial image generation method and apparatus, electronic device, and readable storage medium | |
CN111970456B (en) | Shooting control method, device, equipment and storage medium | |
WO2023273372A1 (en) | Gesture recognition object determination method and apparatus | |
CN107888965A (en) | Image present methods of exhibiting and device, terminal, system, storage medium | |
CN112199016A (en) | Image processing method, image processing device, electronic equipment and computer readable storage medium | |
CN108616712A (en) | A kind of interface operation method, device, equipment and storage medium based on camera | |
CN111601064A (en) | Information interaction method and information interaction device | |
US20220291752A1 (en) | Distributed Application Platform Projected on a Secondary Display for Entertainment, Gaming and Learning with Intelligent Gesture Interactions and Complex Input Composition for Control | |
CN112287767A (en) | Interaction control method, device, storage medium and electronic equipment | |
CN114327197B (en) | Message sending method, device, equipment and medium | |
CN108537149B (en) | Image processing method, image processing device, storage medium and electronic equipment | |
Sarkar et al. | Augmented reality-based virtual smartphone | |
WO2022151687A1 (en) | Group photo image generation method and apparatus, device, storage medium, computer program, and product | |
JP2014194675A (en) | Program and communication device | |
CN113096193A (en) | Three-dimensional somatosensory operation identification method and device and electronic equipment | |
CN107968742B (en) | Image display method, device and computer readable storage medium | |
CN105262676A (en) | Method and apparatus for transmitting message in instant messaging |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 21804647 Country of ref document: EP Kind code of ref document: A1 |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
122 | Ep: pct application non-entry in european phase |
Ref document number: 21804647 Country of ref document: EP Kind code of ref document: A1 |
|
122 | Ep: pct application non-entry in european phase |
Ref document number: 21804647 Country of ref document: EP Kind code of ref document: A1 |
|
32PN | Ep: public notification in the ep bulletin as address of the adressee cannot be established |
Free format text: NOTING OF LOSS OF RIGHTS PURSUANT TO RULE 112(1) EPC (EPO FORM 1205A 04.05.2023) |
|
122 | Ep: pct application non-entry in european phase |
Ref document number: 21804647 Country of ref document: EP Kind code of ref document: A1 |