[go: up one dir, main page]
More Web Proxy on the site http://driver.im/

CN112286400A - Projection interaction method and device, electronic equipment and computer storage medium - Google Patents

Projection interaction method and device, electronic equipment and computer storage medium Download PDF

Info

Publication number
CN112286400A
CN112286400A CN202011014857.0A CN202011014857A CN112286400A CN 112286400 A CN112286400 A CN 112286400A CN 202011014857 A CN202011014857 A CN 202011014857A CN 112286400 A CN112286400 A CN 112286400A
Authority
CN
China
Prior art keywords
navigation bar
state
area
user
preset
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202011014857.0A
Other languages
Chinese (zh)
Inventor
杨晨炜
邱波
虞崇军
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Hangzhou Yixian Advanced Technology Co ltd
Original Assignee
Hangzhou Yixian Advanced Technology Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Hangzhou Yixian Advanced Technology Co ltd filed Critical Hangzhou Yixian Advanced Technology Co ltd
Priority to CN202011014857.0A priority Critical patent/CN112286400A/en
Publication of CN112286400A publication Critical patent/CN112286400A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/041Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
    • G06F3/042Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means by opto-electronic means
    • G06F3/0425Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means by opto-electronic means using a single imaging device like a video camera for tracking the absolute position of a single or a plurality of objects with respect to an imaged reference surface, e.g. video camera imaging a display or a projection screen, a table or a wall surface, on which a computer generated image is displayed or projected
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • G06F3/04812Interaction techniques based on cursor appearance or behaviour, e.g. being affected by the presence of displayed objects
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • G06F3/04815Interaction with a metaphor-based environment or interaction object displayed as three-dimensional, e.g. changing the user viewpoint with respect to the environment or object
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • G06F3/04883Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures for inputting data by handwriting, e.g. gesture or text
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/20Movements or behaviour, e.g. gesture recognition
    • G06V40/28Recognition of hand or arm movements, e.g. recognition of deaf sign language

Landscapes

  • Engineering & Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • General Physics & Mathematics (AREA)
  • Physics & Mathematics (AREA)
  • Multimedia (AREA)
  • Social Psychology (AREA)
  • Psychiatry (AREA)
  • Health & Medical Sciences (AREA)
  • General Health & Medical Sciences (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

The invention relates to a projection interaction method, a projection interaction device, electronic equipment and a computer storage medium, wherein the projection interaction method comprises the following steps: detecting a motion track of a hand of a user in a preset sensing area, wherein the sensing area is an imaging area of a projector or an area between the imaging area and the projector, and converting a navigation bar in the imaging area from a first state to a second state according to the motion track, wherein the first state is that the navigation bar is kept displayed in the imaging area, and the second state is that the navigation bar is hidden according to a preset time period and then pops up for display. Through the method and the device, the problem of low switching efficiency of the navigation bar in the AR intelligent projection system is solved, and the switching efficiency of the navigation bar in the AR intelligent projection system is improved.

Description

Projection interaction method and device, electronic equipment and computer storage medium
Technical Field
The present invention relates to the field of computer interaction technologies, and in particular, to a projection interaction method and apparatus, an electronic device, and a computer storage medium.
Background
The virtual navigation bar is a function commonly used in an intelligent terminal system, and plays a role of linking each page in a website or software, and is generally displayed in a full screen mode or below an interface in the screen. In the related art, when a virtual navigation bar is applied to an interactive Augmented Reality (AR) intelligent projection system, a user often wants that contents in an imaging area of a projector can be completely displayed on an imaging plane of the projector, so the navigation bar is usually folded to be in a hidden state, and when the user needs to use the navigation bar, the user calls the navigation bar out by operating in a sensing area to display the navigation bar in the imaging area, but the switching operation of the navigation bar between the display state and the hidden state needs to be performed only after the user pauses other operations, so that the switching efficiency of the navigation bar is low.
Aiming at the problem of low switching efficiency of a navigation bar in an AR intelligent projection system in the related art, an effective solution is not provided at present.
Disclosure of Invention
The embodiment of the application provides a projection interaction method and device, electronic equipment and a computer storage medium, and aims to at least solve the problem that switching efficiency of a navigation bar in an AR intelligent projection system in the related art is low.
In a first aspect, an embodiment of the present application provides a projection interaction method, where the method includes:
detecting a motion track of a hand of a user in a preset sensing area, wherein the sensing area is an imaging area of a projector or an area between the imaging area and the projector;
and converting the navigation bar in the imaging area from a first state to a second state according to the motion track, wherein the first state is that the navigation bar is kept displayed in the imaging area, and the second state is that the navigation bar is hidden according to a preset time period and then pops up for display.
In some of these embodiments, the navigation bar is displayed in a middle region of a top end in the imaging region in both the first state and the second state.
In some embodiments, the detecting the motion trajectory of the user's hand in the preset sensing area further comprises:
and according to the display positions of the display main body and the operation buttons in the imaging area, reducing the display area displayed in the imaging area by the navigation bar, wherein the display area is not intersected with the display positions.
In some embodiments, the detecting the motion trajectory of the user's hand in the preset sensing area further comprises:
and under the condition that the display position is intersected with a preset navigation bar reserved area, reducing the display area to the navigation bar reserved area, and setting the transparency of the navigation bar as a preset transparency threshold.
In some embodiments, the detecting the motion trajectory of the user's hand in the preset sensing area further comprises:
and hiding the navigation bar under the condition that the display position is intersected with a preset navigation bar reserved area.
In some embodiments, the motion trajectory includes a motion direction and a motion displacement amount of the hand of the user, where the motion displacement amount is a distance between a current position of the hand of the user and the bottom of the imaging area, and the converting the navigation bar from the first state to the second state according to the motion trajectory includes:
and converting the navigation bar from the first state to the second state according to the movement direction and the movement displacement of the hand of the user.
In some embodiments, the switching the navigation bar from the first state to the second state according to the movement direction and the movement displacement of the user's hand comprises:
and under the condition that the motion direction meets a preset direction condition and the motion displacement meets a preset displacement threshold value, converting the navigation bar from the first state to the second state.
In a second aspect, an embodiment of the present application provides a projection interaction apparatus, including: the device comprises a detection module and a state conversion module;
the detection module is used for detecting a motion track of a hand of a user in a preset sensing area, wherein the sensing area is an imaging area of a projector or an area between the imaging area and the projector;
the state conversion module is configured to convert the navigation bar in the imaging area from a first state to a second state according to the motion trajectory, where the first state is that the navigation bar is kept displayed in the imaging area, and the second state is that the navigation bar is hidden according to a preset time period and then pops up for display.
In a third aspect, an embodiment of the present application provides an electronic device, which includes a memory, a processor, and a computer program stored on the memory and running on the processor, where the processor, when executing the computer program, implements the projection interaction method according to the first aspect.
In a fourth aspect, the present application provides a computer storage medium, on which a computer program is stored, which when executed by a processor implements the projection interaction method according to the first aspect.
Compared with the related art, the projection interaction method provided by the embodiment of the application has the advantages that the motion track of the hand of the user in the preset sensing area is detected, wherein the sensing area is the imaging area of the projector or the area between the imaging area and the projector, and the navigation bar in the imaging area is converted from the first state to the second state according to the motion track, wherein the first state is that the navigation bar is kept displayed in the imaging area, and the second state is that the navigation bar is displayed in a pop-up mode after being hidden according to the preset time period, so that the problem of low switching efficiency of the navigation bar in the AR intelligent projection system is solved, and the switching efficiency of the navigation bar in the AR intelligent projection system is improved.
Drawings
The accompanying drawings, which are included to provide a further understanding of the application and are incorporated in and constitute a part of this application, illustrate embodiment(s) of the application and together with the description serve to explain the application and not to limit the application. In the drawings:
FIG. 1 is a schematic diagram of an application environment of a projection interaction method according to an embodiment of the application;
FIG. 2 is a flow chart of a projection interaction method according to an embodiment of the application;
FIG. 3 is a diagram illustrating a display mode of a navigation bar in the related art;
FIG. 4 is a schematic diagram of a display mode of a navigation bar according to an embodiment of the application;
FIG. 5 is a schematic structural diagram of a projection interaction device according to an embodiment of the present application;
fig. 6 is an internal structural diagram of an electronic device according to an embodiment of the present application.
Detailed Description
In order to make the objects, technical solutions and advantages of the present application more apparent, the present application will be described and illustrated below with reference to the accompanying drawings and embodiments. It should be understood that the specific embodiments described herein are merely illustrative of the present application and are not intended to limit the present application. All other embodiments obtained by a person of ordinary skill in the art based on the embodiments provided in the present application without any inventive step are within the scope of protection of the present application.
It is obvious that the drawings in the following description are only examples or embodiments of the present application, and that it is also possible for a person skilled in the art to apply the present application to other similar contexts on the basis of these drawings without inventive effort. Moreover, it should be appreciated that in the development of any such actual implementation, as in any engineering or design project, numerous implementation-specific decisions must be made to achieve the developers' specific goals, such as compliance with system-related and business-related constraints, which may vary from one implementation to another.
Reference in the specification to "an embodiment" means that a particular feature, structure, or characteristic described in connection with the embodiment can be included in at least one embodiment of the specification. The appearances of the phrase in various places in the specification are not necessarily all referring to the same embodiment, nor are separate or alternative embodiments mutually exclusive of other embodiments. Those of ordinary skill in the art will explicitly and implicitly appreciate that the embodiments described herein may be combined with other embodiments without conflict.
Unless defined otherwise, technical or scientific terms referred to herein shall have the ordinary meaning as understood by those of ordinary skill in the art to which this application belongs. Reference to "a," "an," "the," and similar words throughout this application are not to be construed as limiting in number, and may refer to the singular or the plural. The present application is directed to the use of the terms "including," "comprising," "having," and any variations thereof, which are intended to cover non-exclusive inclusions; for example, a process, method, system, article, or apparatus that comprises a list of steps or modules (elements) is not limited to the listed steps or elements, but may include other steps or elements not expressly listed or inherent to such process, method, article, or apparatus. Reference to "connected," "coupled," and the like in this application is not intended to be limited to physical or mechanical connections, but may include electrical connections, whether direct or indirect. The term "plurality" as referred to herein means two or more. "and/or" describes an association relationship of associated objects, meaning that three relationships may exist, for example, "A and/or B" may mean: a exists alone, A and B exist simultaneously, and B exists alone. The character "/" generally indicates that the former and latter associated objects are in an "or" relationship. Reference herein to the terms "first," "second," "third," and the like, are merely to distinguish similar objects and do not denote a particular ordering for the objects.
In an existing intelligent terminal system, a navigation bar is usually located at the bottom of an operation interface, for example, several buttons at the bottom of an android system interface, to implement functions including returning, returning to a desktop, a task manager, and the like. The existing navigation bar-free mode of the intelligent terminal system judges the intention of a user through gesture actions under the condition that no navigation bar exists, for example, the intention slides from the edge to the middle of a mobile phone, so that the terminal system can realize corresponding functions. When the navigation bar-free mode is applied to the AR intelligent projection system, in order to prevent the low operation success rate caused by the error recognition of the gestures, the design requirement on the hollow gestures is high, the learning burden is large for user groups such as children, even if the user groups learn to use various accurate gestures, the navigation bar can be switched between the display state and the hidden state only after other operations are paused, and the switching efficiency of the navigation bar is low.
Fig. 1 is a schematic diagram of an application environment of a projection interaction method according to an embodiment of the present application, where the projection interaction method provided in the present application may be applied to the application environment shown in fig. 1, a terminal 12 communicates with a projector 14 through a network, and a display content on the terminal 12 is displayed in an imaging area on an imaging plane through the projector 14. The projector 14 detects a motion track of a hand of a user in a preset sensing area, wherein the sensing area is the imaging area or an area between the imaging area and the projector 14; the projector 14 converts the navigation bar in the imaging area from a first state to a second state according to the motion trajectory, wherein the first state is that the navigation bar is kept displayed in the imaging area, and the second state is that the navigation bar is hidden according to a preset time period and then pops up for display. The terminal 12 may be, but is not limited to, various personal computers, notebook computers, smart phones, tablet computers, and portable wearable devices, and the imaging plane may be, but is not limited to, on the surface of an imaging carrier 16 such as a desk.
The embodiment provides a projection interaction method. Fig. 2 is a flowchart of a projection interaction method according to an embodiment of the present application, and as shown in fig. 2, the flowchart includes the following steps:
s210, detecting the motion track of the hand of the user in a preset sensing area. The sensing region is an imaging region of the projector or a region between the imaging region and the projector. The imaging area is in an imaging plane of the imaging support, such as on a solid table directly opposite the projector.
And S220, converting the navigation bar in the imaging area from a first state to a second state according to the motion track, wherein the first state is that the navigation bar is kept displayed in the imaging area, and the second state is that the navigation bar is hidden according to a preset time period and then pops up for display. After the projector is started, the navigation bar is set to be in the first state by default, namely, the navigation bar is kept displayed in the imaging area, when a user needs to operate the navigation bar, the user can directly operate the navigation bar without switching the navigation bar from a hidden state to a display keeping state, and the use of the user is facilitated. And when the user wants the display content in the imaging area to be completely displayed, the projector can convert the navigation bar into the second state according to the motion track of the hand of the user detected by the sensing area. In the second state, the navigation bar is retracted and hidden according to a preset time period, and then automatically pops up and is displayed in the imaging area. For example, the time period is 5 seconds, and the navigation bar automatically pops up and remains displayed in the imaging area after being retracted and kept in the hidden state for 5 seconds.
Through the steps, the projector can switch the navigation bar in the imaging area from the state of keeping displaying in the imaging area to the state of retracting, automatically popping up and displaying in the imaging area after being hidden according to the time period of the time period according to the motion track of the hand of the user detected in the sensing area, so that the situation that the hidden navigation bar can be switched to the display state only by other manual operations after the user pauses other operations is avoided, the problem of low switching efficiency of the navigation bar in the AR intelligent projection system is solved, the switching efficiency of the navigation bar in the AR intelligent projection system is improved, and particularly for user groups such as children, the user experience can be greatly improved.
In some embodiments, the navigation bar is displayed in a middle region of the top end in the imaging region in both the first state and the second state. Fig. 3 is a schematic diagram of a display mode of a navigation bar in the related art, and as shown in fig. 3, in order to meet the usage habit and convenience of people, a virtual navigation bar is typically designed at the bottom end of an interface and fills the entire screen width by an intelligent terminal system. However, when the display mode of the navigation bar is applied to the AR intelligent projection system, since the user's operation is entirely performed on the imaging carriers such as the physical desktop, especially when the user group is some children, other normal operations of the user may be recognized as calling out the navigation bar or clicking a button on the navigation bar, thereby causing a false touch condition. Fig. 4 is a schematic view of a display mode of a navigation bar according to an embodiment of the present application, and as shown in fig. 4, in this embodiment, the navigation bar is arranged in the middle area at the top end of the imaging area to display, instead of the display mode that the navigation bar is full of screen width in the related art, because the middle area at the top end of the imaging area is not a common operation area of a user, the "bang navigation bar" can effectively prevent the occurrence of a false touch condition, and improve the operation experience and the visual experience of the user.
In some embodiments, the projector detects that the hand of the user is in front of or behind the motion track of the preset sensing area, and the display area of the navigation bar displayed in the imaging area can be reduced according to the display position of the display main body and the operation buttons in the imaging area, wherein the display area and the display position do not intersect. Through the display area who reduces the navigation bar, guarantee that show main part and operating button in the imaging area can be complete must show, can make other normal operating efficiency of user higher, reduce the operating area and miss the probability of touching, promote user experience.
In some embodiments, the projector detects that the user's hand is before or after the motion trajectory of the preset sensing area, and in a case where the display positions of the display main body and the operation buttons in the imaging area intersect with the preset navigation bar reserved area, reduces the display area of the navigation bar displayed in the imaging area to the navigation bar reserved area, and sets the transparency of the navigation bar to a preset transparency threshold. The navigation bar reserved area is the smallest display area that ensures that the buttons on the navigation bar can be used normally. The display area of the navigation bar is reduced to the reserved area of the navigation bar, the background of the navigation bar is set to be semitransparent, the navigation bar can be guaranteed not to shield a display main body and an operation button in an imaging area, the button of the navigation bar can be displayed clearly, the operation of the navigation bar and other normal operations of a user can not be influenced, the watching experience of the user can not be influenced, and the user experience can be improved.
In some embodiments, the projector detects that the user's hand is before or after the motion track of the preset sensing area, and hides the navigation bar if the display position intersects with the preset navigation bar reserved area. The navigation bar reserved area is the smallest display area that ensures that the buttons on the navigation bar can be used normally. The navigation bar is folded to be hidden, so that the display main body and the operation buttons in the imaging area can be prevented from being shielded by the navigation bar, the other normal operations and the watching experience of a user can not be influenced, and the user experience can be improved.
In some embodiments, the motion trajectory includes a motion direction and a motion displacement amount of the user's hand, wherein the motion displacement amount is a vertical distance from a current position of the user's hand in the imaging region to a bottom of the imaging region. The projector transitioning the navigation bar from the first state to the second state according to the motion trajectory includes: and converting the navigation bar from the first state to the second state according to the movement direction and the movement displacement of the hand of the user.
In some embodiments, the projector transitioning the navigation bar from the first state to the second state according to the direction and amount of movement displacement of the user's hand comprises: and under the condition that the motion direction meets the preset direction condition and the motion displacement meets the preset displacement threshold, converting the navigation bar from the first state to the second state. For example, when the movement direction is consistent with the movement direction from the bottom of the imaging area to the top of the imaging area, and the vertical distance from the current position of the hand of the user in the imaging area to the bottom of the imaging area reaches the threshold value of the displacement amount, the projector converts the navigation bar from the first state to the second state, and the triggering condition of the state conversion is simple, so that the method is particularly suitable for user groups such as children.
In some embodiments, the navigation bar is set to the first state by default after the projector is turned on, i.e. the navigation bar remains displayed in the imaging area. When detecting that a motion track formed by the upward sliding operation of the user hand meets a preset operation condition (for example, the operation condition includes that the motion direction is upward), the projector converts the navigation bar from the first state to the second state, that is, the navigation bar automatically restores to the display-maintaining state after being converted from the display-maintaining state to the retraction and hiding state for N seconds.
In some embodiments, the navigation bar is set to the first state by default after the projector is turned on, i.e. the navigation bar remains displayed in the imaging area. When receiving an operation instruction of an App on a terminal, the projector calls a system API to trigger the retraction and hiding of the navigation bar, and at the moment, the App or other APPs are required to call the system API to trigger redisplay events.
The embodiment of the application provides a projection interaction device. Fig. 5 is a schematic structural diagram of a projection interaction apparatus according to an embodiment of the present application, and as shown in fig. 5, the apparatus includes a detection module 510 and a state transition module 520: the detection module 510 is configured to detect a motion trajectory of a hand of a user in a preset sensing region, where the sensing region is an imaging region of a projector or a region between the imaging region and the projector; the state transition module 520 is configured to transition the navigation bar in the imaging area from a first state to a second state according to the motion trajectory, where the first state is that the navigation bar is kept displayed in the imaging area, and the second state is that the navigation bar is hidden according to a preset time period and then pops up for display.
For the specific definition of the projection interaction device, reference may be made to the above definition of the projection interaction method, which is not described herein again. The modules in the projection interaction device can be wholly or partially realized by software, hardware and a combination thereof. The modules can be embedded in a hardware form or independent from a processor in the computer device, and can also be stored in a memory in the computer device in a software form, so that the processor can call and execute operations corresponding to the modules.
An embodiment of the present application further provides an electronic device, which includes a memory and a processor, where the memory stores a computer program, and the processor is configured to execute the computer program to perform the steps in any of the method embodiments described above.
Optionally, the electronic device may further include a transmission device and an input/output device, wherein the transmission device is connected to the processor, and the input/output device is connected to the processor.
It should be noted that, for specific examples in this embodiment, reference may be made to examples described in the foregoing embodiments and optional implementations, and details of this embodiment are not described herein again.
In addition, in combination with the projection interaction method in the foregoing embodiments, the embodiments of the present application may provide a storage medium to implement. The storage medium having stored thereon a computer program; the computer program, when executed by a processor, implements any of the projection interaction methods in the above embodiments.
In an embodiment, fig. 6 is a schematic internal structure diagram of an electronic device according to an embodiment of the present application, and as shown in fig. 6, there is provided an electronic device, which may be a server, and its internal structure diagram may be as shown in fig. 6. The electronic device includes a processor, a memory, a network interface, and a database connected by a system bus. Wherein the processor of the electronic device is configured to provide computing and control capabilities. The memory of the electronic equipment comprises a nonvolatile storage medium and an internal memory. The non-volatile storage medium stores an operating system, a computer program, and a database. The internal memory provides an environment for the operation of an operating system and computer programs in the non-volatile storage medium. The database of the electronic device is used for storing data. The network interface of the electronic device is used for connecting and communicating with an external terminal through a network. The computer program is executed by a processor to implement a projection interaction method.
Those skilled in the art will appreciate that the configuration shown in fig. 6 is a block diagram of only a portion of the configuration associated with the present application, and does not constitute a limitation on the electronic device to which the present application is applied, and a particular electronic device may include more or less components than those shown in the drawings, or may combine certain components, or have a different arrangement of components.
It will be understood by those skilled in the art that all or part of the processes of the methods of the embodiments described above can be implemented by hardware instructions of a computer program, which can be stored in a non-volatile computer-readable storage medium, and when executed, can include the processes of the embodiments of the methods described above. Any reference to memory, storage, database, or other medium used in the embodiments provided herein may include non-volatile and/or volatile memory, among others. Non-volatile memory can include read-only memory (ROM), Programmable ROM (PROM), Electrically Programmable ROM (EPROM), Electrically Erasable Programmable ROM (EEPROM), or flash memory. Volatile memory can include Random Access Memory (RAM) or external cache memory. By way of illustration and not limitation, RAM is available in a variety of forms such as Static RAM (SRAM), Dynamic RAM (DRAM), Synchronous DRAM (SDRAM), Double Data Rate SDRAM (DDRSDRAM), Enhanced SDRAM (ESDRAM), Synchronous Link DRAM (SLDRAM), Rambus Direct RAM (RDRAM), direct bus dynamic RAM (DRDRAM), and memory bus dynamic RAM (RDRAM).
It should be understood by those skilled in the art that various features of the above-described embodiments can be combined in any combination, and for the sake of brevity, all possible combinations of features in the above-described embodiments are not described in detail, but rather, all combinations of features which are not inconsistent with each other should be construed as being within the scope of the present disclosure.
The above-mentioned embodiments only express several embodiments of the present application, and the description thereof is more specific and detailed, but not construed as limiting the scope of the invention. It should be noted that, for a person skilled in the art, several variations and modifications can be made without departing from the concept of the present application, which falls within the scope of protection of the present application. Therefore, the protection scope of the present patent shall be subject to the appended claims.

Claims (10)

1. A method of projection interaction, the method comprising:
detecting a motion track of a hand of a user in a preset sensing area, wherein the sensing area is an imaging area of a projector or an area between the imaging area and the projector;
and converting the navigation bar in the imaging area from a first state to a second state according to the motion track, wherein the first state is that the navigation bar is kept displayed in the imaging area, and the second state is that the navigation bar is hidden according to a preset time period and then pops up for display.
2. The method of claim 1, wherein in the first state and the second state, the navigation bar is displayed in a middle region of a top end in the imaging region.
3. The method of claim 2, wherein the detecting the motion trajectory of the user's hand at the preset sensing area further comprises:
and according to the display positions of the display main body and the operation buttons in the imaging area, reducing the display area displayed in the imaging area by the navigation bar, wherein the display area is not intersected with the display positions.
4. The method of claim 3, wherein the detecting the motion trajectory of the user's hand at the preset sensing area further comprises:
and under the condition that the display position is intersected with a preset navigation bar reserved area, reducing the display area to the navigation bar reserved area, and setting the transparency of the navigation bar as a preset transparency threshold.
5. The method of claim 3, wherein the detecting the motion trajectory of the user's hand at the preset sensing area further comprises:
and hiding the navigation bar under the condition that the display position is intersected with a preset navigation bar reserved area.
6. The method of claim 1, wherein the motion trajectory comprises a motion direction and a motion displacement of the hand of the user, wherein the motion displacement is a distance between a current position of the hand of the user and a bottom of the imaging area, and wherein transforming the navigation bar from the first state to the second state according to the motion trajectory comprises:
and converting the navigation bar from the first state to the second state according to the movement direction and the movement displacement of the hand of the user.
7. The method of claim 6, wherein the transitioning the navigation bar from the first state to the second state based on the direction and amount of movement displacement of the user's hand comprises:
and under the condition that the motion direction meets a preset direction condition and the motion displacement meets a preset displacement threshold value, converting the navigation bar from the first state to the second state.
8. A projection interaction device, the device comprising: the device comprises a detection module and a state conversion module;
the detection module is used for detecting a motion track of a hand of a user in a preset sensing area, wherein the sensing area is an imaging area of a projector or an area between the imaging area and the projector;
the state conversion module is configured to convert the navigation bar in the imaging area from a first state to a second state according to the motion trajectory, where the first state is that the navigation bar is kept displayed in the imaging area, and the second state is that the navigation bar is hidden according to a preset time period and then pops up for display.
9. An electronic device comprising a memory, a processor, and a computer program stored on the memory and run on the processor, wherein the processor implements the projection interaction method of any one of claims 1 to 7 when executing the computer program.
10. A computer storage medium on which a computer program is stored which, when being executed by a processor, carries out the projection interaction method as claimed in any one of claims 1 to 7.
CN202011014857.0A 2020-09-24 2020-09-24 Projection interaction method and device, electronic equipment and computer storage medium Pending CN112286400A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202011014857.0A CN112286400A (en) 2020-09-24 2020-09-24 Projection interaction method and device, electronic equipment and computer storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202011014857.0A CN112286400A (en) 2020-09-24 2020-09-24 Projection interaction method and device, electronic equipment and computer storage medium

Publications (1)

Publication Number Publication Date
CN112286400A true CN112286400A (en) 2021-01-29

Family

ID=74422916

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202011014857.0A Pending CN112286400A (en) 2020-09-24 2020-09-24 Projection interaction method and device, electronic equipment and computer storage medium

Country Status (1)

Country Link
CN (1) CN112286400A (en)

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6037943A (en) * 1994-08-12 2000-03-14 International Business Machines Corporation Multimedia digital foil presentation system
CN1808364A (en) * 2005-12-20 2006-07-26 无锡永中科技有限公司 Method for displaying electronic lantern navigation information on computer
CN102223508A (en) * 2010-04-14 2011-10-19 鸿富锦精密工业(深圳)有限公司 Front projection control system and method thereof
CN106951153A (en) * 2017-02-21 2017-07-14 联想(北京)有限公司 A kind of display methods and electronic equipment
CN107132962A (en) * 2017-05-02 2017-09-05 青岛海信移动通信技术股份有限公司 Navigation bar control method and device

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6037943A (en) * 1994-08-12 2000-03-14 International Business Machines Corporation Multimedia digital foil presentation system
CN1808364A (en) * 2005-12-20 2006-07-26 无锡永中科技有限公司 Method for displaying electronic lantern navigation information on computer
CN102223508A (en) * 2010-04-14 2011-10-19 鸿富锦精密工业(深圳)有限公司 Front projection control system and method thereof
CN106951153A (en) * 2017-02-21 2017-07-14 联想(北京)有限公司 A kind of display methods and electronic equipment
CN107132962A (en) * 2017-05-02 2017-09-05 青岛海信移动通信技术股份有限公司 Navigation bar control method and device

Similar Documents

Publication Publication Date Title
US11467715B2 (en) User interface display method, terminal and non-transitory computer-readable storage medium for splitting a display using a multi-finger swipe
US10990278B2 (en) Method and device for controlling information flow display panel, terminal apparatus, and storage medium
US11320960B2 (en) Icon display method, device, and terminal
US11128802B2 (en) Photographing method and mobile terminal
US20210173549A1 (en) Method for icon display, terminal, and storage medium
WO2020259651A1 (en) Method for controlling user interface and electronic device
US20170255320A1 (en) Virtual input device using second touch-enabled display
CN109976655B (en) Long screen capturing method, device, terminal and storage medium
US20140310647A1 (en) Method for controlling interface and terminal therefor
CN106951163B (en) Display control method and device
KR101251761B1 (en) Method for Data Transferring Between Applications and Terminal Apparatus Using the Method
CN103870123B (en) A kind of information processing method and electronic equipment
US20210011585A1 (en) Menu display method, apparatus, device and storage medium
CN111831205B (en) Device control method, device, storage medium and electronic device
CN111752460B (en) Screen control method and device, electronic equipment and computer readable storage medium
US20230015678A1 (en) Method for turning on single-hand operation mode, terminal and non-transitory computer-readable storage medium
CN106020698A (en) Mobile terminal and realization method of single-hand mode
CN112835485A (en) Application interface processing method and device, electronic equipment and readable storage medium
CN114779977A (en) Interface display method and device, electronic equipment and storage medium
CN113253889B (en) Page display processing method and device based on special-shaped screen and computer equipment
CN111124584A (en) Shortcut panel display method, terminal and readable storage medium
CN113126874B (en) Control method and control system of terminal equipment
US20180024629A1 (en) Content acquiring method and apparatus, and user equipment
CN112286400A (en) Projection interaction method and device, electronic equipment and computer storage medium
CN107562260B (en) A kind of method and device of touch control

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
RJ01 Rejection of invention patent application after publication

Application publication date: 20210129

RJ01 Rejection of invention patent application after publication