CN115904129A - Touch device, touch object identification method and device - Google Patents
Touch device, touch object identification method and device Download PDFInfo
- Publication number
- CN115904129A CN115904129A CN202111163469.3A CN202111163469A CN115904129A CN 115904129 A CN115904129 A CN 115904129A CN 202111163469 A CN202111163469 A CN 202111163469A CN 115904129 A CN115904129 A CN 115904129A
- Authority
- CN
- China
- Prior art keywords
- touch
- touch object
- identifier
- display screen
- infrared
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
- 238000000034 method Methods 0.000 title claims abstract description 45
- 238000013507 mapping Methods 0.000 claims abstract description 83
- 230000003287 optical effect Effects 0.000 claims abstract description 69
- 241001422033 Thestylus Species 0.000 claims description 67
- 238000012545 processing Methods 0.000 claims description 17
- 230000001960 triggered effect Effects 0.000 claims description 5
- 238000005516 engineering process Methods 0.000 abstract description 9
- 238000010586 diagram Methods 0.000 description 17
- 238000013461 design Methods 0.000 description 10
- 230000006870 function Effects 0.000 description 7
- 238000004891 communication Methods 0.000 description 5
- 238000013515 script Methods 0.000 description 5
- 230000004044 response Effects 0.000 description 4
- 229920001621 AMOLED Polymers 0.000 description 3
- 238000012986 modification Methods 0.000 description 3
- 230000004048 modification Effects 0.000 description 3
- 230000008569 process Effects 0.000 description 3
- 230000000694 effects Effects 0.000 description 2
- 230000002452 interceptive effect Effects 0.000 description 2
- 230000001154 acute effect Effects 0.000 description 1
- 238000013473 artificial intelligence Methods 0.000 description 1
- 239000003086 colorant Substances 0.000 description 1
- 239000011521 glass Substances 0.000 description 1
- 239000004973 liquid crystal related substance Substances 0.000 description 1
- 230000001537 neural effect Effects 0.000 description 1
- 238000005192 partition Methods 0.000 description 1
- 239000002096 quantum dot Substances 0.000 description 1
- 238000009877 rendering Methods 0.000 description 1
- 238000006467 substitution reaction Methods 0.000 description 1
Images
Landscapes
- Position Input By Displaying (AREA)
Abstract
The embodiment provides a touch device, a touch object identification method and a touch object identification device, belonging to the touch technology. The touch device includes: the touch display screen, infrared emission component, infrared receiving element. An optical path is formed between the infrared transmitting element and the infrared receiving element which can receive the infrared light rays transmitted by the infrared transmitting element; a processor coupled to the touch display screen, the infrared emitting element, and the infrared receiving element, the processor configured to: acquiring the number of shielded optical paths in at least one direction when a touch object touches a touch display screen; determining the identifier of the touch object according to the number of optical paths in at least one direction and the mapping relation between the number of optical paths in each direction and the identifier of the touch object; and controlling the touch equipment to enter a working mode matched with the identifier of the touch object according to the identifier of the touch object. The method and the device improve the accuracy of multi-level identification of the touch object.
Description
Technical Field
The embodiment of the application relates to a touch technology. And more particularly, to a touch device, a touch object recognition method, and an apparatus.
Background
The user can touch the touch device through a stylus pen or touch objects with different shapes such as fingers and palms. The touch equipment based on the infrared technology can identify the shape of the touch object and display corresponding drawing handwriting according to the shape of the touch object.
A plurality of infrared transmitting tubes and a plurality of infrared receiving tubes are arranged around a touch display screen of the touch equipment based on the infrared technology. The infrared receiving tube can receive the infrared rays emitted by the infrared emitting tube. When the touch object touches the touch functional device, the touch object can block a part of the light path, so that part of the infrared receiving tubes cannot receive infrared rays. At present, the existing touch object identification method mainly determines the shape of a touch object according to the area of a light path shielded by the touch object.
However, the existing touch object identification method has the problem of poor multi-level identification accuracy.
Disclosure of Invention
An exemplary embodiment of the application provides a touch device, a touch object identification method and a touch object identification device, which can improve user experience of a user in operating the touch device.
In a first aspect, the present application provides a touch device, including:
the touch display screen is used for displaying;
the infrared emission element is arranged on at least one side edge of the touch display screen and used for emitting infrared rays;
the infrared receiving element is arranged on the side edge of the touch display screen opposite to the infrared emitting element and used for receiving infrared light; an optical path is formed between the infrared transmitting element and an infrared receiving element which can receive infrared rays transmitted by the infrared transmitting element;
a processor coupled to the touch display screen, the infrared emitting element, and the infrared receiving element, the processor configured to:
acquiring the number of shielded optical paths in at least one direction when a touch object touches the touch display screen;
determining the identifier of the touch object according to the number of the optical paths in the at least one direction and the mapping relation between the number of the optical paths in each direction and the identifier of the touch object;
and controlling the touch equipment to enter a working mode matched with the identifier of the touch object according to the identifier of the touch object.
In some embodiments, the processor is configured to:
acquiring the area of a light path region shielded by the touch object when the touch object touches the touch display screen, and the brightness value corresponding to each light path shielded by the touch object when the touch object touches the touch display screen;
and determining the identifier of the touch object according to the number of the light paths in at least one direction, the area of the shielded light path region, the average value of the brightness values corresponding to the shielded light paths, the number of the light paths in each direction, the area of the light path region shielded by the touch object, the average value of the brightness values corresponding to the light paths shielded by the touch object, the identifier of the touch object and the mapping relationship among the four.
In some embodiments, the touch display screen includes a plurality of display areas, each of the display areas corresponds to one of the mapping relationships, and the processor is configured to:
acquiring a first touch position of a touch object touching the touch display screen;
and determining the identifier of the touch object according to the number of light paths in at least one direction shielded by the touch object and the mapping relation corresponding to the display area where the first touch position is located.
In some embodiments, the processor is configured to:
acquiring the number of light paths in each direction in each initial display area of the touch display screen; regarding any two adjacent initial display areas, if the number of optical paths in each direction between the two initial display areas is the same, the two initial display areas are taken as one display area.
In some embodiments, the touch object is a stylus, and the processor is configured to:
for the identifier of the stylus in the mapping relationship and the identifier of any one touch object except the identifier of the stylus, comparing the number of light paths in each direction corresponding to the identifiers of the two touch objects to obtain the similarity ratio between the number of light paths in each direction corresponding to the identifier of the stylus and the number of light paths in each direction corresponding to the identifier of any one touch object except the identifier of the stylus;
and if the similarity is greater than or equal to a preset similarity threshold, outputting first prompt information by using the touch display screen, wherein the first prompt information is used for indicating that the stylus is unqualified.
In some embodiments, the processor is configured to:
responding to a mapping relation updating request triggered by a second touch position where the touch object touches the touch display screen, and displaying each display area division interface and second prompt information on the touch display screen; the second prompt information is used for prompting a user to draw at least one handwriting in each display area;
receiving an identifier of at least one touch object input by the user, and the number of optical paths in at least one direction shielded by each touch object when the user draws at least one handwriting in each display area;
and updating the mapping relation according to the identification of the at least one touch object input by the user and the number of the optical paths in the at least one direction shielded by each touch object.
In some embodiments, the touch object is any one of: stylus, finger, palm.
In some embodiments, the touch device is a touch-all-in-one machine.
In a second aspect, the present application provides a method for identifying a touch object, where the method is applied to a touch device, and the touch device includes a touch display screen for displaying; the infrared emission element is arranged on at least one side edge of the touch display screen and used for emitting infrared rays; the infrared receiving element is arranged on the side edge of the touch display screen opposite to the infrared emitting element and used for receiving infrared light; an optical path is formed between the infrared transmitting element and an infrared receiving element capable of receiving infrared light emitted by the infrared transmitting element, and the method comprises the following steps:
acquiring the number of shielded light paths in at least one direction when a touch object touches the touch display screen;
determining the identifier of the touch object according to the number of the optical paths in the at least one direction and the mapping relation between the number of the optical paths in each direction and the identifier of the touch object;
and controlling the touch equipment to enter a working mode matched with the identifier of the touch object according to the identifier of the touch object.
In a third aspect, the present application provides a touch object recognition apparatus, where the apparatus is applied to a touch device, and the touch device includes a touch display screen for displaying; the infrared emission element is arranged on at least one side edge of the touch display screen and used for emitting infrared rays; the infrared receiving element is arranged on the side edge of the touch display screen opposite to the infrared emitting element and used for receiving infrared light; an optical path is formed between the infrared transmitting element and an infrared receiving element capable of receiving infrared light emitted from the infrared transmitting element, and the apparatus includes:
the acquisition module is used for acquiring the number of light paths in at least one shielded direction when a touch object touches the touch display screen;
the processing module is used for determining the identifier of the touch object according to the number of the optical paths in the at least one direction and the mapping relation between the number of the optical paths in each direction and the identifier of the touch object;
and the control module is used for controlling the touch equipment to enter a working mode matched with the identifier of the touch object according to the identifier of the touch object.
According to the touch equipment, the touch object identification method and the touch object identification device, the identification of the touch object is determined through the number of light paths in at least one direction shielded when the touch object touches the touch display screen and the mapping relation between the number of light paths in each direction and the identification of the touch object. When different touch objects touch the touch display screen, the number of light paths shielded by the different touch objects is generally differentiated more than the area of the light path region shielded by the different touch objects. Therefore, compared with the existing method for determining the identifier of the touch object according to the area of the light path area shielded by the touch object, the method for determining the identifier of the touch object improves the accuracy of determining the identifier of the touch object, namely improves the accuracy of performing multi-level identification on the touch object. After the identification of the touch object is determined, the accuracy of controlling the touch device to enter the working mode matched with the identification of the touch object can be improved, and the user experience is further improved.
Drawings
In order to more clearly illustrate the embodiments of the present application or the implementation manner in the related art, a brief description will be given below of the drawings required for the description of the embodiments or the related art, and it is obvious that the drawings in the following description are some embodiments of the present application, and other drawings can be obtained by those skilled in the art according to the drawings.
Fig. 1 is a schematic structural diagram of a touch device based on an infrared technology;
FIG. 2a is a diagram illustrating a touch object recognition in the related art;
FIG. 2b is a diagram illustrating another touch object recognition method in the related art;
FIG. 3a is a schematic diagram of light paths blocked by a stylus in different directions;
FIG. 3b is a schematic diagram of light paths blocked by another stylus in different directions;
fig. 4 is a schematic flowchart of a touch object identification method provided in the present application;
fig. 5 is a schematic structural diagram of another infrared technology-based touch device provided in the present application;
fig. 6 is a schematic diagram of a display interface of a touch display screen provided in the present application;
fig. 7 is a schematic flowchart of another touch object identification method provided in the present application;
FIG. 8 is a schematic view of another touch display screen display interface provided in the present application;
fig. 9 is a schematic structural diagram of a touch object recognition device according to the present application;
fig. 10 is a schematic structural diagram of a touch device provided in the present application.
Detailed Description
To make the objects, embodiments and advantages of the present application clearer, the following description of exemplary embodiments of the present application will clearly and completely describe the exemplary embodiments of the present application with reference to the accompanying drawings in the exemplary embodiments of the present application, and it is to be understood that the described exemplary embodiments are only a part of the embodiments of the present application, and not all of the embodiments.
All other embodiments, which can be derived by a person skilled in the art from the exemplary embodiments described herein without inventive step, are intended to be within the scope of the claims appended hereto. In addition, while the disclosure herein has been presented in terms of one or more exemplary examples, it should be appreciated that aspects of the disclosure may be implemented solely as a complete embodiment.
It should be noted that the brief descriptions of the terms in the present application are only for the convenience of understanding the embodiments described below, and are not intended to limit the embodiments of the present application. These terms should be understood in their ordinary and customary meaning unless otherwise indicated.
The terms "first", "second", "third", and the like in the description and claims of this application and in the above-described drawings are used for distinguishing between similar or analogous objects or entities and are not necessarily meant to limit a particular order or sequence Unless otherwise indicated (Unless other indicated). It is to be understood that the terms so used are interchangeable under appropriate circumstances such that the embodiments described herein are, for example, capable of operation in sequences other than those illustrated or otherwise described herein.
Furthermore, the terms "comprises" and "comprising," as well as any variations thereof, are intended to cover a non-exclusive inclusion, such that a product or device that comprises a list of elements is not necessarily limited to those elements explicitly listed, but may include other elements not expressly listed or inherent to such product or device.
The term "module" as used herein refers to any known or later developed hardware, software, firmware, artificial intelligence, fuzzy logic, or combination of hardware and/or software code that is capable of performing the functionality associated with that element.
Fig. 1 is a schematic structural diagram of a touch device based on an infrared technology. As shown in fig. 1, the touch device may include a touch display screen, an infrared emitting element, an infrared receiving element, and a processor. Wherein,
and the touch display screen is used for displaying. Such as displaying images, video, user interfaces, drawn handwriting drawn by a user on the touch device using a stylus or finger, etc. The touch display screen comprises a display panel. The display panel may adopt a Liquid Crystal Display (LCD), an organic light-emitting diode (OLED), an active-matrix organic light-emitting diode (active-matrix organic light-emitting diode, AMOLED), a flexible light-emitting diode (FLED), a miniature, a Micro-oeld, a quantum dot light-emitting diode (QLED), and the like.
And the infrared emission element is arranged on at least one side edge of the touch display screen and used for emitting infrared rays. And the infrared receiving element is arranged on the side edge of the touch display screen opposite to the infrared emitting element and is used for receiving infrared rays. The infrared transmitting element and the infrared receiving element both belong to an infrared touch system. The touch display screen and the infrared touch system are mutually independent. It should be understood that the following description is provided with an example in which the infrared emitting element is disposed on the first side and the second side of the touch display screen. In some embodiments, the positions of the infrared emitting elements and the infrared receiving elements correspond to each other, and infrared light emitted by the infrared emitting elements is received by the infrared receiving element uniquely corresponding to the infrared emitting element. In some embodiments, the infrared emitting elements may also emit infrared light rays in different scanning directions. As shown in fig. 1, when the emission direction of the infrared ray emitted from the infrared emitting element is changed, the infrared receiving element corresponding to the infrared emitting element is also changed. Illustratively, the infrared emitting element may be, for example, an infrared emitting tube, and the infrared receiving element may be, for example, an infrared receiving tube.
And the processor is connected with the touch display screen, the infrared transmitting element and the infrared receiving element and is used for controlling the touch display screen to display related contents or carrying out other operations on the touch equipment and the like according to the received data. For example, the processor may be a Central Processing Unit (CPU), an Application Specific Integrated Circuit (ASIC), or one or more Integrated circuits configured to implement the embodiments of the present Application. In some embodiments, a processor may include one or more processing units, such as: the processor may include an application processor controller, a Digital Signal Processor (DSP), a Display Processing Unit (DPU), and the like. The controller may be a neural center and a command center of the touch device. The controller can generate an operation control signal according to the instruction operation code and the time sequence signal to finish the control of instruction fetching and instruction execution. The different processing units may be separate devices or may be integrated into one or more processors. In some embodiments, the touch device may also include one or more processors. A memory may also be provided in the processor for storing instructions and data.
It should be understood that fig. 1 only shows, by way of example, a partial structure of a touch device to which the present application relates, and the present application does not limit whether the touch device further includes other structures.
The user may touch the touch device using touch objects of different shapes, such as a stylus, or a finger, a palm, etc. The touch pen may be a touch pen with one usable end, or may be a dual-end touch pen with both usable ends for touching the touch device. The shapes of the two ends of the double-ended touch pen can be the same or different. When a user touches the touch equipment with touch objects of different shapes, the touch equipment can recognize the shape of the touch object through multi-level recognition, and display the corresponding drawn handwriting of the touch object according to the shape of the touch object.
The multi-level recognition may be, for example, three-level recognition, or four-level recognition. As an example, the three-level recognition may be, for example, a fine pen tip of a double-ended stylus pen as a first level, a finger and a thick pen tip as a second level, and a palm as a third level. The fourth-level recognition may be, for example, a thin pen tip of a double-ended stylus pen as a first level, a thick pen tip as a second level, a finger as a third level, and a palm as a fourth level.
For example, in the case of the dual-head stylus, if the touch device recognizes that the user touches the touch device with the thin tip of the dual-head stylus, the touch device may display the drawing handwriting drawn on the touch device by the user with the stylus. If the touch device recognizes that the user touches the touch device with the thick pen tip of the dual-head touch pen, the touch device may cancel the drawn handwriting corresponding to the position touched by the thick pen tip on the touch device (similar to using the thick pen tip as an eraser).
Fig. 2a is a schematic diagram of a touch object recognition in the related art. Fig. 2b is a schematic diagram of another touch object identification method in the related art. As shown in fig. 2a and 2b, currently, the identification method of a touch object in the related art mainly includes: when a user touches the touch device with a touch object, the touch device receives the length and the width of a light path area shielded by the touch object and sent by the infrared touch system. Then, the touch device may determine an area of the light path region blocked by the touch object according to a product of the length and the width. Then, the touch device may determine a touch object corresponding to an area of the light path region blocked by the touch object, and further display a drawing script corresponding to the touch object.
However, different users have different habits of touching the touch device with the touch object, for example, the size of the included angle between the touch pen and the touch device may be different when the different users use the touch pen. As shown in fig. 2a and 2b, it is assumed that when the user uses the thin stylus 1, the included angle between the stylus 1 and the touch device is small, and when the user uses the thick stylus 2, the included angle between the stylus 2 and the touch device is large. Although the thickness of the stylus 1 is different from that of the stylus 2, the area of the optical path region blocked by the stylus 1 when the user uses the stylus 1 is the same as the area of the optical path region blocked by the stylus 2 when the user uses the stylus 2. That is, although the thickness of the stylus 1 and the stylus 2 are different, the touch device may not be able to distinguish the stylus 1 from the stylus 2, and the stylus 1 and the stylus 2 may be considered as touch objects of the same shape.
That is to say, the existing multi-level identification method of the touch object has the problem of poor accuracy.
The inventor finds that, for touch objects with different shapes, when the touch equipment is touched by using different touch objects, the number of light paths blocked by the different touch objects is different. Even if the areas of the light path areas blocked by the touch objects of different shapes are the same, the number of the light paths blocked by the touch objects of different shapes in different directions is different. For example, still taking the stylus 1 and the stylus 2 shown in fig. 2a and 2b as examples, fig. 3a is a schematic diagram of light paths blocked by the stylus in different directions. FIG. 3b is a schematic diagram of light paths blocked by another stylus in different directions. As shown in fig. 3a and 3b, the numbers of the light paths in the first direction and the second direction blocked by the stylus 1 and the stylus 2 are shown in table 1 below. The first direction shown in fig. 3a and 3b is a direction having an angle of more than 90 degrees to the horizontal right direction and the second direction is a direction having an angle of less than 90 degrees to the horizontal right direction.
TABLE 1
That is, even if the areas of the light path regions blocked by the stylus 1 and the stylus 2 are the same, the numbers of light paths blocked by the stylus 1 and the stylus 2 in the first direction and the second direction are different.
It should be understood that fig. 3a and 3b are only an example of the optical paths, and in practical applications, the optical paths in the first direction may also be non-parallel optical paths, and the optical paths in the second direction may also be non-parallel optical paths.
In view of this, the present application provides a method for identifying a type of a touch object based on a number of light paths in at least one direction that are blocked when the touch object touches a touch display screen. Different touch objects are distinguished through the quantity of the light paths in at least one direction shielded by the touch objects, compared with the existing touch object identification method, the distinguishing degree between different touch objects is increased, and the accuracy of multi-stage identification of the touch objects is improved.
The executing body of the method can be a touch device or a processor of the touch device. For example, the touch device may be a touch all-in-one machine (e.g., a smart interactive tablet), or a touch device (e.g., a display with a touch function) that needs to be connected to a host to work.
The technical solutions provided in the present application are described in detail below with reference to specific examples. The following several specific embodiments may be combined with each other, and details of the same or similar concepts or processes may not be repeated in some embodiments.
Fig. 4 is a flowchart illustrating a touch object identification method provided in the present application. As shown in fig. 4, the method comprises the steps of:
s101, obtaining the number of light paths in at least one shielded direction when a touch object touches a touch display screen.
For example, the touch object may be any one of: stylus, finger, palm. As mentioned above, the stylus may be a stylus with one end usable, or may be a dual-head stylus with both ends usable for touching the touch display screen. When the stylus is a dual-head stylus, the two ends of the stylus may have different thicknesses or different shapes. For example, one end of the stylus may be circular and the other end may be elliptical.
Illustratively, the at least one direction may be at least one of a first direction, a second direction, a horizontal direction, and a vertical direction as shown in fig. 3a or 3b, for example. As mentioned above, in some embodiments, the light paths forming an acute angle with the horizontal right direction may be all used as the light paths in the second direction; and all the light paths forming an obtuse angle with the horizontal right direction are taken as light paths in the first direction.
As a possible implementation manner, the infrared touch system may collect the number of light paths in at least one direction that are blocked by the touch object when the touch object touches the touch display screen, and send the number of light paths to the processor of the touch device. That is to say, the touch device can directly obtain the number of light paths in at least one direction blocked when the control object touches the touch display screen from the infrared touch system.
As another possible implementation, the number of optical paths sent by the infrared touch system to the processor may be all directions between each infrared receiving element and each infrared emitting element. In this implementation manner, the touch device may obtain a preset number of light paths in at least one direction from the number of light paths in all directions. Or, the touch device may also combine the number of light paths in each direction, so as to use the light paths belonging to the same preset direction range as the light path in one direction, and use the total number of light paths in the same preset direction range as the number of light paths in the direction.
As another possible implementation manner, the touch device may further receive a brightness value corresponding to each light path when the touch object touches the touch display screen. Then, the touch device may use the light path with the luminance value smaller than the preset luminance threshold as a light path blocked when the touch object touches the touch display screen, and obtain the direction of each blocked light path. According to the direction of each shielded light path, the touch control equipment can acquire the number of the light paths in at least one shielded direction when the touch control object touches the touch control display screen.
S102, determining the identification of the touch object according to the number of the optical paths in at least one direction and the mapping relation between the number of the optical paths in each direction and the identification of the touch object.
The mapping relationship between the number of light paths in each direction and the identifier of the touch object may be, for example, calibrated online by a user and pre-stored in the touch device. In some embodiments, the user may further update the mapping relationship between the number of light paths in each direction and the identifier of the touch object, so as to improve the accuracy of determining the identifier of the touch object according to the mapping relationship.
For example, taking the number of light paths in the first direction and the second direction that are blocked when the touch device acquires that the touch object touches the touch display screen as an example, the mapping relationship between the number of light paths in each direction and the identifier of the touch object may be shown in table 2 below, for example:
TABLE 2
Taking the mapping relationship shown in table 2 as an example, assuming that when the touch object touches the touch display screen of the touch device, the number of blocked light paths in the first direction is 11, and the number of light paths in the second direction is 12, the touch device may determine that the identifier of the touch object is identifier 1.
As a possible implementation manner, the touch device may determine the identifier of the touch object based on only the mapping relationship between the number of light paths in each direction and the identifier of the touch object, so as to improve the efficiency of determining the identifier of the touch object.
As another possible implementation manner, the touch device may further determine the identifier of the touch object based on the number of light paths in each direction, the area of the light path region blocked by the touch object, the average value of the brightness values corresponding to the light paths blocked by the touch object, and the identifier of the touch object, which are mapped, so as to further improve the accuracy of determining the identifier of the touch object.
In this implementation manner, the touch device may further obtain an area of a light path region blocked when the touch object touches the touch display screen, and a brightness value corresponding to each light path blocked when the touch object touches the touch display screen. According to the brightness values corresponding to the light paths shielded when the touch object touches the touch display screen, the touch device can obtain the average value of the brightness values corresponding to the light paths shielded when the touch object touches the touch display screen. Then, the touch device may determine the identifier of the touch object according to the number of light paths in at least one direction that are blocked, the area of the light path region that is blocked, the average value of the brightness values corresponding to the light paths that are blocked, and the mapping relationship among the number of light paths in each direction, the area of the light path region that is blocked by the touch object, the average value of the brightness values corresponding to the light paths that are blocked by the touch object, the identifier of the touch object, and the four.
In some embodiments, the touch device may acquire, for example, the length and width of a light path region that is blocked when the touch object acquired by the infrared touch system touches the touch display screen. And then, the touch equipment determines the area of the shielded light path area when the touch object touches the touch display screen according to the product of the length and the width. Or, in some embodiments, the touch device may further directly obtain an area of a light path region that is shielded when the touch object acquired by the infrared touch system touches the touch display screen.
In some embodiments, the touch device may, for example, acquire a brightness value corresponding to each light path that is blocked when the touch object touches the touch display screen and is collected by the infrared receiving element.
For example, still taking the number of light paths in the first direction and the second direction blocked by the touch device when the touch device obtains the touch object to touch the touch display screen as an example, the number of light paths in each direction, the area of the light path area blocked by the touch object, the average value of the brightness values corresponding to the light paths blocked by the touch object, and the identifier of the touch object may be, for example, as shown in table 3 below:
TABLE 3
Taking the mapping relationship shown in table 3 as an example, assuming that when a touch object touches the touch display screen of the touch device, the number of blocked light paths in the first direction is 11, the number of blocked light paths in the second direction is 12, the area of the blocked light path region is 1, and the average value of the brightness values corresponding to the blocked light paths is 1, the touch device may determine that the identifier of the touch object is identifier 1.
As another possible implementation manner, the touch device may further determine the identifier of the touch object based on the number of light paths in each direction, the area of the light path region blocked by the touch object, and the identifier of the touch object, which are mapped to each other. Or, the touch device may determine the identifier of the touch object based on the number of light paths in each direction, the average value of the brightness values corresponding to the light paths blocked by the touch object, and the identifier of the touch object, which are mapped to each other. For a specific implementation manner, reference may be made to the foregoing embodiments, which are not described herein again.
S103, controlling the touch equipment to enter a working mode matched with the identification of the touch object according to the identification of the touch object.
The operating mode may be, for example, controlling a display screen of the touch device to display a drawing script matched with the identifier of the touch object. Or, the operating mode may also cancel displaying of handwriting corresponding to the touch position of the touch object on the display screen for controlling the display screen of the touch device (which is equivalent to an effect of using the touch object as an eraser). Or, the working mode may also be an electronic whiteboard of the touch device, or an electronic book reading working mode. In some embodiments. The different drawing scripts may refer to: the drawn handwriting has different colors, different thicknesses, different textures, etc.
In some embodiments, the touch device may store a mapping relationship between the identifier of the touch object and the drawing handwriting in advance. The touch device may determine, after determining the identifier of the touch object, a drawing script matching the identifier of the touch object according to the identifier of the touch object and a mapping relationship between the identifier of the touch object and the drawing script. Then, the touch device may control the touch display screen to display the drawn handwriting matching the identifier of the touch object.
In this embodiment, the identifier of the touch object is determined according to the number of light paths in at least one direction that are blocked when the touch object touches the touch display screen, and the mapping relationship between the number of light paths in each direction and the identifier of the touch object. When different touch objects touch the touch display screen, the degree of distinction of the number of light paths blocked is usually greater than the area of the light path region blocked by the different touch objects. Therefore, compared with the existing method for determining the identifier of the touch object according to the area of the light path area shielded by the touch object, the method for determining the identifier of the touch object improves the accuracy of determining the identifier of the touch object, namely improves the accuracy of performing multi-level identification on the touch object. After the identification of the touch object is determined, the accuracy of controlling the touch device to enter the working mode matched with the identification of the touch object can be improved, and the user experience is further improved.
As a possible implementation manner, the touch display screen may further include a plurality of display areas. Each display area can correspond to one mapping relation, so that when the touch object touches different positions of the touch display screen, the touch equipment can determine the identifier of the touch object according to the touch position of the touch object, and the accuracy of determining the identifier of the touch object is further improved. The mapping relationship may be the mapping relationship described in any of the foregoing embodiments (except the mapping relationship between the identifier of the touch object and the drawing handwriting).
In this implementation, the touch device may first obtain a first touch position at which the touch object touches the touch display screen. And then, determining the identifier of the touch object according to the number of the optical paths in at least one direction shielded by the touch object and the mapping relation corresponding to the display area where the first touch position is located.
In some embodiments, for example, a second mapping relationship between the display area and the mapping relationship may be stored in the touch device in advance. The touch device may determine a mapping relationship corresponding to the display area where the first touch position is located according to the display area where the first touch position is located and a second mapping relationship between the display area and the mapping relationship.
It should be understood that the display area dividing manner of the touch display screen is not limited in the present application.
In some embodiments, the touch device may first divide the touch display screen into a plurality of initial display areas. The areas of the initial display regions may be the same or different. Then, the touch device may obtain the number of light paths in each direction in each initial display area. For any two adjacent initial display areas, if the number of optical paths in each direction between the two initial display areas is the same, which indicates that the number of optical paths in each direction blocked by the touch object in one of the initial display areas is the same as the number of optical paths in the corresponding direction blocked by the touch object in the other initial display area, that is, the mapping relationship between the two initial display areas is the same, the touch device may use the two initial display areas as one display area.
For any two adjacent initial display areas, if the number of optical paths in any direction between the two initial display areas is different, which indicates that the mapping relationship between the two initial display areas is different, the touch device cannot use the two initial display areas as one display area.
In some embodiments, the touch device may further divide the display area according to a total number of light paths in each direction in each display area. For example, the touch device may use an initial display area (typically, an edge position of the screen) with the number of light paths less than or equal to a first number threshold as the target display area. For the initial display regions having the number of optical paths larger than the first number threshold, the manner shown in the above-described embodiment of "regarding two initial display regions as one display region with the number of optical paths in each direction between the two initial display regions being the same" is performed.
Alternatively, in some embodiments, the touch device may also directly divide the touch display screen into a plurality of display areas with the same size.
It should be understood that how the touch device obtains the number of light paths in each direction in each display area is not limited in the present application. The touch device may acquire the number of optical paths in each direction in each display area from an infrared touch system, for example.
In addition, it should be understood that, how the touch device obtains the first touch position where the touch object touches the touch display screen is not limited in the present application. In specific implementation, the existing method for determining the touch position of the touch object by using the touch device based on the infrared technology may be referred to, and details are not repeated herein.
Further, the mapping relationships corresponding to the display areas of the touch display screen may be different. In some embodiments, when there is the same mapping relationship between any two adjacent display regions, the two adjacent display regions may be merged into the same display region.
In some embodiments, if the infrared emitting elements of the touch device are uniformly disposed on the first side and the second side of the touch display screen, and the infrared receiving elements are uniformly disposed on the side of the touch display screen opposite to the infrared emitting elements, so that the light path distribution and direction in the first area of the touch display screen are the same as those in the second area, the touch device may only obtain the mapping relationship corresponding to each display area in the first area. If the display area to which the first touch position of the touch object belongs is located in the second area, the touch device may determine the identifier of the touch object based on a mapping relationship corresponding to the display area in the first area that is symmetric to the display area to which the first touch position belongs. By the method, when the user calibrates the mapping relation on line, the mapping relation corresponding to each display area in the first area only needs to be calibrated, so that the time for calibrating the mapping relation under the user line is shortened, and the efficiency for calibrating the mapping relation under the user line is improved.
Exemplarily, taking that a plurality of infrared emitting elements are uniformly deployed on a second side of the touch display screen and a plurality of infrared receiving elements are uniformly deployed on an opposite side of the second side as an example, fig. 5 is a schematic structural diagram of another touch device based on an infrared technology provided in the present application. As shown in fig. 5, by designing the deployment positions of the infrared transmitting element and the infrared receiving element, the distribution and the directions of the light paths in the area 1, the area 3, the area 7, and the area 9 of the touch display screen can be the same; the optical path distribution and direction in the region 2 and the region 8 are the same; the optical path distribution and direction in both regions 4 and 6 are the same.
In this example, when the user specifies the mapping relationship of each display area, the user only needs to specify the mapping relationship of each display area in the areas 1, 2, 4, and 5. Further, if the light path distribution and direction are the same in both the left half area and the rear half area (or the upper half area and the lower half area) of the area 5, the user may calibrate the mapping relationship of each display area only in the left half area (or the right half area) of the area 5 when calibrating the mapping relationship of each display area. Further, in some embodiments, in consideration of that the area 1 and the area 6 belong to diagonal positions of the touch display screen (the flatness difference of the touch display screen on the diagonal is large), when the user calibrates the mapping relationship of each display area, the user may calibrate the mapping relationship of each display area in the area 1, the area 6, the area 2, the area 4, and the area 5, so as to improve the accuracy of the mapping relationship. The areas 3 and 7 can refer to the areas 1 and 6, and are not described in detail.
For example, if the display area at the first touch position of the touch object is in the area 3, the touch device may determine the identifier of the touch object based on a mapping relationship corresponding to the display areas symmetric to the display area at the first touch position in the area 1.
In this embodiment, each display area corresponds to one mapping relationship, so that when the touch object touches different display areas of the touch display screen, the touch device may determine the identifier of the touch object based on the mapping relationship corresponding to the display area. By the method, when the flatness of the touch display screen is poor, so that the touch objects enter different heights of the light path plane between the infrared receiving element and the infrared emitting element, and further the number of light paths in the same direction, which are shielded by the same touch object when the touch display screen is touched by the same touch object in different areas of the touch display screen, is different, the touch device can determine the identification of the touch object according to the mapping relation corresponding to the display area touched by the touch object, and the accuracy of multi-level identification of the touch object is further improved.
As a possible implementation manner, taking the above-mentioned touch object as a stylus pen as an example, the touch device may also be configured to detect whether the design of the stylus pen is qualified. If the touch control equipment determines that the design of the touch control pen is unqualified, first prompt information used for indicating that the touch control pen is unqualified can be output, so that whether the design of the touch control pen is qualified or not is automatically detected, and the efficiency of detecting whether the design of the touch control pen is qualified or not is improved.
In this implementation, for the identifier of the stylus in the mapping relationship between the light path numbers in the directions and the identifiers of the touch objects, and the identifier of any touch object except the identifier of the stylus, the touch device may compare the light path numbers in the directions corresponding to the identifiers of the two touch objects to obtain a similarity ratio between the light path number in the directions corresponding to the identifier of the stylus and the light path number in the directions corresponding to the identifier of any touch object except the identifier of the stylus. Then, based on the similarity ratio, the touch device may determine whether the stylus is qualified.
In some embodiments, taking a stylus that can be used at a single end (referred to as a single-end stylus for short) as an example, the any touch object except the identifier of the stylus may be, for example: identification of other stylus, identification of a finger, or identification of a palm, etc. Taking the stylus pen with two usable ends (referred to as a dual-head stylus pen for short) as an example, the identifier of the stylus pen may be an identifier corresponding to the first end of the dual-head stylus pen. Any touch object other than the identity of the stylus may be, for example: the second end of the double-ended stylus corresponds to a mark, marks of other styli, marks of fingers, or marks of palms, etc.
In some embodiments, the touch device may use the number of light paths in each direction corresponding to the identifier of the stylus as the first set of data, and use the number of light paths in each direction corresponding to the identifier of any touch object except the identifier of the stylus as the second set of data. For example, the touch device may compare the light path numbers of the two touch objects in each direction by calculating a covariance between the first set of data and the second set of data. Then, the touch device may use the covariance between the first set of data and the second set of data as a similarity ratio between the number of optical paths in each direction corresponding to the identifier of the stylus and the number of optical paths in each direction corresponding to the identifier of any touch object except the identifier of the stylus.
It should be understood that, in the present application, there is no limitation on how the touch device calculates the number of light paths in each direction corresponding to the identifier of the stylus and the number of light paths in each direction corresponding to the identifier of any touch object except the identifier of the stylus. The above described implementations carry out possible implementations provided by the present application. In specific implementation, the above-mentioned similarity rate may also be obtained with reference to other implementation manners.
If the similarity is greater than or equal to the preset similarity threshold, it is indicated that the data difference between the number of light paths in each direction corresponding to the identifier of the stylus and the number of light paths in each direction corresponding to the identifier of any touch object except the identifier of the stylus is small. For example, taking any touch object except the stylus as a finger, if the similarity between the number of light paths in each direction corresponding to the identifier of the stylus and the number of light paths in each direction corresponding to the identifier of the finger is greater than or equal to a preset similarity threshold, it indicates that when the stylus is used to touch the touch device, the probability that the touch device recognizes the stylus as the finger is high, and the design of the stylus is not qualified.
Therefore, if the similarity is greater than or equal to the preset similarity threshold, the touch device may output the first prompt information indicating that the stylus is not qualified by using the touch display screen, so as to automatically detect whether the design of the stylus is qualified, and improve the efficiency of detecting whether the design of the stylus is qualified. The preset similarity rate threshold may be, for example, pre-stored in the touch device by the user.
If the similarity is smaller than the preset similarity threshold, it is indicated that the data difference between the number of light paths in each direction corresponding to the identifier of the stylus and the number of light paths in each direction corresponding to the identifier of any touch object except the identifier of the stylus is large, and according to the mapping relationship, the accuracy of multi-level recognition of the touch object can be ensured. Therefore, the touch device can output second prompt information indicating that the stylus is qualified by using the touch display screen, so as to further improve the efficiency of detecting whether the design of the stylus is qualified.
In some embodiments, the touch device may start to perform the method of the present embodiment, for example, in response to a "user turns on a function of the touch device that detects whether the stylus design is qualified".
As a possible implementation manner, the touch device may further update the mapping relationship between the number of light paths in each direction and the identifier of the touch object, or the mapping relationship between the number of light paths in each direction, the area of the light path region blocked by the touch object, the average value of the brightness values corresponding to the light paths blocked by the touch object, and the identifier of the touch object, so as to further improve the accuracy of determining the identifier of the touch object.
In some embodiments, the touch device may display, on the touch display screen, each display area division interface and second prompt information for prompting the user to draw at least one handwriting in each display area in response to a mapping relation update request triggered by the touch object touching a second touch position of the touch display screen.
Exemplarily, fig. 6 is a schematic view of a display interface of a touch display screen provided in the present application. As shown in fig. 6, the touch device may switch to an interface displaying each display area division and the second prompt information in response to the user clicking the operation of the control "yes". After the second prompt message is displayed on the touch display screen for a preset duration (e.g., three seconds), the touch device may control the touch display screen to hide the second prompt message and only display the division of each display area. The user may draw at least one writing in each display area using a stylus or finger.
It should be understood that fig. 6 is an example of a display interface of a touch display screen, and the application is not limited to whether each display interface of the touch display screen further includes other components.
The user may first input a currently used touch object in the touch device, and then draw at least one handwriting in each display area using the touch object. In some embodiments, the touch device may display, using the touch display screen, a third prompt message for prompting the user to input the currently used touch object, and an identified input interface of the touch object. Correspondingly, the touch device may receive an identifier of at least one touch object input by a user, and the number of optical paths in at least one direction blocked by each touch object when the user draws at least one handwriting in each display area.
Then, the touch device may update the mapping relationship according to the identifier of the at least one touch object input by the user and the number of optical paths in the at least one direction blocked by each touch object.
The following description exemplifies the technical solution of the present application, where the touch object includes a double-ended touch pen (including a thin pen point and a thick pen point) and a finger, and the touch device determines the identifier of the touch object according to the first direction, the second direction, the horizontal left direction, the number of light paths in the vertical upward direction, and the mapping relationship between the number of light paths in each direction and the identifier of the touch object, where the light paths are blocked when the touch object touches the touch display screen. Fig. 7 is a flowchart illustrating another touch object identification method provided in the present application. As shown in fig. 7, the method comprises the steps of:
In general, the light path near the edge of the touch display screen is sparse, and the light path near the center of the touch display screen is dense. The touch device may use each initial display area near an edge portion of the touch display screen as a display area; and taking a plurality of initial display areas with the same number of light paths in all directions close to the central part of the touch display screen as one display area.
By using a plurality of initial display areas with the same number of light paths in each direction as one display area, the number of partitions for the touch display screen can be reduced, the time for acquiring the mapping relation corresponding to each display area is further reduced, and the efficiency for acquiring the mapping relation is improved. By dividing the screen into different regions, the possibility of uneven glass in one region is reduced, and the accuracy of determining the identity of the touch object is further improved.
And 2, drawing at least one handwriting in each display area by using a thin pen point of the double-head touch pen, a thick pen point of the double-head touch pen and a finger respectively at a plurality of inclination angles by using a user, and acquiring the number of light paths in a first direction, a second direction, a horizontal left direction and a vertical upward direction which are shielded by each touch object when each handwriting is drawn by using the touch equipment, and the area of a shielded light path area.
When a user writes one stroke, the touch object can be taken away from the touch display screen, and then the next handwriting is drawn. The plurality of tilt angles may be, for example, an included angle between the touch object and the touch display screen is 30 degrees, 45 degrees, or 60 degrees.
For example, fig. 8 is a schematic view of another touch display screen display interface provided in the present application. As shown in FIG. 8, a user may write 9 "meters" per display area.
For example, for any display area, taking an inclination angle as an example that an included angle between a touch object and a touch display screen is 30 degrees, when a user uses a thin pen point of a double-ended stylus pen, a thick pen point of the double-ended stylus pen, and a finger respectively, when drawing a plurality of handwriting in the display area, and each touch object acquired by a touch device draws each handwriting, the number of light paths in a first direction, a second direction, a horizontal leftward direction, a vertical upward direction blocked by the touch object, and the area of the blocked light path area may be, for example, as shown in table 4 below:
TABLE 4
And 3, aiming at any touch object in any display area, taking the average value of the quantity of the plurality of light paths corresponding to any direction as the quantity of the light paths corresponding to the identification of the touch object in the area in the direction, and taking the average value of the areas corresponding to all handwriting as the area of the shielded light path area corresponding to the identification of the touch object, so as to obtain the mapping relation among the quantity of the light paths corresponding to each direction of each display area, the area of the shielded light path area and the identification of the touch object.
For example, taking the above table 4 as an example, the average value of the areas (including the areas at different tilt angles) such as the area 11 and the area 12 may be used as the area of the corresponding blocked light path region of the thin pen point in the display region. The average value of the numbers (including the number of the optical paths at different inclination angles) such as the number 111 and the number 121 is used as the number of the optical paths of the corresponding shading in the horizontal left direction of the thin pen point in the display area.
And 4, acquiring a first touch position of the touch object touching the touch display screen, and determining the identifier of the touch object according to the number of optical paths in at least one direction shielded by the touch object, the number of optical paths in each direction corresponding to the display area where the first touch position is located, the area of the shielded optical path area and the mapping relation among the identifiers of the touch object.
After performing step 4, the touch device may perform step 5, so that the touch display screen displays the drawn handwriting matching the identifier of the touch object. In addition, the touch device may further perform step 6 to detect whether the design of the stylus is qualified.
And 5, controlling the touch display screen to display the drawing handwriting matched with the identification of the touch object according to the identification of the touch object.
And 6, comparing the light path number in each direction corresponding to the identification of the thick pen head and the identification of the thin pen head by the touch equipment to obtain the similarity between the light path number in each direction corresponding to the identification of the thick pen head and the light path number in each direction corresponding to the identification of the thin pen head.
If the similarity rate is greater than or equal to the preset similarity rate threshold, step 7 is executed. If the similarity rate is smaller than the preset similarity rate threshold, step 8 is executed.
And 7, outputting the unqualified first prompt information of the touch pen.
And 8, outputting qualified second prompt information of the touch pen.
In some embodiments, the touch device may further update the mapping relationship, and if the touch device receives a mapping relationship update request triggered by a user. The touch device may perform the foregoing steps 1-3 to update the mapping relationship.
Fig. 9 is a schematic structural diagram of a touch object recognition device provided in the present application. As shown in fig. 9, the apparatus 200 includes: the device comprises an acquisition module 21, a processing module 22 and a control module 23. Wherein,
the obtaining module 21 is configured to obtain the number of light paths in at least one direction that are blocked when the touch object touches the touch display screen.
And the processing module 22 is configured to determine the identifier of the touch object according to the number of optical paths in the at least one direction and a mapping relationship between the number of optical paths in each direction and the identifier of the touch object.
And the control module 23 is configured to control the touch device to enter a working mode matched with the identifier of the touch object according to the identifier of the touch object.
In some embodiments, the processing module 22 is specifically configured to obtain an area of a light path region that is blocked when the touch object touches the touch display screen, and a brightness value corresponding to each light path that is blocked when the touch object touches the touch display screen; and determining the identifier of the touch object according to the number of the light paths in the at least one direction, the area of the shielded light path region, the average value of the brightness values corresponding to the shielded light paths, the number of the light paths in each direction, the area of the light path region shielded by the touch object, the average value of the brightness values corresponding to the light paths shielded by the touch object, the identifier of the touch object and the mapping relation among the four.
In some embodiments, the touch display screen includes a plurality of display areas, and each display area corresponds to one mapping relationship, in this implementation, the processing module 22 is specifically configured to obtain a first touch position where a touch object touches the touch display screen; and determining the identifier of the touch object according to the number of light paths in at least one direction shielded by the touch object and the mapping relation corresponding to the display area where the first touch position is located.
In some embodiments, the processing module 22 is specifically configured to obtain the number of light paths in each direction in each initial display area of the touch display screen; regarding any two adjacent initial display areas, when the number of optical paths in each direction between the two initial display areas is the same, the two initial display areas are taken as one display area.
In some embodiments, the touch object is a touch pen, and in this implementation, the processing module 22 is specifically configured to compare, for the identifier of the touch pen in the mapping relationship and the identifier of any touch object except the identifier of the touch pen, the number of optical paths in each direction corresponding to the identifiers of the two touch objects to obtain a similarity ratio between the number of optical paths in each direction corresponding to the identifier of the touch pen and the number of optical paths in each direction corresponding to the identifier of any touch object except the identifier of the touch pen. And when the similarity is greater than or equal to a preset similarity threshold, outputting first prompt information by using the touch display screen. The first prompt message is used for indicating that the stylus is unqualified.
In some embodiments, the processing module 22 is further configured to, in response to a mapping relationship update request triggered by a second touch position where the touch object touches the touch display screen, display each display area division interface and second prompt information on the touch display screen; the second prompt information is used for prompting a user to draw at least one handwriting in each display area; receiving an identifier of at least one touch object input by the user, and the number of optical paths in at least one direction shielded by each touch object when the user draws at least one handwriting in each display area; and updating the mapping relation according to the identification of the at least one touch object input by the user and the number of the optical paths in the at least one direction shielded by each touch object.
In some embodiments, the touch object is any one of: stylus, finger, palm.
In some embodiments, the touch device is a touch-all-in-one machine.
The touch object recognition device provided by the application is used for executing the embodiment of the touch object recognition method, the implementation principle and the technical effect are similar, and the description is omitted.
Taking the above touch device as an intelligent interactive tablet as an example, fig. 10 is a schematic structural diagram of a touch device provided in the present application. As shown in fig. 10, the touch device 1000 may include: at least one processor 1001, at least one network interface 1004, a user interface 1003, memory 1005, at least one communication bus 1002.
Wherein a communication bus 1002 is used to enable connective communication between these components.
The user interface 1003 may include a Display screen (Display) and a Camera (Camera), and the optional user interface 1003 may further include a standard wired interface and a wireless interface.
The network interface 1004 may optionally include a standard wired interface, a wireless interface (e.g., WI-FI interface), among others.
The Memory 1005 may include a Random Access Memory (RAM) or a Read-Only Memory (Read-Only Memory). Optionally, the memory 1005 includes a non-transitory computer-readable medium. The memory 1005 may be used to store an instruction, a program, code, a set of codes, or a set of instructions. The memory 1005 may include a stored program area and a stored data area, wherein the stored program area may store instructions for implementing an operating system, instructions for at least one function (such as a touch function, a sound playing function, an image playing function, etc.), instructions for implementing the various method embodiments described above, and the like; the storage data area may store data and the like referred to in the above respective method embodiments. The memory 1005 may optionally be at least one memory device located remotely from the processor 1001. As shown in fig. 10, the memory 1005, which is a kind of computer storage medium, may include therein an operating system, a network communication module, a user interface module, and an operating application program of a touch device.
In the touch device 1000 shown in fig. 10, the user interface 1003 is mainly used for providing an input interface for a user and acquiring data input by the user; the processor 1001 may be configured to call an operation application of the touch device stored in the memory 1005, and specifically execute the touch object recognition method according to any of the embodiments.
In an embodiment, an operating system of the touch device is an android system, and in the android system, the processor 1001 may execute the touch object identification method described in any of the above embodiments.
The present application also provides a computer-readable storage medium, which may include: various media that can store program codes, such as a usb disk, a removable hard disk, a Read-Only Memory (ROM), a Random Access Memory (RAM), a magnetic disk or an optical disk, and in particular, the computer readable storage medium stores program instructions, and the program instructions are used for the method in the foregoing embodiments.
The present application also provides a program product comprising execution instructions stored in a readable storage medium. The at least one processor of the electronic device may read the execution instruction from the readable storage medium, and the execution of the execution instruction by the at least one processor causes the electronic device to implement the touch object recognition method provided in the above-described various embodiments.
Finally, it should be noted that: the above embodiments are only used to illustrate the technical solutions of the present application, and not to limit the same; although the present application has been described in detail with reference to the foregoing embodiments, it should be understood by those of ordinary skill in the art that: the technical solutions described in the foregoing embodiments may still be modified, or some or all of the technical features may be equivalently replaced; and the modifications or the substitutions do not make the essence of the corresponding technical solutions depart from the scope of the technical solutions of the embodiments of the present application.
The foregoing description, for purposes of explanation, has been presented in conjunction with specific embodiments. However, the illustrative discussions above are not intended to be exhaustive or to limit the embodiments to the precise forms disclosed above. Many modifications and variations are possible in light of the above teaching. The embodiments were chosen and described in order to best explain the principles and the practical application, to thereby enable others skilled in the art to best utilize the embodiments and various embodiments with various modifications as are suited to the particular use contemplated.
Claims (10)
1. A touch device, comprising:
the touch display screen is used for displaying;
the infrared emission element is arranged on at least one side edge of the touch display screen and used for emitting infrared rays;
the infrared receiving element is arranged on the side edge of the touch display screen opposite to the infrared emitting element and used for receiving infrared light; an optical path is formed between the infrared transmitting element and an infrared receiving element which can receive infrared rays emitted by the infrared transmitting element;
a processor coupled to the touch display screen, the infrared emitting element, and the infrared receiving element, the processor configured to:
acquiring the number of shielded optical paths in at least one direction when a touch object touches the touch display screen;
determining the identifier of the touch object according to the number of the optical paths in the at least one direction and the mapping relation between the number of the optical paths in each direction and the identifier of the touch object;
and controlling the touch equipment to enter a working mode matched with the identifier of the touch object according to the identifier of the touch object.
2. The touch device of claim 1, wherein the processor is configured to:
acquiring the area of a light path region shielded when the touch object touches the touch display screen and the brightness value corresponding to each light path shielded when the touch object touches the touch display screen;
and determining the identifier of the touch object according to the number of the light paths in at least one direction, the area of the shielded light path region, the average value of the brightness values corresponding to the shielded light paths, the number of the light paths in each direction, the area of the light path region shielded by the touch object, the average value of the brightness values corresponding to the light paths shielded by the touch object, the identifier of the touch object and the mapping relationship among the four.
3. The touch device of claim 1 or 2, wherein the touch display screen includes a plurality of display areas, each display area corresponding to one of the mapping relationships, and the processor is configured to:
acquiring a first touch position where a touch object touches the touch display screen;
and determining the identifier of the touch object according to the number of light paths in at least one direction shielded by the touch object and the mapping relation corresponding to the display area where the first touch position is located.
4. The touch device of claim 3, wherein the processor is configured to:
acquiring the number of light paths in each direction in each initial display area of the touch display screen;
regarding any two adjacent initial display areas, if the number of optical paths in each direction between the two initial display areas is the same, the two initial display areas are taken as one display area.
5. The touch device of claim 1 or 2, wherein the touch object is a stylus, and wherein the processor is configured to:
comparing the light path numbers of the two touch objects in each direction corresponding to the identifiers of the touch pens to obtain the similarity between the light path number of the touch pen in each direction corresponding to the identifier of the touch pen and the light path number of the touch object in each direction corresponding to the identifier of the touch pen except the identifier of the touch pen;
and if the similarity is greater than or equal to a preset similarity threshold, outputting first prompt information by using the touch display screen, wherein the first prompt information is used for indicating that the stylus is unqualified.
6. The touch device of claim 1 or 2, wherein the processor is configured to:
responding to a mapping relation updating request triggered by a second touch position where the touch object touches the touch display screen, and displaying each display area division interface and second prompt information on the touch display screen; the second prompt information is used for prompting a user to draw at least one handwriting in each display area;
receiving an identifier of at least one touch object input by the user, and the number of optical paths in at least one direction shielded by each touch object when the user draws at least one handwriting in each display area;
and updating the mapping relation according to the identification of the at least one touch object input by the user and the number of the optical paths in the at least one direction shielded by each touch object.
7. The touch device of claim 1 or 2, wherein the touch object is any one of: stylus, finger, palm.
8. The touch device of claim 1 or 2, wherein the touch device is a touch all-in-one machine.
9. A touch object identification method is applied to touch equipment, and the touch equipment comprises a touch display screen for displaying; the infrared emission element is arranged on at least one side edge of the touch display screen and used for emitting infrared rays; the infrared receiving element is arranged on the side edge of the touch display screen opposite to the infrared emitting element and used for receiving infrared light; an optical path is formed between the infrared transmitting element and an infrared receiving element capable of receiving infrared light emitted by the infrared transmitting element, and the method comprises the following steps:
acquiring the number of shielded optical paths in at least one direction when a touch object touches the touch display screen;
determining the identifier of the touch object according to the number of the optical paths in the at least one direction and the mapping relation between the number of the optical paths in each direction and the identifier of the touch object;
and controlling the touch equipment to enter a working mode matched with the identifier of the touch object according to the identifier of the touch object.
10. A touch object recognition device is characterized in that the device is applied to touch equipment, and the touch equipment comprises a touch display screen for displaying; the infrared emission element is arranged on at least one side edge of the touch display screen and used for emitting infrared rays; the infrared receiving element is arranged on the side edge of the touch display screen opposite to the infrared emitting element and used for receiving infrared light; an optical path is formed between the infrared transmitting element and an infrared receiving element capable of receiving infrared light emitted from the infrared transmitting element, and the apparatus includes:
the acquisition module is used for acquiring the number of light paths in at least one shielded direction when a touch object touches the touch display screen;
the processing module is used for determining the identifier of the touch object according to the number of the optical paths in the at least one direction and the mapping relation between the number of the optical paths in each direction and the identifier of the touch object;
and the control module is used for controlling the touch equipment to enter a working mode matched with the identifier of the touch object according to the identifier of the touch object.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202111163469.3A CN115904129A (en) | 2021-09-30 | 2021-09-30 | Touch device, touch object identification method and device |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202111163469.3A CN115904129A (en) | 2021-09-30 | 2021-09-30 | Touch device, touch object identification method and device |
Publications (1)
Publication Number | Publication Date |
---|---|
CN115904129A true CN115904129A (en) | 2023-04-04 |
Family
ID=86493816
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202111163469.3A Pending CN115904129A (en) | 2021-09-30 | 2021-09-30 | Touch device, touch object identification method and device |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN115904129A (en) |
-
2021
- 2021-09-30 CN CN202111163469.3A patent/CN115904129A/en active Pending
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN110310288B (en) | Method and system for object segmentation in a mixed reality environment | |
JP4790653B2 (en) | Image processing apparatus, control program, computer-readable recording medium, electronic apparatus, and control method for image processing apparatus | |
JP4727614B2 (en) | Image processing apparatus, control program, computer-readable recording medium, electronic apparatus, and control method for image processing apparatus | |
JP5991041B2 (en) | Virtual touch screen system and bidirectional mode automatic switching method | |
US10452206B2 (en) | Projection video display device and video display method | |
US10311295B2 (en) | Heuristic finger detection method based on depth image | |
JP2008250950A5 (en) | ||
JP2008250949A5 (en) | ||
JP6422118B2 (en) | Display system, information processing apparatus, and display method | |
JP2008250951A (en) | Image processor, control program, computer-readable recording medium, electronic equipment, and control method of image processor | |
JP2008250951A5 (en) | ||
KR102463657B1 (en) | Systems and methods for recognizing multi-object structures | |
CN105700727B (en) | With the exchange method and information processing system of hyaline layer application layer below | |
US10621766B2 (en) | Character input method and device using a background image portion as a control region | |
JP2008093233A (en) | Card identification device, card identification method, and program | |
CN111443831A (en) | Gesture recognition method and device | |
JP5015097B2 (en) | Image processing apparatus, image processing program, computer-readable recording medium, electronic apparatus, and image processing method | |
US11507181B2 (en) | Input apparatus having virtual keys set on reference plane | |
KR20200032492A (en) | Correction method for handwriting input, electronic device and storage medium therefor | |
CN115373534A (en) | Handwriting presenting method and device, interactive panel and storage medium | |
CN115904129A (en) | Touch device, touch object identification method and device | |
JP6373664B2 (en) | Electronic device, method and program | |
US9569013B2 (en) | Coordinate detection system, information processing apparatus, and recording medium | |
JP2018005543A (en) | Image processing device, image processing system and image processing method | |
JP6821998B2 (en) | Electronic blackboard, program, method |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination |