[go: up one dir, main page]
More Web Proxy on the site http://driver.im/

KR20160129406A - Wearable device - Google Patents

Wearable device Download PDF

Info

Publication number
KR20160129406A
KR20160129406A KR1020150061522A KR20150061522A KR20160129406A KR 20160129406 A KR20160129406 A KR 20160129406A KR 1020150061522 A KR1020150061522 A KR 1020150061522A KR 20150061522 A KR20150061522 A KR 20150061522A KR 20160129406 A KR20160129406 A KR 20160129406A
Authority
KR
South Korea
Prior art keywords
optical signal
wearable device
blood vessel
information
unit
Prior art date
Application number
KR1020150061522A
Other languages
Korean (ko)
Inventor
박준호
Original Assignee
박준호
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 박준호 filed Critical 박준호
Priority to KR1020150061522A priority Critical patent/KR20160129406A/en
Priority to US15/517,923 priority patent/US10474191B2/en
Priority to PCT/KR2015/010825 priority patent/WO2016060461A1/en
Publication of KR20160129406A publication Critical patent/KR20160129406A/en
Priority to US16/601,359 priority patent/US10908642B2/en

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • G06F3/014Hand-worn input/output arrangements, e.g. data gloves
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30004Biomedical image processing
    • G06T2207/30101Blood vessel; Artery; Vein; Vascular

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Human Computer Interaction (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

An optical signal transmitting unit for transmitting an optical signal, an optical signal detecting unit for receiving a reflected optical signal generated by reflecting an optical signal to a target object, a data processing unit for processing the received reflected optical signal, A wearable device that includes a key determination unit that detects a key input operation of a user and generates an input value matched with a key input operation.

Description

[0002] WEARABLE DEVICE [

The present invention relates to a wearable device.

In recent life environments where the use of electronic devices is essential in everyday life, electronic devices include respective input means. However, such general input means have not been greatly improved in a two-dimensional input means such as a keyboard and a mouse. Furthermore, it needs to be improved in terms of portability and convenience.

Accordingly, the emergence of an input means capable of simultaneously satisfying portability and convenience is required. In particular, in the trend of miniaturization of electronic devices, the new input means must be capable of handling various input values in order to fully utilize the functions of electronic devices as well as portability and convenience.

SUMMARY OF THE INVENTION The present invention has been made in order to solve the above problems, and it is an object of the present invention to allow a user to conveniently input data by utilizing portable input means.

Another object of the present invention is to enable various kinds of data input so that the wearable device can replace the input means of the keyboard and the mouse.

Yet another object of the present invention is to maintain the accuracy of input data while maintaining portability which is an advantage of wearable devices.

It is to be understood that both the foregoing general description and the following detailed description are exemplary and explanatory and are not intended to limit the invention to the particular form disclosed. ≪ / RTI >

According to an aspect of the present invention, there is provided a wearable device including a light signal transmitter for transmitting an optical signal, a data processor for processing a received optical signal reflected from the optical signal receiver, And a key determiner for detecting a key input operation of the user based on the data obtained by processing the reflected optical signal and generating an input value matched with the key input operation, wherein the optical signal transmitter transmits one or more optical signals, And the data processing unit generates pattern information on the blood vessel of the object based on the at least one reflected optical signal, The key input operation is detected by comparing the information about the blood vessel with the pattern information.

When the optical signal transmitting unit transmits one optical signal, the data processing unit can acquire the blood vessel data by comparing the contrast difference between the blood vessel and the surrounding tissue in the data obtained by processing one reflected optical signal by one optical signal .

The data processing unit can generate the pattern information on the blood vessel using the blood vessel data.

One optical signal may be an optical signal in the near infrared region.

When the optical signal transmitting unit transmits two or more optical signals, the data processing unit can acquire the blood vessel data by comparing the data obtained by processing two or more reflected optical signals by two or more optical signals.

At least two optical signals are optical signals in a near-infrared region or an infrared region, and at least one of a wavelength, a transmission time, a reception time, a frequency, and a polarization state may be different from each other.

A wearable device includes a depth sensor that detects a target object three-dimensionally and generates three-dimensional scan information, and a three-dimensional model generating unit that generates a three-dimensional model of the target based on the three-dimensional scan information, And may further include a video processing unit for adding to the model.

The wearable device can detect a key input operation by comparing information about a blood vessel changing according to a key input operation to a pattern added to the three-dimensional model.

The information on the blood vessel can be obtained by detecting the distribution of at least one of hue, saturation, and brightness due to the blood vessel in the object.

The key determining unit determines a three-dimensional position of the first joint connecting the user's palm and the first node of the finger, the second joint connecting the first node and the second node of the finger, An input value can be generated based on the three-dimensional position of the first joint and the second joint.

The key determining unit determines the three-dimensional position of the first joint and the second joint, and the angle at which the first joint and the second joint are bent, and determines a key input operation based on the three-dimensional position of the two joints and the angle of the two joints. The three-dimensional position of the end of the light beam can be calculated.

The optical signal detection unit can detect the first reflected optical signal and the second reflected optical signal, respectively, by separating the received reflected optical signal according to the wavelength.

The optical signal sensing unit may sense the first reflected optical signal and the second reflected optical signal separately received in the time domain or the frequency domain.

According to another aspect of the present invention, there is provided a wearable device including: an optical signal transmitter for transmitting an optical signal; an optical signal detector for receiving a reflected optical signal generated by reflecting an optical signal on a target object; A data processing unit, a position determining unit for measuring a distance and an angle with respect to the target object based on the data obtained by processing the reflected optical signal, and an image output unit for outputting an image to the outside, wherein the optical signal transmitting unit transmits one or more optical signals , The optical signal sensing unit receives at least one reflected optical signal by at least one optical signal and the data processing unit generates pattern information for the blood vessel of the object based on the at least one reflected optical signal, And the distance and angle with respect to the object are measured in comparison with the blood vessel information, And outputs the image at a fixed size in a fixed position.

The wearable device may further comprise a finger recognition unit for sensing the finger skin line of the user and generating pattern information about the skin line. The image output unit compares the pattern information about the skin line with the pattern information of the previously stored skin line, Can be fixedly output.

According to the embodiments of the present invention, the following effects can be expected.

First, users can input data in improved form through wearable device that can provide both portability and convenience.

Secondly, since the wearable device can replace the keyboard and the mouse, various data can be input only by the wearable device without any additional input means.

Third, the accuracy of data input can be maintained while maintaining the portability of the wearable device, and an improved data input environment can be provided to the user.

The effects obtainable in the embodiments of the present invention are not limited to the effects mentioned above, and other effects not mentioned can be obtained from the description of the embodiments of the present invention described below by those skilled in the art Can be clearly understood and understood. In other words, undesirable effects of implementing the present invention can also be obtained by those skilled in the art from the embodiments of the present invention.

BRIEF DESCRIPTION OF THE DRAWINGS The accompanying drawings, which are included to provide a further understanding of the invention and are incorporated in and constitute a part of this specification, illustrate embodiments of the invention and, together with the description, serve to explain the principles of the invention. It is to be understood, however, that the technical features of the present invention are not limited to the specific drawings, and the features disclosed in the drawings may be combined with each other to constitute a new embodiment. Reference numerals in the drawings refer to structural elements.
1 is a block diagram showing the configuration of a wearable device according to an embodiment of the present invention.
FIG. 2 is a view for explaining an operation procedure of a wearable device according to an embodiment of the present invention.
FIG. 3 is a diagram illustrating an operation process of a wearable device according to an embodiment of the present invention.
FIG. 4 is a diagram illustrating an operation process of a wearable device according to an embodiment of the present invention.
5 is a view for explaining the operation of the wearable device according to an embodiment of the present invention.
FIG. 6 is a view for explaining an operation procedure of a wearable device according to an embodiment of the present invention.
FIG. 7 is a view for explaining an operation procedure of a wearable device according to an embodiment of the present invention.
FIG. 8 is a view for explaining an operation procedure of a wearable device according to an embodiment of the present invention.
9 is a view for explaining the operation of the wearable device according to an embodiment of the present invention.
10 is a view showing an embodiment of a wearable device according to another embodiment of the present invention.

While the present invention has been described in connection with what is presently considered to be the most practical and preferred embodiment, it is to be understood that the invention is not limited to the disclosed embodiments. Also, in certain cases, there may be a term selected arbitrarily by the applicant, in which case the meaning thereof will be described in detail in the description of the corresponding invention. Therefore, the term used in the present invention should be defined based on the meaning of the term, not on the name of a simple term, but on the entire contents of the present invention.

The following embodiments are a combination of elements and features of the present invention in a predetermined form. Each component or characteristic may be considered optional unless otherwise expressly stated. Each component or feature may be implemented in a form that is not combined with other components or features. In addition, some of the elements and / or features may be combined to form an embodiment of the present invention. The order of the operations described in the embodiments of the present invention may be changed. Some configurations or features of certain embodiments may be included in other embodiments, or may be replaced with corresponding configurations or features of other embodiments.

In the description of the drawings, there is no description of procedures or steps that may obscure the gist of the present invention, nor is any description of steps or steps that can be understood by those skilled in the art.

Throughout the specification, when an element is referred to as " comprising " or " including ", it is meant that the element does not exclude other elements, do. In addition, the term " "... Quot ;, " module " and the like refer to a unit for processing at least one function or operation, which may be implemented by hardware, software, or a combination of hardware and software. Also, throughout the specification, when a configuration is referred to as being " connected " to another configuration, this may include not only a physical connection, but also an electrical connection, and furthermore, a logical connection.

Also, the terms " a or ", " one ", " the ", and the like are synonyms in the context of describing the invention (particularly in the context of the following claims) May be used in a sense including both singular and plural, unless the context clearly dictates otherwise.

In this specification, the term " user " may be a wearer of a wearable device, a user, or the like, and may include a technician repairing the wearable device, but the present invention is not limited thereto.

Hereinafter, preferred embodiments according to the present invention will be described in detail with reference to the accompanying drawings. DETAILED DESCRIPTION OF THE PREFERRED EMBODIMENTS The following detailed description, together with the accompanying drawings, is intended to illustrate exemplary embodiments of the invention and is not intended to represent the only embodiments in which the invention may be practiced.

In addition, the specific terminology used in the embodiments of the present invention is provided to help understanding of the present invention, and the use of such specific terminology can be changed into other forms without departing from the technical idea of the present invention.

Before describing embodiments of the present invention, the contents of Korean Patent Application No. 10-2014-0108341 and Korean Patent Application No. 10-2014-0139081 both cited by the same inventors and applicants as here are cited do. In Patent Application No. 10-2014-0108341, an invention has been proposed in which a three-dimensional model is generated by scanning a target object three-dimensionally using a wearable device, and a pattern is added to the three-dimensional model to detect a movement of a user. -0139081, an invention has been proposed in which a user's movement is detected by analyzing a user's blood vessel pattern by transmitting and receiving optical signals of different wavelengths and comparing them.

1 is a block diagram showing the configuration of a wearable device according to an embodiment of the present invention.

1 is a block diagram showing the configuration of a wearable device according to an embodiment of the present invention. The wearable device 100 may further include other general configurations in addition to the configurations shown in FIG. 1, and may be implemented with fewer configurations than those shown in FIG. That is, the embodiment and the scope of rights of the wearable device 100 are not limited to the contents shown in FIG.

The wearable device 100 is an input / output means that is mounted on a part of a user's body (e.g., a hand). The wearable device 100 senses a user's body movement by using various means and generates data and a signal according to an action that the sensed movement forms. The wearable device 100 may transmit the generated data and signals to an external device or a server and operate as input means for an external device.

Hereinafter, various configurations of the wearable device 100 will be described. The wearable device 100 according to one embodiment includes an optical signal sending unit 105, an optical signal detecting unit 110, a data processing unit 115, a key determining unit 120, a depth sensor 125, an image processing unit 130 A positioning unit 135, an image output unit 140, a gyroscope sensor 145, an acceleration sensor 150, a feedback unit 155, a communication unit 160, a storage unit 165, a power source unit 170, And a controller 175. The depicted configurations may be wired or wirelessly coupled to each other to exchange data and signals. As described above, the configurations shown in Fig. 1 are merely examples of configuring the wearable device 100, so that the wearable device 100 can be implemented to include fewer or more configurations than those shown.

The optical signal transmitting unit 105 generates and transmits an optical signal. For example, the optical signal transmitting unit 105 may transmit a wavelength of visible light (about 300 to 700 nm) or infrared light (about 700 to 3000 nm) to the optical signal transmitting unit 105 Can be generated. However, the wavelength band of the optical signal generated by the optical signal transmitting unit 105 is not limited to the above-described contents. The optical signal transmitting unit 105 may transmit various wavelengths, such as far-infrared rays having a longer wavelength in addition to the near- Can be generated. Alternatively, the optical signal transmitting unit 105 may generate an optical signal of a continuous spectrum instead of generating an optical signal having a wavelength of a specific band. A specific example of the wavelength of the optical signal generated by the optical signal transmitting unit 105 will be described with reference to FIG.

On the other hand, the optical signal transmitting unit 105 transmits the generated optical signal. The optical signal transmitting unit 105 may output an optical signal in the form of a continuous wave when transmitting the optical signal or may transmit the optical signal in the form of a non-continuous pulse have.

Also, the optical signal transmitting unit 105 may transmit an optical signal in which a pattern is formed. The pattern means a predetermined shape and shape to be formed when an optical signal to be emitted is projected on the outer surface. For example, the optical signal transmitting unit 105 can transmit an optical signal having a pattern of a striped pattern. In addition, the pattern of such an optical signal can be programmed and stored in advance, and can be any pattern that the wearable device 100 can identify.

As described above, the optical signal transmitting unit 105 generates and transmits optical signals according to various methods. The optical signal transmitting unit 105 may combine the wavelength of the optical signal, the type and mode of the optical signal, and generate and transmit optical signals in various manners. Meanwhile, the wearable device 100 may include one or more optical signal transmitting units 105, and a specific example will be described with reference to FIG.

The optical signal sensing unit 110 senses an optical signal received from the outside. When the optical signal transmitted by the optical signal transmitting unit 105 is reflected on an external object (for example, an object or a part of a human body), various physical property values such as the intensity, wavelength, frequency band, and energy of the optical signal are changed. Hereinafter, an optical signal that is reflected on the outer surface and thus has a changed physical property is referred to as a reflective optical signal. In other words, the optical signal sensing unit 110 senses the reflected optical signal generated by reflection of the external object.

As the optical signal transmitting unit 105 transmits optical signals of various wavelengths, the optical signal sensing unit 110 can sense an optical signal having a wavelength that can be transmitted by the optical signal transmitting unit 105. That is, the optical signal sensing unit 110 can sense an optical signal having a visible light wavelength, a near-infrared light, and a far-infrared light, and the wavelength detected by the optical signal sensing unit 110 is limited to the example band It is similar to that described above.

The data processing unit 115 processes the optical signal received by the optical signal sensing unit 110 to generate reception data. The process of data processing by the data processing unit 115 may include a process of digitizing an optical signal, which is an analog signal received by the optical signal sensing unit 110. The process of generating received data through data processing may be performed at regular intervals or in accordance with a control command from the controller 175 of the wearable device 100.

On the other hand, the received data generated by the data processing unit 115 may include information on blood vessels of the subject when the subject is a part of the body. Specifically, as will be described later, the optical signal transmitted by the optical signal transmitting unit 105 is reflected, scattered, and absorbed by the blood vessel of the object, thereby changing the physical property value. Accordingly, the data processing unit 115 can acquire information on the arrangement and distribution of the blood vessels of the target object by processing the reflected optical signal received by the optical signal sensing unit 110. In the above-described received data, reception data in the case where the target object is a blood vessel is referred to as blood vessel data for convenience of explanation.

As described above, when the optical signal transmitting unit 105, the optical signal sensing unit 110, and the data processing unit 115 sense blood vessels and generate blood vessel data, a depth sensor 125 to be described later can operate together . Specifically, as will be described later, the depth sensor 125 senses a three-dimensional object to sense a three-dimensional structure, shape, and position. Accordingly, the blood vessel data for the detected blood vessel can be configured as two-dimensional information about the distribution and arrangement of blood vessels, and the blood vessel data obtained by the operation of the depth sensor 125 can be configured as three-dimensional information have.

The key determining unit 120 detects the key input operation of the user and generates an input value matching the key input operation. The key input operation means an operation in which a user wearing the wearable device 100 presses a predetermined key on a virtual keyboard. That is, the user can perform the operation of pressing the key while wearing the wearable device 100 even if the user does not have the actual keyboard, and this operation means the key input operation. On the other hand, the key input operation may refer to an operation in which a user's finger touches or presses a predetermined surface. When a user's finger touches a part of the body (e.g., another finger or a palm) It can mean to do. In addition, the key input operation can include both the case where the finger does not touch the outer surface, and the case where the finger is bent at an angle of more than a certain degree. That is, in the case of performing an operation similar to the case where the finger contacts the outer surface in the air, it may correspond to the key input operation even if the finger does not touch the outer surface.

The key determiner 120 can detect a key input operation performed by the user using the received data generated by the data processor 115. [ More specifically, the received data includes information on blood vessels as described above. Accordingly, when the arrangement and distribution of the blood vessels are detected according to the key input operation of the user, the key determining unit 120 compares the information on the detected blood vessels with the data on the pre-stored blood vessels. Such a comparison process can be performed by comparing at least one of changes in color, saturation, brightness, size, and shape due to arrangement and distribution of blood vessels.

The key determining unit 120 can determine which key corresponds to the user's key input operation according to the comparison result. In other words, the key determining unit 120 generates an input value by comparing information on blood vessels sensed according to a key input operation with pre-stored information, and detailed embodiments will be described later.

Meanwhile, the generated input value is a value indicating a key pressing operation of the user, and can be transmitted to an external device or server connected to the wearable device 100 or processed internally in the wearable device 100.

The depth sensor 125 scans the object three-dimensionally to generate three-dimensional scan information. That is, the depth sensor 125 transmits various kinds of signals to a target object, detects a change in a signal transmitted from the target surface, or detects a signal reflected from the target object. The depth sensor 125 may analyze the sensed signal to generate three-dimensional scan information for a target object. For example, if the object is a user's hand, the depth sensor 125 may sense the hand three-dimensionally and generate three-dimensional scan information on the contour of the hand.

The depth sensor 125, which scans the object in three dimensions, may include various types of sensors or devices. For example, the depth sensor 125 may include an infrared camera that transmits an infrared signal to a target object and senses a change in the surface of the target object, a time difference between a signal reflected from the target object and an ultrasound signal or an optical signal, A time-of-flight (ToF) camera, a laser transceiver that transmits a laser signal to a subject and senses a reflected signal, and a stereo camera that analyzes a difference value obtained by photographing the object at two positions.

In addition, a pulse laser light is launched into the atmosphere, and a rider (LIDAR, LIght Detection And Ranging) method using the reflector or scattering body, a speckle detecting a change in a pattern of coherent light reflected from the surface of the object A speckle interferometry method, an infrared proximity array (IPA) sensing method using two LEDs, an RGB camera, and the like can also be applied to implement the depth sensor 125. [

Meanwhile, when the depth sensor 125 generates the three-dimensional scan information by using the patterned optical signal of the specific wavelength, it can be realized with the same configuration as the optical signal transmitting unit 105 described above. That is, the optical signal transmitting unit 105 can utilize the patterned optical signal for both the purpose of detecting blood vessels and the purpose of generating three-dimensional scan information. In this case, the optical signal output unit 105 outputs optical signals having different wavelengths, and outputs a patterned optical signal to perform the role of the depth sensor 125 or only the role of the depth sensor 125 can do. The patterned optical signal output by the optical signal transmitting unit 105 to serve as the depth sensor 125 may be any of the wavelengths of the optical signal for detecting the blood vessel, May be used.

In addition, the depth sensor 125 may operate in two ways to generate scan information, similar to that described for the optical signal transmitter 105. That is, the depth sensor 125 may know or may not know the viewpoint and the frequency band of the received optical signal in transmitting the optical signal (patterned optical signal) to the object and generating the three-dimensional scan information . Specifically, when the depth sensor 125 knows the starting point and the wavelength band (or frequency band) of the optical signal to be transmitted, the depth sensor 125 calculates in advance the point in time when the optical signal is received, Dimensional scanning information through an optical signal received in a frequency band. In this case, the depth sensor 125 transmits the optical signal for generating the three-dimensional scan information in the middle of the optical signal transmitting unit 105 transmitting the optical signal of the specific wavelength to acquire the information about the blood vessel of the object can do.

Conversely, even if the depth sensor 125 is not aware of information about the optical signal to be received, it can generate the three-dimensional scan information if it has means for selectively sensing the optical signal to be received. That is, the depth sensor 125 may include a filter or the like for detecting a specific wavelength band of the optical signal. In this case, the depth sensor 125 may selectively sense the received optical signal.

The depth sensor 125 for scanning the object three-dimensionally is not limited to the example of the above-described configurations, and various other configurations may be included in the depth sensor 125. In addition, the depth sensor 125 may be implemented in a combination of two or more of the above-described configurations.

In addition, after the depth sensor 125 performs a process of scanning a target object in a three-dimensional manner, the accuracy of the three-dimensional scan information may be improved by using a computer vision technique. Computer vision technology is used to improve the accuracy of depth information in the process of 2D image analysis. It is used for depth-from-focus, depth-from-stereo, depth-from-shape and depth-from-motion And the like. The depth sensor 125 can accurately generate three-dimensional scan information for a target object by utilizing the above-described various methods.

In the above description, the object is a hand, which is a part of the user's body, but the present invention is not limited thereto. That is, the object can mean not only a body part but also various objects such as an object, a space, and a structure. For example, when the object is an object such as a cell phone, a notebook, a desk, etc., the depth sensor 125 can scan the mobile phone and the notebook three-dimensionally to generate three-dimensional scan information. In addition, when the wearable device 100 is located in the room, the depth sensor 125 can scan the room and the wall surface in the room using the object. Accordingly, the depth sensor 125 can recognize the three-dimensional space by the wall surfaces of the room and can generate the three-dimensional scanning information about the wall surface. In such a case, the depth sensor 125 can know the absolute coordinates of the wearable device 100 where the wearable device 100 is located, that is, within a predetermined space.

The image processing unit 130 is connected to the depth sensor 125 to receive and process the 3D scan information. Specifically, the image processor 130 generates a three-dimensional image using the three-dimensional scan information received from the depth sensor 125, and generates a three-dimensional model of the object through the three-dimensional rendering process. For example, if the object is a user's hand, the image processing unit 130 may generate a three-dimensional model of the hand as the depth sensor 125 senses the user's hand. As another example, when the object is an object such as a mobile phone, the image processing unit 130 can generate a three-dimensional model of the mobile phone. The three-dimensional model can be expressed in black and white or color.

In addition, the image processing unit 130 can add a pattern for blood vessels to the three-dimensional model created using the received data generated by the data processing unit 115. [ As described above, the data processing unit 115 processes the optical signal received by the optical signal sensing unit 110 to generate information on the blood vessel of the object. The image processing unit 130 may process the information about the blood vessel to generate a pattern that can be visually confirmed, and may add the generated pattern to the three-dimensional model generated based on the three-dimensional scan information. That is, the image processing unit 130 can generate a three-dimensional model in which a pattern for a blood vessel is imaged on a three-dimensional model in which only the outer shape of the object exists.

Meanwhile, the key determining unit 120 may analyze the pattern of the three-dimensional model generated by the image processing unit 130 in the process of generating the input value by comparing the information about the blood vessel with the stored information. That is, the key determining unit 120 can determine which input value matches the key input operation of the user by comparing the information on the detected pattern with the information on the pattern of the blood vessel added to the three-dimensional model.

The positioning unit 135 determines the angle at which the wearable device 100 is inclined with respect to the distance from the external reference point. When the data processing unit 115 generates the reception data including the information about the blood vessels as described above, the positioning unit 135 analyzes the reception data and transmits the reception data to the wearable device 100 at a predetermined external reference point (for example, Object) and how tilted it is. This process can be performed by comparing the previously stored blood vessel information with the sensed received data, similar to the process in which the key determiner 120 generates an input value according to a key input operation. Alternatively, the positioning unit 135 can determine the distance and angle with the reference point by comparing the received data with the three-dimensional model to which the pattern of the blood vessel is added.

Alternatively, the positioning unit 135 may know the position and distance between the object and itself using the depth sensor 125 described above. That is, the position determination unit 135 can recognize the position and distance of the target object by analyzing the three-dimensional model of the target object generated by the depth sensor 125 and the external shape of the target object to be sensed. As another example, the positioning unit 135 may determine the position of the object and the object using the gyroscope sensor 145 and the acceleration sensor 150, which will be described later.

The video output unit 140 projects an image to the outside. The video output unit 140 can output an image to the outside such as an object or a part of the body. For example, the image output unit 140 may project an image on a palm, a hand, or an arm that is a part of the body, or may project an image on an object such as a desk or a wall. The image projected by the image output unit 140 may include all kinds of images such as an arbitrary image, moving image, and three-dimensional image (stereoscopic image).

Meanwhile, the image output unit 140 may utilize the information determined by the position determination unit 135 in the process of projecting the image. More specifically, the video output unit 140 uses the result of measuring the distance and the angle of the position determination unit 135 with respect to the target object, so that even if the wearable device 100 moves, To be projected. In other words, the positioning unit 135 calculates the distance and angle between the reference point (for example, the object to which the image is projected) and itself by measuring and analyzing the motion on the three-dimensional space of the position determining unit 135. Then, the image output unit 140 may change the angle and the position of outputting the image so that the image can be uniformly projected in consideration of the calculated distance and angle.

Alternatively, the image output unit 140 may consider information about the skin line of the user in the process of projecting the image. That is, the image output unit 140 may transmit the image at a fixed position and angle with respect to the skin line by using the information about the skin line (or palm) of the user obtained according to various methods. When the positioning unit 135 is combined with the embodiment using the information on the blood vessels before the present embodiment, the angle and position at which the video output unit 140 outputs the video image can be fixed more reliably.

On the other hand, the wearable device 100 can grasp and manage the biometric information such as the skin line of the user in advance in order to consider the above-described information about the skin line. That is, as described in the aforementioned cited patent application 10-2014-0108341, the wearable device 100 senses the skin line of the user through the finger recognition unit including the infrared camera, the RGB camera, the ToF camera, and the like. The information about the sensed skin line is processed and stored in the wearable device 100 as information about the skin line pattern. Then, the wearable device 100 can detect the skin line of the user in the process of projecting the image and compare / analyze the information with the information about the skin line pattern previously stored. The wearable device 100 can grasp the position and motion of a user's body part through such a process, and can output the image at a fixed position and angle as described above.

In the above description, the optical signal transmitting unit 105 and the image output unit 140 are shown as separate components, but the two configurations may be implemented in the same configuration. That is, the optical signal transmitting unit 105 can transmit optical signals of various wavelengths as well as the video outputted by the wearable device 100. The optical signal transmitting unit 105, which is implemented to perform the role of the image output unit 140, may periodically or non-periodically transmit the optical signals of different wavelengths alternately The image can be output. When an optical signal for detecting a blood vessel is outputted during the output of the image, the image can be utilized in the process of selectively detecting the optical signal of the specific wavelength as described above. That is, since the image output unit outputs the image at a wavelength of visible light visible to the user, the optical signal having the visible light wavelength output by the image output unit is utilized as one of the optical signals for detecting the blood vessel. Accordingly, when the optical signal transmitting unit 105 further transmits only the optical signal of the infrared wavelength, the same / similar effect as that of transmitting two optical signals of different wavelengths can be obtained.

In other words, the optical signal transmitting unit 105 may sequentially perform the processes of outputting the first optical signal, outputting the second optical signal, and outputting the image, and the process of outputting the optical signal may be performed relatively It can be done in very short time intervals. In this case, the user can not visually fully recognize the optical signal output for a short period of time, and only the image can be confirmed.

The optical signal transmitting unit 105 has the same configuration as the depth sensor 125 described above. That is, when the optical signal transmitting unit 105 performs both the role of the depth sensor 125 and the function of the image output unit 140, the three configurations may be implemented as one configuration. In this embodiment, the optical signal transmitting unit 105 may transmit the optical signal in the middle of outputting the image, and may also serve as the depth sensor 125 by transmitting the patterned optical signal.

The gyroscope sensor 145 senses the inclination of the wearable device 100 by measuring the angular velocity. The kind and function of the gyroscope sensor 145 are obvious to those of ordinary skill in the art, and a detailed description thereof will be omitted. The acceleration sensor 150 measures the acceleration and the tilt of the wearable device 100 by measuring a change in velocity. Acceleration sensor 150 is also known in terms of its type and function, and a detailed description thereof will be omitted.

Meanwhile, the gyroscope sensor 145 and the acceleration sensor 150 measure the movement of the wearable device 100 in the three-dimensional space. That is, the gyroscope sensor 145 and the acceleration sensor 150 sense the mouse input operation by measuring how the wearable device 100 moves in three-dimensional space in what direction, speed, and slope. The mouse input operation refers to an input for moving the wearable device 100 in a space while the user wears the wearable device 100 and operating the cursor of the mouse. The key determiner 120 detects movement of the wearable device 100 in the space using the measured values sensed by the gyroscope sensor 145 and the acceleration sensor 150, Value can be generated.

That is, the wearable device 100 may operate as a 'spatial mouse' that transmits the cursor value to the outside to perform the role of the input device. In addition, the wearable device 100 can be implemented to perform a role of a spatial mouse for a body or an external object by generating a three-dimensional model of a body or an external object using the depth sensor 125 described above.

In addition, the mouse click operation will be described with respect to the mouse input operation. The mouse click operation refers to an input in which a left or right button of a mouse is clicked by touching two or more fingers while a user performs a mouse input operation while wearing the wearable device 100. [ For example, the wearable device 100 recognizes the case where the user touches the thumb and the index finger with the click of the left button of the mouse, and the case where the stop finger touches the thumb with the mouse click as the mouse click action with respect to the right click of the mouse can do. Meanwhile, the click operation is generated as a mouse click value and can be transmitted to an external device or a server.

The feedback unit 155 is a means by which the wearable device 100 transmits tactile feedback to the user by using various means. The tactile feedback can be generated in various cases and transmitted to the user. For example, when the wearable device 100 is located at a specific coordinate in the space or passes the coordinates, the tactile feedback can be transmitted to the wearable device 100 The tactile feedback can be provided to the user in various cases, such as when a signal is received from the user indicating that the tactile feedback should be transmitted to the user.

The means by which the feedback unit 155 transmits the tactile feedback to the user may be various. For example, the feedback unit 155 may include a vibration module to transmit a vibration signal to a user, or a user wearing the wearable device 100 including a pressure module may feel pressure. In addition, the feedback unit 155 may provide tactile feedback to the user through a shear stress module, or may transmit a microcurrent not to affect the user's body through the current module.

The communication unit 160 performs data communication and performs transmission and reception with the outside. For example, the communication unit 160 may include one or more communication modules for communicating with an external device, a server, and the like, and performing communication by being connected to an external network wirelessly.

The communication unit 160 is a module for short-range communication and is a wireless LAN, a Wi-Fi, a Bluetooth, a zigbee, a Wi-Fi direct, an ultra wideband (UWB) , infrared data association), BLE (Bluetooth low energy), NFC (Near Field Communication), and the like.

The communication unit 160 can transmit an input value, a cursor value, a click value, and the like generated by the key determination unit 120 to the outside using the communication module. Also, the communication unit 160 may receive three-dimensional position information from an external device through the communication modules described above.

The storage unit 165 may store data and information input to and output from the wearable device 100. For example, the storage unit 165 may store an input value, a cursor value, and a click value generated by the key determination unit 120. In addition, the storage unit 165 may store various types of program data or algorithm data that the wearable device 100 can execute.

The storage unit 165 may be a flash memory type, a multimedia card micro type, a card type memory (for example, SD or XD memory), a random access memory (RAM) (ROM), a read-only memory (ROM), an electrically erasable programmable read-only memory (EEPROM), and a programmable read-only memory (PROM) Also, the wearable device 100 may operate a web storage or a cloud server that performs a storage function of the storage unit 165 on the Internet.

The power supply unit 170 supplies power for operation of the wearable device 100. The power supply unit 170 may include various types of power supply means such as a Li-ion battery and a Li-polymer battery. The wearable device 100 includes a plurality of power supply units 170 . The power supply unit 170 may be connected to the other components of the wearable device 100 in a wired manner to supply power, and may be supplied with external power wirelessly through a wireless power transfer technique or the like. In addition, the power supply unit 170 may include a flexible battery that can be bent or spread to a certain degree or more.

The controller 175 controls the overall operation of the wearable device 100 by being connected to the configurations described above. For example, when the optical signal transmitted from the optical signal transmitting unit 105 is detected as a reflected optical signal by the optical signal detecting unit 110, the control unit 175 processes the reflected optical signal by the data processing unit 115 And controls to generate received data. The control unit 175 may also control the key determination unit 120 to generate an input value based on the received data. The control unit 175 may control the image processing unit 130 to output an image of a fixed size to the fixed position using the position determination unit 135 or may control the key determination unit 120 and the image output unit 140, Dimensional model may be generated by the image processing unit 130 in order to assist the user in performing the function of the 3D model. That is, the control unit 175 can control various functions for the wearable device 100 to operate as the input means or the output means according to the operation of the user.

Hereinafter, an embodiment in which the wearable device 100 operates according to the movement of the user's body will be described. In the following, an illustrative embodiment is shown in which a wearable device as shown is mounted on a ring of a user's thumb, unless otherwise specified. The wearable device 100 implemented in a ring form may be implemented to be mounted on the left, right, or both hands of the user, which can be implemented through simple design and structural changes. However, the wearable device 100 may be implemented in various forms such as a glove shape, a bracelet shape, a clip shape, etc., in addition to the ring shape, and a specific example will be described with reference to FIG.

In addition, the wearable device 100 may be implemented in two or more separate forms. That is, the configurations described in FIG. 1 may be included in any one or two or more of the two or more separate wearable devices 100, and two or more separate wearable devices 100 may operate in conjunction with each other to exchange data . In other words, the wearable device 100 may be implemented in a form including some or all of the configurations described in FIG. 1, and in a case where the wearable device 100 includes a part of the wearable device 100, the wearable device 100 may operate in cooperation with another wearable device 100, can do.

FIG. 2 is a view for explaining an operation procedure of a wearable device according to an embodiment of the present invention. The graph shown in Fig. 2 shows the output of the optical signal transmitted by the wearable device according to the wavelength band.

As described above, the wearable device can generate and transmit optical signals of various wavelengths. Hereinafter, a process in which a wearable device transmits optical signals of two different wavelengths will be described in connection with an embodiment of the present invention. For convenience of explanation, the wavelengths of the two optical signals are referred to as a first wavelength and a second wavelength, respectively. The first wavelength means a wavelength in the first frequency band BW1. The second wavelength means a wavelength in the second frequency band BW2. The wavelength of the light. For example, the first frequency band BW1 may be a near-infrared frequency band, and the second frequency band BW2 may be a frequency band of visible light. That is, the wearable device can generate and transmit the first optical signal having the first wavelength of the near-infrared light and the second optical signal having the second wavelength of the visible light line. As another example, the first frequency band BW1 and the second frequency band BW2 may both be near-infrared frequency bands. That is, the wearable device may generate and transmit two optical signals having a near-infrared wavelength.

On the other hand, the wearable device may generate an optical signal having a continuous spectral wavelength, or may generate an optical signal having each wavelength or a wavelength band, in order to output the first optical signal and the second optical signal. Specifically, the wearable device can generate the first optical signal and the second optical signal having different wavelengths, respectively, as shown by the solid line in FIG. On the other hand, the wearable device generates an optical signal having a relatively wide continuous spectrum of wavelengths as shown by the dotted line in Fig. 2, and uses a predetermined filter (for example, a band pass filter) And output optical signals having the first wavelength and the second wavelength.

In the former case, the wearable device may include only one optical signal transmitting unit for generating two optical signals, or may include two or more optical signal transmitting units for generating two optical signals of different wavelengths, respectively. In the latter case, the wearable device may include one single optical signal transmitting unit and may be implemented to include two or more optical signal transmitting units.

FIG. 3 is a diagram illustrating an operation process of a wearable device according to an embodiment of the present invention. 3 illustrates a process in which the wearable device transmits an optical signal to a target and detects a reflected optical signal when the target is a part of the body. In FIG. 3, the body 300, which is a part of the body, may be the skin of the hand, for example, and the area darkened inside the body 300 may be the blood vessel 310 inside the body.

First, as described in FIG. 2, the wearable device generates and transmits optical signals having two or more different wavelengths. In the embodiment of Fig. 3, A1 and B1 having a wavelength of lambda 1 indicate a first optical signal transmitted by the wearable device, and A2 having a wavelength of lambda 2 is a second optical signal / RTI > The wearable device generates two optical signals having different wavelengths and transmits them to the body skin, which is the object 300. 3, the first optical signal has a wavelength in a near-infrared band and the second optical signal has a wavelength in a visible light band.

There are skin tissues and blood vessels inside the human body, each of which is composed of different components. Particularly, blood vessels contain red blood cells containing hemoglobin and become red. These hemoglobin are divided into oxyhemoglobin and deoxyhemoglobin. Oxidized hemoglobin is present in many arteries and transports oxygen to body tissues, and deoxidized hemoglobin is present in a number of veins that carry oxygen to body tissues. In other words, arteries and veins have different physical properties due to differences in the types of hemoglobin located in each. In particular, oxidation / deoxygenated hemoglobin contained in the veins varies in absorption rate as the wavelength of light changes. Since the venous vein containing oxidized / de-oxidized hemoglobin has a relatively high absorption rate for the wavelength of the near infrared region (about 700 to 900 nm) as compared with other surrounding tissues, the amount of scattering / reflection of the optical signal in the near- It will be greatly different. On the other hand, the optical signal in the near infrared region has high absorption rate for oxidized hemoglobin and deoxyhemoglobin in blood vessels, but scattering occurs in surrounding tissues rather than in blood vessels. Therefore, when the optical signal of the near infrared region irradiated to the skin is reflected and received, the difference in contrast between the blood vessels and the surrounding tissues due to the difference in absorption rate is confirmed, and the difference in contrast can be treated as information on the vein pattern.

On the other hand, the wearable device can detect the blood vessels of the body using the difference in physical characteristics between the blood vessels (veins) and other surrounding tissues. That is, the first optical signals A1 and B1 and the second optical signal A2 transmitted from the wearable device have different wavelengths, and a part A1 of the first optical signals A1 and B1 is transmitted to the target 300, And the remaining part B1 of the first optical signals A1 and B1 passes through the skin that is the object 300 and is reflected / scattered in the blood vessel 310 inside the object 300. The latter optical signal B1 reaches the blood vessel 310 and is scattered / reflected by the vein deoxyhemoglobin. The second optical signal A2 is reflected / scattered by the skin, which is the object 300, similar to some of the optical signals A1 among the first optical signals A1 and B1. In other words, the first optical signal A1, B1 is an optical signal of a wavelength that is transmissive to the object 300, passes through the layers of the skin and is reflected / scattered / absorbed per layer. The second optical signal A2 is less transmissive and mostly reflected / scattered from the surface of the skin, and has characteristics similar to some optical signals A1 among the first optical signal A1.

The wearable device transmits the first optical signal (A1, B1) and the second optical signal (A2) to the target object, and then receives the reflected optical signal reflected from the target object. The reflected optical signal includes both the optical signal A1 + B1 in which the first optical signals A1 and B1 are reflected from the skin and the vein and the optical signal A2 in which the second optical signal A2 is reflected from the skin . For convenience of explanation, it is assumed that a reflected optical signal (A1 + B1) by the first optical signals A1 and B1 is referred to as a first reflected optical signal and a reflected optical signal by the second optical signal A2 is referred to as a second reflected optical signal (A2).

The wearable device generates reception data through a process for the first reflected optical signal (A1 + B1), and the received data includes both information about skin and blood vessels of the object.

Then, the wearable device resends the second optical signal A2, which is an optical signal having a wavelength different from that of the first optical signals A1 and B1, to the target object. That is, the second optical signal A2 to be newly transmitted is an optical signal having a wavelength different from that of the first optical signals A1 and B1 transmitted by the wearable device or a part A1 of the first optical signals A1 and B1, Information about the skin surface, which is information obtained by the user. That is, the second optical signal A2 is reflected by the skin of the object and is received by the wearable device, and the reflected optical signal A2 by the second optical signal is received from the information contained in the first reflected optical signal A1 + B1 Some of which are similar.

The wearable device generates the received data through the process of the second reflected optical signal A2 and the received data is transmitted to the skin of the target object differently from the received data for the first reflected optical signal A1 + Information only.

The wearable device compares the received data generated based on the first reflected optical signal (A1 + B1) with the received data generated based on the second reflected optical signal (A2). This comparison may include subtracting the data of the second reflected optical signal A2 from the data of the first reflected optical signal A1 + B1 by comparing the difference between the two received data. That is, the wearable device removes the influence of the second reflected optical signal A2 from the data of the first reflected light signal (A1 + B1), thereby obtaining the first reflected light signal (A1 + B1) Can only obtain information about In other words, the wearable device removes information about the skin from the first reflected light signal (A1 + B1) and obtains information about the vessel from the part (B1) of the first reflected light signal (A1 + B1) The data generated by subtracting the data of the two reflected optical signals may be blood vessel data.

Specifically, as will be described later, the wearable device senses a key input operation of a user and generates an input value by utilizing information on blood vessels included in the blood vessel data. Accordingly, the wearable device must be able to operate to accurately extract information about the blood vessel. The wearable device transmits optical signals of two different wavelengths as described above, and calculates the difference therebetween, so that only the information on the blood vessel can be acquired efficiently.

In the above description, the wearable device receives the first reflected light signal and the second reflected light signal, respectively. Hereinafter, how the wearable device separates and detects two reflected optical signals of different wavelengths will be described in detail. The wearable device receives the first reflected light signal of the first wavelength and the second reflected light signal of the second wavelength in three ways.

First, the wearable device can separate and detect the received reflected optical signal according to the wavelength. That is, since the wavelengths of the first reflected optical signal and the second reflected optical signal are different from each other, the wearable device receives the two reflected optical signals together and processes the respective reflected optical signals separately. In other words, the wearable device can transmit the optical signals of the two wavelengths together and process the reflected optical signal by wavelength separately, even if the two reflected optical signals are received together. For example, the wearable device may include a photo detector for discriminating and detecting an optical signal for each wavelength.

In the first example, the wearable device can selectively detect reflected optical signals of different wavelengths. Accordingly, the wearable device alternately transmits or simultaneously transmits the first optical signal of the first wavelength and the second optical signal of the second wavelength, or transmits the other optical signal periodically or non-periodically while continuously transmitting one optical signal. It is possible to distinguish and detect the reflected optical signals even if the optical signals are transmitted in various ways such as periodically transmitting.

Second, the wearable device can distinguish and detect the reflected optical signals in the time domain or the frequency domain. That is, the wearable device can send out optical signals having different wavelengths with a time difference, or transmit the optical signals with different intensity for each wavelength. Unlike the first example, even if the received reflected light signal can not be distinguished by wavelength, the wearable device knows beforehand what time the optical signal of the wavelength is to be transmitted, so that the reflected optical signal received is the reflected optical signal of the wavelength You can guess.

In the second example, the wearable device can transmit the first optical signal of the first wavelength and the second optical signal of the second wavelength alternately. In this case, since the wearable device knows in advance which reflected optical signal received sequentially is a reflected optical signal of an optical signal of a certain wavelength, it is possible to reduce the burden for distinguishing the reflected optical signal by wavelength. In this embodiment, the wearable device may use a method of transmitting two optical signals alternately, transmitting one optical signal continuously, and transmitting another optical signal periodically or non-periodically.

Third, a case where the intensity of the optical signals is transmitted in a different manner can be considered. The wearable device can transmit the output intensity of different optical signals differently, and this embodiment can be applied in combination with the first embodiment and the second embodiment described above. In this embodiment, the wearable device is able to detect the reflected optical signal more efficiently by time or frequency domain because the intensity difference of the reflected optical signals is relatively large.

In the above description, the wearable device transmits the first optical signal and the second optical signal and analyzes the reflected optical signals. However, the optical signal generated and received by the wearable device may be influenced by the ambient light and natural light of the surrounding environment in which the wearable device operates. For example, when a wearable device generates a second optical signal having a wavelength of visible light and transmits the second optical signal to a target object, the reflected optical signal of the second optical signal may be mixed with an optical signal generated by reflecting sunlight on a target object, . Therefore, a process for removing such noise may be required.

There may be various embodiments for eliminating the influence of external light. First, the wearable device can operate to exclude external factors such as natural light, indoor light, and light by a beam projector. That is, the wearable device recognizes the light sensed by the optical signal sensing unit as external light before the optical signal transmitting unit transmits the optical signal. Then, the wearable device can obtain only the reflected optical signal by the optical signal transmitted by the wearable device by removing the influence of the external light in the reflected optical signal detected after transmitting the optical signal.

Second, the wearable device may use external light instead of eliminating the influence of external light. That is, when the wearable device utilizes the optical signal of the near-infrared wavelength as the first optical signal and the optical signal of the visible light wavelength as the second optical signal, the wearable device directly generates the first optical signal and the second optical signal, It is possible to selectively receive external light. More specifically, the wavelengths of the first optical signal and the second optical signal to be generated by the wearable device can be generated by external light. In this case, the wearable device can filter the reflected optical signal generated by reflecting the external light to the object and select the reflected optical signal of the predetermined wavelength to receive the reflected optical signal. Accordingly, the wearable device can obtain the same or similar result by utilizing external light even if the wearable device does not directly generate the optical signal. However, when the external light is used, the optical signal of the desired wavelength may not be sufficiently received. Therefore, the wearable device may analyze the external light received to additionally generate and transmit the optical signal of the required wavelength to supplement external light .

As a result, when a wearable device receives a reflected optical signal having a specific wavelength, it can directly generate an optical signal and transmit it to a target object to obtain a desired result, while achieving the same result by selectively receiving external light .

In the foregoing, embodiments of the present invention have been described using terms such as a first optical signal, a second optical signal, a first reflected optical signal, and a second reflected optical signal. However, the names such as 'first', 'second', and the like are merely terms for distinguishing the respective concepts, and the contents of the invention are not limited to these terms.

Meanwhile, the first optical signal and the second optical signal described above may be optical signals of a near-infrared region and a visible light region, respectively. However, the present invention is not limited to this embodiment, and both the first optical signal and the second optical signal may be optical signals of a near-infrared region or an infrared region. That is, if the wavelength band of the two optical signals is different, the wearable device can transmit the two optical signals, receive the reflected optical signal, and obtain information about the blood vessel. Since the absorbance / scattering / reflectance of the skin, the blood vessel, and the surrounding tissue are different depending on the wavelength of the optical signals, the first and second reflected optical signals in the near-infrared region or the infrared region include different biometric information do. Data on blood vessel patterns can be obtained by comparing / analyzing / combining such information. In other words, the wearable device acquires information about the blood vessel by transmitting two or more optical signals, and the frequency band and the kind of the optical signal are not limited. Therefore, even though the near infrared ray region and the visible ray region are exemplified above and hereinafter, such contents may be applied to an embodiment in which optical signals of different frequency bands are used.

According to another embodiment, the wearable device can acquire information on the blood vessel with only one optical signal, instead of using the two optical signals. That is, as described above, the optical signal in the near infrared region (700 nm to 900 nm) has different absorption and scattering in blood vessels and surrounding tissues. As the spectrum and wavelength of the optical signal are different, the reflectance varies depending on the skin tissue layer. The wearable device can compare the contrast, analyze / combine such information, check the contrast difference between the blood vessel and the surrounding tissue, and grasp the pattern of the blood vessel. Of course, a method in which a wearable device transmits and receives three or more optical signals is also applicable. In the process of using two or more optical signals, optical signals have different wavelengths, different spectrums, different transmission times (time points), different reception times (different time points), different frequencies, May be different.

In general, the wearable device transmits a plurality of optical signals having different wavelengths in the visible light region and the infrared region, receives the reflected optical signals, and compares / analyzes / combines the wavelengths of the optical signals to generate image data Can be obtained.

Alternatively, the wearable device may acquire image data of a blood vessel and a surrounding tissue by a method of transmitting and receiving only one optical signal in a single near-infrared region, and may know a pattern of a blood vessel. As a result, the wearable device can transmit and receive one or more optical signals to obtain information about the blood vessel.

As a preferred embodiment, the wearable device transmits and receives two or more optical signals in the process of acquiring blood vessel data for the first time, and then operates to transmit and receive one optical signal in the process of sensing the key input operation of the user It is possible. In contrast, the wearable device may generate pattern information for a blood vessel using only one optical signal in the process of acquiring blood vessel data for the first time, and then may use two or more different optical signals in a driving process of grasping the movement of the user.

4 to 7, a description will be given of a process in which the wearable device acquires information on blood vessels and generates an input value according to the procedure described above. FIG. 4 is a diagram illustrating an operation process of a wearable device according to an embodiment of the present invention. 4 illustrates an embodiment in which the wearable device 100 is mounted on the first node of the user's left thumb.

As described above, the wearable device 100 includes the optical signal sending unit 105 and the optical signal sensing unit 110. In FIG. 4, the optical signal transmitting unit 105 and the optical signal sensing unit 110 are arranged adjacent to each other, and not shown through reference numerals, but other functional modules are arranged in a line. However, this is merely an example of an embodiment for convenience of explanation, and the configurations included in the wearable device 100 are not limited to such a configuration.

That is, the optical signal transmitting unit 105 and the optical signal sensing unit 110 may be included in the wearable device 100 or may be disposed apart from each other by a predetermined distance. When the wearable device 100 is implemented in two separate forms, the optical signal transmitting unit 105 and the optical signal sensing unit 110 may be included in the wearable device 100, respectively. In addition, although the wearable device 100 shown in FIG. 4 is implemented in a ring shape, the wearable device 100 is not limited to this embodiment as described with reference to FIG.

The wearable device 100 is attached to a user's body and senses a user's hand as a target object 410 and acquires information on the blood vessel 400 in the target object 410. That is, the wearable device 100 transmits an optical signal to a target body 410 mounted on a part or body of a user, receives a reflected optical signal reflected from outside or inside the target body 410, Information about the blood vessel 400 is obtained.

5 is a view for explaining the operation of the wearable device according to an embodiment of the present invention. FIG. 5 illustrates a process in which a wearable device creates a three-dimensional model of a user's hand.

First, the embodiment shown on the left side of FIG. 5 will be described. The depth sensor of the wearable device senses the hand of the target chain user three-dimensionally and generates three-dimensional scan information. As shown, a wearable device may be mounted on another body part (e.g., the right hand thumb) instead of the left hand thumb to scan the entire user's left hand. The user can move the right hand equipped with the wearable device around the left hand and let the depth sensor scan the left hand in three dimensions.

On the other hand, information about the palm surface is important for the wearable device rather than the user's hand. Accordingly, in order for the wearable device to accurately acquire the three-dimensional scan information on the user's palm surface, the user scans slowly when the detection sensor looks at the palm surface of the left hand, and when the detection sensor looks at the palm surface of the left hand, You can scan at a faster rate than the palm of your hand. Alternatively, if it is not necessary to accurately acquire the three-dimensional information on the back of the hand, the user may omit the three-dimensional scanning process on the back of the hand.

The depth sensor generates three-dimensional scan information for the user's hand and transmits the corresponding information to the image processing unit. The image processor analyzes and processes the three-dimensional scan information to generate a three-dimensional model 500 for the user's left hand. The three-dimensional model 500 may be a three-dimensional image or may be generated through a three-dimensional rendering process.

On the other hand, the three-dimensional model 500 generated using the three-dimensional scan information, which is the result of the scan of the depth sensor, may not sufficiently contain information on the blood vessels required by the wearable device. That is, blood vessels appear faintly in the palm of a hand, but the wearable device does not appear to be sufficient to generate the input value. That is, the depth sensor can accurately measure the appearance of the user's hand, but may not be able to detect the distribution and placement characteristics of the blood vessel.

Accordingly, a process of adding a pattern to the three-dimensional model 500 is performed as shown on the right side of FIG. As described with reference to FIGS. 2 and 3, the wearable device acquires information on the palm of the hand using the optical signal sending unit and the optical signal sensing unit, and the obtained information is generated as a pattern 520 for the blood vessel.

The process of generating the pattern information by sensing the blood vessel by the wearable device may be performed simultaneously with the process of generating the three-dimensional scan information by the depth sensor, or separately. That is, while the depth sensor recognizes the hand of the object in three dimensions and generates three-dimensional scan information, the optical signal sensing unit senses the blood vessel and the data processing unit can generate the pattern information. In this case, the 3D scan information and the pattern information about the blood vessel are transmitted to the image processing unit, and the image processing unit sequentially processes the two information to generate the three-dimensional model. In this embodiment, the three-dimensional model 510 to which the pattern 520 is added can be generated in a single scanning process.

Alternatively, when the depth sensor generates the three-dimensional scan information by scanning the hand of the object and the image processing unit generates the three-dimensional model using the three-dimensional scan information, the optical signal sensing unit and the data processing unit A process can be additionally performed. In this case, the wearable device must scan the target chain twice. That is, in the former case, both of the three-dimensional scan information and the pattern information are generated in one scan process, whereas in the latter case, the three-dimensional scan information is generated through the first scan and the pattern information is generated through the second scan . In the latter case, the image processing unit generates the three-dimensional model in advance, and then processes the received pattern information.

The pattern information generated by the data processing unit is transferred to the image processing unit, and the process of applying the pattern 520 to the three-dimensional model 500 is performed. That is, the image processing unit can process the process of applying the pattern 520 to the three-dimensional model 500 generated by the three-dimensional scan information, thereby generating the three-dimensional model 510 to which the pattern is added. Meanwhile, the pattern 520 includes information on blood vessels. The blood vessels (for example, veins) can be regarded as a three-dimensional three-dimensional structure with different depths and thicknesses depending on the positions of the body such as the palms and fingers. Accordingly, the pattern 520 for the blood vessel added to the three-dimensional model 500 can have three-dimensional information (depth, thickness, directionality, etc.) on the skin surface and the surface.

FIGS. 6 and 7 illustrate an embodiment in which the wearable device analyzes the user's operation using the three-dimensional model and the blood vessel pattern. FIG. FIG. 6 is a view for explaining an operation procedure of a wearable device according to an embodiment of the present invention.

As described with reference to FIG. 5, the image processing unit generates a three-dimensional model of a target object (for example, a left hand of the user) using the three-dimensional scan information and the pattern information. After the initial process of generating the three-dimensional model is performed, the wearable device 100 continuously senses the user's blood vessel. That is, the optical signal sensing unit continuously receives the reflected optical signal reflected from the object, and the data processing unit continuously analyzes the reflected optical signals of different wavelengths to detect the change of the external shape of the blood vessel. For example, the wearable device 100 senses a change in external shape of a blood vessel located at a palm or a node of a finger or a change in external shape of a blood vessel located at a joint connecting the palm and a finger.

When the user performs a key input operation (i.e., typing), the position and arrangement of the blood vessels of the hand in the space change according to the movement of the fingers. For example, as the fingers are bent, the angles between the first and second nodes of the finger become smaller, and they are disposed adjacent to each other. Accordingly, the distribution and arrangement of the blood vessels detected by the optical signal sensing unit of the wearable device 100 are different.

Meanwhile, the wearable device 100 can compare the detected physical characteristics with the pattern of the three-dimensional model. That is, the wearable device 100 has already generated the three-dimensional model using the pattern information on the blood vessel of the user. Accordingly, when the position of the blood vessel is changed due to the key input operation of the user, the information of the detected blood vessel can be compared with the blood vessel pattern of the generated three-dimensional model.

For example, the wearable device 100 detects the key input operation of the user while storing the blood vessel pattern of the palm of the user created in FIGS. 4 and 5 in addition to the three-dimensional model. When the user moves the left hand 600 and presses a certain key, the wearable device 100 senses the blood vessel change of the left hand 600. [ The optical signal sensing unit of the wearable device 100 can continuously sense the position and arrangement of blood vessels by continuously sensing the reflected optical signal reflected from the left hand blood vessel 600 and can compare with the pattern stored in the three-dimensional model.

The wearable device 100 can know how the operation of the left hand 600 is performed according to the change of the blood vessel and can calculate the change of the angle formed by the first and second nodes of the finger as the finger is bent . This calculation process can be performed by comparing the change of the blood vessel with the pattern of the pre-stored three-dimensional model. As a result, the position of the fingertip in the three-dimensional space according to the key input operation is calculated. The three-dimensional position of the finger according to the key input operation determines a predetermined key matched with the key input operation, and the wearable device 100 generates the key as an input value. This will be described in detail with reference to FIG.

In summary, the wearable device 100 continuously detects the blood vessels of the user, and compares the detected arrangement, distribution, and position of the blood vessels with pre-stored three-dimensional models. From this comparison result, the wearable device 100 can calculate a predetermined input value matching the key input operation.

FIG. 7 is a view for explaining an operation procedure of a wearable device according to an embodiment of the present invention. In FIG. 7, the x / y / z axis represents the three-dimensional space, and the line connecting the origin, P1, P2, P3, and P4 represents the skeleton of the user's wrist and finger when the object is the user's hand. P2 is the joint between the first node and the second node, P3 is the joint between the second node and the third node, and P3 is the joint between the second node and the third node. , And P4 indicates the fingertip, respectively.

As described in FIG. 6, the wearable device can calculate the three-dimensional position and bend angle of the joint to which the first and second nodes of the user's finger are connected. That is, the wearable device can calculate an angle? 2 and a three-dimensional position of P2 in FIG. On the other hand, since the wearable device generates and stores a three-dimensional model of the user's hand, calculating the three-dimensional position of P2 means calculating the distance d1 from the center of the wrist to P2.

On the other hand, similarly to the case of P2, the wearable device can calculate θ1 and the three-dimensional position of the joint P1 between the palm and the first node. Alternatively, the wearable device can previously calculate the distance from the center of the wrist to the joint between the palm and the first node in the process of generating the three-dimensional model, that is, the position of P1. In this case, the wearable device can be calculated through comparison of the patterns of blood vessels in a manner similar to? 2 for? 1. That is, the wearable device can calculate the position and bend angle of each joint by comparing the distribution position, size, and appearance change of the blood vessel in each joint with the pre-stored pattern.

On the other hand, assuming that the user's hand is bent according to the natural motion, if the coordinates of P1, the coordinates of P2, θ1, and θ2 are known, all the coordinates of P3, θ3 and P4 can be calculated. This process is an experimental method and can be viewed as an estimation by experience. However, as long as the user does not consciously bend the finger joints at an abnormal angle, the coordinates of P3 and the angle θ3 can be known with high accuracy from the relationship of P1, P2, θ1 and θ2. Similarly, P1, P2, P3, , &thetas; 3, the positional information of P4 can be accurately estimated.

In the above process, the range of? 1,? 2,? 3 may be a problem. That is, θ1, θ2, and θ3 should be measured to be within 180 degrees. When the user lifts his or her finger high, the joint connecting the user's palm and the first node may be measured at 180 degrees or more. However, this angle is far from normal key input operation. Accordingly, the wearable device can acquire only meaningful values that are within 180 degrees of each angle in measuring the angles? 1,? 2,? 3 of the finger joints. The wearable device may be implemented so as to ignore values when the angles? 1,? 2,? 3 are measured to be 180 degrees or more, or conversely, a case where the measured angles are 180 degrees or more may be mapped to a specific operation and processed separately.

On the other hand, there are various methods for improving the accuracy of the estimation process. For example, after the process of generating a three-dimensional model of the hand is first performed, the wearable device can instruct the user to perform an operation for inputting a specific key. When the wearer device senses such an operation and estimates P3, P4, and &thetas; 3, the wearable device can know in advance which value should be compensated. That is, software compensation can be performed in the process of calculating the input value according to the key input operation of the user.

Alternatively, the wearable device may measure the 3-dimensional position of P3 and? 3 directly. That is, the optical signal sensing unit and the data processing unit compares the blood vessel near the joint, which connects the second and third nodes of the finger, with the blood vessel pattern of the three-dimensional model, and measures the three- It is possible. In this case, since the wearable device directly measures P1, P2, P3, θ1, θ2, θ3, and d2, the accuracy in the process of estimating P4 is greatly increased. Alternatively, the above-described software compensation process may be performed in combination with a method of directly measuring P3 and? 3.

As a result, the wearable device senses the key input operation according to the user's typing and generates an input value by determining which key the corresponding key input operation is matched with. Such an input value can be transmitted to an external device or a server connected to the wearable device, and the wearable device operates as an input means.

Hereinabove, an embodiment has been described in which the key input operation for sensing, stopping, ringing, and holding of the user's finger is detected. Meanwhile, the wearable device should also be able to detect the key input operation of the thumb. First, the case where the wearable device is mounted on the thumb will be described. The wearable device can directly measure the position of the thumb, which is a finger attached to the wearable device, or indirectly estimate it.

When the wearable device is mounted on the thumb and directly measures the key input operation of the thumb, the optical signal sensing unit and the data processing unit sense an angle sufficient to recognize the position of the thumb tip. Accordingly, the wearable device can calculate the three-dimensional position information of the thumb tip to which the wearable device is attached. In addition, the wearable device can also calculate how far it is mounted on the thumb from the position of the thumb tip.

Alternatively, when the wearable device indirectly measures the key input operation of the thumb, the wearable device can estimate the position of the thumb mounted by itself from the positions of the joints of the other four fingers. That is, the wearable device can estimate its three-dimensional position from P1, P2 positions of the other four fingers. When the wearable device estimates its position using P1 or P2, it utilizes P1 or P2 of four fingers, that is, four pieces of position information, and when the wearable device estimates its position using P1 and P2 , And estimates its position using eight positional information. That is, since the wearable device has a sufficient number of pieces of information for specifying its position on the three-dimensional space, it can estimate the position of the thumb located on the basis of the position information of other four finger joints. The two ways in which the thumb measures / estimates its position can be similarly applied when the wearable device is mounted on the index finger, the stop finger, the finger grip, and the base. That is, the wearable device can also measure the position of the thumb tip on which the wearable device is mounted.

On the other hand, when the wearable device is mounted on a finger other than the thumb and detects the thumb, the thumb has a different structure from the other four fingers, and therefore another process of measuring the position and angle with respect to the thumb is required.

Unlike the other four fingers, the thumb includes the joint where the palm and the first node are connected, and the joint where the first and second nodes are connected. That is, the wearable device can measure the position of the thumb tip even if only the positions of the two joints of the thumb are acquired. Accordingly, when the wearable device is attached to another finger instead of the thumb, P3 measured from P1, P2,? 1,? 2 with respect to the thumb is the position of the fingertip. Accordingly, the wearable device can measure the position of the thumb tip with higher accuracy than the other four fingers.

In the above description, the wearable device senses the blood vessel of the joint part of the finger, compares it with the previously stored pattern, and detects the key input operation to calculate the three-dimensional position of the fingertip. As described above, the three-dimensional position of the fingertip is matched with a specific input value, and the wearable device confirms which key is pressed by the key input operation from the three-dimensional position of the finger, generates the confirmed key as an input value can do.

Hereinafter, an embodiment in which a wearable device senses a blood vessel near a node of a finger will be described, unlike the above description. That is, the wearable device can grasp not only the finger joint but also the three-dimensional position of the fingertip by detecting the blood vessels of the finger joint. For example, a wearable device can sense the position and angle (P1, θ1) of the joint connecting the palm and the first node of the finger by detecting the arrangement and distribution of the blood vessels of the first node of the palm and fingers, By detecting the blood vessels of the second node, the position and angle (P2, θ2) of the joint connecting the two nodes can be detected. The process of estimating the position of the fingertip by measuring the position of the two joints can be similarly applied to the embodiment described above.

Furthermore, the wearable device can sense the position of the joint even if only one finger node is detected. That is, since the pattern information of the blood vessel added to the three-dimensional model may be three-dimensional information, it can be described that information about the thickness and slope of the blood vessel can be included. Accordingly, the wearable device can recognize the position of another joint by sensing a blood vessel in a single finger node and comparing it with a previously stored pattern. With respect to this embodiment, as the finger joints are bent, not only the arrangement and position of the blood vessels change, but also the brightness and the saturation of the blood vessels change. That is, as the fingers are bent, fingers of the fingers are overlapped and wrinkles are formed. As a result, the wearable device can grasp the position of the fingertip in consideration of transparency, lightness, and saturation of the blood vessel to be sensed.

FIG. 8 is a view for explaining an operation procedure of a wearable device according to an embodiment of the present invention. 8 illustrates an embodiment in which the wearable device detects a mouse input operation and a mouse click operation of a user and generates a cursor value and a click value.

When the wearable device 100 moves in space, the gyroscope sensor 150 and the acceleration sensor 160 measure a positional change, an acceleration change, and a tilt change of the wearable device 100 in space. Accordingly, the wearable device 100 can detect a mouse input operation in which a user moves his / her hand in space while wearing the wearable device 100. In addition, the wearable device 100 may also detect a mouse click action of touching the index finger or the stop finger with the thumb during a mouse input operation.

First, an embodiment in which the wearable device 100 detects a mouse input operation and generates a cursor value will be described. The gyroscope sensor 150 and the acceleration sensor 160 sense a user's mouse input operation when a change in position of the wearable device 100 in space is measured. The user's mouse input operation is an operation for the user to move the mouse cursor from a specific position to another position, and is matched with a predetermined cursor value. The cursor value can be transferred to an external device or server to specify the movement direction and movement value of the mouse cursor.

On the other hand, the reference position of the cursor value may be the position 860 of the center of the palm of the user. That is, the cursor value according to the mouse input operation can be determined based on the position 860 at the center of the palm. This is because the position 860 at the center of the palm is caused by the fact that even if the user moves the hand and the finger, the influence of the movement is small. That is, when the index finger and the stop finger are bent due to a mouse click operation to be described later, other parts of the hand are hard to become accurate reference positions due to the movement of the muscles connected to the finger. On the other hand, the position 860 at the center of the palm is relatively constant even when a mouse click operation is detected during the mouse input operation. Accordingly, the wearable device 100 generates a cursor value based on a change in the position 860 of the center of the palm. Alternatively, the wearable device 100 may utilize the position of the center of the back of the hand or the position of the thumb instead of the position 860 of the center of the palm as a reference position of the cursor value.

Next, the mouse click operation will be described. The finger recognition unit 120 of the wearable device 100 can continuously detect the positions 810, 820, 830 and 840 of the fingers during the detection of the mouse input operation. This process can be performed according to the embodiment described in FIGS. 5 to 7 above.

Meanwhile, as the wearable device 100 moves in the space, the wearable device 100 senses whether the index finger and the stop finger touch the thumb during the detection of the mouse input operation. This process can be calculated based on whether or not the position 810 of the index finger tip and the position 820 of the stop finger tip are adjacent to the wearable device 100 by a predetermined distance or more. That is, when the wearable device 100 is mounted on the thumb, the index finger and the stop finger touch the thumb, which means that the positions 810 and 820 of the two fingers are adjacent to the wearable device 100 by a predetermined distance or more do. Accordingly, the wearable device 100 can detect a mouse click operation based on the positions 810 and 820 of the ends of two fingers.

Alternatively, the wearable device 100 may detect a mouse click operation by comparing positions of the ends 810 and 820 of the two fingers with the position of the position of the thumb end 850. That is, as described with reference to FIG. 7, the wearable device 100 can also grasp the position of the thumb itself. Accordingly, the wearable device 100 may detect the mouse click operation by comparing the positions of the three fingertip positions 810, 820, and 850. FIG.

Then, the wearable device 100 generates a click value matching the mouse click operation. The click value may include a left click value that matches the mouse click operation in which the index finger touches the thumb, and a right click value that matches the mouse click operation in which the stop finger touches the thumb. In addition, the wearable device 100 may process the case where both fingers touch the thumb with a separate click value.

The wearable device 100 transmits the generated click value to an external device or a server. The mouse click operation is detected during the detection of the mouse input operation, so that the click value is transmitted together with the cursor value according to the mouse input operation. Accordingly, the wearable device 100 can click the mouse while moving the mouse cursor from an external device or a server. That is, the wearable device 100 can play a role of a 'space mouse' as an input means for controlling the mouse with movement in space.

9 is a view for explaining the operation of the wearable device according to an embodiment of the present invention. 9, an embodiment in which the wearable device 100 outputs images to the outside will be described in detail.

1, the wearable device 100 includes an image output unit 140, and the image output unit 140 can output an image at a fixed position and size. 9, the wearable device 100 mounted on the user's finger as the target body 900 may output the image 910 to the outside, and the image 910 may be a part of the body as shown in FIG. It can be output to various places such as objects.

Meanwhile, when the wearable device 100 outputs the image 910, the wearable device 100 may analyze the blood vessel information and output the image 910 in a position and size fixed externally. That is, the wearable device 100 compares the blood vessel pattern of the previously stored three-dimensional model with the information of blood vessels sensed according to the motion of the user, Distance and angle can be measured. In other words, the wearable device 100 calculates its relative positional relationship with respect to the external object 900 to which the image 910 is output, so that the output image 910 is output to the object 900 in any size and angle Can be calculated.

Then, the wearable device 100 continuously detects the motion of the user and outputs the image 910 in a fixed position and size. That is, when the arrangement and distribution of the blood vessels detected by the object 900 are changed according to the movement of the user, the wearable device 100 continuously corrects the angle and size of the image 910 projected on the object 900. Accordingly, the wearable device 100 can output the image 910 at a fixed position and size even if the relative positional relationship with the target body 900 changes according to the movement of the user.

In the above-described embodiment, a process of fixedly outputting the image 910 using the pre-stored three-dimensional model of the wearable device 100 has been described. However, the wearable device 100 can output the image 910 in a fixed manner without generating a three-dimensional model.

That is, the wearable device 100 may analyze only the pattern of the blood vessel to be sensed in two dimensions without utilizing the previously stored three-dimensional model. Even if only the pattern of the blood vessel is analyzed, the wearable device 100 can output the image 910 by comparing it with the pattern information of the previously stored blood vessel. Further, the wearable device 100 can grasp the position of the wearable device 100 in its own space by utilizing the gyroscope sensor and the acceleration sensor, and can also output a fixed image based on the detected position.

Meanwhile, in the process of fixing and outputting images by the wearable device 100, the above-described reflected optical signal sensing methods can be utilized. That is, the wearable device 100 can measure the distance and the angle with respect to the external object by transmitting the optical signal having the pattern in the middle of the output of the image and receiving the reflected optical signal. In this process, the optical signal transmitted during the output of the image may be the same wavelength (for example, visible light) as the wavelength of the image or may be another wavelength. In the case where the wearable device 100 utilizes the same wavelength as the image, the wearable device 100 may use the information about the optical signal output time previously known by the wearable device 100, It is possible to receive the reflected optical signal.

As another example, the wearable device 100 may know the information about the video 910 that it outputs, so that it can utilize it. That is, the wearable device 100 can calculate how much the image 910 outputted by the wearable device 100 is projected at a predetermined distance and at an angle, and can use the image 910 actually projected on the target 900 as a visible light reflection signal And can calculate the difference. The wearable device 100 can know the relative position and angle relationship between the object 900 and itself, by correcting the difference value.

As another example, the wearable device 100 may output the image fixedly using the depth sensor described above instead of sensing the reflected optical signal. That is, the wearable device 100 can utilize the patterned optical signal in two ways in the process of detecting the object using the depth sensor. Specifically, the wearable device 100 knows the timing and the frequency band of the patterned optical signal beforehand, and may be aware of the timing of receiving the optical signal in advance. In this case, the depth sensor of the wearable device 100 can calculate the distance and angle with respect to the object from the information on the previously stored viewpoint and information on the frequency band. Conversely, the wearable device 100 may receive the optical signal by utilizing means for selectively sensing the wavelength band of the patterned optical signal, without knowledge of the optical signal.

As another example, the wearable device 100 may output a predetermined pattern 920 to the border of the image 910 as shown in FIG. This pattern 920 is different from the pattern of blood vessels described above and simply means a marker or an identifiable marker added to the periphery of the image 910. [ The wearable device 100 analyzes the shape, size, and degree of inclination of the pattern 920 projected on the target object 900 by adding the pattern 920 and transmits the pattern 920 to the wearable device 100 so that the distance and angle between itself and the target object 900, Location relationships and so on.

10 is a view showing an embodiment of a wearable device according to another embodiment of the present invention. Although the embodiments in which the wearable device 100 is implemented in a ring form have been described above, the embodiment of the wearable device 100 is not limited to this example as described above. That is, the wearable device 100 includes a glove shape mounted on the palm of the user, a bracelet shape surrounding the wrist or arm as shown in Fig. 10 (a), a clip (not shown) Shape, or necklace shape. The wearable device 100 is not limited to the embodiment, and can detect a key input operation, a mouse input operation, a mouse click operation, and the like of the user. That is, although not shown, the wearable device 100 may be implemented in the form of a necklace or the like, but may perform the same operations as those implemented in other forms.

10A, the wearable device 100 is implemented as a bracelet that wraps the user's wrist, and detects a user's key input operation, a mouse input operation, a mouse click operation, and the like. In this embodiment, the wearable device 100 may output images to the user's palm, hand, or arm (1010, 1020).

In the embodiment of Fig. 10 (b), the wearable device 100 is implemented in the form of a clip mounted on a sleeves. The wearable device 100 may output the image to the palm of the user, the user's hand, or the like, and may output the image to an object (e.g., clothes) that is not a part of the user's body. Although not shown, the wearable device 100 may be mounted on a necktie, collar, or the like.

As described above, regardless of the implementation of the wearable device 100, the wearable device 100 can operate in the same or similar form as the embodiments described above.

In the foregoing, the wearable device 100 has described an embodiment in which it senses a part of the body (for example, a mounted hand, etc.) on which the wearable device 100 is mounted. However, in addition, the wearable device 100 may also detect a portion of the body to which it is not attached (e.g., the opposite hand that is not mounted).

For convenience of explanation, when the wearable device 100 in a ring shape is mounted on the left hand of the user, the wearable device 100 can also detect the right hand, which is the opposite hand to which the wearable device 100 is not attached. That is, the wearable device 100 can detect the key input operation of the right hand, the space mouse movement of the right hand, and project the image on the right hand.

However, in order for the wearable device 100 to detect a right hand to which the wearable device 100 is not attached, it is necessary to know the distance and the positional relationship with the right hand. In other words, when the user moves the left hand, the wearable device 100 can not know precisely whether the left hand equipped with the wearable device 100 has moved or the right hand that has not been mounted has moved.

There are several ways to solve this problem. For example, the wearable device 100 can generate a three-dimensional relative coordinate system with its own origin, and can recognize the right hand as a single object. In this case, the fingers of the left hand are also recognized as objects, and their positions are specified in a coordinate system centered on the wearable device 100. [

As another example, the wearable device 100 may utilize an acceleration sensor or a gyroscope sensor. The measured values of the two sensors are not changed when the right hand without the self is mounted, so that the wearable device 100 can recognize the opposite case of the movement of the left hand equipped with the wearable device 100 by changing the measured values of the two sensors .

In another example, an external sensor may be present to sense the movement of the wearable device 100. That is, when the external device measures the position on the three-dimensional space of the wearable device 100 and transmits the measured position to the wearable device 100, the wearable device 100 can easily determine whether the left hand equipped with the wearable device 100 has moved.

Regardless of the example, the wearable device 100 can easily grasp the relative distance and positional relationship between the left hand and the right hand, so that the right hand can also be operated to continuously detect it similarly to the left hand.

It will be understood by those skilled in the art that various changes in form and details may be made therein without departing from the spirit and scope of the invention as defined by the appended claims. Therefore, the disclosed methods should be considered in an illustrative rather than a restrictive sense. It is intended that the present invention cover the modifications and variations of this invention provided they come within the scope of the appended claims and their equivalents.

Claims (15)

In a wearable device,
An optical signal transmitter for transmitting an optical signal;
An optical signal detection unit receiving the reflected optical signal generated by reflecting the optical signal to a target object;
A data processing unit for processing the received reflected optical signal; And
And a key determiner for detecting a key input operation of the user based on the processed data of the reflected optical signal and generating an input value matched with the key input operation,
Wherein the optical signal transmitting unit transmits one or more optical signals,
Wherein the optical signal sensing unit receives one or more reflected optical signals by the at least one optical signal,
Wherein the data processing unit generates pattern information on the blood vessel of the object based on the at least one reflected optical signal,
Wherein the key determining unit detects the key input operation by comparing information on the blood vessel that changes according to a key input operation of the user with the pattern information.
The method according to claim 1,
When the optical signal transmitting unit transmits one optical signal,
Wherein the data processing unit obtains the blood vessel data by comparing light and dark differences between the blood vessel and the surrounding tissue in the data obtained by processing one reflected optical signal by the one optical signal.
3. The method of claim 2,
Wherein the data processing unit generates pattern information on the blood vessel using the blood vessel data.
3. The method of claim 2,
Wherein the one optical signal is an optical signal in a near-infrared region.
The method according to claim 1,
When the optical signal transmitting unit transmits two or more optical signals,
Wherein the data processing unit obtains the blood vessel data by comparing the processed data of the two or more reflected optical signals by the at least two optical signals.
6. The method of claim 5,
Wherein the at least two optical signals are optical signals in a near-infrared region or an infrared region, and at least one of a wavelength, a time to be transmitted, a time, a frequency, and a polarization state is different.
The method according to claim 1,
The wearable device
A depth sensor for detecting the object in three dimensions and generating three-dimensional scan information; And
Further comprising a video processing unit for generating a three-dimensional model of the object based on the three-dimensional scan information, and adding a pattern representing the blood vessel to the three-dimensional model based on the pattern information.
8. The method of claim 7,
The wearable device
Wherein the key input operation is detected by comparing information on the blood vessel that changes according to the key input operation to the pattern added to the three-dimensional model.
9. The method of claim 8,
Wherein the information about the blood vessel is obtained by detecting at least one distribution of hue, saturation, and brightness caused by the blood vessel in the object by the optical signal sensing unit.
The method according to claim 1,
Wherein the key determiner comprises: a first joint that connects the first palm of the finger with the palm of the user based on the sensed key input operation; a third joint that connects the first palm of the finger with the second palm of the finger, And generates the input value based on the three-dimensional position of the first joint and the second joint.
11. The method of claim 10,
Wherein the key determiner determines a three-dimensional position of the first joint and the second joint and an angle at which the first joint and the second joint are bent, and determines a three-dimensional position of the first joint and the second joint according to an angle of the two joints Dimensional position of the end of the finger where the key input operation is sensed.
The method according to claim 1,
Wherein the optical signal sensing unit senses the first reflected optical signal and the second reflected optical signal, respectively, by separating the received reflected optical signal according to a wavelength.
The method according to claim 1,
Wherein the optical signal sensing unit senses the first reflected optical signal and the second reflected optical signal separately received in a time domain or a frequency domain, respectively.
In a wearable device,
An optical signal transmitter for transmitting an optical signal;
An optical signal detection unit receiving the reflected optical signal generated by reflecting the optical signal to a target object;
A data processing unit for processing the received reflected optical signal;
A position determiner for measuring a distance and an angle with respect to the object based on the data obtained by processing the reflected optical signal; And
And an image output unit for outputting an image to the outside,
Wherein the optical signal transmitting unit transmits one or more optical signals,
Wherein the optical signal sensing unit receives one or more reflected optical signals by the at least one optical signal,
Wherein the data processing unit generates pattern information on the blood vessel of the object based on the at least one reflected optical signal,
Wherein the positioning unit compares the pattern information with previously stored blood vessel information to measure a distance and an angle with the target object,
Wherein the image output unit outputs the image at a fixed size in a fixed position based on the distance and the angle.
15. The method of claim 14,
The wearable device,
And a finger recognition unit for generating pattern information on the skin line by sensing the skin line of the user's finger,
Wherein the image output unit fixes and outputs the image by comparing pattern information of the skin line with pattern information of a previously stored skin line.
KR1020150061522A 2014-10-15 2015-04-30 Wearable device KR20160129406A (en)

Priority Applications (4)

Application Number Priority Date Filing Date Title
KR1020150061522A KR20160129406A (en) 2015-04-30 2015-04-30 Wearable device
US15/517,923 US10474191B2 (en) 2014-10-15 2015-10-14 Wearable device
PCT/KR2015/010825 WO2016060461A1 (en) 2014-10-15 2015-10-14 Wearable device
US16/601,359 US10908642B2 (en) 2014-10-15 2019-10-14 Movement-based data input device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
KR1020150061522A KR20160129406A (en) 2015-04-30 2015-04-30 Wearable device

Publications (1)

Publication Number Publication Date
KR20160129406A true KR20160129406A (en) 2016-11-09

Family

ID=57528980

Family Applications (1)

Application Number Title Priority Date Filing Date
KR1020150061522A KR20160129406A (en) 2014-10-15 2015-04-30 Wearable device

Country Status (1)

Country Link
KR (1) KR20160129406A (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20190059443A (en) 2017-11-23 2019-05-31 주식회사 라온즈 Optical sensing module for wearable smart device and portable smart device

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20190059443A (en) 2017-11-23 2019-05-31 주식회사 라온즈 Optical sensing module for wearable smart device and portable smart device

Similar Documents

Publication Publication Date Title
US10908642B2 (en) Movement-based data input device
US10747260B2 (en) Methods, devices, and systems for processing blood vessel data
KR101524575B1 (en) Wearable device
US20210011560A1 (en) Wearable Device
US10817594B2 (en) Wearable electronic device having a light field camera usable to perform bioauthentication from a dorsal side of a forearm near a wrist
EP3035164B1 (en) Wearable sensor for tracking articulated body-parts
EP2984541B1 (en) Near-plane segmentation using pulsed light source
US10264998B2 (en) Blood vessel imaging apparatus and personal authentication system
KR101235432B1 (en) Remote control apparatus and method using virtual touch of electronic device modeled in three dimension
US11132057B2 (en) Use of light transmission through tissue to detect force
US20090174578A1 (en) Operating apparatus and operating system
KR102609766B1 (en) Skin care device
KR101552134B1 (en) Wearable device
TWI596378B (en) Portable virtual reality system
KR20200137830A (en) Electronic device and method for correcting biometric data based on distance between electronic device and user measured using at least one sensor
US20190339768A1 (en) Virtual reality interaction system and method
KR20160129406A (en) Wearable device
KR101524576B1 (en) Wearable device
Gonzalez et al. A 2-D infrared instrumentation for close-range finger position sensing
KR20210111619A (en) Method, system and non-transitory computer-readable recording medium for estimating user's gesture from 2d images
JP2016115310A (en) Electronic apparatus
KR20200120467A (en) Head mounted display apparatus and operating method thereof
WO2015160589A1 (en) Fingerprint based input device