US20130022220A1 - Wearable Computing Device with Indirect Bone-Conduction Speaker - Google Patents
Wearable Computing Device with Indirect Bone-Conduction Speaker Download PDFInfo
- Publication number
- US20130022220A1 US20130022220A1 US13/269,935 US201113269935A US2013022220A1 US 20130022220 A1 US20130022220 A1 US 20130022220A1 US 201113269935 A US201113269935 A US 201113269935A US 2013022220 A1 US2013022220 A1 US 2013022220A1
- Authority
- US
- United States
- Prior art keywords
- wearer
- side section
- vibration transducer
- support structure
- vibration
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Abandoned
Links
Images
Classifications
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04R—LOUDSPEAKERS, MICROPHONES, GRAMOPHONE PICK-UPS OR LIKE ACOUSTIC ELECTROMECHANICAL TRANSDUCERS; DEAF-AID SETS; PUBLIC ADDRESS SYSTEMS
- H04R1/00—Details of transducers, loudspeakers or microphones
- H04R1/02—Casings; Cabinets ; Supports therefor; Mountings therein
- H04R1/028—Casings; Cabinets ; Supports therefor; Mountings therein associated with devices performing functions other than acoustics, e.g. electric candles
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B27/00—Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
- G02B27/01—Head-up displays
- G02B27/017—Head mounted
- G02B27/0176—Head mounted characterised by mechanical features
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04R—LOUDSPEAKERS, MICROPHONES, GRAMOPHONE PICK-UPS OR LIKE ACOUSTIC ELECTROMECHANICAL TRANSDUCERS; DEAF-AID SETS; PUBLIC ADDRESS SYSTEMS
- H04R1/00—Details of transducers, loudspeakers or microphones
- H04R1/10—Earpieces; Attachments therefor ; Earphones; Monophonic headphones
- H04R1/1058—Manufacture or assembly
- H04R1/1066—Constructional aspects of the interconnection between earpiece and earpiece support
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04R—LOUDSPEAKERS, MICROPHONES, GRAMOPHONE PICK-UPS OR LIKE ACOUSTIC ELECTROMECHANICAL TRANSDUCERS; DEAF-AID SETS; PUBLIC ADDRESS SYSTEMS
- H04R5/00—Stereophonic arrangements
- H04R5/02—Spatial or constructional arrangements of loudspeakers
- H04R5/023—Spatial or constructional arrangements of loudspeakers in a chair, pillow
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04R—LOUDSPEAKERS, MICROPHONES, GRAMOPHONE PICK-UPS OR LIKE ACOUSTIC ELECTROMECHANICAL TRANSDUCERS; DEAF-AID SETS; PUBLIC ADDRESS SYSTEMS
- H04R2460/00—Details of hearing devices, i.e. of ear- or headphones covered by H04R1/10 or H04R5/033 but not provided for in any of their subgroups, or of hearing aids covered by H04R25/00 but not provided for in any of its subgroups
- H04R2460/13—Hearing devices using bone conduction transducers
Definitions
- Computing devices such as personal computers, laptop computers, tablet computers, cellular phones, and countless types of Internet-capable devices are increasingly prevalent in numerous aspects of modern life. Over time, the manner in which these devices are providing information to users is becoming more intelligent, more efficient, more intuitive, and/or less obtrusive.
- wearable computing The trend toward miniaturization of computing hardware, peripherals, as well as of sensors, detectors, and image and audio processors, among other technologies, has helped open up a field sometimes referred to as “wearable computing.”
- wearable displays In the area of image and visual processing and production, in particular, it has become possible to consider wearable displays that place a very small image display element close enough to a wearer's (or user's) eye(s) such that the displayed image fills or nearly fills the field of view, and appears as a normal sized image, such as might be displayed on a traditional image display device.
- the relevant technology may be referred to as “near-eye displays.”
- Near-eye displays are fundamental components of wearable displays, also sometimes called “head-mounted displays” (HMDs).
- a head-mounted display places a graphic display or displays close to one or both eyes of a wearer.
- a computer processing system may be used to generate the images on a display.
- Such displays may occupy a wearer's entire field of view, or only occupy part of wearer's field of view.
- head-mounted displays may be as small as a pair of glasses or as large as a helmet.
- an exemplary wearable-computing system may include: (a) one or more optical elements; (b) a support structure comprising a front section and at least one side section, wherein the support structure is configured to support the one or more optical elements; (c) an audio interface configured to receive an audio signal; and (d) at least one vibration transducer located on the at least one side section, wherein the at least one vibration transducer is configured to vibrate at least a portion of the support structure based on the audio signal.
- the vibration transducer is configured such that when the support structure is worn, the vibration transducer vibrates the support structure without directly vibrating a wearer.
- the support structure is configured such that when worn, the support structure vibrationally couples to a bone structure of the wearer.
- an exemplary wearable-computing system may include: (a) a support structure comprising a front section and at least one side section, wherein the support structure is configured to support the one or more optical elements; (b) a means for receiving an audio signal; and (c) a means for vibrating at least a portion of the support structure based on the audio signal, wherein the means for vibrating is located on the at least one side section.
- the means for vibrating is configured such that when the support structure is worn, the means for vibrating vibrates the support structure without directly vibrating a wearer.
- the support structure is configured such that when worn, the support structure vibrationally couples to a bone structure of the wearer.
- FIG. 1 illustrates a wearable computing system according to an exemplary embodiment.
- FIG. 2 illustrates an alternate view of the wearable computing system of FIG. 1 .
- FIG. 3 illustrates an exemplary schematic drawing of a wearable computing system.
- FIG. 4 is a simplified illustration of a head-mounted display configured for indirect bone-conduction audio, according to an exemplary embodiment.
- FIG. 5 is another block diagram illustrating an HMD configured for indirect bone-conduction audio, according to an exemplary embodiment.
- FIG. 6 is another block diagram illustrating an HMD configured for indirect bone-conduction audio, according to an exemplary embodiment.
- the disclosure generally involves a wearable computing system with a head-mounted display (HMD), and in particular, an HMD having at least one vibration transducer that functions as a speaker.
- An exemplary HMD may employ vibration transducers that are commonly referred to as bone-conduction transducers.
- standard applications of bone-conduction transducers involve direct transfer of sound to the inner ear by attaching the transducer directly to the bone (or a pad that is adjacent to the bone).
- An exemplary HMD may include a bone-conduction transducer (or another type of vibration transducer) that transfers sound to the wearer's ear via “indirect bone conduction.”
- an exemplary HMD may include a vibration transducer that does not vibrationally couple to wearer's bone structure (e.g., a vibration transducer that is located so as to avoid substantial contact with the wearer when the HMD is worn). Instead, the vibration transducer is configured to vibrate the frame of the HMD. The HMD frame is in turn vibrationally coupled to the wearer's bone structure. As such, the HMD frame transfers vibration to the wearer's bone structure such that sound is perceived in the wearer's inner ear. In this arrangement, the vibration transducer does not directly vibrate the wearer, and thus may be said to function as an “indirect” bone conduction speaker.
- the vibration transducer may be placed at a location on the HMD that does not contact the wearer.
- a vibration transducer may be located on a side-arm of the HMD, near where the side-arm connects to the front of the HMD.
- the HMD may be configured such that when worn, there is space (e.g., air) between the portion of the HMD where the vibration transducer is located and the wearer. As such, the portion of the HMD that contacts and vibrationally couples to the wearer may be located away from the vibration transducer.
- the frame may transmit the audio signal through the air as well.
- the airborne audio signal may be heard by the wearer, and may actually enhance the sound perceived via indirect bone conduction. At the same time, this airborne audio signal may be much quieter than airborne audio signals emanating by traditional diaphragm speakers, and thus may provide more privacy to the wearer.
- one or more couplers may be attached to the HMD frame to enhance the fit of the HMD to the wearer and help transfer of vibrations from the frame to the wearer's bone structure.
- a fitting piece which may be moldable and/or made of rubber or silicone gel, for example, may be attached to the HMD frame.
- the fitting piece may be attached to the HMD frame in various ways. For instance, a fitting piece may be located behind the wearer's temple and directly above their ear, or in the pit behind the wearer's ear lobe, among other locations.
- an exemplary system may be implemented in or may take the form of a wearable computer (i.e., a wearable-computing device).
- a wearable computer takes the form of or includes an HMD.
- an exemplary system may also be implemented in or take the form of other devices, such as a mobile phone, among others.
- an exemplary system may take the form of non-transitory computer readable medium, which has program instructions stored thereon that are executable by at a processor to provide the functionality described herein.
- An exemplary, system may also take the form of a device such as a wearable computer or mobile phone, or a subsystem of such a device, which includes such a non-transitory computer readable medium having such program instructions stored thereon.
- an HMD may generally be any display device that is worn on the head and places a display in front of one or both eyes of the wearer.
- An HMD may take various forms such as a helmet or eyeglasses.
- references to “eyeglasses” herein should be understood to refer to an HMD that generally takes the form of eyeglasses. Further, features and functions described in reference to “eyeglasses” herein may apply equally to any other kind of HMD.
- FIG. 1 illustrates a wearable computing system according to an exemplary embodiment.
- the wearable computing system is shown in the form of eyeglass 102 .
- eyeglasses 102 include a support structure that is configured to support the one or more optical elements.
- the support structure of an exemplary HMD may include a front section and at least one side section.
- the support structure has a front section that includes lens-frames 104 and 106 and a center frame support 108 .
- side-arms 114 and 116 serve as a first and a second side section of the support structure for eyeglasses 102 .
- the front section and the at least one side section may vary in form, depending upon the implementation.
- the support structure of an exemplary HMD may also be referred to as the “frame” of the HMD.
- the support structure of eyeglasses 102 which includes lens-frames 104 and 106 , center frame support 108 , and side-arms 114 and 116 , may also be referred to as the “frame” of eyeglasses 102 .
- the frame of the eyeglasses 102 may function to secure eyeglasses 102 to a user's face via a user's nose and ears. More specifically, the side-arms 114 and 116 are each projections that extend away from the frame elements 104 and 106 , respectively, and are positioned behind a user's ears to secure the eyeglasses 102 to the user. The side-arms 114 and 116 may further secure the eyeglasses 102 to the user by extending around a rear portion of the user's head. Additionally or alternatively, for example, the system 100 may connect to or be affixed within a head-mounted helmet structure. Other possibilities exist as well.
- each of the frame elements 104 , 106 , and 108 and the side-arms 114 and 116 may be formed of a solid structure of plastic or metal, or may be formed of a hollow structure of similar material so as to allow wiring and component interconnects to be internally routed through the eyeglasses 102 .
- Other materials or combinations of materials are also possible.
- the size, shape, and structure of eyeglasses 102 , and the components thereof, may vary depending upon the implementation.
- each of the optical elements 110 and 112 may be formed of any material that can suitably display a projected image or graphic.
- Each of the optical elements 110 and 112 may also be sufficiently transparent to allow a user to see through the optical element. Combining these features of the optical elements can facilitate an augmented reality or heads-up display where the projected image or graphic is superimposed over a real-world view as perceived by the user through the optical elements.
- the system 100 may also include an on-board computing system 118 , a video camera 120 , a sensor 122 , and finger-operable touchpads 124 , 126 .
- the on-board computing system 118 is shown to be positioned on the side-arm 114 of the eyeglasses 102 ; however, the on-board computing system 118 may be provided on other parts of the eyeglasses 102 .
- the on-board computing system 118 may include a processor and memory, for example.
- the on-board computing system 118 may be configured to receive and analyze data from the video camera 120 and the finger-operable touchpads 124 , 126 (and possibly from other sensory devices, user interfaces, or both) and generate images for output from the optical elements 110 and 112 .
- the video camera 120 is shown to be positioned on the side-arm 114 of the eyeglasses 102 ; however, the video camera 120 may be provided on other parts of the eyeglasses 102 .
- the video camera 120 may be configured to capture images at various resolutions or at different frame rates. Many video cameras with a small form-factor, such as those used in cell phones or webcams, for example, may be incorporated into an example of the system 100 .
- FIG. 1 illustrates one video camera 120 , more video cameras may be used, and each may be configured to capture the same view, or to capture different views.
- the video camera 120 may be forward facing to capture at least a portion of the real-world view perceived by the user. This forward facing image captured by the video camera 120 may then be used to generate an augmented reality where computer generated images appear to interact with the real-world view perceived by the user.
- the sensor 122 is shown mounted on the side-arm 116 of the eyeglasses 102 ; however, the sensor 122 may be provided on other parts of the eyeglasses 102 .
- the sensor 122 may include one or more of a gyroscope or an accelerometer, for example. Other sensing devices may be included within the sensor 122 or other sensing functions may be performed by the sensor 122 .
- sensors such as sensor 122 may be configured to detect head movement by a wearer of eyeglasses 102 .
- a gyroscope and/or accelerometer may be arranged to detect head movements, and may be configured to output head-movement data. This head-movement data may then be used to carry out functions of an exemplary method, such as method 100 , for instance.
- the finger-operable touchpads 124 , 126 are shown mounted on the side-arms 114 , 116 of the eyeglasses 102 . Each of finger-operable touchpads 124 , 126 may be used by a user to input commands.
- the finger-operable touchpads 124 , 126 may sense at least one of a position and a movement of a finger via capacitive sensing, resistance sensing, or a surface acoustic wave process, among other possibilities.
- the finger-operable touchpads 124 , 126 may be capable of sensing finger movement in a direction parallel or planar to the pad surface, in a direction normal to the pad surface, or both, and may also be capable of sensing a level of pressure applied.
- the finger-operable touchpads 124 , 126 may be formed of one or more transparent or transparent insulating layers and one or more transparent or transparent conducting layers. Edges of the finger-operable touchpads 124 , 126 may be formed to have a raised, indented, or roughened surface, so as to provide tactile feedback to a user when the user's finger reaches the edge of the finger-operable touchpads 124 , 126 . Each of the finger-operable touchpads 124 , 126 may be operated independently, and may provide a different function.
- FIG. 2 illustrates an alternate view of the wearable computing system of FIG. 1 .
- the optical elements 110 and 112 may act as display elements.
- the eyeglasses 102 may include a first projector 128 coupled to an inside surface of the side-arm 116 and configured to project a display 130 onto an inside surface of the optical element 112 .
- a second projector 132 may be coupled to an inside surface of the side-arm 114 and configured to project a display 134 onto an inside surface of the optical element 110 .
- the optical elements 110 and 112 may act as a combiner in a light projection system and may include a coating that reflects the light projected onto them from the projectors 128 and 132 . In some embodiments, a special coating may not be used (e.g., when the projectors 128 and 132 are scanning laser devices).
- the optical elements 110 , 112 themselves may include: a transparent or semi-transparent matrix display, such as an electroluminescent display or a liquid crystal display, one or more waveguides for delivering an image to the user's eyes, or other optical elements capable of delivering an in focus near-to-eye image to the user.
- a corresponding display driver may be disposed within the frame elements 104 and 106 for driving such a matrix display.
- a laser or LED source and scanning system could be used to draw a raster display directly onto the retina of one or more of the user's eyes. Other possibilities exist as well.
- FIGS. 1 and 2 show two touchpads and two display elements, it should be understood that many exemplary methods and systems may be implemented in wearable computing devices with only one touchpad and/or with only one optical element having a display element. It is also possible that exemplary methods and systems may be implemented in wearable computing devices with more than two touchpads.
- FIG. 3 illustrates an exemplary schematic drawing of a wearable computing system.
- a computing device 138 communicates using a communication link 140 (e.g., a wired or wireless connection) to a remote device 142 .
- the computing device 138 may be any type of device that can receive data and display information corresponding to or associated with the data.
- the computing device 138 may be a heads-up display system, such as the eyeglasses 102 described with reference to FIGS. 1 and 5 .
- the computing device 138 may include a display system 144 comprising a processor 146 and a display 148 .
- the display 148 may be, for example, an optical see-through display, an optical see-around display, or a video see-through display.
- the processor 146 may receive data from the remote device 142 , and configure the data for display on the display 148 .
- the processor 146 may be any type of processor, such as a micro-processor or a digital signal processor, for example.
- the computing device 138 may further include on-board data storage, such as memory 150 coupled to the processor 146 .
- the memory 150 may store software that can be accessed and executed by the processor 146 , for example.
- the remote device 142 may be any type of computing device or transmitter including a laptop computer, a mobile telephone, etc., that is configured to transmit data to the device 138 .
- the remote device 142 and the device 138 may contain hardware to enable the communication link 140 , such as processors, transmitters, receivers, antennas, etc.
- the communication link 140 is illustrated as a wireless connection; however, wired connections may also be used.
- the communication link 140 may be a wired link via a serial bus such as a universal serial bus or a parallel bus.
- a wired connection may be a proprietary connection as well.
- the communication link 140 may also be a wireless connection using, e.g., Bluetooth® radio technology, communication protocols described in IEEE 802.11 (including any IEEE 802.11 revisions), Cellular technology (such as GSM, CDMA, UMTS, EV-DO, WiMAX, or LTE), or Zigbee® technology, among other possibilities.
- the remote device 142 may be accessible via the Internet and may comprise a computing cluster associated with a particular web service (e.g., social-networking, photo sharing, address book, etc.).
- FIG. 4 is a simplified illustration of an HMD configured for indirect bone-conduction audio, according to an exemplary embodiment.
- HMD 400 includes two optical elements 402 , 404 .
- the frame of HMD 400 includes two side arms 408 -L and 408 -R, two lens-frames 409 -L and 409 -R, and a nose bridge 407 .
- the nose bridge 407 and side arms 408 -L and 408 -R are arranged to fit behind a wearer's ears and hold the optical elements 402 and 404 in front of the wearer's eyes via attachments to the lens-frames 409 -L and 409 -R.
- HMD 400 may include various audio sources, from which audio signals may be acquired.
- HMD 400 includes a microphone 410 .
- HMD 400 may additionally or alternatively an internal audio playback device.
- an on-board computing system (not shown) may be configured to play digital audio files.
- HMD 400 may be configured to acquire an audio signal from an auxiliary audio playback device 412 , such as a portable digital audio player, smartphone, home stereo, car stereo, and/or personal computer.
- auxiliary audio playback device 412 such as a portable digital audio player, smartphone, home stereo, car stereo, and/or personal computer.
- Other audio sources are also possible.
- An exemplary HMD may also include one or more audio interfaces for receiving audio signals from various audio sources, such as those described above.
- HMD 400 may include an interface for receiving an audio signal from microphone 410 .
- HMD 400 may include an interface 411 for receiving an audio signal from auxiliary audio playback device 412 (e.g., an “aux in” input).
- the interface to the auxiliary audio playback device 412 may be a tip, ring, sleeve (TRS) connector, or may take another form.
- HMD 412 may additionally or alternatively include an interface to an internal audio playback device.
- an on-board computing system (not shown) may be configured to process digital audio files and output audio signals to a speaker or speakers. Other audio interfaces are also possible.
- HMD 400 also includes a vibration transducer 414 located on side arm 408 -R, which functions as an indirect bone-conduction speaker.
- a vibration transducer 414 located on side arm 408 -R, which functions as an indirect bone-conduction speaker.
- BCTs bone-conduction transducers
- Any component that is arranged to vibrate the HMD 400 may be incorporated as a vibration transducer, without departing from the scope of the invention.
- Vibration transducer 414 is configured to vibrate at least a portion of the eyeglass frame 406 based on an audio signal received via one of the audio interfaces.
- the side arm 408 -R is configured such that when a user wears HMD 400 , an inner wall of a first portion of side arm 408 -R contacts the wearer so as to vibrationally couple to a bone structure of the wearer.
- side arm 408 -R may contact the wearer at or near where the side-arm is placed between the wearer's ear and the side of the wearer's head, such as at the wearer's mastoid. Other points of contact are also possible.
- Eyeglass frame 406 may be arranged such that when a user wears HMD 400 , the eyeglass frame contacts the wearer. As such, when the vibration transducer 414 vibrates the eyeglass frame 406 , the eyeglass frame can transfer vibration to the bone structure of the wearer. In particular, vibration of the eyeglass frame 406 can be transferred at areas where the eyeglass frame contacts the wearer directly. For instance, the eyeglass frame 406 may transfer vibration, via contact points on the wearer's ear, the wearer's nose, the wearer's temple, the wearer's eyebrow, or any other point where the eyeglass frame 406 directly contacts the wearer.
- vibration transducer 414 is located on a second portion of the side-arm 408 -R, away from the first portion of the side-arm 408 -R that vibrationally couples to wearer. Further, in an exemplary embodiment, vibration transducer 414 vibrates the support structure without directly vibrating the wearer. To achieve this result, the second portion of the side-arm 408 -R, at which vibration transducer 414 is located, may have an inner wall that does not contact the wearer.
- This configuration may leave a space between the second portion of the side-arm 408 -R and the wearer, such that the vibration transducer indirectly vibrates the wearer by transferring vibration from the second portion to the first portion of the side-arm 408 -R, which in turn is vibrationally coupled to the wearer.
- the spacing of the vibration transducer may be accomplished by housing the transducer in or attaching the transducer to the side-arm in various ways.
- the vibration transducer 414 is attached to and protruding from the exterior wall of side arm 408 -R.
- a vibration transducer may be attached to an inner wall of the side arm (while still configured to leave space between the vibration transducer and the wearer).
- a vibration transducer may be enclosed within a side arm having the vibration transducer.
- a vibration transducer may be partially or wholly embedded in an exterior or interior wall of a side arm.
- vibration transducers may be arranged in other locations on or within side-arm 408 -L and/or 408 -R. Additionally, vibration transducers may be arranged on or within other parts of the frame, such as the nose bridge sensor 407 and/or lens frames 409 -L and 409 -R.
- the location of the vibration transducer may enhance the vibration of the side-arm 408 -R.
- side-arm 408 -R may contact and be held in place by the lens-frame 408 -R on one end, and may contact and be held in place by the wearer's ear on the other end (e.g., at the wearer's mastoid), allowing the portion of the side-arm between these points of contact to vibrate more freely. Therefore, placing vibration transducer 414 between these points of contact may help the transducer vibrate the side-arm 408 -R more efficiently. This in turn may result in more efficient transfer of vibration from the eyeglass frame to the wearer's bone structure.
- vibrations from the vibration transducer may also be transmitted through the air, and thus may be heard by the wearer over the air.
- the user may perceive the sound from vibration transducer 414 using both tympanic hearing and bone-conduction hearing.
- the sound that is transmitted through the air and perceived using tympanic hearing may complement the sound perceived via bone-conduction hearing.
- the sound transmitted through the air may enhance the sound perceived by the wearer, the sound transmitted through the air may be unintelligible to others nearby.
- the sound transmitted through the air by the vibration transducer may be inaudible (possibly depending upon the volume level).
- FIG. 5 is another block diagram illustrating an HMD configured for indirect bone-conduction audio, according to an exemplary embodiment.
- FIG. 5 shows an HMD 500 , in which a vibration transducer 514 is arranged on an outer wall of side-arm 508 -R.
- a vibration transducer 514 is arranged on an outer wall of side-arm 508 -R.
- a portion of side-arm 508 -R on which vibration transducer 514 is located is proximate to the pit behind the ear lobe of the wearer.
- vibration transducer 514 When located as such, a gap may exist between the wearer and the portion of side-arm 508 -R to which vibration transducer 514 is attached. As a result, driving vibration transducer 514 with an audio signal vibrates side-arm 508 -R. While the portion of side-arm 508 —to which vibration transducer 514 is attached does not contact the wearer, side-arm 508 -R may contact the wearer elsewhere. In particular, side-arm 508 -R may contact the wearer in between the ear and the head, such as at location 516 , for example. Accordingly, vibrations of side-arm 508 -L may be transferred to a wearer's bone structure at location 516 . Therefore, by vibrating side-arm 508 -R, which in turn vibrates the wearer's bone structure, vibration transducer 514 may serve as an indirect bone conduction speaker.
- FIG. 6 is another block diagram illustrating an HMD configured for indirect bone-conduction audio, according to an exemplary embodiment.
- FIG. 6 shows an HMD 600 , which includes two vibration transducers 614 and 615 .
- vibration transducers 614 and 615 are arranged on the side arms 608 -L and 608 -R, respectively.
- vibration transducers 614 and 615 are typically located on portions of side-arm 608 -L and 608 -R, respectively, which are proximate to a wearer's left and right temple, respectively.
- vibration transducer 614 may be located on the outer wall of side-arm 608 -L, in between the wearer's left eye and left ear.
- vibration transducer 615 may be located on the outer wall of side-arm 608 -R, in between the wearer's right eye and right ear.
- an exemplary HMD may include one or more couplers arranged on the HMD. These couplers may help to enhance the fit of the HMD frame to the wearer, such that the HMD fits in a more secure and/or a more comfortable manner. Further, in an exemplary embodiment, these couplers may help to more efficiently transfer vibration of the HMD frame to the bone structure of the wearer.
- HMD 600 includes two fitting pieces 616 -L and 616 -R.
- fitting piece 616 -L is located on an inner wall of side-arm 608 -L and extends from a portion of the inner wall that is proximate to the pit behind the left ear lobe of the wearer. Configured as such, fitting piece 616 -L may serve to fill any gap between the wearer's body and the side-arm 608 -L.
- fitting piece 616 -R may be configured in a similar manner as fitting piece 616 -L, but with respect to the right side of the wearer's body.
- the fitting pieces or any type of couplers may be attached to, embedded in, and/or enclosed in the HMD frame at various locations.
- fitting pieces may be located in various locations so as to fill space between an HMD frame and wearer's body, and thus help the HMD frame vibrate the wearer's bone structure.
- a fitting piece may be configured to contact a wearer's ear, nose, temple, eyebrow, nose (e.g., at the bridge of the nose), or neck (e.g., at the pit behind the ear lobe), among other locations.
- an exemplary embodiment may include only one coupler, or may include two or more couplers. Further, it should be understood that an exemplary embodiment need not include any couplers.
- an HMD may be configured with multiple vibration transducers, which may be individually customizable. For instance, as the fit of an HMD may vary from user-to-user, the volume of speakers may be adjusted individually to better suit a particular user.
- an HMD frame may contact different users in different locations, such that a behind-ear vibration transducer (e.g., vibration transducer 514 of FIG. 5 ) may provide more-efficient indirect bone conduction for a first user, while a vibration transducer located near the temple (e.g., vibration transducer 414 of FIG. 4 ) may provide more-efficient indirect bone conduction for a second user.
- a behind-ear vibration transducer e.g., vibration transducer 514 of FIG. 5
- a vibration transducer located near the temple e.g., vibration transducer 414 of FIG. 4
- an HMD may be configured with one or more behind-ear vibration transducers and one or more vibration transducers near the temple, which are individually adjustable.
- the first user may choose to lower the volume or turn off the vibration transducers near the temple, while the second user may choose to lower the volume or turn off the behind-ear vibration transducers.
- Other examples are also possible.
- different vibration transducers may be driven by different audio signals.
- a first vibration transducer may be configured to vibrate a left side-arm of an HMD based on a first audio signal
- a second vibration transducer may be configured to vibrate a second portion of the support structure based on a second audio signal.
- the above configuration may be used to deliver stereo sound.
- two individual vibration transducers (or possibly two groups of vibration transducers) may be driven by separate left and right audio signals.
- vibration transducer 614 -L may vibrate side-arm 608 -L based on a “left” audio signal
- vibration transducer 614 -R may vibrate side-arm 608 -R based on a “right” audio signal.
- Other examples of vibration transducers configured for stereo sound are also possible.
- an HMD may include more than two vibration transducers (or possibly more than two groups of vibration transducers), which each are driven by a different audio signal.
- multiple vibration transducers may be individually driven by different audio signals in order to provide a surround sound experience.
- vibrations transducers may be configured for different purposes, and thus driven by different audio signals.
- one or more vibrations transducers may be configured to deliver music, while another vibration transducer may be configured for voice (e.g., for phone calls, speech-based system messages, etc.).
- voice e.g., for phone calls, speech-based system messages, etc.
- Other examples are also possible.
- an exemplary HMD may include one or more vibration dampeners that are configured to substantially isolate vibration of a particular vibration transducer or transducers.
- a first vibration transducer may be configured to vibrate a left side-arm based on a “left” audio signal
- a second vibration transducer may be configured to vibrate a right side-arm based on a second audio signal.
- one or more vibration transducers may be configured to (a) substantially reduce vibration of the right arm by the first vibration transducer and (b) substantially reduce vibration of the left arm by the second vibration transducer. By doing so, the left audio signal may be substantially isolated on the left arm, while the right audio signal may be substantially isolated on the right arm.
- Vibration dampeners may vary in location on an HMD. For instance, continuing the above example, a first vibration dampener may be coupled to the left side-arm and a second vibration dampener may be coupled to the right side-arm, so as to substantially isolate the vibrational coupling of the first vibration transducer to the left side-arm and vibrational coupling of the second vibration transducer to the second right side-arm. To do so, the vibration dampener or dampeners on a given side-arm may be attached at various locations along the side-arm. For instance, referring to FIG. 4 , vibration dampeners may be attached at or near where side-arms 408 -L and 408 -R are attached to lens-frames 409 -L and 409 -R, respectively.
- vibration transducers may be located on the left and right lens-frames, as illustrated in FIG. 6 by vibration transducers 618 -L and 618 -R.
- HMD 600 may include vibration dampeners (not shown) that help to isolate vibration of the left side of HMD 600 from the right side of HMD 600 .
- vibration dampeners may be attached at or near to where lens-frames 609 -L and 609 -R couple to nose bridge 607 .
- a vibration dampener may be located or attached to nose bridge 607 , in order to help prevent: (a) vibration transducer 618 -L from vibrating the right side of HMD 600 (e.g., lens frame 609 -R and/or side-arm 608 -R) and (b) vibration transducer 618 -R from vibrating the left side of HMD 600 (e.g., lens frame 609 -L and/or side-arm 608 -R).
- vibration dampeners may vary in size and/or shape, depending upon the particular implementation. Further, vibration dampeners may be attached to, partially or wholly embedded in, and/or enclosed within the frame of an exemplary HMD. Yet further, vibration dampeners may be made of various different types of materials. For instance, vibration dampeners may be made of silicon, rubber, and/or foam, among other materials. More generally, a vibration dampener may be constructed from any material suitable for absorbing and/or dampening vibration. Furthermore, in some embodiments, a simple air gap between the parts of the HMD may function as a vibration dampener (e.g., an air gap where a side arm connects to a lens frame).
- a vibration dampener e.g., an air gap where a side arm connects to a lens frame.
Landscapes
- Physics & Mathematics (AREA)
- Engineering & Computer Science (AREA)
- Acoustics & Sound (AREA)
- Signal Processing (AREA)
- Manufacturing & Machinery (AREA)
- General Physics & Mathematics (AREA)
- Optics & Photonics (AREA)
- Details Of Audible-Bandwidth Transducers (AREA)
- Headphones And Earphones (AREA)
Abstract
Exemplary wearable computing systems may include a head-mounted display that is configured to provide indirect bone-conduction audio. For example, an exemplary head-mounted display may include at least one vibration transducer that is configured to vibrate at least a portion of the head-mounted display based on the audio signal. The vibration transducer is configured such that when the head-mounted display is worn, the vibration transducer vibrates the head-mounted display without directly vibrating a wearer. However, the head-mounted display structure vibrationally couples to a bone structure of the wearer, such that vibrations from the vibration transducer may be indirectly transferred to the wearer's bone structure.
Description
- Unless otherwise indicated herein, the materials described in this section are not prior art to the claims in this application and are not admitted to be prior art by inclusion in this section.
- Computing devices such as personal computers, laptop computers, tablet computers, cellular phones, and countless types of Internet-capable devices are increasingly prevalent in numerous aspects of modern life. Over time, the manner in which these devices are providing information to users is becoming more intelligent, more efficient, more intuitive, and/or less obtrusive.
- The trend toward miniaturization of computing hardware, peripherals, as well as of sensors, detectors, and image and audio processors, among other technologies, has helped open up a field sometimes referred to as “wearable computing.” In the area of image and visual processing and production, in particular, it has become possible to consider wearable displays that place a very small image display element close enough to a wearer's (or user's) eye(s) such that the displayed image fills or nearly fills the field of view, and appears as a normal sized image, such as might be displayed on a traditional image display device. The relevant technology may be referred to as “near-eye displays.”
- Near-eye displays are fundamental components of wearable displays, also sometimes called “head-mounted displays” (HMDs). A head-mounted display places a graphic display or displays close to one or both eyes of a wearer. To generate the images on a display, a computer processing system may be used. Such displays may occupy a wearer's entire field of view, or only occupy part of wearer's field of view. Further, head-mounted displays may be as small as a pair of glasses or as large as a helmet.
- In one aspect, an exemplary wearable-computing system may include: (a) one or more optical elements; (b) a support structure comprising a front section and at least one side section, wherein the support structure is configured to support the one or more optical elements; (c) an audio interface configured to receive an audio signal; and (d) at least one vibration transducer located on the at least one side section, wherein the at least one vibration transducer is configured to vibrate at least a portion of the support structure based on the audio signal. In this exemplary wearable-computing system, the vibration transducer is configured such that when the support structure is worn, the vibration transducer vibrates the support structure without directly vibrating a wearer. Further, the support structure is configured such that when worn, the support structure vibrationally couples to a bone structure of the wearer.
- In another aspect, an exemplary wearable-computing system may include: (a) a support structure comprising a front section and at least one side section, wherein the support structure is configured to support the one or more optical elements; (b) a means for receiving an audio signal; and (c) a means for vibrating at least a portion of the support structure based on the audio signal, wherein the means for vibrating is located on the at least one side section. In this exemplary wearable-computing system, the means for vibrating is configured such that when the support structure is worn, the means for vibrating vibrates the support structure without directly vibrating a wearer. Further, the support structure is configured such that when worn, the support structure vibrationally couples to a bone structure of the wearer.
- These as well as other aspects, advantages, and alternatives, will become apparent to those of ordinary skill in the art by reading the following detailed description, with reference where appropriate to the accompanying drawings.
-
FIG. 1 illustrates a wearable computing system according to an exemplary embodiment. -
FIG. 2 illustrates an alternate view of the wearable computing system ofFIG. 1 . -
FIG. 3 illustrates an exemplary schematic drawing of a wearable computing system. -
FIG. 4 is a simplified illustration of a head-mounted display configured for indirect bone-conduction audio, according to an exemplary embodiment. -
FIG. 5 is another block diagram illustrating an HMD configured for indirect bone-conduction audio, according to an exemplary embodiment. -
FIG. 6 is another block diagram illustrating an HMD configured for indirect bone-conduction audio, according to an exemplary embodiment. - Exemplary methods and systems are described herein. It should be understood that the word “exemplary” is used herein to mean “serving as an example, instance, or illustration.” Any embodiment or feature described herein as “exemplary” is not necessarily to be construed as preferred or advantageous over other embodiments or features. The exemplary embodiments described herein are not meant to be limiting. It will be readily understood that certain aspects of the disclosed systems and methods can be arranged and combined in a wide variety of different configurations, all of which are contemplated herein.
- The disclosure generally involves a wearable computing system with a head-mounted display (HMD), and in particular, an HMD having at least one vibration transducer that functions as a speaker. An exemplary HMD may employ vibration transducers that are commonly referred to as bone-conduction transducers. However, standard applications of bone-conduction transducers involve direct transfer of sound to the inner ear by attaching the transducer directly to the bone (or a pad that is adjacent to the bone). An exemplary HMD, on the other hand, may include a bone-conduction transducer (or another type of vibration transducer) that transfers sound to the wearer's ear via “indirect bone conduction.”
- More specifically, an exemplary HMD may include a vibration transducer that does not vibrationally couple to wearer's bone structure (e.g., a vibration transducer that is located so as to avoid substantial contact with the wearer when the HMD is worn). Instead, the vibration transducer is configured to vibrate the frame of the HMD. The HMD frame is in turn vibrationally coupled to the wearer's bone structure. As such, the HMD frame transfers vibration to the wearer's bone structure such that sound is perceived in the wearer's inner ear. In this arrangement, the vibration transducer does not directly vibrate the wearer, and thus may be said to function as an “indirect” bone conduction speaker.
- In an exemplary embodiment, the vibration transducer may be placed at a location on the HMD that does not contact the wearer. For example, on a glasses-style HMD, a vibration transducer may be located on a side-arm of the HMD, near where the side-arm connects to the front of the HMD. Further, in an exemplary embodiment, the HMD may be configured such that when worn, there is space (e.g., air) between the portion of the HMD where the vibration transducer is located and the wearer. As such, the portion of the HMD that contacts and vibrationally couples to the wearer may be located away from the vibration transducer.
- In another aspect, because the vibration transducer vibrates the frame of the HMD instead of directly vibrating a wearer, the frame may transmit the audio signal through the air as well. In some embodiments, the airborne audio signal may be heard by the wearer, and may actually enhance the sound perceived via indirect bone conduction. At the same time, this airborne audio signal may be much quieter than airborne audio signals emanating by traditional diaphragm speakers, and thus may provide more privacy to the wearer.
- In a further aspect, one or more couplers may be attached to the HMD frame to enhance the fit of the HMD to the wearer and help transfer of vibrations from the frame to the wearer's bone structure. For example, a fitting piece, which may be moldable and/or made of rubber or silicone gel, for example, may be attached to the HMD frame. The fitting piece may be attached to the HMD frame in various ways. For instance, a fitting piece may be located behind the wearer's temple and directly above their ear, or in the pit behind the wearer's ear lobe, among other locations.
- Systems and devices in which exemplary embodiments may be implemented will now be described in greater detail. In general, an exemplary system may be implemented in or may take the form of a wearable computer (i.e., a wearable-computing device). In an exemplary embodiment, a wearable computer takes the form of or includes an HMD. However, an exemplary system may also be implemented in or take the form of other devices, such as a mobile phone, among others. Further, an exemplary system may take the form of non-transitory computer readable medium, which has program instructions stored thereon that are executable by at a processor to provide the functionality described herein. An exemplary, system may also take the form of a device such as a wearable computer or mobile phone, or a subsystem of such a device, which includes such a non-transitory computer readable medium having such program instructions stored thereon.
- In a further aspect, an HMD may generally be any display device that is worn on the head and places a display in front of one or both eyes of the wearer. An HMD may take various forms such as a helmet or eyeglasses. As such, references to “eyeglasses” herein should be understood to refer to an HMD that generally takes the form of eyeglasses. Further, features and functions described in reference to “eyeglasses” herein may apply equally to any other kind of HMD.
-
FIG. 1 illustrates a wearable computing system according to an exemplary embodiment. The wearable computing system is shown in the form ofeyeglass 102. However, other types of wearable computing devices could additionally or alternatively be used. Theeyeglasses 102 include a support structure that is configured to support the one or more optical elements. - In general, the support structure of an exemplary HMD may include a front section and at least one side section. In
FIG. 1 , the support structure has a front section that includes lens-frames center frame support 108. Further, in the illustrated embodiment, side-arms eyeglasses 102. It should be understood that the front section and the at least one side section may vary in form, depending upon the implementation. - Herein, the support structure of an exemplary HMD may also be referred to as the “frame” of the HMD. For example, the support structure of
eyeglasses 102, which includes lens-frames center frame support 108, and side-arms eyeglasses 102. - The frame of the
eyeglasses 102 may function to secureeyeglasses 102 to a user's face via a user's nose and ears. More specifically, the side-arms frame elements eyeglasses 102 to the user. The side-arms eyeglasses 102 to the user by extending around a rear portion of the user's head. Additionally or alternatively, for example, thesystem 100 may connect to or be affixed within a head-mounted helmet structure. Other possibilities exist as well. - In an exemplary embodiment, each of the
frame elements arms eyeglasses 102. Other materials or combinations of materials are also possible. Further, the size, shape, and structure ofeyeglasses 102, and the components thereof, may vary depending upon the implementation. - Further, each of the
optical elements optical elements - The
system 100 may also include an on-board computing system 118, avideo camera 120, asensor 122, and finger-operable touchpads board computing system 118 is shown to be positioned on the side-arm 114 of theeyeglasses 102; however, the on-board computing system 118 may be provided on other parts of theeyeglasses 102. The on-board computing system 118 may include a processor and memory, for example. The on-board computing system 118 may be configured to receive and analyze data from thevideo camera 120 and the finger-operable touchpads 124, 126 (and possibly from other sensory devices, user interfaces, or both) and generate images for output from theoptical elements - The
video camera 120 is shown to be positioned on the side-arm 114 of theeyeglasses 102; however, thevideo camera 120 may be provided on other parts of theeyeglasses 102. Thevideo camera 120 may be configured to capture images at various resolutions or at different frame rates. Many video cameras with a small form-factor, such as those used in cell phones or webcams, for example, may be incorporated into an example of thesystem 100. AlthoughFIG. 1 illustrates onevideo camera 120, more video cameras may be used, and each may be configured to capture the same view, or to capture different views. For example, thevideo camera 120 may be forward facing to capture at least a portion of the real-world view perceived by the user. This forward facing image captured by thevideo camera 120 may then be used to generate an augmented reality where computer generated images appear to interact with the real-world view perceived by the user. - The
sensor 122 is shown mounted on the side-arm 116 of theeyeglasses 102; however, thesensor 122 may be provided on other parts of theeyeglasses 102. Thesensor 122 may include one or more of a gyroscope or an accelerometer, for example. Other sensing devices may be included within thesensor 122 or other sensing functions may be performed by thesensor 122. - In an exemplary embodiment, sensors such as
sensor 122 may be configured to detect head movement by a wearer ofeyeglasses 102. For instance, a gyroscope and/or accelerometer may be arranged to detect head movements, and may be configured to output head-movement data. This head-movement data may then be used to carry out functions of an exemplary method, such asmethod 100, for instance. - The finger-
operable touchpads arms eyeglasses 102. Each of finger-operable touchpads operable touchpads operable touchpads operable touchpads operable touchpads operable touchpads operable touchpads -
FIG. 2 illustrates an alternate view of the wearable computing system ofFIG. 1 . As shown inFIG. 2 , theoptical elements eyeglasses 102 may include afirst projector 128 coupled to an inside surface of the side-arm 116 and configured to project adisplay 130 onto an inside surface of theoptical element 112. Additionally or alternatively, asecond projector 132 may be coupled to an inside surface of the side-arm 114 and configured to project adisplay 134 onto an inside surface of theoptical element 110. - The
optical elements projectors projectors - In alternative embodiments, other types of display elements may also be used. For example, the
optical elements frame elements - While
FIGS. 1 and 2 show two touchpads and two display elements, it should be understood that many exemplary methods and systems may be implemented in wearable computing devices with only one touchpad and/or with only one optical element having a display element. It is also possible that exemplary methods and systems may be implemented in wearable computing devices with more than two touchpads. -
FIG. 3 illustrates an exemplary schematic drawing of a wearable computing system. In particular, acomputing device 138 communicates using a communication link 140 (e.g., a wired or wireless connection) to a remote device 142. Thecomputing device 138 may be any type of device that can receive data and display information corresponding to or associated with the data. For example, thecomputing device 138 may be a heads-up display system, such as theeyeglasses 102 described with reference toFIGS. 1 and 5 . - Thus, the
computing device 138 may include adisplay system 144 comprising aprocessor 146 and adisplay 148. Thedisplay 148 may be, for example, an optical see-through display, an optical see-around display, or a video see-through display. Theprocessor 146 may receive data from the remote device 142, and configure the data for display on thedisplay 148. Theprocessor 146 may be any type of processor, such as a micro-processor or a digital signal processor, for example. - The
computing device 138 may further include on-board data storage, such asmemory 150 coupled to theprocessor 146. Thememory 150 may store software that can be accessed and executed by theprocessor 146, for example. - The remote device 142 may be any type of computing device or transmitter including a laptop computer, a mobile telephone, etc., that is configured to transmit data to the
device 138. The remote device 142 and thedevice 138 may contain hardware to enable thecommunication link 140, such as processors, transmitters, receivers, antennas, etc. - In
FIG. 3 , thecommunication link 140 is illustrated as a wireless connection; however, wired connections may also be used. For example, thecommunication link 140 may be a wired link via a serial bus such as a universal serial bus or a parallel bus. A wired connection may be a proprietary connection as well. Thecommunication link 140 may also be a wireless connection using, e.g., Bluetooth® radio technology, communication protocols described in IEEE 802.11 (including any IEEE 802.11 revisions), Cellular technology (such as GSM, CDMA, UMTS, EV-DO, WiMAX, or LTE), or Zigbee® technology, among other possibilities. The remote device 142 may be accessible via the Internet and may comprise a computing cluster associated with a particular web service (e.g., social-networking, photo sharing, address book, etc.). - A. Exemplary HMD with Vibration Transducer
-
FIG. 4 is a simplified illustration of an HMD configured for indirect bone-conduction audio, according to an exemplary embodiment. As shown,HMD 400 includes twooptical elements HMD 400 includes two side arms 408-L and 408-R, two lens-frames 409-L and 409-R, and anose bridge 407. Thenose bridge 407 and side arms 408-L and 408-R are arranged to fit behind a wearer's ears and hold theoptical elements - Further,
HMD 400 may include various audio sources, from which audio signals may be acquired. For example,HMD 400 includes amicrophone 410. Further,HMD 400 may additionally or alternatively an internal audio playback device. For example, an on-board computing system (not shown) may be configured to play digital audio files. Yet further,HMD 400 may be configured to acquire an audio signal from an auxiliaryaudio playback device 412, such as a portable digital audio player, smartphone, home stereo, car stereo, and/or personal computer. Other audio sources are also possible. - An exemplary HMD may also include one or more audio interfaces for receiving audio signals from various audio sources, such as those described above. For example,
HMD 400 may include an interface for receiving an audio signal frommicrophone 410. Further,HMD 400 may include aninterface 411 for receiving an audio signal from auxiliary audio playback device 412 (e.g., an “aux in” input). The interface to the auxiliaryaudio playback device 412 may be a tip, ring, sleeve (TRS) connector, or may take another form.HMD 412 may additionally or alternatively include an interface to an internal audio playback device. For example, an on-board computing system (not shown) may be configured to process digital audio files and output audio signals to a speaker or speakers. Other audio interfaces are also possible. -
HMD 400 also includes avibration transducer 414 located on side arm 408-R, which functions as an indirect bone-conduction speaker. Various types of bone-conduction transducers (BCTs) may be implemented, depending upon the implementation. Further, it should be understood that any component that is arranged to vibrate theHMD 400 may be incorporated as a vibration transducer, without departing from the scope of the invention. -
Vibration transducer 414 is configured to vibrate at least a portion of the eyeglass frame 406 based on an audio signal received via one of the audio interfaces. In an exemplary embodiment, the side arm 408-R is configured such that when a user wearsHMD 400, an inner wall of a first portion of side arm 408-R contacts the wearer so as to vibrationally couple to a bone structure of the wearer. For example, side arm 408-R may contact the wearer at or near where the side-arm is placed between the wearer's ear and the side of the wearer's head, such as at the wearer's mastoid. Other points of contact are also possible. - Eyeglass frame 406 may be arranged such that when a user wears
HMD 400, the eyeglass frame contacts the wearer. As such, when thevibration transducer 414 vibrates the eyeglass frame 406, the eyeglass frame can transfer vibration to the bone structure of the wearer. In particular, vibration of the eyeglass frame 406 can be transferred at areas where the eyeglass frame contacts the wearer directly. For instance, the eyeglass frame 406 may transfer vibration, via contact points on the wearer's ear, the wearer's nose, the wearer's temple, the wearer's eyebrow, or any other point where the eyeglass frame 406 directly contacts the wearer. - In an exemplary embodiment,
vibration transducer 414 is located on a second portion of the side-arm 408-R, away from the first portion of the side-arm 408-R that vibrationally couples to wearer. Further, in an exemplary embodiment,vibration transducer 414 vibrates the support structure without directly vibrating the wearer. To achieve this result, the second portion of the side-arm 408-R, at whichvibration transducer 414 is located, may have an inner wall that does not contact the wearer. This configuration may leave a space between the second portion of the side-arm 408-R and the wearer, such that the vibration transducer indirectly vibrates the wearer by transferring vibration from the second portion to the first portion of the side-arm 408-R, which in turn is vibrationally coupled to the wearer. - In practice, the spacing of the vibration transducer may be accomplished by housing the transducer in or attaching the transducer to the side-arm in various ways. For instance, as shown, the
vibration transducer 414 is attached to and protruding from the exterior wall of side arm 408-R. Other arrangements are possible as well. For example, a vibration transducer may be attached to an inner wall of the side arm (while still configured to leave space between the vibration transducer and the wearer). As another example, a vibration transducer may be enclosed within a side arm having the vibration transducer. As yet another example, a vibration transducer may be partially or wholly embedded in an exterior or interior wall of a side arm. Furthermore, vibration transducers may be arranged in other locations on or within side-arm 408-L and/or 408-R. Additionally, vibration transducers may be arranged on or within other parts of the frame, such as thenose bridge sensor 407 and/or lens frames 409-L and 409-R. - In a further aspect of the illustrated arrangement, the location of the vibration transducer may enhance the vibration of the side-arm 408-R. In particular, side-arm 408-R may contact and be held in place by the lens-frame 408-R on one end, and may contact and be held in place by the wearer's ear on the other end (e.g., at the wearer's mastoid), allowing the portion of the side-arm between these points of contact to vibrate more freely. Therefore, placing
vibration transducer 414 between these points of contact may help the transducer vibrate the side-arm 408-R more efficiently. This in turn may result in more efficient transfer of vibration from the eyeglass frame to the wearer's bone structure. - Further, when there is space between
vibration transducer 414 and the wearer, some vibrations from the vibration transducer may also be transmitted through the air, and thus may be heard by the wearer over the air. In other words, the user may perceive the sound fromvibration transducer 414 using both tympanic hearing and bone-conduction hearing. In such an embodiment, the sound that is transmitted through the air and perceived using tympanic hearing may complement the sound perceived via bone-conduction hearing. Furthermore, while the sound transmitted through the air may enhance the sound perceived by the wearer, the sound transmitted through the air may be unintelligible to others nearby. Further, in some arrangements, the sound transmitted through the air by the vibration transducer may be inaudible (possibly depending upon the volume level). - B. Other Arrangements of Vibration Transducers on an HMD
- Other arrangements of a vibration transducer or transducers on a side-arm and elsewhere on the HMD frame are also possible. For example,
FIG. 5 is another block diagram illustrating an HMD configured for indirect bone-conduction audio, according to an exemplary embodiment. In particular,FIG. 5 shows anHMD 500, in which avibration transducer 514 is arranged on an outer wall of side-arm 508-R. As such, whenHMD 500 is worn, a portion of side-arm 508-R on whichvibration transducer 514 is located is proximate to the pit behind the ear lobe of the wearer. - When located as such, a gap may exist between the wearer and the portion of side-arm 508-R to which
vibration transducer 514 is attached. As a result, drivingvibration transducer 514 with an audio signal vibrates side-arm 508-R. While the portion of side-arm 508—to whichvibration transducer 514 is attached does not contact the wearer, side-arm 508-R may contact the wearer elsewhere. In particular, side-arm 508-R may contact the wearer in between the ear and the head, such as at location 516, for example. Accordingly, vibrations of side-arm 508-L may be transferred to a wearer's bone structure at location 516. Therefore, by vibrating side-arm 508-R, which in turn vibrates the wearer's bone structure,vibration transducer 514 may serve as an indirect bone conduction speaker. - In a further aspect, some embodiments may include two or more vibration transducers in various locations on the eyeglass frame. For example,
FIG. 6 is another block diagram illustrating an HMD configured for indirect bone-conduction audio, according to an exemplary embodiment. In particular,FIG. 6 shows anHMD 600, which includes twovibration transducers 614 and 615. As shownvibration transducers 614 and 615 are arranged on the side arms 608-L and 608-R, respectively. Arranged as such, whenHMD 600 is worn,vibration transducers 614 and 615 are typically located on portions of side-arm 608-L and 608-R, respectively, which are proximate to a wearer's left and right temple, respectively. For example,vibration transducer 614 may be located on the outer wall of side-arm 608-L, in between the wearer's left eye and left ear. Similarly, vibration transducer 615 may be located on the outer wall of side-arm 608-R, in between the wearer's right eye and right ear. - C. Fitting Pieces
- In a further aspect, an exemplary HMD may include one or more couplers arranged on the HMD. These couplers may help to enhance the fit of the HMD frame to the wearer, such that the HMD fits in a more secure and/or a more comfortable manner. Further, in an exemplary embodiment, these couplers may help to more efficiently transfer vibration of the HMD frame to the bone structure of the wearer.
- As an example of such couplers,
HMD 600 includes two fitting pieces 616-L and 616-R. As shown, fitting piece 616-L is located on an inner wall of side-arm 608-L and extends from a portion of the inner wall that is proximate to the pit behind the left ear lobe of the wearer. Configured as such, fitting piece 616-L may serve to fill any gap between the wearer's body and the side-arm 608-L. Further, fitting piece 616-R may be configured in a similar manner as fitting piece 616-L, but with respect to the right side of the wearer's body. - In exemplary embodiments, the fitting pieces or any type of couplers may be attached to, embedded in, and/or enclosed in the HMD frame at various locations. For example, fitting pieces may be located in various locations so as to fill space between an HMD frame and wearer's body, and thus help the HMD frame vibrate the wearer's bone structure. For instance, a fitting piece may be configured to contact a wearer's ear, nose, temple, eyebrow, nose (e.g., at the bridge of the nose), or neck (e.g., at the pit behind the ear lobe), among other locations.
- Generally, it should be understood that an exemplary embodiment may include only one coupler, or may include two or more couplers. Further, it should be understood that an exemplary embodiment need not include any couplers.
- D. Additional Features of an HMD with Vibration Transducer Speaker
- In some embodiments, an HMD may be configured with multiple vibration transducers, which may be individually customizable. For instance, as the fit of an HMD may vary from user-to-user, the volume of speakers may be adjusted individually to better suit a particular user. As an example, an HMD frame may contact different users in different locations, such that a behind-ear vibration transducer (e.g.,
vibration transducer 514 ofFIG. 5 ) may provide more-efficient indirect bone conduction for a first user, while a vibration transducer located near the temple (e.g.,vibration transducer 414 ofFIG. 4 ) may provide more-efficient indirect bone conduction for a second user. Accordingly, an HMD may be configured with one or more behind-ear vibration transducers and one or more vibration transducers near the temple, which are individually adjustable. As such, the first user may choose to lower the volume or turn off the vibration transducers near the temple, while the second user may choose to lower the volume or turn off the behind-ear vibration transducers. Other examples are also possible. - In some embodiments, different vibration transducers may be driven by different audio signals. For example, in an embodiment with two vibration transducers, a first vibration transducer may be configured to vibrate a left side-arm of an HMD based on a first audio signal, and a second vibration transducer may be configured to vibrate a second portion of the support structure based on a second audio signal.
- In some embodiments, the above configuration may be used to deliver stereo sound. In particular, two individual vibration transducers (or possibly two groups of vibration transducers) may be driven by separate left and right audio signals. As a specific example, referring to
FIG. 6 , vibration transducer 614-L may vibrate side-arm 608-L based on a “left” audio signal, while vibration transducer 614-R may vibrate side-arm 608-R based on a “right” audio signal. Other examples of vibration transducers configured for stereo sound are also possible. - Furthermore, an HMD may include more than two vibration transducers (or possibly more than two groups of vibration transducers), which each are driven by a different audio signal. For example, multiple vibration transducers may be individually driven by different audio signals in order to provide a surround sound experience.
- Further, in some embodiments, different vibrations transducers may be configured for different purposes, and thus driven by different audio signals. For example, one or more vibrations transducers may be configured to deliver music, while another vibration transducer may be configured for voice (e.g., for phone calls, speech-based system messages, etc.). Other examples are also possible.
- E. Vibration Dampeners
- In a further aspect, an exemplary HMD may include one or more vibration dampeners that are configured to substantially isolate vibration of a particular vibration transducer or transducers. For example, when two vibration transducers are arranged to provide stereo sound, a first vibration transducer may be configured to vibrate a left side-arm based on a “left” audio signal, while a second vibration transducer may be configured to vibrate a right side-arm based on a second audio signal. In such an embodiment, one or more vibration transducers may be configured to (a) substantially reduce vibration of the right arm by the first vibration transducer and (b) substantially reduce vibration of the left arm by the second vibration transducer. By doing so, the left audio signal may be substantially isolated on the left arm, while the right audio signal may be substantially isolated on the right arm.
- Vibration dampeners may vary in location on an HMD. For instance, continuing the above example, a first vibration dampener may be coupled to the left side-arm and a second vibration dampener may be coupled to the right side-arm, so as to substantially isolate the vibrational coupling of the first vibration transducer to the left side-arm and vibrational coupling of the second vibration transducer to the second right side-arm. To do so, the vibration dampener or dampeners on a given side-arm may be attached at various locations along the side-arm. For instance, referring to
FIG. 4 , vibration dampeners may be attached at or near where side-arms 408-L and 408-R are attached to lens-frames 409-L and 409-R, respectively. - As another example, vibration transducers may be located on the left and right lens-frames, as illustrated in
FIG. 6 by vibration transducers 618-L and 618-R. In such an embodiment,HMD 600 may include vibration dampeners (not shown) that help to isolate vibration of the left side ofHMD 600 from the right side ofHMD 600. For instance, to help vibrationally isolate vibration transducers 618-L and 618-R, vibration dampeners may be attached at or near to where lens-frames 609-L and 609-R couple to nose bridge 607. Additionally or alternatively, a vibration dampener (not shown) may be located or attached to nose bridge 607, in order to help prevent: (a) vibration transducer 618-L from vibrating the right side of HMD 600 (e.g., lens frame 609-R and/or side-arm 608-R) and (b) vibration transducer 618-R from vibrating the left side of HMD 600 (e.g., lens frame 609-L and/or side-arm 608-R). - In an exemplary embodiment, vibration dampeners may vary in size and/or shape, depending upon the particular implementation. Further, vibration dampeners may be attached to, partially or wholly embedded in, and/or enclosed within the frame of an exemplary HMD. Yet further, vibration dampeners may be made of various different types of materials. For instance, vibration dampeners may be made of silicon, rubber, and/or foam, among other materials. More generally, a vibration dampener may be constructed from any material suitable for absorbing and/or dampening vibration. Furthermore, in some embodiments, a simple air gap between the parts of the HMD may function as a vibration dampener (e.g., an air gap where a side arm connects to a lens frame).
- While various aspects and embodiments have been disclosed herein, other aspects and embodiments will be apparent to those skilled in the art. The various aspects and embodiments disclosed herein are for purposes of illustration and are not intended to be limiting, with the true scope and spirit being indicated by the following claims.
Claims (23)
1. A wearable-computing system comprising:
one or more optical elements;
a support structure comprising a front section and at least one side section, wherein the support structure is configured to support the one or more optical elements;
an audio interface configured to receive an audio signal; and
at least one vibration transducer located on the at least one side section, wherein the at least one vibration transducer is configured to vibrate the at least one side section based on the audio signal;
wherein the support structure is configured such that when worn, a first portion of the at least one side section has an inner wall that contacts the wearer so as to vibrationally couple to a bone structure of the wearer;
wherein the vibration transducer is located on a second portion of the at least one side section having an inner wall that does not contact the wearer, such that when the support structure is worn, the vibration transducer vibrates the support structure without directly vibrating a wearer.
2. The method of claim 1 , wherein the second portion of the at least one side section is proximate to a connection between the at least one side section and the front section.
3. The system of claim 1 , wherein the system comprises a head-mounted display (HMD) that includes the support structure, the one or more optical elements, and a touchpad interface located on at least one of the side sections of the support structure.
4. The system of claim 3 , wherein the one or more optical elements comprise one or more graphic displays.
5. The system of claim 1 , wherein the support structure is configured, when worn, to provide, a separation distance between the vibration transducer and the wearer such that the vibration transducer does not directly transfer vibration to the wearer's bone structure.
6. The system of claim 1 , further comprising at least one coupler located on the inner wall of the first portion of the at least one side section such that the at least one coupler contacts the wearer, wherein the at least one coupler are configured to enhance the vibrational coupling of the support structure to the bone structure of the wearer.
7. The system of claim 6 , wherein the at least one coupler is configured to contact a wearer at a given location, wherein the given location comprises at least one of: (a) a location on the back of the wearer's ear, (b) a location on or above the wearer's nose, (c) a location near the wearer's temple, (d) a location on the wearer's eyebrow, and (a) a location on the wearer's neck.
8. The system of claim 6 , wherein the at least one coupler is configured to contact a wearer at a mastoid of the wearer.
9. The system of claim 1 , wherein the vibration transducer is attached to an outer wall of the at least one side section.
10. The system of claim 1 , wherein the vibration transducer is partially embedded in an outer wall of the at least one side section.
11. The system of claim 1 , wherein the vibration transducer is enclosed within the at least one side section.
12. The system of claim 1 , wherein the vibration transducer is partially embedded in an inner wall of the at least one side section.
13. The system of claim 1 , wherein the vibration transducer is attached to an inner wall of the at least one side section.
14. The system of claim 1 , wherein the audio interface comprises at least one of: (a) an interface to an internal audio playback device, (b) an interface to an auxiliary audio playback device, and (c) a microphone interface.
15. The system of claim 1 , wherein the at least one side section comprises a first side section and a second side section, and wherein the vibration transducer is included on only the first side section.
16. The system of claim 1 , wherein the at least one side section comprises a first side section and a second side section, and wherein the at least one vibration transducer comprises a first vibration transducer located on the first side section and a second vibration transducer located on the second side section.
17. The system of claim 16 , wherein the first vibration transducer is configured to vibrate a first portion of the support structure based on a first audio signal, and wherein the second vibration transducer is configured to vibrate a second portion of the support structure based on a second audio signal.
18. The system of claim 16 , further comprising at least one vibration dampener configured to: (a) substantially reduce vibration of the second portion of the support structure by the first vibration transducer and (b) substantially reduce vibration of the first portion of the support structure by the second vibration transducer.
19. The system of claim 18 :
wherein the first vibration transducer is configured to vibrate at least a portion of the first side section based on a first audio signal, and wherein the second vibration transducer is configured to vibrate at least a portion of the second side section based on a second audio signal;
wherein the at least one vibration dampener comprises a first vibration dampener vibrationally coupled to the first side section and a second vibration dampener vibrationally coupled to the second side section, wherein the first vibration dampener is configured to substantially isolate the vibrational coupling of the first vibration transducer to the first side section from other portions of the support structure, and wherein the second vibration dampener is configured to substantially isolate the vibrational coupling of the second vibration transducer to the second side section from other portions of the support structure.
20. A wearable-computing system comprising:
a support structure comprising a front section and at least one side section, wherein the support structure is configured to support the one or more optical elements;
a means for receiving an audio signal; and
a means for vibrating at least a portion of the support structure based on the audio signal, wherein the means for vibrating is located on the at least one side section;
wherein the support structure is configured such that when worn, a first portion of the at least one side section has an inner wall that contacts the wearer so as to vibrationally couple to a bone structure of the wearer; and
wherein the means for vibrating is located on a second portion of the at least one side section having an inner wall that does not contact the wearer, such that when the support structure is worn, the means for vibrating vibrates the support structure without directly vibrating a wearer,
21. The system of claim 20 , further comprising means for coupling the support structure to the wearer, wherein the means for coupling is located on the inner wall of the first portion of the at least one side section such that the at least one coupler contacts the wearer, and wherein the means for coupling is configured to enhance the vibrational coupling of the support structure to the bone structure of the wearer.
22. The system of claim 21 , wherein the means for coupling is configured to contact the wearer at a given location, wherein the given location comprises at least one of: (a) a location behind the wearer's ear, (b) a location on or above the wearer's nose, (c) a location near the wearer's temple, and (d) a location near the wearer's eyebrow.
23. The system of claim 21 , wherein the means for coupling is configured to contact the wearer at a mastoid of the wearer.
Priority Applications (4)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US13/269,935 US20130022220A1 (en) | 2011-07-20 | 2011-10-10 | Wearable Computing Device with Indirect Bone-Conduction Speaker |
PCT/US2012/047618 WO2013013158A2 (en) | 2011-07-20 | 2012-07-20 | Wearable computing device with indirect bone-conduction speaker |
CN201280045795.7A CN103988113B (en) | 2011-07-20 | 2012-07-20 | wearable computing device with indirect bone-conduction speaker |
US15/066,253 US9900676B2 (en) | 2011-07-20 | 2016-03-10 | Wearable computing device with indirect bone-conduction speaker |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US201161509997P | 2011-07-20 | 2011-07-20 | |
US13/269,935 US20130022220A1 (en) | 2011-07-20 | 2011-10-10 | Wearable Computing Device with Indirect Bone-Conduction Speaker |
Related Child Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US15/066,253 Continuation US9900676B2 (en) | 2011-07-20 | 2016-03-10 | Wearable computing device with indirect bone-conduction speaker |
Publications (1)
Publication Number | Publication Date |
---|---|
US20130022220A1 true US20130022220A1 (en) | 2013-01-24 |
Family
ID=47555766
Family Applications (2)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US13/269,935 Abandoned US20130022220A1 (en) | 2011-07-20 | 2011-10-10 | Wearable Computing Device with Indirect Bone-Conduction Speaker |
US15/066,253 Active 2031-12-29 US9900676B2 (en) | 2011-07-20 | 2016-03-10 | Wearable computing device with indirect bone-conduction speaker |
Family Applications After (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US15/066,253 Active 2031-12-29 US9900676B2 (en) | 2011-07-20 | 2016-03-10 | Wearable computing device with indirect bone-conduction speaker |
Country Status (3)
Country | Link |
---|---|
US (2) | US20130022220A1 (en) |
CN (1) | CN103988113B (en) |
WO (1) | WO2013013158A2 (en) |
Cited By (43)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20130201082A1 (en) * | 2008-06-11 | 2013-08-08 | Honeywell International Inc. | Method and system for operating a near-to-eye display |
US20140029762A1 (en) * | 2012-07-25 | 2014-01-30 | Nokia Corporation | Head-Mounted Sound Capture Device |
US20140184550A1 (en) * | 2011-09-07 | 2014-07-03 | Tandemlaunch Technologies Inc. | System and Method for Using Eye Gaze Information to Enhance Interactions |
US20140247951A1 (en) * | 2013-03-01 | 2014-09-04 | Lalkrushna Malaviya | Animal Headphone Apparatus |
WO2015009539A1 (en) * | 2013-07-15 | 2015-01-22 | Google Inc. | Isolation of audio transducer |
US20150149092A1 (en) * | 2013-11-25 | 2015-05-28 | National Oilwell Varco, L.P. | Wearable interface for drilling information system |
WO2015143018A1 (en) * | 2014-03-18 | 2015-09-24 | Google Inc. | Adaptive piezoelectric array for bone conduction receiver in wearable computers |
US20150334486A1 (en) * | 2012-12-13 | 2015-11-19 | Samsung Electronics Co., Ltd. | Glasses apparatus and method for controlling glasses apparatus, audio apparatus and method for providing audio signal and display apparatus |
US9323983B2 (en) | 2014-05-29 | 2016-04-26 | Comcast Cable Communications, Llc | Real-time image and audio replacement for visual acquisition devices |
US9430921B2 (en) * | 2014-09-24 | 2016-08-30 | Taction Technology Inc. | Systems and methods for generating damped electromagnetically actuated planar motion for audio-frequency vibrations |
US9471101B2 (en) | 2013-09-11 | 2016-10-18 | Lg Electronics Inc. | Wearable computing device and user interface method |
US9690326B2 (en) | 2015-02-02 | 2017-06-27 | Samsung Display Co., Ltd. | Wearable display device |
US9720083B2 (en) * | 2013-06-05 | 2017-08-01 | Google Inc. | Using sounds for determining a worn state of a wearable computing device |
US9806795B2 (en) | 2013-08-05 | 2017-10-31 | Microsoft Technology Licensing, Llc | Automated earpiece cache management |
CN107526432A (en) * | 2016-06-15 | 2017-12-29 | 意美森公司 | System and method for providing touch feedback via case |
US9895110B2 (en) | 2014-09-11 | 2018-02-20 | Industrial Technology Research Institute | Exercise physiological sensing system, motion artifact suppression processing method and device |
US9924265B2 (en) * | 2015-09-15 | 2018-03-20 | Intel Corporation | System for voice capture via nasal vibration sensing |
US9936273B2 (en) | 2015-01-20 | 2018-04-03 | Taction Technology, Inc. | Apparatus and methods for altering the appearance of wearable devices |
US9999396B2 (en) | 2014-09-11 | 2018-06-19 | Industrial Technology Research Institute | Exercise physiological sensing system, motion artifact suppression processing method and device |
US10241583B2 (en) | 2016-08-30 | 2019-03-26 | Intel Corporation | User command determination based on a vibration pattern |
US10298282B2 (en) | 2016-06-16 | 2019-05-21 | Intel Corporation | Multi-modal sensing wearable device for physiological context measurement |
US10324494B2 (en) | 2015-11-25 | 2019-06-18 | Intel Corporation | Apparatus for detecting electromagnetic field change in response to gesture |
WO2019143864A1 (en) * | 2018-01-17 | 2019-07-25 | Magic Leap, Inc. | Display systems and methods for determining registration between a display and a user's eyes |
US20190238971A1 (en) * | 2018-01-31 | 2019-08-01 | Bose Corporation | Eyeglass Headphones |
US10390139B2 (en) | 2015-09-16 | 2019-08-20 | Taction Technology, Inc. | Apparatus and methods for audio-tactile spatialization of sound and perception of bass |
US10491739B2 (en) | 2017-03-16 | 2019-11-26 | Microsoft Technology Licensing, Llc | Opportunistic timing of device notifications |
US10573139B2 (en) | 2015-09-16 | 2020-02-25 | Taction Technology, Inc. | Tactile transducer with digital signal processing for improved fidelity |
US10721594B2 (en) | 2014-06-26 | 2020-07-21 | Microsoft Technology Licensing, Llc | Location-based audio messaging |
US10873800B1 (en) * | 2019-05-17 | 2020-12-22 | Facebook Technologies, Llc | Artificial-reality devices with display-mounted transducers for audio playback |
US11106034B2 (en) * | 2019-05-07 | 2021-08-31 | Apple Inc. | Adjustment mechanism for head-mounted display |
CN113691914A (en) * | 2017-12-22 | 2021-11-23 | 谷歌有限责任公司 | Two-dimensional distributed mode actuator |
US11290706B2 (en) | 2018-01-17 | 2022-03-29 | Magic Leap, Inc. | Display systems and methods for determining registration between a display and a user's eyes |
US20220382382A1 (en) * | 2021-06-01 | 2022-12-01 | tooz technologies GmbH | Calling up a wake-up function and controlling a wearable device using tap gestures |
US11567336B2 (en) | 2018-07-24 | 2023-01-31 | Magic Leap, Inc. | Display systems and methods for determining registration between display and eyes of user |
US11656472B2 (en) * | 2017-10-22 | 2023-05-23 | Lumus Ltd. | Head-mounted augmented reality device employing an optical bench |
US11662311B2 (en) | 2018-04-08 | 2023-05-30 | Lumus Ltd. | Optical sample characterization |
US11729359B2 (en) | 2019-12-08 | 2023-08-15 | Lumus Ltd. | Optical systems with compact image projector |
US11747137B2 (en) | 2020-11-18 | 2023-09-05 | Lumus Ltd. | Optical-based validation of orientations of internal facets |
US11762169B2 (en) | 2017-12-03 | 2023-09-19 | Lumus Ltd. | Optical device alignment methods |
US11768538B1 (en) | 2019-04-26 | 2023-09-26 | Apple Inc. | Wearable electronic device with physical interface |
US11927734B2 (en) | 2016-11-08 | 2024-03-12 | Lumus Ltd. | Light-guide device with optical cutoff edge and corresponding production methods |
US12019249B2 (en) | 2019-12-25 | 2024-06-25 | Lumus Ltd. | Optical systems and methods for eye tracking based on redirecting light from eye using an optical arrangement associated with a light-guide optical element |
US12135430B2 (en) | 2021-05-19 | 2024-11-05 | Lumus Ltd. | Active optical engine |
Families Citing this family (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US9596536B2 (en) * | 2015-07-22 | 2017-03-14 | Google Inc. | Microphone arranged in cavity for enhanced voice isolation |
DE202016105934U1 (en) | 2016-10-21 | 2017-08-22 | Krones Ag | Docking station for a labeling unit |
WO2018230790A1 (en) | 2017-06-13 | 2018-12-20 | 주식회사 비햅틱스 | Head mounted display |
CN107280956A (en) * | 2017-07-28 | 2017-10-24 | 马国华 | A kind of electronic audio frequency Physiotherapy instrument |
CN109270710A (en) * | 2018-12-13 | 2019-01-25 | 张�浩 | Osteoacusis spectacle frame |
US10659869B1 (en) * | 2019-02-08 | 2020-05-19 | Facebook Technologies, Llc | Cartilage transducer |
Family Cites Families (40)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US5282253A (en) | 1991-02-26 | 1994-01-25 | Pan Communications, Inc. | Bone conduction microphone mount |
US5457751A (en) | 1992-01-15 | 1995-10-10 | Such; Ronald W. | Ergonomic headset |
US6301367B1 (en) * | 1995-03-08 | 2001-10-09 | Interval Research Corporation | Wearable audio system with acoustic modules |
JPH1065996A (en) | 1996-08-23 | 1998-03-06 | Olympus Optical Co Ltd | Head wearable display device |
EP1027627B1 (en) * | 1997-10-30 | 2009-02-11 | MYVU Corporation | Eyeglass interface system |
US6215655B1 (en) | 1997-10-31 | 2001-04-10 | Lacerta Enterprises, Inc. | Drive-in ordering apparatus |
US6463157B1 (en) | 1998-10-06 | 2002-10-08 | Analytical Engineering, Inc. | Bone conduction speaker and microphone |
US7150526B2 (en) | 2000-06-02 | 2006-12-19 | Oakley, Inc. | Wireless interactive headset |
US7461936B2 (en) | 2000-06-02 | 2008-12-09 | Oakley, Inc. | Eyeglasses with detachable adjustable electronics module |
US8482488B2 (en) | 2004-12-22 | 2013-07-09 | Oakley, Inc. | Data input management system for wearable electronically enabled interface |
US20020039427A1 (en) | 2000-10-04 | 2002-04-04 | Timothy Whitwell | Audio apparatus |
US20020124295A1 (en) | 2000-10-30 | 2002-09-12 | Loel Fenwick | Clothing apparatus, carrier for a biophysical sensor, and patient alarm system |
US7494216B2 (en) | 2002-07-26 | 2009-02-24 | Oakely, Inc. | Electronic eyewear with hands-free operation |
US7310427B2 (en) | 2002-08-01 | 2007-12-18 | Virginia Commonwealth University | Recreational bone conduction audio device, system |
US7233684B2 (en) | 2002-11-25 | 2007-06-19 | Eastman Kodak Company | Imaging method and system using affective information |
US7762665B2 (en) | 2003-03-21 | 2010-07-27 | Queen's University At Kingston | Method and apparatus for communication between humans and devices |
US7792552B2 (en) | 2003-04-15 | 2010-09-07 | Ipventure, Inc. | Eyeglasses for wireless communications |
JP2005352024A (en) * | 2004-06-09 | 2005-12-22 | Murata Mfg Co Ltd | Glasses type interface device and security system |
US7555136B2 (en) | 2004-06-25 | 2009-06-30 | Victorion Technology Co., Ltd. | Nasal bone conduction wireless communication transmitting device |
US20060034478A1 (en) | 2004-08-11 | 2006-02-16 | Davenport Kevin E | Audio eyeglasses |
US7580540B2 (en) | 2004-12-29 | 2009-08-25 | Motorola, Inc. | Apparatus and method for receiving inputs from a user |
US20070069976A1 (en) | 2005-09-26 | 2007-03-29 | Willins Bruce A | Method and system for interface between head mounted display and handheld device |
WO2007107985A2 (en) | 2006-03-22 | 2007-09-27 | David Weisman | Method and system for bone conduction sound propagation |
US7543934B2 (en) | 2006-09-20 | 2009-06-09 | Ipventures, Inc. | Eyeglasses with activity monitoring and acoustic dampening |
US7740353B2 (en) | 2006-12-14 | 2010-06-22 | Oakley, Inc. | Wearable high resolution audio visual interface |
JP2008165063A (en) * | 2006-12-28 | 2008-07-17 | Scalar Corp | Head mounted display |
KR20080090720A (en) | 2007-04-05 | 2008-10-09 | 최성식 | Headphone with vibration speaker |
US8086288B2 (en) | 2007-06-15 | 2011-12-27 | Eric Klein | Miniature wireless earring headset |
WO2009101622A2 (en) * | 2008-02-11 | 2009-08-20 | Bone Tone Communications Ltd. | A sound system and a method for providing sound |
US20090259090A1 (en) * | 2008-03-31 | 2009-10-15 | Cochlear Limited | Bone conduction hearing device having acoustic feedback reduction system |
US20100149073A1 (en) * | 2008-11-02 | 2010-06-17 | David Chaum | Near to Eye Display System and Appliance |
US20110158444A1 (en) | 2008-08-29 | 2011-06-30 | Phonak Ag | Hearing instrument and method for providing hearing assistance to a user |
WO2010062481A1 (en) * | 2008-11-02 | 2010-06-03 | David Chaum | Near to eye display system and appliance |
CN101753221A (en) * | 2008-11-28 | 2010-06-23 | 新兴盛科技股份有限公司 | Butterfly temporal bone conductive communication and/or hear-assisting device |
JP2010224472A (en) * | 2009-03-25 | 2010-10-07 | Olympus Corp | Spectacle mount type image display apparatus |
US8094858B2 (en) | 2009-04-27 | 2012-01-10 | Joseph Adam Thiel | Eyewear retention device |
JP5385387B2 (en) | 2009-06-29 | 2014-01-08 | パイオニア株式会社 | Speaker damper |
US8964298B2 (en) * | 2010-02-28 | 2015-02-24 | Microsoft Corporation | Video display modification based on sensor input for a see-through near-to-eye display |
CN104902037B (en) * | 2010-12-27 | 2018-08-28 | 罗姆股份有限公司 | Mobile phone |
US8223088B1 (en) | 2011-06-09 | 2012-07-17 | Google Inc. | Multimode input field for a head-mounted display |
-
2011
- 2011-10-10 US US13/269,935 patent/US20130022220A1/en not_active Abandoned
-
2012
- 2012-07-20 WO PCT/US2012/047618 patent/WO2013013158A2/en active Application Filing
- 2012-07-20 CN CN201280045795.7A patent/CN103988113B/en active Active
-
2016
- 2016-03-10 US US15/066,253 patent/US9900676B2/en active Active
Cited By (67)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20130201082A1 (en) * | 2008-06-11 | 2013-08-08 | Honeywell International Inc. | Method and system for operating a near-to-eye display |
US9594248B2 (en) * | 2008-06-11 | 2017-03-14 | Honeywell International Inc. | Method and system for operating a near-to-eye display |
US20140184550A1 (en) * | 2011-09-07 | 2014-07-03 | Tandemlaunch Technologies Inc. | System and Method for Using Eye Gaze Information to Enhance Interactions |
US20140029762A1 (en) * | 2012-07-25 | 2014-01-30 | Nokia Corporation | Head-Mounted Sound Capture Device |
US9094749B2 (en) * | 2012-07-25 | 2015-07-28 | Nokia Technologies Oy | Head-mounted sound capture device |
US20150334486A1 (en) * | 2012-12-13 | 2015-11-19 | Samsung Electronics Co., Ltd. | Glasses apparatus and method for controlling glasses apparatus, audio apparatus and method for providing audio signal and display apparatus |
US9712910B2 (en) * | 2012-12-13 | 2017-07-18 | Samsung Electronics Co., Ltd. | Glasses apparatus and method for controlling glasses apparatus, audio apparatus and method for providing audio signal and display apparatus |
US20140247951A1 (en) * | 2013-03-01 | 2014-09-04 | Lalkrushna Malaviya | Animal Headphone Apparatus |
US9628895B2 (en) * | 2013-03-01 | 2017-04-18 | Lalkrushna Malaviya | Animal headphone apparatus |
US9720083B2 (en) * | 2013-06-05 | 2017-08-01 | Google Inc. | Using sounds for determining a worn state of a wearable computing device |
WO2015009539A1 (en) * | 2013-07-15 | 2015-01-22 | Google Inc. | Isolation of audio transducer |
US9143848B2 (en) | 2013-07-15 | 2015-09-22 | Google Inc. | Isolation of audio transducer |
CN105518516A (en) * | 2013-07-15 | 2016-04-20 | 谷歌公司 | Isolation of audio transducer |
US9806795B2 (en) | 2013-08-05 | 2017-10-31 | Microsoft Technology Licensing, Llc | Automated earpiece cache management |
US9471101B2 (en) | 2013-09-11 | 2016-10-18 | Lg Electronics Inc. | Wearable computing device and user interface method |
US20150149092A1 (en) * | 2013-11-25 | 2015-05-28 | National Oilwell Varco, L.P. | Wearable interface for drilling information system |
US9547175B2 (en) | 2014-03-18 | 2017-01-17 | Google Inc. | Adaptive piezoelectric array for bone conduction receiver in wearable computers |
WO2015143018A1 (en) * | 2014-03-18 | 2015-09-24 | Google Inc. | Adaptive piezoelectric array for bone conduction receiver in wearable computers |
US9323983B2 (en) | 2014-05-29 | 2016-04-26 | Comcast Cable Communications, Llc | Real-time image and audio replacement for visual acquisition devices |
US10721594B2 (en) | 2014-06-26 | 2020-07-21 | Microsoft Technology Licensing, Llc | Location-based audio messaging |
US9999396B2 (en) | 2014-09-11 | 2018-06-19 | Industrial Technology Research Institute | Exercise physiological sensing system, motion artifact suppression processing method and device |
US9895110B2 (en) | 2014-09-11 | 2018-02-20 | Industrial Technology Research Institute | Exercise physiological sensing system, motion artifact suppression processing method and device |
US10820117B2 (en) | 2014-09-24 | 2020-10-27 | Taction Technology, Inc. | Systems and methods for generating damped electromagnetically actuated planar motion for audio-frequency vibrations |
US20170171666A1 (en) * | 2014-09-24 | 2017-06-15 | Taction Technology Inc. | Systems and methods for generating damped electromagnetically actuated planar motion for audio-frequency vibrations |
US10812913B2 (en) | 2014-09-24 | 2020-10-20 | Taction Technology, Inc. | Systems and methods for generating damped electromagnetically actuated planar motion for audio-frequency vibrations |
US9430921B2 (en) * | 2014-09-24 | 2016-08-30 | Taction Technology Inc. | Systems and methods for generating damped electromagnetically actuated planar motion for audio-frequency vibrations |
US10659885B2 (en) | 2014-09-24 | 2020-05-19 | Taction Technology, Inc. | Systems and methods for generating damped electromagnetically actuated planar motion for audio-frequency vibrations |
US9936273B2 (en) | 2015-01-20 | 2018-04-03 | Taction Technology, Inc. | Apparatus and methods for altering the appearance of wearable devices |
US9690326B2 (en) | 2015-02-02 | 2017-06-27 | Samsung Display Co., Ltd. | Wearable display device |
US9924265B2 (en) * | 2015-09-15 | 2018-03-20 | Intel Corporation | System for voice capture via nasal vibration sensing |
US10390139B2 (en) | 2015-09-16 | 2019-08-20 | Taction Technology, Inc. | Apparatus and methods for audio-tactile spatialization of sound and perception of bass |
US10573139B2 (en) | 2015-09-16 | 2020-02-25 | Taction Technology, Inc. | Tactile transducer with digital signal processing for improved fidelity |
US11263879B2 (en) | 2015-09-16 | 2022-03-01 | Taction Technology, Inc. | Tactile transducer with digital signal processing for improved fidelity |
US10324494B2 (en) | 2015-11-25 | 2019-06-18 | Intel Corporation | Apparatus for detecting electromagnetic field change in response to gesture |
US10444844B2 (en) | 2016-06-15 | 2019-10-15 | Immersion Corporation | Systems and methods for providing haptic feedback via a case |
US10095311B2 (en) * | 2016-06-15 | 2018-10-09 | Immersion Corporation | Systems and methods for providing haptic feedback via a case |
CN107526432A (en) * | 2016-06-15 | 2017-12-29 | 意美森公司 | System and method for providing touch feedback via case |
US10298282B2 (en) | 2016-06-16 | 2019-05-21 | Intel Corporation | Multi-modal sensing wearable device for physiological context measurement |
US10241583B2 (en) | 2016-08-30 | 2019-03-26 | Intel Corporation | User command determination based on a vibration pattern |
US11927734B2 (en) | 2016-11-08 | 2024-03-12 | Lumus Ltd. | Light-guide device with optical cutoff edge and corresponding production methods |
US10491739B2 (en) | 2017-03-16 | 2019-11-26 | Microsoft Technology Licensing, Llc | Opportunistic timing of device notifications |
US11966062B2 (en) * | 2017-10-22 | 2024-04-23 | Lumus Ltd. | Head-mounted augmented reality device employing an optical bench |
US11656472B2 (en) * | 2017-10-22 | 2023-05-23 | Lumus Ltd. | Head-mounted augmented reality device employing an optical bench |
US11762169B2 (en) | 2017-12-03 | 2023-09-19 | Lumus Ltd. | Optical device alignment methods |
CN113691914A (en) * | 2017-12-22 | 2021-11-23 | 谷歌有限责任公司 | Two-dimensional distributed mode actuator |
WO2019143864A1 (en) * | 2018-01-17 | 2019-07-25 | Magic Leap, Inc. | Display systems and methods for determining registration between a display and a user's eyes |
US12102388B2 (en) | 2018-01-17 | 2024-10-01 | Magic Leap, Inc. | Eye center of rotation determination, depth plane selection, and render camera positioning in display systems |
US11883104B2 (en) | 2018-01-17 | 2024-01-30 | Magic Leap, Inc. | Eye center of rotation determination, depth plane selection, and render camera positioning in display systems |
US11880033B2 (en) | 2018-01-17 | 2024-01-23 | Magic Leap, Inc. | Display systems and methods for determining registration between a display and a user's eyes |
US11290706B2 (en) | 2018-01-17 | 2022-03-29 | Magic Leap, Inc. | Display systems and methods for determining registration between a display and a user's eyes |
US11451901B2 (en) * | 2018-01-31 | 2022-09-20 | Bose Corporation | Eyeglass headphones |
US20190238971A1 (en) * | 2018-01-31 | 2019-08-01 | Bose Corporation | Eyeglass Headphones |
US10555071B2 (en) * | 2018-01-31 | 2020-02-04 | Bose Corporation | Eyeglass headphones |
US11662311B2 (en) | 2018-04-08 | 2023-05-30 | Lumus Ltd. | Optical sample characterization |
US11880043B2 (en) | 2018-07-24 | 2024-01-23 | Magic Leap, Inc. | Display systems and methods for determining registration between display and eyes of user |
US11567336B2 (en) | 2018-07-24 | 2023-01-31 | Magic Leap, Inc. | Display systems and methods for determining registration between display and eyes of user |
US11768538B1 (en) | 2019-04-26 | 2023-09-26 | Apple Inc. | Wearable electronic device with physical interface |
US11106034B2 (en) * | 2019-05-07 | 2021-08-31 | Apple Inc. | Adjustment mechanism for head-mounted display |
US10873800B1 (en) * | 2019-05-17 | 2020-12-22 | Facebook Technologies, Llc | Artificial-reality devices with display-mounted transducers for audio playback |
US11902735B2 (en) * | 2019-05-17 | 2024-02-13 | Meta Platforms Technologies, Llc | Artificial-reality devices with display-mounted transducers for audio playback |
US20220408177A1 (en) * | 2019-05-17 | 2022-12-22 | Meta Platforms Technologies, Llc | Artificial-reality devices with display-mounted transducers for audio playback |
US11445288B2 (en) * | 2019-05-17 | 2022-09-13 | Meta Platforms Technologies, Llc | Artificial-reality devices with display-mounted transducers for audio playback |
US11729359B2 (en) | 2019-12-08 | 2023-08-15 | Lumus Ltd. | Optical systems with compact image projector |
US12019249B2 (en) | 2019-12-25 | 2024-06-25 | Lumus Ltd. | Optical systems and methods for eye tracking based on redirecting light from eye using an optical arrangement associated with a light-guide optical element |
US11747137B2 (en) | 2020-11-18 | 2023-09-05 | Lumus Ltd. | Optical-based validation of orientations of internal facets |
US12135430B2 (en) | 2021-05-19 | 2024-11-05 | Lumus Ltd. | Active optical engine |
US20220382382A1 (en) * | 2021-06-01 | 2022-12-01 | tooz technologies GmbH | Calling up a wake-up function and controlling a wearable device using tap gestures |
Also Published As
Publication number | Publication date |
---|---|
CN103988113A (en) | 2014-08-13 |
US9900676B2 (en) | 2018-02-20 |
WO2013013158A2 (en) | 2013-01-24 |
WO2013013158A3 (en) | 2013-04-18 |
US20160192048A1 (en) | 2016-06-30 |
CN103988113B (en) | 2017-05-03 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US9900676B2 (en) | Wearable computing device with indirect bone-conduction speaker | |
US9210494B1 (en) | External vibration reduction in bone-conduction speaker | |
US9031273B2 (en) | Wearable computing device with behind-ear bone-conduction speaker | |
US20140064536A1 (en) | Thin Film Bone-Conduction Transducer for a Wearable Computing System | |
US9609412B2 (en) | Bone-conduction anvil and diaphragm | |
US9547175B2 (en) | Adaptive piezoelectric array for bone conduction receiver in wearable computers | |
US20160161748A1 (en) | Wearable computing device | |
US9456284B2 (en) | Dual-element MEMS microphone for mechanical vibration noise cancellation | |
US9100732B1 (en) | Hertzian dipole headphone speaker | |
US9002020B1 (en) | Bone-conduction transducer array for spatial audio | |
US9143848B2 (en) | Isolation of audio transducer | |
US8965012B1 (en) | Smart sensing bone conduction transducer | |
US9998817B1 (en) | On head detection by capacitive sensing BCT | |
US10734706B1 (en) | Antenna assembly utilizing space between a battery and a housing | |
US9525936B1 (en) | Wireless earbud communications using magnetic induction | |
US11675200B1 (en) | Antenna methods and systems for wearable devices |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
AS | Assignment |
Owner name: GOOGLE INC., CALIFORNIA Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:DONG, JIANCHUN;CHI, LIANG-YU TOM;HEINRICH, MITCHELL;AND OTHERS;SIGNING DATES FROM 20110907 TO 20110909;REEL/FRAME:027040/0777 |
|
STCB | Information on status: application discontinuation |
Free format text: ABANDONED -- AFTER EXAMINER'S ANSWER OR BOARD OF APPEALS DECISION |
|
AS | Assignment |
Owner name: GOOGLE LLC, CALIFORNIA Free format text: CHANGE OF NAME;ASSIGNOR:GOOGLE INC.;REEL/FRAME:044142/0357 Effective date: 20170929 |