CN118591344A - Method for integrating stretchable conductive textile traces and textile-type sensors into a woven structure - Google Patents
Method for integrating stretchable conductive textile traces and textile-type sensors into a woven structure Download PDFInfo
- Publication number
- CN118591344A CN118591344A CN202380018664.8A CN202380018664A CN118591344A CN 118591344 A CN118591344 A CN 118591344A CN 202380018664 A CN202380018664 A CN 202380018664A CN 118591344 A CN118591344 A CN 118591344A
- Authority
- CN
- China
- Prior art keywords
- fabric
- wearable device
- conductive
- woven
- conductive trace
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
- 238000000034 method Methods 0.000 title claims description 81
- 239000004753 textile Substances 0.000 title claims description 31
- 239000004744 fabric Substances 0.000 claims abstract description 253
- 239000000463 material Substances 0.000 claims abstract description 25
- 230000002232 neuromuscular Effects 0.000 claims description 59
- 230000008569 process Effects 0.000 claims description 20
- RYGMFSIKBFXOCR-UHFFFAOYSA-N Copper Chemical compound [Cu] RYGMFSIKBFXOCR-UHFFFAOYSA-N 0.000 claims description 6
- 229910052802 copper Inorganic materials 0.000 claims description 4
- 239000010949 copper Substances 0.000 claims description 4
- 230000015556 catabolic process Effects 0.000 claims description 2
- 238000006731 degradation reaction Methods 0.000 claims description 2
- 238000009940 knitting Methods 0.000 description 119
- 210000000707 wrist Anatomy 0.000 description 48
- 230000007246 mechanism Effects 0.000 description 44
- 238000009954 braiding Methods 0.000 description 42
- 210000003128 head Anatomy 0.000 description 38
- 230000008878 coupling Effects 0.000 description 37
- 238000010168 coupling process Methods 0.000 description 37
- 238000005859 coupling reaction Methods 0.000 description 37
- 239000010410 layer Substances 0.000 description 36
- 238000004519 manufacturing process Methods 0.000 description 36
- 230000000712 assembly Effects 0.000 description 30
- 238000000429 assembly Methods 0.000 description 30
- 238000012545 processing Methods 0.000 description 26
- 238000004891 communication Methods 0.000 description 23
- 210000003811 finger Anatomy 0.000 description 21
- 239000002759 woven fabric Substances 0.000 description 21
- 239000007787 solid Substances 0.000 description 18
- 230000009471 action Effects 0.000 description 17
- 230000008859 change Effects 0.000 description 15
- 230000004044 response Effects 0.000 description 15
- 239000011295 pitch Substances 0.000 description 13
- 238000003384 imaging method Methods 0.000 description 12
- 238000003780 insertion Methods 0.000 description 12
- 230000037431 insertion Effects 0.000 description 12
- 230000006870 function Effects 0.000 description 11
- 238000005259 measurement Methods 0.000 description 11
- 239000004433 Thermoplastic polyurethane Substances 0.000 description 10
- 230000002093 peripheral effect Effects 0.000 description 10
- 229920002803 thermoplastic polyurethane Polymers 0.000 description 10
- 238000010586 diagram Methods 0.000 description 9
- 230000009977 dual effect Effects 0.000 description 8
- 238000002567 electromyography Methods 0.000 description 8
- 238000005452 bending Methods 0.000 description 7
- 239000012530 fluid Substances 0.000 description 7
- 210000003205 muscle Anatomy 0.000 description 7
- 238000009958 sewing Methods 0.000 description 7
- 238000009941 weaving Methods 0.000 description 7
- 230000003190 augmentative effect Effects 0.000 description 6
- 230000003287 optical effect Effects 0.000 description 6
- 210000003813 thumb Anatomy 0.000 description 6
- 230000000007 visual effect Effects 0.000 description 6
- 239000004020 conductor Substances 0.000 description 5
- 230000000694 effects Effects 0.000 description 5
- 230000003993 interaction Effects 0.000 description 5
- 230000000670 limiting effect Effects 0.000 description 5
- 238000003825 pressing Methods 0.000 description 5
- 238000003860 storage Methods 0.000 description 5
- 230000007704 transition Effects 0.000 description 5
- 229910000881 Cu alloy Inorganic materials 0.000 description 4
- NEIHULKJZQTQKJ-UHFFFAOYSA-N [Cu].[Ag] Chemical compound [Cu].[Ag] NEIHULKJZQTQKJ-UHFFFAOYSA-N 0.000 description 4
- 230000008901 benefit Effects 0.000 description 4
- 238000001514 detection method Methods 0.000 description 4
- 238000001746 injection moulding Methods 0.000 description 4
- 230000003068 static effect Effects 0.000 description 4
- 230000000638 stimulation Effects 0.000 description 4
- 230000004913 activation Effects 0.000 description 3
- 238000013461 design Methods 0.000 description 3
- 210000000613 ear canal Anatomy 0.000 description 3
- 230000002349 favourable effect Effects 0.000 description 3
- 239000011888 foil Substances 0.000 description 3
- 230000036541 health Effects 0.000 description 3
- 239000007788 liquid Substances 0.000 description 3
- 210000004932 little finger Anatomy 0.000 description 3
- 239000011159 matrix material Substances 0.000 description 3
- 230000003278 mimic effect Effects 0.000 description 3
- 238000012544 monitoring process Methods 0.000 description 3
- 229920000642 polymer Polymers 0.000 description 3
- 230000002829 reductive effect Effects 0.000 description 3
- 229920002595 Dielectric elastomer Polymers 0.000 description 2
- WHXSMMKQMYFTQS-UHFFFAOYSA-N Lithium Chemical compound [Li] WHXSMMKQMYFTQS-UHFFFAOYSA-N 0.000 description 2
- BQCADISMDOOEFD-UHFFFAOYSA-N Silver Chemical compound [Ag] BQCADISMDOOEFD-UHFFFAOYSA-N 0.000 description 2
- 230000009286 beneficial effect Effects 0.000 description 2
- 210000004556 brain Anatomy 0.000 description 2
- 239000011248 coating agent Substances 0.000 description 2
- 238000000576 coating method Methods 0.000 description 2
- 239000011889 copper foil Substances 0.000 description 2
- 238000012937 correction Methods 0.000 description 2
- 238000013480 data collection Methods 0.000 description 2
- 230000007423 decrease Effects 0.000 description 2
- 210000003298 dental enamel Anatomy 0.000 description 2
- 238000009826 distribution Methods 0.000 description 2
- 238000005538 encapsulation Methods 0.000 description 2
- 230000004438 eyesight Effects 0.000 description 2
- 230000008713 feedback mechanism Effects 0.000 description 2
- 239000000835 fiber Substances 0.000 description 2
- 239000011521 glass Substances 0.000 description 2
- 230000000977 initiatory effect Effects 0.000 description 2
- 239000011810 insulating material Substances 0.000 description 2
- 230000010354 integration Effects 0.000 description 2
- 238000003698 laser cutting Methods 0.000 description 2
- 229910052744 lithium Inorganic materials 0.000 description 2
- 230000007774 longterm Effects 0.000 description 2
- 230000014759 maintenance of location Effects 0.000 description 2
- 238000013507 mapping Methods 0.000 description 2
- 230000004048 modification Effects 0.000 description 2
- 238000012986 modification Methods 0.000 description 2
- 150000002926 oxygen Chemical class 0.000 description 2
- 238000007747 plating Methods 0.000 description 2
- BASFCYQUMIYNBI-UHFFFAOYSA-N platinum Chemical compound [Pt] BASFCYQUMIYNBI-UHFFFAOYSA-N 0.000 description 2
- 239000000047 product Substances 0.000 description 2
- 229910052709 silver Inorganic materials 0.000 description 2
- 239000004332 silver Substances 0.000 description 2
- 239000004984 smart glass Substances 0.000 description 2
- 210000004243 sweat Anatomy 0.000 description 2
- 229910000597 tin-copper alloy Inorganic materials 0.000 description 2
- OKTJSMMVPCPJKN-UHFFFAOYSA-N Carbon Chemical compound [C] OKTJSMMVPCPJKN-UHFFFAOYSA-N 0.000 description 1
- HBBGRARXTFLTSG-UHFFFAOYSA-N Lithium ion Chemical compound [Li+] HBBGRARXTFLTSG-UHFFFAOYSA-N 0.000 description 1
- 101100408383 Mus musculus Piwil1 gene Proteins 0.000 description 1
- 239000004677 Nylon Substances 0.000 description 1
- 229920002334 Spandex Polymers 0.000 description 1
- 230000001133 acceleration Effects 0.000 description 1
- 230000000386 athletic effect Effects 0.000 description 1
- 239000011324 bead Substances 0.000 description 1
- 230000036760 body temperature Effects 0.000 description 1
- 238000004364 calculation method Methods 0.000 description 1
- 239000002775 capsule Substances 0.000 description 1
- 230000001413 cellular effect Effects 0.000 description 1
- 238000010276 construction Methods 0.000 description 1
- 230000008602 contraction Effects 0.000 description 1
- 238000005520 cutting process Methods 0.000 description 1
- 238000013500 data storage Methods 0.000 description 1
- 238000012217 deletion Methods 0.000 description 1
- 230000037430 deletion Effects 0.000 description 1
- 230000000881 depressing effect Effects 0.000 description 1
- 210000005069 ears Anatomy 0.000 description 1
- 210000004177 elastic tissue Anatomy 0.000 description 1
- 229920001971 elastomer Polymers 0.000 description 1
- 229920001746 electroactive polymer Polymers 0.000 description 1
- 238000009956 embroidering Methods 0.000 description 1
- 238000004146 energy storage Methods 0.000 description 1
- 238000005516 engineering process Methods 0.000 description 1
- 230000002708 enhancing effect Effects 0.000 description 1
- 230000007613 environmental effect Effects 0.000 description 1
- 230000001815 facial effect Effects 0.000 description 1
- 230000008921 facial expression Effects 0.000 description 1
- 229920005570 flexible polymer Polymers 0.000 description 1
- 210000000245 forearm Anatomy 0.000 description 1
- 230000004927 fusion Effects 0.000 description 1
- 230000005021 gait Effects 0.000 description 1
- 239000007789 gas Substances 0.000 description 1
- PCHJSUWPFVWCPO-UHFFFAOYSA-N gold Chemical compound [Au] PCHJSUWPFVWCPO-UHFFFAOYSA-N 0.000 description 1
- 229910052737 gold Inorganic materials 0.000 description 1
- 239000010931 gold Substances 0.000 description 1
- 210000004247 hand Anatomy 0.000 description 1
- 238000009944 hand knitting Methods 0.000 description 1
- 238000003306 harvesting Methods 0.000 description 1
- 238000005286 illumination Methods 0.000 description 1
- 238000007654 immersion Methods 0.000 description 1
- 230000001939 inductive effect Effects 0.000 description 1
- 239000011261 inert gas Substances 0.000 description 1
- 230000002452 interceptive effect Effects 0.000 description 1
- 230000003155 kinesthetic effect Effects 0.000 description 1
- 239000004973 liquid crystal related substance Substances 0.000 description 1
- 229910001416 lithium ion Inorganic materials 0.000 description 1
- 238000007726 management method Methods 0.000 description 1
- 239000003550 marker Substances 0.000 description 1
- 230000013011 mating Effects 0.000 description 1
- 239000012528 membrane Substances 0.000 description 1
- 238000002156 mixing Methods 0.000 description 1
- 229920001778 nylon Polymers 0.000 description 1
- 230000037361 pathway Effects 0.000 description 1
- 230000008447 perception Effects 0.000 description 1
- 230000024159 perception of rate of movement Effects 0.000 description 1
- 229910052697 platinum Inorganic materials 0.000 description 1
- 229920001296 polysiloxane Polymers 0.000 description 1
- 238000002360 preparation method Methods 0.000 description 1
- 230000001902 propagating effect Effects 0.000 description 1
- 210000001747 pupil Anatomy 0.000 description 1
- 230000009467 reduction Effects 0.000 description 1
- 208000014733 refractive error Diseases 0.000 description 1
- 230000002441 reversible effect Effects 0.000 description 1
- 238000005096 rolling process Methods 0.000 description 1
- 230000035945 sensitivity Effects 0.000 description 1
- 230000021317 sensory perception Effects 0.000 description 1
- 238000004088 simulation Methods 0.000 description 1
- 239000002356 single layer Substances 0.000 description 1
- 239000000243 solution Substances 0.000 description 1
- 230000005236 sound signal Effects 0.000 description 1
- 230000007480 spreading Effects 0.000 description 1
- 238000003892 spreading Methods 0.000 description 1
- 239000000758 substrate Substances 0.000 description 1
- 239000013589 supplement Substances 0.000 description 1
- 230000000153 supplemental effect Effects 0.000 description 1
- 238000012549 training Methods 0.000 description 1
- 238000012546 transfer Methods 0.000 description 1
- 230000004382 visual function Effects 0.000 description 1
Landscapes
- Knitting Of Fabric (AREA)
Abstract
Example wearable devices including an electrically conductive deformable fabric are described herein. The conductive deformable fabric has non-malleable fixed length conductive traces along a first axis and the conductive traces are stitched into the fabric structure to create the conductive deformable material. The fabric structure includes a stitch pattern that facilitates the conductive trace to expand and collapse in an oscillating manner to allow the conductive trace to expand and contract along the first axis without exceeding a fixed length of the conductive trace, respectively. The conductive deformable material is positioned within the wearable device such that the stitch pattern is located over a joint of a user when the wearable device is worn to allow the stitch pattern to stretch or contract with movement of the joint.
Description
Cross Reference to Related Applications
The present application claims priority from U.S. provisional application Ser. No. 63/485,878 submitted at month 17 of 2023, U.S. provisional application Ser. No. 63/485,875 submitted at month 17 of 2023, U.S. provisional application Ser. No. 63/485,880 submitted at month 17 of 2023, and U.S. provisional application Ser. No. 63/485,882 submitted at month 17 of 2023. Each of these applications is incorporated herein by reference in its respective entirety.
The present application also claims priority from U.S. provisional application Ser. No. 63/314,199, filed on 25 at 02/2022, which provisional application is incorporated herein by reference in its entirety (the contents of this provisional application are also incorporated in appendix A of the present application).
Technical Field
This patent relates generally to fabrics (fabrics) for use in wearable devices including embedded electronics and corresponding manufacturing processes for such fabrics. The embedded electronics can be configured to provide input information about the wearer and other information to an artificial reality headset (headset) for interacting with an artificial reality environment. These fabrics are made using specialized hardware that is used to create lightweight and seamless materials that are comfortable to wear for extended periods of time.
Background
When interacting with an artificial reality viewed with an artificial reality headset, input devices and sensors are required to interact with these environments. While controllers and other devices can be used to interact with these environments, they tend to reduce the sense of immersion in the artificial reality environment. Thus, there is a need for an apparatus having immersive aspects that do not detract from the artificial reality environment. While gloved wearable devices attempt to improve these interactions, traditional glove wearable devices can be large and cumbersome, and can also impede movement, which can also result in a reduction in the immersive experience. For example, the glove wearable device can include multiple layers for each different subset of the components of the glove wearable device. Multiple layers can also prove uncomfortable in long-term use (i.e., when interacting with an artificial reality environment).
Furthermore, integrating electronic components with a flexible wearable device can be a difficult challenge. Thus, some wearable devices use electronic components that are separately attached to and not integrated with or embedded in the soft components of the wearable device. This may increase the volume of the wearable device and also lead to latency problems and other performance drawbacks.
Accordingly, there is a need to address one or more of the challenges described above. A brief overview of the solution to the above-described problems is described below.
Disclosure of Invention
The devices, methods, systems, and manufacturing processes described herein address one or more of the drawbacks or disadvantages described above by allowing wearable devices configured to interact with an artificial reality environment to be as lightweight and as comfortable as possible. The techniques described herein also allow for the integration of some electronic devices (e.g., integrated circuits for detecting and/or processing inputs provided by a user) directly into a fabric (e.g., by making electronic components an integral part of the fabric) to provide a more comfortable and lighter wearable device. Manufacturing these types of fabrics can also be difficult, particularly when mass produced, which is one reason why the manufacturing methods described herein using multi-dimensional braiding machines (KNITTING MACHINE) are beneficial in facilitating the wider adoption and acceptance of artificial reality systems.
One example of a garment-integrated capacitive sensor that can be used to detect an input (e.g., to detect a force-based or contact-based input from a change in capacitance at the garment-integrated capacitive sensor) is described. The garment integrated capacitive sensor includes a first knitted conductive electrode layer that is constructed using an insulating conductive fabric (e.g., the insulating conductive fabric can be made with a compressive/tensile core (e.g., elastic fiber, thermoplastic polyurethane (thermoplastic polyurethane, TPU)), which achieves deformation at the yarn (yarn) level, which enhances the performance of the capacitive sensor. In some embodiments, wrapping a high surface area insulated conductor (e.g., enamel coated copper foil, etc.) around the core can further improve sensor performance. In some embodiments, the silver-copper alloy wire/foil provides balanced performance when considering conductivity, cost, and fatigue resistance as compared to pure copper, tin-copper alloys, and silver-copper alloys. The first braided conductive electrode layer has a first surface. The garment-integrated capacitive sensor also includes a second woven conductive electrode layer constructed using a non-insulated conductive fabric having a second surface configured to be in direct contact with the first surface to create the garment-integrated capacitive sensor. In some embodiments, the garment-integrated capacitive sensor is configured to communicate with the processor and to receive a sensed value from the garment-integrated capacitive sensor.
Having summarized the first aspect generally associated with the use of a garment-integrated capacitive sensor that can be used to detect inputs, the second aspect generally associated with a method of manufacturing a woven fabric comprising a non-woven structure is now summarized.
One example of a method of manufacturing a woven fabric comprising a non-woven structure includes, in the case of weaving a fabric structure according to a programmed weaving sequence for a V-bed knitting machine (V-bed KNITTING MACHINE) (e.g., or any other suitable multi-dimensional knitting machine): the non-woven structure is provided to the V-bed knitting machine at a point in time when the fabric structure has a first knit portion. The first weave portion is formed based on a first type of weave pattern and after providing the non-woven structure, the method includes automatically adjusting the V-bed knitting machine to use a second type of weave pattern that is different from the first type of weave pattern following a programmed knitting sequence to accommodate the non-woven structure within a second weave portion adjacent to the first weave portion within the fabric structure.
Having summarized the second aspect generally associated with methods of making woven fabrics comprising the above-described non-woven structures, the third aspect generally associated with weaving dual density fabrics comprising over-molded structures has now been summarized.
In an example method of knitting a dual density fabric, the method includes: in the case of knitting a fabric structure according to a programmed knitting sequence for a V-bed knitting machine (or other multi-dimensional knitting machine): a first portion of the fabric structure having a first fabric density is woven to include a three-dimensional pocket, and a second portion of the fabric structure having a second fabric density different from the first fabric density adjacent to the first portion within the fabric structure is woven from the automated adjustment of the V-bed knitting machine based on the programmed knitting sequence. In some embodiments, the second portion is woven first. For example, a second portion of the fabric structure having a second fabric density is woven and the V-bed knitting machine is automatically adjusted based on the programmed knitting sequence to knit a first portion of the fabric structure to include a three-dimensional pocket having a first fabric density different from the second fabric density adjacent to the first portion within the fabric structure. The method further includes overmolding the polymeric structure into a three-dimensional pocket, wherein the second portion of the fabric structure is temporarily secured to a device configured to attach the overmolded structure into the three-dimensional pocket. The method further includes removing a second portion of the fabric structure.
Having summarized the third aspect generally associated with the use of dual density fabrics comprising an overmolded structure, the fourth aspect generally associated with wearable devices comprising an electrically conductive deformable fabric is now summarized.
An example wearable device includes an electrically conductive deformable fabric, and the electrically conductive deformable fabric includes an electrically conductive trace having a fixed length of non-extensibility (non-extendable) along a first axis. Conductive traces are woven into the fabric structure to create a conductive deformable material. The fabric structure includes a stitch pattern (STITCH PATTERN) that facilitates the conductive trace to expand and collapse in an oscillating manner to allow the conductive trace to expand and contract along a first axis, respectively, without exceeding a fixed length of the conductive trace (or substantially without exceeding a fixed length such that the conductive trace is not subject to tensile or torsional forces), and the conductive deformable material is positioned within the wearable device such that the stitch pattern is positioned over a joint of a user when the wearable device is worn to allow the stitch pattern to expand or contract with movement of the joint.
The description provided herein focuses on a wearable device that can be used to control a gloved artificial reality environment, but those skilled in the art will appreciate upon reading this disclosure that many examples of wearable devices will benefit from the techniques described herein, including other wearable devices such as articles of apparel (hair bands, shirts, jerseys, sports pants, socks, etc.). Those skilled in the art will also appreciate upon reading this disclosure that while the primary example used in connection with the manufacturing or assembly process is a V-bed knitting machine, the techniques described herein may be applied to any multi-dimensional knitting machine.
The features and advantages described in the specification are not necessarily all inclusive, and certain additional features and advantages will be apparent to one of ordinary skill in the art in view of the drawings, specification, and claims. Furthermore, it should be noted that the language used in the specification has been principally selected for readability and instructional purposes.
Having summarized the above example aspects, a brief description of the drawings will now be given.
Drawings
For a better understanding of the various embodiments described, reference should be made to the following detailed description taken in conjunction with the accompanying drawings in which like reference numerals refer to corresponding parts throughout the drawings.
Fig. 1A-1E illustrate a knitted wearable glove device according to some embodiments that includes one or more garment-integrated capacitive sensors (e.g., that can be configured to detect force-based and contact-based inputs from a user's finger, and can do so in various quadrants to detect this input more finely).
Fig. 2 illustrates a multi-dimensional knitting machine configured to produce multi-dimensionally knit garments in an automated manner (e.g., requiring any hand knitting or other user intervention after initiating a knitting process, including electronic components that allow for an integrated component that automatically knits into a multi-dimensionally knit garment) according to some embodiments.
Fig. 3A illustrates a sequence of knitting a knitted wearable structure (e.g., glove) along a vertical axis, according to some embodiments.
Fig. 3B illustrates a sequence of knitting a knitted wearable structure (e.g., another glove) along a horizontal axis, according to some embodiments.
Fig. 4 illustrates a non-woven structure according to some embodiments that is inserted into a multi-dimensional braiding machine in the case of braiding the woven structure (e.g., and also in an automated fashion such that no user intervention is required to allow integration of the non-woven structure after the braiding sequence is initiated).
Fig. 5A and 5B illustrate a woven structure having a non-woven structure with a first woven portion surrounding the non-woven structure and a second woven portion surrounding the first woven portion, according to some embodiments.
Fig. 6A-6B illustrate a first type of stitch pattern (e.g., a plain stitch (jersey stitch) pattern) for allowing for receiving a conductive trace, according to some embodiments.
Fig. 6C-6D illustrate a second type of stitch pattern (e.g., different from the plain stitch pattern depicted and described with reference to fig. 6A-6B) for allowing for receiving conductive traces according to some embodiments.
Fig. 6E illustrates another example of a stitch pattern for adjusting the stitch spacing (gauge of THE STITCH) to adjust the stretch properties of the resulting fabric, according to some embodiments.
Fig. 6F illustrates an example of a fabric including larger pitch knit stitches 622 (e.g., larger pitch knit plain stitches) that allow for accommodating additional stretch characteristics, according to some embodiments.
Fig. 6G illustrates that the conductive yarn according to some embodiments can be stitched in a vertical direction, as opposed to a horizontal direction for stitching the conductive yarn as described with reference to the examples of fig. 6A-6F.
Fig. 6H illustrates that the conductive yarn 626 (shaded) according to some embodiments can be woven in another manner than plain stitch.
Fig. 7A-7G illustrate a sequence for producing a portion of an actuator configured to be placed at a fingertip, according to some embodiments.
Fig. 8A-8B illustrate a fabric structure including one or more portions made of an electrically conductive deformable fabric and favorable strain characteristics accommodated by the fabric structure, according to some embodiments.
Fig. 9A-9C illustrate a fabric structure comprising one or more portions made of an electrically conductive deformable fabric, and configured to have biaxial stretching with favorable strain characteristics as illustrated by each of the graphs in fig. 9A-9C, according to some embodiments.
Fig. 10A illustrates two views of a woven fabric including a solid weave that can be configured to accommodate one or more non-woven structures, according to some embodiments.
Fig. 10B illustrates an embodiment in which multiple solid portions are placed on a single braided structure, according to some embodiments.
FIG. 11 illustrates a flow chart of a method for detecting a force received at a garment, according to some embodiments.
Fig. 12 illustrates a process flow diagram for manufacturing a woven fabric including a non-woven structure, according to some embodiments.
Fig. 13 illustrates a process flow diagram for knitting a dual density fabric including an overmolded structure in accordance with some embodiments.
Fig. 14A-14E illustrate an exemplary wrist wearable (wrist-wearable) device according to some embodiments.
Fig. 15A-15B illustrate an example AR system that can be controlled using a knitted structure (e.g., a wearable glove or other wearable structure formed according to the knitting techniques described herein), according to some embodiments.
Fig. 16A and 16B are block diagrams illustrating an exemplary artificial reality system according to some embodiments.
Fig. 17 is a schematic diagram illustrating additional components (e.g., additional components that allow for the use of aspects of the braided structures described herein to provide haptic feedback) that can be used with the artificial reality systems of fig. 16A and 16B, according to some embodiments.
The various features shown in the drawings may not be drawn to scale in accordance with common practice. Thus, the dimensions of the various features may be arbitrarily expanded or reduced for clarity. Moreover, some of the figures may not depict all of the components of a given system, method, or apparatus. Finally, the same reference numerals may be used throughout the description and the drawings to designate the same features.
Brief description of the drawingsfigures
The specification is accompanied by appendix a, which includes figures and related descriptive text for conductive yarns (and woven fabrics formed in part using conductive yarns or other yarns), making electrical connections with textile electrodes, and laser cutting certain fabrics (and other manufacturing processes). These aspects can be combined, substituted, or otherwise combined with other aspects described herein.
Detailed Description
Numerous details are described herein to provide a thorough understanding of the example embodiments shown in the drawings. However, embodiments may be practiced without many specific details. In addition, well-known processes, components, and materials need not be described in detail in order to avoid obscuring aspects of the embodiments described herein.
Embodiments of the present disclosure can include or incorporate various types or embodiments of artificial reality systems. As described herein, an artificial reality is any overlaid functional and/or sensorially detectable presentation provided by an artificial reality system within a user's physical environment. Such artificial reality (ARTIFICIAL REALITIES, AR) can include and/or present Virtual Reality (VR), augmented reality, mixed Artificial Reality (MAR), or some variant and/or some combination thereof. For example, the user can perform an over-the-air gesture (which can be detected using aspects of the weave structures described herein) such that songs are skipped by a song providing API that provides play at, for example, a home speaker (speaker). In some implementations of the AR system, ambient light (e.g., a live feed of the surrounding environment that the user would typically see) can pass through the display elements of the respective head wearable devices that present aspects of the AR system. In some implementations, ambient light can pass through a respective aspect of the AR system. For example, a visual user interface element (e.g., a notification user interface element) can be presented at the head wearable device, and a quantity of ambient light (e.g., 15% to 50% of the ambient light) can pass through the user interface element such that the user can distinguish at least a portion of the physical environment on which the user interface element is displayed.
The artificial reality content can include entirely generated content or generated content combined with captured (e.g., real world) content. The artificial reality content can include video, audio, haptic events, or some combination thereof, any of which can be presented in a single channel or multiple channels (such as stereoscopic video producing a three-dimensional effect to a viewer). Further, in some embodiments, the artificial reality can also be associated with an application, product, accessory, service, or some combination thereof for creating content in the artificial reality and/or otherwise using in the artificial reality (e.g., performing an activity in the artificial reality), for example.
As described herein, multi-dimensional braiding machines can be leveraged to produce complex braided structures, including integrating non-braided structures, adjusting braiding patterns and spacing without producing seams, producing complex garments (e.g., gloves) without requiring complex garments to be reoriented, and the like. Although much of the description provided herein refers to woven fabric structures produced using yarns, the same techniques applied to these woven fabric structures can also be applied to woven fabric (woven-fabric) structures.
Fig. 1A-1E illustrate a knitted wearable glove device according to some embodiments that includes one or more garment-integrated capacitive sensors (e.g., that can be configured to detect force-based and contact-based inputs from a user 118 finger, and can do so in various quadrants to detect this input more finely). The knitted wearable glove device 100 includes one or more garment-integrated capacitive sensor assemblies 102A-102E in each respective fingertip of the knitted wearable glove device (and thumb tips (hereinafter, fingers and fingertips are also used for thumb and thumb tips)). Each of the one or more garment integrated capacitive sensor assemblies 102A-102E includes a plurality of contact areas on each of the respective garment integrated capacitive sensors (e.g., each respective capacitive sensor includes four different contact quadrants, although various other numbers of contact areas are also contemplated). For example, FIG. 1A shows an exploded view 104 showing a garment integrated capacitive sensor assembly 102D having four different garment integrated capacitive sensor contact areas 106A-106D, wherein each respective contact area of each respective garment integrated capacitive sensor can be used for capacitance values. These garment-integrated capacitive sensors can be used to determine a finely applied force (e.g., a finger rolling on a surface), which can be used to provide input to control an artificial reality environment.
As will be explained in further detail in connection with the description of subsequent figures, one or more garment-integrated capacitive sensor assemblies 102A-102E in each respective fingertip are seamlessly integrated with the knitted wearable glove device 100. This seamless nature is shown in exploded view 104, and exploded view 104 shows that one or more garment integrated capacitive sensor assemblies are each constructed from two woven layers. The first woven conductive electrode layer 108 is constructed using an insulated conductive fabric and the second woven conductive electrode layer 110 is constructed using a non-insulated conductive fabric content. When combined, the first woven conductive electrode layer 108 is configured to be in direct contact with the second woven conductive electrode layer 110 to create a garment-integrated capacitive sensor. Although in the example embodiment of fig. 1A, second braided conductive electrode layer 110 is shown as the outer layer (i.e., on the exterior of glove 100), in some other embodiments, the opposite arrangement is also possible (i.e., the insulating layer is a non-outer layer).
Turning now to fig. 1B, a pair of knitted wearable glove apparatus 100 is shown, with one glove shown from a perspective view depicting a palm side (PALMAR SIDE) of knitted wearable glove apparatus 100 and the other glove shown from a perspective view depicting a back side (dorsal side) of knitted wearable glove apparatus 100. The first glove 112 is discussed above with reference to fig. 1A, showing the palm side. The second sleeve 114 is now described to illustrate the back side of the hand. The second sleeve 114, shown on the back of the hand, shows additional garment-integrated capacitive sensors 116A-116L disposed on one of its surfaces. As shown in this example, the capacitive sensor can be located near (e.g., within 0.2-5mm of, or directly on) a user's joint (e.g., a knuckle) such that the capacitive sensor can be used to measure bending/stretching occurring at or near the joint. As will be discussed further below, the knitted wearable glove device 100 (including the garment-integrated capacitive sensor) can be produced by a multi-dimensional knitting machine (e.g., a V-bed knitting machine or an X-bed knitting machine), allowing the knitted wearable glove device 100 including one or more garment-integrated capacitive sensors to be produced in a single knitting process (e.g., without requiring the wearable glove to be removed from the knitting machine to be reoriented or completed). In some embodiments, additional garment-integrated capacitive sensors are located on the thumb of the glove, as well as on the palm and wrist, to provide additional input areas or sensor detection areas, allowing further flexibility in interacting with the artificial reality environment.
In one example, a soft capacitive sensor integrated with the glove of fig. 1A and 1B can be used to aid in typing operations. This is illustrated in fig. 1C and 1D, which illustrate examples of using data provided by one or more garment-integrated capacitive sensors to provide input to a user interface presented in an artificial reality environment. Fig. 1C shows a user 118 wearing knitted wearable glove device 100 pressing down on a surface 120 (e.g., a table). One or more garment-integrated capacitive sensor assemblies 102A-102E provide data (e.g., corresponding capacitance measurements resulting from a user pressing down on surface 120, which can be divided into individual contact areas of each of the capacitive sensors, as described above) that indicates that force is being applied to fingertips 122 of knitted wearable glove device 100. The measured data can then be used to calculate a force value, as shown by curve 123 in graph 121, which illustrates the detected force received in response to a touch event. Fig. 1C also shows that input on virtual keys of virtual keyboard 124 is provided to a virtual display (e.g., indicated by the displayed letter "H" 125) in response to a force being applied to fingertips 122 of knitted wearable glove device 100.
Fig. 1D shows that user 118 wearing knitted wearable glove device 100 is no longer pressing down on surface 120 (e.g., a table). In response to no longer pressing down on surface 120, one or more garment-integrated capacitive sensor assemblies 102A-102E provide data (e.g., corresponding capacitance measurements) indicating that a force is not being applied to fingertips 122 of knitted wearable glove device 100. The measured data can then be used to calculate a force value, as shown in graph 126, which illustrates the detected force received in response to the touch event ending. Thus, the detected force is now less than the calculated force shown in graph 121 in fig. 1C.
Fig. 1E illustrates a lock-and-KEY KNITTING technology that can be used to increase the contact surface area of one or more adjacent garment integrated capacitive sensor assemblies (e.g., 102A-102E shown in fig. 1A). Fig. 1E shows two different garment integrated capacitive sensor assemblies (e.g., a first garment integrated capacitive sensor 128 and a second garment integrated capacitive sensor 130), wherein the first garment integrated capacitive sensor 128 terminates in a first pattern 132 (e.g., a key) and the second garment integrated capacitive sensor 130 terminates in a second pattern 134 (e.g., a lock) that corresponds to the first pattern (e.g., a pattern opposite the first pattern). Although described in this example as being used to weave two sensor assemblies together, it should be understood that the key weaving technique can also be used to weave two contact areas together within a single sensor assembly.
Fig. 1E also shows the first garment integrated capacitive sensor 128 and the second garment integrated capacitive sensor 130 spliced together in the bonded garment integrated capacitive sensor assembly 136. In some embodiments, such a key structure improves braiding resolution by taking advantage of the three-dimensional nature of braiding. In some embodiments, such a key structure increases the contact surface area between the first knitted conductive electrode layer 108 and the second knitted conductive electrode layer 110 in one or more garment integrated capacitive sensor assemblies 102A-102E.
Attention is now directed to fig. 2, which illustrates a multi-dimensional knitting machine configured to produce multi-dimensionally knit garments in an automated manner (e.g., requiring any manual knitting or other user intervention after commencing the knitting process, including electronic components that allow for an integrated component that automatically knits into a multi-dimensionally knit garment) according to some embodiments. Multidimensional knitting machine 200 is a garment production device that is computer controlled and user programmable to allow production of complex knit structures (e.g., gloves, tubular fabrics, fabrics with embedded electronics, complex knit patterns, special stretch characteristics, unique pattern structures, multi-thread structures, etc.). Multidimensional knitting machine 200 includes a first axis needle bed 202, a second axis needle bed 208, and an nth axis needle bed (indicating that more than three needle beds are possible). Each of these needle beds (e.g., needle 204, needle 210, and needle 218) are configured to use a plurality of different types of knitting patterns (e.g., plain knitting, rib knitting, interlock knitting (interlock knits), french terry knitting (French-TERRY KNITS), pile knitting (FLEECE KNITS), etc.) based on a programmed sequence provided to multi-dimensional knitting machine 200, and variations of these knitting can be used to form a single continuous garment (e.g., a combination of plain knitting and French terry knitting and/or a first variation of plain knitting and/or a second variation of plain knitting). In some embodiments, these knit variations can be accomplished without creating seams in a single continuous garment (e.g., can create a seamless wearable device). In some embodiments, the braiding machine is further configured to laminate the fabric to create a multi-layered wearable structure (e.g., housing one or more electronic components). In some embodiments, each of the multiple layers of the wearable structure can be made of a different fabric, which in one example is produced using conductive yarns. For example, a double-layer woven capacitive sensor can be produced using multi-dimensional braiding machine 200, wherein the first and second layers use different wires (e.g., coated and uncoated conductive wires). Multiple fabric spools (e.g., fabric spool 204, fabric spool 212, and fabric spool 220) can be included for each of the needle beds. Multiple types of fabric spools can be used for each needle bed to allow even more complex woven structures (also known as garments) to be produced. In some embodiments, the fabric spool can also include elastic strands that allow for the production of stretchable fabrics and/or fabrics with shape memory.
Each of the needle beds discussed above can also include one or more non-woven insert assemblies (e.g., non-woven insert assembly 206, non-woven insert assembly 214, and non-woven insert assembly 222) configured to allow insertion of a non-woven structure into the needle bed such that the non-woven structure can be woven into the woven structure in the event that the woven structure (e.g., garment) is produced. For example, the nonwoven structure can include a flexible printed circuit board, a rigid circuit board, conductive wires, structural ribs, sensors (e.g., neuromuscular signal sensor, light sensor, PPG sensor, etc.), and the like. In some embodiments, stitch patterns can be adjusted by the multidimensional braiding machine (e.g., according to a programmed sequence of braiding instructions provided to the machine) to accommodate these structures, which in some embodiments means that these structures are woven into the fabric rather than sewn on top of the woven fabric. This allows the garment to be lighter, thinner, and more comfortable to wear (e.g., by having fewer protrusions that can exert uneven pressure on the wearer's skin). In some embodiments, these multidimensional braiding machines are also capable of braiding a braided structure along one or both of a vertical axis or a horizontal axis, depending on the desired characteristics of the braided structure. Knitting along a horizontal axis means that the garment will be produced from left to right (e.g., the glove will be produced starting with the little finger, then moving to the ring finger, then moving to the middle finger, etc. (e.g., as shown in the exemplary sequence of fig. 3B)). Vertical sewing means that the garment is produced in a top-down fashion (e.g., the glove will be produced starting from the top of the highest finger and moving down to the wrist portion of the glove (e.g., as shown at 228 in fig. 2)). With respect to the glove example, reverse manufacturing processes (e.g., first knitting the thumb when knitting horizontally and first knitting the wrist portion when knitting vertically) are also contemplated. In some embodiments, the insert assembly is capable of feeding the nonwoven structure into a knitting machine, or in some other embodiments, the insert assembly is fed through a knitting machine having a nonwoven structure. In the latter case, the insert assembly is not integrated into the garment and is discarded. In some embodiments, the insertion component is not fed at all, but rather is an integrated component of the multi-dimensional knitting machine that is activated based on a programmed knitting sequence, allowing insertion of the non-woven component into the knitted structure.
Multidimensional braiding machine 200 also includes a braiding logic module 224, which is a module that is user programmable to allow a user (a manufacturing entity that can mass produce a wearable structure) to define a braiding sequence to produce a garment using any of the materials, stitch patterns, braiding techniques, etc., described above. As described above, the braiding logic 224 allows for a seamless combination of any of the techniques described above, allowing for the production of unique complex braided structures in a single braiding sequence (e.g., a user need not remove the braided structure, then reinsert and reorient to complete braiding of the braided structure). Multidimensional braiding machine 200 also includes an insertion logic 226 that cooperates with braiding logic 224 to allow insertion of a nonwoven component to be seamlessly inserted into a braided structure if the braided structure is braided together. The insert logic communicates with the weave logic to allow the weave to be adjusted based on the location of the insert nonwoven structure. In some embodiments, the user need only show where the non-woven structure will be inserted in their mock-up (e.g., at a user interface associated with the multi-dimensional knitting machine that allows for creation and editing of a programmed knitting sequence), and knitting logic module 224 and insertion logic module 226 work together automatically to allow for production of the knitted structure.
Fig. 3A illustrates a sequence of knitting a knitted wearable structure (e.g., glove) along a vertical axis, according to some embodiments. Vertical sewing means that the garment is produced in a top-down fashion (e.g., the glove will be produced starting from the top of the highest finger and moving down to the wrist portion of the glove). For some woven wearable structures, it is desirable to vertically weave the woven structure. For certain three-dimensional woven structures (e.g., pockets with certain openings), it can be desirable to stitch on a particular axis. FIG. 3A shows a sequence 300 showing three snapshots (302A-302C) as a function of time sewn along a vertical axis.
Fig. 3B illustrates a sequence of knitting a knitted wearable structure (e.g., another glove) along a horizontal axis, according to some embodiments. Horizontal sewing means that the garment will be produced from the left side to the right side (e.g., the glove will be produced starting with the little finger, then moving to the ring finger, then moving to the middle finger). For some woven wearable structures, it is desirable to stitch the woven structure horizontally rather than vertically. FIG. 3B shows a sequence 304 that shows two snapshots (306A-306B) as a function of time for horizontal sewing. It should be appreciated that certain multi-dimensional knitting machines can be programmed to allow a combination of both horizontal knitting and vertical knitting, even for a single wearable structure, such that certain aspects of the wearable glove (e.g., three-dimensional stereo pocket) can be knitted horizontally and other aspects of the wearable glove (e.g., non-woven structure, such as a printed circuit board) can be knitted vertically.
Fig. 4 illustrates insertion of a non-woven structure into a multi-dimensional braiding machine with braiding of the woven structure (e.g., and also doing so in an automated fashion such that no user intervention is required to integrate the non-woven structure after initiating a braiding sequence), according to some embodiments. Fig. 4 shows a schematic view similar to fig. 2, showing an insert assembly 402 (identical to insert assemblies 206, 214, and 222 described with reference to fig. 2) configured to operate with a multi-dimensional knitting machine 400 (identical to multi-dimensional knitting machine 200 discussed with reference to fig. 2).
Fig. 4 shows a sequence of how a non-woven structure 405 can be inserted into a multi-dimensional braiding machine while a woven fabric (e.g., glove 408) is being braided. In a first pane 406A, indicating a first point in time, fingertips of glove 408 are being produced, for example woven along a vertical axis. The second pane 406B, indicating the second point in time, shows the woven structure 405 being woven, and also shows the non-woven structures 408A and 408B (which in this example may be separate conductive traces or printed circuit boards that can be used to send (route) data from the sensor lines, such as one or more soft capacitive sensors previously described as an example) being woven into (i.e., inserted into) the woven structure 405. The second pane 406B also shows that in some embodiments, the wires of the woven structure 405 are woven alternately above or below the non-woven structure 405 to ensure that the non-woven structures 408A and 408B are integrated into a single layer of fabric. The third pane 406C, indicating a third point in time, also shows continued braiding of the braided structure 405 together and continued braiding of the non-braided structures 408A and 408B into the braided structure 405. Ultimately, multi-dimensional knitting machine 400 along with insert assembly 402 will produce a complete glove 408 (e.g., a three-dimensional glove) with embedded non-woven structures 408A and 408B. In some embodiments, nonwoven structures 408A and 408B can include incisions to allow threads to pass therethrough, thereby further securing nonwoven structures 408A and 408B to woven fabric 405.
Fig. 4 also illustrates how multi-dimensional braiding machine 200 may be able to adjust the braiding pattern to a different braiding pattern, according to some embodiments, while still allowing the non-braided structure to be integrated into the braided structure. Fig. 4 shows in a fourth pane 406D indicating a fourth point in time that the second weave pattern 412 can be switched to an intermediate weave without having a seam between the two weave patterns (e.g., the weave pattern can be changed and still produce a seamless weave). Changing the weave pattern in the middle of weaving can be beneficial to accommodate different bending requirements of the wearable structure (e.g., the location on the glove corresponding to the joint can require a different weave pattern to accommodate more movement than the location corresponding to the phalanges). In some embodiments, the first weave pattern 414 is a denser (e.g., denser) weave (e.g., a higher number of individual stitches per particular spatial area) than the second weave pattern 416 to accommodate additional movement (e.g., bending of the user's joint). In some embodiments, the non-woven structure 405 (e.g., printed circuit board, wire bundle, semi-rigid support, etc.) is constructed of a different material than the fabric structure. For example, the nonwoven structure can be a printed circuit board, wire, strand, semi-rigid support, or the like.
Fig. 5A and 5B illustrate a woven structure having a non-woven structure with a first woven portion surrounding the non-woven structure and a second woven portion surrounding the first woven portion, according to some embodiments. Fig. 5A shows a structure 500 comprising a non-woven structure 502 and a woven structure 501. In the example depicted in fig. 5A-5B, the non-woven structure 502 does not stretch like the woven structure 501, so it is necessary to design a technique that allows the woven structure to stretch without damaging the non-woven structure 502. Fig. 5A shows the woven structure 501 straightening (uncoil) the non-woven structure 502 as the woven structure 501 is stretched, which effectively allows the non-woven structure 502 to not interfere (i.e., match) with the stretching of the woven structure 501. Fig. 5A shows a first weave portion 506 having a first weave pattern and a second weave portion 508 having a second weave pattern (different than the first weave pattern) that accommodates non-woven structure 502. When combined with the non-woven structure 502, the second woven portion 508 is configured to stretch in substantially the same manner as the first woven portion 506. For example, a looser weave pattern may be used in second weave 508 to accommodate the reduced stretch capability created by non-woven structure 502. In order for the nonwoven structure 502 not to be unduly stressed when stretched, the nonwoven structure 502 can be oversized for a given area and placed in a serpentine pattern (MEANDERING PATTERN). The meandering pattern allows the nonwoven structure 502 to move as the fabric is stretched without applying undue stress/strain to the nonwoven structure (e.g., such that the maximum stretch length is equal to the nonwoven structure length when linear). When the non-woven structure is used for sensing purposes, excessive stress and/or strain on the woven structure can damage the components or interfere with accurate measurements.
Fig. 5B illustrates the structure 500 described with reference to fig. 5A in a stretched state (e.g., as indicated by opposing arrows 510A and 510B), which shows the woven structure 501 stretched and the non-woven structure 502 in its stretched state. Fig. 5A also shows that first braid 506 and second braid 508 are stretched in a horizontal direction. Although stretching in one direction is shown, in some embodiments, the fabric can also be configured to have bi-directional stretching.
Fig. 6A-6B illustrate a first type of stitch pattern (e.g., a plain stitch pattern) for accommodating conductive traces according to some embodiments. In some embodiments, the conductive traces can be woven into yarns that mimic yarns of adjacent yarns in the fabric. In some embodiments, such yarns can be any of the conductive yarns described in appendix a, including the yarns shown and described with reference to fig. 3-7 in appendix a. This alternative method provides another method of providing a seamless textile structure comprising one or more electronic components. Fig. 6A illustrates a sewing technique 600 for producing plain stitch fabric. Fig. 6A also shows that the conductive yarn 602 has a different appearance than the surrounding yarn 604, i.e., it is distinguished from the surrounding yarn. However, in some embodiments, the conductive yarns can have the same appearance. The structuring needle 605 shown in fig. 6A (and in fig. 6C) is a structuring needle corresponding to the needles 204, 210 and 218 described with reference to fig. 2. Fig. 6B shows a fabric 606 constructed using Shan Zhenping needle stitches that include conductive yarn 602 and surrounding yarn 604.
Fig. 6C-6D illustrate a second type of stitch pattern (e.g., different from the plain stitch pattern depicted and described with reference to fig. 6A-6B) that allows for receiving conductive traces according to some embodiments. In some embodiments, the conductive traces can be woven into yarns that mimic yarns of adjacent yarns in the fabric. This alternative method provides another method of providing a seamless textile structure comprising one or more electronic components. Fig. 6C shows a sewing technique 608 for producing plain stitch fabric. Fig. 6C also shows that conductive yarn 610 has a different appearance than surrounding yarn 612, i.e., it is distinguished from the surrounding yarn. However, in some embodiments, the conductive yarns can have the same appearance. Fig. 6C also shows some of the needles 605 being knitting conductive yarns 610, while other needles 605 are knitting surrounding yarns 612. Fig. 6D shows a fabric 614 constructed using modified plain stitch including conductive yarn 602 and surrounding yarn 604. Improving plain stitch can allow for additional stretch (i.e., having different stitches around the conductive yarn can improve the overall stretchability of the fabric 614).
Fig. 6E illustrates another example of a stitch pattern that adjusts stitch spacing to adjust the stretch properties of the resulting fabric, according to some embodiments. Fig. 6E shows three different stitch sizes (e.g., small stitch spacing 616, medium stitch spacing 618, large stitch spacing 620, etc.). For example, the pitch is as high as 18 pitches or more. In some embodiments, different stitch pitches may be used in the same garment (e.g., high motion areas (e.g., joints) may require a large pitch to allow greater stretch than low activity areas) depending on the requirements of the area to be stitched.
Fig. 6F illustrates an example of a fabric including larger pitch knit stitches 622 (e.g., larger pitch knit plain stitches) that allow for accommodating additional stretch characteristics, according to some embodiments. Such a larger pitch knitted plain stitch is apparent compared to the plain stitch shown in fig. 6B. Although a few stitch pitches have been described, any pitch can be used based on the stretch requirements of the garment. Although a primary plain stitch has been shown, other stitches as mentioned above with reference to fig. 2 can be used. In some embodiments, the knitted fabric of appendix a can be formed using such larger pitch knitted plain stitches, such as those described with reference to fig. 3-7 of appendix a.
Fig. 6G shows that the conductive yarn 624 can be stitched in a vertical direction (e.g., in a wale direction instead of a weft direction) as opposed to the horizontal direction for stitching the conductive yarn described with reference to the examples of fig. 6A-6F. In some embodiments, there may be both vertical and horizontal stitching, depending on the requirements of the garment. In some embodiments, the conductive yarns are coated such that the conductive yarns may contact each other without interfering with their respective signals. Fig. 6H illustrates that the conductive yarn 626 (shaded) can be woven in another manner other than plain stitch, according to some embodiments.
Fig. 7A-7G illustrate a sequence for producing a portion of an actuator configured to be placed at a fingertip, according to some embodiments. Fig. 7A to 7C show the progress of the woven structure produced over time. The woven structure 700 being produced is made up of two different fabric components. The first fabric component 702 has a first weave pattern and is the desired finished fabric product in the depicted example. In some embodiments, the first fabric component 702 is also stitched in a manner that allows for the production of a stereoscopic pocket (i.e., better formed around the user's fingertip when completed). The second fabric assembly 704 can be a temporary piece configured to be removed at a later point in time in production. The second fabric assembly 704 can be used primarily as a guide during the overmolding step, as will be described in more detail later.
Figure 7B better illustrates second fabric assembly 704 with its stitch pattern changed in certain locations to add guide holes 706A-706D for aligning the woven structure in the overmold machine to uniformly place the overmolded structure in the correct location. Fig. 7C further illustrates that the braiding process continues in a manner that adds more guide holes as the braided structure 700 continues to be produced. Fig. 7C also shows that one or more stress relief holes 707 can also be produced by a multi-dimensional braiding machine. In some embodiments, these one or more stress relief holes 707 can be used to route cables (e.g., electronic, fluid, or pneumatic cables) or to allow bending of the fabric (e.g., bending around a fingertip).
Fig. 7D illustrates that the woven structure 700 is inserted into the overmolding machine 708 to integrate one or more haptic feedback generator components 710 into the first fabric component 702. As discussed, second fabric assembly 704 includes guide holes 706A-706L corresponding to locating pins (dowel) 712A-712L. The locating pins 712A-712L are inserted into the guide holes 706A-706L of the second fabric assembly 704 to ensure that the overmold machine 708 properly places the overmolded structure onto the first fabric assembly 702.
Fig. 7E shows the overmold machine 708 compressed down on the woven structure 700 (not visible in fig. 7E because it has been compressed by the machine 708) to infuse the overmolded structure (not visible in fig. 7E because it has been compressed by the machine 708). When compressed down on the woven structure 700, an injection-type material (e.g., silicone, rubber, etc.) having bending properties can flow onto/into the fabric in the shape of a mold provided by the overmold machine 708 to create an overmolded structure (hazy). In some embodiments, the additional component is added to an overmolding machine, and the overmolding machine secures the additional component to the braided structure 700 via the molded structure.
Fig. 7F shows the post-overmold structure now comprising an overmold structure 714 embedded in the first fabric component 702 of the woven structure 704 to create a complete tactile fingertip structure 716. The overmolded structure 714 can be configured as a matrix comprising tactile feedback generators (e.g., as shown by the array of blisters 717, wherein each individual blister can be used to provide tactile feedback and/or sensing input), wherein each tactile feedback generator can be individually controlled (e.g., by inflation or deflation) to provide tactile sensing to a user wearing the completed tactile fingertip structure 716. In some embodiments, the encapsulation structure 714 includes one or more sensors (e.g., neuromuscular signal sensors that are immobilized during the encapsulation process). In some embodiments, the one or more sensors are configured to detect both neuromuscular and non-neuromuscular signals. Fig. 7F also shows two wires 718A and 718B, which in this exemplary embodiment are configured as the only wires that attach the first fabric component 702 to the second fabric component 704. In some embodiments, only a single thread attaches first fabric portion assembly 702 to second fabric assembly 704.
Fig. 7G shows two wires 718A and 718B pulled from the woven structure 700, with the result that the first fabric component 702 and the second fabric component 704 become separable from each other. In some embodiments, a single thread can be configured to separate the first fabric component 702 from the second fabric component 704. In some embodiments, the second fabric component is a continuous piece rather than two separate pieces. As previously described, the second fabric component 704 is a temporary piece configured to be removed at a later point in time in production and used only during the manufacturing process. Fig. 7G also illustrates the complete process of producing a fabric with integrated overmolded structure 720. In some embodiments, the threads 718A and 718B are loose threads from the second fabric component 704 that allow the second fabric component 704 to be unwound (e.g., by hand or machine) in order to separate the second fabric component 704 from the first fabric component 702.
Fig. 8A-8B illustrate a fabric structure (e.g., glove 800) including one or more portions made of an electrically conductive deformable fabric (e.g., electrically conductive deformable fabric portion 802) and advantageous strain characteristics accommodated by the fabric structure, according to some embodiments. In some embodiments, the conductive deformable fabric has a different amount of stretch (e.g., more restrictive) along certain axes than the surrounding material, but still requires stretch. For integrating conductive deformable fabrics, this can be achieved using certain folding techniques, such as fold techniques derived from the origami process (e.g., the fabric having alternating folds along at least one axis to reduce its footprint (e.g., a first footprint) when in an unstretched state, and substantially expanding to increase its footprint (e.g., a second footprint greater than the first footprint along at least one axis) when in a stretched state. In some embodiments, the fabric structure includes elasticity, allowing the fabric to be in a default unstretched state.
In some embodiments, the conductive deformable fabric portion 802 can be configured as a strain sensor (i.e., based on the deployment of the fabric, the resistance change of the fabric, which can be used to determine the strain that occurs). In some embodiments, strain information can be used to determine the pose of the hand (e.g., strain can be used to determine whether the finger is in the curved/first state (e.g., the higher the strain, the tighter the finger curl)). In some embodiments, the conductive deformable fabric can also be configured to couple with a neuromuscular signal sensor, and the conductive deformable fabric can be configured to power and/or transmit signal data from the neuromuscular signal sensor.
Fig. 8A also shows that user 801 curls their hands into a fist, with the result that portions of glove 800 are stretched into tension. Graph 804 shows a prophetic plot of curve 806 indicating the measured/calculated strain (shown on y-axis 808) occurring at conductive deformable fabric portion 802 over time (shown on x-axis 810). Curve 806 shows that as the hand is tightened further (i.e., the conductive deformable fabric portion 802 is placed further in its stretched state), the strain increases. In some embodiments, multiple discrete strain sensors can be placed in different areas opposite the continuous band, as shown (e.g., a single strain sensor placed on each joint or other flexible portion of the hand), to make multiple strain measurements and provide even better photographs of hand pose. In some embodiments, multiple strain sensors may be placed at a single location (e.g., joint) to provide even more detailed (e.g., higher resolution) measurements. In some embodiments, information provided by one or more strain sensors can be used to provide input to an artificial reality environment displayed at the artificial reality headset 803.
Fig. 8B shows the user now spreading their hand, with the result that glove 800 returns to its unstretched state. Graph 804 shows a predicted curve 806 now indicating the measured/calculated strain occurring at the conductive deformable fabric portion 802 over time. The graph shows that with further deployment (i.e., placing the conductive deformable fabric portion 802 further in its unstretched state), the strain is reduced.
Fig. 9A-9C illustrate a fabric structure 900 according to some embodiments that includes one or more portions made of an electrically conductive deformable fabric 902, and the fabric structure 900 is configured to have bi-directional stretching with favorable strain characteristics as shown by the curves in each of fig. 9A-9C. As discussed with reference to fig. 8A-8B, the fabric structure 900 is made stretchable by using a series of folds, and the unstretched state is a substantially folded state and the stretched state is a substantially unfolded state. As will be discussed, the fabric structure 900 can have a folding pattern that allows it to unfold in both the x-direction and the y-direction to allow bi-directional stretching.
Fig. 9A shows the fabric structure 900 in a default unstretched state. Graph 904 shown in fig. 9A shows, by dashed x-axis curve 906 and solid y-axis curve 908, that at time t1, no measured/calculated strain occurs along both the x-axis and y-axis of the fabric structure, respectively.
Fig. 9B shows the fabric structure 900 in an extended state along the y-axis, and the graph 904 shown in fig. 9B indicates that there is a measured/calculated strain along the y-axis at time t2 by a solid y-axis curve 908. Fig. 9B also shows that at time t2, dashed x-axis curve 906 indicates that no measured/calculated strain has occurred along the x-axis.
Fig. 9C shows the fabric structure 900 in an extended state along both the x-axis and the y-axis, and the graph 904 shown in fig. 9C shows the measured/calculated strain occurring along both the x-axis and the y-axis of the fabric structure at time t3, respectively, by the dashed x-axis curve 906 and the solid y-axis curve 908.
Fig. 10A illustrates two views of a woven fabric including a solid weave that can be configured to accommodate one or more non-woven structures, according to some embodiments. First view 1000 shows a top down view of braided structure 1002 including solid portion 1004. The solid portion 1004 acts as a pocket to allow placement of a nonwoven structure (not shown) within the cavity of the solid portion. In some embodiments, the nonwoven structure is inserted via an insertion assembly. In some embodiments, the nonwoven structure is a neuromuscular signal sensor (e.g., an electromyography sensor (electromyography sensor)).
Fig. 10A also depicts a second view 1006 that shows a side view of the braided structure 1002 including the solid portion 1004. In some embodiments, the braiding structure 1002 is produced on a multi-dimensional braiding machine configured to adjust a braiding pattern of the braiding structure while producing the braiding structure to produce the solid portion 1004. In some embodiments, the stereoscopic portion has no seams or boundaries with adjacent portions because it is created by only changing the weave pattern (e.g., denser weave patterns surrounded by portions having a looser weave pattern can create a stereoscopic pocket).
Fig. 10B illustrates an embodiment in which multiple solid portions are placed on a single braided structure 1008, according to some embodiments. The plurality of solid portions 1010A-1010C can allow the plurality of nonwoven structures to be placed in close proximity to one another. In some implementations, the stereoscopic portions are placed in a grid array that spans both the x and y directions. In some embodiments, the stereoscopic portions are offset from one another along one or more axes.
The above description supplements the numerous manufacturing processes and yarn types described in appendix a, so that various yarns (e.g., different yarn materials that can be used as described with reference to fig. 3-7 of appendix a) and manufacturing processes (e.g., laser cutting, die cutting and making electrical connections as generally discussed with reference to fig. 8-55 of appendix a) can be used in conjunction with the textile structures and manufacturing processes discussed elsewhere herein, and appendix a is appended to this specification.
Fig. 11 illustrates a method flowchart 1100 for detecting a force received at a garment, according to some embodiments.
(A1) According to some embodiments, a method (1102) of detecting a force received at a garment includes receiving (1104) a force at a sensor integrated into the garment, wherein the capacitive sensor includes: a first braided conductive electrode layer constructed using an insulated conductive fabric, wherein the first braided conductive electrode layer has a first surface; and a second woven conductive electrode layer constructed using a non-insulated conductive fabric comprising a second surface, wherein the second surface is configured to be in direct contact with the first surface (e.g., woven onto the same layer as the first layer, wherein the first layer is a structural component of a wearable device (e.g., a glove)) to create the sensor. The method also includes transmitting (1106) a value corresponding to the received force to a processor in response to the received force at the sensor. The method then includes determining (1108), via the processor, the calculated force value. Further details regarding the capacitive sensor of A1 will be provided below with reference to B1 to B17. Appendix a provides further details of example materials for producing textile-type electrodes, such that any of the example materials shown and described in appendix a may be used in combination with other textile structures described herein and/or in combination with the manufacturing processes and techniques described herein, in addition to or instead of the manufacturing processes and techniques described herein. For example, conductive yarns (e.g., silvertech +150-22Tex or Statex Shieldex yarns 235/36 1-Ply) as described with reference to FIG. 3 of appendix A, drawing 7.
(B1) According to some embodiments, a garment integrated capacitive sensor includes a first woven conductive electrode layer that is constructed using an insulating conductive fabric (e.g., the insulating conductive fabric can be made with a compressive/tensile core (e.g., elastane, thermoplastic polyurethane (thermoplastic polyurethane, TPU)), which enables deformation at the yarn level, enhancing capacitive sensor performance. In some embodiments, a high surface area insulated conductor (e.g., enamel coated copper foil, etc.) wrapped around the core further improves sensor performance. In some embodiments, the silver-copper alloy wire/foil provides balanced performance when considering conductivity, cost, and fatigue resistance as compared to pure copper, tin-copper alloys, and silver-copper alloys. The first braided conductive electrode layer has a first surface. The garment-integrated capacitive sensor also includes a second woven conductive electrode layer constructed using a non-insulated conductive fabric including a second surface configured to be in direct contact with the first surface to create the garment-integrated capacitive sensor. In some embodiments, the garment-integrated capacitive sensor is configured to communicate with the processor and to receive a sensed value from the garment-integrated capacitive sensor.
For example, fig. 1A-1D illustrate examples of garment-integrated capacitive sensors integrated into a wearable device and uses thereof, according to some embodiments.
In some embodiments, the second braided, electrically conductive electrode layer is constructed using a material such as silver, platinum, gold, or the like. In some embodiments, a coating/plating is applied at each fiber level (e.g., coating/plating each fiber of the braided conductive electrode). In some embodiments, the solderable yarns enable easier electrical interconnection. In some embodiments, the second braided conductive electrode layer is constructed using conductive yarns made of silver plated nylon. In some embodiments, the first and second braided conductive electrode layers are made of yarns/wires having TPU cores, and the TPU cores achieve tunable compressibility. In some embodiments, the electrical interconnect is fabricated using ultrasonic bonding (ultrasonic bonding). In some embodiments, a yarn wrapping/twisting machine is used around which the conductive or insulating conductive wire/foil is wrapped.
Garment-integrated capacitive sensors without a separate dielectric can more easily adapt to the surrounding of the human body (e.g., curved portions such as fingertips). In some embodiments, textile sensors having customized shapes are seamlessly woven as part of a substrate (e.g., glove fingertips, wristband) that is built in a single manufacturing step (e.g., a single weaving sequence). Some drawbacks of using a dielectric film in the sensor construction (such as 3-layer sensor geometry) mean that each time the sensor needs to be woven, the machine must be stopped and the dielectric film needs to be manually inserted between the electrodes. Another disadvantage of the three-layer design is that since the space for the dielectric film to be inserted is only a few millimeters, the dielectric film may not be inserted properly. When the dielectric film is not inserted properly, the sensor may short. In addition, it is difficult to determine the incorrect structure of the tri-layer design before the entire glove/sensor sample weave is completed. Furthermore, this step requires the preparation of custom sized dielectric films to accommodate different shape/size sensors. Furthermore, the production of a three-layer sensor arrangement is more time consuming and more difficult to automate.
(B2) In some embodiments of B1, the sensed value, when processed by the processor, can infer the force received at the garment-integrated capacitive sensor. For example, fig. 1C and 1D show the determined forces received at glove 100 in graphs 121 and 126, respectively.
(B3) In some embodiments of any of B1-B2, the value sensed when processed by the processor is capable of determining whether the garment integrated capacitive sensor is in contact with the surface. For example, fig. 1C shows that in response to glove 100 contacting surface 120 at a location corresponding to a virtual key of virtual keyboard 124, an "H" letter 125 is displayed on a display (e.g., a real display or a display shown in artificial reality).
(B4) In some implementations of any of B1-B3, the processor is further in communication with an artificial reality headset displaying artificial reality, and the sensed values from the garment-integrated capacitive sensor are used to alter visual aspects of the artificial reality. FIG. 1C illustrates an example of a glove providing input to a display (e.g., displaying an "H" letter 125 on the display) via a virtual keyboard.
(B5) In some embodiments of B1-B4, the garment-integrated capacitive sensor is seamlessly woven into a fabric that is not a capacitive sensor. For example, fig. 1A and 1B illustrate a knitted wearable glove device 100 that includes one or more garment-integrated capacitive sensors, wherein the garment-integrated capacitive sensors are seamlessly integrated (e.g., at least on the surface of the glove without raised beads/stitches for binding the glove's threads with the one or more garment-integrated capacitive sensors).
(B6) In some embodiments of B1-B5, the garment-integrated capacitive sensor is integrated into a wearable device (e.g., glove 100 shown in fig. 1A-1D), wherein the wearable device includes a plurality of garment-integrated capacitive sensors. In some embodiments, the plurality of garment-integrated capacitive sensors can be divided into a plurality of quadrants, wherein the plurality of quadrants are configured to wrap around a fingertip in a three-dimensional manner. In some embodiments, a plurality of garment-integrated capacitive sensors are continuously woven together.
(B7) In some implementations of B1-B6, each of the plurality of garment-integrated capacitive sensors is capable of detecting a pressure covering an area between 0.5cm 2 and 15cm 2.
(B8) In some embodiments of B1-B7, the second surface is configured to directly contact the first surface without the need for a separate dielectric sheet. For example, fig. 1A shows a two-layer capacitive sensor having a first woven conductive electrode layer 108 constructed using an insulated conductive fabric and a second woven conductive electrode layer 110 constructed using non-insulated conductive fabric contents.
(B9) In some embodiments of B1-B8, the garment-integrated capacitive sensor is integrated into a wearable glove (e.g., glove 100 in fig. 1A-1D).
(B10) In some embodiments of B9, additional garment-integrated capacitive sensors are integrated into the wearable glove (e.g., fig. 1A and 1B illustrate a plurality of garment-integrated capacitive sensor assemblies (e.g., 102A-102E in fig. 1A and garment-integrated capacitive sensors 116A-116L in fig. 1B).
(B11) In some embodiments of B10, the garment-integrated capacitive sensor and the additional garment-integrated capacitive sensor are located on separate fingertips of the wearable glove (e.g., garment-integrated capacitive sensor assemblies 102A-102E). In some embodiments, a sensor is located at each fingertip of the glove. In some embodiments, the sensor is located on the palm side or on the back side of the hand.
(B12) In some embodiments of B1-B9, the garment integrated capacitive sensor is knitted with the non-sensor portion of the garment using a V-bed knitting machine (e.g., fig. 2 shows the production of gloves using a multi-dimensional knitting machine).
(B13) In some embodiments of B12, a plurality of garment-integrated capacitive sensors are woven together with non-sensor portions of the garment using a V-bed knitting machine (e.g., fig. 2 illustrates the production of gloves using a multi-dimensional knitting machine).
(B14) In some embodiments of B13, a plurality of garment-integrated capacitive sensors are woven together using a key weave pattern (lock AND KEY KNIT PATTERN) (e.g., a key weave method increases the effective surface area of the plurality of garment-integrated capacitive sensors, thereby improving performance). In some embodiments, a key weave pattern can be applied to improve energy storage of parallel electrodes, weave assemblies for energy harvesting, and the like. Fig. 1E shows a key structure for connecting a plurality of integrated capacitive sensors.
(B15) In some embodiments of B1-B9, the insulating conductive fabric is constructed of a conductor coated with an insulating material. For example, a first braided conductive electrode layer 108 constructed using an insulated conductive fabric is discussed with reference to fig. 1A.
(B16) In some embodiments of B15, the insulating material does not change the flexibility of the conductive fabric (pliability).
(B17) In some embodiments of B1-B9, the insulated conductive fabric is constructed of a conductor having an insulating covering (insulated shroud) surrounding the conductive fabric.
Fig. 12 illustrates a process flow diagram 1200 for manufacturing a woven fabric including a non-woven structure, according to some embodiments.
(C1) According to some embodiments, a method (1200) of making a woven fabric comprising a non-woven structure comprises: in the case of knitting a fabric structure according to a programmed knitting sequence for a V-bed knitting machine (e.g., or any other suitable multi-dimensional knitting machine) (1200): providing (1204) a non-woven structure to the V-bed knitting machine at a point in time when the fabric structure has a first weave portion, wherein the first weave portion is formed based on a first type of weave pattern, and after providing the non-woven structure, automatically adjusting the V-bed knitting machine to use a second type of weave pattern that is different from the first type of weave pattern following (1208) the programmed weave sequence to adapt the non-woven structure within a second weave portion adjacent to the first weave portion within the fabric structure. For example, fig. 2 shows a multi-dimensional braiding machine 200 that includes a plurality of non-woven insert assemblies for inserting the non-woven assemblies into a woven fabric. Fig. 4 also shows an example of how non-woven structures 408A and 408B can be woven (i.e., inserted) into woven structure 405 (e.g., a glove).
(C2) In some embodiments of C1, the non-woven structure is provided to a V-bed knitting machine (e.g., fig. 4 shows an insert assembly 402 corresponding to multi-dimensional knitting machine 400) via an insert apparatus that is different from the V-bed knitting machine.
(C3) In some embodiments of any of C1-C2, the insertion device passes through a V-bed knitting machine.
(C4) In some embodiments of any of C1-C3, the insertion device is attached to a V-bed knitting machine and feeds the non-woven structure into the V-bed knitting machine according to a programmed knitting sequence (e.g., fig. 4 shows that the insertion assembly 402 can be mounted over one of the knitting beds of the multi-dimensional knitting machine 402).
(C5) In some embodiments of any of C1-C4, the first type of weave pattern has a higher weave density than the second type of weave pattern.
(C6) In some embodiments of any of C1-C5, the first type of weave pattern uses one weave pattern that stretches more (or less) than the second type of weave pattern (e.g., fig. 4 shows in fourth pane 406D that first weave pattern 414 has a tighter (e.g., denser) weave than second weave pattern 416 to accommodate additional movement).
(C7) In some embodiments of any of C1-C6, the non-woven structure is a flexible circuit board (e.g., fig. 4 shows non-woven structure 405 (e.g., a printed circuit board) being inserted into the woven structure).
(C8) In some embodiments of any of C1-C7, the non-woven structure is a wire or wire bundle (e.g., fig. 4 shows the non-woven structure 405 (e.g., wire or wire bundle) being inserted into the woven structure).
(C9) In some embodiments of any of C1-C8, the non-woven structure is a semi-rigid support for providing rigidity to the fabric structure (e.g., fig. 4 shows non-woven structure 405 (e.g., a semi-rigid support) being inserted into the woven structure).
(C10) In some embodiments of any of C1-C9, the first woven portion and the non-woven structure within the second woven portion have substantially the same stretchability (e.g., unidirectional or bi-directional stretch). For example, fig. 5A-5B illustrate that a first woven portion that does not include a non-woven structure can stretch at the same rate as a second woven portion that includes a non-woven structure.
(C11) In some embodiments of any of C1-C10, including automatically adjusting the V-bed knitting machine to use the second type of knitting pattern after providing the non-knitting structure to the V-bed knitting machine at a point in time when the fabric structure has a first knit formed based on the first type of knitting pattern and before following the programmed knitting sequence. In some embodiments, the method further comprises automatically creating a transition region following the programmed weave sequence, wherein the fabric has a weave pattern of a second type, wherein the weave pattern of the second type allows for more movement of the non-woven structure. For example, fig. 4 shows a modification of the weave pattern to accommodate non-woven structures 408A and 408B.
(C12) In some embodiments of any of C1-C11, the nonwoven structure is inserted such that it follows a meandering pattern along the axis, wherein the meandering pattern allows the nonwoven structure to stretch along the axis with the woven portion of the fabric structure. For example, fig. 5A-5B illustrate a meandering pattern that allows the non-woven structure 502 to move as the fabric is stretched without imparting undue stress/strain to the non-woven structure.
(C13) In some embodiments of any of C1-C12, the second type of weave pattern can be a solid weave to allow placement of the non-woven structure in the volume of the solid weave. Fig. 10A-10B illustrate the ability to create solid portions 1004 and 1010A-1010C to accommodate one or more nonwoven structures.
(C14) In some embodiments of any of C1-C13, the programmed knitting sequence for the V-bed knitting machine is configured to accommodate multiple non-woven structures in the case of a knitted fabric structure (e.g., fig. 4 shows non-woven structures 408A and 408B being knitted (i.e., inserted) into knitted structure 405).
(C15) In some embodiments of C14, one of the plurality of non-woven structures is a different material than the non-woven structure (e.g., as discussed with reference to fig. 4, non-woven structure 405 can be a printed circuit board, wire bundle, semi-rigid support, etc., which are different materials).
(C16) In some embodiments of C14, one of the plurality of non-woven structures is shaped differently than the non-woven structure (e.g., the non-woven (e.g., wire, flexible printed circuit board, etc.) structure shown in fig. 4 is shaped differently than the woven structure (e.g., glove made with wire)).
(C17) According to some embodiments, the woven fabric device comprising a non-woven structure is configured according to any one of C1-C16.
(D1) According to some embodiments, a method of manufacturing a knitting machine includes providing a V-bed knitting machine and attaching an insert mechanism to the V-bed knitting machine. The method further includes interconnecting the V-bed knitting machine and the insertion mechanism to a processor, wherein the processor is configured to produce a performance of the method. The method includes, in the case of knitting a fabric structure according to a programmed knitting sequence for a V-bed knitting machine: at a point in time when the fabric structure has a first weave portion formed based on a first type of weave pattern, providing a non-woven structure to the V-bed knitting machine via an interposer, and after providing the non-woven structure, automatically adjusting the V-bed knitting machine to use a second type of weave pattern that is different from the first type of weave pattern, following a programmed weave sequence, to adapt the non-woven structure within a second weave portion adjacent to the first weave portion within the fabric structure.
Fig. 13 illustrates a method 1300 for knitting a graph of a dual density fabric including an overmolded structure in accordance with some embodiments.
(E1) According to some embodiments, a method (1300) of knitting a dual density fabric (1302), the method comprising: in the case of knitting a fabric structure using a programmed knitting sequence for a V-bed knitting machine (1304): knitting (1306) a first portion of the fabric structure having a first fabric density to include a three-dimensional pocket (e.g., sewing the first fabric assembly 702 in a manner that produces a three-dimensional pocket is described with reference to the discussion of fig. 7A-7G), and automatically (1308) adjusting a V-bed knitting machine based on a programmed knitting sequence to knit a second portion of the fabric structure having a second fabric density that is different than the first fabric density, adjacent to the first portion within the fabric structure (e.g., fig. 7A-7G show the second fabric assembly 704, which is a temporary piece). In some embodiments, the second portion is woven first. For example, a second portion of the fabric structure is woven with a second fabric density and the V-bed knitting machine is automatically adjusted based on the programmed knitting sequence to knit the first portion of the fabric structure to include a three-dimensional pocket having a first fabric density different from the second fabric density, the three-dimensional pocket being adjacent to the first portion within the fabric structure. The method further includes overmolding (1310) the polymeric overmolded structure into a three-dimensional pocket (e.g., fig. 7F-7G illustrate the overmolded structure 714 being configurable to include a matrix of tactile feedback generators (e.g., as illustrated by the bubble array 717)), wherein a second portion of the fabric structure is temporarily secured to a device configured to attach the overmolded structure into the three-dimensional pocket. The method further includes removing (1312) a second portion of the fabric structure (e.g., fig. 7G shows two wires 718A and 718B pulled from the woven structure 700, with the result that the first fabric portion assembly 702 and the second fabric assembly 704 are separated from each other).
(E2) In some embodiments of E1, the three-dimensional pocket is configured to house one or more sensors. For example, fig. 7F includes an overmolded structure 714 that can include one or more sensors (e.g., embedded sensors).
(E3) In some embodiments of E2, the one or more sensors are neuromuscular sensors, and the neuromuscular sensors are configured to detect one or more neuromuscular signals of a user. For example, fig. 7F includes an overmolded structure 714 that can include one or more sensors, wherein the sensors are neuromuscular signal sensors.
(E4) In some embodiments of E2, the one or more sensors are non-neuromuscular sensors, and the non-neuromuscular sensors are configured to detect one or more non-neuromuscular signals associated with the user. For example, fig. 7F includes an overmolded structure 714 that can include one or more sensors, wherein the sensors are not neuromuscular signal sensors (e.g., temperature sensors, inertial measurement sensors, etc.).
(E5) In some embodiments of any of E1-E2, the polymer overmolded structure is a component of a haptic feedback generation system. For example, fig. 7F includes an overmolded structure 714 that can include one or more haptic feedback generators (e.g., as shown by bubble array 717).
(E6) In some embodiments of E5, the haptic feedback generation system is a pressure activated system (e.g., a pneumatic or hydraulic system).
(E7) In some embodiments of E5, the haptic feedback generation system is an electric actuation system (e.g., dielectric elastomer actuator (DIELECTRIC ELASTOMER ACTUATOR, DEA)).
(E8) In some embodiments of E5, the haptic feedback generation system includes a matrix of haptic feedback generators (e.g., stretchable bubbles for applying pressure to the user's skin). For example, fig. 7F shows a bubble array 717.
(E9) In some embodiments of any of E1-E2, the fabric density is determined by a combination of material weight and stitch. For example, fig. 6A to 6H show a plurality of types of pins having different pitches.
(E10) In some embodiments of any of E1-E2, including prior to overmolding the polymeric overmolding structure into the three-dimensional pocket, the fabric structure is placed (e.g., automatically) in an injection molding machine (injection molding machine) (e.g., fig. 7D shows that the woven structure 700 is being inserted into the overmolding machine 708).
(E11) In some embodiments of E10, placing the fabric structure in the injection molding machine is accomplished based on a knitting position guide (e.g., a hole in the fabric) integrated into the second portion of the fabric structure. For example, FIG. 7D shows second fabric assembly 704 including guide holes 706A-706L corresponding to locating pins 712A-712L.
(E12) In some embodiments of E11, the guides are holes (or marks (e.g., different colored lines) or fabric protrusions) for securing the fabric structure in a particular position within the injection molding machine. In some embodiments, the apertures are woven automatically into the second fabric structure. For example, FIG. 7D shows locating pins 712A-712L being inserted into guide holes 706A-706L of second fabric assembly 704 to ensure that overmold machine 708 properly places the overmolded structure on first fabric assembly 702.
(E13) In some embodiments of any of E1-E2, removing the second portion of the textile structure does not damage the first portion of the textile structure.
(E14) In some embodiments of E13, the second portion of the fabric structure is removed by removing the detachable attachment lines (e.g., fig. 7F also shows two lines 718A and 718B configured to be the only lines connecting the first fabric component 702 with the second fabric component 704). In some embodiments of E13, the second portion of the fabric structure is removed by untangling the second portion of the fabric structure when the wire is pulled.
(E15) In some embodiments of E14, the detachable attachment line is a single line. For example, referring to the discussion of fig. 7G, alternative embodiments can include a single thread that can be configured to separate the first fabric component 702 from the second fabric component.
(E16) In some embodiments of any of E1-E2, the first portion of the fabric structure comprises a third density different from the first density. For example, similar to the stereoscopic pocket described with reference to fig. 10A to 10B, the portion of the first fabric assembly having the pocket (e.g., stereoscopic pocket) can be realized by changing the density of the fabric.
(E17) In some embodiments of any of E1-E2, the first portion of the fabric structure includes one or more stress relief holes (or cuts) for wrapping the second fabric structure around the user's finger (e.g., with reference to one or more stress relief holes 707 described in fig. 7C).
(E18) In some embodiments of any of E1-E2, the first portion of the fabric is configured to wick moisture away from the polymer overmolded structure. In some embodiments, reducing moisture improves the performance of the haptic feedback generator.
(E19) According to some embodiments, a woven dual density fabric structure comprising an overmolded structure is configured according to any one of E1-E18.
Another embodiment of the conductive deformable fabric will be discussed below.
(F1) According to some embodiments, the wearable device includes an electrically conductive deformable fabric (e.g., fig. 8A-8B illustrate a fabric structure (e.g., glove 800) including one or more portions made of an electrically conductive deformable fabric (e.g., electrically conductive deformable fabric portion 802)), and the electrically conductive deformable fabric includes an electrically conductive trace having a non-malleable fixed length along a first axis. The conductive traces are woven into a fabric structure to produce a conductive deformable material. The fabric structure includes a stitch pattern that facilitates expanding and collapsing the conductive trace in an oscillating manner to allow the conductive trace to expand and contract along the first axis, respectively, without exceeding a fixed length of the conductive trace, and the conductive deformable material is positioned within the wearable device such that when the wearable device is worn, the stitch pattern is located over a joint of a user to allow the stitch pattern to expand or contract with movement of the joint. While joints are used as the primary example of a body part that is capable of bending and causing the stitch pattern to stretch, the skilled artisan will appreciate that the same principles can be applied to any body part that is bent, stretched, contracted, twisted, etc. For example, fig. 9A-9C illustrate a fabric structure 900 comprising one or more portions made of an electrically conductive deformable fabric 902, and the fabric structure 900 is configured to have bi-directional stretch.
(F2) In some embodiments of F1, the stitch pattern further facilitates expansion and contraction of the conductive trace along a second axis perpendicular to the first axis without exceeding a fixed length of the conductive trace. For example, FIG. 9C shows the fabric structure 900 in an extended state along both the x-axis and the y-axis.
(F3) In some embodiments of any of F1-F2, the stitch pattern of the textile structure allows the textile structure to collapse (collapse) via alternating folds, wherein the conductive traces collapse with the textile structure. For example, fig. 9A-9C illustrate how the conductive deformable fabric 902 collapses with the fabric structure.
(F4) In some embodiments of any of F1-F3, the fabric structure includes an elasticity that allows the conductive deformable fabric to return to a default state.
(F5) In some embodiments of any of F1-F4, the conductive trace is linear along the first axis along the malleable fixed length (e.g., fig. 9A-9C show that the conductive deformable fabric 902 is linear along the first axis).
(F6) In some embodiments of any of F1-F5, the stitch pattern of the fabric structure is a plain stitch pattern (e.g., plain pattern stitches, such as the stitches described with reference to fig. 6A-6H).
(F7) In some embodiments of any of F1-F6, the conductive trace is embroidered onto the fabric structure (e.g., fig. 8A-8B show a fabric structure (e.g., glove 800) comprising one or more portions made of a conductive deformable fabric).
(F8) In some embodiments of any of F1-F7, a portion of the conductive trace is configured to be attached to a neuromuscular signal sensor (e.g., an electrode (e.g., a soft electrode made of FKM)).
(F9) In some implementations of any of F1-F8, the conductive trace is an insulated copper magnetic wire.
(F10) In some embodiments of any of F1-F9, the wearable device is machine washable (machine washable).
(F11) In some embodiments of any of F1-F10, the conductive deformable fabric is configured to shrink to a dimension 300% less than the fixed length of the conductive trace (e.g., fig. 9A-9C illustrate a fabric structure (e.g., glove 800) comprising one or more portions made of conductive deformable fabric configured to shrink to a dimension 300% less than the length of the conductive deformable fabric when fully extended).
(F12) In some embodiments of any of F1-F11, the first portion of the conductive trace is configured to contact and not electrically short with the second portion of the conductive trace.
(F13) In some embodiments of any of F1-F12, the conductive deformable fabric is configured to unfold and fold in an oscillating manner for 8,000 to 20,000 cycles without performance degradation.
(F14) In some implementations of any of F1-F13, the resistivity of the conductive trace increases (or decreases) along a fixed length of the conductive trace according to the width of the conductive trace (e.g., thereby allowing for pose determination based on resulting values based on changes in resistivity). For example, fig. 8A to 9C all show how strain values are calculated based on a change in resistivity according to a change in length (e.g., unfolding) of the conductive deformable fabric.
(F15) In some embodiments of any of F1-F14, the unfolding and folding in an oscillating manner follows a folding technique based on a paper folding process.
(F16) In some embodiments of any of F1-F15, the conductive trace provides a signal that can be used to determine the amount of strain at the fabric structure (e.g., and thus at the wearable device). For example, fig. 8A to 9C all show how strain values are calculated based on a change in resistivity according to a change in length (e.g., unfolding) of the conductive deformable fabric.
(F17) In some embodiments of F16, the amount of strain on the fabric structure is used to determine the motion of the joint for interacting with the artificial reality environment. Fig. 8A-8B illustrate a user 801 wearing an artificial reality headset 103 and being able to generate input to the artificial reality environment based on the change in resistivity of their stretched glove.
The features described above with reference to A1 to F17 can be interchanged. For example, any of the woven fabrics/garments described with reference to A1 through F17 can be produced using any technique that contemplates a multi-dimensional braiding machine.
Those of ordinary skill in the art will appreciate that the methods of use, methods of manufacture, and apparatus described above can be incorporated into a single wearable apparatus and the manufacturing process of the apparatus. For example, a knitting machine produced according to the method of manufacturing a knitting machine described with reference to D1 can be used to produce a wearable device (e.g., a glove) that includes two or more of: the force sensing devices described with reference to A1-B17, woven fabrics comprising non-woven structures resulting from the manufacturing methods described with reference to C1-C16, dual density fabrics described with reference to the methods described with reference to E1-E18, and/or wearable devices comprising conductive deformable fabrics described with reference to F1-F17.
In other exemplary embodiments described in appendix a, a wristband can be provided. The wristband can include a textile body; a textile electrode at the surface of the textile body; a flexible printed circuit; and a textile conductive trace electrically connecting the textile electrode with the flexible printed circuit. These textile conductive traces can be integrated with the braided structure using the techniques described above, and additional details regarding the wristband are also provided in appendix a. The textile electrode can be positioned along an inner surface of the textile body. The textile electrode can comprise conductive yarns (examples of which are described in appendix a). The textile body and the textile electrode can be formed using a method selected from the group consisting of braiding, weaving, and embroidering. The flexible printed circuit can be integrated into the textile body.
In another aspect, also described in appendix a, a fabric electrode comprising a woven, woven or embroidered textile can be provided.
The braided structures described above can be implemented in various forms and can be used in conjunction with an artificial reality system (e.g., providing soft wearable gloves for use as input and sensing devices for use with an artificial reality system). Accordingly, examples of wrist wearable devices, headphone devices, systems, and haptic feedback devices are described below to provide further content of systems in which the techniques described herein can be utilized. The specific operations described above may occur as a result of specific hardware, which is described in more detail below. The devices described below are not limiting and features on these devices can be removed or additional features can be added to these devices.
Example wrist wearable device
Fig. 14A and 14B illustrate an exemplary wrist wearable device 1450 according to some embodiments. The wrist wearable device 1450 is an example of a wearable device described herein, such that the wearable device should be understood to have features of the wrist wearable device 1450, and vice versa. Fig. 14A shows a perspective view of wrist-wearable device 1450 including a watch body 1454 coupled to a wristband 1462. The watch body 1454 and wristband 1462 can have a substantially rectangular or circular shape and can be configured to allow a user to wear the wrist-wearable device 1450 on a body part (e.g., a wrist). The wrist-wearable device 1450 can include a retaining mechanism (RETAINING MECHANISM) 1467 (e.g., a clasp, press-fit strap fastener, etc.) for securing the wristband 1462 to the user's wrist. The wrist-wearable device 1450 can also include a coupling mechanism 1460 (e.g., a bracket) for detachably coupling the bladder or bezel 1454 (via a coupling surface of the bezel 1454) to the wristband 1462.
The wrist wearable device 1450 is capable of performing various functions associated with navigating through a user interface and selectively opening applications. As will be described in more detail below, the operations performed by the wrist wearable device 1450 can include, but are not limited to, displaying visual content to a user (e.g., visual content displayed on the display 1456); sensing user input (e.g., sensing a touch on peripheral button 1468, sensing biometric data on sensor 1464, sensing neuromuscular signals on neuromuscular sensor 1465, etc.); information delivery (messaging) (e.g., text, voice, video, etc.); collecting images; wireless communication (e.g., cellular, near field, wi-Fi, personal area network, etc.); position determination; financial transactions; providing haptic feedback; an alarm; notifying; biometric authentication; health monitoring; sleep monitoring, and the like. These functions can be performed independently in timepiece 1454, independently in wristband 1462, and/or communicatively between timepiece 1454 and wristband 1462. In some embodiments, functions that can be performed on the wrist wearable device 1450 in conjunction with the artificial reality environment include, but are not limited to, virtual Reality (VR) environments (including non-immersive, semi-immersive, and fully-immersive VR environments); augmented reality environments (including marker-based, marker-free, location-based, and projection-based augmented reality environments); mixed reality; as well as other types of mixed reality environments. As will be appreciated by those skilled in the art upon reading the description provided herein, the novel wearable devices described herein can be used with any of these types of artificial reality environments.
The wristband 1462 can be configured to be worn by a user such that an inner surface of the wristband 1462 is in contact with the user's skin. When worn by a user, the sensor 1464 is in contact with the skin of the user. The sensor 1464 is a biosensor capable of sensing a user's heart rate, saturated oxygen level, temperature, sweat level, muscle intent, or a combination thereof. Watchband 1462 can include a plurality of sensors 1464 that can be distributed on an inside and/or outside surface of watchband 1462. Additionally or alternatively, the body 1454 can include the same or different sensors as those of the wristband 1462 (or in some embodiments, the wristband 1462 can include no sensors at all). For example, multiple sensors can be distributed on the inside and/or outside surface of the gauge body 1454. As described below with reference to fig. 14B and/or 14C, the meter 1454 can include, but is not limited to, a front image sensor 1425A and/or a rear image sensor 1425B, a biometric sensor, an IMU, a heart rate sensor, a saturated oxygen sensor, one or more neuromuscular sensors, an altimeter sensor, a temperature sensor, a bioimpedance sensor, a pedometer sensor, an optical sensor (e.g., imaging sensor 14104), a touch sensor, a sweat sensor, and the like. The sensors 1464 can also include sensors that provide data regarding the user's environment, including the user's motion (e.g., IMU), height, position, orientation, gait, or a combination thereof. The sensor 1464 can also include a light sensor (e.g., infrared light sensor, visible light sensor) configured to track the position and/or movement of the watch body 1454 and/or wristband 1462. The wristband 1462 can use wired communication methods (e.g., universal asynchronous receiver/Transmitter (Universal Asynchronous Receiver/Transmitter, UART), USB transceiver, etc.) and/or wireless communication methods (e.g., near field communication, bluetooth, etc.) to transmit data obtained by the sensor 1464 to the meter 1454. Watchband 1462 can be configured to operate independently of whether or not a watch body 1454 is coupled to watchband 1462 or decoupled from watchband 1462 (e.g., using sensor 1464 to collect data).
In some examples, wristband 1462 can include neuromuscular sensors 1465 (e.g., EMG sensors, mechanical movement (mechanomyogram, MMG) sensors, acoustic myogram (sonomyography, SMG) sensors, etc.). Neuromuscular sensor 1465 is capable of sensing the user's intent to perform certain motor actions. The sensed muscle intent can be used to control certain user interfaces displayed on the display 1456 of the wrist wearable device 1450 and/or can be transmitted to a device responsible for presenting an artificial reality environment (e.g., a head mounted display) to perform actions in the associated artificial reality environment, such as controlling movement of a virtual device displayed to the user.
The signals from the neuromuscular sensor 1465 can be used to provide the user with enhanced interaction with physical objects and/or virtual objects in an artificial reality application generated by an artificial reality system (e.g., user interface objects presented on the display 1456 or another computing device (e.g., a smartphone)). Signals from neuromuscular sensors 1465 can be obtained (e.g., sensed and recorded) by one or more neuromuscular sensors 1465 of wristband 1462. Although fig. 14A shows one neuromuscular sensor 1465, the wristband 1462 can include a plurality of neuromuscular sensors 1465 circumferentially arranged on an inside surface of the wristband 1462 such that the plurality of neuromuscular sensors 1465 contact the skin of the user. The wristband 1462 can include a plurality of neuromuscular sensors 1465 circumferentially arranged on an inside surface of the wristband 1462. When a user performs muscle activation (e.g., movement, gestures, etc.), the neuromuscular sensor 1465 is able to sense and record neuromuscular signals from the user. The muscle activation performed by the user can include: static gestures, such as placing the palm of the user down on a table; dynamic gestures, such as grabbing physical or virtual objects; and blind gestures that are not perceived by another person, such as by co-contracting opposing muscles or using sub-muscle actuation to slightly tighten the joint. The muscle activation performed by the user can include symbolic gestures (gestures mapped to other gestures, interactions, or commands, such as specifying a gesture vocabulary based on a mapping of gestures to commands).
The wristband 1462 and/or the watch body 1454 can include a haptic device 1463 (e.g., a vibrotactile actuator) configured to provide tactile feedback (e.g., skin and/or kinesthesia feel, etc.) to the user's skin. The sensors 1464 and 1465, and/or the haptic device 1463 can be configured to operate in conjunction with a plurality of applications including, but not limited to, health monitoring, social media, gaming, and artificial reality (e.g., applications associated with artificial reality).
The wrist-wearable device 1450 can include a coupling mechanism (also referred to as a cradle) for detachably coupling the bezel 1454 to the wristband 1462. The user can detach the watch body 1454 from the wristband 1462 in order to reduce the burden on the user of the wrist-wearable device 1450. The wrist-wearable device 1450 can include a coupling surface and/or a coupling mechanism 1460 (e.g., a cradle, tracking strap, support base, clasp) on the watch body 1454. The user can perform any type of movement to couple timepiece 1454 to wristband 1462 and decouple timepiece 1454 from wristband 1462. For example, a user may twist, slide, turn, push, pull, or rotate the body 1454, or a combination thereof, relative to the wristband 1462 to attach the body 1454 to the wristband 1462 and detach the body 1454 from the wristband 1462.
As shown in the example of fig. 14A, the wristband coupling mechanism 1460 can include a frame or shell type that allows the case 1454 to couple surfaces to remain within the wristband coupling mechanism 1460. The bezel 1454 can be detachably coupled to the wristband 1462 by friction fit, magnetic coupling, rotation-based connectors, shear pin couplings, retention springs, one or more magnets, clips, pins, hook and loop fasteners, or combinations thereof. In some examples, timepiece 1454 can be decoupled from wristband 1462 by actuating release mechanism 1470. The release mechanism 1470 can include, but is not limited to, a button, knob, plunger, handle, lever, fastener, clasp, dial (dial), latch, or a combination thereof.
As shown in fig. 14A-14B, the coupling mechanism 1460 can be configured to receive a coupling surface proximate to a bottom side of the bezel 1454 (e.g., a side opposite a front side of the bezel 1454 where the display 1456 is located) such that a user can push the bezel 1454 downward into the coupling mechanism 1460 to attach the bezel 1454 to the coupling mechanism 1460. In some embodiments, the coupling mechanism 1460 can be configured to receive a top side of the bezel 1454 (e.g., a side proximate to a front side of the bezel 1454 where the display 1456 is located) that is pushed up into the cradle instead of being pushed down into the coupling mechanism 1460. In some embodiments, coupling mechanism 1460 is an integrated component of wristband 1462 such that wristband 1462 and coupling mechanism 1460 are a single, unitary structure.
The wrist wearable device 1450 can include a single release mechanism 1470 or multiple release mechanisms 1470 (e.g., two release mechanisms 1470, such as spring-loaded buttons, positioned on opposite sides of the wrist wearable device 1450). As shown in fig. 14A, a release mechanism 1470 can be positioned on the watch body 1454 and/or the wristband coupling mechanism 1460. Although fig. 14A shows release mechanism 1470 positioned at a corner of watch body 1454 and a corner of wristband coupling mechanism 1460, release mechanism 1470 can be positioned at any location on watch body 1454 and/or wristband coupling mechanism 1460 that facilitates user actuation of wrist wearable device 1450. The user of wrist wearable device 1450 is able to actuate release mechanism 1470 by pushing, rotating, lifting, depressing, moving, or performing other actions on release mechanism 1470. Actuation of the release mechanism 1470 can release (e.g., decouple) the body 1454 from the band coupling mechanism 1460 and the band 1462 to allow a user to use the body 1454 independent of the band 1462 and vice versa. For example, decoupling bezel 1454 from wristband 1462 can allow a user to acquire images using rear image sensor 1425B.
Fig. 14B includes a top view of an example of a wrist wearable device 1450. The example of the wrist wearable device 1450 shown in fig. 14A-14B can include a coupling mechanism 1460 (as shown in fig. 14B, the shape of the coupling mechanism can correspond to the shape of the watch body 1454 of the wrist wearable device 1450). The bezel 1454 can be detachably coupled to the coupling mechanism 1460 by friction fit, magnetic coupling, rotation-based connectors, shear pin couplings, retention springs, one or more magnets, clips, pins, hook and loop fasteners, or combinations thereof.
In some examples, the bezel 1454 can be decoupled from the coupling mechanism 1460 by actuating the release mechanism 1470. The release mechanism 1470 can include, but is not limited to, a button, knob, plunger, handle, lever, fastener, clasp, dial, latch, or combinations thereof. In some examples, wristband system functions can be performed independently in bezel 1454, independently in coupling mechanism 1460, and/or communicatively between bezel 1454 and coupling mechanism 1460. The coupling mechanism 1460 can be configured to operate independently (e.g., perform a function independently) of the bezel 1454. Additionally or alternatively, the bezel 1454 can be configured to operate independently of the coupling mechanism 1460 (e.g., to perform a function independently). As described below with reference to the block diagram of fig. 14A, the coupling mechanism 1460 and/or the table 1454 can each include independent resources required to independently perform a function. For example, the coupling mechanism 1460 and/or the body 1454 can each include a power source (e.g., a battery), memory, data storage, a processor (e.g., a Central Processing Unit (CPU)), communication, a light source, and/or an input/output device.
The wrist wearable device 1450 can have various peripheral buttons 1472, 1474, and 1476 for performing various operations at the wrist wearable device 1450. Further, various sensors including one or both of the sensors 1464 and 1465 can be located on the bottom of the timepiece 1454, and can be optionally used even when the timepiece 1454 is separated from the wristband 1462.
Fig. 14C is a block diagram of a computing system 14000 in accordance with at least one embodiment of the present disclosure. The computing system 14000 includes an electronic device 14002, which can be, for example, a wrist wearable device. The wrist wearable device 1450 described in detail above with reference to fig. 14A-14B is an example of the electronic device 14002, and thus the electronic device 14002 will be understood to include the components for the computing system 14000 shown and described below. In some embodiments, all or most of the components of computing system 14000 are included in a single integrated circuit. In some embodiments, computing system 14000 can have a split architecture (e.g., a split mechanical architecture, a split electronic architecture) between a watch body (e.g., watch body 1454 in fig. 14A-14B) and a wristband (e.g., wristband 1462 in fig. 14A-14B). The electronic device 14002 can include a processor (e.g., central processing unit 14004), a controller 14010, a peripheral interface 14014 including one or more sensors 14100 and various peripherals, a power source (e.g., power system 14300), and memory (e.g., memory 14400) including an operating system (e.g., operating system 14402), data (e.g., data 14410), and one or more applications (e.g., application 14430).
In some embodiments, computing system 14000 includes a power system 14300 including a charger input 14302, a power-MANAGEMENT INTEGRATED circuit (PMIC) 14304, and a battery 14306.
In some embodiments, the watch body and wristband can each be an electronic device 14002 each having a corresponding battery (e.g., battery 14306) and can share power with each other. The watch body and wristband can receive electrical charge using a variety of techniques. In some embodiments, the watch body and wristband can receive electrical charge using a wired charging assembly (e.g., a power cord). Alternatively or additionally, the watch body and/or wristband can be configured for wireless charging. For example, the portable charging device can be designed to mate with a portion of the watch body and/or wristband and wirelessly deliver available power to the battery of the watch body and/or wristband.
The watch body and wristband can have independent power systems 14300 to enable independent operation of each. The watch body and wristband can also share power (e.g., one can charge the other) via respective PMICs 14304 that can share power through power and ground conductors, and/or through a wireless charging antenna.
In some embodiments, peripheral interface 14014 can include one or more sensors 14100. The sensor 14100 can include a coupling sensor 14102 for detecting when the electronic device 14002 is coupled with another electronic device 14002 (e.g., a watch body can detect when it is coupled to a wristband, and vice versa). The sensors 14100 can include imaging sensors 14104 for collecting imaging data, which can optionally be the same device as one or more of the cameras 14218. In some embodiments, the imaging sensor 14104 can be separate from the camera 14218. In some embodiments, the sensor includes an SpO2 sensor 14106. In some embodiments, the sensor 14100 includes an EMG sensor 14108 for detecting muscle movement of a user, e.g., the electronic device 14002. In some embodiments, the sensor 14100 includes a capacitive sensor 14110 for detecting a change in potential of a portion of a user's body. In some embodiments, the sensor 14100 includes a heart rate sensor 14112. In some implementations, the sensor 5100 includes an Inertial Measurement Unit (IMU) sensor 14114 for detecting a change in acceleration of, for example, a user's hand.
In some embodiments, the peripheral interface 14014 includes a near-field communication (NFC) component 14202, a global-positioning system (GPS) component 14204, a long-term evolution (LTE) component 14206, and/or a Wi-Fi or bluetooth communication component 14208.
In some embodiments, the peripheral interface includes one or more buttons (e.g., peripheral buttons 1457, 1458, and 1459 of fig. 14B) that, when selected by a user, perform operations at the electronic device 14002.
The electronic device 14002 can include at least one display 14212 for displaying visual function visibility (affordance) (including user interface elements and/or three-dimensional virtual objects) to a user. The display can also include a touch screen for entering user inputs (such as touch gestures, swipe gestures, etc.).
The electronic device 14002 can include at least one speaker 14214 and at least one microphone (14216) for providing audio signals to and receiving audio input from a user. The user can provide user input through the microphone 14216 and can also receive audio output from the speaker 14214 as part of a haptic event provided by the haptic controller 14012.
The electronic device 14002 can include at least one camera 14218 including a front camera 14220 and a rear camera 14222. In some implementations, the electronic device 14002 can be a head wearable device, and one of the cameras 14218 can be integrated with a lens assembly of the head wearable device.
One or more of the electronic devices 14002 can include one or more haptic controllers 14012 and an associated component for providing haptic events at one or more of the electronic devices 14002 (e.g., vibration perception or audio output in response to events at the electronic devices 14002). The haptic controller 14012 can be in communication with one or more electroacoustic devices including one of the one or more speakers 14214 and/or other audio components that convert energy into linear motion and/or electromechanical devices (such as motors, solenoids, electroactive polymers, piezoelectric actuators, electrostatic actuators), or other tactile output generating components (e.g., components that convert electrical signals into tactile outputs on a device). The haptic controller 14012 is capable of providing a tactile event to a user of the electronic device 14002 that can be sensed. In some embodiments, one or more haptic controllers 14012 can receive input signals from an application of the plurality of applications 14430.
Memory 14400 optionally includes high-speed random access memory, and optionally also includes non-volatile memory, such as one or more magnetic disk storage devices, flash memory devices, or other non-volatile solid state storage devices. Access to memory 14400 by other components of electronic device 14002 (such as one or more processors of central processing unit 14004) and peripheral device interface 14014 is optionally controlled by a memory controller of controller 14010.
In some embodiments, the software components stored in memory 14400 may include one or more operating systems 14402 (e.g., linux-based operating systems, android operating systems, etc.). Memory 14400 can also include data 14410, which includes structured data (e.g., SQL database, mongoDB database, graphQL data, JSON data, etc.). The data 14410 can include profile data 14412, sensor data 14414, media file data 14414.
In some embodiments, the software components stored in memory 14400 include one or more application programs 14430 configured to perform operations at electronic device 14002. In some embodiments, the one or more applications 14430 include one or more communication interface modules 14432, one or more graphics modules 14434, one or more camera application modules 14436. In some embodiments, multiple applications 14430 can work in conjunction with one another to perform various tasks at one or more electronic devices 14002.
It should be appreciated that the electronic device 14002 is merely some example of the electronic device 14002 within the computing system 14000, and that other electronic devices 14002 that are part of the computing system 14000 can have more or fewer components than optionally combined two or more components shown, or alternatively have different component configurations or arrangements. The various components shown in fig. 14C are implemented in hardware, software, firmware, or a combination thereof, including one or more signal processing and/or application specific integrated circuits.
As shown in the lower portion of fig. 14C, various individual components of the wrist-wearable device can be examples of the electronic device 14002. For example, some or all of the components shown in electronic device 14002 can be housed or otherwise disposed in combination wristwatch device 14002A, or within individual components of the capsule device case 14002B, bracket portion 14002C, and/or wristband.
Fig. 14D illustrates a wearable device 14170 according to some embodiments. In some embodiments, the wearable device 14170 is used to generate control information (e.g., sensed data regarding neuromuscular signals or instructions to execute certain commands after sensing the data) for causing the computing device to execute one or more input commands. In some embodiments, the wearable device 14170 includes a plurality of neuromuscular sensors 14176. In some embodiments, the plurality of neuromuscular sensors 14176 includes a predetermined number (e.g., 16) of neuromuscular sensors (e.g., EMG sensors) arranged circumferentially around the elastic band 14174. The plurality of neuromuscular sensors 14176 may include any suitable number of neuromuscular sensors. In some embodiments, the number and arrangement of neuromuscular sensors 14176 depend on the particular application in which the wearable device 14170 is used. For example, a wearable device 14170 configured as an armband (armband), wristband, or chest strap may include a plurality of neuromuscular sensors 14176 with different numbers and different arrangements of neuromuscular sensors for each use case, such as medical use cases as compared to gaming or general daily use cases. For example, at least 16 neuromuscular sensors 14176 may be arranged circumferentially around the elastic band 14174.
In some embodiments, the elastic band 14174 is configured to be worn around the lower arm or wrist of the user. The elastic band 14174 may include a flexible electronic connector 14172. In some embodiments, flexible electronic connector 14172 interconnects individual sensors and electronic circuits packaged in one or more sensor housings. Alternatively, in some embodiments, flexible electronic connectors 14172 interconnect separate sensors and electronic circuits external to one or more sensor housings. Each neuromuscular sensor of the plurality 14176 of neuromuscular sensors can include a skin contacting surface including one or more electrodes. One or more of the plurality of neuromuscular sensors 14176 can be coupled together using flexible electronics incorporated into the wearable device 14170. In some embodiments, one or more of the plurality of neuromuscular sensors 14176 can be integrated into a woven fabric, wherein one or more of the plurality of neuromuscular sensors 14176 are woven into the fabric and mimic the flexibility of the fabric (e.g., one or more of the plurality of neuromuscular sensors 14176 can be constructed from a series of woven yarns). In some embodiments, the sensor is flush with the surface of the textile and is indistinguishable from the textile when worn by a user.
Fig. 14E illustrates a wearable device 14179, according to some embodiments. The wearable device 14179 includes pairs of sensor channels 14185a-14185f along an inner surface of the wearable structure 14175 configured to detect neuromuscular signals. A different number of paired sensor channels can be used (e.g., one pair of sensors, three pairs of sensors, four pairs of sensors, or six pairs of sensors). The wearable structure 14175 can include a band 14190, a bladder 14195, and a bracket portion (not shown) coupled to the band 14190 to allow the bladder 14195 to be detachably coupled to the band 14190. For embodiments in which the bladder 14195 is removable, the bladder 14195 can be referred to as a removable structure such that in these embodiments the wearable device includes a wearable portion (e.g., a strap portion 14190 and a bracket portion) and a removable structure (removable bladder removable from the bracket). In some embodiments, the bladder 14195 includes one or more processors and/or other components of the wearable device 1688 described above with reference to fig. 16A and 16B. Wearable structure 14175 is configured to be worn by user 1611. More specifically, the wearable structure 14175 is configured to couple the wearable device 14179 to a wrist, arm, forearm, or other portion of the user's body. Each pair of sensor channels 14185a-14185f includes two electrodes 14180 (e.g., electrodes 14180a-14180 h) for sensing neuromuscular signals based on differential sensing within each respective sensor channel. According to some embodiments, the wearable device 14170 further includes an electrical ground and shielding electrode.
The techniques described above can be used with any device for sensing neuromuscular signals, including the arm wearable device of fig. 14A-14C, but can also be used with other types of wearable devices for sensing neuromuscular signals, such as body wearable devices or head wearable devices that may have neuromuscular sensors closer to the brain or spine.
In some embodiments, the wrist wearable device can be used in conjunction with the head wearable device described below, and the wrist wearable device can also be configured to allow a user to control aspects of the artificial reality (e.g., by using EMG-based gestures to control user interface objects in the artificial reality and/or by allowing a user to interact with a touch screen on the wrist wearable device). Having thus described an exemplary wrist wearable device, attention is now directed to an exemplary head wearable device (such as AR glasses and VR headphones).
Example head wearable device
Fig. 15A illustrates an example AR system 1500 that can be controlled using a knitted structure (e.g., a wearable glove or other wearable structure formed according to the knitting techniques described herein) according to some embodiments. In fig. 15A, AR system 1500 includes an eyeglass device having a frame 1502 configured to hold left display device 1506-1 and right display device 1506-2 in front of a user's eyes. The display devices 1506-1 and 1506-2 may act together or independently to present one or a series of images to the user. Although the AR system 1500 includes two displays, embodiments of the present disclosure may be implemented in an AR system having a single near-eye display (NED) or more than two near-eye displays (NED).
In some implementations, the AR system 1500 includes one or more sensors, such as acoustic sensors 1504. For example, the acoustic sensor 1504 can generate a measurement signal in response to movement of the AR system 1500, and can be located on substantially any portion of the frame 1502. Any of the sensors may be a position sensor, IMU, depth camera assembly, or any combination thereof. In some embodiments, AR system 1500 includes more or fewer sensors than shown in fig. 15A. In embodiments where the sensor includes an IMU, the IMU may generate calibration data based on measurement signals from the sensor. Examples of sensors include, but are not limited to, accelerometers, gyroscopes, magnetometers, other suitable types of sensors that detect motion, sensors for IMU error correction, or some combination thereof.
In some embodiments, the AR system 1500 includes a microphone array having a plurality of acoustic sensors 1504-1 through 1504-8, collectively referred to as acoustic sensors 1504. The acoustic sensor 1504 may be a transducer that detects changes in air pressure caused by acoustic waves. In some embodiments, each acoustic sensor 1504 is configured to detect sound and convert the detected sound to an electronic format (e.g., analog or digital format). In some embodiments, the microphone array includes ten acoustic sensors: 1504-1 and 1504-2 designed to be placed within respective ears of a user, acoustic sensors 1504-3, 1504-4, 1504-5, 1504-6, 1504-7, and 1504-8 located at different locations on the frame 1502, and acoustic sensors located on respective neckbands, wherein the neckbands are system selectable components that are not present in certain embodiments of the artificial reality system discussed herein.
The configuration of the acoustic sensors 1504 of the microphone array may vary. Although the AR system 1500 shown in fig. 15A has ten acoustic sensors 1504, the number of acoustic sensors 1504 may be more or less than ten. In some cases, using more acoustic sensors 1504 increases the amount of audio information collected and/or the sensitivity and accuracy of the audio information. In contrast, in some cases, using a fewer number of acoustic sensors 1504 reduces the computational power required by the controller to process the collected audio information. Further, the location of each acoustic sensor 1504 of the microphone array may vary. For example, the locations of the acoustic sensors 1504 may include defined locations on the user, defined coordinates on the frame 1502, an orientation associated with each acoustic sensor, or some combination thereof.
Acoustic sensors 1504-1 and 1504-2 may be positioned on different portions of a user's ear. In some embodiments, there are additional acoustic sensors on or around the ear in addition to the acoustic sensor 1504 within the ear canal. In some cases, the acoustic sensor is positioned near the user's ear canal so that the microphone array can collect information about how sound reaches the ear canal. By positioning at least two of the acoustic sensors 1504 on either side of the user's head (e.g., as binaural microphones), the AR device 1500 is able to simulate binaural hearing and capture a 3D stereo sound field around the user's head. In some embodiments, acoustic sensors 1504-1 and 1504-2 are connected to AR system 1500 via a wired connection, and in other embodiments, acoustic sensors 1504-1 and 1504-2 are connected to AR system 1500 via a wireless connection (e.g., a bluetooth connection). In some implementations, the AR system 1500 does not include acoustic sensors 1504-1 and 1504-2.
The acoustic sensor 1504 on the frame 1502 may be positioned along the length of the temple, across the nose bridge, above or below the display device 1506, or some combination thereof. The acoustic sensor 1504 may be oriented such that the microphone array is capable of detecting sound in a wide range of directions around a user wearing the AR system 1500. In some implementations, a calibration procedure is performed during the manufacture of the AR system 1500 to determine the relative positioning of each acoustic sensor 1504 in the microphone array.
In some embodiments, the eyewear device further includes an external device (e.g., a pairing device), or is communicatively coupled to an external device, such as the optional neck strap described above. In some embodiments, the optional neck strap is coupled to the eyewear device via one or more connectors. The connector may be a wired or wireless connector and may include electrical and/or non-electrical (e.g., structural) components. In some embodiments, the eyewear device and neck strap operate independently without any wired or wireless connection therebetween. In some embodiments, the components of the eyeglass device and neck strap can be located on one or more additional peripheral devices paired with the eyeglass device, neck strap, or some combination thereof. Further, neck strap is intended to represent any suitable type or form of mating device. Accordingly, the following discussion of the neck strap may also apply to various other paired devices, such as smart watches, smart phones, wrist straps, other wearable devices, hand-held controllers, tablet computers, or laptop computers.
In some cases, pairing an external device (such as an optional neck strap) with the AR eyeglass device enables the AR eyeglass device to achieve the apparent size of a pair of eyeglasses while still providing sufficient battery and computing power for extended capabilities. Some or all of the battery power, computing resources, and/or additional features of AR system 1500 may be provided by or shared between the paired device and the eyeglass device, thereby reducing weight, heat distribution, and overall eyeglass device size while still maintaining the desired functionality. For example, the neck strap may allow components that would otherwise be included on the eyeglass device to be included in the neck strap, thereby transferring weight loads from the user's head to the user's shoulders. In some embodiments, the neck strap has a greater surface area through which heat is diffused and dispersed to the surrounding environment. Thus, the neck strap may allow for greater battery and computing capacity than would otherwise be possible on a stand-alone eyeglass device. Because the weight carried in the neck strap may be less invasive to the user than the weight carried in the eyeglass device, the user may be able to carry or wear the paired device for a longer period of time while wearing the lighter eyeglass device than if the user were to carry the independent eyeglass device that was originally heavily loaded, thereby enabling the user to more fully integrate the artificial reality environment into their daily activities.
In some embodiments, the optional neck strap is communicatively coupled with the eyewear device and/or other devices. Other devices may provide certain functionality (e.g., tracking, positioning, depth mapping, processing, storage, etc.) to the AR system 1500. In some embodiments, the neck strap includes a controller and a power source. In some embodiments, the acoustic sensor of the neck strap is configured to detect sound and convert the detected sound into an electronic format (analog or digital).
The controller of the neck strap processes information generated by sensors on the neck strap and/or the AR system 1500. For example, the controller may process information from the acoustic sensor 1504. For each detected sound, the controller may perform a direction-of-arrival (DOA) estimation to estimate the direction in which the detected sound arrives at the microphone array. The controller may populate the audio data set with this information when the microphone array detects sound. In embodiments where the AR system 1500 includes an IMU, the controller may calculate all inertial and spatial calculations from the IMU located on the eyeglass device. The connector may communicate information between the eyewear device and the neck strap and between the eyewear device and the controller. The information may be in the form of: optical data, electrical data, wireless data, or any other form of transmissible data. Moving the processing of information generated by the eyeglass device to the neck strap may reduce weight and heat in the eyeglass device, making it more comfortable and safer for the user.
In some embodiments, a power source in the neck strap provides power to the eyewear device and the neck strap. The power source may include, but is not limited to, a lithium ion battery, a lithium polymer battery, a disposable lithium battery, an alkaline battery, or any other form of power storage device. In some embodiments, the power source is a wired power source.
As described, some artificial reality systems may essentially replace one or more of the user's sensory perceptions of the real world with a virtual experience, rather than mixing artificial reality with actual reality. One example of this type of system is a head mounted display system, such as VR system 1550 in fig. 15B, that covers most or all of the user's field of view.
Fig. 15B illustrates a VR system 1550 (e.g., also referred to herein as a VR headset or VR headset) in accordance with some embodiments. VR system 1550 includes a Head Mounted Display (HMD) 1552.HMD 1552 includes a front body 1556 and a frame 1554 (e.g., a strap or belt) shaped to encircle the head of a user. In some implementations, the HMD 1552 includes output audio transducers 1558-1 and 1558-2, as shown in fig. 15B (e.g., transducers). In some embodiments, front body 1556 and/or frame 1554 include one or more electronic elements, including one or more electronic displays, one or more IMUs, one or more tracking emitters or detectors, and/or any other suitable devices or sensors for creating an artificial reality experience.
The artificial reality system may include various types of visual feedback mechanisms. For example, the display devices in the AR system 1500 and/or VR system 1550 may include one or more Liquid Crystal Displays (LCDs), light emitting diode (LIGHT EMITTING diode) displays, organic LIGHT EMITTING Diode (OLED) displays, and/or any other suitable type of display screen. The artificial reality system may include a single display screen for both eyes or may provide a display screen for each eye, which may allow for additional flexibility in zoom adjustment or correction of refractive errors associated with the user's vision. Some artificial reality systems also include an optical subsystem through which a user can view one or more lenses of the display screen (e.g., conventional concave or convex lenses, fresnel lenses, or adjustable liquid lenses).
In addition to or instead of using a display screen, some artificial reality systems include one or more projection systems. For example, the display devices in the AR system 1500 and/or VR system 1550 may include (e.g., using a waveguide) micro LED projectors that project light into the display devices, such as a transparent combination lens that allows ambient light to pass through. The display device may refract the projected light to the pupil of the user, and may enable the user to view both the artificial reality content and the real world at the same time. The artificial reality system may also be configured with any other suitable type or form of image projection system.
The artificial reality system may also include various types of computer vision components and subsystems. For example, the AR system 1500 and/or VR system 1550 can include one or more optical sensors, such as two-dimensional (2D) or three-dimensional (3D) cameras, time-of-flight depth sensors, single beam or scanning laser rangefinders, 3D LiDAR sensors, and/or any other suitable type or form of optical sensor. The artificial reality system may process data from one or more of these sensors to identify the user's location, map a real world map, provide the user with a context for a real world environment, and/or perform various other functions. For example, FIG. 15B shows VR system 1550 with cameras 1560-1 and 1560-2 that can be used to provide depth information for creating a voxel field and a two-dimensional grid to provide object information to a user to avoid collisions. Fig. 15B also shows that the VR system includes one or more additional cameras 1562 that are configured to augment cameras 1560-1 and 1560-2 by providing more information. For example, additional cameras 1562 can be used to provide color information that cameras 1560-1 and 1560-2 cannot recognize. In some embodiments, cameras 1560-1 and 1560-2 and additional cameras 1562 can include optional IR cut-off filters configured to remove IR light received at the respective camera sensors.
In some embodiments, AR system 1500 and/or VR system 1550 can include a haptic feedback system that can be incorporated into headwear, gloves, tights, hand-held controllers, environmental devices (e.g., chairs or foot pads), and/or any other type of device or system, such as the wearable devices discussed herein. The haptic feedback system may provide various types of skin feedback including vibration, force, traction, shear, texture, and/or temperature. Haptic feedback systems may also provide various types of kinesthetic feedback, such as motion and compliance. Haptic feedback may be implemented using motors, piezoelectric actuators, fluid systems, and/or various other types of feedback mechanisms. The haptic feedback system may be implemented independently of, within, and/or in conjunction with other artificial reality devices.
The techniques described above can be used with any device for interacting with an artificial reality environment, including the head wearable device of fig. 15A-15B, but can also be used with other types of wearable devices for sensing neuromuscular signals (such as body wearable devices or head wearable devices that may have neuromuscular sensors closer to the brain or spine). Having thus described example wrist wearable devices and head wearable devices, attention is now directed to example feedback systems that can be integrated into the devices described above or as separate devices.
Example feedback device
Fig. 17 is a schematic diagram illustrating additional components (e.g., additional components that allow for the use of aspects of the braided structures described herein to provide haptic feedback) that can be used with the artificial reality system 1600 of fig. 16A and 16B, according to some embodiments. For ease of illustration, the components in fig. 17 are illustrated in a particular arrangement, and those skilled in the art will appreciate that other arrangements are possible. Moreover, while some example features are shown, various other features are not shown for the sake of brevity and so as not to obscure aspects of the example embodiments disclosed herein.
The artificial reality system 1600 may also provide feedback to the user that the action was performed. The feedback provided may be visual feedback via an electronic display in the head-mounted display 1611 (e.g., displaying the simulated hand as it picks up and lifts the virtual coffee cup) and/or haptic feedback via a haptic assembly 1722 in the device 1720. For example, the tactile feedback may prevent one or more of the user's fingers (or at least hinder/prevent movement thereof) from curling beyond a certain point to simulate the feel of touching a solid coffee cup. To this end, the device 1720 changes (directly or indirectly) the pressurized state of one or more of the haptic assemblies 1722. Each haptic assembly 1722 includes a mechanism that provides resistance at least when the respective haptic assembly 1722 transitions from a first pressurized state (e.g., atmospheric pressure or air bleed) to a second pressurized state (e.g., air bleed to a threshold pressure). The structure of haptic assembly 1722 can be integrated into a variety of devices configured to contact or be in proximity to the skin of a user, including, but not limited to, devices such as glove donning devices, body donning apparel devices, headphone devices (e.g., artificial reality headphones 803 in fig. 8A-8B).
As described above, the haptic assemblies 1722 described herein are configured to transition between a first pressurized state and a second pressurized state to provide haptic feedback to a user. Due to the ever-changing nature of artificial reality, haptic assembly 1722 may need to transition between the two states hundreds or possibly thousands of times during a single use. Thus, the haptic assemblies 1722 described herein are durable and are designed to quickly transition from state to state. To provide some context, in the first pressurized state, haptic assembly 1722 does not interfere with free movement of a portion of the wearer's body. For example, one or more haptic assemblies 1722 incorporated into a glove are made of a flexible material (e.g., an electrostatic zipper actuator) that does not interfere with free movement of a wearer's hand and fingers. The haptic assembly 1722 is configured to conform to a shape of a body portion of a wearer when in a first pressurized state. However, once in the second pressurized state, haptic assembly 1722 is configured to resist free movement of the body part of the wearer. For example, when the haptic assemblies 1722 are in the second pressurized state, the respective haptic assembly 1722 (or the respective haptic assemblies) can limit movement of the wearer's finger (e.g., prevent finger curling or stretching). Further, once in the second pressurized state, haptic assemblies 1722 may take on different shapes, with some haptic assemblies 1722 configured to take on a planar rigid shape (e.g., planar and rigid) and some other haptic assemblies 1722 configured to at least partially bend or flex.
As a non-limiting example, the system 17 includes a plurality of devices 1720-A, 1720-B …, 1720-N, each of which includes a garment 1702 and one or more haptic assemblies 1722 (e.g., haptic assemblies 1722-A, 1722-B, …, 1722-N). As described above, haptic assembly 1722 is configured to provide haptic stimulus to a wearer of device 1720. The garment 1702 of each device 1720 can be a variety of garments (e.g., gloves, socks, shirts, or pants), and thus, a user can wear multiple devices 1720 that provide tactile stimulation to different parts of the body. Each haptic assembly 1722 is coupled to (e.g., embedded in or attached to) garment 1702. In addition, each haptic assembly 1722 includes a support structure 1704 and at least one bladder (blade) 1706. The bladder 1706 (e.g., a membrane) is a sealed, inflatable pocket made of a durable and puncture resistant material, such as Thermoplastic Polyurethane (TPU), flexible polymer, or the like. The bladder 1706 contains a medium (e.g., a fluid such as air, an inert gas, or even a liquid) that can be added to the bladder 1706 or removed from the bladder 1706 to change the pressure (e.g., fluid pressure) within the bladder 1706. The support structure 1704 is made of a stronger and stiffer material than the material of the bladder 1706. The respective support structures 1704 coupled to the respective bladders 1706 are configured to strengthen the respective bladders 1706 as the respective bladders change shape and size due to changes in pressure (e.g., fluid pressure) within the bladders.
The system 1700 also includes a controller 1714 and a pressure altering device 1710. In some embodiments, the controller 1714 is part of the computer system 1730 (e.g., a processor of the computer system 1730). The controller 1714 is configured to control the operation of the pressure changing device 1710, and in turn, the device 1720. For example, the controller 1714 sends one or more signals to the pressure changing device 1710 to activate the pressure changing device 1710 (e.g., turn it on and off). The one or more signals may specify a desired pressure (e.g., pounds per square inch) to be output by the pressure change device 1710. One or more signals and the pressure output by the pressure changing device 1710 may be generated based on information collected by the sensor 1625 in fig. 16A and 16B. For example, based on information collected by sensor 1625 in fig. 16A and 16B (e.g., a user contacting an artificial coffee cup), one or more signals may cause pressure change device 1710 to increase pressure (e.g., fluid pressure) within first haptic assembly 1722 at a first time. Then, based on additional information collected by sensor 1714 and/or sensor 1724 (e.g., a user grabbing and lifting an artificial coffee cup), the controller may send one or more additional signals to pressure changing device 1710 that cause pressure changing device 1710 to further increase pressure within first haptic assembly 1722 at a second time after the first time. In addition, the one or more signals may cause the pressure changing device 1710 to inflate one or more bladders 1706 in the first device 1720-A while one or more bladders 1706 in the second device 1720-B remain unchanged. In addition, the one or more signals may cause the pressure changing device 1710 to inflate one or more bladders 1706 in the first device 1720-A to a first pressure and to inflate one or more other bladders 1706 in the first device 1720-A to a second pressure that is different from the first pressure. Depending on the number of devices 1720 served by the pressure changing device 1710 and the number of bladders therein, many different inflation configurations can be achieved by one or more signals, and the above examples are not meant to be limiting.
The system 1700 may include an optional manifold 1712 between the pressure changing device 1710 and the device 1720. The manifold 1712 may include one or more valves (not shown) that pneumatically couple each haptic assembly 1722 with the pressure changing device 1710 via tubing 1708. In some embodiments, the manifold 1712 is in communication with the controller 1714, and the controller 1714 controls one or more valves of the manifold 1712 (e.g., the controller generates one or more control signals). The manifold 1712 is configured to switchably couple the pressure changing device 1710 with one or more haptic assemblies 1722 of the same or different devices 1720 based on one or more control signals from the controller 1714. In some embodiments, instead of pneumatically coupling the pressure changing device 1710 with the haptic assembly 1722 using the manifold 1712, the system 1700 may include multiple pressure changing devices 1710, wherein each pressure changing device 1710 is directly pneumatically coupled with a single (or multiple) haptic assembly 1722. In some embodiments, the pressure changing device 1710 and the optional manifold 1712 can be configured as part of one or more of the devices 1720 (not shown), while in other embodiments, the pressure changing device 1710 and the optional manifold 1712 can be configured external to the device 1720. A single pressure changing device 1710 may be shared by multiple devices 1720.
In some embodiments, the pressure changing device 1710 is a pneumatic device, a hydraulic device, a pneumatic hydraulic device, or some other device capable of adding and removing media (e.g., fluid, liquid, gas) from the one or more haptic assemblies 1722.
The devices shown in fig. 17 may be coupled via a wired connection (e.g., via bus 1709). Alternatively, one or more of the devices shown in fig. 17 may be connected wirelessly (e.g., via short-range communication signals). Having thus described an example wrist wearable device, an example head wearable device, and an example feedback device, attention is now directed to an example system that integrates one or more of the above devices.
Example System
Fig. 16A and 16B are block diagrams illustrating an exemplary artificial reality system, according to some embodiments. According to some embodiments, system 1600 includes one or more devices for facilitating interaction with an artificial reality environment. For example, head wearable device 1611 can present user interface to user 16015 in an artificial reality environment. As a non-limiting example, system 1600 includes one or more wearable devices that can be used in conjunction with one or more computing devices. In some embodiments, system 1600 provides functionality of a virtual reality device, an augmented reality device, a mixed-reality (mixed-reality) device, a mixed-reality (hybrid-reality) device, or a combination thereof. In some implementations, the system 1600 provides functionality of a user interface and/or one or more user applications (e.g., games, word processors, messaging applications, calendars, clocks, etc.).
The system 1600 can include one or more of a server 1670, an electronic device 1674 (e.g., a computer 1674a, a smartphone 1674b, a controller 1674c, and/or other devices), a head wearable device 1611 (e.g., AR system 1500 or VR system 1550), and/or a wrist wearable device 1688 (e.g., wrist wearable device 16020). In some implementations, one or more of the server 1670, the electronic device 1674, the head wearable device 1611, and/or the wrist wearable device 1688 are communicatively coupled via the network 1672. In some embodiments, head wearable device 1611 is configured such that one or more operations are performed by communicatively coupled wrist wearable device 1688, and/or both devices can also be connected to an intermediary device, such as smart phone 1674b, controller 1674c, or other device that provides instructions and data to and between the two devices. In some embodiments, head wearable device 1611 is configured such that one or more operations are performed by the plurality of devices in conjunction with wrist wearable device 1688. In some implementations, instructions that cause execution of one or more operations are controlled via the artificial reality processing module 1645. The artificial reality processing module 1645 can be implemented in one or more devices, such as one or more of the server 1670, the electronic device 1674, the head wearable device 1611, and/or the wrist wearable device 1688. In some embodiments, one or more devices perform the operations of the artificial reality processing module 1645 using one or more corresponding processors, alone or in combination with at least one other device described herein. In some embodiments, system 1600 includes other wearable devices not shown in fig. 16A and 16B, such as a ring, collar, sock, glove machine analog.
In some implementations, the system 1600 provides functionality to control or provide commands to one or more computing devices 1674 based on wearable devices (e.g., the head wearable device 1611 or the wrist wearable device 1688) that determine a motion or intended motion of a user. A motor action is an intended motor action when a detected neuromuscular signal propagating through a neuromuscular pathway can be determined to be a motor action before the user performs the motor action or before the user completes the motor action. The motion actions can be detected based on the detected neuromuscular signals, but can additionally (using fusion of various sensor inputs), or alternatively, use other types of sensors to detect motion actions (such as focusing on a camera observing hand motion and/or using data from an inertial measurement unit to detect a characteristic vibration sequence or other data type corresponding to a particular air gesture). The one or more computing devices include one or more of a head mounted display, a smart phone, a tablet, a smart watch, a laptop, a computer system, an augmented reality system, a robot, a vehicle, a virtual avatar, a user interface, a wrist wearable device, and/or other electronic devices and/or control interfaces.
In some embodiments, the motion includes finger motion, hand motion, wrist motion, arm motion, pinch gesture, index finger motion, middle finger motion, ring finger motion, little finger motion, thumb motion, grasping (or making a fist) of the hand, waving the hand, and/or other motion of the user's hand or arm.
In some implementations, a user can define one or more gestures using a learning module. In some implementations, a user can enter a training phase in which user-defined gestures are associated with one or more input commands that, when provided to a computing device, cause the computing device to perform actions. Similarly, one or more input commands associated with the user-defined gesture can be used to cause the wearable device to perform one or more actions locally. Once trained, the user-defined gestures are stored in memory 1660. Similar to athletic activity, the one or more processors 1650 can use neuromuscular signals detected by the one or more sensors 1625 to determine that the user performed a user defined gesture.
The electronic device 1674 can also include a communication interface 1615, an interface 1620 (e.g., including one or more displays, lights, speakers, and tactile generators), one or more sensors 1625, one or more applications 1635, an artificial reality processing module 1645, one or more processors 1650, and memory 1660. The electronic device 1674 is configured to communicatively couple with the wrist-wearable device 1688 and/or the head-worn device 1611 (or other device) using the communication interface 1615. In some embodiments, electronic device 1674 is configured to be communicatively coupled to wrist-wearable device 1688 and/or head-wearable device 1611 (or other device) via an Application Programming Interface (API). In some implementations, the electronic device 1674 operates in conjunction with the wrist-wearable device 1688 and/or the head-wearable device 1611 to determine gestures and cause operations or actions to be performed at the communicatively coupled device.
The server 1670 includes a communication interface 1615, one or more application programs 1635, an artificial reality processing module 1645, one or more processors 1650, and memory 1660. In some embodiments, server 1670 is configured to receive sensor data from one or more devices, such as head wearable device 1611, wrist wearable device 1688, and/or electronic device 1674, and to use the received sensor data to identify gestures or user inputs. The server 1670 can generate instructions that cause operations and actions associated with the determined gesture or user input to be performed at a communicatively coupled device, such as the head wearable device 1611.
The head wearable device 1611 includes smart glasses (e.g., augmented reality glasses), artificial reality headphones (e.g., VR/AR headphones), or other head wearable devices. In some implementations, one or more components of the head wearable device 1611 are housed within a body of the HMD 1614 (e.g., a frame of smart glasses, a body of AR headphones, etc.). In some implementations, one or more components of the head wearable device 1611 are stored within a lens of the HMD 1614 or coupled with the HMD. Alternatively or additionally, in some embodiments, one or more components of head wearable device 1611 are housed within modular housing 1606. The head wearable device 1611 is configured to communicatively couple with other electronic devices 1674 and/or servers 1670 using communication interfaces 1615 as described above.
Fig. 16B depicts additional details of the HMD 1614 and modular housing 1606 described above with reference to fig. 16A, according to some embodiments.
The housing 1606 includes a communication interface 1615, circuitry 1646, a power supply 1607 (e.g., a battery for powering one or more electronic components of the housing 1606 and/or providing available power to the HMD 1614), one or more processors 1650, and a memory 1660. In some implementations, the housing 1606 can include one or more supplemental components that increase the functionality of the HMD 1614. For example, in some embodiments, the housing 1606 can include one or more sensors 1625, an AR processing module 1645, one or more haptic generators 1621, one or more imaging devices 1655, one or more microphones 1613, one or more speakers 1617, and the like. The housing 1606 is configured to be coupled with the HMD1614 via one or more telescoping side bands. More specifically, housing 1606 is a modular portion of head wearable device 1611 that is removable from head wearable device 1611 and replaceable with another housing (which includes more or less functionality). The modularity of housing 1606 allows the user to adjust the functionality of head wearable device 1611 based on their needs.
In some implementations, the communication interface 1615 is configured to communicatively couple the housing 1606 with the HMD 1614, server 1670, and/or other electronic devices 1674 (e.g., controller 1674c, tablet, computer, etc.). The communication interface 1615 is used to establish a wired or wireless connection between the housing 1606 and other devices. In some embodiments, the communication interface 1615 includes hardware capable of data communication using any of a variety of custom or standard wireless protocols (e.g., IEEE 802.15.4, wi-Fi, zigBee, 6LoWPAN, thread, Z-wave, smart bluetooth, isa100.11a, wireless HART, or MiWi), custom or standard wired protocols (e.g., ethernet or HomePlug), and/or any other suitable communication protocol. In some implementations, the housing 1606 is configured to be communicatively coupled with the HMD 1614 and/or other electronic device 1674 via an Application Programming Interface (API).
In some embodiments, the power source 1607 is a battery. The power supply 1607 can be a primary battery source or a secondary battery source for the HMD 1614. In some implementations, the power supply 1607 provides available power to one or more electronic components of the housing 1606 or HMD 1614. For example, the power supply 1607 can provide available power to the sensor 1621, the speaker 1617, the HMD 1614, and the microphone 1613. In some embodiments, the power source 1607 is a rechargeable battery. In some embodiments, the power supply 1607 is a modular battery that can be removed and replaced with a fully charged battery if charged alone.
The one or more sensors 1625 can include a heart rate sensor, a neuromuscular signal sensor (e.g., an Electromyography (EMG) sensor), an SpO2 sensor, an altimeter, a thermal sensor or thermocouple, an ambient light sensor, an ambient noise acoustic sensor, and/or an Inertial Measurement Unit (IMU). Additional non-limiting examples of the one or more sensors 1625 include, for example, infrared, pyroelectric, ultrasonic, microphone, laser, optical, doppler, gyroscope, accelerometer, resonant LC sensor, capacitive sensor, acoustic sensor, and/or inductive sensor. In some embodiments, one or more sensors 1625 are configured to collect additional data about the user (e.g., impedance of the user's body). Examples of sensor data output by these sensors include body temperature data, infrared rangefinder data, location information, motion data, activity recognition data, contour detection and recognition data, gesture data, heart rate data, and other wearable device data (e.g., biometric readings and output, accelerometer data). The one or more sensors 1625 can include a position sensing device (e.g., GPS) configured to provide position information. In some embodiments, data measured or sensed by one or more sensors 1625 is stored in memory 1660. In some implementations, the housing 1606 receives sensor data from a communicatively coupled device (such as the HMD 1614, the server 1670, and/or other electronic devices 1674). Alternatively, the housing 1606 can provide sensor data to the HMD 1614, server 1670, and/or other electronic devices 1674.
The one or more haptic generators 1621 can include one or more actuators (e.g., eccentric rotating mass (ECCENTRIC ROTATING MASS, ERM), linear resonant actuator (linear resonant actuator, LRA), voice Coil Motor (VCM), piezoelectric haptic actuator, thermoelectric device, solenoid actuator, ultrasonic transducer or sensor, etc.). In some embodiments, the one or more haptic generators 1621 are hydraulic, pneumatic, electric, and/or mechanical actuators. In some embodiments, one or more haptic generators 1621 are part of a surface of the housing 1606 that can be used to generate a haptic response (e.g., thermal changes at the surface, tightening or loosening of the strap, increase or decrease in pressure, etc.). For example, the one or more haptic generators 1625 can apply vibration stimulation, pressure stimulation, squeeze simulation, shear stimulation, temperature change, or some combination thereof to a user. Further, in some embodiments, the one or more haptic generators 1621 include audio generation devices (e.g., speakers 1617 and other acoustic transducers) and lighting devices (e.g., light-emitting diodes (LEDs), screen displays, etc.). One or more haptic generators 1621 can be used to generate different audible sounds and/or visible light that are provided to a user as haptic responses. The above list of haptic generators is not exhaustive; any affected device can be used to generate one or more haptic responses that are delivered to the user.
In some embodiments, the one or more applications 1635 include social media applications, banking applications, health applications, messaging applications, web browsers, gaming applications, streaming media applications, imaging applications, productivity applications, social applications, and the like. In some embodiments, one or more applications 1635 include an artificial reality application. One or more application programs 1635 are configured to provide data to head wearable device 1611 to perform one or more operations. In some implementations, one or more applications 1635 can be displayed via a display 1630 of the head wearable device 1611 (e.g., via the HMD 1614).
In some implementations, the instructions that form the execution of the one or more operations are controlled via an Artificial Reality (AR) processing module 1645. The AR processing module 1645 can be implemented in one or more devices, such as one or more of the server 1670, the electronic device 1674, the head wearable device 1611, and/or the wrist wearable device 1670. In some embodiments, one or more devices perform the operations of AR processing module 1645 using one or more corresponding processors, alone or in combination with at least one other device described herein. In some implementations, the AR processing module 1645 is configured to process signals based at least on sensor data. In some implementations, the AR processing module 1645 is configured to process the signal based on received image data capturing at least a portion of a user's hand, mouth, facial expression, surrounding environment, and the like. For example, the housing 1606 can receive EMG data and/or IMU data from the one or more sensors 1625 and provide the sensor data to the AR processing module 1645 for a particular operation (e.g., gesture recognition, facial recognition, etc.). The AR processing module 1645 causes a device communicatively coupled to the housing 1606 to perform operations (or actions). In some implementations, the AR processing module 1645 performs different operations based on the sensor data and/or performs one or more actions based on the sensor data.
In some embodiments, the one or more imaging devices 1655 can include an ultra-wide camera, a tele camera, a depth sensing camera, or other type of camera. In some embodiments, one or more imaging devices 1655 are used to capture image data and/or video data. Imaging device 1655 can be coupled to a portion of housing 1606. The captured image data can be processed and stored in memory and then presented to a user for viewing. The one or more imaging devices 1655 can include one or more modes for capturing image data or video data. For example, these modes can include a high-DYNAMIC RANGE (HDR) image capture mode, a low-light image capture mode, a burst image capture mode, and other modes. In some implementations, the particular mode is automatically selected based on the environment (e.g., lighting, movement of the device, etc.). For example, a wrist wearable device with an HDR image capturing mode and an activated low-light image capturing mode can automatically select an appropriate mode based on the environment (e.g., dark illumination can result in the use of the low-light image capturing mode instead of the HDR image capturing mode). In some embodiments, the user is able to select a mode. Image data and/or video data captured by one or more imaging devices 1655 is stored in memory 1660 (memory 1660 can include volatile and nonvolatile memory such that image data and/or video data can be temporarily or permanently stored as desired by the environment).
Circuitry 1646 is configured to facilitate interaction between housing 1606 and HMD 1614. In some implementations, the circuitry 1646 is configured to regulate power distribution between the power supply 1607 and the HMD 1614. In some implementations, the circuitry 1646 is configured to transfer audio and/or video data between the HMD 1614 and/or one or more components of the housing 1606.
The one or more processors 1650 can be implemented as any type of computing device, such as an integrated system single chip (INTEGRATED SYSTEM-on-a-chip), a microcontroller, a fixed programmable gate array (fixed programmable GATE ARRAY, FPGA), a microprocessor, and/or other Application Specific Integrated Circuit (ASIC). The processor may operate in conjunction with memory 1660. Memory 1660 may be or include random access memory (random access memory, RAM), read-only memory (ROM), dynamic random access memory (dynamic random access memory, DRAM), static random access memory (static random access memory, SRAM), and magnetoresistive random access memory (magnetoresistive random access memory, MRAM), and may include firmware such as static data or fixed instructions, basic input/output system (BIOS), system functions, configuration data, and other routines used during operation of the housing and processor 1650. Memory 1660 also provides storage for data and instructions associated with applications and data processed by processor 1650.
In some embodiments, memory 1660 stores at least user data 1661 including sensor data 1662 and AR processing data 1664. Sensor data 1662 includes sensor data monitored by one or more sensors 1625 of the housing 1606 and/or received from one or more devices (such as HMD 1614, smart phone 1674b, controller 1674c, etc.) communicatively coupled to the housing 1606. Sensor data 1662 can be used by AR processing module 1645 to include sensor data collected over a predetermined period of time. The AR processing data 1664 can include one or more predefined camera control gestures, user-defined camera control gestures, predefined non-camera control gestures, and/or user-defined non-camera control gestures. In some implementations, the AR processing data 1664 further includes one or more predetermined thresholds for different gestures.
HMD 1614 includes a communication interface 1615, a display 1630, an AR processing module 1645, one or more processors, and memory. In some implementations, the HMD 1614 includes one or more sensors 1625, one or more haptic generators 1621, one or more imaging devices 1655 (e.g., cameras), microphones 1613, speakers 1617, and/or one or more applications 1635.HMD 1614 operates in conjunction with housing 1606 to perform one or more operations of head wearable device 1611, such as capturing camera data, presenting a representation of image data on a coupled display, operating one or more applications 1635, and/or allowing a user to participate in an AR environment.
Any data collection performed by the devices described herein and/or any device configured to perform or cause performance of the different embodiments described above with reference to any of the figures (hereinafter "device") is done with the consent of the user in a manner consistent with all applicable privacy laws. The user is provided with an option to allow the device to collect data, and an option to limit or reject the device from collecting data. The user can choose to join or leave any data collection at any time. In addition, the user may choose to request deletion of any collected data.
It will be understood that, although the terms "first" and "second" are used herein to describe various elements, these elements should not be limited by these terms. These terms are only used to distinguish one element from another element.
The terminology used herein is for the purpose of describing particular embodiments only and is not intended to limit the claims. As used in the description of the embodiments and the appended claims, the singular forms "a," "an," and "the" are intended to include the plural forms as well, unless the context clearly indicates otherwise. It will also be understood that the term "and/or" as used herein refers to and includes any and all possible combinations of one or more of the associated list items. It will be further understood that the terms "comprises" and/or "comprising," when used in this specification, specify the presence of stated features, integers, steps, operations, elements, and/or components, but do not preclude the presence or addition of one or more other features, integers, steps, operations, elements, components, and/or groups thereof.
As used herein, the term "if" can be interpreted as "when" or "in response to a determination" or "in accordance with a determination" or "in response to detection" that the stated precondition is true, depending on the context. Also, the phrase "if it is determined that the prerequisite is true" or "if it is true" or "when the prerequisite is true" can be interpreted as "being determined" or "in response to a determination" or "according to a determination", "being detected" or "in response to a detection" that the prerequisite is true, depending on the context.
The foregoing description, for purpose of explanation, has been described with reference to specific embodiments. However, the illustrative discussions above are not intended to be exhaustive or to limit the claims to the precise forms disclosed. Many modifications and variations are possible in light of the above teaching. The embodiments were chosen and described in order to best explain the principles of operation and practical applications to others skilled in the art.
Claims (17)
1. A wearable device, the wearable device comprising:
an electrically conductive deformable fabric, the electrically conductive deformable fabric comprising:
A conductive trace having a fixed length along a first axis that is non-malleable;
The conductive trace is stitched into the fabric structure to produce a conductive deformable material, wherein:
the fabric structure includes a stitch pattern that facilitates the conductive trace to expand and collapse in an oscillating manner to allow the conductive trace to expand and contract along the first axis without exceeding the fixed length of the conductive trace, respectively, and
The conductive deformable material is positioned within the wearable device such that when the wearable device is worn, the stitch pattern is located over a user's joint to allow the stitch pattern to stretch or contract with movement of the joint.
2. The wearable device of claim 1, wherein the stitch pattern further facilitates the conductive trace to stretch and contract along a second axis perpendicular to the first axis without exceeding the fixed length of the conductive trace.
3. The wearable device of claim 1, wherein the conductive trace provides a signal that can be used to determine an amount of strain at the fabric structure.
4. A wearable device according to claim 3, wherein the amount of strain on the fabric structure is used to determine articulation for interacting with an artificial reality environment.
5. The wearable device of claim 1, wherein the stitch pattern of the textile structure allows the textile structure to collapse via alternating folds, wherein the conductive trace collapses with the textile structure.
6. The wearable device of claim 1, wherein the fabric structure comprises an elasticity that allows the conductive deformable fabric to return to a default state.
7. The wearable device of claim 1, wherein the conductive trace is linear along the first axis along the non-malleable fixed length.
8. The wearable device of claim 1, wherein the stitch pattern of the fabric structure is a plain stitch pattern.
9. The wearable device of claim 1, wherein the conductive trace is embroidered onto a fabric structure.
10. The wearable device of claim 1, wherein a portion of the conductive trace is configured to attach to a neuromuscular signal sensor.
11. The wearable device of claim 1, wherein the conductive trace is an insulated copper magnet wire.
12. The wearable device of claim 1, wherein the wearable device is machine washable.
13. The wearable device of claim 1, wherein the conductive deformable fabric is configured to shrink to a size 300% less than a fixed length of the conductive trace.
14. The wearable device of claim 1, wherein the first portion of the conductive trace is configured to contact and not electrically short with the second portion of the conductive trace.
15. The wearable device of claim 1, wherein the conductive deformable fabric is configured to expand and fold in an oscillating manner for 8,000 to 20,000 cycles without performance degradation.
16. The wearable device of claim 1, wherein a resistivity of the conductive trace increases along the fixed length of the conductive trace as a function of a width of the conductive trace.
17. The wearable device of claim 1, wherein the unfolding and folding in an oscillating manner follows a folding technique based on a paper folding process.
Applications Claiming Priority (9)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US63/314,199 | 2022-02-25 | ||
US63/485,880 | 2023-02-17 | ||
US63/485,875 | 2023-02-17 | ||
US63/485,878 | 2023-02-17 | ||
US63/485,882 | 2023-02-17 | ||
US18/174,593 US20230376112A1 (en) | 2022-02-25 | 2023-02-24 | Knitted textile structures formed by altering knit patterns to accommodate external mediums, and manufacturing processes associated therewith |
US18/174,592 | 2023-02-24 | ||
US18/174,593 | 2023-02-24 | ||
PCT/US2023/013905 WO2023164189A1 (en) | 2022-02-25 | 2023-02-26 | Techniques for incorporating stretchable conductive textile traces and textile-based sensors into knit structures |
Publications (1)
Publication Number | Publication Date |
---|---|
CN118591344A true CN118591344A (en) | 2024-09-03 |
Family
ID=92527198
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202380018664.8A Pending CN118591344A (en) | 2022-02-25 | 2023-02-26 | Method for integrating stretchable conductive textile traces and textile-type sensors into a woven structure |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN118591344A (en) |
-
2023
- 2023-02-26 CN CN202380018664.8A patent/CN118591344A/en active Pending
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US10082872B2 (en) | Deformable haptic wearables with variable physical properties | |
CN109690455A (en) | Finger-worn type device with sensor and haptics member | |
US20210081048A1 (en) | Artificial reality devices, including haptic devices and coupling sensors | |
US11983320B2 (en) | Techniques for incorporating stretchable conductive textile traces and textile-based sensors into knit structures | |
US11262797B1 (en) | Computer systems with wearable force sensing devices | |
Yu et al. | uKnit: A Position-Aware Reconfigurable Machine-Knitted Wearable for Gestural Interaction and Passive Sensing using Electrical Impedance Tomography | |
US20230376112A1 (en) | Knitted textile structures formed by altering knit patterns to accommodate external mediums, and manufacturing processes associated therewith | |
CN118591344A (en) | Method for integrating stretchable conductive textile traces and textile-type sensors into a woven structure | |
TW202400084A (en) | Techniques for incorporating stretchable conductive textile traces and textile-based sensors into knit structures | |
US20210378571A1 (en) | Composite bioelectrodes | |
US20240360601A1 (en) | Strain-locking knit band structures with embedded electronics for wearable devices | |
US20240077946A1 (en) | Systems and methods of generating high-density multi-modal haptic responses using an array of electrohydraulic-controlled haptic tactors, and methods of manufacturing electrohydraulic-controlled haptic tactors for use therewith | |
US11488361B1 (en) | Systems and methods for calibrating wearables based on impedance levels of users' skin surfaces | |
US20240338081A1 (en) | Wearable Device For Adjusting Haptic Responses Based On A Fit Characteristic | |
US20240148331A1 (en) | Systems for detecting fit of a wearable device on a user by measuring the current draw to amplify a biopotential signal sensor and method of use thereof | |
CN118866429A (en) | Strain-locking needle belt structure with embedded electronic device for wearable equipment | |
US20240192765A1 (en) | Activation force detected via neuromuscular-signal sensors of a wearable device, and systems and methods of use thereof | |
US20240329738A1 (en) | Techniques for determining that impedance changes detected at sensor-skin interfaces by biopotential-signal sensors correspond to user commands, and systems and methods using those techniques | |
US20240225520A1 (en) | Techniques for utilizing a multiplexed stage-two amplifier to improve power consumption of analog front-end circuits used to process biopotential signals, and wearable devices, systems, and methods of use thereof | |
EP4454558A1 (en) | Manufacturing processes for biopotential-based wrist-wearable devices and resulting manufactured biopotential -based wrist-wearable devices | |
EP4410190A1 (en) | Techniques for using inward-facing eye-tracking cameras of a head-worn device to measure heart rate, and systems and methods using those techniques | |
US20240248553A1 (en) | Coprocessor for biopotential signal pipeline, and systems and methods of use thereof | |
US20240118749A1 (en) | Systems for calibrating neuromuscular signals sensed by a plurality of neuromuscular-signal sensors, and methods of use thereof | |
US11714495B2 (en) | Finger devices with adjustable housing structures | |
US20240329749A1 (en) | Easy-to-remember interaction model using in-air hand gestures to control artificial-reality headsets, and methods of use thereof |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination |