WO2021141936A1 - Haptic waveform generation and rendering at interface device - Google Patents
Haptic waveform generation and rendering at interface device Download PDFInfo
- Publication number
- WO2021141936A1 WO2021141936A1 PCT/US2021/012240 US2021012240W WO2021141936A1 WO 2021141936 A1 WO2021141936 A1 WO 2021141936A1 US 2021012240 W US2021012240 W US 2021012240W WO 2021141936 A1 WO2021141936 A1 WO 2021141936A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- haptic
- tactile
- waveform
- rendering
- input characteristic
- Prior art date
Links
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/011—Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/016—Input arrangements with force or tactile feedback as computer generated output to the user
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/03—Arrangements for converting the position or the displacement of a member into a coded form
- G06F3/033—Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor
- G06F3/0346—Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor with detection of the device orientation or free movement in a 3D space, e.g. 3D mice, 6-DOF [six degrees of freedom] pointers using gyroscopes, accelerometers or tilt-sensors
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/03—Arrangements for converting the position or the displacement of a member into a coded form
- G06F3/033—Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor
- G06F3/0354—Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor with detection of 2D relative movements between the device, or an operating part thereof, and a plane or surface, e.g. 2D mice, trackballs, pens or pucks
- G06F3/03545—Pens or stylus
Definitions
- haptic core e.g., chip
- functions of the haptic core are implemented in microcontroller in a device.
- Other specific aspects include (e.g., hand-held) devices (or “Indexes”) incorporating the haptic core chip.
- Further specific aspects include a modeler and library service for generating, storing and distributing object surface models for use in rendering virtual tactile experiences at a device.
- Virtual tactile experiences can be used to provide a human user with tactile feedback from displayed objects in an interface (e.g., a touch screen interface) or in an environment (e.g., an immersive environment such as virtual reality, mixed reality, merged reality, and/or augmented reality, collectively referred to as “virtual reality” (VR) herein).
- an interface e.g., a touch screen interface
- an environment e.g., an immersive environment such as virtual reality, mixed reality, merged reality, and/or augmented reality, collectively referred to as “virtual reality” (VR) herein).
- VR virtual reality
- Various systems, methods and computer program products are disclosed which provide a human user with tactile feedback from virtual objects, e.g., at a touch interface or on a hand-held device.
- a first aspect of the disclosure includes a haptic core for capturing human tactile interaction with objects and generating a time varying waveform that represents (or, imitates) that human interaction with the object, for example, in rendering at a device such as a hand-held device and/or touch screen device.
- a second aspect of the disclosure includes systems and methods for a hand-held instrument to capture object surface tactile characteristics and create an object surface-human interaction model (also called the “index” or Tactai IndexTM). These aspects can also include displaying the object surface-human interaction model on a display with electromechanical components.
- a third aspect of the disclosure includes a modeler and library service, for example, as operated in a cloud-based system, to generate, store and distribute object surface-human interaction models such as those captured by the index.
- a fourth aspect of the disclosure includes a computer-implemented method of rendering haptic feedback at a device having a tactile interface and a haptic actuator, the method comprising: detecting an input characteristic of user contact with the tactile interface; processing the input characteristic with a data model having an input characteristic to waveform output correspondence; generating a waveform based on the input characteristic using the data model, where the at least one waveform is dynamically generated in response to the input characteristic; and rendering the at least one waveform at the haptic actuator to provide tactile feedback to the user.
- a fifth aspect of the disclosure includes a device having a tactile interface, a haptic actuator, a processor and a memory, wherein the processor is configured to execute instructions stored in the memory to perform actions comprising: detecting an input characteristic of user contact with the tactile interface; processing the input characteristic with a data model having an input characteristic to waveform output correspondence; generating a waveform based on the input characteristic using the data model, where the at least one waveform is dynamically generated in response to the input characteristic; and rendering the at least one waveform at the haptic actuator to provide tactile feedback to the user.
- a sixth aspect of the disclosure includes a computer-implemented method of actuating haptic feedback at a device in response to detecting a haptic event from a user, the method comprising: generating a waveform for triggering a haptic response at the device; and rendering the waveform as a set of vibrations at the device, where the waveform comprises a series of successive vibrations that are rendered as vibrations once the haptic event is detected, and where each vibration is delivered as a decaying sinusoid with a frequency, amplitude and decay rate that vary based on a material of the device, wherein each vibration lasts approximately 0.10 seconds or less.
- the data model calculates the waveform based on a stored set of vibrational responses and extrapolations on the set of vibrational responses, where the extrapolations are best-fit adjustments to the vibrational responses based on the input characteristics of the user contact.
- the haptic actuator is one of a plurality of haptic actuators.
- the input characteristic comprises at least one of: a) a detected speed of movement of a body part of the user across the tactile interface, b) a detected force applied by the user at the tactile interface, or c) a detected change in a contact state of the user’s body part relative to the tactile interface.
- the input characteristic comprises a single input characteristic.
- the tactile interface comprises a display, and wherein the data model is specific to a tactile object shown on the display dining detection of the input characteristic.
- the object shown on the display is one of a plurality of objects in a displayed image or video.
- the tactile feedback is provided as a series of vibrations via the haptic actuator.
- the model is stored locally at memory in the device.
- the model is configured to be downloaded to the memory from an external source.
- the model comprises at least one of: a relational database, a neural network, or linked lists of parameters.
- the device comprises at least one of: a stylus, a tablet, a smart phone, a vehicle steering wheel, a utensil, a medical device, an exploration probe or a hand controller.
- the tactile interface comprises at least one of: a touchpad, a trackpad, a touch screen, a smart surface or a specific haptic-feedback interface element.
- the method further includes, prior to detecting the input characteristic of the user contact: providing instructions for scanning an object with a scanning device; and confirming successful scanning of the object with the scanning device in response to receiving scan data sufficient to characterize the object.
- the scanning device comprises an optical scanning device.
- the scanning device is hand-held.
- the method further includes updating the data model with the scan data to enable tactile feedback associated with the scanned object.
- the scanning device is part of the device having the tactile interface.
- rendering the waveform comprises rendering a series of waveforms at an actuator on the device over time to match an exploration profile of a series of haptic events, where the exploration profile varies over time and the series of waveforms comprises dynamically generated time- varying waveforms that are calculated at a rate of approximately 1 kilo-hertz (kHz), and where the series of waveforms are double integrated to create an acceleration graph representing the haptic response.
- kHz kilo-hertz
- FIG. 1 is a schematic depiction of components in a haptic core according to various embodiments of the disclosure.
- FIG. 2 is a data flow diagram illustrating processes performed by a microcontroller in the haptic core of FIG. 1 according to various implementations.
- FIG. 3 is a flow diagram illustrating processes in a method according to various implementations.
- FIG. 4 is a flow diagram illustrating processes in a method according to various additional implementations.
- FIG. 5 is a schematic depiction of a scanning (or, index) device in the form of a stylus according to various implementations.
- FIG. 6 is a flow diagram illustrating processes in a method according to further implementations.
- FIG. 6 is a flow diagram illustrating processes in a method according to further implementations.
- system can refer to a computer system, server, etc., composed wholly or partially of hardware and/or software components, one or more instances of a system embodied in software and accessible to a local or remote user, all or part of one or more systems in a cloud computing environment, one or more physical and/or virtual machines accessed via the Internet, other types of physical or virtual computing devices, and/or components thereof.
- a platform enables adaptive identification and rendering of tactile feedback from virtual objects, e.g., via one or more virtual reality (VR) environments or via an interface such as a touch interface (e.g., a touch screen or other touch-enabled interface).
- VR virtual reality
- a touch interface e.g., a touch screen or other touch-enabled interface
- Specific aspects include a haptic core (e.g., chip) that is configured to operate as a stand-alone, or approximately autonomous hardware component for generating virtual tactile responses in a device. Further aspects include software and/or hardware components for generating tactile responses in a device, e.g., by dynamically generating and/or rendering a waveform at the device. Other specific aspects include hand-held devices (or “indexes”) incorporating the haptic core chip. Further specific aspects include a modeler and library service for generating, storing and distributing object surface models for use in rendering virtual tactile experiences at a device.
- Virtual tactile experiences can be used to provide a human user with tactile feedback from displayed objects in an interface (e.g., a touch screen interface) or in an environment (e.g., an immersive environment such as virtual reality, mixed reality, merged reality, and/or augmented reality, collectively referred to as “virtual reality” (VR) herein).
- an interface e.g., a touch screen interface
- an environment e.g., an immersive environment such as virtual reality, mixed reality, merged reality, and/or augmented reality, collectively referred to as “virtual reality” (VR) herein).
- VR virtual reality
- Various embodiments of the disclosure include systems and methods that enable application of tactile features to virtual objects, e.g., to images or other displayed objects in an adaptive manner.
- the Haptic Core Chip (or simply, haptic core) is a physical embodiment of aspects performed by the touch enabling platform (or TEP) described in US Patent Application No. 16/209,183.
- the haptic core renders embedded objects with corresponding haptic effects automatically, as appropriate for each device in which it is utilized.
- the haptic core is configured to generate and coordinate rendering of waveforms specific to the input characteristics detected at the device (e.g., interface device).
- the waveform(s) is generated dynamically using a data model.
- a typical haptic feedback either comes from a system event or user input.
- the haptic feedback should be generated in real time considering the content of that input, and should reach the user with near-zero latency.
- the system In order to deliver a user-triggered haptic feedback, the system should pick up the user input through various software layers (or algorithms), decide on the proper response, compute this response and send it to an actuator through another set of software layers.
- Each of these layers add latency, which can aggregate to an overall latency such that the feedback becomes irrelevant. Other factors adding latency can be processor load, congestion on communication buses, or any other tasks the system must perform with the resources also used for haptic feedback.
- the haptic core includes architecture that can be implemented in software or in a chip (or chipset) that allows for streamlined, low-latency haptic feedback.
- the haptic core is configured to:
- device hardware such as via digitizer electronics in a stylus or hand- held device, and/or via the digitizer controller in the case of a touch screen device.
- OS operating system
- API application programming interface
- B) Add sensors and fuse the input from the primary input device (e.g., touch screen or other hand- held or wearable device).
- the primary input device e.g., touch screen or other hand- held or wearable device.
- most conventional stylus-based interaction information is gathered from the digitizer, processed by the OS, and then provided to the haptic rendering component.
- the haptic rendering component produces a time-varying waveform that contains the haptic signal which is then streamed to the stylus.
- This streaming approach introduces latency. Aspects of the haptic cote described herein reduce this latency by enabling time- varying waveform generation on the device (e.g., stylus), and motion detection via sensors onboard the device.
- the haptic cote has one or more of the following features:
- the haptic core can include a system-in-package IC and various required and optional external components.
- V) Output to drive an actuator, e.g., an on-board piezoelectric actuator.
- an actuator e.g., an on-board piezoelectric actuator.
- On-board power management e.g., fast charging
- extended runtime e.g., 30 minutes to 2+ hours of runtime
- the haptic core has various benefits relative to conventional haptic rendering components, for example: near-zero latency (e.g., instantaneous feedback as perceived by the human user); the ability to add haptic effects to any of a number of surfaces (e.g., touchscreens as well as non-touch screens); streamlined adjustment of controller modes and/or material (rendering) modes via actuator on device (e.g., on wearable device or hand-held device); and ability to connect directly to a haptic driver chip without going through OS API for a generated haptic signal, thereby reducing latency.
- near-zero latency e.g., instantaneous feedback as perceived by the human user
- the ability to add haptic effects to any of a number of surfaces e.g., touchscreens as well as non-touch screens
- streamlined adjustment of controller modes and/or material (rendering) modes via actuator on device e.g., on wearable device or hand-held device
- FIG 1 illustrates components in the haptic core according to various aspects.
- the haptic core captures human tactile interaction with a plurality of objects (e.g., a library of objects) and generates a time-varying waveform that matches the human user’s interaction with the object at a rendering device.
- the haptic core can include the following components:
- a first amplifier 20 for amplifying a first capacitive force sensor input e.g., from a touch enabling device such as a touch screen, stylus, medical instrument, or other hand-held or wearable device.
- a second amplifier 30 for amplifying a second capacitive force sensor input e.g., from another portion of the touch enabling device such as a touch screen, stylus, medical instrument or other hand-held or wearable device.
- An inertial measurement unit (IMU) 40 configured to detect changes in position/orientation (via measured acceleration and angular rate) of the device on which the haptic core is positioned.
- the measured acceleration can also be used in calculating scanning functionality.
- a power management component (PMC) 50 coupled with a battery gauge for managing battery usage, e.g., from an external power source such as a battery on-board the device where the haptic core is located.
- PMC power management component
- Communications (Comm.) devices such as an antenna 60 and transceiver 70 (e.g., such as BLE antenna/transceiver components) for wirelessly communicating with the host device.
- the host device can include a tablet, smartphone, wearable smart device or any other computing device on which a texture is to be simulated.
- the host device sends velocity information as well as the identifier (ID) of the texture to the device that performs the haptic rendering (e.g., the device over which the actuator is hovering or contacting).
- FIG. 2 shows a data flow diagram illustrating processes performed by the microcontroller, which can include the following: a) determine whether contact occurs based on the capacitive force input, and trigger a waveform output; b) compute an input speed for the core algorithm to generate the tactile waveform, using the detected speed from the host device as well as the acceleration sensed by the on-board inertial measurement unit (IMU); c) load a texture (data) model corresponding to the ID that is received from a device (e.g., wirelessly from the host device); and d) if required or beneficial, input detected force and computed speed into the loaded texture (data) model to generate corresponding outputs to a haptic driver such as a piezo driver (e.g., Boreas driver) that actuates a haptic response via a piezo actuator (FIG. 1).
- a haptic driver such as a piezo driver (e.g., Boreas driver) that actuates a haptic response via a pie
- the haptic core 10 is sized to integrate in any number of small wearable and/or handheld devices (e.g., stylus and/or touch-screen devices), and can be approximately 2mm x 2mm in size in particular cases.
- the haptic core 10 can be particularly beneficial in stylus devices, medical instruments, ultrasound probes, dental instruments, etc.
- the haptic core 10 can be integrated in any haptic interface device, that is, any device capable of rendering haptic feedback.
- the haptic core 10 enables autonomous computing capability, that is, the haptic core does not rely on an external microcontroller or processor to generate haptic feedback instructions (outputs). This can reduce latency relative to conventional systems, e.g., achieving latency of approximately one (1) millisecond (ms) as compared with approximately 100ms of latency in conventional systems.
- ms millisecond
- the haptic core 10 enables a method of rendering haptic feedback at a device having a tactile interface and a haptic actuator.
- the microcontroller in the haptic core is configured to perform a method as illustrated in the process diagram in FIG. 3. In certain cases, this method can include the following processes:
- Process P1 Detecting an input characteristic of user contact with the tactile interface.
- process P1 is an optional pre-process, as indicated in phantom.
- the microcontroller 80 can obtain an input characteristic from the device with the haptic actuator.
- the microcontroller 80 is on board (i.e., physically) the device.
- the device comprises at least one of: a stylus, a tablet, a smart phone, a smart device, a wearable smart device (e.g., smart watch, wearable audio device, or wearable biometric monitor), a vehicle steering wheel, a utensil, a medical device, an exploration probe, or a hand controller.
- the tactile interface comprises at least one of: a touchpad, a trackpad, a touch screen, a smart surface, or a specific haptic-feedback interface element (e.g., a handheld device with a specific location for resting one or more fingers).
- the input characteristic includes at least one of: a) a detected speed of movement of a body part of the user across the tactile interface, b) a detected force applied by the user at the tactile interface, or c) a detected change in a contact state of the user’s body part relative to the tactile interface.
- the input characteristic can include detected speed of movement (e.g., speed of a finger moving across a touch screen, speed of a user’s hand moving across a steering wheel, or the speed of a user’s finger moving along a stylus or utensil), a detected force applied by the user (e.g., normal or pressing force on a touch screen, squeezing or compressive force on a steering wheel, stylus or utensil), or a detected change in the contact state of the user’s body part (e.g., in contact with the interface, not contacting the interface, or transitioning between contact and non-contact).
- the input characteristic comprises a single input characteristic, e.g., speed, force, or contact v. non-contact.
- Process P2 Processing the input characteristic with a data model having an input characteristic to waveform output correspondence.
- the input characteristic is used as an input to the data model to produce a waveform for rendering tactile feedback.
- the model is stored locally at memory in the device.
- the model is configured to be downloaded to the memory from an external source.
- the external source can include a cloud-based storage system and/or an application store.
- the model comprises at least one of: a relational database, a neural network, or linked lists of parameters (e.g., a parameter file or files including coefficients and corresponding specified parameters).
- Process P3 generating a waveform based on the input characteristic using the data model.
- the waveform is dynamically generated in response to the input characteristic.
- the w'aveform is not predefined, and is instead generated based on the input characteristic, which can vary across a wide range of values. For example, input values for speed and force can span the complete range of possible input values that are detectable at the interface.
- the data model is adaptable for distinct types of input device and/or interface, e.g., to widen or narrow the range of possible input values based on the type of input device and/or interface.
- the waveform is randomly generated within a range of frequencies based on the input characteristic, e.g., random variation within bounding frequencies.
- the data model calculates the waveform based on a stored set of vibrational responses and extrapolations on the set of vibrational responses, where the extrapolations are best-fit adjustments to the vibrational responses based on the input characteristics of the user contact. For example, w'here the detected input characteristics of the contact (e.g., speed, force, contact/non-contact) deviates from a threshold for a set of vibrational responses, the data model calculates one or more extrapolated waveforms based on the amount of the deviation.
- the detected input characteristics of the contact e.g., speed, force, contact/non-contact
- Process P4 rendering the waveform at the haptic actuator to provide tactile feedback to the user.
- the haptic actuator is one of a plurality of haptic actuators at the device.
- the waveform can be rendered using one, two, three or more haptic actuators at the device.
- the tactile interface includes a display, and the data model is specific to a tactile object shown on the display during detection of the input characteristic.
- the object shown on the display can include one of a plurality of objects in a displayed image or video.
- the microcontroller 80 is configured to generate distinct waveforms for providing a tactile response to user contact with one or more of the objects in the displayed image or video.
- the tactile feedback is provided as a series of vibrations via the haptic actuator, e.g., to provide feedback in response to contact with a gi ven object over time, and/or to provide feedback in response to contact with multiple objects in a displayed image or video.
- the tactile waveforms are generated synthetically.
- the haptic core 10 is configured to generate a waveform specific to the curvature of the interface surface, e.g., by taking a perpendicular cross-section of the shape of the interface, and combining sinusoid waveforms iteratively to plot the shape of the interface at the cross-section (e.g., to find a best-fit set of waveforms for the surface.
- the haptic core 10 can then generate a vibration pattern to match the synthetic waveform
- FIG. 4 is a flow diagram illustrating processes in an additional method according to implementations.
- the process can include actuating haptic feedback at a device in response to detecting a haptic event from a user, e.g., a touch interface event or contact of a device by a user.
- the method includes:
- P101 generating a waveform for triggering a haptic response at the device; and [0083] P102: rendering the waveform as a set of vibrations at the device.
- the waveform comprises a series of successive vibrations that are rendered as vibrations once the haptic event is detected, and wherein each vibration is delivered as a decaying sinusoid with a frequency, amplitude and decay rate that vary based on a material of the device, wherein each vibration lasts approximately 0.10 seconds or less.
- rendering the waveform comprises rendering a series of waveforms at an actuator on the device over time to match an exploration profile of a series of haptic events, wherein the exploration profile varies over time and the series of waveforms comprises dynamically generated time- varying waveforms that are calculated at a rate of approximately 1 kilo-hertz (kHz), and wherein the series of waveforms are double integrated to create an acceleration graph representing the haptic response.
- kHz kilo-hertz
- the Index is also referred to as Tactai IndexTM, a trademark belonging to Tactai, Inc., headquartered in Waltham, MA).
- the index is a hand-held instrument that enables the capture of real-world interaction between a human user and an object surface or texture for subsequent recreation on a screen or other tactile surface.
- the index is referred to as a scanning device, or a hand-held scanning device.
- the scanning device includes:
- a device that includes: i) a tip for contacting an object surface with a texture to be detected (or, “acquired”).
- the tip transmits topological characteristics of the surface to sensors located on the tool as the user moves/taps the tool against the object.
- the tip registers accelerations that are detected by on-board sensing elements; ii) A set of sensors for allowing the tool to collect interaction parameters as well as the accelerations that result from surface exploration by the tool.
- Interaction parameters include, e.g., the velocity of the tool tip relative to the surface being explored, as well as the contact pressure between the tip and the surface; iii) A microprocessor for analyzing acquired data, e.g., for completeness. The microprocessor can then package the data and send to a connected host for storage, transmission, subsequent implementation, etc.; and iv) An interface (e.g., wired and/or wireless) to the host that permits transmission of raw data acquired by the tip.
- An interface e.g., wired and/or wireless
- the scanning device 500 includes a handheld device such as a stylus, depicted schematically in FIG. 5.
- Electronics 510 including sensors, microprocessor(s) , interface(s), and other components in the haptic core are illustrated in only general terms in FIG. 5. It is understood that electronics 510 can also include one or more actuators (e.g., piezoelectric actuators / drivers), power sources (e.g., battery), power controller(s), communication devices (e.g., via WiFi, BLE or other communication protocols), and switches/buttons not shown in FIG. 5.
- actuators e.g., piezoelectric actuators / drivers
- power sources e.g., battery
- power controller(s) e.g., communication devices
- communication devices e.g., via WiFi, BLE or other communication protocols
- SDK software development kit
- the SDK can include one or more of: a) a device driver configured to manage communication with the index; and b) an application programming interface (API) that exposes functions for a user application to manage the data acquisition process (described herein), and communicate with an external modeler such as a cloud-based modeler.
- API application programming interface
- a modeler e.g., cloud-based modeler
- a server e.g., a cloud-based server.
- the scanning device (index) 500 can function as an input/output device for the haptic core, and can map contacts with a surface type for use by the modeler in order to generate a model of a surface.
- the index 500 can also function to render the model at the device as a user interacts with a surface.
- the models are selectable for different devices (indices) such as a host device (e.g., tablet, smartphone, stylus, medical instrument, etc.), and can vary by frequency and resolution for each device.
- the models can be stored on-board a device in the haptic core 10.
- the scanning device (or, “index”) 500 enables a method as illustrated in the flow diagram in FIG. 6.
- the method illustrated in FIG. 6 can be performed prior to process P1 illustrated in FIG. 3.
- the scanning device method illustrated in FIG. 6 can be performed independently of the processes illustrated in FIG. 3.
- the method further includes, e.g., prior to detecting the input characteristic of the user contact:
- P201 providing instructions for scanning an object with a scanning device.
- the instructions are provided via any conventional output device at the scanning device (e.g., a visual interface) or via a connected output device such as an audio output device or another visual display device (e.g., at a connected smart device).
- D202 is the scan data sufficient to characterize the object?
- process P203 includes confirming successful scanning of the object with the scanning device. In certain cases, confirming the successful scan is performed using any interface described herein.
- the method further includes Process P204: updating the data model with the scan data to enable tactile feedback associated with the scanned object.
- the scanning device comprises an optical scanning device.
- the scanning device is hand-held.
- the scanning device is part of the device having the tactile interface.
- the modeler and library service can be configured to generate, store and distribute object surface models.
- the modeler and/or library service is cloud-based.
- the modeler and library store data captured by the index and generate (e.g., on-demand) an object surface model that is specific to each user interaction.
- sampling rates, resolution and/or force can be varied.
- the model could have an approximately 10 kHz sampling rate, and use a complete Delaunay triangulation, with a final model file size of a few hundred kilobytes.
- the modeler and library can include the following:
- the API can enable a user to perform: i) upload of data captured by the index; ii) list available object surface models; iii) purchase and/or license the use of specific object surface models; iv) create and/or manage a personal library of object surface models; and v) download object surface models.
- a modeler that is used to generate the required object surface model for a particular output device on-demand.
- output models can range in size from several kilobits (kb) to several thousand kb.
- Haptic accessories or devices described herein can be either directly connected to a touch interface device or, in the case of a keyboard or any other peripherals; can be connected to a CPU. In some cases, a haptic accessory can be integrated into the touch interface device. These haptic accessories can be either powered by the touch interface device or by another power source, e.g., on-board battery.
- the invention provides a computer program fixed in at least one computer-readable medium, which when executed, enables a computer system to provide a touch enabled system.
- the computer-readable medium includes program code, which implements some or all of the processes and/or embodiments described herein. It is understood that the term "computer-readable medium" comprises one or more of any type of tangible medium of expression, now known or later developed, from which a copy of the program code can be perceived, reproduced, or otherwise communicated by a computing device.
- the computer-readable medium can comprise: one or more portable storage articles of manufacture; one or more memory/storage components of a computing device; paper; etc.
- the disclosure provides a method of providing a copy of program code, which implements some or all of a process described herein.
- a computer system can process a copy of program code that implements some or all of a process described herein to generate and transmit, for reception at a second, distinct location, a set of data signals that has one or more of its characteristics set and/or changed in such a manner as to encode a copy of the program code in the set of data signals.
- an embodiment of the invention provides a method of acquiring a copy of program code that implements some or all of a process described herein, which includes a computer system receiving the set of data signals described herein, and translating the set of data signals into a copy of the computer program fixed in at least one computer-readable medium.
- the set of data signals can be transmitted/received using any type of communications link.
- the invention provides a method of providing a haptic touch interface system
- a computer system can be obtained (e.g., created, maintained, made available) and one or more components for performing a process described herein can be obtained (e.g., created, purchased, used, modified) and deployed to the computer system.
- the deployment can comprise one or more of: (1) installing program code on a computing device; (2) adding one or more computing and/or I/O devices to the computer system; (3) incorporating and/or modifying the computer system to enable it to perform a process described herein; etc.
- the technical effect of the various embodiments is to allow users to experience a haptic interaction with an interface as described herein.
- the technical effect of various embodiments includes rendering haptic feedback at a device having a tactile interface and a haptic actuator.
- the technical effect of various embodiments includes generating, storing and distributing object surface-human interaction models such as those captured by an index device. Additional technical effects include capturing human tactile interaction with objects and generating a time varying waveform that represents (or, imitates) that human interaction with the object, for example, in rendering at a device such as a hand-held device and/or touch screen device.
- aspects of the present invention may be embodied as a system, method or computer program product. Accordingly, aspects of the present invention may take the form of an entirely hardware embodiment, an entirely software embodiment (including firmware, resident software, micro-code, etc.) or an embodiment combining software and hardware aspects that may all generally be referred to herein as a “circuit,” “module,” or “system.” Furthermore, aspects of the present invention may take the form of a computer program product embodied in one or more computer readable medium(s) having computer readable program code embodied thereon.
- each block in the flowchart or block diagrams may represent a module, segment, or portion of code, which comprises one or more executable instructions for implementing the specified logical function(s).
- the functions noted in the block may occur out of the order noted in the figures. For example, two blocks shown in succession may, in fact, be executed substantially concurrently, or the blocks may sometimes be executed in the reverse order, depending upon the functionality involved.
- the functionality described herein, or portions thereof, and its various modifications can be implemented, at least in part, via a computer program product, e.g., a computer program tangibly embodied in an information carrier, such as one or more non-transitory machine- readable media, for execution by, or to control the operation of, one or more data processing apparatus, e.g., a programmable processor, a computer, multiple computers, and/or programmable logic components.
- a computer program product e.g., a computer program tangibly embodied in an information carrier, such as one or more non-transitory machine- readable media, for execution by, or to control the operation of, one or more data processing apparatus, e.g., a programmable processor, a computer, multiple computers, and/or programmable logic components.
- a computer program can be written in any form of programming language, including compiled or interpreted languages, and it can be deployed in any form, including as a stand-alone program or as a module, component, subroutine, or other unit suitable for use in a computing environment.
- a computer program can be deployed to be executed on one computer or on multiple computers at one site or distributed across multiple sites and interconnected by a network.
- Actions associated with implementing all or part of the functions can be performed by one or more programmable processors executing one or more computer programs to perform the functions of the calibration process. All or part of the functions can be implemented as, special purpose logic circuitry, e.g., an FPGA and/or an ASIC (application-specific integrated circuit).
- processors suitable for the execution of a computer program include, by way of example, both general and special purpose microprocessors, and any one or more processors of any kind of digital computer.
- a processor will recei ve instructions and data from a read-only memory or a random access memory or both.
- Components of a computer include a processor for executing instructions and one or more memory devices for storing instructions and data.
- the numerical values as stated for the parameter can take on negative values.
- the example value of range stated as “less than 10” can assume negative values, e.g. - 1, -2, -3, -10, -20, -30, etc.
- a system or device configured to perform a function can include a computer system or computing device programmed or otherwise modified to perform that specific function.
- program code stored on a computer-readable medium e.g., storage medium
- a device configured to interact with and/or act upon other components can be specifically shaped and/or designed to effectively interact with and/or act upon those components.
- the device is configured to interact with another component because at least a portion of its shape complements at least a portion of the shape of that other component. In some circumstances, at least a portion of the device is sized to interact with at least a portion of that other component.
- the physical relationship e.g., complementary, size- coincident, etc.
- the physical relationship can aid in performing a function, for example, displacement of one or more of the device or other component, engagement of one or more of the device or other component, etc.
- components described as being “coupled” to one another can be joined along one or more interfaces.
- these interfaces can include junctions between distinct components, and in other cases, these interfaces can include a solidly and/or integrally formed interconnection. That is, in some cases, components that are “coupled” to one another can be simultaneously formed to define a single continuous member.
- these coupled components can be formed as separate members and be subsequently joined through known processes (e.g., soldering, fastening, ultrasonic welding, bonding).
- electronic components described as being “coupled” can be linked via conventional hard-wired and/or wireless means such that these electronic components can communicate data with one another.
- Spatially relative terms such as “inner,” “outer,” “beneath”, “below”, “lower”, “above”, “upper” and the like, may be used herein for ease of description to describe one element or feature's relationship to another element(s) or feature(s) as illustrated in the figures. Spatially relative terms may be intended to encompass different orientations of the device in use or operation in addition to the orientation depicted in the figures. For example, if the device in the figures is turned over, elements described as “below” or “beneath” other elements or features would then be oriented “above” the other elements or features. Thus, the example term “below” can encompass both an orientation of above and below. The device may be otherwise oriented (rotated 90 degrees or at other orientations) and the spatially relative descriptors used herein interpreted accordingly.
Landscapes
- Engineering & Computer Science (AREA)
- General Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- Human Computer Interaction (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- User Interface Of Digital Computer (AREA)
Abstract
Various aspects of the disclosure relate to virtual tactile experiences. Specific aspects include a haptic core that is configured to operate as a stand-alone hardware component for generating virtual tactile responses in a device. Other specific aspects include (e.g., hand-held) devices (or, indexes) incorporating the haptic core chip. Additional aspects include approaches for dynamically rendering haptic waveforms tailored to tactile input characteristics. Further specific aspects include a modeler and library service for generating, storing and distributing object surface models for use in rendering virtual tactile experiences at a device.
Description
Haptic Waveform Generation and Rendering at Interface Device
Priority Claim
[001] This application claims priority to US Provisional Application No. 62/957,429 (filed on January 6, 2020), the entire contents of which are incorporated herein by reference.
Technical Field
[002] Various aspects of the disclosure relate to virtual tactile experiences. Specific aspects include a haptic core (e.g., chip) that is configured to operate as a stand-alone hardware component for generating virtual tactile responses in a device. In additional implementations, functions of the haptic core are implemented in microcontroller in a device. Other specific aspects include (e.g., hand-held) devices (or “Indexes”) incorporating the haptic core chip. Further specific aspects include a modeler and library service for generating, storing and distributing object surface models for use in rendering virtual tactile experiences at a device. Virtual tactile experiences can be used to provide a human user with tactile feedback from displayed objects in an interface (e.g., a touch screen interface) or in an environment (e.g., an immersive environment such as virtual reality, mixed reality, merged reality, and/or augmented reality, collectively referred to as “virtual reality” (VR) herein).
Background
[003] The evolution of interaction paradigms from buttons, mouse-clicks and finger swipes, requires content to be accurately responsive. That is, a virtual object (either in virtual reality (VR) or on a touchscreen) needs to “feel” like its natural real-world self to enable certain user experiences. For example, a wooden table or a ceramic mug should feel distinct from their surroundings and also be able to be distinguished from each other when touched by a virtual finger in VR or a real finger on an interface. However, conventional interface platforms and VR platforms fail to enable adaptive tactile feedback and integration.
Summary
[004] Various systems, methods and computer program products are disclosed which provide a human user with tactile feedback from virtual objects, e.g., at a touch interface or on a hand-held device.
[005] A first aspect of the disclosure includes a haptic core for capturing human tactile interaction with objects and generating a time varying waveform that represents (or, imitates) that human interaction with the object, for example, in rendering at a device such as a hand-held device and/or touch screen device.
[006] A second aspect of the disclosure includes systems and methods for a hand-held instrument to capture object surface tactile characteristics and create an object surface-human interaction model (also
called the “index” or Tactai Index™). These aspects can also include displaying the object surface-human interaction model on a display with electromechanical components.
[007] A third aspect of the disclosure includes a modeler and library service, for example, as operated in a cloud-based system, to generate, store and distribute object surface-human interaction models such as those captured by the index.
[008] A fourth aspect of the disclosure includes a computer-implemented method of rendering haptic feedback at a device having a tactile interface and a haptic actuator, the method comprising: detecting an input characteristic of user contact with the tactile interface; processing the input characteristic with a data model having an input characteristic to waveform output correspondence; generating a waveform based on the input characteristic using the data model, where the at least one waveform is dynamically generated in response to the input characteristic; and rendering the at least one waveform at the haptic actuator to provide tactile feedback to the user.
[009] A fifth aspect of the disclosure includes a device having a tactile interface, a haptic actuator, a processor and a memory, wherein the processor is configured to execute instructions stored in the memory to perform actions comprising: detecting an input characteristic of user contact with the tactile interface; processing the input characteristic with a data model having an input characteristic to waveform output correspondence; generating a waveform based on the input characteristic using the data model, where the at least one waveform is dynamically generated in response to the input characteristic; and rendering the at least one waveform at the haptic actuator to provide tactile feedback to the user.
[0010] A sixth aspect of the disclosure includes a computer-implemented method of actuating haptic feedback at a device in response to detecting a haptic event from a user, the method comprising: generating a waveform for triggering a haptic response at the device; and rendering the waveform as a set of vibrations at the device, where the waveform comprises a series of successive vibrations that are rendered as vibrations once the haptic event is detected, and where each vibration is delivered as a decaying sinusoid with a frequency, amplitude and decay rate that vary based on a material of the device, wherein each vibration lasts approximately 0.10 seconds or less.
[0011] Additional particular aspects of the disclosed implementations can include the following.
[0012] In certain cases, the data model calculates the waveform based on a stored set of vibrational responses and extrapolations on the set of vibrational responses, where the extrapolations are best-fit adjustments to the vibrational responses based on the input characteristics of the user contact.
[0013] In particular aspects, the haptic actuator is one of a plurality of haptic actuators.
[0014] In some implementations, the input characteristic comprises at least one of: a) a detected speed of movement of a body part of the user across the tactile interface, b) a detected force applied by the user at the tactile interface, or c) a detected change in a contact state of the user’s body part relative to the tactile interface.
[0015] In particular cases, the input characteristic comprises a single input characteristic.
[0016] In some aspects, the tactile interface comprises a display, and wherein the data model is specific to a tactile object shown on the display dining detection of the input characteristic.
[0017] In certain implementations, the object shown on the display is one of a plurality of objects in a displayed image or video.
[0018] In particular cases, the tactile feedback is provided as a series of vibrations via the haptic actuator. [0019] In some aspects, the model is stored locally at memory in the device.
[0020] In certain implementations, the model is configured to be downloaded to the memory from an external source.
[0021] In particular aspects, the model comprises at least one of: a relational database, a neural network, or linked lists of parameters.
[0022] In certain aspects, the device comprises at least one of: a stylus, a tablet, a smart phone, a vehicle steering wheel, a utensil, a medical device, an exploration probe or a hand controller.
[0023] In some cases, the tactile interface comprises at least one of: a touchpad, a trackpad, a touch screen, a smart surface or a specific haptic-feedback interface element.
[0024] In particular implementations, the method further includes, prior to detecting the input characteristic of the user contact: providing instructions for scanning an object with a scanning device; and confirming successful scanning of the object with the scanning device in response to receiving scan data sufficient to characterize the object.
[0025] In certain cases, the scanning device comprises an optical scanning device.
[0026] In some implementations, the scanning device is hand-held.
[0027] In particular aspects, the method further includes updating the data model with the scan data to enable tactile feedback associated with the scanned object.
[0028] In some cases, the scanning device is part of the device having the tactile interface.
[0029] In certain implementations, rendering the waveform comprises rendering a series of waveforms at an actuator on the device over time to match an exploration profile of a series of haptic events, where the exploration profile varies over time and the series of waveforms comprises dynamically generated time- varying waveforms that are calculated at a rate of approximately 1 kilo-hertz (kHz), and where the series of waveforms are double integrated to create an acceleration graph representing the haptic response.
[0030] Two or more features described in this disclosure, including those described in this summary section, may be combined to form implementations not specifically described herein.
[0031] The details of one or more implementations are set forth in the accompanying drawings and the description below. Other features, objects and benefits will be apparent from the description and drawings, and from the claims.
Brief Description of the Drawings
[0032] These and other features of this disclosure will be more readily understood from the following detailed description of the various aspects of the disclosure taken in conjunction with the accompanying drawings that depict various embodiments of the disclosure, in which:
[0033] FIG. 1 is a schematic depiction of components in a haptic core according to various embodiments of the disclosure.
[0034] FIG. 2 is a data flow diagram illustrating processes performed by a microcontroller in the haptic core of FIG. 1 according to various implementations.
[0035] FIG. 3 is a flow diagram illustrating processes in a method according to various implementations. [0036] FIG. 4 is a flow diagram illustrating processes in a method according to various additional implementations.
[0037] FIG. 5 is a schematic depiction of a scanning (or, index) device in the form of a stylus according to various implementations.
[0038] FIG. 6 is a flow diagram illustrating processes in a method according to further implementations. [0039] It is noted that the drawings of the various aspects of the invention are not necessarily to scale. The drawings are intended to depict only typical aspects of the invention, and therefore should not be considered as limiting the scope of the invention. In the drawings, like numbering represents like elements between the drawings.
Detailed Description
[0040] In the following description, reference is made to the accompanying drawings that form a part thereof, and in which is shown by way of illustration specific illustrative embodiments in which the present teachings may be practiced. These embodiments are described in sufficient detail to enable those skilled in the art to practice the present teachings, and it is to be understood that other embodiments may be used and that changes may be made without departing from the scope of the present teachings. The following description is, therefore, merely illustrative.
[0041] To better illustrate the various embodiments of the present disclosure, particular terminology which may be known or unknown to those of ordinary skill in the art is defined to further clarify the embodiments set forth herein. The term “system” can refer to a computer system, server, etc., composed wholly or partially of hardware and/or software components, one or more instances of a system embodied in software and accessible to a local or remote user, all or part of one or more systems in a cloud computing environment, one or more physical and/or virtual machines accessed via the Internet, other types of physical or virtual computing devices, and/or components thereof.
[0042] As noted herein, various aspects of the disclosure relate to touch enabled platforms for virtual objects. In particular aspects, a platform enables adaptive identification and rendering of tactile feedback
from virtual objects, e.g., via one or more virtual reality (VR) environments or via an interface such as a touch interface (e.g., a touch screen or other touch-enabled interface).
[0043] Specific aspects include a haptic core (e.g., chip) that is configured to operate as a stand-alone, or approximately autonomous hardware component for generating virtual tactile responses in a device. Further aspects include software and/or hardware components for generating tactile responses in a device, e.g., by dynamically generating and/or rendering a waveform at the device. Other specific aspects include hand-held devices (or “indexes”) incorporating the haptic core chip. Further specific aspects include a modeler and library service for generating, storing and distributing object surface models for use in rendering virtual tactile experiences at a device. Virtual tactile experiences can be used to provide a human user with tactile feedback from displayed objects in an interface (e.g., a touch screen interface) or in an environment (e.g., an immersive environment such as virtual reality, mixed reality, merged reality, and/or augmented reality, collectively referred to as “virtual reality” (VR) herein).
[0044] This application incorporates each of the following by reference in its entirety: US Patent Application No. 15/416,005, filed on January 26, 2017; US Provisional Patent Application No. 62,287,506, filed on January 27, 2016; US Patent Application No. 16/209,183, filed on December 4, 2018 (Touch Enabling Process, Haptic Accessory, and Core Haptic Engine to Enable Creation and Delivery of Tactile- Enabled Experiences with Virtual Objects), as well as the provisional application to which that application claims priority (US Provisional Application No. 62/594,787); US Patent Application No. 16/953,813 (filed on November 20, 2020); US Patent No. 7,808,488; US Patent No. 8,988,445; Roland S. Johansson & J. Randall Flanagan, Coding and use of tactile signals from the fingertips in object manipulation tasks, Nature Reviews Neuroscience 10, 345-359 (May 2009); and “Vibrotactile Display: Perception, Technology, and Applications” by Seungmoon Choi and Katherine J. Kuchenbecker. Vol. 101, No. 9, September 2013 | Proceedings of the IEEE.
[0045] Various embodiments of the disclosure include systems and methods that enable application of tactile features to virtual objects, e.g., to images or other displayed objects in an adaptive manner.
[0046] In the following description, reference is made to the accompanying drawings that form a part thereof, and in which is shown by way of illustration, specific example embodiments in which the present teachings may be practiced. These embodiments are described in sufficient detail to enable those skilled in the art to practice the present teachings, and it is to be understood that other embodiments may be utilized and that changes may be made without departing from the scope of the present teachings. The following description is, therefore, merely illustrative.
[0047] Haptic Core
[0048] In some cases, the Haptic Core Chip (or simply, haptic core) is a physical embodiment of aspects performed by the touch enabling platform (or TEP) described in US Patent Application No. 16/209,183.
The haptic core renders embedded objects with corresponding haptic effects automatically, as appropriate for each device in which it is utilized. In certain cases, as noted herein, the haptic core is configured to
generate and coordinate rendering of waveforms specific to the input characteristics detected at the device (e.g., interface device). In various implementations, the waveform(s) is generated dynamically using a data model.
[0049] As human-machine interactions become more and more fluid, and the amount of information to transmit between components increases, original equipment manufacturers (OEMs) and software developers aim to make the user-device interaction richer. One approach has been to improve the image and sound associated with this interaction, and another approach is to add modalities. Haptics also enable enhanced human-machine interactions, but conventional haptic systems are deficient in a number of respects.
[0050] A typical haptic feedback either comes from a system event or user input. When coming from user input, in order to be effective, the haptic feedback should be generated in real time considering the content of that input, and should reach the user with near-zero latency. In order to deliver a user-triggered haptic feedback, the system should pick up the user input through various software layers (or algorithms), decide on the proper response, compute this response and send it to an actuator through another set of software layers. Each of these layers add latency, which can aggregate to an overall latency such that the feedback becomes irrelevant. Other factors adding latency can be processor load, congestion on communication buses, or any other tasks the system must perform with the resources also used for haptic feedback.
[0051] In order to address this issue of haptic latency, the haptic core includes architecture that can be implemented in software or in a chip (or chipset) that allows for streamlined, low-latency haptic feedback. The haptic core is configured to:
[0052] A) Gather (user) input from device hardware, such as via digitizer electronics in a stylus or hand- held device, and/or via the digitizer controller in the case of a touch screen device. This is in contrast to conventional haptic platforms that gather input from the device operating system (OS) via an application programming interface (API). The conventional approach that relies on the OS and API introduces latency and error via software layer-based delay and filtering of raw data.
[0053] B) Add sensors and fuse the input from the primary input device (e.g., touch screen or other hand- held or wearable device). As an example, most conventional stylus-based interaction information is gathered from the digitizer, processed by the OS, and then provided to the haptic rendering component. The haptic rendering component produces a time-varying waveform that contains the haptic signal which is then streamed to the stylus. This streaming approach introduces latency. Aspects of the haptic cote described herein reduce this latency by enabling time- varying waveform generation on the device (e.g., stylus), and motion detection via sensors onboard the device.
[0054] The haptic cote has one or more of the following features:
[0055] I) A modular design. The haptic core can include a system-in-package IC and various required and optional external components.
[0056] II) On-board sensing of motion and contact forces, i.e., on-board the device that performs the haptic rendering.
[0057] III) Embedded wireless processing, e.g., via Bluetooth Low Energy (BLE) or other communications protocol.
[0058] IV) On-board material -model storage, which can enable fast retrieval and rendering.
[0059] V) Output to drive an actuator, e.g., an on-board piezoelectric actuator.
[0060] VI) On-board power management (e.g., fast charging), as well as extended runtime (e.g., 30 minutes to 2+ hours of runtime).
[0061] The haptic core has various benefits relative to conventional haptic rendering components, for example: near-zero latency (e.g., instantaneous feedback as perceived by the human user); the ability to add haptic effects to any of a number of surfaces (e.g., touchscreens as well as non-touch screens); streamlined adjustment of controller modes and/or material (rendering) modes via actuator on device (e.g., on wearable device or hand-held device); and ability to connect directly to a haptic driver chip without going through OS API for a generated haptic signal, thereby reducing latency.
[0062] FIG 1 illustrates components in the haptic core according to various aspects. In particular cases, the haptic core captures human tactile interaction with a plurality of objects (e.g., a library of objects) and generates a time-varying waveform that matches the human user’s interaction with the object at a rendering device. The haptic core can include the following components:
[0063] A first amplifier 20 for amplifying a first capacitive force sensor input, e.g., from a touch enabling device such as a touch screen, stylus, medical instrument, or other hand-held or wearable device.
[0064] A second amplifier 30 for amplifying a second capacitive force sensor input, e.g., from another portion of the touch enabling device such as a touch screen, stylus, medical instrument or other hand-held or wearable device.
[0065] An inertial measurement unit (IMU) 40 configured to detect changes in position/orientation (via measured acceleration and angular rate) of the device on which the haptic core is positioned. The measured acceleration can also be used in calculating scanning functionality.
[0066] A power management component (PMC) 50 coupled with a battery gauge for managing battery usage, e.g., from an external power source such as a battery on-board the device where the haptic core is located.
[0067] Communications (Comm.) devices such as an antenna 60 and transceiver 70 (e.g., such as BLE antenna/transceiver components) for wirelessly communicating with the host device. In various implementations, the host device can include a tablet, smartphone, wearable smart device or any other computing device on which a texture is to be simulated. In some cases, the host device sends velocity information as well as the identifier (ID) of the texture to the device that performs the haptic rendering (e.g., the device over which the actuator is hovering or contacting).
[0068] A microcontroller 80 for performing one or more processes. FIG. 2 shows a data flow diagram illustrating processes performed by the microcontroller, which can include the following: a) determine whether contact occurs based on the capacitive force input, and trigger a waveform output; b) compute an
input speed for the core algorithm to generate the tactile waveform, using the detected speed from the host device as well as the acceleration sensed by the on-board inertial measurement unit (IMU); c) load a texture (data) model corresponding to the ID that is received from a device (e.g., wirelessly from the host device); and d) if required or beneficial, input detected force and computed speed into the loaded texture (data) model to generate corresponding outputs to a haptic driver such as a piezo driver (e.g., Boreas driver) that actuates a haptic response via a piezo actuator (FIG. 1).
[0069] In certain cases, the haptic core 10 is sized to integrate in any number of small wearable and/or handheld devices (e.g., stylus and/or touch-screen devices), and can be approximately 2mm x 2mm in size in particular cases. The haptic core 10 can be particularly beneficial in stylus devices, medical instruments, ultrasound probes, dental instruments, etc. In additional implementations, the haptic core 10 can be integrated in any haptic interface device, that is, any device capable of rendering haptic feedback.
[0070] The haptic core 10 enables autonomous computing capability, that is, the haptic core does not rely on an external microcontroller or processor to generate haptic feedback instructions (outputs). This can reduce latency relative to conventional systems, e.g., achieving latency of approximately one (1) millisecond (ms) as compared with approximately 100ms of latency in conventional systems.
[0071] In particular examples, the haptic core 10 enables a method of rendering haptic feedback at a device having a tactile interface and a haptic actuator. In certain cases, the microcontroller in the haptic core is configured to perform a method as illustrated in the process diagram in FIG. 3. In certain cases, this method can include the following processes:
[0072] Process P1: Detecting an input characteristic of user contact with the tactile interface. In some cases, process P1 is an optional pre-process, as indicated in phantom. For example, in particular implementations, e.g., where the microcontroller 80 is located in a distinct device, the microcontroller 80 can obtain an input characteristic from the device with the haptic actuator. However, in various implementations, the microcontroller 80 is on board (i.e., physically) the device.
[0073] In certain aspects, the device comprises at least one of: a stylus, a tablet, a smart phone, a smart device, a wearable smart device (e.g., smart watch, wearable audio device, or wearable biometric monitor), a vehicle steering wheel, a utensil, a medical device, an exploration probe, or a hand controller. In some cases, the tactile interface comprises at least one of: a touchpad, a trackpad, a touch screen, a smart surface, or a specific haptic-feedback interface element (e.g., a handheld device with a specific location for resting one or more fingers).
[0074] In some cases, the input characteristic includes at least one of: a) a detected speed of movement of a body part of the user across the tactile interface, b) a detected force applied by the user at the tactile interface, or c) a detected change in a contact state of the user’s body part relative to the tactile interface.
For example, the input characteristic can include detected speed of movement (e.g., speed of a finger moving across a touch screen, speed of a user’s hand moving across a steering wheel, or the speed of a user’s finger moving along a stylus or utensil), a detected force applied by the user (e.g., normal or pressing
force on a touch screen, squeezing or compressive force on a steering wheel, stylus or utensil), or a detected change in the contact state of the user’s body part (e.g., in contact with the interface, not contacting the interface, or transitioning between contact and non-contact). In particular cases, the input characteristic comprises a single input characteristic, e.g., speed, force, or contact v. non-contact.
[0075] Process P2: Processing the input characteristic with a data model having an input characteristic to waveform output correspondence. In various implementations, the input characteristic is used as an input to the data model to produce a waveform for rendering tactile feedback. In some cases, the model is stored locally at memory in the device. In certain implementations, the model is configured to be downloaded to the memory from an external source. For example, the external source can include a cloud-based storage system and/or an application store. In particular aspects, the model comprises at least one of: a relational database, a neural network, or linked lists of parameters (e.g., a parameter file or files including coefficients and corresponding specified parameters).
[0076] Process P3: generating a waveform based on the input characteristic using the data model. As noted herein, in particular cases, the waveform is dynamically generated in response to the input characteristic. In various implementations, the w'aveform is not predefined, and is instead generated based on the input characteristic, which can vary across a wide range of values. For example, input values for speed and force can span the complete range of possible input values that are detectable at the interface. In certain cases, the data model is adaptable for distinct types of input device and/or interface, e.g., to widen or narrow the range of possible input values based on the type of input device and/or interface. In certain cases, the waveform is randomly generated within a range of frequencies based on the input characteristic, e.g., random variation within bounding frequencies.
[0077] In certain additional cases, the data model calculates the waveform based on a stored set of vibrational responses and extrapolations on the set of vibrational responses, where the extrapolations are best-fit adjustments to the vibrational responses based on the input characteristics of the user contact. For example, w'here the detected input characteristics of the contact (e.g., speed, force, contact/non-contact) deviates from a threshold for a set of vibrational responses, the data model calculates one or more extrapolated waveforms based on the amount of the deviation.
[0078] Process P4: rendering the waveform at the haptic actuator to provide tactile feedback to the user. In particular aspects, the haptic actuator is one of a plurality of haptic actuators at the device. In these implementations, the waveform can be rendered using one, two, three or more haptic actuators at the device. [0079] In some aspects, the tactile interface includes a display, and the data model is specific to a tactile object shown on the display during detection of the input characteristic. For example, the object shown on the display can include one of a plurality of objects in a displayed image or video. In certain cases, the microcontroller 80 is configured to generate distinct waveforms for providing a tactile response to user contact with one or more of the objects in the displayed image or video. In particular implementations, the tactile feedback is provided as a series of vibrations via the haptic actuator, e.g., to provide feedback in
response to contact with a gi ven object over time, and/or to provide feedback in response to contact with multiple objects in a displayed image or video.
[0080] In certain implementations, the tactile waveforms are generated synthetically. In these examples, the haptic core 10 is configured to generate a waveform specific to the curvature of the interface surface, e.g., by taking a perpendicular cross-section of the shape of the interface, and combining sinusoid waveforms iteratively to plot the shape of the interface at the cross-section (e.g., to find a best-fit set of waveforms for the surface. The haptic core 10 can then generate a vibration pattern to match the synthetic waveform [0081] FIG. 4 is a flow diagram illustrating processes in an additional method according to implementations. In these cases, the process can include actuating haptic feedback at a device in response to detecting a haptic event from a user, e.g., a touch interface event or contact of a device by a user. In particular cases, the method includes:
[0082] P101 : generating a waveform for triggering a haptic response at the device; and [0083] P102: rendering the waveform as a set of vibrations at the device.
[0084] In particular cases, the waveform comprises a series of successive vibrations that are rendered as vibrations once the haptic event is detected, and wherein each vibration is delivered as a decaying sinusoid with a frequency, amplitude and decay rate that vary based on a material of the device, wherein each vibration lasts approximately 0.10 seconds or less.
[0085] In certain implementations, rendering the waveform comprises rendering a series of waveforms at an actuator on the device over time to match an exploration profile of a series of haptic events, wherein the exploration profile varies over time and the series of waveforms comprises dynamically generated time- varying waveforms that are calculated at a rate of approximately 1 kilo-hertz (kHz), and wherein the series of waveforms are double integrated to create an acceleration graph representing the haptic response.
[0086] Index
[0087] The Index is also referred to as Tactai Index™, a trademark belonging to Tactai, Inc., headquartered in Waltham, MA). The index is a hand-held instrument that enables the capture of real-world interaction between a human user and an object surface or texture for subsequent recreation on a screen or other tactile surface. In various implementations, the index is referred to as a scanning device, or a hand-held scanning device. In certain cases, the scanning device (index) includes:
[0088] I) A device that includes: i) a tip for contacting an object surface with a texture to be detected (or, “acquired”). The tip transmits topological characteristics of the surface to sensors located on the tool as the user moves/taps the tool against the object. By following the characteristics of the surface (e.g., bumps, crevices, etc.), the tip registers accelerations that are detected by on-board sensing elements; ii) A set of sensors for allowing the tool to collect interaction parameters as well as the accelerations that result from surface exploration by the tool. Interaction parameters include, e.g., the velocity of the tool tip relative to the surface being explored, as well as the contact pressure between the tip and the surface; iii) A microprocessor for analyzing acquired data, e.g., for completeness. The microprocessor can then package
the data and send to a connected host for storage, transmission, subsequent implementation, etc.; and iv) An interface (e.g., wired and/or wireless) to the host that permits transmission of raw data acquired by the tip.
In one particular example, the scanning device 500 includes a handheld device such as a stylus, depicted schematically in FIG. 5. Electronics 510 including sensors, microprocessor(s) , interface(s), and other components in the haptic core are illustrated in only general terms in FIG. 5. It is understood that electronics 510 can also include one or more actuators (e.g., piezoelectric actuators / drivers), power sources (e.g., battery), power controller(s), communication devices (e.g., via WiFi, BLE or other communication protocols), and switches/buttons not shown in FIG. 5.
[0089] II) A software development kit (SDK) that includes components for the host device to interface with the index. The SDK can include one or more of: a) a device driver configured to manage communication with the index; and b) an application programming interface (API) that exposes functions for a user application to manage the data acquisition process (described herein), and communicate with an external modeler such as a cloud-based modeler.
[0090] III) A modeler (e.g., cloud-based modeler) that allows for processing data captures by the index and creation of corresponding models. These models can be stored in a server, e.g., a cloud-based server.
[0091] The scanning device (index) 500 can function as an input/output device for the haptic core, and can map contacts with a surface type for use by the modeler in order to generate a model of a surface. The index 500 can also function to render the model at the device as a user interacts with a surface. The models are selectable for different devices (indices) such as a host device (e.g., tablet, smartphone, stylus, medical instrument, etc.), and can vary by frequency and resolution for each device. As noted herein, the models can be stored on-board a device in the haptic core 10.
[0092] In certain implementations, the scanning device (or, “index”) 500 enables a method as illustrated in the flow diagram in FIG. 6. In some cases, the method illustrated in FIG. 6 can be performed prior to process P1 illustrated in FIG. 3. However, in other cases, the scanning device method illustrated in FIG. 6 can be performed independently of the processes illustrated in FIG. 3. In particular implementations, the method further includes, e.g., prior to detecting the input characteristic of the user contact:
[0093] P201: providing instructions for scanning an object with a scanning device. In some cases, the instructions are provided via any conventional output device at the scanning device (e.g., a visual interface) or via a connected output device such as an audio output device or another visual display device (e.g., at a connected smart device).
[0094] D202: is the scan data sufficient to characterize the object?
[0095] If No to D202 (insufficient scan data), the process reverts back to P201, providing instructions (e.g., repeat prior instructions or refined instructions) for scanning the object with the scanning device in order to meet sufficiency requirements.
[0096] If Yes to D202 (sufficient scan data), process P203 includes confirming successful scanning of the object with the scanning device. In certain cases, confirming the successful scan is performed using any interface described herein.
[0097] In particular aspects, the method further includes Process P204: updating the data model with the scan data to enable tactile feedback associated with the scanned object.
[0098] In certain cases, the scanning device comprises an optical scanning device. In some implementations, the scanning device is hand-held. In some cases, the scanning device is part of the device having the tactile interface.
[0099] Modeler and Library Service
[00100] The modeler and library service can be configured to generate, store and distribute object surface models. In some cases, the modeler and/or library service is cloud-based. In particular cases, the modeler and library store data captured by the index and generate (e.g., on-demand) an object surface model that is specific to each user interaction. Depending on the target (haptic) device on which the model will be implemented, sampling rates, resolution and/or force can be varied. As an example, if the target device has a high-resolution actuator and enough power and memory, the model could have an approximately 10 kHz sampling rate, and use a complete Delaunay triangulation, with a final model file size of a few hundred kilobytes. At the other end of the spectrum, if a target device can only render a limited set of low fidelity textures, then the sampling rate could be approximately 1 kHz with a simplified model of only a few kilobytes, and additional scaling will be performed as post processing to adjust the signal for input force. [00101] The modeler and library can include the following:
[00102] A) An API allowing communication with client applications. The API can enable a user to perform: i) upload of data captured by the index; ii) list available object surface models; iii) purchase and/or license the use of specific object surface models; iv) create and/or manage a personal library of object surface models; and v) download object surface models.
[00103] B) Data storage for storing raw, captured data.
[00104] C) A modeler that is used to generate the required object surface model for a particular output device on-demand. Depending on input parameters and output device capabilities, output models can range in size from several kilobits (kb) to several thousand kb.
[00105] Hardware and/or software components and functions
[00106] The communication between any components described herein can occur over any conventionally available means, such as but not limited to Bluetooth, WiFi, NFC, USB 1-3, type a, b, c lightning, audio port, etc.
[00107] Haptic accessories or devices described herein can be either directly connected to a touch interface device or, in the case of a keyboard or any other peripherals; can be connected to a CPU. In some cases, a haptic accessory can be integrated into the touch interface device. These haptic accessories can be either powered by the touch interface device or by another power source, e.g., on-board battery.
[00108] While shown and described herein as approaches for providing a touch enabled system, it is understood that aspects of the invention further provide various alternative embodiments. For example, in one embodiment, the invention provides a computer program fixed in at least one computer-readable medium, which when executed, enables a computer system to provide a touch enabled system. To this extent, the computer-readable medium includes program code, which implements some or all of the processes and/or embodiments described herein. It is understood that the term " computer-readable medium" comprises one or more of any type of tangible medium of expression, now known or later developed, from which a copy of the program code can be perceived, reproduced, or otherwise communicated by a computing device. For example, the computer-readable medium can comprise: one or more portable storage articles of manufacture; one or more memory/storage components of a computing device; paper; etc.
[00109] In another embodiment, the disclosure provides a method of providing a copy of program code, which implements some or all of a process described herein. In this case, a computer system can process a copy of program code that implements some or all of a process described herein to generate and transmit, for reception at a second, distinct location, a set of data signals that has one or more of its characteristics set and/or changed in such a manner as to encode a copy of the program code in the set of data signals. Similarly, an embodiment of the invention provides a method of acquiring a copy of program code that implements some or all of a process described herein, which includes a computer system receiving the set of data signals described herein, and translating the set of data signals into a copy of the computer program fixed in at least one computer-readable medium. In either case, the set of data signals can be transmitted/received using any type of communications link.
[00110] In still another embodiment, the invention provides a method of providing a haptic touch interface system In this case, a computer system, can be obtained (e.g., created, maintained, made available) and one or more components for performing a process described herein can be obtained (e.g., created, purchased, used, modified) and deployed to the computer system. To this extent, the deployment can comprise one or more of: (1) installing program code on a computing device; (2) adding one or more computing and/or I/O devices to the computer system; (3) incorporating and/or modifying the computer system to enable it to perform a process described herein; etc.
[00111] In any case, the technical effect of the various embodiments, is to allow users to experience a haptic interaction with an interface as described herein. In particular cases, the technical effect of various embodiments includes rendering haptic feedback at a device having a tactile interface and a haptic actuator. In further particular cases, the technical effect of various embodiments includes generating, storing and distributing object surface-human interaction models such as those captured by an index device. Additional technical effects include capturing human tactile interaction with objects and generating a time varying waveform that represents (or, imitates) that human interaction with the object, for example, in rendering at a device such as a hand-held device and/or touch screen device.
[00112] As will be appreciated by one skilled in the art, aspects of the present invention may be embodied as a system, method or computer program product. Accordingly, aspects of the present invention may take the form of an entirely hardware embodiment, an entirely software embodiment (including firmware, resident software, micro-code, etc.) or an embodiment combining software and hardware aspects that may all generally be referred to herein as a “circuit,” “module,” or “system.” Furthermore, aspects of the present invention may take the form of a computer program product embodied in one or more computer readable medium(s) having computer readable program code embodied thereon.
[00113] The flowcharts and block diagrams in the Figures illustrate the layout, functionality, and operation of possible implementations of systems, methods and computer program products according to various embodiments of the present invention. In this regard, each block in the flowchart or block diagrams may represent a module, segment, or portion of code, which comprises one or more executable instructions for implementing the specified logical function(s). It should also be noted that, in some alternative implementations, the functions noted in the block may occur out of the order noted in the figures. For example, two blocks shown in succession may, in fact, be executed substantially concurrently, or the blocks may sometimes be executed in the reverse order, depending upon the functionality involved. It will also be noted that each block of the block diagrams and/or flowchart illustration, and combinations of blocks in the block diagrams and/or flowchart illustration, can be implemented by special purpose hardware-based systems that perform the specified functions or acts, or combinations of special purpose hardware and computer instructions.
[00114] The functionality described herein, or portions thereof, and its various modifications (hereinafter “the functions”) can be implemented, at least in part, via a computer program product, e.g., a computer program tangibly embodied in an information carrier, such as one or more non-transitory machine- readable media, for execution by, or to control the operation of, one or more data processing apparatus, e.g., a programmable processor, a computer, multiple computers, and/or programmable logic components.
[00115] A computer program can be written in any form of programming language, including compiled or interpreted languages, and it can be deployed in any form, including as a stand-alone program or as a module, component, subroutine, or other unit suitable for use in a computing environment. A computer program can be deployed to be executed on one computer or on multiple computers at one site or distributed across multiple sites and interconnected by a network.
[00116] Actions associated with implementing all or part of the functions can be performed by one or more programmable processors executing one or more computer programs to perform the functions of the calibration process. All or part of the functions can be implemented as, special purpose logic circuitry, e.g., an FPGA and/or an ASIC (application-specific integrated circuit). Processors suitable for the execution of a computer program include, by way of example, both general and special purpose microprocessors, and any one or more processors of any kind of digital computer. Generally, a processor will recei ve instructions and
data from a read-only memory or a random access memory or both. Components of a computer include a processor for executing instructions and one or more memory devices for storing instructions and data.
[00117] In the description, reference is made to the accompanying drawings that form a part thereof, and in which is shown by way of illustration specific exemplary embodiments in which the present teachings may be practiced. These embodiments are described in sufficient detail to enable those skilled in the art to practice die present teachings and it is to be understood that other embodiments may be utilized and that changes may be made without departing from the scope of the present teachings. The following description is, therefore, merely exemplary.
[00118] Illustrations with respect to one or more embodiments, alterations and/or modifications can be made to the illustrated examples without departing from the spirit and scope of the appended claims. In addition, while a particular feature may have been disclosed with respect to only one of several embodiments, such feature may be combined with one or more other features of the other embodiments as may be desired and advantageous for any given or particular function. Furthermore, to the extent that the terms “including”, “includes”, “having”, “has”, “with”, or variants thereof are used in either the detailed description and the claims, such terms are intended to he inclusi ve in a manner similar to the term “comprising.” The term “at least one of’ is used to mean one or more of the listed items can he selected.
[00119] Notwithstanding that the numerical ranges and parameters setting forth the broad scope of embodiments are approximations, the numerical values set forth in the specific examples are reported as precisely as possible. Any numerical value, however, inherently contains certain errors necessarily resulting from the standard deviation found in their respective testing measurements. Moreover, all ranges disclosed herein are to he understood to encompass any and ail sub-ranges subsumed therein. For example, a range of "less than 10" can include any and all sub-ranges between (and including) the minimum value of zero and the maximum value of 10, that is, any and all sub-ranges having a minimum value of equal to or greater than zero and a maximum value of equal to or less than 10, e.g., 1 to 5. In certain cases, the numerical values as stated for the parameter can take on negative values. In this case, the example value of range stated as “less than 10” can assume negative values, e.g. - 1, -2, -3, -10, -20, -30, etc.
[00120] As used herein, the term “configured,” “configured to” and/or “configured for” can refer to specific-purpose features of the component so described. For example, a system or device configured to perform a function can include a computer system or computing device programmed or otherwise modified to perform that specific function. In other cases, program code stored on a computer-readable medium (e.g., storage medium), can be configured to cause at least one computing device to perform functions when that program code is executed on that computing device. In these cases, the arrangement of the program code triggers specific functions in the computing device upon execution. In other examples, a device configured to interact with and/or act upon other components can be specifically shaped and/or designed to effectively interact with and/or act upon those components. In some such circumstances, the device is configured to interact with another component because at least a portion of its shape complements at least a portion of the
shape of that other component. In some circumstances, at least a portion of the device is sized to interact with at least a portion of that other component. The physical relationship (e.g., complementary, size- coincident, etc.) between the device and the other component can aid in performing a function, for example, displacement of one or more of the device or other component, engagement of one or more of the device or other component, etc.
[00121] In various embodiments, components described as being “coupled” to one another can be joined along one or more interfaces. In some embodiments, these interfaces can include junctions between distinct components, and in other cases, these interfaces can include a solidly and/or integrally formed interconnection. That is, in some cases, components that are “coupled” to one another can be simultaneously formed to define a single continuous member. However, in other embodiments, these coupled components can be formed as separate members and be subsequently joined through known processes (e.g., soldering, fastening, ultrasonic welding, bonding). In various embodiments, electronic components described as being “coupled” can be linked via conventional hard-wired and/or wireless means such that these electronic components can communicate data with one another.
[00122] The terminology used herein is for the purpose of describing particular example embodiments only and is not intended to be limiting. As used herein, the singular forms "a", "an" and "the" may be intended to include the plural forms as well, unless the context clearly indicates otherwise. The terms "comprises," "comprising," "including," and "having," are inclusive and therefore specify the presence of stated features, integers, steps, operations, elements, and/or components, but do not preclude the presence or addition of one or more other features, integers, steps, operations, elements, components, and/or groups thereof. The method steps, processes, and operations described herein are not to be construed as necessarily requiring their performance in the particular order discussed or illustrated, unless specifically identified as an order of performance. It is also to be understood that additional or alternative steps may be employed.
[00123] When an element or layer is referred to as being "on", "engaged to", "connected to" or "coupled to" another element or layer, it may be directly on, engaged, connected or coupled to the other element or layer, or intervening elements or layers may be present. In contrast, when an element is referred to as being "directly on," "directly engaged to", "directly connected to" or "directly coupled to" another element or layer, there may be no intervening elements or layers present. Other words used to describe the relationship between elements should be interpreted in a like fashion (e.g., "between" versus "directly between," "adjacent" versus "directly adjacent," etc.). As used herein, the term "and/or" includes any and all combinations of one or more of the associated listed items.
[00124] Spatially relative terms, such as "inner," "outer," "beneath", "below", "lower", "above", "upper" and the like, may be used herein for ease of description to describe one element or feature's relationship to another element(s) or feature(s) as illustrated in the figures. Spatially relative terms may be intended to encompass different orientations of the device in use or operation in addition to the orientation depicted in the figures. For example, if the device in the figures is turned over, elements described as
"below" or "beneath" other elements or features would then be oriented "above" the other elements or features. Thus, the example term "below" can encompass both an orientation of above and below. The device may be otherwise oriented (rotated 90 degrees or at other orientations) and the spatially relative descriptors used herein interpreted accordingly.
[00125] The foregoing description of various aspects of the invention has been presented for purposes of illustration and description. It is not intended to be exhaustive or to limit the invention to the precise form disclosed, and obviously, many modifications and variations are possible. Such modifications and variations that may be apparent to an individual in the art are included within the scope of the invention as defined by the accompanying claims.
Claims
1. A computer-implemented method of rendering haptic feedback at a device having a tactile interface and a haptic actuator, the method comprising: detecting an input characteristic of user contact with the tactile interface; processing the input characteristic with a data model having an input characteristic to waveform output correspondence; generating a waveform based on the input characteristic using the data model, wherein the at least one waveform is dynamically generated in response to the input characteristic; and rendering the at least one waveform at the haptic actuator to provide tactile feedback to the user.
2. The method of claim 1, wherein the data model calculates the waveform based on a stored set of vibrational responses and extrapolations on the set of vibrational responses, wherein the extrapolations are best-fit adjustments to the vibrational responses based on the input characteristics of the user contact.
3. The method of claim 1, wherein the haptic actuator is one of a plurality of haptic actuators.
4. The method of claim 1, wherein the input characteristic comprises at least one of: a) a detected speed of movement of a body part of the user across the tactile interface, b) a detected force applied by the user at the tactile interface, or c) a detected change in a contact state of the user’s body part relative to the tactile interface.
5. The method of claim 1, wherein the input characteristic comprises a single input characteristic.
6. The method of claim 1 , wherein the tactile interface comprises a display, and wherein the data model is specific to a tactile object shown on the display during detection of the input characteristic.
7. The method of claim 6, wherein the object shown on the display is one of a plurality of objects in a displayed image or video.
8. The method of claim 1 , wherein the tactile feedback is provided as a series of vibrations via the haptic actuator.
9. The method of claim 1, wherein the model is stored locally at memory in the device.
10. The method of claim 9, wherein the model is configured to be downloaded to the memory from an external source.
11. The method of claim 1, wherein the model comprises at least one of: a relational database, a neural network, or linked lists of parameters.
12. The method of claim 1, wherein the device comprises at least one of: a stylus, a tablet, a smart phone, a vehicle steering wheel, a utensil, a medical device, an exploration probe, or a hand controller.
13. The method of claim 1, wherein the tactile interface comprises at least one of: a touchpad, a trackpad, a touch screen, a smart surface, or a specific haptic-feedback interface element.
14. The method of claim 1, further comprising, prior to detecting the input characteristic of the user contact: providing instructions for scanning an object with a scanning device; and confirming successful scanning of the object with the scanning device in response to recei ving scan data sufficient to characterize the object.
15. The method of claim 14, wherein the scanning device comprises an optical scanning device.
16. The method of claim 14, wherein the scanning device is hand-held.
17. The method of claim 14, further comprising: updating the data model with the scan data to enable tactile feedback associated with the scanned object.
18. The method of claim 14, wherein the scanning device is part of the device having the tactile interface.
19. A computer-implemented method of actuating haptic feedback at a device in response to detecting a haptic event from a user, the method comprising: generating a waveform for triggering a haptic response at the device; and rendering the waveform as a set of vibrations at the device, wherein the waveform comprises a series of successive vibrations that are rendered as vibrations once the haptic event is detected, and wherein each vibration is delivered as a decaying sinusoid with a frequency, amplitude and decay rate that vary based on a material of the device, wherein each vibration lasts approximately 0.10 seconds or less.
20. The computer-implemented method of claim 19, wherein rendering the waveform comprises rendering a series of waveforms at an actuator on the device over time to match an exploration profile of a series of haptic events, wherein the exploration profile varies over time and the series of waveforms comprises dynamically generated time-varying waveforms that are calculated at a rate of approximately 1 kilo-hertz (kHz), and wherein the series of waveforms are double integrated to create an acceleration graph representing the haptic response.
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US202062957429P | 2020-01-06 | 2020-01-06 | |
US62/957,429 | 2020-01-06 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2021141936A1 true WO2021141936A1 (en) | 2021-07-15 |
Family
ID=76788324
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/US2021/012240 WO2021141936A1 (en) | 2020-01-06 | 2021-01-06 | Haptic waveform generation and rendering at interface device |
Country Status (1)
Country | Link |
---|---|
WO (1) | WO2021141936A1 (en) |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2024036708A1 (en) * | 2022-08-19 | 2024-02-22 | 瑞声开泰声学科技(上海)有限公司 | Generation method and system for tactile feedback effect, and related device |
Citations (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20080226134A1 (en) * | 2007-03-12 | 2008-09-18 | Stetten George Dewitt | Fingertip visual haptic sensor controller |
US20120249462A1 (en) * | 2011-04-01 | 2012-10-04 | Analog Devices, Inc. | Method and apparatus for haptic vibration response profiling and feedback |
-
2021
- 2021-01-06 WO PCT/US2021/012240 patent/WO2021141936A1/en active Application Filing
Patent Citations (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20080226134A1 (en) * | 2007-03-12 | 2008-09-18 | Stetten George Dewitt | Fingertip visual haptic sensor controller |
US20120249462A1 (en) * | 2011-04-01 | 2012-10-04 | Analog Devices, Inc. | Method and apparatus for haptic vibration response profiling and feedback |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2024036708A1 (en) * | 2022-08-19 | 2024-02-22 | 瑞声开泰声学科技(上海)有限公司 | Generation method and system for tactile feedback effect, and related device |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN104049795B (en) | Touch feedback based on contactant generates | |
EP3602256B1 (en) | Virtual reality / augmented reality handheld controller sensing | |
JP6453448B2 (en) | 3D Contextual Feedback | |
JP6541292B2 (en) | High resolution haptic effect generation using primitives | |
EP3598273A1 (en) | Adaptive haptic effect rendering based on dynamic system identification | |
CN107807733A (en) | Compensated tactile for flexible electronic devices is presented | |
WO2020117537A1 (en) | Augmenting the functionality of user input devices using a digital glove | |
KR20190017010A (en) | Multi-modal haptic effect | |
JP5664240B2 (en) | Information input system, information input method, information input program | |
WO2021141936A1 (en) | Haptic waveform generation and rendering at interface device | |
US20240126385A1 (en) | Touch-sensitive input device | |
US11287891B2 (en) | Measurement apparatus and control method of measurement apparatus | |
JP5876733B2 (en) | User interface device capable of imparting tactile vibration according to object height, tactile vibration imparting method and program | |
WO2020117539A1 (en) | Augmenting the functionality of user input devices using a digital glove | |
US11435831B2 (en) | Measurement apparatus and control method of measurement apparatus | |
JP2013114323A (en) | Three dimensional space coordinate input device | |
KR101748570B1 (en) | Wearable data input device | |
JP2024541931A (en) | Ergonomic reconfiguration of virtual input devices | |
US9170670B2 (en) | Portable electronic device | |
JP2023039834A (en) | Tactile sensation measurement device | |
KR20160089982A (en) | Input apparatus using a motion recognition sensor | |
CN117707334A (en) | Flexible ultrasonic glove for palm pressure sensing | |
CN115698917A (en) | Determining distance to input device | |
Banker et al. | Ultrasonic 3D Wireless Computer Mouse | |
KR20160035633A (en) | Piano Pro |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 21738133 Country of ref document: EP Kind code of ref document: A1 |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
122 | Ep: pct application non-entry in european phase |
Ref document number: 21738133 Country of ref document: EP Kind code of ref document: A1 |