[go: up one dir, main page]
More Web Proxy on the site http://driver.im/

WO2013160561A1 - Tactile output system - Google Patents

Tactile output system Download PDF

Info

Publication number
WO2013160561A1
WO2013160561A1 PCT/FI2013/050468 FI2013050468W WO2013160561A1 WO 2013160561 A1 WO2013160561 A1 WO 2013160561A1 FI 2013050468 W FI2013050468 W FI 2013050468W WO 2013160561 A1 WO2013160561 A1 WO 2013160561A1
Authority
WO
WIPO (PCT)
Prior art keywords
finger
texel
tactile output
tactile
contact
Prior art date
Application number
PCT/FI2013/050468
Other languages
French (fr)
Inventor
Pekka Nikander
Jukka Linjama
Jonas Bengtsson
Harri KAPANEN
Original Assignee
Senseg Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Senseg Ltd filed Critical Senseg Ltd
Publication of WO2013160561A1 publication Critical patent/WO2013160561A1/en

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/016Input arrangements with force or tactile feedback as computer generated output to the user

Definitions

  • the subject matter disclosed herein generally relates to electronic devices. Specifically, the present disclosure addresses a tactile output system.
  • a tactile output system may be used to generate tactile stimuli while a user is using a device surface, typically causing a tactile sensation to one or more of the user's fingers.
  • a user may experience a change in sliding friction or a vibrational feeling when touching a capacitive touchscreen to operate a computer program controlled by one or more touch inputs (e.g., touch input events), which may be generated or detected by the capacitive touchscreen.
  • the computer program may affect the tactile stimuli that generate the user's experiences.
  • Improvments to user experience may comprise improved spatial accuracy, realism, natural feeling, flexibility, diversity of possible user feedbacks, or the like. This object is achieved with methods and devices as stated in the attached independent claims.
  • the dependent claims and the present description and drawings disclose additional features which provide additional benefits and/or solve residual problems.
  • FIG. 1 is a conceptual flowchart illustrating sources of information used by a tactile output system, according to some example embodiments.
  • FIG. 2 is a conceptual diagram illustrating a hierarchy of texel maps, according to some example embodiments.
  • FIG. 3 is a set of diagrams illustrating location estimation, according to some example embodiments.
  • FIG. 4-9 are block diagrams illustrating components of a tactile output system in the example form of a tablet computer, according to various example embodiments.
  • FIG. 10 is a diagram depicting a haptic device in the example form of a tactile stimulation apparatus, according to some example embodiments.
  • FIG. 11 is a schematic diagram illustrating various components of a haptic device in the example form of a tactile stimulation apparatus, according to some example embodiments.
  • FIG. 12 is a block diagram illustrating components of a machine, according to some example embodiments, able to read instructions from a machine-readable medium and perform any one or more of the methodologies discussed herein.
  • FIG. 13-16 are flowcharts illustrating operations of a tactile output system, according to various example embodiments.
  • Example methods and systems are directed to tactile output (e.g., haptic output).
  • tactile output e.g., haptic output
  • some of the example embodiments of these methods and systems may support control of tactile stimuli in such a manner that a large amount of spatial accuracy and flexibility is achieved. Examples merely typify possible variations.
  • components and functions are optional and may be combined or subdivided, and operations may vary in sequence or be combined or subdivided. In the following description, for purposes of explanation, numerous specific details are set forth to provide a thorough understanding of example
  • a system for generating tactile sensory stimulation may include a few pieces of special-purpose hardware and software, combined together into a system, in a few possible ways.
  • a system may be embodied in a device (e.g., electronic device).
  • the software pieces may be organized so that one or more parts of the software runs on a general-purpose processor, under the control of an operating system of the device, while one or more other parts of the software run on a separate microcontroller.
  • This micro-controller may control a piece of special-purpose hardware, for example, by controlling a number of output signals, such as general-purpose input/output (GPIO) signals, timer signals, or other signals corresponding to physical pins that alter some electrical aspect of the hardware.
  • GPIO general-purpose input/output
  • the changing of these electrical aspects may alter some physical aspect of the hardware, such as tension in a piezo-electric crystal, a rotation speed of an electrical motor used in a vibrator, or a strength of a Coulomb's force between a surface and a user's finger (e.g., fingertip) in an electrostatic- vibrational tactile system.
  • a tactile output system may be able to generate some number or range of differing tactile stimuli.
  • One or more pieces of an application or other software may control what stimuli are generated.
  • the software, hardware, and combinatorial aspects of such a system may determine (e.g., describe, define, limit, or any suitable combination thereof) the number, range, and quality of the stimuli the system is able to generate.
  • a tactile output system provides an application programming interface (API).
  • API application programming interface
  • Such an API may include one or more data formats and structures and associated classes, methods, or functions.
  • An application or other program may construct (e.g., directly or indirectly) one or more instances of these data structures and call or invoke the API functions or methods, in order to instruct the hardware to generate tactile stimuli.
  • the input device may issue a series of touch event reports.
  • Each report may describe one or more measured locations of the touching finger. It may also contain other information, such as an estimate of the finger pressure.
  • an application or some other program controls the generation of one or more tactile stimuli (e.g., which kind of tactile stimulus to generate and when to generate it).
  • an application program e.g., a computer program executing on the tactile output system
  • the application may combine information from its internal state and the touch event reports, and may generate one or more application-specific tactile commands that appropriately cause the hardware to generate one or more tactile stimuli.
  • a game application may determine (e.g., compute ) that the immediately preceding user input has induced a virtual explosion in the game. Based on that determined (e.g., computed) result, the game application may produce (e.g., generate) one or more tactile commands that cause the hardware to shake the device intensively so as to simulate the effect of an explosion.
  • a vibrator motor start-up latency caused by the physical inertia and other physical factors, may limit how fast a tactile effect can start after the application program has made its decision and issued a tactile command to initiate the tactile effect.
  • tactile output systems may have limitations on how fast and accurately they can produce tactile stimuli as a response to the changes in the program state and user actions (e.g., in response to state changes in the program as a result of one or more touch input event reports that indicate user actions that were submitted via a touch-sensitive surface).
  • a touch-based input/output system such as a tablet computer having a capacitive touch screen and an electrostatic- vibrational tactile touch effect system (e.g., subsystem), may experience unavoidable systemic delays. It may take, for example, on the order of a few tens of milliseconds from the time a finger slides over a specific position on the tactile surface until the command that is generated in response to a
  • corresponding touch event report is manifested (e.g., actualized) as tactile output.
  • a touch input device may take some 10-30 ms to process its measurements and send the generated input event reports to its main central processing unit (CPU).
  • the operating system may take some time to process those reports (e.g., less than 1 ms), and the system-side touch input system (e.g., subsystem) may take some more time to process and filter them (e.g., less than 1 ms, unless the system is very busy with other tasks).
  • the system-side touch input system e.g., subsystem
  • it may take some more time before the operating system scheduler assigns a processor core to the right application process. After all these events have taken place, the application program may process the touch input events.
  • the command may first be passed to another process, such as a process that serves many processes and decides which process is currently allowed to utilize a particular piece of hardware.
  • a tactile output system e.g., haptic system or haptic device
  • the latency in a tactile output system may be characterized as a latency in a haptic system input-output control loop, and such a latency may include three parts: input processing latency, application processing latency, and output rendering latency.
  • latency refers to end-to-end delay in a haptic system input-output control loop.
  • latency may include
  • latency jitter refers to variations in the latency between control events that deteriorate tactile output timing and spatial accuracy.
  • tactile stimuli is a generic term used to describe various physical phenomena that may directly or indirectly stimulate sensory nerves in a human being.
  • the stimuli may consist of an electric field that may primarily change the friction between a user's finger and the tactile surface.
  • a "tactile element” refers to a single “feature” that can be felt at a surface (e.g., a description of the desired tactile properties of a position at a surface).
  • tactile elements include a fraction of an edge, a protruding point, and an elementary feature that is part of a texture, in which case the total of a few individual elements may form the perceivable texture feeling.
  • tactile content refers to a set of "feelable” tactile elements.
  • tactile contents may include all of the tactile elements used by an application.
  • Tactile content may be described or defined in many substantially different ways (e.g., as different embodiments of the content), such as directly within a program source code in the form of conditional statements and other procedural or imperative programming language constructions, or in a more declarative way, in which case the content may be described through use of suitably expressive data structures and a corresponding declarative programming language.
  • a "tactile surface” refers to a surface equipped with an input device and tactile output hardware so that a user can slide one or more fingers over the surface and perceive various tactile elements as being there on the surface.
  • an "input event report” is a piece of data that represents a momentary state at an input device.
  • An input event report may describe one or more locations and one or more contact areas of a user's finger or fingers on a tactile surface.
  • an "input device” is a subsystem that may include hardware and software for producing input event reports.
  • an “output signal” is an electrical signal, digital or analog, that is used within the system to directly alter the electrical or physical state of the tactile output hardware.
  • physical actualization refers to a phenomenon when a piece of tactile output hardware changes the physical state of a tactile output system (e.g., as a result of an output signal change), so that the user's perceptions get changed.
  • a "tactile output system” is a system (e.g., an apparatus, a machine, or device) that generates output signals from some embodiment of tactile content and input event reports.
  • spatial accuracy refers to a quality describing how accurately a tactile element stays at its defined (e.g., desired) position, or moves along its defined (e.g., desired) trajectory (e.g., as a user is trying to track the element on a tactile surface).
  • spatial flexibility refers to a quality describing how easily, uniformly, and fast the system may change the tactile content on a tactile surface.
  • gray refers to a shortest time segment of a tactile output signal.
  • an optimal duration may be 20 .. 50 ms, which may correspond to the shortest perceptually distinguishable tactile event, or to a time granularity of tactile output system control.
  • silica speed refers to a velocity of a finger touch sliding on a touch interface surface.
  • a "texel” is a texture element that is defined for an area on a virtual tactile surface.
  • Slide speed may determine one or more dimensions of a texel.
  • One grain may correspond to (e.g., map to) one texel.
  • a texel may describe tactile output to be rendered by the tactile output system at a spatial location (e.g., a locationary piece of a description of tactile output).
  • a "texel map” refers to a visualization (e.g., a digital representation) of tactile content as texels on a surface (e.g., so that a haptic design can be presented and be aligned with computer graphic pixels) and touch input locations on the surface. Accordingly, a texel map may spatially describe tactile output to be rendered by the tactile output system (e.g., a spatial description of such tactile output).
  • a texel may describe a corresponding portion of tactile output for a spatial location (e.g., on a surface of a device contacted by a user's finger) as a function of spatial coordinates (e.g., with respect to the surface of the device).
  • the texel may describe the corresponding portion of tactile output for the spatial location as a point-like effect description (e.g., the degree to which the portion of the tactile output is perceivable as being pointy or otherwise similar to a pinpoint).
  • a point-like effect description may be expressed as a waveform (e.g., applicable to control tactile output hardware in providing an output signal).
  • Example embodiments of a tactile output system are configured to produce high quality tactile stimuli.
  • a tactile output system may be able to process the application state and user input in a way that allows the system to produce tactile stimuli that have very high spatial accuracy and flexibility.
  • Such a system may have no limitations on where on the screen the tactile stimuli can be generated, and such a system may produce them precisely at the location a user would expect them to be felt based on typical real-life sensory experiences.
  • an example tactile output system may be able to generate accurate contours and edges on a touch-input based tablet computer.
  • Example embodiments of the tactile output system may include a mechanical system (e.g., a subsystem) that is configured to cause the tactile surface to physically morph or vibrate.
  • a mechanical system e.g., a subsystem
  • the specific areas that are able to morph may be spatially constrained, which may result in a corresponding physical limitation in spatial accuracy.
  • Physical vibration on the other hand, may limit both the range of effects and the places where the effect may be felt well.
  • the tactile output system may be able to produce tactile stimuli flexibly, or equally at all locations within the tactile surface, with or without limitations in its ability to provide high spatial accuracy.
  • a processing latency is inherent to the system, which may cause the effects to be felt post facto. This, in turn, may limit the spatial accuracy.
  • a texel map may be utilized as a design aid, with various attendant benefits arising therefrom.
  • the tactile output system may be characterized by an ability to provide tactile sensations with high spatial accuracy and flexibility. This may be achieved through combining a number of specific mechanisms in various configurations. Although the discussion herein focuses on specific example embodiments of such a tactile output system, it should be appreciated that one skilled in the state of the art may define other configurations of the herein described system, mechanisms, components, and subsystems.
  • the tactile output system may include one or more mechanisms (e.g., components or subsystems) configured to perform any one or more of the following functions: finger movement speed estimation, finger location prediction, finger movement direction estimation, finger pressure estimation, effect adjustment based on (e.g., according to) finger speed, effect adjustment based on finger movement direction, effect adjustment based on finger pressure, a point-like effect description (e.g., with data description compensated based on speed, pressure, direction, or any suitable combination thereof), effect description based on a spatial map (e.g., texel map), and effect variation (e.g., based on prioritization, multiple alternating texel maps, texel map data, finger speed, finger direction, finger pressure, or any suitable combination thereof).
  • a spatial map e.g., texel map
  • effect variation e.g., based on prioritization, multiple alternating texel maps, texel map data, finger speed, finger direction, finger pressure, or any suitable combination thereof.
  • the tactile output system may estimate the speed of a finger through any approximation technique. It may be beneficial to adjust the algorithm used based on a previous estimate. For example, if the previous estimate indicates that a finger has been stationary, the system may use an algorithm that produces good overall tactile stimuli results for slow speeds. On the other hand, if the previous estimate indicates that the finger has been moving at a high speed (e.g. as in a "fling" or "flick” gesture), the system may instead use another algorithm, such as one that is experimentally found to produce good overall tactile stimuli at high speeds.
  • the system may use a relatively accurate speed estimate at slow speeds, but may use a more "conservative" algorithm at higher speeds, due to the ill effects overcompensation may have, as discussed below.
  • Such finger speed estimation may be beneficial in a tactile output system for least the reason that the system may use such an estimate to adjust the tactile stimuli produced by the tactile output system. It may also be beneficial indirectly, such as by allowing the system to perform further estimation and adjustments.
  • the input device may provide some value that describes how hard or lightly the user's finger is pressing on the screen (e.g., a touchscreen). Such a value may be usable for affecting tactile output, as such, or it may be beneficial to compute a further estimate from one or more values provided by the input device. In some example embodiments, using a sliding average may give a finger pressure estimate that is more suitable for affecting tactile output than using the value directly.
  • the tactile output system may be configured to compensate for one or more inherent system latencies. For example, by the time an application or other program component processes any touch event reports, a moving finger may have already moved further from the reported location. Furthermore, by the time any tactile stimulus gets physically actualized, the finger may have moved some distance further. To compensate for this, the tactile output system may estimate a predicted position of the finger at a future time when any set of tactile stimuli will get physically actualized.
  • Such an estimation may be based, for example, on one or more of the following: a set of previous touch event reports, a "current" (or latest) touch event report, one or more empirical results describing the overall input latencies from a finger crossing a point to the time the touch event report reaches the processing piece of program code, one or more empirical results describing the output latencies from the processing piece of program code generating tactile output expressions to the hardware to the time the effect is physically actualized, a degree of variation or jitter on such measurements, and a degree of dependency that the variation or jitter may have on finger location, finger speed, finger pressure, tactile content, or any suitable combination thereof.
  • the finger location may be estimated through one or more methodologies. For example, a speed estimate, a movement direction estimate, system latency, and the latest measured location may be used to compute a location estimate.
  • some example embodiments of the system may use a more complex procedure, and the algorithm used may vary, for example, based on empirical testing results. For example, the tactile output system may slightly underestimate (e.g., "deliberately") how much the finger has advanced at high speeds, in order to damp down errors stemming from measurement jitter.
  • Having such a location estimate available may be beneficial for a tactile output system. For example, if a finger is moving at high speed over an edge or other feelable surface feature, without location estimation the user may perceive the feature to move on the screen depending on the movement speed and direction of the finger.
  • Finger movement direction estimation may be performed, in various example embodiments, in a manner similar to estimation of finger speed.
  • the tactile output system may estimate the direction of the movement a finger is taking (e.g., a movement direction).
  • the finger speed estimation algorithm represents the finger speed as a vector, thus indicating not only the scalar velocity but also the direction of the finger.
  • the algorithm may refuse to change the direction abruptly (e.g., even if the measurements would indicate so), since such an abrupt change may be deemed more likely to have been caused by finger location measurement errors rather than real direction changes.
  • an actual physical effect that a user can feel is produced by the surface itself morphing physically, thereby forming protrusions, inclusions, indentations, or any suitable combination thereof.
  • the sensory effect caused by the tactile formation feels natural per se, as the sensory nerves of the user's finger or hand get excited by a physical formation.
  • one or more sensory effects are caused by some other mechanism yet still involve real physical formations.
  • a piezo-electrical system may create a standing vibration at the point of a finger, and hence create an illusion of the surface reclining or ascending by quickly shifting the standing waveform at the finger position.
  • an electrostatic- vibration system e.g., subsystem or component
  • the vibration, field, or other means of generating sensory effects may merely be contributing to the desired sensory feeling.
  • the actual feeling may be formed in the brain of the user, for example, based on the nerve signals coming from the finger and the expectations in the user's brain.
  • the expectations may depend on the user's previous experiences, as well as other sensory input, such as any picture visible on the touched surface (e.g., a generated image), one or more sounds, and feelings of temperature, pain, and other sensory cells that may or may not get excited by the used tactile output method.
  • simply actualizing the same tactile output whenever the finger passes a given location on the surface may produce an unnatural feeling.
  • the user may perceive that an edge gets sharper or duller at differing speeds of the finger. Consequently, creating an illusion of a naturally feeling surface may involve differing physical tactile stimuli to be generated depending on the finger speed. For example, it may be beneficial to reduce the amplitude of tactile stimuli when the finger is moving at higher speeds. As another example, in an electrostatic- vibration system, such reduction of the amplitude may decrease the friction between the finger and the surface, thereby producing a feeling more similar to natural friction, which generally depends on the sliding speed. Such a variation may also be beneficial in making a single physical surface appear as different materials, since the way natural friction behaves at differing sliding speeds may depend on the material.
  • Another example of effect adjustment based on finger speed is compensating for friction phenomena at very low sliding speeds. In this case, it may be beneficial to reduce the tactile effect intensity, or otherwise modify the tactile output waveform to better imitate the behavior of a finger touching real surfaces.
  • some surface formations may cause different sensory effects, depending on the angle and direction between the formation and a sliding finger. For example, when a finger passes over an edge at an "ascending" direction, the feeling is different from that experienced when the finger is passing over the same edge at the "descending" direction. Furthermore, if the finger slides along the edge, the feeling may be different still. On more complex directional surfaces, such as on striped velvet, the feeling may depend on the direction in more complex ways, or may depend on some combination of direction, pressure, and speed of the movement. Hence, it may be beneficial for certain example embodiments of the tactile output system to be configured to cause differing physical stimuli to be generated depending on the finger direction, the finger pressure, or both. Such an ability may help the system to more accurately simulate differing textures.
  • illusions of feeling moving objects may be created by using information of finger movement with respect to a virtual object movement direction.
  • the user may recognize a moving object by successive finger explorations.
  • the moving object may be recognized by the user based on a different sensation when the finger is moving against the object's motion (e.g., direction of movement) compared to when the finger is following the object's motion.
  • a point-like effect description e.g., with data description compensated based on speed, pressure, direction, or any suitable combination thereof
  • the system may estimate the finger speed, pressure, direction, and latency-adjusted location, as described above. Based on such estimates, the system may adjust the generated tactile stimulus so that it would cause a sensory feeling that closely matches with the desired tactile experience.
  • the system may produce different tactile stimuli compared to a situation where a similar- sized protrusion is a dull one.
  • the system may describe each spatial feature (e.g., texel) as a data structure that describes differing effects at different directions with respect to the texel.
  • the system may differentiate between 9 different directions (e.g., 4 main directions, 4 intermediate directions, and 1 for a stationary finger), or it may differentiate between 17 differing directions.
  • Each directional effect may be further described in a way that allows the system to react in different ways to different sliding speeds, where the various different sliding speeds include no sliding at all (e.g., the stationary finger case).
  • the effect may be described as a "parameterized" function (e.g., a function dependent on one or more adjustable parameters) that reduces or increases the strength of an electrostatic field as the sliding finger achieves higher velocities, as the finger pressure changes (e.g., increases or decreases), or both.
  • the frequency of the waveform of used in such electrostatic fields may be changed according to the speed, pressure, or both.
  • a surface e.g., of the tactile output system
  • a two-dimensional array of tactile elements e.g., when describing or defining tactile surfaces.
  • Such a two-dimensional array may be considered as a texel map (e.g., a map of texture elements).
  • the texel map may be defined by one or more map functions that describe the desired feeling at each surface location.
  • a texel map may be used as an implementation description, as a visualization of tactile content (e.g., for design purposes), or both.
  • the texel map combines the spatial and temporal aspects of haptics interaction design.
  • a tactile (e.g., haptic) output system may determine what physical stimuli to actuate based on information of a finger sliding over one or more of the texels in such a texel map. Since any one or more texels may be spatially smaller than a fingertip, the system may need to determine (e.g., choose or identify) a texel from a number of texels, or combine a number of texels, in order to simulate the desired tactile feeling. For example, a single texel may cover a 1 square millimeter area of the tactile surface, while the area of the finger touching the screen may cover a 25 square millimeter oval area. Hence, the tactile output system may be configured to work with (e.g., analyze or monitor) a 25 square millimeter cross sectional area as the finger slides over the surface, and may consider up to a few tens of texels at each point in time.
  • each texel data structure may contain information for describing the depth of the texel, the normal of the texel, or both.
  • the depth of the texel may represent whether the texel is protruding or reclining in relation to one or more neighboring texels.
  • the normal of the texel may represent the direction of the rectangular surface element that the texel represents in relation to the enclosing surface.
  • the depth information and normal information may be represented in a format that is commonly used for describing such information in computer programs (e.g., in a depth map or normal map, such as those used in computer graphics).
  • the actual hardware used for generating tactile stimuli may take a relatively long amount of time to produce a meaningful tactile effect (e.g., 10 or 50 ms).
  • a meaningful tactile effect e.g. 10 or 50 ms
  • a finger slides over the surface at a relatively fast speed, such as 10 cm/s, it may slide a distance of 5 mm during a 50 ms effect duration.
  • the tactile surface is described by a texel map (e.g., an effect map)
  • the finger may proceed over, for example, 5 texels before the system has a chance to decide which effect to output next.
  • the system may determine the texels that a finger is likely to slide over during the next estimated time period and select the most prominent texel based on (e.g., with the help of) such priorities.
  • the algorithm used by the system may also determine whether a texel is likely to be in the middle of the sliding finger or at an edge of it, and adjust the priorities accordingly based on this determination. For example, it may adjust up the priority of the texels that are likely to be in the middle of the sliding finger while adjusting down the priority of the texels at the edge of the finger.
  • An alternative method may be to store several texel maps at differing resolutions or granularities (e.g., levels of granularity).
  • the system may be configured to always select a texel map that best matches with the current finger speed, so that the finger always and only slides from a texel to a neighboring texel.
  • the system may use a more finely described texel map where each texel is described, for example, as a 1 mm xl mm area.
  • the system may use a texel map where each texel describes the most desired effect for a 5 mm x5 mm area when the finger is sliding with the speed corresponding to this resolution or granularity.
  • a system may store several texel maps at a given granularity at differing offsets, for example, so that a second texel map has a texel corner located at the middle point of the other texel map. This may help the system to quickly adjust to higher or slower speeds as the finger speed changes.
  • the tactile output system may have an overall system organization (e.g., system architecture). Example embodiments of such a system organization are described herein. Alternative system organizations may be apparent to one skilled in the art. For example, additional example embodiments may be obtained by leaving out some of the described components or adding more components.
  • FIG. 1 illustrates an example embodiment of the overall organization, showing various information flows.
  • the first source is some embodiment of the information that describes the desired effect "landscape".
  • Such effect landscape information may be defined in various different ways. For example, such information may be dispersed over an application program in the form of a condition statement or other program code. In particular, some or all of the effect landscape
  • information may be stored declaratively (e.g., in texel maps), as described above.
  • This first source of information may be relatively "static”, in the sense that it may be created by developers of the system or application (e.g., program) to be executed thereon and it does not change over the lifetime of a program version. However, it may also be “dynamic” in the sense that it may be computed (e.g., in real-time or near real-time).
  • an application program may create a texel map from an image downloaded from the network by, for example, using a edge detection algorithm that allows the application to make the edges within the image feelable.
  • an application that creates 3-D graphics may also create a corresponding dynamic texel map.
  • implementation of dynamic texel maps involves using shared memory between the application process and the other components of the system.
  • the second source of information may be the input event reports. These may be created dynamically (e.g., during the system runtime), and they may describe the user interactions with the tactile surface. From a system design perspective, these input event reports may be expected to be unpredictable, but have some predictable features when there is a sliding touch contact detected.
  • One function of the system may be to access the two above- described sources of information and compute a stream of output signals that cause the hardware to generate the desired physical phenomena. This allows the user to perceive the surface as having highly accurate tactile elements that can be quickly changed under program control. For example, if a user action causes the system to change its internal state, such as showing a new picture, the internal state change may be able to cause an instant change in the tactile content, where "instant" means faster than the human perception limit.
  • the system is shown as being configured to perform a few processing steps.
  • the tactile event reports are processed first, producing a speed estimate, a direction estimate, a pressure estimate, a location estimate, or any suitable combination thereof.
  • the speed and direction estimates may be represented as a momentary velocity vector, describing the estimated speed and direction of the finger.
  • the location estimate may be represented as an ⁇ x, y> coordinate position, describing the estimated position of the finger on the tactile surface at the time the to-be-issued tactile stimulus will get physically actualized.
  • the pressure estimate may be represented as integer that describes a relative pressure.
  • the speed estimate, direction estimate, pressure estimate, location estimate, or any combination thereof may then be used by the system to select a tactile element that the system will actualize. This selection may be based on one or more tactile maps (e.g. texel maps) representing the desired tactile elements at each point on the tactile surface.
  • the selection process may store and utilize one or more previous finger positions in performing the selection.
  • the input event processing steps may correct such stored previous estimates based on actual measured values.
  • the system may further refine and adjust the desired effect, for example, by selecting the desired piece of tactile content based on the estimated direction of the finger.
  • the content may be further refined, for example, by adjusting it based on (e.g., according to) the estimated finger speed, estimated finger pressure, or both. This may involve changing the duration, the frequency, or one or more other aspects of the output signals.
  • the adjusted piece of tactile content may then be converted into a set of temporally changing output signals.
  • one signal may control when a high- voltage electric charge is passed to an effect electrode (e.g., a tixel) and another signal may control when a high- voltage charge is released from the effect electrode.
  • an effect electrode e.g., a tixel
  • the input event reports may be represented in at least three different formats, according to some example embodiments.
  • the underlying Linux kernel may represent a touch input event as a sequence of primary Linux input subsystem events.
  • Such events are available, for example, through the Linux /dev/input/event N device-driver pseudo-file.
  • the Android Window Manager may read the Linux input events and store them in their internal representations.
  • the events may be converted to a Java MotionEvent format.
  • an underlying Linux kernel may represent input event reports in an internal data representation (e.g., internal data structure).
  • an internal data representation e.g., internal data structure.
  • the input event reports may be acquired from the Window Manager in different ways.
  • the system may implement a new Android system service that runs in the same process space as the Window Manager in the Android system_server process.
  • Such a new system service may acquire the actual events by modifying the virtual table of the Android InputDispatcher C++ class so that when any Android C++ component invokes the Android InputDispatcher::notifyMotion((7) method, a method belonging to the tactile output subsystem is called instead.
  • the called method may, in turn, pass a copy of any input event to the rest of the tactile output system and then pass program control to the original
  • InputDispatcher : notifyMotion( ... ) method.
  • the system server process may be started in the following way. First, when the underlying Linux operating system is booted, it may start the Android init process at the end of its internal boot process.
  • the Android init process may read a number of configuration files, including the init.rc file.
  • the init.rc file defines a number of Linux shell environment variables, including for example the PATH and
  • the init.rc file also defines how the init starts the Android zygote process, which in turn may start the system_server process.
  • the new Android system service may be dynamically loaded to the Android system server process without any modifications to the Android source code. This may be performed, for example, using the following method.
  • a new environment variable LD PRELOAD may be added to the Linux shell environment as set up by the init process.
  • the LD PRELOAD value may define a new shared library.
  • the Android Bionic dynamic loader may load the new shared library to any process that init launches, including the zygote and the system_server processes.
  • the new shared library When the new shared library is loaded using the LD PRELOAD environment variable, thereby becoming a preloaded library, it may intercept any C function or C++ method defined in any of the libraries that get loaded after the preloaded library.
  • the function defined in the preloaded library may be called instead of the function defined in any of the later loaded libraries.
  • the function or method is called from within a library or application that defines the function itself, or from a library that has been prelinked with the function or method defined, the original function may get called.
  • the Android SensorService C++ class constructor may get called by the Android libsystem service.so shared library. While the libsystem service.so library may be prelinked to Android, it might not be prelinked against the libsensorservice.so shared library that defines the
  • the preloaded library may intercept the Android SensorService constructor.
  • libsystem_service.so shared library calls the SensorService constructor, a new pseudo-constructor defined in the preloaded library may be called.
  • the pseudo- constructor may then initialize some other parts of the preloaded library. For example, the pseudo-constructor may instantiate a new system service. The pseudo-constructor may then pass the program control to the original
  • the preloaded library may intercept some other Android method, function, or constructor, or define a C++ static constructor. This may cause a program entry point within the shared library to be called. Such a method may be used to initialize the preloaded library.
  • the preloaded library when it is initialized, it may modify the Android InputDispatcher class virtual table, as described above, or any other Android C++ class virtual table.
  • the above-described dynamic loading of a new shared library may also be implemented by some other means, for example, through modifying the LD LIBRARY PATH environment variable.
  • a momentary speed estimate may be computed by calculating the Cartesian distance between a previously acquired input event and a currently acquired input event, dividing the distance with the time that has passed between the events, as computed from the event timestamps.
  • the reported speed estimate may be a cumulative or exponential moving average over a number of momentary speed estimates.
  • a momentary direction estimate may be computed by calculating the difference between the x and y coordinates of the previously acquired input event and the currently acquired input event, and calculating the "atan2" function over the differences.
  • the reported radian value may be smoothed with a cumulative or exponential moving average.
  • the resulting radian value may be mapped to a set of discrete directions through comparing the computed radian value with a set of fixed radian values.
  • the momentary direction may be represented as a pair of ⁇ x, y> coordinates. These coordinates may be scaled to represent the x and y distance taken if the finger is moving on the represented direction over some unit of time (e.g., 10 ms). According to various example embodiments, the actual x and y values might play no role in this representation, with only their ratio contributing to the momentary direction. For example, if the finger is moving to the right, the direction may be represented as ⁇ 10, 0>, and if the finger is moving at the angle of ⁇ /4 or 45° towards the upper right corner, the direction may be represented as ⁇ 10, 10>.
  • an input event report may contain data that represents how hard the user's finger is pressing on the tactile surface, how large an area is touched by the finger on the service, or other information that may be used to estimate how hard the user is currently pressing the finger against the tactile surface.
  • the actual finger pressure may be estimated from such information, for example, using a heuristic algorithm.
  • Such a heuristic algorithm may be calibrated, for example, using an application program that guides the user to touch the tactile service at different pressures.
  • a forthcoming location of the finger at the estimated time of effect actualization may be based on computing the estimated location from a fixed, experimentally derived system latency estimate, the location reported in the input event, the speed and direction estimates, or any suitable combination thereof.
  • An example procedure may be described as follows.
  • the speed estimate may be multiplied by the latency estimate, giving an estimate of the distance that the finger will have taken between the time when the finger was on the position reported in the input event and the time the effect gets actualized.
  • the x and y direction values may now be scaled by the distance estimate. For example, if the latency is estimated to be 30 ms, the finger speed is 0.4 pixels/ms, and the direction is ⁇ 10, -10>, the finger may be estimated to have moved 12 pixels within the 30 millisecond period, and the resulting ⁇ x, y> delta estimate may be approximately ⁇ 8.5, -8.5>.
  • the delta estimate may be directly added to the ⁇ x, y> position from the last input event report, giving the location estimate. Since the effect takes a while to execute, the location prediction may be enhanced by outputting two points, from xl, yl to x2, y2, thus defining the area of the texel map under consideration.
  • the predicted locations can be expressed as:
  • location2 locationl + effect period.
  • Some example embodiments of the tactile output system ensure that significant details are always played. Simply using estimated values for the locations could result in one or more details being missed. Accordingly, some example embodiments of the system may use the "previous location2" as the starting point for location prediction.
  • Haptic content may be represented in a number of ways by the tactile output system (e.g., haptic device or haptic apparatus).
  • the tactile content is represented in the form of texel maps, or data structures consisting of or describing positional cells, with each cell defining a tactile element (e.g., a texel) that describes the desired tactile properties of a position at the tactile surface.
  • a texel map may be or include a two-dimensional array wherein each cell, at an ⁇ x, y> coordinate, contains a small integer number, such as a value between zero and 255. The integer number may be used as an index to another array (e.g., a texel palette) in which each array element describes the texel properties.
  • each texel in the texel map may contain further values, such as the relative "depth" of the texel in relation to the one or more nearby texels (e.g., adjacent texels or texels less than five texels away), the relative "slope" of the texel (e.g., as used in so-called "normal maps” used when rendering computer graphics), or both.
  • the depth may be represented as a small integer
  • the normal may be represented by the ⁇ x, y> coordinates of the slope normal (e.g., in the form of a pair of small integers).
  • a texel palette element may be or include a short, fixed length array of small integers, such as an array of 9 or 17 numbers between zero and 255.
  • Each element of the array may be associated with a different finger direction, for example with element zero being associated with a positional, non-moving finger, element one being associated with a finger moving on an angle between ⁇ /8 and - ⁇ /8, element two with angles between 2 ⁇ /8 and ⁇ /8, and so on.
  • Each integer stored in an array element within a texel palette element may be used as an index to another array (e.g., a grain array) in which each element may describe the desired tactile properties of the texel whenever the finger is moving along the associated direction, for example, at some nominal speed, pressure, or both.
  • the elements of the grain array e.g., grains
  • Each template may be capable of being adjusted to produce differing output signal sequences, with each sequence representing the desired tactile output at a given finger velocity, finger pressure, or both.
  • each template may be a template of a computer program
  • each computer program template may be a binary representation of a sequence of instructions that may be executed by a virtual machine or by a general purpose or special purpose central processor unit (CPU) or microcontroller unit (MCU).
  • CPU central processor unit
  • MCU microcontroller unit
  • FIG. 2 illustrates texel map data structures and their relationships.
  • a number of texel maps may be associated with each other, forming a hierarchy of texel maps.
  • the texel maps in a texel map hierarchy may all be of differing scales (e.g., differing sizes).
  • each texel in the lowest level texel map describes the desired tactile content for a relatively small area of the tactile surface (e.g., a 1 mm xl mm area), and each next level texel map covers an area that is an integer multiple of this area with respect to width and height.
  • a second lowest level texel map may cover a 2 mm x 2 mm area
  • a third lowest level texel map may cover a 4 mm x 4 mm area
  • a fourth lowest level texel map covers a 8 mm x 8 mm area, all the way up to a texel map that contains only 2x2 texels at the highest level.
  • a number of texel map hierarchies may be associated together, forming a description of the tactile content associated with a visible programming construct, such as an Android View.
  • only one of the texel map hierarchies is active at a time.
  • An application program may have an API that supports creation of new texel map hierarchies, and such an API may enable the application program to indicate which texel map hierarchy is to be active (e.g., in response to an API call).
  • An underlying windowing system may automatically control which of a multitude of tactile content descriptions is active at a given point in time. For example, whenever an Android View is asserted as the so-called "touch focus," the tactile output system may grant priority to the texel map hierarchies associated with that Android View.
  • a texel selection subsystem within the tactile output system may select a texel whose tactile content will be actualized next.
  • the texel selection subsystem may take as its input a texel map hierarchy.
  • the texel selection subsystem may implement a texel selection routine that considers only one texel map hierarchy (e.g., at a time).
  • one of the multiple texel map hierarchies may be designated as a system-level active texel map hierarchy.
  • One or more other components of the system such as the Android Window Manager, may change the system- level active texel map hierarchy (e.g., in response to one or more application program API calls and based on the Android touch focus).
  • the texel selection subsystem considers only one texel map hierarchy (e.g., at a time), and the texel selection subsystem may select a texel periodically. For example, the period may be 10 or 50 milliseconds. The tactile content described by the selected next texel may then define the output signals until a following texel is selected.
  • the texel selection subsystem may utilize the information generated by speed estimation, direction destination, pressure estimation, location estimation, or any suitable combination thereof. With this information, the texel selection subsystem may determine one or more texels that the finger is likely to slide over during the considered time period.
  • the texel selection subsystem may also consider the finger width, which may be wider than each texel and, accordingly, consider multiple texels perpendicular to the estimated direction of the finger.
  • the texel selection subsystem may also consider the finger movement length and, accordingly, consider a number of texels along the estimated direction of the finger.
  • the texel selection subsystem may also consider the finger speed and, accordingly, select a texel map from a hierarchy of texel maps so that the finger is likely to move only from one texel to the next one along its estimated direction. This may have the effect of reducing the number of texels that need to be considered by the texel selection subsystem.
  • the texel selection subsystem may associate a priority or weight with each of the considered texels. This priority or weight may be
  • each grain in the grain palette may be associated with a static weight, and the computational priority or weight may be formed by dividing the static weight by the relative distance of the texel from the mid- finger path. Based on this priority or weight, the texel selection subsystem may select a single texel that has the highest priority among those considered.
  • the priorities or weights may be computed in such a way that ties become improbable. In the case of a tie, the system may select one of the tied highest priority texels at random.
  • the selecting of a texel may include determining a predicted segment of a predicted path of contact for the finger on the surface of the device, and the texel may be selected based on the predicted segment of that predicted path of contact.
  • the determining of the predicted segment of the predicted path includes estimating a start location of the finger at a start of a time period, estimating an end location of the finger at an end of the time period, estimating a width of an area of the contact on the surface of the device (e.g., as measured perpendicular to a path defined by the start location and the end location), or any suitable combination thereof.
  • the determining of the predicted segment may be based on the estimated start location, the estimated end location, the estimated width of the area of the contact, or any suitable combination thereof.
  • the texel selection subsystem may select a number of texels so that their output signals will be combined using the weights. In such a case, the texels with higher weights may affect more of the output signal than those with lower weights.
  • the texel selection subsystem first selects a texel map from a hierarchy of texel maps so that the finger will move to the next adjacent texel, if at all. For example, if the finger is determined to move 3 mm during the considered period, the texel selection subsystem may first select the texel map with 2 mm x 2 mm areas or the texel map with the 4 mm x 4 mm areas, depending on the initial and destination locations. In certain example embodiments, the coarser area texel map is only selected if the finger is likely to appear as jumping over a texel in the finer grained texel map.
  • the texel selection subsystem may then consider the texel at which the finger will be arriving and another texel that is one to five texels adjacent to it, in a direction perpendicular to the finger movement.
  • FIG. 3 shows some illustrative examples.
  • the texel selection subsystem may perform this function by inspecting the eight texels next to the one at which the finger is estimated to land, excluding the ones that were considered during the previous time period, computing the priorities or weights by calculating the distance between the estimated finger path and a parallel line going through the texel middle point and how much further the texel midpoint would be along the finger path, and multiplying these values.
  • a texel map may be represented as a two-dimensional array of data elements indexed by spatial locations, among which is the estimated spatial location of the contact made by the finger.
  • a texel map may be represented as a
  • multidimensional object-based data structure e.g., a vector haptic model, which may be analogous to a vector graphics model
  • each of multiple objects assigns a corresponding portion of the tactile output (e.g., as described by a tactile output description) to a spatial location (e.g., estimated spatial location on the surface that is contacted by the finger).
  • the texel selection subsystem may then divide the dynamic weight of 1 among the selected texels, giving each selected texel a weight based on how close the texel is to the estimated finger path. Thereafter, the texel selection subsystem may compute the dynamic priorities of the texels by multiplying the dynamic weight with the static weight associated with the grain indicated by the texel. It then may select the texel with the highest dynamic priority. In the case of a tie, it may use a simple pseudo-random number generator to pick one.
  • the texel selection subsystem may prevent it from being selected the second time and select the next highest priority texel instead.
  • the texel selection subsystem may select the grain (e.g., from a plurality of grains associated with the texel) based on the estimated finger direction.
  • the texel selection subsystem may select a grain associated with (e.g., corresponds to) a direction that can be presented as a multiple of a larger fraction ⁇ .
  • the tactile output system may adjust the tactile content defined in the selected texel or selected texels based on (e.g., according to) the finger's estimated speed, pressure, or both.
  • the selected texel may contain a number of different tactile content templates (e.g., for different speed ranges, pressure ranges, or both). This may have an effect similar to that of having a hierarchy of texel maps, selecting the highest priority texel for the next hierarchy level, and having different content templates at each level.
  • the grain may be assumed to contain a corresponding tactile content template (e.g., a tactile output template).
  • This template may include or define some nominal finger speed, v nom , for which it has been designed, some nominal finger pressure, p nom , or both.
  • v nom a nominal finger speed
  • p nom a nominal finger pressure
  • the grains may be always played for a given fixed period of time. Hence, one solution is to not adjust the grains at all and simply accept the ⁇ 33% variance.
  • Certain example embodiments may provide even better control, and there are multiple options for doing so.
  • One example option is to perform interpolation of an effect by taking a weighted average between the virtual machine parameters from the selected speed and the next slower or faster speed.
  • Another option is to have or include information in the grain on how to change the virtual machine parameters.
  • the charger and discharger frequencies are scaled down at the slowest speeds, thereby attenuating the strength of the effect.
  • ridges may feel stronger when the speed is increased, which may be done up to a point where adjacent ridges may be perceived as starting to blur together. In some example embodiments, middle speeds result in ridges feeling softer as the speed is increased.
  • the resulting grain e.g., speed-adjusted, pressure- adjusted, or both
  • the resulting grain may be denoted as an "expressed grain.”
  • the expressed grain may be represented in a form of a binary computer program.
  • a program may be executed on an execution unit which may, for example, be a virtual machine running on a physical central processing unit (CPU) or microcontroller unit (MCU), or a general-purpose or special-purpose CPU or MCU (e.g., configured by instructions or software).
  • an instruction e.g., software
  • a tactile output system may implement a virtual machine architecture (e.g., software architecture) that configures the tactile output system to perform one or more of the functions described above.
  • the virtual machine architecture may generally be described as a software architecture that implements one or more mechanisms, components, or subsystems discussed above.
  • a binary computer program e.g., an executable software application or its source code
  • one or more (e.g., each) of the instructions within the binary computer program may contain a field specifying the length of time that should pass before the next instruction is executed.
  • the virtual machine or special purpose hardware executing unit may have a facility for executing sections of the program multiple times.
  • the binary computer program may contain instructions that specify (e.g., singly or in combination) one or more of the following parameters: a duration, a repeat count, a repeat offset, and one or more physical parameters (e.g., one or more frequency values, defining for example output signal frequencies during the execution of the instruction, or one or more target analog values, defining for example the target value of a voltage level at the end the execution of the instruction).
  • One or more instructions e.g., each instruction
  • such instructions may be executed according to the following procedure:
  • Step 1 Take (e.g., access) the current instruction and adjust the output signals according to one or more physical parameters in the instructions.
  • microcontroller may be initialized to produce an alternating binary output value at the specified frequency.
  • Step 2 Make the execution unit inactive (e.g., sleep, wake, or block) or perform other functions for a duration specified in the instruction.
  • the execution unit inactive e.g., sleep, wake, or block
  • Step 3 If the repeat count associated with the instruction is zero, which denotes an infinite loop, make the instruction that is at an offset specified in this instruction the next current instruction, and loop back to Step 1.
  • Step 4 Increment a repeat counter associated with the instruction.
  • Step 5 If the repeat counter value is equal to the repeat count value, exit the loop. Clear the repeat counter value (e.g., make it zero), make the next instruction the next current instruction, and loop back to Step 1.
  • Step 6 Make the instruction at an offset specified in this instruction the next current instruction and loop back to Step 1.
  • array counter [size] integer
  • curinst progam [pc] ;
  • various subsystems may be implemented (e.g., included) within a tablet computer.
  • most functions are embodied as software implementations running (e.g., executing) on the tablet computer's host CPU.
  • the results of effect adjustment such as a resultant binary program, may be transferred to a separate microcontroller.
  • the separate microcontroller may then control the hardware in the forming of output signals (e.g., output signals that convey tactile output).
  • the functions may be divided between different programs or processes.
  • the input event reports may be generated by a window manager component, which may be modified to provide the input event reports to the rest of the tactile output system.
  • One or more of the functional subcomponents of a tactile output system may be combined into a single larger subcomponent, such as a "haptic engine.”
  • the haptic engine may use efficient data structures within a single memory space to streamline processing.
  • the functions may be implemented in program code in an entangled fashion, for example, in order to minimize processing time.
  • One or more application programs may construct one or more texel maps, for example, by reading one or more texel map definitions (e.g., from files stored on the disk and creating respective representations in memory).
  • the application programs may provide the one or more texel maps to the haptic engine on their own initiative (e.g., programming) or in response to a request from the haptic engine.
  • an application program may be implemented as one or more Android activities, Android content providers, or any suitable combination thereof, or in the form of some other Android subsystem.
  • One or more texel maps may be transferred from the application programs to the haptic engine in the form of shared memory regions, for example, using Android ashmem, Android MemoryHeap objects, or both.
  • the tablet computer input controller reports one or more finger positions every 10ms.
  • the input controller delay may be in the order of 20 ms.
  • the output latency may be, for example, 15 ms.
  • the host CPU processing latency may be, for example, less than 2 ms.
  • the Window Manager may hand over a new input event report every 10 ms. These events reports may be stored in a short queue, which may have the capacity of for example 6 or 10 event reports.
  • the haptic engine may run on a separate thread of execution which may be woken up periodically. When the thread is woken up, it may first use the event reports stored in the queue to estimate the current speed and direction of the finger being tracked. The resulting speed vector may be multiplied with the known system latency. In this example case, the system latency may be estimated to be 35 ms, to give an estimate (e.g., a predicted estimate) of a distance vector the finger will likely have taken from the position reported in the latest received input event report by the time the effect will be physically actualized. It may be beneficial to use a slightly lower value, such as 30ms, in order to err on the conservative side in case the speed estimate is not accurate enough. It may be beneficial to further adjust the value to compensate for the effect duration.
  • the system latency may be estimated to be 35 ms, to give an estimate (e.g., a predicted estimate) of a distance vector the finger will likely have taken from the position reported in the latest received input event report by the time the effect will be
  • the direction vector may be added to the finger position available in the latest received input event report, giving an estimated position of the finger.
  • the estimated position may be used to select the most appropriate texel to be actualized. Accordingly, an effect template may be chosen from the texel using the direction estimate.
  • the effect template may be "expressed" using the speed estimate, the pressure estimate, or both.
  • the resulting "expressed" effect template which is denoted as “expressed grain” in FIG. 4, may be sent from the haptic engine to the separate microcontroller.
  • the expressed grain may be in the form of a binary computer program.
  • the expressed grain may be sent over a digital bus, such as a serial bus (e.g., Universal Serial Bus (USB), Serial Protocol Interface (SPI), or I 2 C bus).
  • USB Universal Serial Bus
  • SPI Serial Protocol Interface
  • I 2 C bus I 2 C bus
  • the tactile output system may further estimate the time it will likely take to run the expressed grain on the microcontroller. This estimate may be used to decide when to schedule the haptic engine thread next.
  • the microcontroller may continue to execute the previous expressed grain while it is receiving the next grain. Once it has received all or a large enough part of the next grain (e.g., so that it will not overrun the so-far received part), the microcontroller may start to execute next expressed grain. It may do so on its own initiative (e.g., as programmed), for example, based on its estimate of execution time and bus speed, based on an explicit command or instruction format received from the host CPU, or based on any suitable combination thereof.
  • effect adjustment may be placed at the microcontroller.
  • the host software may provide the microcontroller with an effect template, which may be in the form of a binary "unlinked" computer program, and a speed estimate, a pressure estimate, or both.
  • the microcontroller may perform the effect adjustment, producing the final effect (e.g. binary program) to be executed.
  • the one or more texel maps or parts thereof may be passed to the microcontroller.
  • the texel map chooser may provide only one texel map, or one or more parts thereof, to the microcontroller.
  • the texel map chooser may provide multiple texel maps. In this way, the microcontroller may use less memory to store the one or more texel maps.
  • the tactile output system may be embodied on a separate microcontroller.
  • the input event reports may be received directly from, for example, an input microcontroller, which may be beneficial as it may reduce the overall system latency.
  • one or more of the functional blocks may be implemented directly in the form of hardware functions.
  • one or more of the functional blocks may be implemented at an input controller.
  • the functional blocks may be divided between an input controller, the host CPU, and an output controller.
  • FIG. 10 is a diagram depicting a tactile output system (e.g., a haptic device) in the example form of a tactile stimulation apparatus 150, according to some example embodiments.
  • a tactile output system e.g., a haptic device
  • tactile stimulation apparatus 150 may be capable of creating a sensation of touch or pressure to a body member 120 (e.g., a finger) based on the creation of a pulsating Coulomb force, as discussed by way of examples herein.
  • the tactile stimulation apparatus 150 may be in the form of a tactile display device that is capable of displaying graphics as well as creating a sensation of touch to the body member 120.
  • FIG. 10 depicts an example of such a tactile display device in the form of a smart phone having a touch screen panel 160 (e.g., a touch-sensitive screen) that is responsive to touches by the body member 120. That is, touching different portions of the touch screen panel 160 with the body member 120 may cause the smart phone to take various actions.
  • a touch screen panel 160 e.g., a touch-sensitive screen
  • the touch screen panel 160 may create a sensation of touch or pressure to the body member 120.
  • the creation of the touch sensation to the body member 120 may involve the generation of one or more high voltages.
  • a region of the touch screen panel 160 may comprise a semiconducting material that may limit a flow of current to the body member 120. Additionally, the semiconducting material may also be used to reduce the thickness of the touch screen panel 160, as described by way of examples herein.
  • the tactile stimulation apparatus 150 may include a variety of other apparatus, such as a computer monitor, a television, a handle (e.g., a door handle), a touchpad, a mouse, a keyboard, a switch, a trackball, a joystick, or any suitable combination thereof.
  • FIG. 11 is a schematic diagram illustrating various components of a tactile output system (e.g., a haptic device) in the example form of a tactile stimulation apparatus 1200, according to some example embodiments.
  • a display region 1222 shows information 1226, which is seen by a user through a touch-sensitive region 1262 and a tactile output region 1242.
  • the touch-sensitive region 1262 is scanned by a touch input controller 1240, such that a microprocessor 1204 (e.g., a host CPU), under the control of instructions (e.g., software) stored in and executed from a memory 1206, is aware of the presence or absence of the body member 120 on top of a predefined area 1246.
  • a microprocessor 1204 e.g., a host CPU
  • the composite section of the touch-sensitive region 1262 may be completely homogenous.
  • the predefined areas, such as area 1246, are created dynamically by the microprocessor 1204 under control of the instructions, such that the X and Y coordinates of the body member 120, as it touches the touch-sensitive region 1262, are compared with predefined borders of the predefined area 1246.
  • Reference numeral 1248 denotes a presence-detection logic stored within the memory 1206. Execution of the presence-detection logic 1248 by the microprocessor 1204 may cause the detection of the presence or absence of the body member 120 at the predefined area 1246. It may also cause detection of a location of the body member 120 within the predefined area at 1246. A visual cue, such as a name of the function or activity associated with the predefined area 1246, may be displayed by the display region 1222, as part of the displayed information 1226, so as to help the user find the predefined area 1246.
  • stimulus- variation logic 1268 Additionally stored within the memory 1206 may be stimulus- variation logic 1268.
  • Input information to the stimulus- variation logic 1268 may include information on the presence, absence, location, or any suitable combination thereof, of the body member 120 at the predefined area 1246. Based on this information, the stimulus- variation logic 1268 may have the effect that the microprocessor 1204 instructs the tactile output controller 1260 (e.g., a subsystem including a microcontroller or a special-purpose processor) to vary the electrical input to the tactile output region 1242, thus varying the tactile output controller 1260 (e.g., a subsystem including a microcontroller or a special-purpose processor) to vary the electrical input to the tactile output region 1242, thus varying the tactile output controller 1260 (e.g., a subsystem including a microcontroller or a special-purpose processor) to vary the electrical input to the tactile output region 1242, thus varying the tactile output controller 1260 (e.g., a subsystem including a microcontroller or a special
  • electrosensory sensations caused to the body member 120 may detect the presence or absence of the displayed information 1226 at the predefined area 1246 merely by way of tactile information (or electrosensory sensation) and without requiring visual clues.
  • Any of the machines, systems, apparatus, or devices shown or discussed herein may be implemented in a general-purpose computer modified (e.g., configured or programmed) by software to be a special-purpose computer to perform the functions described herein for that machine.
  • a computer system able to implement any one or more of the methodologies described herein is discussed with respect to FIG. 12.
  • any two or more of the example systems or devices discussed herein may be combined into a single machine, system or device, and the functions described herein for any single machine, system, or device may be subdivided among multiple machines, systems, apparatus, or devices.
  • any one or more of the modules or components described herein may be implemented using hardware (e.g., a processor of a machine) or a combination of hardware and software.
  • any module or component described herein may configure a processor to perform the operations described herein for that module.
  • any two or more of these modules or components may be combined into a single module or component, and the functions described herein for a single module or component may be subdivided among multiple modules or components.
  • FIG. 12 is a block diagram illustrating components of a machine 900, according to some example embodiments, able to read instructions from a machine-readable medium (e.g., a machine-readable storage medium) and perform any one or more of the methodologies discussed herein.
  • FIG. 12 shows a diagrammatic representation of the machine 900 in the example form of a computer system and within which instructions 924 (e.g., software, a program, an application, an applet, an app, or other executable code) for causing the machine 900 to perform any one or more of the methodologies discussed herein may be executed.
  • the machine 900 operates as a standalone device or may be connected (e.g., networked) to other machines.
  • the machine 900 may operate in the capacity of a server machine or a client machine in a server-client network environment, or as a peer machine in a peer-to-peer (or distributed) network environment.
  • the machine 900 may be a server computer, a client computer, a personal computer (PC), a tablet computer, a laptop computer, a netbook, a set-top box (STB), a personal digital assistant (PDA), a cellular telephone, a smartphone, a web appliance, a network router, a network switch, a network bridge, or any machine capable of executing the instructions 924, sequentially or otherwise, that specify actions to be taken by that machine.
  • PC personal computer
  • PDA personal digital assistant
  • the machine 900 includes a processor 902 (e.g., a central processing unit (CPU), a graphics processing unit (GPU), a digital signal processor (DSP), an application specific integrated circuit (ASIC), a radio- frequency integrated circuit (RFIC), or any suitable combination thereof), a main memory 904, and a static memory 906, which are configured to communicate with each other via a bus 908.
  • the machine 900 may further include a graphics display 910 (e.g., a plasma display panel (PDP), a light emitting diode (LED) display, a liquid crystal display (LCD), a projector, or a cathode ray tube (CRT)).
  • a graphics display 910 e.g., a plasma display panel (PDP), a light emitting diode (LED) display, a liquid crystal display (LCD), a projector, or a cathode ray tube (CRT)
  • the machine 900 may also include an alphanumeric input device 912 (e.g., a keyboard), a cursor control device 914 (e.g., a mouse, a touchpad, a trackball, a joystick, a motion sensor, or other pointing instrument), a storage unit 916, a signal generation device 918 (e.g., a speaker), and a network interface device 920.
  • an alphanumeric input device 912 e.g., a keyboard
  • a cursor control device 914 e.g., a mouse, a touchpad, a trackball, a joystick, a motion sensor, or other pointing instrument
  • a storage unit 916 e.g., a signal generation device 918 (e.g., a speaker)
  • a signal generation device 918 e.g., a speaker
  • the storage unit 916 includes a machine-readable medium 922 on which is stored the instructions 924 embodying any one or more of the methodologies or functions described herein.
  • the instructions 924 may also reside, completely or at least partially, within the main memory 904, within the processor 902 (e.g., within the processor's cache memory), or both, during execution thereof by the machine 900. Accordingly, the main memory 904 and the processor 902 may be considered as machine-readable media.
  • the instructions 924 may be transmitted or received over a network 926 via the network interface device 920.
  • the term "memory” refers to a machine-readable medium able to store data temporarily or permanently and may be taken to include, but not be limited to, random-access memory (RAM), read-only memory (ROM), buffer memory, flash memory, and cache memory. While the machine-readable medium 922 is shown in an example embodiment to be a single medium, the term “machine-readable medium” should be taken to include a single medium or multiple media (e.g., a centralized or distributed database, or associated caches and servers) able to store instructions.
  • processors of the machine e.g., processor 902
  • machine-readable medium refers to a single storage apparatus or device, as well as “cloud-based” storage systems or storage networks that include multiple storage apparatus or devices.
  • the term “machine-readable medium” shall accordingly be taken to include, but not be limited to, one or more data repositories in the form of a solid-state memory, an optical medium, a magnetic medium, or any suitable combination thereof.
  • FIG. 13-16 are flowcharts illustrating operations of a tactile output system (e.g., a machine, a haptic device, a tactile output device, or any suitable combination thereof), according to various example embodiments.
  • FIG. 13 illustrates operations in a method 1300, according to some example
  • Operations of the method 1300 may be performed by one or more of the hardware components shown in FIG. 4-12. As shown in FIG. 13, the method 1300 includes operations 1310, 1320, 1330, and 1340.
  • a processor accesses an input event report that is generated as a result of contact (e.g., a touch) made by a user's finger on a surface (e.g., touch-sensitive surface) of a tactile output system.
  • the processor determines a finger speed, finger direction, finger pressure, or any suitable combination thereof.
  • One or more of such values determined by the processor in operation 1320 may be an actual value, an estimated value (e.g., an estimate of the value), a predicted value (e.g., an estimate of a likely future value), or a weighted value (e.g., weighted as discussed above with respect to FIG. 1-3). That is, performance of operation 1320 by the processor may determine (e.g., calculate) a contact parameter that describes, in whole or in part, the contact made by the finger on the surface of the device.
  • the processor obtains a tactile output template based on the results of operation 1320.
  • obtaining the tactile output template may include generating the tactile output template, adjusting (e.g., modifying) the tactile output template, or both.
  • a processor e.g., processor 902, a separate microcontroller, or other tactile output hardware
  • the method 1300 may include one or more of operations 1422, 1424, 1426, and 1432.
  • One or more of operations 1422, 1424, and 1426 may be performed as part (e.g., a precursor task, a subroutine, or portion) of operation 1320.
  • the processor determines the location (e.g., spatial location on the touch-sensitive surface) of the contact made by the finger on the surface of the device. Accordingly, operation 1320 may be performed based on the location determined in operation 1422.
  • the processor determines the pressure exerted by the finger on the surface of the device based on the area (e.g., a height of the area, width of the area, or circumference of the area) encompassed by the contact made by the finger on the surface of the device. Accordingly, operation 1320 may be performed based on the pressure exerted by the finger, the area of the contact made by the finger, or both.
  • the area e.g., a height of the area, width of the area, or circumference of the area
  • the processor determines the pressure based on a sliding average pressure detected (e.g., measured) over a period of time.
  • operation 1320 may be performed based on this time-averaged pressure of the finger on the surface of the device.
  • Operation 1432 may be performed as part of operation 1330.
  • the processor selects a tactile element (e.g., a texel), and this selection may be made based on the results of operation 1320 (e.g., based on the contact parameter determined in operation 1320).
  • a tactile element e.g., a texel
  • FIG. 15 illustrates operations in a method 1500, according to some example embodiments. Operations of the method 1500 may be performed by one or more of the hardware components shown in FIG. 4-12. As shown in FIG. 15, the method 1500 includes operations 1510, 1520, 1530, 1540, and 1550.
  • a processor accesses an input event report that is generated from contact (e.g., a touch) made by a user's finger on the surface (e.g., touch-sensitive surface) of the tactile output system.
  • the processor calculates a finger speed, finger direction, finger pressure, or any suitable combination thereof.
  • One or more of such values determined by the processor in operation 1320 may be an actual value, an estimated value (e.g., an estimate of the value), a predicted value (e.g., an estimate of a likely future value), or a weighted value (e.g., weighted as discussed above with respect to FIG. 1-3).
  • performance of operation 1520 by the processor may determine (e.g., calculate) a contact parameter that describes, in whole or in part, the contact made by the finger on the surface of the device.
  • the processor detects a state change in a computer program (e.g., executing on the device).
  • the computer program may be controlled by one or more input event reports (e.g., the input event report accessed in operation 1510).
  • the processor may receive an indication that the state of the program has changed as a result of the input event report accessed in operation 1510.
  • the processor obtains an effect description (e.g., by obtaining a tactile output template) based on the results of operation 1530.
  • obtaining the effect description may include generating the effect description, adjusting (e.g., modifying) the effect description, or both.
  • a processor e.g., processor 902, a separate microcontroller, or other tactile output hardware
  • the method 1500 may include one or more of operations 1622, 1642, and 1644.
  • Operation 1622 may be performed as part (e.g., a precursor task, a subroutine, or portion) of operation 1520.
  • the processor estimates a location (e.g., spatial location) of the contact made by the finger on the surface of the device. Accordingly, operation 1520 may be performed based on the location estimate it in operation 1622.
  • One or more of operations 1642 and 1644 may be performed as part of operation 1540.
  • the processor selects a spatial description of tactile output (e.g., a texel map, which may be selected from among multiple texel maps within a hierarchy of texel maps), and this selection may be made based on the results of operation 1530 (e.g., based on the contact parameter determined in operation 1530).
  • operation 1642 may include selection of a texel map based on the results of operation 1530.
  • Various example embodiments of texel map selection are discussed above with respect to FIG. 1-3.
  • the processor selects a locationary piece of a tactile output description (e.g., a tactile element, such as a texel), and this selection may be made based on the results of operation 1530 (e.g., based on the contact parameter determined in operation 1530).
  • a locationary piece of a tactile output description e.g., a tactile element, such as a texel
  • this selection may be made based on the results of operation 1530 (e.g., based on the contact parameter determined in operation 1530).
  • operation 1644 may include selection of a texel based on the results of operation 1530.
  • Various example embodiments of texel selection are discussed above with respect to FIG. 1-3
  • Operation 1645 may be performed as part of operation 1644.
  • the processor determines a predicted segment of a predicted path of the contact made by the finger on the surface of the device. For example, the processor may perform location estimation, location prediction, or both, as discussed above with respect to FIG. 1-3. In some example embodiments, operation 1645 is performed based on the estimate of the spatial location of the contact, as estimated in operation 1622.
  • One or more of operations 1646, 1647, and 1648 may be performed as part of operation 1645.
  • the processor estimates a start location of the user's finger (e.g., where it contacts the surface of the device) at the start of a period of time (e.g., time period, such as a sliding window of time).
  • the processor estimates an end location of the user's finger at the end of the period of time.
  • the processor estimates a width of the contact area (e.g., the width of the area of the contact made by the finger, which width may be measured perpendicularly to a line that connects the start location and the end location).
  • the start location, the end location, the width, or any suitable combination thereof may fully or partially define the predicted segment that is determined in operation 1645.
  • Modules may constitute either software modules (e.g., code embodied on a machine-readable medium or in a transmission signal) or hardware modules.
  • a "hardware module” is a tangible unit capable of performing certain operations and may be configured or arranged in a certain physical manner.
  • one or more computer systems e.g., a standalone computer system, a client computer system, or a server computer system
  • one or more hardware modules of a computer system e.g., a processor or a group of processors
  • software e.g., an application or application portion
  • a hardware module may be implemented mechanically, electronically, or by any suitable combination thereof.
  • a hardware module may include dedicated circuitry or logic that is permanently configured to perform certain operations.
  • a hardware module may be a special-purpose processor, such as a field programmable gate array (FPGA) or an ASIC.
  • FPGA field programmable gate array
  • a hardware module may also include programmable logic or circuitry that is temporarily configured by software to perform certain operations.
  • a hardware module may include software
  • hardware module should be understood to encompass a tangible entity, be that an entity that is physically constructed, permanently configured (e.g., hardwired), or temporarily configured (e.g., programmed) to operate in a certain manner or to perform certain operations described herein.
  • “hardware-implemented module” refers to a hardware module. Considering embodiments in which hardware modules are temporarily configured (e.g., programmed), each of the hardware modules need not be configured or instantiated at any one instance in time.
  • a hardware module comprises a general-purpose processor configured by software to become a special-purpose processor
  • the general-purpose processor may be configured as respectively different special-purpose processors (e.g., comprising different hardware modules) at different times.
  • Software may accordingly configure a processor, for example, to constitute a particular hardware module at one instance of time and to constitute a different hardware module at a different instance of time.
  • Hardware modules can provide information to, and receive information from, other hardware modules. Accordingly, the described hardware modules may be regarded as being communicatively coupled. Where multiple hardware modules exist contemporaneously, communications may be achieved through signal transmission (e.g., over appropriate circuits and buses) between or among two or more of the hardware modules. In embodiments in which multiple hardware modules are configured or instantiated at different times, communications between such hardware modules may be achieved, for example, through the storage and retrieval of information in memory structures to which the multiple hardware modules have access. For example, one hardware module may perform an operation and store the output of that operation in a memory device to which it is communicatively coupled. A further hardware module may then, at a later time, access the memory device to retrieve and process the stored output. Hardware modules may also initiate
  • communications with input or output devices can operate on a resource (e.g., a collection of information).
  • a resource e.g., a collection of information
  • processors may be temporarily configured (e.g., by software) or permanently configured to perform the relevant operations. Whether temporarily or permanently configured, such processors may constitute processor-implemented modules that operate to perform one or more operations or functions described herein.
  • processor-implemented module refers to a hardware module implemented using one or more processors.
  • the methods described herein may be at least partially processor-implemented, a processor being an example of hardware.
  • a processor being an example of hardware.
  • the operations of a method may be performed by one or more processors or processor-implemented modules.
  • the one or more processors may also operate to support performance of the relevant operations in a "cloud computing" environment or as a "software as a service” (SaaS).
  • SaaS software as a service
  • at least some of the operations may be performed by a group of computers (as examples of machines including processors), with these operations being accessible via a network (e.g., the Internet) and via one or more appropriate interfaces (e.g., an application program interface (API)).
  • API application program interface
  • the performance of certain of the operations may be distributed among the one or more processors, not only residing within a single machine, but deployed across a number of machines.
  • the one or more processors or processor-implemented modules may be located in a single geographic location (e.g., within a home environment, an office environment, or a server farm). In other example embodiments, the one or more processors or processor-implemented modules may be distributed across a number of geographic locations.
  • any one or more algorithms discussed herein may be implemented by means of special-purpose electronic hardware in which the one or more algorithms are a symbolic representation of the specific electrical functions that may take place among electrical circuits within, for example, a special-purpose silicon chip or other semiconductor.
  • each symbol in an algorithmic description may be directly mapped to the physical electrical circuits, and possibly vice versa.
  • such a special-purpose chips may implement a special-purpose processor, which may be able to execute binary code (e.g., one or more computer programs) that may be tailored to provide a means for an efficient representation of tactile content.
  • a method comprising:
  • determining e.g., using a processor or other suitable hardware a contact parameter based on at least one of a speed of the finger relative to the surface of the device, a direction of the finger relative to the surface of the device, or a pressure exerted by the finger on the surface of the device, the determining of the contact parameter being based on the received input event report generated as a result of the contact;
  • the tactile output template defining a tactile output that is renderable on the surface of the device to the finger of the user
  • the device causing the device to render the tactile output on the surface of the device based on the tactile output template to the finger of the user.
  • the contact parameter is determined based further on a location of the contact by the finger on the surface of the device.
  • the determining of the contact parameter is based on the pressure exerted by the finger on the surface of the device; and the method further comprises
  • the pressure as a sliding average pressure exerted by the finger over a period of time.
  • the obtaining of the tactile output template comprises generating the tactile output template based on the determined contact parameter.
  • the obtaining of the tactile output template comprises adjusting the tactile output template based on the determined contact parameter.
  • the obtaining of the tactile output template comprises selecting a tactile element based on the determined contact parameter, the tactile element describing a tactically perceivable surface feature and corresponding to the tactile output template;
  • the obtaining of the tactile output template is based on its correspondence to the tactile element.
  • the determining of the contact parameter is based on an estimate of at least one of the speed of the finger, the direction of the finger, or the pressure of the finger; and the method further comprises
  • a device comprising:
  • a processor configured to:
  • an effect description based on a state change in a computer program (e.g., a computer program controlled by the input event report) and based on the estimate of the contact parameter (e.g., obtain the effect description in response to detecting the state change in the computer program); and
  • a controller e.g., a microcontroller configured to cause the device to render a tactile output on the surface of the device based on the effect description to the finger of the user.
  • the processor is configured to obtain the effect description by generating the effect description based on the state change.
  • the processor is configured to obtain the effect description by modifying the effect description based on the estimate of the contact parameter.
  • the processor is configured to estimate a spatial location of the contact by the finger on the surface of the device.
  • the texel map is represented as a two-dimensional array of data elements indexed by spatial locations among which is the estimated spatial location of the contact by the finger.
  • the texel map is represented as a multidimensional object-based data structure in which each of multiple objects assigns a corresponding portion of the tactile output to a spatial location.
  • the texel describes the corresponding portion of the tactile output for the location as a point-like effect description.
  • the point-like effect description is expressed as a waveform.
  • the processor is configured to select the texel by:
  • the processor is configured to determine the predicted segment of the predicted path by:
  • the processor is configured to obtain the effect description by obtaining a tactile output template based on the contact parameter, the tactile output template defining the tactile output that is renderable by the surface of the device to the finger of the user;
  • the microcontroller is configured to cause the surface of the device to render the tactile output based on the tactile output template to the finger of the user.
  • a tangible (e.g., non-transitory) machine-readable storage medium comprising instructions that, when executed by one or more processors of a machine, cause the machine to perform operations comprising:
  • an input event report generated as a result of contact made by a finger of a user upon a surface of a device accessing an input event report generated as a result of contact made by a finger of a user upon a surface of a device; determining a contact parameter based on at least one of a speed of the finger relative to the surface of the device, a direction of the finger relative to the surface of the device, or a pressure exerted by the finger on the surface of the device, the determining of the contact parameter being based on the received input event report generated as a result of the contact;
  • the tactile output template defining a tactile output that is renderable on the surface of the device to the finger of the user
  • the device causing the device to render the tactile output on the surface of the device based on the tactile output template to the finger of the user.
  • a tangible (e.g., non-transitory) machine-readable storage medium comprising instructions that, when executed by one or more processors of a machine, cause the machine to perform operations comprising:

Landscapes

  • Engineering & Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

ABSTRACT OF THE DISCLOSURE A tactile output system may be characterized by an ability to provide tactile sensations with high spatial accuracy and flexibility. This may be achieved through combining a number of specific mechanisms in various configurations. The tactile output system may thus include one or more mechanisms, components, or subsystems that are configured to perform any one or more of the following functions: finger movement speed estimation, finger location prediction, finger movement direction estimation, finger pressure estimation, effect adjustment based on finger speed, effect adjustment based on finger movement direction, effect adjustment based on finger pressure, a point- like effect description, effect description based on a texel map, and effect variation based on prioritization, multiple alternating texel maps, texel map data, finger speed, finger direction, finger pressure, or any suitable combination thereof.

Description

TACTILE OUTPUT SYSTEM
TECHNICAL FIELD
[0001] The subject matter disclosed herein generally relates to electronic devices. Specifically, the present disclosure addresses a tactile output system.
BACKGROUND
[0002] A tactile output system may be used to generate tactile stimuli while a user is using a device surface, typically causing a tactile sensation to one or more of the user's fingers. For example, a user may experience a change in sliding friction or a vibrational feeling when touching a capacitive touchscreen to operate a computer program controlled by one or more touch inputs (e.g., touch input events), which may be generated or detected by the capacitive touchscreen. The computer program may affect the tactile stimuli that generate the user's experiences.
[0003] It is an object of the present invention to improve prior art tactile outputs systems with respect to one or more aspects, such as user improved experience, reduced system latency or jitter of latency, improved integration with existing computer and operating system platforms. Improvments to user experience may comprise improved spatial accuracy, realism, natural feeling, flexibility, diversity of possible user feedbacks, or the like. This object is achieved with methods and devices as stated in the attached independent claims. The dependent claims and the present description and drawings disclose additional features which provide additional benefits and/or solve residual problems.
BRIEF DESCRIPTION OF THE DRAWINGS
[0004] Some embodiments are illustrated by way of example and not limitation in the figures of the accompanying drawings.
[0005] FIG. 1 is a conceptual flowchart illustrating sources of information used by a tactile output system, according to some example embodiments.
[0006] FIG. 2 is a conceptual diagram illustrating a hierarchy of texel maps, according to some example embodiments. [0007] FIG. 3 is a set of diagrams illustrating location estimation, according to some example embodiments.
[0008] FIG. 4-9 are block diagrams illustrating components of a tactile output system in the example form of a tablet computer, according to various example embodiments.
[0009] FIG. 10 is a diagram depicting a haptic device in the example form of a tactile stimulation apparatus, according to some example embodiments.
[0010] FIG. 11 is a schematic diagram illustrating various components of a haptic device in the example form of a tactile stimulation apparatus, according to some example embodiments.
[0011] FIG. 12 is a block diagram illustrating components of a machine, according to some example embodiments, able to read instructions from a machine-readable medium and perform any one or more of the methodologies discussed herein.
[0012] FIG. 13-16 are flowcharts illustrating operations of a tactile output system, according to various example embodiments.
DETAILED DESCRIPTION
[0013] Example methods and systems (e.g., apparatus or devices) are directed to tactile output (e.g., haptic output). In particular, some of the example embodiments of these methods and systems may support control of tactile stimuli in such a manner that a large amount of spatial accuracy and flexibility is achieved. Examples merely typify possible variations. Unless explicitly stated otherwise, components and functions are optional and may be combined or subdivided, and operations may vary in sequence or be combined or subdivided. In the following description, for purposes of explanation, numerous specific details are set forth to provide a thorough understanding of example
embodiments. It will be evident to one skilled in the art, however, that the present subject matter may be practiced without these specific details.
[0014] A system (e.g., an apparatus or a device) for generating tactile sensory stimulation may include a few pieces of special-purpose hardware and software, combined together into a system, in a few possible ways. Such a system may be embodied in a device (e.g., electronic device). The software pieces may be organized so that one or more parts of the software runs on a general-purpose processor, under the control of an operating system of the device, while one or more other parts of the software run on a separate microcontroller. This micro-controller, in turn, may control a piece of special-purpose hardware, for example, by controlling a number of output signals, such as general-purpose input/output (GPIO) signals, timer signals, or other signals corresponding to physical pins that alter some electrical aspect of the hardware. The changing of these electrical aspects may alter some physical aspect of the hardware, such as tension in a piezo-electric crystal, a rotation speed of an electrical motor used in a vibrator, or a strength of a Coulomb's force between a surface and a user's finger (e.g., fingertip) in an electrostatic- vibrational tactile system.
[0015] A tactile output system (e.g., apparatus or device) may be able to generate some number or range of differing tactile stimuli. One or more pieces of an application or other software may control what stimuli are generated. The software, hardware, and combinatorial aspects of such a system may determine (e.g., describe, define, limit, or any suitable combination thereof) the number, range, and quality of the stimuli the system is able to generate.
[0016] In an example case, a tactile output system provides an application programming interface (API). Such an API may include one or more data formats and structures and associated classes, methods, or functions. An application or other program may construct (e.g., directly or indirectly) one or more instances of these data structures and call or invoke the API functions or methods, in order to instruct the hardware to generate tactile stimuli.
[0017] When a user slides all or part of one or more fingers over a tactile surface of an input device, the input device may issue a series of touch event reports. Each report may describe one or more measured locations of the touching finger. It may also contain other information, such as an estimate of the finger pressure.
[0018] In some example cases, an application or some other program controls the generation of one or more tactile stimuli (e.g., which kind of tactile stimulus to generate and when to generate it). In other words, an application program (e.g., a computer program executing on the tactile output system) may implement an interaction model that determines what tactile stimuli to generate and when. In such an interaction model, the application may combine information from its internal state and the touch event reports, and may generate one or more application-specific tactile commands that appropriately cause the hardware to generate one or more tactile stimuli.
[0019] For example, a game application may determine (e.g., compute ) that the immediately preceding user input has induced a virtual explosion in the game. Based on that determined (e.g., computed) result, the game application may produce (e.g., generate) one or more tactile commands that cause the hardware to shake the device intensively so as to simulate the effect of an explosion.
[0020] As a result of various software, hardware, and combinatorial aspects of a tactile output system, the number, range, and quality of the produced tactile stimuli may be limited in a number of ways. For example, a vibrator motor start-up latency, caused by the physical inertia and other physical factors, may limit how fast a tactile effect can start after the application program has made its decision and issued a tactile command to initiate the tactile effect. In general, tactile output systems may have limitations on how fast and accurately they can produce tactile stimuli as a response to the changes in the program state and user actions (e.g., in response to state changes in the program as a result of one or more touch input event reports that indicate user actions that were submitted via a touch-sensitive surface).
[0021] One potential problem that may be experienced by tactile output systems is inherent system latency. A touch-based input/output system, such as a tablet computer having a capacitive touch screen and an electrostatic- vibrational tactile touch effect system (e.g., subsystem), may experience unavoidable systemic delays. It may take, for example, on the order of a few tens of milliseconds from the time a finger slides over a specific position on the tactile surface until the command that is generated in response to a
corresponding touch event report is manifested (e.g., actualized) as tactile output.
[0022] For example, a touch input device may take some 10-30 ms to process its measurements and send the generated input event reports to its main central processing unit (CPU). The operating system may take some time to process those reports (e.g., less than 1 ms), and the system-side touch input system (e.g., subsystem) may take some more time to process and filter them (e.g., less than 1 ms, unless the system is very busy with other tasks). In a multitasking system, it may take some more time before the operating system scheduler assigns a processor core to the right application process. After all these events have taken place, the application program may process the touch input events. On the output side, when an application process issues a command to a tactile output system, the command may first be passed to another process, such as a process that serves many processes and decides which process is currently allowed to utilize a particular piece of hardware. As illustrated by this example, the latency in a tactile output system (e.g., haptic system or haptic device) may be characterized as a latency in a haptic system input-output control loop, and such a latency may include three parts: input processing latency, application processing latency, and output rendering latency.
[0023] As used herein, "latency" refers to end-to-end delay in a haptic system input-output control loop. For example, latency may include
contributions from one or more of three factors: touch input processing, application processing, and tactile output rendering.
[0024] As used herein, "latency jitter" refers to variations in the latency between control events that deteriorate tactile output timing and spatial accuracy.
[0025] As used herein, "tactile stimuli" is a generic term used to describe various physical phenomena that may directly or indirectly stimulate sensory nerves in a human being. For example, in an electrostatic system, the stimuli may consist of an electric field that may primarily change the friction between a user's finger and the tactile surface.
[0026] As used herein, a "tactile element" refers to a single "feature" that can be felt at a surface (e.g., a description of the desired tactile properties of a position at a surface). Examples of tactile elements include a fraction of an edge, a protruding point, and an elementary feature that is part of a texture, in which case the total of a few individual elements may form the perceivable texture feeling.
[0027] As used herein, "tactile content" refers to a set of "feelable" tactile elements. For example, tactile contents may include all of the tactile elements used by an application. Tactile content may be described or defined in many substantially different ways (e.g., as different embodiments of the content), such as directly within a program source code in the form of conditional statements and other procedural or imperative programming language constructions, or in a more declarative way, in which case the content may be described through use of suitably expressive data structures and a corresponding declarative programming language.
[0028] As used herein, a "tactile surface" refers to a surface equipped with an input device and tactile output hardware so that a user can slide one or more fingers over the surface and perceive various tactile elements as being there on the surface.
[0029] As used herein, an "input event report" is a piece of data that represents a momentary state at an input device. An input event report may describe one or more locations and one or more contact areas of a user's finger or fingers on a tactile surface.
[0030] As used herein, an "input device" is a subsystem that may include hardware and software for producing input event reports.
[0031] As used herein, an "output signal" is an electrical signal, digital or analog, that is used within the system to directly alter the electrical or physical state of the tactile output hardware.
[0032] As used herein, "physical actualization" refers to a phenomenon when a piece of tactile output hardware changes the physical state of a tactile output system (e.g., as a result of an output signal change), so that the user's perceptions get changed.
[0033] As used herein, a "tactile output system" is a system (e.g., an apparatus, a machine, or device) that generates output signals from some embodiment of tactile content and input event reports.
[0034] As used herein, "spatial accuracy" refers to a quality describing how accurately a tactile element stays at its defined (e.g., desired) position, or moves along its defined (e.g., desired) trajectory (e.g., as a user is trying to track the element on a tactile surface).
[0035] As used herein, "spatial flexibility" refers to a quality describing how easily, uniformly, and fast the system may change the tactile content on a tactile surface.
[0036] As used herein, "grain" refers to a shortest time segment of a tactile output signal. In some example embodiments, an optimal duration may be 20 .. 50 ms, which may correspond to the shortest perceptually distinguishable tactile event, or to a time granularity of tactile output system control.
[0037] As used herein, "slide speed" refers to a velocity of a finger touch sliding on a touch interface surface.
[0038] As used herein, a "texel" is a texture element that is defined for an area on a virtual tactile surface. Slide speed may determine one or more dimensions of a texel. One grain may correspond to (e.g., map to) one texel. For example, at a slide speed of 10 cm/s, a 30 ms grain may map to a 3 mm texel. Accordingly, a texel may describe tactile output to be rendered by the tactile output system at a spatial location (e.g., a locationary piece of a description of tactile output).
[0039] As used herein, a "texel map" refers to a visualization (e.g., a digital representation) of tactile content as texels on a surface (e.g., so that a haptic design can be presented and be aligned with computer graphic pixels) and touch input locations on the surface. Accordingly, a texel map may spatially describe tactile output to be rendered by the tactile output system (e.g., a spatial description of such tactile output). Within a texel map, a texel may describe a corresponding portion of tactile output for a spatial location (e.g., on a surface of a device contacted by a user's finger) as a function of spatial coordinates (e.g., with respect to the surface of the device). In some example embodiments, the texel may describe the corresponding portion of tactile output for the spatial location as a point-like effect description (e.g., the degree to which the portion of the tactile output is perceivable as being pointy or otherwise similar to a pinpoint). Such a point-like effect description may be expressed as a waveform (e.g., applicable to control tactile output hardware in providing an output signal).
[0040] Example embodiments of a tactile output system (e.g., tactile output device or other haptic device) are configured to produce high quality tactile stimuli. In particular, such a tactile output system may be able to process the application state and user input in a way that allows the system to produce tactile stimuli that have very high spatial accuracy and flexibility. Such a system may have no limitations on where on the screen the tactile stimuli can be generated, and such a system may produce them precisely at the location a user would expect them to be felt based on typical real-life sensory experiences. For example, an example tactile output system may be able to generate accurate contours and edges on a touch-input based tablet computer.
[0041] Example embodiments of the tactile output system may include a mechanical system (e.g., a subsystem) that is configured to cause the tactile surface to physically morph or vibrate. In the case of morphing, the specific areas that are able to morph may be spatially constrained, which may result in a corresponding physical limitation in spatial accuracy. Physical vibration, on the other hand, may limit both the range of effects and the places where the effect may be felt well. Moreover, the tactile output system may be able to produce tactile stimuli flexibly, or equally at all locations within the tactile surface, with or without limitations in its ability to provide high spatial accuracy. In some example embodiments, a processing latency is inherent to the system, which may cause the effects to be felt post facto. This, in turn, may limit the spatial accuracy. In certain example embodiments, a texel map may be utilized as a design aid, with various attendant benefits arising therefrom.
[0042] The tactile output system may be characterized by an ability to provide tactile sensations with high spatial accuracy and flexibility. This may be achieved through combining a number of specific mechanisms in various configurations. Although the discussion herein focuses on specific example embodiments of such a tactile output system, it should be appreciated that one skilled in the state of the art may define other configurations of the herein described system, mechanisms, components, and subsystems.
[0043] According to various example embodiments, the tactile output system may include one or more mechanisms (e.g., components or subsystems) configured to perform any one or more of the following functions: finger movement speed estimation, finger location prediction, finger movement direction estimation, finger pressure estimation, effect adjustment based on (e.g., according to) finger speed, effect adjustment based on finger movement direction, effect adjustment based on finger pressure, a point-like effect description (e.g., with data description compensated based on speed, pressure, direction, or any suitable combination thereof), effect description based on a spatial map (e.g., texel map), and effect variation (e.g., based on prioritization, multiple alternating texel maps, texel map data, finger speed, finger direction, finger pressure, or any suitable combination thereof). [0044] Regarding finger speed estimation, the tactile output system (e.g., via a suitably configured component thereof) may estimate the speed of a finger through any approximation technique. It may be beneficial to adjust the algorithm used based on a previous estimate. For example, if the previous estimate indicates that a finger has been stationary, the system may use an algorithm that produces good overall tactile stimuli results for slow speeds. On the other hand, if the previous estimate indicates that the finger has been moving at a high speed (e.g. as in a "fling" or "flick" gesture), the system may instead use another algorithm, such as one that is experimentally found to produce good overall tactile stimuli at high speeds.
[0045] As another example, the system may use a relatively accurate speed estimate at slow speeds, but may use a more "conservative" algorithm at higher speeds, due to the ill effects overcompensation may have, as discussed below.
[0046] Such finger speed estimation may be beneficial in a tactile output system for least the reason that the system may use such an estimate to adjust the tactile stimuli produced by the tactile output system. It may also be beneficial indirectly, such as by allowing the system to perform further estimation and adjustments.
[0047] Regarding finger pressure estimation, the input device may provide some value that describes how hard or lightly the user's finger is pressing on the screen (e.g., a touchscreen). Such a value may be usable for affecting tactile output, as such, or it may be beneficial to compute a further estimate from one or more values provided by the input device. In some example embodiments, using a sliding average may give a finger pressure estimate that is more suitable for affecting tactile output than using the value directly.
[0048] Regarding finger location estimation, the tactile output system may be configured to compensate for one or more inherent system latencies. For example, by the time an application or other program component processes any touch event reports, a moving finger may have already moved further from the reported location. Furthermore, by the time any tactile stimulus gets physically actualized, the finger may have moved some distance further. To compensate for this, the tactile output system may estimate a predicted position of the finger at a future time when any set of tactile stimuli will get physically actualized. Such an estimation may be based, for example, on one or more of the following: a set of previous touch event reports, a "current" (or latest) touch event report, one or more empirical results describing the overall input latencies from a finger crossing a point to the time the touch event report reaches the processing piece of program code, one or more empirical results describing the output latencies from the processing piece of program code generating tactile output expressions to the hardware to the time the effect is physically actualized, a degree of variation or jitter on such measurements, and a degree of dependency that the variation or jitter may have on finger location, finger speed, finger pressure, tactile content, or any suitable combination thereof.
[0049] For example, using such information, the system may determine that the finger was most probably sliding at the speed of 2 cm/s over the surface and that the overall system latency is in the order of 30 ms. In such a case, the finger position is likely to have moved 30 ms * 2 cm/s = 0.6 mm from the latest reported location by the time the tactile output gets physically actualized.
[0050] The finger location may be estimated through one or more methodologies. For example, a speed estimate, a movement direction estimate, system latency, and the latest measured location may be used to compute a location estimate. However, some example embodiments of the system may use a more complex procedure, and the algorithm used may vary, for example, based on empirical testing results. For example, the tactile output system may slightly underestimate (e.g., "deliberately") how much the finger has advanced at high speeds, in order to damp down errors stemming from measurement jitter.
[0051] Having such a location estimate available may be beneficial for a tactile output system. For example, if a finger is moving at high speed over an edge or other feelable surface feature, without location estimation the user may perceive the feature to move on the screen depending on the movement speed and direction of the finger.
[0052] Finger movement direction estimation may be performed, in various example embodiments, in a manner similar to estimation of finger speed. Specifically, the tactile output system may estimate the direction of the movement a finger is taking (e.g., a movement direction). In some example embodiments, the finger speed estimation algorithm represents the finger speed as a vector, thus indicating not only the scalar velocity but also the direction of the finger. As noted above, it may be beneficial to use differing algorithms at different situations. For example, at very slow speeds it may be beneficial to consider the direction as fuzzy (e.g., to compensate for errors in the algorithm). At high speeds, the algorithm may refuse to change the direction abruptly (e.g., even if the measurements would indicate so), since such an abrupt change may be deemed more likely to have been caused by finger location measurement errors rather than real direction changes.
[0053] Regarding effect adjustment based on finger speed, in some example embodiments of the tactile output system, an actual physical effect that a user can feel is produced by the surface itself morphing physically, thereby forming protrusions, inclusions, indentations, or any suitable combination thereof. In such cases, the sensory effect caused by the tactile formation feels natural per se, as the sensory nerves of the user's finger or hand get excited by a physical formation. In other example embodiments of the tactile output system, one or more sensory effects are caused by some other mechanism yet still involve real physical formations. For example, a piezo-electrical system (e.g., subsystem or component) may create a standing vibration at the point of a finger, and hence create an illusion of the surface reclining or ascending by quickly shifting the standing waveform at the finger position. As another example, an electrostatic- vibration system (e.g., subsystem or component) may rapidly change the electrical field between the finger and the surface. In this latter case, if the finger moves while the tactile stimuli is applied, the resulting sensory effect may not feel natural in that the surface may appear to have a different feeling at different speeds. This may be due to the used physical system not generating actual physical formations on the surface, but causing an illusion of such formations through some other means. In such a situation, the vibration, field, or other means of generating sensory effects, may merely be contributing to the desired sensory feeling. The actual feeling may be formed in the brain of the user, for example, based on the nerve signals coming from the finger and the expectations in the user's brain. The expectations, in turn, may depend on the user's previous experiences, as well as other sensory input, such as any picture visible on the touched surface (e.g., a generated image), one or more sounds, and feelings of temperature, pain, and other sensory cells that may or may not get excited by the used tactile output method. [0054] In such a situation, simply actualizing the same tactile output whenever the finger passes a given location on the surface may produce an unnatural feeling. For example, the user may perceive that an edge gets sharper or duller at differing speeds of the finger. Consequently, creating an illusion of a naturally feeling surface may involve differing physical tactile stimuli to be generated depending on the finger speed. For example, it may be beneficial to reduce the amplitude of tactile stimuli when the finger is moving at higher speeds. As another example, in an electrostatic- vibration system, such reduction of the amplitude may decrease the friction between the finger and the surface, thereby producing a feeling more similar to natural friction, which generally depends on the sliding speed. Such a variation may also be beneficial in making a single physical surface appear as different materials, since the way natural friction behaves at differing sliding speeds may depend on the material.
[0055] Another example of effect adjustment based on finger speed is compensating for friction phenomena at very low sliding speeds. In this case, it may be beneficial to reduce the tactile effect intensity, or otherwise modify the tactile output waveform to better imitate the behavior of a finger touching real surfaces.
[0056] Regarding effect adjustment based on finger movement direction, some surface formations may cause different sensory effects, depending on the angle and direction between the formation and a sliding finger. For example, when a finger passes over an edge at an "ascending" direction, the feeling is different from that experienced when the finger is passing over the same edge at the "descending" direction. Furthermore, if the finger slides along the edge, the feeling may be different still. On more complex directional surfaces, such as on striped velvet, the feeling may depend on the direction in more complex ways, or may depend on some combination of direction, pressure, and speed of the movement. Hence, it may be beneficial for certain example embodiments of the tactile output system to be configured to cause differing physical stimuli to be generated depending on the finger direction, the finger pressure, or both. Such an ability may help the system to more accurately simulate differing textures.
[0057] In addition to imitating static surfaces, illusions of feeling moving objects may be created by using information of finger movement with respect to a virtual object movement direction. In a system that adjusts an effect with this strategy, the user may recognize a moving object by successive finger explorations. For example, the moving object may be recognized by the user based on a different sensation when the finger is moving against the object's motion (e.g., direction of movement) compared to when the finger is following the object's motion.
[0058] Regarding a point-like effect description (e.g., with data description compensated based on speed, pressure, direction, or any suitable combination thereof), it may be beneficial to describe or define a single spatially-bound tactile element, a texel, in a way that allows the tactile output system to adjust the physical tactile stimulus generated (e.g., while the finger slides over the tactile element, or stays over the tactile element) in a way that depends on the finger speed and direction. According to various example embodiments, the system may estimate the finger speed, pressure, direction, and latency-adjusted location, as described above. Based on such estimates, the system may adjust the generated tactile stimulus so that it would cause a sensory feeling that closely matches with the desired tactile experience.
[0059] For example, to simulate a sharp protrusion with a texel, the system may produce different tactile stimuli compared to a situation where a similar- sized protrusion is a dull one. To implement such an ability, the system may describe each spatial feature (e.g., texel) as a data structure that describes differing effects at different directions with respect to the texel. For example, the system may differentiate between 9 different directions (e.g., 4 main directions, 4 intermediate directions, and 1 for a stationary finger), or it may differentiate between 17 differing directions.
[0060] Each directional effect may be further described in a way that allows the system to react in different ways to different sliding speeds, where the various different sliding speeds include no sliding at all (e.g., the stationary finger case). For example, the effect may be described as a "parameterized" function (e.g., a function dependent on one or more adjustable parameters) that reduces or increases the strength of an electrostatic field as the sliding finger achieves higher velocities, as the finger pressure changes (e.g., increases or decreases), or both. As another example, the frequency of the waveform of used in such electrostatic fields may be changed according to the speed, pressure, or both. [0061] Regarding effect description based on a spatial map (e.g., a texel map), it may be beneficial to describe or define a surface (e.g., of the tactile output system) as a two-dimensional array of tactile elements (e.g., when describing or defining tactile surfaces). Such a two-dimensional array may be considered as a texel map (e.g., a map of texture elements). Moreover, the texel map may be defined by one or more map functions that describe the desired feeling at each surface location. A texel map may be used as an implementation description, as a visualization of tactile content (e.g., for design purposes), or both. In some example embodiments, the texel map combines the spatial and temporal aspects of haptics interaction design.
[0062] A tactile (e.g., haptic) output system may determine what physical stimuli to actuate based on information of a finger sliding over one or more of the texels in such a texel map. Since any one or more texels may be spatially smaller than a fingertip, the system may need to determine (e.g., choose or identify) a texel from a number of texels, or combine a number of texels, in order to simulate the desired tactile feeling. For example, a single texel may cover a 1 square millimeter area of the tactile surface, while the area of the finger touching the screen may cover a 25 square millimeter oval area. Hence, the tactile output system may be configured to work with (e.g., analyze or monitor) a 25 square millimeter cross sectional area as the finger slides over the surface, and may consider up to a few tens of texels at each point in time.
[0063] Regarding texels, each texel data structure may contain information for describing the depth of the texel, the normal of the texel, or both. The depth of the texel may represent whether the texel is protruding or reclining in relation to one or more neighboring texels. The normal of the texel may represent the direction of the rectangular surface element that the texel represents in relation to the enclosing surface. The depth information and normal information may be represented in a format that is commonly used for describing such information in computer programs (e.g., in a depth map or normal map, such as those used in computer graphics).
[0064] Regarding effect variation (e.g., with prioritization or multiple alternating texel maps), the actual hardware used for generating tactile stimuli may take a relatively long amount of time to produce a meaningful tactile effect (e.g., 10 or 50 ms). In addition, when a finger slides over the surface at a relatively fast speed, such as 10 cm/s, it may slide a distance of 5 mm during a 50 ms effect duration. If the tactile surface is described by a texel map (e.g., an effect map), the finger may proceed over, for example, 5 texels before the system has a chance to decide which effect to output next.
[0065] In such a situation, it may be beneficial to associate a priority with each texel or a different priority for each direction described by the texel. In such a case, the system may determine the texels that a finger is likely to slide over during the next estimated time period and select the most prominent texel based on (e.g., with the help of) such priorities. The algorithm used by the system may also determine whether a texel is likely to be in the middle of the sliding finger or at an edge of it, and adjust the priorities accordingly based on this determination. For example, it may adjust up the priority of the texels that are likely to be in the middle of the sliding finger while adjusting down the priority of the texels at the edge of the finger.
[0066] An alternative method may be to store several texel maps at differing resolutions or granularities (e.g., levels of granularity). In such a case, the system may be configured to always select a texel map that best matches with the current finger speed, so that the finger always and only slides from a texel to a neighboring texel. Hence, at slow finger speeds, the system may use a more finely described texel map where each texel is described, for example, as a 1 mm xl mm area. At some higher speed, the system may use a texel map where each texel describes the most desired effect for a 5 mm x5 mm area when the finger is sliding with the speed corresponding to this resolution or granularity.
[0067] A system may store several texel maps at a given granularity at differing offsets, for example, so that a second texel map has a texel corner located at the middle point of the other texel map. This may help the system to quickly adjust to higher or slower speeds as the finger speed changes.
[0068] The tactile output system may have an overall system organization (e.g., system architecture). Example embodiments of such a system organization are described herein. Alternative system organizations may be apparent to one skilled in the art. For example, additional example embodiments may be obtained by leaving out some of the described components or adding more components. FIG. 1 illustrates an example embodiment of the overall organization, showing various information flows.
[0069] As shown in FIG. 1, there may be two basic sources of information used in determining (e.g., computing) what tactile stimuli need to be generated at any given point in time. The first source is some embodiment of the information that describes the desired effect "landscape". Such effect landscape information may be defined in various different ways. For example, such information may be dispersed over an application program in the form of a condition statement or other program code. In particular, some or all of the effect landscape
information may be stored declaratively (e.g., in texel maps), as described above.
[0070] This first source of information may be relatively "static", in the sense that it may be created by developers of the system or application (e.g., program) to be executed thereon and it does not change over the lifetime of a program version. However, it may also be "dynamic" in the sense that it may be computed (e.g., in real-time or near real-time). For example, an application program may create a texel map from an image downloaded from the network by, for example, using a edge detection algorithm that allows the application to make the edges within the image feelable. As another example, an application that creates 3-D graphics may also create a corresponding dynamic texel map. In some example embodiments, implementation of dynamic texel maps involves using shared memory between the application process and the other components of the system.
[0071] The second source of information may be the input event reports. These may be created dynamically (e.g., during the system runtime), and they may describe the user interactions with the tactile surface. From a system design perspective, these input event reports may be expected to be unpredictable, but have some predictable features when there is a sliding touch contact detected.
[0072] One function of the system may be to access the two above- described sources of information and compute a stream of output signals that cause the hardware to generate the desired physical phenomena. This allows the user to perceive the surface as having highly accurate tactile elements that can be quickly changed under program control. For example, if a user action causes the system to change its internal state, such as showing a new picture, the internal state change may be able to cause an instant change in the tactile content, where "instant" means faster than the human perception limit.
[0073] In FIG. 1, the system is shown as being configured to perform a few processing steps. The tactile event reports are processed first, producing a speed estimate, a direction estimate, a pressure estimate, a location estimate, or any suitable combination thereof. The speed and direction estimates may be represented as a momentary velocity vector, describing the estimated speed and direction of the finger. The location estimate may be represented as an <x, y> coordinate position, describing the estimated position of the finger on the tactile surface at the time the to-be-issued tactile stimulus will get physically actualized. The pressure estimate may be represented as integer that describes a relative pressure.
[0074] The speed estimate, direction estimate, pressure estimate, location estimate, or any combination thereof, may then be used by the system to select a tactile element that the system will actualize. This selection may be based on one or more tactile maps (e.g. texel maps) representing the desired tactile elements at each point on the tactile surface. The selection process may store and utilize one or more previous finger positions in performing the selection. The input event processing steps may correct such stored previous estimates based on actual measured values.
[0075] Once the desired tactile element has been selected, the system may further refine and adjust the desired effect, for example, by selecting the desired piece of tactile content based on the estimated direction of the finger. The content may be further refined, for example, by adjusting it based on (e.g., according to) the estimated finger speed, estimated finger pressure, or both. This may involve changing the duration, the frequency, or one or more other aspects of the output signals.
[0076] The adjusted piece of tactile content may then be converted into a set of temporally changing output signals. There may be one or more parallel signals, separately or jointly controlling different hardware aspects. For example, in an electrostatic system, one signal may control when a high- voltage electric charge is passed to an effect electrode (e.g., a tixel) and another signal may control when a high- voltage charge is released from the effect electrode. [0077] Regarding input event report acquisition, as an example, consider a tablet computer that is running the Android operating system. For such a system, the input event reports may be represented in at least three different formats, according to some example embodiments. Firstly, the underlying Linux kernel may represent a touch input event as a sequence of primary Linux input subsystem events. Such events are available, for example, through the Linux /dev/input/event N device-driver pseudo-file. Secondly, the Android Window Manager may read the Linux input events and store them in their internal representations. Thirdly, when the Window Manager further passes the events to the Android Java runtime environment, the events may be converted to a Java MotionEvent format.
[0078] It may be beneficial to acquire the input event reports through the Android Window Manager in their internal representations. This approach may avoid the step of parsing the sequence raw Linux input events into an touch event, as the Window Manager does that already. Rather, the approach would acquire the events before they get queued or filtered. This may have the effect of reducing the overall latency of the tactile output system.
[0079] Alternatively, it may be beneficial to acquire the input event reports from the Linux input subsystem, or even from the input device driver. As an example, in an Android operating system, an underlying Linux kernel may represent input event reports in an internal data representation (e.g., internal data structure). In some example embodiments, it may be beneficial to modify or extend the Linux kernel source and object code, so that the modified or extended code is able to produce copies of the internally represented input event reports, or parts of such reports, and pass them onward for further processing. In other example embodiments, it may be beneficial to modify or extend an input device driver to copy information from the driver- internal input event reports and pass the information onward for further processing.
[0080] In the Android Window Manager, the input event reports may be acquired from the Window Manager in different ways. As an example, the system may implement a new Android system service that runs in the same process space as the Window Manager in the Android system_server process. Such a new system service may acquire the actual events by modifying the virtual table of the Android InputDispatcher C++ class so that when any Android C++ component invokes the Android InputDispatcher::notifyMotion(...) method, a method belonging to the tactile output subsystem is called instead. The called method may, in turn, pass a copy of any input event to the rest of the tactile output system and then pass program control to the original
InputDispatcher : :notifyMotion( ... ) method.
[0081] In an Android system, the system server process may be started in the following way. First, when the underlying Linux operating system is booted, it may start the Android init process at the end of its internal boot process. The Android init process may read a number of configuration files, including the init.rc file. Among other things, the init.rc file defines a number of Linux shell environment variables, including for example the PATH and
LD LIBRARY PATH environment variables. In some example embodiments, the init.rc file also defines how the init starts the Android zygote process, which in turn may start the system_server process.
[0082] In some example embodiments, the new Android system service may be dynamically loaded to the Android system server process without any modifications to the Android source code. This may be performed, for example, using the following method. First, a new environment variable LD PRELOAD may be added to the Linux shell environment as set up by the init process. The LD PRELOAD value may define a new shared library. As a consequence of this, the Android Bionic dynamic loader may load the new shared library to any process that init launches, including the zygote and the system_server processes.
[0083] When the new shared library is loaded using the LD PRELOAD environment variable, thereby becoming a preloaded library, it may intercept any C function or C++ method defined in any of the libraries that get loaded after the preloaded library. When such an intercepted function or method is called or invoked from within an application or library that does not define the function or method itself and that has not been prelinked with the symbol defined, the function defined in the preloaded library may be called instead of the function defined in any of the later loaded libraries. However, if the function or method is called from within a library or application that defines the function itself, or from a library that has been prelinked with the function or method defined, the original function may get called. [0084] During the initialization of the system service process in an Android system, the Android SensorService C++ class constructor may get called by the Android libsystem service.so shared library. While the libsystem service.so library may be prelinked to Android, it might not be prelinked against the libsensorservice.so shared library that defines the
SensorService constructor.
[0085] In some example embodiments, the preloaded library may intercept the Android SensorService constructor. As a consequence, when the
libsystem_service.so shared library calls the SensorService constructor, a new pseudo-constructor defined in the preloaded library may be called. The pseudo- constructor may then initialize some other parts of the preloaded library. For example, the pseudo-constructor may instantiate a new system service. The pseudo-constructor may then pass the program control to the original
SensorService constructor as defined in the libsensorservice.so shared library.
[0086] Instead of or in addition to intercepting the SensorService constructor, the preloaded library may intercept some other Android method, function, or constructor, or define a C++ static constructor. This may cause a program entry point within the shared library to be called. Such a method may be used to initialize the preloaded library. In some example embodiments, when the preloaded library is initialized, it may modify the Android InputDispatcher class virtual table, as described above, or any other Android C++ class virtual table. According to various example embodiments, the above-described dynamic loading of a new shared library may also be implemented by some other means, for example, through modifying the LD LIBRARY PATH environment variable.
[0087] Regarding speed and direction estimation, a momentary speed estimate may be computed by calculating the Cartesian distance between a previously acquired input event and a currently acquired input event, dividing the distance with the time that has passed between the events, as computed from the event timestamps. To smooth measurement errors, the reported speed estimate may be a cumulative or exponential moving average over a number of momentary speed estimates.
[0088] A momentary direction estimate may be computed by calculating the difference between the x and y coordinates of the previously acquired input event and the currently acquired input event, and calculating the "atan2" function over the differences. To smooth measurement errors, the reported radian value may be smoothed with a cumulative or exponential moving average. The resulting radian value may be mapped to a set of discrete directions through comparing the computed radian value with a set of fixed radian values.
[0089] In some example embodiments, the momentary direction may be represented as a pair of <x, y> coordinates. These coordinates may be scaled to represent the x and y distance taken if the finger is moving on the represented direction over some unit of time (e.g., 10 ms). According to various example embodiments, the actual x and y values might play no role in this representation, with only their ratio contributing to the momentary direction. For example, if the finger is moving to the right, the direction may be represented as <10, 0>, and if the finger is moving at the angle of π/4 or 45° towards the upper right corner, the direction may be represented as <10, 10>.
[0090] Regarding pressure estimation, an input event report may contain data that represents how hard the user's finger is pressing on the tactile surface, how large an area is touched by the finger on the service, or other information that may be used to estimate how hard the user is currently pressing the finger against the tactile surface. The actual finger pressure may be estimated from such information, for example, using a heuristic algorithm. Such a heuristic algorithm may be calibrated, for example, using an application program that guides the user to touch the tactile service at different pressures.
[0091] Regarding location estimation, a forthcoming location of the finger at the estimated time of effect actualization may be based on computing the estimated location from a fixed, experimentally derived system latency estimate, the location reported in the input event, the speed and direction estimates, or any suitable combination thereof. An example procedure may be described as follows.
[0092] First, the speed estimate may be multiplied by the latency estimate, giving an estimate of the distance that the finger will have taken between the time when the finger was on the position reported in the input event and the time the effect gets actualized. If an <x, y> representation is used for the direction, the x and y direction values may now be scaled by the distance estimate. For example, if the latency is estimated to be 30 ms, the finger speed is 0.4 pixels/ms, and the direction is <10, -10>, the finger may be estimated to have moved 12 pixels within the 30 millisecond period, and the resulting <x, y> delta estimate may be approximately <8.5, -8.5>.
[0093] Secondly, the delta estimate may be directly added to the <x, y> position from the last input event report, giving the location estimate. Since the effect takes a while to execute, the location prediction may be enhanced by outputting two points, from xl, yl to x2, y2, thus defining the area of the texel map under consideration.
[0094] Theoretically, the predicted locations can be expressed as:
locationl = reported touch location + input latency + output latency, and
location2 = locationl + effect period.
[0095] Some example embodiments of the system may use the following definitions:
locationl = previous location2, and
location2 = reported touch location + 30ms.
[0096] Some example embodiments of the tactile output system ensure that significant details are always played. Simply using estimated values for the locations could result in one or more details being missed. Accordingly, some example embodiments of the system may use the "previous location2" as the starting point for location prediction.
[0097] Haptic content (e.g., tactile content) may be represented in a number of ways by the tactile output system (e.g., haptic device or haptic apparatus). In some example embodiments, the tactile content is represented in the form of texel maps, or data structures consisting of or describing positional cells, with each cell defining a tactile element (e.g., a texel) that describes the desired tactile properties of a position at the tactile surface. For example, a texel map may be or include a two-dimensional array wherein each cell, at an <x, y> coordinate, contains a small integer number, such as a value between zero and 255. The integer number may be used as an index to another array (e.g., a texel palette) in which each array element describes the texel properties.
[0098] In some example embodiments, each texel in the texel map may contain further values, such as the relative "depth" of the texel in relation to the one or more nearby texels (e.g., adjacent texels or texels less than five texels away), the relative "slope" of the texel (e.g., as used in so-called "normal maps" used when rendering computer graphics), or both. For example, the depth may be represented as a small integer, and the normal may be represented by the <x, y> coordinates of the slope normal (e.g., in the form of a pair of small integers).
[0099] A texel palette element may be or include a short, fixed length array of small integers, such as an array of 9 or 17 numbers between zero and 255. Each element of the array may be associated with a different finger direction, for example with element zero being associated with a positional, non-moving finger, element one being associated with a finger moving on an angle between π/8 and -π/8, element two with angles between 2π/8 and π/8, and so on.
[00100] Each integer stored in an array element within a texel palette element may be used as an index to another array (e.g., a grain array) in which each element may describe the desired tactile properties of the texel whenever the finger is moving along the associated direction, for example, at some nominal speed, pressure, or both. The elements of the grain array (e.g., grains) may each contain a template. Each template may be capable of being adjusted to produce differing output signal sequences, with each sequence representing the desired tactile output at a given finger velocity, finger pressure, or both.
Moreover, each template may be a template of a computer program, and each computer program template may be a binary representation of a sequence of instructions that may be executed by a virtual machine or by a general purpose or special purpose central processor unit (CPU) or microcontroller unit (MCU).
[0100] FIG. 2 illustrates texel map data structures and their relationships. In some example embodiments, a number of texel maps may be associated with each other, forming a hierarchy of texel maps. The texel maps in a texel map hierarchy may all be of differing scales (e.g., differing sizes). For example, each texel in the lowest level texel map describes the desired tactile content for a relatively small area of the tactile surface (e.g., a 1 mm xl mm area), and each next level texel map covers an area that is an integer multiple of this area with respect to width and height. For example, a second lowest level texel map may cover a 2 mm x 2 mm area, and a third lowest level texel map may cover a 4 mm x 4 mm area, while a fourth lowest level texel map covers a 8 mm x 8 mm area, all the way up to a texel map that contains only 2x2 texels at the highest level. [0101] In some example embodiments, a number of texel map hierarchies may be associated together, forming a description of the tactile content associated with a visible programming construct, such as an Android View. In some situations, only one of the texel map hierarchies is active at a time. An application program may have an API that supports creation of new texel map hierarchies, and such an API may enable the application program to indicate which texel map hierarchy is to be active (e.g., in response to an API call).
[0102] An underlying windowing system may automatically control which of a multitude of tactile content descriptions is active at a given point in time. For example, whenever an Android View is asserted as the so-called "touch focus," the tactile output system may grant priority to the texel map hierarchies associated with that Android View.
[0103] Regarding texel selection, according to various example embodiments, a texel selection subsystem within the tactile output system may select a texel whose tactile content will be actualized next. The texel selection subsystem may take as its input a texel map hierarchy. In a system that uses multiple texel map hierarchies, the texel selection subsystem may implement a texel selection routine that considers only one texel map hierarchy (e.g., at a time). For example, one of the multiple texel map hierarchies may be designated as a system-level active texel map hierarchy. One or more other components of the system, such as the Android Window Manager, may change the system- level active texel map hierarchy (e.g., in response to one or more application program API calls and based on the Android touch focus).
[0104] In some example embodiments, the texel selection subsystem considers only one texel map hierarchy (e.g., at a time), and the texel selection subsystem may select a texel periodically. For example, the period may be 10 or 50 milliseconds. The tactile content described by the selected next texel may then define the output signals until a following texel is selected. To select a next texel, the texel selection subsystem may utilize the information generated by speed estimation, direction destination, pressure estimation, location estimation, or any suitable combination thereof. With this information, the texel selection subsystem may determine one or more texels that the finger is likely to slide over during the considered time period. The texel selection subsystem may also consider the finger width, which may be wider than each texel and, accordingly, consider multiple texels perpendicular to the estimated direction of the finger. The texel selection subsystem may also consider the finger movement length and, accordingly, consider a number of texels along the estimated direction of the finger. The texel selection subsystem may also consider the finger speed and, accordingly, select a texel map from a hierarchy of texel maps so that the finger is likely to move only from one texel to the next one along its estimated direction. This may have the effect of reducing the number of texels that need to be considered by the texel selection subsystem.
[0105] The texel selection subsystem may associate a priority or weight with each of the considered texels. This priority or weight may be
computationally formed, for example, by considering how far the texel is, in perpendicular direction, from an estimated or predicted mid- finger path (e.g., the path that the middle point of the finger is likely to take). For example, each grain in the grain palette may be associated with a static weight, and the computational priority or weight may be formed by dividing the static weight by the relative distance of the texel from the mid- finger path. Based on this priority or weight, the texel selection subsystem may select a single texel that has the highest priority among those considered. The priorities or weights may be computed in such a way that ties become improbable. In the case of a tie, the system may select one of the tied highest priority texels at random.
Accordingly, the selecting of a texel may include determining a predicted segment of a predicted path of contact for the finger on the surface of the device, and the texel may be selected based on the predicted segment of that predicted path of contact. In some example embodiments, the determining of the predicted segment of the predicted path includes estimating a start location of the finger at a start of a time period, estimating an end location of the finger at an end of the time period, estimating a width of an area of the contact on the surface of the device (e.g., as measured perpendicular to a path defined by the start location and the end location), or any suitable combination thereof. In such example embodiments, the determining of the predicted segment may be based on the estimated start location, the estimated end location, the estimated width of the area of the contact, or any suitable combination thereof.
[0106] The texel selection subsystem may select a number of texels so that their output signals will be combined using the weights. In such a case, the texels with higher weights may affect more of the output signal than those with lower weights.
[0107] In some example embodiments, the texel selection subsystem first selects a texel map from a hierarchy of texel maps so that the finger will move to the next adjacent texel, if at all. For example, if the finger is determined to move 3 mm during the considered period, the texel selection subsystem may first select the texel map with 2 mm x 2 mm areas or the texel map with the 4 mm x 4 mm areas, depending on the initial and destination locations. In certain example embodiments, the coarser area texel map is only selected if the finger is likely to appear as jumping over a texel in the finer grained texel map.
[0108] The texel selection subsystem may then consider the texel at which the finger will be arriving and another texel that is one to five texels adjacent to it, in a direction perpendicular to the finger movement. FIG. 3 shows some illustrative examples. The texel selection subsystem may perform this function by inspecting the eight texels next to the one at which the finger is estimated to land, excluding the ones that were considered during the previous time period, computing the priorities or weights by calculating the distance between the estimated finger path and a parallel line going through the texel middle point and how much further the texel midpoint would be along the finger path, and multiplying these values. As shown in FIG. 2, a texel map may be represented as a two-dimensional array of data elements indexed by spatial locations, among which is the estimated spatial location of the contact made by the finger. In some example embodiments, a texel map may be represented as a
multidimensional object-based data structure (e.g., a vector haptic model, which may be analogous to a vector graphics model) in which each of multiple objects assigns a corresponding portion of the tactile output (e.g., as described by a tactile output description) to a spatial location (e.g., estimated spatial location on the surface that is contacted by the finger).
[0109] The texel selection subsystem may then divide the dynamic weight of 1 among the selected texels, giving each selected texel a weight based on how close the texel is to the estimated finger path. Thereafter, the texel selection subsystem may compute the dynamic priorities of the texels by multiplying the dynamic weight with the static weight associated with the grain indicated by the texel. It then may select the texel with the highest dynamic priority. In the case of a tie, it may use a simple pseudo-random number generator to pick one.
[0110] In the case where the same texel would be selected twice in two subsequent periods (e.g., because its static weight is so high compared to the others), the texel selection subsystem may prevent it from being selected the second time and select the next highest priority texel instead.
[0111] In some example embodiments, once the target texel has been selected, the texel selection subsystem may select the grain (e.g., from a plurality of grains associated with the texel) based on the estimated finger direction. In the case of a tie (e.g., the angle landing exactly between two represented angles), the texel selection subsystem may select a grain associated with (e.g., corresponds to) a direction that can be presented as a multiple of a larger fraction οίπ.
[0112] Regarding effect adjustment, the tactile output system may adjust the tactile content defined in the selected texel or selected texels based on (e.g., according to) the finger's estimated speed, pressure, or both. For example, the selected texel may contain a number of different tactile content templates (e.g., for different speed ranges, pressure ranges, or both). This may have an effect similar to that of having a hierarchy of texel maps, selecting the highest priority texel for the next hierarchy level, and having different content templates at each level.
[0113] In some example embodiments, there is one selected texel, with one selected grain. The grain may be assumed to contain a corresponding tactile content template (e.g., a tactile output template). This template may include or define some nominal finger speed, vnom, for which it has been designed, some nominal finger pressure, pnom, or both. In some example embodiments, there may be templates for different speed ranges, so that only speeds from about 2/3 vnom to about 4/3 vnom are considered, with the slowest speed template being a possible exception. In a likewise manner, there may be similarly divided templates for different pressure ranges.
[0114] In such an example tactile output system, the grains may be always played for a given fixed period of time. Hence, one solution is to not adjust the grains at all and simply accept the ±33% variance. [0115] Certain example embodiments, however, may provide even better control, and there are multiple options for doing so. One example option is to perform interpolation of an effect by taking a weighted average between the virtual machine parameters from the selected speed and the next slower or faster speed. Another option is to have or include information in the grain on how to change the virtual machine parameters. In some example embodiments, the charger and discharger frequencies are scaled down at the slowest speeds, thereby attenuating the strength of the effect. At middle speeds, some ridges may feel stronger when the speed is increased, which may be done up to a point where adjacent ridges may be perceived as starting to blur together. In some example embodiments, middle speeds result in ridges feeling softer as the speed is increased. At any rate, the resulting grain (e.g., speed-adjusted, pressure- adjusted, or both) may be denoted as an "expressed grain."
[0116] Regarding hardware actualization, the expressed grain may be represented in a form of a binary computer program. Such a program may be executed on an execution unit which may, for example, be a virtual machine running on a physical central processing unit (CPU) or microcontroller unit (MCU), or a general-purpose or special-purpose CPU or MCU (e.g., configured by instructions or software). In particular, an instruction (e.g., software) format may be run on a virtual machine or special-purpose hardware execution unit. According to various example embodiments, a tactile output system may implement a virtual machine architecture (e.g., software architecture) that configures the tactile output system to perform one or more of the functions described above.
[0117] The virtual machine architecture may generally be described as a software architecture that implements one or more mechanisms, components, or subsystems discussed above. For example, a binary computer program (e.g., an executable software application or its source code) may contain instructions that specify frequencies, pulse width modulation duty cycles, voltages, or other properties of signals that control the hardware generating one or more haptic effects.
[0118] To control the timing of these signals, one or more (e.g., each) of the instructions within the binary computer program may contain a field specifying the length of time that should pass before the next instruction is executed. To reduce the size of programs that generate repeating signal patterns, the virtual machine or special purpose hardware executing unit may have a facility for executing sections of the program multiple times.
[0119] The binary computer program may contain instructions that specify (e.g., singly or in combination) one or more of the following parameters: a duration, a repeat count, a repeat offset, and one or more physical parameters (e.g., one or more frequency values, defining for example output signal frequencies during the execution of the instruction, or one or more target analog values, defining for example the target value of a voltage level at the end the execution of the instruction). One or more instructions (e.g., each instruction) may be associated with a repeat counter, any one or more (e.g., all) of which may be initialized to zero.
[0120] In some example embodiments, such instructions may be executed according to the following procedure:
[0121] Step 1 : Take (e.g., access) the current instruction and adjust the output signals according to one or more physical parameters in the instructions.
For example, if the output parameters are frequencies, a timer in a
microcontroller may be initialized to produce an alternating binary output value at the specified frequency.
[0122] Step 2: Make the execution unit inactive (e.g., sleep, wake, or block) or perform other functions for a duration specified in the instruction.
[0123] Step 3 : If the repeat count associated with the instruction is zero, which denotes an infinite loop, make the instruction that is at an offset specified in this instruction the next current instruction, and loop back to Step 1.
[0124] Step 4: Increment a repeat counter associated with the instruction.
[0125] Step 5: If the repeat counter value is equal to the repeat count value, exit the loop. Clear the repeat counter value (e.g., make it zero), make the next instruction the next current instruction, and loop back to Step 1.
[0126] Step 6: Make the instruction at an offset specified in this instruction the next current instruction and loop back to Step 1.
[0127] In pseudo-program code, the procedure may be described as follows:
type instruction : record
s ignal s : opaque ; duration : integer;
maxcount : integer;
offset : integer;
end record;
array program [ size ] : instruction;
array counter [size] : integer;
var pc : integer;
var curins : instruction;
var output : opaque;
forever do
curinst := progam [pc] ;
output := curinst . signals ;
sleep (curinst . duration) ;
if (curinst .maxcount == 0)
pc = pc + curinst . offset ;
else
counter [pc] = counter [pc] + 1;
if (curinst .maxcount == counter [pc])
counter [pc] := 0;
pc = pc + 1 ;
else
pc = pc + curinst . offset ;
end if;
end if;
pc = pc mod size;
end do;
[0128] As shown in FIG. 4, according to certain example embodiments, various subsystems may be implemented (e.g., included) within a tablet computer. In the example embodiment shown, most functions are embodied as software implementations running (e.g., executing) on the tablet computer's host CPU. The results of effect adjustment, such as a resultant binary program, may be transferred to a separate microcontroller. The separate microcontroller may then control the hardware in the forming of output signals (e.g., output signals that convey tactile output).
[0129] Within the host CPU, the functions may be divided between different programs or processes. For example, the input event reports may be generated by a window manager component, which may be modified to provide the input event reports to the rest of the tactile output system.
[0130] One or more of the functional subcomponents of a tactile output system (e.g., speed estimation, pressure estimation, direction estimation, location estimation, texel selection, effect adjustment, or any suitable combination thereof) may be combined into a single larger subcomponent, such as a "haptic engine." The haptic engine may use efficient data structures within a single memory space to streamline processing. The functions may be implemented in program code in an entangled fashion, for example, in order to minimize processing time.
[0131] One or more application programs may construct one or more texel maps, for example, by reading one or more texel map definitions (e.g., from files stored on the disk and creating respective representations in memory). The application programs may provide the one or more texel maps to the haptic engine on their own initiative (e.g., programming) or in response to a request from the haptic engine. In Android, such an application program may be implemented as one or more Android activities, Android content providers, or any suitable combination thereof, or in the form of some other Android subsystem. One or more texel maps may be transferred from the application programs to the haptic engine in the form of shared memory regions, for example, using Android ashmem, Android MemoryHeap objects, or both.
[0132] To illustrate a possible piece of the overall functional operation of example embodiments of the tactile output system, consider an enclosing embodiment where the tablet computer input controller reports one or more finger positions every 10ms. The input controller delay may be in the order of 20 ms. The output latency may be, for example, 15 ms. The host CPU processing latency may be, for example, less than 2 ms.
[0133] In such an example embodiment, the Window Manager may hand over a new input event report every 10 ms. These events reports may be stored in a short queue, which may have the capacity of for example 6 or 10 event reports.
[0134] The haptic engine may run on a separate thread of execution which may be woken up periodically. When the thread is woken up, it may first use the event reports stored in the queue to estimate the current speed and direction of the finger being tracked. The resulting speed vector may be multiplied with the known system latency. In this example case, the system latency may be estimated to be 35 ms, to give an estimate (e.g., a predicted estimate) of a distance vector the finger will likely have taken from the position reported in the latest received input event report by the time the effect will be physically actualized. It may be beneficial to use a slightly lower value, such as 30ms, in order to err on the conservative side in case the speed estimate is not accurate enough. It may be beneficial to further adjust the value to compensate for the effect duration.
[0135] The direction vector may be added to the finger position available in the latest received input event report, giving an estimated position of the finger. The estimated position may be used to select the most appropriate texel to be actualized. Accordingly, an effect template may be chosen from the texel using the direction estimate. The effect template may be "expressed" using the speed estimate, the pressure estimate, or both.
[0136] The resulting "expressed" effect template, which is denoted as "expressed grain" in FIG. 4, may be sent from the haptic engine to the separate microcontroller. The expressed grain may be in the form of a binary computer program. The expressed grain may be sent over a digital bus, such as a serial bus (e.g., Universal Serial Bus (USB), Serial Protocol Interface (SPI), or I2C bus).
[0137] The tactile output system may further estimate the time it will likely take to run the expressed grain on the microcontroller. This estimate may be used to decide when to schedule the haptic engine thread next.
[0138] The microcontroller may continue to execute the previous expressed grain while it is receiving the next grain. Once it has received all or a large enough part of the next grain (e.g., so that it will not overrun the so-far received part), the microcontroller may start to execute next expressed grain. It may do so on its own initiative (e.g., as programmed), for example, based on its estimate of execution time and bus speed, based on an explicit command or instruction format received from the host CPU, or based on any suitable combination thereof.
[0139] As shown in FIG. 5, effect adjustment may be placed at the microcontroller. The host software may provide the microcontroller with an effect template, which may be in the form of a binary "unlinked" computer program, and a speed estimate, a pressure estimate, or both. The microcontroller may perform the effect adjustment, producing the final effect (e.g. binary program) to be executed.
[0140] As shown in FIG. 6, the one or more texel maps or parts thereof may be passed to the microcontroller. There may be a new functional mechanism or subsystem in the form of a "texel map chooser," which may be provided with a location estimate. The texel map chooser may provide only one texel map, or one or more parts thereof, to the microcontroller. The texel map chooser may provide multiple texel maps. In this way, the microcontroller may use less memory to store the one or more texel maps.
[0141] As shown in FIG. 7, almost all of the tactile output system may be embodied on a separate microcontroller. The input event reports may be received directly from, for example, an input microcontroller, which may be beneficial as it may reduce the overall system latency.
[0142] In certain example embodiments, one or more of the functional blocks may be implemented directly in the form of hardware functions.
According to various example embodiments, one or more of the functional blocks may be implemented at an input controller.
[0143] As shown in FIG. 8, most functional blocks may be implemented at the input microcontroller.
[0144] As shown in FIG. 9, the functional blocks may be divided between an input controller, the host CPU, and an output controller.
[0145] FIG. 10 is a diagram depicting a tactile output system (e.g., a haptic device) in the example form of a tactile stimulation apparatus 150, according to some example embodiments. As used herein, "tactile" means relating to a sensation of touch or pressure, and the tactile stimulation apparatus 150 may be capable of creating a sensation of touch or pressure to a body member 120 (e.g., a finger) based on the creation of a pulsating Coulomb force, as discussed by way of examples herein.
[0146] The tactile stimulation apparatus 150 may be in the form of a tactile display device that is capable of displaying graphics as well as creating a sensation of touch to the body member 120. FIG. 10 depicts an example of such a tactile display device in the form of a smart phone having a touch screen panel 160 (e.g., a touch-sensitive screen) that is responsive to touches by the body member 120. That is, touching different portions of the touch screen panel 160 with the body member 120 may cause the smart phone to take various actions.
[0147] In addition to displaying graphics, the touch screen panel 160 may create a sensation of touch or pressure to the body member 120. The creation of the touch sensation to the body member 120 may involve the generation of one or more high voltages. A region of the touch screen panel 160 may comprise a semiconducting material that may limit a flow of current to the body member 120. Additionally, the semiconducting material may also be used to reduce the thickness of the touch screen panel 160, as described by way of examples herein. In addition to the smart phone depicted in FIG. 10, the tactile stimulation apparatus 150 may include a variety of other apparatus, such as a computer monitor, a television, a handle (e.g., a door handle), a touchpad, a mouse, a keyboard, a switch, a trackball, a joystick, or any suitable combination thereof.
[0148] FIG. 11 is a schematic diagram illustrating various components of a tactile output system (e.g., a haptic device) in the example form of a tactile stimulation apparatus 1200, according to some example embodiments. As depicted in FIG. 11, a display region 1222 shows information 1226, which is seen by a user through a touch-sensitive region 1262 and a tactile output region 1242. The touch-sensitive region 1262 is scanned by a touch input controller 1240, such that a microprocessor 1204 (e.g., a host CPU), under the control of instructions (e.g., software) stored in and executed from a memory 1206, is aware of the presence or absence of the body member 120 on top of a predefined area 1246. The composite section of the touch-sensitive region 1262 may be completely homogenous. The predefined areas, such as area 1246, are created dynamically by the microprocessor 1204 under control of the instructions, such that the X and Y coordinates of the body member 120, as it touches the touch- sensitive region 1262, are compared with predefined borders of the predefined area 1246.
[0149] Reference numeral 1248 denotes a presence-detection logic stored within the memory 1206. Execution of the presence-detection logic 1248 by the microprocessor 1204 may cause the detection of the presence or absence of the body member 120 at the predefined area 1246. It may also cause detection of a location of the body member 120 within the predefined area at 1246. A visual cue, such as a name of the function or activity associated with the predefined area 1246, may be displayed by the display region 1222, as part of the displayed information 1226, so as to help the user find the predefined area 1246.
[0150] Additionally stored within the memory 1206 may be stimulus- variation logic 1268. Input information to the stimulus- variation logic 1268 may include information on the presence, absence, location, or any suitable combination thereof, of the body member 120 at the predefined area 1246. Based on this information, the stimulus- variation logic 1268 may have the effect that the microprocessor 1204 instructs the tactile output controller 1260 (e.g., a subsystem including a microcontroller or a special-purpose processor) to vary the electrical input to the tactile output region 1242, thus varying the
electrosensory sensations caused to the body member 120. Thus, a user may detect the presence or absence of the displayed information 1226 at the predefined area 1246 merely by way of tactile information (or electrosensory sensation) and without requiring visual clues.
[0151] Any of the machines, systems, apparatus, or devices shown or discussed herein may be implemented in a general-purpose computer modified (e.g., configured or programmed) by software to be a special-purpose computer to perform the functions described herein for that machine. For example, a computer system able to implement any one or more of the methodologies described herein is discussed with respect to FIG. 12. Moreover, any two or more of the example systems or devices discussed herein may be combined into a single machine, system or device, and the functions described herein for any single machine, system, or device may be subdivided among multiple machines, systems, apparatus, or devices.
[0152] Any one or more of the modules or components described herein may be implemented using hardware (e.g., a processor of a machine) or a combination of hardware and software. For example, any module or component described herein may configure a processor to perform the operations described herein for that module. Moreover, any two or more of these modules or components may be combined into a single module or component, and the functions described herein for a single module or component may be subdivided among multiple modules or components.
[0153] FIG. 12 is a block diagram illustrating components of a machine 900, according to some example embodiments, able to read instructions from a machine-readable medium (e.g., a machine-readable storage medium) and perform any one or more of the methodologies discussed herein. Specifically, FIG. 12 shows a diagrammatic representation of the machine 900 in the example form of a computer system and within which instructions 924 (e.g., software, a program, an application, an applet, an app, or other executable code) for causing the machine 900 to perform any one or more of the methodologies discussed herein may be executed. In alternative embodiments, the machine 900 operates as a standalone device or may be connected (e.g., networked) to other machines. In a networked deployment, the machine 900 may operate in the capacity of a server machine or a client machine in a server-client network environment, or as a peer machine in a peer-to-peer (or distributed) network environment. The machine 900 may be a server computer, a client computer, a personal computer (PC), a tablet computer, a laptop computer, a netbook, a set-top box (STB), a personal digital assistant (PDA), a cellular telephone, a smartphone, a web appliance, a network router, a network switch, a network bridge, or any machine capable of executing the instructions 924, sequentially or otherwise, that specify actions to be taken by that machine. Further, while only a single machine is illustrated, the term "machine" shall also be taken to include a collection of machines that individually or jointly execute the instructions 924 to perform any one or more of the methodologies discussed herein.
[0154] The machine 900 includes a processor 902 (e.g., a central processing unit (CPU), a graphics processing unit (GPU), a digital signal processor (DSP), an application specific integrated circuit (ASIC), a radio- frequency integrated circuit (RFIC), or any suitable combination thereof), a main memory 904, and a static memory 906, which are configured to communicate with each other via a bus 908. The machine 900 may further include a graphics display 910 (e.g., a plasma display panel (PDP), a light emitting diode (LED) display, a liquid crystal display (LCD), a projector, or a cathode ray tube (CRT)). The machine 900 may also include an alphanumeric input device 912 (e.g., a keyboard), a cursor control device 914 (e.g., a mouse, a touchpad, a trackball, a joystick, a motion sensor, or other pointing instrument), a storage unit 916, a signal generation device 918 (e.g., a speaker), and a network interface device 920.
[0155] The storage unit 916 includes a machine-readable medium 922 on which is stored the instructions 924 embodying any one or more of the methodologies or functions described herein. The instructions 924 may also reside, completely or at least partially, within the main memory 904, within the processor 902 (e.g., within the processor's cache memory), or both, during execution thereof by the machine 900. Accordingly, the main memory 904 and the processor 902 may be considered as machine-readable media. The instructions 924 may be transmitted or received over a network 926 via the network interface device 920.
[0156] As used herein, the term "memory" refers to a machine-readable medium able to store data temporarily or permanently and may be taken to include, but not be limited to, random-access memory (RAM), read-only memory (ROM), buffer memory, flash memory, and cache memory. While the machine-readable medium 922 is shown in an example embodiment to be a single medium, the term "machine-readable medium" should be taken to include a single medium or multiple media (e.g., a centralized or distributed database, or associated caches and servers) able to store instructions. The term "machine- readable medium" shall also be taken to include any medium, or combination of multiple media, that is capable of storing instructionsfor execution by a machine (e.g., machine 900), such that the instructions, when executed by one or more processors of the machine (e.g., processor 902), cause the machine to perform any one or more of the methodologies described herein. Accordingly, a
"machine-readable medium" refers to a single storage apparatus or device, as well as "cloud-based" storage systems or storage networks that include multiple storage apparatus or devices. The term "machine-readable medium" shall accordingly be taken to include, but not be limited to, one or more data repositories in the form of a solid-state memory, an optical medium, a magnetic medium, or any suitable combination thereof.
[0157] FIG. 13-16 are flowcharts illustrating operations of a tactile output system (e.g., a machine, a haptic device, a tactile output device, or any suitable combination thereof), according to various example embodiments. FIG. 13 illustrates operations in a method 1300, according to some example
embodiments. Operations of the method 1300 may be performed by one or more of the hardware components shown in FIG. 4-12. As shown in FIG. 13, the method 1300 includes operations 1310, 1320, 1330, and 1340.
[0158] In operation 1310, a processor (e.g., processor 902) accesses an input event report that is generated as a result of contact (e.g., a touch) made by a user's finger on a surface (e.g., touch-sensitive surface) of a tactile output system. In operation 1320, the processor determines a finger speed, finger direction, finger pressure, or any suitable combination thereof. One or more of such values determined by the processor in operation 1320 may be an actual value, an estimated value (e.g., an estimate of the value), a predicted value (e.g., an estimate of a likely future value), or a weighted value (e.g., weighted as discussed above with respect to FIG. 1-3). That is, performance of operation 1320 by the processor may determine (e.g., calculate) a contact parameter that describes, in whole or in part, the contact made by the finger on the surface of the device.
[0159] In operation 1330, the processor obtains a tactile output template based on the results of operation 1320. As noted above respect to FIG. 1-3, obtaining the tactile output template may include generating the tactile output template, adjusting (e.g., modifying) the tactile output template, or both. In operation 1340, a processor (e.g., processor 902, a separate microcontroller, or other tactile output hardware) causes the surface of the device to render a tactile output (e.g., haptic effect) based on all or part of the tactile output template obtained in operation 1330.
[0160] As shown in FIG. 14, the method 1300 may include one or more of operations 1422, 1424, 1426, and 1432. One or more of operations 1422, 1424, and 1426 may be performed as part (e.g., a precursor task, a subroutine, or portion) of operation 1320. In operation 1422, the processor determines the location (e.g., spatial location on the touch-sensitive surface) of the contact made by the finger on the surface of the device. Accordingly, operation 1320 may be performed based on the location determined in operation 1422.
[0161] In operation 1424, the processor determines the pressure exerted by the finger on the surface of the device based on the area (e.g., a height of the area, width of the area, or circumference of the area) encompassed by the contact made by the finger on the surface of the device. Accordingly, operation 1320 may be performed based on the pressure exerted by the finger, the area of the contact made by the finger, or both.
[0162] In operation 1426, the processor determines the pressure based on a sliding average pressure detected (e.g., measured) over a period of time.
Accordingly, operation 1320 may be performed based on this time-averaged pressure of the finger on the surface of the device.
[0163] Operation 1432 may be performed as part of operation 1330. In operation 1432, the processor selects a tactile element (e.g., a texel), and this selection may be made based on the results of operation 1320 (e.g., based on the contact parameter determined in operation 1320). Various example
embodiments of texel selection are discussed above with respect to FIG. 1-3.
[0164] FIG. 15 illustrates operations in a method 1500, according to some example embodiments. Operations of the method 1500 may be performed by one or more of the hardware components shown in FIG. 4-12. As shown in FIG. 15, the method 1500 includes operations 1510, 1520, 1530, 1540, and 1550.
[0165] In operation 1510, a processor (e.g., processor 902) accesses an input event report that is generated from contact (e.g., a touch) made by a user's finger on the surface (e.g., touch-sensitive surface) of the tactile output system. In operation 1520, the processor calculates a finger speed, finger direction, finger pressure, or any suitable combination thereof. One or more of such values determined by the processor in operation 1320 may be an actual value, an estimated value (e.g., an estimate of the value), a predicted value (e.g., an estimate of a likely future value), or a weighted value (e.g., weighted as discussed above with respect to FIG. 1-3). That is, performance of operation 1520 by the processor may determine (e.g., calculate) a contact parameter that describes, in whole or in part, the contact made by the finger on the surface of the device. [0166] In operation 1530, the processor detects a state change in a computer program (e.g., executing on the device). The computer program may be controlled by one or more input event reports (e.g., the input event report accessed in operation 1510). For example, the processor may receive an indication that the state of the program has changed as a result of the input event report accessed in operation 1510.
[0167] In operation 1540, the processor obtains an effect description (e.g., by obtaining a tactile output template) based on the results of operation 1530. As noted above respect to FIG. 1-3, obtaining the effect description may include generating the effect description, adjusting (e.g., modifying) the effect description, or both. In operation 1550, a processor (e.g., processor 902, a separate microcontroller, or other tactile output hardware) causes the surface of the device to render a tactile output (e.g., haptic effect) based on all or part of the effect description obtained in operation 1540.
[0168] As shown in FIG. 16, the method 1500 may include one or more of operations 1622, 1642, and 1644. Operation 1622 may be performed as part (e.g., a precursor task, a subroutine, or portion) of operation 1520. In operation 1622, the processor estimates a location (e.g., spatial location) of the contact made by the finger on the surface of the device. Accordingly, operation 1520 may be performed based on the location estimate it in operation 1622.
[0169] One or more of operations 1642 and 1644 may be performed as part of operation 1540. In operation 1642, the processor selects a spatial description of tactile output (e.g., a texel map, which may be selected from among multiple texel maps within a hierarchy of texel maps), and this selection may be made based on the results of operation 1530 (e.g., based on the contact parameter determined in operation 1530). For example, operation 1642 may include selection of a texel map based on the results of operation 1530. Various example embodiments of texel map selection are discussed above with respect to FIG. 1-3.
[0170] In operation 1644, the processor selects a locationary piece of a tactile output description (e.g., a tactile element, such as a texel), and this selection may be made based on the results of operation 1530 (e.g., based on the contact parameter determined in operation 1530). For example, operation 1644 may include selection of a texel based on the results of operation 1530. Various example embodiments of texel selection are discussed above with respect to FIG. 1-3
[0171] Operation 1645 may be performed as part of operation 1644. In operation 1645, the processor determines a predicted segment of a predicted path of the contact made by the finger on the surface of the device. For example, the processor may perform location estimation, location prediction, or both, as discussed above with respect to FIG. 1-3. In some example embodiments, operation 1645 is performed based on the estimate of the spatial location of the contact, as estimated in operation 1622.
[0172] One or more of operations 1646, 1647, and 1648 may be performed as part of operation 1645. In operation 1646, the processor estimates a start location of the user's finger (e.g., where it contacts the surface of the device) at the start of a period of time (e.g., time period, such as a sliding window of time). In operation 1647, the processor estimates an end location of the user's finger at the end of the period of time. In operation 1648, the processor estimates a width of the contact area (e.g., the width of the area of the contact made by the finger, which width may be measured perpendicularly to a line that connects the start location and the end location). The start location, the end location, the width, or any suitable combination thereof may fully or partially define the predicted segment that is determined in operation 1645.
[0173] Throughout this specification, plural instances may implement components, operations, or structures described as a single instance. Although individual operations of one or more methods are illustrated and described as separate operations, one or more of the individual operations may be performed concurrently, and nothing requires that the operations be performed in the order illustrated. Structures and functionality presented as separate components in example configurations may be implemented as a combined structure or component. Similarly, structures and functionality presented as a single component may be implemented as separate components. These and other variations, modifications, additions, and improvements fall within the scope of the subject matter herein.
[0174] Certain embodiments may be described herein as including logic or a number of components, modules, or mechanisms. Modules may constitute either software modules (e.g., code embodied on a machine-readable medium or in a transmission signal) or hardware modules. A "hardware module" is a tangible unit capable of performing certain operations and may be configured or arranged in a certain physical manner. In various example embodiments, one or more computer systems (e.g., a standalone computer system, a client computer system, or a server computer system) or one or more hardware modules of a computer system (e.g., a processor or a group of processors) may be configured by software (e.g., an application or application portion) as a hardware module that operates to perform certain operations as described herein.
[0175] In some embodiments, a hardware module may be implemented mechanically, electronically, or by any suitable combination thereof. For example, a hardware module may include dedicated circuitry or logic that is permanently configured to perform certain operations. For example, a hardware module may be a special-purpose processor, such as a field programmable gate array (FPGA) or an ASIC. A hardware module may also include programmable logic or circuitry that is temporarily configured by software to perform certain operations. For example, a hardware module may include software
encompassed within a general-purpose processor or other programmable processor. It will be appreciated that the decision to implement a hardware module mechanically, in dedicated and permanently configured circuitry, or in temporarily configured circuitry (e.g., configured by software) may be driven by cost and time considerations.
[0176] Accordingly, the phrase "hardware module" should be understood to encompass a tangible entity, be that an entity that is physically constructed, permanently configured (e.g., hardwired), or temporarily configured (e.g., programmed) to operate in a certain manner or to perform certain operations described herein. As used herein, "hardware-implemented module" refers to a hardware module. Considering embodiments in which hardware modules are temporarily configured (e.g., programmed), each of the hardware modules need not be configured or instantiated at any one instance in time. For example, where a hardware module comprises a general-purpose processor configured by software to become a special-purpose processor, the general-purpose processor may be configured as respectively different special-purpose processors (e.g., comprising different hardware modules) at different times. Software may accordingly configure a processor, for example, to constitute a particular hardware module at one instance of time and to constitute a different hardware module at a different instance of time.
[0177] Hardware modules can provide information to, and receive information from, other hardware modules. Accordingly, the described hardware modules may be regarded as being communicatively coupled. Where multiple hardware modules exist contemporaneously, communications may be achieved through signal transmission (e.g., over appropriate circuits and buses) between or among two or more of the hardware modules. In embodiments in which multiple hardware modules are configured or instantiated at different times, communications between such hardware modules may be achieved, for example, through the storage and retrieval of information in memory structures to which the multiple hardware modules have access. For example, one hardware module may perform an operation and store the output of that operation in a memory device to which it is communicatively coupled. A further hardware module may then, at a later time, access the memory device to retrieve and process the stored output. Hardware modules may also initiate
communications with input or output devices, and can operate on a resource (e.g., a collection of information).
[0178] The various operations of example methods described herein may be performed, at least partially, by one or more processors that are temporarily configured (e.g., by software) or permanently configured to perform the relevant operations. Whether temporarily or permanently configured, such processors may constitute processor-implemented modules that operate to perform one or more operations or functions described herein. As used herein, "processor- implemented module" refers to a hardware module implemented using one or more processors.
[0179] Similarly, the methods described herein may be at least partially processor-implemented, a processor being an example of hardware. For example, at least some of the operations of a method may be performed by one or more processors or processor-implemented modules. Moreover, the one or more processors may also operate to support performance of the relevant operations in a "cloud computing" environment or as a "software as a service" (SaaS). For example, at least some of the operations may be performed by a group of computers (as examples of machines including processors), with these operations being accessible via a network (e.g., the Internet) and via one or more appropriate interfaces (e.g., an application program interface (API)).
[0180] The performance of certain of the operations may be distributed among the one or more processors, not only residing within a single machine, but deployed across a number of machines. In some example embodiments, the one or more processors or processor-implemented modules may be located in a single geographic location (e.g., within a home environment, an office environment, or a server farm). In other example embodiments, the one or more processors or processor-implemented modules may be distributed across a number of geographic locations.
[0181] Some portions of this specification are presented in terms of algorithms or symbolic representations of operations on data stored as bits or binary digital signals within a machine memory (e.g., a computer memory). These algorithms or symbolic representations are examples of techniques used by those of ordinary skill in the data processing arts to convey the substance of their work to others skilled in the art. As used herein, an "algorithm" is a self- consistent sequence of operations or similar processing leading to a desired result. In this context, algorithms and operations involve physical manipulation of physical quantities. Typically, but not necessarily, such quantities may take the form of electrical, magnetic, or optical signals capable of being stored, accessed, transferred, combined, compared, or otherwise manipulated by a machine.
[0182] Any one or more algorithms discussed herein may be implemented by means of special-purpose electronic hardware in which the one or more algorithms are a symbolic representation of the specific electrical functions that may take place among electrical circuits within, for example, a special-purpose silicon chip or other semiconductor. In such example embodiments, each symbol in an algorithmic description may be directly mapped to the physical electrical circuits, and possibly vice versa. In some example embodiments, such a special-purpose chips may implement a special-purpose processor, which may be able to execute binary code (e.g., one or more computer programs) that may be tailored to provide a means for an efficient representation of tactile content.
[0183] Unless specifically stated otherwise, discussions herein using words such as "processing," "computing," "calculating," "determining," "presenting," "displaying," or the like may refer to actions or processes of a machine (e.g., a computer) that manipulates or transforms data represented as physical (e.g., electronic, magnetic, or optical) quantities within one or more memories (e.g., volatile memory, non-volatile memory, or any suitable combination thereof), registers, or other machine components that receive, store, transmit, or display information. Furthermore, unless specifically stated otherwise, the terms "a" or "an" are herein used, as is common in patent documents, to include one or more than one instance. Finally, as used herein, the conjunction "or" refers to a nonexclusive "or," unless specifically stated otherwise.
[0184] The detailed embodiments described in this document may be combined in various ways in addition to those explictly described earlier in this document. Such combinations of embodiments generally provide the benefits of the individual embodiments they are based on. The following enumerated descriptions define various example embodiments of methods, machine-readable media, and systems (e.g., devices, apparatus, or any suitable combination thereof) discussed herein:
[0185] 1. A method comprising:
accessing an input event report generated as a result of contact made by a finger of a user upon a surface of a device;
determining (e.g., using a processor or other suitable hardware) a contact parameter based on at least one of a speed of the finger relative to the surface of the device, a direction of the finger relative to the surface of the device, or a pressure exerted by the finger on the surface of the device, the determining of the contact parameter being based on the received input event report generated as a result of the contact;
obtaining a tactile output template based on the contact parameter, the tactile output template defining a tactile output that is renderable on the surface of the device to the finger of the user; and
causing the device to render the tactile output on the surface of the device based on the tactile output template to the finger of the user.
[0186] 2. The method of description 1, wherein:
the contact parameter is determined based further on a location of the contact by the finger on the surface of the device.
[0187] 3. The method of description 1 or description 2, wherein: the determining of the contact parameter is based on the pressure exerted by the finger on the surface of the device; and the method further comprises
determining the pressure exerted by the finger based on an area of the contact on the surface of the device.
[0188] 4. The method of any of descriptions 1-3, wherein:
the determining of the contact parameter is based on the pressure exerted by the finger on the surface of the device; and the method further comprises
determining the pressure as a sliding average pressure exerted by the finger over a period of time.
[0189] 5. The method of any of descriptions 1-4, wherein:
the obtaining of the tactile output template comprises generating the tactile output template based on the determined contact parameter.
[0190] 6. The method of any of descriptions 1-5, wherein:
the obtaining of the tactile output template comprises adjusting the tactile output template based on the determined contact parameter.
[0191] 7. The method of any of descriptions 1-6, wherein:
the obtaining of the tactile output template comprises selecting a tactile element based on the determined contact parameter, the tactile element describing a tactically perceivable surface feature and corresponding to the tactile output template; and
the obtaining of the tactile output template is based on its correspondence to the tactile element.
[0192] 8. The method of any of descriptions 1-7, wherein:
the determining of the contact parameter is based on an estimate of at least one of the speed of the finger, the direction of the finger, or the pressure of the finger; and the method further comprises
calculating the estimate based on the input event report.
[0193] 9. A device comprising:
a processor configured to:
access an input event report generated as a result of contact made by a finger of a user upon a surface of the device;
calculate an estimate of a contact parameter that represents at least one of a speed of the finger relative to the surface of the device, a direction of the finger relative to the surface of the device, or a pressure exerted by the finger on the surface of the device, the calculating of the estimate being based on the input event report generated as a result of the contact;
obtain an effect description based on a state change in a computer program (e.g., a computer program controlled by the input event report) and based on the estimate of the contact parameter (e.g., obtain the effect description in response to detecting the state change in the computer program); and
a controller (e.g., a microcontroller) configured to cause the device to render a tactile output on the surface of the device based on the effect description to the finger of the user.
[0194] 10. The device of description 9, wherein:
the processor is configured to obtain the effect description by generating the effect description based on the state change.
[0195] 11. The device of description 9 or description 10, wherein:
the processor is configured to obtain the effect description by modifying the effect description based on the estimate of the contact parameter.
[0196] 12. The device of any of descriptions 9-11, wherein:
the processor is configured to obtain the effect description by:
selecting a texel map based on the estimate of the contact parameter, the texel map spatially describing the tactile output and including multiple texels;
selecting a texel among the multiple texels in the texel map based on the estimate of the contact parameter, the texel describing a corresponding portion of the tactile output for a location on the surface of the device; and
obtaining the effect description based on the texel.
[0197] 13. The device of description 12, wherein:
the processor is configured to estimate a spatial location of the contact by the finger on the surface of the device; and
the texel map is represented as a two-dimensional array of data elements indexed by spatial locations among which is the estimated spatial location of the contact by the finger.
[0198] 14. The device of description 12 or 13, wherein:
the texel map is represented as a multidimensional object-based data structure in which each of multiple objects assigns a corresponding portion of the tactile output to a spatial location.
[0199] 15. The device of any of descriptions 12-14, wherein: the texel describes the corresponding portion of the tactile output for the location as a function of spatial coordinates with respect to the surface of the device.
[0200] 16. The device of any of descriptions 12-15, wherein:
the texel describes the corresponding portion of the tactile output for the location as a point-like effect description.
[0201] 17. The device of claim 16, wherein:
the point-like effect description is expressed as a waveform.
[0202] 18. The device of any of descriptions 12-17, wherein:
the processor is configured to select the texel by:
determining a predicted segment of a path of contact for the finger on the surface of the device; and
selecting the texel based on the predicted segment of the path.
[0203] 19. The device of description 18, wherein:
the processor is configured to determine the predicted segment of the predicted path by:
estimating a start location of the finger at a start of a time period;
estimating an end location of the finger at an end of the time period;
estimating a width of an area of the contact on the surface of the device; and determining the predicted segment based on at least one of the estimated start location, the estimated end location, or the estimated width of the area of the contact..
[0204] 20. The device of any of descriptions 9-19, wherein:
the processor is configured to obtain the effect description by obtaining a tactile output template based on the contact parameter, the tactile output template defining the tactile output that is renderable by the surface of the device to the finger of the user; and
the microcontroller is configured to cause the surface of the device to render the tactile output based on the tactile output template to the finger of the user.
[0205] 21. A tangible (e.g., non-transitory) machine-readable storage medium comprising instructions that, when executed by one or more processors of a machine, cause the machine to perform operations comprising:
accessing an input event report generated as a result of contact made by a finger of a user upon a surface of a device; determining a contact parameter based on at least one of a speed of the finger relative to the surface of the device, a direction of the finger relative to the surface of the device, or a pressure exerted by the finger on the surface of the device, the determining of the contact parameter being based on the received input event report generated as a result of the contact;
obtaining a tactile output template based on the contact parameter, the tactile output template defining a tactile output that is renderable on the surface of the device to the finger of the user; and
causing the device to render the tactile output on the surface of the device based on the tactile output template to the finger of the user.
[0206] 22. A tangible (e.g., non-transitory) machine-readable storage medium comprising instructions that, when executed by one or more processors of a machine, cause the machine to perform operations comprising:
access an input event report generated as a result of contact made by a finger of the user upon a surface of the device;
calculate an estimate of a contact parameter that represents at least one of a speed of the finger relative to the surface of the device, a direction of the finger relative to the surface of the device, or a pressure exerted by the finger on the surface of the device, the calculating of the estimate being based on the input event report generated as a result of the contact;
detect a state change in a computer program controlled by the input event report; obtain an effect description based on the state change and based on the estimate of the contact parameter; and
cause the surface of the device to render a tactile output based on the effect description to the finger of the user.

Claims

A method comprising:
accessing an input event report generated as a result of contact made by a finger of a user upon a surface of a device;
determining a contact parameter based on at least one of a speed of the finger relative to the surface of the device, a direction of the finger relative to the surface of the device, or a pressure exerted by the finger on the surface of the device,
the determining of the contact parameter being based on the
received input event report generated as a result of the contact;
obtaining a tactile output template based on the contact parameter,
the tactile output template defining a tactile output that is
renderable on the surface of the device to the finger of the user; and
causing the device to render the tactile output on the surface of the device based on the tactile output template to the finger of the user.
The method of claim 1, wherein:
the contact parameter is determined based further on a location of the contact by the finger on the surface of the device.
The method of claim 1 or 2, wherein:
the determining of the contact parameter is based on the pressure exerted by the finger on the surface of the device; and the method further comprises
determining the pressure exerted by the finger based on an area of the contact on the surface of the device.
The method of any one of the preceding claims, wherein:
the determining of the contact parameter is based on the pressure exerted by the finger on the surface of the device; and the method further comprises
determining the pressure as a sliding average pressure exerted by the finger over a period of time.
The method of any one of the preceding claims, wherein:
the obtaining of the tactile output template comprises generating the tactile output template based on the determined contact parameter.
The method of any one of the preceding claims, wherein:
the obtaining of the tactile output template comprises adjusting the tactile output template based on the determined contact parameter.
The method of any one of the preceding claims, wherein:
the obtaining of the tactile output template comprises selecting a tactile element based on the determined contact parameter, the tactile element describing a tactically perceivable surface feature and corresponding to the tactile output template; and
the obtaining of the tactile output template is based on its correspondence to the tactile element.
The method of any one of the preceding claims, wherein:
the determining of the contact parameter is based on an estimate of at least one of the speed of the finger, the direction of the finger, or the pressure of the finger; and the method further comprises calculating the estimate based on the input event report.
A device comprising:
a processor configured to:
access an input event report generated as a result of contact made by a finger of a user upon a surface of the device;
calculate an estimate of a contact parameter that represents at least one of a speed of the finger relative to the surface of the device, a direction of the finger relative to the surface of the device, or a pressure exerted by the finger on the surface of the device,
the calculating of the estimate being based on the input event report generated as a result of the contact; obtain an effect description based on a state change in a computer program and based on the estimate of the contact parameter; and
a controller configured to cause the surface of the device to render a tactile output based on the effect description to the finger of the user.
10. The device of claim 9, wherein:
the processor is configured to obtain the effect description by generating the effect description based on the state change.
The device of claim 9 or 10, wherein:
the processor is configured to obtain the effect description by modifying the effect description based on the estimate of the contact parameter.
The device of any one of claims 9 - 11, wherein:
the processor is configured to obtain the effect description by:
selecting a texture element [="texel"] map based on the estimate of the contact parameter,
the texel map spatially describing the tactile output and including multiple texels;
selecting a texel among the multiple texels in the texel map based on the estimate of the contact parameter,
the texel describing a corresponding portion of the tactile output for a location on the surface of the device; and
obtaining the effect description based on the texel.
13. The device of claim 12, wherein:
the processor is configured to estimate a spatial location of the contact by the finger on the surface of the device; and
the texel map is represented as a two-dimensional array of data elements indexed by spatial locations among which is the estimated spatial location of the contact by the finger.
14. The device of claim 12 or 13, wherein:
the texel map is represented as a multidimensional object-based data structure in which each of multiple objects assigns a corresponding portion of the tactile output to a spatial location.
15. The device of claim 12, 13 or 14, wherein:
the texel describes the corresponding portion of the tactile output for the location as a function of spatial coordinates with respect to the surface of the device.
16. The device of any one of claims 12 - 15, wherein:
the texel describes the corresponding portion of the tactile output for the location as a point-like effect description.
17. The device of claim 16, wherein:
the point-like effect description is expressed as a waveform.
The device of any one of claims 12 - 17, wherein:
the processor is configured to select the texel by:
determining a predicted segment of a path of contact for the
finger on the surface of the device; and
selecting the texel based on the predicted segment of the path.
19. The device of claim 18, wherein:
the processor is configured to determine the predicted segment of the predicted path by:
estimating a start location of the finger at a start of a time period; estimating an end location of the finger at an end of the time period;
estimating a width of an area of the contact on the surface of the device; and
determining the predicted segment based on at least one of the estimated start location, the estimated end location, or the estimated width of the area of the contact.
The device of any one of claims 9 - 19, wherein:
the processor is configured to obtain the effect description by obtaining a tactile output template based on the contact parameter, the tactile output template defining the tactile output that is
renderable by the surface of the device; and
the microcontroller is configured to cause the surface of the device to render the tactile output based on the tactile output template to the finger of the user.
PCT/FI2013/050468 2012-04-26 2013-04-25 Tactile output system WO2013160561A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
US201261638743P 2012-04-26 2012-04-26
US61/638,743 2012-04-26

Publications (1)

Publication Number Publication Date
WO2013160561A1 true WO2013160561A1 (en) 2013-10-31

Family

ID=49482275

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/FI2013/050468 WO2013160561A1 (en) 2012-04-26 2013-04-25 Tactile output system

Country Status (1)

Country Link
WO (1) WO2013160561A1 (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9864871B2 (en) 2015-01-24 2018-01-09 International Business Machines Corporation Masking of haptic data
WO2021252998A1 (en) * 2020-06-12 2021-12-16 Emerge Now Inc. Method and system for the generation and management of tactile commands for tactile sensation

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20090002328A1 (en) * 2007-06-26 2009-01-01 Immersion Corporation, A Delaware Corporation Method and apparatus for multi-touch tactile touch panel actuator mechanisms
US20100085169A1 (en) * 2008-10-02 2010-04-08 Ivan Poupyrev User Interface Feedback Apparatus, User Interface Feedback Method, and Program
US20100231367A1 (en) * 2009-03-12 2010-09-16 Immersion Corporation Systems and Methods for Providing Features in a Friction Display
US20110267294A1 (en) * 2010-04-29 2011-11-03 Nokia Corporation Apparatus and method for providing tactile feedback for user

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20090002328A1 (en) * 2007-06-26 2009-01-01 Immersion Corporation, A Delaware Corporation Method and apparatus for multi-touch tactile touch panel actuator mechanisms
US20100085169A1 (en) * 2008-10-02 2010-04-08 Ivan Poupyrev User Interface Feedback Apparatus, User Interface Feedback Method, and Program
US20100231367A1 (en) * 2009-03-12 2010-09-16 Immersion Corporation Systems and Methods for Providing Features in a Friction Display
US20110267294A1 (en) * 2010-04-29 2011-11-03 Nokia Corporation Apparatus and method for providing tactile feedback for user

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9864871B2 (en) 2015-01-24 2018-01-09 International Business Machines Corporation Masking of haptic data
WO2021252998A1 (en) * 2020-06-12 2021-12-16 Emerge Now Inc. Method and system for the generation and management of tactile commands for tactile sensation
US11249553B2 (en) 2020-06-12 2022-02-15 Emerge Now Inc. Method and system for the generation and management of tactile commands for tactile sensation
US12061739B2 (en) 2020-06-12 2024-08-13 Emerge Now Inc. Method and system for the generation and management of tactile commands for tactile sensation

Similar Documents

Publication Publication Date Title
JP6788719B2 (en) Systems and methods for force-based object manipulation and tactile sensation
KR102296967B1 (en) Ergonomic physical interaction zone cursor mapping
KR101946365B1 (en) Display device and Method for controlling the same
US9170649B2 (en) Audio and tactile feedback based on visual environment
US20140292668A1 (en) Touch input device haptic feedback
JP5507760B2 (en) Electronics
US20140015761A1 (en) Generating haptic effects for dynamic events
US10474238B2 (en) Systems and methods for virtual affective touch
CN105094312B (en) The modification of dynamic haptic effect
EP3321780A1 (en) High definition haptic effects generation using primitives
WO2013169303A1 (en) Adaptive haptic feedback for electronic devices
EP3596591B1 (en) Devices, methods, and graphical user interfaces for seamless transition of user interface behaviors
US10163245B2 (en) Multi-mode animation system
EP2957995A1 (en) Input device and control method therefor, and program
EP3333674A1 (en) Systems and methods for compliance simulation with haptics
WO2015121958A1 (en) Electronic device, input device, and drive control method for electronic device
CN111684402B (en) Haptic effects on touch input surfaces
EP3367216A1 (en) Systems and methods for virtual affective touch
WO2020067124A1 (en) Program, electronic device, and method
WO2013160561A1 (en) Tactile output system
US11093117B2 (en) Method for controlling animation&#39;s process running on electronic devices
US10545576B2 (en) Electronic device and drive control method thereof
US20240184366A1 (en) Tactile Sensation Generation Method, Haptic Reproduction Device and Computer Storage Medium
JP7419039B2 (en) Tactile presentation device
JP7471782B2 (en) Program, electronic device, and method

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 13782096

Country of ref document: EP

Kind code of ref document: A1

DPE1 Request for preliminary examination filed after expiration of 19th month from priority date (pct application filed from 20040101)
NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 13782096

Country of ref document: EP

Kind code of ref document: A1