WO2024204546A1 - Haptic sensation presenting system - Google Patents
Haptic sensation presenting system Download PDFInfo
- Publication number
- WO2024204546A1 WO2024204546A1 PCT/JP2024/012671 JP2024012671W WO2024204546A1 WO 2024204546 A1 WO2024204546 A1 WO 2024204546A1 JP 2024012671 W JP2024012671 W JP 2024012671W WO 2024204546 A1 WO2024204546 A1 WO 2024204546A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- tactile
- unit
- data
- presentation device
- sensation
- Prior art date
Links
- 230000035807 sensation Effects 0.000 title claims abstract description 84
- 238000006073 displacement reaction Methods 0.000 claims description 63
- 230000004044 response Effects 0.000 claims description 22
- 238000000034 method Methods 0.000 claims description 20
- 230000015541 sensory perception of touch Effects 0.000 claims 1
- 230000010365 information processing Effects 0.000 description 31
- 238000004891 communication Methods 0.000 description 28
- 238000012545 processing Methods 0.000 description 21
- 238000010586 diagram Methods 0.000 description 20
- 210000003811 finger Anatomy 0.000 description 15
- 230000008569 process Effects 0.000 description 13
- 230000001953 sensory effect Effects 0.000 description 13
- 230000015654 memory Effects 0.000 description 10
- 238000012790 confirmation Methods 0.000 description 7
- 235000013527 bean curd Nutrition 0.000 description 6
- 230000007423 decrease Effects 0.000 description 4
- 239000012530 fluid Substances 0.000 description 4
- 239000000463 material Substances 0.000 description 4
- 230000000007 visual effect Effects 0.000 description 4
- 229920001971 elastomer Polymers 0.000 description 3
- 238000010079 rubber tapping Methods 0.000 description 3
- 239000004576 sand Substances 0.000 description 3
- 230000008859 change Effects 0.000 description 2
- 230000000694 effects Effects 0.000 description 2
- 238000005401 electroluminescence Methods 0.000 description 2
- 230000005057 finger movement Effects 0.000 description 2
- 230000006870 function Effects 0.000 description 2
- 230000007246 mechanism Effects 0.000 description 2
- 229920001592 potato starch Polymers 0.000 description 2
- 238000005070 sampling Methods 0.000 description 2
- 229920001817 Agar Polymers 0.000 description 1
- 241000203475 Neopanax arboreus Species 0.000 description 1
- 230000001133 acceleration Effects 0.000 description 1
- 239000008272 agar Substances 0.000 description 1
- 238000006243 chemical reaction Methods 0.000 description 1
- 238000005516 engineering process Methods 0.000 description 1
- 230000004438 eyesight Effects 0.000 description 1
- 239000004744 fabric Substances 0.000 description 1
- 230000014509 gene expression Effects 0.000 description 1
- 239000004973 liquid crystal related substance Substances 0.000 description 1
- 238000012986 modification Methods 0.000 description 1
- 230000004048 modification Effects 0.000 description 1
- 239000011435 rock Substances 0.000 description 1
- 229920002379 silicone rubber Polymers 0.000 description 1
- 239000004945 silicone rubber Substances 0.000 description 1
- 230000000638 stimulation Effects 0.000 description 1
- 210000003813 thumb Anatomy 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0481—Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
- G06F3/0482—Interaction with lists of selectable items, e.g. menus
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0484—Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
- G06F3/04847—Interaction techniques to control parameter settings, e.g. interaction with sliders or dials
Definitions
- This disclosure relates to a tactile presentation system.
- Patent Documents 1 to 3 disclose tactile presentation devices that use magnetorheological fluids (MRFs).
- the objective of this disclosure is to provide a tactile presentation system that allows customization of tactile sensations.
- the tactile presentation system includes a tactile reception unit that receives tactile specific information that identifies a desired tactile sensation in response to a user's operation, a tactile setting unit that sets the tactile specific information received by the tactile reception unit, and a tactile presentation device that presents the desired tactile sensation based on the tactile specific information set by the tactile setting unit.
- FIG. 1 is a schematic diagram showing the overall configuration of a tactile presentation system according to an embodiment of the present disclosure.
- FIG. 2 is a functional block diagram showing a configuration of the information processing device in FIG.
- FIG. 3 is a table showing tactile data stored in the sensation DB in FIG.
- FIG. 4 is a functional block diagram showing the configuration of the tactile presentation device in FIG.
- FIG. 5 is a flow diagram showing the procedure of information processing by the information processing device shown in FIG. 2 and the tactile presentation device shown in FIG.
- FIG. 6 is a flow diagram showing the procedure of the tactile specific information receiving and setting process in FIG.
- FIG. 7 is a diagram showing a standard tactile confirmation screen displayed in step S31 in FIG.
- FIG. 8 is a diagram showing a screen displayed after the confirmation screen shown in FIG. FIG.
- FIG. 9 is a diagram showing the hardness selection screen displayed in step S32 in FIG.
- FIG. 10 is a diagram showing the texture selection screen displayed in step S37 in FIG.
- FIG. 11 is a timing diagram showing typical currents supplied to the MRF device in FIG.
- FIG. 12 is a table showing tactile data including the frequencies adjusted in step S38 in FIG.
- FIG. 13 is a timing diagram illustrating the current with the adjusted frequency shown in FIG.
- FIG. 14 is a timing diagram showing the current having an adjusted duty cycle in addition to the adjusted frequency shown in FIG.
- FIG. 15 is a diagram showing the pattern selection screen displayed in step S41 in FIG.
- FIG. 16 is a diagram showing a confirmation screen for the desired tactile sensation displayed in step S44 in FIG.
- FIG. 17 is a diagram showing a hardness and texture selection screen different from those in FIGS.
- the tactile presentation system includes a tactile reception unit that receives tactile identification information that identifies a desired tactile sensation in response to a user's operation, a tactile setting unit that sets the tactile identification information received by the tactile reception unit, and a tactile presentation device that presents the desired tactile sensation based on the tactile identification information set by the tactile setting unit.
- the tactile presentation system further includes a display unit that displays a coordinate system including a first axis indicating the first tactile specific information and a second axis indicating the second tactile specific information, and an operator that is arranged on the coordinate system and can be displaced in response to a user's operation.
- the tactile reception unit receives the displacement of the operator displayed by the display unit.
- the tactile setting unit sets the first and second tactile specific information corresponding to the displacement of the operator received by the tactile reception unit.
- the tactile presentation system further includes a display unit that displays a plurality of options corresponding to a plurality of types of tactile sensations.
- the tactile reception unit selects one option from the plurality of options displayed by the display unit in response to a user operation.
- the tactile setting unit sets tactile specific information corresponding to the option selected by the tactile reception unit.
- the tactile setting unit sets the current value to be supplied to the tactile presentation device as tactile specific information.
- the tactile setting unit sets the frequency of the current to be supplied to the tactile presentation device as tactile specific information.
- a tactile presentation method by a computer includes the steps of receiving tactile identification information that identifies a desired tactile sensation in response to a user's operation, setting the received tactile identification information, and controlling a tactile presentation device to present the desired tactile sensation based on the set tactile identification information.
- the tactile presentation program causes a computer to execute the steps of accepting tactile identification information that identifies a desired tactile sensation in response to a user's operation, setting the accepted tactile identification information, and controlling the tactile presentation device to present the desired tactile sensation based on the set tactile identification information.
- FIG. 1 is a schematic diagram showing a tactile presentation system 100 in this embodiment.
- the tactile presentation system 100 is a game system, and includes an information processing device 1 and a tactile presentation device (hereinafter, sometimes referred to as a tap unit) 2.
- the information processing device 1 and the tactile presentation device 2 are connected by short-range wireless communication to transmit and receive data between them.
- the information processing device 1 is a smartphone.
- the information processing device 1 is not limited to a smartphone, and may be a tablet terminal.
- the information processing device 1 may be a laptop PC (Personal Computer).
- the information processing device 1 may be a dedicated gaming device.
- the information processing device 1 may be a dedicated gaming device integrated with the tactile presentation device 2, that is, a gaming machine that includes the tactile presentation device 2 in the controller portion.
- the tactile presentation device 2 is a device that allows an operator to operate the device by moving his/her finger while holding the finger along the displacement unit 202.
- the tactile presentation device 2 reads the position of the displacement unit 202, which is displaced by the operator's finger movement, and controls the built-in MRF (Magneto-Rheological Fluid) device 24 according to the position to generate a force sensation due to a reaction force (rotational resistance) against the operator's operation of the displacement unit 202, thereby presenting a tactile sensation.
- the form of the displacement unit 202 of the tactile presentation device 2 is not limited to that shown in FIG. 1, and may be a button, a stick, or a cushion-like object covered with a cover.
- the tactile presentation device 2 may employ a motor or a piezoelectric element instead of the MRF device 24, and may generate a force sensation due to a rotational force or vibration in response to the operator's operation, and may be combined with a device that presents vibration, a warm sensation, a cold sensation, or an electrical stimulation in addition to the displacement unit 202. It can also be installed on the ground or a wall and operated by the operator's palm, foot, etc.
- an information processing device 1 with a game application program (hereafter referred to as a game app) installed presents images (visual) from a built-in display unit 14, sounds (auditory) from an audio output unit 15, and tactile sensations from a connected tactile presentation device 2 based on acquired game content.
- the information processing device 1 displays a character C on the display unit 14, and when an operator presses the displacement unit 202 of the tactile presentation device 2 with a finger, it changes the image of the displayed character while outputting a tactile sensation according to the type and level of character C, and outputs audio or sound effects according to the change from the audio output unit 15.
- the tactile presentation system 100 allows the user to select the character C to be operated.
- the information processing device 1 changes the tactile sensation in the tactile presentation device 2 according to the character C, and changes and outputs images and sounds according to the displacement in the displacement unit 202.
- FIG. 2 is a block diagram showing the internal configuration of the information processing device 1.
- the information processing device 1 includes a processing unit 10, a storage unit 11, a first communication unit 12, a second communication unit 13, a display unit 14, an audio output unit 15, and an operation unit 16.
- the information processing device 1 further includes a haptic reception unit 17 and a haptic setting unit 18.
- the processing unit 10 is a processor that uses a CPU (Central Processing Unit) and/or a GPU (Graphics Processing Unit).
- the processing unit 10 executes game-related processing based on the game application P1 stored in the memory unit 11.
- the storage unit 11 uses a non-volatile memory such as a flash memory or SSD.
- the storage unit 11 stores the game application P1, a sensory DB 110 described below, and other data referenced by the processing unit 10.
- the game application P1 may be a game application P8 stored in the storage medium 8 that has been read by the processing unit 10 and copied to the storage unit 11.
- the game application P1 is downloaded from the server device via the first communication unit 12 and stored in an executable manner.
- the storage unit 11 stores a sensory DB 110 that contains tactile data, visual (images, videos, text) data, and auditory (audio, sound effects) data corresponding to characters.
- the sensory DB 110 stores the tactile data, visual data, and auditory data to be output for each displacement data of the displacement unit 202 in the tactile presentation device 2, in association with a content ID that identifies a character or item.
- the first communication unit 12 realizes communication with the server device via a network including the Internet or a carrier network.
- the first communication unit 12 may be a wireless communication device that connects to a carrier network, or may be a wireless communication device for Wi-Fi.
- the processing unit 10 can send and receive data between the server device and the first communication unit 12.
- the second communication unit 13 is a communication module for short-range wireless communication, for example Bluetooth (registered trademark).
- the processing unit 10 can transmit and receive data to and from the tactile presentation device 2 via the second communication unit 13.
- the display unit 14 is a display such as a liquid crystal display or an organic EL (Electro Luminescence) display.
- the display unit 14 is, for example, a display with a built-in touch panel.
- the processing unit 10 displays on the display unit 14 game content stored in the memory unit 11 or game content such as images and text provided from the server device based on the game application P1.
- the audio output unit 15 includes a speaker, etc.
- the processing unit 10 outputs game content stored in the storage unit 11 or game content such as sound and music provided by the server device from the audio output unit 15 based on the game application P1.
- the operation unit 16 is a user interface capable of inputting and outputting data to and from the processing unit 10, and is a touch panel built into the display unit 14.
- the operation unit 16 may be a physical button.
- the operation unit 16 may also serve as a voice input unit.
- the tactile reception unit 17 receives tactile identification information that identifies a desired tactile sensation in response to a user's operation.
- the tactile setting unit 18 sets the tactile identification information received by the tactile reception unit 17.
- the tactile presentation device 2 presents the desired tactile sensation based on the tactile identification information set by the tactile setting unit 18.
- the processing unit 10 executes the game application P1, thereby causing the computer to function as the tactile reception unit 17 and the tactile setting unit 18.
- FIG. 3 is an explanatory diagram showing an example of the contents of the sensory DB 110.
- the sensory DB 110 stores, as tactile data, the value and frequency of the current supplied to the MRF device 24 of the tactile presentation device 2 for each displacement amount (angle) of the displacement unit 202.
- the sensory DB 110 may also store visual data, auditory data, etc.
- FIG. 4 is a block diagram showing the configuration of the tactile presentation device 2.
- the tactile presentation device 2 is configured by providing a flat, bottomed, cylindrical gripper 200 with a belt-like flat plate-like displacement section 202 having a curved section that partially follows the circumferential direction.
- the displacement section 202 is made of a material that is itself flexible, but may be made of a highly rigid material and rotatably supported by the gripper 200 via a support shaft.
- a cloth tape-like binding device 203 is provided on the outer surface of the tip of the displacement section 202.
- a link mechanism 204 that connects to the rotating shaft of the rotor of the MRF device 24 housed inside the gripper 200 is provided on the inner surface of the tip of the displacement section 202.
- the displacement section 202 may be provided with a material that adds a variety of textures, such as silicone rubber or a material with fur on its surface.
- the operator holds the gripping body 200 with, for example, the thumb and middle finger, and inserts the index finger and other fingers into the fastener 203 along the displacement portion 202.
- the operator can move the displacement portion 202 by pushing the index finger in, and can also move the displacement portion 202 away from the gripping body 200 by extending the index finger.
- the displacement amount (angle) of the displacement unit 202 shown in FIG. 3 is 0 degrees.
- the displacement amount (angle) of the displacement unit 202 is 90 degrees.
- the movable range of the displacement unit 202 is 90 degrees from the upper limit to the lower limit, but is not limited to this.
- the tactile data is set in increments of 1 degree, but it is possible to set it more finely, for example, in increments of 0.1 degrees.
- the tactile presentation device 2 includes a gripping body 200 as shown in FIG. 1, a control unit 20, a memory unit 21, a communication unit 22, a power supply unit 23, an MRF device 24, and a sensor 25.
- the gripping body 200 has an internal MRF device 24.
- the control unit 20, the memory unit 21, the communication unit 22, and the power supply unit 23 may be provided integrally with the gripping body 200, or may be provided separately and connected to the gripping body 200 wirelessly or via a wire.
- the control unit 20 includes a processor such as a CPU and an MPU (Micro-Processing Unit), and memories such as a ROM (Read Only Memory) and a RAM (Random Access Memory).
- the control unit 20 is, for example, a microcontroller.
- the control unit 20 controls each component based on a control program P2 stored in the built-in ROM, and realizes tactile presentation.
- the storage unit 21 is an auxiliary memory for the control unit 20, and stores the control data (haptic data) of the MRF device 24 in a rewritable manner.
- the communication unit 22 is a communication module for short-range wireless communication, such as Bluetooth (registered trademark).
- the control unit 20 can transmit and receive data to and from the information processing device 1 via the communication unit 22.
- the control unit 20 is connected to the power supply unit 23, the MRF device 24, and the sensor 25 via I/O, and exchanges signals with each other.
- the power supply unit 23 includes a rechargeable battery. When the power supply unit 23 is turned on, it supplies power to each component and the MRF device 24.
- the MRF device 24 has a yoke that is arranged to sandwich a disk-shaped rotor with a gap between them, and generates a magnetic field by passing a control current through a coil attached to the yoke, and controls the viscosity (shear stress) of the magnetorheological fluid sealed in the gap to provide rotational resistance to the rotor.
- the control unit 20 adjusts the magnitude of the control current to the MRF device 24, the rotational resistance is immediately changed.
- the sensor 25 measures the position (angle) of the displacement unit 202 and outputs it to the control unit 20.
- the sensor 25 measures the displacement of the displacement unit 202 as an angle and outputs it.
- the sensor 25 may be composed of multiple sensors such as a gyro sensor and an acceleration sensor.
- the displacement unit 202 when the displacement unit 202 is operated by the operator, the displacement of the displacement unit 202 is transmitted in the rotational direction to the rotation shaft of the rotor of the MRF device 24 via the link mechanism 204.
- the MRF device 24 When the MRF device 24 is not operating, that is, while the control current is zero, the rotation shaft rotates freely, so the displacement unit 202 fluctuates without resistance.
- the viscosity (shear stress) of the magnetorheological fluid inside the MRF device 24 is changed according to the magnitude of the current flowing to the MRF device 24.
- the control unit 20 can change the force of resistance to the displacement unit 202 and the way in which it appears by continuously changing the magnitude of the current to the MRF device 24 or vibrating the current value at a predetermined frequency.
- the tactile presentation device 2 can present a slippery sensation by varying the resistance (current value) according to the amount of pressure applied to the displacement section 202, or a firm sensation by increasing the resistance as the pressure increases, or a crunchy sensation by repeatedly varying the resistance or by repeatedly switching on and off in a square wave pattern.
- FIG. 5 is a flowchart showing an example of the processing procedure based on the game application P1.
- the information processing procedure by the information processing device 1 shown in FIG. 5 is executed when the game application P1 is launched, or when the communication connection with the tactile presentation device 2 is cut off.
- the processing unit 10 establishes a communication connection with the tactile presentation device 2 via the second communication unit 13 (step S11).
- the processing unit 10 reads out standard tactile data from the sensory DB 110 (step S12) and transmits the read out standard tactile data to the tactile presentation device 2 (step S13).
- the standard tactile data read from the sensory DB 110 in step S13 is a table that stores the corresponding current value and frequency for each displacement amount (angle), as shown in FIG. 3.
- the tactile presentation device 2 executes the process shown on the right side of FIG. 5 based on the transmitted tactile data.
- the control unit 20 of the tactile presentation device 2 executes the following process.
- the tactile presentation device 2 When the tactile presentation device 2 receives tactile data from the information processing device 1 during communication connection (step S21), the tactile presentation device 2 stores the data in the storage unit 21 (step S22).
- the tactile data received in step S21 is a list table of the current values and frequencies for each amount of displacement described above.
- the control unit 20 of the tactile presentation device 2 samples a signal corresponding to the displacement amount (angle) of the displacement unit 202 output from the sensor 25 (step S23).
- the control unit 20 transmits the displacement amount obtained by sampling to the information processing device 1 (step S24).
- the displacement amount (angle) may be a relative displacement from the upper end, or may be an absolute position detected by the sensor 25.
- the control unit 20 references the current value corresponding to the amount of displacement obtained by sampling from the tactile data stored in the memory unit 21 (step S25), and outputs the referenced current to the MRF device 24 (step S26).
- the tactile reception unit 17 receives tactile specific information that identifies the desired tactile sensation in response to the user's operation (step S14).
- the tactile setting unit 18 sets the tactile specific information received by the tactile reception unit 17 (step S15).
- the tactile presentation device 2 presents the desired tactile sensation based on the tactile specific information set by the tactile setting unit 18 (steps S21 to S26).
- the display unit 14 displays a confirmation screen for the standard tactile sensation, as shown in FIG. 7 (step S31). Specifically, the display unit 14 displays an animation of the tactile presentation device 2 and guides the user on how to operate the tactile presentation device 2. Additionally, the audio output unit 15 may also guide the user on the operation method by audio. Initially, the tactile presentation device 2 presents a standard tactile sensation. The user can confirm the standard tactile sensation presented by the tactile presentation device 2 by operating the tactile presentation device 2.
- the display unit 14 displays a screen for selecting whether or not to create the desired tactile sensation, as shown in FIG. 8.
- the animation of the tactile presentation device 2 continues as is, and the tactile sensation presented by the tactile presentation device 2 also continues as is.
- the user taps the displayed Create button the process of creating the desired sensation begins.
- the user taps the displayed OK button the standard tactile sensation is maintained.
- the display unit 14 displays a hardness selection screen as shown in FIG. 9 (step S32). Specifically, the display unit 14 displays five options. In this example, to express hardness, "raw egg,” “tofu,” “initial,” “gummy,” and “rock okoshi” are displayed in order from softest to softest. When this screen is first displayed, the standard hardness "initial” is selected by default.
- the haptic reception unit 17 selects the desired hardness in response to the user's tap operation (step S33). That is, the haptic reception unit 17 receives haptic specific information that specifies the desired hardness.
- the haptic setting unit 18 adjusts the parameters of the current value of the standard haptic data in response to the selected hardness (step S34). That is, the haptic setting unit 18 sets the haptic specific information that specifies the desired hardness received by the haptic reception unit 17. Specifically, when "raw egg” is selected, the current value is set to, for example, 0.6 times the standard haptic data. When “tofu” is selected, the current value is set to, for example, 0.8 times the standard haptic data.
- the current value is maintained as it is in the standard haptic data.
- the current value is set to, for example, 1.2 times the standard haptic data.
- the current value is set to, for example, 1.4 times the standard haptic data. Note that no adjustment is made to parameters other than the current value of the standard haptic data (for example, "frequency”).
- the "raw egg” used in the explanation here is merely an example, and other expressions of hardness may be used.
- the current value multiplier is also merely an example, and other multipliers may be used.
- the second communication unit 13 transmits tactile data including the adjusted current value to the tactile presentation device 2 (step S35).
- the tactile presentation device 2 executes the processes of steps S21 to S26 to present a tactile sensation of hardness according to the adjusted current value. This allows the user to experience the selected hardness through the tactile presentation device 2.
- the user can reselect the hardness any number of times until tapping the OK button, and each time a hardness different from the currently selected hardness is selected, the information processing device 1 adjusts the current value according to the selected hardness and transmits tactile data including the adjusted current value to the tactile presentation device 2 (steps S32 to S35).
- the tactile presentation device 2 performs the processes of steps S21 to S26 to present a tactile sensation based on the latest tactile data received.
- the tactile setting unit 18 determines the currently selected hardness as the desired hardness.
- the processing unit 10 stores the latest tactile data with the adjusted current value (hereinafter referred to as "first adjusted tactile data") in the sensory DB 110.
- the display unit 14 displays a texture selection screen as shown in FIG. 10 (step S36). Specifically, the display unit 14 displays five options. In this example, to express the textures, “rubber,” “potato starch,” “initial,” “electric shock,” and “sand” are displayed in order from smoothest to lightest. When this screen is first displayed, the standard texture "initial” is selected by default. In this state, the tactile presentation device 2 stores the same first adjusted tactile data as stored in the sensory DB 110, and presents a tactile sensation corresponding to the first adjusted tactile data.
- the haptic reception unit 17 selects the desired texture in response to the user's tap operation (step S37). That is, the haptic reception unit 17 receives haptic specific information that identifies the desired texture.
- the haptic setting unit 18 adjusts the frequency parameters of the first adjusted haptic data in response to the selected texture (step S38). That is, the haptic setting unit 18 sets the haptic specific information that identifies the desired texture received by the haptic reception unit 17. Specifically, when “rubber” is selected, the frequency is set to, for example, 0.5 times the standard haptic data. When “potato starch” is selected, the frequency is set to, for example, 0.8 times the standard haptic data.
- the frequency When “initial” is selected, the frequency is maintained as the standard haptic data. When “agar” is selected, the frequency is set to, for example, 1.6 times the standard haptic data. When “sand” is selected, the frequency is set to, for example, 2.0 times the standard haptic data.
- the "rubber” and the like used in the explanation here are merely examples, and other textures may be used. Additionally, the frequency multiplication factor is merely an example, and other multiplication factors may be used.
- the current is 0.5 A and the frequency is 100 Hz (period is 0.01 seconds).
- the MRF device 24 is supplied with the current shown in FIG. 11. The duty ratio of this current is 50%.
- the current is 0.5 A and the frequency is 100 Hz (period is 0.01 seconds).
- the current remains at 0.50 A, but the frequency is increased to 200 Hz (period is 0.005 seconds), as in the tactile data shown in FIG. 12.
- the current shown in FIG. 13 is supplied to the MRF device 24.
- the duty ratio of this current is also 50%, but the feel may be changed by changing the duty ratio, as shown in FIG. 14.
- the second communication unit 13 transmits the first adjusted tactile data (hereinafter referred to as "second adjusted tactile data") with the frequency parameters adjusted to the tactile presentation device 2 (step S39).
- the tactile presentation device 2 executes the processes of steps S21 to S26 to present a tactile sensation according to the received second adjusted tactile data. This allows the user to experience the selected texture through the tactile presentation device 2.
- the user can select the texture any number of times until tapping the OK button, and each time a texture different from the currently selected texture is selected, the corresponding texture (touch) can be presented to the user.
- the information processing device 1 adjusts the frequency parameters of the first adjusted tactile data according to the selected texture, and transmits second adjusted tactile data including the adjusted frequency to the tactile presentation device 2 (steps S36 to S39).
- the tactile presentation device 2 presents a tactile sensation based on the latest received second adjusted tactile data by executing the processes of steps S21 to S26.
- the tactile setting unit 18 determines the currently selected texture as the desired texture.
- the processing unit 10 stores the latest second adjusted tactile data in the sensation DB 110 separately from the first adjusted tactile data.
- the display unit 14 displays a pattern selection screen as shown in FIG. 15 (step S40). Specifically, the display unit 14 displays five options. When this screen is first displayed, the standard pattern "initial" is selected by default. In the standard pattern, the current value is constant even if the displacement (angle) changes. The hardness and texture have already been selected above. In this state, the tactile presentation device 2 stores the same second adjusted tactile data stored in the sensory DB 110, and presents a tactile sensation corresponding to the second adjusted tactile data. In the other four patterns, the displacement (angle) is shown on the horizontal axis, and the current value is shown on the vertical axis. In the top pattern, the current value gradually increases while the displacement is small, and the current value suddenly increases as the displacement increases.
- the current value suddenly increases while the displacement is small, and the current value gradually increases as the displacement increases.
- the current value gradually decreases while the displacement is small, and the current value suddenly decreases as the displacement increases.
- the current value suddenly decreases while the displacement is small, and then gradually decreases as the displacement increases.
- the haptic reception unit 17 selects a desired pattern in response to the user's tap operation (step S41). That is, the haptic reception unit 17 receives haptic specific information that specifies the desired pattern.
- the haptic setting unit 18 adjusts the parameter of the current value of the second adjusted haptic data in response to the selected pattern (step S42). That is, the haptic setting unit 18 sets haptic specific information that specifies the desired pattern received by the haptic reception unit 17.
- the current value is adjusted by multiplying the parameter of the current value of the second adjusted haptic data by a predetermined coefficient for each displacement. For example, when a pattern that increases quadratically is selected, the current value corresponding to each displacement is multiplied by "nX 2 (X is a displacement)" as a coefficient.
- the second communication unit 13 transmits the second adjusted tactile data (hereinafter referred to as "third adjusted tactile data") in which the current value parameters have been adjusted to the tactile presentation device 2 (step S43).
- the tactile presentation device 2 executes the processes of steps S21 to S26 to present a tactile pattern according to the received third adjusted tactile data. Therefore, the user can experience the tactile pattern selected through the tactile presentation device 2.
- the user can reselect a pattern any number of times until tapping the OK button, and each time a pattern different from the currently selected pattern is selected, a corresponding pattern (tactile sensation) can be presented to the user.
- the information processing device 1 adjusts the parameters of the current value of the second adjusted tactile data according to the selected pattern, and transmits third adjusted tactile data including the adjusted current value to the tactile presentation device 2 (steps S36 to S39).
- the tactile presentation device 2 presents a tactile sensation based on the latest received third adjusted tactile data by executing the processes of steps S21 to S26.
- the tactile setting unit 18 determines the currently selected pattern as the desired pattern.
- the processing unit 10 stores the latest third adjusted tactile data in the sensory DB 110 separately from the first adjusted tactile data and the second adjusted tactile data.
- the pattern is not limited to these, but may be wavy, zigzag, etc. Also, instead of displaying the pattern as a shape, it may be displayed as words such as "wavy,” “zigzag,” “squiggly,” or “flat.”
- the display unit 14 displays a confirmation screen for the created tactile sensation, as shown in FIG. 16 (step S44).
- the tactile presentation device 2 stores the same third adjusted tactile data as that stored in the sensory DB 110, and presents a tactile sensation corresponding to the third adjusted tactile data. Therefore, the user can experience the created tactile sensation through the tactile presentation device 2.
- the process returns to step S32. It may also return to step S36 or S40.
- the third adjusted tactile sensation data can be remade midway by using the first adjusted tactile sensation data or the second adjusted tactile sensation data stored in DB110.
- at least one of the set current value, frequency, and pattern may be fine-tuned. For fine adjustment, a displaceable operator such as a slide bar may be displayed, and this operator may be displaced according to the user's operation.
- a displaceable operator such as a pointer may be displayed, and the operator may be displaced in two dimensions in response to a user's operation.
- two-dimensional coordinates including an X axis and a Y axis and an operator arranged on the two-dimensional coordinates are displayed.
- the X axis indicates the touch.
- the Y axis indicates the hardness. The user can set the touch and the hardness at the same time by moving the operator.
- the OK button the process proceeds to step S40.
- the screen shown in FIG. 17 is used instead of the screens shown in FIG. 9 and FIG. 10.
- the OK button is touched on the screen shown in FIG. 17, the screens shown in FIG.
- FIG. 16 15 and FIG. 16 are displayed in order.
- the Remake button is touched on the screen shown in FIG. 16, the tactile sensation set immediately before is selected, so the pointer is placed at a position corresponding to that tactile sensation.
- the origin of the coordinates corresponds to the standard tactile sensation data.
- the pointer is placed at the origin of the coordinates.
- the user touches the pointer and swipes it to move the pointer on the coordinates.
- the information processing device 1 adjusts the tactile data in real time in response to the movement of the pointer (Y-axis: current value, X-axis: frequency), and transmits the adjusted tactile data (corresponding to the second adjusted tactile data in the above embodiment) to the tactile presentation device 2.
- the tactile data is adjusted based on the standard tactile data.
- the current parameter of the standard tactile data is multiplied by a coefficient corresponding to the position on the Y-axis
- the frequency parameter of the standard tactile data is multiplied by a coefficient corresponding to the position on the X-axis to generate new tactile data.
- the current and frequency coefficients are both "1".
- the current coefficient becomes larger than “1” in the "hard” direction, and becomes smaller than “1” in the “soft” direction. However, it never becomes 0 or less.
- the frequency coefficient becomes larger than "1” in the "rough” direction, and becomes smaller than "1” in the "soft” direction.
- the minimum value is "0".
- tactile data is transmitted to the tactile presentation device 2 in real time according to the position of the pointer, but tactile data may also be transmitted for a position where finger movement stops for a predetermined period of time or when the pointer is removed from the finger. Coordinates have limits on the top, bottom, left, and right. For example, if the user swipes diagonally upwards to the right when the limit of the X axis is reached, the pointer moves upwards (Y axis) without moving to the right (X axis direction). In this example as well, the user can experience a sense of touch through the tactile presentation device 2 at the time the tactile data is received by the tactile presentation device 2.
- This screen can also include the pattern selection screen in FIG. 15. This allows the third adjustment tactile data to be created on a single screen.
- the above system includes a tactile presentation device and a smartphone that can be connected to the device, but it may also be a tactile presentation system that includes a touch panel monitor that integrates the tactile presentation device and the smartphone.
- the tactile presentation device is not limited to one using an MRF, but may be another passive type device, or an active type device using a motor or the like. It may also be a device that combines passive and active types.
- non-transitory storage media that store programs for causing a computer to function as an information processing device, a tactile presentation device, or a tactile presentation system are also included in the embodiments of the present invention.
- Touch panel devices are available that use vibrations to guide users with weak eyesight on how to use them, and the present invention can also be applied to such devices.
- multiple tactile presentation devices may be paired so that tactile sensations can be set simultaneously. Furthermore, by setting a tactile sensation on one device, the other device may automatically be set to the same tactile sensation. Furthermore, in a five-finger glove-type device, by setting a tactile sensation on one finger, the same tactile sensation may be automatically set on all five fingers. Furthermore, by setting a tactile sensation on a basic finger, the tactile sensation may be automatically set for each finger by referring to the set tactile sensation.
- confirmation coordinates may be displayed separately from the coordinates operated by the user, so that the position of the pointer can be confirmed using these coordinates.
- the confirmation coordinates may be an enlarged display of only the area around the position of the user's finger.
- the resolution and scale of the coordinates operated by a pinch operation may also be changed.
- the scale of only the parameters in the X-axis direction (Y-axis direction) is changed.
- the scales of the X and Y parameters can also be changed simultaneously, and the scales of the X and Y parameters can also be changed at a ratio according to the pinch direction.
- the animation and sound of the tactile presentation device 2 described in FIG. 7 and FIG. 8 may be displayed.
- the animation and sound may be changed in conjunction with the operation of the tactile presentation device 2.
- an animation of an object that evokes a tactile sensation may be further displayed, and sound associated with the animation may be output.
- an animation of an object that evokes a tactile sensation may be further displayed, and sound associated with the animation may be output.
- an animation corresponding to "tofu” is displayed.
- an animation of tofu being crushed in conjunction with the operation is displayed, and sound that is generated when the tofu is crushed is output.
- FIG. 10 and FIG. 15 an object close to the third adjusted tactile data generated by the three combinations selected in FIG. 9, FIG. 10, and FIG. 16 is detected by referring to a tactile database (not shown) in which tactile data of various objects is stored, and an animation of the object may be displayed and sound may be output.
- Tactile presentation device 17 Tactile reception unit 18: Tactile setting unit 24: MRF device 100: Tactile presentation system
Landscapes
- Engineering & Computer Science (AREA)
- General Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- Human Computer Interaction (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- User Interface Of Digital Computer (AREA)
Abstract
Provided is a haptic sensation presenting system. A haptic sensation presenting system 100 comprises: a haptic sensation accepting unit 17 that accepts haptic sensation specification information specifying a desired haptic sensation in accordance with an operation performed by a user; a haptic sensation setting unit 18 that sets the haptic sensation specification information accepted by the haptic sensation accepting unit 17; and a haptic sensation presenting device 2 that presents the desired haptic sensation on the basis of the haptic sensation specification information set by the haptic sensation setting unit 18.
Description
本開示は、触覚提示システムに関する。
This disclosure relates to a tactile presentation system.
触覚を提示する技術(ハプティクス)が種々提案されている。例えば特許文献1~3は、磁気粘性流体(MRF;Magneto-Rheological Fluid)を用いた触覚提示装置を開示する。
Various technologies for presenting tactile sensations (haptics) have been proposed. For example, Patent Documents 1 to 3 disclose tactile presentation devices that use magnetorheological fluids (MRFs).
しかしながら、触覚をカスタマイズ可能な触覚提示装置は存在しない。
However, there are currently no tactile presentation devices that allow customization of the sense of touch.
本開示の課題は、触覚をカスタマイズ可能な触覚提示システムを提供することである。
The objective of this disclosure is to provide a tactile presentation system that allows customization of tactile sensations.
本開示による触覚提示システムは、ユーザの操作に応じて所望の触覚を特定する触覚特定情報を受け付ける触覚受付部と、前記触覚受付部により受け付けられる触覚特定情報を設定する触覚設定部と、前記触覚設定部により設定される触覚特定情報に基づいて前記所望の触覚を提示する触覚提示装置とを備える。
The tactile presentation system according to the present disclosure includes a tactile reception unit that receives tactile specific information that identifies a desired tactile sensation in response to a user's operation, a tactile setting unit that sets the tactile specific information received by the tactile reception unit, and a tactile presentation device that presents the desired tactile sensation based on the tactile specific information set by the tactile setting unit.
<実施形態の概要>
本実施形態によると、触覚提示システムは、ユーザの操作に応じて所望の触覚を特定する触覚特定情報を受け付ける触覚受付部と、触覚受付部により受け付けられる触覚特定情報を設定する触覚設定部と、触覚設定部により設定される触覚特定情報に基づいて所望の触覚を提示する触覚提示装置とを備える。 <Overview of the embodiment>
According to this embodiment, the tactile presentation system includes a tactile reception unit that receives tactile identification information that identifies a desired tactile sensation in response to a user's operation, a tactile setting unit that sets the tactile identification information received by the tactile reception unit, and a tactile presentation device that presents the desired tactile sensation based on the tactile identification information set by the tactile setting unit.
本実施形態によると、触覚提示システムは、ユーザの操作に応じて所望の触覚を特定する触覚特定情報を受け付ける触覚受付部と、触覚受付部により受け付けられる触覚特定情報を設定する触覚設定部と、触覚設定部により設定される触覚特定情報に基づいて所望の触覚を提示する触覚提示装置とを備える。 <Overview of the embodiment>
According to this embodiment, the tactile presentation system includes a tactile reception unit that receives tactile identification information that identifies a desired tactile sensation in response to a user's operation, a tactile setting unit that sets the tactile identification information received by the tactile reception unit, and a tactile presentation device that presents the desired tactile sensation based on the tactile identification information set by the tactile setting unit.
触覚提示システムは、さらに、第1触覚特定情報を示す第1軸と第2触覚特定情報を示す第2軸とを含む座標と、座標上に配置され、ユーザの操作に応じて変位可能な操作子とを表示する表示部を備える。触覚受付部は、表示部により表示される操作子の変位を受け付ける。触覚設定部は、触覚受付部により受け付けられる操作子の変位に対応する第1及び第2触覚特定情報を設定する。
The tactile presentation system further includes a display unit that displays a coordinate system including a first axis indicating the first tactile specific information and a second axis indicating the second tactile specific information, and an operator that is arranged on the coordinate system and can be displaced in response to a user's operation. The tactile reception unit receives the displacement of the operator displayed by the display unit. The tactile setting unit sets the first and second tactile specific information corresponding to the displacement of the operator received by the tactile reception unit.
触覚提示システムは、さらに、複数種類の触覚に対応する複数の選択肢を表示する表示部を備える。触覚受付部は、表示部により表示される複数の選択肢の中からユーザの操作に応じて1つの選択肢を選択する。触覚設定部は、触覚受付部により選択される選択肢に対応する触覚特定情報を設定する。
The tactile presentation system further includes a display unit that displays a plurality of options corresponding to a plurality of types of tactile sensations. The tactile reception unit selects one option from the plurality of options displayed by the display unit in response to a user operation. The tactile setting unit sets tactile specific information corresponding to the option selected by the tactile reception unit.
触覚設定部は、触覚特定情報として触覚提示装置に供給されるべき電流値を設定する。
The tactile setting unit sets the current value to be supplied to the tactile presentation device as tactile specific information.
触覚設定部は、触覚特定情報として触覚提示装置に供給されるべき電流の周波数を設定する。
The tactile setting unit sets the frequency of the current to be supplied to the tactile presentation device as tactile specific information.
もう1つの実施形態によると、コンピュータによる触覚提示方法は、ユーザの操作に応じて所望の触覚を特定する触覚特定情報を受け付けるステップと、受け付けられる触覚特定情報を設定するステップと、設定される触覚特定情報に基づいて所望の触覚を提示するように触覚提示装置を制御するステップとを含む。
According to another embodiment, a tactile presentation method by a computer includes the steps of receiving tactile identification information that identifies a desired tactile sensation in response to a user's operation, setting the received tactile identification information, and controlling a tactile presentation device to present the desired tactile sensation based on the set tactile identification information.
さらにもう1つの実施形態によると、触覚提示プログラムは、ユーザの操作に応じて所望の触覚を特定する触覚特定情報を受け付けるステップと、受け付けられる触覚特定情報を設定するステップと、設定される触覚特定情報に基づいて所望の触覚を提示するように触覚提示装置を制御するステップとをコンピュータに実行させるためのものである。
According to yet another embodiment, the tactile presentation program causes a computer to execute the steps of accepting tactile identification information that identifies a desired tactile sensation in response to a user's operation, setting the accepted tactile identification information, and controlling the tactile presentation device to present the desired tactile sensation based on the set tactile identification information.
<実施形態の詳細>
以下、添付の図面を参照しながら本開示の実施形態を詳しく説明する。図中、同一又は相当部分には同一参照符号を付し、その説明を繰り返さない。 <Details of the embodiment>
Hereinafter, embodiments of the present disclosure will be described in detail with reference to the accompanying drawings. In the drawings, the same or corresponding parts are designated by the same reference characters, and the description thereof will not be repeated.
以下、添付の図面を参照しながら本開示の実施形態を詳しく説明する。図中、同一又は相当部分には同一参照符号を付し、その説明を繰り返さない。 <Details of the embodiment>
Hereinafter, embodiments of the present disclosure will be described in detail with reference to the accompanying drawings. In the drawings, the same or corresponding parts are designated by the same reference characters, and the description thereof will not be repeated.
図1は、本実施形態における触覚提示システム100を示す模式図である。触覚提示システム100はゲームシステムであり、情報処理装置1と、触覚提示装置(以下、タップユニットという場合もある)2とを含む。情報処理装置1と、触覚提示装置2とは近距離無線通信で通信接続され、相互にデータを授受する。
FIG. 1 is a schematic diagram showing a tactile presentation system 100 in this embodiment. The tactile presentation system 100 is a game system, and includes an information processing device 1 and a tactile presentation device (hereinafter, sometimes referred to as a tap unit) 2. The information processing device 1 and the tactile presentation device 2 are connected by short-range wireless communication to transmit and receive data between them.
情報処理装置1は、図1に示すようにスマートフォンを用いる。情報処理装置1は、スマートフォンに限られず、タブレット端末であってもよい。情報処理装置1は、ラップトップ型のPC(Personal Computer)であってもよい。情報処理装置1はゲーム専用機器であってもよい。情報処理装置1は、触覚提示装置2と一体化されたゲーム専用機器、すなわちコントローラ部分に触覚提示装置2を含むゲーム機であってもよい。
As shown in FIG. 1, the information processing device 1 is a smartphone. The information processing device 1 is not limited to a smartphone, and may be a tablet terminal. The information processing device 1 may be a laptop PC (Personal Computer). The information processing device 1 may be a dedicated gaming device. The information processing device 1 may be a dedicated gaming device integrated with the tactile presentation device 2, that is, a gaming machine that includes the tactile presentation device 2 in the controller portion.
触覚提示装置2は、操作者が指を変位部202に沿えて持ちながら指を動かして操作できる装置である。触覚提示装置2は、操作者が指を動かすことで変位する変位部202の位置を読み取り、内蔵するMRF(Magneto-Rheological Fluid)デバイス24をその位置に応じて制御して操作者の変位部202への操作に対する反力(回転抵抗)により力覚を生じさせ、触覚を提示する装置である。触覚提示装置2の変位部202の態様は、図1に示すようなものに限られず、ボタンであってもよいし、スティック状であってもよいし、カバーに覆われたクッション状のものであってもよい。触覚提示装置2は、MRFデバイス24に代えて、モータやピエゾ素子などを採用し、操作者の操作に対し回転力や振動により力覚を生じさせるものでもよく、変位部202の他、振動や温感、冷感、電気刺激を提示するものと組み合わされてもよい。地面や壁に設置し、操作者の手のひらや足等で操作される構造でもよい。
The tactile presentation device 2 is a device that allows an operator to operate the device by moving his/her finger while holding the finger along the displacement unit 202. The tactile presentation device 2 reads the position of the displacement unit 202, which is displaced by the operator's finger movement, and controls the built-in MRF (Magneto-Rheological Fluid) device 24 according to the position to generate a force sensation due to a reaction force (rotational resistance) against the operator's operation of the displacement unit 202, thereby presenting a tactile sensation. The form of the displacement unit 202 of the tactile presentation device 2 is not limited to that shown in FIG. 1, and may be a button, a stick, or a cushion-like object covered with a cover. The tactile presentation device 2 may employ a motor or a piezoelectric element instead of the MRF device 24, and may generate a force sensation due to a rotational force or vibration in response to the operator's operation, and may be combined with a device that presents vibration, a warm sensation, a cold sensation, or an electrical stimulation in addition to the displacement unit 202. It can also be installed on the ground or a wall and operated by the operator's palm, foot, etc.
触覚提示システム100では、ゲームアプリケーションプログラム(以下、ゲームアプリという)をインストールした情報処理装置1は、取得したゲームコンテンツに基づき、内蔵する表示部14から画像(視覚)を、音声出力部15から音声(聴覚)を、接続される触覚提示装置2から触覚を提示する。例えば情報処理装置1は、図1に示すように、表示部14に、キャラクタCを表示させ、操作者が触覚提示装置2の変位部202を指で押し込むと、キャラクタCの種類やレベルに応じた触覚を出力させつつ、表示中のキャラクタの画像を変化させ、変化に応じた音声又は効果音を音声出力部15から出力する。
In the tactile presentation system 100, an information processing device 1 with a game application program (hereafter referred to as a game app) installed presents images (visual) from a built-in display unit 14, sounds (auditory) from an audio output unit 15, and tactile sensations from a connected tactile presentation device 2 based on acquired game content. For example, as shown in FIG. 1, the information processing device 1 displays a character C on the display unit 14, and when an operator presses the displacement unit 202 of the tactile presentation device 2 with a finger, it changes the image of the displayed character while outputting a tactile sensation according to the type and level of character C, and outputs audio or sound effects according to the change from the audio output unit 15.
触覚提示システム100では、操作するキャラクタCを選択させることができる。触覚提示システム100は、情報処理装置1がキャラクタCに応じて触覚提示装置2における触覚を変化させ、変位部202における変位に応じて画像や音声を変更して出力する。
The tactile presentation system 100 allows the user to select the character C to be operated. In the tactile presentation system 100, the information processing device 1 changes the tactile sensation in the tactile presentation device 2 according to the character C, and changes and outputs images and sounds according to the displacement in the displacement unit 202.
図2は、情報処理装置1の内部構成を示すブロック図である。情報処理装置1は、処理部10、記憶部11、第1通信部12、第2通信部13、表示部14、音声出力部15及び操作部16を備える。情報処理装置1はさらに、触覚受付部17及び触覚設定部18を備える。
FIG. 2 is a block diagram showing the internal configuration of the information processing device 1. The information processing device 1 includes a processing unit 10, a storage unit 11, a first communication unit 12, a second communication unit 13, a display unit 14, an audio output unit 15, and an operation unit 16. The information processing device 1 further includes a haptic reception unit 17 and a haptic setting unit 18.
処理部10は、CPU(Central Processing Unit)及び/又はGPU(Graphics Processing Unit)を用いたプロセッサである。処理部10は、記憶部11に記憶されているゲームアプリP1に基づき、ゲームに係る処理を実行する。
The processing unit 10 is a processor that uses a CPU (Central Processing Unit) and/or a GPU (Graphics Processing Unit). The processing unit 10 executes game-related processing based on the game application P1 stored in the memory unit 11.
記憶部11は、フラッシュメモリ、SSD等の不揮発性メモリを用いる。記憶部11は、ゲームアプリP1と、後述の感覚DB110と、その他、処理部10が参照するデータとを記憶する。ゲームアプリP1は、記憶媒体8に記憶されているゲームアプリP8を処理部10が読み出して記憶部11に複製したものであってもよい。ゲームアプリP1は、第1通信部12を介してサーバ装置からダウンロードして実行可能に記憶したものである。
The storage unit 11 uses a non-volatile memory such as a flash memory or SSD. The storage unit 11 stores the game application P1, a sensory DB 110 described below, and other data referenced by the processing unit 10. The game application P1 may be a game application P8 stored in the storage medium 8 that has been read by the processing unit 10 and copied to the storage unit 11. The game application P1 is downloaded from the server device via the first communication unit 12 and stored in an executable manner.
記憶部11は、キャラクタに応じた触覚データ、視覚(画像、動画、テキスト)データ及び聴覚(音声、効果音)データを含む感覚DB110を記憶する。感覚DB110は、キャラクタ又はアイテムを識別するコンテンツIDに対応付けて、触覚提示装置2における変位部202の変位データ毎に、その変位において出力すべき触覚データ、視覚データ、及び聴覚データを格納している。
The storage unit 11 stores a sensory DB 110 that contains tactile data, visual (images, videos, text) data, and auditory (audio, sound effects) data corresponding to characters. The sensory DB 110 stores the tactile data, visual data, and auditory data to be output for each displacement data of the displacement unit 202 in the tactile presentation device 2, in association with a content ID that identifies a character or item.
第1通信部12は、インターネット又はキャリアネットワークを含むネットワークを介したサーバ装置との通信を実現する。第1通信部12は具体的には、キャリアネットワークと接続する無線通信デバイスであってもよいし、WiFi用の無線通信デバイスであってもよい。処理部10は、第1通信部12によってサーバ装置との間でデータを送受信できる。
The first communication unit 12 realizes communication with the server device via a network including the Internet or a carrier network. Specifically, the first communication unit 12 may be a wireless communication device that connects to a carrier network, or may be a wireless communication device for Wi-Fi. The processing unit 10 can send and receive data between the server device and the first communication unit 12.
第2通信部13は、近距離無線通信、例えばBluetooth(登録商標)の通信モジュールである。処理部10は、第2通信部13によって触覚提示装置2との間でデータを送受信できる。
The second communication unit 13 is a communication module for short-range wireless communication, for example Bluetooth (registered trademark). The processing unit 10 can transmit and receive data to and from the tactile presentation device 2 via the second communication unit 13.
表示部14は、液晶ディスプレイ、有機EL(Electro Luminescence)ディスプレイ等のディスプレイである。表示部14は例えば、タッチパネル内蔵型ディスプレイである。処理部10は表示部14に、ゲームアプリP1に基づき、記憶部11に記憶されるゲームコンテンツ、又は、サーバ装置から提供される画像、テキスト等のゲームコンテンツを表示する。
The display unit 14 is a display such as a liquid crystal display or an organic EL (Electro Luminescence) display. The display unit 14 is, for example, a display with a built-in touch panel. The processing unit 10 displays on the display unit 14 game content stored in the memory unit 11 or game content such as images and text provided from the server device based on the game application P1.
音声出力部15は、スピーカ等を含む。処理部10は、ゲームアプリP1に基づき、記憶部11に記憶されるゲームコンテンツ、又は、サーバ装置から提供される音声、音楽等のゲームコンテンツを音声出力部15から出力させる。
The audio output unit 15 includes a speaker, etc. The processing unit 10 outputs game content stored in the storage unit 11 or game content such as sound and music provided by the server device from the audio output unit 15 based on the game application P1.
操作部16は、処理部10との間で入出力が可能なユーザインタフェースであって、表示部14内蔵のタッチパネルである。操作部16は、物理ボタンであってもよい。操作部16は、音声入力部と兼用であってもよい。
The operation unit 16 is a user interface capable of inputting and outputting data to and from the processing unit 10, and is a touch panel built into the display unit 14. The operation unit 16 may be a physical button. The operation unit 16 may also serve as a voice input unit.
触覚受付部17は、ユーザの操作に応じて所望の触覚を特定する触覚特定情報を受け付ける。触覚設定部18は、触覚受付部17により受け付けられる触覚特定情報を設定する。触覚提示装置2は、触覚設定部18により設定される触覚特定情報に基づいて所望の触覚を提示する。処理部10がゲームアプリP1を実行することにより、コンピュータを触覚受付部17及び触覚設定部18として機能させることができる。
The tactile reception unit 17 receives tactile identification information that identifies a desired tactile sensation in response to a user's operation. The tactile setting unit 18 sets the tactile identification information received by the tactile reception unit 17. The tactile presentation device 2 presents the desired tactile sensation based on the tactile identification information set by the tactile setting unit 18. The processing unit 10 executes the game application P1, thereby causing the computer to function as the tactile reception unit 17 and the tactile setting unit 18.
図3は、感覚DB110の内容例を示す説明図である。感覚DB110は、変位部202の変位量(角度)毎の触覚提示装置2のMRFデバイス24に供給する電流の値及び周波数を触覚データとして格納している。感覚DB110は、触覚データのほか、視覚データ、聴覚データ等も格納していてもよい。
FIG. 3 is an explanatory diagram showing an example of the contents of the sensory DB 110. The sensory DB 110 stores, as tactile data, the value and frequency of the current supplied to the MRF device 24 of the tactile presentation device 2 for each displacement amount (angle) of the displacement unit 202. In addition to tactile data, the sensory DB 110 may also store visual data, auditory data, etc.
図4は、触覚提示装置2の構成を示すブロック図である。触覚提示装置2は、図1に示したように、扁平な有底円筒状の把持体200に周方向に一部沿うような湾曲部を有する帯状平板の変位部202を設けて構成される。変位部202は、それ自体が撓むことが可能な素材であるが、剛性の高い素材を採用し、把持体200と支軸を介して回動可能に支持されていてもよい。変位部202の先端の外側の面には、布テープ状の結束具203が設けられている。変位部202の先端の内側面は、把持体200内部に収容されているMRFデバイス24のロータの回転軸と連結するリンク機構204が設けられている。変位部202には、その表面にシリコンラバーや毛並みを持つもの等、多様な感触を加えた素材等が取り付けられていてもよい。
FIG. 4 is a block diagram showing the configuration of the tactile presentation device 2. As shown in FIG. 1, the tactile presentation device 2 is configured by providing a flat, bottomed, cylindrical gripper 200 with a belt-like flat plate-like displacement section 202 having a curved section that partially follows the circumferential direction. The displacement section 202 is made of a material that is itself flexible, but may be made of a highly rigid material and rotatably supported by the gripper 200 via a support shaft. A cloth tape-like binding device 203 is provided on the outer surface of the tip of the displacement section 202. A link mechanism 204 that connects to the rotating shaft of the rotor of the MRF device 24 housed inside the gripper 200 is provided on the inner surface of the tip of the displacement section 202. The displacement section 202 may be provided with a material that adds a variety of textures, such as silicone rubber or a material with fur on its surface.
操作者は図1に示したように、把持体200を例えば親指と中指とで把持しつつ、人差し指等の指を変位部202に沿わせて結束具203に人差し指を差し込んで使用する。操作者は人差し指を押し込むように変位部202を動かすことができ、また、人差し指を伸ばして把持体200から変位部202を遠ざけるように動かすことができる。
As shown in FIG. 1, the operator holds the gripping body 200 with, for example, the thumb and middle finger, and inserts the index finger and other fingers into the fastener 203 along the displacement portion 202. The operator can move the displacement portion 202 by pushing the index finger in, and can also move the displacement portion 202 away from the gripping body 200 by extending the index finger.
変位部202が把持体200から最も遠い位置(上限)にあるとき、図3に示した変位部202の変位量(角度)は0度である。変位部202が把持体200に最も近い位置(下限)にあるとき、変位部202の変位量(角度)は90度である。この例では、変位部202の可動範囲は上限から下限まで90度であるが、これに限定されない。また図3では、触覚データを1度ごとに設定しているが、例えば0.1度ごとなど、もっと細かく設定することも可能である。
When the displacement unit 202 is at the position farthest from the gripping body 200 (upper limit), the displacement amount (angle) of the displacement unit 202 shown in FIG. 3 is 0 degrees. When the displacement unit 202 is at the position closest to the gripping body 200 (lower limit), the displacement amount (angle) of the displacement unit 202 is 90 degrees. In this example, the movable range of the displacement unit 202 is 90 degrees from the upper limit to the lower limit, but is not limited to this. Also, in FIG. 3, the tactile data is set in increments of 1 degree, but it is possible to set it more finely, for example, in increments of 0.1 degrees.
触覚提示装置2は、図1に示すような把持体200と、制御部20、記憶部21、通信部22、電源部23、MRFデバイス24、センサ25を備える。把持体200は、MRFデバイス24を内蔵する。制御部20、記憶部21、通信部22、及び電源部23は把持体200と一体に設けられてもよいし、把持体200と無線又は有線により接続される別体に設けられてもよい。
The tactile presentation device 2 includes a gripping body 200 as shown in FIG. 1, a control unit 20, a memory unit 21, a communication unit 22, a power supply unit 23, an MRF device 24, and a sensor 25. The gripping body 200 has an internal MRF device 24. The control unit 20, the memory unit 21, the communication unit 22, and the power supply unit 23 may be provided integrally with the gripping body 200, or may be provided separately and connected to the gripping body 200 wirelessly or via a wire.
制御部20は、CPU、MPU(Micro-Processing Unit )等のプロセッサ、ROM(Read Only Memory)、RAM(Random Access Memory)等のメモリを含む。制御部20は、例えばマイクロコントローラである。制御部20は、内蔵ROMに記憶されている制御プログラムP2に基づいて各構成部を制御し、触覚提示を実現する。
The control unit 20 includes a processor such as a CPU and an MPU (Micro-Processing Unit), and memories such as a ROM (Read Only Memory) and a RAM (Random Access Memory). The control unit 20 is, for example, a microcontroller. The control unit 20 controls each component based on a control program P2 stored in the built-in ROM, and realizes tactile presentation.
記憶部21は、制御部20に対する補助記憶メモリであり、MRFデバイス24の制御データ(触覚データ)を書き換え可能に記憶する。
The storage unit 21 is an auxiliary memory for the control unit 20, and stores the control data (haptic data) of the MRF device 24 in a rewritable manner.
通信部22は、近距離無線通信、例えばBluetooth(登録商標)の通信モジュールである。制御部20は、通信部22によって情報処理装置1との間でデータを送受信できる。
The communication unit 22 is a communication module for short-range wireless communication, such as Bluetooth (registered trademark). The control unit 20 can transmit and receive data to and from the information processing device 1 via the communication unit 22.
制御部20は、電源部23、MRFデバイス24及びセンサ25とI/Oを介して接続されており、相互に信号を授受する。
The control unit 20 is connected to the power supply unit 23, the MRF device 24, and the sensor 25 via I/O, and exchanges signals with each other.
電源部23は、充電可能なバッテリを含む。電源部23は、ON状態になると各構成部及びMRFデバイス24へ電力を供給する。
The power supply unit 23 includes a rechargeable battery. When the power supply unit 23 is turned on, it supplies power to each component and the MRF device 24.
MRFデバイス24は、円板状のロータを、隙間を開けて挟むようにして設けられたヨークを有し、ヨークに設けられたコイルに制御電流を流して磁界を発生させ、隙間に封入されている磁気粘性流体の粘度(ずり応力)を制御してロータの回転抵抗を与える。制御部20が、MRFデバイス24への制御電流の大きさを制御すると即座に回転抵抗が変更される。
The MRF device 24 has a yoke that is arranged to sandwich a disk-shaped rotor with a gap between them, and generates a magnetic field by passing a control current through a coil attached to the yoke, and controls the viscosity (shear stress) of the magnetorheological fluid sealed in the gap to provide rotational resistance to the rotor. When the control unit 20 adjusts the magnitude of the control current to the MRF device 24, the rotational resistance is immediately changed.
センサ25は、変位部202の位置(角度)を測定して制御部20へ出力する。センサ25は変位部202の変位を、角度として測定して出力する。センサ25は、ジャイロセンサ、加速度センサ等の複数のセンサから構成されてもよい。
The sensor 25 measures the position (angle) of the displacement unit 202 and outputs it to the control unit 20. The sensor 25 measures the displacement of the displacement unit 202 as an angle and outputs it. The sensor 25 may be composed of multiple sensors such as a gyro sensor and an acceleration sensor.
図4に示すように構成される触覚提示装置2では、変位部202が操作者によって操作されると、変位部202の変位がリンク機構204を介してMRFデバイス24のロータの回転軸への回転方向に伝達される。回転軸は、MRFデバイス24が動作していない場合、即ち制御電流がゼロである間は、自由に回転するため、変位部202は抵抗なく変動する。一方で、MRFデバイス24が動作し、制御電流がゼロでない場合には、MRFデバイス24へ流れる電流の大きさに応じてMRFデバイス24内部の磁気粘性流体の粘度(ずり応力)が変更される。制御部20が、MRFデバイス24への電流の大きさを連続的に変更したり、電流値を所定の周波数で振動させたりすることで、変位部202に対する抵抗の力やその出現方法を変更できる。
In the tactile presentation device 2 configured as shown in FIG. 4, when the displacement unit 202 is operated by the operator, the displacement of the displacement unit 202 is transmitted in the rotational direction to the rotation shaft of the rotor of the MRF device 24 via the link mechanism 204. When the MRF device 24 is not operating, that is, while the control current is zero, the rotation shaft rotates freely, so the displacement unit 202 fluctuates without resistance. On the other hand, when the MRF device 24 is operating and the control current is not zero, the viscosity (shear stress) of the magnetorheological fluid inside the MRF device 24 is changed according to the magnitude of the current flowing to the MRF device 24. The control unit 20 can change the force of resistance to the displacement unit 202 and the way in which it appears by continuously changing the magnitude of the current to the MRF device 24 or vibrating the current value at a predetermined frequency.
このようにして触覚提示装置2は、変位部202の押し込む量に応じて抵抗(電流値)を変動させてヌルリとした触覚を提示したり、押し込む量が大きくなるにつれて抵抗を大きくしてギュッとした固さの触覚を提示したり、抵抗の大小の繰り返しや矩形波的なON・OFFの繰り返しによってザクザクとした触覚を提示したりすることができる。
In this way, the tactile presentation device 2 can present a slippery sensation by varying the resistance (current value) according to the amount of pressure applied to the displacement section 202, or a firm sensation by increasing the resistance as the pressure increases, or a crunchy sensation by repeatedly varying the resistance or by repeatedly switching on and off in a square wave pattern.
このように構成される触覚提示システム100のゲームの進行について、以下に説明する。図5は、ゲームアプリP1に基づく処理の手順の一例を示すフローチャートである。図5に示す情報処理装置1による情報処理の手順は、ゲームアプリP1が起動された場合に実行されるか、触覚提示装置2との通信接続が切れた場合に実行される。
The progress of the game in the tactile presentation system 100 configured in this manner will be described below. FIG. 5 is a flowchart showing an example of the processing procedure based on the game application P1. The information processing procedure by the information processing device 1 shown in FIG. 5 is executed when the game application P1 is launched, or when the communication connection with the tactile presentation device 2 is cut off.
情報処理装置1において、処理部10は、触覚提示装置2との間で、第2通信部13によって通信接続を確立させる(ステップS11)。処理部10は、感覚DB110から標準の触覚データを読み出し(ステップS12)、読み出した標準の触覚データを触覚提示装置2へ送信する(ステップS13)。
In the information processing device 1, the processing unit 10 establishes a communication connection with the tactile presentation device 2 via the second communication unit 13 (step S11). The processing unit 10 reads out standard tactile data from the sensory DB 110 (step S12) and transmits the read out standard tactile data to the tactile presentation device 2 (step S13).
ステップS13で感覚DB110から読み出される標準の触覚データは、図3に示したような、変位量(角度)毎に、対応する電流の値及び周波数を記憶したテーブルである。
The standard tactile data read from the sensory DB 110 in step S13 is a table that stores the corresponding current value and frequency for each displacement amount (angle), as shown in FIG. 3.
触覚提示装置2は、送信された触覚データに基づいて、図5の右側に示す処理を実行する。触覚提示装置2の制御部20は、起動して情報処理装置1と通信接続が確立されると、以下の処理を実行する。
The tactile presentation device 2 executes the process shown on the right side of FIG. 5 based on the transmitted tactile data. When the control unit 20 of the tactile presentation device 2 is started and a communication connection with the information processing device 1 is established, the control unit 20 executes the following process.
触覚提示装置2は、通信接続中の情報処理装置1から触覚データを受信すると(ステップS21)、記憶部21に記憶する(ステップS22)。ステップS21で受信する触覚データは、上述した変位量毎の電流値及び周波数の一覧テーブルである。
When the tactile presentation device 2 receives tactile data from the information processing device 1 during communication connection (step S21), the tactile presentation device 2 stores the data in the storage unit 21 (step S22). The tactile data received in step S21 is a list table of the current values and frequencies for each amount of displacement described above.
触覚提示装置2の制御部20は、センサ25から出力される変位部202の変位量(角度)に対応する信号をサンプリングする(ステップS23)。制御部20は、サンプリングにより得られる変位量を、情報処理装置1へ送信する(ステップS24)。変位量(角度)は、上端からの相対的な変位でもよいし、センサ25によって検出された絶対位置でもよい。
The control unit 20 of the tactile presentation device 2 samples a signal corresponding to the displacement amount (angle) of the displacement unit 202 output from the sensor 25 (step S23). The control unit 20 transmits the displacement amount obtained by sampling to the information processing device 1 (step S24). The displacement amount (angle) may be a relative displacement from the upper end, or may be an absolute position detected by the sensor 25.
制御部20は、サンプリングにより得られる変位量に対応する電流値を、記憶部21に記憶した触覚データから参照し(ステップS25)、参照した電流をMRFデバイス24へ出力する(ステップS26)。
The control unit 20 references the current value corresponding to the amount of displacement obtained by sampling from the tactile data stored in the memory unit 21 (step S25), and outputs the referenced current to the MRF device 24 (step S26).
情報処理装置1において、標準の触覚データが触覚提示装置2へ送信された後、触覚受付部17は、ユーザの操作に応じて所望の触覚を特定する触覚特定情報を受け付ける(ステップS14)。触覚設定部18は、触覚受付部17により受け付けられる触覚特定情報を設定する(ステップS15)。触覚提示装置2は、触覚設定部18により設定される触覚特定情報に基づいて所望の触覚を提示する(ステップS21~S26)。
In the information processing device 1, after the standard tactile data is transmitted to the tactile presentation device 2, the tactile reception unit 17 receives tactile specific information that identifies the desired tactile sensation in response to the user's operation (step S14). The tactile setting unit 18 sets the tactile specific information received by the tactile reception unit 17 (step S15). The tactile presentation device 2 presents the desired tactile sensation based on the tactile specific information set by the tactile setting unit 18 (steps S21 to S26).
以下、図6のフロー図を参照し、触覚特定情報を受け付けて設定する処理の詳細を説明する。
Below, we will explain the details of the process of accepting and setting tactile specific information with reference to the flow diagram in Figure 6.
情報処理装置1において、表示部14は、図7に示すように、標準触覚の確認画面を表示する(ステップS31)。具体的には、表示部14は、触覚提示装置2のアニメーションを表示し、触覚提示装置2の操作方法をユーザに案内する。併せて、音声出力部15が音声でも操作方法を案内してもよい。最初、触覚提示装置2は標準の触覚を提示する。ユーザは、触覚提示装置2を操作することにより、触覚提示装置2により提示される標準の触覚を確認できる。
In the information processing device 1, the display unit 14 displays a confirmation screen for the standard tactile sensation, as shown in FIG. 7 (step S31). Specifically, the display unit 14 displays an animation of the tactile presentation device 2 and guides the user on how to operate the tactile presentation device 2. Additionally, the audio output unit 15 may also guide the user on the operation method by audio. Initially, the tactile presentation device 2 presents a standard tactile sensation. The user can confirm the standard tactile sensation presented by the tactile presentation device 2 by operating the tactile presentation device 2.
ユーザが触覚提示装置2を操作すると、表示部14は、図8に示すように、所望の触覚を作成するか否かを選択するための画面を表示する。触覚提示装置2のアニメーションはそのまま継続され、触覚提示装置2により提示される触覚もそのまま継続される。ユーザが表示された作成ボタンをタップすると、所望の感覚を作成する処理が始まる。他方、ユーザが表示されたOKボタンをタップすると、標準の触覚のまま維持される。
When the user operates the tactile presentation device 2, the display unit 14 displays a screen for selecting whether or not to create the desired tactile sensation, as shown in FIG. 8. The animation of the tactile presentation device 2 continues as is, and the tactile sensation presented by the tactile presentation device 2 also continues as is. When the user taps the displayed Create button, the process of creating the desired sensation begins. On the other hand, when the user taps the displayed OK button, the standard tactile sensation is maintained.
ユーザが作成ボタンをタップすると、表示部14は、図9に示すように、硬さ選択画面を表示する(ステップS32)。具体的には、表示部14は、5つの選択肢を表示する。この例では、硬さを表現するために、柔らかいものから順に、「生卵」、「とうふ」、「初期」、「グミ」及び「岩おこし」が表示されている。この画面が表示された当初は、標準の硬さ「初期」がデフォルトで選択されている。
When the user taps the create button, the display unit 14 displays a hardness selection screen as shown in FIG. 9 (step S32). Specifically, the display unit 14 displays five options. In this example, to express hardness, "raw egg," "tofu," "initial," "gummy," and "rock okoshi" are displayed in order from softest to softest. When this screen is first displayed, the standard hardness "initial" is selected by default.
続いて、触覚受付部17は、ユーザのタップ操作に応じて所望の硬さを選択する(ステップS33)。すなわち、触覚受付部17は、所望の硬さを特定する触覚特定情報を受け付ける。触覚設定部18は、選択された硬さに応じて標準の触覚データの電流値のパラメータを調整する(ステップS34)。すなわち、触覚設定部18は、触覚受付部17により受け付けられる所望の硬さを特定する触覚特定情報を設定する。具体的には、「生卵」が選択されると、電流値は例えば標準の触覚データの0.6倍にされる。「とうふ」が選択されると、電流値は例えば標準の触覚データの0.8倍にされる。「初期」が選択されると、電流値は標準の触覚データのまま維持される。「グミ」が選択されると、電流値は例えば標準の触覚データの1.2倍にされる。「岩おこし」が選択されると、電流値は例えば標準の触覚データの1.4倍にされる。なお、ここでは標準の触覚データの電流値以外のパラメータ(例えば、「周波数」)の調整は行われない。ここで説明に用いた「生卵」等は一例に過ぎず、硬さを表現する他のものを用いてもよい。また、電流値の倍率も一例に過ぎず、他の倍率を用いてもよい。
Next, the haptic reception unit 17 selects the desired hardness in response to the user's tap operation (step S33). That is, the haptic reception unit 17 receives haptic specific information that specifies the desired hardness. The haptic setting unit 18 adjusts the parameters of the current value of the standard haptic data in response to the selected hardness (step S34). That is, the haptic setting unit 18 sets the haptic specific information that specifies the desired hardness received by the haptic reception unit 17. Specifically, when "raw egg" is selected, the current value is set to, for example, 0.6 times the standard haptic data. When "tofu" is selected, the current value is set to, for example, 0.8 times the standard haptic data. When "initial" is selected, the current value is maintained as it is in the standard haptic data. When "gummy" is selected, the current value is set to, for example, 1.2 times the standard haptic data. When "rock-raising" is selected, the current value is set to, for example, 1.4 times the standard haptic data. Note that no adjustment is made to parameters other than the current value of the standard haptic data (for example, "frequency"). The "raw egg" used in the explanation here is merely an example, and other expressions of hardness may be used. The current value multiplier is also merely an example, and other multipliers may be used.
第2通信部13は、調整された電流値を含む触覚データを触覚提示装置2に送信する(ステップS35)。触覚提示装置2は、ステップS21~S26の処理を実行することにより、調整された電流値に応じた硬さの触覚を提示する。そのため、ユーザは選択した硬さを触覚提示装置2を通じて体験できる。
The second communication unit 13 transmits tactile data including the adjusted current value to the tactile presentation device 2 (step S35). The tactile presentation device 2 executes the processes of steps S21 to S26 to present a tactile sensation of hardness according to the adjusted current value. This allows the user to experience the selected hardness through the tactile presentation device 2.
ユーザは、OKボタンをタップするまで、硬さを何回も選択し直すことができ、選択し直す度に、それに対応する硬さ(触覚)をユーザに提示することができる。具体的には、現在選択されている硬さと異なる硬さが選択される度に、情報処理装置1は、選択された硬さに応じて電流値を調整し、調整された電流値を含む触覚データを触覚提示装置2に送信する(ステップS32~S35)。触覚提示装置2は、ステップS21~S26の処理を実行することにより、受信した最新の触覚データに基づいて触覚を提示する。ユーザがOKボタンをタップすると、触覚設定部18は、現在選択されている硬さを所望の硬さとして決定する。このとき、処理部10は、電流値が調整された最新の触覚データ(以下、「第1調整触覚データ」という)を感覚DB110に記憶する。
The user can reselect the hardness any number of times until tapping the OK button, and each time a hardness different from the currently selected hardness is selected, the information processing device 1 adjusts the current value according to the selected hardness and transmits tactile data including the adjusted current value to the tactile presentation device 2 (steps S32 to S35). The tactile presentation device 2 performs the processes of steps S21 to S26 to present a tactile sensation based on the latest tactile data received. When the user taps the OK button, the tactile setting unit 18 determines the currently selected hardness as the desired hardness. At this time, the processing unit 10 stores the latest tactile data with the adjusted current value (hereinafter referred to as "first adjusted tactile data") in the sensory DB 110.
ユーザがOKボタンをタップすると、表示部14は、図10に示すように、手触り選択画面を表示する(ステップS36)。具体的には、表示部14は、5つの選択肢を表示する。この例では、手触りを表現するために、滑らかなものから順に、「ゴム」、「片栗粉」、「初期」、「感電」及び「砂」が表示されている。この画面が表示された当初は、標準の手触り「初期」がデフォルトで選択されている。この状態においては、触覚提示装置2は、感覚DB110に記憶された第1調整触覚データと同じものが記憶されており、当該第1調整触覚データに対応する触覚を提示する。
When the user taps the OK button, the display unit 14 displays a texture selection screen as shown in FIG. 10 (step S36). Specifically, the display unit 14 displays five options. In this example, to express the textures, "rubber," "potato starch," "initial," "electric shock," and "sand" are displayed in order from smoothest to lightest. When this screen is first displayed, the standard texture "initial" is selected by default. In this state, the tactile presentation device 2 stores the same first adjusted tactile data as stored in the sensory DB 110, and presents a tactile sensation corresponding to the first adjusted tactile data.
続いて、触覚受付部17は、ユーザのタップ操作に応じて所望の手触りを選択する(ステップS37)。すなわち、触覚受付部17は、所望の手触りを特定する触覚特定情報を受け付ける。触覚設定部18は、選択された手触りに応じて、第1調整触覚データの周波数のパラメータを調整する(ステップS38)。すなわち、触覚設定部18は、触覚受付部17により受け付けられる所望の手触りを特定する触覚特定情報を設定する。具体的には、「ゴム」が選択されると、周波数は例えば標準の触覚データの0.5倍にされる。「片栗粉」が選択されると、周波数は例えば標準の触覚データの0.8倍にされる。「初期」が選択されると、周波数は標準の触覚データのまま維持される。「寒天」が選択されると、周波数は例えば標準の触覚データの1.6倍にされる。「砂」が選択されると、周波数は例えば標準の触覚データの2.0倍にされる。ここで説明に用いた「ゴム」等は一例に過ぎず、手触りを表現する他のものを用いてもよい。また、周波数の倍率も一例に過ぎず、他の倍率を用いてもよい。
Next, the haptic reception unit 17 selects the desired texture in response to the user's tap operation (step S37). That is, the haptic reception unit 17 receives haptic specific information that identifies the desired texture. The haptic setting unit 18 adjusts the frequency parameters of the first adjusted haptic data in response to the selected texture (step S38). That is, the haptic setting unit 18 sets the haptic specific information that identifies the desired texture received by the haptic reception unit 17. Specifically, when "rubber" is selected, the frequency is set to, for example, 0.5 times the standard haptic data. When "potato starch" is selected, the frequency is set to, for example, 0.8 times the standard haptic data. When "initial" is selected, the frequency is maintained as the standard haptic data. When "agar" is selected, the frequency is set to, for example, 1.6 times the standard haptic data. When "sand" is selected, the frequency is set to, for example, 2.0 times the standard haptic data. The "rubber" and the like used in the explanation here are merely examples, and other textures may be used. Additionally, the frequency multiplication factor is merely an example, and other multiplication factors may be used.
例えば、変位が2度以上3度未満の場合、図3に示した標準の触覚データを参照すると、電流は0.5Aであり、周波数は100Hz(周期は0.01秒)である。ステップS33及びS37で「初期」が選択された場合、MRFデバイス24には図11に示した電流が供給される。この電流のデューティ比は50%である。
For example, when the displacement is 2 degrees or more but less than 3 degrees, referring to the standard tactile data shown in FIG. 3, the current is 0.5 A and the frequency is 100 Hz (period is 0.01 seconds). When "initial" is selected in steps S33 and S37, the MRF device 24 is supplied with the current shown in FIG. 11. The duty ratio of this current is 50%.
また、変位が3度以上4度未満の場合、図3に示した標準の触覚データを参照すると、電流は0.5Aであり、周波数は100Hz(周期は0.01秒)である。加えて、ステップS33で「初期」が選択され、かつ、ステップS37で「砂」が選択された場合、図12に示した触覚データのように、電流は0.50Aのままであるが、周波数は200Hz(周期は0.005秒)に上げられている。この場合、MRFデバイス24には図13に示した電流が供給される。この電流のデューティ比も50%であるが、図14に示したようにデューティ比を変えることにより手触りを変えるようにしてもよい。
Furthermore, when the displacement is between 3 degrees and 4 degrees, referring to the standard tactile data shown in FIG. 3, the current is 0.5 A and the frequency is 100 Hz (period is 0.01 seconds). In addition, when "initial" is selected in step S33 and "sand" is selected in step S37, the current remains at 0.50 A, but the frequency is increased to 200 Hz (period is 0.005 seconds), as in the tactile data shown in FIG. 12. In this case, the current shown in FIG. 13 is supplied to the MRF device 24. The duty ratio of this current is also 50%, but the feel may be changed by changing the duty ratio, as shown in FIG. 14.
第2通信部13は、周波数のパラメータが調整された第1調整触覚データ(以下、「第2調整触覚データ」という)を触覚提示装置2に送信する(ステップS39)。触覚提示装置2は、ステップS21~S26の処理を実行することにより、受信した第2調整触覚データに応じた手触りの触覚を提示する。そのため、ユーザは選択した手触りを触覚提示装置2を通じて体験できる。
The second communication unit 13 transmits the first adjusted tactile data (hereinafter referred to as "second adjusted tactile data") with the frequency parameters adjusted to the tactile presentation device 2 (step S39). The tactile presentation device 2 executes the processes of steps S21 to S26 to present a tactile sensation according to the received second adjusted tactile data. This allows the user to experience the selected texture through the tactile presentation device 2.
ユーザは、OKボタンをタップするまで、手触りを何回も選択し直すことができ、選択し直す度に、それに対応する手触り(触覚)をユーザに提示することができる。具体的には、現在選択されている手触りと異なる手触りが選択される度に、情報処理装置1は、選択された手触りに応じて、第1調整触覚データの周波数のパラメータを調整し、調整された周波数を含む第2調整触覚データを触覚提示装置2に送信する(ステップS36~S39)。触覚提示装置2は、ステップS21~S26の処理を実行することにより、受信した最新の第2調整触覚データに基づいて触覚を提示する。ユーザがOKボタンをタップすると、触覚設定部18は、現在選択されている手触りを所望の手触りとして決定する。このとき、処理部10は、第1調整触覚データとは別に、最新の第2調整触覚データを感覚DB110に記憶する。
The user can select the texture any number of times until tapping the OK button, and each time a texture different from the currently selected texture is selected, the corresponding texture (touch) can be presented to the user. Specifically, each time a texture different from the currently selected texture is selected, the information processing device 1 adjusts the frequency parameters of the first adjusted tactile data according to the selected texture, and transmits second adjusted tactile data including the adjusted frequency to the tactile presentation device 2 (steps S36 to S39). The tactile presentation device 2 presents a tactile sensation based on the latest received second adjusted tactile data by executing the processes of steps S21 to S26. When the user taps the OK button, the tactile setting unit 18 determines the currently selected texture as the desired texture. At this time, the processing unit 10 stores the latest second adjusted tactile data in the sensation DB 110 separately from the first adjusted tactile data.
ユーザがOKボタンをタップすると、表示部14は、図15に示すように、パターン選択画面を表示する(ステップS40)。具体的には、表示部14は、5つの選択肢を表示する。この画面が表示された当初は、標準パターン「初期」がデフォルトで選択されている。標準パターンでは、変位(角度)が変化しても電流値は一定である。硬さ及び手触りは上記で既に選択されている。この状態においては、触覚提示装置2は、感覚DB110に記憶された第2調整触覚データと同じものが記憶されており、当該第2調整触覚データに対応する触覚を提示する。他の4つのパターンでは、変位(角度)が横軸に示され、電流値が縦軸に示される。一番上のパターンでは、変位が小さい間は電流値が徐々に大きくなり、変位が大きくなると電流値が急に大きくなる。上から2番目のパターンでは、変位が小さい間に電流値が急に大きくなり、変位が大きくなると電流値が徐々に大きくなる。下から2番目のパターンでは、変位が小さい間は電流値が徐々に小さくなり、変位が大きくなると電流値が急に小さくなる。一番下のパターンでは、変位が小さい間に電流値が急に小さくなり、変位が大きくなると電流値が徐々に小さくなる。
When the user taps the OK button, the display unit 14 displays a pattern selection screen as shown in FIG. 15 (step S40). Specifically, the display unit 14 displays five options. When this screen is first displayed, the standard pattern "initial" is selected by default. In the standard pattern, the current value is constant even if the displacement (angle) changes. The hardness and texture have already been selected above. In this state, the tactile presentation device 2 stores the same second adjusted tactile data stored in the sensory DB 110, and presents a tactile sensation corresponding to the second adjusted tactile data. In the other four patterns, the displacement (angle) is shown on the horizontal axis, and the current value is shown on the vertical axis. In the top pattern, the current value gradually increases while the displacement is small, and the current value suddenly increases as the displacement increases. In the second pattern from the top, the current value suddenly increases while the displacement is small, and the current value gradually increases as the displacement increases. In the second pattern from the bottom, the current value gradually decreases while the displacement is small, and the current value suddenly decreases as the displacement increases. In the bottom pattern, the current value suddenly decreases while the displacement is small, and then gradually decreases as the displacement increases.
続いて、触覚受付部17は、ユーザのタップ操作に応じて所望のパターンを選択する(ステップS41)。すなわち、触覚受付部17は、所望のパターンを特定する触覚特定情報を受け付ける。触覚設定部18は、選択されたパターンに応じて、第2調整触覚データの電流値のパラメータを調整する(ステップS42)。すなわち、触覚設定部18は、触覚受付部17により受け付けられる所望のパターンを特定する触覚特定情報を設定する。具体的には、第2調整触覚データの電流値のパラメータに対して、各変位に対して所定の係数を変位ごとにかけることによって電流値の調整を行う。例えば、二次関数的に増加するパターンが選ばれた場合は、「nX2(Xは変位)」を係数として、各変位に対応する電流値にかける。
Next, the haptic reception unit 17 selects a desired pattern in response to the user's tap operation (step S41). That is, the haptic reception unit 17 receives haptic specific information that specifies the desired pattern. The haptic setting unit 18 adjusts the parameter of the current value of the second adjusted haptic data in response to the selected pattern (step S42). That is, the haptic setting unit 18 sets haptic specific information that specifies the desired pattern received by the haptic reception unit 17. Specifically, the current value is adjusted by multiplying the parameter of the current value of the second adjusted haptic data by a predetermined coefficient for each displacement. For example, when a pattern that increases quadratically is selected, the current value corresponding to each displacement is multiplied by "nX 2 (X is a displacement)" as a coefficient.
第2通信部13は、電流値のパラメータが調整された第2調整触覚データ(以下、「第3調整触覚データ」という)を触覚提示装置2に送信する(ステップS43)。触覚提示装置2は、ステップS21~S26の処理を実行することにより、受信した第3調整触覚データに応じたパターンの触覚を提示する。そのため、ユーザは選択したパターンの触覚を触覚提示装置2を通じて体験できる。
The second communication unit 13 transmits the second adjusted tactile data (hereinafter referred to as "third adjusted tactile data") in which the current value parameters have been adjusted to the tactile presentation device 2 (step S43). The tactile presentation device 2 executes the processes of steps S21 to S26 to present a tactile pattern according to the received third adjusted tactile data. Therefore, the user can experience the tactile pattern selected through the tactile presentation device 2.
ユーザは、OKボタンをタップするまで、パターンを何回も選択し直すことができ、選択し直す度に、それに対応するパターン(触覚)をユーザに提示することができる。具体的には、現在選択されているパターンと異なるパターンが選択される度に、情報処理装置1は、選択されたパターンに応じて、第2調整触覚データの電流値のパラメータを調整し、調整された電流値を含む第3調整触覚データを触覚提示装置2に送信する(ステップS36~S39)。触覚提示装置2は、ステップS21~S26の処理を実行することにより、受信した最新の第3調整触覚データに基づいて触覚を提示する。ユーザがOKボタンをタップすると、触覚設定部18は、現在選択されているパターンを所望のパターンとして決定する。このとき、処理部10は、第1調整触覚データ及び第2調整触覚データとは別に、最新の第3調整触覚データを感覚DB110に記憶する。
The user can reselect a pattern any number of times until tapping the OK button, and each time a pattern different from the currently selected pattern is selected, a corresponding pattern (tactile sensation) can be presented to the user. Specifically, each time a pattern different from the currently selected pattern is selected, the information processing device 1 adjusts the parameters of the current value of the second adjusted tactile data according to the selected pattern, and transmits third adjusted tactile data including the adjusted current value to the tactile presentation device 2 (steps S36 to S39). The tactile presentation device 2 presents a tactile sensation based on the latest received third adjusted tactile data by executing the processes of steps S21 to S26. When the user taps the OK button, the tactile setting unit 18 determines the currently selected pattern as the desired pattern. At this time, the processing unit 10 stores the latest third adjusted tactile data in the sensory DB 110 separately from the first adjusted tactile data and the second adjusted tactile data.
パターンは、これらに限らず、波形、ジグザグ等でもよい。また、パターンを形状で表示する代わりに、「ナミナミ」、「ジグザグ」、「ぐにゃー」、「フラット」等の文字で表示してもよい。
The pattern is not limited to these, but may be wavy, zigzag, etc. Also, instead of displaying the pattern as a shape, it may be displayed as words such as "wavy," "zigzag," "squiggly," or "flat."
ユーザがOKボタンをタップすると、表示部14は、図16に示すように、作成した触覚の確認画面を表示する(ステップS44)。この状態においては、触覚提示装置2は、感覚DB110に記憶された第3調整触覚データと同じものが記憶されており、当該第3調整触覚データに対応する触覚を提示する。そのため、ユーザは作成した触覚を触覚提示装置2を通じて体験できる。
When the user taps the OK button, the display unit 14 displays a confirmation screen for the created tactile sensation, as shown in FIG. 16 (step S44). In this state, the tactile presentation device 2 stores the same third adjusted tactile data as that stored in the sensory DB 110, and presents a tactile sensation corresponding to the third adjusted tactile data. Therefore, the user can experience the created tactile sensation through the tactile presentation device 2.
ユーザがOKボタンをタップすると(ステップS45で「OK」)、上記で選択された硬さ、手触り及びパターンが確定され、所望の触覚の設定が完了する。他方、ユーザが作り直しボタンをタップすると(ステップS45で「作り直し」)、処理はステップS32に戻る。ステップS36又はS40に戻るようにしてもよい。これらの場合、DB110に記憶された第1調整触覚データ又は第2調整触覚データを用いることで、第3調整触覚データを途中から作り直すことができる。また、所望の触覚の設定完了後に、設定した電流値、周波数及びパターンの少なくとも1つを微調整するようにしてもよい。微調整には、スライドバー等の変位可能な操作子を表示し、この操作子をユーザの操作に応じて変位させるようにしてもよい。
When the user taps the OK button ("OK" in step S45), the hardness, texture, and pattern selected above are confirmed, and the setting of the desired tactile sensation is completed. On the other hand, when the user taps the Remake button ("Remake" in step S45), the process returns to step S32. It may also return to step S36 or S40. In these cases, the third adjusted tactile sensation data can be remade midway by using the first adjusted tactile sensation data or the second adjusted tactile sensation data stored in DB110. In addition, after the setting of the desired tactile sensation is completed, at least one of the set current value, frequency, and pattern may be fine-tuned. For fine adjustment, a displaceable operator such as a slide bar may be displayed, and this operator may be displaced according to the user's operation.
<他の実施形態>
図17に示すように、ポインタ等の変位可能な操作子を表示し、この操作子をユーザの操作に応じて二次元上で変位させるようにしてもよい。図17に示した画面では、X軸とY軸とを含む二次元座標と、二次元座標上に配置される操作子とが表示される。X軸は手触りを示す。Y軸は硬さを示す。ユーザは操作子を移動させることにより、手触り及び硬さを同時に設定することができる。ユーザがOKボタンをタップすると、処理はステップS40に進む。図17に示した画面は、図9及び図10に示した画面の代わりに用いられる。図17に示した画面でOKボタンがタッチされると、図15及び図16に示した画面が順に表示される。図16に示した画面で作り直しボタンがタッチされると、その直前に設定されていた触覚が選択されているため、その触覚に応じた位置にポインタが配置される。座標の原点は、標準の触覚データに該当する。初期段階では、座標の原点にポインタが配置される。ユーザがポインタにタッチしてスワイプすることにより、座標上でポインタを移動させる。情報処理装置1は、ポインタの移動に応じてリアルタイムで触覚データを調整(Y軸:電流値、X軸:周波数)し、調整した触覚データ(上記実施形態の第2調整触覚データに相当)を触覚提示装置2に送信する。触覚データの調整は、標準の触覚データを基準に行う。標準の触覚データの電流のパラメータにY軸の位置に応じた係数を乗算し、標準の触覚データの周波数のパラメータにX軸の位置に応じた係数を乗算し、新たな触覚データを生成する。原点の場合、電流及び周波数の係数はともに「1」となる。電流の係数は、「硬い」方向に行くほど「1」よりも大きくなり、「柔らかい」方向に行くほど「1」よりも小さくなる。ただし、0以下になることはない。周波数の係数は、「ザラザラ」方向に行くほど「1」よりも大きくなり、「柔らか」方向に行くほど「1」よりも小さくなる。最小値は「0」である。 <Other embodiments>
As shown in FIG. 17, a displaceable operator such as a pointer may be displayed, and the operator may be displaced in two dimensions in response to a user's operation. In the screen shown in FIG. 17, two-dimensional coordinates including an X axis and a Y axis and an operator arranged on the two-dimensional coordinates are displayed. The X axis indicates the touch. The Y axis indicates the hardness. The user can set the touch and the hardness at the same time by moving the operator. When the user taps the OK button, the process proceeds to step S40. The screen shown in FIG. 17 is used instead of the screens shown in FIG. 9 and FIG. 10. When the OK button is touched on the screen shown in FIG. 17, the screens shown in FIG. 15 and FIG. 16 are displayed in order. When the Remake button is touched on the screen shown in FIG. 16, the tactile sensation set immediately before is selected, so the pointer is placed at a position corresponding to that tactile sensation. The origin of the coordinates corresponds to the standard tactile sensation data. In the initial stage, the pointer is placed at the origin of the coordinates. The user touches the pointer and swipes it to move the pointer on the coordinates. Theinformation processing device 1 adjusts the tactile data in real time in response to the movement of the pointer (Y-axis: current value, X-axis: frequency), and transmits the adjusted tactile data (corresponding to the second adjusted tactile data in the above embodiment) to the tactile presentation device 2. The tactile data is adjusted based on the standard tactile data. The current parameter of the standard tactile data is multiplied by a coefficient corresponding to the position on the Y-axis, and the frequency parameter of the standard tactile data is multiplied by a coefficient corresponding to the position on the X-axis to generate new tactile data. In the case of the origin, the current and frequency coefficients are both "1". The current coefficient becomes larger than "1" in the "hard" direction, and becomes smaller than "1" in the "soft" direction. However, it never becomes 0 or less. The frequency coefficient becomes larger than "1" in the "rough" direction, and becomes smaller than "1" in the "soft" direction. The minimum value is "0".
図17に示すように、ポインタ等の変位可能な操作子を表示し、この操作子をユーザの操作に応じて二次元上で変位させるようにしてもよい。図17に示した画面では、X軸とY軸とを含む二次元座標と、二次元座標上に配置される操作子とが表示される。X軸は手触りを示す。Y軸は硬さを示す。ユーザは操作子を移動させることにより、手触り及び硬さを同時に設定することができる。ユーザがOKボタンをタップすると、処理はステップS40に進む。図17に示した画面は、図9及び図10に示した画面の代わりに用いられる。図17に示した画面でOKボタンがタッチされると、図15及び図16に示した画面が順に表示される。図16に示した画面で作り直しボタンがタッチされると、その直前に設定されていた触覚が選択されているため、その触覚に応じた位置にポインタが配置される。座標の原点は、標準の触覚データに該当する。初期段階では、座標の原点にポインタが配置される。ユーザがポインタにタッチしてスワイプすることにより、座標上でポインタを移動させる。情報処理装置1は、ポインタの移動に応じてリアルタイムで触覚データを調整(Y軸:電流値、X軸:周波数)し、調整した触覚データ(上記実施形態の第2調整触覚データに相当)を触覚提示装置2に送信する。触覚データの調整は、標準の触覚データを基準に行う。標準の触覚データの電流のパラメータにY軸の位置に応じた係数を乗算し、標準の触覚データの周波数のパラメータにX軸の位置に応じた係数を乗算し、新たな触覚データを生成する。原点の場合、電流及び周波数の係数はともに「1」となる。電流の係数は、「硬い」方向に行くほど「1」よりも大きくなり、「柔らかい」方向に行くほど「1」よりも小さくなる。ただし、0以下になることはない。周波数の係数は、「ザラザラ」方向に行くほど「1」よりも大きくなり、「柔らか」方向に行くほど「1」よりも小さくなる。最小値は「0」である。 <Other embodiments>
As shown in FIG. 17, a displaceable operator such as a pointer may be displayed, and the operator may be displaced in two dimensions in response to a user's operation. In the screen shown in FIG. 17, two-dimensional coordinates including an X axis and a Y axis and an operator arranged on the two-dimensional coordinates are displayed. The X axis indicates the touch. The Y axis indicates the hardness. The user can set the touch and the hardness at the same time by moving the operator. When the user taps the OK button, the process proceeds to step S40. The screen shown in FIG. 17 is used instead of the screens shown in FIG. 9 and FIG. 10. When the OK button is touched on the screen shown in FIG. 17, the screens shown in FIG. 15 and FIG. 16 are displayed in order. When the Remake button is touched on the screen shown in FIG. 16, the tactile sensation set immediately before is selected, so the pointer is placed at a position corresponding to that tactile sensation. The origin of the coordinates corresponds to the standard tactile sensation data. In the initial stage, the pointer is placed at the origin of the coordinates. The user touches the pointer and swipes it to move the pointer on the coordinates. The
上記では、ポインタの位置に応じてリアルタイムで触覚データを触覚提示装置2に送信するようにしているが、指の動きが所定時間以上止まった位置や、ポインタが指から離れた時点における触覚データを送信するようにしてもよい。座標には上下左右に限界がある。例えば、X軸の限界に達した状態で右斜め上にスワイプした場合は、ポインタは、右(X軸方向)に移動することなく、上(Y軸方向)に移動する。この例でも、触覚提示装置2が触覚データを受信したタイミングで、ユーザは触覚提示装置2を通じて触覚を体験することができる。
In the above, tactile data is transmitted to the tactile presentation device 2 in real time according to the position of the pointer, but tactile data may also be transmitted for a position where finger movement stops for a predetermined period of time or when the pointer is removed from the finger. Coordinates have limits on the top, bottom, left, and right. For example, if the user swipes diagonally upwards to the right when the limit of the X axis is reached, the pointer moves upwards (Y axis) without moving to the right (X axis direction). In this example as well, the user can experience a sense of touch through the tactile presentation device 2 at the time the tactile data is received by the tactile presentation device 2.
また、この画面に図15のパターン選択画面を含めることもできる。これにより、第3調整触覚データを1画面で作成することができる。
This screen can also include the pattern selection screen in FIG. 15. This allows the third adjustment tactile data to be created on a single screen.
上記システムは、触覚提示装置と、これに接続可能なスマートフォンとを備えるが、触覚提示装置及びスマートフォンを一体化したタッチパネルモニタを備える触覚提示システムでもよい。
The above system includes a tactile presentation device and a smartphone that can be connected to the device, but it may also be a tactile presentation system that includes a touch panel monitor that integrates the tactile presentation device and the smartphone.
また、触覚提示装置は、MRFを用いたものに限らず、他のパッシブ型の装置であってもよく、モータ等を用いたアクティブ型の装置であってもよい。また、パッシブ型とアクティブ型を組み合わせた装置であってもよい。また、コンピュータを、情報処理装置、触覚提示装置又は触覚提示システムとして機能させるためのプログラムを記憶した非一時的な(non-transitory)記憶媒体も本発明の実施形態に含まれる。
Furthermore, the tactile presentation device is not limited to one using an MRF, but may be another passive type device, or an active type device using a motor or the like. It may also be a device that combines passive and active types. Furthermore, non-transitory storage media that store programs for causing a computer to function as an information processing device, a tactile presentation device, or a tactile presentation system are also included in the embodiments of the present invention.
弱視者に振動で使用方法を案内するタッチパネル装置が提供されているが、本発明はそのような装置にも適用可能である。
Touch panel devices are available that use vibrations to guide users with weak eyesight on how to use them, and the present invention can also be applied to such devices.
また、ステップS11で、複数の触覚提示装置をペアリングし、触覚を同時に設定できるようにしてもよい。また、1台に触覚を設定すれば、もう1台も同じ触覚を自動的に設定できるようにしてもよい。また、5本指のグローブ型デバイスにおいて、1本の指に触覚を設定すれば、5本の指全てに同じ触覚を自動的に設定できるようにしてもよい。また、基本の指に触覚を設定すれば、これを参照し、指ごとに触覚を自動的に設定できるようにしてもよい。
Furthermore, in step S11, multiple tactile presentation devices may be paired so that tactile sensations can be set simultaneously. Furthermore, by setting a tactile sensation on one device, the other device may automatically be set to the same tactile sensation. Furthermore, in a five-finger glove-type device, by setting a tactile sensation on one finger, the same tactile sensation may be automatically set on all five fingers. Furthermore, by setting a tactile sensation on a basic finger, the tactile sensation may be automatically set for each finger by referring to the set tactile sensation.
また、図17に示した実施形態においては、ポインタの位置が指で隠れてしまうので、利用者が操作する座標とは別に確認用の座標を表示し、この座標でポインタの位置を確認できるようにしてもよい。確認用の座標は、利用者の指位置付近のみを拡大表示したものとすることもできる。また、ピンチ操作で操作する座標の分解能や、スケールを変えられるようにしてもよい。X軸方向(Y軸方向)にピンチした場合は、X軸方向(Y軸方向)のパラメータのみのスケールを変更する。X及びYのパラメータのスケールを同時に変更することもでき、ピンチの方向に応じた割合でX及びYのパラメータのスケールを変更することもできる。
In the embodiment shown in FIG. 17, since the position of the pointer is hidden by the finger, confirmation coordinates may be displayed separately from the coordinates operated by the user, so that the position of the pointer can be confirmed using these coordinates. The confirmation coordinates may be an enlarged display of only the area around the position of the user's finger. The resolution and scale of the coordinates operated by a pinch operation may also be changed. When pinching in the X-axis direction (Y-axis direction), the scale of only the parameters in the X-axis direction (Y-axis direction) is changed. The scales of the X and Y parameters can also be changed simultaneously, and the scales of the X and Y parameters can also be changed at a ratio according to the pinch direction.
また、図9、図10、図15、図16及び図17においても、図7及び図8で説明した触覚提示装置2のアニメーションや音声を表示してもよい。このアニメーション及び音声は、触覚提示装置2の動作と連動して変化させることもできる。また、これらの各画面において、触覚をイメージさせるオブジェクトのアニメーションをさらに表示したり、それに連動する音声を出力させたりすることもできる。例えば、図9で「とうふ」が選択中の場合は、「とうふ」に対応するアニメーションを表示する。触覚提示装置2が操作されると、その動きと連動して豆腐がつぶれる様子をアニメーション表示するとともに、豆腐がつぶれるときに発する音声を出力する。図10及び図15の場合も同様である。図16の場合は、図9、図10及び図16で選択された3つの組み合わせによって生成された第3調整触覚データに近いオブジェクトを、様々なオブジェクトの触覚データが記憶された触覚データベース(不図示)を参照することにより検出し、そのオブジェクトのアニメーションを表示するとともに、音声を出力することができる。
9, 10, 15, 16, and 17, the animation and sound of the tactile presentation device 2 described in FIG. 7 and FIG. 8 may be displayed. The animation and sound may be changed in conjunction with the operation of the tactile presentation device 2. In addition, in each of these screens, an animation of an object that evokes a tactile sensation may be further displayed, and sound associated with the animation may be output. For example, when "tofu" is selected in FIG. 9, an animation corresponding to "tofu" is displayed. When the tactile presentation device 2 is operated, an animation of tofu being crushed in conjunction with the operation is displayed, and sound that is generated when the tofu is crushed is output. The same applies to the cases of FIG. 10 and FIG. 15. In the case of FIG. 16, an object close to the third adjusted tactile data generated by the three combinations selected in FIG. 9, FIG. 10, and FIG. 16 is detected by referring to a tactile database (not shown) in which tactile data of various objects is stored, and an animation of the object may be displayed and sound may be output.
以上、本発明の実施形態を説明したが、本発明は上記実施形態に限定されることなく、その趣旨を逸脱しない限り、種々の改良、変形などが可能である。
The above describes an embodiment of the present invention, but the present invention is not limited to the above embodiment, and various improvements and modifications are possible without departing from the spirit of the invention.
1 :情報処理装置
2 :触覚提示装置
17 :触覚受付部
18 :触覚設定部
24 :MRFデバイス
100:触覚提示システム 1: Information processing device 2: Tactile presentation device 17: Tactile reception unit 18: Tactile setting unit 24: MRF device 100: Tactile presentation system
2 :触覚提示装置
17 :触覚受付部
18 :触覚設定部
24 :MRFデバイス
100:触覚提示システム 1: Information processing device 2: Tactile presentation device 17: Tactile reception unit 18: Tactile setting unit 24: MRF device 100: Tactile presentation system
Claims (7)
- ユーザの操作に応じて所望の触覚を特定する触覚特定情報を受け付ける触覚受付部と、
前記触覚受付部により受け付けられる触覚特定情報を設定する触覚設定部と、
前記触覚設定部により設定される触覚特定情報に基づいて前記所望の触覚を提示する触覚提示装置とを備える、触覚提示システム。 a tactile reception unit that receives tactile identification information that identifies a desired tactile sensation in response to a user's operation;
a tactile setting unit that sets tactile specific information received by the tactile reception unit;
a tactile presentation device that presents the desired tactile sensation based on the tactile specification information set by the tactile setting unit. - 請求項1に記載の触覚提示システムであって、さらに、
第1触覚特定情報を示す第1軸と第2触覚特定情報を示す第2軸とを含む座標と、前記座標上に配置され、前記ユーザの操作に応じて変位可能な操作子とを表示する表示部を備え、
前記触覚受付部は、前記表示部により表示される操作子の変位を受け付け、
前記触覚設定部は、前記触覚受付部により受け付けられる操作子の変位に対応する第1及び第2触覚特定情報を設定する、触覚提示システム。 The tactile presentation system according to claim 1, further comprising:
a display unit that displays a coordinate system including a first axis indicating the first tactile specific information and a second axis indicating the second tactile specific information, and an operator that is arranged on the coordinate system and can be displaced in response to an operation by the user;
the tactile sense reception unit receives a displacement of an operator displayed by the display unit;
The tactile setting unit sets first and second tactile specification information corresponding to the displacement of the operator received by the tactile reception unit. - 請求項1に記載の触覚提示システムであって、さらに、
複数種類の触覚に対応する複数の選択肢を表示する表示部を備え、
前記触覚受付部は、前記表示部により表示される複数の選択肢の中から前記ユーザの操作に応じて1つの選択肢を選択し、
前記触覚設定部は、前記触覚受付部により選択される選択肢に対応する触覚特定情報を設定する、触覚提示システム。 The tactile presentation system according to claim 1, further comprising:
A display unit is provided that displays a plurality of options corresponding to a plurality of types of tactile sensations,
the tactile reception unit selects one option from a plurality of options displayed by the display unit in response to an operation by the user;
The tactile setting unit sets tactile specification information corresponding to the option selected by the tactile reception unit. - 請求項2又は3に記載の触覚提示システムであって、
前記触覚設定部は、前記触覚特定情報として前記触覚提示装置に供給されるべき電流値を設定する、触覚提示システム。 The tactile presentation system according to claim 2 or 3,
A tactile presentation system, wherein the tactile setting unit sets a current value to be supplied to the tactile presentation device as the tactile identification information. - 請求項2又は3に記載の触覚提示システムであって、
前記触覚設定部は、前記触覚特定情報として前記触覚提示装置に供給されるべき電流の周波数を設定する、触覚提示システム。 The tactile presentation system according to claim 2 or 3,
A tactile presentation system, wherein the tactile setting unit sets a frequency of a current to be supplied to the tactile presentation device as the tactile identification information. - コンピュータによる触覚提示方法であって、
ユーザの操作に応じて所望の触覚を特定する触覚特定情報を受け付けるステップと、
前記受け付けられる触覚特定情報を設定するステップと、
前記設定される触覚特定情報に基づいて前記所望の触覚を提示するように触覚提示装置を制御するステップとを含む、触覚提示方法。 A tactile presentation method by a computer, comprising:
receiving tactile identification information that identifies a desired tactile sensation in response to a user operation;
setting the received tactile specific information;
and controlling a tactile presentation device so as to present the desired tactile sensation based on the set tactile identification information. - ユーザの操作に応じて所望の触覚を特定する触覚特定情報を受け付けるステップと、
前記受け付けられる触覚特定情報を設定するステップと、
前記設定される触覚特定情報に基づいて前記所望の触覚を提示するように触覚提示装置を制御するステップとをコンピュータに実行させるための触覚提示プログラム。
receiving tactile identification information that identifies a desired tactile sensation in response to a user operation;
setting the received tactile specific information;
and a step of controlling a tactile presentation device so as to present the desired tactile sensation based on the set tactile identification information.
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2023057485 | 2023-03-31 | ||
JP2023-057485 | 2023-03-31 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2024204546A1 true WO2024204546A1 (en) | 2024-10-03 |
Family
ID=92906774
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2024/012671 WO2024204546A1 (en) | 2023-03-31 | 2024-03-28 | Haptic sensation presenting system |
Country Status (1)
Country | Link |
---|---|
WO (1) | WO2024204546A1 (en) |
Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2011048651A (en) * | 2009-08-27 | 2011-03-10 | National Institute Of Information & Communication Technology | Grip feeling feed-back device |
JP2017138651A (en) * | 2016-02-01 | 2017-08-10 | 株式会社栗本鐵工所 | Force sense presentation apparatus |
JP2019219948A (en) * | 2018-06-20 | 2019-12-26 | アルプスアルパイン株式会社 | Operation system, operation device, control device, control method, and program |
WO2020111155A1 (en) * | 2018-11-30 | 2020-06-04 | 株式会社ソニー・インタラクティブエンタテインメント | Information processing device, method for controlling same, and program |
JP2023025707A (en) * | 2021-05-19 | 2023-02-22 | アルプスアルパイン株式会社 | Tactile control unit, program, tactile control method, tactile control system, and server |
-
2024
- 2024-03-28 WO PCT/JP2024/012671 patent/WO2024204546A1/en unknown
Patent Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2011048651A (en) * | 2009-08-27 | 2011-03-10 | National Institute Of Information & Communication Technology | Grip feeling feed-back device |
JP2017138651A (en) * | 2016-02-01 | 2017-08-10 | 株式会社栗本鐵工所 | Force sense presentation apparatus |
JP2019219948A (en) * | 2018-06-20 | 2019-12-26 | アルプスアルパイン株式会社 | Operation system, operation device, control device, control method, and program |
WO2020111155A1 (en) * | 2018-11-30 | 2020-06-04 | 株式会社ソニー・インタラクティブエンタテインメント | Information processing device, method for controlling same, and program |
JP2023025707A (en) * | 2021-05-19 | 2023-02-22 | アルプスアルパイン株式会社 | Tactile control unit, program, tactile control method, tactile control system, and server |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
JP6670884B2 (en) | System and method for tactile-use adaptive and multi-faceted displays | |
JP6588951B2 (en) | System and method using multiple actuators to achieve texture | |
JP6820652B2 (en) | Systems and methods for producing friction and vibration tactile effects | |
JP3085481U (en) | Tactile feedback for touchpads and other touch controls | |
EP2849034B1 (en) | Apparatus for haptic display of data features | |
JP6323182B2 (en) | Electronic book apparatus and electronic book program | |
EP2796964B1 (en) | System and Method for a Haptically-Enabled Deformable Surface | |
JP2019192242A (en) | Systems, devices and methods for providing immersive reality interface modes | |
EP3508952A1 (en) | Systems and methods for providing mode or state awareness with programmable surface texture | |
EP3614236A1 (en) | User interface device | |
EP3441866A1 (en) | Systems and methods for multi-pressure interaction on touch-sensitive surfaces | |
US20090295739A1 (en) | Haptic tactile precision selection | |
CN110597380A (en) | Apparatus and method for providing local haptic effects to a display screen | |
JP2015167023A (en) | Systems and methods for texture engine | |
JP2012526331A (en) | Method and apparatus for forming shape change display by tactile feedback | |
WO2011135171A1 (en) | Apparatus and method for providing tactile feedback for user | |
CN105094312B (en) | The modification of dynamic haptic effect | |
WO2015088492A1 (en) | Input friction mechanism for rotary inputs of electronic devices | |
WO2024204546A1 (en) | Haptic sensation presenting system | |
WO2005040954A1 (en) | Haptic input device for generating control information | |
JP2023148851A (en) | Control device, control method, haptic feedback system, and computer program | |
JP2014142869A (en) | Information processor, information processing method, program and recording medium | |
WO2024204530A1 (en) | Information processing system, information processing device, information processing method, and computer program | |
WO2024166646A1 (en) | Information processing system, information processing device, information processing method, and computer program | |
JP2024137188A (en) | Virtual object tactile presentation system, visual presentation device, virtual object tactile presentation method, and computer program |