default search action
Mihoko Niitsuma
Person information
Refine list
refinements active!
zoomed in on ?? of ?? records
view refined list in
export refined list as
2020 – today
- 2024
- [c76]Dávid Vincze, Mihoko Niitsuma:
Towards an Interaction Stimulator Social Robot in the Parent-Child Interaction Therapy based on Real-time Speech Processing. RO-MAN 2024: 886-892 - [c75]Dávid Vincze, Mihoko Niitsuma:
A Human Speech-based Behavior Model for Social Robots based on Parent-Child Interaction Therapy. SII 2024: 579-584 - [c74]Otono Uchikawa, Mihoko Niitsuma:
Different Age Groups Comparison on Impression Evaluation of Rewarding/Punitive Behavior with Gestures and Gaze of Robots. SII 2024: 610-615 - [c73]Ryogo Kai, Kenta Ohashi, Hikaru Fujita, Takuya Kojima, Yuma Sasaki, Mihoko Niitsuma, Kazunori Umeda:
Development of an Environmentally Independent Mobile Manipulation System for Product Disposal in Retail Stores. SII 2024: 1282-1287 - [c72]Atsushi Sugimoto, Yuma Sasaki, Takuya Kojima, Otono Uchikawa, Taisei Inaba, Taito Nishikawa, Mihoko Niitsuma:
Situation-based Proactive Human-Robotic Systems Interaction and Collaboration in Future Convenience Stores. SII 2024: 1417-1424 - 2023
- [c71]Takumi Sato, Mihoko Niitsuma:
Evaluation of Cognitive Assistance for Visual Impairment to Recognize Dynamic Walking Environments Using Vibrotactile Stimulation. IECON 2023: 1-6 - [c70]Takumi Sato, Mihoko Niitsuma:
Cognitive assistance for the visually impaired using haptics presentation of environmental information. ISIE 2023: 1-4 - [c69]Dávid Vincze, Mihoko Niitsuma:
Human-Object-Robot Interaction Through Playing Behaviour for Social Robots. SMC 2023: 3174-3179 - 2022
- [j7]Takumi Nishio, Mihoko Niitsuma:
Moving objects extraction for building environmental maps describing human walking activity using a 3D LiDAR. Adv. Robotics 36(23): 1291-1304 (2022) - [c68]Otono Uchikawa, Mihoko Niitsuma:
Two Age Groups Comparison on Impression Evaluation of Distance and Communication with Two Different Appearance Mobile Robots*. AIM 2022: 902-908 - [c67]Dávid Vincze, Mihoko Niitsuma:
What-You-See-Is-What-You-Get Indoor Localization for Physical Human-Robot Interaction Experiments. AIM 2022: 909-914 - [c66]Jumpei Shiga, Gakushi Maruyama, Shinnosuke Kato, Takahiro Hashimoto, Kaori Shibagaki, Ken Kitamura, Yuki Ikeguchi, Atsushi Sugimoto, Mihoko Niitsuma:
Intelligent Space-Based Future Convenience Store for Customer Interaction Service with New Experience, Safety, and Flexibility. SII 2022: 1064-1071 - 2021
- [j6]Kanon Fujino, Mihoko Niitsuma:
Effects of Presenting People Flow Information by Vibrotactile Stimulation for Visually Impaired People on Behavior Decision. J. Robotics Mechatronics 33(1): 97-107 (2021) - [c65]Dávid Vincze, Márta Gácsi, Szilveszter Kovács, Péter Korondi, Ádám Miklósi, Mihoko Niitsuma:
Towards the automatic observation and evaluation of ethologically inspired Human-Robot Interaction. AIM 2021: 586-591 - [c64]Moe Urasaki, Mihoko Niitsuma:
Mental Fatigue Evaluation using Blink Frequency and Pupil Diameter on Human-Robot Collaboration with Different Motion Timing. ISIE 2021: 1-5 - [c63]Tappei Katsunaga, Takayuki Tanaka, Mihoko Niitsuma, Saburo Takahashi, Toshihisa Abe:
Hierarchical probabilistic task recognition based on spatial memory for care support. SII 2021: 310-314 - [c62]Shinnosuke Kato, Mihoko Niitsuma, Takayuki Tanaka:
Pose Identification for Task Recognition in Care Work. SII 2021: 315-319 - [c61]Dávid Vincze, Márta Gácsi, Szilveszter Kovács, Mihoko Niitsuma, Péter Korondi, Ádám Miklósi:
Towards the automatic observation and coding of simple behaviours in ethological experiments. SII 2021: 841-842 - [c60]Tamás Tompa, Szilveszter Kovács, Dávid Vincze, Mihoko Niitsuma:
Demonstration of expert knowledge injection in Fuzzy Rule Interpolation based Q-learning. SII 2021: 843-844 - 2020
- [c59]Dávid Vincze, Alex Tóth, Mihoko Niitsuma:
Antecedent Redundancy Exploitation in Fuzzy Rule Interpolation-based Reinforcement Learning. AIM 2020: 1316-1321 - [c58]Kenta Nakajima, Mihoko Niitsuma:
Effects of Space and Scenery on Virtual Pet-Assisted Activity. HAI 2020: 105-111 - [c57]Jumpei Okimoto, Mihoko Niitsuma:
Effects of Auditory Cues on Human-Robot Collaboration. ISIE 2020: 1572-1577 - [c56]Kouji Sakotani, Shinnosuke Kato, Mihoko Niitsuma, Takayuki Tanaka:
Task Activity Recognition and Workspace Extraction for Nursing Care Assistance in Intelligent Space. SII 2020: 1259-1264 - [c55]Tappei Katsunaga, Takayuki Tanaka, Mihoko Niitsuma, Saburo Takahashi, Toshihisa Abe:
Task recognition based on task history considering estimation error of localization in care support with autonomous mobile wheelchairs. SII 2020: 1265-1269 - [c54]Dávid Vincze, Alex Tóth, Mihoko Niitsuma:
Football Simulation Modeling with Fuzzy Rule Interpolation-based Fuzzy Automaton. UR 2020: 87-92
2010 – 2019
- 2019
- [c53]Kanon Fujino, Mihoko Niitsuma:
Extraction and Presentation of People Flow Information to Assist Dynamic Environment Recognition for Visually Impaired People. IECON 2019: 6883-6889 - [c52]Toshiki Hashizume, Mihoko Niitsuma:
Pose Presentation of End Effector Using Vibrotactile Interface for Assistance in Motion Sharing of Industrial Robot Remote Operation. ISIE 2019: 1186-1191 - [c51]Takumi Nishio, Mihoko Niitsuma:
Environmental Map Building to Describe Walking Dynamics for Determination of Spatial Feature of Walking Activity. ISIE 2019: 2315-2320 - [c50]Motoaki Shiratsu, Yuta Sampei, Shuntaro Noguchi, Ryo Midorikawa, Kouji Sakotani, Takumi Nishio, Mihoko Niitsuma:
Interactive Presentation of Information for Shopping Assistance in Intelligent Space. SII 2019: 379-384 - 2018
- [c49]Natsuki Ichikawa, Mihoko Niitsuma:
Nonverbal Human-Robot Communication for Ambient Assisted Living Applications Based on Ethologically Inspired Social Behavior Model. IECON 2018: 6045-6050 - [c48]Ryo Midorikawa, Mihoko Niitsuma:
Effects of Touch Experience on Active Human Touch in Human-Robot Interaction. SyRoCo 2018: 154-159 - [e1]Mihoko Niitsuma:
12th IFAC Symposium on Robot Control, SyRoCo 2018, Budapest, Hungary, August 27-30, 2018. IFAC-PapersOnline 51(22), International Federation of Automatic Control 2018 [contents] - 2017
- [c47]Daisuke Yamazaki, Mihoko Niitsuma:
Presentation of contact information of industrial robot's end-effector using vibrotactile sensation. HSI 2017: 172-177 - [c46]Hiroshi Fukushima, Mihoko Niitsuma:
Investigation of cognitive characteristics of human vision to moving objects for visual field estimation. SII 2017: 1080-1085 - [c45]Yuta Sampei, Mihoko Niitsuma:
Approach based on geometric shape of pedestrian's head to shoulder region for human tracking in high density crowd using a 3D laser range finder. URAI 2017: 846-847 - 2016
- [c44]Kaito Tsukada, Mihoko Niitsuma:
Impression on Human-Robot Communication Affected by Inconsistency in Expected Robot Perception. HAI 2016: 261-262 - [c43]Yasuyuki Sawada, Mihoko Niitsuma:
Dynamic obstacle avoidance based on obstacle type for interactive smart electric wheelchair. IECON 2016: 5891-5896 - [c42]Honoka Kanai, Mihoko Niitsuma:
Update of human-robot relationship based on ethologically inspired human-robot communication history. RO-MAN 2016: 324-330 - [c41]Takashi Hatano, Csongor Márk Horváth, Trygve Thomessen, Mihoko Niitsuma:
A vibrotactile navigation aid for remote operation of an industrial robot. SII 2016: 700-705 - 2015
- [c40]Audun Rønning Sanderud, Mihoko Niitsuma, Trygve Thomessen:
A likelihood analysis for a risk analysis for safe human robot collaboration. ETFA 2015: 1-6 - [c39]Soh Takahashi, Márta Gácsi, Péter Korondi, Hideki Hashimoto, Mihoko Niitsuma:
Leading a Person Using Ethologically Inspired Autonomous Robot Behavior. HRI (Extended Abstracts) 2015: 87-88 - [c38]Antal Martinecz, Mihoko Niitsuma:
Modeling the G-Protein Signaling of the Retina with Fractional Calculus. ICINCO (1) 2015: 481-488 - [c37]Yohei Takahashi, Mihoko Niitsuma:
Enhancement of attachment behavior model for social robot to adapt in daily living environments. IECON 2015: 5142-5147 - [c36]Soh Takahashi, Márta Gácsi, Péter Korondi, Mihoko Niitsuma:
Design of legible autonomous leading behavior based on dogs' approach. RO-MAN 2015: 211-216 - [c35]Naoki Uenoyama, Mihoko Niitsuma:
Temporal segmentation of environmental map based on changing rate of activity areas and improvement of tracking method using LRF. SII 2015: 57-62 - [c34]Keita Suzuki, Trygve Thomessen, Mihoko Niitsuma:
Vibrotactile information for supporting pick and place task using industrial robot remote operation. SII 2015: 69-74 - [c33]Trygve Thomessen, Bjorn Harald Ratvik Lian, Mihoko Niitsuma:
Towards virtual presence based on multimodal man-machine communication: How to control and communicate grasping conditions. SII 2015: 75-80 - [c32]Hiroko Kamide, Mihoko Niitsuma, Tatsuo Arai:
Implicit Nonverbal Behaviors Expressing Closeness by 3D Agents. ICSR 2015: 306-316 - [c31]Trygve Thomessen, Mihoko Niitsuma, Keita Suzuki, Takashi Hatano, Hideki Hashimoto:
Towards Virtual Presence Based on Multimodal Man-Machine Communication: A Remote Operation Support System for Industrial Robots. SyRoCo 2015: 172-177 - [c30]Péter Korondi, Beáta Korcsok, Szilveszter Kovács, Mihoko Niitsuma:
Etho-robotics: What kind of behaviour can we learn from the animals? SyRoCo 2015: 244-255 - 2014
- [c29]Audun Rønning Sanderud, Trygve Thomessen, Hideki Hashimoto, Hisashi Osumi, Mihoko Niitsuma:
An approach to path planning and real-time redundancy control for human-robot collaboration. AIM 2014: 1018-1023 - [c28]Mihoko Niitsuma, Yohei Takahashi, Soh Takahashi:
Improvement in mutual interaction between robot and person for attachment behavior of robot. IECON 2014: 4022-4027 - [c27]Atsuro Takimoto, Hiroshi Hashimoto, Mihoko Niitsuma:
Effective destination determination for semi-autonomous smart electric wheelchair based on history of human activity. INDIN 2014: 763-769 - 2013
- [c26]Sota Sakamaki, Mihoko Niitsuma:
Evaluation of smart electric wheelchair operation based on directional input from user and mobile robot navigation. AIM 2013: 471-476 - [c25]Péter Baranyi, Anna Esposito, Mihoko Niitsuma, Bjørn Solvang:
Welcome. CogInfoCom 2013: 1 - [c24]Yuichi Muramatsu, Mihoko Niitsuma, Trygve Thomessen:
Building a cognitive model of tactile sensations based on vibrotactile stimuli. CogInfoCom 2013: 149-154 - [c23]Takayuki Watabe, Mihoko Niitsuma:
Mental map generation assistance tool using relative pitch difference and angular information for visually impaired people. CogInfoCom 2013: 255-260 - [c22]Toshiya Furuyama, Mihoko Niitsuma:
Building a multi-resolution map for autonomous mobile robot navigation in living environments. CogInfoCom 2013: 261-266 - [c21]Mihoko Niitsuma, Ryuichi Numakunai, Akira Onodera:
Tuning of behavioral characteristics in an ethologically inspired robot behavior model based on verbal communication. IECON 2013: 7855-7861 - [c20]Yuichi Muramatsu, Mihoko Niitsuma:
Correspondence relationships between vibrotactile stimuli and tactile sensations determined by semantic differential. RO-MAN 2013: 668-673 - 2012
- [j5]Mihoko Niitsuma, Terumichi Ochi, Masahiro Yamaguchi, Koki Iwamoto:
Design of Mutual Interaction Between a User and Smart Electric Wheelchair. J. Adv. Comput. Intell. Intell. Informatics 16(2): 305-312 (2012) - [j4]Young Eun Song, Peter Kovacs, Mihoko Niitsuma, Hideki Hashimoto:
Spatial Memory for Augmented Personal Working Environments. J. Adv. Comput. Intell. Intell. Informatics 16(2): 349-357 (2012) - [c19]Yuichi Muramatsu, Mihoko Niitsuma, Trygve Thomessen:
Perception of tactile sensation using vibrotactile glove interface. CogInfoCom 2012: 621-626 - [c18]Young Eun Song, Mihoko Niitsuma, Takashi Kubota, Hideki Hashimoto, Hyoung Il Son:
Mobile multimodal human-robot interface for virtual collaboration. CogInfoCom 2012: 627-631 - [c17]Dávid Vincze, Szilveszter Kovács, Mihoko Niitsuma, Hideki Hashimoto, Péter Korondi, Márta Gácsi, Ádám Miklósi:
Ethologically inspired human-robot interaction interfaces. HCCE 2012: 51-57 - [c16]Mihoko Niitsuma, Takuya Ichikawa, Ryuichi Numakunai, Akira Onodera, Péter Korondi, Hideki Hashimoto:
Design of social behavior of physical agent in intelligent space. IECON 2012: 5523-5528 - [c15]Syo Hiroi, Mihoko Niitsuma:
Building a map including moving objects for mobile robot navigation in living environments. INSS 2012: 1-2 - [c14]Young Eun Song, Mihoko Niitsuma, Takashi Kubota, Hideki Hashimoto, Hyoung Il Son:
Multimodal Human-Robot Interface with Gesture-Based Virtual Collaboration. RiTA 2012: 91-104 - [c13]Takuya Ichikawa, Mina Yuki, Péter Korondi, Hideki Hashimoto, Márta Gácsi, Mihoko Niitsuma:
Impression evaluation for different behavioral characteristics in ethologically inspired human-robot communication. RO-MAN 2012: 55-60 - [c12]Ryuichi Numakunai, Takuya Ichikawa, Márta Gácsi, Péter Korondi, Hideki Hashimoto, Mihoko Niitsuma:
Exploratory behavior in ethologically inspired robot behavioral model. RO-MAN 2012: 577-582 - [c11]Takuya Ichikawa, Wataru Beppu, Szilveszter Kovács, Péter Korondi, Hideki Hashimoto, Mihoko Niitsuma:
Ethologically Inspired Human-Robot Communication for Monitoring Support System in Intelligent Space. SyRoCo 2012: 58-63 - 2011
- [j3]Mihoko Niitsuma, Tamio Tanikawa:
Editorial: Ambient Intelligence. J. Robotics Mechatronics 23(4): 465 (2011) - [c10]Mihoko Niitsuma, Terumichi Ochi, Masahiro Yamaguchi, Hideki Hashimoto:
An approach of human - smart electric wheelchair interaction in intelligent space. RiiSS 2011: 119-124
2000 – 2009
- 2009
- [j2]Mihoko Niitsuma, Hideki Hashimoto:
Observation of Human Activities Based on Spatial Memory in Intelligent Space. J. Robotics Mechatronics 21(4): 515-523 (2009) - [c9]Mihoko Niitsuma, Hideki Hashimoto:
Comparison of visual and vibration displays for finding spatial memory in Intelligent Space. RO-MAN 2009: 587-592 - [c8]Mihoko Niitsuma, Kazuki Yokoi, Hideki Hashimoto:
Observation of Human-Object Interaction Using Distributed Sensors. SyRoCo 2009: 257-262 - 2008
- [c7]Mihoko Niitsuma, Kouhei Kawaji, Kazuki Yokoi, Hideki Hashimoto:
Extraction of human - object relations in Intelligent Space. RO-MAN 2008: 520-525 - 2007
- [j1]Mihoko Niitsuma, Hiroshi Hashimoto:
Spatial Memory as an Aid System for Human Activity in Intelligent Space. IEEE Trans. Ind. Electron. 54(2): 1122-1131 (2007) - [c6]Mihoko Niitsuma, Hideki Hashimoto:
Extraction of Space-Human Activity Association for Design of Intelligent Environment. ICRA 2007: 1814-1819 - 2006
- [c5]Mihoko Niitsuma, Hiroshi Hashimoto, Hideki Hashimoto:
Spatial Memory: an Aid System for Human Activity in Intelligent Space. ICRA 2006: 4258-4263 - 2005
- [c4]Hiroshi Hashimoto, Daisuke Takeda, Toshio Matsunaga, Masato Saito, Chiharu Ishii, Mihoko Niitsuma:
Psychological evaluation for appearance of swinging robot - via SD and biosignal method approach. SMC 2005: 740-745 - [c3]Hiroshi Hashimoto, Masato Saito, Akinori Sasaki, Toshio Matsunaga, Chiharu Ishii, Mihoko Niitsuma:
Recognition of Surrounding Environmental Based on Hand-Haptic Force. SMC 2005: 1223-1228 - 2003
- [c2]Adela Béres, Károly Béres, Mihoko Niitsuma, Hideki Hashimoto:
Advanced Movement Model of Crowd Robots. WISES 2003: 131-138 - 2002
- [c1]Mihoko Niitsuma, Hiroshi Hashimoto, Yukio Kimura, Shintaro Ishijima:
Movement Model of Multiple Mobile Robots Based on Servo System. DARS 2002: 247-256
Coauthor Index
manage site settings
To protect your privacy, all features that rely on external API calls from your browser are turned off by default. You need to opt-in for them to become active. All settings here will be stored as cookies with your web browser. For more information see our F.A.Q.
Unpaywalled article links
Add open access links from to the list of external document links (if available).
Privacy notice: By enabling the option above, your browser will contact the API of unpaywall.org to load hyperlinks to open access articles. Although we do not have any reason to believe that your call will be tracked, we do not have any control over how the remote server uses your data. So please proceed with care and consider checking the Unpaywall privacy policy.
Archived links via Wayback Machine
For web page which are no longer available, try to retrieve content from the of the Internet Archive (if available).
Privacy notice: By enabling the option above, your browser will contact the API of archive.org to check for archived content of web pages that are no longer available. Although we do not have any reason to believe that your call will be tracked, we do not have any control over how the remote server uses your data. So please proceed with care and consider checking the Internet Archive privacy policy.
Reference lists
Add a list of references from , , and to record detail pages.
load references from crossref.org and opencitations.net
Privacy notice: By enabling the option above, your browser will contact the APIs of crossref.org, opencitations.net, and semanticscholar.org to load article reference information. Although we do not have any reason to believe that your call will be tracked, we do not have any control over how the remote server uses your data. So please proceed with care and consider checking the Crossref privacy policy and the OpenCitations privacy policy, as well as the AI2 Privacy Policy covering Semantic Scholar.
Citation data
Add a list of citing articles from and to record detail pages.
load citations from opencitations.net
Privacy notice: By enabling the option above, your browser will contact the API of opencitations.net and semanticscholar.org to load citation information. Although we do not have any reason to believe that your call will be tracked, we do not have any control over how the remote server uses your data. So please proceed with care and consider checking the OpenCitations privacy policy as well as the AI2 Privacy Policy covering Semantic Scholar.
OpenAlex data
Load additional information about publications from .
Privacy notice: By enabling the option above, your browser will contact the API of openalex.org to load additional information. Although we do not have any reason to believe that your call will be tracked, we do not have any control over how the remote server uses your data. So please proceed with care and consider checking the information given by OpenAlex.
last updated on 2024-11-11 21:23 CET by the dblp team
all metadata released as open data under CC0 1.0 license
see also: Terms of Use | Privacy Policy | Imprint