CN113414753A - Indoor robot autonomous following system and method - Google Patents
Indoor robot autonomous following system and method Download PDFInfo
- Publication number
- CN113414753A CN113414753A CN202110618550.XA CN202110618550A CN113414753A CN 113414753 A CN113414753 A CN 113414753A CN 202110618550 A CN202110618550 A CN 202110618550A CN 113414753 A CN113414753 A CN 113414753A
- Authority
- CN
- China
- Prior art keywords
- robot
- module
- human body
- controller
- head
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
Images
Classifications
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B25—HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
- B25J—MANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
- B25J5/00—Manipulators mounted on wheels or on carriages
- B25J5/007—Manipulators mounted on wheels or on carriages mounted on wheels
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B25—HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
- B25J—MANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
- B25J19/00—Accessories fitted to manipulators, e.g. for monitoring, for viewing; Safety devices combined with or specially adapted for use in connection with manipulators
- B25J19/0054—Cooling means
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B25—HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
- B25J—MANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
- B25J19/00—Accessories fitted to manipulators, e.g. for monitoring, for viewing; Safety devices combined with or specially adapted for use in connection with manipulators
- B25J19/0075—Means for protecting the manipulator from its environment or vice versa
- B25J19/0083—Means for protecting the manipulator from its environment or vice versa using gaiters
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B25—HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
- B25J—MANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
- B25J19/00—Accessories fitted to manipulators, e.g. for monitoring, for viewing; Safety devices combined with or specially adapted for use in connection with manipulators
- B25J19/02—Sensing devices
- B25J19/026—Acoustical sensing devices
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B25—HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
- B25J—MANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
- B25J19/00—Accessories fitted to manipulators, e.g. for monitoring, for viewing; Safety devices combined with or specially adapted for use in connection with manipulators
- B25J19/02—Sensing devices
- B25J19/04—Viewing devices
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B25—HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
- B25J—MANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
- B25J9/00—Programme-controlled manipulators
- B25J9/16—Programme controls
- B25J9/1674—Programme controls characterised by safety, monitoring, diagnostic
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B25—HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
- B25J—MANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
- B25J9/00—Programme-controlled manipulators
- B25J9/16—Programme controls
- B25J9/1694—Programme controls characterised by use of sensors other than normal servo-feedback from position, speed or acceleration sensors, perception control, multi-sensor controlled systems, sensor fusion
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B25—HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
- B25J—MANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
- B25J9/00—Programme-controlled manipulators
- B25J9/16—Programme controls
- B25J9/1694—Programme controls characterised by use of sensors other than normal servo-feedback from position, speed or acceleration sensors, perception control, multi-sensor controlled systems, sensor fusion
- B25J9/1697—Vision controlled systems
Landscapes
- Engineering & Computer Science (AREA)
- Robotics (AREA)
- Mechanical Engineering (AREA)
- Physics & Mathematics (AREA)
- Acoustics & Sound (AREA)
- Manipulator (AREA)
Abstract
The invention belongs to the technical field of robots, and particularly discloses an indoor robot autonomous following system and a method, wherein the indoor robot autonomous following system comprises a mobile robot, and the mobile robot is characterized by comprising a controller, a human body characteristic acquisition module, a sound acquisition module, an identification module, a moving mechanism, a robot head, a robot body and a driving mechanism; the controller, the human body characteristic acquisition module, the sound acquisition module and the identification module are all arranged at the head of the robot; the moving mechanism is arranged at the bottom of the robot body and used for driving the mobile robot to move along with the human body; the two ends of the driving mechanism are respectively connected with the robot head and the robot body and are used for driving the robot head to rotate; the invention can quickly identify the following target, can finish the effective identification of the following target in a multi-person environment and realizes the effective following of the robot in a complex environment.
Description
Technical Field
The invention relates to the technical field of robots, in particular to an indoor robot autonomous following system and method.
Background
With the continuous application of the related technology of the robot in the production and living fields, people have higher and higher requirements on the interaction function of the robot, the robot following technology is an important component in the field of human-computer interaction, and the robot following technology plays an important role in multiple fields of robot household, commercial use and military use.
The existing indoor robot following technology cannot realize target following in a multi-person environment, and the situation of losing or mistakenly following is easy to occur in the multi-person environment, so that the indoor robot autonomous following system and method are provided.
Disclosure of Invention
The invention aims to provide an indoor robot autonomous following system and method to solve the problems in the background technology.
In order to achieve the purpose, the invention provides the following technical scheme: an indoor robot autonomous following system comprises a mobile robot, and is characterized in that the mobile robot comprises a controller, a human body characteristic acquisition module, a sound acquisition module, an identification module, a moving mechanism, a robot head, a robot body and a driving mechanism; the controller, the human body characteristic acquisition module, the sound acquisition module and the identification module are all arranged at the head of the robot; the moving mechanism is arranged at the bottom of the robot body and used for driving the mobile robot to move along with the human body; the two ends of the driving mechanism are respectively connected with the robot head and the robot body and are used for driving the robot head to rotate;
the human body characteristic acquisition module is used for acquiring the performance characteristics of a human body, and the sound acquisition module is used for acquiring the sound information of the human body; the recognition module is connected with the human body characteristic collection module and the sound collection module and used for recognizing the performance characteristics of the human body and the sound information of the human body, the recognition module recognizes that the information is specific information, the recognition module outputs recognition signals to the controller, and the controller controls the driving mechanism to drive the head of the robot to rotate so that the head of the robot faces the direction of the human body.
Preferably, the mobile robot further comprises an ultrasonic sensor, and the ultrasonic sensor is arranged on the moving mechanism and used for detecting the obstacle.
Preferably, the performance characteristics include color characteristics, gait characteristics, facial characteristics, and bone characteristics.
Preferably, actuating mechanism includes driving motor, and driving motor installs in the inside holding tank of robot truck, and driving motor's top is connected with the axis of rotation, and the holding tank is worn out and is connected with the robot head on the top of axis of rotation.
Preferably, a bearing is sleeved outside the rotating shaft, and the bearing is embedded at the top of the robot trunk.
Preferably, the dorsal part of robot trunk is equipped with the louvre, and the spiro union has the cooling tube in the louvre, is equipped with dustproof mesh board in the cooling tube.
Preferably, the sound collection module is a sound collector; the human body characteristic acquisition module is an image sensor.
An indoor robot autonomous following method comprises the following steps:
the voice information of the human body is collected through the voice collection module, the collected voice is transmitted to the recognition module, the recognition module recognizes the voice information of the human body, under the condition that the recognition module recognizes that the information is specific information, the recognition module outputs a recognition signal to the controller, and the controller controls the driving mechanism to drive the head of the robot to rotate so that the head of the robot faces the direction of the human body;
the human body performance characteristics are collected through the human body characteristic collecting module, the collected human body performance characteristics are transmitted to the recognition module, the recognition module recognizes that the performance characteristics are specific characteristics, the recognition module outputs recognition signals to the controller, and the controller controls the moving mechanism to drive the mobile robot to move towards the human body.
Compared with the prior art, the invention has the beneficial effects that:
1. according to the invention, the voice information of the human body is collected through the voice collection module, the collected voice is transmitted to the identification module, the identification module identifies the voice information of the human body, under the condition that the identification module identifies that the information is specific information, the identification module outputs an identification signal to the controller, and the controller controls the driving mechanism to drive the head of the robot to rotate so that the head of the robot faces the direction of the human body; the human body performance characteristics are collected through a human body characteristic collecting module, the collected human body performance characteristics are transmitted to an identification module, under the condition that the identification module identifies that the performance characteristics are specific characteristics, the identification module outputs identification signals to the controller, and the controller controls a moving mechanism to drive the mobile robot to move towards the human body; the following target can be quickly identified, effective identification of the following target in a multi-person environment can be completed, and effective following of the robot in a complex environment is achieved.
2. According to the invention, the radiating pipe is arranged on the back side of the robot body, so that the driving mechanism can be radiated, and the normal operation of the driving mechanism is ensured; the dustproof screen plate is arranged in the radiating pipe and used for preventing external dust from entering the robot trunk, and in addition, the radiating pipe is in threaded connection with the robot trunk, so that the radiating pipe is convenient to disassemble and assemble and is convenient to clean the dustproof screen plate; the head of the robot can stably rotate by sleeving the middle bearing outside the rotating shaft.
Drawings
FIG. 1 is a schematic view of the overall structure of the present invention;
FIG. 2 is a schematic view of the internal structure of the present invention;
FIG. 3 is a flow chart diagram of the autonomous following method of the robot of the present invention.
In the figure: 1. a controller; 2. a human body characteristic acquisition module; 3. a sound collection module; 4. an identification module; 5. a moving mechanism; 6. a robot head; 7. a robot trunk; 8. a rotating shaft; 9. a radiating pipe; 10. a dustproof screen plate; 11. a bearing; 12. an ultrasonic sensor; 13. the motor is driven.
Detailed Description
The technical solutions in the embodiments of the present invention will be clearly and completely described below with reference to the drawings in the embodiments of the present invention, and it is obvious that the described embodiments are only a part of the embodiments of the present invention, and not all of the embodiments. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present invention.
In the description of the present invention, it should be noted that the terms "vertical", "upper", "lower", "horizontal", and the like indicate orientations or positional relationships based on those shown in the drawings, and are only for convenience of describing the present invention and simplifying the description, but do not indicate or imply that the referred device or element must have a specific orientation, be constructed in a specific orientation, and be operated, and thus, should not be construed as limiting the present invention.
In the description of the present invention, it should also be noted that, unless otherwise explicitly specified or limited, the terms "disposed," "mounted," "connected," and "connected" are to be construed broadly and may, for example, be fixedly connected, detachably connected, or integrally connected; can be mechanically or electrically connected; they may be connected directly or indirectly through intervening media, or they may be interconnected between two elements. The specific meanings of the above terms in the present invention can be understood by those skilled in the art according to specific situations.
Example 1: referring to fig. 1-2, the present invention provides a technical solution: an indoor robot autonomous following system comprises a mobile robot, and is characterized in that the mobile robot comprises a controller 1, a human body characteristic acquisition module 2, a sound acquisition module 3, an identification module 4, a moving mechanism 5, a robot head 6, a robot trunk 7 and a driving mechanism; the controller 1, the human body characteristic acquisition module 2, the sound acquisition module 3 and the identification module 4 are all arranged on the head 6 of the robot; the moving mechanism 5 is arranged at the bottom of the robot body 7 and used for driving the mobile robot to move along with the human body; the two ends of the driving mechanism are respectively connected with the robot head 6 and the robot body 7 and are used for driving the robot head 6 to rotate;
the human body characteristic acquisition module 2 is used for acquiring the performance characteristics of a human body, and the sound acquisition module 3 is used for acquiring the sound information of the human body; identification module 4 is connected with human feature collection module 2, sound collection module 3 for the expression characteristic to the human body and human sound information discern identification module 4 discerns under the condition that information is specific information, identification module 4 output identification signal extremely controller 1, controller 1 control actuating mechanism drive robot head 6 and rotate to make robot head 6 towards human direction identification module 4 discernment under the condition that the expression characteristic is specific characteristic, identification module 4 output identification signal extremely controller 1, controller 1 control moving mechanism 5 drives mobile robot and moves towards the human body.
Further, the mobile robot further includes an ultrasonic sensor 12, and the ultrasonic sensor 12 is disposed on the moving mechanism 5 and is used for detecting an obstacle.
Further, the performance characteristics include color characteristics, gait characteristics, facial characteristics, and bone characteristics.
Further, actuating mechanism includes driving motor 13, and driving motor 13 installs in the inside holding tank of robot truck 7, and driving motor 13's top is connected with axis of rotation 8, and the holding tank is worn out and is connected with robot head 6 on the top of axis of rotation 8.
Further, the outside cover of axis of rotation 8 is equipped with bearing 11, and bearing 11 gomphosis is at the top of robot truck 7, and bearing 11's setting can make axis of rotation 8 steady rotation, and then can make robot head 6 steady rotation.
Further, the dorsal part of robot truck 7 is equipped with the louvre, and spiral connection has cooling tube 9 in the louvre, and easy dismounting is convenient for dispel the heat to actuating mechanism, is equipped with dustproof otter board 10 in the cooling tube 9, prevents that external dust from getting into in the robot truck 7.
Further, the sound collection module 3 is a sound collector; the human body characteristic acquisition module 2 is an image sensor.
Furthermore, gather human sound information through sound collection module 3 to reach identification module 4 with the sound of gathering, identification module 4 discerns human sound information identification module 4 discerns under the condition that information is specific information, identification module 4 output identification signal extremely controller 1, controller 1 control actuating mechanism drive robot head 6 and rotate, so that robot head 6 is towards human direction.
Real-time example 2: referring to fig. 3, an indoor robot autonomous following method includes the following steps:
the voice information of a human body is collected through the voice collection module 3, the collected voice is transmitted to the recognition module 4, the recognition module 4 recognizes the voice information of the human body, under the condition that the recognition module 4 recognizes that the information is specific information, the recognition module 4 outputs a recognition signal to the controller 1, the controller 1 controls the driving motor 13 in the driving mechanism to operate, the driving motor 13 operates to drive the rotating shaft 8 to rotate, and the rotating shaft 8 drives the robot head 6 to rotate, so that the robot head 6 faces the direction of the human body;
human performance characteristics are collected through the human body characteristic collecting module 2, the collected human performance characteristics are transmitted to the recognition module 4, the recognition module 4 recognizes that the performance characteristics are specific characteristics, the recognition module 4 outputs recognition signals to the controller 1, and the controller 1 controls the moving mechanism 5 to drive the mobile robot to move towards the human body.
It is worth noting that: the whole device realizes control over the device through the master control button, and the device matched with the control button is common equipment, belongs to the existing mature technology, and is not repeated for the electrical connection relation and the specific circuit structure.
Although embodiments of the present invention have been shown and described, it will be appreciated by those skilled in the art that changes, modifications, substitutions and alterations can be made in these embodiments without departing from the principles and spirit of the invention, the scope of which is defined in the appended claims and their equivalents.
Claims (8)
1. An indoor robot autonomous following system comprises a mobile robot, and is characterized in that the mobile robot comprises a controller (1), a human body characteristic acquisition module (2), a sound acquisition module (3), an identification module (4), a moving mechanism (5), a robot head (6), a robot trunk (7) and a driving mechanism; the controller (1), the human body characteristic acquisition module (2), the sound acquisition module (3) and the identification module (4) are all arranged on the head (6) of the robot; the moving mechanism (5) is arranged at the bottom of the robot trunk (7) and used for driving the moving robot to move along with a human body; the two ends of the driving mechanism are respectively connected with the robot head (6) and the robot body (7) and are used for driving the robot head (6) to rotate;
the human body characteristic acquisition module (2) is used for acquiring performance characteristics of a human body, and the sound acquisition module (3) is used for acquiring sound information of the human body; identification module (4) are connected with human feature collection module (2), sound collection module (3) for discern human performance characteristic and human sound information identification module (4) discernment under the condition that information is specific information, identification module (4) output identification signal extremely controller (1), controller (1) control actuating mechanism drive robot head (6) rotate to make robot head (6) towards human direction identification module (4) discernment under the condition that performance characteristic is specific characteristic, identification module (4) output identification signal extremely controller (1), controller (1) control moving mechanism (5) drive mobile robot move towards human.
2. An indoor robot autonomous following system according to claim 1, characterized in that: the mobile robot further comprises an ultrasonic sensor (12), and the ultrasonic sensor (12) is arranged on the moving mechanism (5) and used for detecting obstacles.
3. An indoor robot autonomous following system according to claim 1, characterized in that: the performance characteristics include color characteristics, gait characteristics, facial characteristics, and skeletal characteristics.
4. An indoor robot autonomous following system according to claim 1, characterized in that: actuating mechanism includes driving motor (13), and in the holding tank of robot truck (7) inside was installed in driving motor (13), the top of driving motor (13) was connected with axis of rotation (8), and the holding tank is worn out and is connected with robot head (6) on the top of axis of rotation (8).
5. An indoor robot autonomous following system according to claim 1, characterized in that: the outside cover of axis of rotation (8) is equipped with bearing (11), and bearing (11) gomphosis is at the top of robot truck (7).
6. An indoor robot autonomous following system according to claim 1, characterized in that: the dorsal part of robot truck (7) is equipped with the louvre, and the louvre internal screw connection has cooling tube (9), is equipped with dustproof mesh board (10) in cooling tube (9).
7. An indoor robot autonomous following system according to claim 1, characterized in that: the sound acquisition module (3) is a sound collector; the human body characteristic acquisition module (2) is an image sensor.
8. An indoor robot autonomous following method according to any of claims 1-7, characterized by comprising the steps of:
the voice information of a human body is collected through the voice collection module (3), the collected voice is transmitted to the recognition module (4), the recognition module (4) recognizes the voice information of the human body, under the condition that the recognition module (4) recognizes that the information is specific information, the recognition module (4) outputs a recognition signal to the controller (1), and the controller (1) controls the driving mechanism to drive the robot head (6) to rotate so that the robot head (6) faces the direction of the human body;
human performance characteristics are collected through the human body characteristic collection module (2), the collected human performance characteristics are transmitted to the recognition module (4), the recognition module (4) recognizes the performance characteristics are specific characteristics, the recognition module (4) outputs recognition signals to the controller (1), and the controller (1) controls the moving mechanism (5) to drive the mobile robot to move towards the human body.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202110618550.XA CN113414753A (en) | 2021-06-03 | 2021-06-03 | Indoor robot autonomous following system and method |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202110618550.XA CN113414753A (en) | 2021-06-03 | 2021-06-03 | Indoor robot autonomous following system and method |
Publications (1)
Publication Number | Publication Date |
---|---|
CN113414753A true CN113414753A (en) | 2021-09-21 |
Family
ID=77713714
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202110618550.XA Pending CN113414753A (en) | 2021-06-03 | 2021-06-03 | Indoor robot autonomous following system and method |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN113414753A (en) |
Cited By (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN113858216A (en) * | 2021-12-01 | 2021-12-31 | 南开大学 | Robot following method, device and system |
CN114630232A (en) * | 2022-03-19 | 2022-06-14 | 南京华脉科技股份有限公司 | Voice conversation device based on artificial intelligence and conversation system thereof |
Citations (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
KR101193610B1 (en) * | 2012-04-24 | 2012-10-26 | 경남대학교 산학협력단 | Intelligent robot system for traffic guidance of crosswalk |
CN105034002A (en) * | 2015-08-04 | 2015-11-11 | 北京进化者机器人科技有限公司 | Multifunctional home service robot |
CN106774325A (en) * | 2016-12-23 | 2017-05-31 | 湖南晖龙股份有限公司 | Robot is followed based on ultrasonic wave, bluetooth and vision |
CN107336245A (en) * | 2017-05-27 | 2017-11-10 | 芜湖星途机器人科技有限公司 | Head actively follows robot |
CN107398900A (en) * | 2017-05-27 | 2017-11-28 | 芜湖星途机器人科技有限公司 | Active system for tracking after robot identification human body |
CN108638080A (en) * | 2018-04-24 | 2018-10-12 | 芜湖信河信息技术有限公司 | Lift auxiliary robot |
-
2021
- 2021-06-03 CN CN202110618550.XA patent/CN113414753A/en active Pending
Patent Citations (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
KR101193610B1 (en) * | 2012-04-24 | 2012-10-26 | 경남대학교 산학협력단 | Intelligent robot system for traffic guidance of crosswalk |
CN105034002A (en) * | 2015-08-04 | 2015-11-11 | 北京进化者机器人科技有限公司 | Multifunctional home service robot |
CN106774325A (en) * | 2016-12-23 | 2017-05-31 | 湖南晖龙股份有限公司 | Robot is followed based on ultrasonic wave, bluetooth and vision |
CN107336245A (en) * | 2017-05-27 | 2017-11-10 | 芜湖星途机器人科技有限公司 | Head actively follows robot |
CN107398900A (en) * | 2017-05-27 | 2017-11-28 | 芜湖星途机器人科技有限公司 | Active system for tracking after robot identification human body |
CN108638080A (en) * | 2018-04-24 | 2018-10-12 | 芜湖信河信息技术有限公司 | Lift auxiliary robot |
Cited By (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN113858216A (en) * | 2021-12-01 | 2021-12-31 | 南开大学 | Robot following method, device and system |
CN113858216B (en) * | 2021-12-01 | 2022-02-22 | 南开大学 | Robot following method, device and system |
CN114630232A (en) * | 2022-03-19 | 2022-06-14 | 南京华脉科技股份有限公司 | Voice conversation device based on artificial intelligence and conversation system thereof |
CN114630232B (en) * | 2022-03-19 | 2023-10-17 | 国网上海市电力公司超高压分公司 | Speech dialogue device based on artificial intelligence and dialogue system thereof |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN102176222B (en) | Multi-sensor information collection analyzing system and autism children monitoring auxiliary system | |
CN113414753A (en) | Indoor robot autonomous following system and method | |
CN102499638B (en) | Living body detection system based on vision, hearing, smell and touch | |
CN107984481A (en) | A kind of split type house Kang Hu robots | |
CN107038844A (en) | One kind nurse robot | |
CN108910355A (en) | A kind of intelligent garbage recyclable device and control method | |
CN111182234A (en) | Face recognition method and device | |
CN205299832U (en) | Intelligence air conditioner based on infrared human response | |
CN212859492U (en) | Intelligent household robot | |
CN109240296A (en) | A kind of omnidirectional's intelligent carriage | |
CN113499055A (en) | Intelligent respiratory anomaly detection system | |
CN211209769U (en) | AI action recognition analysis camera | |
CN210819528U (en) | IOT (Internet of things) intelligent household robot based on ROS (reactive oxygen species) system | |
CN208910246U (en) | A kind of cardioelectric monitor system of voice control | |
CN106384471A (en) | Electronic dog | |
CN215031481U (en) | Cleaning robot for nuclear power plant | |
CN208005686U (en) | A kind of split type house Kang Hu robots | |
CN207390142U (en) | A kind of intelligent garbage bin | |
CN213758071U (en) | Intelligent cleaning robot | |
CN210822754U (en) | Aerial photography unmanned aerial vehicle for homeland survey | |
CN207851602U (en) | Smart home detects host | |
WO2021184561A1 (en) | Ultraviolet inspection robot | |
CN217530872U (en) | Robot assembly and multi-robot system | |
CN209149155U (en) | Electric control system of electric power inspection robot | |
CN108202804A (en) | A kind of intelligence slide plate perambulator |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
RJ01 | Rejection of invention patent application after publication |
Application publication date: 20210921 |
|
RJ01 | Rejection of invention patent application after publication |