CN111643011A - Cleaning robot control method and device, cleaning robot and storage medium - Google Patents
Cleaning robot control method and device, cleaning robot and storage medium Download PDFInfo
- Publication number
- CN111643011A CN111643011A CN202010457292.7A CN202010457292A CN111643011A CN 111643011 A CN111643011 A CN 111643011A CN 202010457292 A CN202010457292 A CN 202010457292A CN 111643011 A CN111643011 A CN 111643011A
- Authority
- CN
- China
- Prior art keywords
- behavior
- cleaning robot
- target
- environment
- target object
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Granted
Links
- 238000004140 cleaning Methods 0.000 title claims abstract description 326
- 238000000034 method Methods 0.000 title claims abstract description 49
- 238000001514 detection method Methods 0.000 claims abstract description 127
- 230000000694 effects Effects 0.000 claims abstract description 56
- 238000007405 data analysis Methods 0.000 claims abstract description 28
- 230000006399 behavior Effects 0.000 claims description 236
- 206010000117 Abnormal behaviour Diseases 0.000 claims description 33
- 238000004590 computer program Methods 0.000 claims description 13
- 230000008569 process Effects 0.000 claims description 11
- 238000000605 extraction Methods 0.000 claims description 6
- 238000012216 screening Methods 0.000 claims description 4
- 238000012549 training Methods 0.000 description 22
- 230000007613 environmental effect Effects 0.000 description 16
- 241000282472 Canis lupus familiaris Species 0.000 description 15
- 238000004422 calculation algorithm Methods 0.000 description 9
- 238000004891 communication Methods 0.000 description 7
- 241000282326 Felis catus Species 0.000 description 5
- 238000012545 processing Methods 0.000 description 4
- 230000003068 static effect Effects 0.000 description 4
- 238000013135 deep learning Methods 0.000 description 3
- 238000011109 contamination Methods 0.000 description 2
- 238000013527 convolutional neural network Methods 0.000 description 2
- 238000010586 diagram Methods 0.000 description 2
- 238000012360 testing method Methods 0.000 description 2
- 241001465754 Metazoa Species 0.000 description 1
- 241001122767 Theaceae Species 0.000 description 1
- 230000005856 abnormality Effects 0.000 description 1
- 238000013473 artificial intelligence Methods 0.000 description 1
- 238000013528 artificial neural network Methods 0.000 description 1
- 238000004364 calculation method Methods 0.000 description 1
- 238000006243 chemical reaction Methods 0.000 description 1
- 238000007796 conventional method Methods 0.000 description 1
- 125000004122 cyclic group Chemical group 0.000 description 1
- 238000005516 engineering process Methods 0.000 description 1
- 230000006870 function Effects 0.000 description 1
- 238000013507 mapping Methods 0.000 description 1
- 238000013508 migration Methods 0.000 description 1
- 230000005012 migration Effects 0.000 description 1
- 238000012986 modification Methods 0.000 description 1
- 230000004048 modification Effects 0.000 description 1
- 238000003062 neural network model Methods 0.000 description 1
- 230000003287 optical effect Effects 0.000 description 1
- 230000011218 segmentation Effects 0.000 description 1
- 238000010408 sweeping Methods 0.000 description 1
- 238000010200 validation analysis Methods 0.000 description 1
- 230000000007 visual effect Effects 0.000 description 1
Images
Classifications
-
- A—HUMAN NECESSITIES
- A47—FURNITURE; DOMESTIC ARTICLES OR APPLIANCES; COFFEE MILLS; SPICE MILLS; SUCTION CLEANERS IN GENERAL
- A47L—DOMESTIC WASHING OR CLEANING; SUCTION CLEANERS IN GENERAL
- A47L11/00—Machines for cleaning floors, carpets, furniture, walls, or wall coverings
- A47L11/24—Floor-sweeping machines, motor-driven
-
- A—HUMAN NECESSITIES
- A47—FURNITURE; DOMESTIC ARTICLES OR APPLIANCES; COFFEE MILLS; SPICE MILLS; SUCTION CLEANERS IN GENERAL
- A47L—DOMESTIC WASHING OR CLEANING; SUCTION CLEANERS IN GENERAL
- A47L11/00—Machines for cleaning floors, carpets, furniture, walls, or wall coverings
- A47L11/40—Parts or details of machines not provided for in groups A47L11/02 - A47L11/38, or not restricted to one of these groups, e.g. handles, arrangements of switches, skirts, buffers, levers
-
- A—HUMAN NECESSITIES
- A47—FURNITURE; DOMESTIC ARTICLES OR APPLIANCES; COFFEE MILLS; SPICE MILLS; SUCTION CLEANERS IN GENERAL
- A47L—DOMESTIC WASHING OR CLEANING; SUCTION CLEANERS IN GENERAL
- A47L11/00—Machines for cleaning floors, carpets, furniture, walls, or wall coverings
- A47L11/40—Parts or details of machines not provided for in groups A47L11/02 - A47L11/38, or not restricted to one of these groups, e.g. handles, arrangements of switches, skirts, buffers, levers
- A47L11/4002—Installations of electric equipment
-
- A—HUMAN NECESSITIES
- A47—FURNITURE; DOMESTIC ARTICLES OR APPLIANCES; COFFEE MILLS; SPICE MILLS; SUCTION CLEANERS IN GENERAL
- A47L—DOMESTIC WASHING OR CLEANING; SUCTION CLEANERS IN GENERAL
- A47L11/00—Machines for cleaning floors, carpets, furniture, walls, or wall coverings
- A47L11/40—Parts or details of machines not provided for in groups A47L11/02 - A47L11/38, or not restricted to one of these groups, e.g. handles, arrangements of switches, skirts, buffers, levers
- A47L11/4011—Regulation of the cleaning machine by electric means; Control systems and remote control systems therefor
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05D—SYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
- G05D1/00—Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
- G05D1/02—Control of position or course in two dimensions
- G05D1/021—Control of position or course in two dimensions specially adapted to land vehicles
- G05D1/0212—Control of position or course in two dimensions specially adapted to land vehicles with means for defining a desired trajectory
- G05D1/0219—Control of position or course in two dimensions specially adapted to land vehicles with means for defining a desired trajectory ensuring the processing of the whole working surface
-
- A—HUMAN NECESSITIES
- A47—FURNITURE; DOMESTIC ARTICLES OR APPLIANCES; COFFEE MILLS; SPICE MILLS; SUCTION CLEANERS IN GENERAL
- A47L—DOMESTIC WASHING OR CLEANING; SUCTION CLEANERS IN GENERAL
- A47L2201/00—Robotic cleaning machines, i.e. with automatic control of the travelling movement or the cleaning operation
- A47L2201/04—Automatic control of the travelling movement; Automatic obstacle detection
-
- A—HUMAN NECESSITIES
- A47—FURNITURE; DOMESTIC ARTICLES OR APPLIANCES; COFFEE MILLS; SPICE MILLS; SUCTION CLEANERS IN GENERAL
- A47L—DOMESTIC WASHING OR CLEANING; SUCTION CLEANERS IN GENERAL
- A47L2201/00—Robotic cleaning machines, i.e. with automatic control of the travelling movement or the cleaning operation
- A47L2201/06—Control of the cleaning action for autonomous devices; Automatic detection of the surface condition before, during or after cleaning
Landscapes
- Engineering & Computer Science (AREA)
- Aviation & Aerospace Engineering (AREA)
- Radar, Positioning & Navigation (AREA)
- Remote Sensing (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Automation & Control Theory (AREA)
- Electric Vacuum Cleaner (AREA)
Abstract
The application relates to a cleaning robot control method, a cleaning robot control device, a cleaning robot and a storage medium. The method comprises the following steps: acquiring a plurality of environment images corresponding to the cleaning robot; calling an image detection model to detect a target object in the environment image, and an object type and an object position corresponding to the target object; generating behavior track information corresponding to the target object according to the object positions corresponding to the plurality of environment images respectively; acquiring object behavior characteristics corresponding to the object type obtained according to big data analysis; and determining a target activity area corresponding to the target object according to the behavior track information and the object behavior characteristics, and controlling the cleaning robot to clean the target activity area. By adopting the method, the cleaning efficiency of the cleaning robot can be effectively improved.
Description
Technical Field
The application relates to the technical field of artificial intelligence, in particular to a cleaning robot control method and device, a cleaning robot and a storage medium.
Background
The cleaning robot is also called as a sweeping robot, and can automatically clean the ground of a cleaning area in the home of a user under the condition of no user intervention through intelligent control. In a conventional manner, the cleaning robot typically sweeps the entire cleaning area to the same extent.
However, when there is a pet such as a cat or a dog in the home of the user, the contamination degree of different areas may be different, and the cleaning robot is controlled according to the conventional method to perform the cleaning operation of the same strategy on the entire cleaning area, so that the areas with different contamination degrees caused by the pet cannot be cleaned accurately and effectively, and the cleaning efficiency of the cleaning robot is low.
Disclosure of Invention
In view of the above, it is necessary to provide a cleaning robot control method, a cleaning robot control device, a cleaning robot, and a storage medium, which can improve the cleaning efficiency of the cleaning robot.
A cleaning robot control method, the method comprising:
acquiring a plurality of environment images corresponding to the cleaning robot;
calling an image detection model to detect a target object in the environment image, and an object type and an object position corresponding to the target object;
generating behavior track information corresponding to the target object according to the object positions corresponding to the plurality of environment images respectively;
acquiring object behavior characteristics corresponding to the object type obtained according to big data analysis;
and determining a target activity area corresponding to the target object according to the behavior track information and the object behavior characteristics, and controlling the cleaning robot to clean the target activity area.
In one embodiment, the invoking the image detection model to detect the target object in the environment image, and the object type and the object position corresponding to the target object include:
calling an image detection model, inputting the environment image into the image detection model, and carrying out target detection on the environment image according to the image detection model;
acquiring a target object output by the image detection model and an environment object corresponding to the target object;
acquiring environment information corresponding to the cleaning robot, wherein the environment information comprises environment object coordinates corresponding to the environment objects;
and determining the object position corresponding to the target object according to the environment object coordinate.
In one embodiment, the obtaining of the object behavior characteristics corresponding to the object type obtained according to big data analysis includes:
acquiring reference behavior information of a reference object corresponding to each of the plurality of cleaning robots;
screening reference behavior information of a reference object corresponding to the object type to obtain behavior information to be analyzed;
and performing feature extraction on the behavior information to be analyzed to obtain behavior features to be analyzed, and performing big data analysis on the behavior features to be analyzed to obtain object behavior features corresponding to the object types.
In one embodiment, the controlling the cleaning robot to clean the target moving area includes:
adjusting a cleaning strategy corresponding to the cleaning robot according to the target activity area;
generating a cleaning control instruction according to the adjusted cleaning strategy;
and controlling the cleaning robot to perform cleaning operation on the target moving area according to the cleaning control instruction.
In one embodiment, the method further comprises:
detecting a current distance between the cleaning robot and the target object;
when the current distance is smaller than a preset threshold value, acquiring a plurality of current positions corresponding to the target object within a preset time period;
calling a behavior prediction model to process the current positions to obtain predicted behavior positions corresponding to the target object;
and controlling the cleaning robot to execute an avoiding operation according to the predicted behavior position.
In one embodiment, the method further comprises:
generating target behavior characteristics corresponding to the target object according to the behavior track information and the object behavior characteristics;
acquiring current behavior data corresponding to the target object, and extracting current behavior characteristics corresponding to the current behavior data;
performing abnormal behavior detection on the current behavior characteristics according to the target behavior characteristics;
and when the detection result is abnormal behavior, generating abnormal behavior prompt information according to the detection result, and sending the abnormal behavior prompt information to a terminal corresponding to the cleaning robot.
A cleaning robot control apparatus, the apparatus comprising:
the image detection module is used for acquiring a plurality of environment images corresponding to the cleaning robot; calling an image detection model to detect a target object in the environment image, and an object type and an object position corresponding to the target object;
the track generation module is used for generating behavior track information corresponding to the target object according to the object positions corresponding to the plurality of environment images;
the characteristic acquisition module is used for acquiring object behavior characteristics corresponding to the object types obtained according to big data analysis;
and the cleaning control module is used for determining a target activity area corresponding to the target object according to the behavior track information and the object behavior characteristics and controlling the cleaning robot to clean the target activity area.
In one embodiment, the image detection module is further configured to invoke an image detection model, input the environment image to the image detection model, and perform target detection on the environment image according to the image detection model; acquiring a target object output by the image detection model and an environment object corresponding to the target object; acquiring environment information corresponding to the cleaning robot, wherein the environment information comprises environment object coordinates corresponding to the environment objects; and determining the object position corresponding to the target object according to the environment object coordinate.
A cleaning robot comprises a memory and a processor, wherein the memory stores a computer program, and the processor realizes the steps of the cleaning robot control method when executing the computer program.
A computer-readable storage medium, on which a computer program is stored, which, when being executed by a processor, carries out the steps of the above-mentioned cleaning robot control method.
According to the cleaning robot control method and device, the cleaning robot and the storage medium, the target object in the environment image, the object type and the object position corresponding to the target object are obtained by acquiring a plurality of environment images corresponding to the cleaning robot and calling the image detection model to perform target detection on the environment images, so that the target object in the environment image is detected. And generating behavior track information corresponding to the target object according to the object positions corresponding to the plurality of environment images, wherein the behavior track information can represent the behavior track of the target object. The target activity area corresponding to the target object is determined according to the behavior track information and the object behavior characteristics by acquiring the object behavior characteristics corresponding to the object type obtained through big data analysis, and the cleaning robot is controlled to clean the target activity area, so that the target activity area of the target object is accurately and effectively cleaned in a targeted manner, and the cleaning efficiency of the cleaning robot is effectively improved.
Drawings
FIG. 1 is a schematic diagram of a cleaning robot in one embodiment;
FIG. 2 is a schematic flow chart of a cleaning robot control method according to one embodiment;
FIG. 3 is a flowchart illustrating steps for invoking an image detection model to detect a target object in an environmental image, and an object type and an object position corresponding to the target object in one embodiment;
FIG. 4 is a flowchart illustrating a control method of the cleaning robot in accordance with another embodiment;
FIG. 5 is a flowchart illustrating a control method of a cleaning robot in accordance with still another embodiment;
fig. 6 is a block diagram showing the structure of a cleaning robot control device in one embodiment.
Detailed Description
In order to make the objects, technical solutions and advantages of the present application more apparent, the present application is described in further detail below with reference to the accompanying drawings and embodiments. It should be understood that the specific embodiments described herein are merely illustrative of the present application and are not intended to limit the present application.
The cleaning robot control method can be applied to a cleaning robot. As shown in fig. 1, the cleaning robot 100 may include, but is not limited to, a sensor 102, a controller 104, an actuator assembly 106, and the like. The controller 104 may execute the cleaning robot control method to control the execution component 106 of the cleaning robot to perform a corresponding operation. Specifically, the cleaning robot may capture an environmental image of the environment around the cleaning robot via the sensor 102. The controller 104 acquires a plurality of environmental images acquired by the cleaning robot through the sensor 102. The controller 104 invokes the image detection model to detect the target object in the environmental image, and the object type and the object position corresponding to the target object, and generates behavior trajectory information corresponding to the target object according to the object positions corresponding to the plurality of environmental images. The controller 104 obtains the object behavior characteristics corresponding to the object type obtained by the big data analysis, and determines the target activity area corresponding to the target object according to the behavior track information and the object behavior characteristics. The controller 104 controls the cleaning robot to clean the target active area. The controller 104 may specifically control the execution component 106 to perform a cleaning operation on the target active area. The sensors 102 may specifically include, but are not limited to, visual sensors, infrared sensors, acoustic sensors, video cameras, depth cameras, and the like. The controller 104 may specifically include, but is not limited to, a Central Processing Unit (CPU), a control circuit, and the like, and is configured to process data such as the acquired environment image, and control the execution component 106 to execute a corresponding operation through the control circuit. The execution component 106 may specifically include, but is not limited to, a movement component, a cleaning component, and the like.
In one embodiment, as shown in fig. 2, there is provided a cleaning robot control method, which is described by way of example as applied to the cleaning robot 100 shown in fig. 1, including the steps of:
The environment image is image data of the surrounding environment where the cleaning robot is located. The environmental image may be acquired during a cleaning operation of the cleaning robot, or may be acquired while the cleaning robot moves to acquire the environmental image. In one embodiment, the cleaning robot may receive a cleaning strategy setting such as an initialization setting or a function selection performed by a user, and the cleaning robot controls the cleaning robot to perform a corresponding cleaning operation according to the set cleaning strategy. In the cleaning process of the cleaning robot, a plurality of environment images corresponding to the cleaning robot are acquired in parallel.
The cleaning robot can acquire a plurality of environment images corresponding to the cleaning robot through the arranged sensor. When the types of the sensors corresponding to different cleaning robots are different, the types of the data of the environment images acquired by different sensors can be different.
For example, when the sensor of the cleaning robot includes a camera, the cleaning robot may collect video data within a camera range through the camera during a moving process, and extract multi-frame image data from the video data as an environment image corresponding to the cleaning robot. The cleaning robot can also directly collect a plurality of environment images through the camera.
When the cleaning robot is provided with a sensor including a laser sensor, the cleaning robot can also acquire an image of the surrounding environment through the laser sensor. The environment image collected by the laser sensor can be specifically a point cloud image. Specifically, the laser sensor may emit a detection signal, such as a laser beam or the like. The laser sensor compares a signal reflected by an object in the environment where the cleaning robot is located with the detection signal to obtain point cloud data of the surrounding environment, and the point cloud data is subjected to data cleaning, point cloud segmentation, point cloud projection and the like to obtain a point cloud image of the surrounding environment of the cleaning robot.
The cleaning robot can acquire an environment image through the sensor, the environment image can be image data in a preset time period, and the preset time period can be set according to actual application requirements and is used for detecting the time length of a target moving range. For example, the preset time period may be set to two weeks, one month, three months, or half a year, etc. The cleaning robot can cache the environment images collected by the sensor and control the cleaning robot according to a plurality of environment images in a preset time period.
The image detection model can be pre-established and obtained after training, and is a target detection model established based on a target detection algorithm and used for respectively carrying out target detection on a plurality of environment images. The target detection algorithm used for establishing the image detection model may be one of a plurality of target detection algorithms such as yolo (young Only Look one), fast-RCNN (Convolutional Neural Networks), CornerNet, MobileNet, ssd (single shell Multi Box detector), and the like.
The target object refers to a dynamic object in the environment where the cleaning robot is located, and specifically may be an animal in the environment where the cleaning robot is located. For example, the target object may be a pet that the user maintains at home. The cleaning robot can call a pre-trained image detection model, input the collected environment image into the image detection model, and perform target detection on the environment image through the image detection model to obtain a detection result output by the image detection model. The detection result of the environment image may include the target object in the environment image, and the object type and the object position corresponding to the target object.
The object type is one of a plurality of preset types corresponding to the target object in the environment image. For example, the target object may be one of a cat or a dog. In one embodiment, the preset type may be divided into a plurality of type levels. For example, types can be specifically classified into major and minor classes. The broad category includes cats or dogs. Subclasses may include specific breeds of cats or dogs. For example, the target object may be one of an Alaska dog, a golden retriever, a Samoyae dog, a German shepherd dog, a autumn dog, a Cokikyphos dog, and a French bulldog. The behavior characteristics corresponding to different varieties of pets may be different, and the target activity area of the target object can be determined more accurately according to the specific variety corresponding to the target object, so that the cleaning efficiency of the cleaning robot is improved.
The image detection model can frame the detected target object and the object type corresponding to the target object in the environment image through the target frame. The cleaning robot can determine the object position corresponding to the target object according to the position of the target object in the environment image. The object position refers to the position of the target object in the environment. In one embodiment, some of the environmental images may not include a target object.
And step 206, generating behavior track information corresponding to the target object according to the object positions corresponding to the plurality of environment images.
The cleaning robot can detect the current surrounding environment when cleaning is performed for the first time or according to map generation operation of a user, and generate a clean environment map according to collected environment information, wherein the clean environment map is an area map which is required to be cleaned by the cleaning robot. For example, the cleaning environment map may be a plan view to be cleaned corresponding to the user's home.
The cleaning robot can determine the object position of the target object when the environment image is acquired according to the environment image, and map the object positions corresponding to the plurality of environment images into the clean environment map to obtain the environment map coordinates of the target object at the acquisition time of the plurality of images. The cleaning robot can connect a plurality of environment map coordinates corresponding to the target object according to the sequence of the acquisition time corresponding to the plurality of environment images, and fit the behavior track according to the environment map coordinates to generate behavior track information corresponding to the target object. The behavior trace information may specifically include a behavior trace corresponding to the target object and behavior time when the target object moves to different positions.
And step 208, acquiring the object behavior characteristics corresponding to the object type obtained according to the big data analysis.
The cleaning robot may acquire an object behavior feature corresponding to the object type, and the object behavior feature may be obtained through big data analysis. Specifically, the cleaning robot may directly obtain the object behavior characteristics obtained by analyzing the big data according to the object type, and may also obtain a large amount of behavior information to be analyzed corresponding to the object type, and perform the big data analysis on the behavior information to be analyzed to obtain the object behavior characteristics corresponding to the object type. The object behavior characteristics can be used for representing the behavior characteristics of the dynamic object corresponding to the object type under the common condition.
In one embodiment, the big data analysis of the object behavior characteristics may be specifically performed by a server corresponding to the cleaning robot due to a large data processing amount of the big data analysis. The server may be implemented as a stand-alone server or as a server cluster of multiple servers. Specifically, the server may obtain object behavior information collected by the plurality of cleaning robots, where the object behavior information includes behavior information corresponding to dynamic objects of a plurality of object types. The server can classify the object behavior information according to the object type, extract the characteristics of the object behavior information corresponding to the same object type to obtain the behavior characteristics corresponding to the object type, and analyze the big data of the behavior characteristics to obtain the object behavior characteristics corresponding to the object type. The server can perform big data analysis on the acquired object behavior information to obtain object behavior characteristics corresponding to various object types. The server can establish a mapping relation between the object types and the object behavior characteristics so that the cleaning robot can obtain the corresponding object behavior characteristics according to the object types. The server corresponding to the cleaning robot is used for big data analysis, the cleaning robot can directly obtain the object behavior characteristics corresponding to the object type obtained according to the big data analysis, and the data processing pressure of the cleaning robot is effectively reduced.
And step 210, determining a target activity area corresponding to the target object according to the behavior track information and the object behavior characteristics, and controlling the cleaning robot to clean the target activity area.
The cleaning robot can determine a target activity area corresponding to the target object according to the behavior track information and the object behavior characteristics corresponding to the target object, and control the cleaning robot to clean the target activity area. Specifically, the cleaning robot may determine an activity area corresponding to the target object according to the behavior trace information corresponding to the target object, and the activity area determined according to the behavior trace information is an actual activity area of the target object. The cleaning robot can adjust the activity area according to the object behavior characteristics corresponding to the object type, and determine the adjusted activity area as a target activity area corresponding to the target object. Wherein the adjustment process specifically includes an enlargement adjustment. The cleaning robot can enlarge the actual activity area of the target object according to the behavior characteristics of the object to obtain the target activity area. In one embodiment, when the actual activity area does not need to be enlarged according to the object behavior characteristics, the actual activity area corresponding to the target object may be directly used as the target activity area of the target object. The target activity area obtained after adjustment according to the object behavior characteristics can more accurately represent the actual activity of the target object and an area with larger possibility of activity.
The cleaning robot can clean the target activity area according to the determined target activity area corresponding to the target object. Specifically, the cleaning robot may acquire preset target cleaning information, and the target cleaning information may be cleaning information of a target active area, which is preset according to actual application requirements. The target cleaning information may specifically include, but is not limited to, information on a cleaning manner, a cleaning frequency, a cleaning power, a cleaning time, and the like of the target active region. And controlling an execution component of the cleaning robot to clean the target movable area according to the target cleaning information.
In one embodiment, the cleaning robot may also adjust the cleaning strategy based on the target activity area. The cleaning policy refers to a cleaning rule that a user previously sets for the cleaning robot. The cleaning robot performs cleaning according to the set cleaning strategy. After the cleaning robot determines the target activity area corresponding to the target object, the cleaning strategy can be adjusted according to the target activity area, so that the adjusted cleaning strategy is obtained. The adjusted cleaning strategy may specifically include a cleaning time, a cleaning pattern, a number of times of cleaning, a frequency of cleaning, and the like corresponding to the target object. The cleaning robot can generate a cleaning control instruction when cleaning is needed according to the adjusted cleaning strategy, and the cleaning control instruction is used for controlling the cleaning robot to perform cleaning operation on the target moving area.
For example, in an actual application process, when the cleaning robot is not used yet or the target object is not included in the region to be cleaned, the target moving region does not exist, and the cleaning robot performs uniform cleaning on the region to be cleaned. After the cleaning robot determines the target activity area corresponding to the target object, the cleaning strategy can be adjusted according to the target activity area, and different cleaning modes are determined according to information such as object types to clean the target activity area. Specifically, the cleaning robot may divide an area to be cleaned into a target active area and a non-target active area, and perform cleaning using different cleaning strategies for the two areas. For example, the cleaning robot can perform cleaning operation with a higher cleaning frequency on the target activity area, or when the target object is a dog, a cleaning mode with a higher suction force is adopted for the target activity area, and the like, so that the target activity area is cleaned in a targeted manner, a better cleaning effect can be generated for the target activity area, and the cleaning efficiency of the cleaning robot is effectively improved.
In this embodiment, a plurality of environment images corresponding to the cleaning robot are acquired, and an image detection model is called to perform target detection on the environment images, so as to obtain target objects in the environment images, and object types and object positions corresponding to the target objects, thereby detecting the target objects in the environment where the cleaning robot is located. And generating behavior track information corresponding to the target object according to the object positions corresponding to the plurality of environment images, wherein the behavior track information can represent the behavior track of the target object. The target activity area corresponding to the target object is determined according to the behavior track information and the object behavior characteristics by acquiring the object behavior characteristics corresponding to the object type obtained through big data analysis, and the cleaning robot is controlled to clean the target activity area, so that the target activity area of the target object is accurately and effectively cleaned in a targeted manner, and the cleaning efficiency of the cleaning robot is effectively improved.
In an embodiment, as shown in fig. 3, the step of invoking the image detection model to detect the target object in the environment image, and the object type and the object position corresponding to the target object includes:
And step 304, acquiring a target object output by the image detection model and an environment object corresponding to the target object.
And 308, determining the object position corresponding to the target object according to the environment object coordinates.
The image detection model can be pre-established and configured in the cleaning robot after training, after the environment image of the surrounding environment is obtained, the image detection model can be called, the environment image is input into the image detection model, the environment image is subjected to target detection through the image detection model, and a target detection result output by the image detection model is obtained. The image detection model may detect a trained object from the environmental image, and the image detection model may output the environmental image including an object frame that frames the detected object in the environmental image and an object type corresponding to the framed object. In practical application, the object frame is usually a rectangular frame, and the object frame may be in other shapes.
The cleaning robot may determine a target object included in the environment image and an environment object corresponding to the target object according to a detection result output by the image detection model. The target object is a dynamic object such as a pet, and the environment object is a static object in the environment where the cleaning robot is located. For example, the environmental object may specifically include, but is not limited to, a tea table, a couch, a television cabinet, a bed, and the like.
The cleaning robot may acquire environmental information corresponding to the cleaning robot. The environment information may be configured in the cleaning robot by the user in advance, or may be obtained by detecting the surrounding environment in which the cleaning robot is located during the operation. The environment information may specifically include, but is not limited to, a clean environment map of an environment in which the cleaning robot is located, environment objects included in the surrounding environment, environment object coordinates corresponding to the environment objects, and the like, and the environment object coordinates may be coordinate information of the environment objects in the clean environment map. The cleaning robot can determine the object position corresponding to the target object according to the relative position relationship between the target object and one or more environment objects and the environment object coordinates corresponding to the environment objects.
In the embodiment, the target detection is performed on the environment image through the image detection model, so that the target object and the environment object in the environment where the cleaning robot is located can be accurately identified through the environment image. The method comprises the steps of obtaining environment information corresponding to the cleaning robot, wherein the environment information comprises environment object coordinates corresponding to environment objects, and determining object positions corresponding to target objects through the environment object coordinates. The environment object in the environment where the cleaning robot is located is usually a static object, the position of the cleaning robot cannot be frequently moved, the environment object coordinate corresponding to the environment object is used as a reference, the object position corresponding to the dynamic target object can be accurately and effectively determined, the accuracy of detecting the object position is improved, and the target activity area of the target object can be determined according to the accurate object position.
In one embodiment, the established standard detection model may be trained by a training image, thereby obtaining an image detection model. Specifically, the standard detection model may be a TFLite (tensrflow Lite: an open source deep learning framework for device-side inference) type model established based on a deep learning network MobileNet V1(efficient convolutional neural network for mobile vision application), and the standard detection model may also be established based on other deep learning networks. For example, the standard detection model may be specifically established based on networks such as VGG, ResNet (Residual Neural Network), RetinaNet, CornerNet-lite, YOLO, or SSD. In one embodiment, the standard detection model may be a model established based on a MobileNet-SSD algorithm. The MobileNet-SSD algorithm is a target detection algorithm for extracting image features through MobileNet, detecting an object frame by using an SSD frame, extracting the image features through deep separable convolution and effectively improving the calculation efficiency of the convolution network.
The training image can be image data collected by a model training person according to actual requirements, and can also be image data in a training database. For example, image data of objects to be recognized in the tensrflow (an open source code database) may be acquired according to actual application requirements, where the objects to be recognized include dynamic objects such as pets and static objects that may exist in the environment where the cleaning robot is located. For example, the training image may specifically be an image of a plurality of home environments including a pet cat or a pet dog. Converting the acquired image data into TFrecord (binary data format) format data, and inputting the TFrecord format data serving as training data into a standard detection model for model training. In one embodiment, the data used for training the standard test model may specifically include training data, validation data, and test data.
And (4) performing cyclic training on the standard detection model according to the training image until the training is converged to obtain the trained image detection model. For example, migration learning may be performed by using a Fine-Tune (model tuning), training a preset number of epochs (each Epoch represents training once using all training data), for example, 3 ten thousand epochs, and determining that training is converged when a loss value (loss) of the detected model is reduced to a preset value. The preset value may be a parameter value preset according to actual requirements, for example, the preset value may be set to 0.2. After the training is finished, the standard detection model with the converged training can be converted through a model conversion algorithm, and the standard detection model is converted into an image detection model with a preset format, wherein the image detection model can run in the cleaning robot so as to detect the environmental image collected by the sensor. The image detection model may specifically be a TFLite type detection model, for example.
In this embodiment, a standard detection model is established according to a target detection algorithm, the standard detection model is trained through a training image corresponding to the detection requirement of the cleaning robot, the trained detection model is converted and configured in the cleaning robot, and the cleaning robot can detect an environmental image around the cleaning robot according to the image detection model, so that the target object and the environmental object around the cleaning robot can be accurately detected.
In an embodiment, the step of obtaining the object behavior characteristics corresponding to the object type obtained according to the big data analysis includes: acquiring reference behavior information of reference objects corresponding to the plurality of cleaning robots respectively; screening reference behavior information of a reference object corresponding to the object type to obtain behavior information to be analyzed; and performing feature extraction on the behavior information to be analyzed to obtain behavior features to be analyzed, and performing big data analysis on the behavior features to be analyzed to obtain object behavior features corresponding to the object types.
After the cleaning robot detects the target object by calling the image detection model, the object behavior characteristics obtained by big data analysis can be obtained according to the object type corresponding to the target object. Specifically, the cleaning robot may acquire reference behavior information of a reference object corresponding to each of the plurality of other cleaning robots, and the reference behavior information may be acquired from a server corresponding to the cleaning robot, or may establish communication with the other cleaning robots to directly acquire the reference behavior information from the other cleaning robots. The reference object refers to a target object detected by each of the plurality of cleaning robots, and the reference behavior information refers to behavior information corresponding to each of the plurality of target objects. It is understood that a plurality of cleaning robots may be respectively owned and used by different users, and the reference object is a dynamic object in the cleaning area corresponding to the different users. For example, the reference object may be specifically a pet in each of a plurality of users' corresponding homes.
And unique corresponding incidence relation exists among the reference object, the reference object type and the reference behavior information. The cleaning robot can screen reference behavior information with the reference object type consistent with the object type from a large amount of reference behavior information according to the object type corresponding to the target object, and behavior information to be analyzed corresponding to the object type is obtained. The cleaning robot can perform feature extraction on the behavior to be analyzed in a feature extraction model and other modes to obtain the behavior feature to be analyzed corresponding to the object type. And carrying out big data analysis on a large number of behavior characteristics to be analyzed corresponding to the same object type to obtain the behavior characteristics of the object corresponding to the object type. The object behavior characteristics may be used to represent the behavior characteristics that such a class of target objects of the corresponding object type has. For example, when the target object is specifically an alaska dog, the cleaning robot may acquire the behavior characteristics of the alaska dog raised by a plurality of users to perform big data analysis, and obtain the object behavior characteristics of the alaska dog. The behavior information of the alaska dog may be collected according to a cleaning robot owned and used by the corresponding user.
In this embodiment, by obtaining reference behavior information of reference objects corresponding to the plurality of cleaning robots, behavior information to be analyzed corresponding to the type of the object is screened from a large amount of reference behavior information, so that behavior characteristics corresponding to the type are analyzed according to the same type of behavior information. By extracting the characteristics of the behavior information to be analyzed and carrying out big data analysis on the obtained behavior characteristics to be analyzed, the object behavior characteristics corresponding to the object type are obtained, the accuracy of determining the behavior characteristics of the object is effectively improved, the target activity area can be accurately determined according to the behavior characteristics of the object, and therefore the cleaning efficiency of the cleaning robot for the target activity area is improved.
In one embodiment, as shown in fig. 4, the above cleaning robot control method further includes:
in step 402, a current distance between the cleaning robot and the target object is detected.
And step 406, calling a behavior prediction model to process the current positions to obtain predicted behavior positions corresponding to the target object.
And 408, controlling the cleaning robot to execute the avoiding operation according to the predicted behavior position.
When the cleaning robot is located in an environment including a target object, a current distance between the cleaning robot and the target object may be detected after the cleaning robot is started. The current distance refers to a distance between the cleaning robot and the target object at the time of detection. The cleaning robot may detect the current distance by data transmitted from the sensor. For example, when the sensor of the cleaning robot includes an infrared sensor, the infrared signal reflected by the target object may be received by the infrared sensor, and the current distance between the cleaning robot and the target object may be determined. The current distance between the cleaning robot and the target object may also be determined from an image of the environment including the target object.
The cleaning robot may compare the detected current distance with a preset threshold, where the preset threshold is a length preset according to an actual application requirement. The preset threshold may be uniformly set, and the preset thresholds set by the plurality of cleaning robots are the same. For example, it may be set to 0.3 m. The preset threshold may also be corresponding to the target object, and different thresholds may be set for different target objects. When the current distance between the cleaning robot and the target object is less than the preset threshold, the cleaning robot may acquire a plurality of current positions corresponding to the target object within a preset time period. The preset time period is a preset time length for acquiring the current position to perform behavior prediction, and specifically, the preset time period is a historical time period before the current distance is detected. The current position refers to a position where the detection target object is located within a corresponding historical period of time. In one embodiment, when the current distance is greater than or equal to a preset threshold, the cleaning robot is controlled to continue the current movement.
The cleaning robot can call the behavior prediction model to conduct behavior prediction on the current position of the target object in a preset time period, and the predicted behavior position corresponding to the target object is obtained. The behavior prediction model may be established in advance according to a behavior prediction algorithm and obtained after training, and the behavior prediction model may specifically be a neural network model. The cleaning robot can predict a plurality of current positions of the target object through the behavior prediction model to obtain a predicted behavior position where the target object may possibly enter. In one embodiment, the predicted behavior location may specifically be a location coordinate in a clean environment map predicted by a behavior prediction model. The cleaning robot can execute the avoiding operation according to the predicted behavior position, and the cleaning robot is controlled to move towards the direction far away from the predicted behavior position.
In the embodiment, by detecting the current distance between the cleaning robot and the target object, when the current distance is smaller than a preset threshold, the position of the target object within a preset time period is obtained, the plurality of current positions are processed through the behavior prediction model to obtain the predicted behavior position corresponding to the target object, and the cleaning robot is controlled to execute an avoiding operation according to the predicted behavior position, so that the cleaning robot is prevented from contacting or colliding with the target object, and the safety of the cleaning robot is effectively improved.
In one embodiment, as shown in fig. 5, the above cleaning robot control method further includes:
and 502, generating target behavior characteristics corresponding to the target object according to the behavior track information and the object behavior characteristics.
And step 506, performing abnormal behavior detection on the current behavior characteristics according to the target behavior characteristics.
And step 508, when the detection result is abnormal behavior, generating abnormal behavior prompt information according to the detection result, and sending the abnormal behavior prompt information to a terminal corresponding to the cleaning robot.
The cleaning robot can extract features according to the generated behavior track information, and actual behavior features corresponding to the target object are extracted and obtained. The cleaning robot can adjust the extracted actual behavior characteristics according to the object behavior characteristics of the object type corresponding to the target object, and determine the adjusted behavior characteristics as the target behavior characteristics corresponding to the target object. Specifically, the cleaning robot may obtain a feature weight corresponding to a pre-configured behavior feature, and the feature weight corresponding to the actual behavior feature may be different from the feature weight corresponding to the object behavior feature. The cleaning robot can perform weighting processing on the actual behavior characteristics and the object behavior characteristics according to the characteristic weight to generate target behavior characteristics corresponding to the target object. The target behavior characteristics generated according to the object behavior characteristics and the actual behavior characteristics can reflect the behavior of the target object more accurately.
The cleaning robot may acquire current behavior data corresponding to the target object through the sensor, where the current behavior data may be acquired by the cleaning robot in a moving state or a stationary state. For example, the cleaning robot may also acquire current behavior data corresponding to the target object in a stationary state where the cleaning operation is not performed. The cleaning robot can extract the characteristics of the current behavior data to obtain the current behavior characteristics corresponding to the target object.
The cleaning robot can detect abnormal behaviors of the current behavior characteristics according to the target behavior characteristics. Specifically, the cleaning robot may compare the current behavior characteristic with the target behavior characteristic to obtain a characteristic similarity between the current behavior characteristic and the target behavior characteristic. And when the feature similarity is greater than or equal to the feature threshold, determining that the detection result of the current behavior is the non-abnormal behavior. And when the feature similarity is smaller than the feature threshold, determining that the detection result of the current behavior is the abnormal behavior. Wherein, the characteristic threshold is a similarity preset according to the actual application requirement. For example, the feature threshold may be specifically set to 80%.
And when the detection result is abnormal behavior, determining that the target object currently has the abnormal behavior. The cleaning robot can generate abnormal behavior prompt information according to the detection result, and the cleaning robot is controlled to send the abnormal behavior prompt information to the terminal corresponding to the cleaning robot, so that the terminal displays the received abnormal behavior prompt information according to the information type of the abnormal behavior prompt information, and the user target object corresponding to the terminal is prompted to possibly have abnormal behavior at present.
In one embodiment, a user can establish communication connection with the cleaning robot through a corresponding terminal, and video data of a target object is acquired through a camera of the cleaning robot, so that the video data is provided for the user to check abnormal behaviors of the target object.
In one embodiment, when the detection result of the current behavior feature is a non-abnormal behavior, the cleaning robot may further correct the target behavior feature of the target object according to the current behavior feature, so as to obtain a more accurate target behavior feature.
In this embodiment, a target behavior feature corresponding to a target object is generated according to behavior trajectory information and the object behavior feature, abnormal behavior detection is performed on the current behavior feature according to the target behavior feature by extracting the current behavior feature of current behavior data corresponding to the target object, when a detection result is abnormal behavior, abnormal behavior prompt information is generated according to the detection result, and the abnormal behavior prompt information is sent to a terminal corresponding to the cleaning robot, so that the possibility of abnormal behavior of a user target object corresponding to the terminal is accurately prompted.
It should be understood that although the various steps in the flow charts of fig. 2-5 are shown in order as indicated by the arrows, the steps are not necessarily performed in order as indicated by the arrows. The steps are not performed in the exact order shown and described, and may be performed in other orders, unless explicitly stated otherwise. Moreover, at least some of the steps in fig. 2-5 may include multiple steps or multiple stages, which are not necessarily performed at the same time, but may be performed at different times, which are not necessarily performed in sequence, but may be performed in turn or alternately with other steps or at least some of the other steps.
In one embodiment, as shown in fig. 6, there is provided a cleaning robot control device including: an image detection module 602, a trajectory generation module 604, a feature acquisition module 606, and a cleaning control module 608, wherein:
the image detection module 602 is configured to obtain a plurality of environment images corresponding to the cleaning robot; and calling an image detection model to detect the target object in the environment image, and the object type and the object position corresponding to the target object.
The track generating module 604 is configured to generate behavior track information corresponding to the target object according to object positions corresponding to the multiple environment images.
The characteristic obtaining module 606 is configured to obtain an object behavior characteristic corresponding to an object type obtained according to big data analysis.
And the cleaning control module 608 is configured to determine a target activity area corresponding to the target object according to the behavior trajectory information and the object behavior characteristics, and control the cleaning robot to clean the target activity area.
In an embodiment, the image detection module 602 is further configured to invoke an image detection model, input an environment image into the image detection model, and perform target detection on the environment image according to the image detection model; acquiring a target object output by an image detection model and an environment object corresponding to the target object; acquiring environment information corresponding to the cleaning robot, wherein the environment information comprises environment object coordinates corresponding to environment objects; and determining the object position corresponding to the target object according to the environment object coordinates.
In one embodiment, the characteristic obtaining module 606 is further configured to obtain reference behavior information of a reference object corresponding to each of the plurality of cleaning robots; screening reference behavior information of a reference object corresponding to the object type to obtain behavior information to be analyzed; and performing feature extraction on the behavior information to be analyzed to obtain behavior features to be analyzed, and performing big data analysis on the behavior features to be analyzed to obtain object behavior features corresponding to the object types.
In one embodiment, the cleaning control module 608 is further configured to adjust a cleaning strategy corresponding to the cleaning robot according to the target moving area; generating a cleaning control instruction according to the adjusted cleaning strategy; and controlling the cleaning robot to perform cleaning operation on the target moving area according to the cleaning control instruction.
In one embodiment, the cleaning control module 608 is further configured to detect a current distance between the cleaning robot and the target object; when the current distance is smaller than a preset threshold value, acquiring a plurality of current positions corresponding to the target object within a preset time period; calling a behavior prediction model to process the current positions to obtain predicted behavior positions corresponding to the target object; and controlling the cleaning robot to execute the avoiding operation according to the predicted behavior position.
In one embodiment, the cleaning robot control device further includes an abnormality prompt module, configured to generate a target behavior feature corresponding to a target object according to the behavior trajectory information and the object behavior feature; acquiring current behavior data corresponding to a target object, and extracting current behavior characteristics corresponding to the current behavior data; detecting abnormal behaviors of the current behavior characteristics according to the target behavior characteristics; and when the detection result is abnormal behavior, generating abnormal behavior prompt information according to the detection result, and sending the abnormal behavior prompt information to a terminal corresponding to the cleaning robot.
For specific limitations of the cleaning robot control device, reference may be made to the above limitations of the cleaning robot control method, which are not described in detail herein. The respective modules in the above-described cleaning robot control device may be wholly or partially implemented by software, hardware, and a combination thereof. The modules can be embedded in a hardware form or independent from a processor in the cleaning robot, and can also be stored in a memory in the cleaning robot in a software form, so that the processor can call and execute the corresponding operations of the modules.
In one embodiment, a computer device is provided, which may be a cleaning robot. The cleaning robot includes a controller, a communication interface, a sensor, an actuator assembly, and an input device connected by a system bus. Wherein the controller comprises a processor and a memory. The cleaning robot processor is used to provide computing and control capabilities. The storage of the cleaning robot comprises a nonvolatile storage medium and an internal storage. The non-volatile storage medium stores an operating system and a computer program. The internal memory provides an environment for the operation of an operating system and computer programs in the non-volatile storage medium. The communication interface of the cleaning robot is used for carrying out wired or wireless communication with an external terminal, and the wireless communication can be realized through WIFI, an operator network, NFC (near field communication) or other technologies. The computer program is executed by a processor to implement a cleaning robot control method. The input device of the cleaning robot can be a touch layer covered on a display screen, a key, a track ball or a touch pad arranged on a shell of the cleaning robot, an external touch pad and the like.
In one embodiment, there is provided a cleaning robot comprising a memory in which a computer program is stored and a processor which, when executing the computer program, performs the steps in the above-described cleaning robot control method embodiments.
In one embodiment, a computer-readable storage medium is provided, on which a computer program is stored, which, when being executed by a processor, carries out the steps of the above-mentioned cleaning robot control method embodiment.
It will be understood by those skilled in the art that all or part of the processes of the methods of the embodiments described above can be implemented by hardware instructions of a computer program, which can be stored in a non-volatile computer-readable storage medium, and when executed, can include the processes of the embodiments of the methods described above. Any reference to memory, storage, database or other medium used in the embodiments provided herein can include at least one of non-volatile and volatile memory. Non-volatile Memory may include Read-Only Memory (ROM), magnetic tape, floppy disk, flash Memory, optical storage, or the like. Volatile Memory can include Random Access Memory (RAM) or external cache Memory. By way of illustration and not limitation, RAM can take many forms, such as Static Random Access Memory (SRAM) or Dynamic Random Access Memory (DRAM), among others.
The technical features of the above embodiments can be arbitrarily combined, and for the sake of brevity, all possible combinations of the technical features in the above embodiments are not described, but should be considered as the scope of the present specification as long as there is no contradiction between the combinations of the technical features.
The above-mentioned embodiments only express several embodiments of the present application, and the description thereof is more specific and detailed, but not construed as limiting the scope of the invention. It should be noted that, for a person skilled in the art, several variations and modifications can be made without departing from the concept of the present application, which falls within the scope of protection of the present application. Therefore, the protection scope of the present patent shall be subject to the appended claims.
Claims (10)
1. A cleaning robot control method, the method comprising:
acquiring a plurality of environment images corresponding to the cleaning robot;
calling an image detection model to detect a target object in the environment image, and an object type and an object position corresponding to the target object;
generating behavior track information corresponding to the target object according to the object positions corresponding to the plurality of environment images respectively;
acquiring object behavior characteristics corresponding to the object type obtained according to big data analysis;
and determining a target activity area corresponding to the target object according to the behavior track information and the object behavior characteristics, and controlling the cleaning robot to clean the target activity area.
2. The method of claim 1, wherein invoking the image detection model to detect the target object in the environment image, and wherein the object type and the object position corresponding to the target object comprise:
calling an image detection model, inputting the environment image into the image detection model, and carrying out target detection on the environment image according to the image detection model;
acquiring a target object output by the image detection model and an environment object corresponding to the target object;
acquiring environment information corresponding to the cleaning robot, wherein the environment information comprises environment object coordinates corresponding to the environment objects;
and determining the object position corresponding to the target object according to the environment object coordinate.
3. The method according to claim 1, wherein the obtaining of the object behavior feature corresponding to the object type obtained according to big data analysis comprises:
acquiring reference behavior information of a reference object corresponding to each of the plurality of cleaning robots;
screening reference behavior information of a reference object corresponding to the object type to obtain behavior information to be analyzed;
and performing feature extraction on the behavior information to be analyzed to obtain behavior features to be analyzed, and performing big data analysis on the behavior features to be analyzed to obtain object behavior features corresponding to the object types.
4. The method of claim 1, wherein the controlling the cleaning robot to clean the target active area comprises:
adjusting a cleaning strategy corresponding to the cleaning robot according to the target activity area;
generating a cleaning control instruction according to the adjusted cleaning strategy;
and controlling the cleaning robot to perform cleaning operation on the target moving area according to the cleaning control instruction.
5. The method of any one of claims 1 to 4, further comprising:
detecting a current distance between the cleaning robot and the target object;
when the current distance is smaller than a preset threshold value, acquiring a plurality of current positions corresponding to the target object within a preset time period;
calling a behavior prediction model to process the current positions to obtain predicted behavior positions corresponding to the target object;
and controlling the cleaning robot to execute an avoiding operation according to the predicted behavior position.
6. The method of any one of claims 1 to 4, further comprising:
generating target behavior characteristics corresponding to the target object according to the behavior track information and the object behavior characteristics;
acquiring current behavior data corresponding to the target object, and extracting current behavior characteristics corresponding to the current behavior data;
performing abnormal behavior detection on the current behavior characteristics according to the target behavior characteristics;
and when the detection result is abnormal behavior, generating abnormal behavior prompt information according to the detection result, and sending the abnormal behavior prompt information to a terminal corresponding to the cleaning robot.
7. A cleaning robot control apparatus, characterized in that the apparatus comprises:
the image detection module is used for acquiring a plurality of environment images corresponding to the cleaning robot; calling an image detection model to detect a target object in the environment image, and an object type and an object position corresponding to the target object;
the track generation module is used for generating behavior track information corresponding to the target object according to the object positions corresponding to the plurality of environment images;
the characteristic acquisition module is used for acquiring object behavior characteristics corresponding to the object types obtained according to big data analysis;
and the cleaning control module is used for determining a target activity area corresponding to the target object according to the behavior track information and the object behavior characteristics and controlling the cleaning robot to clean the target activity area.
8. The apparatus of claim 7, wherein the image detection module is further configured to invoke an image detection model, input the environment image into the image detection model, and perform object detection on the environment image according to the image detection model; acquiring a target object output by the image detection model and an environment object corresponding to the target object; acquiring environment information corresponding to the cleaning robot, wherein the environment information comprises environment object coordinates corresponding to the environment objects; and determining the object position corresponding to the target object according to the environment object coordinate.
9. A cleaning robot comprising a memory and a processor, the memory storing a computer program, characterized in that the processor realizes the steps of the method according to any of claims 1 to 6 when executing the computer program.
10. A computer-readable storage medium, on which a computer program is stored, which, when being executed by a processor, carries out the steps of the method of any one of claims 1 to 6.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202010457292.7A CN111643011B (en) | 2020-05-26 | 2020-05-26 | Cleaning robot control method and device, cleaning robot and storage medium |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202010457292.7A CN111643011B (en) | 2020-05-26 | 2020-05-26 | Cleaning robot control method and device, cleaning robot and storage medium |
Publications (2)
Publication Number | Publication Date |
---|---|
CN111643011A true CN111643011A (en) | 2020-09-11 |
CN111643011B CN111643011B (en) | 2022-06-03 |
Family
ID=72343286
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202010457292.7A Active CN111643011B (en) | 2020-05-26 | 2020-05-26 | Cleaning robot control method and device, cleaning robot and storage medium |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN111643011B (en) |
Cited By (9)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN112168066A (en) * | 2020-09-30 | 2021-01-05 | 深圳市银星智能科技股份有限公司 | Control method and device for cleaning robot, cleaning robot and storage medium |
CN113180549A (en) * | 2021-04-30 | 2021-07-30 | 青岛海尔空调器有限总公司 | Cleaning control method, device and air conditioner |
CN113509104A (en) * | 2021-04-25 | 2021-10-19 | 珠海格力电器股份有限公司 | Cleaning method, storage medium and cleaning robot |
CN114468855A (en) * | 2022-01-07 | 2022-05-13 | 珠海格力电器股份有限公司 | Equipment control method, device, equipment and storage medium |
CN114766977A (en) * | 2022-05-07 | 2022-07-22 | 美智纵横科技有限责任公司 | Cleaning method, system, equipment and storage medium based on movement track |
CN115040038A (en) * | 2022-06-22 | 2022-09-13 | 杭州萤石软件有限公司 | Robot control method and device and robot |
CN115200144A (en) * | 2022-07-20 | 2022-10-18 | 珠海格力电器股份有限公司 | Air purifier and method for cleaning foreign matters |
WO2023124859A1 (en) * | 2021-12-28 | 2023-07-06 | 速感科技(北京)有限公司 | Cleaning robot, cleaning methods thereof and computer readable storage medium |
CN116524135A (en) * | 2023-07-05 | 2023-08-01 | 方心科技股份有限公司 | Three-dimensional model generation method and system based on image |
Citations (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2014191789A (en) * | 2013-03-28 | 2014-10-06 | Sharp Corp | Self-traveling type electronic apparatus and travel area specification system for self-traveling type electronic apparatus |
CN108107886A (en) * | 2017-11-29 | 2018-06-01 | 珠海格力电器股份有限公司 | Driving control method and device of sweeping robot and sweeping robot |
CN108958253A (en) * | 2018-07-19 | 2018-12-07 | 北京小米移动软件有限公司 | The control method and device of sweeping robot |
CN108968811A (en) * | 2018-06-20 | 2018-12-11 | 四川斐讯信息技术有限公司 | A kind of object identification method and system of sweeping robot |
CN109571482A (en) * | 2019-01-02 | 2019-04-05 | 京东方科技集团股份有限公司 | Sweeping robot paths planning method and related system, readable storage medium storing program for executing |
US10463217B1 (en) * | 2019-01-31 | 2019-11-05 | Irobot Corporation | Cleaning of pet areas by autonomous cleaning robots |
CN110936370A (en) * | 2018-09-25 | 2020-03-31 | 格力电器(武汉)有限公司 | Cleaning robot control method and device |
-
2020
- 2020-05-26 CN CN202010457292.7A patent/CN111643011B/en active Active
Patent Citations (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2014191789A (en) * | 2013-03-28 | 2014-10-06 | Sharp Corp | Self-traveling type electronic apparatus and travel area specification system for self-traveling type electronic apparatus |
CN108107886A (en) * | 2017-11-29 | 2018-06-01 | 珠海格力电器股份有限公司 | Driving control method and device of sweeping robot and sweeping robot |
CN108968811A (en) * | 2018-06-20 | 2018-12-11 | 四川斐讯信息技术有限公司 | A kind of object identification method and system of sweeping robot |
CN108958253A (en) * | 2018-07-19 | 2018-12-07 | 北京小米移动软件有限公司 | The control method and device of sweeping robot |
CN110936370A (en) * | 2018-09-25 | 2020-03-31 | 格力电器(武汉)有限公司 | Cleaning robot control method and device |
CN109571482A (en) * | 2019-01-02 | 2019-04-05 | 京东方科技集团股份有限公司 | Sweeping robot paths planning method and related system, readable storage medium storing program for executing |
US10463217B1 (en) * | 2019-01-31 | 2019-11-05 | Irobot Corporation | Cleaning of pet areas by autonomous cleaning robots |
Cited By (12)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN112168066A (en) * | 2020-09-30 | 2021-01-05 | 深圳市银星智能科技股份有限公司 | Control method and device for cleaning robot, cleaning robot and storage medium |
CN113509104A (en) * | 2021-04-25 | 2021-10-19 | 珠海格力电器股份有限公司 | Cleaning method, storage medium and cleaning robot |
CN113180549A (en) * | 2021-04-30 | 2021-07-30 | 青岛海尔空调器有限总公司 | Cleaning control method, device and air conditioner |
WO2022227533A1 (en) * | 2021-04-30 | 2022-11-03 | 青岛海尔空调器有限总公司 | Cleaning control method and apparatus, and air conditioner |
WO2023124859A1 (en) * | 2021-12-28 | 2023-07-06 | 速感科技(北京)有限公司 | Cleaning robot, cleaning methods thereof and computer readable storage medium |
CN114468855A (en) * | 2022-01-07 | 2022-05-13 | 珠海格力电器股份有限公司 | Equipment control method, device, equipment and storage medium |
CN114766977A (en) * | 2022-05-07 | 2022-07-22 | 美智纵横科技有限责任公司 | Cleaning method, system, equipment and storage medium based on movement track |
CN115040038A (en) * | 2022-06-22 | 2022-09-13 | 杭州萤石软件有限公司 | Robot control method and device and robot |
CN115200144A (en) * | 2022-07-20 | 2022-10-18 | 珠海格力电器股份有限公司 | Air purifier and method for cleaning foreign matters |
CN115200144B (en) * | 2022-07-20 | 2023-10-27 | 珠海格力电器股份有限公司 | Air purifier and method for cleaning foreign matters |
CN116524135A (en) * | 2023-07-05 | 2023-08-01 | 方心科技股份有限公司 | Three-dimensional model generation method and system based on image |
CN116524135B (en) * | 2023-07-05 | 2023-09-15 | 方心科技股份有限公司 | Three-dimensional model generation method and system based on image |
Also Published As
Publication number | Publication date |
---|---|
CN111643011B (en) | 2022-06-03 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN111643011B (en) | Cleaning robot control method and device, cleaning robot and storage medium | |
CN111568314B (en) | Cleaning method and device based on scene recognition, cleaning robot and storage medium | |
US11450146B2 (en) | Gesture recognition method, apparatus, and device | |
KR101607934B1 (en) | The method for monitoring communicable disease and system using the method, recording medium for performing the method | |
CN111643010B (en) | Cleaning robot control method and device, cleaning robot and storage medium | |
CN113239874B (en) | Behavior gesture detection method, device, equipment and medium based on video image | |
KR102111894B1 (en) | A behavior pattern abnormality discrimination system and method for providing the same | |
CN110866880A (en) | Image artifact detection method, device, equipment and storage medium | |
CN113490965A (en) | Image tracking processing method and device, computer equipment and storage medium | |
CN106875408A (en) | Method, device and terminal device for sectional drawing | |
CN113609097B (en) | Fingerprint library generation method, device, computer equipment and storage medium | |
JP2020135465A (en) | Learning device, learning method, program and recognition device | |
KR20220039373A (en) | Smart pigsty system and pigsty monitoring method using the same | |
WO2022002443A1 (en) | Autonomous livestock monitoring | |
JP7148322B2 (en) | Image processing device | |
KR102390740B1 (en) | Method and device for training model to classify bad agricultural products, and device for classifying defective agricultural products using the same | |
US20130307976A1 (en) | Imaging apparatus and imaging condition setting method and program | |
CN113312981B (en) | Machine room murine image recognition method, system and storage medium | |
KR20240017696A (en) | Method and system for generating time series tracking datas for crop growth management | |
CN116580078A (en) | Animal attitude group estimation method, device, equipment and storage medium | |
CN112132064B (en) | Method, device, equipment and medium for identifying number of gestational sacs based on artificial intelligence | |
CN114187659A (en) | Posture recognition method for recognizing posture of pig and related product thereof | |
CN113887634A (en) | Improved two-step detection-based electric safety belt detection and early warning method | |
CN113156453A (en) | Moving object detection method, apparatus, device and storage medium | |
JP2016162409A (en) | Image processing device, image processing system, image processing method and program |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant | ||
GR01 | Patent grant |