CN115584613A - Control method of clothes treatment equipment - Google Patents
Control method of clothes treatment equipment Download PDFInfo
- Publication number
- CN115584613A CN115584613A CN202110757378.6A CN202110757378A CN115584613A CN 115584613 A CN115584613 A CN 115584613A CN 202110757378 A CN202110757378 A CN 202110757378A CN 115584613 A CN115584613 A CN 115584613A
- Authority
- CN
- China
- Prior art keywords
- clothes
- module
- voice module
- control method
- user
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
- 238000000034 method Methods 0.000 title claims abstract description 67
- 238000005406 washing Methods 0.000 claims abstract description 95
- 238000005303 weighing Methods 0.000 claims abstract description 45
- 238000001514 detection method Methods 0.000 claims description 19
- 230000003213 activating effect Effects 0.000 claims description 7
- 230000001939 inductive effect Effects 0.000 abstract description 6
- 238000004458 analytical method Methods 0.000 description 10
- 230000003993 interaction Effects 0.000 description 10
- 230000007613 environmental effect Effects 0.000 description 4
- 238000010191 image analysis Methods 0.000 description 4
- 238000010586 diagram Methods 0.000 description 3
- 230000004913 activation Effects 0.000 description 2
- 238000006467 substitution reaction Methods 0.000 description 2
- 238000004364 calculation method Methods 0.000 description 1
- 238000001035 drying Methods 0.000 description 1
- 230000000694 effects Effects 0.000 description 1
- 238000005516 engineering process Methods 0.000 description 1
- 238000002474 experimental method Methods 0.000 description 1
- 230000006870 function Effects 0.000 description 1
- 238000011897 real-time detection Methods 0.000 description 1
- 238000011895 specific detection Methods 0.000 description 1
Images
Classifications
-
- D—TEXTILES; PAPER
- D06—TREATMENT OF TEXTILES OR THE LIKE; LAUNDERING; FLEXIBLE MATERIALS NOT OTHERWISE PROVIDED FOR
- D06F—LAUNDERING, DRYING, IRONING, PRESSING OR FOLDING TEXTILE ARTICLES
- D06F33/00—Control of operations performed in washing machines or washer-dryers
- D06F33/30—Control of washing machines characterised by the purpose or target of the control
- D06F33/32—Control of operational steps, e.g. optimisation or improvement of operational steps depending on the condition of the laundry
- D06F33/36—Control of operational steps, e.g. optimisation or improvement of operational steps depending on the condition of the laundry of washing
-
- D—TEXTILES; PAPER
- D06—TREATMENT OF TEXTILES OR THE LIKE; LAUNDERING; FLEXIBLE MATERIALS NOT OTHERWISE PROVIDED FOR
- D06F—LAUNDERING, DRYING, IRONING, PRESSING OR FOLDING TEXTILE ARTICLES
- D06F33/00—Control of operations performed in washing machines or washer-dryers
- D06F33/30—Control of washing machines characterised by the purpose or target of the control
- D06F33/32—Control of operational steps, e.g. optimisation or improvement of operational steps depending on the condition of the laundry
-
- D—TEXTILES; PAPER
- D06—TREATMENT OF TEXTILES OR THE LIKE; LAUNDERING; FLEXIBLE MATERIALS NOT OTHERWISE PROVIDED FOR
- D06F—LAUNDERING, DRYING, IRONING, PRESSING OR FOLDING TEXTILE ARTICLES
- D06F34/00—Details of control systems for washing machines, washer-dryers or laundry dryers
- D06F34/14—Arrangements for detecting or measuring specific parameters
- D06F34/18—Condition of the laundry, e.g. nature or weight
-
- D—TEXTILES; PAPER
- D06—TREATMENT OF TEXTILES OR THE LIKE; LAUNDERING; FLEXIBLE MATERIALS NOT OTHERWISE PROVIDED FOR
- D06F—LAUNDERING, DRYING, IRONING, PRESSING OR FOLDING TEXTILE ARTICLES
- D06F34/00—Details of control systems for washing machines, washer-dryers or laundry dryers
- D06F34/28—Arrangements for program selection, e.g. control panels therefor; Arrangements for indicating program parameters, e.g. the selected program or its progress
-
- D—TEXTILES; PAPER
- D06—TREATMENT OF TEXTILES OR THE LIKE; LAUNDERING; FLEXIBLE MATERIALS NOT OTHERWISE PROVIDED FOR
- D06F—LAUNDERING, DRYING, IRONING, PRESSING OR FOLDING TEXTILE ARTICLES
- D06F2103/00—Parameters monitored or detected for the control of domestic laundry washing machines, washer-dryers or laundry dryers
- D06F2103/02—Characteristics of laundry or load
- D06F2103/04—Quantity, e.g. weight or variation of weight
-
- D—TEXTILES; PAPER
- D06—TREATMENT OF TEXTILES OR THE LIKE; LAUNDERING; FLEXIBLE MATERIALS NOT OTHERWISE PROVIDED FOR
- D06F—LAUNDERING, DRYING, IRONING, PRESSING OR FOLDING TEXTILE ARTICLES
- D06F2105/00—Systems or parameters controlled or affected by the control systems of washing machines, washer-dryers or laundry dryers
- D06F2105/58—Indications or alarms to the control system or to the user
- D06F2105/60—Audible signals
Landscapes
- Engineering & Computer Science (AREA)
- Textile Engineering (AREA)
- Control Of Washing Machine And Dryer (AREA)
Abstract
The invention relates to the technical field of clothes treatment, in particular to a control method of clothes treatment equipment, and aims to solve the problem that the prior art cannot realize the non-inductive starting of a voice module. To this end, the control method of the present invention includes: acquiring an image of a user through the image acquisition module; judging whether the user has clothes in the hand or not based on the image; if the clothes exist, further judging whether the weight of the clothes to be processed in the clothes processing drum currently exceeds a threshold value; and selectively starting the voice module to carry out washing reminding or starting the weighing module to carry out clothes weighing based on the judgment result. According to the invention, the user has clothes on hand, so that the voice module or the weighing module is started further according to whether the weight of the clothes to be processed in the current clothes processing drum exceeds the threshold value, the voice module can be started without any operation of the user, the purpose of starting the voice module without feeling is achieved, and the user experience is improved.
Description
Technical Field
The invention relates to the technical field of clothes treatment, and particularly provides a control method of clothes treatment equipment.
Background
With the continuous development of science and technology, the functions of the intelligent washing machine are also continuously increased. For example, the intelligent washing machine is provided with the voice module, interaction between a user and the intelligent washing machine can be achieved through the voice module, or the current working state of the intelligent washing machine can be broadcasted, and the like, so that the user can use the intelligent washing machine better, and user experience is improved.
However, at present, before the voice module is started, the washing device needs to wake up the keyword, for example, a user sends an instruction to start the voice module, and the washing device starts the voice module after receiving the instruction, or a button for starting the voice module is provided on the washing device or the mobile terminal, and the user can start the voice module by selecting the button when necessary. Therefore, the starting of the voice module cannot be really noninductive at present.
Accordingly, there is a need in the art for a new solution to the above-mentioned problems.
Disclosure of Invention
The present invention is directed to solve the above technical problem, that is, to solve the problem that the prior art cannot implement the non-inductive start of the voice module.
In a first aspect, the present invention provides a control method of a laundry treatment apparatus comprising a laundry treatment drum for containing laundry to be treated, the laundry treatment apparatus being configured with an image acquisition module, a voice module and a weighing module, the control method comprising: acquiring an image of a user through the image acquisition module; determining whether there is clothing in the user's hand based on the image; if the clothes exist, further judging whether the weight of the clothes to be processed in the clothes processing drum currently exceeds a threshold value; and selectively starting the voice module to carry out washing reminding or starting the weighing module to carry out clothes weighing based on the judgment result.
In a preferred technical solution of the above control method, the step of selectively starting the voice module to remind the user of washing or starting the weighing module to weigh the laundry based on the determination result further includes: if the weight of the clothes to be processed exceeds the threshold value, starting the voice module to carry out washing reminding; and if the weight of the clothes to be processed does not exceed the threshold value, starting the weighing module to weigh the clothes.
In a preferred embodiment of the above control method, the voice module has a sound generation mode and a sound pickup mode, and the step of "activating the voice module" further includes: and controlling the voice module to enter the pronunciation mode.
In a preferred embodiment of the above control method, the voice module has a sound emitting mode and a sound collecting mode, and the step of "activating the voice module" further includes: and controlling the voice module to enter the pronunciation mode.
In a preferred embodiment of the above control method, after the step of "controlling the voice module to enter the pronunciation mode", the control method further includes: and controlling the voice module to send out a prompt of 'washing clothes'.
In a preferred embodiment of the above control method, after the step of "starting the weighing module to weigh the laundry", the control method further includes: obtaining a weight of the laundry to be treated in the laundry treatment drum and saving the weight.
In a preferred embodiment of the above control method, after the step of "controlling the voice module to enter the sounding mode" or the step of "acquiring the weight of the laundry to be treated in the laundry treatment drum", the control method further comprises: and controlling the voice module to enter the pickup mode.
In a preferred embodiment of the above control method, the control method further includes: when the voice module enters the pickup mode within a preset time length, if the voice module does not receive an effective instruction, controlling the voice module to exit the pickup mode; the effective instruction comprises a washing starting instruction, a washing stopping instruction, a starting instruction and a shutdown instruction.
In a preferred embodiment of the above control method, after the step of "controlling the voice module to exit the sound pickup mode", the control method further includes: controlling the voice module to operate at a first power.
In a preferred embodiment of the above control method, the control method further includes: if there is no laundry, the image acquisition module is turned off.
In a preferred embodiment of the above control method, the laundry processing apparatus is further configured with a human body detection module, and before the step of "acquiring the image of the user by the image acquisition module", the control method further includes: detecting whether a user enters a preset area or not through the human body detection module; and if a user enters the preset area, starting the image acquisition module.
In an aspect of the present invention, a laundry treating apparatus includes a laundry treating drum for containing laundry to be treated, and the laundry to be treated is placed in the laundry treating drum for washing while the laundry is washed. The clothes processing equipment is provided with an image acquisition module, a voice module and a weighing module, and the control method of the invention comprises the following steps: the method comprises the steps of obtaining an image of a user through an image obtaining module, judging whether clothes exist in the hand of the user or not based on the image, further judging whether the weight of the clothes to be processed in a current clothes processing drum exceeds a threshold value or not if the clothes exist, and selectively starting a voice module to carry out washing reminding or starting a weighing module to carry out clothes weighing based on the judgment result. Through the control mode, when clothes exist in the hands of a user, the intention of the user for washing the clothes is described, then whether the weight of the clothes to be processed in the current clothes processing barrel exceeds a threshold value or not is further judged, and according to the judgment result, the voice module is started to carry out clothes washing reminding or the weighing module is started to carry out clothes weighing, so that the voice module can be started without any operation of the user, the voice module is started without feeling, and the user experience is improved. The weighing module can be started without any operation of a user, the weight of the clothes to be washed in the clothes treatment drum is obtained, the weight can be used as the next judgment for comparing the weight of the clothes to be treated currently with the threshold value, and then whether the voice module is started or not is further determined according to the judgment result, so that the non-inductive starting of the voice module is better realized.
If the user does not have clothes in the hands, the user does not prepare for washing clothes, the clothes may only pass through the clothes treatment equipment, and at the moment, the image acquisition module is closed, so that the electric energy is saved.
Further, if the weight of the clothes to be treated exceeds the threshold value, it indicates that the clothes to be treated in the clothes treatment drum are enough and can be washed, and at this time, the voice module is started to carry out washing reminding. If the weight of the clothes to be processed does not exceed the threshold value, the clothes to be processed in the clothes processing drum are still less, the clothes can be continuously added into the clothes processing drum, and at the moment, the weighing module is started to weigh the clothes. Through the control mode, the operation of the voice module or the weighing module can be controlled according to the weight of the clothes to be processed in the current clothes processing drum, so that the clothes processing drum can better serve users.
Further, the voice module has a pronunciation mode and a pickup mode, and the step of activating the voice module further includes: the voice module is controlled to enter a pronunciation mode, so that washing reminding can be performed in a voice reminding mode. After the step of controlling the voice module to enter the pronunciation mode, the voice module is controlled to send out the prompt of washing clothes, and the user knows that the clothes can be washed after hearing the prompt, so that the aim of reminding the user can be better fulfilled.
Further, after the step of "starting the weighing module to weigh the laundry", the control method of the present invention further comprises: the weight of laundry to be treated in the laundry treating drum is acquired and stored. Based on the weight, it may be further determined whether the weight exceeds a threshold value, and then based on the determination result, it is further determined whether to activate the voice module, thereby better achieving an imperceptible activation of the voice module.
Further, after the step of controlling the voice module to enter the sounding mode or the step of acquiring the weight of the laundry to be treated in the laundry treating drum, the control method further includes: and controlling the voice module to enter a pickup mode. Therefore, the user can realize man-machine interaction with the clothes processing equipment through the voice module, and the requirements of the user are better met.
Further, within a preset duration when the voice module enters the pickup mode, if the voice module does not receive an effective instruction, the effective instruction comprises a washing starting instruction, a washing stopping instruction, a starting instruction and a shutdown instruction. The user does not intend to effectively perform human-computer interaction with the clothes treatment equipment to control the clothes treatment equipment at present, and at the moment, the voice module is controlled to exit the sound pickup mode, and the human-computer interaction channel is closed.
Further, after the step of controlling the voice module to exit the sound pickup mode, the voice module is controlled to operate at the first power. It should be noted that the first power is lower than the power when the voice module is normally operated, i.e. the voice module is controlled to operate at a low power. By such a control method, the consumption of electric energy can be effectively reduced.
Further, the laundry treating apparatus is further provided with a human body detecting module, and before the step of "acquiring the image of the user by the image acquiring module", the control method of the present invention further comprises: whether a user enters the preset area or not is detected through the human body detection module, if the user enters the preset area, the user possibly wants to wash clothes is indicated, and at the moment, the image acquisition module is started. The image acquisition module acquires the image of the user, and then whether the user has clothes in the hand is analyzed based on the image, so that whether the clothes are intended to be washed is further judged, the intelligent control of the clothes treatment equipment can be better realized, and the user experience is improved.
Drawings
The control method of the laundry treating apparatus of the present invention will be described with reference to the accompanying drawings, in which:
fig. 1 is a main flowchart of a control method of a laundry treating apparatus according to an embodiment of the present invention;
FIG. 2 is a flow diagram of a voice module entering a pickup mode according to an embodiment of the present invention;
FIG. 3 is a flow diagram of the activation of an image acquisition module according to one embodiment of the invention;
FIG. 4 is a flowchart of an embodiment of the present invention for invoking a preset model analysis image and determining whether a user has clothing on his/her hand;
FIG. 5 is a flow diagram of an image analysis using the ResNet18 model according to one embodiment of the present invention;
fig. 6 is a flowchart of determining whether there is clothes on the user's hand based on the analysis result of the ResNet18 model according to an embodiment of the present invention.
Detailed Description
Preferred embodiments of the present invention are described below with reference to the accompanying drawings. It should be understood by those skilled in the art that these embodiments are only for explaining the technical principle of the present invention, and are not intended to limit the scope of the present invention. Although the present embodiment is described by taking a washing machine as an example, the present invention is also applicable to various types of clothes treatment apparatuses such as shoe washers, dryers, washing and drying machines, and clothes care machines. For example, when the laundry treating apparatus is a shoe washing machine, it is necessary to judge whether the user has shoes on his hand when performing image analysis.
It should be noted that the terms "first" and "second" are used for descriptive purposes only and are not to be construed as indicating or implying relative importance.
In order to better serve users, the current intelligent washing machine is generally configured with a voice module, and the interaction between the users and the intelligent washing machine can be realized through the voice module, or the current working state of the intelligent washing machine can be broadcasted, and the like. However, when the current voice module is started, a user needs to send an instruction or manually select the voice module, and the voice module cannot be started in a real and non-sensible manner. Therefore, the control method of the clothes treatment equipment judges whether clothes exist on the hands of the user according to the image of the user, if clothes exist, further judges whether the weight of the clothes to be treated in the current clothes treatment drum exceeds a threshold value, and starts the voice module to carry out washing reminding or starts the weighing module to carry out clothes weighing based on the judgment result, so that the voice module can be started without any operation of the user, the purpose of starting the voice module without feeling is achieved, and the user experience is improved.
In this embodiment, the washing machine includes a washing tub for containing laundry, and the laundry is placed in the washing tub when washing the laundry. The washing machine is provided with an image acquisition module, a voice module, a weighing module and a control module, wherein the image acquisition module, the voice module and the weighing module are all connected with the control module. The image acquisition module is used for acquiring an image of a user. The voice module is used for realizing interaction between a user and the washing machine or broadcasting voice prompt. The weighing module is used for acquiring the weight of clothes to be washed in the washing drum. The control module is used for controlling the preset model to analyze the image of the user acquired by the image acquisition module, judging whether clothes exist on the hand of the user according to an analysis result, controlling and judging whether the back weight of the clothes to be washed in the washing drum exceeds a threshold value, and starting the voice module to perform voice reminding or starting the heavy module to perform clothes weighing according to a judgment result.
In this embodiment, the image acquisition module may be, but is not limited to, a camera, a video camera, and the like, and those skilled in the art may flexibly select the image acquisition module. The image acquisition module can be arranged on the front panel of the washing machine, obviously, the image acquisition module can also be arranged at other positions which are convenient for acquiring the images of the user.
The voice module may be, but is not limited to, a device capable of broadcasting a voice reminder, such as a speaker or a loudspeaker, or may be, but is not limited to, a device capable of collecting ambient sound, such as a sound pickup or a microphone. The voice module can be arranged on a front panel or a side plate of the washing machine, and obviously can also be arranged at other positions where voice broadcasting and sound collection are convenient.
The weighing module may be, but is not limited to, a load cell, a weigh scale, etc. which is disposed within the washing drum for acquiring the weight of laundry to be washed within the washing drum.
First, a control method of a laundry treating apparatus of the present invention will be explained with reference to fig. 1. Fig. 1 is a main flow chart of a control method of a laundry treating apparatus according to an embodiment of the present invention.
In one possible embodiment, as shown in fig. 1, the control method of the present invention comprises:
step S10: acquiring an image of a user through an image acquisition module;
in step S10, an image of the user is acquired by the image acquisition module.
Step S20: judging whether clothes exist in the hands of the user or not based on the image; if not, executing step S30; if yes, go to step S40;
in step S20, image analysis is performed based on the user image acquired in step S10, and it is determined whether or not the user has clothes on his hand based on the analysis result. The specific steps of image analysis are set forth in detail below. If the user has no laundry on his hand, step S30 is performed. If the user has laundry on his/her hand, step S40 is performed.
Step S30: and closing the image acquisition module.
In step S30, based on the determination result in step S20, if there is no clothes on the hand of the user, it indicates that the user is not ready to wash clothes, and may only pass through the washing machine, at this time, the image acquisition module is turned off, and only the human body detection module is controlled to continue to detect whether there is clothes entering the preset area, so as to save electric energy.
Step S40: if the clothes exist, further judging whether the weight of the clothes to be washed in the current washing drum exceeds a threshold value;
step S50: and selectively starting the voice module to carry out washing reminding or starting the weighing module to carry out clothes weighing based on the judgment result.
In step S40, based on the determination result in step S20, if there is clothes in the user' S hand, which indicates that the user intends to wash clothes, it is further determined whether the weight of the clothes to be washed in the current washing drum exceeds the threshold.
In step S50, based on the determination result in step S40, the voice module is started to perform a clothes washing reminder or the weighing module is started to perform clothes weighing, so that the voice module can be started without any operation of a user, the voice module is started without a sense, and user experience is improved. The weighing module can be started without any operation of a user, the weight of the clothes to be washed in the washing drum is obtained, the weight can be used as the next judgment for comparing the weight of the clothes to be washed with the threshold value, and then whether the voice module is started or not is further determined according to the judgment result, so that the non-inductive starting of the voice module is better realized.
In particular, if the weight of the laundry in the washing drum exceeds the threshold value, which indicates that the laundry in the laundry treatment drum is enough, the washing can be performed, for example, the weight of the laundry is 4.8kg, and the threshold value is 4.5kg. At the moment, the voice module is started to carry out washing reminding.
It should be noted that the voice module has a sound generation mode and a sound pickup mode, and when the voice module enters the sound generation mode, a reminder of "washing clothes" can be sent out through the voice module. Obviously, the voice module can also be used for sending voice prompts such as 'startup' and the like, and technicians in the field can flexibly select the voice prompt according to specific application scenes as long as the voice module can be used for washing prompt. When the voice module enters the pickup mode, the voice module can be used for collecting environmental sounds, such as various instructions sent by a user, such as a washing instruction, a starting instruction and the like.
Preferably, the step of "starting the voice module to perform the washing reminder" specifically includes: and controlling the voice module to enter a pronunciation mode. After the step of controlling the voice module to enter the pronunciation mode, the voice module is controlled to send out the prompt of washing clothes, and the user knows that the clothes can be washed after hearing the prompt, so that the user can be better reminded.
Obviously, the voice module can also perform washing reminding in other manners, for example, performing washing reminding by using a "drip" alarm sound, and the like, and a person skilled in the art can flexibly select a specific reminding means.
If the weight of the laundry in the washing drum does not exceed the threshold, indicating that the laundry in the laundry treatment drum is still relatively small, the laundry can be continuously added into the laundry treatment drum, for example, the weight of the laundry is 3kg, and the threshold is 4.5kg. At the moment, the weighing module is started to weigh the clothes.
In a possible embodiment, after the step of "starting the weighing module for weighing the laundry", the control method of the present invention further comprises: and acquiring the weight of the clothes to be washed in the washing drum through the weighing module, and storing the weight. After the weight of the laundry is acquired, the weight is preserved. When the next time that the user has clothes in the hands, the weight is used as the weight of the clothes to be washed in the current washing drum to be compared with the threshold value, and then whether the voice module is started or not is further determined according to the judgment result, so that the non-inductive starting of the voice module is better realized.
Through the control mode, the operation of the voice module or the weighing module can be controlled according to the weight of the clothes to be washed in the current washing drum, so that the user can be better served.
After the voice module is controlled to enter the sound-picking mode or the weight of clothes to be washed in the washing drum is acquired, the voice module is controlled to enter the sound-picking mode so as to collect environmental sound through the sound-picking mode, and man-machine interaction is achieved. Specifically, the control method of the laundry treating apparatus of the present invention is further explained with reference to fig. 2 as follows. Fig. 2 is a flowchart illustrating a voice module entering a pickup mode according to an embodiment of the present invention.
As shown in fig. 2, in a possible embodiment, after controlling the voice module to enter the sounding mode or acquiring the weight of the laundry in the washing drum, the control method of the present invention further comprises:
step S61: after the voice module is controlled to enter the voice mode or the weight of clothes to be washed in the washing drum is acquired, the voice module is controlled to enter the sound pickup mode;
in step S61, after the voice module is controlled to enter the sounding mode, a washing reminder may be given to remind the user that the clothes can be washed, and then the voice module is controlled to enter the sound pickup mode to collect environmental sound, such as an instruction of the user.
Or after the weight of the clothes to be washed in the washing drum is acquired and stored, the voice mode is controlled to enter the pickup mode, and the environmental sound is collected, so that when the weight of the clothes to be washed exceeds a threshold value, after the voice module is used for carrying out washing reminding, a user can give an instruction to the washing machine in a voice mode, for example, the instruction for starting washing is given. If the user still wants to wash the laundry, the user may give an instruction, for example, an instruction to start washing, by voice when the weight of the laundry does not exceed the threshold.
Through such control mode, after the voice module enters the pronunciation mode or acquires the weight of the laundry, the voice module is in the pickup mode, so that the man-machine interaction between a user and the washing machine can be realized, the washing machine can be better controlled, and the user experience is improved.
Step S62: when the voice module enters the pickup mode for a preset time, if the voice module does not receive an effective instruction, controlling the voice module to exit the pickup mode;
the effective instruction comprises a washing starting instruction, a washing stopping instruction, a starting instruction and a shutdown instruction. It should be noted that the types of the valid instructions listed above are only specific types of instructions, and do not limit the specific content of the instructions that can be received by the voice module, as long as the valid instructions received by the voice module can control the washing machine to start washing, stop washing, start up, and shut down.
In step S62, after the voice module is controlled to enter the sound pickup mode in step S61, the duration for which the voice module enters the sound pickup mode is acquired. When the time length for the voice module to enter the pickup mode is the preset time length, the effective instruction is not received all the time, if the preset time length is 5min, namely the voice module does not receive the effective instruction when entering the pickup mode within 5min, the user does not intend to wash clothes at present, and at the moment, the voice module is controlled to exit the pickup mode.
Step S63: the voice module is controlled to operate at a first power.
In step S63, after the voice module is controlled to exit the sound pickup mode in step S62, the voice module is controlled to operate at the first power. It should be noted that the first power is lower than the normal operation power of the voice module. Specifically, the voice module is controlled to be in a dormant state, or the voice module runs at a very low first power, and the voice module is controlled to run at a lower first power, so that the electric energy can be effectively saved.
In one possible embodiment, a relay is provided on the power supply circuit of the speech module. After the voice module exits the sound pickup mode, or before determining whether the user has clothes on his hand through step S20, the relay is controlled to be turned off, so that the voice module is controlled to operate at the first power, e.g., the voice module is in a sleep state, or the voice module is controlled to operate at an extremely low power. And after judging whether the user has clothes on the hand and judging that the clothes are in the positive state, controlling the relay to be closed so as to control the voice module to operate at a second power, wherein the second power is higher than the first power, and specifically, the voice module operates at the power when the voice module normally operates. By the control mode, before the user determines whether clothes exist on the hand or not, or when the user does not have clothes on the hand, the voice module is controlled to operate at the lower first power, even if the voice module is in a dormant state, and the electric energy is saved. And after the user is determined to have clothes on the hand, the voice module is controlled to operate at a higher second power, and even if the voice module is normally started, the user can realize interaction with the washing machine through the voice module, so that the non-inductive starting of the voice module is realized.
Certainly, the power supply circuit of the voice module may not be provided with a relay, but the voice module is directly controlled by the control module to operate at the first power or the second power, and a person skilled in the art can flexibly select the power supply circuit as long as the voice module can be controlled to operate at a lower first power and save electric energy at a proper time, or operate at a higher second power and ensure normal operation of the voice module.
In order to provide services to the user more accurately, before the image of the user is acquired in step S10, it is detected whether the user enters the preset area, and whether to start the image acquisition module is determined according to whether the user enters the preset area. Specifically, the control method of the laundry treating apparatus of the present invention is further explained below with reference to fig. 3. FIG. 3 is a flowchart of activating an image capture module according to an embodiment of the present invention.
As shown in fig. 3, in a possible embodiment, before step S10 "acquiring an image of a user by an image acquisition module", the control method of the present invention further includes:
step S100: detecting whether a user enters a preset area or not through a human body detection module;
step S200: and if the user enters the preset area, starting the image acquisition module.
In this embodiment, the washing machine is further provided with a human body detection module for detecting whether a user enters a preset area. The human body detection module can be arranged on a front panel of the washing machine, and obviously can also be arranged at other positions which are convenient for detecting whether a user enters a preset area or not.
It should be noted that the human body detection module may be any one of an infrared detection module, a radar detection module, and the like, which can detect whether a user enters the preset area, and no matter what way is adopted to detect whether a user enters the preset area, any one specific detection method should not limit the present invention.
In step S100, a human body detection module such as an infrared detection module or a radar detection module is used to detect whether a user enters a preset area.
It should be noted that the human body detection module may perform real-time detection, or perform detection according to a preset time interval, where the preset time interval may be 15 seconds, 30 seconds, 1min, 3min, or the like. Of course, these preset time intervals are only described by way of example, and are not limited thereto, and those skilled in the art can flexibly adjust and set the preset time intervals in practical applications according to the frequency of the users entering the placing positions of the washing machine, and the like, and in any case, the preset time intervals are adjusted and set as long as whether the users enter the preset areas can be accurately detected.
It should be noted that the preset region may be a detection region when the human body detection module normally works, or may be a region preset by a person skilled in the art according to experiments or experience. The preset area may be an area having a certain linear distance from the washing machine, for example, 0.8m, 1.2m or 1.5m, and the preset area may be flexibly adjusted and set by a person skilled in the art.
In step S200, if it is detected that the user enters the preset area based on the detection result in step S100, it indicates that the user may want to wash the laundry, and of course, only pass through the washing machine. In order to further judge the purpose of the user, the image acquisition module is started, and the image of the user is further acquired through the image acquisition module.
After determining that the user enters the preset area, step S10 is performed to acquire an image of the user and further determine whether the user has clothes in his/her hand based on the image. Specifically, the control method of the laundry treating apparatus of the present invention is further explained below with reference to fig. 4 to 6. Fig. 4 is a flowchart for calling a preset model analysis image and determining whether a user has clothes on his hand according to an embodiment of the present invention, fig. 5 is a flowchart for analyzing an image using a ResNet18 model according to an embodiment of the present invention, and fig. 6 is a flowchart for determining whether a user has clothes on his hand based on an analysis result of the ResNet18 model according to an embodiment of the present invention.
As shown in fig. 4, in a possible embodiment, the step of "determining whether there is clothes on the user' S hand based on the image" in step S20 further includes:
step S201: calling a preset model to analyze the image;
step S202: and judging whether the user has clothes on the hand or not according to the analysis result.
In step S201, preset models are pre-stored on the washing machine, and the preset models may be, but are not limited to, a CNN model, a ResNet18 model, a ResNet101 model, a deplabv 3+ model, a ResNeXt model, and an HRNet model.
Preferably, the preset model is a ResNet18 model, and the network structure of the ResNet18 model is shown in Table 1 below.
Table 1 network architecture of the resnet18 model
The layers of the ResNet18 model are connected in a non-linear manner, and therefore, the ResNet18 model is also connected in a non-linear manner as a whole. It should be noted that the calculation method and the operation principle of the ResNet18 model are common knowledge in the art and will not be described herein. Of course, other models such as the ResNet50 model, the ResNet101 model, etc. may be used to analyze the user's image to determine if the user has clothing on their hand.
The specific manner of "calling the preset model analysis image" in step S201 is described below with reference to fig. 5 and taking the preset model as the ResNet18 model.
Step S2011: intercepting a plurality of sub-images from an image according to a preset method;
step S2012: inputting all sub-images into ResNet18 model;
step S2013: the ResNet18 model calculates and obtains the characteristic value of the image according to all the sub-images.
The preset method can be that different movable frames are set, the image in each movable frame is intercepted, and the intercepted image is used as a subimage; alternatively, the preset method may be to divide the image of the user into N parts, for example, 5 parts, 10 parts, 15 parts, or 20 parts, according to the size of the image of the user, where N is a positive integer, respectively intercept each image, and use the intercepted image as the sub-image. Of course, the preset method is not limited to the above-listed method, and any method may be adopted as long as a plurality of sub-images can be cut out from the image.
As shown in fig. 6, the step of "determining whether there is clothes on the user' S hand according to the analysis result" in step S202 specifically includes:
s2021: judging whether the characteristic value is larger than a preset value or not;
s2022: if the characteristic value is larger than the preset value, judging that clothes exist on the hand of the user;
s2023: and if the characteristic value is less than or equal to the preset value, judging that the user does not have clothes on the hand.
In step S2021, the feature value calculated in step S2013 is compared with a preset value, and it is finally determined whether there is clothes on the user' S hand according to the comparison result between the feature value and the preset value.
In step S2022, if the feature value is greater than the preset value, for example, the preset value is 0.5, the feature value calculated in step S2013 is 0.95, and is greater than the preset value, which indicates that the user has clothes in the hand, it is determined that the user has clothes in the hand.
In step S2023, if the feature value is less than or equal to the predetermined value, for example, the predetermined value is 0.5, the feature value calculated in step S2013 is 0.05, and is less than the predetermined value, which indicates that there is no clothes on the hand of the user, it is determined that there is no clothes on the hand of the user.
It should be noted that the above listed preset values are only exemplary, and not restrictive, and those skilled in the art can flexibly adjust and set the preset values according to the accuracy of determining whether there is clothes on the hand of the user, for example, the preset value may also be 0.7, 0.8, 0.9, or 1, and no matter how the preset value is set, as long as whether there is clothes on the hand of the user can be accurately determined.
It should be further noted that, in the above process, step S2022 and step S2023 have no sequence, are parallel, and are only related to the determination result of whether the characteristic value is greater than the preset value, and the corresponding step may be executed according to different determination results.
In summary, in the preferred technical solution of the present invention, the image of the user is obtained through the image obtaining module, then the image is analyzed, whether clothes are on the hand of the user is judged based on the analysis result, if clothes are on the hand, whether the weight of the clothes to be processed in the current clothes processing drum exceeds the threshold is further judged, and the voice module is started to perform washing reminding or the weighing module is started to perform clothes weighing based on the judgment result, so that the voice module can be started without any operation of the user, and the user experience is improved. If no clothes exist, the image acquisition module is closed, and electric energy is saved. If the weight of the clothes to be processed exceeds a threshold value, controlling the voice module to send out a prompt of 'washing clothes'; if the weight of the laundry to be treated does not exceed the threshold value, the weight of the laundry to be treated in the laundry treatment drum is acquired and stored by the weighing module. After the voice module enters the pronunciation mode or acquires the weight of the clothes to be processed in the clothes processing drum, the voice module is controlled to enter the pickup mode, and therefore human-computer interaction between a user and the washing machine is achieved. When the voice module enters the pickup mode and is preset in duration, if the voice module does not receive an effective instruction, the pickup mode exits, and the voice module is controlled to operate at a first power, so that electric energy is saved.
Although the foregoing embodiments describe the steps in the above sequential order, those skilled in the art can understand that, in order to achieve the effect of the present embodiments, the different steps need not be executed in such an order, and may be executed simultaneously (in parallel) or in an inverted order, and these simple changes are all within the scope of protection of the present application.
So far, the technical solutions of the present invention have been described in connection with the preferred embodiments shown in the drawings, but it is easily understood by those skilled in the art that the scope of the present invention is obviously not limited to these specific embodiments. Equivalent changes or substitutions of related technical features can be made by those skilled in the art without departing from the principle of the invention, and the technical scheme after the changes or substitutions can fall into the protection scope of the invention.
Claims (10)
1. A control method of a laundry treatment apparatus, characterized in that the laundry treatment apparatus comprises a laundry treatment drum for containing laundry to be treated, the laundry treatment apparatus is provided with an image acquisition module, a voice module and a weighing module,
the control method comprises the following steps:
acquiring an image of a user through the image acquisition module;
judging whether the user has clothes in the hand or not based on the image;
if the clothes exist, further judging whether the weight of the clothes to be processed in the clothes processing drum currently exceeds a threshold value;
and selectively starting the voice module to carry out washing reminding or starting the weighing module to carry out clothes weighing based on the judgment result.
2. The control method according to claim 1, wherein the step of selectively activating the voice module for a wash reminder or the weighing module for weighing laundry based on the determination result further comprises:
if the weight of the clothes to be processed exceeds the threshold value, starting the voice module to carry out washing reminding;
and if the weight of the clothes to be processed does not exceed the threshold value, starting the weighing module to weigh the clothes.
3. The control method of claim 2, wherein the voice module has a sound emitting mode and a sound pickup mode, and the step of activating the voice module further comprises:
and controlling the voice module to enter the pronunciation mode.
4. The control method according to claim 3, wherein after the step of controlling the voice module to enter the utterance mode, the control method further comprises:
and controlling the voice module to send out a prompt of 'washing clothes'.
5. The control method of claim 3, wherein after the step of "activating the weighing module", the control method further comprises:
obtaining the weight of the laundry to be treated in the laundry treatment drum and saving the weight.
6. The control method according to claim 5, wherein after the step of controlling the voice module to enter the sounding mode or the step of acquiring the weight of the laundry to be treated in the laundry treating drum, the control method further comprises:
and controlling the voice module to enter the pickup mode.
7. The control method according to claim 6, characterized by further comprising:
when the voice module enters the pickup mode within a preset duration, if the voice module does not receive an effective instruction, controlling the voice module to exit the pickup mode;
the effective instruction comprises a washing starting instruction, a washing stopping instruction, a starting instruction and a shutdown instruction.
8. The control method according to claim 7, wherein after the step of controlling the voice module to exit the sound pickup mode, the control method further comprises:
controlling the voice module to operate at a first power.
9. The control method according to claim 1, characterized in that the control method further comprises:
if there is no laundry, the image acquisition module is turned off.
10. The control method according to claim 1, wherein the laundry treating apparatus is further provided with a human body detecting module,
before the step of "acquiring an image of a user by the image acquisition module", the control method further includes:
detecting whether a user enters a preset area or not through the human body detection module;
and if a user enters the preset area, starting the image acquisition module.
Priority Applications (4)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202110757378.6A CN115584613A (en) | 2021-07-05 | 2021-07-05 | Control method of clothes treatment equipment |
EP22836686.0A EP4368761A1 (en) | 2021-07-05 | 2022-06-16 | Control method for clothes treatment device |
PCT/CN2022/099067 WO2023279932A1 (en) | 2021-07-05 | 2022-06-16 | Control method for clothes treatment device |
US18/576,497 US20240301604A1 (en) | 2021-07-05 | 2022-06-16 | Control method for clothes treatment device |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202110757378.6A CN115584613A (en) | 2021-07-05 | 2021-07-05 | Control method of clothes treatment equipment |
Publications (1)
Publication Number | Publication Date |
---|---|
CN115584613A true CN115584613A (en) | 2023-01-10 |
Family
ID=84771794
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202110757378.6A Pending CN115584613A (en) | 2021-07-05 | 2021-07-05 | Control method of clothes treatment equipment |
Country Status (4)
Country | Link |
---|---|
US (1) | US20240301604A1 (en) |
EP (1) | EP4368761A1 (en) |
CN (1) | CN115584613A (en) |
WO (1) | WO2023279932A1 (en) |
Family Cites Families (8)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US11098429B2 (en) * | 2015-09-17 | 2021-08-24 | Washlava, Inc. | Communication and control system for laundry machines |
CN209456744U (en) * | 2018-09-21 | 2019-10-01 | 北京小米移动软件有限公司 | A kind of clothes cleaning equipment |
EP3957791A4 (en) * | 2019-04-16 | 2022-12-07 | LG Electronics Inc. | Artificial intelligence laundry treating device and operation method therefor |
CN112229753A (en) * | 2019-07-15 | 2021-01-15 | 青岛海尔洗衣机有限公司 | Clothes accommodating device and foreign matter reminding method |
CN112481975A (en) * | 2019-09-11 | 2021-03-12 | 青岛海尔洗衣机有限公司 | Clothes care method and device and clothes care equipment |
CN112941804B (en) * | 2019-12-11 | 2024-09-13 | 合肥海尔洗衣机有限公司 | Control method of washing machine and washing machine |
CN110965302A (en) * | 2019-12-24 | 2020-04-07 | 青岛海尔洗衣机有限公司 | To-be-washed clothes storage device and washing machine |
CN113046993A (en) * | 2021-03-19 | 2021-06-29 | 合肥美菱物联科技有限公司 | Intelligent washing machine and control method thereof |
-
2021
- 2021-07-05 CN CN202110757378.6A patent/CN115584613A/en active Pending
-
2022
- 2022-06-16 US US18/576,497 patent/US20240301604A1/en active Pending
- 2022-06-16 WO PCT/CN2022/099067 patent/WO2023279932A1/en active Application Filing
- 2022-06-16 EP EP22836686.0A patent/EP4368761A1/en active Pending
Also Published As
Publication number | Publication date |
---|---|
EP4368761A1 (en) | 2024-05-15 |
US20240301604A1 (en) | 2024-09-12 |
WO2023279932A1 (en) | 2023-01-12 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
EP3517849B1 (en) | Household appliance control method, device and system, and intelligent air conditioner | |
CN105912092B (en) | Voice awakening method and speech recognition equipment in human-computer interaction | |
US20190080541A1 (en) | Unlocking control methods and related products | |
CN104571925B (en) | The one-handed performance method and device of mobile terminal | |
CN103856605B (en) | A kind of information processing method and electronic equipment | |
CN109067628B (en) | Voice control method and control device of intelligent household appliance and intelligent household appliance | |
CN111210021A (en) | Audio signal processing method, model training method and related device | |
CN104883505B (en) | Electronic equipment and its camera control method | |
CN107633657A (en) | The based reminding method and terminal of a kind of fatigue driving | |
CN110409156A (en) | Ironing equipment, ironing clothes method and apparatus | |
CN105093980B (en) | Control the method and device of smart machine start and stop | |
CN101483683A (en) | Handhold apparatus and voice recognition method thereof | |
CN107918726A (en) | Apart from inducing method, equipment and storage medium | |
CN111692418A (en) | Water outlet device and control method thereof | |
CN109192214A (en) | A kind of voice number obtaining method, storage medium and robot | |
CN108076223A (en) | Target switching method, device, terminal device and storage medium | |
CN115584613A (en) | Control method of clothes treatment equipment | |
CN107395873A (en) | volume adjusting method, device, storage medium and terminal | |
CN109487492A (en) | Washing machine, washing machine voice control model of mind designing system and its operation method | |
CN111839305A (en) | Shower control method, shower device, and storage medium | |
CN110004659B (en) | Laundry treating apparatus and control method thereof | |
CN109841221A (en) | Parameter adjusting method, device and body-building equipment based on speech recognition | |
CN103577302A (en) | Electronic equipment and monitoring method thereof | |
CN105373004A (en) | Method and device for controlling smart device | |
CN115404641A (en) | Control method of clothes treatment equipment |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination |