CN113723498A - Food maturity identification method, device, system, electric appliance, server and medium - Google Patents
Food maturity identification method, device, system, electric appliance, server and medium Download PDFInfo
- Publication number
- CN113723498A CN113723498A CN202110993333.9A CN202110993333A CN113723498A CN 113723498 A CN113723498 A CN 113723498A CN 202110993333 A CN202110993333 A CN 202110993333A CN 113723498 A CN113723498 A CN 113723498A
- Authority
- CN
- China
- Prior art keywords
- food
- image
- doneness
- state image
- initial state
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
- 235000013305 food Nutrition 0.000 title claims abstract description 215
- 238000000034 method Methods 0.000 title claims abstract description 37
- 238000003062 neural network model Methods 0.000 claims abstract description 64
- 230000003042 antagnostic effect Effects 0.000 claims abstract description 42
- 238000010411 cooking Methods 0.000 claims description 67
- 238000012545 processing Methods 0.000 claims description 15
- 238000004891 communication Methods 0.000 claims description 11
- 238000004590 computer program Methods 0.000 claims description 7
- 238000010438 heat treatment Methods 0.000 description 16
- 238000001514 detection method Methods 0.000 description 13
- 238000004364 calculation method Methods 0.000 description 6
- 239000000523 sample Substances 0.000 description 6
- QVGXLLKOCUKJST-UHFFFAOYSA-N atomic oxygen Chemical compound [O] QVGXLLKOCUKJST-UHFFFAOYSA-N 0.000 description 4
- 238000010586 diagram Methods 0.000 description 4
- 229910052760 oxygen Inorganic materials 0.000 description 4
- 239000001301 oxygen Substances 0.000 description 4
- 230000006870 function Effects 0.000 description 3
- 230000008569 process Effects 0.000 description 3
- 230000003044 adaptive effect Effects 0.000 description 2
- 235000021168 barbecue Nutrition 0.000 description 2
- 230000008859 change Effects 0.000 description 2
- 238000013527 convolutional neural network Methods 0.000 description 2
- 239000000463 material Substances 0.000 description 2
- 230000003287 optical effect Effects 0.000 description 2
- 230000004075 alteration Effects 0.000 description 1
- 238000004140 cleaning Methods 0.000 description 1
- 238000011161 development Methods 0.000 description 1
- 230000000694 effects Effects 0.000 description 1
- 238000005516 engineering process Methods 0.000 description 1
- 239000000284 extract Substances 0.000 description 1
- 238000000605 extraction Methods 0.000 description 1
- 235000002864 food coloring agent Nutrition 0.000 description 1
- 230000002068 genetic effect Effects 0.000 description 1
- 238000003780 insertion Methods 0.000 description 1
- 230000037431 insertion Effects 0.000 description 1
- 238000013507 mapping Methods 0.000 description 1
- 235000013372 meat Nutrition 0.000 description 1
- 238000012986 modification Methods 0.000 description 1
- 230000004048 modification Effects 0.000 description 1
- 239000013307 optical fiber Substances 0.000 description 1
- 238000006467 substitution reaction Methods 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F18/00—Pattern recognition
- G06F18/20—Analysing
- G06F18/24—Classification techniques
- G06F18/241—Classification techniques relating to the classification model, e.g. parametric or non-parametric approaches
- G06F18/2415—Classification techniques relating to the classification model, e.g. parametric or non-parametric approaches based on parametric or probabilistic models, e.g. based on likelihood ratio or false acceptance rate versus a false rejection rate
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F18/00—Pattern recognition
- G06F18/20—Analysing
- G06F18/22—Matching criteria, e.g. proximity measures
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06N—COMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
- G06N3/00—Computing arrangements based on biological models
- G06N3/02—Neural networks
- G06N3/04—Architecture, e.g. interconnection topology
- G06N3/045—Combinations of networks
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06N—COMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
- G06N3/00—Computing arrangements based on biological models
- G06N3/02—Neural networks
- G06N3/08—Learning methods
Landscapes
- Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- Physics & Mathematics (AREA)
- Data Mining & Analysis (AREA)
- Life Sciences & Earth Sciences (AREA)
- Artificial Intelligence (AREA)
- General Physics & Mathematics (AREA)
- General Engineering & Computer Science (AREA)
- Evolutionary Computation (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Computational Linguistics (AREA)
- Bioinformatics & Computational Biology (AREA)
- Bioinformatics & Cheminformatics (AREA)
- Health & Medical Sciences (AREA)
- Biomedical Technology (AREA)
- Biophysics (AREA)
- Evolutionary Biology (AREA)
- General Health & Medical Sciences (AREA)
- Molecular Biology (AREA)
- Computing Systems (AREA)
- Mathematical Physics (AREA)
- Software Systems (AREA)
- Probability & Statistics with Applications (AREA)
- Image Analysis (AREA)
Abstract
The invention discloses a method, a device and a system for identifying the maturity of food, an electric appliance, a server and a medium. The food doneness identification method comprises the following steps: acquiring an initial state image of food; obtaining a target image according to the initial state image and a preset antagonistic neural network model, wherein the target image represents an image when the food reaches the required doneness; acquiring a current state image of food; and comparing the similarity of the target image and the current state image to obtain the food doneness. According to the food doneness identification method, the doneness of the food is obtained by comparing the similarity of the target image and the current state image, the requirements of doneness judgment of various types of food can be met, the generalization degree is high, in addition, the method occupies less computing resources, and the cost of the doneness judgment is reduced.
Description
Technical Field
The invention relates to the technical field of household appliances, in particular to a method, a device, a system, an appliance, a server and a medium for identifying the maturity of food.
Background
With the development of science and technology, intelligent cooking equipment such as an oven is more and more popular in ordinary families, and daily life is greatly facilitated.
However, in the existing method for judging the cooked degree of food, some cooking devices detect the temperature of food by adding probes, but the insertion and cleaning of the probes are troublesome, and the probes can only be used on thicker food generally, so the generalization degree is low.
Disclosure of Invention
The embodiment of the invention provides a food doneness identification method, a food doneness identification device, a food doneness identification system, an electric appliance, a server and a medium.
The food maturity identification method comprises the following steps: acquiring an initial state image of food;
obtaining a target image according to the initial state image and a preset antagonistic neural network model, wherein the target image represents an image when the food reaches the required doneness;
acquiring a current state image of the food;
and comparing the similarity of the target image and the current state image to obtain the food doneness.
In some embodiments, the obtaining a target image according to the initial state image and a preset antagonistic neural network model includes:
acquiring characteristic information of the initial state image according to the initial state image and the preset antagonistic neural network model;
and obtaining the target image according to the characteristic information and the preset antagonistic neural network model.
In some embodiments, the preset antagonistic neural network model comprises a number of sub-neural network models corresponding to different types of food,
the obtaining of the target image according to the initial state image and the preset antagonistic neural network model comprises:
searching the preset antagonistic neural network model according to the initial state image to obtain a corresponding sub-neural network model;
inputting the initial state image into the sub-neural network model to obtain the characteristic information of the initial state image;
and obtaining the target image according to the characteristic information and the sub-neural network model.
In some embodiments, the comparing the similarity between the target image and the current state image to obtain the food doneness comprises:
acquiring pixel values corresponding to all pixel points of the target image and pixel values corresponding to all pixel points of the current state image;
calculating the difference between the pixel value of the pixel point of the target image and the pixel value of the pixel point of the current state image to obtain a pixel difference value;
determining that the two corresponding pixel points are similar under the condition that the pixel difference value is smaller than a preset threshold value;
determining that the two corresponding pixel points are not similar under the condition that the pixel difference value is larger than the preset threshold value;
and calculating the proportion of similar pixel points to all pixel points to obtain the similarity, wherein the similarity represents the food doneness.
In some embodiments, the comparing the similarity between the target image and the current state image to obtain the food doneness comprises:
under the condition that the similarity is smaller than a preset doneness threshold value, acquiring that the current food doneness is not cooked;
and under the condition that the similarity is greater than the preset doneness threshold value, acquiring the current food doneness as cooked.
The food doneness recognition device of the embodiment of the present invention includes: the image acquisition module is used for acquiring an initial state image of food and acquiring a current state image of the food;
the image processing module is used for obtaining a target image according to the initial state image and a preset antagonistic neural network model, wherein the target image represents an image when the food reaches the required doneness;
and the image comparison module is used for comparing the similarity between the target image and the current state image to obtain the food doneness.
The cooking electric appliance comprises the food maturity identification device.
The server comprises a communication module, a processing module and a display module, wherein the communication module is used for receiving an initial state image of food uploaded by a cooking appliance and a current state image of the food;
the image processing module is used for obtaining a target image according to the initial state image and a preset antagonistic neural network model, wherein the target image represents an image when the food reaches the required doneness;
and the image comparison module is used for comparing the similarity between the target image and the current state image to obtain the food cooking degree so as to transmit the food cooking degree to the cooking electric appliance.
The food maturity identifying system comprises a cooking appliance and a server,
the cooking appliance comprises an image acquisition module and an image comparison module, wherein the image acquisition module is used for acquiring an initial state image of food and a current state image of the food so as to transmit the initial state image and the current state image to the server, and the image comparison module is used for comparing the similarity between the target image and the current state image to obtain the food doneness;
the server comprises an image processing module, and the image processing module is used for obtaining the target image according to the initial state image and a preset antagonistic neural network model so as to transmit the target image to the cooking appliance.
Embodiments of the present invention further provide a computer-readable storage medium, on which a computer program is stored, which, when executed by a processor, implements the steps of the food doneness identification method of any of the above embodiments.
According to the food doneness identification method, the device, the system, the electric appliance, the server and the medium, the doneness of the food is obtained by comparing the similarity of the target image and the current state image, the requirements of doneness judgment of various types of food can be met, the generalization degree is high, in addition, the method occupies less computing resources, and the cost of doneness judgment is reduced.
Additional aspects and advantages of the invention will be set forth in part in the description which follows and, in part, will be obvious from the description, or may be learned by practice of the invention.
Drawings
The foregoing and/or additional aspects and advantages of the present invention will become apparent and readily appreciated from the following description of the embodiments, taken in conjunction with the accompanying drawings of which:
fig. 1 is a schematic flow chart of a food doneness identification method according to an embodiment of the present invention;
fig. 2 is another schematic flow chart of a food doneness identification method according to an embodiment of the invention;
fig. 3 is a schematic flow chart of a food doneness identification method according to an embodiment of the invention;
fig. 4 is a further flowchart of the food doneness identifying method according to the embodiment of the invention;
fig. 5 is a further flowchart of the food doneness identifying method according to the embodiment of the invention;
fig. 6 is a block diagram of a food doneness recognition apparatus according to an embodiment of the present invention;
fig. 7 is a block diagram of a cooking appliance according to an embodiment of the present invention;
FIG. 8 is a block diagram of a server in accordance with an embodiment of the present invention;
fig. 9 is a block diagram of a food doneness identification system according to an embodiment of the present invention.
Detailed Description
Reference will now be made in detail to embodiments of the present invention, examples of which are illustrated in the accompanying drawings, wherein like or similar reference numerals refer to the same or similar elements or elements having the same or similar function throughout. The embodiments described below with reference to the drawings are illustrative and intended to be illustrative of the invention and are not to be construed as limiting the invention.
In the description of the embodiments of the present invention, the terms "first" and "second" are used for descriptive purposes only and are not to be construed as indicating or implying relative importance or implying any number of technical features indicated. Thus, features defined as "first", "second", may explicitly or implicitly include one or more of the described features. In the description of the embodiments of the present invention, "a plurality" means two or more.
Referring to fig. 1, a method for identifying a degree of ripeness of food according to an embodiment of the present invention includes:
s10, acquiring an initial state image of the food;
s20, obtaining a target image according to the initial state image and a preset antagonistic neural network model, wherein the target image represents an image when the food reaches the required doneness;
s30, acquiring a current state image of the food;
and S40, comparing the similarity of the target image and the current state image to obtain the food doneness.
According to the food doneness identification method, the doneness of the food is obtained by comparing the similarity of the target image and the current state image, the requirements of doneness judgment of various types of food can be met, the generalization degree is high, in addition, the method occupies less computing resources, and the cost of the doneness judgment is reduced.
Specifically, in the existing method for judging the cooked degree of food, part of cooking devices detect the temperature of the food by adding probes, but the probes are troublesome to insert and clean, and the probes can only be used on thicker food generally, so that the generalization degree is low; the partial cooking equipment judges the cooking degree by combining the oxygen sensor, judges the cooking degree change by measuring the oxygen content change in the oven cavity, and considers that the food is cooked to be edible cooking degree when the oxygen content is lower than a certain set threshold value. However, the types of food supporting the doneness judgment are few, the food is limited to supporting a few baking types or meat, and the oxygen sensor threshold of the corresponding menu can be initialized only by inputting the food to be cooked by the user in the using process, so that the intelligent generalization degree is not high, and the use experience brought to the user is poor.
According to the food doneness recognition method provided by the embodiment of the invention, the target image corresponding to the initial state image is generated through the preset antagonistic neural network model, so that the current state image of the food can be obtained according to the similarity, the doneness of various types of foods can be judged, and in addition, the calculation resources occupied by the judgment of the similarity can be reduced, and the doneness judgment cost can be reduced.
Specifically, the initial state image of the food is an image including an initial state of the food, the image may be an image including elements of the food, the baking tray, the environment, and the like, the image may be an image including all the food in the cooking appliance 200, and the image may also be an image including some food in the cooking appliance 200, which is not limited herein.
It should be noted that the original image acquired by the image acquisition device may be processed, and elements such as the baking tray and the cavity in the image are removed, so that pixels representing the elements such as the baking tray and the cavity in the original image are black, only the pixels representing the food are reserved, and the processed original image is used as the initial state image of the food. The form of the current state image of the food should be the same as the initial state image of the food, that is, if the initial state image of the food includes elements such as food, bakeware, and environment, the current state image of the food should also include elements such as food, bakeware, and environment, and the current state image of the food and the initial state image of the food should be collected by the image collecting device at the same position.
It is understood that the higher the similarity between the target image and the current state image, the closer the current state image is to the desired doneness of the food, i.e., the closer the current state of the food is to the desired doneness. It should be noted that the time for acquiring the current state image should not be too long as the time for acquiring the initial state image of the food, so as to avoid the problem that the food is blurred because the current state image is captured after the food has reached the required doneness.
It should be noted that the initial state image includes, but is not limited to, a state image of raw food, that is, the initial state image may be an image acquired when food is just put into the cooking appliance 200, or the initial state image may be an image acquired after food is put into the cooking appliance 200 for a certain period of time, and is not limited in this respect.
The preset countermeasure neural Network model may be an countermeasure neural Network model based on ACGAN (automatic Classifier genetic adaptive Network, Network with Auxiliary classification information based on generation of a countermeasure Network model), CGAN (Conditional generation adaptive Network), and the like.
In some embodiments, referring to fig. 2, step S20 includes:
step S21, acquiring characteristic information of the initial state image according to the initial state image and a preset antagonistic neural network model;
and step S22, obtaining a target image according to the characteristic information and the preset antagonistic neural network model.
Therefore, the target image can be obtained, and a basis is provided for judging the food doneness.
Specifically, the characteristic information includes information indicating characteristics of a food color, a food shape, a food size, a food texture, and the like. In this embodiment, the preset antagonistic neural network model can be used to generate target images of various types of food such as steak, cake, barbecue, etc. The initial state image can be input into a preset antagonistic neural network model, the preset neural network model extracts characteristic information from the initial state image through convolutional coding, and an intermediate image with the characteristic information is obtained; and mapping the preset antagonistic neural network model through a deconvolution generator, and generating a target image according to the intermediate image with the characteristic information. Due to the limited computing power, the intermediate image with feature information is typically smaller than the target image and the initial state image, e.g. when the target image and the initial state image are both 1080p, the intermediate image with feature information may be a 480p image.
In some embodiments, referring to fig. 3, the preset antagonistic neural network model includes a plurality of sub-neural network models corresponding to different types of food, and step S20 includes:
step S23, searching a preset antagonistic neural network model according to the initial state image to obtain a corresponding sub-neural network model;
step S24, inputting the initial state image into a sub-neural network model to obtain the characteristic information of the initial state image;
and step S25, obtaining a target image according to the characteristic information and the sub neural network model.
Therefore, different sub-neural network models are selected according to different food types, and the judgment accuracy is improved.
Specifically, the preset antagonistic neural network model may include a plurality of sub-neural network models, and each neural network model corresponds to different types of food such as steak, cake, barbecue, and the like. The type of food of the initial state image can be identified through the image identification module, so that the corresponding sub-neural network model is searched in the preset antagonistic neural network model according to the identification result of the image identification module, the target image is generated through the sub-neural network model corresponding to the type of the food, and the generated target image is more accurate. The method also can preset the antagonistic neural network model to extract the characteristic information from the initial state image through convolutional coding to form an intermediate image with the characteristic information, search a corresponding sub-neural network model in the antagonistic neural network model according to the intermediate image with the characteristic information, and map the intermediate image with the characteristic information by the sub-neural network model through a deconvolution generator to generate a target image.
In some embodiments, referring to fig. 4, step S40 includes:
step S41, acquiring pixel values corresponding to each pixel point of the target image and pixel values corresponding to each pixel point of the current state image;
step S42, calculating the difference between the pixel value of the pixel point of the target image and the pixel value of the pixel point of the current state image to obtain a pixel difference value;
step S43, determining that two corresponding pixel points are similar under the condition that the pixel difference value is smaller than a preset threshold value;
step S44, determining that the two corresponding pixel points are not similar when the pixel difference value is larger than a preset threshold value;
and step S45, calculating the proportion of the similar pixel points to all the pixel points to obtain the similarity, wherein the similarity represents the food doneness.
Therefore, the similarity between the target image and the current state image can be obtained, and a basis is provided for judging the food doneness.
Specifically, the pixel value may be an RGB (Red Green Blue ) color channel of the pixel point. The pixel values may be represented in binary code, and similarly, the pixel difference values may also be represented in binary code.
The pixel difference value may be a difference between pixel values of two corresponding pixels. Specifically, the difference value between the R color channel of the pixel value of the pixel at the certain position of the target image and the R color channel of the pixel value of the pixel at the corresponding position of the current state image, the difference value between the G color channel of the pixel value of the pixel at the certain position of the target image and the G color channel of the pixel value of the pixel at the corresponding position of the current state image, and the difference value between the B color channel of the pixel value of the pixel at the certain position of the target image and the B color channel of the pixel value of the pixel at the corresponding position of the current state image together form a pixel difference value, so that the pixel difference value of the pixel at the certain position is obtained, and the steps are repeated to obtain the pixel difference value between the pixel at each position of the target image and the pixel at each position of the current state image.
It should be noted that the target image and the current state image are shot by the image acquisition device at the same position, so that a pixel point at a certain position of the target image corresponds to a pixel point at the same position of the current state image.
It should be added that there are many ways to obtain the similarity, for example, after step S41, the histogram statistics is used to obtain the similarity; for another example, the target image and the current state image are subjected to feature extraction by using the same convolutional neural network, and the features of the target image and the features of the current state image are compared to obtain similarity; for another example, the target image and the current state image are input into a convolutional neural network, and the similarity between the two images is obtained by using a regression method.
Further, referring to fig. 5, step S40 includes:
step S46, under the condition that the similarity is smaller than a preset doneness threshold value, acquiring the current food doneness as being unripe;
and step S47, acquiring the current food ripeness as being already ripened when the similarity is larger than the preset ripeness threshold.
Thus, the degree of ripeness of the food can be judged according to the similarity.
Specifically, the food ripeness degree is judged to be in a cooked state and an unripe state according to the similarity, and other threshold value ranges can be set to judge the degree of the food in the unripe state, for example, when the threshold value range can be set to be 5% -10%, the current food ripeness degree is in the initial cooking stage in the unripe state, and at this time, if the similarity is 7%, the current food ripeness degree can be obtained to be in the initial cooking stage; if the threshold range is set to be 50% -60%, the current doneness of the food is at the middle tip of cooking in an unripe state, and if the similarity is 55%, the current doneness of the food can be obtained and is at the middle stage of cooking, which is not listed here.
In general, it is sufficient that the cooking appliance 200 determines whether the state of the food is cooked or not, and if the state of the food is cooked, the heating is stopped in time, and if the state of the food is not cooked, the heating is continued, so that it is only necessary to determine whether the state of the food is cooked or not according to the similarity.
Referring to fig. 6, the embodiment of the invention provides a food doneness recognition device 100, which includes an image acquisition module 10, an image processing module 20 and an image comparison module 30. The image acquisition module 10 is used for acquiring an initial state image of the food and acquiring a current state image of the food. The image processing module 20 is configured to obtain a target image according to the initial state image and the preset antagonistic neural network model, where the target image represents an image when the food reaches the required doneness. The image comparison module 30 is used for comparing the similarity between the target image and the current state image to obtain the food doneness.
According to the food doneness recognition device 100 of the embodiment of the invention, the preset antagonistic neural network model is used for generating the target image corresponding to the initial state image, so that the current state image of the food can be obtained according to the similarity when the current state image of the food is obtained, and thus, the doneness of various types of food can be judged, and in addition, the calculation resources occupied by the similarity judgment can be reduced, and the doneness judgment cost can be reduced.
Referring to fig. 7, an embodiment of the invention provides an electric cooking appliance 200 including the food doneness recognition device 100.
According to the cooking appliance 200 provided by the embodiment of the invention, the target image corresponding to the initial state image is generated through the preset antagonistic neural network model, so that the current state image of the food can be obtained according to the similarity when the current state image of the food is obtained, thus the doneness of various types of foods can be judged, in addition, the calculation resources occupied by the judgment of the similarity can be reduced, and the cost of the doneness judgment can be reduced.
Specifically, the cooking appliance 200 may be an oven, a stove, or the like. Taking the cooking appliance 200 as an oven as an example, the cooking appliance 200 may further include a heating device, a door switch detection device, a controller, and the like. The heating device is used for heating food in the oven, the oven door switch detection device is used for detecting whether the oven door of the oven is opened, and the controller is used for controlling the heating device, the oven door switch detection device, the food doneness recognition device 100 and the like to work.
The image acquisition module 10 can be electrically connected with the box door detection device, and when the box door detection device detects that the box door is opened, namely, when a user puts food into the oven through the box door, the image acquisition module 10 is controlled to start working, so as to acquire images of the food in the cooking appliance 200 and obtain initial state images of the food. The heating device can be electrically connected with the image comparison module 30, and when the image comparison module 30 obtains the cooked degree of food, and the controller judges that the current food is cooked, the heating device can be controlled to stop heating, so that the food is prevented from being cooked too much.
The cooking appliance 200 may further include a switch detection device for detecting whether the cooking appliance 200 is in an operating state, i.e., detecting whether the cooking appliance 200 starts to cook. The image capturing module 10 may be electrically connected to the switch detection device, and when the switch detection device detects that the cooking appliance 200 starts to work, the image capturing module 10 may start to capture an image as an initial state image of the food.
The image capturing module 10 may be a camera mounted on the cooking appliance 200, or may be a camera mounted on another electrical appliance and electrically connected to the cooking appliance 200, which is not limited in this respect.
The controller can control the image acquisition module 10 to perform image acquisition once every 5 seconds to update the current state image of the food; the controller may also control the image capturing module 10 to perform image capturing operation once every 10 seconds to update the current state image of the food, and the time interval of the image capturing operation performed by the controller is many, which may be 3 seconds, 20 seconds, 1 minute, and the like, and may be adjusted according to the use habit of the user, the power of the cooking appliance 200, the type of the food, and the like, which is not limited specifically herein. It should be noted that the time interval between the image capturing operations performed by the image capturing module 10 is not too long, so as to avoid the food from being cooked, because the current state image is not updated and the food is not perceived to be too hot to cook.
Referring to fig. 8, a server 300 is provided according to an embodiment of the present invention. The server 300 includes a communication module 40, an image processing module 20, and an image comparison module 30. The communication module 40 is configured to receive the initial state image of the food and the current state image of the food uploaded by the cooking appliance 200. And the image processing module 20 is used for obtaining a target image according to the initial state image and the preset antagonistic neural network model, wherein the target image represents an image when the food reaches the required doneness. And the image comparison module 30 is used for comparing the similarity between the target image and the current state image to obtain the food doneness, so as to transmit the food doneness to the cooking appliance 200.
According to the server 300 of the embodiment of the invention, the preset antagonistic neural network model is used for generating the target image corresponding to the initial state image, so that the current state of the food can be known according to the similarity when the current state image of the food is obtained, thus the doneness of various types of food can be judged, and in addition, the calculation resources occupied by the judgment of the similarity can be reduced, and the doneness judgment cost can be reduced.
Specifically, the communication module 40 may implement communication with the cooking appliance 200 in a bluetooth mode, a WiFi mode, or the like, and the communication module 40 may also implement communication with the cooking appliance 200 by being electrically connected with the cooking appliance 200, which is not limited herein.
Referring to fig. 9, the embodiment of the invention provides a food doneness recognition system 500, which includes a cooking appliance 200 and a server 300. The cooking appliance 200 includes an image capture module 10 and an image comparison module 30. The image capturing module 10 is configured to obtain an initial state image of the food and a current state image of the food, and transmit the initial state image and the current state image to the server 300. The image comparison module 30 is used for comparing the similarity between the target image and the current state image to obtain the food doneness. The server 300 includes an image processing module 20. The image processing module 20 is configured to obtain a target image according to the initial state image and the preset antagonistic neural network model, so as to transmit the target image to the cooking appliance 200.
According to the food doneness recognition system 500 provided by the embodiment of the invention, the target image corresponding to the initial state image is generated through the preset antagonistic neural network model, so that the current state image of the food can be obtained according to the similarity, the doneness of various types of foods can be judged, and in addition, the calculation resources occupied by the judgment of the similarity can be reduced, and the doneness judgment cost can be reduced.
Specifically, the cooking appliance 200 may be an oven, a stove, or the like. Taking the cooking appliance 200 as an oven as an example, the cooking appliance 200 may further include a heating device, a door switch detection device, a controller, and the like. The heating device is used for heating food in the oven, the oven door switch detection device is used for detecting whether the oven door of the oven is opened, and the controller is used for controlling the heating device, the oven door switch detection device, the food doneness recognition device 100 and the like to work.
The image acquisition module 10 can be electrically connected with the door switch detection device, and when the door switch detection device detects that the door is opened, namely, when a user puts food into the oven through the door, the image acquisition module 10 is controlled to start working, and image acquisition is performed on the food in the cooking appliance 200 to obtain an initial state image of the food. The heating device can be electrically connected with the image comparison module 30, and when the image comparison module 30 obtains the cooked degree of food, and the controller judges that the current food is cooked, the heating device can be controlled to stop heating, so that the food is prevented from being cooked too much.
The image capturing module 10 may be a camera mounted on the cooking appliance 200, or may be a camera mounted on another electrical appliance and electrically connected to the cooking appliance 200, which is not limited in this respect.
The controller can control the image acquisition module 10 to perform image acquisition once every 5 seconds to update the current state image of the food; the controller may also control the image capturing module 10 to perform image capturing operation once every 10 seconds to update the current state image of the food, and the time interval of the image capturing operation performed by the controller is many, which may be 3 seconds, 20 seconds, 1 minute, and the like, and may be adjusted according to the use habit of the user, the power of the cooking appliance 200, the type of the food, and the like, which is not limited specifically herein. It should be noted that the time interval between the image capturing operations performed by the image capturing module 10 is not too long, so as to avoid the food from being cooked, because the current state image is not updated and the food is not perceived to be too hot to cook.
The communication module 40 may communicate with the cooking appliance 200 through bluetooth, WiFi, or the like, and the communication module 40 may also communicate with the cooking appliance 200 through being electrically connected with the cooking appliance 200, which is not limited herein.
Embodiments of the present invention provide a computer-readable storage medium, on which a computer program is stored, which, when executed by a processor, implements the steps of the food doneness identification method of any of the above embodiments.
The computer-readable storage medium of the embodiment of the invention generates the target image corresponding to the initial state image through the preset antagonistic neural network model, so that the state of the current food can be known according to the similarity when the current state image of the food is obtained, thus the doneness of various types of food can be judged, in addition, the calculation resources occupied by the judgment of the similarity can be reduced, and the doneness judgment cost can be reduced.
The computer readable medium may be provided in the cooking appliance 200 or in the server. The cooking appliance 200 can communicate with the server to obtain the corresponding program. It will be appreciated that the computer program comprises computer program code. The computer program code may be in the form of source code, object code, an executable file or some intermediate form, etc. The computer-readable storage medium may include: any entity or device capable of carrying computer program code, recording medium, U-disk, removable hard disk, magnetic disk, optical disk, computer Memory, Read-Only Memory (ROM), Random Access Memory (RAM), software distribution medium, and the like.
A computer readable storage medium may be any means that can contain, store, communicate, propagate, or transport the program for use by or in connection with the instruction execution system, apparatus, or device. More specific examples (a non-exhaustive list) of the computer readable storage medium would include the following: an electrical connection (electronic device) having one or more wires, a portable computer diskette (magnetic device), a Random Access Memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or flash memory), an optical fiber device, and a portable compact disc read-only memory (CDROM). Additionally, the computer-readable storage medium may even be paper or another suitable medium upon which the program is printed, as the program can be electronically captured, via for instance optical scanning of the paper or other medium, then compiled, interpreted or otherwise processed in a suitable manner if necessary, and then stored in a computer memory.
It should be noted that the above description of the embodiments and the advantageous effects of the food doneness identification method is also applicable to the food doneness identification device 100, the cooking appliance 200, the server 300, the food doneness identification system 500 and the computer readable medium of the embodiments of the present invention, and is not detailed herein to avoid redundancy.
In the description herein, references to the description of the term "one embodiment," "some embodiments," "an example," "a specific example," or "some examples," etc., mean that a particular feature, structure, material, or characteristic described in connection with the embodiment or example is included in at least one embodiment or example of the invention. In this specification, the schematic representations of the terms used above are not necessarily intended to refer to the same embodiment or example. Furthermore, the particular features, structures, materials, or characteristics described may be combined in any suitable manner in any one or more embodiments or examples. Furthermore, various embodiments or examples and features of different embodiments or examples described in this specification can be combined and combined by one skilled in the art without contradiction.
Any process or method descriptions in flow charts or otherwise described herein may be understood as representing modules, segments, or portions of code which include one or more executable instructions for implementing specific logical functions or steps of the process, and alternate implementations are included within the scope of the preferred embodiment of the present invention in which functions may be executed out of order from that shown or discussed, including substantially concurrently or in reverse order, depending on the functionality involved, as would be understood by those reasonably skilled in the art of the present invention.
Although embodiments of the present invention have been shown and described above, it is understood that the above embodiments are exemplary and should not be construed as limiting the present invention, and that variations, modifications, substitutions and alterations can be made to the above embodiments by those of ordinary skill in the art within the scope of the present invention.
Claims (10)
1. A method for recognizing a degree of doneness of a food, comprising:
acquiring an initial state image of food;
obtaining a target image according to the initial state image and a preset antagonistic neural network model, wherein the target image represents an image when the food reaches the required doneness;
acquiring a current state image of the food;
and comparing the similarity of the target image and the current state image to obtain the food doneness.
2. The method for recognizing the doneness of food according to claim 1, wherein the obtaining a target image according to the initial state image and a preset antagonistic neural network model comprises:
acquiring characteristic information of the initial state image according to the initial state image and the preset antagonistic neural network model;
and obtaining the target image according to the characteristic information and the preset antagonistic neural network model.
3. The food doneness recognition method according to claim 1, characterized in that the preset antagonistic neural network model includes several sub-neural network models corresponding to different types of foods,
the obtaining of the target image according to the initial state image and the preset antagonistic neural network model comprises:
searching the preset antagonistic neural network model according to the initial state image to obtain a corresponding sub-neural network model;
inputting the initial state image into the sub-neural network model to obtain the characteristic information of the initial state image;
and obtaining the target image according to the characteristic information and the sub-neural network model.
4. The method for recognizing the degree of doneness of food according to claim 1, wherein the comparing the degree of similarity between the target image and the current state image to obtain the degree of doneness of food comprises:
acquiring pixel values corresponding to all pixel points of the target image and pixel values corresponding to all pixel points of the current state image;
calculating the difference between the pixel value of the pixel point of the target image and the pixel value of the pixel point of the current state image to obtain a pixel difference value;
determining that the two corresponding pixel points are similar under the condition that the pixel difference value is smaller than a preset threshold value;
determining that the two corresponding pixel points are not similar under the condition that the pixel difference value is larger than the preset threshold value;
and calculating the proportion of similar pixel points to all pixel points to obtain the similarity, wherein the similarity represents the food doneness.
5. The method for recognizing the degree of doneness of food according to claim 4, wherein the comparing the degree of similarity of the target image and the current state image to obtain the degree of doneness of food comprises:
under the condition that the similarity is smaller than a preset doneness threshold value, acquiring that the current food doneness is not cooked;
and under the condition that the similarity is greater than the preset doneness threshold value, acquiring the current food doneness as cooked.
6. A food doneness recognition apparatus, characterized by comprising:
the image acquisition module is used for acquiring an initial state image of food and acquiring a current state image of the food;
the image processing module is used for obtaining a target image according to the initial state image and a preset antagonistic neural network model, wherein the target image represents an image when the food reaches the required doneness;
and the image comparison module is used for comparing the similarity between the target image and the current state image to obtain the food doneness.
7. A cooking appliance characterized in that it comprises the food doneness recognition device of claim 7.
8. A server, characterized in that the server comprises:
the communication module is used for receiving an initial state image of food uploaded by the cooking appliance and a current state image of the food;
the image processing module is used for obtaining a target image according to the initial state image and a preset antagonistic neural network model, wherein the target image represents an image when the food reaches the required doneness;
and the image comparison module is used for comparing the similarity between the target image and the current state image to obtain the food cooking degree so as to transmit the food cooking degree to the cooking electric appliance.
9. A food doneness identification system is characterized in that the food doneness identification system comprises a cooking appliance and a server,
the cooking appliance comprises an image acquisition module and an image comparison module, wherein the image acquisition module is used for acquiring an initial state image of food and a current state image of the food so as to transmit the initial state image and the current state image to the server, and the image comparison module is used for comparing the similarity between the target image and the current state image to obtain the food doneness;
the server comprises an image processing module, and the image processing module is used for obtaining the target image according to the initial state image and a preset antagonistic neural network model so as to transmit the target image to the cooking appliance.
10. A computer-readable storage medium, on which a computer program is stored, which, when being executed by a processor, carries out the steps of the food doneness identification method according to any one of claims 1 to 5.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202110993333.9A CN113723498A (en) | 2021-08-26 | 2021-08-26 | Food maturity identification method, device, system, electric appliance, server and medium |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202110993333.9A CN113723498A (en) | 2021-08-26 | 2021-08-26 | Food maturity identification method, device, system, electric appliance, server and medium |
Publications (1)
Publication Number | Publication Date |
---|---|
CN113723498A true CN113723498A (en) | 2021-11-30 |
Family
ID=78678357
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202110993333.9A Pending CN113723498A (en) | 2021-08-26 | 2021-08-26 | Food maturity identification method, device, system, electric appliance, server and medium |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN113723498A (en) |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN114224189A (en) * | 2021-12-21 | 2022-03-25 | 珠海格力电器股份有限公司 | Cooking equipment control method and device and cooking equipment |
Citations (12)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN106123479A (en) * | 2016-07-06 | 2016-11-16 | 辽宁科技学院 | The food materials method for automatically inputting of refrigerator |
CN108280474A (en) * | 2018-01-19 | 2018-07-13 | 广州市派客朴食信息科技有限责任公司 | A kind of food recognition methods based on neural network |
CN109886926A (en) * | 2019-01-22 | 2019-06-14 | 东喜和仪(珠海市)数据科技有限公司 | Fresh food quality determining method and device based on image recognition |
CN109998360A (en) * | 2019-04-11 | 2019-07-12 | 上海长膳智能科技有限公司 | A kind of method and apparatus for automatic cooking food |
CN110287851A (en) * | 2019-06-20 | 2019-09-27 | 厦门市美亚柏科信息股份有限公司 | A kind of target image localization method, device, system and storage medium |
US20200019861A1 (en) * | 2019-08-26 | 2020-01-16 | Lg Electronics Inc. | Method for controlling cook based on artificial intelligent and intelligent device |
CN110909833A (en) * | 2019-11-28 | 2020-03-24 | 广东美的厨房电器制造有限公司 | Food cooking method, equipment and computer readable storage medium |
CN110956217A (en) * | 2019-12-06 | 2020-04-03 | 广东美的白色家电技术创新中心有限公司 | Food maturity recognition method and device and computer storage medium |
US20200334628A1 (en) * | 2019-04-19 | 2020-10-22 | Zume Inc. | Food fulfillment with user selection of instances of food items and related systems, articles and methods |
CN111950414A (en) * | 2020-07-31 | 2020-11-17 | 广州微林软件有限公司 | Cabinet food identification system and identification method |
US20210044871A1 (en) * | 2018-04-17 | 2021-02-11 | Samsung Electronics Co., Ltd. | Display device and display device control method |
CN113194792A (en) * | 2018-10-15 | 2021-07-30 | 广东美的厨房电器制造有限公司 | System and method for capturing and annotating cooking images to train an intelligent cooking appliance, locate food in an intelligent cooking appliance, and determine cooking progress of food in an intelligent cooking appliance |
-
2021
- 2021-08-26 CN CN202110993333.9A patent/CN113723498A/en active Pending
Patent Citations (12)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN106123479A (en) * | 2016-07-06 | 2016-11-16 | 辽宁科技学院 | The food materials method for automatically inputting of refrigerator |
CN108280474A (en) * | 2018-01-19 | 2018-07-13 | 广州市派客朴食信息科技有限责任公司 | A kind of food recognition methods based on neural network |
US20210044871A1 (en) * | 2018-04-17 | 2021-02-11 | Samsung Electronics Co., Ltd. | Display device and display device control method |
CN113194792A (en) * | 2018-10-15 | 2021-07-30 | 广东美的厨房电器制造有限公司 | System and method for capturing and annotating cooking images to train an intelligent cooking appliance, locate food in an intelligent cooking appliance, and determine cooking progress of food in an intelligent cooking appliance |
CN109886926A (en) * | 2019-01-22 | 2019-06-14 | 东喜和仪(珠海市)数据科技有限公司 | Fresh food quality determining method and device based on image recognition |
CN109998360A (en) * | 2019-04-11 | 2019-07-12 | 上海长膳智能科技有限公司 | A kind of method and apparatus for automatic cooking food |
US20200334628A1 (en) * | 2019-04-19 | 2020-10-22 | Zume Inc. | Food fulfillment with user selection of instances of food items and related systems, articles and methods |
CN110287851A (en) * | 2019-06-20 | 2019-09-27 | 厦门市美亚柏科信息股份有限公司 | A kind of target image localization method, device, system and storage medium |
US20200019861A1 (en) * | 2019-08-26 | 2020-01-16 | Lg Electronics Inc. | Method for controlling cook based on artificial intelligent and intelligent device |
CN110909833A (en) * | 2019-11-28 | 2020-03-24 | 广东美的厨房电器制造有限公司 | Food cooking method, equipment and computer readable storage medium |
CN110956217A (en) * | 2019-12-06 | 2020-04-03 | 广东美的白色家电技术创新中心有限公司 | Food maturity recognition method and device and computer storage medium |
CN111950414A (en) * | 2020-07-31 | 2020-11-17 | 广州微林软件有限公司 | Cabinet food identification system and identification method |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN114224189A (en) * | 2021-12-21 | 2022-03-25 | 珠海格力电器股份有限公司 | Cooking equipment control method and device and cooking equipment |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN107468048B (en) | Cooking appliance and control method thereof | |
CN105380575B (en) | Control method, system, Cloud Server and the sweeping robot of sweeping robot | |
CN110123149B (en) | Cooking control method of cooking equipment and cooking equipment | |
CN106570028B (en) | Mobile terminal and method and device for deleting blurred image | |
CN107752794B (en) | Baking method and device | |
US12094228B2 (en) | Method of identifying level of doneness of food, device, and computer storage medium | |
CN107991939A (en) | Cooking control method and culinary art control device, storage medium and cooking equipment | |
CN110234040B (en) | Food material image acquisition method of cooking equipment and cooking equipment | |
CN107743224A (en) | The dirty based reminding method of camera lens, system, readable storage medium storing program for executing and mobile terminal | |
CN111752170B (en) | Intelligent cooking method and device | |
CN108154086A (en) | A kind of image extraction method, device and electronic equipment | |
CN110222720A (en) | A kind of cooking equipment with short video acquisition function | |
CN110857831A (en) | Method and device for controlling temperature of refrigerator | |
CN110657617A (en) | Defrosting control method and device for refrigerator and refrigerator | |
CN113723498A (en) | Food maturity identification method, device, system, electric appliance, server and medium | |
CN112741508A (en) | Control method of cooking equipment and cooking equipment | |
CN110989464B (en) | Cooking method and device based on cooking curve, storage medium and cooking equipment | |
CN113676706A (en) | Cooking video generation method and device, server and control system | |
CN116708907A (en) | Menu generation method, apparatus, device, storage medium, and program product | |
CN110664233B (en) | Control method and device for steaming and baking oven | |
CN110181503B (en) | Anomaly detection method and device, intelligent equipment and storage medium | |
CN113838004A (en) | Food baking control method and device, storage medium and electronic device | |
CN111666961B (en) | Intelligent household appliance, method and device for identifying food material type of intelligent household appliance and electronic equipment | |
CN111723278A (en) | Menu recommendation method, device, recommendation system and related equipment | |
CN116406958A (en) | Prompt method and device for cooking equipment and cooking equipment |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination |