CN112464013B - Information pushing method and device, electronic equipment and storage medium - Google Patents
Information pushing method and device, electronic equipment and storage medium Download PDFInfo
- Publication number
- CN112464013B CN112464013B CN202011408345.2A CN202011408345A CN112464013B CN 112464013 B CN112464013 B CN 112464013B CN 202011408345 A CN202011408345 A CN 202011408345A CN 112464013 B CN112464013 B CN 112464013B
- Authority
- CN
- China
- Prior art keywords
- target
- food material
- information
- food
- menus
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Active
Links
- 238000000034 method Methods 0.000 title claims abstract description 64
- 235000013305 food Nutrition 0.000 claims abstract description 477
- 239000000463 material Substances 0.000 claims abstract description 433
- 238000010411 cooking Methods 0.000 claims description 36
- 238000004891 communication Methods 0.000 claims description 20
- 235000016709 nutrition Nutrition 0.000 claims description 15
- 235000005686 eating Nutrition 0.000 claims description 14
- 238000004590 computer program Methods 0.000 claims description 10
- 230000035764 nutrition Effects 0.000 claims description 8
- 230000003190 augmentative effect Effects 0.000 claims description 6
- 235000012054 meals Nutrition 0.000 claims description 4
- 238000012163 sequencing technique Methods 0.000 claims 2
- 238000005265 energy consumption Methods 0.000 abstract description 12
- 238000005516 engineering process Methods 0.000 abstract description 5
- 238000012545 processing Methods 0.000 description 8
- 230000008569 process Effects 0.000 description 7
- 235000006694 eating habits Nutrition 0.000 description 6
- 238000010586 diagram Methods 0.000 description 5
- 230000000007 visual effect Effects 0.000 description 5
- 230000006870 function Effects 0.000 description 4
- 238000007405 data analysis Methods 0.000 description 3
- 230000000694 effects Effects 0.000 description 3
- 230000003287 optical effect Effects 0.000 description 3
- 230000004044 response Effects 0.000 description 3
- 238000004458 analytical method Methods 0.000 description 2
- 230000008878 coupling Effects 0.000 description 2
- 238000010168 coupling process Methods 0.000 description 2
- 238000005859 coupling reaction Methods 0.000 description 2
- 230000003993 interaction Effects 0.000 description 2
- 230000002093 peripheral effect Effects 0.000 description 2
- 206010063385 Intellectualisation Diseases 0.000 description 1
- 230000006978 adaptation Effects 0.000 description 1
- 238000013475 authorization Methods 0.000 description 1
- 230000037208 balanced nutrition Effects 0.000 description 1
- 235000019046 balanced nutrition Nutrition 0.000 description 1
- 238000004364 calculation method Methods 0.000 description 1
- 238000013527 convolutional neural network Methods 0.000 description 1
- 238000013500 data storage Methods 0.000 description 1
- 230000002452 interceptive effect Effects 0.000 description 1
- 238000012986 modification Methods 0.000 description 1
- 230000004048 modification Effects 0.000 description 1
- 230000001953 sensory effect Effects 0.000 description 1
- 238000004904 shortening Methods 0.000 description 1
- 229940088594 vitamin Drugs 0.000 description 1
- 229930003231 vitamin Natural products 0.000 description 1
- 235000013343 vitamin Nutrition 0.000 description 1
- 239000011782 vitamin Substances 0.000 description 1
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F16/00—Information retrieval; Database structures therefor; File system structures therefor
- G06F16/50—Information retrieval; Database structures therefor; File system structures therefor of still image data
- G06F16/58—Retrieval characterised by using metadata, e.g. metadata not derived from the content or metadata generated manually
- G06F16/583—Retrieval characterised by using metadata, e.g. metadata not derived from the content or metadata generated manually using metadata automatically derived from the content
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F16/00—Information retrieval; Database structures therefor; File system structures therefor
- G06F16/50—Information retrieval; Database structures therefor; File system structures therefor of still image data
- G06F16/53—Querying
- G06F16/538—Presentation of query results
-
- Y—GENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
- Y02—TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
- Y02D—CLIMATE CHANGE MITIGATION TECHNOLOGIES IN INFORMATION AND COMMUNICATION TECHNOLOGIES [ICT], I.E. INFORMATION AND COMMUNICATION TECHNOLOGIES AIMING AT THE REDUCTION OF THEIR OWN ENERGY USE
- Y02D10/00—Energy efficient computing, e.g. low power processors, power management or thermal management
Landscapes
- Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- Data Mining & Analysis (AREA)
- Databases & Information Systems (AREA)
- Physics & Mathematics (AREA)
- General Engineering & Computer Science (AREA)
- General Physics & Mathematics (AREA)
- Library & Information Science (AREA)
- Medical Treatment And Welfare Office Work (AREA)
Abstract
The application provides an information pushing method and device, electronic equipment and a storage medium, wherein the method comprises the following steps: matching food material image information with three-dimensional food material models in a three-dimensional food material model library, wherein the food material image information is obtained by scanning target food materials, and each three-dimensional food material model corresponds to one food material; under the condition that a target food material model matched with the food material image information is acquired, acquiring a plurality of target menus corresponding to target food materials from a menu library according to target object information corresponding to a target object; pushing target related information of target food materials to a target client for display, wherein the target related information comprises a target food material model and a plurality of target menus. The method and the device solve the problems of low effective information acquisition efficiency, high browsing time and high energy consumption in a menu pushing mode in the related technology.
Description
Technical Field
The present application relates to the field of computers, and in particular, to a method and apparatus for pushing information, an electronic device, and a storage medium.
Background
Currently, a user may acquire recipe information using an intelligent App (Application), and then cook according to the recipe information. The smart menu function in the smart App is presented to the user mainly in text, picture or video. The user mainly obtains the menu effective information through a reading mode, so that the effective information obtaining efficiency is low, and the browsing time and energy consumption are high.
Therefore, the menu pushing mode in the related art has the problems of low effective information acquisition efficiency, and large browsing time and energy consumption.
Disclosure of Invention
The application provides an information pushing method and device, electronic equipment and a storage medium, which at least solve the problems of low effective information acquisition efficiency, and large browsing time and energy consumption in a menu pushing mode in the related technology.
According to an aspect of an embodiment of the present application, there is provided a method for pushing information, including: matching food material image information with three-dimensional food material models in a three-dimensional food material model library, wherein the food material image information is obtained by scanning target food materials, and each three-dimensional food material model corresponds to one food material; under the condition that a target food model matched with the food image information is acquired, acquiring a plurality of target menus corresponding to the target food from a menu library according to target object information corresponding to a target object; pushing the target related information of the target food material to a target client for display, wherein the target related information comprises the target food material model and the plurality of target menus.
Optionally, the obtaining, according to the target object information corresponding to the target object, a plurality of target recipes corresponding to the target food materials from a recipe library includes: determining the target object information corresponding to the target object, wherein the target object information is used for representing at least one of the following of the target object: dish preference, dish which has been queried, physiological characteristics; and acquiring the plurality of target menus corresponding to the target food materials from the menu library according to the target object information, wherein the plurality of target menus are ordered according to the matching degree of the target object information.
Optionally, the obtaining, according to the target object information corresponding to the target object, a plurality of target recipes corresponding to the target food materials from a recipe library includes: when the target objects are a plurality of and the target food is a plurality of, determining target object information corresponding to a plurality of the target objects, wherein the target object information is used for representing at least one of the plurality of the target objects: dish preference, inquired dishes, the number of people eating, physiological characteristics; and acquiring a plurality of target menus corresponding to a plurality of target food materials from the menu library according to the target object information, wherein each menu in the plurality of target menus comprises a combination of a plurality of menus, and each target food material is used by at least one menu in the plurality of menus.
Optionally, before the pushing the target related information of the target food material to the target client for display, the method further includes at least one of the following: acquiring menu cooking information corresponding to each target menu in the plurality of target menus, wherein the menu cooking information is used for representing cooking modes of each target menu, and the target associated information comprises the menu cooking information; acquiring food material reference information corresponding to the target food material, wherein the food material reference information is used for representing at least one of the following: the freshness of the target food material, the nutrition parameter of the target food material, and the target associated information comprises the food material reference information.
Optionally, before the matching of the using food material image information with the three-dimensional food material model in the three-dimensional food material model library, the method further comprises: performing Augmented Reality (AR) scanning on the target food material through the target client to obtain the food material image information; and sending the food material image information to a target server, wherein the target server is a server for matching the three-dimensional food material model.
Optionally, after pushing the target related information of the target food material to the target client for display, the method further includes: the image acquisition device of the target client acquires real-time images of the target food, wherein the target client displays real-time images of the target food; displaying the target food material model on the real-time image of the target food material, wherein the position and the size of the target food material model are adjusted according to the position and the size of the real-time image of the target food material; and displaying the target menus on the target client, wherein the target menus are not overlapped with the target food material model.
Optionally, after displaying the target food material model on the real-time image of the target food material, the method further comprises: detecting an adjustment operation performed on the target food material model, wherein the adjustment operation is used for adjusting the display angle of the target food material model; responding to the adjustment operation, and adjusting the display angle of the target food material model; and displaying the target food material model with the adjusted angle on the target client.
According to another aspect of the embodiment of the present application, there is also provided an information pushing apparatus, including: the matching unit is used for matching the food material image information with three-dimensional food material models in the three-dimensional food material model library, wherein the food material image information is obtained by scanning a target food material, and each three-dimensional food material model corresponds to one food material; a first acquisition unit configured to acquire, when a target food material model matching the food material image information is acquired, a plurality of target recipes corresponding to the target food materials from a recipe library according to target object information corresponding to a target object; the pushing unit is used for pushing the target associated information of the target food materials to a target client for display, wherein the target associated information comprises the target food material model and the plurality of target menus.
Optionally, the first acquisition unit includes: a first determining module, configured to determine the target object information corresponding to the target object, where the target object information is used to represent at least one of the following target objects: dish preference, dish which has been queried, physiological characteristics; the first acquisition module is used for acquiring the plurality of target recipes corresponding to the target food materials from the recipe library according to the target object information.
Optionally, the first acquisition unit includes: a second determining module, configured to determine, when the target objects are plural and the target food materials are plural, target object information corresponding to the plural target objects, where the target object information is used to represent at least one of the plural target objects: dish preference, inquired dishes, the number of people eating, physiological characteristics; the second obtaining module is used for obtaining a plurality of target recipes corresponding to a plurality of target food materials from the recipe library according to the target object information, wherein each recipe in the plurality of target recipes comprises a combination of a plurality of recipes, and each target food material is used by at least one recipe in the plurality of recipes.
Optionally, the apparatus further comprises at least one of: the second obtaining unit is used for obtaining recipe cooking information corresponding to each target recipe in the plurality of target recipes before pushing the target related information of the target food materials to a target client for display, wherein the recipe cooking information is used for representing the cooking mode of each target recipe, and the target related information comprises the recipe cooking information; a third acquisition unit, configured to acquire food material reference information corresponding to the target food material, where the food material reference information is used to represent at least one of: the freshness of the target food material, the nutrition parameter of the target food material, and the target associated information comprises the food material reference information.
Optionally, the apparatus further comprises: the scanning unit is used for carrying out Augmented Reality (AR) scanning on the target food materials through the target client before the food material image information is matched with the three-dimensional food material models in the three-dimensional food material model library, so as to obtain the food material image information; and the sending unit is used for sending the food material image information to a target server, wherein the target server is a server for matching the three-dimensional food material model.
Optionally, the apparatus further comprises: the acquisition unit is used for acquiring real-time images of the target food materials through image acquisition equipment of the target client after the target related information of the target food materials is pushed to the target client for display, wherein the target client displays real-time images of the target food materials; the first display unit is used for displaying the target food material model on the real-time image of the target food material, wherein the position and the size of the target food material model are adjusted according to the position and the size of the real-time image of the target food material; and the second display unit is used for displaying the target menus on the target client, wherein the target menus are not overlapped with the target food material model.
Optionally, the apparatus further comprises: the detecting unit is used for detecting the adjustment operation executed by the target food material model after the target food material model is displayed on the real-time image of the target food material, wherein the adjustment operation is used for adjusting the display angle of the target food material model; the adjusting unit is used for responding to the adjusting operation and adjusting the display angle of the target food material model; and the third display unit is used for displaying the target food material model with the adjusted angle on the target client.
According to still another aspect of the embodiments of the present application, there is provided an electronic device including a processor, a communication interface, a memory, and a communication bus, wherein the processor, the communication interface, and the memory complete communication with each other through the communication bus; wherein the memory is used for storing a computer program; a processor for performing the method steps of any of the embodiments described above by running the computer program stored on the memory.
According to a further aspect of the embodiments of the present application there is also provided a computer readable storage medium having stored therein a computer program, wherein the computer program is arranged to perform the method steps of any of the embodiments described above when run.
In the embodiment of the application, a mode of matching a menu by using user information of a user and pushing food material models and the menu to the user is adopted, and matching is carried out by using food material image information and three-dimensional food material models in a three-dimensional food material model library, wherein the food material image information is image information obtained by scanning a target food material, and each three-dimensional food material model corresponds to one food material; under the condition that a target food material model matched with the food material image information is acquired, acquiring a plurality of target menus corresponding to target food materials from a menu library according to target object information corresponding to a target object; the target related information of the target food materials is pushed to the target client for display, wherein the target related information comprises a target food material model and a plurality of target menus, the corresponding three-dimensional food material model is obtained due to the recognition of the food materials, the effective information of the food materials can be visually presented, and the user information of the user is used for matching the menus, so that the matching degree of the pushed menus and the user can be improved, the technical effects of improving the acquisition efficiency of the effective information, shortening the browsing time of the user and reducing the energy consumption of the user are achieved, and the problems of low effective information acquisition efficiency, long browsing time and high energy consumption of the menu pushing mode in the related technology are further solved.
Drawings
The accompanying drawings, which are incorporated in and constitute a part of this specification, illustrate embodiments consistent with the invention and together with the description, serve to explain the principles of the invention.
In order to more clearly illustrate the embodiments of the invention or the technical solutions of the prior art, the drawings which are used in the description of the embodiments or the prior art will be briefly described, and it will be obvious to a person skilled in the art that other drawings can be obtained from these drawings without inventive effort.
FIG. 1 is a schematic diagram of a hardware environment of an alternative information push method according to an embodiment of the present invention;
FIG. 2 is a flow chart of an alternative method for pushing information according to an embodiment of the present application;
FIG. 3 is a flow chart of another alternative information pushing method according to an embodiment of the present application;
FIG. 4 is a block diagram of an alternative information pushing device according to an embodiment of the present application;
Fig. 5 is a block diagram of an alternative electronic device in accordance with an embodiment of the present application.
Detailed Description
In order that those skilled in the art will better understand the present application, a technical solution in the embodiments of the present application will be clearly and completely described below with reference to the accompanying drawings in which it is apparent that the described embodiments are only some embodiments of the present application, not all embodiments. All other embodiments, which can be made by those skilled in the art based on the embodiments of the present application without making any inventive effort, shall fall within the scope of the present application.
It should be noted that the terms "first," "second," and the like in the description and the claims of the present application and the above figures are used for distinguishing between similar objects and not necessarily for describing a particular sequential or chronological order. It is to be understood that the data so used may be interchanged where appropriate such that the embodiments of the application described herein may be implemented in sequences other than those illustrated or otherwise described herein. Furthermore, the terms "comprises," "comprising," and "having," and any variations thereof, are intended to cover a non-exclusive inclusion, such that a process, method, system, article, or apparatus that comprises a list of steps or elements is not necessarily limited to those steps or elements expressly listed but may include other steps or elements not expressly listed or inherent to such process, method, article, or apparatus.
According to one aspect of the embodiment of the application, a method for pushing information is provided. Alternatively, in the present embodiment, the above-described information pushing method may be applied to a hardware environment configured by the terminal 102 and the server 104 as shown in fig. 1. As shown in fig. 1, the server 104 is connected to the terminal 102 through a network, and may be used to provide services (such as game services, application services, etc.) to the terminal or clients installed on the terminal, and a database may be provided on the server or independent of the server, for providing data storage services to the server 104.
The network includes, but is not limited to, at least one of: a wired network, a wireless network, which may include, but is not limited to, at least one of: a wide area network, metropolitan area network, or local area network, which may include, but is not limited to, at least one of the following: bluetooth, WIFI, and other networks that enable wireless communications. The terminal 102 may be a terminal for calculating data, such as a mobile terminal (e.g., a mobile phone, a tablet computer), a notebook computer, a PC, etc. The server may include, but is not limited to, any hardware device that can perform the calculations.
The pushing method of the information in the embodiment of the present application may be performed by the server 104, may be performed by the terminal 102, or may be performed by both the server 104 and the terminal 102. The pushing method of the information performed by the terminal 102 according to the embodiment of the present application may also be performed by a client installed thereon.
Taking the server 104 as an example to execute the method for pushing information in this embodiment, fig. 2 is a schematic flow chart of an alternative method for pushing information according to an embodiment of the present application, as shown in fig. 2, the flow of the method may include the following steps:
step S202, matching is performed on food material image information and three-dimensional food material models in a three-dimensional food material model library, wherein the food material image information is image information obtained by scanning target food materials, and each three-dimensional food material model corresponds to one food material.
The pushing method of the information in the embodiment may be applied to a scenario of pushing food related information to a user, where the food related information may include, but is not limited to, at least one of the following: a three-dimensional model of food (three-dimensional food model), food recipe information, information of cooking modes of food recipes, food nutrition parameters and the like.
A client of a target application can be run on a terminal device of a user, and the client can be in communication connection with a server, and the server can be a server for providing a smart menu. The user can log in to the client running on the terminal equipment by using account numbers, passwords, dynamic passwords, associated application login and other modes. Alternatively, the user may also directly use the functions provided by the client without logging in.
The target user may use a target client of the target account number to log into a target application running on the target terminal device. The target client can call the image acquisition equipment of the target terminal equipment to scan so as to obtain food material image information. For example, the smart App (i.e., the target application) may call a camera on the smart terminal to capture or scan the food material, resulting in image information of the food material. Optionally, the target client may also directly read the food image information stored on the target terminal device.
The target client can send the food material image information to the cloud server through the network so as to process and analyze big data in the background cloud. The cloud server or a database connected with the cloud server can be provided with a three-dimensional food material model library. The three-dimensional food material model library may contain a plurality of three-dimensional food material models, each of which may correspond to one food material.
After receiving the food material image information sent by the target client, the cloud server can use the food material image information to match with the three-dimensional food material models in the three-dimensional food material model library to determine whether a three-dimensional food material model matched with the food material image information (the represented food material) exists.
In step S204, when a target food model matching the food image information is acquired, a plurality of target recipes corresponding to the target food are acquired from the recipe library according to the target object information corresponding to the target object.
A recipe library may be configured on or in a database connected to the cloud server, where each recipe in the recipe library may correspond to one or more food materials, and corresponding tag information, e.g., a belonging recipe, a suitable crowd, a suitable scene, etc.
If a three-dimensional food model matched with the food image information is acquired, namely, a target food model, the cloud server can acquire a plurality of target menus corresponding to target food from a menu library according to target object information corresponding to a target object. The target object information is used to characterize a target object, which may be an object corresponding to the use target client, for example, an object corresponding to a target account (target user), or an object corresponding to a terminal identification of the target terminal device, or may be another object, for example, an associated object of the target user.
The target object information may be acquired in various ways, for example, by authorization of the target user, the object information input by the target application, the search record, or the like, or the object associated information acquired from the associated application of the target application. For another example, when the user performs a menu search, the user inputs or selects the obtained object information (the user information of the user may be user information of another user).
As an example, a user may enter or select user profiles on the smart App, may use the smart App to enter or select user profiles of other associated users (e.g., family members), may use the smart App to enter a search recipe for a menu of dish names, food names, etc., may also enter or select the number of people that are eating at the time of the search, user profiles of different people, etc. The intelligent App can acquire operation information executed by a user on the intelligent App, so that user data, search records and the like are acquired, and the acquired user data, search records and the like are sent to the cloud server for storage.
The cloud server may obtain a plurality of target recipes corresponding to the target food materials from the recipe library according to the target object information corresponding to the target objects, and cook the food materials used by each of the plurality of target recipes to include the target food materials, for example, the cloud server may use the target object information and the target food materials to match with the tags of each of the recipes in the recipe library, thereby obtaining a plurality of target recipes, and the tag of each of the plurality of target recipes matches with at least one piece of the target object information.
Step S206, pushing target related information of the target food materials to the target client for display, wherein the target related information comprises a target food material model and a plurality of target menus.
After obtaining the multiple target menus, the cloud server can push target related information of the target food materials to the target client. The target associated information can comprise a target food material model, so that the three-dimensional model of the target food material can be displayed through the target client, the user can interact with the food material model and the real scene to experience, and the user can know the food material better, more intuitively and conveniently at the intelligent App end.
The target associated information may further include the plurality of target menus, and different menus may represent the ordering manner of the plurality of target menus through sequential identification, that is, the display order of the plurality of target menus, so that possible effective information may be displayed to the user in a manner of being easier to obtain, thereby improving the efficiency of obtaining the effective information, and reducing browsing time and energy consumption of the user.
After receiving the target association information, the target client may display different target association information simultaneously or in a certain order, for example, the target food material model may be displayed first, and then the multiple target recipes may be displayed (in this case, the target food material model and the multiple target recipes may be displayed simultaneously, or only the multiple target recipes may be displayed.
Optionally, the target client may display the target food material model and multiple target recipes simultaneously, the target food material model may be displayed in a first area of the target client, and the multiple target recipes are displayed in a second area of the target client according to a sequence, or may display one recipe on the target food material model, and the displayed recipes may be switched by operating the target food material model, or may be displayed in other display modes, which is not limited in this embodiment.
Through the steps S202 to S206, matching is performed using food material image information with three-dimensional food material models in a three-dimensional food material model library, where the food material image information is image information obtained by scanning a target food material, and each three-dimensional food material model corresponds to one food material; under the condition that a target food material model matched with the food material image information is acquired, acquiring a plurality of target menus corresponding to target food materials from a menu library according to target object information corresponding to a target object; the method comprises the steps of pushing target related information of target food materials to a target client for display, wherein the target related information comprises a target food material model and a plurality of target menus, the problems of low effective information acquisition efficiency, high browsing time and high energy consumption in a menu pushing mode in the related technology are solved, the effective information acquisition efficiency is improved, the browsing time of a user is shortened, and the energy consumption of the user is reduced.
As an alternative embodiment, acquiring a plurality of target recipes corresponding to target food materials from a recipe library according to target object information corresponding to a target object includes:
S11, determining target object information corresponding to a target object, wherein the target object information is used for representing at least one of the following target objects: dish preference, dish which has been queried, physiological characteristics;
s12, acquiring a plurality of target menus corresponding to the target food materials from a menu library according to the target object information, wherein the target menus are ordered according to the matching degree of the target object information.
The cloud server may obtain target object information of the target object, where the target object information may be used to characterize at least one of the following of the target object: physiological characteristics of the target subject (e.g., gender, height, age, weight, etc.), usage habits, life preferences, etc. For example, if there is one target object, the target object information may be used to characterize at least one of a dish preference, a queried dish, a physiological characteristic of the target object.
According to the target object information, the cloud server can acquire the target menus corresponding to the target food materials, and the target menus can be ordered according to the matching degree with the target object information. The degree of matching of each of the plurality of target recipes with the target object may be represented by the number of tags matching the target object information among all the tags of each recipe.
Alternatively, different object information may have different degrees of importance, for example, the user's dish preference and the queried dishes may have higher degrees of importance, and the physiological features may have lower degrees of importance, for example, the queried dishes may be queried differently, and the corresponding degrees of importance may be different. The degree of matching of the recipe to the target object information may be a weighted sum of the matching tags (object information).
According to the embodiment, different object information is subjected to menu matching, and the menus are ordered according to the matching degree of the menus and the object information, so that the efficiency of acquiring effective information can be improved, the browsing time of a user is shortened, and the energy consumption of the user is further reduced.
As an alternative embodiment, acquiring a plurality of target recipes corresponding to target food materials from a recipe library according to target object information corresponding to a target object includes:
S21, when there are a plurality of target objects and a plurality of target food materials, determining target object information corresponding to the plurality of target objects, wherein the target object information is used to represent at least one of the following target objects: dish preference, inquired dishes, the number of people eating, physiological characteristics;
S22, acquiring a plurality of target recipes corresponding to a plurality of target food materials from a recipe library according to the target object information, wherein each recipe in the plurality of target recipes comprises a combination of the plurality of recipes, and each target food material is used by at least one recipe in the plurality of recipes.
When the number of the dining persons is multiple, the dining habit information of each person can be obtained according to the eating habits of the dining groups, and the dining habit information of each person is combined to match a recommended menu which optimally integrates the dining preferences of all the user groups.
If the number of the dining personnel (target objects) is a plurality, the cloud server can determine target object information of the plurality of target objects. In this case, the target object information includes not only object information (e.g., dish preference, dish that has been queried, physiological characteristics) of each target object, but also group information (e.g., number of meals, etc.) of a plurality of target objects as a group. Optionally, the cloud server may acquire the number of dishes to be cooked while acquiring the food image information from the target client.
In order to meet the dining demands of multiple people, a plurality of dishes are required to be cooked, if one target food material is used, the target food material is used as a main food material, one or more menus conforming to the dining habits of the multiple target objects are determined, for example, the dish preference and the physiological characteristics of the multiple target objects can be integrated, the menu conforming to the dish preference and the physiological characteristics of the multiple target objects is matched from a menu library, and then the menus are ordered according to the queried times of the matched menu.
After obtaining one or more recipes conforming to the eating habits of a plurality of target objects, the cloud server may obtain recipes matching with each of the one or more recipes from the recipe library according to the target object information, thereby obtaining a plurality of recipe combinations, that is, a plurality of target recipes. The menu and the menu can be matched according to the nutrition related parameters of the food materials used by the menu, or can be matched according to a preset menu combination. If the number of dishes to be cooked is specified, the number of dishes contained in the target recipe is the same as the number of dishes to be cooked, and if the number of dishes to be cooked is not specified, the number of dishes contained in the target recipe matches (e.g., is the same as or equivalent to) the number of people having a meal.
If the target food materials are multiple, the multiple food materials can be scanned sequentially or simultaneously, and each scanned food material image can contain at least one food material. The cloud server can match recipes according to single food materials or combinations of at least two food materials, obtain one or more recipes, and obtain a plurality of recipe combinations in a similar manner.
In addition, for one target menu, in order to ensure that all food materials can be used, the cloud server can control each target food material to be used by at least one menu in a plurality of menus contained in the target menu. In order to ensure balanced nutrition, the target recipe may also include a recipe that does not use any of the target food materials. The specific recipe configuration scheme may be set as required, which is not limited in this embodiment.
According to the embodiment, according to the eating habits of the eating groups, and by combining the eating information (such as dish preference, physiological characteristics and the like) of different users, a recommended menu integrating the eating preferences of all the user groups is matched, so that the matching degree of the recommended menu and the users can be improved, and the use experience of the users is improved.
As an optional embodiment, before pushing the target associated information of the target food material to the target client for display, the method further includes at least one of the following:
S31, acquiring menu cooking information corresponding to each target menu in a plurality of target menus, wherein the menu cooking information is used for representing the cooking modes of each target menu, and the target associated information comprises menu cooking information;
S32, acquiring food material reference information corresponding to a target food material, wherein the food material reference information is used for representing at least one of the following: the freshness of the target food material, the nutrition parameter of the target food material, and the target related information comprise food material reference information.
In addition to the target food material model and the plurality of target recipes, the target-related information may include recipe cooking information of each target recipe and food material reference information of the target food material.
The cloud server can acquire cooking information of each target menu, and the cooking information of each target menu is used for representing a cooking mode of each target menu. The cooking mode of each target menu can be a cooking image or a cooking video divided according to the cooking steps, so that a user can conveniently skip between different cooking steps by clicking or triggering in other modes, and the user can conveniently acquire the cooking image or the cooking video corresponding to the current cooking process.
The cloud server may analyze the food material image information to determine food material reference information for the target food material, which may include, but is not limited to, information representing at least one of: freshness of the target food material, and nutritional parameters of the target food material.
For example, the cloud server may search for the nutritional parameters of the target food material through the search engine, or may match the nutritional parameter information of the target food material from a database storing the nutritional parameters of different food materials.
For another example, the cloud server may extract image features of the food image information, and determine freshness of the target food according to the image features. The above-mentioned process can be implemented by using a convolutional neural network model, for example, a feature map of food material image information is extracted through a convolutional layer, the probability of each freshness of the target food material is determined through a full connection layer, and finally the freshness of the target food material is determined. In addition, the freshness of the target food material may be determined in other manners (for example, by matching with the reference image information), and the manner of determining the freshness of the target food material is not limited in this embodiment.
According to the embodiment, the cooking mode of the menu and/or the reference information of the food materials are obtained and pushed to the client, so that the comprehensiveness of information obtaining can be improved, and the use experience of a user can be improved.
As an alternative embodiment, before matching the three-dimensional food material model in the three-dimensional food material model library using the food material image information, the method further comprises:
S41, performing Augmented Reality (AR) scanning on a target food material through a target client to obtain food material image information;
and S42, transmitting the food image information to a target server, wherein the target server is a server for performing three-dimensional food model matching.
The food material image information may be obtained by performing AR (Augmented Reality ) scan recognition on the target food material. The user may click on a specific control on the target client, for example, AR scanning, initiate an AR scanning function of the target client, and perform AR scanning by calling an image acquisition device (for example, a camera) to obtain food image information. Alternatively, if the number of target food materials is plural, the plural target food materials may be scanned together, or may be scanned separately, to obtain plural food material image information.
For food material image information obtained by AR scanning, the target client can send the food material image information to a target server (for example, the cloud server) through a network, and the target server is a server for controlling three-dimensional food material model matching. Alternatively, if the food material image information is plural, the plural food material image information may be transmitted to the target server together, or may be transmitted to the target server in the scanning order or in the order specified by the user.
Through this embodiment, through carrying out AR scanning recognition to the food, can improve the ability that food image information represents the food, improve the accuracy that three-dimensional model matched.
As an optional embodiment, after pushing the target related information of the target food material to the target client for displaying, the method further includes:
S51, performing real-time image acquisition on the target food material through image acquisition equipment of a target client, wherein a real-time image of the target food material is displayed on the target client;
s52, displaying a target food material model on the real-time image of the target food material, wherein the position and the size of the target food material model are adjusted according to the position and the size of the real-time image of the target food material;
s53, displaying a plurality of target menus on the target client, wherein the target menus are not overlapped with the target food material model.
In order to improve the efficiency of effective information acquisition, after the target food material model is acquired, the target client may fuse the 3D model with the real scene.
The image acquisition equipment of the target client can acquire real-time images of the target food materials so as to display real-time images of the target food materials on the target client. The target food material model can be displayed in a superimposed manner on the real-time image of the target food material while the real-time image of the target food material is displayed. The position and the size of the displayed target food material model can be adjusted according to the position and the size of the real-time image of the target food material.
As an alternative embodiment, the target client may also display food material image information, and display a target food material model on the food material image information, where the position and size of the target food material model are adjusted according to the position and size of the target food material in the food material image information.
Alternatively, in this embodiment, in addition to displaying the target food material model, a plurality of target recipes may be displayed on the target client, and in order to ensure the integrity of information display, the plurality of target recipes do not overlap with the target food material model, that is, the region in which the plurality of target recipes are displayed and the region in which the target food material model are displayed do not overlap. The manner of displaying the plurality of target recipes may be similar to that described above, and will not be described in detail herein.
According to the embodiment, the 3D model and the real scene can be fused by displaying the target food material model on the real-time image of the target food material, so that the displayed visual information is enriched; the target food material model and the multiple menus are displayed in the non-overlapping area, so that the comprehensiveness and the completeness of information display can be improved.
As an alternative embodiment, after displaying the target food material model on the real-time image of the target food material, the method further includes:
S61, detecting an adjustment operation executed on the target food material model, wherein the adjustment operation is used for adjusting the display angle of the target food material model;
s62, adjusting the display angle of the target food material model in response to the adjustment operation;
And S63, displaying the angle-adjusted target food material model on the target client.
The user can execute the adjustment operation on the target food material model to adjust the display parameters of the target food material model through the target client. The display parameters may be varied and may include, but are not limited to, at least one of the following: display angle, display size, display position. The target client may detect the adjustment operation performed by the target food material model, and in response to the adjustment operation, the target client may adjust a display parameter of the target food material model and display the adjusted target food material model.
For example, the target client may detect an adjustment operation performed on the target food material model for adjusting the display angle of the target food material model; in response to the adjustment operation, the target client may adjust a display angle of the target food material model, and display the angle-adjusted target food material model on the target client. The manner of adjusting the display size and the display position is similar to the above manner, and will not be described in detail herein.
According to the method and the device for adjusting the display parameters of the target food material model, the display interactivity of the food material model can be enhanced, and the use experience of a user is improved.
The method for pushing information in the embodiment of the present application is explained below with reference to an alternative example. In this example, the target application is a smart App.
The intelligent menu in the related art is generally presented in a text, picture or video mode, is not intelligent enough, has single presentation effect, cannot be dynamically analyzed according to the current situation, has poor strain force and visual intuitiveness (poor visual experience effect) of effective information, lacks interaction experience with people, and cannot meet the demands of people.
According to the intelligent menu intelligent pushing mode based on the combination of AR and big data, AR scanning is conducted on food materials, a 3D model of the food materials is identified, the 3D model is formed to be skillfully fused with the real world, visual interaction is formed, and the real experience of user operation is enhanced; according to the identification result, the cloud performs model big data analysis, and dynamically and accurately visualizes and presents effective information of food material identification, so that the intellectualization of a menu can be improved, the dynamic update and pushing of the effective information of the menu can be realized, a user can conveniently and intuitively and dynamically know the menu information, a real sensory effect and rich scenes are presented for the user, the interactivity is improved, and the effective information can be intuitively and conveniently acquired.
As shown in fig. 3, the flow of the information pushing method in this alternative example may include the following steps:
step S302, AR scanning is performed on the food material.
The user may trigger the smart App to AR scan a food material (e.g., a food material in a store or kitchen) through which a food material image of the food material is obtained.
Step S304, the food material image is sent to the background cloud.
The intelligent App can send the scanned food material image to the background cloud (cloud server).
Step S306, matching the food material image with a corresponding food material 3D model (three-dimensional food material model), if matching is successful, executing step S308, otherwise, executing step S310.
Step S308, the background cloud control intelligent App displays the corresponding 3D model of the food material.
Step S310, the background cloud control intelligent App prompts a user to scan and identify failure conditions through an interactive interface.
In step S312, the background cloud performs data processing analysis according to the model, determines whether to execute at least one sub-step in step S314, if yes, executes at least one sub-step in step S314, otherwise, does not perform processing.
For the identified model data, the background cloud may perform big data processing analysis to determine whether other relevant information of the food material needs to be pushed, if so, at least one sub-step in step S314 is performed, otherwise, no processing is performed.
In step S314, the background cloud executes at least one of food material recipe information introduction pushing, food material recipe cooking method pushing, and food material nutrition related parameter pushing.
The background cloud can accurately acquire food material menu information and a list, and the food material menu information and the list are presented to a user for reading through an intelligent App pushing or visual interface to inform the user of effective information such as commodity introduction, commodity price, freshness and nutritional value of the food material, and the user can know the related food material menu information conveniently. The smart App may display freshness, nutritional value, some nutritional vitamins contained, delicacies to cook the food material, and the like.
Meanwhile, the background cloud can also conduct data analysis processing according to data such as food materials, reading habits, personal hobbies and the like scanned by the AR at ordinary times by the user, intelligently pushes related menus, and meets the requirements of cooking food by the user.
Optionally, when the number of people eating the meal is small, the background cloud can acquire the eating habit information of each person according to the eating habits of the eating group, finally combines the eating habit information of each person, and finally matches and recommends the eating preference of the optimal comprehensive user group.
Through this example, through AR scanning and high in the clouds big data analysis in the intelligent App, face the food of various, when needs culinary art food, the user only need AR scanning food, can acquire the way and the process of culinary art food, can reduce too much and go to inquire about the relevant menu data of reading, light swift completion culinary art food process.
It should be noted that, for simplicity of description, the foregoing method embodiments are all described as a series of acts, but it should be understood by those skilled in the art that the present application is not limited by the order of acts described, as some steps may be performed in other orders or concurrently in accordance with the present application. Further, those skilled in the art will also appreciate that the embodiments described in the specification are all preferred embodiments, and that the acts and modules referred to are not necessarily required for the present application.
From the description of the above embodiments, it will be clear to a person skilled in the art that the method according to the above embodiments may be implemented by means of software plus the necessary general hardware platform, but of course also by means of hardware, but in many cases the former is a preferred embodiment. Based on such understanding, the technical solution of the present application may be embodied essentially or in a part contributing to the prior art in the form of a software product stored in a storage medium (such as ROM (Read-Only Memory)/RAM (Random Access Memory), magnetic disk, optical disk) and including instructions for causing a terminal device (which may be a mobile phone, a computer, a server, or a network device, etc.) to perform the method according to the embodiments of the present application.
According to another aspect of the embodiment of the application, an information pushing device for implementing the information pushing method is also provided. Fig. 4 is a block diagram of an alternative information pushing device according to an embodiment of the present application, as shown in fig. 4, the device may include:
(1) A matching unit 402, configured to match three-dimensional food material models in a three-dimensional food material model library using food material image information, where the food material image information is image information obtained by scanning a target food material, and each three-dimensional food material model corresponds to one food material;
(2) A first obtaining unit 404, connected to the matching unit 402, for obtaining, when a target food material model matched with the food material image information is obtained, a plurality of target recipes corresponding to the target food materials from a recipe library according to target object information corresponding to the target object;
(3) The pushing unit 406 is connected to the first obtaining unit 404, and is configured to push target related information of a target food material to a target client for display, where the target related information includes a target food material model and a plurality of target recipes.
It should be noted that, the matching unit 402 in this embodiment may be used to perform the step S202, the first obtaining unit 404 in this embodiment may be used to perform the step S204, and the pushing unit 406 in this embodiment may be used to perform the step S206.
Through the module, matching is carried out by using food material image information and three-dimensional food material models in a three-dimensional food material model library, wherein the food material image information is obtained by scanning target food materials, and each three-dimensional food material model corresponds to one food material; under the condition that a target food material model matched with the food material image information is acquired, acquiring a plurality of target menus corresponding to target food materials from a menu library according to target object information corresponding to a target object; the method comprises the steps of pushing target related information of target food materials to a target client for display, wherein the target related information comprises a target food material model and a plurality of target menus, the problems of low effective information acquisition efficiency, high browsing time and high energy consumption in a menu pushing mode in the related technology are solved, the effective information acquisition efficiency is improved, the browsing time of a user is shortened, and the energy consumption of the user is reduced.
As an alternative embodiment, the first acquisition unit 404 includes:
a first determining module, configured to determine target object information corresponding to a target object, where the target object information is used to represent at least one of the following target objects: dish preference, dish which has been queried, physiological characteristics;
The first acquisition module is used for acquiring a plurality of target recipes corresponding to the target food materials from the recipe library according to the target object information.
As an alternative embodiment, the first acquisition unit 404 includes:
A second determining module, configured to determine, when the target objects are plural and the target food is plural, target object information corresponding to the plural target objects, where the target object information is used to represent at least one of the following of the plural target objects: dish preference, inquired dishes, the number of people eating, physiological characteristics;
The second acquisition module is used for acquiring a plurality of target recipes corresponding to a plurality of target food materials from the recipe library according to the target object information, wherein each recipe in the plurality of target recipes comprises a combination of the plurality of recipes, and each target food material is used by at least one recipe in the plurality of recipes.
As an alternative embodiment, the apparatus further comprises at least one of:
The second acquisition unit is used for acquiring menu cooking information corresponding to each target menu in the plurality of target menus before pushing the target associated information of the target food materials to the target client for display, wherein the menu cooking information is used for indicating the cooking mode of each target menu, and the target associated information comprises menu cooking information;
A third acquisition unit, configured to acquire food material reference information corresponding to a target food material, where the food material reference information is used to represent at least one of: the freshness of the target food material, the nutrition parameter of the target food material, and the target related information comprise food material reference information.
As an alternative embodiment, the above device further comprises:
the scanning unit is used for carrying out Augmented Reality (AR) scanning on the target food materials through the target client before the food material image information is matched with the three-dimensional food material models in the three-dimensional food material model library, so as to obtain the food material image information;
and the sending unit is used for sending the food material image information to a target server, wherein the target server is a server for performing three-dimensional food material model matching.
As an alternative embodiment, the above device further comprises:
The acquisition unit is used for carrying out real-time image acquisition on the target food material through the image acquisition equipment of the target client after pushing the target related information of the target food material to the target client for display, wherein the target client displays a real-time image of the target food material;
the first display unit is used for displaying the target food material model on the real-time image of the target food material, wherein the position and the size of the target food material model are adjusted according to the position and the size of the real-time image of the target food material;
And the second display unit is used for displaying a plurality of target menus on the target client, wherein the target menus are not overlapped with the target food material model.
As an alternative embodiment, the above device further comprises:
The detecting unit is used for detecting the adjustment operation executed on the target food material model after the target food material model is displayed on the real-time image of the target food material, wherein the adjustment operation is used for adjusting the display angle of the target food material model;
the adjusting unit is used for responding to the adjusting operation and adjusting the display angle of the target food material model;
and the third display unit is used for displaying the target food material model with the adjusted angle on the target client.
It should be noted that the above modules are the same as examples and application scenarios implemented by the corresponding steps, but are not limited to what is disclosed in the above embodiments. It should be noted that the above modules may be implemented in software or in hardware as part of the apparatus shown in fig. 1, where the hardware environment includes a network environment.
According to still another aspect of the embodiment of the present application, there is further provided an electronic device for implementing the pushing method of information, where the electronic device may be a server, a terminal, or a combination thereof.
Fig. 5 is a block diagram of an alternative electronic device, according to an embodiment of the application, as shown in fig. 5, comprising a processor 502, a communication interface 504, a memory 506, and a communication bus 508, wherein the processor 502, the communication interface 504, and the memory 506 communicate with each other via the communication bus 508, wherein,
A memory 506 for storing a computer program;
the processor 502 is configured to execute the computer program stored in the memory 506, and implement the following steps:
S1, matching food material image information with three-dimensional food material models in a three-dimensional food material model library, wherein the food material image information is obtained by scanning target food materials, and each three-dimensional food material model corresponds to one food material;
s2, when a target food material model matched with the food material image information is obtained, obtaining a plurality of target menus corresponding to target food materials from a menu library according to target object information corresponding to a target object;
And S3, pushing target associated information of the target food materials to a target client for display, wherein the target associated information comprises a target food material model and a plurality of target menus.
Alternatively, in the present embodiment, the above-described communication bus may be a PCI (PERIPHERAL COMPONENT INTERCONNECT, peripheral component interconnect standard) bus, or an EISA (Extended Industry Standard Architecture ) bus, or the like. The communication bus may be classified as an address bus, a data bus, a control bus, or the like. For ease of illustration, only one thick line is shown in fig. 5, but not only one bus or one type of bus.
The communication interface is used for communication between the electronic device and other devices.
The memory may include RAM or may include non-volatile memory (non-volatile memory), such as at least one disk memory. Optionally, the memory may also be at least one memory device located remotely from the aforementioned processor.
As an example, the matching unit 402, the first obtaining unit 404, and the pushing unit 406 in the pushing device that includes the information may be in the memory 506, but not limited to. In addition, other module units in the information pushing device may be included, but are not limited to, and are not described in detail in this example.
The processor may be a general purpose processor and may include, but is not limited to: CPU (Central Processing Unit ), NP (Network Processor, network processor), etc.; but may also be a DSP (DIGITAL SIGNAL Processing), ASIC (Application SPECIFIC INTEGRATED Circuit), FPGA (Field-Programmable gate array) or other Programmable logic device, discrete gate or transistor logic device, discrete hardware components.
In addition, the electronic device further includes: and a display for displaying at least one of the target food material model and the plurality of target recipes.
Alternatively, specific examples in this embodiment may refer to examples described in the foregoing embodiments, and this embodiment is not described herein.
It will be understood by those skilled in the art that the structure shown in fig. 5 is only schematic, and the device implementing the above information pushing method may be a terminal device, and the terminal device may be a smart phone (such as an Android Mobile phone, an iOS Mobile phone, etc.), a tablet computer, a palm computer, a Mobile internet device (Mobile INTERNET DEVICES, MID), a PAD, etc. Fig. 5 is not limited to the structure of the electronic device described above. For example, the electronic device may also include more or fewer components (e.g., network interfaces, display devices, etc.) than shown in FIG. 5, or have a different configuration than shown in FIG. 5.
Those of ordinary skill in the art will appreciate that all or part of the steps in the various methods of the above embodiments may be implemented by a program for instructing a terminal device to execute in association with hardware, the program may be stored in a computer readable storage medium, and the storage medium may include: flash disk, ROM, RAM, magnetic or optical disk, etc.
According to yet another aspect of an embodiment of the present application, there is also provided a storage medium. Alternatively, in this embodiment, the storage medium may be used to execute the program code of the pushing method of any of the above information in the embodiment of the present application.
Alternatively, in this embodiment, the storage medium may be located on at least one network device of the plurality of network devices in the network shown in the above embodiment.
Alternatively, in the present embodiment, the storage medium is configured to store program code for performing the steps of:
S1, matching food material image information with three-dimensional food material models in a three-dimensional food material model library, wherein the food material image information is obtained by scanning target food materials, and each three-dimensional food material model corresponds to one food material;
s2, when a target food material model matched with the food material image information is obtained, obtaining a plurality of target menus corresponding to target food materials from a menu library according to target object information corresponding to a target object;
And S3, pushing target associated information of the target food materials to a target client for display, wherein the target associated information comprises a target food material model and a plurality of target menus.
Alternatively, specific examples in the present embodiment may refer to examples described in the above embodiments, which are not described in detail in the present embodiment.
Alternatively, in the present embodiment, the storage medium may include, but is not limited to: various media capable of storing program codes, such as a U disk, ROM, RAM, a mobile hard disk, a magnetic disk or an optical disk.
The foregoing embodiment numbers of the present application are merely for the purpose of description, and do not represent the advantages or disadvantages of the embodiments.
The integrated units in the above embodiments may be stored in the above-described computer-readable storage medium if implemented in the form of software functional units and sold or used as separate products. Based on such understanding, the technical solution of the present application may be embodied in essence or a part contributing to the prior art or all or part of the technical solution in the form of a software product stored in a storage medium, comprising several instructions for causing one or more computer devices (which may be personal computers, servers or network devices, etc.) to perform all or part of the steps of the method described in the embodiments of the present application.
In the foregoing embodiments of the present application, the descriptions of the embodiments are emphasized, and for a portion of this disclosure that is not described in detail in this embodiment, reference is made to the related descriptions of other embodiments.
In several embodiments provided by the present application, it should be understood that the disclosed client may be implemented in other manners. The above-described embodiments of the apparatus are merely exemplary, and the division of the units, such as the division of the units, is merely a logical function division, and may be implemented in another manner, for example, multiple units or components may be combined or may be integrated into another system, or some features may be omitted, or not performed. Alternatively, the coupling or direct coupling or communication connection shown or discussed with each other may be through some interfaces, units or modules, or may be in electrical or other forms.
The units described as separate units may or may not be physically separate, and units shown as units may or may not be physical units, may be located in one place, or may be distributed on a plurality of network units. Some or all of the units may be selected according to actual needs to achieve the purpose of the solution provided in the present embodiment.
In addition, each functional unit in the embodiments of the present application may be integrated in one processing unit, or each unit may exist alone physically, or two or more units may be integrated in one unit. The integrated units may be implemented in hardware or in software functional units.
The foregoing is merely a preferred embodiment of the present application and it should be noted that modifications and adaptations to those skilled in the art may be made without departing from the principles of the present application, which are intended to be comprehended within the scope of the present application.
Claims (8)
1. The information pushing method is characterized by comprising the following steps of:
matching food material image information with three-dimensional food material models in a three-dimensional food material model library, wherein the food material image information is obtained by scanning target food materials, and each three-dimensional food material model corresponds to one food material;
under the condition that a target food model matched with the food image information is acquired, acquiring a plurality of target menus corresponding to the target food from a menu library according to target object information corresponding to a target object;
Pushing target associated information of the target food materials to a target client for display, wherein the target associated information comprises the target food material model and the plurality of target menus;
The obtaining, from a recipe library, a plurality of target recipes corresponding to the target food materials according to target object information corresponding to the target objects includes:
When the target objects are a plurality of and the target food is a plurality of, determining target object information corresponding to a plurality of the target objects, wherein the target object information is used for representing at least one of the plurality of the target objects: dish preference, inquired dishes, the number of people eating, physiological characteristics;
according to the target object information, acquiring a plurality of target menus corresponding to a plurality of target food materials from the menu library, wherein each menu in the plurality of target menus comprises a combination of a plurality of menus, each target food material is used by at least one menu in the plurality of menus, and the number of the menus contained in the target menu is matched with the number of people eating;
And under the condition that the target objects are a plurality of and the target food materials are one, taking the target food materials as main food materials, integrating the dish preference and the physiological characteristics of the plurality of target objects, matching the dishes preference and the physiological characteristics of the plurality of target objects from a menu library, and sequencing the menus according to the queried times of the matched dishes to obtain a plurality of target menus corresponding to the target food materials.
2. The method of claim 1, wherein prior to pushing the target related information for the target food material to a target client for display, the method further comprises at least one of:
Acquiring menu cooking information corresponding to each target menu in the plurality of target menus, wherein the menu cooking information is used for representing cooking modes of each target menu, and the target associated information comprises the menu cooking information;
Acquiring food material reference information corresponding to the target food material, wherein the food material reference information is used for representing at least one of the following: the freshness of the target food material, the nutrition parameter of the target food material, and the target associated information comprises the food material reference information.
3. The method of claim 1, wherein prior to the matching with the three-dimensional food material model in the three-dimensional food material model library using food material image information, the method further comprises:
performing Augmented Reality (AR) scanning on the target food material through the target client to obtain the food material image information;
And sending the food material image information to a target server, wherein the target server is a server for matching the three-dimensional food material model.
4. A method according to any one of claims 1 to 3, wherein after said pushing the target associated information of the target food material to a target client for display, the method further comprises:
the image acquisition device of the target client acquires real-time images of the target food, wherein the target client displays real-time images of the target food;
Displaying the target food material model on the real-time image of the target food material, wherein the position and the size of the target food material model are adjusted according to the position and the size of the real-time image of the target food material;
And displaying the target menus on the target client, wherein the target menus are not overlapped with the target food material model.
5. The method of claim 4, wherein after displaying the target food material model on the real-time image of the target food material, the method further comprises:
Detecting an adjustment operation performed on the target food material model, wherein the adjustment operation is used for adjusting the display angle of the target food material model;
Responding to the adjustment operation, and adjusting the display angle of the target food material model;
And displaying the target food material model with the adjusted angle on the target client.
6. An information pushing device, characterized by comprising:
The matching unit is used for matching the food material image information with three-dimensional food material models in the three-dimensional food material model library, wherein the food material image information is obtained by scanning a target food material, and each three-dimensional food material model corresponds to one food material;
A first acquisition unit configured to acquire, when a target food material model matching the food material image information is acquired, a plurality of target recipes corresponding to the target food materials from a recipe library according to target object information corresponding to a target object;
The pushing unit is used for pushing the target associated information of the target food materials to a target client for display, wherein the target associated information comprises the target food material model and the plurality of target menus;
the first acquisition unit includes: a second determining module, configured to determine, when the target objects are plural and the target food materials are plural, target object information corresponding to the plural target objects, where the target object information is used to represent at least one of the plural target objects: dish preference, inquired dishes, the number of people eating, physiological characteristics; a second obtaining module, configured to obtain, according to the target object information, a plurality of target recipes corresponding to a plurality of target food materials from the recipe library, where each recipe in the plurality of target recipes includes a combination of a plurality of recipes, each target food material is used by at least one recipe in the plurality of recipes, and a number of recipes included in the target recipe matches a number of people having a meal;
And when the target objects are a plurality of and the target food materials are one, taking the target food materials as main food materials, integrating the dish preference and the physiological characteristics of the plurality of target objects, matching the dishes preference and the physiological characteristics of the plurality of target objects from a menu library, and sequencing the menus according to the queried times of the matched dishes to obtain a plurality of target menus corresponding to the target food materials.
7. An electronic device comprising a processor, a communication interface, a memory and a communication bus, wherein the processor, the communication interface and the memory communicate with each other via the communication bus, characterized in that,
The memory is used for storing a computer program;
the processor is configured to perform the method steps of any of claims 1 to 5 by running the computer program stored on the memory.
8. A computer-readable storage medium, characterized in that the storage medium has stored therein a computer program, wherein the computer program is arranged to perform the method steps of any of claims 1 to 5 when run.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202011408345.2A CN112464013B (en) | 2020-12-04 | 2020-12-04 | Information pushing method and device, electronic equipment and storage medium |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202011408345.2A CN112464013B (en) | 2020-12-04 | 2020-12-04 | Information pushing method and device, electronic equipment and storage medium |
Publications (2)
Publication Number | Publication Date |
---|---|
CN112464013A CN112464013A (en) | 2021-03-09 |
CN112464013B true CN112464013B (en) | 2024-09-06 |
Family
ID=74805774
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202011408345.2A Active CN112464013B (en) | 2020-12-04 | 2020-12-04 | Information pushing method and device, electronic equipment and storage medium |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN112464013B (en) |
Families Citing this family (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN113283364A (en) * | 2021-06-04 | 2021-08-20 | 青岛海尔科技有限公司 | Recipe determination method and apparatus, storage medium, and electronic apparatus |
CN113591576A (en) * | 2021-06-29 | 2021-11-02 | 青岛海尔科技有限公司 | Food material information detection method and device, storage medium and electronic device |
CN115390470A (en) * | 2022-08-24 | 2022-11-25 | 青岛海尔科技有限公司 | Menu determination method and device, storage medium and electronic device |
CN116468526A (en) * | 2023-06-19 | 2023-07-21 | 中国第一汽车股份有限公司 | Recipe generation method and device based on vehicle-mounted OMS camera and vehicle |
Citations (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN104361041A (en) * | 2014-10-28 | 2015-02-18 | 华南理工大学 | Auxiliary method and auxiliary system of intelligent refrigerator |
CN107577176A (en) * | 2017-08-16 | 2018-01-12 | 珠海格力电器股份有限公司 | Control method, device and system of cooking appliance |
Family Cites Families (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
EP3107429B1 (en) * | 2014-02-20 | 2023-11-15 | MBL Limited | Methods and systems for food preparation in a robotic cooking kitchen |
CN110797105A (en) * | 2019-10-08 | 2020-02-14 | 珠海格力电器股份有限公司 | Menu recommendation method and device, storage medium and cooking equipment |
-
2020
- 2020-12-04 CN CN202011408345.2A patent/CN112464013B/en active Active
Patent Citations (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN104361041A (en) * | 2014-10-28 | 2015-02-18 | 华南理工大学 | Auxiliary method and auxiliary system of intelligent refrigerator |
CN107577176A (en) * | 2017-08-16 | 2018-01-12 | 珠海格力电器股份有限公司 | Control method, device and system of cooking appliance |
Also Published As
Publication number | Publication date |
---|---|
CN112464013A (en) | 2021-03-09 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN112464013B (en) | Information pushing method and device, electronic equipment and storage medium | |
US10803315B2 (en) | Electronic device and method for processing information associated with food | |
CN107862018B (en) | Recommendation method and device for food cooking method | |
US20140214618A1 (en) | In-store customer scan process including nutritional information | |
CN109683711B (en) | Product display method and device | |
EP3779842A1 (en) | Commodity information query method and system | |
KR20190048922A (en) | Smart table and controlling method thereof | |
CN111541868A (en) | Cooking state monitoring method, device and system | |
CN111859120A (en) | Information pushing method and device, computer-readable storage medium and electronic device | |
CN113325722B (en) | Multi-mode implementation method and device for intelligent cooking and intelligent cabinet | |
CN109685568B (en) | Product display method and device | |
CN109766052B (en) | Dish picture uploading method and device, computer equipment and readable storage medium | |
CN114218415A (en) | Cooking recipe display method and device | |
CN113434759A (en) | Method and system for determining recommended content, mirror, storage medium and electronic device | |
CN116452881B (en) | Food nutritive value detection method, device, equipment and storage medium | |
CN111885139A (en) | Content sharing method, device and system, mobile terminal and server | |
CN111667082A (en) | Feedback method and apparatus, storage medium, and electronic apparatus | |
CN114556444A (en) | Training method of combined model and object information processing method, device and system | |
CN112906513B (en) | Dining resource information processing method, device and equipment | |
KR101988986B1 (en) | Apparatus for recommanding recipe based on psychological condition | |
CN111935488B (en) | Data processing method, information display method, device, server and terminal equipment | |
CN111401346B (en) | Information processing method, device, system, storage medium and computer equipment | |
CN114520921B (en) | Information interaction method in live broadcast, storage medium and electronic device | |
KR20240105574A (en) | Ingredient freshness grasp application using AR | |
CN114708483A (en) | Method and device for controlling baking of food material, storage medium and electronic device |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant | ||
GR01 | Patent grant |