[go: up one dir, main page]
More Web Proxy on the site http://driver.im/

CN112528156A - Method for establishing sequencing model, method for automatically completing query and corresponding device - Google Patents

Method for establishing sequencing model, method for automatically completing query and corresponding device Download PDF

Info

Publication number
CN112528156A
CN112528156A CN202011551563.1A CN202011551563A CN112528156A CN 112528156 A CN112528156 A CN 112528156A CN 202011551563 A CN202011551563 A CN 202011551563A CN 112528156 A CN112528156 A CN 112528156A
Authority
CN
China
Prior art keywords
query
model
training
unit
training data
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202011551563.1A
Other languages
Chinese (zh)
Other versions
CN112528156B (en
Inventor
范淼
黄际洲
孙一博
王海峰
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beijing Baidu Netcom Science and Technology Co Ltd
Original Assignee
Beijing Baidu Netcom Science and Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beijing Baidu Netcom Science and Technology Co Ltd filed Critical Beijing Baidu Netcom Science and Technology Co Ltd
Priority to CN202011551563.1A priority Critical patent/CN112528156B/en
Publication of CN112528156A publication Critical patent/CN112528156A/en
Application granted granted Critical
Publication of CN112528156B publication Critical patent/CN112528156B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F16/00Information retrieval; Database structures therefor; File system structures therefor
    • G06F16/90Details of database functions independent of the retrieved data types
    • G06F16/95Retrieval from the web
    • G06F16/953Querying, e.g. by the use of web search engines
    • G06F16/9535Search customisation based on user profiles and personalisation
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F16/00Information retrieval; Database structures therefor; File system structures therefor
    • G06F16/90Details of database functions independent of the retrieved data types
    • G06F16/95Retrieval from the web
    • G06F16/953Querying, e.g. by the use of web search engines
    • G06F16/9537Spatial or temporal dependent retrieval, e.g. spatiotemporal queries
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/04Architecture, e.g. interconnection topology
    • G06N3/045Combinations of networks
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/08Learning methods

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Databases & Information Systems (AREA)
  • Physics & Mathematics (AREA)
  • Data Mining & Analysis (AREA)
  • General Engineering & Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Biomedical Technology (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Artificial Intelligence (AREA)
  • Health & Medical Sciences (AREA)
  • Biophysics (AREA)
  • Computational Linguistics (AREA)
  • Evolutionary Computation (AREA)
  • General Health & Medical Sciences (AREA)
  • Molecular Biology (AREA)
  • Computing Systems (AREA)
  • Mathematical Physics (AREA)
  • Software Systems (AREA)
  • Information Retrieval, Db Structures And Fs Structures Therefor (AREA)

Abstract

The invention discloses a method for establishing a sequencing model, a method for automatically completing query and a corresponding device, and relates to the technical field of intelligent search. The specific implementation scheme is as follows: acquiring training data, wherein the training data comprises first training data and second training data; the first training data comprises historical query data occurring in a space-time set, and the second training data comprises historical query data occurring in a preset space-time unit; training a sequencing model by using the training data to obtain a sequencing model corresponding to the space-time unit; in the training process, the first training data is used for carrying out global training on the ranking model, and the second training data is used for carrying out local training on the ranking model; and the sequencing model corresponding to the spatio-temporal unit is used for predicting the query completion suggestion of the query prefix input by the user in the spatio-temporal unit. The query completion suggestion recommended based on the ranking model provided by the disclosure can better meet the practical requirements of the user.

Description

Method for establishing sequencing model, method for automatically completing query and corresponding device
Technical Field
The present disclosure relates to the field of computer application technologies, and in particular, to a method for establishing a ranking model, a method for query completion, and a corresponding apparatus in the field of intelligent search technologies.
Background
QACs (Query Auto-Completion) are currently widely used by mainstream general search engines and vertical search engines. For example, in a map-like application, when a user inputs a query to search for a certain POI (Point of Interest), starting from the incomplete query input by the user (the incomplete query input by the user is referred to as a query prefix in the present disclosure), the search engine may recommend a series of candidate POIs to the user in real time in a candidate list for the user to select as a completion result of the query (the query recommended in the candidate list is referred to as a query completion suggestion in the present disclosure). Once the user finds the POI he wants in the candidate list, the query can be completed by selecting the POI from the candidate list, and the query of the POI can be initiated.
However, in the existing query autocompletion scheme, suggestions provided for the same query prefix are all the same, for example, all suggestions are ranked in a candidate list based on the retrieval heat of each POI, and the suggestions cannot well meet the practical requirements of the user.
Disclosure of Invention
The disclosure provides a method for establishing a ranking model, a method for automatically completing query and a corresponding device, so that recommended query completion suggestions can better meet the practical requirements of users.
According to a first aspect of the present disclosure, there is provided a method of building a ranking model, comprising:
acquiring training data, wherein the training data comprises first training data and second training data; the first training data comprises historical query data occurring in a space-time set, and the second training data comprises historical query data occurring in a preset space-time unit;
training a sequencing model by using the training data to obtain a sequencing model corresponding to the space-time unit; in the training process, the first training data is used for carrying out global training on the ranking model, and the second training data is used for carrying out local training on the ranking model;
and the sequencing model corresponding to the spatio-temporal unit is used for predicting the query completion suggestion of the query prefix input by the user in the spatio-temporal unit.
According to a second aspect of the present disclosure, there is provided a method for query autocompletion, comprising:
acquiring a query prefix input by a user, candidate query items corresponding to the query prefix, and determining a time-space unit where the user inputs the query prefix;
inputting the query prefix and the candidate query items into a ranking model corresponding to the spatio-temporal unit to obtain the scores of the ranking model for the candidate query items;
determining a query completion suggestion recommended to the user according to the scores of the candidate query terms;
the sequencing model corresponding to the space-time unit is obtained by training by adopting the method.
According to a third aspect of the present disclosure, there is provided an apparatus for building an order model, comprising:
the data acquisition unit is used for acquiring training data, and the training data comprises first training data and second training data; the first training data comprises historical query data occurring in a space-time set, and the second training data comprises historical query data occurring in a preset space-time unit;
the model training unit is used for training a sequencing model by utilizing the training data to obtain a sequencing model corresponding to the spatio-temporal unit; in the training process, the first training data is used for carrying out global training on the ranking model, and the second training data is used for carrying out local training on the ranking model;
and the sequencing model corresponding to the spatio-temporal unit is used for predicting the query completion suggestion of the query prefix input by the user in the spatio-temporal unit.
According to a fourth aspect of the present disclosure, there is provided an apparatus for query autocompletion, comprising:
the query unit is used for acquiring a query prefix input by a user, candidate query items corresponding to the query prefix and determining a space-time unit where the query prefix is input by the user;
the scoring unit is used for inputting the query prefix and the candidate query items into the ranking model corresponding to the spatio-temporal unit to obtain the score of each candidate query item by the ranking model;
the query completion unit is used for determining a query completion suggestion recommended to the user according to the scores of the candidate query terms;
wherein the sequencing model corresponding to the spatio-temporal unit is obtained by training the device.
According to a fifth aspect of the present disclosure, there is provided an electronic device comprising:
at least one processor; and
a memory communicatively coupled to the at least one processor; wherein,
the memory stores instructions executable by the at least one processor to enable the at least one processor to perform the method as described above.
According to a sixth aspect of the present disclosure, there is provided a non-transitory computer readable storage medium having stored thereon computer instructions for causing the computer to perform the method as described above.
According to a seventh aspect of the present disclosure, there is provided a computer program product comprising a computer program which, when executed by a processor, implements a method according to the above.
According to the technical scheme, the personalized ranking model is trained respectively for each time-space unit, and the ranking model can learn the global query requirements of a general region and a time range and can also learn the personalized query requirements of the preset time-space unit. And the query automatic completion based on the sequencing model can better meet the query requirement of the spatio-temporal unit.
It should be understood that what is described in this section is not intended to identify key or critical features of the embodiments of the disclosure, nor is it intended to be used to limit the scope of the disclosure. Other features of the present disclosure will become apparent from the following description.
Drawings
The drawings are included to provide a better understanding of the present solution and are not to be construed as limiting the present disclosure. Wherein:
FIG. 1 is an exemplary system architecture diagram to which embodiments of the present disclosure may be applied;
FIG. 2 is a flow diagram of a method for building a ranking model provided by the disclosed embodiments;
fig. 3 and 4 are schematic structural diagrams of two ordering models provided by the embodiment of the present disclosure;
FIG. 5 is a schematic diagram of model parameters for training a ranking model provided by an embodiment of the present disclosure;
FIG. 6 is a flowchart of a method for query autocompletion provided by an embodiment of the present disclosure;
FIG. 7 is a diagram of an apparatus for creating a ranking model according to an embodiment of the present disclosure;
FIG. 8 is a diagram of an apparatus for query autocompletion according to an embodiment of the present disclosure;
FIG. 9 is a block diagram of an electronic device used to implement embodiments of the present disclosure.
Detailed Description
Exemplary embodiments of the present disclosure are described below with reference to the accompanying drawings, in which various details of the embodiments of the disclosure are included to assist understanding, and which are to be considered as merely exemplary. Accordingly, those of ordinary skill in the art will recognize that various changes and modifications of the embodiments described herein can be made without departing from the scope and spirit of the present disclosure. Also, descriptions of well-known functions and constructions are omitted in the following description for clarity and conciseness.
In the prior art, the ranking for each candidate query term usually takes into account the heat characteristics of each candidate query term, and in some cases, also takes into account the attribute characteristics of some users. But this ordering does not meet the actual needs of the user well. By using the data of real query of the users in the large-scale historical query data for statistics, the users are found to be in different time and different space, and even if the same query prefix is input, the query items in actual requirements are greatly different. This phenomenon is more apparent in the typical application scenario of POI query.
For example, a user located in Shanghai Fengxian district, when entering the query prefix "Shanghai", eventually chooses the most of the query "Shanghai fish". The user located in the Songjiang area of Shanghai finally selects the most queries for the "Shanghai Rainbow bridge station" when entering the query prefix "Shanghai". The user located in huangpu district, shanghai, eventually chooses the most of the query "shanghai station" when entering the query prefix "shanghai".
As another example, a user who is also located in a gold mountain area in Shanghai may have the most query requirements for a shopping mall when entering the query prefix "Shanghai" on a non-weekday, and the most query requirements for a company class on a weekday.
In the method, a traditional mode that spatio-temporal characteristics are input into a sequencing model as characteristics is not adopted, but a brand-new thought is adopted, and the sequencing model is obtained by respectively training each spatio-temporal unit. When a user inputs a query prefix, the query completion suggestion is predicted by utilizing the sequencing model of the spatiotemporal unit. The technical solution provided by the present disclosure is described in detail below with reference to examples.
FIG. 1 illustrates an exemplary system architecture to which embodiments of the disclosure may be applied. As shown in fig. 1, the system architecture may include terminal devices 101 and 102, a network 103, and a server 104. The network 103 serves as a medium for providing communication links between the terminal devices 101, 102 and the server 104. Network 103 may include various connection types, such as wired, wireless communication links, or fiber optic cables, to name a few.
A user may interact with server 104 through network 103 using terminal devices 101 and 102. Various applications, such as a voice interaction application, a web browser application, a communication-type application, etc., may be installed on the terminal devices 101 and 102.
The terminal devices 101 and 102 may be various electronic devices including, but not limited to, a smart phone, a tablet, a PC, a smart tv, and the like. The query autocomplete apparatus provided by the present invention may be configured and run on the server 104. It may be implemented as a plurality of software or software modules (for example, for providing distributed services), or as a single software or software module, which is not specifically limited herein.
For example, when the user inputs the query prefix on the retrieval interface provided by the browser or the client on the terminal device 101, the browser or the client provides the query prefix to the server 104 in real time, and the server returns the query completion suggestion corresponding to the query prefix currently input by the user to the terminal device 101 by using the method provided by the present disclosure. If the user finds a desired query (query term) from the query completion suggestions, a search for the query may be initiated by selecting the query. If the user does not find any query from the query completion suggestions, the user can continue inputting, then the browser or the client provides the query prefix to the server 104 in real time, and the server 104 returns the query completion suggestions corresponding to the query prefix input by the user. Thereby providing an effect of: and in the process of inputting the query by the user, recommending query completion suggestions to the user in real time along with the query prefix input by the user.
The server 104 may be a single server or a server group including a plurality of servers. It should be understood that the number of terminal devices, networks, and servers in fig. 1 is merely illustrative. There may be any number of terminal devices, networks, and servers, as desired for implementation.
Fig. 2 is a flowchart of a method for establishing a ranking model according to an embodiment of the disclosure, and as shown in fig. 2, the method may include the following steps:
in 201, training data is acquired, wherein the training data comprises first training data and second training data; wherein the first training data comprises historical query data occurring in a spatiotemporal set and the second training data comprises historical query data occurring in a preset spatiotemporal unit.
In 202, training a sequencing model by using training data to obtain a sequencing model corresponding to a preset time-space unit; in the training process, the first training data are used for carrying out global training on the sequencing model, and the second training data are used for carrying out local training on the sequencing model; the sequencing model corresponding to the space-time unit is used for predicting the query completion suggestion of the query prefix input by the user in the space-time unit.
Where the space-time set includes more than 1 spatio-temporal unit, it may be historical query data in all time ranges in a certain geographic area. Or historical query data in a main time range in a plurality of regional ranges. And so on. The part of historical query data reflects global query requirements in a common region and a time range, so that the global training of the ranking model based on the part of historical query data also reflects the global query requirements in the common region and the time range.
In the present disclosure, the space-time set is divided into more than one space-time unit in advance, and each space-time unit is actually combined by a region unit and a time unit. In practice, the step is to respectively establish respective sequencing models of each spatio-temporal unit based on historical query data occurring in each spatio-temporal unit, and the sequencing models are personalized for each spatio-temporal unit. Specific implementation manners are described in detail in the following embodiments.
Through the mode provided by the embodiment, the individual sequencing model is trained respectively for each space-time unit, and the sequencing model can learn the global query requirement in a general region and a time range and can also learn the individual query requirement of a preset space-time unit. Such a ranking model can more accurately reflect the query requirements that occur at the spatiotemporal unit.
Step 201 in the above embodiment is "acquiring training data, where the training data includes first training data and second training data; wherein the first training data comprises historical query data occurring in a spatiotemporal set, and the second training data comprises historical query data occurring in a preset spatiotemporal unit.
The historical query data involved in the embodiments of the present disclosure may include a query prefix input when a user selects a query term from the query completion suggestions, a selected query term from the query completion suggestions corresponding to the query prefix, and unselected query terms. Wherein the selected query term embodies the query requirements of the user, and thus is taken as a positive example, and the unselected query term is taken as a negative example.
Taking a user for querying a POI in a map-like application as an example, for example, when a user _ a clicks a POI "hundredth building a seat" from a query completion suggestion in the process of inputting characters one by one to form each query prefix "ba" when inputting the query prefix "ba", the user identifier user _ a, the query prefix "ba", the POI "hundredth building a seat" selected by the user in the corresponding query completion suggestion, and the POI "octagon great wall" not selected by the user are obtained as a piece of training data. Wherein, the 'ba' and the 'Baidu mansion A seat' form a positive example pair, and the 'ba' and the 'Ba Da Ling great wall' form a negative example pair. Since there are other POIs not selected by the user, such as "hundredth building", "hundredth science and technology park", etc., in the query completion suggestion, "ba" may also form a negative example pair with "hundredth building", and "ba" may also form a negative example pair with "hundredth science and technology park".
The positive and negative example pairs described above are one preferred training data configuration method, and in addition to this method, only the positive example pairs may be selected to configure training data, or another method may be employed.
In the present disclosure, the training data may be divided based on the space-time standard, on the one hand, by the region, and on the other hand, by the time. Where the space-time set is actually made up of more than one space-time unit. Each spatio-temporal unit is obtained by combining a time unit and a region unit. For example, a region is divided into more than one region unit in advance, and the division may be performed according to administrative regions, for example, according to a district or a county, according to a block, and the like. The division may also be performed according to a preset size, for example, the geographic area is divided into 1km × 1km cells, and each cell is taken as a unit of area.
The time units may also be set in various ways, for example, divided into two time units according to working days and non-working days. For another example, the time is divided into time units such as monday and tuesday. For another example, every two hours is divided into a time unit according to the time of day. And so on.
And combining the set region units and the time units in pairs, wherein each combination obtains a space-time unit. There are as many spatiotemporal units as there are combinations.
It should be noted that the spatio-temporal units may be fixedly set, and the region unit and the time unit corresponding to each spatio-temporal unit are all the same in size. However, since some spatio-temporal units may have sparse historical query data, the spatio-temporal units may be expanded by expanding the size of the region units and/or the time units, so that the expanded spatio-temporal units have a sufficient amount of historical query data to train a sufficiently accurate ranking model. That is, the size of the spatiotemporal unit set in the present disclosure can be flexibly adjusted according to actual situations.
For example, historical query data occurring at all times in Shanghai may be used as the first training data, and historical query data occurring at 8: 00-10: 00 am in Huangpu may be used as the second training data.
Step 202 in the above embodiment is to "train the ranking model by using the training data to obtain the ranking model corresponding to the preset spatio-temporal unit; in the training process, the first training data is used for carrying out global training on the ranking model, and the second training data is used for carrying out local training on the ranking model.
In this step, the first training data is actually used for global training, and the second training data is used for local training, so as to finally obtain an individualized ordering model which not only reflects global query requirements, but also reflects local spatiotemporal query requirements. The training targets adopted in the process of training the ranking model are as follows: the difference between the scores of the ranking model for the selected query terms and the scores for the unselected query terms is maximized. Specifically, the present step can adopt, but is not limited to, the following two implementation manners:
the first implementation mode comprises the following steps: firstly, pre-training a ranking model by utilizing first training data to update global model parameters and personalized model parameters to obtain a global ranking model; and then, further training the global ordering model by using second training data to adjust personalized model parameters to obtain an ordering model corresponding to the preset spatio-temporal unit.
In this way, the ranking model is first trained to update all model parameters to obtain a global ranking model using historical query data occurring in the space-time set. The method is equivalent to training the ranking model integrally by using a large range of historical query data.
As a typical application scenario, the method and apparatus provided by the present disclosure may be applied to a scenario regarding a Point of Interest (POI) query in a map-like application. Namely, when the user uses the map application to perform POI query, the query completion suggestion is recommended to the user in real time along with the query prefix input by the user. After determining candidate POIs corresponding to the query prefix input by the user, the query completion suggestion is obtained by ranking each candidate POI by using a ranking model. The above method provided by the present disclosure is described in detail below by taking this application scenario as an example.
The ranking model involved therein can be implemented by any neural network model. Mainly structurally comprises: embedded networks and sequenced networks. The embedded network is used to encode the input data to obtain corresponding vector representations, for example, a vector representation of the query prefix and a vector representation of each query term. The ranking network is used to score each query term according to the vector representation output by the embedded network.
For the ranking model, a similarity-based approach or a regression-based approach may be used.
As one implementation manner, if the implementation manner is based on similarity, the structure of the ranking model may include three parts as shown in fig. 3: prefix-embedded networks, POI-embedded networks, and similarity-computation networks, as shown in fig. 3.
The prefix embedding network is used for obtaining a vector representation u of a query prefix, the POI embedding network is used for obtaining a vector representation v of candidate POI, the similarity calculation network is used for determining the similarity S between the vector representation of the query prefix and the vector representation of the candidate POI, and each candidate POI is scored based on the similarity S.
During the training process, for a piece of training data: for a query prefix, a positive case POI corresponding to the query prefix and a negative case POI corresponding to the query prefix, the query prefix is input into a prefix embedding network, and the prefix embedding network outputs a vector representation u of the query prefix. Respectively inputting positive and negative POIs corresponding to the query prefix into a POI embedded network, and outputting vector representation v of the positive POI by the POI embedded network+And vector representation v of negative case POI-. The ranking network calculates a vector representation u of the query prefix and a vector representation v of the regular POI, respectively+Similarity between them S+Vector representation u of query prefix and vector representation v of regular POI-Similarity between them S-. The training targets of the whole ranking model are as follows: maximization of S+And S-The difference between them.
As another implementation manner, if implemented in a regression-based manner, the structure of the ranking model may include a prefix-embedded network, a POI-embedded network, and a regression network, as shown in fig. 4, and may further include a user-embedded network, for example.
The method comprises the steps that a prefix embedding network is used for obtaining a vector representation u of a query prefix, a POI embedding network is used for obtaining a vector representation v of candidate POI, a user embedding network is used for obtaining a vector representation d of user attribute information, then after splicing the vector representations, a regression network is used for mapping the spliced vector representations to obtain scores of the candidate POI.
During the training process, for a piece of training data: the query prefix, the positive case POI corresponding to the query prefix and the negative case POI corresponding to the query prefix are input into the prefix embedded network and outputThe vector of (c) represents u. Respectively inputting positive and negative POIs corresponding to the query prefix into a POI embedded network, and outputting vector representation v of the positive POI by the POI embedded network+And vector representation v of negative case POI-. Vector representation u and v of query prefix+After splicing (vector representation corresponding to the user attribute information can be further spliced), mapping is carried out by a regression network to obtain the score P of the opposite POI+. Vector representation u and v of query prefix-After splicing (vector representation corresponding to the user attribute information can be further spliced), mapping is carried out by a regression network to obtain the score P of the opposite POI-. The training targets of the whole ranking model are as follows: maximization of P+And P-The difference between them.
After the global ranking model is obtained through training, only personalized model parameters in the global ranking model are adjusted by using historical query data occurring in a preset time-space unit, and sharing parameters are fixed. And finally, obtaining a sequencing model corresponding to the space-time unit.
As shown in fig. 5, that is, the networks in the ranking model are divided into a sharing layer and a personalization layer. Some network layers are global to all the space-time units, the influence on individual difference of each space-time unit is small, the network layers are called sharing layers, and parameters of the sharing layers are global model parameters. Other network layers have larger individualized difference influence on the spatio-temporal unit, the network layers are called individualized layers, and parameters of the individualized layers are individualized model parameters.
And training the shared parameters and the personalized model parameters uniformly when the global ordering model is trained. But when the historical query data of the preset time-space unit is used for further training, the shared parameters are fixed, and only the personalized model parameters are adjusted.
Wherein, the global model parameters may include: the parameters of the network are embedded in the ranking model. The personalized model parameters may include parameters in a ranking network, such as parameters of a similarity calculation network or a regression network.
In addition, since the sequencing network generally includes a multi-layer network structure, the global model parameters may further include parameters related to a part of hierarchy at the lower layer in the sequencing network, in addition to parameters embedded in the network. The personalized model parameters may include parameters involved in sorting the upper part of the hierarchy in the network, i.e., parameters other than the global model parameters.
As a preferred embodiment, the personalized model parameters may include model parameters of a fully connected layer in the ranking model. The global model parameters include model parameters of other network layers in the ranking model except for the fully connected layer.
In addition, for the technical solution provided by the above embodiment, the pre-training of the global ranking model is a common basis for further training the ranking models of the spatio-temporal units, and the training of the ranking models of the spatio-temporal units are independent and do not interfere with each other. Thus, the present disclosure may be implemented with a large-scale distributed processing architecture, thereby improving the efficiency of model training. That is, after the global ranking model is trained in step 201, a distributed structure may be adopted to allocate the ranking model training tasks of each spatio-temporal unit to different computing nodes for parallel execution, and each computing node is trained in step 202 to obtain the ranking model of each spatio-temporal unit.
The second implementation mode comprises the following steps: and performing at least one alternation training on the sequencing model by utilizing the first training task and the second training task to obtain the sequencing model corresponding to the preset space-time unit. The first training task is to train a sequencing model by using first training data so as to update global model parameters and personalized model parameters; the second training task is to train the ranking model using the second training data to update the personalized model parameters.
The implementation manner is different from the first implementation manner in that the global training and the personalized training of the ranking model are performed alternately instead of the global training of the ranking model by using the first training data and then by using the second training data.
After several rounds of training of the first training task, several rounds of training of the second training task are performed, several rounds of training of the first training task are performed, and the rest is done until the model converges. The training data, the training targets, the structure of the ranking model, the setting of the model parameters, and the like used in the model training process may all be described in the first embodiment, and are not described herein again.
For model parameters, although the sequencing models need to be trained respectively for each spatio-temporal unit, the difference between the sequencing models of each spatio-temporal unit is only the personalized model parameters, so that for the query autocomplete system, on the basis of storing the global sequencing model, only the corresponding personalized model parameters need to be stored respectively for each spatio-temporal unit, and no storage pressure is brought.
After the sequencing model of the preset spatio-temporal unit is obtained, the sequencing model of the spatio-temporal unit can be used for predicting query completion suggestions of a query prefix initiated by a user in the spatio-temporal unit.
Fig. 6 is a flowchart of a method for query autocompletion according to an embodiment of the present disclosure, in which a model of the spatio-temporal unit order used in the method is pre-established by using the process shown in fig. 2. As shown in fig. 6, the method may include the steps of:
at 601, a query prefix input by a user, a spatio-temporal unit where the query prefix is input, and candidate query terms corresponding to the query prefix are determined.
The present disclosure is applicable to various types of input contents, such as Chinese characters, pinyin, initials, etc., but the input query prefix can be regarded as a character string. And acquiring the query prefix currently input by the user in real time along with the query prefix input by the user. For example, when a user wants to input "hundredth building", the user may input a plurality of query prefixes such as "hundredth", "hundredth large", and then execute the method provided by the present disclosure for each query prefix. That is, when the user inputs "hundred", the currently input query prefix is "hundred", and the method of the present disclosure is executed for the query prefix to recommend the query completion suggestion for the user. When the user inputs the 'hundredth degree', the currently input query prefix is 'hundredth degree', and the method disclosed by the invention is executed aiming at the query prefix to recommend the query completion suggestion for the user. When the user inputs 'hundredth is large', the currently input query prefix is 'hundredth is large', and the method disclosed by the invention is executed aiming at the query prefix to recommend a query completion suggestion for the user.
The method for determining candidate query terms corresponding to the currently input query prefix may adopt an existing implementation manner, and aims to find query terms beginning with the query prefix as text or find query terms strongly related to the query prefix.
When the method provided by the disclosure is applied to POI query, the query items are POI information. For example, an inverted index may be established in advance in the POI library for POI information with various corresponding query prefixes. When a user inputs a query prefix, the POI library is queried according to the currently input query prefix, and all the hit POIs are used as candidate POIs.
For another example, associations between POIs and various query prefixes may be pre-established in a POI library, and the associations may be obtained from a user retrieval history. For example, after the user has entered "Zhejiang", the POI of "Zhejiang university" is selected from the search results. The association may also be obtained by using a synonymy POI word list, for example, if "forbidden city" is also called "forbidden city", the association between query prefixes "forbidden", "zijin", etc. and "forbidden city" POIs is pre-established. The association may also be added manually.
The spatiotemporal unit where the user inputs the query prefix may be determined by the location and time where the user inputs the query prefix, that is, the spatiotemporal unit where the location and time belong.
At 602, the query prefix and the candidate query term input by the user are input into the ranking model corresponding to the spatio-temporal unit, and the score of the ranking model for each candidate query term is obtained.
The working principle, structure, etc. of the ranking model may adopt the relevant description in the previous embodiment, and are not described herein again.
At 603, query completion suggestions recommended to the user are determined according to the scores of the candidate query terms.
In this step, candidate query terms with score values greater than or equal to a preset score threshold may be used as query completion suggestions, or query terms with score values ranked P top may be used as query completion suggestions, and so on, where P is a preset positive integer. And when the query completion suggestion is recommended to the user, sorting in the candidate list according to the scores of all the query terms. The recommendation method can follow the existing form of a drop-down box near the search box, and can also adopt other forms.
One specific example is as follows:
and training a ranking model by using POI query data of all users in the previous month in the whole Shanghai city, and updating the global model parameters and the personalized model parameters.
Then, the Shanghai city is divided into areas, such as Huangpu area, Xuhui area, Changning area, Jingan area, Putuo area, Iris area, poppy area, Mining area, Baoshan area, Jiading area, Pudong area, Jinshan area, Songjiang area, Qingpu area, Fengxian area and Chongming area, and 16 area units are obtained.
And dividing every 2 hours in 24 hours a day into one time unit to obtain 12 time units.
And respectively combining the 16 region units and the 12 time units in pairs to obtain 192 space-time units.
And further adjusting personalized model parameters in the sequencing model by respectively utilizing POI query data generated in each space-time unit to obtain the sequencing model corresponding to each space-time unit.
For example, taking a spatio-temporal unit "huangpu district + 8: 00-10: 00 am" as an example, the POI query data occurring in the spatio-temporal unit is obtained to further train the global ranking model, and then the ranking model corresponding to the spatio-temporal unit is obtained.
If a certain user currently inputs a query prefix 'Shanghai', if the user is currently located in Huangpu district and the current time is 8: 00-10: 00 in the morning, the scores of all candidate POI corresponding to the query prefix are obtained by using a ranking model corresponding to a time-space unit 'Huangpu district + 8: 00-10: 00 in the morning', and a query completion suggestion recommended to the user is determined according to the scores.
The above is a detailed description of the method provided in the present application, and the following is a detailed description of the apparatus provided in the present application with reference to the embodiments.
Fig. 7 is a structural diagram of an apparatus for establishing a ranking model according to an embodiment of the present disclosure, where the apparatus may be an application located at a server end, or may also be a functional unit such as a plug-in or Software Development Kit (SDK) located in the application located at the server end, or may also be located in a computer terminal with a strong computing power, which is not particularly limited in this embodiment of the present disclosure. As shown in fig. 7, the apparatus 700 may include: a data acquisition unit 701 and a model training unit 702. The main functions of each component unit are as follows:
a data obtaining unit 701, configured to obtain training data, where the training data includes first training data and second training data; wherein the first training data comprises historical query data occurring in a spatiotemporal set and the second training data comprises historical query data occurring in a preset spatiotemporal unit.
A model training unit 702, configured to train a ranking model using training data to obtain a ranking model corresponding to the spatio-temporal unit; in the training process, the first training data are used for carrying out global training on the sequencing model, and the second training data are used for carrying out local training on the sequencing model.
The sequencing model corresponding to the space-time unit is used for predicting the query completion suggestion of the query prefix input by the user in the space-time unit.
The model training unit 702 may adopt, but is not limited to, the following two modes:
the first mode is as follows: a model training unit 702, specifically configured to pre-train a ranking model using first training data to update global model parameters and personalized model parameters, so as to obtain a global ranking model; and further training the global ordering model by using the second training data to adjust the personalized model parameters to obtain the ordering model corresponding to the space-time unit.
The second mode is as follows: the model training unit 702 is specifically configured to perform at least one alternating training on the ranking model by using the first training task and the second training task to obtain a ranking model corresponding to the spatio-temporal unit; the first training task is to train the sequencing model by utilizing first training data so as to update global model parameters and personalized model parameters; the second training task is to train the ranking model using the second training data to update the personalized model parameters.
As a preferred embodiment, the historical query data may include: the query prefix input when the user selects the query item from the query completion suggestions, the selected query item and the unselected query item in the query completion suggestions corresponding to the query prefix.
The training targets used by the model training unit 702 in training the ranking model are: the difference between the scores of the ranking model for the selected query terms and the scores for the unselected query terms is maximized.
As one implementation, the global model parameters may include: the parameters of the network are embedded in the ranking model. The personalized model parameters include at least some of the parameters in the ranking network in the ranking model.
As a typical implementation, the personalized model parameters may include model parameters of a fully connected layer. The global model parameters may include model parameters of other network layers than the fully connected layer.
Wherein the space-time unit is obtained by combining a time unit and a region unit. The size of the region unit may be set according to the sparseness of the historical query data occurring at the region unit.
As a preferred embodiment, the model training unit 702 may be implemented using a large-scale distributed processing architecture.
Fig. 8 is a structural diagram of an apparatus for query autocompletion according to an embodiment of the present disclosure, where a ranking model used in the apparatus may be obtained by pre-training with the apparatus shown in fig. 7. As shown in fig. 8, the apparatus 800 includes: the system comprises an acquisition unit 11, a scoring unit 12 and a query completion unit 13. The main functions of each component unit are as follows:
the obtaining unit 11 is configured to obtain a query prefix input by a user, candidate query terms corresponding to the query prefix, and a spatio-temporal unit in which the query prefix is determined when the user inputs the query prefix.
And the scoring unit 12 is configured to input the query prefix and the candidate query item into the ranking model corresponding to the empty unit, so as to obtain a score of each candidate query item by the ranking model.
And the query completion unit 13 is configured to determine a query completion suggestion recommended to the user according to the score of each candidate query term.
The embodiments in the present specification are described in a progressive manner, and the same and similar parts among the embodiments are referred to each other, and each embodiment focuses on the differences from the other embodiments. In particular, as for the apparatus embodiment, since it is substantially similar to the method embodiment, the description is relatively simple, and for the relevant points, reference may be made to the partial description of the method embodiment.
The present disclosure also provides an electronic device, a readable storage medium, and a computer program product according to embodiments of the present disclosure.
FIG. 9 is a block diagram of an electronic device for a method of building a ranking model or query autocompletion according to an embodiment of the present disclosure. Electronic devices are intended to represent various forms of digital computers, such as laptops, desktops, workstations, personal digital assistants, servers, blade servers, mainframes, and other appropriate computers. The electronic device may also represent various forms of mobile devices, such as personal digital processing, cellular phones, smart phones, wearable devices, and other similar computing devices. The components shown herein, their connections and relationships, and their functions, are meant to be examples only, and are not meant to limit implementations of the disclosure described and/or claimed herein.
As shown in fig. 9, the apparatus 900 includes a computing unit 901, which can perform various appropriate actions and processes in accordance with a computer program stored in a Read Only Memory (ROM)902 or a computer program loaded from a storage unit 908 into a Random Access Memory (RAM) 903. In the RAM 903, various programs and data required for the operation of the device 900 can also be stored. The calculation unit 901, ROM 902, and RAM 903 are connected to each other via a bus 904. An input/output (I/O) interface 905 is also connected to bus 904.
A number of components in the device 900 are connected to the I/O interface 905, including: an input unit 906 such as a keyboard, a mouse, and the like; an output unit 907 such as various types of displays, speakers, and the like; a storage unit 908 such as a magnetic disk, optical disk, or the like; and a communication unit 909 such as a network card, a modem, a wireless communication transceiver, and the like. The communication unit 909 allows the device 900 to exchange information/data with other devices through a computer network such as the internet and/or various telecommunication networks.
The computing unit 901 may be a variety of general and/or special purpose processing components having processing and computing capabilities. Some examples of the computing unit 901 include, but are not limited to, a Central Processing Unit (CPU), a Graphics Processing Unit (GPU), various dedicated Artificial Intelligence (AI) computing chips, various computing units running machine learning model algorithms, a Digital Signal Processor (DSP), and any suitable processor, controller, microcontroller, and so forth. The computing unit 901 performs the various methods and processes described above, such as methods of building a ranking model or query autocompletion. For example, in some embodiments, the method of building a ranking model or query autocompletion may be implemented as a computer software program tangibly embodied in a machine-readable medium, such as storage unit 908.
In some embodiments, part or all of the computer program may be loaded and/or installed onto device 900 via ROM 802 and/or communications unit 909. When loaded into RAM 903 and executed by computing unit 901, may perform one or more of the steps of the above-described method of building a ranking model or query autocompletion. Alternatively, in other embodiments, the computing unit 901 may be configured to perform the method of building the ranking model or query autocompletion by any other suitable means (e.g., by means of firmware).
Various implementations of the systems and techniques described here can be realized in digital electronic circuitry, integrated circuitry, Field Programmable Gate Arrays (FPGAs), Application Specific Integrated Circuits (ASICs), Application Specific Standard Products (ASSPs), system on a chip (SOCs), load programmable logic devices (CPLDs), computer hardware, firmware, software, and/or combinations thereof. These various embodiments may include: implemented in one or more computer programs that are executable and/or interpretable on a programmable system including at least one programmable processor, which may be special or general purpose, receiving data and instructions from, and transmitting data and instructions to, a storage system, at least one input device, and at least one output device.
Program code for implementing the methods of the present disclosure may be written in any combination of one or more programming languages. These program codes may be provided to a processor or controller of a general purpose computer, special purpose computer, or other programmable data processing apparatus, such that the program codes, when executed by the processor or controller 30, cause the functions/operations specified in the flowchart and/or block diagram to be performed. The program code may execute entirely on the machine, partly on the machine, as a stand-alone software package, partly on the machine and partly on a remote machine or entirely on the remote machine or server.
In the context of this disclosure, a machine-readable medium may be a tangible medium that can contain, or store a program for use by or in connection with an instruction execution system, apparatus, or device. The machine-readable medium may be a machine-readable signal medium or a machine-readable storage medium. A machine-readable medium may include, but is not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, or device, or any suitable combination of the foregoing. More specific examples of a machine-readable storage medium would include an electrical connection based on one or more wires, a portable computer diskette, a hard disk, a Random Access Memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or flash memory), an optical fiber, a portable compact disc read-only memory (CD-ROM), an optical storage device, a magnetic storage device, or any suitable combination of the foregoing.
To provide for interaction with a user, the systems and techniques described here can be implemented on a computer having: a display device (e.g., a CRT (cathode ray tube) or LCD (liquid crystal display) monitor) for displaying information to a user; and a keyboard and a pointing device (e.g., a mouse or a trackball) by which a user can provide input to the computer. Other kinds of devices may also be used to provide for interaction with a user; for example, feedback provided to the user can be any form of sensory feedback (e.g., visual feedback, auditory feedback, or tactile feedback); and input from the user may be received in any form, including acoustic, speech, or tactile input.
The systems and techniques described here can be implemented in a computing system that includes a back-end component (e.g., as a data server), or that includes a middleware component (e.g., an application server), or that includes a front-end component (e.g., a user computer having a graphical user interface or a web browser through which a user can interact with an implementation of the systems and techniques described here), or any combination of such back-end, middleware, or front-end components. The components of the system can be interconnected by any form or medium of digital data communication (e.g., a communication network). Examples of communication networks include: local Area Networks (LANs), Wide Area Networks (WANs), and the Internet.
The computer system may include clients and servers. A client and server are generally remote from each other and typically interact through a communication network. The relationship of client and server arises by virtue of computer programs running on the respective computers and having a client-server relationship to each other.
It should be understood that various forms of the flows shown above may be used, with steps reordered, added, or deleted. For example, the steps described in the present application may be executed in parallel, sequentially, or in different orders, and are not limited herein as long as the desired results of the technical solutions disclosed in the present disclosure can be achieved.
The above detailed description should not be construed as limiting the scope of the disclosure. It should be understood by those skilled in the art that various modifications, combinations, sub-combinations and substitutions may be made in accordance with design requirements and other factors. Any modification, equivalent replacement, and improvement made within the spirit and principle of the present disclosure should be included in the scope of protection of the present disclosure.

Claims (21)

1. A method of building a ranking model, comprising:
acquiring training data, wherein the training data comprises first training data and second training data; the first training data comprises historical query data occurring in a space-time set, and the second training data comprises historical query data occurring in a preset space-time unit;
training a sequencing model by using the training data to obtain a sequencing model corresponding to the space-time unit; in the training process, the first training data is used for carrying out global training on the ranking model, and the second training data is used for carrying out local training on the ranking model;
and the sequencing model corresponding to the spatio-temporal unit is used for predicting the query completion suggestion of the query prefix input by the user in the spatio-temporal unit.
2. The method of claim 1, wherein training a ranking model using the training data to obtain a ranking model corresponding to the spatio-temporal unit comprises:
pre-training a ranking model by using the first training data to update global model parameters and personalized model parameters to obtain a global ranking model;
and further training the global ordering model by using the second training data to adjust the personalized model parameters to obtain an ordering model corresponding to the space-time unit.
3. The method of claim 1, wherein training a ranking model using the training data to obtain a ranking model corresponding to the spatio-temporal unit comprises:
performing at least one alternation training on the sequencing model by utilizing a first training task and a second training task to obtain a sequencing model corresponding to the spatio-temporal unit;
wherein the first training task is to train the order model using the first training data to update global model parameters and personalized model parameters; the second training task is to train the ranking model using the second training data to update the personalized model parameters.
4. The method of claim 1, wherein the historical query data comprises: the query prefix input when the user selects the query item from the query completion suggestions, the selected query item and the unselected query item in the query completion suggestions corresponding to the query prefix;
the training objectives employed in training the ranking model are: the difference between the ranking model's scores for selected query terms and the scores for unselected query terms is maximized.
5. The method of claim 2 or 3, wherein the global model parameters comprise: embedding network parameters into the sequencing model;
the personalized model parameters comprise at least part of parameters in a ranking network in the ranking model.
6. The method of claim 5, wherein the personalized model parameters comprise model parameters of a fully connected layer;
the global model parameters include model parameters of other network layers than the fully connected layer.
7. The method of claim 1, wherein the spatio-temporal unit is a combination of a time unit and a region unit;
wherein the size of the region unit is set according to the sparseness of the historical query data occurring at the region unit.
8. The method of claim 2, wherein the pre-training and further training of the global ordering model to adjust model parameters is implemented using a large-scale distributed processing architecture.
9. A method of query autocompletion, comprising:
acquiring a query prefix input by a user, candidate query items corresponding to the query prefix, and determining a time-space unit where the user inputs the query prefix;
inputting the query prefix and the candidate query items into a ranking model corresponding to the spatio-temporal unit to obtain the scores of the ranking model for the candidate query items;
determining a query completion suggestion recommended to the user according to the scores of the candidate query terms;
the order model corresponding to the space-time unit is obtained by training by the method according to any one of claims 1 to 8.
10. An apparatus for building an order model, comprising:
the data acquisition unit is used for acquiring training data, and the training data comprises first training data and second training data; the first training data comprises historical query data occurring in a space-time set, and the second training data comprises historical query data occurring in a preset space-time unit;
the model training unit is used for training a sequencing model by utilizing the training data to obtain a sequencing model corresponding to the spatio-temporal unit; in the training process, the first training data is used for carrying out global training on the ranking model, and the second training data is used for carrying out local training on the ranking model;
and the sequencing model corresponding to the spatio-temporal unit is used for predicting the query completion suggestion of the query prefix input by the user in the spatio-temporal unit.
11. The apparatus according to claim 10, wherein the model training unit is specifically configured to pre-train a ranking model with the first training data to update global model parameters and personalized model parameters, resulting in a global ranking model; and further training the global ordering model by using the second training data to adjust the personalized model parameters to obtain an ordering model corresponding to the space-time unit.
12. The apparatus according to claim 10, wherein the model training unit is specifically configured to perform at least one round training on the ranking model by using a first training task and a second training task to obtain the ranking model corresponding to the spatio-temporal unit; wherein the first training task is to train the order model using the first training data to update global model parameters and personalized model parameters; the second training task is to train the ranking model using the second training data to update the personalized model parameters.
13. The apparatus of claim 10, wherein the historical query data comprises: the query prefix input when the user selects the query item from the query completion suggestions, the selected query item and the unselected query item in the query completion suggestions corresponding to the query prefix;
the training targets adopted by the model training unit in training the ranking model are as follows: the difference between the ranking model's scores for selected query terms and the scores for unselected query terms is maximized.
14. The apparatus of claim 11 or 12, wherein the global model parameters comprise: embedding network parameters into the sequencing model;
the personalized model parameters comprise at least part of parameters in a ranking network in the ranking model.
15. The apparatus of claim 14, wherein the personalized model parameters comprise model parameters of a fully connected layer;
the global model parameters include model parameters of other network layers than the fully connected layer.
16. The apparatus of claim 10, wherein the spatio-temporal unit is a combination of a time unit and a region unit;
wherein the size of the region unit is set according to the sparseness of the historical query data occurring at the region unit.
17. The apparatus of claim 11, wherein the model training unit is implemented using a large-scale distributed processing architecture.
18. An apparatus for query autocomplete, comprising:
the query unit is used for acquiring a query prefix input by a user, candidate query items corresponding to the query prefix and determining a space-time unit where the query prefix is input by the user;
the scoring unit is used for inputting the query prefix and the candidate query items into the ranking model corresponding to the spatio-temporal unit to obtain the score of each candidate query item by the ranking model;
the query completion unit is used for determining a query completion suggestion recommended to the user according to the scores of the candidate query terms;
wherein the order model corresponding to the spatio-temporal unit is trained by the apparatus of any one of claims 10 to 17.
19. An electronic device, comprising:
at least one processor; and
a memory communicatively coupled to the at least one processor; wherein,
the memory stores instructions executable by the at least one processor to enable the at least one processor to perform the method of any one of claims 1-9.
20. A non-transitory computer readable storage medium having stored thereon computer instructions for causing the computer to perform the method of any one of claims 1-9.
21. A computer program product comprising a computer program which, when executed by a processor, implements the method according to any one of claims 1-9.
CN202011551563.1A 2020-12-24 2020-12-24 Method for establishing sorting model, method for inquiring automatic completion and corresponding device Active CN112528156B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202011551563.1A CN112528156B (en) 2020-12-24 2020-12-24 Method for establishing sorting model, method for inquiring automatic completion and corresponding device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202011551563.1A CN112528156B (en) 2020-12-24 2020-12-24 Method for establishing sorting model, method for inquiring automatic completion and corresponding device

Publications (2)

Publication Number Publication Date
CN112528156A true CN112528156A (en) 2021-03-19
CN112528156B CN112528156B (en) 2024-03-26

Family

ID=74976271

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202011551563.1A Active CN112528156B (en) 2020-12-24 2020-12-24 Method for establishing sorting model, method for inquiring automatic completion and corresponding device

Country Status (1)

Country Link
CN (1) CN112528156B (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115016873A (en) * 2022-05-05 2022-09-06 上海乾臻信息科技有限公司 Front-end data interaction method and system, electronic equipment and readable storage medium

Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103914536A (en) * 2014-03-31 2014-07-09 北京百度网讯科技有限公司 Interest point recommending method and system for electronic maps
CN104462369A (en) * 2014-12-08 2015-03-25 沈阳美行科技有限公司 Automatic search completion method for navigation equipment
US20180081989A1 (en) * 2016-09-22 2018-03-22 Yahoo Holdings, Inc. Method and system for providing query suggestions based on personalized spelling correction
CN107862004A (en) * 2017-10-24 2018-03-30 科大讯飞股份有限公司 Intelligent sorting method and device, storage medium and electronic equipment
CN109325635A (en) * 2018-10-25 2019-02-12 电子科技大学中山学院 Position prediction method based on automatic completion
WO2019177620A1 (en) * 2018-03-16 2019-09-19 Ford Motor Company Optimizing and predicting availability of resources in a shared vehicle environment
WO2019219846A1 (en) * 2018-05-17 2019-11-21 Fraunhofer-Gesellschaft zur Förderung der angewandten Forschung e.V. Concepts for distributed learning of neural networks and/or transmission of parameterization updates therefor
CN111222058A (en) * 2020-01-06 2020-06-02 百度在线网络技术(北京)有限公司 Method, device, equipment and computer storage medium for query automatic completion
CN111241427A (en) * 2020-01-06 2020-06-05 百度在线网络技术(北京)有限公司 Method, device, equipment and computer storage medium for query automatic completion

Patent Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103914536A (en) * 2014-03-31 2014-07-09 北京百度网讯科技有限公司 Interest point recommending method and system for electronic maps
CN104462369A (en) * 2014-12-08 2015-03-25 沈阳美行科技有限公司 Automatic search completion method for navigation equipment
US20180081989A1 (en) * 2016-09-22 2018-03-22 Yahoo Holdings, Inc. Method and system for providing query suggestions based on personalized spelling correction
CN107862004A (en) * 2017-10-24 2018-03-30 科大讯飞股份有限公司 Intelligent sorting method and device, storage medium and electronic equipment
WO2019177620A1 (en) * 2018-03-16 2019-09-19 Ford Motor Company Optimizing and predicting availability of resources in a shared vehicle environment
WO2019219846A1 (en) * 2018-05-17 2019-11-21 Fraunhofer-Gesellschaft zur Förderung der angewandten Forschung e.V. Concepts for distributed learning of neural networks and/or transmission of parameterization updates therefor
CN109325635A (en) * 2018-10-25 2019-02-12 电子科技大学中山学院 Position prediction method based on automatic completion
CN111222058A (en) * 2020-01-06 2020-06-02 百度在线网络技术(北京)有限公司 Method, device, equipment and computer storage medium for query automatic completion
CN111241427A (en) * 2020-01-06 2020-06-05 百度在线网络技术(北京)有限公司 Method, device, equipment and computer storage medium for query automatic completion

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
王末;郑晓欢;王卷乐;柏永青;: "基于混合过滤的地学数据个性化推荐方法设计与实现", 地理研究, no. 04 *
赵震宇: "基于深度学习和海云协同的推荐方法研究", 中国博士学位论文全文数据库 信息科技辑, no. 8, pages 5 *

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115016873A (en) * 2022-05-05 2022-09-06 上海乾臻信息科技有限公司 Front-end data interaction method and system, electronic equipment and readable storage medium
CN115016873B (en) * 2022-05-05 2024-07-12 上海乾臻信息科技有限公司 Front-end data interaction method, system, electronic equipment and readable storage medium

Also Published As

Publication number Publication date
CN112528156B (en) 2024-03-26

Similar Documents

Publication Publication Date Title
CN112000700B (en) Map information display method and device, electronic equipment and storage medium
EP3822876A2 (en) Method and apparatus for predicting destination, electronic device and storage medium
CN111666461B (en) Method, apparatus, device and computer storage medium for retrieving geographic location
CN114357105B (en) Pre-training method and model fine-tuning method of geographic pre-training model
CN112612957A (en) Interest point recommendation method, interest point recommendation model training method and device
CN111666292B (en) Similarity model establishment method and device for retrieving geographic position
CN111241427B (en) Method, device, equipment and computer storage medium for query automatic completion
CN114329244A (en) Map interest point query method, map interest point query device, map interest point query equipment, storage medium and program product
CN112765452B (en) Search recommendation method and device and electronic equipment
CN112905903A (en) House renting recommendation method and device, electronic equipment and storage medium
CN112528146B (en) Content resource recommendation method and device, electronic equipment and storage medium
CN112989170A (en) Keyword matching method applied to information search, information search method and device
CN113033194A (en) Training method, device, equipment and storage medium of semantic representation graph model
CN116521816A (en) Data processing method, retrieval method, device, equipment and storage medium
CN112528156B (en) Method for establishing sorting model, method for inquiring automatic completion and corresponding device
CN113326450B (en) Point-of-interest recall method and device, electronic equipment and storage medium
CN113761381B (en) Method, device, equipment and storage medium for recommending interest points
EP4047447A1 (en) Route recommendation method and apparatus, electronic device, and storage medium
CN111782748B (en) Map retrieval method, information point POI semantic vector calculation method and device
CN114036414A (en) Method and device for processing interest points, electronic equipment, medium and program product
CN113407579A (en) Group query method and device, electronic equipment and readable storage medium
CN112861023A (en) Map information processing method, map information processing apparatus, map information processing device, storage medium, and program product
CN113868532B (en) Location recommendation method and device, electronic equipment and storage medium
CN112528157A (en) Method for establishing sequencing model, method for automatically completing query and corresponding device
CN116383491B (en) Information recommendation method, apparatus, device, storage medium, and program product

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant