CN110750734A - Weather display method and device, computer equipment and computer-readable storage medium - Google Patents
Weather display method and device, computer equipment and computer-readable storage medium Download PDFInfo
- Publication number
- CN110750734A CN110750734A CN201910995532.6A CN201910995532A CN110750734A CN 110750734 A CN110750734 A CN 110750734A CN 201910995532 A CN201910995532 A CN 201910995532A CN 110750734 A CN110750734 A CN 110750734A
- Authority
- CN
- China
- Prior art keywords
- weather
- log data
- target log
- data
- video material
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Granted
Links
- 238000000034 method Methods 0.000 title claims abstract description 54
- 230000000694 effects Effects 0.000 claims abstract description 43
- 230000000007 visual effect Effects 0.000 claims abstract description 10
- 230000015654 memory Effects 0.000 claims description 17
- 238000001514 detection method Methods 0.000 claims description 2
- 238000007654 immersion Methods 0.000 abstract description 2
- 238000010586 diagram Methods 0.000 description 14
- 230000006870 function Effects 0.000 description 12
- 230000008569 process Effects 0.000 description 11
- 230000002093 peripheral effect Effects 0.000 description 10
- 230000001133 acceleration Effects 0.000 description 9
- 238000012545 processing Methods 0.000 description 9
- 239000000463 material Substances 0.000 description 8
- 238000004891 communication Methods 0.000 description 7
- 230000003287 optical effect Effects 0.000 description 6
- 238000013475 authorization Methods 0.000 description 5
- 238000005516 engineering process Methods 0.000 description 3
- 238000010295 mobile communication Methods 0.000 description 3
- 230000001960 triggered effect Effects 0.000 description 3
- 238000013473 artificial intelligence Methods 0.000 description 2
- 239000000919 ceramic Substances 0.000 description 2
- 230000006835 compression Effects 0.000 description 2
- 238000007906 compression Methods 0.000 description 2
- 230000005540 biological transmission Effects 0.000 description 1
- 230000000903 blocking effect Effects 0.000 description 1
- 238000013500 data storage Methods 0.000 description 1
- 230000007423 decrease Effects 0.000 description 1
- 238000013461 design Methods 0.000 description 1
- 238000011161 development Methods 0.000 description 1
- 230000008451 emotion Effects 0.000 description 1
- 238000000605 extraction Methods 0.000 description 1
- 230000004927 fusion Effects 0.000 description 1
- 230000006872 improvement Effects 0.000 description 1
- 230000001788 irregular Effects 0.000 description 1
- 238000002372 labelling Methods 0.000 description 1
- 239000004973 liquid crystal related substance Substances 0.000 description 1
- 238000010801 machine learning Methods 0.000 description 1
- 230000007246 mechanism Effects 0.000 description 1
- 230000005055 memory storage Effects 0.000 description 1
- 230000004048 modification Effects 0.000 description 1
- 238000012986 modification Methods 0.000 description 1
- 230000009467 reduction Effects 0.000 description 1
- 238000009877 rendering Methods 0.000 description 1
- 230000006641 stabilisation Effects 0.000 description 1
- 238000011105 stabilization Methods 0.000 description 1
- 230000003068 static effect Effects 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F16/00—Information retrieval; Database structures therefor; File system structures therefor
- G06F16/90—Details of database functions independent of the retrieved data types
- G06F16/95—Retrieval from the web
- G06F16/953—Querying, e.g. by the use of web search engines
- G06F16/9536—Search customisation based on social or collaborative filtering
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F16/00—Information retrieval; Database structures therefor; File system structures therefor
- G06F16/90—Details of database functions independent of the retrieved data types
- G06F16/95—Retrieval from the web
- G06F16/953—Querying, e.g. by the use of web search engines
- G06F16/9537—Spatial or temporal dependent retrieval, e.g. spatiotemporal queries
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F16/00—Information retrieval; Database structures therefor; File system structures therefor
- G06F16/90—Details of database functions independent of the retrieved data types
- G06F16/95—Retrieval from the web
- G06F16/953—Querying, e.g. by the use of web search engines
- G06F16/9538—Presentation of query results
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06Q—INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
- G06Q50/00—Information and communication technology [ICT] specially adapted for implementation of business processes of specific business sectors, e.g. utilities or tourism
- G06Q50/01—Social networking
-
- Y—GENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
- Y02—TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
- Y02A—TECHNOLOGIES FOR ADAPTATION TO CLIMATE CHANGE
- Y02A90/00—Technologies having an indirect contribution to adaptation to climate change
- Y02A90/10—Information and communication technologies [ICT] supporting adaptation to climate change, e.g. for weather forecasting or climate simulation
Landscapes
- Engineering & Computer Science (AREA)
- Databases & Information Systems (AREA)
- Theoretical Computer Science (AREA)
- General Physics & Mathematics (AREA)
- Physics & Mathematics (AREA)
- General Engineering & Computer Science (AREA)
- Data Mining & Analysis (AREA)
- Business, Economics & Management (AREA)
- Primary Health Care (AREA)
- Strategic Management (AREA)
- Tourism & Hospitality (AREA)
- Marketing (AREA)
- General Business, Economics & Management (AREA)
- Human Resources & Organizations (AREA)
- General Health & Medical Sciences (AREA)
- Economics (AREA)
- Health & Medical Sciences (AREA)
- Computing Systems (AREA)
- Information Transfer Between Computers (AREA)
Abstract
The application discloses a weather display method, a weather display device, computer equipment and a computer readable storage medium, and belongs to the technical field of networks. According to the method and the device, the weather type indicated by the target log data is determined by obtaining the position information in the target log data and the release time of the target log data, the video material corresponding to the weather type is obtained, the video material is used for describing the visual effect and the auditory effect of the weather type, the video material is played in the display area of the target log data, and in the weather display mode, the weather information can be presented from two dimensions of images and sound, the presentation effect of the weather information is improved, so that a user has more immersion when browsing the weather data, and the audiovisual experience of the user is improved.
Description
Technical Field
The present application relates to the field of network technologies, and in particular, to a weather display method and apparatus, a computer device, and a computer-readable storage medium.
Background
With the development of internet technology, more and more social applications are beginning to face users, users can share content through social applications, for example, a dynamic state can be published in a friend circle, a QQ space, a microblog, and the like, at present, most of social applications can provide a weather sign-in function, that is, users can publish a weather card to share weather information of their own location, when other users browse the weather card, a server can obtain the location information carried in the weather card, find corresponding weather materials based on the location information, and display the weather materials on a user client, but in this weather display method, the obtained weather materials are usually static pictures or dynamic pictures in gif format, the display effect is single, the corresponding weather effect cannot be presented well, and further, the users lack of immersion when browsing, the audiovisual experience is poor.
Disclosure of Invention
The embodiment of the application provides a weather display method, a weather display device, computer equipment and a computer readable storage medium, which can solve the problem of single weather display effect in the related art. The technical scheme is as follows:
in one aspect, a weather display method is provided, which includes:
acquiring position information in target log data and the release time of the target log data;
determining a weather type indicated by the target log data based on the location information and the release time;
acquiring a video material corresponding to the weather type, wherein the video material is used for describing the visual effect and the auditory effect of the weather type;
and playing the video material on a display page of the target log data.
In one aspect, there is provided a weather display apparatus, the apparatus including:
the information acquisition module is used for acquiring the position information in the target log data and the release time of the target log data;
a determining module, configured to determine a weather type indicated by the target log data based on the location information and the release time;
the video acquisition module is used for acquiring a video material corresponding to the weather type, and the video material is used for describing the visual effect and the auditory effect of the weather type;
and the playing module is used for playing the video material on the display page of the target log data.
In one possible implementation, the apparatus further includes:
the traffic data acquisition module is used for acquiring traffic live data corresponding to the position information when the target log data comprises the traffic condition key words;
and the display module is used for displaying the traffic live data in the display area of the target log data.
In one aspect, a computer device is provided that includes one or more processors and one or more memories having at least one program code stored therein, the at least one program code being loaded into and executed by the one or more processors to perform operations performed by the weather display method.
In one aspect, a computer-readable storage medium having at least one program code stored therein is provided, the at least one program code being loaded into and executed by a processor to perform operations performed by the weather display method.
According to the technical scheme, the weather type indicated by the target log data is determined by obtaining the position information in the target log data and the release time of the target log data based on the position information and the release time, the video material corresponding to the weather type is obtained, the video material is used for describing the visual effect and the auditory effect of the weather type, and the video material is played on the display page of the target log data.
Drawings
In order to more clearly illustrate the technical solutions in the embodiments of the present application, the drawings needed to be used in the description of the embodiments are briefly introduced below, and it is obvious that the drawings in the following description are only some embodiments of the present application, and it is obvious for those skilled in the art to obtain other drawings based on these drawings without creative efforts.
Fig. 1 is a schematic diagram of an implementation environment of a weather display method according to an embodiment of the present disclosure;
FIG. 2 is a flowchart illustrating a process of publishing target log data according to an embodiment of the present disclosure;
FIG. 3 is a flowchart of a weather display method according to an embodiment of the present disclosure;
FIG. 4 is a schematic diagram of a dynamic effect triggering area provided by an embodiment of the present application;
fig. 5 is a schematic diagram of a video material display effect provided by an embodiment of the present application;
fig. 6 is a schematic diagram of a method for publishing and displaying weather information according to an embodiment of the present disclosure;
FIG. 7 is a schematic structural diagram of a weather display apparatus according to an embodiment of the present invention;
fig. 8 is a schematic structural diagram of a terminal according to an embodiment of the present application;
fig. 9 is a schematic structural diagram of a server according to an embodiment of the present application.
Detailed Description
To make the objects, technical solutions and advantages of the present application more clear, embodiments of the present application will be described in further detail below with reference to the accompanying drawings.
Fig. 1 is a schematic diagram of an implementation environment of a weather display method according to an embodiment of the present application, and referring to fig. 1, the implementation environment includes at least one first terminal 101 and at least one second terminal 102. The at least one first terminal 101 and the at least one second terminal 102 may each have a social application installed and run thereon, and the social application may provide a data sharing function. The at least one first terminal 101 and the at least one second terminal 102 may be a tablet computer, a smart phone, and the like, which is not limited in this embodiment of the application. For example, the first terminal 101 may be a terminal used by a journal publishing user, the second terminal 102 may be a terminal used by a journal browsing user, and a user account may be logged in a social application running in the first terminal 101 and the second terminal 102.
Of course, the implementation environment may further include at least one server 103, and the at least one server 103 may be used to provide services such as data publishing, data querying, and the like. The at least one first terminal 101, the at least one second terminal 102 and the at least one server 103 may be connected via a wired network or a wireless network to ensure that data transmission between the respective terminals and the server 103 is possible. The first terminal 101 and the second terminal 102 may both issue log data to the server, or may send a log data query request to the server to obtain log data issued by another user from the server. The at least one server 103 may be a device such as a cloud computing platform, which is not limited in this embodiment of the present application.
In order to facilitate understanding of the technical processes of the embodiments of the present application, some terms referred to in the embodiments of the present application are explained below:
LBS (Location Based Service ): the position Information (Geographic coordinates or geodetic coordinates) of the mobile terminal user is obtained through a radio communication network (such as a GSM network and a CDMA network) of a telecommunication mobile operator or an external positioning mode (such as a GPS), and the value-added service of corresponding service is provided for the user under the support of a Geographic Information System (Geographic Information System) platform. LBS is formed by combining a mobile communication network and a computer network, the two networks are interacted through a gateway before, and a mobile terminal sends a request through the mobile communication network and transmits the request to an LBS service platform through the gateway; and the service platform processes according to the user request and the current position of the user and returns the result to the user through the gateway.
Fig. 2 is a flowchart of a publishing process of target log data according to an embodiment of the present application, and referring to fig. 2, the method may specifically include the following steps:
201. the first terminal acquires the position information of the user based on the target log data editing request of the user.
The first terminal can be a client device used by any user, a social application program can be installed and run on the first terminal, and the social application program can provide a log data publishing function. The target log data belong to a target data category, and in the embodiment of the application, the log data belonging to the target data category can trigger a dynamic display effect of the weather information. The target log data may be log data published by any user.
In a possible implementation manner, the target log data editing request may be triggered Based on a selection operation of a user on the target data type, and after receiving the target log data editing request triggered by the user, the first terminal may detect whether the user performs a Location information authorization operation on the social application program, if the user does not perform the Location information authorization operation, the first terminal may display an authorization prompt message to prompt the user to perform Location information authorization, and if the user performs the Location information authorization operation, obtain Location Based Service (LBS) information of the user, that is, Location information of the user.
202. The first terminal acquires information provided by a user on a log distribution page, and sends a target log data distribution request to the server based on the information provided by the user and the person losing information.
In a possible implementation manner, after receiving the target log data editing request, the first terminal may display a log editing page, which may include, for example, a text information input region, a picture selection region, and a publishing control, and the first terminal may obtain information provided by the user based on a triggering operation of the publishing control by the user. The picture selection area may display a plurality of background pictures, the plurality of background pictures may be determined based on the location information, for example, the background pictures may include landmark buildings in an area indicated by the location information, specifically, after the first terminal acquires the location information, a picture acquisition request may be sent to a target server storing the background pictures, the picture acquisition request may carry the location information, and the target server may obtain at least one background picture corresponding to the location information, for example, a shooting location of the at least one background picture is the same as a location indicated by the location information, of course, each background picture may carry a location tag, and the location tag may be used to indicate a shooting location of the background picture, so as to match the location information and the background pictures with the server. In this embodiment of the application, each background picture may carry a weather tag, the weather tag may be used to indicate a weather type presented by the background picture, after the first terminal acquires the location information, the first terminal may also acquire instant weather information of the location based on the location information, the target server screens out at least one background picture matching the location information and the instant weather information from a background picture material library, and sends the at least one background picture to the first terminal for display. It should be noted that the above description of obtaining the background picture is only an exemplary description, and the embodiment of the present application does not limit which background picture obtaining manner is specifically adopted.
203. The server generates target log data based on the target log data issue request.
In a possible implementation manner, after receiving a target log data publishing request, the server may generate target log data based on the location information of the user and text information, pictures, and the like provided by the user, where of course, the target log data may also carry a target data category identifier, a timestamp, and the like, where the target data category identifier may be used to indicate a data category of the target log data, and the timestamp may be used to indicate a publishing time of the target log data.
It should be noted that the above description of the target log data publishing process is only an exemplary description, and the embodiment of the present application does not specifically limit which manner is specifically used to publish the target log data.
Fig. 3 is a flowchart of a weather display method provided in an embodiment of the present application, and referring to fig. 3, the method may specifically include the following steps:
301. and the second terminal displays the target log data on the target page.
In this embodiment of the application, the second terminal may be a client device used by any user, a social application may be installed and run on the second terminal, and the social application may provide log data publishing and log data displaying functions.
The target page may display a plurality of pieces of log data, and in one possible implementation, the second terminal may obtain, based on a log data browsing request of a user, a plurality of pieces of log data from a server and display the log data on the target page, where the plurality of pieces of log data may include log data belonging to the category of the target data.
302. The second terminal acquires the position information in the target log data and the release time of the target log data.
In this embodiment of the application, the second terminal may detect whether a display area of the target log data coincides with a dynamic effect trigger area, and when the display area coincides with the dynamic effect trigger area, perform the step of acquiring the location information and the release time. Referring to fig. 4, fig. 4 is a schematic diagram of a dynamic effect trigger area provided in an embodiment of the present application, and when a display area 401 of the target log data overlaps with a dynamic effect trigger area 402, the second terminal may be triggered to acquire the location information in the log data and the publishing time of the target log data.
303. The second terminal determines the weather type indicated by the target log data based on the location information and the release time.
In a possible implementation manner, the second terminal may obtain weather data from a weather server based on the location information and the release time, determine a weather type corresponding to the weather data, and use the weather type as the weather type indicated by the target log data. In this embodiment of the application, one weather type may correspond to a set of weather data restriction conditions, and when the weather data satisfies a certain set of weather data restriction conditions, the weather type corresponding to the certain set of weather data restriction conditions is the weather type corresponding to the weather data, for example, the weather restriction condition of the weather type of "thunderstorm" is "rainfall is greater than 0 mm" and "thunderstorm" and the acquired weather data is "rainfall is 20 mm" and "thunderstorm", the weather type corresponding to the weather data is "thunderstorm".
304. And the second terminal acquires the video material corresponding to the weather type, and the video material is used for describing the visual effect and the auditory effect of the weather type.
In this embodiment of the present application, a video material may carry a weather tag, in a possible implementation manner, the second terminal may send a video material acquisition request to a CDN (Content Delivery Network), where the request may carry the weather type, a server in the Content Delivery Network may compare the weather type with each weather tag, and when the weather type is the same as the weather type indicated by any weather tag, the video material corresponding to the any weather tag may be acquired, and the video material is sent to the second terminal.
305. And the second terminal plays the video material on the display page of the target log data.
Referring to fig. 5, fig. 5 is a schematic diagram of a video material display effect provided by an embodiment of the present application, where the second terminal may display image content in the video material in a full screen in a display page of the target log data, that is, the target page 501. In one possible implementation, to avoid the video material from blocking the content of the target log data, improve the video playing effect, a transparent data channel may be added to the video material, and in particular, a target weather video may be encoded based on a target video encoding standard, an initial video material generated, the target weather video is corresponding to the weather type, color data in the initial video material is obtained, the color data is mixed with transparent channel data to generate an initial video material with transparency, adding audio channel data to the initial video material with transparency, generating the video material of the weather type, wherein the color data and the transparent channel data can be mixed by using openGL (Open Graphics Library), the target coding standard can be an H.264 coding standard, and the video coded by the H.264 coding standard has a small volume. In this embodiment of the present application, when decoding the video material, the positions of the transparent channel data and the audio channel data may be calculated according to the frame rate of the video material, where each data in the transparent channel data may correspond to each frame in the video material one to one. In the above video material generation method, by modifying the video data in the h.264 format, the transparent channel data and the audio channel data can be stored in one video file, and all data contents are effectively integrated. It should be noted that the above description of the video material generation method is only an exemplary description, and the embodiment of the present application does not specifically limit which video material generation method is specifically adopted.
In a possible implementation manner, the second terminal may also be accompanied by a vibration effect when playing the video material, specifically, after acquiring the video material, the second terminal may detect whether the user has started a vibration function, when detecting that the user has started the vibration effect, the second terminal may trigger the vibration effect when playing the video material, and when not starting the vibration effect, the user does not trigger the vibration effect.
According to the technical scheme, the weather type indicated by the target log data is determined by obtaining the position information in the target log data and the release time of the target log data based on the position information and the release time, the video material corresponding to the weather type is obtained, the video material is used for describing the visual effect and the auditory effect of the weather type, and the video material is played in the display area of the target log data.
Fig. 6 is a schematic diagram of a method for publishing and displaying weather information according to an embodiment of the present disclosure, referring to fig. 6, a first user, i.e., a log publishing user, may add a weather animation effect to a log by selecting a weather card when publishing the log, and publish the log to a database of a server after editing the log, a second user, i.e., a log browsing user, may send a log obtaining request to the server, perform data retrieval by the server, send a friend log of the second user to a terminal used by the second user, the terminal used by the second user may determine whether each log includes the weather card, if so, obtain a weather type based on location information in log data, request a weather material to a CDN (Content Delivery Network), a server in a Content Delivery Network obtains the weather material corresponding to the weather type from a material library, the weather animation effect is sent to the terminal used by the second user, the terminal used by the second user is displayed, in the process, the weather information of the position where the user is located can be presented by the log publishing user by means of the weather animation effect carried by the weather card, and meanwhile, the log browsing user can have good audio-visual experience when browsing the log, so that the social experience of the user can be improved.
In this embodiment, the second terminal may further determine whether to acquire the video material for playing based on the content of the target log data, for example, when the content of the target log data includes content such as location information and weather information, the video material is played. In one possible implementation, the above process may specifically include any one of the following implementations:
in the first implementation manner, the second terminal may determine whether the content of the target log data is related to the position information, and when the content of the target log data is related to the position information, the second terminal acquires and plays the video material. In a possible implementation manner, the second terminal may perform text recognition on the target log data, compare at least one place keyword with the location information when the target log data includes the at least one place keyword, and perform a step of determining a weather type indicated by the target log data based on the location information and the release time when an area indicated by any one of the place keywords is the same as an area indicated by the location information.
In one possible implementation, a text recognition model may be applied to obtain at least one location keyword in the target log data, the text recognition model may be a model constructed based on multiple operation layers, for example, the text recognition model may be a BERT model, the BERT model includes 12 operation layers, that is, 12 Transformers (converters), and each transformer may perform feature extraction on text information based on an attention mechanism, and encode and decode the text information. The second terminal may input text data in the target log data into the BERT model, preprocess the text data by the BERT model, divide one text data into a character sequence composed of a plurality of characters, replace each character in the character sequence based on a character-to-vector parameter generated in a pretraining process of the BERT model, convert one character into one vector, and obtain a vector sequence corresponding to the one text data. And inputting the vector sequence corresponding to the text data into the BERT model, performing encoding operation and decoding operation on a plurality of vector sequences by 12 operation layers in the BERT model to extract text features of the text data, and labeling at least one place keyword in the text data based on the extracted text features. It should be noted that the above description of obtaining the place keyword is only an exemplary description, and the embodiment of the present application does not specifically limit which way to specifically obtain the place keyword.
And the second terminal can judge whether the content of the target log data is related to the weather information of the position information, and when the content of the target log data is related to the weather information of the position information, the video material is acquired and played. In a possible implementation manner, the second terminal may perform text recognition on the target log data, compare the at least one weather keyword with the weather type when the target log data includes the at least one weather keyword, and execute the step of acquiring the video material corresponding to the weather type when the weather state indicated by any one of the weather keywords is the same as the weather state indicated by the weather type. The process of obtaining the weather keywords is similar to the method of obtaining the location keywords in the implementation manner, and details are not repeated herein.
And in the third implementation mode, the second terminal can judge whether the target log data contains the target picture, and when the target log data contains the target picture, the video material is obtained and played. In a possible implementation manner, the second terminal may obtain a picture included in the target log data, compare the picture with a background picture in a background picture material library, determine that the log data includes the target picture when the picture is the same as any background picture, and perform the step of determining the weather type indicated by the target log data based on the location information and the release time.
In a possible implementation manner, when the second terminal acquires a plurality of pieces of target log data carrying the same position information, the video material corresponding to the target log data for a plurality of days may be continuously displayed, and the display effect of the video material is adjusted based on the correlation between the content of each piece of target log data and the weather information, for example, when the correlation is low, the sound of the video material is small, the vibration effect is weak, and when the correlation is high, the sound of the video material is large, and the vibration effect is strong.
When the weather card is issued for many times by one person in a short time, namely, a plurality of pieces of entry mark log data are issued, the second terminal can intelligently identify whether the content of the target log data is strongly related to the weather card or not when the user browses, and then judge whether the corresponding animation effect is realized or not; when a plurality of people issue the weather card at the same time, namely the target log data is issued at the same time, when the user browses, the second terminal can intelligently identify the strength of each area, namely the strength of the correlation between the content of the target log data and the weather card, and continuously realize the animation effect corresponding to a plurality of target log data carrying the same weather information. In the weather display mode, whether the animation effect is displayed or not is judged by intelligently identifying the content of the target log data, so that the display information redundancy can be avoided, and the browsing experience of a user is improved. In a possible implementation manner, the issued weather cards may be collected to generate a card set, or the weather cards are converted into an emoticon, or the emotion is applied to a chat scene, and the like.
In an embodiment of the application, the second terminal may further determine whether the content of the target log data is related to traffic information, and in a possible implementation manner, when the target log data includes a traffic condition keyword, may obtain traffic live data corresponding to the location information, and display the traffic live data in a display area of the target log data. The process of obtaining the traffic condition keywords is similar to the method of obtaining the location keywords, which is not described herein again. Of course, the second terminal may also display air quality information, wind strength, and the like based on the content of the target log data, which is not limited in this embodiment of the application.
All the above optional technical solutions may be combined arbitrarily to form optional embodiments of the present application, and are not described herein again.
Fig. 7 is a schematic structural diagram of a weather display apparatus according to an embodiment of the present invention, and referring to fig. 7, the apparatus includes:
an information obtaining module 701, configured to obtain location information in target log data and release time of the target log data;
a determining module 702, configured to determine a weather type indicated by the target log data based on the location information and the release time;
a video obtaining module 703, configured to obtain a video material corresponding to the weather type, where the video material is used to describe a visual effect and an auditory effect of the weather type;
a playing module 704, configured to play the video material in the display area of the target log data.
In one possible implementation, the apparatus further includes:
and the detection module is used for detecting whether the display area of the target log data is overlapped with the dynamic effect trigger area or not, and executing the step of acquiring the position information and the release time when the display area is overlapped with the dynamic effect trigger area.
In one possible implementation, the target log data belongs to a target data category.
In one possible implementation, the apparatus further includes:
the place comparison module is used for comparing at least one place keyword with the position information when the target log data comprises the at least one place keyword; when the area indicated by any one of the place keywords is the same as the area indicated by the location information, a step of determining the weather type indicated by the target log data based on the location information and the release time is performed.
In one possible implementation, the apparatus further includes:
the weather comparison module is used for comparing at least one weather keyword with the weather type when the target log data comprises the at least one weather keyword; and when the weather state indicated by any weather keyword is the same as the weather state indicated by the weather type, executing the step of acquiring the video material corresponding to the weather type.
In one possible implementation, the determining module 702 is configured to:
acquiring weather data from a weather server based on the position information and the release time;
and determining the weather type corresponding to the weather data, and taking the weather type as the weather type indicated by the target log data.
In one possible implementation, the apparatus further includes:
the traffic data acquisition module is used for acquiring traffic live data corresponding to the position information when the target log data comprises the traffic condition key words;
and the display module is used for displaying the traffic live data in the display area of the target log data.
According to the technical scheme, the weather type indicated by the target log data is determined by obtaining the position information in the target log data and the release time of the target log data based on the position information and the release time, the video material corresponding to the weather type is obtained, the video material is used for describing the visual effect and the auditory effect of the weather type, and the video material is played in the display area of the target log data.
It should be noted that: in the weather display device provided in the above embodiment, only the division of the functional modules is illustrated when displaying weather, and in practical applications, the function distribution may be completed by different functional modules according to needs, that is, the internal structure of the device is divided into different functional modules to complete all or part of the functions described above. In addition, the weather display device provided by the above embodiment and the weather display method embodiment belong to the same concept, and the specific implementation process thereof is described in the method embodiment in detail, and is not described herein again.
Fig. 8 is a schematic structural diagram of a terminal according to an embodiment of the present application. The terminal 800 may be: a smart phone, a tablet computer, an MP3 player (Moving Picture Experts Group Audio Layer III, motion video Experts compression standard Audio Layer 3), an MP4 player (Moving Picture Experts Group Audio Layer IV, motion video Experts compression standard Audio Layer 4), a notebook computer, or a desktop computer. The terminal 800 may also be referred to by other names such as user equipment, portable terminal, laptop terminal, desktop terminal, etc.
In general, the terminal 800 includes: one or more processors 801 and one or more memories 802.
The processor 801 may include one or more processing cores, such as a 4-core processor, an 8-core processor, and so forth. The processor 801 may be implemented in at least one hardware form of a DSP (Digital Signal Processing), an FPGA (Field-Programmable Gate Array), and a PLA (Programmable Logic Array). The processor 801 may also include a main processor and a coprocessor, where the main processor is a processor for processing data in an awake state, and is also called a Central Processing Unit (CPU); a coprocessor is a low power processor for processing data in a standby state. In some embodiments, the processor 801 may be integrated with a GPU (Graphics Processing Unit), which is responsible for rendering and drawing the content required to be displayed on the display screen. In some embodiments, the processor 801 may further include an AI (Artificial Intelligence) processor for processing computing operations related to machine learning.
In some embodiments, the terminal 800 may further include: a peripheral interface 803 and at least one peripheral. The processor 801, memory 802 and peripheral interface 803 may be connected by bus or signal lines. Various peripheral devices may be connected to peripheral interface 803 by a bus, signal line, or circuit board. Specifically, the peripheral device includes: at least one of a radio frequency circuit 804, a display screen 805, a camera assembly 806, an audio circuit 807, a positioning assembly 808, and a power supply 809.
The peripheral interface 803 may be used to connect at least one peripheral related to I/O (Input/Output) to the processor 801 and the memory 802. In some embodiments, the processor 801, memory 802, and peripheral interface 803 are integrated on the same chip or circuit board; in some other embodiments, any one or two of the processor 801, the memory 802, and the peripheral interface 803 may be implemented on separate chips or circuit boards, which are not limited by this embodiment.
The Radio Frequency circuit 804 is used for receiving and transmitting RF (Radio Frequency) signals, also called electromagnetic signals. The radio frequency circuitry 804 communicates with communication networks and other communication devices via electromagnetic signals. The rf circuit 804 converts an electrical signal into an electromagnetic signal to be transmitted, or converts a received electromagnetic signal into an electrical signal. Optionally, the radio frequency circuit 804 includes: an antenna system, an RF transceiver, one or more amplifiers, a tuner, an oscillator, a digital signal processor, a codec chipset, a subscriber identity module card, and so forth. The radio frequency circuit 804 may communicate with other terminals via at least one wireless communication protocol. The wireless communication protocols include, but are not limited to: metropolitan area networks, various generation mobile communication networks (2G, 3G, 4G, and 5G), Wireless local area networks, and/or WiFi (Wireless Fidelity) networks. In some embodiments, the radio frequency circuit 804 may further include NFC (Near Field Communication) related circuits, which are not limited in this application.
The display screen 805 is used to display a UI (User Interface). The UI may include graphics, text, icons, video, and any combination thereof. When the display 805 is a touch display, the display 805 also has the ability to capture touch signals on or above the surface of the display 805. The touch signal may be input to the processor 801 as a control signal for processing. At this point, the display 805 may also be used to provide virtual buttons and/or a virtual keyboard, also referred to as soft buttons and/or a soft keyboard. In some embodiments, the display 805 may be one, providing the front panel of the terminal 800; in other embodiments, the display 805 may be at least two, respectively disposed on different surfaces of the terminal 800 or in a folded design; in some embodiments, display 805 may be a flexible display disposed on a curved surface or a folded surface of terminal 800. Even further, the display 805 may be arranged in a non-rectangular irregular pattern, i.e., a shaped screen. The Display 805 can be made of LCD (liquid crystal Display), OLED (Organic Light-Emitting Diode), and the like.
The camera assembly 806 is used to capture images or video. Optionally, camera assembly 806 includes a front camera and a rear camera. Generally, a front camera is disposed at a front panel of the terminal, and a rear camera is disposed at a rear surface of the terminal. In some embodiments, the number of the rear cameras is at least two, and each rear camera is any one of a main camera, a depth-of-field camera, a wide-angle camera and a telephoto camera, so that the main camera and the depth-of-field camera are fused to realize a background blurring function, and the main camera and the wide-angle camera are fused to realize panoramic shooting and VR (Virtual Reality) shooting functions or other fusion shooting functions. In some embodiments, camera assembly 806 may also include a flash. The flash lamp can be a monochrome temperature flash lamp or a bicolor temperature flash lamp. The double-color-temperature flash lamp is a combination of a warm-light flash lamp and a cold-light flash lamp, and can be used for light compensation at different color temperatures.
The audio circuit 807 may include a microphone and a speaker. The microphone is used for collecting sound waves of a user and the environment, converting the sound waves into electric signals, and inputting the electric signals to the processor 801 for processing or inputting the electric signals to the radio frequency circuit 804 to realize voice communication. For the purpose of stereo sound collection or noise reduction, a plurality of microphones may be provided at different portions of the terminal 800. The microphone may also be an array microphone or an omni-directional pick-up microphone. The speaker is used to convert electrical signals from the processor 801 or the radio frequency circuit 804 into sound waves. The loudspeaker can be a traditional film loudspeaker or a piezoelectric ceramic loudspeaker. When the speaker is a piezoelectric ceramic speaker, the speaker can be used for purposes such as converting an electric signal into a sound wave audible to a human being, or converting an electric signal into a sound wave inaudible to a human being to measure a distance. In some embodiments, the audio circuitry 807 may also include a headphone jack.
The positioning component 808 is used to locate the current geographic position of the terminal 800 for navigation or LBS (location based Service). The positioning component 808 may be a positioning component based on the GPS (global positioning System) in the united states, the beidou System in china, the graves System in russia, or the galileo System in the european union.
In some embodiments, terminal 800 also includes one or more sensors 810. The one or more sensors 810 include, but are not limited to: acceleration sensor 811, gyro sensor 812, pressure sensor 813, fingerprint sensor 814, optical sensor 815 and proximity sensor 816.
The acceleration sensor 811 may detect the magnitude of acceleration in three coordinate axes of the coordinate system established with the terminal 800. For example, the acceleration sensor 811 may be used to detect the components of the gravitational acceleration in three coordinate axes. The processor 801 may control the display 805 to display the user interface in a landscape view or a portrait view according to the gravitational acceleration signal collected by the acceleration sensor 811. The acceleration sensor 811 may also be used for acquisition of motion data of a game or a user.
The gyro sensor 812 may detect a body direction and a rotation angle of the terminal 800, and the gyro sensor 812 may cooperate with the acceleration sensor 811 to acquire a 3D motion of the user with respect to the terminal 800. From the data collected by the gyro sensor 812, the processor 801 may implement the following functions: motion sensing (such as changing the UI according to a user's tilting operation), image stabilization at the time of photographing, game control, and inertial navigation.
Pressure sensors 813 may be disposed on the side frames of terminal 800 and/or underneath display 805. When the pressure sensor 813 is disposed on the side frame of the terminal 800, the holding signal of the user to the terminal 800 can be detected, and the processor 801 performs left-right hand recognition or shortcut operation according to the holding signal collected by the pressure sensor 813. When the pressure sensor 813 is disposed at a lower layer of the display screen 805, the processor 801 controls the operability control on the UI interface according to the pressure operation of the user on the display screen 805. The operability control comprises at least one of a button control, a scroll bar control, an icon control and a menu control.
The fingerprint sensor 814 is used for collecting a fingerprint of the user, and the processor 801 identifies the identity of the user according to the fingerprint collected by the fingerprint sensor 814, or the fingerprint sensor 814 identifies the identity of the user according to the collected fingerprint. Upon identifying that the user's identity is a trusted identity, the processor 801 authorizes the user to perform relevant sensitive operations including unlocking a screen, viewing encrypted information, downloading software, paying for and changing settings, etc. Fingerprint sensor 814 may be disposed on the front, back, or side of terminal 800. When a physical button or a vendor Logo is provided on the terminal 800, the fingerprint sensor 814 may be integrated with the physical button or the vendor Logo.
The optical sensor 815 is used to collect the ambient light intensity. In one embodiment, processor 801 may control the display brightness of display 805 based on the ambient light intensity collected by optical sensor 815. Specifically, when the ambient light intensity is high, the display brightness of the display screen 805 is increased; when the ambient light intensity is low, the display brightness of the display 805 is reduced. In another embodiment, the processor 801 may also dynamically adjust the shooting parameters of the camera assembly 806 based on the ambient light intensity collected by the optical sensor 815.
A proximity sensor 816, also known as a distance sensor, is typically provided on the front panel of the terminal 800. The proximity sensor 816 is used to collect the distance between the user and the front surface of the terminal 800. In one embodiment, when the proximity sensor 816 detects that the distance between the user and the front surface of the terminal 800 gradually decreases, the processor 801 controls the display 805 to switch from the bright screen state to the dark screen state; when the proximity sensor 816 detects that the distance between the user and the front surface of the terminal 800 becomes gradually larger, the display 805 is controlled by the processor 801 to switch from the breath-screen state to the bright-screen state.
Those skilled in the art will appreciate that the configuration shown in fig. 8 is not intended to be limiting of terminal 800 and may include more or fewer components than those shown, or some components may be combined, or a different arrangement of components may be used.
Fig. 9 is a schematic structural diagram of a server according to an embodiment of the present application, where the server 900 may generate a relatively large difference due to different configurations or performances, and may include one or more processors (CPUs) 901 and one or more memories 902, where the one or more memories 902 store at least one program code, and the at least one program code is loaded and executed by the one or more processors 901 to implement the methods provided by the foregoing method embodiments. Certainly, the server 900 may also have components such as a wired or wireless network interface, a keyboard, and an input/output interface, so as to perform input and output, and the server 900 may also include other components for implementing device functions, which are not described herein again.
In an exemplary embodiment, a computer readable storage medium, such as a memory, including at least one program code executable by a processor to perform the weather display method of the above embodiments is also provided. For example, the computer-readable storage medium may be a Read-Only Memory (ROM), a Random Access Memory (RAM), a Compact Disc Read-Only Memory (CD-ROM), a magnetic tape, a floppy disk, an optical data storage device, and the like.
It will be understood by those skilled in the art that all or part of the steps of implementing the above embodiments may be implemented by hardware, or implemented by at least one program code associated with hardware, where the program code is stored in a computer readable storage medium, such as a read only memory, a magnetic or optical disk, etc.
The above description is only exemplary of the present application and should not be taken as limiting the present application, as any modification, equivalent replacement, or improvement made within the spirit and principle of the present application should be included in the protection scope of the present application.
Claims (15)
1. A weather display method, comprising:
acquiring position information in target log data and the release time of the target log data;
determining a weather type indicated by the target log data based on the location information and the release time;
acquiring a video material corresponding to the weather type, wherein the video material is used for describing the visual effect and the auditory effect of the weather type;
and playing the video material on a display page of the target log data.
2. The method of claim 1, wherein before the obtaining the location information in the target log data and the publishing time of the target log data, the method further comprises:
and detecting whether a display area of the target log data is overlapped with a dynamic effect trigger area, and executing the step of acquiring the position information and the release time when the display area is overlapped with the dynamic effect trigger area.
3. The method of claim 1, wherein the target log data belongs to a target data category.
4. The method of claim 1, wherein prior to determining the type of weather indicated by the target log data based on the location information and the release time, the method further comprises:
comparing the at least one place keyword with the position information when the target log data comprises the at least one place keyword;
and when the area indicated by any place keyword is the same as the area indicated by the position information, executing the step of determining the weather type indicated by the target log data based on the position information and the release time.
5. The method of claim 1, wherein before the obtaining the video material corresponding to the weather type, the method further comprises:
when the target log data comprises at least one weather keyword, comparing the at least one weather keyword with the weather type;
and when the weather state indicated by any weather keyword is the same as the weather state indicated by the weather type, executing the step of acquiring the video material corresponding to the weather type.
6. The method of claim 1, wherein the determining the type of weather indicated by the target log data based on the location information and the release time comprises:
acquiring weather data from a weather server based on the position information and the release time;
and determining a weather type corresponding to the weather data, and taking the weather type as the weather type indicated by the target log data.
7. The method of claim 1, wherein after obtaining the location information in the target log data and the publishing time of the target log data, the method further comprises:
when the target log data comprise the traffic condition keywords, acquiring traffic live data corresponding to the position information;
displaying the traffic live data in a display area of the target log data.
8. A weather display apparatus, characterized in that the apparatus comprises:
the information acquisition module is used for acquiring position information in target log data and the release time of the target log data;
a determining module, configured to determine a weather type indicated by the target log data based on the location information and the release time;
the video acquisition module is used for acquiring a video material corresponding to the weather type, and the video material is used for describing the visual effect and the auditory effect of the weather type;
and the playing module is used for playing the video material on a display page of the target log data.
9. The apparatus of claim 8, further comprising:
and the detection module is used for detecting whether a display area of the target log data is overlapped with a dynamic effect trigger area or not, and executing the step of acquiring the position information and the release time when the display area is overlapped with the dynamic effect trigger area.
10. The apparatus of claim 8, wherein the target log data belongs to a target data category.
11. The apparatus of claim 8, further comprising:
the place comparison module is used for comparing at least one place keyword with the position information when the target log data comprises the at least one place keyword; and when the area indicated by any place keyword is the same as the area indicated by the position information, executing the step of determining the weather type indicated by the target log data based on the position information and the release time.
12. The apparatus of claim 8, further comprising:
the weather comparison module is used for comparing at least one weather keyword with the weather type when the target log data comprises the at least one weather keyword; and when the weather state indicated by any weather keyword is the same as the weather state indicated by the weather type, executing the step of acquiring the video material corresponding to the weather type.
13. The apparatus of claim 8, wherein the determining module is configured to:
acquiring weather data from a weather server based on the position information and the release time;
and determining a weather type corresponding to the weather data, and taking the weather type as the weather type indicated by the target log data.
14. A computer device comprising one or more processors and one or more memories having at least one program code stored therein, the at least one program code being loaded and executed by the one or more processors to perform operations performed by the weather display method of any one of claims 1 to 7.
15. A computer-readable storage medium having at least one program code stored therein, the at least one program code being loaded into and executed by a processor to perform operations performed by the weather display method of any one of claims 1 to 7.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201910995532.6A CN110750734B (en) | 2019-10-18 | 2019-10-18 | Weather display method, weather display device, computer equipment and computer readable storage medium |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201910995532.6A CN110750734B (en) | 2019-10-18 | 2019-10-18 | Weather display method, weather display device, computer equipment and computer readable storage medium |
Publications (2)
Publication Number | Publication Date |
---|---|
CN110750734A true CN110750734A (en) | 2020-02-04 |
CN110750734B CN110750734B (en) | 2024-07-12 |
Family
ID=69278953
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN201910995532.6A Active CN110750734B (en) | 2019-10-18 | 2019-10-18 | Weather display method, weather display device, computer equipment and computer readable storage medium |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN110750734B (en) |
Cited By (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN111652180A (en) * | 2020-06-16 | 2020-09-11 | 北京梧桐车联科技有限责任公司 | Defect positioning method, device, equipment and computer readable storage medium |
CN112667346A (en) * | 2021-03-16 | 2021-04-16 | 深圳市火乐科技发展有限公司 | Weather data display method and device, electronic equipment and storage medium |
US11334166B2 (en) * | 2020-09-30 | 2022-05-17 | International Business Machines Corporation | Multi-sensory notifications |
CN114599002A (en) * | 2022-05-09 | 2022-06-07 | 广东省气象公共服务中心(广东气象影视宣传中心) | Daily weather personalized service automatic pushing method based on 5G message |
CN114663883A (en) * | 2022-05-25 | 2022-06-24 | 中山职业技术学院 | Point cloud data correction method and device, electronic equipment and storage medium |
WO2024152725A1 (en) * | 2023-01-19 | 2024-07-25 | 华为技术有限公司 | Display method based on weather information, and electronic device and readable storage medium |
Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN102422286A (en) * | 2009-03-11 | 2012-04-18 | 香港浸会大学 | Automatic and semi-automatic image classification, annotation and tagging using image acquisition parameters and metadata |
CN103428076A (en) * | 2013-08-22 | 2013-12-04 | 北京奇虎科技有限公司 | Method and device for transmitting information to multi-type terminals or applications |
CN107943896A (en) * | 2017-11-16 | 2018-04-20 | 百度在线网络技术(北京)有限公司 | Information processing method and device |
WO2019105393A1 (en) * | 2017-11-30 | 2019-06-06 | 腾讯科技(深圳)有限公司 | Web page content processing method, apparatus, browser, device and storage medium |
-
2019
- 2019-10-18 CN CN201910995532.6A patent/CN110750734B/en active Active
Patent Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN102422286A (en) * | 2009-03-11 | 2012-04-18 | 香港浸会大学 | Automatic and semi-automatic image classification, annotation and tagging using image acquisition parameters and metadata |
CN103428076A (en) * | 2013-08-22 | 2013-12-04 | 北京奇虎科技有限公司 | Method and device for transmitting information to multi-type terminals or applications |
CN107943896A (en) * | 2017-11-16 | 2018-04-20 | 百度在线网络技术(北京)有限公司 | Information processing method and device |
WO2019105393A1 (en) * | 2017-11-30 | 2019-06-06 | 腾讯科技(深圳)有限公司 | Web page content processing method, apparatus, browser, device and storage medium |
Cited By (10)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN111652180A (en) * | 2020-06-16 | 2020-09-11 | 北京梧桐车联科技有限责任公司 | Defect positioning method, device, equipment and computer readable storage medium |
CN111652180B (en) * | 2020-06-16 | 2023-11-17 | 北京梧桐车联科技有限责任公司 | Defect positioning method, device, equipment and computer readable storage medium |
US11334166B2 (en) * | 2020-09-30 | 2022-05-17 | International Business Machines Corporation | Multi-sensory notifications |
CN112667346A (en) * | 2021-03-16 | 2021-04-16 | 深圳市火乐科技发展有限公司 | Weather data display method and device, electronic equipment and storage medium |
CN114047992A (en) * | 2021-03-16 | 2022-02-15 | 深圳市火乐科技发展有限公司 | Weather data display method and device, electronic equipment and storage medium |
CN114047992B (en) * | 2021-03-16 | 2024-11-01 | 深圳市火乐科技发展有限公司 | Weather data display method and device, electronic equipment and storage medium |
CN114599002A (en) * | 2022-05-09 | 2022-06-07 | 广东省气象公共服务中心(广东气象影视宣传中心) | Daily weather personalized service automatic pushing method based on 5G message |
CN114599002B (en) * | 2022-05-09 | 2022-07-26 | 广东省气象公共服务中心(广东气象影视宣传中心) | Daily weather personalized service automatic pushing method based on 5G message |
CN114663883A (en) * | 2022-05-25 | 2022-06-24 | 中山职业技术学院 | Point cloud data correction method and device, electronic equipment and storage medium |
WO2024152725A1 (en) * | 2023-01-19 | 2024-07-25 | 华为技术有限公司 | Display method based on weather information, and electronic device and readable storage medium |
Also Published As
Publication number | Publication date |
---|---|
CN110750734B (en) | 2024-07-12 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN110750734B (en) | Weather display method, weather display device, computer equipment and computer readable storage medium | |
CN110278464B (en) | Method and device for displaying list | |
CN109451343A (en) | Video sharing method, apparatus, terminal and storage medium | |
CN109167937B (en) | Video distribution method, device, terminal and storage medium | |
CN110061900B (en) | Message display method, device, terminal and computer readable storage medium | |
CN109327608B (en) | Song sharing method, terminal, server and system | |
CN112118477B (en) | Virtual gift display method, device, equipment and storage medium | |
CN113411680B (en) | Multimedia resource playing method, device, terminal and storage medium | |
CN109922356B (en) | Video recommendation method and device and computer-readable storage medium | |
CN113490010B (en) | Interaction method, device and equipment based on live video and storage medium | |
CN110139143B (en) | Virtual article display method, device, computer equipment and storage medium | |
CN112104648A (en) | Data processing method, device, terminal, server and storage medium | |
CN111083526B (en) | Video transition method and device, computer equipment and storage medium | |
CN114245218B (en) | Audio and video playing method and device, computer equipment and storage medium | |
CN112052354A (en) | Video recommendation method, video display method and device and computer equipment | |
CN110662105A (en) | Animation file generation method and device and storage medium | |
CN111628925A (en) | Song interaction method and device, terminal and storage medium | |
CN111031391A (en) | Video dubbing method, device, server, terminal and storage medium | |
CN112004134B (en) | Multimedia data display method, device, equipment and storage medium | |
CN112052355A (en) | Video display method, device, terminal, server, system and storage medium | |
CN111083554A (en) | Method and device for displaying live gift | |
CN113485596B (en) | Virtual model processing method and device, electronic equipment and storage medium | |
CN113556481B (en) | Video special effect generation method and device, electronic equipment and storage medium | |
CN112133319B (en) | Audio generation method, device, equipment and storage medium | |
CN111064657B (en) | Method, device and system for grouping concerned accounts |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
REG | Reference to a national code |
Ref country code: HK Ref legal event code: DE Ref document number: 40020190 Country of ref document: HK |
|
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant | ||
GR01 | Patent grant | ||
TG01 | Patent term adjustment |