[go: up one dir, main page]
More Web Proxy on the site http://driver.im/

CN114185776A - Big data point burying method, device, equipment and medium for application program - Google Patents

Big data point burying method, device, equipment and medium for application program Download PDF

Info

Publication number
CN114185776A
CN114185776A CN202111440057.XA CN202111440057A CN114185776A CN 114185776 A CN114185776 A CN 114185776A CN 202111440057 A CN202111440057 A CN 202111440057A CN 114185776 A CN114185776 A CN 114185776A
Authority
CN
China
Prior art keywords
point
buried
file
buried point
target
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202111440057.XA
Other languages
Chinese (zh)
Inventor
何辉
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Pingan Payment Technology Service Co Ltd
Original Assignee
Pingan Payment Technology Service Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Pingan Payment Technology Service Co Ltd filed Critical Pingan Payment Technology Service Co Ltd
Priority to CN202111440057.XA priority Critical patent/CN114185776A/en
Publication of CN114185776A publication Critical patent/CN114185776A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F11/00Error detection; Error correction; Monitoring
    • G06F11/36Preventing errors by testing or debugging software
    • G06F11/3668Software testing
    • G06F11/3672Test management
    • G06F11/3688Test management for test execution, e.g. scheduling of test suites
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F11/00Error detection; Error correction; Monitoring
    • G06F11/36Preventing errors by testing or debugging software
    • G06F11/3668Software testing
    • G06F11/3672Test management
    • G06F11/3684Test management for test design, e.g. generating new test cases
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F9/00Arrangements for program control, e.g. control units
    • G06F9/06Arrangements for program control, e.g. control units using stored programs, i.e. using an internal store of processing equipment to receive or retain programs
    • G06F9/44Arrangements for executing specific programs
    • G06F9/445Program loading or initiating
    • G06F9/44505Configuring for program initiating, e.g. using registry, configuration files
    • G06F9/4451User profiles; Roaming
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F9/00Arrangements for program control, e.g. control units
    • G06F9/06Arrangements for program control, e.g. control units using stored programs, i.e. using an internal store of processing equipment to receive or retain programs
    • G06F9/44Arrangements for executing specific programs
    • G06F9/445Program loading or initiating
    • G06F9/44521Dynamic linking or loading; Link editing at or after load time, e.g. Java class loading

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Software Systems (AREA)
  • Physics & Mathematics (AREA)
  • General Engineering & Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Computer Hardware Design (AREA)
  • Quality & Reliability (AREA)
  • Stored Programmes (AREA)

Abstract

The invention relates to the field of big data, and provides a big data embedding method, a device, equipment and a medium of an application program, which can add an embedding point plug-in into an initial starting command of a to-be-embedded point application, run the embedding point plug-in without sensing at a back-end system, screen a class object file based on an embedding point configuration file after starting the to-be-embedded point application and executing class loading operation, facilitate subsequent pointed embedding, perform class definition according to an embedding point record list to obtain a target class object file after embedding points, and obtain the embedding point data of the target class object file when detecting that the target class object file is executed, so as to realize non-invasive embedding of the application program. In addition, the invention also relates to a block chain technology, and the buried point data can be stored in the block chain node.

Description

Big data point burying method, device, equipment and medium for application program
Technical Field
The invention relates to the technical field of big data, in particular to a big data point burying method, a device, equipment and a medium for an application program.
Background
With the continuous expansion of business by each large company, the developed application programs have more and more functions and higher user quantity, and therefore, in order to analyze various behaviors of users, data embedding needs to be performed on the application programs so as to provide better services for the users.
The currently widely adopted point burying scheme is to change a back-end corresponding service code, and perform a series of operations such as packaging, testing, publishing, online and the like, and the point burying scheme mainly has the following problems:
(1) the original codes are modified and re-edition is carried out according to the requirements of each embedded point, and time and labor are consumed;
(2) because the original code is modified, the logic of the original code is changed, once the abnormal code appears, the operation of the whole code is influenced, and the risk is higher;
(3) because the original code is modified, and the scenes of the embedded points are various, the modified code is also various in form, inconvenient to read and high in maintenance cost.
Disclosure of Invention
In view of the foregoing, it is desirable to provide a method, an apparatus, a device, and a medium for embedding big data into an application program, which are intended to solve the problems of high risk and high cost of embedding big data into an application program.
A big data embedding method of an application program comprises the following steps:
establishing a buried point plug-in, and adding the buried point plug-in into an initial starting command of a point application to be buried to obtain a target starting command of the point plug-in to be buried;
executing class loading operation when the target starting command is used for starting the application of the point to be buried;
after the class loading operation is finished, obtaining a class object file and a buried point configuration file stored in a specified database;
establishing a buried point record list according to the class object file and the buried point configuration file;
carrying out class definition according to the embedded point record list to obtain a target class object file after embedding points;
and when the target class object file is detected to be executed, acquiring buried point data of the target class object file.
According to a preferred embodiment of the present invention, the adding the burial point plug-in to an initial start command of a to-be-buried point application to obtain a target start command of the to-be-buried point plug-in includes:
acquiring a storage path of the buried point plug-in;
and adding a storage path of the embedded point plug-in unit in the initial starting command to obtain the target starting command.
According to the preferred embodiment of the present invention, the executing the class loading operation includes:
starting a virtual machine container corresponding to the point to be buried;
acquiring a storage path of the buried point plug-in from the target starting command;
calling the buried point plug-in from the storage path, and acquiring a file stored in the buried point plug-in;
and loading the file stored in the embedded point plug-in to the memory of the virtual machine container.
According to a preferred embodiment of the present invention, before the obtaining of the buried point configuration file stored in the specified database, the method further includes:
acquiring a historical buried point record, determining a buried point requirement according to the historical buried point record, and generating a buried point configuration file according to the buried point requirement; and/or
Collecting the embedded point requirements issued by the configuration page, and generating the embedded point configuration file according to the embedded point requirements issued by the configuration page.
According to a preferred embodiment of the present invention, the establishing a buried point record list according to the class object file and the buried point configuration file includes:
identifying a buried point file name from the buried point configuration file;
matching in the class object file according to the buried point file name;
determining the file with the file name of the embedded point in the class object file as a to-be-embedded point object;
and generating the buried point record list based on the object to be buried.
According to a preferred embodiment of the present invention, the performing class definition according to the embedded point record list to obtain the object file of the target class after embedding the point includes:
acquiring a function in the object to be buried;
acquiring a buried point function of the object to be buried from the buried point configuration file;
adding the embedding point function to the front of the function in the object to be embedded to obtain a code block of the object to be embedded;
and determining the code block of the object to be buried as the target class object file.
According to a preferred embodiment of the present invention, after the obtaining of the buried point data of the target class object file, the method further includes:
acquiring the execution times of the target function in a preset time range from the buried point data;
calculating the daily activity of the target function according to the execution times and the preset time range;
when the daily activity is greater than or equal to a configuration threshold, determining target content corresponding to the target function;
and acquiring the associated content of the target content in each pre-configured period, and pushing the associated content.
A big data blob device for an application, the big data blob device for an application comprising:
the system comprises an establishing unit, a starting unit and a starting unit, wherein the establishing unit is used for establishing a buried point plug-in and adding the buried point plug-in into an initial starting command of a to-be-buried point application to obtain a target starting command of the to-be-buried point plug-in;
the execution unit is used for executing class loading operation when the target starting command is used for starting the application of the point to be buried;
the acquisition unit is used for acquiring the class object file and acquiring the buried point configuration file stored in the specified database after the class loading operation is executed;
the establishing unit is further used for establishing a buried point record list according to the class object file and the buried point configuration file;
the class definition unit is used for carrying out class definition according to the embedded point record list to obtain a target class object file after embedding points;
the obtaining unit is further configured to obtain the buried point data of the target class object file when it is detected that the target class object file is executed.
A computer device, the computer device comprising:
a memory storing at least one instruction; and
a processor executing instructions stored in the memory to implement a big data landed method for the application.
A computer-readable storage medium having stored therein at least one instruction for execution by a processor in a computer device to implement a big data burial method of the application.
According to the technical scheme, the embedded point plug-in unit can be established, the embedded point plug-in unit is added into an initial starting command of the to-be-embedded point application to obtain a target starting command of the to-be-embedded point plug-in unit, the embedded point plug-in unit can be operated without sensing in a back end system when the to-be-embedded point application is started, class loading operation is executed when the to-be-embedded point application is started by using the target starting command so as to be processed aiming at class object files subsequently, the class object files are obtained after the class loading operation is finished, embedded point configuration files stored in a specified database are obtained, an embedded point record list is established according to the class object files and the embedded point configuration files, the class object files are screened based on the embedded point configuration files so as to facilitate the subsequent targeted embedding of points, and class definition is performed according to the embedded point record list, the target class object file after the point embedding is obtained, original codes are not changed, the original codes are not influenced while the point embedding of big data is carried out, when the target class object file is detected to be executed, the point embedding data of the target class object file is obtained, and further the non-invasive point embedding of an application program is achieved.
Drawings
FIG. 1 is a flow chart of a preferred embodiment of a big data embedding method of the present invention.
FIG. 2 is a functional block diagram of a preferred embodiment of a big data pointing device for applications of the present invention.
FIG. 3 is a schematic structural diagram of a computer device according to a preferred embodiment of the present invention for implementing a big data embedding method for an application program.
Detailed Description
In order to make the objects, technical solutions and advantages of the present invention more apparent, the present invention will be described in detail with reference to the accompanying drawings and specific embodiments.
FIG. 1 is a flow chart of a preferred embodiment of a big data embedding method of the application of the present invention. The order of the steps in the flow chart may be changed and some steps may be omitted according to different needs.
The big data embedding method of the Application program is applied to one or more computer devices, and the computer devices are devices capable of automatically performing numerical calculation and/or information processing according to preset or stored instructions, and the hardware thereof includes but is not limited to a microprocessor, an Application Specific Integrated Circuit (ASIC), a Programmable Gate Array (FPGA), a Digital Signal Processor (DSP), an embedded device, and the like.
The computer device may be any electronic product capable of performing human-computer interaction with a user, for example, a Personal computer, a tablet computer, a smart phone, a Personal Digital Assistant (PDA), a game machine, an interactive web Television (IPTV), an intelligent wearable device, and the like.
The computer device may also include a network device and/or a user device. The network device includes, but is not limited to, a single network server, a server group consisting of a plurality of network servers, or a Cloud Computing (Cloud Computing) based Cloud consisting of a large number of hosts or network servers.
The server may be an independent server, or may be a cloud server that provides basic cloud computing services such as a cloud service, a cloud database, cloud computing, a cloud function, cloud storage, a Network service, cloud communication, a middleware service, a domain name service, a security service, a Content Delivery Network (CDN), a big data and artificial intelligence platform, and the like.
Among them, Artificial Intelligence (AI) is a theory, method, technique and application system that simulates, extends and expands human Intelligence using a digital computer or a machine controlled by a digital computer, senses the environment, acquires knowledge and uses the knowledge to obtain the best result.
The artificial intelligence infrastructure generally includes technologies such as sensors, dedicated artificial intelligence chips, cloud computing, distributed storage, big data processing technologies, operation/interaction systems, mechatronics, and the like. The artificial intelligence software technology mainly comprises a computer vision technology, a robot technology, a biological recognition technology, a voice processing technology, a natural language processing technology, machine learning/deep learning and the like.
The Network in which the computer device is located includes, but is not limited to, the internet, a wide area Network, a metropolitan area Network, a local area Network, a Virtual Private Network (VPN), and the like.
S10, establishing a buried point plug-in, and adding the buried point plug-in to an initial starting command of the to-be-buried point application to obtain a target starting command of the to-be-buried point plug-in.
In this embodiment, the embedded point plug-in may be a jar file, such as big data-buried-point jar.
In at least one embodiment of the present invention, the adding the burial point plug-in to an initial start command of a to-be-buried point application, and obtaining a target start command of the to-be-buried point plug-in includes:
acquiring a storage path of the buried point plug-in;
and adding a storage path of the embedded point plug-in unit in the initial starting command to obtain the target starting command.
For example: when the initial starting command is: the embedded point plug-in is established by java-jar as follows: jar, the corresponding target starting command is as follows: jar-java agent:/xxx/bigdata-buried-point.
Through the implementation mode, when the application of the to-be-buried point is started, the buried point plug-in can be operated in a backend system without sensing.
And S11, executing class loading operation when the target starting command is used for starting the application of the point to be buried.
Specifically, the executing the class loading operation includes:
starting a virtual machine container corresponding to the point to be buried;
acquiring a storage path of the buried point plug-in from the target starting command;
calling the buried point plug-in from the storage path, and acquiring a file stored in the buried point plug-in;
and loading the file stored in the embedded point plug-in to the memory of the virtual machine container.
For example: the Virtual Machine container may be a JVM (Java Virtual Machine).
By the implementation mode, the class loading can be realized, so that the subsequent processing can be performed on the class object file.
And S12, after the class loading operation is finished, acquiring a class object file and acquiring a buried point configuration file stored in the specified database.
In this embodiment, the class object file is stored in the memory of the virtual machine container.
In this embodiment, the designated database may be any type of database, and the designated database is used to store the data of the burial point requirement of each associated application program, such as the burial point configuration file.
In this embodiment, the buried point configuration file records a buried point position, a buried point function, and the like.
In at least one embodiment of the invention, before the obtaining the buried point configuration file stored in the specified database, the method further comprises:
acquiring a historical buried point record, determining a buried point requirement according to the historical buried point record, and generating a buried point configuration file according to the buried point requirement; and/or
Collecting the embedded point requirements issued by the configuration page, and generating the embedded point configuration file according to the embedded point requirements issued by the configuration page.
The configuration page is used for acquiring the embedded point requirements configured by related workers in real time so as to realize real-time updating of the embedded point requirements.
Through the implementation mode, the configuration file is generated by combining the historical embedded point record and the embedded point requirement issued in real time, so that the embedded point can be embedded in a targeted manner according to the actual requirement.
And S13, establishing a buried point record list according to the class object file and the buried point configuration file.
In at least one embodiment of the present invention, the creating a buried point record list according to the class object file and the buried point configuration file includes:
identifying a buried point file name from the buried point configuration file;
matching in the class object file according to the buried point file name;
determining the file with the file name of the embedded point in the class object file as a to-be-embedded point object;
and generating the buried point record list based on the object to be buried.
Through the implementation mode, the class object files are screened based on the embedded point configuration files, and subsequent pointed embedded points are facilitated.
And S14, performing class definition according to the embedded point record list to obtain a target class object file after embedding points.
Specifically, the class definition according to the embedded point record list, and obtaining the object file of the target class after embedding the point includes:
acquiring a function in the object to be buried;
acquiring a buried point function of the object to be buried from the buried point configuration file;
adding the embedding point function to the front of the function in the object to be embedded to obtain a code block of the object to be embedded;
and determining the code block of the object to be buried as the target class object file.
In the above embodiment, by adding the embedding function before the function in the object to be embedded, the logic code of the data embedding can be executed when the object to be embedded is called, which is equivalent to modifying the underlying bytecode without changing the original code, and the original code is not affected while the large data embedding is performed.
S15, when the object class object file is detected to be executed, acquiring the buried point data of the object class object file.
Through the implementation mode, the non-invasive embedded point of the application program can be realized, the original code is not modified, the development cost is effectively saved, and even if the logic of the original code is changed, the operation logic of the whole code is not influenced, and the risk of abnormity is reduced.
In at least one embodiment of the present invention, after the obtaining of the buried point data of the target class object file, the method further includes:
acquiring the execution times of the target function in a preset time range from the buried point data;
calculating the daily activity of the target function according to the execution times and the preset time range;
when the daily activity is greater than or equal to a configuration threshold, determining target content corresponding to the target function;
and acquiring the associated content of the target content in each pre-configured period, and pushing the associated content.
Wherein, the preset time range can be configured by self-definition, such as 1 month.
Wherein, the configuration threshold can be configured in a self-defining way, such as 4.
Wherein, the configured period may be daily, no two days, etc., and the invention is not limited.
For example: and when the page of the product A is clicked 150 times within 1 month by the user from the buried point data, the daily activity is 5 and is greater than a configuration threshold value 4, which indicates that the user frequently clicks and browses the product A and is likely to be interested in the product A, the associated product of the product A is obtained and pushed every day for the user to screen, and accurate pushing is realized.
Of course, other data analysis can be performed by using the buried point data, which is not described herein.
It should be noted that, in order to further improve the security of the data and avoid malicious tampering of the data, the buried data may be stored in the block chain node.
According to the technical scheme, the embedded point plug-in unit can be established, the embedded point plug-in unit is added into an initial starting command of the to-be-embedded point application to obtain a target starting command of the to-be-embedded point plug-in unit, the embedded point plug-in unit can be operated without sensing in a back end system when the to-be-embedded point application is started, class loading operation is executed when the to-be-embedded point application is started by using the target starting command so as to be processed aiming at class object files subsequently, the class object files are obtained after the class loading operation is finished, embedded point configuration files stored in a specified database are obtained, an embedded point record list is established according to the class object files and the embedded point configuration files, the class object files are screened based on the embedded point configuration files so as to facilitate the subsequent targeted embedding of points, and class definition is performed according to the embedded point record list, the target class object file after the point embedding is obtained, original codes are not changed, the original codes are not influenced while the point embedding of big data is carried out, when the target class object file is detected to be executed, the point embedding data of the target class object file is obtained, and further the non-invasive point embedding of an application program is achieved.
FIG. 2 is a functional block diagram of a preferred embodiment of a big data embedding device of the application program of the present invention. The big data buried point device 11 of the application program comprises a building unit 110, an executing unit 111, an acquiring unit 112 and a class defining unit 113. The module/unit referred to in the present invention refers to a series of computer program segments that can be executed by the processor 13 and that can perform a fixed function, and that are stored in the memory 12. In the present embodiment, the functions of the modules/units will be described in detail in the following embodiments.
The establishing unit 110 establishes a buried point plug-in, and adds the buried point plug-in to an initial start command of a to-be-buried point application to obtain a target start command of the to-be-buried point plug-in.
In this embodiment, the embedded point plug-in may be a jar file, such as big data-buried-point jar.
In at least one embodiment of the present invention, the creating unit 110 adds the burial point plugin to an initial start command of a to-be-buried point application, and obtaining a target start command of the to-be-buried point plugin includes:
acquiring a storage path of the buried point plug-in;
and adding a storage path of the embedded point plug-in unit in the initial starting command to obtain the target starting command.
For example: when the initial starting command is: the embedded point plug-in is established by java-jar as follows: jar, the corresponding target starting command is as follows: jar-java agent:/xxx/bigdata-buried-point.
Through the implementation mode, when the application of the to-be-buried point is started, the buried point plug-in can be operated in a backend system without sensing.
The execution unit 111 executes a class loading operation when the target start command is used to start the application to be buried.
Specifically, the executing unit 111 executes the class loading operation, including:
starting a virtual machine container corresponding to the point to be buried;
acquiring a storage path of the buried point plug-in from the target starting command;
calling the buried point plug-in from the storage path, and acquiring a file stored in the buried point plug-in;
and loading the file stored in the embedded point plug-in to the memory of the virtual machine container.
For example: the Virtual Machine container may be a JVM (Java Virtual Machine).
By the implementation mode, the class loading can be realized, so that the subsequent processing can be performed on the class object file.
After the class loading operation is completed, the obtaining unit 112 obtains the class object file and obtains the buried point configuration file stored in the specified database.
In this embodiment, the class object file is stored in the memory of the virtual machine container.
In this embodiment, the designated database may be any type of database, and the designated database is used to store the data of the burial point requirement of each associated application program, such as the burial point configuration file.
In this embodiment, the buried point configuration file records a buried point position, a buried point function, and the like.
In at least one embodiment of the present invention, before the acquisition of the buried point configuration file stored in the designated database, a historical buried point record is acquired, a buried point requirement is determined according to the historical buried point record, and the buried point configuration file is generated according to the buried point requirement; and/or
Collecting the embedded point requirements issued by the configuration page, and generating the embedded point configuration file according to the embedded point requirements issued by the configuration page.
The configuration page is used for acquiring the embedded point requirements configured by related workers in real time so as to realize real-time updating of the embedded point requirements.
Through the implementation mode, the configuration file is generated by combining the historical embedded point record and the embedded point requirement issued in real time, so that the embedded point can be embedded in a targeted manner according to the actual requirement.
The establishing unit 110 establishes a buried point record list according to the class object file and the buried point configuration file.
In at least one embodiment of the present invention, the creating unit 110 creates a buried point record list according to the class object file and the buried point configuration file, including:
identifying a buried point file name from the buried point configuration file;
matching in the class object file according to the buried point file name;
determining the file with the file name of the embedded point in the class object file as a to-be-embedded point object;
and generating the buried point record list based on the object to be buried.
Through the implementation mode, the class object files are screened based on the embedded point configuration files, and subsequent pointed embedded points are facilitated.
The class definition unit 113 performs class definition according to the buried point record list to obtain a target class object file after the buried point is obtained.
Specifically, the class definition unit 113 performs class definition according to the buried point record list, and obtaining the target class object file after the buried point includes:
acquiring a function in the object to be buried;
acquiring a buried point function of the object to be buried from the buried point configuration file;
adding the embedding point function to the front of the function in the object to be embedded to obtain a code block of the object to be embedded;
and determining the code block of the object to be buried as the target class object file.
In the above embodiment, by adding the embedding function before the function in the object to be embedded, the logic code of the data embedding can be executed when the object to be embedded is called, which is equivalent to modifying the underlying bytecode without changing the original code, and the original code is not affected while the large data embedding is performed.
When it is detected that the target class object file is executed, the obtaining unit 112 obtains the buried point data of the target class object file.
Through the implementation mode, the non-invasive embedded point of the application program can be realized, the original code is not modified, the development cost is effectively saved, and even if the logic of the original code is changed, the operation logic of the whole code is not influenced, and the risk of abnormity is reduced.
In at least one embodiment of the present invention, after the embedded point data of the target class object file is obtained, the execution times of the target function in a preset time range are obtained from the embedded point data;
calculating the daily activity of the target function according to the execution times and the preset time range;
when the daily activity is greater than or equal to a configuration threshold, determining target content corresponding to the target function;
and acquiring the associated content of the target content in each pre-configured period, and pushing the associated content.
Wherein, the preset time range can be configured by self-definition, such as 1 month.
Wherein, the configuration threshold can be configured in a self-defining way, such as 4.
Wherein, the configured period may be daily, no two days, etc., and the invention is not limited.
For example: and when the page of the product A is clicked 150 times within 1 month by the user from the buried point data, the daily activity is 5 and is greater than a configuration threshold value 4, which indicates that the user frequently clicks and browses the product A and is likely to be interested in the product A, the associated product of the product A is obtained and pushed every day for the user to screen, and accurate pushing is realized.
Of course, other data analysis can be performed by using the buried point data, which is not described herein.
It should be noted that, in order to further improve the security of the data and avoid malicious tampering of the data, the buried data may be stored in the block chain node.
According to the technical scheme, the embedded point plug-in unit can be established, the embedded point plug-in unit is added into an initial starting command of the to-be-embedded point application to obtain a target starting command of the to-be-embedded point plug-in unit, the embedded point plug-in unit can be operated without sensing in a back end system when the to-be-embedded point application is started, class loading operation is executed when the to-be-embedded point application is started by using the target starting command so as to be processed aiming at class object files subsequently, the class object files are obtained after the class loading operation is finished, embedded point configuration files stored in a specified database are obtained, an embedded point record list is established according to the class object files and the embedded point configuration files, the class object files are screened based on the embedded point configuration files so as to facilitate the subsequent targeted embedding of points, and class definition is performed according to the embedded point record list, the target class object file after the point embedding is obtained, original codes are not changed, the original codes are not influenced while the point embedding of big data is carried out, when the target class object file is detected to be executed, the point embedding data of the target class object file is obtained, and further the non-invasive point embedding of an application program is achieved.
Fig. 3 is a schematic structural diagram of a computer device according to a preferred embodiment of the present invention for implementing a big data embedding method for an application program.
The computer device 1 may comprise a memory 12, a processor 13 and a bus, and may further comprise a computer program, such as a big data pad program of an application program, stored in the memory 12 and executable on the processor 13.
It will be understood by those skilled in the art that the schematic diagram is merely an example of the computer device 1, and does not constitute a limitation to the computer device 1, the computer device 1 may have a bus-type structure or a star-shaped structure, the computer device 1 may further include more or less other hardware or software than those shown, or different component arrangements, for example, the computer device 1 may further include an input and output device, a network access device, etc.
It should be noted that the computer device 1 is only an example, and other electronic products that are currently available or may come into existence in the future, such as electronic products that can be adapted to the present invention, should also be included in the scope of the present invention, and are included herein by reference.
The memory 12 includes at least one type of readable storage medium, which includes flash memory, removable hard disks, multimedia cards, card-type memory (e.g., SD or DX memory, etc.), magnetic memory, magnetic disks, optical disks, etc. The memory 12 may in some embodiments be an internal storage unit of the computer device 1, for example a removable hard disk of the computer device 1. The memory 12 may also be an external storage device of the computer device 1 in other embodiments, such as a plug-in removable hard disk, a Smart Media Card (SMC), a Secure Digital (SD) Card, a Flash memory Card (Flash Card), etc. provided on the computer device 1. Further, the memory 12 may also include both an internal storage unit and an external storage device of the computer device 1. The memory 12 can be used not only for storing application software installed in the computer apparatus 1 and various types of data such as codes of a big data buried program of an application program, etc., but also for temporarily storing data that has been output or is to be output.
The processor 13 may be composed of an integrated circuit in some embodiments, for example, a single packaged integrated circuit, or may be composed of a plurality of integrated circuits packaged with the same or different functions, including one or more Central Processing Units (CPUs), microprocessors, digital Processing chips, graphics processors, and combinations of various control chips. The processor 13 is a Control Unit (Control Unit) of the computer device 1, connects various components of the entire computer device 1 by using various interfaces and lines, and executes various functions and processes data of the computer device 1 by running or executing programs or modules (e.g., a big data buried program for executing application programs, etc.) stored in the memory 12 and calling data stored in the memory 12.
The processor 13 executes the operating system of the computer device 1 and various installed application programs. The processor 13 executes the application programs to implement the steps in the big data landed method embodiments of the various application programs described above, such as the steps shown in fig. 1.
Illustratively, the computer program may be divided into one or more modules/units, which are stored in the memory 12 and executed by the processor 13 to accomplish the present invention. The one or more modules/units may be a series of computer readable instruction segments capable of performing certain functions, which are used to describe the execution of the computer program in the computer device 1. For example, the computer program may be divided into a building unit 110, an executing unit 111, an obtaining unit 112, a class defining unit 113.
The integrated unit implemented in the form of a software functional module may be stored in a computer-readable storage medium. The software functional module is stored in a storage medium and includes several instructions to enable a computer device (which may be a personal computer, a computer device, or a network device) or a processor (processor) to execute a part of the big data embedding method of the application program according to the embodiments of the present invention.
The integrated modules/units of the computer device 1 may be stored in a computer-readable storage medium if they are implemented in the form of software functional units and sold or used as separate products. Based on such understanding, all or part of the flow of the method according to the embodiments of the present invention may be implemented by a computer program, which may be stored in a computer-readable storage medium, and when the computer program is executed by a processor, the steps of the method embodiments may be implemented.
Wherein the computer program comprises computer program code, which may be in the form of source code, object code, an executable file or some intermediate form, etc. The computer-readable medium may include: any entity or device capable of carrying the computer program code, recording medium, U-disk, removable hard disk, magnetic disk, optical disk, computer Memory, Read-Only Memory (ROM), random-access Memory, or the like.
Further, the computer-readable storage medium may mainly include a storage program area and a storage data area, wherein the storage program area may store an operating system, an application program required for at least one function, and the like; the storage data area may store data created according to the use of the blockchain node, and the like.
The block chain is a novel application mode of computer technologies such as distributed data storage, point-to-point transmission, a consensus mechanism, an encryption algorithm and the like. A block chain (Blockchain), which is essentially a decentralized database, is a series of data blocks associated by using a cryptographic method, and each data block contains information of a batch of network transactions, so as to verify the validity (anti-counterfeiting) of the information and generate a next block. The blockchain may include a blockchain underlying platform, a platform product service layer, an application service layer, and the like.
The bus may be a Peripheral Component Interconnect (PCI) bus, an Extended Industry Standard Architecture (EISA) bus, or the like. The bus may be divided into an address bus, a data bus, a control bus, etc. For ease of illustration, only one line is shown in FIG. 3, but this does not mean only one bus or one type of bus. The bus is arranged to enable connection communication between the memory 12 and at least one processor 13 or the like.
Although not shown, the computer device 1 may further include a power supply (such as a battery) for supplying power to each component, and preferably, the power supply may be logically connected to the at least one processor 13 through a power management device, so that functions of charge management, discharge management, power consumption management and the like are realized through the power management device. The power supply may also include any component of one or more dc or ac power sources, recharging devices, power failure detection circuitry, power converters or inverters, power status indicators, and the like. The computer device 1 may further include various sensors, a bluetooth module, a Wi-Fi module, and the like, which are not described herein again.
Further, the computer device 1 may further include a network interface, and optionally, the network interface may include a wired interface and/or a wireless interface (such as a WI-FI interface, a bluetooth interface, etc.), which are generally used for establishing a communication connection between the computer device 1 and other computer devices.
Optionally, the computer device 1 may further comprise a user interface, which may be a Display (Display), an input unit, such as a Keyboard (Keyboard), and optionally a standard wired interface, a wireless interface. Alternatively, in some embodiments, the display may be an LED display, a liquid crystal display, a touch-sensitive liquid crystal display, an OLED (Organic Light-Emitting Diode) touch device, or the like. The display, which may also be referred to as a display screen or display unit, is suitable for displaying information processed in the computer device 1 and for displaying a visualized user interface.
It is to be understood that the described embodiments are for purposes of illustration only and that the scope of the appended claims is not limited to such structures.
Fig. 3 shows only the computer device 1 with the components 12-13, and it will be understood by a person skilled in the art that the structure shown in fig. 3 does not constitute a limitation of the computer device 1 and may comprise fewer or more components than shown, or a combination of certain components, or a different arrangement of components.
With reference to fig. 1, the memory 12 of the computer device 1 stores a plurality of instructions to implement a big data landed method of an application program, and the processor 13 can execute the plurality of instructions to implement:
establishing a buried point plug-in, and adding the buried point plug-in into an initial starting command of a point application to be buried to obtain a target starting command of the point plug-in to be buried;
executing class loading operation when the target starting command is used for starting the application of the point to be buried;
after the class loading operation is finished, obtaining a class object file and a buried point configuration file stored in a specified database;
establishing a buried point record list according to the class object file and the buried point configuration file;
carrying out class definition according to the embedded point record list to obtain a target class object file after embedding points;
and when the target class object file is detected to be executed, acquiring buried point data of the target class object file.
Specifically, the processor 13 may refer to the description of the relevant steps in the embodiment corresponding to fig. 1 for a specific implementation method of the instruction, which is not described herein again.
In the embodiments provided in the present invention, it should be understood that the disclosed system, apparatus and method may be implemented in other ways. For example, the above-described apparatus embodiments are merely illustrative, and for example, the division of the modules is only one logical functional division, and other divisions may be realized in practice.
The invention is operational with numerous general purpose or special purpose computing system environments or configurations. For example: personal computers, server computers, hand-held or portable devices, tablet-type devices, multiprocessor systems, microprocessor-based systems, set top boxes, programmable consumer electronics, network PCs, minicomputers, mainframe computers, distributed computing environments that include any of the above systems or devices, and the like. The invention may be described in the general context of computer-executable instructions, such as program modules, being executed by a computer. Generally, program modules include routines, programs, objects, components, data structures, etc. that perform particular tasks or implement particular abstract data types. The invention may also be practiced in distributed computing environments where tasks are performed by remote processing devices that are linked through a communications network. In a distributed computing environment, program modules may be located in both local and remote computer storage media including memory storage devices.
The modules described as separate parts may or may not be physically separate, and parts displayed as modules may or may not be physical units, may be located in one place, or may be distributed on a plurality of network units. Some or all of the modules may be selected according to actual needs to achieve the purpose of the solution of the present embodiment.
In addition, functional modules in the embodiments of the present invention may be integrated into one processing unit, or each unit may exist alone physically, or two or more units are integrated into one unit. The integrated unit can be realized in a form of hardware, or in a form of hardware plus a software functional module.
It will be evident to those skilled in the art that the invention is not limited to the details of the foregoing illustrative embodiments, and that the present invention may be embodied in other specific forms without departing from the spirit or essential attributes thereof.
The present embodiments are therefore to be considered in all respects as illustrative and not restrictive, the scope of the invention being indicated by the appended claims rather than by the foregoing description, and all changes which come within the meaning and range of equivalency of the claims are therefore intended to be embraced therein. Any reference signs in the claims shall not be construed as limiting the claim concerned.
Furthermore, it is obvious that the word "comprising" does not exclude other elements or steps, and the singular does not exclude the plural. A plurality of units or means recited in the present invention may also be implemented by one unit or means through software or hardware. The terms first, second, etc. are used to denote names, but not any particular order.
Finally, it should be noted that the above embodiments are only for illustrating the technical solutions of the present invention and not for limiting, and although the present invention is described in detail with reference to the preferred embodiments, it should be understood by those skilled in the art that modifications or equivalent substitutions may be made on the technical solutions of the present invention without departing from the spirit and scope of the technical solutions of the present invention.

Claims (10)

1. A big data embedding method of an application program is characterized by comprising the following steps:
establishing a buried point plug-in, and adding the buried point plug-in into an initial starting command of a point application to be buried to obtain a target starting command of the point plug-in to be buried;
executing class loading operation when the target starting command is used for starting the application of the point to be buried;
after the class loading operation is finished, obtaining a class object file and a buried point configuration file stored in a specified database;
establishing a buried point record list according to the class object file and the buried point configuration file;
carrying out class definition according to the embedded point record list to obtain a target class object file after embedding points;
and when the target class object file is detected to be executed, acquiring buried point data of the target class object file.
2. The big data embedding method of the application program according to claim 1, wherein the adding the embedding point plug-in to an initial start command of the application to be embedded to obtain a target start command of the plug-in to be embedded comprises:
acquiring a storage path of the buried point plug-in;
and adding a storage path of the embedded point plug-in unit in the initial starting command to obtain the target starting command.
3. The big data site method of an application program of claim 2, wherein said performing a class load operation comprises:
starting a virtual machine container corresponding to the point to be buried;
acquiring a storage path of the buried point plug-in from the target starting command;
calling the buried point plug-in from the storage path, and acquiring a file stored in the buried point plug-in;
and loading the file stored in the embedded point plug-in to the memory of the virtual machine container.
4. The big data burial method of application program as claimed in claim 1, wherein before the obtaining of the burial configuration file stored in the specified database, the method further comprises:
acquiring a historical buried point record, determining a buried point requirement according to the historical buried point record, and generating a buried point configuration file according to the buried point requirement; and/or
Collecting the embedded point requirements issued by the configuration page, and generating the embedded point configuration file according to the embedded point requirements issued by the configuration page.
5. The method of claim 1, wherein the creating a list of buried point records according to the class object file and the buried point configuration file comprises:
identifying a buried point file name from the buried point configuration file;
matching in the class object file according to the buried point file name;
determining the file with the file name of the embedded point in the class object file as a to-be-embedded point object;
and generating the buried point record list based on the object to be buried.
6. The big data embedding method of the application program as claimed in claim 5, wherein said performing class definition according to the embedding point record list, and obtaining the object class file after embedding point comprises:
acquiring a function in the object to be buried;
acquiring a buried point function of the object to be buried from the buried point configuration file;
adding the embedding point function to the front of the function in the object to be embedded to obtain a code block of the object to be embedded;
and determining the code block of the object to be buried as the target class object file.
7. The big data embedding method of the application program according to claim 1, wherein after the obtaining of the embedding data of the target class object file, the method further comprises:
acquiring the execution times of the target function in a preset time range from the buried point data;
calculating the daily activity of the target function according to the execution times and the preset time range;
when the daily activity is greater than or equal to a configuration threshold, determining target content corresponding to the target function;
and acquiring the associated content of the target content in each pre-configured period, and pushing the associated content.
8. A big data embedding device of an application program, which is characterized by comprising:
the system comprises an establishing unit, a starting unit and a starting unit, wherein the establishing unit is used for establishing a buried point plug-in and adding the buried point plug-in into an initial starting command of a to-be-buried point application to obtain a target starting command of the to-be-buried point plug-in;
the execution unit is used for executing class loading operation when the target starting command is used for starting the application of the point to be buried;
the acquisition unit is used for acquiring the class object file and acquiring the buried point configuration file stored in the specified database after the class loading operation is executed;
the establishing unit is further used for establishing a buried point record list according to the class object file and the buried point configuration file;
the class definition unit is used for carrying out class definition according to the embedded point record list to obtain a target class object file after embedding points;
the obtaining unit is further configured to obtain the buried point data of the target class object file when it is detected that the target class object file is executed.
9. A computer device, characterized in that the computer device comprises:
a memory storing at least one instruction; and
a processor executing instructions stored in the memory to implement a big data landed method of an application program according to any of claims 1 to 7.
10. A computer-readable storage medium characterized by: the computer-readable storage medium has stored therein at least one instruction that is executed by a processor in a computer device to implement a big data burial method of an application program according to any one of claims 1 to 7.
CN202111440057.XA 2021-11-30 2021-11-30 Big data point burying method, device, equipment and medium for application program Pending CN114185776A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202111440057.XA CN114185776A (en) 2021-11-30 2021-11-30 Big data point burying method, device, equipment and medium for application program

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202111440057.XA CN114185776A (en) 2021-11-30 2021-11-30 Big data point burying method, device, equipment and medium for application program

Publications (1)

Publication Number Publication Date
CN114185776A true CN114185776A (en) 2022-03-15

Family

ID=80602996

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202111440057.XA Pending CN114185776A (en) 2021-11-30 2021-11-30 Big data point burying method, device, equipment and medium for application program

Country Status (1)

Country Link
CN (1) CN114185776A (en)

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114860600A (en) * 2022-05-12 2022-08-05 平安科技(深圳)有限公司 Visual data point burying method, device, equipment and storage medium
CN115134352A (en) * 2022-06-27 2022-09-30 重庆长安汽车股份有限公司 Method, device, equipment and medium for uploading data of buried points
CN118642781A (en) * 2024-08-14 2024-09-13 杭州新中大科技股份有限公司 Method, device, equipment and storage medium for quickly starting application program

Cited By (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114860600A (en) * 2022-05-12 2022-08-05 平安科技(深圳)有限公司 Visual data point burying method, device, equipment and storage medium
CN114860600B (en) * 2022-05-12 2024-05-28 平安科技(深圳)有限公司 Visual data embedded point method, device, equipment and storage medium
CN115134352A (en) * 2022-06-27 2022-09-30 重庆长安汽车股份有限公司 Method, device, equipment and medium for uploading data of buried points
CN115134352B (en) * 2022-06-27 2023-06-20 重庆长安汽车股份有限公司 Buried point data uploading method, device, equipment and medium
CN118642781A (en) * 2024-08-14 2024-09-13 杭州新中大科技股份有限公司 Method, device, equipment and storage medium for quickly starting application program

Similar Documents

Publication Publication Date Title
CN114185776A (en) Big data point burying method, device, equipment and medium for application program
CN113806434B (en) Big data processing method, device, equipment and medium
CN111949708A (en) Multi-task prediction method, device, equipment and medium based on time sequence feature extraction
CN115081538A (en) Customer relationship identification method, device, equipment and medium based on machine learning
CN111950621A (en) Target data detection method, device, equipment and medium based on artificial intelligence
CN113704665A (en) Dynamic service publishing method, device, electronic equipment and storage medium
CN113886204A (en) User behavior data collection method and device, electronic equipment and readable storage medium
CN115964307A (en) Transaction data automatic testing method, device, equipment and medium
CN114547696A (en) File desensitization method and device, electronic equipment and storage medium
CN111950707B (en) Behavior prediction method, device, equipment and medium based on behavior co-occurrence network
CN114169303A (en) Method, device, equipment and medium for editing table based on vue.js
CN114816371B (en) Message processing method, device, equipment and medium
CN111985545A (en) Target data detection method, device, equipment and medium based on artificial intelligence
CN114860349B (en) Data loading method, device, equipment and medium
CN115357666A (en) Abnormal business behavior identification method and device, electronic equipment and storage medium
CN114662005A (en) Message pushing method, device, equipment and storage medium for user behavior track
CN114911479A (en) Interface generation method, device, equipment and storage medium based on configuration
CN115101152A (en) Sample priority switching method, device, equipment and medium
CN115310979A (en) Data payment method and device, electronic equipment and storage medium
CN115934576B (en) Test case generation method, device, equipment and medium in transaction scene
CN115543214B (en) Data storage method, device, equipment and medium in low-delay scene
CN114818656B (en) Binary file analysis method, device, equipment and medium based on gray scale upgrading
CN113434365B (en) Data characteristic monitoring method and device, electronic equipment and storage medium
CN114139199A (en) Data desensitization method, apparatus, device and medium
CN114911464A (en) Code generation method, device and equipment based on domain drive and storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination